Lead Cloud Data Developer (100% REMOTE/NO C2C) @ Amerit Consulting | Jobright.ai
JOBSarrow
RecommendedLiked
0
Applied
0
External
0
Lead Cloud Data Developer (100% REMOTE/NO C2C) jobs in San Diego, CA
200+ applicantsPosted by Agency
company-logo

Amerit Consulting · 15 hours ago

Lead Cloud Data Developer (100% REMOTE/NO C2C)

ftfMaximize your interview chances
ConsultingHuman Resources
badNo H1Bnote
Hiring Manager
Bhupesh Khurana
linkedin

Insider Connection @Amerit Consulting

Discover valuable connections within the company who might provide insights and potential referrals.
Get 3x more responses when you reach out via email instead of LinkedIn.

Responsibilities

Be responsible for the overall architecture of a specific module within a product (e.g., Data-ingestion, near-real-time-data-processor, etc.), perform design and assist implementation considering system characteristics to produce optimal performance, reliability and maintainability.
Provide technical guidance to team members, ensuring they are working towards the product's architectural goals.
Create and manage RFCs (Request for Comments) and ADRs (Architecture Decision Records), Design notes and technical documentation for your module, following the architecture governance processes.
Lead a team of data engineers, providing mentorship, setting priorities, and ensuring alignment with business goals.
Architect, design, and build scalable data pipelines for processing large volumes of structured and unstructured data from various sources.
Collaborate with software engineers, architects, and product teams to design and implement systems that enable real-time and batch data processing at scale.
Be the go-to person for PySpark-based solutions, ensuring optimal performance and reliability for distributed data processing.
Ensure that data engineering systems adhere to the best data security, privacy, and governance practices in line with industry standards.
Perform code reviews for the product, ensuring adherence to company coding standards and best practices.
Develop and implement monitoring and alerting systems to ensure timely detection and resolution of data pipeline failures and performance bottlenecks.
Act as a champion for new technologies, helping ease transitions and addressing concerns or resistance from team members.

Qualification

Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.

AWSData Pipeline DesignPySparkSQLPythonCloud PaaS TechnologiesData Engineering Team LeadershipDatabase DesignETL ToolsAWS Certified Data EngineerAWS Certified Solutions ArchitectC#GoLangJavaScriptReactJsSnowflakeDatabricksSynapseMS SQL ServerML / NotebooksHealthcare Industry ExperienceLegal Compliance KnowledgeSoft Skills

Required

Candidate must be authorized to work in USA without requiring sponsorship
Experience leading a data engineering team with a strong focus on software engineering principles such as KISS, DRY, YAGNI etc.
Candidate MUST have experience in owning large, complex system architecture and hands-on experience designing and implementing data pipelines across large-scale systems.
Experience implementing and optimizing data pipelines with AWS is a must.
Production delivery experience in Cloud-based PaaS Big Data related technologies (EMR, Snowflake, Data bricks etc.)
Experienced in multiple Cloud PaaS persistence technologies, and in-depth knowledge of cloud-based ETL offerings and orchestration technologies (AWS Step Function, Airflow etc.)
Experienced in stream-based and batch processing, applying modern technologies
Working experience with distributed file systems (S3, HDFC, ADLS), table formats (HUDI, Iceberg), and various open file formats (JSON, Parquet, Csv, etc.)
Strong programming experience in PySpark, SQL, Python, etc.
Database design skills including normalization/de-normalization and data warehouse design
Knowledge and understanding of relevant legal and regulatory requirements, such as SOX, PCI, HIPAA, Data Protection
Experience in the healthcare industry, a plus
A collaborative and informative mentality is a must!
Bachelors or Master’s in Computer Science, Information Systems, or an engineering field or relevant experience.
10+ years of related experience in developing data solutions and data movement.

Preferred

AWS, preferably AWS certified Data Engineer and AWS certified Solutions Architect.
Proficiency in at least one programming language C#, GoLang, JavaScript or ReactJs
Spark / Python / SQL
Snowflake/ Databricks / Synapse / MS SQL Server
ETL / Orchestration Tools (Step Function, DBT etc.)
ML / Notebooks

Company

Amerit Consulting

twittertwittertwitter
company-logo
Amerit Consulting is a staffing and recruiting company that offers temporary staffing and payrolling services.

Funding

Current Stage
Late Stage

Leadership Team

leader-logo
Jordan Schultheis
National Account Executive
linkedin
Company data provided by crunchbase
logo

Orion

Your AI Copilot