Lead GCPData Engineer @ AptivaCorp | Jobright.ai
JOBSarrow
RecommendedLiked
0
Applied
0
External
0
Lead GCPData Engineer jobs in Bentonville, AR
Be an early applicantLess than 25 applicants
company-logo

AptivaCorp · 11 hours ago

Lead GCPData Engineer

ftfMaximize your interview chances
AnalyticsCloud Data Services
check
Growth Opportunities
check
H1B Sponsor Likelynote

Insider Connection @AptivaCorp

Discover valuable connections within the company who might provide insights and potential referrals.
Get 3x more responses when you reach out via email instead of LinkedIn.

Responsibilities

Design and develop big data applications using the latest open source technologies.
Desired working in offshore model and Managed outcome
Develop logical and physical data models for big data platforms.
Automate workflows using Apache Airflow.
Create data pipelines using Apache Hive, Apache Spark, Apache Kafka.
Provide ongoing maintenance and enhancements to existing systems and participate in rotational on-call support.
Learn our business domain and technology infrastructure quickly and share your knowledge freely and actively with others in the team.
Mentor junior engineers on the team
Lead daily standups and design reviews
Groom and prioritize backlog using JIRA
Act as the point of contact for your assigned business domain

Qualification

Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.

HadoopSparkScalaGCPETL ProcessData PipelineData WarehousePythonJavaApache AirflowApache HiveData LakesPerlShellApache KafkaRDBMSTest Driven DevelopmentAgileGitflowBitBucketJIRAConfluenceBambooJenkinsTFS

Required

Hadoop- 8+ Yrs of Exp
Spark - 8+ Yrs of Exp
Scala - 8+ Yrs of Exp
GCP - 5+ Yrs of Exp
ETL Process / Data Pipeline experience –8+ Yrs of Exp
2+ years of recent GCP experience
Experience building data pipelines in GCP
GCP Dataproc, GCS & BIGQuery experience
5+ years of hands-on experience with developing data warehouse solutions and data products
5+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required
2+ years of hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms
Experience with programming languages: Python, Java, Scala, etc.
Experience with scripting languages: Perl, Shell, etc.
Practice working with, processing, and managing large data sets (multi TB/PB scale)
Exposure to test driven development and automated testing frameworks
Background in Scrum/Agile development methodologies
Capable of delivering on multiple competing priorities with little supervision
Excellent verbal and written communication skills
Bachelor’s degree in computer science or equivalent experience

Preferred

Domain Experience (If any) – Retail
Gitflow
Atlassian products – BitBucket, JIRA, Confluence etc.
Continuous Integration tools such as Bamboo, Jenkins, or TFS

Company

AptivaCorp

twittertwitter
company-logo
Aptiva is founded by a team of experienced professionals with a solid base in IT and Management.

H1B Sponsorship

AptivaCorp has a track record of offering H1B sponsorships. Please note that this does not guarantee sponsorship for this specific role. Below presents additional info for your reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2023 (30)
2022 (21)
2021 (36)
2020 (23)

Funding

Current Stage
Late Stage
Company data provided by crunchbase
logo

Orion

Your AI Copilot