Senior Data Engineer @ GM Financial | Jobright.ai
JOBSarrow
RecommendedLiked
0
Applied
0
Senior Data Engineer jobs in Irving, TX
200+ applicants
expire-info-iconThis job has closed.
company-logo

GM Financial · 4 days ago

Senior Data Engineer

Wonder how qualified you are to the job?

ftfMaximize your interview chances
Finance
check
Comp. & Benefits

Insider Connection @GM Financial

Discover valuable connections within the company who might provide insights and potential referrals, giving your job application an inside edge.

Responsibilities

Code, test, deploy, orchestrate, monitor, document, and troubleshoot cloud-based data engineering processing and associated automation according to best practices and security standards.
Work closely with data scientists, data architects, ETL developers, and business partners to extract features of interest from external and internal data sources.
Contribute to evaluation, research, and experimentation efforts with batch and streaming data engineering technologies.
Inform and showcase capabilities of emerging technologies to enable their adoption within the data engineering practice.
Define and refine processes and procedures for the data engineering practice.
Educate and develop ETL developers on data engineering cloud-based initiatives for a smooth transition.

Qualification

Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.

HadoopHDFSSparkKafkaFlumeJSONParquetSequenceFileCloud DatabasesMQRelational DatabasesOracleAzureAWSGCPAzure ARM TemplatesHashicorp TerraformAWS Cloud FormationCloud ComputingHybrid Cloud ComputingVirtualization TechnologiesInfrastructure as a ServicePlatform as a ServiceSoftware as a ServiceObject Storage TechnologiesData Lake Storage Gen2S3MinioCephADLS

Required

Experience with processing large data sets using Hadoop, HDFS, Spark, Kafka, Flume or similar distributed systems
Experience with ingesting various source data formats such as JSON, Parquet, SequenceFile, Cloud Databases, MQ, Relational Databases such as Oracle
Experience with Cloud technologies (such as Azure, AWS, GCP) and native toolsets such as Azure ARM Templates, Hashicorp Terraform, AWS Cloud Formation
Understanding of cloud computing technologies, business drivers and emerging computing trends
Thorough understanding of Hybrid Cloud Computing: virtualization technologies, Infrastructure as a Service, Platform as a Service and Software as a Service Cloud delivery models and the current competitive landscape
Working knowledge of Object Storage technologies to include but not limited to Data Lake Storage Gen2, S3, Minio, Ceph, ADLS etc
Experience with containerization to include but not limited to Dockers, Kubernetes, Spark on Kubernetes, Spark Operator
Working knowledge of Agile development /SAFe, Scrum and Application Lifecycle Management
Strong background with source control management systems (GIT or Subversion); Build Systems (Maven, Gradle, Webpack); Code Quality (Sonar); Artifact Repository Managers (Artifactory), Continuous Integration/ Continuous Deployment (Azure DevOps)
Experience with NoSQL data stores such as CosmosDB, MongoDB, Cassandra, Redis, Riak or other technologies that embed NoSQL with search such as MarkLogic or Lily Enterprise
Creating and maintaining ETL processes
Knowledgeable of best practices in information technology governance and privacy compliance
Experience with REST APIs
Advanced knowledge of Databricks platform and associated features including workflows, unity catalog, delta live tables, time travel, SQL statement execution API, etc
Understanding of Databricks medallion architecture
Advanced knowledge of programming concepts and languages including SQL and Python/PySpark
5-7 years of hands-on experience with data engineering required
4-6 years of hands-on experience with processing large data sets required
4-6 years of hands-on experience with SQL, data modeling, relational databases and/or no SQL databases required
Bachelor’s Degree in related field or equivalent work experience required

Preferred

Additional Skills: Troubleshoot complex problems and works across teams to meet commitments
Additional Skills: Excellent computer skills and proficiency in digital data collection
Additional Skills: Ability to work in an Agile/Scrum team environment
Additional Skills: Strong interpersonal, verbal, and writing skills
Additional Skills: Digital technology solutions (DMPs, CDPs, Tag Management Platforms, Cross-Device Tracking, SDKs, etc)
Additional Skills: Knowledge of Real Time-CDP and Journey Analytics solutions
Additional Skills: Understanding of big data platforms and architectures, data stream processing pipeline/platform, data lake and data lake houses
Additional Skills: SQL experience: querying data and sharing what insights can be derived
Additional Skills: Understanding of cloud solutions such as Google Cloud Platform, Microsoft Azure & Amazon AWS cloud architecture & services
Additional Skills: Understanding of GDPR, privacy & security topics

Benefits

401K matching
Bonding leave for new parents (12 weeks, 100% paid)
Tuition assistance
Training
GM employee auto discount
Community service pay
Nine company holidays

Company

GM Financial

company-logo
GM Financial is the captive finance company and a wholly-owned subsidiary of General Motors Company.

Funding

Current Stage
Late Stage
Total Funding
unknown
2010-09-29Acquired· by General Motors

Leadership Team

leader-logo
Katie DeGraaf
Senior Vice President, OnStar Insurance, Product & Telematics
linkedin
B
Beverly Fells-Bohanon
Vice President
linkedin
Company data provided by crunchbase
logo

Orion

Your AI Copilot