Rackspace Technology · 4 days ago
Senior Big Data Hadoop ML Engineer (GCP) - USA
Maximize your interview chances
Web Hosting
Actively Hiring
Insider Connection @Rackspace Technology
Get 3x more responses when you reach out via email instead of LinkedIn.
Responsibilities
Develop scalable and robust code for large scale batch processing systems using Hadoop, Oozie, Pig, Hive, Map Reduce, Spark (Java), Python, Hbase.
Develop, manage, and maintain batch pipelines supporting Machine Learning workloads.
Leverage GCP for scalable big data processing and storage solutions.
Implement automation/DevOps best practices for CI/CD, IaC, etc.
Qualification
Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.
Required
Bachelor’s degree: Computer Science, Software Engineering or related field of study.
10+ years of experience in customer-facing software/technology or consulting.
5+ years of experience with 'on-premises to cloud' migrations or IT transformations.
5+ years of experience building, and operating solutions built on GCP.
Strong experience in the Apache Hadoop ecosystem.
Proficiency in in the Hadoop ecosystem especially with Oozie and Pig.
Strong programming skills with Java, Python, and Spark.
Experience in Infrastructure and Applied DevOps principles in daily work.
Experience with continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) tool like Terraform.
Proven experience in engineering batch processing systems at scale.
Experience with batch pipelines supporting Machine Learning workloads.
Company
Rackspace Technology
Be ready for what’s next with multicloud solutions from Rackspace Technology™. We are the multicloud solutions experts.
Funding
Current Stage
Late StageTotal Funding
unknown2016-08-08Acquired
Recent News
GlobeNewswire News Room
2024-01-15
Company data provided by crunchbase