SIGN IN
Senior Data DevOps Engineer jobs in United States
info-icon
This job has closed.
company-logo

Jobs via Dice · 6 hours ago

Senior Data DevOps Engineer

EPAM Georgia is a team of innovators united by a passion for technology, seeking a Senior Data DevOps Engineer to support the development and operationalization of an Enterprise Data Platform for a leading oil and gas company. The role focuses on implementing and optimizing an on-premises data platform, integrating advanced technologies for data ingestion, processing, and analytics while emphasizing automation and operational excellence.
Computer Software

Responsibilities

Install and configure platform components ensuring seamless integration with the EDP stack
Set up and manage RBAC (Role-Based Access Control) to enforce security best practices
Design implement and maintain CI/CD pipelines (e.g., GitLab CI) for automated build test and deployment of platform components and data workflows
Integrate Infrastructure as Code (IaC) tools (e.g., Terraform) into pipelines for repeatable auditable deployments
Deploy and manage logging and monitoring solutions using the LGTM stack (Loki Grafana Tempo Mimir)
Build and configure a centralized management console for the EDP
Establish and support multi-tenancy for secure independent environments across teams and locations
Automate deployment pipelines for data ingestion transformation and querying frameworks
Manage infrastructure for scalability high availability and reliability using Kubernetes and RedHat OS
Implement and enforce security policies with HashiCorp Vault and Open Policy Agent (OPA)
Proactively monitor troubleshoot and optimize platform components for high performance
Collaborate with Data Engineering and Platform teams to streamline releases and promote continuous delivery
Work closely with the customer's technical team to align goals and resolve issues

Qualification

KubernetesRedHat OSCI/CD pipeline designInfrastructure as CodeRBAC security policiesHashiCorp VaultLGTM stackMulti-tenancy managementTroubleshooting skillsSelf-sufficiencyApache KafkaApache SparkPostgreSQLTrinoBig data architecturesS3-compatible storageDevOps certificationsProblem-solving skillsCommunication skillsIndependent work

Required

Proven experience as a Data DevOps Engineer or similar role in enterprise data platforms
Hands-on expertise with Kubernetes and RedHat OS for infrastructure management
Strong background in CI/CD pipeline design and automation (preferably GitLab CI)
Proficiency with Infrastructure as Code tools (e.g., Terraform)
Experience implementing RBAC security policies and secrets management (HashiCorp Vault OPA)
Familiarity with logging and monitoring stacks (LGTM or similar)
Solid understanding of multi-tenancy and centralized management in data platforms
Excellent troubleshooting and problem-solving skills
High reliability self-sufficiency and ability to work independently
Strong communication skills for effective collaboration with technical teams and stakeholders

Preferred

Experience with Apache Kafka Apache Spark (including Spark Streaming) MinIO Apache Iceberg PostgreSQL and Trino
Prior involvement in building or optimizing on-premises enterprise data platforms
Knowledge of distributed systems and big data architectures
Familiarity with S3-compatible object storage solutions
Experience supporting multi-location or multi-team environments
Certifications in Kubernetes RedHat or relevant DevOps technologies

Benefits

Participation in the Employee Stock Purchase Plan
Monetary bonuses for engaging in the referral program
Comprehensive medical & family care package
Five trust days per year (sick leave without a medical certificate)
Benefits package (sports activities, a variety of stores and services)

Company

Jobs via Dice

twitter
company-logo
Welcome to Jobs via Dice, the go-to destination for discovering the tech jobs you want.

Funding

Current Stage
Early Stage
Company data provided by crunchbase