Jobs via Dice · 13 hours ago
Senior Data DevOps Engineer
EPAM Georgia is a team of innovators united by a passion for technology, seeking a highly skilled Senior Data DevOps Engineer to support the development of an Enterprise Data Platform for a leading oil and gas company. This role focuses on implementing and optimizing an on-premises data platform, emphasizing automation, infrastructure management, and operational excellence.
Computer Software
Responsibilities
Install and configure platform components ensuring seamless integration with the EDP stack
Set up and manage RBAC (Role-Based Access Control) to enforce security best practices
Design implement and maintain CI/CD pipelines (e.g., GitLab CI) for automated build test and deployment of platform components and data workflows
Integrate Infrastructure as Code (IaC) tools (e.g., Terraform) into pipelines for repeatable auditable deployments
Deploy and manage logging and monitoring solutions using the LGTM stack (Loki Grafana Tempo Mimir)
Build and configure a centralized management console for the EDP
Establish and support multi-tenancy for secure independent environments across teams and locations
Automate deployment pipelines for data ingestion transformation and querying frameworks
Manage infrastructure for scalability high availability and reliability using Kubernetes and RedHat OS
Implement and enforce security policies with HashiCorp Vault and Open Policy Agent (OPA)
Proactively monitor troubleshoot and optimize platform components for high performance
Collaborate with Data Engineering and Platform teams to streamline releases and promote continuous delivery
Work closely with the customer's technical team to align goals and resolve issues
Qualification
Required
Proven experience as a Data DevOps Engineer or similar role in enterprise data platforms
Hands-on expertise with Kubernetes and RedHat OS for infrastructure management
Strong background in CI/CD pipeline design and automation (preferably GitLab CI)
Proficiency with Infrastructure as Code tools (e.g., Terraform)
Experience implementing RBAC security policies and secrets management (HashiCorp Vault OPA)
Familiarity with logging and monitoring stacks (LGTM or similar)
Solid understanding of multi-tenancy and centralized management in data platforms
Excellent troubleshooting and problem-solving skills
High reliability self-sufficiency and ability to work independently
Strong communication skills for effective collaboration with technical teams and stakeholders
Preferred
Experience with Apache Kafka Apache Spark (including Spark Streaming) MinIO Apache Iceberg PostgreSQL and Trino
Prior involvement in building or optimizing on-premises enterprise data platforms
Knowledge of distributed systems and big data architectures
Familiarity with S3-compatible object storage solutions
Experience supporting multi-location or multi-team environments
Certifications in Kubernetes RedHat or relevant DevOps technologies
Benefits
Opportunity to work abroad for up to two months per year
Relocation opportunities within our offices in 55+ countries
Corporate and social events
Leadership development, career advising, soft skills and well-being programs
Certifications, including Google Cloud Platform, Azure and AWS
Unlimited access to LinkedIn Learning and Get Abstract
Free English classes with certified teachers
Participation in the Employee Stock Purchase Plan
Monetary bonuses for engaging in the referral program
Comprehensive medical & family care package
Five trust days per year (sick leave without a medical certificate)
Benefits package (sports activities, a variety of stores and services)
Company
Jobs via Dice
Welcome to Jobs via Dice, the go-to destination for discovering the tech jobs you want.
Funding
Current Stage
Early StageCompany data provided by crunchbase