Delta System & Software, Inc. ยท 5 hours ago
Senior/Lead Data Engineer
Delta System & Software, Inc. is seeking an experienced Senior/Lead Data Engineer to design and deliver scalable data solutions on the Azure ecosystem. The role involves architecting data platforms, optimizing data workflows, and leading the development of ETL pipelines while collaborating with various teams to ensure high-quality data management.
Responsibilities
Architect, design, and implement scalable data platforms and pipelines on Azure and Databricks
Build and optimize data ingestion, transformation, and processing workflows across batch and real-time data streams
Work extensively with ADLS, Delta Lake, and Spark (Python) for large-scale data engineering
Lead the development of complex ETL/ELT pipelines, ensuring high quality, reliability, and performance
Design and implement data models, including conceptual, logical, and physical models for analytics and operational workloads
Work with relational and lakehouse systems including PostgreSQL and Delta Lake
Define and enforce best practices in data governance, data quality, security, and architecture
Collaborate with architects, data scientists, analysts, and business teams to translate requirements into technical solutions
Troubleshoot production issues, optimize performance, and support continuous improvement of the data platform
Mentor junior engineers and contribute to building engineering standards and reusable components
Qualification
Required
8+ years of hands-on data engineering experience in enterprise environments
Strong expertise in Azure services, especially Azure Databricks, Functions, and Azure Data Factory
Advanced proficiency in Apache Spark with Python (PySpark)
Strong command over SQL, query optimization, and performance tuning
Deep understanding of ETL/ELT methodologies, data pipelines, and scheduling/orchestration
Hands-on experience with Delta Lake (ACID transactions, optimization, schema evolution)
Strong experience in data modelling (normalized, dimensional, lakehouse modelling)
Experience in both batch processing and real-time/streaming data (Kafka, Event Hub, or similar)
Solid understanding of data architecture principles, distributed systems, and cloud-native design patterns
Ability to design end-to-end solutions, evaluate trade-offs, and recommend best-fit architectures
Strong analytical, problem-solving, and communication skills
Ability to collaborate with cross-functional teams and lead technical discussions
Preferred
Experience with CI/CD tools such as Azure DevOps and Git
Familiarity with IaC tools (Terraform, ARM)
Exposure to data governance and cataloging tools (Azure Purview)
Experience supporting machine learning or BI workloads on Databricks