Interactive Resources - iR · 5 hours ago
Senior Data Platform Engineer
Interactive Resources - iR is a nationally distributed organization operating in a regulated financial environment, seeking a highly experienced Databricks Data Engineer to help modernize and scale its cloud data platform. The role involves designing and optimizing large-scale data solutions on the Databricks Lakehouse platform, enabling secure analytics and reporting across the business.
Human ResourcesInformation Technology
Responsibilities
Design, build, and optimize large-scale data pipelines on the Databricks Lakehouse platform with a focus on reliability, scalability, and governance
Contribute to the modernization of an Azure-based data ecosystem, including cloud architecture, distributed data processing, and CI/CD automation
Develop and maintain ETL/ELT pipelines using Python, PySpark, Spark SQL, and Databricks notebooks
Design and optimize Delta Lake data models to support high-performance analytics and enterprise reporting
Implement and manage Unity Catalog to support role-based access control, lineage, governance, and secure data sharing
Build reusable frameworks using Databricks Workflows, Repos, and Delta Live Tables
Create scalable ingestion pipelines for APIs, relational databases, files, streaming sources, and master data systems
Automate API ingestion and data workflows using Python and REST APIs
Leverage orchestration tools such as Apache Airflow to manage complex workflows and dependencies
Support data governance, metadata management, lineage, and cataloging initiatives
Enable downstream use cases including BI, analytics, data science, and application integrations
Write highly optimized SQL queries, stored procedures, and curated datasets for reporting and analytics
Automate deployments, testing, and DevOps workflows across Databricks and Azure environments
Qualification
Required
8+ years of experience designing and delivering scalable data pipelines in modern cloud-based data platforms
Deep expertise in data engineering and data warehousing with end-to-end ownership of production systems
Advanced proficiency in SQL and Python, including performance tuning and optimization
Strong hands-on experience with Databricks and Azure-based data architectures
Experience working with governed or regulated datasets and applying strong data security and compliance practices
Proven ability to lead technical initiatives and collaborate across cross-functional and matrixed teams
Experience integrating cloud data platforms with enterprise systems and downstream consumers
Benefits
Competitive compensation and comprehensive benefits package
Strong commitment to professional development, training, and long-term career growth
Tuition reimbursement available for qualified education and certifications