Morph Enterprise · 19 hours ago
Senior Azure Databricks Engineer
Morph Enterprise is seeking a Senior Data Engineer to serve as the primary technical engine for their Client Medicaid data ecosystem. This role involves maintaining legacy systems while leading the modernization efforts to implement Azure Synapse and Databricks Lakehouse for enhanced data management and analytics capabilities.
Responsibilities
Leads the adoption or implementation of an advanced technology or platform
Expert on the functionality or usage of a particular system, platform, or technology product
Serves as a consultant to clients, guiding the efficient use or adoption of a particular IT product or platform
Creates implementation, testing, and/or integration plans
Demonstrates expertise in a particular IT platform or service, allowing for maximum IT investment
Qualification
Required
Bachelor's degree in Information Technology or related field or equivalent experience
High-level mastery of current legacy environment characterized by SSIS ETL processes managed via Team Foundation Server (TFS)
Maintain, troubleshoot, and modify complex SSIS packages handling high-volume Medicaid claims, provider, and member data
Manage code deployments and branching strategies within TFS, ensuring continuous integration of legacy SQL assets
Support and optimize SSRS report queries and SSAS tabular/multidimensional models to ensure federal and state compliance reporting remains uninterrupted
Implement 'Medallion Architecture' (Bronze/Silver/Gold) using Azure Databricks (PySpark/SQL)
Lead the transition of legacy SSIS logic into Azure Data Factory (ADF) and Databricks notebooks
Facilitate the migration of source control and CI/CD pipelines from TFS to Azure DevOps (Git)
Build and tune Dedicated and Serverless SQL Pools within Azure Synapse to facilitate advanced analytics and AI-readiness
Implement Row-Level Security (RLS) and automated data masking for PHI/PII in accordance with HIPAA, CMS MARS-E, and NIST standards
Develop automated data validation frameworks to ensure data parity between legacy SQL systems and the new Cloud Lakehouse
Ability to pivot between a 10-year-old SSIS package and a modern Databricks Spark job in the same day
Ability to take high-level architectural blueprints from the Lead Architect and translate them into high-performance, production-ready code
Absolute precision in Medicaid data handling, where an error in logic can impact member benefits or federal funding