Senior Azure Data Integrations Engineer jobs in United States
info-icon
This job has closed.
company-logo

LogicExcell · 7 hours ago

Senior Azure Data Integrations Engineer

LogicExcell is seeking a Senior Azure Data Integrations Engineer to play a critical role in building and operating a modern, enterprise-grade data platform that supports analytics and data-driven decision making. This hands-on role involves designing, implementing, and operating scalable data ingestion frameworks, ensuring reliable datasets, and collaborating with various stakeholders to translate complex data flows into trusted datasets within Azure and Databricks.

Information Technology & Services
check
H1B Sponsor Likelynote
Hiring Manager
Mukul Sharma
linkedin

Responsibilities

Design, build, and operate data ingestion pipelines from diverse enterprise systems into Azure Data Lake and Databricks
Implement batch and near-real-time ingestion patterns with support for incremental processing, schema evolution, and replayability
Apply Medallion architecture principles (Bronze / Silver / Gold) to ensure raw, refined, and curated data layers are clearly defined, governed, and analytics-ready
Own the curated data layer that serves analytics and reporting platforms, ensuring datasets are performant, well-modeled, and consistently defined
Partner closely with BI developers and analytics users to translate reporting and dashboard requirements into backend data models and datasets optimized for consumption
Support analytics tools (e.g., Sigma, Power BI, or similar platforms) by ensuring reliable connectivity, appropriate data access patterns, and production-ready data structures
Establish and maintain Dev / Test / Prod environment segregation across data pipelines, storage, and Databricks assets
Implement and manage CI/CD pipelines for data assets, including ADF pipelines, Databricks notebooks/jobs, and configuration promotion
Enforce deployment discipline, including approvals, validation, rollback strategies, and environment-specific parameterization
Implement secure access patterns using managed identities, service principals, Key Vault, and least-privilege principles
Ensure sensitive data is handled appropriately through access controls, segmentation, and governance standards
Define and maintain operational practices, including monitoring, alerting, error handling, and runbooks for production support
Proactively identify and address data quality, performance, reliability, and cost optimization concerns
Set technical standards and best practices for data integration and pipeline development across the platform
Review designs and implementations to ensure scalability, maintainability, and alignment with platform architecture
Collaborate with architects, data science leadership, and external partners to align execution with broader platform strategy

Qualification

Azure Data FactoryDatabricksAzure Data Lake StorageMedallion architectureSQLPythonCI/CD practicesSecurity managementClear communicationCollaboration skillsProblem-solvingTime managementAdaptability

Required

8+ years of experience in data engineering, cloud data integration, or analytics platform development
Strong hands-on experience with: Azure Data Factory (ADF) – dynamic pipelines, triggers, integration runtimes
Databricks – Spark / PySpark development, Delta Lake, job orchestration
Azure Data Lake Storage (ADLS Gen2)
Proven experience implementing Medallion architecture in production environments
Strong SQL skills and working knowledge of Python for transformations and automation
Experience enabling analytics and BI platforms through Databricks or similar data backends
Solid understanding of CI/CD practices applied to data platforms and multi-environment deployments
Strong grasp of security, identity, and access management in cloud data environments
Ability to operate effectively in complex, enterprise, multi-stakeholder settings
Clear, concise communication skills and the ability to translate technical concepts for non-technical partners
Availability during Mountain Standard Time (MST) working hours

Preferred

Experience building metadata-driven ingestion frameworks
Familiarity with API-based ingestion and secure credential management
Experience working with external vendors or implementation partners
Background in highly governed or regulated environments
Azure certifications (e.g., DP-203) are a plus

Company

LogicExcell

twitter
company-logo
LogicExcell is your trusted partner for IT staffing. We specialize in connecting businesses with top-tier IT talent.

H1B Sponsorship

LogicExcell has a track record of offering H1B sponsorships. Please note that this does not guarantee sponsorship for this specific role. Below presents additional info for your reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2025 (2)

Funding

Current Stage
Early Stage
Company data provided by crunchbase