Data | ETL Architect jobs in United States
info-icon
This job has closed.
company-logo

Jobs via Dice ยท 1 day ago

Data | ETL Architect

Jobs via Dice is seeking ETL Architects to drive the development of data integration pipelines for the Correction Information Management System (CIMS) Data Warehouse/Data Lake on Azure. The role involves designing high-performing ETL processes, ensuring data integrity, security, and performance while collaborating with cross-functional teams.

Computer Software

Responsibilities

Scope of Work/Job Characteristics The Data Architects, under the working job title of Extract, Transform, Load (ETL) Architects, will serve as the principal line of communication for the project team. The ETL Architects will drive the development of data integration pipelines, enabling efficient, reliable access to critical data within the Correction Information Management System (CIMS) Data Warehouse/Data Lake on Azure. They will
Work with Azure Data Factory (ADF), Azure Databricks, Azure Synapse, Power BI, and Azure Purview. The ETL Architects will be at the forefront of transforming complex data into actionable insights. The ETL Architects will be responsible for ensuring data integrity, security, and performance, all while
Meeting mission-critical needs. The specific duties and responsibilities of this position are as follows:
ETL Pipeline Design and Development:
Lead the design and development of high-performing ETL processes to integrate and transform data across disparate sources
Deliver efficient, reliable pipelines that meet business needs and maintain the highest standards of security; and
Utilize ADF to automate and streamline data workflows, ensuring smooth transitions from source to target
Data Integration and Transformation:
Build and manage complex ETL workflows that extract, transform, and load data for downstream analytics and reporting, ensuring data is accurate, timely, and secure; and
Take ownership of data quality and validation, creating resilient ETL processes that ensure only trusted data reaches its destination
Cloud Platform Expertise:
Leverage the full power of the Azure ecosystem-ADF, Databricks, Synapse, and Purview-to manage and process high volumes of structured and unstructured data, delivering solutions that are scalable and performance-optimized; and
Integrate large datasets into Azure Synapse Analytics, enabling analytics teams to deliver data-driven insights that support the Department's mission
Performance Optimization:
Continuously optimize ETL jobs to minimize latency and maximize throughput; and
Ensure the architecture supports fast, reliable data access for end-users and systems, meeting stringent performance metrics
Security and Compliance:
Embed security and compliance best practices in every step of the ETL process
Protect sensitive data by adhering to industry standards and ensuring compliance with the Department's data governance policies; and
Use Azure Purview to enforce data governance, track data lineage, and ensure that data handling meets the highest standards of integrity
Collaboration and Stakeholder Engagement:
Partner with cross-functional teams (e.g., data engineers, analysts, business stakeholders, and security experts) to design and implement ETL solutions that meet the Department's evolving needs; and

Qualification

ETL developmentAzure Data FactoryAzure DatabricksAzure Synapse AnalyticsPythonSQLData governanceSecurity complianceCollaborationDocumentation

Required

A bachelor's degree from an accredited college or university in Computer Science, Information Systems, or a related field is required. Alternatively, equivalent work experience, including experience in Service-Oriented Architecture (SOA) and Microsoft Azure Cloud Solutions, can be substituted for the educational requirement on a year-for-year basis, when applicable
Seven (7) or more years of experience in ETL development and data engineering
Three (3) or more years of hands-on experience working with ADF, Azure Cloud, Azure Databricks, Azure Synapse Analytics, and Azure Purview
Proven track record of building and optimizing large-scale ETL pipelines for high-performance, high-availability environments
Extensive expertise in Spark, Python, and/or Scala for large-scale data transformations
Strong Structured Query Language (SQL) proficiency and experience working with complex data structures
In-depth knowledge of data governance, security protocols, and role-based access control (RBAC) within the Azure ecosystem
Ability to design ETL processes that are resilient, efficient, and fully compliant with regulatory standards
The selected Candidates must successfully complete a Level II Background Check

Preferred

Possession of a Microsoft Office Certification as an Azure Data Engineer Associate, Azure Solutions Architect Expert, and Azure Fundamentals
Azure Databricks Certification as a Data Engineer Associate

Company

Jobs via Dice

twitter
company-logo
Welcome to Jobs via Dice, the go-to destination for discovering the tech jobs you want.

Funding

Current Stage
Early Stage
Company data provided by crunchbase