Zyxware Technologies · 1 day ago
ETL/Data Engineer (Federal Government Project)
Zyxware Technologies is seeking a Mid-Level Data Engineer to design, build, and manage data pipelines and systems that enable enterprise analytics and decision-making. The role involves developing ETL workflows, writing Python scripts, and leveraging AWS and Azure cloud services while ensuring compliance with industry standards for security, scalability, and data governance.
Responsibilities
Design and implement data pipelines to support analytics and reporting
Develop and maintain ETL/ELT workflows using AWS Glue, Azure Data Factory, and Snowflake Tasks
Build and optimize data pipelines using Python, PySpark, and SQL
Work with solution architects to define and maintain data models for data warehouses and data marts
Optimize and tune workflows for performance and scalability
Perform data profiling, quality checks, and implement Change Data Capture (CDC) and AWS DMS
Collaborate with stakeholders to define data requirements and deliver solutions
Enable reporting and dashboards using Power BI and Tableau
Apply data governance, security, and compliance standards (ISO 27001, SOC 2, GDPR/CCPA)
Qualification
Required
Bachelor's degree in computer science, Engineering, or related field
3–5 years of experience in data engineering or data management
Proficiency in SQL and experience with data modeling
Hands-on experience with AWS and Azure cloud platforms (AWS experience preferred)
Experience with Snowflake and Oracle for data warehousing and data marts
Strong problem-solving and debugging skills
1–2 years of experience with Power BI/Tableau (deployment pipelines, direct query vs dataflows)
Preferred
Experience with ETL tools (AWS Glue, Azure Data Factory, Talend)
Knowledge of data warehousing concepts, data quality analysis, and profiling
Exposure to CI/CD pipelines (GitHub Actions, Jenkins) and infrastructure automation (Terraform, CloudFormation)