The Carrera Agency · 5 hours ago
Senior Data Engineer ( with Azure Data Factory)
Maximize your interview chances
Human ResourcesInformation Technology
Growth Opportunities
Insider Connection @The Carrera Agency
Get 3x more responses when you reach out via email instead of LinkedIn.
Responsibilities
Lead the technical design, development, testing, and documentation of Data Warehouse and ETL projects
Perform data profiling, logical/physical data modeling to build new ETL designs and solutions
Develop, implement, and deploy ETL solutions to update data warehouses and data marts
Maintain quality control, document technical specifications, and conduct unit testing to ensure data accuracy and quality
Implement, stabilize, and establish DevOps processes for version control and deployment across environments
Troubleshoot, debug, and diagnose ETL issues, providing production support and off-hours operational support as needed
Perform performance tuning and enhancement of SQL and ETL processes, preparing related technical documentation
Collaborate with offshore teams to coordinate development work and operational support
Stay up-to-date with the latest ETL technologies and plan effective utilization
Play a key role in planning the migration of our EDW system to a modern global data warehouse architecture
Assess and implement new EDW/Cloud technologies to evolve the EDW architecture for efficiency and performance
Communicate clearly and professionally with users, peers, and all levels of management, both in written and verbal forms
Lead ETL tasks and activities related to BI projects, assigning, coordinating, and following up on activities to meet project timelines
Contribute to AI/ML projects as assigned
Perform code reviews on ETL and report changes where appropriate
Coordinate with the DBA team on migration, configuration, and tuning of ETL codes
Act as a mentor for other data engineers in the BI Team
Adhere to established processes and work policies defined by management
Qualification
Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.
Required
Bachelor's degree in Computer Science, MIS, Data Science, or a related field with 8+ years of relevant experience; or a Master's degree with 6 years of experience
Understanding of ERP business processes (Order to Cash, Procure to Pay, Record to Report, etc.), data warehouse and BI concepts, and the ability to apply educational and practical experience to improve business intelligence applications and provide simplified and standardized solutions to achieve business objectives
Expert knowledge of data warehouse architecture, including modern data warehouse concepts, EDW, and Data Lake/Cloud architecture
Expertise in dimensional modeling, star schema designs, including best practices for indexing, partitioning, and data loading
Advanced experience in SQL, writing stored procedures, and SQL tuning, using Oracle PL/SQL
Strong experience with Data Integration tools using Azure Data Factory (ADF)
Well-versed in database administration tasks and working with DBAs to monitor and resolve SQL/ETL issues and performance tuning
Experience with DevOps processes in ADF, preferably using GitHub (experience with other version control tools is helpful)
Experience in troubleshooting data warehouse refresh issues and BI reports data validation with source systems
Excellent communication skills
Ability to organize and handle multiple tasks simultaneously
Ability to mentor and coordinate activities for other data engineers as needed
Preferred
Experience working with Oracle EBS or other major ERP systems like SAP
Experience with AI/ML – proficiency in R, Python, PySpark is a plus
Experience with Cloud EDW technologies like Databricks, Snowflake, Synapse
Experience with Microsoft Fabric, Data Lakehouse concepts, and related reporting capabilities