Take2 Consulting, LLC · 2 days ago
Senior Data Engineer
Wonder how qualified you are to the job?
Maximize your interview chances
Insider Connection @Take2 Consulting, LLC
Responsibilities
Design, develop and tune data warehouses and data pipelines within Azure
Design and develop ML model execution pipelines using Azure Databricks
Evangelize engineering design and development standards
Act as a key contributor to the design and development lifecycle of analytic applications utilizing Microsoft Azure and BI technology platforms
Participate in Agile ceremonies including daily stand-ups, sprint planning, retrospectives, and product demonstrations
Produce efficient and elegant code that meets business requirements
Author unit tests that adhere to code coverage guidelines
Proactively communicate progress, issues, and risks to stakeholders
Accurately estimate assignments
Create and maintain technical documentation
Mentor less experienced engineers
Contribute to the growth and maturity of the Software Engineering Group
Performs other related duties as directed
Qualification
Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.
Required
10 years of hands-on experience designing complex data models for both OLTP and OLAP systems
10 years of hands-on experience designing and implementing large-scale data warehouse solutions.
5 years of experience in designing, developing, and tuning modern Azure data warehouses including in-depth knowledge of slowly changing dimensions and advanced performance tuning techniques
5 years of hands-on experience designing and implementing large-scale Azure data pipelines including Change Data Capture (CDC) solutions for structured, semi-structured, and unstructured data in batch and real-time environments
5 years of experience working with data governance
3 years of hands-on experience in Azure data services
3-5 years of hands-on experience with Azure Databricks, Unity Catalog, and Data Lakehouse
Broad experience in Microsoft SQL technologies
Broad multi-tenant data architecture and implementation experience across different data stores, messaging systems, and data processing engines
Experience with data integration through APIs, Web Services, SOAP, and/or REST service
Experience using Azure DevOps and CI/CD as well as Agile tools and processes including Git, Jenkins, Jira, and Confluence
Knowledge of SOA and Micros Services Application Architecture
Ability to work in a fast-paced, collaborative team environment
Excellent written and verbal communication skills and ability to express ideas clearly and concisely
Preferred
3 years of hands-on experience with Python
Bachelor's degree in CS or related field, master’s degree preferred