Montage Health · 3 days ago
Senior Data Warehouse Engineer
Wonder how qualified you are to the job?
Insider Connection @Montage Health
Responsibilities
Develop and maintain data pipelines, API-based or file-based data flows between source systems and the data warehouse
Use innovative tools and techniques to automate common data preparation and integration tasks with the goal of reducing defects and ensuring data quality
Implement best practices to ensure the integrity of data with exception-handling routines
Provides source to target mapping development, support, and maintenance.
Lead troubleshooting efforts and formulate interdisciplinary task force groups for ETL issues
Design, develop, and deploy data structures and data transformations in the enterprise data warehouse using Python, SSIS, ADF
Maintain and extend Epic Caboodle platform and develop custom Caboodle data modeling components
Form relationships and coordinate with business stakeholders to identify data needs, clarify requirements, and implement solutions
Contribute to the department’s short-term and long-term strategic plan
Make appropriate recommendations on management of data extraction, and analysis
Maintain knowledge of the current regulations and technologies related to data management
Assist with data governance initiatives in the areas of data quality, data security, metadata and master data management
Actively contribute to all aspects of the data project lifecycle including request intake and acknowledgment, project estimation, time-tracking, and prioritization of tasks.
Perform other duties as required or assigned.
Qualification
Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.
Required
7+ years of experience as a Data Engineer
In-depth knowledge of SQL, data warehouses, and data transformation techniques
Proven experience with designing and building data pipelines
Expert knowledge of metadata management and related tools
Advanced knowledge of data ETL concepts, processes, and tools such as MS SSIS, ADF
Advanced knowledge of Python
Ability to read and understand various data structures
Ability to work independently and as part of a team
Strong analytical, technical, and troubleshooting skills
Ability to assess requirements from multiple sources and their impact on potential solutions
Ability to work in a complex environment
Ability to be organized and proficient at tracking tasks, defining next steps, and following project plans
Advanced knowledge of database and data warehousing concepts, including data lakes, relational and dimensional database design concepts, and data modeling practices
Intermediate knowledge of Jupyter Notebooks
Familiarity with Agile project management methods such as SCRUM, Lean, and/or Kanban
Advanced knowledge of healthcare data structures, workflows, and concepts, from Electronic Health Record systems like Epic
Preferred
Knowledge of Azure cloud platform, Fabric data platform, ADF and DevOps is highly preferred
Bachelor’s degree in a technical, scientific, and/or healthcare discipline; or equivalent work experience
Epic Cogito, Clarity, and Caboodle certifications are required within 90 days of hire
Epic Caboodle Development and Clarity-Caboodle Development certifications are required within 120 days of hire
All certifications must be maintained throughout employment
Benefits
Health Insurance
Dental Insurance
Vision Insurance