The University of Kansas · 3 hours ago
Data Modeler
The University of Kansas is seeking a Data Modeler to support its data strategy and data-informed decision making. The role involves designing, building, and maintaining complex data systems, including data pipelines and data warehouses, while collaborating with cross-functional teams to enable advanced analytics and ensure data governance.
ConsultingHealth CareNon Profit
Responsibilities
Design, develop, and maintain scalable ETL/ELT pipelines integrating data from diverse internal and external sources
Develop and optimize SQL-based processes to support data loading, transformation, and validation tasks
Maintain and enhance KU’s Data Warehouse and Data Lake environments to support analytics, dashboards, and operational reports
Support AI/ML model integration pipelines and prepare data for model training, scoring, and inferencing
Collaborate with BI Analysts, Data Scientists, and business stakeholders to translate requirements into performant data engineering solutions
Maintain and document data workflows, source-to-target mappings, and architecture diagrams in accordance with data governance policies
Ensure metadata capture and compliance with KU’s data catalog and dictionary
Provide production support for ETL/ELT jobs, data pipelines, and cloud-based infrastructure
Troubleshoot issues related to data ingestion, transformation, and availability
Monitor job performance and system health, coordinating with IT operations as needed
Participate in upgrade cycles for ETL tools, cloud platforms, and database systems
Execute data validation strategies for all data pipelines and models
Implement audit controls, data quality checks, and reconciliation procedures
Collaborate with QA teams and business users for UAT and production validation
Participate in campus-wide data initiatives, pilot projects, and tool evaluations
Contribute to continuous improvement and innovation in KU’s data engineering practices
Qualification
Required
Bachelor's degree in computer science, Information Technology, Engineering, Mathematics, Statistics or a related field
Three (3) years of demonstrated experience using Python for scripting, automation, and data manipulation across academic, professional, or project-based environments
Two (2) years of hands-on experience with at least one major cloud platform, such as AWS, Azure, or Google Cloud Platform
Exposure to integrating AI/ML workflows or components into data pipelines, through coursework, projects, or professional experience
Working knowledge of SQL and relational database systems (e.g., Oracle, PostgreSQL, MySQL) as evidenced by application materials
Excellent communication, documentation, and collaboration skills as evidenced by application materials
Preferred
Exposure to CI/CD practices and version control systems such as Git as evidenced by application materials
Experience working with APIs, including data ingestion from RESTful services as evidenced by application materials
Company
The University of Kansas
The University of Kansas is a major comprehensive research and teaching university and a center for learning, scholarship, and creative endeavor.