Veritas Veterinary Partners · 7 hours ago
Data Engineer
Maximize your interview chances
Health CareVeterinary
No H1B
Insider Connection @Veritas Veterinary Partners
Get 3x more responses when you reach out via email instead of LinkedIn.
Responsibilities
Design and implement scalable, flexible data marts, data warehouses and data models for analytics and reporting purposes
Model and structure data for effective reporting and business intelligence use cases, focusing on Snowflake, MySQL and SQL databases and visualization and reporting tools from Microsoft and Tableau
Ensure the data architecture can integrate seamlessly with business applications through integration components, cloud data connectors and API calls
Develop and manage ETL/ELT pipelines using tools like SQL Server Integration Services (SSIS) or Azure Data Factory and other third-party ETL platforms
Extract, transform, and load data from disparate systems into a centralized data warehouse
Optimize data flows and processing for performance, reliability, and scalability
Ensure data pipelines can handle both structured and semi-structured data from various sources
Leverage data connectors & APIs to extract and feed data into databases or BI systems
Manage and optimize database performance to ensure data integrity and fast query performance
Troubleshoot issues related to data integration, extraction, and transformation
Familiarity with preparing data for and using data visualization tools such as Power BI
Ensure that the data architecture and data models support self-service analytics and is accessible for non-technical business users
Partner with BI Analyst to automate data feeds behind interactive dashboards and reports
Work closely with business analysts, field operators, senior leadership and other stakeholders to understand data needs and provide relevant solutions
Present technical topics in a clear and understandable way
Monitor and optimize data pipelines for performance and efficiency
Troubleshoot and resolve issues related to data extraction, transformation, and processing of data
Document data processes, workflows, and reporting solutions for transparency and knowledge sharing
Stay updated with industry best practices and emerging technologies to continuously improve data engineering practices
Continuously explore modern data tools and methodologies, incorporating industry trends to enhance the data strategy
Qualification
Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.
Required
Bachelor’s or master’s degree in computer science, Information Systems, or a related field
7+ years of experience in data engineering, ETL processes, and data pipeline development
Proven experience with SQL, MySQL, PostgreSQL, NoSQL and advanced query writing (T-SQL)
Expertise in SQL Server, MySQL (AWS-hosted), Snowflake, Azure SQL, and Azure Data Factory
Experience designing and implementing data models (e.g., star schema, dimensional models) for analytics
Familiarity with APIs, data integration techniques and managing structured and semi-structured data
Experience with reporting and visualization tools such as Power BI and understanding how to structure data to meet BI needs
Proven ability to work with data from HR, ATS and ERP and Financial platforms
Hands-on experience developing and managing ETL processes using SSIS or similar tools
Strong problem-solving skills with the ability to communicate complex technical concepts to non-technical stakeholders
Preferred
Experience with cloud-based ETL tools like Azure Data Factory
Familiarity with data pipeline tools (e.g., dbt, Stitch, Fivetran)
Knowledge of data governance and security best practices
Benefits
Competitive salary and benefits package
Professional development and career growth opportunities