Data Engineer Business Intelligence and Data Warehousing @ Quick Med Claims, LLC | Jobright.ai
JOBSarrow
RecommendedLiked
0
Applied
0
External
0
Data Engineer Business Intelligence and Data Warehousing jobs in Pittsburgh, PA
94 applicants
company-logo

Quick Med Claims, LLC · 19 hours ago

Data Engineer Business Intelligence and Data Warehousing

ftfMaximize your interview chances
AutomotiveHealth Care

Insider Connection @Quick Med Claims, LLC

Discover valuable connections within the company who might provide insights and potential referrals.
Get 3x more responses when you reach out via email instead of LinkedIn.

Responsibilities

Support and optimize the existing AWS Redshift data warehouse and Hanger ETL pipeline.
Implement data optimizations to improve the performance of large datasets, including data partitioning, indexing, and query performance tuning.
Lead the migration from AWS Redshift to Databricks Lakehouse, implementing Delta Lake for data storage and processing.
Design, develop, and maintain scalable ETL pipelines using Python, Spark, SQL, and Databricks, ensuring data quality, consistency, and timeliness.
Integrate structured, semi-structured, and unstructured data from various internal and external sources from both on premise and cloud platforms such as AWS.
Utilize ETL frameworks and scheduling tools (e.g., Airflow, Databricks Jobs) for automated monitoring, testing, and validation for data quality and pipeline health.
Perform data analysis and data mapping from SQL Server-based RCM transactional systems and other source systems to transform data into business intelligence and reporting formats residing in data warehouse.
Apply dimensional modeling techniques (e.g., star schemas) to ensure effective data organization and modeling for BI, reporting, and machine learning.
Implement SCD techniques (Types 1, 2, and 3) to ensure accurate tracking and storage of historical data changes, particularly in operational and transactional data.
Work with Sisense to develop interactive dashboards and with Jaspersoft Reporting to develop and enhance reports that support operational and strategic decision-making.
Implement and integrate large language models (LLMs) to solve specific business problems in Databricks, such as improving billing processes, predicting trends, and enhancing operational efficiency.
Work with DevOps tools such as Kubernetes, Jenkins, Github, Slack, and Terraform to automate deployments and infrastructure management.
Ensure data security and compliance with industry regulations, including HIPAA, by adhering to best practices in data governance and privacy standards as well as managing access control and encryption for sensitive data.
Maintain documentation for data models, data workflows, ETL pipelines, machine learning models, system architectures, and design and coding standards.
Collaborate with data engineers, data analysts, business analysts, and other stakeholders to ensure data availability for reporting, modeling, and decision-making.
Effectively communicate complex technical concepts to non-technical stakeholders.
Lead projects to successful completion.
Lead, mentor, and provide guidance to junior team members, promoting best practices and code quality.
Stay current with emerging technologies, methodologies, and industry trends.

Qualification

Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.

AWS RedshiftDatabricks LakehouseETL pipeline developmentMachine LearningPythonSQLData modelingData IntegrationDevOps toolsAirflowTalendInformaticaJaspersoft ReportingSisenseDockerKubernetesJiraConfluenceCSSJavaScriptHIPAA compliance

Required

Bachelor’s degree in Computer Science, Engineering, Mathematics, or a related field.
5+ years of experience in data engineering, with a focus on ETL pipeline design and development, data warehouse design and management, structured and unstructured database management systems, and cloud technologies.
Experience with AWS Redshift, including integration with S3, Spectrum, Redshift, Lambda, and Glue for data processing and transformation.
1+ years of hands-on experience with Databricks Lakehouse, Delta Lake, and Unity Catalog including data lake management, and optimization of storage and processing.
Solid proficiency in Python and SQL for developing ETL pipelines, querying relational databases, and transforming data.
Experience with ETL tools, ETL frameworks, and scheduling tools like Apache Airflow, Databricks Jobs, AWS Glue, Talend, and Informatica.
Strong background in data modeling, including dimensional modeling (star and snowflake schemas) to support business intelligence and reporting tools.
Experience implementing Slowly Changing Dimensions (SCD) techniques to manage and track historical data changes.
Expertise in machine learning integration within Databricks to solve business problems and optimize business processes.
Familiarity with DevOps practices and tools such Jenkins, Github, Slack, and Terraform. Experience with containerization tools like Docker and Kubernetes for packaging and deploying applications.
Basic understanding of cloud infrastructure management and monitoring using tools like CloudWatch and Databricks Monitoring.
Experience working in an Agile development environment, using Jira and Confluence to manage tasks and collaboration according to the Software Development Life Cycle (SDLC).

Preferred

Experience with Delta Lake in Databricks and data lake best practices for large-scale data storage and management.
Familiarity with data privacy regulations, especially in healthcare (HIPAA).
Experience with administration and management of Sisense BI platform.
Experience with JavaScript and CSS.
Experience with leading teams and projects.
Experience in Healthcare or RCM.

Company

Quick Med Claims, LLC

twittertwittertwitter
company-logo
Quick Med Claims (QMC) is a nationally recognized leader in emergency medical transportation billing and reimbursement.

Funding

Current Stage
Growth Stage
Total Funding
unknown
Key Investors
GreyLion Capital
2018-11-12Private Equity
2015-01-16Private Equity

Leadership Team

leader-logo
Harry Sichi
Co-founder and CEO
linkedin
leader-logo
Michael Lewis
Executive Chairman of the Board and Co-founder
linkedin
Company data provided by crunchbase
logo

Orion

Your AI Copilot