ETL Team Lead jobs in United States
cer-icon
Apply on Employer Site
company-logo

Canal Insurance Company · 1 day ago

ETL Team Lead

Canal Insurance Company specializes in insurance for commercial trucking and specialty transportation operations. The ETL Team Lead will own end-to-end operational support for Canal's existing data stack, ensuring effective ETL processes, leading technical teams, and developing scalable data ingestion pipelines.

Insurance

Responsibilities

Monitor daily ETL loads across SQL jobs, DHIC (GW DataHub and InfoCenter) and legacy SSIS packages
Work with AMS team where necessary to troubleshoot pipeline failures, performance issues, schema mismatches, permissions issues, and cloud resource failures
Work with AMS team where necessary to perform root-cause analysis and to implement permanent fixes
Ensure SLA adherence and on-time delivery of critical reporting datasets for scheduled ETL jobs
Provide direction for both AMS and ETL Developers for Legacy & Current ETL maintenance
Refactor or retire outdated or redundant ETL processes
Work with AMS team to assist with the creation and/or enhancement of operational runbooks, SOPs, monitoring dashboards, and incident response workflows
Partner with other IT operational segments, business SMEs, and AMS team to minimize downtime and to ensure that business SLAs are met
Improve upon existing, as well as implement new proactive for daily processing
Work with AMS team to ensure development support coverage for critical data pipelines (rotation-based)
Support month-end and quarter-end financial reporting cycles
Coordinate production releases and validate deployments
Serve as technical lead guiding onshore/offshore developers
Review code, enforce best practices, and mentor junior engineers
Partner with Scrum Masters, Project Mangers, Enterprise Architecture, QA Automation, Change Management, and AMS support teams
Develop reusable ingestion patterns for Guidewire DataHub and InfoCenter, HubSpot, Telematics, and other facets of the business
Work with Canal Architects to modernize existing ETL workloads using Delta Lake, Medallion Architecture, and Fabric Lakehouse
Build scalable data ingestion pipelines using potential upcoming technology (Azure Data Factory, MS Fabric , Databricks, Synapse Pipelines, etc.)
Work to bring in Internal and External integrations data into the platform
Proven experience designing and implementing real-time data pipelines using Event Hub, Fabric Real-Time Analytics, Databricks Structured Streaming, and KQL-based event processing
Develop and enable real-time operational insights and automation capabilities—including Telematics alerting, FNOL automation, and fraud/VNOS/VNOP detection—through event-driven architectures and streaming analytics
Lead the strategy, design, and engineering of Canal’s modern Azure data ecosystem using next-generation tools and Medallion Architecture, including:
Implementing Medallion Architecture (Bronze/Silver/Gold) across Fabric Lakehouse, Warehouse, Eventhouse, and KQL Database
Leveraging Delta tables with schema enforcement, ACID compliance, and versioning
Develop curated, analytics-ready datasets to support Power BI, operational reporting, and advanced analytics use cases
Assist Canal architect with implementation of Data Governance tools
Establish robust data quality, validation, alerting, and observability frameworks to maintain accuracy, consistency, and trust in enterprise data
Prepare ML-ready datasets for pricing, risk, fraud detection, underwriting, claims leakage, and predictive insights

Qualification

Microsoft SQL ServerT-SQL ScriptingSQL Server Integration ServicesData Pipeline DevelopmentAzure Data FactoryOracle DatabaseSAP BODSPostgreSQL ScriptingEvent-Driven ArchitectureData GovernanceSoft Skills

Required

Monitor daily ETL loads across SQL jobs, DHIC (GW DataHub and InfoCenter) and legacy SSIS packages
Work with AMS team where necessary to troubleshoot pipeline failures, performance issues, schema mismatches, permissions issues, and cloud resource failures
Work with AMS team where necessary to perform root-cause analysis and to implement permanent fixes
Ensure SLA adherence and on-time delivery of critical reporting datasets for scheduled ETL jobs
Provide direction for both AMS and ETL Developers for Legacy & Current ETL maintenance
Refactor or retire outdated or redundant ETL processes
Maintain and improve existing pipelines that utilize the following technologies: Microsoft SQL Server Database programming, T-SQL Scripting, SQL Server Integration Services, Microsoft Powershell, Guidewire DataHub and InfoCenter, Oracle Database programming, Oracle PL-SQL Scripting, SAP BODS (SAP BusinessObjects Data Services), PostgreSQL Scripting
Work with AMS team to assist with the creation and/or enhancement of operational runbooks, SOPs, monitoring dashboards, and incident response workflows
Partner with other IT operational segments, business SMEs, and AMS team to minimize downtime and to ensure that business SLAs are met
Improve upon existing, as well as implement new proactive for daily processing
Work with AMS team to ensure development support coverage for critical data pipelines (rotation-based)
Support month-end and quarter-end financial reporting cycles
Coordinate production releases and validate deployments
Serve as technical lead guiding onshore/offshore developers
Review code, enforce best practices, and mentor junior engineers
Partner with Scrum Masters, Project Managers, Enterprise Architecture, QA Automation, Change Management, and AMS support teams
Develop reusable ingestion patterns for Guidewire DataHub and InfoCenter, HubSpot, Telematics, and other facets of the business
Work with Canal Architects to modernize existing ETL workloads using Delta Lake, Medallion Architecture, and Fabric Lakehouse
Build scalable data ingestion pipelines using potential upcoming technology (Azure Data Factory, MS Fabric, Databricks, Synapse Pipelines, etc.)
Work to bring in Internal and External integrations data into the platform
Proven experience designing and implementing real-time data pipelines using Event Hub, Fabric Real-Time Analytics, Databricks Structured Streaming, and KQL-based event processing
Develop and enable real-time operational insights and automation capabilities—including Telematics alerting, FNOL automation, and fraud/VNOS/VNOP detection—through event-driven architectures and streaming analytics
Lead the strategy, design, and engineering of Canal's modern Azure data ecosystem using next-generation tools and Medallion Architecture, including: Implementing Medallion Architecture (Bronze/Silver/Gold) across Fabric Lakehouse, Warehouse, Eventhouse, and KQL Database
Leveraging Delta tables with schema enforcement, ACID compliance, and versioning
Develop curated, analytics-ready datasets to support Power BI, operational reporting, and advanced analytics use cases
Assist Canal architect with implementation of Data Governance tools
Establish robust data quality, validation, alerting, and observability frameworks to maintain accuracy, consistency, and trust in enterprise data

Preferred

Prepare ML-ready datasets for pricing, risk, fraud detection, underwriting, claims leakage, and predictive insights

Benefits

Basic & Voluntary Life Insurance Plans
Medical, Dental, & Vision
Short Term & Long Term Disability
401(k) plan with company match up to 6%
Flexible Spending Accounts
Employee Assistance Programs
Generous PTO Plan

Company

Canal Insurance Company

twittertwittertwitter
company-logo
Canal Insurance Company specializes in insurance for commercial trucking and specialty transportation operations since 1939.

Funding

Current Stage
Growth Stage

Leadership Team

leader-logo
Paul Brocklebank
President and Chief Executive Officer
linkedin
leader-logo
Matthew Grimm
Vice President, Chief Underwriting Officer
linkedin
Company data provided by crunchbase