Canal Insurance Company · 3 days ago
ETL Team Lead
Canal Insurance Company specializes in insurance for commercial trucking and specialty transportation operations. The ETL Team Lead will own operational support for the existing data stack, ensuring SLA adherence, directing ETL developers, and leading the modernization of Canal's data operations layer.
Insurance
Responsibilities
Monitor daily ETL loads across SQL jobs, DHIC (GW DataHub and InfoCenter) and legacy SSIS packages
Work with AMS team where necessary to troubleshoot pipeline failures, performance issues, schema mismatches, permissions issues, and cloud resource failures
Work with AMS team where necessary to perform root-cause analysis and to implement permanent fixes
Ensure SLA adherence and on-time delivery of critical reporting datasets for scheduled ETL jobs
Provide direction for both AMS and ETL Developers for Legacy & Current ETL maintenance
Refactor or retire outdated or redundant ETL processes
Work with AMS team to assist with the creation and/or enhancement of operational runbooks, SOPs, monitoring dashboards, and incident response workflows
Partner with other IT operational segments, business SMEs, and AMS team to minimize downtime and to ensure that business SLAs are met
Improve upon existing, as well as implement new proactive for daily processing
Work with AMS team to ensure development support coverage for critical data pipelines (rotation-based)
Support month-end and quarter-end financial reporting cycles
Coordinate production releases and validate deployments
Serve as technical lead guiding onshore/offshore developers
Review code, enforce best practices, and mentor junior engineers
Partner with Scrum Masters, Project Mangers, Enterprise Architecture, QA Automation, Change Management, and AMS support teams
Develop reusable ingestion patterns for Guidewire DataHub and InfoCenter, HubSpot, Telematics, and other facets of the business
Work with Canal Architects to modernize existing ETL workloads using Delta Lake, Medallion Architecture, and Fabric Lakehouse
Build scalable data ingestion pipelines using potential upcoming technology (Azure Data Factory, MS Fabric , Databricks, Synapse Pipelines, etc.)
Work to bring in Internal and External integrations data into the platform
Proven experience designing and implementing real-time data pipelines using Event Hub, Fabric Real-Time Analytics, Databricks Structured Streaming, and KQL-based event processing
Develop and enable real-time operational insights and automation capabilities—including Telematics alerting, FNOL automation, and fraud/VNOS/VNOP detection—through event-driven architectures and streaming analytics
Lead the strategy, design, and engineering of Canal’s modern Azure data ecosystem using next-generation tools and Medallion Architecture, including: Implementing Medallion Architecture (Bronze/Silver/Gold) across Fabric Lakehouse, Warehouse, Eventhouse, and KQL Database
Leveraging Delta tables with schema enforcement, ACID compliance, and versioning
Develop curated, analytics-ready datasets to support Power BI, operational reporting, and advanced analytics use cases
Assist Canal architect with implementation of Data Governance tools
Establish robust data quality, validation, alerting, and observability frameworks to maintain accuracy, consistency, and trust in enterprise data
Prepare ML-ready datasets for pricing, risk, fraud detection, underwriting, claims leakage, and predictive insights
Qualification
Required
Monitor daily ETL loads across SQL jobs, DHIC (GW DataHub and InfoCenter) and legacy SSIS packages
Work with AMS team where necessary to troubleshoot pipeline failures, performance issues, schema mismatches, permissions issues, and cloud resource failures
Work with AMS team where necessary to perform root-cause analysis and to implement permanent fixes
Ensure SLA adherence and on-time delivery of critical reporting datasets for scheduled ETL jobs
Provide direction for both AMS and ETL Developers for Legacy & Current ETL maintenance
Refactor or retire outdated or redundant ETL processes
Maintain and improve existing pipelines that utilize Microsoft SQL Server Database programming
Maintain and improve existing pipelines that utilize T-SQL Scripting
Maintain and improve existing pipelines that utilize SQL Server Integration Services
Maintain and improve existing pipelines that utilize Microsoft Powershell
Maintain and improve existing pipelines that utilize Guidewire DataHub and InfoCenter
Maintain and improve existing pipelines that utilize Oracle Database programming
Maintain and improve existing pipelines that utilize Oracle PL-SQL Scripting
Maintain and improve existing pipelines that utilize SAP BODS (SAP BusinessObjects Data Services)
Maintain and improve existing pipelines that utilize PostgreSQL Scripting
Work with AMS team to assist with the creation and/or enhancement of operational runbooks, SOPs, monitoring dashboards, and incident response workflows
Partner with other IT operational segments, business SMEs, and AMS team to minimize downtime and to ensure that business SLAs are met
Improve upon existing, as well as implement new proactive for daily processing
Work with AMS team to ensure development support coverage for critical data pipelines (rotation-based)
Support month-end and quarter-end financial reporting cycles
Coordinate production releases and validate deployments
Serve as technical lead guiding onshore/offshore developers
Review code, enforce best practices, and mentor junior engineers
Partner with Scrum Masters, Project Managers, Enterprise Architecture, QA Automation, Change Management, and AMS support teams
Develop reusable ingestion patterns for Guidewire DataHub and InfoCenter, HubSpot, Telematics, and other facets of the business
Work with Canal Architects to modernize existing ETL workloads using Delta Lake, Medallion Architecture, and Fabric Lakehouse
Build scalable data ingestion pipelines using potential upcoming technology (Azure Data Factory, MS Fabric, Databricks, Synapse Pipelines, etc.)
Work to bring in Internal and External integrations data into the platform
Proven experience designing and implementing real-time data pipelines using Event Hub, Fabric Real-Time Analytics, Databricks Structured Streaming, and KQL-based event processing
Develop and enable real-time operational insights and automation capabilities—including Telematics alerting, FNOL automation, and fraud/VNOS/VNOP detection—through event-driven architectures and streaming analytics
Lead the strategy, design, and engineering of Canal's modern Azure data ecosystem using next-generation tools and Medallion Architecture
Implementing Medallion Architecture (Bronze/Silver/Gold) across Fabric Lakehouse, Warehouse, Eventhouse, and KQL Database
Leveraging Delta tables with schema enforcement, ACID compliance, and versioning
Develop curated, analytics-ready datasets to support Power BI, operational reporting, and advanced analytics use cases
Assist Canal architect with implementation of Data Governance tools
Establish robust data quality, validation, alerting, and observability frameworks to maintain accuracy, consistency, and trust in enterprise data
Preferred
Prepare ML-ready datasets for pricing, risk, fraud detection, underwriting, claims leakage, and predictive insights
Benefits
Basic & Voluntary Life Insurance Plans
Medical, Dental, & Vision
Short Term & Long Term Disability
401(k) plan with company match up to 6%
Flexible Spending Accounts
Employee Assistance Programs
Generous PTO Plan
Company
Canal Insurance Company
Canal Insurance Company specializes in insurance for commercial trucking and specialty transportation operations since 1939.
Funding
Current Stage
Growth StageLeadership Team
Recent News
2025-12-15
2025-07-14
Company data provided by crunchbase