Ziply Fiber · 10 hours ago
Senior Data Engineer
Ziply Fiber is a local internet service provider dedicated to elevating the connected lives of the communities we serve. The Senior Data Engineer will be responsible for designing, building, and maintaining scalable data pipelines, data models, and infrastructure that support business intelligence, analytics, and operational data needs.
InternetManufacturingTelecommunications
Responsibilities
Design, develop, and maintain scalable data pipelines for ingestion, transformation, and storage of large datasets
Troubleshoot and resolve data pipeline and ETL failures, implementing robust monitoring and alerting systems
Automate data workflows increase efficiency and reduce manual intervention
Optimize data models for analytics and business intelligence reporting
Build and maintain data infrastructure, ensuring performance, reliability, and scalability
Implement best practices for data governance, security, and compliance
Work with structured and unstructured data, integrating data from various sources including databases, APIs, and streaming platforms
Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and design appropriate solutions
Mentor and train junior engineers, fostering a culture of learning and innovation
Develop and maintain documentation for data engineering processes and workflows
Performs other duties as required to support the business and evolving organization
Qualification
Required
Bachelor's degree in Computer Science, Engineering, or a related field
Minimum of eight (8) years of experience in data engineering, ETL development, or related fields
Strong proficiency in SQL and database technologies (PostgreSQL, MySQL, Oracle, SQL Server, etc.)
Familiarity with Linux/Unix and scripting technologies utilized on them
Proficiency in programming languages such as Python for data engineering tasks
Hands-on experience with cloud platforms such as Microsoft Azure and its data services such as Azure Data Factory and Azure Synapse Analytics
Experience working with data warehouses such as Snowflake or Azure SQL Data Warehouse
Familiarity with workflow automation tools such as Autosys
Knowledge of data modeling, schema design, and data architecture best practices
Strong understanding of data governance, security, and compliance standards
Ability to work independently in a remote environment across different time zones and collaborate effectively across teams
Exposure to GraphQL and RESTful APIs for data retrieval and integration
Familiarity with NoSQL databases such as MongoDB
Experience with version control software such as GitLab
Preferred
Proven aptitude for independently managing complex procedures, even when encountered infrequently
Proactive approach to learning and optimizing operational workflows
Familiarity with DevOps practices and CI/CD pipelines for data engineering, including Azure DevOps
Proficient in designing, writing, and maintaining complex stored procedures and stored procedure–based ETL workflows for robust data processing
Comfortable working in complex ecosystems with heterogeneous data sources and diverse end-user requirements, adapting solutions to fit unique contexts
Working knowledge of data wrangling and ETL tools, including Alteryx or similar technologies
Understanding of data privacy regulations such as GDPR and CCPA
Benefits
Medical
Dental
Vision
401k
Flexible spending account
Parental leave
Quarterly performance bonus
Training
Career growth and education reimbursement programs
Company
Ziply Fiber
Ziply Fiber offers fiber-optic phone, TV, and internet with coverage checks, support, and package details.
Funding
Current Stage
Late StageTotal Funding
$500MKey Investors
Cable ONE
2024-11-05Acquired
2022-11-04Corporate Round· $50M
2022-09-08Private Equity· $450M
Recent News
lightreading
2026-01-17
iphoneincanada.ca
2025-11-08
2025-11-08
Company data provided by crunchbase