SIGN IN
Senior Product Analytics Developer jobs in United States
cer-icon
Apply on Employer Site
company-logo

Horizon3.ai · 1 day ago

Senior Product Analytics Developer

Horizon3.ai is a fast-growing, remote cybersecurity company dedicated to enabling organizations to proactively find and fix exploitable attack vectors. They are seeking a Senior Product Analytics Developer to build a world-class data ecosystem, collaborating with various stakeholders to design and maintain scalable data pipelines, ensuring high-quality data accessibility for analytics efforts.
Artificial Intelligence (AI)Cyber SecurityEnterprise SoftwareMachine LearningNetwork Security

Responsibilities

Data Pipeline Development — Design, build, and maintain scalable data pipelines that extract, transform, and load (ETL/ELT) data from various internal and external systems
Data Modeling & Architecture — Develop efficient and reliable data models that support reporting and analytics needs across business functions
Data Quality & Integrity — Implement data quality checks and monitoring to ensure accuracy, completeness, and consistency of critical datasets
Data Infrastructure — Manage and optimize data storage solutions (e.g. data warehouses, data lakes), ensuring scalability, security, and performance
Business Enablement — Partner closely with analysts and business stakeholders to understand their data needs and deliver well-documented, high-quality data assets
Cross-Functional Collaboration — Work with Engineering teams to instrument new data sources and drive data-driven culture across the organization
Automation & Monitoring — Develop automated processes for data ingestion, transformation, and validation; implement monitoring to proactively detect issues
Data Visualization - Work with Product and other stakeholders within the organization to provide insights via easily digestible visualizations in Tableau and ad-hoc
Documentation & Best Practices — Optimize compute performance and cost in Redshift, including query tuning, resource scaling, and data modeling best practices for analytic and reporting data
Anticipate data infrastructure changes and improvements based on new product features and initiatives
Develop and maintain a deep understanding of the company's data assets within the data lake/warehouse, serving as a subject matter expert for data availability, lineage, and quality
Own and maintain Tableau/BI Tooling data sources (scheduling, permissions, refreshes, etc)
Develop and maintain ETL/ELT release protocols from lower environments to production. Manage DWH changes across multi-region infrastructure
Build data models with a bias for self-service. Internal stakeholders should have a clear understanding of table relationships and the data therein
Build trust in the data by ensuring accuracy and availability. Monitoring, alerting and addressing data anomalies are one of your top priorities
Regular engagement with Eng and Product teams to better understand data infrastructure changes that will impact the DWH and organizational business needs
Engagement with owners/stakeholders for applications with bi-directional relationship to DWH to ensure data accuracy and availability
Maintain clear documentation (ERD, Data Dictionary) and promote data engineering best practices across the team
Utilize software engineering best practices such as version control via Git, CI/CD, and release management
Approach schema and table design with a bias towards growth, ensuring scalability and continuous optimization

Qualification

SQLETL/ELT pipelinesData modelingData qualityPythonCloud platformsData visualizationAgile mindsetVersion controlCross-functional collaborationSelf-starter

Required

Bachelor's degree or equivalent in Computer Science, Engineering, Information Systems, or a related field
4+ years of experience in data engineering or a related field
Hands-on experience designing and building scalable data pipelines and data models
Experience working in fast-paced environments and collaborating across multiple business functions
Experience with translating stakeholder requirements into analytics readouts
Strong SQL skills and experience with relational databases and cloud data warehouses (e.g., Snowflake, BigQuery, Redshift)
Proficiency in programming languages used for data engineering (e.g., Python, Scala)
Must have experience building and maintaining ETL/ELT pipelines using modern data tools (e.g., dbt, Airflow, Fivetran, AWS DMS)
Familiarity with data modeling concepts and data architecture best practices
Strong understanding of data quality, data governance, and security principles
Ability to collaborate effectively with cross-functional teams and communicate technical concepts to non-technical stakeholders
Agile mindset with ability to iterate quickly and adapt to changing business priorities
Self-starter who takes ownership and drives initiatives to completion
Up to 10% travel required

Preferred

Experience with BI and data visualization tools (e.g., Looker, Tableau, Power BI)
Experience working with cloud platforms (AWS, GCP, Azure)
Familiarity with version control (Git) and CI/CD for data pipelines

Benefits

Health, vision & dental insurance for you and your family
Flexible vacation policy
Generous parental leave
Equity package in the form of stock options

Company

Horizon3.ai

twittertwittertwitter
company-logo
Horizon3.ai offers an autonomous penetration testing platform that helps organizations proactively find and fix security vulnerabilities.

Funding

Current Stage
Late Stage
Total Funding
$178.5M
Key Investors
Prosperity7 VenturesNew Enterprise AssociatesCraft Ventures
2026-01-13Series Unknown
2025-05-22Series D· $100M
2023-08-08Series C· $40M

Leadership Team

leader-logo
Snehal Antani
Co-Founder & CEO
linkedin
leader-logo
Holly Grey
Chief Financial Officer
linkedin
Company data provided by crunchbase