The Value Maximizer · 10 hours ago
Data Product Architect
The Value Maximizer is seeking a senior Data Product Architect to define and govern the end-to-end architecture of enterprise data products. This role involves designing scalable data architectures and partnering with various teams to ensure alignment with enterprise standards and regulatory requirements.
Information Technology & Services
Responsibilities
Define end-to-end architecture for data products from source systems through analytics and downstream consumption
Design and govern logical, physical, and semantic data models (facts, dimensions, metrics, hierarchies)
Apply domain-driven and data-product design principles to ensure consistency and reusability
Establish and govern data contracts and domain interfaces
Define architectural patterns across Hadoop, lakehouse, and streaming platforms
Guide batch, near-real-time, and event-driven designs using Spark and Kafka
Ensure alignment across on-prem and cloud-based platforms in a hybrid enterprise environment
Review and guide ingestion and data service designs built on Java/Spring Boot and Python
Architect Kafka-based pipelines for decoupled, event-driven data products
Apply graph modeling patterns where relationship-centric use cases require it
Define enterprise semantic models supporting BI and analytics tools (Power BI, Fabric, Tableau)
Ensure consistent business definitions and metrics across reporting and analytics
Enable one-to-many consumption where a single data product supports multiple use cases
Embed data quality, lineage, metadata, and observability into architectural designs
Partner with centralized governance, security, and risk teams to meet regulatory requirements
Define data product ownership, stewardship, and lifecycle standards
Act as the architectural authority for data products within the organization
Review and approve solution designs and reference implementations
Bridge enterprise architecture standards with delivery execution across teams
Qualification
Required
12+ years of experience in data architecture, data engineering, or analytics architecture
Proven experience designing enterprise-scale data products and platforms
Strong expertise in data modeling, lakehouse architectures, and streaming systems
Excellent communication skills with technical and business stakeholders
Bachelor's degree in Computer Science, Engineering, or related field (Master's preferred)
Proficiency in Data Platforms: Hadoop, modern lakehouse architectures
Experience with Streaming & Processing: Spark, Spark Streaming, Kafka
Programming skills: Java, Spring Boot (design/review), Python
Knowledge of Modeling & Analytics: Dimensional, canonical, domain-driven modeling; semantic layers
Familiarity with Observability: Data observability and operational monitoring (ELK preferred)
Understanding of Governance & Security: Data governance, lineage, quality, and compliance
Preferred
Banking or financial services experience (deposits, loans, transactions)
Familiarity with data mesh or domain-oriented operating models
Experience supporting BI modernization initiatives
Exposure to Azure, AWS, or GCP in regulated enterprise environments
Company
The Value Maximizer
At The Value Maximizer, we empower businesses to unlock their full potential through cutting-edge AI-based platforms.
Funding
Current Stage
Early StageCompany data provided by crunchbase