dentsu · 1 day ago
Senior Director, Data Engineering
Dentsu is a leading global marketing and advertising agency, and they are seeking a Senior Director of Data Engineering to architect and enhance their data ecosystem. The role involves designing scalable data infrastructure, optimizing data pipelines, and enabling AI-driven insights in collaboration with various teams.
AdvertisingInformation ServicesMarketing
Responsibilities
Build, scale, and maintain robust data pipelines/models using DBT, Python, PySpark, Databricks, and SQL, integrating AI-first foundations and semantic layers for consistent data interpretation
Design and manage semantic models, star schemas, ontologies, taxonomies, knowledge graphs, and glossaries using DBT YAML, GitHub, Unity Catalog, Fabric/OneLake, and Power BI for unified understanding and AI reasoning
Utilize low-code/no-code tools (Trifacta, DBT, Power BI, Tableau, Fabric/OneLake, Copilot Studio) to build governed semantic layers supporting natural language querying, vector search, and hybrid AI indexing
Own AI deployment pipelines with containerized agents and automation using Kubernetes, n8n, LangChain, Azure AI Foundry, and Copilot Platform (MCP) for multi-step retrieval, summarization, and notifications
Strengthen AI accuracy/governance via metadata, access controls, and grounding (vector DBs, search indexes, knowledge graphs) to deliver reliable responses, source citation, and 'why' reasoning
Design modular, reusable data models for analytics, reporting, AI enablement, and agentic apps, including LLM integration for intent parsing, routing, retrieval, and synthesis
Develop and monitor mapping tables, validation rules, lineage, error logging, and observability for ETL/ELT health, data integrity, schema control, and real-time quality monitoring
Collaborate with analysts, engineers, and stakeholders to transform raw data into governed datasets, leveraging Adverity for multi-source integration and normalization
Implement agentic AI and Copilot integrations to enhance data accessibility, autonomous resolution, and dynamic insights across processes
Drive innovation in Data Quality Suite roadmap, including real-time monitoring, dynamic interfaces, self-serve tools, and AI-enhanced features for scalability
Contribute to medallion architecture (bronze/silver/gold), best practices for reusable components, semantic layer extension (e.g., RAG indexing), and AI infrastructure
Manage Databricks Unity Catalog, Workflows, SQL Analytics, Notebooks, and Jobs for governed analytics and ML workflows
Develop pipelines/tools with Microsoft Fabric, Power BI, Power Apps, Azure Data Lake/Blob, and Copilot Studio, tied to GitHub, n8n, and Kubernetes orchestration
Leverage GitHub and GitHub Copilot for version control, CI/CD, automation, code suggestions, and collaboration on SQL, Python, YAML, and agent development
Utilize Java or Scala for custom processing scripts, scalable ingestion, and advanced AI actions like code execution and vector search
Qualification
Required
8+ years of experience as a Data Engineer or in a similar role building scalable data infrastructure, with at least 2+ years focused on AI-integrated systems, semantic layers, or agentic AI deployments
Bachelor's Degree in Computer Science, Engineering, Information Systems, or related field required
Advanced expertise in SQL, Python, DBT; strong experience with PySpark, Databricks, and semantic layer tools like DBT YAML, Unity Catalog, and knowledge graphs required
Hands-on experience with ETL/ELT design tools like Trifacta (Alteryx), Adverity, Azure Data Factory, Fabric/Power BI DAX, or similar, including data normalization and workflow automation
Proven experience building and extending semantic layers for AI applications, including ontologies, taxonomies, vector databases, and integration with LLMs for enhanced reasoning, accuracy, and 'why' question resolution
Deep experience in the Microsoft Tech Data Stack, including Power BI, Power Apps, Fabric/OneLake, Azure Data Lakes (ADLS Gen2), Azure Blob Storage, Copilot Studio, and Azure AI Foundry for ModelOps and intelligent actions
Experience with AI deployment and orchestration tools such as Kubernetes, n8n, LangChain, and Model Context Protocol (MCP) for containerized agents, multi-step workflows, and governance
Strong experience in developing and managing API endpoints, integrating with external systems, and supporting LLM access for conversational AI and automation
Proficiency in Java or Scala for large-scale data processing, ingestion workflows, and custom AI integrations
Experience supporting data observability, quality frameworks (e.g., unit tests, reconciliation logic, job monitoring), and AI governance (e.g., metadata embedding, compliance rules)
Strong familiarity with Git-based development, GitHub Copilot for AI-assisted coding, and structured code collaboration in environments like DBT Cloud and GitHub Actions
Act quickly and independently, demonstrating a self-starter mindset with a proven ability to learn new tools and technologies on the fly, while delivering scalable solutions using any combination of tools in our tech stack to drive continuous improvement and impact
Preferred
Exposure to building tools in Microsoft Power Apps or other low-code platforms, including Copilot integrations for monitoring and workflows
Experience in advertising, marketing, or digital media environments, particularly with use cases like performance reporting, reconciliation automation, or brand visibility optimization
Benefits
Medical, vision, and dental insurance
Life insurance
Short-term and long-term disability insurance
401k
Flexible paid time off
At least 15 paid holidays per year
Paid sick and safe leave
Paid parental leave
Company
dentsu
We are dentsu.
Funding
Current Stage
Late StageTotal Funding
$24.88MKey Investors
Epiris
2012-07-12Acquired
1993-11-01Private Equity· $24.88M
Recent News
Morningstar.com
2025-06-13
Company data provided by crunchbase