Enterprise Data Analytics and AI Developer jobs in United States
cer-icon
Apply on Employer Site
company-logo

Dudek · 13 hours ago

Enterprise Data Analytics and AI Developer

Dudek is a 100% employee-owned firm supporting clients nationwide and delivering projects that improve and protect the built and natural environments. The Enterprise Data Analytics and AI Developer is responsible for the design, implementation, and deployment of enterprise-scale data management, data analytics, and AI capabilities, leveraging Microsoft technologies to deliver secure and reliable solutions.

ConsultingInformation Technology
check
Growth Opportunities
badNo H1BnoteU.S. Citizen Onlynote

Responsibilities

Partner with enterprise architecture & engineering to define data and AI roadmaps that align with business objectives and operating models
Develop reference architectures and patterns for Fabric (OneLake, Lakehouse/Warehouse, Data Factory) and AI Foundry (agents, grounding, evaluations, guardrails)
Shape data service standards (naming, domains, data contracts), semantic modeling conventions, and model lifecycle policies
Contribute to backlog planning, estimation, release planning, and solution sizing for enterprise programs
Influence security, privacy, and compliance requirements (RBAC, sensitivity labels, DLP) for data and AI workloads
Provide technical leadership and mentorship to technical teams and practitioners while establishing code review, testing, and deployment standards
Translate business outcomes into technical designs and acceptance criteria; communicate tradeoffs and risks to non-technical stakeholders
Collaborate with corporate governance teams to ensure responsible AI and governed data usage
Enable knowledge transfer with high-quality documentation, runbooks, and enablement sessions for end users and support teams
Lead end-to-end technical delivery for multiple initiatives—from discovery and design through build, test, release, and operations
Define technical work breakdown structures (WBS), estimates, and resource plans; provide progress updates tied to backlog items and milestones
Own technical quality gates: design reviews, data model reviews, security reviews, and production readiness assessments
Coordinate integration with third-party systems and data providers; support vendor RFP/SOW technical inputs and evaluation criteria
Drive non-functional requirements (performance, availability, observability, cost) and execute performance/scalability tests prior to go-live
Facilitate UAT, cutover planning, and incident response playbooks; ensure smooth transitions to operations
Design, configure, and deploy custom copilots using Microsoft’s Copilot studio
Train technical users on the design and prototyping of custom copilots
Integrate custom copilots into Teams and Sharepoint user interfaces
Design Lakehouse and Warehouse architectures in OneLake and implement domain-driven data services
Build ingestion and transformation ETL pipelines with Data Factory, notebooks, shortcuts, and mirroring
Develop Power BI semantic models and datasets; optimize aggregations, partitions, incremental refreshes, and query performance
Implement KQL databases for streaming/operational analytics and monitoring use cases
Harden solutions with OAuth, SAML assertions, RBAC, sensitivity labels, row-level/object-level security, and workspace isolation; integrate with Purview where applicable
Automate CI/CD for Fabric items (Lakehouse, Warehouse, Semantic Models, Data Factory) using deployment pipelines
Select and evaluate models via the model catalog and implement model router policies and versioning/upgrade strategies
Build and host single and multi-agent solutions with Agent Service while integrating frameworks agent frameworks as needed
Implement retrieval-augmented generation (RAG) using Azure AI Search & vector indices while securely grounding agents with Fabric Data Agents where applicable
Instrument tracing, evaluations, and guardrails and configure data leakage prevention per enterprise policy
Operate with the Foundry control plane for fleet governance, cost controls, and policy enforcement while integrating alerts with enterprise monitoring
Implement Machine Learning tools to design and train AI models to solve business challenges
Experience with data pipelines related to enterprise structured and unstructured data models
Deep understanding of Azure tooling
Establish CI/CD for data and AI assets using GitHub and implement environment promotion, approvals, and rollback strategies
Create automated tests (unit, pipeline, data quality, prompt/agent evals) and define associated runbooks
Set up cost observability and right-size capacity and throughput
Implement telemetry and logging for data pipelines, query performance, agent runs, tool calls, error handling, and publish operational dashboards

Qualification

Microsoft FabricMicrosoft AI FoundrySQLPythonKQLData governanceCI/CDData modelingAzureCommunication skillsProblem-solvingCollaborationMentorshipAdaptability

Required

7+ years of hands-on experience in data engineering, analytics engineering, and/or AI application development in enterprise environments
Deep understanding of data modeling frameworks: Kimbal, EDW Streaming and Lakehouse Architecture
Expertise with Microsoft Fabric: OneLake, Data Factory, Lakehouse, Warehouse, Real-Time Intelligence/KQL, and Power BI semantic models
Expertise with Microsoft AI Foundry: model catalog, Agent Service, evaluations/observability, safety/guardrails, and Control Plane
Proficiency in SQL, Python, and KQL; experience with data modeling (star, data vault), and performance tuning
Experience implementing RAG pipelines (Azure AI Search/vector indices) and securely grounding agents to governed enterprise data
Proven delivery of production-grade solutions with CI/CD, IaC, automated testing, and operations runbooks on Azure
Strong understanding of data governance, privacy, and security (RBAC, sensitivity labels, row-level/object-level security, DLP)
Excellent communication skills with the ability to explain complex technical topics to diverse stakeholders
Must possess a valid driver's license and have active personal automobile liability insurance by the first day of employment

Preferred

Bachelor's degree in Computer Science, Information Systems, Data Engineering, or related field
Certifications such as Microsoft Certified: Azure Data Engineer Associate, Azure AI Engineer Associate, or Microsoft Fabric certifications
Experience with Purview governance, DLP policies, and compliance frameworks in regulated industries
Experience integrating ERP's or other enterprise business systems
Ability to interoperate within multi-platform services

Company

Dudek

twittertwittertwitter
company-logo
Dudek is an environmental and engineering consulting firm providing services to client's projects.

Funding

Current Stage
Late Stage

Leadership Team

leader-logo
Helder Guimarães
Chief Financial Officer
linkedin
leader-logo
Eric Wilson
Vice President, Environmental
linkedin
Company data provided by crunchbase