Odyssey Logistics · 6 hours ago
Sr. IT Architect
Odyssey Logistics is a global leader in multimodal logistics, dedicated to solving complex supply chain challenges. The Sr. IT Architect role involves owning the enterprise analytics ecosystem and managing a range of responsibilities from cloud system architecture to data platform leadership, ensuring strategic adaptability and innovation in engineering practices.
ConsultingLogisticsSupply Chain Management
Responsibilities
Architect and maintain a multi-region Databricks on AWS environment, enforcing "secure by default" networking via customer-managed VPCs, PrivateLink, Transit Gateway, and strict IAM cross-account roles
Design the underlying infrastructure for Generative AI, including configuring MLflow for model governance, Databricks Vector Search for RAG applications, and GPU-optimized compute clusters for inference
Enforce reproducibility by managing all platform resources (workspaces, clusters, jobs) via Terraform and Databricks Asset Bundles (DAB), ensuring no manual changes exist in Production
Systematically audit legacy technical choices with a "clean slate" perspective. You are expected to seek disconfirming evidence that contradicts the status quo to prevent confirmation bias and the escalation of commitment to outdated strategies
Lead monthly FinOps reviews using AWS Cost Explorer and Databricks system tables to differentiate between necessary investment and "sunk costs," ensuring we do not irrationally commit resources to failing projects simply because they are already underway
Own strategic vendor relationships (Databricks, AWS, Fivetran, Sigma), holding partners accountable for successful outcomes and resolving support blockers aggressively rather than passively accepting roadmap delays
Drive a Data Mesh culture by treating data assets as "Data Products." Define clear contracts, SLOs (Service Level Objectives), and publication standards to decouple producers from consumers
Design and operationalize data privacy frameworks to satisfy GDPR, CCPA, and SOC2 requirements, including automated workflows for "Right to Be Forgotten" (RTBF) requests and PII masking within the Lakehouse architecture
Administer the Sigma Computing environment, overseeing workspace architecture, version tagging strategies, and the promotion path of analytics assets from Dev to Production
Design and enforce comprehensive Role-Based Access Control (RBAC) policies across Unity Catalog (Catalogs, Schemas, Tables) and AWS IAM, ensuring "least privilege" access while maintaining operational velocity
Manage high-volume data replication pipelines using Fivetran and AWS DMS, ensuring data fidelity, efficient schema drift handling, and cost-optimized sync frequencies across heterogeneous sources
Perform rigorous code reviews on high-impact pipelines, enforcing best practices such as Z-ordering, Liquid Clustering, and Schema Evolution, with the authority to reject sub-standard code
Partner with analytics, data science, and business teams to design scalable data solutions that align to strategic priorities and unlock advanced use cases
Drive innovation by introducing emerging technologies, frameworks, and best practices into engineering workflows to improve scalability, automation, and productivity
Qualification
Required
8+ years in Data Engineering/Architecture, with 3+ years of daily production mastery in Databricks on AWS
Deep familiarity with global data compliance standards (GDPR, CCPA, HIPAA) and experience implementing technical controls for PII protection (e.g., dynamic views, column-level encryption) in a distributed data environment
Expert-level proficiency with modern ELT and BI tools, specifically Fivetran (connector configuration, transformation), AWS DMS, and Sigma Computing (administration, version control, security)
Deep understanding of Role-Based Access Control (RBAC) models within distributed data systems, specifically regarding Unity Catalog grants and AWS IAM Identity Center
Deep expertise in AWS networking (Transit Gateway, VPC Endpoints, Private Subnets) and security (SCPs, Encryption, IAM identity center)
Proven production experience deploying CI/CD pipelines via GitHub Actions and Databricks Asset Bundles (DAB)
Demonstrated experience managing enterprise software vendors, including contract utilization, technical escalation, and roadmap alignment
A track record of making high-stakes technical decisions, including examples of when you recommended stopping a project to save resources (managing Sunk Cost bias)
Strong analytical problem-solving mindset, capable of identifying patterns, optimizing processes, and unlocking new business value through scalable solutions
Demonstrated ability to navigate ambiguity—driving clarity, alignment, and momentum in evolving requirements and complex stakeholder landscapes
Working knowledge of agile/scrum methodologies and use of agile lifecycle management tools (e.g., Jira, Confluence)
Exceptional communication and leadership skills to mentor engineers, influence stakeholders, and drive adoption of data products and practices
Preferred
Databricks Certified Data Engineer Professional (strongly preferred)
AWS Solutions Architect Professional
Familiarity with Generative AI concepts and modern ML techniques (LLMs, transformers, ensemble methods, deep learning frameworks) and how they integrate into data engineering workflows
Benefits
A choice of medical plans with FSA and HSA options
Dental Insurance
Vision Insurance
Company-paid Life and Disability Insurance
401(k) Plan with Company Match
Employee Assistance Program
Company Health & Wellness Program
Discounts with Preferred Vendors
Company
Odyssey Logistics
Odyssey resolves your logistics challenges and offers adaptive multimodal logistics on a global scale.
Funding
Current Stage
Late StageTotal Funding
$168.8MKey Investors
Goldman SachsLogispringTrident Capital
2017-08-30Acquired
2014-09-04Series Unknown· $40M
2014-01-28Series Unknown· $48M
Leadership Team
Recent News
Yahoo Finance
2024-12-28
2024-12-28
Company data provided by crunchbase