Coda Search│Staffing · 3 weeks ago
Snowflake Data Architect
Coda Search is a global consulting firm seeking an experienced Data Architect with deep Snowflake expertise to design, build, and modernize enterprise-scale data platforms. The role focuses on improving data reliability, quality, scalability, and cost efficiency while enabling advanced analytics and predictive use cases.
ConsultingRecruitingStaffing Agency
Responsibilities
Architect and evolve modern Snowflake-based data warehouse and data lake architectures on Azure or AWS
Design conceptual, logical, and physical data models supporting analytical, operational, and predictive workloads
Define reference architectures, ingestion patterns, transformation strategies, and optimization standards for Snowflake environments
Architect solutions that combine data from diverse structured, semi-structured, and unstructured sources into consistent, analytics-ready formats
Design, build, and maintain scalable, fault-tolerant ETL/ELT pipelines using Snowflake, Python, and SQL
Implement and standardize ELT workflows using dbt, ensuring modular, testable, and well-documented transformations
Develop batch and real-time ingestion frameworks using orchestration tools such as Apache Airflow or Azure Data Factory
Optimize Snowflake workloads using partitioning, clustering, micro-partition management, and query performance tuning
Write and optimize complex SQL queries and stored procedures to support analytics and downstream applications
Provide hands-on guidance and code reviews for data engineers, ensuring performance, reliability, and maintainability
Troubleshoot and resolve complex pipeline, performance, and data quality issues across the platform
Tune distributed processing workflows and leverage pushdown optimization and parallel processing for cost and performance efficiency
Implement data quality and reliability frameworks, including validation, profiling, and automated testing
Monitor resource usage and Snowflake costs, applying lifecycle management and cost-control strategies
Design self-healing and monitoring mechanisms to ensure high availability and reliability of critical pipelines
Maintain clear architectural documentation, data models, and workflow documentation
Partner with analysts, data scientists, and stakeholders to translate business requirements into scalable Snowflake solutions
Support predictive and prescriptive modeling use cases by enabling clean, trusted, and well-modeled datasets
Stay current with Snowflake features and modern data architecture trends, recommending adoption where appropriate
Qualification
Required
Deep, hands-on experience with Snowflake, including warehouse design, performance optimization, and cost management
Strong proficiency in SQL and Python for data engineering and transformation workflows
Experience designing and implementing ETL/ELT pipelines using Snowflake, Azure Data Factory, Airflow, or similar tools
Strong experience with dbt for transformation, testing, and documentation
Proven expertise in data modeling, including star and snowflake schemas and dimensional modeling
Solid understanding of data warehouse and data lake architectures on Azure or AWS
Hands-on experience with cloud storage and services (e.g., Azure Blob Storage, AWS S3, Glue, Lambda)
Familiarity with distributed data processing concepts (Spark, Hadoop, Databricks)
Preferred
Experience with real-time or streaming platforms (Kafka, Kinesis)
Snowflake certifications
Cloud certifications (Azure, AWS, GCP)
Experience integrating Snowflake with BI and visualization tools such as Power BI, Tableau, or Looker
Exposure to predictive or prescriptive analytics workflows
Company
Coda Search│Staffing
At Coda, we believe the best way to serve our clients is through an inclusive and personalized approach. We’re not order-takers, we’re consultants.
Funding
Current Stage
Growth StageRecent News
2025-08-13
Company data provided by crunchbase