Artmac · 5 hours ago
Senior Snowflake Data Engineer
Artmac is a technology consulting and service-oriented IT company dedicated to providing innovative technology solutions and services to customers. They are seeking a Senior Snowflake Data Engineer to design and implement Snowflake schemas, build data ingestion pipelines, and integrate with cloud platforms and BI tools.
SoftwareAppsMarketingInformation TechnologyDigital MarketingWeb DesignWeb Development
Responsibilities
10+ years of data engineering experience, with 5+ years in Snowflake Data Cloud
Expertise in SQL optimization and Snowflake performance tuning
Hands-on with Snowpipe, Streams & Tasks, Snowpark, Zero-Copy Cloning, and Secure Data Sharing
Proficiency in Python, Scala, or Java for Snowpark development
Experience integrating with cloud platforms like AWS
Exposure to ETL/ELT tools (Informatica, Matillion, Fivetran)
Familiarity with CI/CD, Git, and DevOps practices for data operations
Preferred Certifications: SnowPro Core
Design and implement Snowflake schemas (star, snowflake, data vault) optimized with micro-partitioning, clustering keys, materialized views, and search optimization services
Build real-time and batch ingestion pipelines into Snowflake using Snowpipe, Kafka Connect, Fivetran, Matillion, Informatica, or dbt
Automate incremental data processing with Streams & Tasks to support CDC (Change Data Capture)
Use Zero-Copy Cloning for environment management, testing, and sandboxing
Apply Time Travel and Fail-safe features for data recovery and auditing
Develop data transformation logic in Snowpark for Python/SQL/Scala to push compute directly into Snowflake
Design integrations with cloud storage (S3, Azure ADLS, GCS) for staging and external tables
Implement data sharing and data marketplace solutions via Snowflake Secure Data Sharing and Snowflake Marketplace
Enable semi-structured data handling (JSON, Avro, Parquet, ORC, XML) using VARIANT columns and lateral flattening
Integrate Snowflake with BI tools (Power BI, Tableau) via live connections and semantic layers
Implement RBAC (Role-Based Access Control), Row Access Policies, and Dynamic Data Masking for data security
Integrate with data catalog & governance platforms (Collibra, Alation, Informatica CDGC) using Snowflake metadata and APIs
Support CI/CD automation for Snowflake code deployment using GitHub Actions, Azure DevOps, or dbt Cloud
Qualification
Required
10+ years of data engineering experience, with 5+ years in Snowflake Data Cloud
Expertise in SQL optimization and Snowflake performance tuning
Hands-on with Snowpipe, Streams & Tasks, Snowpark, Zero-Copy Cloning, and Secure Data Sharing
Proficiency in Python, Scala, or Java for Snowpark development
Experience integrating with cloud platforms like AWS
Exposure to ETL/ELT tools (Informatica, Matillion, Fivetran)
Familiarity with CI/CD, Git, and DevOps practices for data operations
Design and implement Snowflake schemas (star, snowflake, data vault) optimized with micro-partitioning, clustering keys, materialized views, and search optimization services
Build real-time and batch ingestion pipelines into Snowflake using Snowpipe, Kafka Connect, Fivetran, Matillion, Informatica, or dbt
Automate incremental data processing with Streams & Tasks to support CDC (Change Data Capture)
Use Zero-Copy Cloning for environment management, testing, and sandboxing
Apply Time Travel and Fail-safe features for data recovery and auditing
Develop data transformation logic in Snowpark for Python/SQL/Scala to push compute directly into Snowflake
Design integrations with cloud storage (S3, Azure ADLS, GCS) for staging and external tables
Implement data sharing and data marketplace solutions via Snowflake Secure Data Sharing and Snowflake Marketplace
Enable semi-structured data handling (JSON, Avro, Parquet, ORC, XML) using VARIANT columns and lateral flattening
Integrate Snowflake with BI tools (Power BI, Tableau) via live connections and semantic layers
Implement RBAC (Role-Based Access Control), Row Access Policies, and Dynamic Data Masking for data security
Integrate with data catalog & governance platforms (Collibra, Alation, Informatica CDGC) using Snowflake metadata and APIs
Support CI/CD automation for Snowflake code deployment using GitHub Actions, Azure DevOps, or dbt Cloud
Bachelor's degree or equivalent combination of education and experience
Preferred
Snowflake-native feature design and implementation (Snowpark, Streams, Time Travel, Secure Data Sharing)
Data ingestion (Snowpipe, CDC, Kafka, Fivetran)
Semi-structured data handling (VARIANT, JSON, Avro, Parquet)
Advanced SQL and performance tuning
Data governance (RBAC, masking, lineage, catalogs)
Cloud data platform integrations (AWS S3, Azure ADLS, GCP GCS)
BI and analytics tool integration
Cost optimization and warehouse orchestration
Company
Artmac
Artmac provides Digital, consulting and Management IT services to clients globally - As your trusted partner, let’s reimagine how your business gets done.
H1B Sponsorship
Artmac has a track record of offering H1B sponsorships. Please note that this does not
guarantee sponsorship for this specific role. Below presents additional info for your
reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2023 (3)
2021 (3)
Funding
Current Stage
Early StageCompany data provided by crunchbase