PriceSenz · 17 hours ago
Data Architect
Maximize your interview chances
Insider Connection @PriceSenz
Get 3x more responses when you reach out via email instead of LinkedIn.
Responsibilities
Design and implement data structures for storing, processing, and accessing data within the Snowflake platform.
Optimize data pipelines, workflows, and queries for performance, scalability, and cost-efficiency.
Develop ETL/ELT processes and integrate Snowflake with cloud services (e.g., AWS, Azure, GCP), third-party tools, and on-premises systems.
Apply Snowflake features such as data sharing, secure data exchange, cloning, and data masking to meet business requirements.
Ensure secure data access by implementing encryption, role-based access control (RBAC), and data masking.
Automate administrative tasks using Snowflake SQL and scripting languages like Python or Shell scripting.
Perform data loading methods like bulk loading using COPY INTO, Snowpipe for real-time ingestion, and External tables.
Manage Snowflake Virtual Warehouses, including scaling, resizing, and auto-suspend/resume settings.
Implement and manage roles, privileges, and access control via Snowflake RBAC.
Integrate Snowflake Single Sign-On (SSO) and System for Cross-domain Identity Management (SCIM) for identity and access management.
Monitor and troubleshoot Snowflake environments, track usage, and optimize query performance.
Collaborate with data engineers, data scientists, and business analysts to ensure the Snowflake environment meets stakeholder needs.
Develop technical documentation, data flow diagrams, and support documentation.
Qualification
Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.
Required
8 years of experience in data modeling, integration, warehousing, and governance.
8 years of experience with Snowflake architecture and components, including Databases, Procedures, Tasks, and Streams.
8 years of expertise in Snowflake cloning capabilities for databases and schemas.
8 years of experience in managing Snowflake Warehouses and optimizing performance for query execution.
8 years of proficiency in implementing and managing Snowflake RBAC for secure access.
8 years of experience integrating Snowflake with tools like Informatica and Azure Data Factory (ADF) for ETL/ELT processes.
8 years of experience automating tasks using Snowflake SQL and scripting languages like Python or Shell scripting.
8 years of expertise in monitoring and troubleshooting Snowflake environments.
8 years of strong understanding of Snowflake’s security features, including encryption, data masking, and network policies.
8 years of technical writing and diagramming skills (Visio, Erwin, Microsoft Office Suite).
8 years of experience working in an agile sprint team, using JIRA software.
8 years of experience managing multiple teams and prioritizing tasks.
Knowledge of Informatica 10.5 and experience developing reports in Cognos Analytics 11.1.
Preferred
5 years of familiarity with CI/CD pipelines and version control for managing Snowflake code deployments.
5 years of prior experience in the healthcare industry or with HHS agencies.
5 years of experience working with PII or PHI data and HL7 data.
5 years of experience working with Azure cloud services.
4 years of relevant experience or a Bachelor’s degree in Computer Science, Information Systems, Business, or equivalent.