Trilyon, Inc. · 18 hours ago
Architect
Trilyon, Inc. is seeking a Snowflake Architect to join their team. This role involves designing and implementing scalable data lake architectures and leading cloud migration initiatives, focusing on data modernization and high-performance data platforms.
AdvertisingMarketingContent CreatorsDigital MarketingDocument Preparation
Responsibilities
Guide teams in migrating data platforms from on-premises Cloudera environments to Azure cloud services
Design and implement scalable data lake solutions using Snowflake and Databricks
Develop and optimize data pipelines for data ingestion, transformation, and storage
Manage data governance, quality, and security across cloud environments
Implement performance tuning, automation, and CI/CD processes for data workflows
Collaborate with cross-functional teams to support cloud migration initiatives
Install, configure, manage, and monitor Cloudera Hadoop clusters, including HDFS, YARN, and related ecosystem components
Ensure high availability, performance, and security
Tune Hadoop, Hive, and Spark jobs; optimize queries and configurations; troubleshoot issues related to Linux servers, networks, cluster health, job failures, and performance bottlenecks
Implement security and governance controls such as Kerberos, Apache Ranger, and Atlas
Manage secrets and privileged access using HashiCorp Vault and CyberArk
Migrate Hadoop, Hive, and Spark data and applications to Azure services including Azure Synapse Analytics, Azure Databricks, and Snowflake
Develop automation scripts using shell, Ansible, and Python
Create and maintain technical documentation
Collaborate with vendors on upgrades and vulnerability remediation
Provide operational support to ensure stable and secure data platforms
Qualification
Required
Experience in Snowflake
Experience in Databricks
Experience in Azure cloud platforms
Experience in data lake architecture
Experience in Hadoop ecosystems
Experience in data governance
Passion for cloud data modernization
Passion for high-performance data platforms
Guiding teams in migrating data platforms from on-premises Cloudera environments to Azure cloud services
Designing and implementing scalable data lake solutions using Snowflake and Databricks
Developing and optimizing data pipelines for data ingestion, transformation, and storage
Managing data governance, quality, and security across cloud environments
Implementing performance tuning, automation, and CI/CD processes for data workflows
Collaborating with cross-functional teams to support cloud migration initiatives
Installing, configuring, managing, and monitoring Cloudera Hadoop clusters, including HDFS, YARN, and related ecosystem components
Ensuring high availability, performance, and security of Hadoop clusters
Tuning Hadoop, Hive, and Spark jobs
Optimizing queries and configurations
Troubleshooting issues related to Linux servers, networks, cluster health, job failures, and performance bottlenecks
Implementing security and governance controls such as Kerberos, Apache Ranger, and Atlas
Managing secrets and privileged access using HashiCorp Vault and CyberArk
Migrating Hadoop, Hive, and Spark data and applications to Azure services including Azure Synapse Analytics, Azure Databricks, and Snowflake
Developing automation scripts using shell, Ansible, and Python
Creating and maintaining technical documentation
Collaborating with vendors on upgrades and vulnerability remediation
Providing operational support to ensure stable and secure data platforms
Benefits
Comprehensive benefits package
Opportunities for growth and professional development
Collaborative and inclusive company culture