SIGN IN
Principal Data Engineer - R01560587 jobs in United States
cer-icon
Apply on Employer Site
company-logo

Brillio · 9 hours ago

Principal Data Engineer - R01560587

Brillio is seeking a Principal Data Engineer to design and implement scalable Snowflake data architectures for enterprise data warehousing and analytics. The role involves optimizing performance, developing data pipelines, and managing AWS-based data solutions, while ensuring security best practices are enforced.
AnalyticsArtificial Intelligence (AI)Big DataCloud ComputingConsultingEnterprise ApplicationsMachine LearningMobile
check
H1B Sponsor Likelynote

Responsibilities

Design and implement scalable Snowflake data architectures to support enterprise data warehousing and analytics needs
Optimize Snowflake performance through advanced tuning, warehousing strategies, and efficient data sharing solutions
Develop robust data pipelines using Python and DBT, including modeling, testing, macros, and snapshot management
Implement and enforce security best practices such as RBAC, data masking, and row-level security across cloud data platforms
Architect and manage AWS-based data solutions leveraging S3, Redshift, Lambda, Glue, EC2, and IAM for secure and reliable data operations
Orchestrate and monitor complex data workflows using Apache Airflow, including DAG design, operator configuration, and scheduling
Utilize version control systems such as Git to manage codebase and facilitate collaborative data engineering workflows
Integrate and process high-volume data using Apache ecosystem tools such as Spark, Kafka, and Hive, with an understanding of Hadoop environments

Qualification

SnowflakePythonAWSApache AirflowSQLDBTGitApache SparkHadoopData WarehousingData ModellingDocumentation Skills

Required

12 - 15 years of experience, including significant hands-on expertise in Snowflake data architecture and data engineering
Design and implement scalable Snowflake data architectures to support enterprise data warehousing and analytics needs
Optimize Snowflake performance through advanced tuning, warehousing strategies, and efficient data sharing solutions
Develop robust data pipelines using Python and DBT, including modeling, testing, macros, and snapshot management
Implement and enforce security best practices such as RBAC, data masking, and row-level security across cloud data platforms
Architect and manage AWS-based data solutions leveraging S3, Redshift, Lambda, Glue, EC2, and IAM for secure and reliable data operations
Orchestrate and monitor complex data workflows using Apache Airflow, including DAG design, operator configuration, and scheduling
Utilize version control systems such as Git to manage codebase and facilitate collaborative data engineering workflows
Integrate and process high-volume data using Apache ecosystem tools such as Spark, Kafka, and Hive, with an understanding of Hadoop environments
Advanced hands-on experience with Snowflake, including performance tuning and warehousing strategies
Expertise in Snowflake security features such as RBAC, data masking, and row-level security
Proficiency in advanced Python programming for data engineering tasks
In-depth knowledge of DBT for data modeling, testing, macros, and snapshot management
Strong experience with AWS services including S3, Redshift, Lambda, Glue, EC2, and IAM
Extensive experience designing and managing Apache Airflow DAGs and scheduling workflows
Proficiency in version control using Git for collaborative development
Hands-on experience with Apache Spark, Kafka, and Hive
Solid understanding of Hadoop ecosystem
Expertise in SQL (basic and advanced), including SnowSQL, PLSQL, and T-SQL
Strong requirement understanding, presentation, and documentation skills; ability to translate business needs into clear, structured functional/technical documents and present them effectively to stakeholders

Preferred

Experience with Salesforce Data Cloud integration
Familiarity with data cataloging tools such as Alation
Exposure to real-time streaming architectures
Experience working in multi-cloud environments
Knowledge of DevOps or DataOps practices
Certifications in data cloud technologies
Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or a related field
Relevant certifications in Snowflake, AWS, or data engineering technologies are highly desirable

Company

Brillio is a technology consulting and technology services company focused on the implementation of digital technologies.

H1B Sponsorship

Brillio has a track record of offering H1B sponsorships. Please note that this does not guarantee sponsorship for this specific role. Below presents additional info for your reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2025 (258)
2024 (240)
2023 (281)
2022 (314)
2021 (196)
2020 (406)

Funding

Current Stage
Late Stage
Total Funding
unknown
Key Investors
The Orogen Group
2023-09-05Private Equity
2019-01-14Acquired

Leadership Team

leader-logo
Raj Mamodia
Chairman, Founder & CEO
linkedin
leader-logo
Santosh Padmanabhan
Architect
linkedin
Company data provided by crunchbase