Senior Data Engineer – Snowflake / Qlik / DBT jobs in United States
cer-icon
Apply on Employer Site
company-logo

NuStar Technologies · 5 hours ago

Senior Data Engineer – Snowflake / Qlik / DBT

NuStar Technologies is seeking a Senior Data Engineer to design, develop, and maintain data pipelines and data warehouse structures. The role involves ensuring data quality, implementing governance policies, and optimizing performance for large-scale data operations.

AnalyticsConsultingEnterprise Resource Planning (ERP)IT ManagementStaffing Agency
check
H1B Sponsor Likelynote

Responsibilities

Design, develop, and maintain data pipelines for efficient data processing and integration for real-time and batch-processing
Implement and optimize ETL processes to extract, load, transform and integrate data from various sources
Enhance data flows and storage solutions for improved performance
Design and implement data warehouse structures
Ensure data quality and consistency within the data warehouse
Apply data modeling techniques for efficient data storage and retrieval
Implement data governance policies and procedures
Implement data quality frameworks, standards and documentation
Ensure compliance with relevant data regulations and standards
Implement data security measures and access controls
Maintain data protection protocols
Analyze and optimize system performance for large-scale data operations
Troubleshoot data issues and implement robust solutions
Write unit test cases, validate the data integrity & consistency requirements, adopt automated data pipelines using GitLab, Github, CICD tools
Adopt release management processes to promote code deployment to various environments including production, disaster recovery, and support activities
Collaborate with data scientists, analysts, and other teams to understand and meet data requirements
Participate in cross-functional projects to support data-driven initiatives

Qualification

SnowflakeDBTQlikData Pipeline DevelopmentData GovernanceETL ProcessesData WarehousingPythonSQLAWS ServicesPerformance OptimizationFinancial Banking ExperienceCross-Functional CollaborationSoft Skills

Required

Design, develop, and maintain data pipelines for efficient data processing and integration for real-time and batch-processing
Implement and optimize ETL processes to extract, load, transform and integrate data from various sources
Enhance data flows and storage solutions for improved performance
Design and implement data warehouse structures
Ensure data quality and consistency within the data warehouse
Apply data modeling techniques for efficient data storage and retrieval
Implement data governance policies and procedures
Implement data quality frameworks, standards and documentation
Ensure compliance with relevant data regulations and standards
Implement data security measures and access controls
Maintain data protection protocols
Analyze and optimize system performance for large-scale data operations
Troubleshoot data issues and implement robust solutions
Write unit test cases, validate the data integrity & consistency requirements, adopt automated data pipelines using GitLab, Github, CICD tools
Adopt release management processes to promote code deployment to various environments including production, disaster recovery, and support activities
Collaborate with data scientists, analysts, and other teams to understand and meet data requirements
Participate in cross-functional projects to support data-driven initiatives
Hands-on experience in building robust metadata-driven, automated data pipeline solutions leveraging modern cloud-based data technologies, tools for large data platforms
Hands-on experience leveraging data security, governance methodologies meeting data compliance requirements
Experience building automated ELT data pipelines, snowpipe frameworks leveraging Qlik Replicate, DBT Cloud, snowflake with CICD
Hands-on experience building data integrity solutions across multiple data sources and targets like SQL Server, Oracle, Mainframe-DB2, files, Snowflake
Experience working with various structured & semi-structured data files - CSV, fixed width, JSON, XML, Excel, and mainframe VSAM
Experience using S3, Lambda, SQS, SNS, Glue, RDS AWS services
Proficiency in Python, Pyspark, advanced SQL for ingestion frameworks and automation
Hands-on data orchestration experience using DBT cloud, Astronomer Airflow
Experience in implementing logging, monitoring, alerting, observability, performance tuning techniques
Implement and maintain sensitive data protection strategies – tokenization, snowflake data masking policies, dynamic & conditional masking, and role based masking rules
Strong experience designing, implementing RBAC, data access controls, adopting governance standards across Snowflake and supporting systems
Strong experience in adopting release management guidelines, code deployment to various environments, implementing disaster recovery strategies, leading production activities
Experience implementing schema drift detection and schema evolution patterns
Must have one or more certifications in the relevant technology fields

Preferred

Nice to have Financial banking experience

Company

NuStar Technologies

twittertwitter
company-logo
NuStar Technologies provides staffing, recruiting, cloud enablement, ERP implementation, analytics, web solution, and managed services.

H1B Sponsorship

NuStar Technologies has a track record of offering H1B sponsorships. Please note that this does not guarantee sponsorship for this specific role. Below presents additional info for your reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2025 (1)
2024 (1)
2023 (3)
2022 (1)
2020 (3)

Funding

Current Stage
Early Stage
Company data provided by crunchbase