NuStar Technologies · 19 hours ago
Lead Data Architect
NuStar Technologies is seeking a Lead Data Architect to design and implement scalable data architectures aligned with business objectives. The role involves hands-on development of data pipelines, governance policies, and mentoring junior engineers.
AnalyticsConsultingEnterprise Resource Planning (ERP)IT ManagementStaffing Agency
Responsibilities
Design and implement scalable, efficient data architectures
Lead the development of data strategy aligned with business objectives
Evaluate and integrate new technologies to enhance data capabilities
Implement complex data pipelines for real-time and batch processing
Optimize data flows for high-volume, high-velocity data environments
Develop advanced ETL processes for diverse data sources
Drive, establish and enforce data governance policies and best practices
Implement data quality frameworks and monitoring systems
Ensure compliance with data regulations and standards
Analyze and optimize system performance for large-scale data operations
Troubleshoot complex data issues and implement robust solutions
Mentor junior data engineers and provide technical guidance
Be current on data technologies, recommend best practices and standards
Collaborate with cross-functional teams, leadership to drive data literacy
Enforce/write unit test cases, validate/review the data integrity & consistency results, drive automated data pipelines using GitLab, Github, CICD tools
Review & approve code promotions, enforce release management procedures to promote code deployment to various environments including production, disaster recovery, and support activities
Collaborate with business stakeholders to understand data requirements and translate them into an efficient architecture solution
Qualification
Required
Design and implement scalable, efficient data architectures
Lead the development of data strategy aligned with business objectives
Evaluate and integrate new technologies to enhance data capabilities
Implement complex data pipelines for real-time and batch processing
Optimize data flows for high-volume, high-velocity data environments
Develop advanced ETL processes for diverse data sources
Drive, establish and enforce data governance policies and best practices
Implement data quality frameworks and monitoring systems
Ensure compliance with data regulations and standards
Analyze and optimize system performance for large-scale data operations
Troubleshoot complex data issues and implement robust solutions
Mentor junior data engineers and provide technical guidance
Be current on data technologies, recommend best practices and standards
Collaborate with cross-functional teams, leadership to drive data literacy
Enforce/write unit test cases, validate/review the data integrity & consistency results
Drive automated data pipelines using GitLab, Github, CICD tools
Review & approve code promotions
Enforce release management procedures to promote code deployment to various environments
Collaborate with business stakeholders to understand data requirements and translate them into an efficient architecture solution
Deep expertise in designing, building robust metadata-driven, automated data pipeline solutions leveraging modern cloud-based data technologies
Deep experience leveraging data security, governance methodologies meeting data compliance requirements
Strong hands-on experience designing, building medallion architecture ELT pipelines, snowpipe, streaming frameworks using Qlik Replicate, DBT Cloud transformations, snowflake, GitLab with CICD
Strong experience designing, building data integrity solutions across multiple data sources and targets like SQL Server, Oracle, Mainframe-DB2, files, Snowflake
Strong design, development experience in Python/Pyspark, advanced SQL for ingestion frameworks and automation
Strong experience architecting, building solutions using S3, Lambda, SQS, SNS, Glue, RDS AWS services
Strong experience working with various structured and semi-structured data files - CSV, fixed width, JSON, XML, Excel, and mainframe VSAM
Strong orchestration experience using DBT cloud, Astronomer Airflow
Design, implement logging, monitoring, alerting, observability using tools like Dynatrace
Strong experience in problem solving, performance tuning
Design and implement schema drift detection and schema evolution patterns
Strong experience designing, implementing sensitive data protection strategies – tokenization, snowflake data masking policies, dynamic & conditional masking, and role based masking rules
Strong experience designing, implementing RBAC, data access controls, adopting governance standards across Snowflake and supporting systems
Strong experience in enforcing, adopting release management guidelines, code deployment to various environments, implementing disaster recovery strategies, leading production activities
Ability to make technical data decisions for key business use cases
Ability to prioritize investment on new data-related technologies in conjunction with business units
Must have one or more certifications in the relevant technology fields
Preferred
Financial banking experience is a plus
Company
NuStar Technologies
NuStar Technologies provides staffing, recruiting, cloud enablement, ERP implementation, analytics, web solution, and managed services.
H1B Sponsorship
NuStar Technologies has a track record of offering H1B sponsorships. Please note that this does not
guarantee sponsorship for this specific role. Below presents additional info for your
reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2025 (1)
2024 (1)
2023 (3)
2022 (1)
2020 (3)
Funding
Current Stage
Early StageCompany data provided by crunchbase