System One · 5 hours ago
AWS Data Engineer
System One has an exciting Data Engineering opportunity with a partner based in the Tulsa, OK area. This position will be responsible for designing and developing data architecture, building and maintaining data pipelines, and ensuring data quality and integrity using AWS services.
Staffing Agency
Responsibilities
Design and develop data architecture: Create scalable, reliable, and efficient data lakehouse solutions on AWS, leveraging Apache Iceberg and other AWS services for table formats
Build and maintain data pipelines: Design, construct, and automate ETL/ELT processes to ingest data from diverse sources into the AWS ecosystem
Create and manage data APIs: Design, develop, and maintain secure and scalable RESTful and other APIs to facilitate data access for internal teams and applications, typically leveraging AWS services
Implement AWS services: Utilize a wide array of AWS tools for data processing, storage, and analytics, such as Amazon S3, Amazon EMR, and AWS Lake Formation, with native Iceberg support
Manage Iceberg tables: Build and manage Apache Iceberg tables on Amazon S3 to enable data lakehouse features like ACID transactions, time travel, and schema evolution
Optimize data performance: Implement partitioning strategies, data compaction, and fine-tuning techniques for Iceberg tables to enhance query performance
Ensure data quality and integrity: Implement data validation and error-handling processes, leveraging Iceberg's transactional capabilities for consistent data
Ensure security and compliance: Implement robust data security measures, access controls, and compliance with data protection regulations, including using AWS Lake Formation with Iceberg and implementing authorization on APIs via IAM or Cognito
Collaborate with stakeholders: Work closely with data scientists, analysts, software engineers, and business teams to understand their data needs and deliver effective solutions
Provide technical support: Offer technical expertise and troubleshooting for data-related issues related to pipelines and API endpoints
Maintain documentation: Create and maintain technical documentation for data workflows, processes, and API specifications
Qualification
Required
Bachelor's degree in Computer Science, Information Technology, or a related field
Proven experience in data engineering, with significant hands-on experience using AWS cloud services
Proficiency in programming languages like Python, Java, or Scala
Strong SQL skills for querying, data modeling, and database design
Expertise in relevant AWS services such as S3, EMR, Lambda, API Gateway, SageMaker and IAM
Hands-on experience building and managing Apache Iceberg tables
Experience with big data technologies like Apache Spark and Hadoop
Experience creating and deploying RESTful APIs, with knowledge of best practices for performance and security
Experience with ETL tools and workflow orchestration tools like Apache Airflow
Familiarity with DevOps practices, CI/CD pipelines, and infrastructure as code (Terraform) is often desired
Strong problem-solving and analytical skills
Excellent communication and collaboration skills
Ability to work independently and as part of an agile team
Proof of ability to work in the U.S. without sponsorship
Preferred
AWS Certified Data Analytics
AWS Certified Data Engineer – Specialty
other relevant AWS certifications
Benefits
Health and welfare benefits coverage options including medical, dental, vision, spending accounts, life insurance, voluntary plans
Participation in a 401(k) plan
Company
System One
System One is a professional staffing firms.
Funding
Current Stage
Late StageTotal Funding
$301.8MKey Investors
TruistOaktree Capital ManagementProspect Capital Corporation
2021-01-28Debt Financing· $290M
2020-12-08Private Equity
2016-09-20Acquired
Recent News
2026-01-06
Company data provided by crunchbase