AWS Data Engineer - Fully Remote - US Only @ Scalepex | Jobright.ai
JOBSarrow
RecommendedLiked
0
Applied
0
External
0
AWS Data Engineer - Fully Remote - US Only jobs in Plano, TX
168 applicantsPosted by Agency
company-logo

Scalepex ยท 21 hours ago

AWS Data Engineer - Fully Remote - US Only

ftfMaximize your interview chances
FinanceProfessional Services
badNo H1BnoteSecurity Clearance Requirednote

Insider Connection @Scalepex

Discover valuable connections within the company who might provide insights and potential referrals.
Get 3x more responses when you reach out via email instead of LinkedIn.

Responsibilities

Design and Build Data Pipelines: Develop scalable, reliable data pipelines using AWS services (e.g., Glue, S3, Redshift) to process and transform large datasets from utility systems like smart meters or energy grids
Workflow Orchestration: Use AWS Step Functions to orchestrate workflows across data pipelines; experience with Airflow is acceptable but Step Functions is preferred
Data Integration and Transformation: Implement ETL/ELT processes using PySpark, Python, and Pandas to clean, transform, and integrate data from multiple sources into unified datasets
Distributed Systems Expertise: Leverage experience with complex distributed systems to ensure reliability, scalability, and performance in handling large-scale utility data
Serverless Application Development: Use AWS Lambda functions to build serverless solutions for automating data processing tasks
Data Modeling for Analytics: Design data models tailored for utilities use cases (e.g., energy consumption forecasting) to enable advanced analytics
Optimize Data Pipelines: Continuously monitor and improve the performance of data pipelines to reduce latency, enhance throughput, and ensure high availability
Ensure Data Security and Compliance: Implement robust security measures to protect sensitive utility data and ensure compliance with industry regulations

Qualification

Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.

AWS Data EngineeringAWS Step FunctionsAWS LambdaAWS GlueAWS RedshiftPythonPySparkPandasData EngineeringETL/ELT processesDistributed SystemsData ModelingData SecurityData Governance

Required

Minimum of 5 years of experience in data engineering
Proficiency in AWS services such as Step Functions, Lambda, Glue, S3, DynamoDB, and Redshift
Strong programming skills in Python with experience using PySpark and Pandas for large-scale data processing
Hands-on experience with distributed systems and scalable architectures
Knowledge of ETL/ELT processes for integrating diverse datasets into centralized systems
Familiarity with utilities-specific datasets (e.g., smart meters, energy grids) is highly desirable
Strong analytical skills with the ability to work on unstructured datasets
Knowledge of data governance practices to ensure accuracy, consistency, and security of data
Strong experience in AWS data engineering
Ability to work independently
Ability to work with a cross-functional teams, including interfacing and communicating with business stakeholders
Professional oral and written communication skills
Strong problem solving and troubleshooting skills with experience exercising mature judgement
Excellent teamwork and interpersonal skills
Ability to obtain and maintain the required clearance for this role

Company

Scalepex

twittertwittertwitter
company-logo
Scalepex offers staffing and placement solutions for healthcare sector.

Funding

Current Stage
Growth Stage

Leadership Team

N
Naveed Patel
Founder and CEO
linkedin
A
Amin Chandani
Founder and Managing Partner
linkedin
Company data provided by crunchbase
logo

Orion

Your AI Copilot