Certified Cloud Engineer (AWS Red Hat & Python) @ Codeworks IT Careers | Jobright.ai
JOBSarrow
RecommendedLiked
0
Applied
0
Certified Cloud Engineer (AWS Red Hat & Python) jobs in United States
132 applicants
expire-info-iconThis job has closed.
company-logo

Codeworks IT Careers · 1 week ago

Certified Cloud Engineer (AWS Red Hat & Python)

Wonder how qualified you are to the job?

Information Technology
check
Actively Hiring
check
Growth Opportunities

Insider Connection @Codeworks IT Careers

Discover valuable connections within the company who might provide insights and potential referrals, giving your job application an inside edge.

Responsibilities

Collaborate with data engineering, business analysts, and development teams to design, develop, test, and maintain robust and scalable data pipelines from Workday to AWS Redshift.
Architect, implement, and manage end-to-end data pipelines, ensuring data accuracy, reliability, data quality, performance, and timeliness.
Provide expertise in Redshift database optimization, performance tuning, and query optimization.
Assist with design and implementation of workflows using Airflow.
Perform data profiling and analysis to troubleshoot data-related challenges / issues and build solutions to address those concerns.
Proactively identify opportunities to automate tasks and develop reusable frameworks.
Work closely with version control team to maintain a well-organized and documented repository of codes, scripts, and configurations using Git/Bitbucket.
Provide technical guidance and mentorship to fellow developers, sharing insights into best practices, tips, and techniques for optimizing Redshift-based data solutions.

Qualification

Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.

AWS Data Lake SolutionsRedshift IntegrationPython ProgrammingAirflow DAGsPysparkETL ScriptingCloudFormation TemplatesYAMLIAM PoliciesIAM RolesCloudwatch LogsLog InsightsCloudtrailETL Best PracticesData IntegrationData ModelingData TransformationPerformance TuningGitVersion Control Systems

Required

Advanced hands-on professional experience designing AWS data lake solutions.
Experience integrating Redshift with other AWS services, such as DMS, Glue, Lambda, S3, Athena, Airflow.
Proficiency in Python programming with a focus on developing efficient Airflow DAGs and operators.
Experience with Pyspark and Glue ETL scripting including functions like relationalize, performing joins and transforming dataframes with pyspark code.
Competency developing CloudFormation templates to deploy AWS infrastructure, including YAML defined IAM policies and roles.
Experience with Airflow DAG creation.
Familiarity with debugging serverless applications using AWS tooling like Cloudwatch Logs & Log Insights, Cloudtrail, IAM.
Ability to work in a highly complex python object oriented platform.
Strong understanding of ETL best practices, data integration, data modeling, and data transformation.
Proficiency in identifying and resolving performance bottleneck and fine-tuning Redshift queries.
Familiarity with version control systems, particularly Git, for maintaining a structured code repository.

Company

Codeworks IT Careers

twittertwittertwitter
company-logo
Our mission is to be a pillar in the business technology ecosystem by cultivating partnerships as a trusted advisor for quality solution delivery, empowering our consultants and employees to grow professionally and personally, and strengthening communities we serve.

Funding

Current Stage
Growth Stage
Company data provided by crunchbase
logo

Orion

Your AI Copilot