Senior Data Engineer - Python Airflow @ Worldpay | Jobright.ai
JOBSarrow
RecommendedLiked
0
Applied
0
External
0
Senior Data Engineer - Python Airflow jobs in United States
Be an early applicantLess than 25 applicants
company-logo

Worldpay · 9 hours ago

Senior Data Engineer - Python Airflow

ftfMaximize your interview chances
BankingPayments

Insider Connection @Worldpay

Discover valuable connections within the company who might provide insights and potential referrals.
Get 3x more responses when you reach out via email instead of LinkedIn.

Responsibilities

Develop and implement strategies for data engineering initiatives using Python, AWS, Airflow, and Snowflake technologies
Monitor trends in the data engineering industry and stay up to date on current technologies
Collaborate with product team to develop solutions that meet their goals and objectives
Act as a subject matter expert for Apache Airflow and provide technical guidance to team members
Install, configure, and maintain Astronomer Airflow environments
Build complex data engineering pipelines using Python and Airflow
Will be responsible for designing, developing, and maintaining scalable workflows and orchestration systems using Astronomer
Create and manage Directed Acyclic Graphs (DAGs) to automate data pipelines and processes
Leverage AWS Glue, Step Functions, and other services for orchestrating data workflows
Develop custom operators and plugins for Astronomer to extend its capabilities
Integrate code with defined CI/CD framework and AWS services required for building secure data pipeline
Manage user access and permissions, ensuring data security and compliance with company policies
Implement and monitor security controls, including encryption, authentication, and network security
Conduct regular security audits and vulnerability assessments
Manage data ingestion and ETL processes
Automate routine tasks and processes using scripting and automation tools
Maintain comprehensive documentation for configurations, procedures, and troubleshooting steps

Qualification

Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.

PythonAirflowAWSSnowflakeETL developmentData engineeringApache AirflowCI/CD toolsScripting languagesDatabricksPySparkHadoopAWS GlueKafkaBashPowerShellTerraformAWS CLIPrometheusGrafanaELK stackAWS certificationAirflow CertificationPython Certification

Required

5+ yrs in a pivotal Software/Data Engineering role with deep exposure to modern data stacks, particularly Snowflake, Airflow, DBT, and AWS data services.
Proficiency in cloud platforms such as AWS, Azure, or Google Cloud.
Must have experience on ETL development life cycle, best practices of ETL pipelines, thorough work experience on data warehouse using combination of Python, Snowflake & AWS Services.
Understanding data pipelines and modern ways of automating data pipeline using cloud-based testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions.
Serving as the Databricks account owner, including security and privacy setup, marketplace plugins and integration with other tools.
Strong experience with Amazon Web Services (AWS) accounts and high-level usage monitoring.
Proficiency in scripting languages such as Python, Bash, or PowerShell.
Experience with CI/CD tools like Jenkins, GitLab CI, CircleCI, or similar.
Familiarity with monitoring and logging tools (e.g., Prometheus, Grafana, ELK stack).
Excellent problem-solving skills and attention to detail.
Strong communication skills and the ability to work collaboratively in a team environment.
Experience with AWS CLI and Networking.
Experience with architecting and maintaining high availability production systems.
Experience with developing monitoring architecture and implementing monitoring agents, dashboards, escalations, and alerts.
Knowledge of security controls for the public cloud (encryption of data in motion/rest and key management).
Demonstrated knowledge and hands-on experience with AWS alerting/monitoring tools.
Experience with infrastructure as code (IaC) tools such as Terraform.
AWS certification.
Airflow Certification.
Python Certification.

Preferred

Experience with PySpark/Hadoop and/or AWS Glue ETL and/or Databricks w/ python is preferred.
Data engineering experience with AWS Services (S3, Lambda, Glue, Lake Formation, EMR), Kafka, Streaming, Databricks is highly preferred.
Experience in Astronomer and/or Airflow.
Experience Unity Catalog migration, workspaces and audit logs.

Benefits

Time to support charities and give back to your community
Parental leave policy
Global recognition platform
Virgin Pulse access
Global employee assistance program

Company

Worldpay

twittertwittertwitter
company-logo
Worldpay provides electronic payment and banking services.

Funding

Current Stage
Public Company
Total Funding
unknown
2023-07-06Acquired· undefined
2015-10-13IPO· undefined

Leadership Team

leader-logo
Kevin Hennessy
VP, Strategic Solutions
linkedin
leader-logo
Patrick Keaney
Business Development Manager
linkedin
Company data provided by crunchbase
logo

Orion

Your AI Copilot