myGwork - LGBTQ+ Business Community · 2 days ago
Senior Data Engineer
Wonder how qualified you are to the job?
Internet
Insider Connection @myGwork - LGBTQ+ Business Community
Responsibilities
Design, develop, and maintain scalable data pipelines with Snowflake, Apache Airflow, dbt (SQL), and Python with AWS services.
Support and contribute to the continuous improvement, tuning, applications, infrastructure developments, process controls, and upgrades of the data platform.
Provide guidance, hands-on development, and operational support for the deployment of database scripts and changes across multiple environments.
Collaborate with Moody's technical teams and business owners as needed during the design and implementation.
Manage individual time and tasks effectively.
Collaborate with cross-functional teams to understand data requirements and implement effective solutions.
Qualification
Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.
Required
Bachelor's degree or equivalent, Master's is a plus - or equivalent experience
3+ year of experience contributing to and providing technical leadership in a data engineering or software development team
Hands on experience in design and development of data integration/ETL data pipelines from various data sources and data formats
Knowledge of best practices for Airflow DAG management, scheduling, and monitoring.
Experience with implementing data quality checks, error handling, and retries in Airflow workflows.
Hands-on experience designing, developing, and maintaining data pipelines using Apache Airflow, including creating custom operators, sensors, and plugins.
Experience within all phases of software development working across Agile teams, product owners and external stakeholders
Experience driving technical ideas and communicating clearly to both technical and non-technical audiences at all levels of the organization
Strong development, testing, debugging skills at all levels (unit, system, integration, and performance testing) along with detail-oriented documentation skills
Effective communication and problem-solving skills.
Strong database engineering skills with emphasis Postgres SQL, DynamoDB, Snowflake
Preferred
Hands on experience implementing, managing, supporting, and developing with AWS database technologies, AWS RDS (Microsoft SQL Server, PostgreSQL) and AWS DynamoDB
Strong understanding of data quality best practices and data governance principles.
Strong understanding of object-oriented programming, design, and architectural patterns
Experience with development technologies: Git (Bitbucket or GitHub), Jira, Rally, Asana, Jenkins, Cypress, Postman
Experience with AWS cloud technologies: ECS/Fargate, ECR (EC2 Container Registry), Lambda, DynamoDB, Aurora PostgreSQL, API (Application Programming Interface) Gateway, VPC (Virtual Private Cloud), ALB, NLB, Neptune
Experience working with big data technologies: Apache Airflow, Apache Spark, Apache Kafka, AWS Kinesis, AWS Redshift
Experience with development languages: Java (Spring Boot), Scala, PySpark
Benefits
Medical
Dental
Vision
Parental leave
Paid time off
401(k) plan with employee and company contribution opportunities
Life insurance
Disability insurance
Accident insurance
Discounted employee stock purchase plan
Tuition reimbursement
Company
myGwork - LGBTQ+ Business Community
myGwork is the largest global platform for the LGBTQ+ business community.
Funding
Current Stage
Early StageTotal Funding
$4.77MKey Investors
24 HaymarketInnovate UK
2023-08-17Series Unknown· $1.66M
2023-08-17Grant· Undisclosed
2021-12-07Series A· $2.12M
Recent News
2024-04-10
Company data provided by crunchbase