Data Engineer III @ Availity | Jobright.ai
JOBSarrow
RecommendedLiked
0
Applied
0
External
0
Data Engineer III jobs in North Dakota, United States
Be an early applicantLess than 25 applicants
expire-info-iconThis job has closed.
company-logo

Availity · 17 hours ago

Data Engineer III

ftfMaximize your interview chances
FitnessHealth Care
check
Growth Opportunities
badNo H1Bnote

Insider Connection @Availity

Discover valuable connections within the company who might provide insights and potential referrals.
Get 3x more responses when you reach out via email instead of LinkedIn.

Responsibilities

Develop a scalable and resilient cloud data platform and scalable data pipelines.
Ensure industry best practices around data pipelines, metadata management, data quality, data governance, and data privacy.
Build highly scalable AWS Infrastructure (from scratch or through 3rd party products) to enable Big Data Processing in the platform.
Find optimization within cloud resource usage to minimize costs while maintaining system reliability, including leveraging reserved instances and spot instances effectively.
Find performance sensitive considerations within development best practices, as well as, troubleshooting across the data platform utilizing tools (e.g., Splunk and New Relic, Cloud Watch, etc.) to ensure performance measurement and monitoring.
Participate in coding best practices, guidelines and principles that help engineers write clean, efficient, and maintainable code.
Participate in code reviews to catch issues, improve code quality, and provide constructive feedback to individuals within the team during code reviews.

Qualification

Find out how your skills align with this job's requirements. If anything seems off, you can easily click on the tags to select or unselect skills to reflect your actual expertise.

Spark frameworkAWS EMRETL tasksAWS TerraformAirflowPythonSQLJavaScalaBig data technologiesLinuxBashNode.jsOracleSQL ServerGITRAWS Certified Developer - AssociateAWS Certified Solutions Architect - AssociateSoft skillsCollaborative attitudeProblem-solving proficiency

Required

In-depth understanding of Spark framework, scripting languages (e.g., Python, Bash, node.js)
Advance knowledge of programming languages (e.g., SQL, Java, Scala) to design, build, and maintain complex data processing, ETL (Extract, Transform, Load) tasks, and AWS automation.
Experience working with big data and large datasets (100 gigs or more).
Experience with AWS Terraform.
Experience building workflows with Airflow.
A firm understanding of unit testing.
Possess in-depth knowledge of AWS services and data engineering tools to diagnose and solve complex issues efficiently, specifically AWS EMR for big data processing.
In-depth understanding of GIT or other distributed version control systems.
Excellent communication. Essential to performing at maximum efficiency within the team.
Collaborative attitude. This role is part of a larger, more dynamic team that nurtures collaboration.
Strong technical, process, and problem-solving proficiency.
Must have experience with SQL and relational database systems (e.g., Oracle, SQL Server).
Must have experience with Linux.

Preferred

4+ years of development experience with big data technologies.
4+ years using Apache Spark.
Experience with cloud computing platforms (e.g., AWS, Azure, Google Cloud).
Experience with Software Development Life Cycle.
Experience with Software Development Best Practices (Version control, Change Management, Unit Testing, etc.).
Familiarity with standard data science toolkits, such as R and Python is a plus.
Demonstrated experience with operationalization and observability in a production environment.
Holding relevant certifications (e.g., AWS Certified Developer - Associate or AWS Certified Solutions Architect - Associate) would be a plus.

Benefits

Generous HSA company contribution
Healthcare
Vision
Dental benefits
401k match program
Unlimited PTO for salaried associates + 9 paid holidays
Education reimbursement
Paid Parental Leave for both moms and dads
Gym memberships reimbursement
Participation in racing events
Weight management programs

Company

Availity

company-logo
Availity offers a free access to real-time information and instant responses for healthcare professionals.

Funding

Current Stage
Late Stage
Total Funding
$200M
Key Investors
Novo HoldingsFrancisco Partners
2021-07-07Secondary Market
2017-10-19Private Equity· $200M

Leadership Team

leader-logo
Russ Thomas
CEO
linkedin
leader-logo
Frank Petito
Chief Financial Officer
linkedin
Company data provided by crunchbase
logo

Orion

Your AI Copilot