Releady · 3 months ago
Full Stack Data Engineer - Senior
Releady is a company focused on delivering secure and high-quality software solutions. The Senior Full Stack Data Engineer will be responsible for designing, developing, and deploying cloud-native data pipelines while implementing DevSecOps best practices and collaborating with cross-functional teams.
ConsultingRecruitingStaffing Agency
Responsibilities
Build data pipelines: Create, maintain, and optimize workloads from development to production for specific use cases, with a focus on cloud-native solutions and modern frameworks
Develop the most efficient and cost-effective implementation, leveraging reusable features where possible
Drive operational excellence, including but not limited to Incident Management, process automation leveraging AI, and ensuring smooth deployments for your technology products/platform features
Monitor and manage software configuration changes to anticipate and address data reliability and customer satisfaction issues, leveraging cloud monitoring tools and practices
Coordinate sustaining support for multiple application platforms or business processes, ensuring seamless integration and operation in a cloud environment
Apply significant knowledge of IT and healthcare industry trends
Execute the analysis and remediation of root causes, including deficiencies in technology, process or resource capabilities
Work in agile/DevSecOps pod model alongside solution leads, data modelers, analysts, business partners and other developers in delivery of data
Support of monitoring and tuning application code to assure optimal availability, performance and utilization of resources
Provide technical expertise working with Analysts and Business Users to implement complex and varied functional specifications into technical designs
Qualification
Required
Requires a bachelor's degree in computer science, Information Technology, Management Information Systems, or a related field (or equivalent experience), with a minimum of 3 years of relevant experience in enterprise application support and cloud-based solution delivery
Experience in Cloud platform preferably Azure (or AWS or GCP) and its related technical stack including ADLS, Synapse, Azure data factory etc
Experienced in Snowflake and/or Databricks
Solid experience with JavaScript, along with CSS responsive design practices
Strong technical understanding of data modeling (data vault 2.0), data mining, master data management, data integration, data architecture, data virtualization, data warehousing and data quality techniques
Hands on experience using data management technologies like Informatica PowerCenter/IICS, Collibra, Reltio Master Data Management, DBT Cloud, Dbt core, Denodo and Golden Gate or Striim replication
Working knowledge of testing tools and systems and scheduling software (Tidal, Control-m)
Basic experience in working with data governance and data security and specifically information stewards and privacy and security officers in moving data pipelines into production with appropriate data quality, governance and security standards and certification
Proficiency in Unix command-line operations, including directory navigation, file manipulation, and shell scripting, python along with utilities like awk, sed etc
Hands-on experience with CI/CD pipelines (e.g. Bitbucket Pipelines, GitHub Actions) and Infrastructure as Code tools like Ansible to automate cloud deployments
Demonstrated/ Excellent ability to influence and collaborate with stakeholders, vendors, and cross-functional teams, with excellent verbal and written communication skills to translate and execute technical deliverable
Strong process orientation with the ability to follow and improve procedures for critical maintenance and operational tasks
Preferred
Preferred experience in the healthcare industry