Jobs via Dice · 4 hours ago
Senior Data Engineer
Dice is the leading career destination for tech experts at every stage of their careers, and they are seeking a Senior Data Engineer for their client, Info Dinamica Inc. The role involves designing and building scalable data pipelines, ensuring data quality and compliance, and collaborating with stakeholders to translate business requirements into technical solutions.
Computer Software
Responsibilities
Designs and builds scalable data pipelines, integrates diverse sources, and optimizes storage/processing using Hadoop ecosystem and Greenplum
Ensures data quality, security, and compliance through governance frameworks
Implements orchestration, monitoring, and performance tuning for reliable, cost-efficient operations
Expertise in Hadoop ecosystem (HDFS, Hive, Spark, Kafka) and MPP databases like Greenplum for large-scale data processing and optimization
Collaborates with Data Owners and stakeholders to translate business rules into technical solutions
Delivers curated datasets, lineage, and documentation aligned with SLAs and regulatory standards
Subject matter expert having experience of interacting with client, understanding the requirement and guiding the team
Documenting the requirements clearly with defined scope and must play a anchor role in setting the right expectations and delivering as per the schedule
Design and develop scalable data pipelines using Hadoop ecosystem and Greenplum for ingestion, transformation, and storage of large datasets
Optimize data models and queries for performance and reliability, ensuring compliance with security and governance standards
Implement data quality checks, monitoring, and orchestration workflows for timely and accurate data delivery
Collaborate with Data Owners and business teams to translate requirements into technical solutions and maintain documentation and lineage
Qualification
Required
Strong work experience as a Data Engineer (Big Data Hadoop, Green Plum, etc.)
Designs and builds scalable data pipelines
Integrates diverse sources
Optimizes storage/processing using Hadoop ecosystem and Greenplum
Ensures data quality, security, and compliance through governance frameworks
Implements orchestration, monitoring, and performance tuning for reliable, cost-efficient operations
Expertise in Hadoop ecosystem (HDFS, Hive, Spark, Kafka) and MPP databases like Greenplum for large-scale data processing and optimization
Collaborates with Data Owners and stakeholders to translate business rules into technical solutions
Delivers curated datasets, lineage, and documentation aligned with SLAs and regulatory standards
Subject matter expert having experience of interacting with client, understanding the requirement and guiding the team
Documenting the requirements clearly with defined scope
Must play an anchor role in setting the right expectations and delivering as per the schedule
Design and develop scalable data pipelines using Hadoop ecosystem and Greenplum for ingestion, transformation, and storage of large datasets
Optimize data models and queries for performance and reliability, ensuring compliance with security and governance standards
Implement data quality checks, monitoring, and orchestration workflows for timely and accurate data delivery
Collaborate with Data Owners and business teams to translate requirements into technical solutions and maintain documentation and lineage
Preferred
Agile environment preferred
Company
Jobs via Dice
Welcome to Jobs via Dice, the go-to destination for discovering the tech jobs you want.
Funding
Current Stage
Early StageCompany data provided by crunchbase