Confidential ยท 3 hours ago
Data Engineer
Confidential company is seeking a Data Engineer to design, build, and maintain robust data pipelines and architectures. The role involves ensuring data reliability, scalability, and accessibility while collaborating with data scientists, analysts, and engineering teams.
Marketing & Advertising
Responsibilities
Design, develop, and maintain scalable ETL/ELT pipelines to ingest, transform, and store structured and unstructured data
Collaborate with Data Scientists, Analysts, and Product teams to understand data requirements and deliver solutions that support analytics and ML initiatives
Ensure data quality, integrity, and consistency across multiple sources and platforms
Implement data models, schemas, and storage solutions optimized for performance, scalability, and cost efficiency
Monitor, troubleshoot, and optimize data pipelines and workflows for reliability and performance
Participate in data governance, documentation, and compliance initiatives to maintain secure and auditable data practices
Evaluate and implement modern data engineering tools, frameworks, and best practices to improve existing systems
Collaborate with cross-functional teams to support data-driven decision-making and reporting
Qualification
Required
Strong experience in designing, building, and maintaining data pipelines and ETL/ELT processes
Proficiency in SQL and relational database systems (e.g., PostgreSQL, MySQL, Redshift)
Experience with programming languages such as Python, Java, or Scala for data processing
Familiarity with cloud platforms and data services (AWS, GCP, or Azure)
Knowledge of data warehousing concepts, dimensional modeling, and big data technologies (e.g., Spark, Hadoop)
Strong problem-solving and analytical skills with attention to data quality and reliability
Experience with workflow orchestration tools (e.g., Airflow, Prefect) is a plus
Preferred
3+ years of professional experience as a Data Engineer or in a similar role
Experience building data pipelines to support machine learning models or AI applications
Knowledge of data governance, security, and compliance best practices
Familiarity with real-time data streaming frameworks (e.g., Kafka, Kinesis)
Prior experience in a fully remote or distributed team environment