Senior Data Engineer - ETL & Integrations jobs in United States
cer-icon
Apply on Employer Site
company-logo

Lasso Informatics · 16 hours ago

Senior Data Engineer - ETL & Integrations

Lasso Informatics is a SaaS start-up focused on research data management and analysis. They are seeking a Senior Data Engineer to build and operate ETL pipelines and integrations, working with diverse data formats and collaborating with various teams to ensure data quality and reliability.

Information Technology & Services

Responsibilities

Design, build, and operate end-to-end ETL pipelines and data integrations
Develop BPMN-based workflows to model and manage complex process flows
Build integration services and transformation logic in Java and Python
Integrate internal and external systems using REST APIs, API gateways, and asynchronous messaging
Apply appropriate data transformation and loading strategies (batch and near-real-time)
Design and optimize PostgreSQL schemas, queries, indexes, and bulk loading mechanisms
Work with structured and semi-structured data formats (JSON, CSV, XML, Parquet, Avro)
Ensure data quality, consistency, and reliability through validation, deduplication, and idempotency
Monitor, troubleshoot, and optimize production ETL pipelines and integration services
Collaborate with engineering, product, and external partners on integration contracts and data models
Document ETL pipelines, workflows, schemas, and operational procedures

Qualification

ETL pipelinesPythonPostgreSQLREST APIsJavaBPMN workflowsData transformationDistributed systemsTroubleshootingDebuggingOperational mindsetArchitectural patterns

Required

5+ years of experience in data engineering, ETL, or systems integration roles
Strong experience building and operating production ETL pipelines
Proficiency in Python and/or Java in backend or data-processing environments
Strong PostgreSQL and SQL experience, including performance tuning
Hands-on experience with data transformation and loading techniques (ETL vs ELT, incremental loads, CDC concepts)
Experience integrating systems via REST APIs and API gateways
Experience working with BPMN-based workflow engines or workflow modeling tools
Experience operating distributed systems in production environments
Strong troubleshooting, debugging, and operational mindset
Familiarity with common architectural patterns (e.g., layered architectures, event-driven systems, integration patterns)

Preferred

Experience with specific BPMN workflow engines such as Camunda, Zeebe, or Flowable
Experience with event-driven architectures or message queues
Cloud platforms experience (AWS, GCP, or Azure)
Docker and Kubernetes experience
CI/CD pipelines for data or backend systems
Experience working in regulated or compliance-driven environments

Benefits

Competitive salary and benefits package
Opportunities for leadership and professional growth
Collaborative team committed to innovation, quality, and scientific impact
Access to training resources and ongoing professional development

Company

Lasso Informatics

twitter
company-logo
We are driven by a passion for scientific research and a commitment to making complex study design, data workflows, and data analysis simple, reliable, and repeatable.

Funding

Current Stage
Growth Stage
Company data provided by crunchbase