ALTAK INC · 13 hours ago
Ab Initio Developer
Altak Group is seeking an experienced Ab Initio Developer to build and optimize high-throughput, resilient data pipelines. The role involves designing, developing, and supporting Ab Initio graphs and plans, integrating with AWS and/or Azure services, and collaborating with various teams to deliver governed, production-grade data at scale.
Consumer ElectronicsElectronicsManufacturing
Responsibilities
Design & develop Ab Initio graphs/plans using GDE, Co>Operating System, EME, and Conduct>It to ingest, transform, and publish data
Migrate pipelines from on-prem to cloud targets (batch and near-real-time), ensuring restartability, parameterization (PDL), metadata management, and resiliency
Integrate with AWS (e.g., S3, Redshift, Glue/Glue Catalog, Lambda, EMR, MSK/Kinesis) and/or Azure (e.g., ADLS Gen2, Synapse, Data Factory, Event Hubs)
Build connectors/jobs for Oracle, SQL Server, DB2, files, MQ/Kafka; implement incremental loads/CDC patterns where applicable
Tune memory/parallelism/partitioning, optimize file and database I/O, and implement robust error handling, alerting, and SLA monitoring
Apply data quality checks (e.g., Ab Initio DQE), lineage/metadata practices, and enterprise security controls (IAM, KMS/Key Vault, tokenization)
Version control (Git/EME), automated build/deploy, environment promotion, and infrastructure coordination with platform teams
Troubleshoot production issues, perform capacity planning, and deliver clear runbooks
Partner with data architects, platform/cloud engineers, and analysts; contribute to standards and best practices
Qualification
Required
5+ years hands-on Ab Initio development (GDE, EME, Co>Operating System; Conduct>It scheduling)
Proven delivery of high-volume batch and near-real-time pipelines, including restart/recovery, parameterization, and metadata-driven design
Strong SQL and performance tuning across major RDBMS (Oracle/SQL Server/DB2; plus Redshift/Synapse/Snowflake a plus)
Production experience integrating Ab Initio with AWS and/or Azure for data landing, processing, and analytics
Solid Linux/Unix fundamentals and scripting (bash, Python preferred)
Experience with Kafka/MQ (publish/subscribe), file transfer patterns, and secure networking (VPC/VNet, PrivateLink/Private Endpoints)
Familiarity with data quality, lineage, and compliance for sensitive data (e.g., PHI/PII)
Excellent troubleshooting skills and the ability to own solutions end-to-end
Preferred
Cloud certs (e.g., AWS Data Analytics / Solutions Architect, Azure Data Engineer Associate)
Experience with Glue/Spark or Synapse Spark for complementary processing; EMR/Databricks exposure a plus
Orchestration with Airflow / ADF alongside Conduct>It; event-driven designs with Lambda/Functions
IaC awareness (Terraform/CloudFormation/Bicep) to collaborate with platform teams
Experience with Snowflake or Delta Lake patterns on S3/ADLS
Monitoring/observability (CloudWatch, Azure Monitor, Prometheus/Grafana, Splunk)
Agile/SAFe delivery in regulated environments (healthcare/financial services)
Company
ALTAK INC
Funding
Current Stage
Early StageCompany data provided by crunchbase