Level 05 - Junior Software / Data Operations Engineer jobs in United States
cer-icon
Apply on Employer Site
company-logo

Abyss Solutions Ltd · 2 days ago

Level 05 - Junior Software / Data Operations Engineer

Abyss Solutions Ltd is dedicated to creating innovative AI and robotics solutions. The Data Operations Engineer will execute and support daily data-processing workflows, investigate pipeline failures, and collaborate with senior developers to resolve issues while performing rapid scripting tasks.

Artificial Intelligence (AI)Computer VisionIndustrial AutomationInfrastructureMachine LearningRobotics

Responsibilities

Run daily data-processing workflows — Execute production workflows each day, ensuring processing jobs start, complete, and deliver on time
Monitor workflow execution and detect anomalies — Observe job statuses, alerts, and logs to detect failures or unusual patterns
Investigate pipeline failures to isolate root cause — Determine if a failure is due to data quality, metadata inconsistency, transformation bug, or orchestration error
Collaborate with senior developers on debugging sessions — Shadow or join senior devs in root-cause analysis of complex failures, and apply remediation
Re-run failed jobs and validate corrected output — After fixes, restart jobs, confirm correct outputs, and log results
Perform ad-hoc scripting tasks for rapid turnaround (“BlackOps”) — Script transformations, cleanup or merging of CSV/JSON files and handle point-cloud file preparation as needed
Maintain operational logs, trackers, and documentation — Update task-tracking systems (e.g., ClickUp), workflow execution logs, and document debugging learnings for SOP updates
Operate and manage cloud-based compute and orchestration infrastructure — Use and maintain compute/storage resources in the cloud, work within orchestration tool flows (e.g., Prefect flows)

Qualification

Python scriptingData processing workflowsCloud platformsOrchestration toolsCSV/JSON handlingAnalytical mindsetCollaboration skillsDocumentation skills

Required

Minimum of 4-5 years of experience in data operations, pipeline monitoring, or data engineering support
Proficiency in Python scripting for data manipulation
Experience working with CSV, JSON, and ideally point-cloud file formats
Familiarity with cloud platforms and data workflow orchestration tools (e.g., Prefect)
Strong analytical and debugging mindset; able to distinguish data failures vs. code/flow failures
Excellent collaboration and communication skills; able to work with senior developers and operations teams
Comfortable in an operationally-focused, fast-paced environment with daily delivery demands
Knowledge of data-processing workflow concepts (ingestion, transformation, validation, output)
Knowledge of pipeline orchestration tools and how tasks, triggers, and retries function (e.g., Prefect, Airflow)
Knowledge of cloud compute and storage infrastructure (e.g., Google Cloud Platform, buckets, VMs, job scheduling)
Knowledge of data formats and their characteristics (CSV, JSON, point-cloud files, metadata structures)
Knowledge of debugging and root-cause-analysis methodologies (log inspection, stack trace interpretation, data vs code differentiation)
Knowledge of version control workflows (e.g., Git) and how code changes affect production pipelines
Knowledge of operational workflow management (SOPs, monitoring, alerts, task-tracking tools)
Skill in executing and monitoring data-processing workflows via orchestration tools
Skill in interpreting logs, alerts, and job execution statuses to identify anomalies
Skill in diagnosing whether a failure is due to data issues (e.g., malformed input, missing metadata) or processing/flow bugs
Skill in writing and modifying Python scripts for data transformation (CSV/JSON) and point-cloud preparation
Skill in operating cloud-based compute/storage resources (launching jobs, managing buckets, handling permissions)
Skill in collaborating with development and operations teams, communicating clearly about debugging outcomes and next steps
Skill in documenting operational flows, troubleshooting steps, and updating SOPs or workflow trackers

Preferred

Prior experience in 3D scanning, CAD/point-cloud workflows
Experience with other workflow tools (Airflow, Dagster) and CI/CD pipelines
Familiarity with task-tracking and project management tools (e.g., ClickUp)
Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field

Company

Abyss Solutions Ltd

twittertwitter
company-logo
Abyss pioneers large-scale asset integrity management and maintenance planning using advanced, cloud-based inspection solutions for energy operators worldwide.

Funding

Current Stage
Growth Stage
Total Funding
$13.51M
Key Investors
Airtree Ventures
2022-11-21Series Unknown· $9.91M
2021-01-06Series Unknown· $2.54M
2017-05-09Seed· $1M

Leadership Team

leader-logo
Masood Naqshbandi
Co-Founder
linkedin
leader-logo
Abraham Kazzaz
Chief Operating Officer
linkedin
Company data provided by crunchbase