EBSCO Information Services ยท 3 hours ago
Senior MLOps Engineer
EBSCO Information Services is a leader in delivering optimized research experiences through innovative technology. As a Senior MLOps Engineer, you will design and maintain ML pipelines within an AWS-based ecosystem, collaborating with data engineers and scientists to operationalize ML models and ensure their reliability and security.
Content CreatorsDatabaseSaaS
Responsibilities
Design, build, and maintain ML Ops pipelines supporting model training, validation, and deployment across AWS environments
Implement automation for model packaging, testing, deployment, and monitoring using CI/CD best practices
Collaborate with data engineers and data scientists to operationalize ML workloads within the data lakehouse ecosystem
Develop and maintain integrations between data ingestion, feature stores, and model repositories
Apply infrastructure-as-code (Terraform, AWS CDK, CloudFormation) to automate ML pipeline infrastructure
Implement and manage model versioning, reproducibility, and lineage tracking using tools such as MLflow or SageMaker Model Registry
Define and automate monitoring, alerting, and retraining strategies for deployed models
Ensure all ML infrastructure and pipelines meet enterprise security, compliance, and governance standards
Participate in code reviews, knowledge sharing, and continuous improvement of ML Ops practices
Mentor junior engineers and contribute to documentation, standards, and best practices for ML Ops across teams
Qualification
Required
Bachelor's Degree in Computer Science, Data Engineering, or a related technical field or equivalent experience
4+ years of professional experience in software, data, or ML engineering
2+ years of direct experience implementing and maintaining ML pipelines in production
Strong proficiency in Python and familiarity with ML frameworks such as PyTorch, TensorFlow, or Scikit-learn
Hands-on experience with AWS services (SageMaker, Step Functions, Lambda, ECR, S3, Glue, IAM)
Solid understanding of CI/CD, containerization (Docker)
Experience with building CI/CD pipelines (Jenkins, Github Actions, etc.)
Experience with infrastructure-as-code and automation (Terraform, AWS CDK, or CloudFormation)
Strong understanding of data pipelines, ETL/ELT concepts, and feature engineering in a lakehouse environment
Proven ability to apply software engineering practices to machine learning workflows
Strong communication and collaboration skills across multidisciplinary teams
Preferred
Experience with feature stores, data catalogs, and metadata management
Familiarity with model governance and compliance frameworks
Experience with model monitoring and drift detection tools (CloudWatch, or custom solutions)
Understanding of data lakehouse technologies such as Apache Iceberg or Delta Lake
Contributions to open-source ML Ops or DevOps tooling
Experience in Agile development environments and cross-functional collaboration
Company
EBSCO Information Services
At EBSCO Information Services, we hire great people and let them thrive.
H1B Sponsorship
EBSCO Information Services has a track record of offering H1B sponsorships. Please note that this does not
guarantee sponsorship for this specific role. Below presents additional info for your
reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2024 (1)
2021 (7)
2020 (7)
Funding
Current Stage
Late StageRecent News
Company data provided by crunchbase