Senior ML Engineer – ML/Inference jobs in United States
cer-icon
Apply on Employer Site
company-logo

MARA · 4 weeks ago

Senior ML Engineer – ML/Inference

MARA is redefining the future of sovereign, energy-aware AI infrastructure. The company is seeking a Machine Learning Engineer to lead the deployment, optimization, and lifecycle management of AI models, focusing on efficiency, observability, and scalability in production environments.

Computer Software
check
H1B Sponsor Likelynote

Responsibilities

Own the end-to-end lifecycle of ML model deployment—from training artifacts to production inference services
Design, build, and maintain scalable inference pipelines using modern orchestration frameworks (e.g., Kubeflow, Airflow, Ray, MLflow)
Implement and optimize model serving infrastructure for latency, throughput, and cost efficiency across GPU and CPU clusters
Develop and tune Retrieval-Augmented Generation (RAG) systems, including vector database configuration, embedding optimization, and retriever–generator orchestration
Collaborate with product and platform teams to integrate model APIs and agentic workflows into customer-facing systems
Evaluate, benchmark, and optimize large language and multimodal models using quantization, pruning, and distillation techniques
Design CI/CD workflows for ML systems, ensuring reproducibility, observability, and continuous delivery of model updates
Contribute to the development of internal tools for dataset management, feature stores, and evaluation pipelines
Monitor production model performance, detect drift, and drive improvements to reliability and explainability
Explore and integrate emerging agentic and orchestration frameworks (LangChain, LangGraph, CrewAI, etc.) to accelerate development of intelligent systems

Qualification

ML infrastructure engineeringInference optimizationMLOps practicesPythonDistributed systemsCollaboration skillsDocumentation skills

Required

5+ years of experience in applied ML or ML infrastructure engineering
Proven expertise in model serving and inference optimization (TensorRT, ONNX, vLLM, Triton, DeepSpeed, or similar)
Strong proficiency in Python, with experience building APIs and pipelines using FastAPI, PyTorch, and Hugging Face tooling
Experience configuring and tuning RAG systems (vector databases such as Milvus, Weaviate, LanceDB, or pgvector)
Solid foundation in MLOps practices: versioning (MLflow, DVC), orchestration (Airflow, Kubeflow), and monitoring (Prometheus, Grafana, Sentry)
Familiarity with distributed compute systems (Kubernetes, Ray, Slurm) and cloud ML stacks (AWS Sagemaker, GCP Vertex AI, Azure ML)
Understanding of prompt engineering, agentic frameworks, and LLM evaluation
Strong collaboration and documentation skills, with ability to bridge ML research, DevOps, and product development

Preferred

Background in HPC, ML infrastructure, or sovereign/regulated environments
Familiarity with energy-aware computing, modular data centers, or ESG-driven infrastructure design
Experience collaborating with European and global engineering partners
Strong communicator who can bridge engineering, business, and vendor ecosystems seamlessly

Company

MARA

twitter
company-logo
MARA (NASDAQ: MARA) deploys digital energy technologies to advance the world's energy systems.

H1B Sponsorship

MARA has a track record of offering H1B sponsorships. Please note that this does not guarantee sponsorship for this specific role. Below presents additional info for your reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2024 (1)
2023 (1)

Funding

Current Stage
Growth Stage

Leadership Team

leader-logo
Fred Thiel
Chairman & Chief Executive Officer
linkedin
leader-logo
Salman Khan
Chief Financial Officer
linkedin
Company data provided by crunchbase