Voucherify · 3 months ago
AI Research Engineer (Python/NodeJS)
Voucherify is a small but profitable software company offering an API-first Promotion & Loyalty Engine. They are seeking an experienced AI Research Engineer to design and develop AI-powered services and integrate large language models into their SaaS product, impacting campaign performance for global customers.
CouponsDeveloper APIsDigital MarketingE-Commerce PlatformsLoyalty ProgramsSaaS
Responsibilities
Build AI-powered backend services – design and maintain APIs and intelligent services in Python and NodeJS, making them robust, scalable, and production-ready
Bring LLMs into production – integrate large language models with retrieval-augmented generation (RAG) pipelines and agent-based workflows, using frameworks like LangChain, LangGraph, or FastMCP
Design smart orchestration – develop multi-agent workflows, communication layers, and orchestration logic to help automate and optimize promo and loyalty campaign management
Connect the dots – ensure seamless and secure communication between AI modules, Voucherify’s core services, and external tools through event-driven interfaces
Leverage the cloud – deploy and optimize microservices on AWS, making full use of Kubernetes and cloud-native best practices
Collaborate & innovate – work side by side with frontend developers, system architects, Sales and product teams to deliver features that push the boundaries of AI in SaaS
Continuously improve – tune performance, reliability, and fault tolerance, ensuring that what you build runs smoothly at enterprise scale
Qualification
Required
5+ years of hands-on Python experience (bonus points if you also know NodeJS), preferably in AI, ML, or backend development
Proven experience designing and implementing RAG functionality and working with LLMs in production
Experience with frameworks like LangChain, LangGraph, or similar agentic frameworks and a passion for building tools that can 'reason' and act
Strong understanding of microservices, event-driven systems, and cloud-native architectures
Familiarity with vector databases and prompt engineering
Experience with MCP servers (SSE, streaming, async communication)
Proven track record of deploying ML solutions into production environments and AWS (Kubernetes, scaling, monitoring)
Strong grasp of secure coding, encryption, and data protection in AI systems
Benefits
Competitive pay
Flexible working hours
Autonomy
Fully remote work
Healthy work-life balance