Senior Staff Software Developer - FinOps Cloud Development Platform jobs in United States
cer-icon
Apply on Employer Site
company-logo

ServiceNow · 3 weeks ago

Senior Staff Software Developer - FinOps Cloud Development Platform

ServiceNow is a global market leader in innovative AI-enhanced technology, and they are seeking a Senior Staff Software Developer to join their FinOps Tools team. This role involves designing and implementing a cloud-native data development platform to empower data practitioners and streamline analytics production.

Business Process Automation (BPA)Cloud ManagementEnterprise SoftwareRobotic Process Automation (RPA)SaaS
check
Growth Opportunities
check
H1B Sponsor Likelynote

Responsibilities

Design and develop scalable, maintainable, and reusable software components with a strong emphasis on performance and reliability
Collaborate with product managers to translate requirements into well-architected solutions, owning features from design through delivery
Build intuitive and extensible user experiences using modern UI frameworks, ensuring flexibility for customer-specific needs
Contribute to the design and implementation of new products and features while enhancing existing product capabilities
Integrate automated testing into development workflows to ensure consistent quality across releases
Participate in design and code reviews ensuring best practices in performance, maintainability, and testability
Develop comprehensive test strategies covering functional, regression, integration and performance aspects
Foster a culture of continuous learning and improvement by sharing best practices in engineering and quality
Promote a culture of engineering craftsmanship, knowledge-sharing, and thoughtful quality practices across the team
Design and architect the foundational cloud development platform for notebook-based data workflows
Lead technical decision-making on workspace provisioning, developer experience, and productionization pathways
Establish best practices for notebook-to-production workflows, including git integration, parameterization, validation, and automated deployment
Drive innovation in data development platforms, leveraging AI/ML tools for enhanced developer productivity
Move fast: deliver working MVP in 3 months, production system scale in 6 months
Build and customize cloud workspace infrastructure using Coder (open source) on Kubernetes
Develop VS Code extensions (TypeScript) for productionization workflows: notebook validation, parameterization, and Argo Workflow generation
Implement opinionated notebook templates and validation rules for production-ready data pipelines
Create seamless integrations between notebooks and ServiceNow's data stack: Trino queries, Iceberg table outputs, Lightdash previews, dbt transformations
Build backend services (Python) for workflow orchestration, notebook parsing, and metadata management
Deploy JupyterHub initially, then progressively replace components with custom platform features based on user feedback
Design container images with embedded security policies, pre-configured data access to Trino/Iceberg tables, and optimized dependencies
Implement git-native workflows with automated notebook versioning, code review integration, and CI/CD pipelines
Build observability and monitoring for workspace health, user activity, and pipeline success rates
Establish infrastructure foundation that scales from 5 early adopters to 30+ practitioners within first year
Create "template-based" notebook workflows with opinionated structure: parameterization (Papermill-style), Iceberg table outputs, validation checkpoints
Build CLI and UI tooling for one-click productionization: notebook → Argo Workflow with minimal manual intervention
Establish developer guardrails: credential management, data access policies, resource quotas
Collaborate closely with early adopter data scientists to rapidly iterate on workflows and validate usability
Prioritize platform stability and clear productionization paths over feature breadth in first 6 months
Leverage cutting-edge AI development tools (e.g.. Cursor, Windsurf, ChatGPT, GitHub Copilot) to accelerate development velocity
Establish AI-augmented development practices and mentor future team members on effective AI tool utilization
Drive innovation in AI-assisted code generation, testing, and platform optimization
Work autonomously with guidance from Engineering and FinOps leadership
Collaborate with DevOps team on Kubernetes infrastructure, CI/CD pipelines, and security policies
Partner with FinOps Tools team members working on Trino, dbt, Lightdash, and Iceberg to ensure seamless integrations
Contribute to open-source projects in the notebook and developer tooling ecosystem

Qualification

Cloud-native architecturePythonKubernetesData workflowsAI integrationFull-stack developmentCI/CD pipelinesModern UI frameworksAPI designTechnical writingCollaboration skillsProblem-solvingContinuous learning

Required

Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AI's potential impact on the function or industry
12+ years of experience in software engineering, with a track record of delivering high-quality products with deep expertise in full-stack development and cloud-native architecture with a Bachelor's degree; or 8 years and a Master's degree; or a PhD with 5 years experience in Computer Science, Engineering, or related technical field; or equivalent experience
Strong Python skills for backend services, API development, and data tooling (notebook parsing, workflow generation)
Proven track record of rapid execution in greenfield environments with evolving requirements
Hands-on experience building and scaling developer platforms or internal tools at enterprise scale
Deep understanding of cloud development environments (Coder, GitHub Codespaces, Gitpod, or similar)
Strong Kubernetes and containerization expertise for cloud-native application deployment
Experience with data workflows and tooling: Jupyter, notebooks, orchestration systems (Airflow/Argo), data catalogs
Full professional proficiency in English
Proficiency in Python, Java, or similar object-oriented languages
Experience with modern front-end frameworks such as Angular, React, or Vue
Strong knowledge of data structures, algorithms, object-oriented design, design patterns, and performance optimization
Familiarity with automated testing frameworks (e.g., JUnit, Selenium, TestNG) and integrating tests into CI/CD pipelines
Understanding software quality principles including reliability, observability, and production readiness
Ability to troubleshoot complex systems and optimize performance across the stack
Experience with AI-powered tools or workflows, including validation of datasets, model predictions, and inference consistency
Comfort with development tools such as IDEs, debuggers, profilers, source control, and Unix-based systems
Proven track record building internal developer platforms or productivity tools from scratch
Experience designing opinionated workflows that balance flexibility with guardrails
Strong understanding of developer personas: data scientists, analysts, engineers
Ability to iterate rapidly with early adopters and incorporate feedback without over-engineering
Experience with workspace security: secrets management, network policies, image scanning
Comfort operating at startup velocity within enterprise constraints
Proven ability to work autonomously and drive technical decisions in ambiguous, greenfield environments
Strong bias toward action: prototype quickly, gather feedback, iterate aggressively
Strong technical writing and documentation skills for developer-facing content
Excellent collaboration skills across engineering, DevOps, and data teams
Ability to establish technical foundations for new products with long-term vision while delivering short-term results

Preferred

Open-source contributions, Jupyter ecosystem, or developer tooling
Experience with Argo Workflows, Tekton, or Kubernetes-native CI/CD systems
Familiarity with data validation frameworks (Great Expectations, dbt tests, etc.)
Experience with Apache Iceberg or lakehouse architectures
Conference speaking or technical blogging on developer platforms or data tooling

Benefits

Health plans
Flexible spending accounts
401(k) Plan with company match
ESPP
Matching donations
Flexible time away plan
Family leave programs

Company

ServiceNow

company-logo
ServiceNow is an AI platform that delivers IT operations, field service management and app engine solutions.

H1B Sponsorship

ServiceNow has a track record of offering H1B sponsorships. Please note that this does not guarantee sponsorship for this specific role. Below presents additional info for your reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2025 (910)
2024 (876)
2023 (807)
2022 (840)
2021 (447)
2020 (439)

Funding

Current Stage
Public Company
Total Funding
$83.7M
Key Investors
Sequoia CapitalJMI Equity
2022-12-09Post Ipo Equity
2012-07-29IPO
2012-03-20Private Equity· $10.98M

Leadership Team

leader-logo
Bill McDermott
Chairman and CEO
linkedin
leader-logo
Pat Casey
Chief Technology Officer & EVP of DevOps
linkedin
Company data provided by crunchbase