SIGN IN
Data Science Workflows Architect jobs in United States
cer-icon
Apply on Employer Site
company-logo

Berkeley Lab · 8 hours ago

Data Science Workflows Architect

Lawrence Berkeley National Laboratory is hiring a Data Science Workflows Architect within the NERSC division. In this role, you will support the user community in adapting and optimizing their workflows for High Performance Computing systems, leveraging cutting-edge technologies to advance the scientific mission of the Department of Energy.
Research
check
H1B Sponsor Likelynote

Responsibilities

Work with domain scientists to integrate HPC resources at NERSC into their workflows
Collaborate with other teams at NERSC to ensure next generation NERSC systems such as the new supercomputer Doudna can be harnessed by the NERSC user community
Collaborate with teams across the Department of Energy to support Integrated Research Infrastructure and the emerging American Science Cloud
Optimize the user environment on NERSC supercomputers to maximize scientific productivity for the user community. This includes installing, maintaining and documenting a productive and performant set of tools (e.g. Jupyter, Podman, Julia or Python), engaging with the developer and user community, helping to optimize systems to meet user needs, and monitoring system performance from an application perspective
Develop and support services in data management, data movement, agentic AI, and workflow orchestration
Educate and train users by creating content for the NERSC website with online tutorials and documentation, giving presentations, and attending conferences. Communicate with users about new opportunities and capabilities in software and systems and advise them in effectively transitioning to new technologies
Communicate clearly to both domain scientists and computer systems engineers, explaining the subtleties of using an HPC system and translating scientific requirements into computing needs
Work on and resolve complex issues where analysis of situations or data requires an in-depth evaluation of variable factors
Exercise judgment in selecting methods, techniques and evaluation criteria for obtaining results
Determine methods and procedures on new assignments and may coordinate activities of other personnel
Network with key contacts outside one’s own area of expertise
Work on and resolve significant and unique issues where analysis of situations or data requires an evaluation of intangibles
Exercise independent judgment in methods, techniques and evaluation criteria for obtaining results

Qualification

High-Performance ComputingScientific computational workflowsWorkflow toolsSoftware engineeringProgramming C/C++Programming PythonData managementData analysisNetworkingCommunicationInterpersonal skillsProblem-solvingTeam collaboration

Required

Typically requires a minimum of 8 years of related experience with a Bachelor's degree; or 6 years and a Master's degree; or equivalent experience
Wide-ranging expertise in the areas of scientific computational workflows, workflow tools, and/or High-Performance Computing
Software engineering experience, including version control, testing, debugging, and CI/CD
Programming background in languages including (any one of) C/C++, Python, Julia, Rust, Go, and shell scripting
Prior experience managing computational needs and resources, either as a user or as an administrator
Excellent communication and interpersonal skills, able to express yourself clearly in both written (e.g. conference papers, technical papers, documentation, email) and oral (e.g. via zoom and in person) communication
Demonstrated ability to work effectively as part of a cross-disciplinary team
Ability to troubleshoot and solve problems of diverse scope where analysis of data requires evaluation of identifiable factors
Ability to resolve complex issues in creative and effective ways
Ability to network and collaborate with key contacts outside one's own area of expertise

Preferred

Advanced Degree (Masters or Ph.D) in a Scientific Domain
Experience working with experiments that have real-time and/or large-scale computing requirements
Experience with HPC systems, including use of a batch scheduling system
Experience implementing and/or integrating computational workflow tools to support scientific research
Understanding of and experience with data analysis and/or AI tools and platforms
Experience deploying and utilizing Jupyter or other interfaces for interactive data exploration
Experience with cloud technologies, software, interfaces/APIs and containers
Experience with (and enthusiasm for) modern computing architectures including GPUs and other Accelerators
Experience in working on interdisciplinary projects involving multiple scientific and technical teams and institutions

Company

Berkeley Lab

twittertwittertwitter
company-logo
Berkeley Lab is a national laboratory that creates advanced new tools for scientific discovery.

H1B Sponsorship

Berkeley Lab has a track record of offering H1B sponsorships. Please note that this does not guarantee sponsorship for this specific role. Below presents additional info for your reference. (Data Powered by US Department of Labor)
Distribution of Different Job Fields Receiving Sponsorship
Represents job field similar to this job
Trends of Total Sponsorships
2025 (154)
2024 (159)
2023 (163)
2022 (154)
2021 (165)
2020 (107)

Funding

Current Stage
Late Stage

Leadership Team

leader-logo
Mary Barnum, MBA
Business Manager, COO Office
linkedin
leader-logo
Rebecca Rishell
Deputy Chief Operating Officer
linkedin
Company data provided by crunchbase