The most powerful computers in human history are not doing what you'd expect.

They're not running climate models. They're not simulating nuclear reactions. Right now, some of the world's most powerful computers — at Argonne, Oak Ridge, Google DeepMind, and Microsoft Research — are being used to train large language models not for chatbots, but for science.

The bet: if you train a massive neural network on decades of scientific data — millions of protein structures, terabytes of materials properties, petabytes of simulation outputs — you get something qualitatively different from a textbook. You get a model that has internalized the patterns of nature, one that can suggest new hypotheses, accelerate experiments, and compress what once took five years of graduate research into a few months.

This is the frontier that tech companies, biotech startups, research universities, and national labs are all racing to define. The career window it opens is real, and it's open right now.

What the Labs Are Doing

Argonne National Laboratory — AuroraGPT and Polybot

Argonne is home to Aurora, one of the world's first exascale supercomputers, capable of more than a quintillion calculations per second. The flagship AI program is AuroraGPT — a foundation model for science, trained not on internet text but on scientific literature, molecular property databases, and decades of simulation outputs. The goal: a general-purpose AI research assistant that can answer "What is the most promising synthesis pathway for a high-entropy alloy with these target properties?"

Argonne also runs Polybot — an autonomous chemistry laboratory. Polybot combines robotic fluid handling with active learning: the AI designs an experiment, the robot executes it, the result feeds back into the model, and the system iterates without human intervention overnight. What once required a postdoc spending six months synthesizing compounds now runs at machine speed.

Oak Ridge National Laboratory — Frontier and Climate AI

ORNL operates Frontier, which in May 2022 became the first computer to break the exascale barrier at 1.1 exaflops — 10¹⁸ floating-point operations per second. Frontier's scientific AI work includes climate modeling at unprecedented resolution, running global simulations at sub-kilometer scale to understand regional weather patterns and extreme events. ORNL researchers are also deploying AI to accelerate materials discovery in battery science, identifying electrolyte compositions that could push EV range beyond current lithium-ion limits.

Lawrence Berkeley National Laboratory — NERSC and X-Ray AI

LBNL's Advanced Light Source and National Energy Research Scientific Computing Center (NERSC) work as a paired instrument: X-ray experiments generate vast data about material structures, and AI trained on that data is replacing months-long analysis processes. Researchers at LBNL have shown that AI can determine protein structures from diffraction data in minutes rather than weeks.

Pacific Northwest National Laboratory — Grid Security AI

PNNL applies AI to chemical sciences and grid security — using machine learning to predict chemical reaction outcomes, design materials for carbon capture, and detect anomalies in the Pacific Northwest power grid before they cascade into outages. PNNL's AI for grid security work is directly funded by DOE's Office of Electricity and represents some of the most consequential applied AI in the country.

Lawrence Livermore National Laboratory — WarpX and El Capitan

LLNL uses AI for stockpile stewardship — ensuring the safety of the nuclear arsenal without live testing. Their WarpX project applies ML to plasma physics simulation, and El Capitan (commissioned 2024) is designed with AI workloads as a first-class use case. LLNL researchers have published models that predict high-energy-density physics experiment outcomes with accuracy that would have required weeks of supercomputer time a decade ago.

Career Tracks Covered in This Issue
  • AI/ML Research Scientist — Deep understanding of both ML methods (transformers, graph neural networks, diffusion models) and a scientific domain (chemistry, materials science, nuclear physics). Entry: $90K; senior: $180K+. Strong CS or applied math with undergraduate research in a scientific field is competitive.
  • Scientific Software Engineer — Builds training pipelines, data processing systems, model serving at HPC scale. Proficiency in Python, C++, CUDA, and distributed computing frameworks (MPI, PyTorch Distributed). High demand — far more domain scientists want to use AI than engineers who know how to build it.
  • Computational Research Scientist — Runs experiments in silico using LAMMPS, VASP, OpenMC, plus AI-based surrogate models. Typically requires graduate degree, but SULI internships during undergrad are a direct pathway to PhD programs and post-grad lab positions.
  • Data Scientist / Research Data Engineer — Data infrastructure: ingesting petabytes of experimental data, building training datasets, ensuring data quality and provenance. Less domain expertise required, more engineering depth. Entry point for data science or statistics backgrounds.

The Core Skill Stack

Python (PyTorch)
NumPy / pandas
Graph Neural Nets
HPC / CUDA
MPI / Distributed
Linux / Bash
Scientific Domain
Git + GitHub

How to Get There

SULI is the primary DOE pathway for undergraduates. Applications open twice a year. If you're a CS, data science, applied math, or engineering student, your target labs for AI research are Argonne, Oak Ridge, LBNL, and PNNL — all have active AI/ML programs and take SULI students specifically for computational work.

CCI offers the same access for community college students. The program is often overlooked, which means acceptance rates are more favorable than SULI. If you're at a community college and interested in computing or data science, CCI is worth a serious look.

What to include in your application:

  • Any experience with Python, NumPy, PyTorch, or TensorFlow — even self-taught coursework counts
  • Evidence of scientific curiosity beyond your coursework (a GitHub project, a competition, independent reading in a scientific domain)
  • A clear statement of which lab's research interests you and why — mentors respond to specificity

The AI-for-science moment is early. The researchers who get in now — as SULI interns, as PhD students, as early-career scientists — will be there when the first AI-designed material ships to market, when the first AI-accelerated drug passes clinical trials, when the first AI-assisted reactor design gets licensed. The labs are building these programs today. The door is open.


Resources to Go Deeper

  • DOE SULI Applications — For undergraduates. Target: Argonne (Aurora/INCITE), ORNL (Frontier), LBNL (NERSC), PNNL (grid AI).
  • DOE CCI Applications — For community college students. Same labs, same opportunities, often overlooked.
  • INCITE Allocation Program — Competitive HPC access for researchers at ALCF (Argonne). Undergrads have applied through mentored research projects.
  • NERSC at LBNL — National Energy Research Scientific Computing Center. Major user facility with student research opportunities.
  • DOE AI for Science — Official DOE initiative page covering all national lab AI programs and funding opportunities.