NASA has announced that it will launch the Nancy Grace Roman Space Telescope into orbit in September 2026, eight months ahead of schedule. The new space telescope is expected to provide astronomers with 20,000 terabytes of data over its lifetime.
Add to this the 57 gigabytes of breathtaking images downlinked daily from the James Webb Space Telescope, which began operations in 2021, and the Vera C. Rubin Observatory in the mountains of Chile, which is expected to begin exploring later this year and collect 20 terabytes of data each night.
For comparison, the Hubble Space Telescope, once the gold standard, provides just 1 to 2 gigabytes of sensor readings each day. It’s been a long time since they sifted through all these measurements by hand, but astronomers, like others with large amounts of data, are now turning to GPUs to solve problems.
Brant Robertson, an astrophysicist at the University of California, Santa Cruz, has had a front-row seat to this step change in science, supporting and using data from these missions. For the past 15 years, Robertson has been working with Nvidia to apply GPUs to problems in understanding the universe. We first tested theories about supernova explosions through advanced simulations, and are now developing tools to analyze large amounts of data from modern observatories.
“There has been an evolution from observing a small number of objects to CPU-based analysis on large datasets and then running GPU-accelerated versions of those same analyses,” he told TechCrunch.
Robertson and then-graduate student Ryan Housen developed a deep learning model called Morpheus that can pore over large datasets and identify galaxies. Their initial AI analysis of Webb data identified a surprising number of specific types of disk galaxies, with new implications for theories about the evolution of the universe.
Now, Morpheus is changing with the times. Robertson is switching its architecture from convolutional neural networks to transformers, which are behind the rise of large-scale language models. This allows the model to analyze several times more area than it currently can, making it faster to work with.
tech crunch event
San Francisco, California
|
October 13-15, 2026
Robertson is also working on generative AI models trained on space telescope data to improve the quality of observations collected by ground-based telescopes, which are distorted by Earth’s atmosphere. Despite advances in rocket technology, getting the 8-meter mirror into orbit remains difficult, so using software to improve Rubin’s observations is the next best option.
But he still feels the pressure of global demand for GPU access. Robertson used the National Science Foundation to build a GPU cluster at the University of California, Santa Cruz, but the cluster is becoming obsolete even as more researchers want to apply compute-intensive techniques to their research. The Trump administration proposed cutting NSF’s budget by 50% in its current budget request.
“People want to do this kind of AI and ML analysis, and GPUs are the way to do that,” Robertson said. “You need an entrepreneurial spirit…especially when you’re working at the cutting edge of technology. Universities are very risk-averse because they have limited resources. So you have to go out and show them, ‘Look, this is where we want to be as a field.'”
If you buy through links in our articles, we may earn a small commission. This does not affect editorial independence.
