Two decades ago, Saul Perlmutter, Brian Schmidt, and Adam Reiss shocked the world when they published research showing not only that the Universe was expanding, but that the expansion was occurring at an accelerating rate. The discovery came as a complete surprise even to the astronomers themselves, and netted them a Nobel Prize in 2011.
The accelerating expansion was attributed to "dark energy," which is neither dark, nor energy, but which represents some force fighting gravity's pull, causing galaxies to speed apart from one another faster than would be expected by prevailing theories. What this force is remains unknown.
The discovery of accelerating expansion also had another consequence. It spurred efforts to better quantify this expansion in the hopes of understanding what was at the root of it.
With that goal in mind, Karl Gebhardt, a professor of Astronomy at The University of Texas at Austin and an expert at uncovering the dynamics of distant invisible phenomenon, came up with an experiment that would look deeper into the past of the cosmos than ever before to determine with great accuracy how fast the universe was accelerating.
"We're struggling on the theory side to explain what's going on," Gebhardt said. "There's a lot of ideas out there, but we have a lack of observations."
The experiment — known as Hobby-Eberly Telescope Dark Energy Experiment, or HETDEX — involved upgrading an optical telescope of extreme sensitivity to survey the sky for galaxies that were active 10 billion years ago, and determine how fast they were traveling outward. The project is supported by more than $42 million in grants from the state of Texas, the United States Air Force, the National Science Foundation (NSF), and the contributions of many private foundations and individuals.
"The expansion rate that had been used was only done on data that was relatively recent in history of the universe," Gebhardt said. "HETDEX is looking back in time where no one else has looked before. We don't know if the rate has been consistent. If it's a constant, then that calls for a specific physical interpretation. If it changes, it's a very different physical interpretation."
Today, more than 12 years after the project was first proposed, the experiment is underway and the team is gathering one of the largest data collections in astrophysics to answer one of the most essential questions in astronomy.
Based at the McDonald Observatory in West Texas, HETDEX equips one of the largest telescopes in the world with the largest spectrograph on the planet — actually a set of 156 spectrographs. Each spectrograph is fed by 225 fibers that take the light from a patch of sky 1/3600 of the size of the full moon. When completed, the instrument will have 35,000 optical fibers that all focus the light from a large portion of the sky into the spectrometers measuring thousands of objects simultaneously. (It currently has 25,000.)
The five-year HETDEX survey will create a spectral map of a region of the northern sky near the Big Dipper equivalent to the size of 1,000 moons. If a galaxy aligns with one of the fibers, its light will be carried to a spectrograph for analysis. This survey will produce not only a map of the region, but also spectra of all objects within it, allowing the survey to measure their velocities and other physical characteristics.
Measuring speed and distance 10 billion years in the past
As complicated as the experiment is, Gebhardt says in the end the exploration comes down to two factors: the velocity the galaxies are moving, and the distance they have travelled from their origin following the Big Bang.
The velocity can be gleaned by the redshift of spectral lines toward longer wavelengths (the red end of the spectrum), which is proportional to the velocity. The distance is determined by the difference from a baseline structure of the universe that has long been established by astronomers and has been the basis for most cosmology prior to the discovery of dark energy.
"When galaxies are made, they're made in a pattern, like a fingerprint," Gebhardt said. "If we can measure that pattern, it's the same as measuring the distance between the ridges in your fingerprint. You can tell how much the universe has expanded. We do this for millions of galaxies and from those millions of galaxies what you get is a map."
This map tells astronomers how far those galaxies had traveled from the Big Bang until the light left those galaxies and traveled 10 billion light-years to the telescope. By identifying and mapping millions of galaxies from an early period of the universe's expansion and then determining the velocity and distance travelled by them, they believe they can determine the rate of expansion to within 1% of error at this previous phase of expansion.
Establishing that rate and how it has changed over time will help solve the mystery of dark energy and what's causing the expansion.
"It could be that we don't understand gravity on large scales — it could go from an attractive to a repulsive force. Or the expansion could be caused by the energy of empty space," Gebhardt said. "There are five or six other hypotheses — extremely different ideas. Measurements of the expansion rate at different times in the universe is how we limit these models."
Over the course of the experiment, HETDEX will capture 400 billion resolution elements — a mountain of data that travels continuously from a mountain-top in West Texas straight to the Texas Advanced Computing Center (TACC) where some of most powerful academic supercomputers in the world analyze it. In particular, Gebhardt has relied heavily on TACC's Wrangler supercomputer, a powerful data analysis system supported by NSF.
"We're running a code to find individual point sources: a filter over all of the spatial elements to find individual galaxies," Gebhardt said. "Most of the CPU time is used to perform that analysis."
AstroTinder
As with any idea formulated years before implementation, some aspects of the experiment have proved tricky. In particular, Gebhardt underestimated the noisiness of the data generated by the optical fibers.
He had anticipated a straightforward analysis, but found that he first needed a way to separate real target galaxies from false positives. Strangely enough, humans can readily detect the difference, but most computational algorithms cannot.
So, to address this problem, he is training a machine learning algorithm using human-labelled readings to make the distinction. Working with students in the UT Computer Science department, he created an app that he calls 'AstroTinder' to assist in the process.
Individuals with minimal training are able to look at spectral lines and images of point sources and swipe left or right, depending on whether they believe it is a real galaxy or something else — an artifact of the algorithm or a speck of dust on the sensor.
After enough of these determinations are made, Gebhardt will use TACC's machine learning-centric Maverick supercomputer to train the system to make the distinction itself. The system will then be off to the races, sifting through the billions of data points to identify and map the 100,000 target galaxies.
From those, further analysis will establish the velocity and distance, and then the rate of acceleration of the expansion.
Discoveries en route to the expansion rate
After three years of operation, the survey is about 20 percent complete. Gebhardt anticipates his team will need one-third of the data before they can say anything definitive about the expansion rate. In the meanwhile, the survey is collecting lots of interesting, unintended data about astronomical objects including naked black holes, highly active star-forming galaxies, asteroids, and meteorites.
"No one's looked at the universe in this way," he said. "We're finding things that couldn't be discovered in any other way."
The data processing and data management challenges for HETDEX harken back to a statement by Tony Tyson, chief scientist for the Large Synoptic Survey Telescope (recently renamed the Vera C. Rubin Observatory), that "the telescope is just a peripheral to the data management system."
"Great care has gone into the design of the data management system such that both the HETDEX science team and the astronomical research community in general will be able to exploit the data, both for the dark energy studies that are the core focus of the project, as well as for other purposes," said Bob Hanisch, director of the Office of Data and Informatics within the Material Measurement Laboratory at the National Institute for Standard and Technology (NIST), who is not directly involved in the project.
"It is great to see the close collaboration between the HETDEX astronomers and the computational scientists at TACC, such that HETDEX data can be moved to the HPC and data storage facilities efficiently and made available for analysis and distribution."
"Astronomical data science has evolved in amazing ways over the past 20 years," said Niall Gaffney, director of Data Intensive Computing at TACC and former designer of the archives at the Space Telescope Science Institute which holds the data from the Hubble Space Telescope. "HETDEX is taking the lessons learned from missions like the Hubble Space Telescope and Kepler and combining them with modern machine learning techniques pioneered in industry to better understand a fundamental force in nature in ways we could not 20 years ago. Having a facility and staff like those at TACC help bridge these technologies to bring about these new discoveries."
In terms of science outcomes, Gebhardt believes the experiment will have a major impact, either by helping astronomers understand how gravity works or how the Big Bang occurred. For non-scientists, the research helps resolve our place in the universe.
"We are completely insignificant as humans in the universe, but we're able to understand how the universe evolved," Gebhardt said. "Being able to do that, I think, is amazing."
Comments