By Roland Piquepaille
An international group of cosmologists, the Virgo Consortium, has realized the first simulation of the entire universe, starting 380,000 years after the Big Bang and going up to now. In "Computing the Cosmos," IEEE Spectrum writes that the scientists used a 4.2 teraflops system at the Max Planck Society's Computing Center in Garching, Germany, to do the computations. The whole universe was simulated by ten billion particles, each having a mass a billion times that of our sun. As it was necessary to compute the gravitational interactions between each of the ten billion mass points and all the others, a task that needed 60,000 years, the computer scientists devised a couple of tricks to reduce the amount of computations. And in June 2004, the first simulation of our universe was completed. The resulting data, which represents about 20 terabytes, will be available to everyone in the months to come, at least to people with a high-bandwidth connection. Read more...Here is a general overview of the project.
The group, dubbed the Virgo Consortium -- a name borrowed from the galaxy cluster closest to our own -- is creating the largest and most detailed computer model of the universe ever made. While other groups have simulated chunks of the cosmos, the Virgo simulation is going for the whole thing. The cosmologists' best theories about the universe's matter distribution and galaxy formation will become equations, numbers, variables, and other parameters in simulations running on one of Germany's most powerful supercomputers, an IBM Unix cluster at the Max Planck Society's Computing Center in Garching, near Munich.
Now, here some details about this cluster -- and its limitations.
The machine, a cluster of powerful IBM Unix computers, has a total of 812 processors and 2 terabytes of memory, for a peak performance of 4.2 teraflops, or trillions of calculations per second. It took 31st place late last year in the Top500 list, a ranking of the world's most powerful computers by Jack Dongarra, a professor of computer science at the University of Tennessee in Knoxville, and other supercomputer experts.
But as it turns out, even the most powerful machine on Earth couldn't possibly replicate exactly the matter distribution conditions of the 380 000-year-old universe the Virgo group chose as the simulation's starting point. The number of particles is simply too large, and no computer now or in the foreseeable future could simulate the interaction of so many elements.
To understand why such a powerful system cannot handle this simulation in a reasonable amount of time, we need to look at the parameters of this simulation.
The fundamental challenge for the Virgo team is to approximate that reality in a way that is both feasible to compute and fine-grained enough to yield useful insights. The Virgo astrophysicists have tackled it by coming up with a representation of that epoch's distribution of matter using 10 billion mass points, many more than any other simulation has ever attempted to use.
These dimensionless points have no real physical meaning; they are just simulation elements, a way of modeling the universe's matter content. Each point is made up of normal and dark matter in proportion to the best current estimates, having a mass a billion times that of our sun, or 2000 trillion trillion trillion (239) kilograms. (The 10 billion particles together account for only 0.003 percent of the observable universe's total mass, but since the universe is homogeneous on the largest scales, the model is more than enough to be representative of the full extent of the cosmos.)
With these ten billion points, the Virgo team faced a serious challenge.
The software [astrophysicist Volker Springel] and his colleagues developed calculates the gravitational interactions among the simulation's 10 billion mass points and keeps track of the points' displacements in space. It repeats these calculations over and over, for thousands of simulation time steps.
The simulation, therefore, has to calculate the gravitational pull between each pair of mass points. That is, it has to choose one of the 10 billion points and calculate its gravitational interaction with each of the other 9 999 999 999 points, even those at the farthest corners of the universe. Next, the simulation picks another point and does the same thing again, with this process repeated for all points. In the end, the number of gravitational interactions to be calculated reaches 100 million trillion (1 followed by 20 zeros), and that's just for one time step of the simulation. If it simply chugged through all of the thousands of time steps of the Millennium Run, the Virgo group's supercomputer would have to run continuously for about 60,000 years.
Because it was obviously unacceptable, Springel and his colleagues used a couple of tricks to reduce the amount of computations.
First, the researchers divided the simulated cube into several billion smaller volumes. During the gravitational calculations, points within one of these volumes are lumped together -- their masses are summed. So instead of calculating, say, a thousand gravitational interactions between a given particle and a thousand others, the simulation uses an algorithm to perform a single calculation if those thousand points happen to fall within the same volume. For points that are far apart, this approximation doesn't introduce notable errors, while it does speed up the calculations significantly.
They used another method for short distance interactions.
Springel developed new software with what is called a tree algorithm to simplify and speed up the calculations for this realm of short-distance interactions. Think of all 10 billion points as the leaves of a tree. Eight of these leaves attach to a stem, eight stems attach to a branch, and so on, until all the points are connected to the trunk. To evaluate the force on a given point, the program climbs up the tree from the root, adding the contributions from branches and stems found along the way until it encounters individual leaves. This trick reduces the number of required calculations from an incomputable n2 to a much more manageable n log10n, says Springel.
After these two tricks were introduced into the software, the simulation started. And it was completed in June 2004, generating about 20 terabytes of results. These results, which represent 64 snapshots of a virtual universe, will be available to all of us in the months to come. But who will really have access to such an amount of data outside universities and research centers? My guess is that the Virgo Consortium will find a way to reduce the size of the snaphots for regular folks. So stay tuned for the next developments.
Sources: Alexander Hellemans & Madhusree Mukerjee, IEEE Spectrum, Vol. 41, No. 8, P. 28, August 2004
Famous quotes containing the words simulating and/or universe:
“Would it be possible to stand still on one spot more majesticallywhile simulating a triumphant march forwardthan it is done by the two English Houses of Parliament?”
—Alexander Herzen (18121870)
“When we lose love, we lose also our identification with the universe and with eternal valuesan identification which alone makes it possible for us to lay our lives on the altar for what we believe.”
—Sarah Patton Boyle, U.S. civil rights activist and author. The Desegregated Heart, part 3, ch. 2 (1962)