Understanding turbulence is critical for a wide range of terrestrial and astrophysical applications. For example, turbulence on earth is responsible for the transport of pollutants in the atmosphere. But turbulence also plays a central role in astrophysics. For instance, the turbulent motions of gas and dust particles in protostellar disks enable the formation of planets. Moreover, virtually all modern theories of star formation rest on the statistics of turbulence (Federrath & Klessen 2012; Padoan et al., 2014). Here we present the first results of the world's largest simulation of supersonic turbulence. The simulation has an unprecedented resolution of 10,048^3 grid cells, running on 65,536 compute cores at the Leibniz Supercomputing Centre, with 131 TB of main memory and produced about 2 PB of three-dimensional output data. The main aim of the simulation is to understand the transition from supersonic to subsonic turbulence, the so-called 'sonic scale', which is a key ingredient for star formation.
This is a simulation movie of the projected gas density in the regime of fully-developed turbulence:
[ EXTREME_coldens_new.mp4, 13MB high-res mpeg 4 video ]
With colour bar scaling and full simulation box:
[ EXTREME_coldens.mpeg, 67MB high-res mpeg ]
This Figure shows a scaling test with our modified version of the FLASH code, which was used to run the simulation. Our code provides excellent scaling up to 100,000 compute cores.
Turbulence has a wide range of applications in science and engineering, including the amplification of magnetic fields, star and planet formation, mixing of pollutants in the atmosphere, fuel ignition in engines, and many more. Generating the huge dataset of turbulence presented here, we have begun to reach the technical limits of what is feasible on any supercomputer in the world to date. We are currently pushing the boundaries further with this extremely high-resolution simulation, hoping to unravel the statistics of supersonic and subsonic, magnetized turbulence in the near future.
The simulations used supercomputing resources on SuperMUC provided by the Gauss Centre for Supercomputing and the Leibniz Supercomputing Centre (grants pr48pi and pr32lo). This work was supported by the Discovery Projects Scheme of the Australian Research Council (grants DP150104329 and DP17010603). The software used here was in part developed by the DOE-supported ASC/Alliance Center for Astrophysical Thermonuclear Flashes at the University of Chicago.