William Roper, Peter Thomas, Chris Lovell, Kyle Oman, Thomas Sandnes, Victor Forouhar
Session type:
Regular
Description:
Simulations are currently looking down the barrel of a new regime of observational data. Whether from the ever-expanding high-redshift frontier and the mysteries of exoplanets probed by JWST, or the massive surveys due to be conducted by the likes of the Euclid Space Telescope and Vera C. Rubin Observatory. For our "virtual twins" to keep up, they must probe statistically significant samples with high enough resolution to resolve detailed physics in robust models. This requirement demands high-resolution simulations probing large samples, or in the case of stellar and planetary simulations numerous individual realisations of objects. A hugely computationally demanding proposition.
In addition to the required volumes, these observational frontiers probe physical mechanisms poorly represented in large volumes in computational astrophysics studies: from the reionisation of the Universe on the largest scales, to the coevolution of black holes and their accretion disks and everything in between and beyond. The fidelity with which we model the Universe needs to improve in step with the fidelity of observational data, as do our analysis methods. To truly compare theoretical and observational results we need to be comparing like-for-like. This requires robust and complete physical models as a foundation, and forward modelling software to produce synthetic observations built on top of that solid foundation.
Even so, we are not lost in this game of catch-up - novel and efficient implementations are plentiful, HPC facilities are ever-increasing in computational power, and the Exascale regime is just around the corner in the UK. With the rise of the Exascale regime comes the advent of accelerator-based systems, ever-increasing thread counts and ample memory. However, this increase in computational potential could become a poisoned chalice without proper consideration. Not only do we have to consider new architectures and handle calculations magnitudes larger than we have in the past, we also need to store and analyse the huge data footprint produced by them, and mitigate the energy consumption inherent in running these simulations and storing the data.
In this session, we want to promote both scientific and technical discussions surrounding the next generation of simulations across spatial regimes. From planetary impact and formation simulations to massive cosmological simulations encompassing the Hubble volume, what do we need to interpret and understand observations at the frontier? What physics is missing in our models? How do we best utilise the hardware at our disposal? How do we analyse the huge datasets we will produce efficiently? Do we have common workflows and best practices? And, most importantly, how do we do all of this efficiently without undue impact on the environment from our energy usage?
Topic:
Techniques
Schedule
30 Minutes
Matthieu Schaller
The SWIFT code, the FLAMINGO model, and the road to exa-scale
15 Minutes
Alex Richings
The CHIMES non-equilibrium chemistry package: Modelling the chemistry of ions and molecules in simulations of galaxy formation
15 Minutes
David O'Ryan
Painting Galaxies: Putting Statistical Constraint on Galaxy Interaction
15 Minutes
Eva Duran Camacho
Numerical simulations of Milky Way-type galaxies: findings and limitations
15 Minutes
Akshay Priyadarshi
A Galactic population synthesis approach to exoplanet demography
All attendees are expected to show respect and courtesy to other attendees and staff, and to adhere to the NAM Code of Conduct.