GRChombo: An adaptable numerical relativity code for fundamental physics

GRChombo is an open-source code for performing Numerical Relativity time evolutions, built on top of the publicly available Chombo software for the solution of PDEs. Whilst GRChombo uses standard techniques in NR, it focusses on applications in theoretical physics where adaptability, both in terms of grid structure, and in terms of code modification, are key drivers.

Strong gravity regimes are described by the Einstein Field Equation (EFE) of General Relativity (Einstein, 1916) where g µν is the gravitational metric describing spacetime distances, R and R µν are related to its second derivatives in space and time, and T µν is the stress-energy tensor of any matter or fields present.Analytic solutions to the EFE only exist where there is a high degree of symmetry; in general the equations must be solved numerically.The need for observational predictions has thus led to the development of numerical relativity (NR), methods for numerically solving the above expression, typically utilising high-performance computing (HPC) resources.Expanding out the tensorial notation above, the EFE is a set of coupled, nonlinear second-order partial differential equations for g µν , which describes the curvature of spacetime in the presence of stress-energy T µν , that is, schematically the equation we are trying to solve has the form: where the indices µ, ν run over the spacetime indices -in 4 dimensions, t, x, y, z.Given that g µν is symmetric in its indices, this gives a set of ten coupled nonlinear partial differential equations, sourced by the stress-energy of any matter or fields present in the spacetime.
One common approach to NR is to specify an initial spatial distribution for the metric and matter fields (subject to certain constraints), and then solve a time evolution for all metric and matter quantities, thus populating their values thoughout the four-dimensional spacetime.
The canonical example of this is the simulation of two black holes in orbit around each other, which permits extraction of the gravitational wave signal produced during the merger.Such numerical results have been instrumental in discovering signals in the noisy LIGO/VIRGO detector data, as well as confirming the predictions of GR to a high precision in the strong field regime (B.P. Abbott & others, 2016;R. Abbott & others, 2020).
GRChombo is an open-source code for performing such NR time evolutions, built on top of the publicly available Chombo software (Adams & others, 2015) for the solution of PDEs.Whilst GRChombo uses standard techniques in NR, it focusses on applications in theoretical physics where adaptability, both in terms of grid structure, and in terms of code modification, are key drivers.

Key features of GRChombo
Since its initial announcement in 2015 (Clough et al., 2015), the GRChombo code has become a fully mature, open-source NR resource.
• Boundary Conditions: The code implements periodic, Sommerfeld (radiative), extrapolating and reflective boundary conditions.
• Initial Conditions: The current examples provide analytic or semi-analytic initial data for black hole binaries, Kerr black holes and scalar matter.The code also incorporates a standalone version of the TwoPunctures code (Ansorg et al., 2004) for accurate binary BH data of arbitrary spins (up to the usual limit for Bowen-York data of around a/M = 0.9 for the dimensionless spin parameter), masses and momenta.
• Diagnostics: GRChombo has routines for finding black hole horizons, calculating spacetime masses, angular momenta, densities, fluxes and extracting gravitational waves.
• C++ class structure: GRChombo is written in the C++ language, and makes heavy use of object oriented programming (OOP) and templating.
• Parallelism: GRChombo uses hybrid OpenMP/MPI parallelism with explicit vectorisation of the evolution equations via intrinsics, and is AVX-512 compliant.Our code demonstrates efficient strong scaling up to several thousand CPU-cores for a typical BH binary problem, and further for larger problem sizes.
• Adaptive Mesh Refinement: The underlying Chombo code provides Berger-Oliger style (M.J. Berger & Oliger, 1984) AMR with block-structured Berger-Rigoutsos grid generation (M.Berger & Rigoutsos, 1991).The tagging of refinement regions is fully flexible and can be based on truncation error or other user-defined measures.
The code continues to be actively developed with a number of ongoing projects to add new features.
Images of Imhomogeneous inflaton field in (Aurrekoetxea, Clough, et al., 2020) and evolution of the equation of state and density in (Joana & Clesse, 2021).
Images of testing cosmic censorship with higher-dimensional black rings in (Figueras et al., 2017) and mapping regions of validity for modified gravity in (Figueras & França, 2020).
Images of scalar field accretion around a spinning BH from (Bamber et al., 2021), and the relativistic scaling of dynamical friction from (Traykova et al., 2021).
• the study of gravitational recoil in unequal mass binaries (Radia et al., 2021).

-
Cartesius (SURF), Netherlands -Marenostrum (BSC), Spain • the Argonne Leadership Computing Facility, including the Joint Laboratory for System Evaluation (JLSE), which is a U.S. Department of Energy (DOE) Office of Science User Facility supported under Contract DE-AC02-06CH11357.• the Texas Advanced Computing Center (TACC) at the University of Austin HPC and visualization resources URL: http://www.tacc.utexas.edu,and the San Diego Supercomputing Center (SDSC) URL: https://www.sdsc.edu,under project PHY-20043 and XSEDE Grant No. NSF-PHY-090003 • Consortium des Équipements de Calcul Intensif (CÉCI), funded by the Fonds de la Recherche Scientifique de Belgique (F.R.S.-FNRS), Belgium • the Marconi HPC resources and software support (awarded by CINECA), Italy • the Glamdring cluster, Astrophysics, Oxford, UK • The Fawcett cluster, Faculty of Mathematics, Cambridge, UK • the Argo cluster at ICTP, Trieste, Italy • the Apocrita cluster at QMUL, UK • The Athena cluster at HPC Midlands Plus, UK • The Cosmo cluster at CURL, University of Louvain, Belgium DiRAC Data Intensive service at Leicester, operated by the University of Leicester IT Services, which forms part of the STFC DiRAC HPC Facility (www.dirac.ac.uk).The equipment was funded by BEIS capital funding via STFC capital grants ST/K000373/1 and ST/R002363/1 and STFC DiRAC Operations grant ST/R001014/1.DiRAC is part of the National e-Infrastructure.-DiRAC at Durham facility managed by the Institute for Computational Cosmology on behalf of the STFC DiRAC HPC Facility (www.dirac.ac.uk).The equipment was funded by BEIS capital funding via STFC capital grants ST/P002293/1 and ST/R002371/1, Durham University and STFC operations grant ST/R000832/1.DiRAC is part of the National e-Infrastructure.DiRAC Complexity system, operated by the University of Leicester IT Services, which forms part of the STFC DiRAC HPC Facility (www.dirac.ac.uk ).This equipment is funded by BIS National E-Infrastructure capital grant ST/K000373/1 and STFC DiRAC Operations grant ST/K0003259/1.DiRAC is part of the National e-Infrastructure.
-• PRACE (Partnership for Advanced Computing in Europe) resources under grant numbers 2018194669, 2020225359.Systems used include: