GRDzhadzha: A code for evolving relativistic matter on analytic metric backgrounds

The following brief overview has been prepared as part of the submission of the code to the Journal of Open Source Software. The code itself can be found at https://github.com/GRChombo/GRDzhadzha 1 .


Summary
Strong gravity environments such as those around black holes provide us with unique opportunities to study questions in fundamental physics, such as the existence and properties of dark matter and dark energy.Characterising the behaviour of new fields and other types of matter in highly relativistic environments generally necessitates numerical simulations unless one imposes significant symmetries.Therefore we need to turn to numerical methods to study the dynamics and evolution of the complex systems of black holes and other compact objects in different environments, using numerical relativity (NR).These methods allow us to split the four-dimensional Einstein equations into three-dimensional spatial hypersurfaces and a time-like direction.Then if a solution is known at the initial spatial hypersurface, it can be numerically evolved in time, where an analytic solution no longer exists.Whilst the tools of NR provide the most complete (i.e., approximation free) method for evolving matter in such environments, in many cases of interest, the density of the matter components is negligible in comparison to the curvature scales of the background spacetime metric, in which case it is a reasonable approximation to neglect their backreaction on it and treat the metric as fixed (assuming the background itself is stationary or otherwise has an analytic form).
In such cases, one does not need to evolve all the metric degrees of freedom as in NR, but only the additional matter ones.It is possible to do this using any NR code in a trivial way by setting the evolution of the metric variables to zero, but this is clearly rather inefficient.This code, GRDzhadzha, directly evolves the matter variables on an analytically specified background.This significantly speeds up the computation time and reduces the resources needed (both in terms of CPU hours and storage) to perform a given simulation.The code is based on the publicly available NR code GRChombo [1,2], which itself uses the open source Chombo framework [3] for solving PDEs.In the following sections we discuss the key features and applications of the code, and give an indication of the efficiencies that can be achieved compared to a standard NR code.

Key features
GRDzhadzha inherits many of the features of GRChombo and Chombo, but avoids the complications introduced when evolving the metric.The key features are: • Background metrics: The currently available backgrounds in the code are a static Kerr black hole in horizon penetrating Kerr-Schild coordinates and a boosted black hole in isotropic Schwarzschild coordinates.These backgrounds can easily be adapted to other coordinate systems for different problems.The code is templated over the background so it can easily be changed without major code modification.
• Matter evolution: The code calculates the evolution for the matter variables on the metric background using an ADM decomposition in space and time -currently we have implemented a real and a complex scalar field as examples of matter types.Again the code is templated over the matter class so that the matter types can be exchanged with minimal modification.
• Accuracy: The metric values and their derivatives are calculated exactly at each point, whereas the matter fields are evolved with a 4th order Runge-Kutta time integration and their derivatives calculated with the same finite difference stencils used in GRChombo (4th and 6th order are currently available).
• Initial Conditions: The current examples provide initial data for real and complex scalar field matter.Since backreaction is ignored, there are no constraint equations to satisfy in the case of a scalar field, and the initial data can be freely specified.
• Diagnostics: GRDzhadzha has routines for verifying the conservation of matter energy densities, angular and linear momentum densities, and their fluxes, as discussed in [4,5].
• C++ class structure: Following the structure of GRChombo, GRDzhadzha is also written in C++ and uses object oriented programming (OOP) and templating.
• Parallelism: GRChombo uses hybrid OpenMP/MPI parallelism with explicit vectorisation of the evolution equations via intrinsics, and is AVX-512 compliant.
• Adaptive Mesh Refinement: The code inherits the flexible AMR grid structure of Chombo, which provides Berger-Oliger style [6] AMR with block-structured Berger-Rigoutsos grid generation [7].Depending on the problem, the user may specify the refinement to be triggered by the matter or the background spacetime [8].One nice feature is that one does not need to resolve the horizon of the black hole unless matter is present at that location, so for an incoming wave a lot of storage and processing time can be saved by only resolving the wave, and not the spacetime background.

Statement of Need
As mentioned in the introduction, any numerical relativity code like GRChombo can undertake these simulations.
CosmoGRaPH, is based on the SAMRAI infrastructure, and has targeted fluid and MHD applications.GRAthena++ [30] makes use of oct-tree AMR to maximise scaling.Whilst there exist many NR codes (both public and private), which can in principle be used to perform simulations of fundamental fields on a fixed BH background, most do not have the efficiency advantages of GRDzhadzha 2 .In particular, the fact that the ADM variables and their derivatives are not evolved or stored on the grid saves both a lot of simulation run time, as well as output file storage space.To get a rough idea of the improvement in storage and CPU hours one can achieve, we performed a short test simulation using GRDzhadzha and compared it simulation performed using the full NR capabilities of GRChombo.We find that on average GRDzhadzha is 15-20 times faster than GRChombo and requires about 3 times less file storage.An additional advantage of this code versus using a full NR code, for problems with negligible backreaction, is that here the metric variables are calculated analytically at every point on the grid, which significantly decreases the margin for numerical error, and means that resolution can be focussed on the matter location, and not the spacetime curvature.
It is important to note that whilst backreaction is neglected in the metric calculation, this does not mean that the backreaction effects cannot be calculated.Fixed background simulations provide a first order (in the density) estimate of the gravitational effects caused by the matter, taking into account their relativistic behaviour.This is discussed further in [4] and some examples using the approach are [33,34,35].
Since the interface and structure of the code is very close to the GRChombo numerical relativity code, it is possible for the results of these fixed background simulations to be used as initial data in full numerical relativity simulations (and vice versa), as was done in [36].Therefore if the backreaction is found to be significant due to some growth mechanism, the simulation can be continued in full NR.
• Studying the interference patterns in neutrino flavour oscillations around a static black hole [37].
• Growth of scalar hair around a Schwarzschild [38] and a Kerr [33] black hole.
• Determining the relativistic drag forces on a Schwarzschild black hole moving through a cloud of scalar field dark matter [34,35].
• Studying the dynamical friction effects on a Kerr black hole (Magnus effect) [39].
• BH mergers in wave dark matter environments [36]. • DiRAC (Distributed Research utilising Advanced Computing) resources under the projects ACSP218, ACSP191, ACTP183, ACTP186 and ACTP316.DiRAC is part of the National e-Infrastructure.Systems used include: -Cambridge Service for Data Driven Discovery (CSD3), part of which is operated by the University of Cambridge Research Computing on behalf of the STFC DiRAC HPC Facility (www.dirac.ac.uk).DiRAC Complexity system, operated by the University of Leicester IT Services, which forms part of the STFC DiRAC HPC Facility (www.dirac.ac.uk).This equipment is funded by BIS National E-Infrastructure capital grant ST/K000373/1 and STFC DiRAC Operations grant ST/K0003259/1 • Sakura, Cobra and Raven clusters at the Max Planck Computing and Data Facility (MPCDF) in Garching, Germany • PRACE (Partnership for Advanced Computing in Europe) resources under grant numbers 2018194669, 2020225359.Systems used include: -SuperMUCNG, Leibniz Supercomputing Center (LRZ), Germany -JUWELS, Juelich Supercomputing Centre (JSC), Germany • the Argonne Leadership Computing Facility, including the Joint Laboratory for System Evaluation (JLSE), which is a U.S. Department of Energy (DOE) Office of Science User Facility supported under Contract DE-AC02-06CH11357 • the Texas Advanced Computing Center (TACC) at the University of Austin HPC and visualization resources URL: (http://www.tacc.utexas.edu),and the San Diego Supercomputing Center (SDSC) URL: (https://www.sdsc.edu),under project PHY-20043 and XSEDE Grant No. NSF-PHY-090003 • the Glamdring cluster, Astrophysics, Oxford, UK