DMStag: Staggered, Structured Grids for PETSc

“Staggered grid” codes arise in many applications, where a physical problem is discretized and solved in terms of unknowns located at different locations on a structured grid. For example, the Navier-Stokes and Stokes equations can be solved on a staggered grid with the classical Marker and Cell (MAC) method (Harlow & Welch, 1965), storing pressure unknowns in element centers and velocities on element boundaries. This gives a simple and compact method which exhibits superior numerical properties to colocated grid methods which consider all unknowns to be located on the same grid of points.

1. A discrete topological space (here, a structured cell complex) 2. An atlas of overlapping local patches, assigned to MPI ranks 3. A field assigning sets of scalar unknowns to each point in the topological space (here, a constant number of unknown "DOFs" for each point in a given stratum, defined as cells of a given dimension) 4. A special field for coordinates of each point DMStag is included with and requires PETSc. As of PETSc 3.17, the vast majority of the source code to implement DMStag can be found in src/dm/impls/stag/, include/petscdmstag.h and include/petsc/private/dmstagimpl.h. For more information, please consult the PETSc manual chapter on DMStag and the DMStag manual pages.

Statement of need
The discretization of PDEs on structured (or "regular") grids of cells, associated unknown quantities with several different geometric entities (for instance, cell centers and cell faces), has applications in many fields. The highly structured nature of this discretization and relative simplicity of many of the stencils can lead to highly efficient and maintainable application codes which solve important problems in fluid simulation (including weather and climate simulation and geophysical flows) and electromagnetic and plasma simulations. Users have commonly implemented these methods with PETSc, using the DMDA object. However, since this represents a structured grid with an equal number of unknowns attached to each vertex (or cell center, if interpreted that way), a staggered grid must either be represented as a collection of DMDA objects, or by introducing a convention to store staggered data at a nearby non-staggered location, ignoring unused points on the boundary. This is inconvenient and can also have performance consequences. The choice of approach commits one to interlace or segregate data from different physical fields, and the presence of unused "dummy" points in the global representation may induce additional complication when attempting to use grid-aware solvers like PETSc's geometric or algebraic multigrid solvers, or block-based preconditioners (PCFIELDSPLIT).
DMStag is currently used in research simulating magmatic flow in the Earth's mantle  as part of the FD-PDE framework , and MHD simulations for tokamak simulation. Its base functionality was introduced into PETSc with version 3.11 (March 2019).
We are not aware of any directly-comparable general-purpose frameworks. In many application codes, the relatively simple data structures involved can be implemented directly. This approach has the obvious advantage of avoiding a heavy dependency like PETSc, but could incur larger investment of implementation effort to experiment with new, more-scalable solvers or with GPU-backed data structures, both of which PETSc's focus on composability and portability aim to make more accessible. Projects like GridTools and STELLA (Gysi et al., 2015) provide high-performance stencil operations for particular domains, in this case climate and weather modelling.

Examples
A few example codes are available with PETSc, currently in src/dm/impls/stag/tutorials. Additional codes in src/dm/impls/stag/tests are used for testing.