MOOSE Newsletter (August 2021)
AD Scalar Kernels
Scalar kernels can now use automatic differentiation (AD). While AD is not necessary for systems of ordinary differential equations (ODEs) involving only scalar variables (due to the exact Jacobians offered by ParsedODEKernel, for example), ODEs involving contributions from field variables greatly benefit from AD. For example, an elemental user object may compute an ADReal
value from field variable(s) on a domain, which then may be used in a scalar equation.
To create an AD scalar kernel, derive from ADScalarKernel
and implement the method computeQpResidual()
. As a caution, if using user objects to compute ADReal
values, be sure to execute those user objects on NONLINEAR
to ensure the derivatives in the ADReal
value are populated.
Bug Fixes and Minor Enhancements
In ExplicitSSPRungeKutta, a bug was fixed where a segmentation fault was occurring in some corner cases when using automatic differentiation, due to calling
computeADTimeDerivatives
before initializing some stage data, even though in these cases, the resulting value is not used. Now, a quiet NaN is returned instead of trying to use an invalid pointer.When using time integrators deriving from ExplicitTimeIntegrator, if
solve_type
was set to anything other thanLINEAR
, a segmentation fault would occur. Now, an error message is given instead.