Volume 56, pp. 86-101, 2022.

Structure preservation for the Deep Neural Network Multigrid Solver

Nils Margenberg, Christian Lessig, and Thomas Richter

Abstract

The simulation of partial differential equations is a central subject of numerical analysis and an indispensable tool in science, engineering, and related fields. Existing approaches, such as finite elements, provide (highly) efficient tools but deep neural network-based techniques emerged in the last few years as an alternative with very promising results. We investigate the combination of both approaches for the approximation of the Navier-Stokes equations and to what extent structural properties such as divergence freedom can and should be respected. Our work is based on DNN-MG, a deep neural network multigrid technique, that we introduced recently and which uses a neural network to represent fine grid fluctuations not resolved by a geometric multigrid finite element solver. Although DNN-MG provides solutions with very good accuracy and is computationally highly efficient, we noticed that the neural network-based corrections substantially violate the divergence freedom of the velocity vector field. In this contribution, we discuss these findings and analyze three approaches to address the problem: a penalty term to encourage divergence freedom of the network output; a penalty term for the corrected velocity field; and a network that learns the stream function and which hence yields divergence-free corrections by construction. Our experimental results show that the third approach based on the stream function outperforms the other two and not only improves the divergence freedom but also the overall fidelity of the simulation.

Full Text (PDF) [1 MB], BibTeX

Key words

multigrid methods, incompressible Navier-Stokes equations, machine learning, artificial neural networks

AMS subject classifications

65M55, 76D05

< Back