The essentially non-oscillatory (ENO) procedure and its variant, the ENO-SR procedure, are very efficient algorithms for interpolating (reconstructing) rough functions. We prove that the ENO (and ENO-SR) procedure are equivalent to deep ReLU neural networks. This demonstrates the ability of deep ReLU neural networks to approximate rough functions to high-order of accuracy. Numerical tests for the resulting trained neural networks show excellent performance for interpolating functions, approximating solutions of nonlinear conservation laws and at data compression.
© 2001-2024 Fundación Dialnet · Todos los derechos reservados