COMPUTATION COMPLEXITY OF DEEP RELU NEURAL NETWORKS IN HIGH-DIMENSIONAL APPROXIMATION

Dinh Dũng, Van Kien Nguyen, Mai Xuan Thao
Author affiliations

Authors

  • Dinh Dũng Vietnam National University, Hanoi, Information Technology Institute
  • Van Kien Nguyen Faculty of Basic Sciences, University of Transport and Communications
  • Mai Xuan Thao Department of Natural Sciences, Hong Duc University

DOI:

https://doi.org/10.15625/1813-9663/37/3/15902

Keywords:

Deep ReLU neural network, computation complexity, high-dimensional approximation, H\"older-Nikol'skii space of mixed smoothness

Abstract

The purpose of the present paper is to study the computation complexity of deep ReLU neural networks to approximate functions in H\"older-Nikol'skii spaces of mixed smoothness $H_\infty^\alpha(\mathbb{I}^d)$ on the unit cube $\mathbb{I}^d:=[0,1]^d$. In this context, for any function $f\in H_\infty^\alpha(\mathbb{I}^d)$, we explicitly construct nonadaptive and adaptive deep ReLU neural networks having an output that approximates $f$ with a prescribed accuracy $\varepsilon$, and prove dimension-dependent bounds for the computation complexity of this approximation, characterized by the size and the depth of this deep ReLU neural network, explicitly in $d$ and $\varepsilon$. Our results show the advantage of the adaptive method of approximation by deep ReLU neural networks over nonadaptive one.

Metrics

Metrics Loading ...

Downloads

Published

28-09-2021

How to Cite

[1]
D. Dũng, V. K. Nguyen, and M. X. Thao, “COMPUTATION COMPLEXITY OF DEEP RELU NEURAL NETWORKS IN HIGH-DIMENSIONAL APPROXIMATION”, JCC, vol. 37, no. 3, p. 291–320, Sep. 2021.

Issue

Section

SPECIAL ISSUE DEDICATED TO THE MEMORY OF PROFESSOR PHAN DINH DIEU - PART A

Most read articles by the same author(s)