Cookies?
Library Header Image
LSE Research Online LSE Library Services

Towards lower bounds on the depth of Relu neural networks

Hertrich, Christoph ORCID: 0000-0001-5646-8567, Basu, Amitabh, Summa, Marco D.I. and Skutella, Martin (2023) Towards lower bounds on the depth of Relu neural networks. SIAM Journal on Discrete Mathematics, 37 (2). pp. 997-1029. ISSN 0895-4801

[img] Text (Towards lower bounds on the depth of Relu neural networks) - Published Version
Download (645kB)

Identification Number: 10.1137/22M1489332

Abstract

We contribute to a better understanding of the class of functions that can be represented by a neural network with ReLU activations and a given architecture. Using techniques from mixed-integer optimization, polyhedral theory, and tropical geometry, we provide a mathematical counterbalance to the universal approximation theorems which suggest that a single hidden layer is sufficient for learning any function. In particular, we investigate whether the class of exactly representable functions strictly increases by adding more layers (with no restrictions on size). As a by-product of our investigations, we settle an old conjecture about piecewise linear functions by Wang and Sun [IEEE Trans. Inform. Theory, 51 (2005), pp. 4425-4431] in the affirmative. We also present upper bounds on the sizes of neural networks required to represent functions with logarithmic depth.

Item Type: Article
Additional Information: © 2023 Society for Industrial and Applied Mathematics Publications.
Divisions: Mathematics
Subjects: Q Science > QA Mathematics
Date Deposited: 25 Jul 2023 11:24
Last Modified: 16 Sep 2024 16:03
URI: http://eprints.lse.ac.uk/id/eprint/119828

Actions (login required)

View Item View Item

Downloads

Downloads per month over past year

View more statistics