Cookies?
Library Header Image
LSE Research Online LSE Library Services

The computational complexity of ReLU network training parameterized by data dimensionality

Froese, Vincent, Hertrich, Christoph and Niedermeier, Rolf (2022) The computational complexity of ReLU network training parameterized by data dimensionality. Journal of Artificial Intelligence Research, 74. pp. 1775-1790. ISSN 1076-9757

[img] Text (The Computational Complexity of ReLU Network Training Parameterized by Data Dimensionality) - Published Version
Download (350kB)

Identification Number: 10.1613/JAIR.1.13547

Abstract

Understanding the computational complexity of training simple neural networks with rectified linear units (ReLUs) has recently been a subject of intensive research. Closing gaps and complementing results from the literature, we present several results on the parameterized complexity of training two-layer ReLU networks with respect to various loss functions. After a brief discussion of other parameters, we focus on analyzing the influence of the dimension d of the training data on the computational complexity. We provide running time lower bounds in terms of W[1]-hardness for parameter d and prove that known brute-force strategies are essentially optimal (assuming the Exponential Time Hypothesis). In comparison with previous work, our results hold for a broad(er) range of loss functions, including `p-loss for all p ∈ [0, ∞]. In particular, we improve a known polynomial-time algorithm for constant d and convex loss functions to a more general class of loss functions, matching our running time lower bounds also in these cases.

Item Type: Article
Additional Information: © 2022 AI Access Foundation.
Divisions: Mathematics
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Date Deposited: 13 Oct 2022 11:24
Last Modified: 04 Dec 2023 05:57
URI: http://eprints.lse.ac.uk/id/eprint/116972

Actions (login required)

View Item View Item

Downloads

Downloads per month over past year

View more statistics