Cookies?
Library Header Image
LSE Research Online LSE Library Services

Training fully connected neural networks is ∃R-complete

Bertschinger, Daniel, Hertrich, Christoph ORCID: 0000-0001-5646-8567, Jungeblut, Paul, Miltzow, Tillmann and Weber, Simon (2024) Training fully connected neural networks is ∃R-complete. In: Oh, A., Naumann, T., Globerson, A., Saenko, K., Hardt, M. and Levine, S., (eds.) NIPS '23: Proceedings of the 37th International Conference on Neural Information Processing Systems. Curran Associates, Inc., 36222 - 36237.

Full text not available from this repository.

Abstract

We consider the algorithmic problem of finding the optimal weights and biases for a two-layer fully connected neural network to fit a given set of data points, also known as empirical risk minimization. We show that the problem is ∃R-complete. This complexity class can be defined as the set of algorithmic problems that are polynomial-time equivalent to finding real roots of a multivariate polynomial with integer coefficients. Furthermore, we show that arbitrary algebraic numbers are required as weights to be able to train some instances to optimality, even if all data points are rational. Our result already applies to fully connected instances with two inputs, two outputs, and one hidden layer of ReLU neurons. Thereby, we strengthen a result by Abrahamsen, Kleist and Miltzow [NeurIPS 2021]. A consequence of this is that a combinatorial search algorithm like the one by Arora, Basu, Mianjy and Mukherjee [ICLR 2018] is impossible for networks with more than one output dimension, unless NP=∃R.

Item Type: Book Section
Official URL: https://dl.acm.org/doi/proceedings/10.5555/3666122
Additional Information: © 2023 Neural Information Processing Systems Foundation, Inc.
Divisions: Mathematics
Subjects: Q Science > QA Mathematics
Date Deposited: 21 May 2024 13:57
Last Modified: 14 Sep 2024 10:26
URI: http://eprints.lse.ac.uk/id/eprint/123554

Actions (login required)

View Item View Item