Cookies?
Library Header Image
LSE Research Online LSE Library Services

Handling hard affine SDP shape constraints in RKHSs

Aubin-Frankowski, Pierre-Cyril and Szabo, Zoltan ORCID: 0000-0001-6183-7603 (2022) Handling hard affine SDP shape constraints in RKHSs. Journal of Machine Learning Research. ISSN 1532-4435

[img] Text (21-0007) - Published Version
Available under License Creative Commons Attribution.

Download (2MB)

Abstract

Shape constraints, such as non-negativity, monotonicity, convexity or supermodularity, play a key role in various applications of machine learning and statistics. However, incorporating this side information into predictive models in a hard way (for example at all points of an interval) for rich function classes is a notoriously challenging problem. We propose a unified and modular convex optimization framework, relying on second-order cone (SOC) tightening, to encode hard affine SDP constraints on function derivatives, for models belonging to vector-valued reproducing kernel Hilbert spaces (vRKHSs). The modular nature of the proposed approach allows to simultaneously handle multiple shape constraints, and to tighten an infinite number of constraints into finitely many. We prove the convergence of the proposed scheme and that of its adaptive variant, leveraging geometric properties of vRKHSs. Due to the covering-based construction of the tightening, the method is particularly well-suited to tasks with small to moderate input dimensions. The efficiency of the approach is illustrated in the context of shape optimization, robotics and econometrics.

Item Type: Article
Official URL: https://jmlr.org/papers/v23/21-0007.html
Additional Information: © 2022 JMLR.
Divisions: Statistics
Subjects: H Social Sciences > HA Statistics
Date Deposited: 01 Aug 2022 09:27
Last Modified: 20 Dec 2024 00:46
URI: http://eprints.lse.ac.uk/id/eprint/115724

Actions (login required)

View Item View Item

Downloads

Downloads per month over past year

View more statistics