Jump to Content

Private Learning of Halfspaces: Simplifying the Construction and Reducing the Sample Complexity

Eliad Tsfadia
NeurIPS 2020

Abstract

We present a private learner for halfspaces over a finite grid $G$ in $R^d$ with sample complexity $d^{2.5}\cdot 2^{\log^*|G|}$, which improves the state-of-the-art result of [Beimel et al., COLT 2019] by a $d^2$ factor. The building block for our learner is a new differentially private algorithm for approximately solving the linear feasibility problem: Given a feasible collection of $m$ linear constraints of the form $Ax\geq b$, the task is to privately identify a solution $x$ that satisfies most of the constraints. Our algorithm is iterative, where each iteration determines the next coordinate of the constructed solution $x$.