Projects developed for the Mathematics for Data course at EPFL
We consider a binary classification task that we model using logistic regression. We use first-order methods and accelerated gradient descent methods to find a classifier. We also explore proximal gradient methods to obtain sparse solutions. In the second part, proximal methods are used for image reconstruction.
We implement a simple GAN with linear generator and a dual variable, in order to limit the computational requirements. We also explore two variants of the stochastic gradient descent, Adam and RMSProp. Then these algorithms are used to train a neural network to solve the image classification task.
In the first part, we investigate projection-free convex optimization by comparing the computational costs of the proximal operators proxX and the linear minimization oracles lmoX for a given set X. We then implement the Frank-Wolfe method for solving a blind image deconvolution problem. In the second part, we use a variant of the Frank-Wolfe method for linearly constrained problems as well as the Vu-Condat primal-dual method to obtain numerical solutions to the classical k-means clustering problem.