## Stéphane Ayache

Aix-Marseille Université (AMU)

### Towards Deeper Understanding of Deep Learning

Slides
This last years, major advancement has been made in Machine Learning thanks to a revival
of Neural Networks. Recent advances in Deep Learning give rise to many technological revolutions in various
fields, that make it possible to upset a large part of our society and everyday's life. Deep Learning models
are indeed really powerful, but one still need to improve the understanding of such models and architectures
to face recent societal challenges. After a wide introduction on NN concepts, this talk goes deeper in the
understanding of Convolution layers through simple tools from linear algebra.
We then introduce weighted convolutions, and show few preliminary experiments on a transfer
learning task. Deep Learning is not as easy as 3 lines of code, understanding it well will conduct to better
architectures and explainable models.

## Luc Giffon

Aix-Marseille Université (AMU)

### Deep Networks with Adaptive Nyström Approximation

Slides
Recent work has focused on combining kernel methods and deep learning to exploit the
best of the two approaches. Here, we introduce a new architecture of neural networks in which we replace
the top dense layers of standard convolutional architectures with an approximation of a kernel function
by relying on the Nyström approximation. Our approach is easy andhighly flexible. It is compatible with any
kernel function and it allows exploiting multiple kernels. We show that our architecture has the same
performance than standard architecture on datasets like SVHN and CIFAR100. One benefit of the method
lies in its limited number of learnable parameters which makes it particularly suited for small training
set sizes, e.g. from 5 to 20 samples per class.

## Benoît Grémaud

Centre de Physique Théorique (CPT)

### Quantum tensor networks: from condensed matter to machine learning

First, I will review some of the main aspects of quantum many-body systems (entanglement and area law), emphasizing the
origin of the complexity of computing the properties of these systems. Then, I will explain how the
different numerical tools developed within the tensor network framework, such as the
DMRG (density matrix renormalization group), MPS (matrix product states), can help solving these
difficulties. Finally, I will shortly review some of the possible applications of the
tensor networks in machine learning.

## Anupam Prakash

Institut de recherche en informatique fondamentale (IRIF)

### Quantum Support Vector Machines

Slides
We present a quantum algorithm for support vector machines (SVMs) that can potentially achieve
significant speedups. The algorithm is based on a quantum interior point method for second order cone
programs (SOCPs), a class of structured convex optimization problems which generalize quadratic programs.
I will also introduce techniques from quantum linear algebra and machine learning that are required for
this algorithm.