Dear quanta,

Tomorrow we will have a group meeting talk from visitor Daniel Stilck Franca (Copenhagen) at 11am.  Then at 1:30pm we'll have a seminar from Srinivasan.

Here are the topics of the two talks. 

Title: Quantum speedups for certain convex feasibility problems
Abstract: we give an algorithm that decides if a collection of closed convex sets of matrices has an empty intersection with the set of quantum states or outputs a state close to all sets. The algorithm is guaranteed to converge after a logarithmic number of queries to membership oracles for the convex sets of interest.
Moreover, it is particularly well suited to be run on a quantum computer, as it is based on preparing a sequence of quantum Gibbs states.
Exploring this, we show how to obtain quantum speedups for certain classes of SDPs.
This is joint work with Richard Kueng and Fernando Brandão.
 
Title: Strengths and weaknesses of quantum examples for learning

Abstract: Recently, there has been a lot of recent interest in quantum
algorithms for machine learning. These try to improve over classical
algorithms for generalizing or otherwise learning from given data (the data
itself could also be quantum).

I consider the following setting: Let C be concept class of Boolean
functions f:{0,1}^n->{-1,1}. In the classical learning model, a learner is
given a (x,f(x)) sample where x is drawn according to a distribution D.
Quantumly an example would be a *superposition* over samples. The goal of
the learner is to output a hypothesis with error at most eps. In this talk,
I describe a few instances when quantum examples help and when they don't.
No prior knowledge of quantum computing or quantum physics is necessary to
understand this talk.

a) In the classical PAC model, D is unknown to the learner. Fundamental
results in classical learning show that the classical sample complexity of
a concept class C in the PAC model is tightly determined by the so-called
VC-dimension of C. Our main result is to show that exactly the same formula
gives the quantum sample complexity, showing that quantum examples do not
help in the PAC model of learning (nor in the related agnostic model).

b) Let D be the uniform distribution, eps=0 (i.e., we need to *identity* f)
and suppose C contains Boolean functions having at most k non-zero Fourier
coefficients, then how many classical examples suffice to identity f? This
question has been studied for over a decade under the name of sparse
recovery and Fourier sampling, having applications in compressed sensing
and the data stream model. Regev and Haviv (CCC'15) showed Theta(nk)
classical samples suffice and are required to solve this problem. In this
work, we solve this problem using O(k^{1.5}) quantum samples and show that
Omega(k) quantum samples are necessary (importantly, it is independent of
n!).

c) I also consider the coupon-collector problem and show how quantum
examples give a logarithmic-improvement over the classical algorithms that
solve the coupon collector problem.

Based on joint work with Ronald de Wolf and others.