Schedule and abstracts

Date: Wednesday, June 25, 2014

Location: Convention Hall No. 4E (Level 1 of the Beijing International Convention Center)

09:00-10:00 Invited talk: Elina Robeva
10:00-10:20 Poster spotlights
10:20-10:40 Coffee break
10:40-11:40 Invited talk: Prateek Jain
11:40-11:50 Poster spotlights
11:50-12:30 Poster session
12:30-14:00 Lunch
14:00-15:00 Invited talk: Aravindan Vijayaraghavan
15:00-15:40 Poster session / coffee break
15:40-16:40 Invited talk: Anima Anandkumar

Invited talks

Elina Robeva (UC Berkeley):

Title: Orthogonally Decomposable Tensors
Abstract: We describe tensor decomposition and the tensor power method from an algebraic perspective. Given an orthogonally decomposable tensor, we give a formula for all of its eigenvectors. Moreover, we give polynomial equations that cut out the space of orthogonally decomposable tensors.

Slides: pdf

Prateek Jain (MSR India):

Title: Provable Non-convex Projections for Low-rank Matrix Recovery
Abstract: Typical low-rank matrix recovery problems such as low-rank matrix completion, robust PCA etc can be solved using non-convex projections onto the set of low-rank matrices. However, providing theoretical guarantees for such methods is difficult due to the non-convexity in projections. In this talk, we will discuss two of our recent results that show that non-convex projections based methods can be used to solve two important problems in this area: a) low-rank matrix completion, b) robust PCA. For low-rank matrix completion, we provide first provable algorithm with time and storage complexity O(n) (assuming rank to be constant) while avoiding any dependence on the condition no. of the underlying matrix to be recovered. For robust PCA, we provide first provable algorithm with time complexity O(n^2 r) which matches the time complexity of normal SVD and is faster than the usual nuclear+L_1-regularization methods that incur O(n^3) time complexity. This talk is based on joint works with Animashree Anandkumar, U N Niranjan, Praneeth Netrapalli, and Sujay Sanghavi.

Slides: pdf

Aravindan Vijayaraghavan (CMU):

Title: Smoothed Analysis of Tensor Decompositions and Learning Overcomplete Latent Variable Models
Abstract: Low-rank tensor decompositions, the high dimensional analog of matrix decompositions, are a powerful tool that arise in machine learning, statistics and signal processing. However, tensors pose significant algorithmic challenges and tensors analogs of much of the matrix algebra toolkit are unlikely to exist because of hardness results. For instance, efficient tensor decompositions in the overcomplete case (where rank exceeds dimension) are particularly challenging.

I will introduce a smoothed analysis model for studying tensor decompositions. Smoothed analysis gives an appealing way to analyze non worst-case instances: the assumption here is that the model is not adversarially chosen, formalized by a perturbation of model parameters. We give efficient algorithms for decomposing tensors, even in the highly overcomplete case (rank being polynomial in the dimension) using smoothed analysis. This gives new efficient algorithms for learning probabilistic models like mixtures of axis-aligned gaussians and multi-view models even in highly overcomplete settings.

Based on joint work with Aditya Bhaskara, Moses Charikar and Ankur Moitra.

Slides: pptx | pdf

Anima Anandkumar (UC Irvine):

Title: Learning Overcomplete Latent Variable Models Through Tensor Decompositions.
Abstract: Tensor decomposition provide a guaranteed approach to unsupervised learning of latent variable models such as Gaussian mixtures, topic models and independent components. For these models, tensors computed from low order moments (up to fourth order) are decomposed to yield the model parameters. For overcomplete models, where the latent dimensionality exceeds the observed dimensionality, decomposing the tensor is challenging, since the tensor rank exceeds the observed dimensionality. In this talk, I will present recent results on guaranteed recovery of overcomplete tensors for generic parameters. This is joint work with Majid Janzamin and Rong Ge.

Slides: pdf


Contributed posters

Invited posters