MSWeD44
Mathematics of Information and Low Dimensional Models  Part II of III
For Part I, see MSTuE44
For Part III, see MSWeE44
Date: August 12
Time: 13:3015:30
Room: 21
(Note: Click title to show the abstract.)
Organizer:
Blanchard, Jeffrey (Grinnell College)
Abstract: This minsymposium considers a variety of illposed inverse problems associated with information theory, signal processing, and image processing. By exploiting low dimensional structure, such as in compressed sensing and low rank matrix completion, tractable algorithms permit construction of accurate approximate solutions and low dimensional representations. The minisymposium will include stateoftheart work on algorithms, theoretical analysis, and relationships with high dimensional geometry from researchers at all stages of their careers.
Notes to ICIAM Committee:
 Jared Tanner (Oxford) is a coorganizer of this symposium but does not have a pin.
 This symposium is sponsored by the SIAM SIAG on Linear Algebra.
MSWeD441
13:3014:00
Multiscale Geometric Methods for Statistical Learning and Data in HighDimensions
Maggioni, Mauro (Duke Univ.)
Liao, Wenjing (Duke Univ. & SAMSI)
Abstract: We discuss a family of algorithms for analyzing various new and classical problems in the analysis of highdimensional data sets. These methods rely on the idea of performing suitable multiscale geometric decompositions of the data, and exploiting such a decomposition to perform a variety of tasks in signal processing and statistical learning. In particular, we discuss the problem of dictionary learning, in Euclidean and metric spaces, and their applications in statistical signal processing.
MSWeD442
14:0014:30
Lowdimensional quantized representations of signals and their distances.
Boufounos, Petros (Mitsubishi Electric Research Laboratories)
Abstract: Recently developed lowdimensional quantized representations, such as universal embeddings have proven very powerful in coding signals for cloud computing and bigdata applications. This talk explores how such representations provide rateefficient, lowlatency, and lowcomplexity transmission for inference over communication channels. Such embeddings can also be used in coding signals when side information about the signal is available at the receiver, thus enabling lowcomplexity, efficient compression methods. We demonstrate results in image compression and retrieval applications.
MSWeD443
14:3015:00
Does $\ell_p$minimization outperform $\ell_1$minimization?
Maleki, Arian (Columbia Univ.)
Abstract: In many application areas ranging from bioinformatics to imaging we are faced with the following question: Can we recover a sparse vector $x_o \in \mathbb{R}^N$ from its undersampled set of noisy observations $y \in \mathbb{R}^n$, $y=A x_o+w$. This talk presents an accurate analysis of a class of recovery algorithms known as $\ell_p$regularized least squares for different values of $0 \leq p \leq 1$, under the asymptotic settings.
MSWeD444
15:0015:30
Metric learning with rank and sparsity constraints
Bah, Bubacarr (Univ. of Texas at Austin)
Abstract: Choosing a distance preserving metric is fundamental to many signal processing algorithms, such as kmeans, nearest neighbor searches, compressive sensing, etc. In virtually all these applications, the efficiency of the signal processing algorithm depends on how fast we can evaluate the learned metric. Moreover, storing the chosen metric can create space bottlenecks in high dimensional signal processing problems. As a result, we consider data dependent metric learning with rank as well as sparsity constraints. We propose a new fast nonconvex algorithm and empirically demonstrate its performance on various datasets.
Return
Footnote: Code: TypeDateTimeRoom No.
Type : IL=Invited Lecture, SL=Special Lectures, MS=Minisymposia, IM=Industrial Minisymposia, CP=Contributed Papers, PP=Posters
Date: Mo=Monday, Tu=Tuesday, We=Wednesday, Th=Thursday, Fr=Friday
Time : A=8:309:30, B=10:0011:00, C=11:1012:10, BC=10:0012:10, D=13:3015:30, E=16:0018:00, F=19:0020:00, G=12:1013:30, H=15:3016:00
Room No.: TBA
