Multilinear Subspace Learning Via Linear Tansforms And Grassmannian Manifold Analysis

Download Multilinear Subspace Learning Via Linear Tansforms And Grassmannian Manifold Analysis PDF/ePub or read online books in Mobi eBooks. Click Download or Read Online button to get Multilinear Subspace Learning Via Linear Tansforms And Grassmannian Manifold Analysis book now. This website allows unlimited access to, at the time of writing, more than 1.5 million titles, including hundreds of thousands of titles in various foreign languages.
Multilinear Subspace Learning Via Linear Tansforms and Grassmannian Manifold Analysis

Furthermore, in general, the number of observations is relatively small compared to the feature vector dimension potentially resulting in poor conditioning (referred to as the small sample size problem). Due to these issues, particularly when dealing with higher-order data with high dimensionality, there has been a growing interest in multilinear subspace learning (MSL) to maintain the natural representation of multidimensional arrays (commonly referred to as tensors). To best explore, analyze, and provide insights from such data, new mathematical tools are required in an effort to bridge the gap between traditional machine learning models and their multilinear counterparts. In this dissertation, we present new approaches and formulate mathematical theories to deal with such data using a multilinear (tensor-tensor) perspective. In particular, we provide insights into several different application areas within the machine learning community and illustrate how multilinear extensions can be achieved.
Generalized Principal Component Analysis

This book provides a comprehensive introduction to the latest advances in the mathematical theory and computational tools for modeling high-dimensional data drawn from one or multiple low-dimensional subspaces (or manifolds) and potentially corrupted by noise, gross errors, or outliers. This challenging task requires the development of new algebraic, geometric, statistical, and computational methods for efficient and robust estimation and segmentation of one or multiple subspaces. The book also presents interesting real-world applications of these new methods in image processing, image and video segmentation, face recognition and clustering, and hybrid system identification etc. This book is intended to serve as a textbook for graduate students and beginning researchers in data science, machine learning, computer vision, image and signal processing, and systems theory. It contains ample illustrations, examples, and exercises and is made largely self-contained with three Appendices which survey basic concepts and principles from statistics, optimization, and algebraic-geometry used in this book. René Vidal is a Professor of Biomedical Engineering and Director of the Vision Dynamics and Learning Lab at The Johns Hopkins University. Yi Ma is Executive Dean and Professor at the School of Information Science and Technology at ShanghaiTech University. S. Shankar Sastry is Dean of the College of Engineering, Professor of Electrical Engineering and Computer Science and Professor of Bioengineering at the University of California, Berkeley.
Optimization Algorithms on Matrix Manifolds

Author: P.-A. Absil
language: en
Publisher: Princeton University Press
Release Date: 2007-12-23
Many problems in the sciences and engineering can be rephrased as optimization problems on matrix search spaces endowed with a so-called manifold structure. This book shows how to exploit the special structure of such problems to develop efficient numerical algorithms. It places careful emphasis on both the numerical formulation of the algorithm and its differential geometric abstraction--illustrating how good algorithms draw equally from the insights of differential geometry, optimization, and numerical analysis. Two more theoretical chapters provide readers with the background in differential geometry necessary to algorithmic development. In the other chapters, several well-known optimization methods such as steepest descent and conjugate gradients are generalized to abstract manifolds. The book provides a generic development of each of these methods, building upon the material of the geometric chapters. It then guides readers through the calculations that turn these geometrically formulated methods into concrete numerical algorithms. The state-of-the-art algorithms given as examples are competitive with the best existing algorithms for a selection of eigenspace problems in numerical linear algebra. Optimization Algorithms on Matrix Manifolds offers techniques with broad applications in linear algebra, signal processing, data mining, computer vision, and statistical analysis. It can serve as a graduate-level textbook and will be of interest to applied mathematicians, engineers, and computer scientists.