Regularization, Optimization, Kernels, and Support Vector Machines
Regularization, Optimization, Kernels, and Support Vector Machines offers a snapshot of the current state of the art of large-scale machine learning, providing a single multidisciplinary source for the latest research and advances in regularization, sparsity, compressed sensing, convex and large-scale optimization, kernel methods, and support vector machines. Consisting of 21 chapters authored by leading researchers in machine learning, this comprehensive reference: Covers the relationship between support vector machines (SVMs) and the Lasso Discusses multi-layer SVMs Explores nonparametric feature selection, basis pursuit methods, and robust compressive sensing Describes graph-based regularization methods for single- and multi-task learning Considers regularized methods for dictionary learning and portfolio selection Addresses non-negative matrix factorization Examines low-rank matrix and tensor-based models Presents advanced kernel methods for batch and online machine learning, system identification, domain adaptation, and image processing Tackles large-scale algorithms including conditional gradient methods, (non-convex) proximal techniques, and stochastic gradient descent Regularization, Optimization, Kernels, and Support Vector Machines is ideal for researchers in machine learning, pattern recognition, data mining, signal processing, statistical learning, and related areas.