The increasing role of artificial intelligence to automate decision-making sparks concern about potential AI-based discrimination.
Matrices of small integers—innocuous as they may seem—can clearly provoke interesting behavior.
Joseph Teran spoke about the use of mathematical models for computer-generated imagery at AN18.
Gil Strang identifies continuous piecewise linear functions as powerful approximators in an effort to transform shallow learning into deep learning.
On the occasion of a birthday celebration, Walter Gautschi describes his interest in different research areas.
Nature can overcome the second law of thermodynamics and (nearly) exchange the temperatures of two substances.
Per-Gunnar Martinsson describes two randomized algorithms designed to help process large datasets in high-dimensional spaces.
2017 / x + 484 pages / Softcover / ISBN 978-1-611974-98-0 / List Price $97.00 / SIAM Member Price $67.90 / Order Code MO25
Keywords: nonlinear optimization; convex analysis; first order methods; decomposition methods ; scientific computing
The primary goal of this book is to provide a self-contained, comprehensive study of the main ﬁrst-order methods that are frequently used in solving large-scale problems. First-order methods exploit information on values and gradients/subgradients (but not Hessians) of the functions composing the model under consideration. With the increase in the number of applications that can be modeled as large or even huge-scale optimization problems, there has been a revived interest in using simple methods that require low iteration cost as well as low memory storage.
The author has gathered, reorganized, and synthesized (in a unified manner) many results that are currently scattered throughout the literature, many of which cannot be typically found in optimization books.
First-Order Methods in Optimization
This book is intended primarily for researchers and graduate students in mathematics, computer sciences, and electrical and other engineering departments. Readers with a background in advanced calculus and linear algebra, as well as prior knowledge in the fundamentals of optimization (some convex analysis, optimality conditions, and duality), will be best prepared for the material.
About the Author
Amir Beck is a Professor at the School of Mathematical Sciences, Tel-Aviv University. His research interests are in continuous optimization, including theory, algorithmic analysis, and its applications. He has published numerous papers and has given invited lectures at international conferences. He serves in the editorial board of several journals. His research has been supported by various funding agencies, including the Israel Science Foundation, the German-Israeli Foundation, the United States–Israel Binational Science Foundation, the Israeli Science and Energy ministries and the European Community.
View this book