The BBVA Foundation Frontiers of Knowledge Award in Information and Communication Technologies goes, in this twelfth edition, to Isabelle Guyon, Bernhard Schölkopf and Vladmir Vapnik for their fundamental contributions to machine learning, specifically in the theory and applications of support vector machines (SVMs) and kernel methods.
Machine learning is transforming the everyday world, improving fields as diverse as medical diagnosis, computer vision, natural language processing, and monitoring climate change. This powerful tool teaches computers to infer relationships and patterns from real world data, but doing this accurately and reliably requires a method to separate data points into distinct classes. This problem was solved by the theory and implementation of SVMs, originally developed by Vapnik and Guyon, and made efficient through elegant kernel methods conceived by Schölkopf.
Vapnik pioneered the field of statistical learning theory while at the Institute of Control Sciences in Moscow, where, with his late student, he conceived the Vapnik-Chervonenkis dimension, a powerful measure of the complexity of learnability. Moving to Bell Labs in the early 1990s, Vapnik collaborated with Guyon on methods to find the best linear data classification. These SVMs showed, for the first time, a formal foundation for optimal learning from data and generalizing to unseen data. Schölkopf worked with Vapnik to show that SVMs could be extended to a broader set of nonlinear classifications and nonlinear principal component analysis via mathematical kernel functions. These enabled enormous simplifications in SVMs by mapping complex data sets into higher-dimensional representations. Schölkopf and Guyon have also advanced the science of causal discovery, uncovering cause-and-effect relationships in observed data, a problem considered by many to be unsolvable.