Episode
Vladimir Vapnik: Statistical Learning
Description
Vladimir Vapnik is the co-inventor of support vector machines, support vector clustering, VC theory, and many foundational ideas in statistical learning. His work has been cited over 170,000 times. He has some very interesting ideas about artificial intelligence and the nature of learning, especially on the limits of our current approaches and the open problems in the field. Video version is available on YouTube. If you would like to get more information about this podcast go to https://lexfridman.com/ai or connect with @lexfridman on Twitter, LinkedIn, Facebook, or YouTube where you can watch the video versions of these conversations.
Chapters
Vladimir Vapnik, co-inventor of Support Vector Machines and pioneer in statistical learning, discusses the current limits of AI and the open problems in the field, as well as his views on the extent to which God may play dice with the fundamental nature of reality.
00:00 - 01:48 (01:48)
Summary
Vladimir Vapnik, co-inventor of Support Vector Machines and pioneer in statistical learning, discusses the current limits of AI and the open problems in the field, as well as his views on the extent to which God may play dice with the fundamental nature of reality.
EpisodeVladimir Vapnik: Statistical Learning
PodcastLex Fridman Podcast
The episode discusses the idea of the unreasonable effectiveness of mathematics in the natural sciences and whether it is a reflection of God's laws or simply a tool for making predictions.
01:50 - 09:25 (07:35)
Summary
The episode discusses the idea of the unreasonable effectiveness of mathematics in the natural sciences and whether it is a reflection of God's laws or simply a tool for making predictions.
EpisodeVladimir Vapnik: Statistical Learning
PodcastLex Fridman Podcast
This podcast discusses how mathematically modeling the process of learning in the human brain is similar to machine learning and how the teaching process is not irrelevant.
09:25 - 14:42 (05:16)
Summary
This podcast discusses how mathematically modeling the process of learning in the human brain is similar to machine learning and how the teaching process is not irrelevant.
EpisodeVladimir Vapnik: Statistical Learning
PodcastLex Fridman Podcast
This podcast episode discusses the concept of admissible set of functions in the field of science and mathematics.
14:42 - 23:05 (08:22)
Summary
This podcast episode discusses the concept of admissible set of functions in the field of science and mathematics. The goal is to create a set of functions with a small capacity or VC dimension, leading to a more effective and accurate representation.
EpisodeVladimir Vapnik: Statistical Learning
PodcastLex Fridman Podcast
In this podcast, the speaker questions the nature of learning and the interpretations behind deep learning.
23:06 - 28:58 (05:52)
Summary
In this podcast, the speaker questions the nature of learning and the interpretations behind deep learning. They argue that the optimal solution for learning lies in shadow networks, rather than deep learning.
EpisodeVladimir Vapnik: Statistical Learning
PodcastLex Fridman Podcast
Deep learning methods can create admissible sets of functions with significantly less training data.
28:58 - 35:26 (06:27)
Summary
Deep learning methods can create admissible sets of functions with significantly less training data. However, it is important to recognize that not all problems can be solved through deep learning and that the creation of admissible sets of functions requires sufficient training data.
EpisodeVladimir Vapnik: Statistical Learning
PodcastLex Fridman Podcast
The speaker discusses how people can develop specific abilities such as playing like a butterfly or sweeping like a doc through the use of formal statistics and the uniform law of large numbers.
35:27 - 42:08 (06:40)
Summary
The speaker discusses how people can develop specific abilities such as playing like a butterfly or sweeping like a doc through the use of formal statistics and the uniform law of large numbers.
EpisodeVladimir Vapnik: Statistical Learning
PodcastLex Fridman Podcast
The concept of symmetry and invariants is crucial in image recognition, where it allows creating mathematical measures that can be applied to a variety of images to recognize objects with much fewer examples.
42:09 - 48:54 (06:45)
Summary
The concept of symmetry and invariants is crucial in image recognition, where it allows creating mathematical measures that can be applied to a variety of images to recognize objects with much fewer examples.
EpisodeVladimir Vapnik: Statistical Learning
PodcastLex Fridman Podcast
In this episode, a renowned mathematician shares his philosophy of science, the importance of honesty in research, and some of his happiest and most profound moments as a researcher.
48:55 - 54:14 (05:18)
Summary
In this episode, a renowned mathematician shares his philosophy of science, the importance of honesty in research, and some of his happiest and most profound moments as a researcher.