Episode
Vladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence
Description
Vladimir Vapnik is the co-inventor of support vector machines, support vector clustering, VC theory, and many foundational ideas in statistical learning. He was born in the Soviet Union, worked at the Institute of Control Sciences in Moscow, then in the US, worked at AT&T, NEC Labs, Facebook AI Research, and now is a professor at Columbia University. His work has been cited over 200,000 times. This conversation is part of the Artificial Intelligence podcast. If you would like to get more information about this podcast go to https://lexfridman.com/ai or connect with @lexfridman on Twitter, LinkedIn, Facebook, Medium, or YouTube where you can watch the video versions of these conversations. If you enjoy the podcast, please rate it 5 stars on Apple Podcasts, follow on Spotify, or support it on Patreon. This episode is presented by Cash App. Download it (App Store, Google Play), use code "LexPodcast". Here's the outline of the episode. On some podcast players you should be able to click the timestamp to jump to that time. 00:00 - Introduction 02:55 - Alan Turing: science and engineering of intelligence 09:09 - What is a predicate? 14:22 - Plato's world of ideas and world of things 21:06 - Strong and weak convergence 28:37 - Deep learning and the essence of intelligence 50:36 - Symbolic AI and logic-based systems 54:31 - How hard is 2D image understanding? 1:00:23 - Data 1:06:39 - Language 1:14:54 - Beautiful idea in statistical theory of learning 1:19:28 - Intelligence and heuristics 1:22:23 - Reasoning 1:25:11 - Role of philosophy in learning theory 1:31:40 - Music (speaking in Russian) 1:35:08 - Mortality
Chapters
In this podcast, the host talks about the PCI DSS security standard and how it applies to Cash App to ensure secure digital transactions.
00:00 - 02:25 (02:25)
Summary
In this podcast, the host talks about the PCI DSS security standard and how it applies to Cash App to ensure secure digital transactions.
EpisodeVladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence
PodcastLex Fridman Podcast
The speaker discusses the significance of predicates in constructing invariants and how they can be used to summarize human behavior and the world we live in.
02:25 - 11:42 (09:16)
Summary
The speaker discusses the significance of predicates in constructing invariants and how they can be used to summarize human behavior and the world we live in.
EpisodeVladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence
PodcastLex Fridman Podcast
The speaker discusses the concept of Plato's world of ideas and world of things, and the idea that there is a connection between the two through the use of predicates.
11:42 - 18:32 (06:49)
Summary
The speaker discusses the concept of Plato's world of ideas and world of things, and the idea that there is a connection between the two through the use of predicates.
EpisodeVladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence
PodcastLex Fridman Podcast
This episode discusses selecting a set of admissible functions that give the same value with a chosen predicate, converging towards a desired function, and minimizing the square difference between functions to find the desired function.
18:32 - 27:25 (08:52)
Summary
This episode discusses selecting a set of admissible functions that give the same value with a chosen predicate, converging towards a desired function, and minimizing the square difference between functions to find the desired function.
EpisodeVladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence
PodcastLex Fridman Podcast
This podcast episode explores the importance of discovering good predicates in developing intelligent machines and questions whether machines can do this task effectively.
27:25 - 31:47 (04:22)
Summary
This podcast episode explores the importance of discovering good predicates in developing intelligent machines and questions whether machines can do this task effectively.
EpisodeVladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence
PodcastLex Fridman Podcast
By having an admissible set of functions, we can create a subset of functions that we can use to create architecture.
31:47 - 37:08 (05:20)
Summary
By having an admissible set of functions, we can create a subset of functions that we can use to create architecture. Closed form solutions use this set of functions, which is not the same as piecewise linear functions. Neural networks use this set of functions, making it better to have a set of functions than to rely on piecewise linear functions.
EpisodeVladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence
PodcastLex Fridman Podcast
Good predicates are crucial in reducing the set of admissible functions, leading to smaller sets of functions as new invariance is incorporated.
37:08 - 42:34 (05:26)
Summary
Good predicates are crucial in reducing the set of admissible functions, leading to smaller sets of functions as new invariance is incorporated.
EpisodeVladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence
PodcastLex Fridman Podcast
This episode delves into the 31 structural notions common in stories, from obsession to happily ever after, resonating even in non-folk narratives such as detective serials and everyday life.
42:34 - 44:21 (01:47)
Summary
This episode delves into the 31 structural notions common in stories, from obsession to happily ever after, resonating even in non-folk narratives such as detective serials and everyday life.
EpisodeVladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence
PodcastLex Fridman Podcast
Theore claims that if you use all available functions from Hilda's space, you won't need training data.
44:21 - 51:37 (07:15)
Summary
Theore claims that if you use all available functions from Hilda's space, you won't need training data. However, if you only use a few good predicates, you'll need some training data. Thus, the search for good predicates remains an ongoing challenge in machine learning.
EpisodeVladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence
PodcastLex Fridman Podcast
The speaker discusses the challenge of selecting a small set of admissible functions that are good enough to extract universal invariance for machine learning and image understanding, which requires several invariance to get the same performance as the best neural net using 100 times fewer examples.
51:37 - 56:07 (04:30)
Summary
The speaker discusses the challenge of selecting a small set of admissible functions that are good enough to extract universal invariance for machine learning and image understanding, which requires several invariance to get the same performance as the best neural net using 100 times fewer examples.
EpisodeVladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence
PodcastLex Fridman Podcast
The selection of data set is a pivotal task in discovering useful predicates or functions, and finding the right data can lead to understanding common sense in our three-dimensional world.
56:08 - 1:02:08 (06:00)
Summary
The selection of data set is a pivotal task in discovering useful predicates or functions, and finding the right data can lead to understanding common sense in our three-dimensional world.
EpisodeVladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence
PodcastLex Fridman Podcast
This podcast episode discusses the importance of having a small set of functions for statistical analysis, specifically focusing on the difference between the law of large numbers and the uniform law of large numbers.
1:02:09 - 1:06:50 (04:40)
Summary
This podcast episode discusses the importance of having a small set of functions for statistical analysis, specifically focusing on the difference between the law of large numbers and the uniform law of large numbers.
EpisodeVladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence
PodcastLex Fridman Podcast
The limitations of handwritten recognition and natural language processing are discussed in this podcast, with a comparison made between language models which have a finite set of letters and simpler benchmarks and image analysis which needs more complicated and sophisticated approaches.
1:06:50 - 1:13:55 (07:04)
Summary
The limitations of handwritten recognition and natural language processing are discussed in this podcast, with a comparison made between language models which have a finite set of letters and simpler benchmarks and image analysis which needs more complicated and sophisticated approaches. The role of symmetry is also highlighted in real-life 2D images.
EpisodeVladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence
PodcastLex Fridman Podcast
The use of weak convergence is essential in reducing the set of possible admissible functions and understanding their properties, especially when it comes to learning.
1:13:55 - 1:19:00 (05:05)
Summary
The use of weak convergence is essential in reducing the set of possible admissible functions and understanding their properties, especially when it comes to learning. Although uniform convergence is useful, it is not enough without the power of weak convergence.
EpisodeVladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence
PodcastLex Fridman Podcast
The speaker discusses the need to make corrections to the radial basis function for bond prediction and questions whether human-level intelligence will have a closed form solution for understanding life at a high abstract level.
1:19:02 - 1:30:41 (11:39)
Summary
The speaker discusses the need to make corrections to the radial basis function for bond prediction and questions whether human-level intelligence will have a closed form solution for understanding life at a high abstract level.
EpisodeVladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence
PodcastLex Fridman Podcast
The process of collecting and analyzing different sets of data including language, music, and images can lead to sparks of ideas for creating new collections of description and predicates for digit recognition.
1:30:43 - 1:40:08 (09:25)
Summary
The process of collecting and analyzing different sets of data including language, music, and images can lead to sparks of ideas for creating new collections of description and predicates for digit recognition.
EpisodeVladimir Vapnik: Predicates, Invariants, and the Essence of Intelligence
PodcastLex Fridman Podcast
A writer and Sorecov literature discusses his insights on literature and human relationships, sharing his perspective on life and the big blocks of life.
1:40:08 - 1:45:04 (04:55)
Summary
A writer and Sorecov literature discusses his insights on literature and human relationships, sharing his perspective on life and the big blocks of life.