Chapter
Scaling Memory Networks for Human Reasoning
Deep learning and human reasoning are not the same thing, and trying to scale a memory network to contain all of Wikipedia doesn't quite work. To achieve human-like reasoning, neural nets may require prior structure and different math than what is commonly used in computer science.
Clips
The speaker discusses the power of deep learning and how neural nets with non-convex objective functions and numerous parameters can learn anything from a small amount of data using stochastic gradient descent, contrary to what is taught in traditional textbooks.
07:52 - 09:50 (01:57)
Summary
The speaker discusses the power of deep learning and how neural nets with non-convex objective functions and numerous parameters can learn anything from a small amount of data using stochastic gradient descent, contrary to what is taught in traditional textbooks.
ChapterScaling Memory Networks for Human Reasoning
EpisodeYann LeCun: Deep Learning, Convolutional Neural Networks, and Self-Supervised Learning
PodcastLex Fridman Podcast
Yann LeCun, Chief Artificial Intelligence Scientist at Facebook, delves into the basics of learning and intelligence of the human brain, the inseparability of learning and intelligence, and the use of machine learning to automate intelligence.
09:50 - 11:20 (01:30)
Summary
Yann LeCun, Chief Artificial Intelligence Scientist at Facebook, delves into the basics of learning and intelligence of the human brain, the inseparability of learning and intelligence, and the use of machine learning to automate intelligence. He also discusses the compatibility of reasoning with gradient-based learning.
ChapterScaling Memory Networks for Human Reasoning
EpisodeYann LeCun: Deep Learning, Convolutional Neural Networks, and Self-Supervised Learning
PodcastLex Fridman Podcast
This podcast discusses the use of cybernetic math in deep learning and the challenges of reconciling discrete reasoning with gradient-based learning.
11:20 - 14:13 (02:52)
Summary
This podcast discusses the use of cybernetic math in deep learning and the challenges of reconciling discrete reasoning with gradient-based learning. It poses the question of how much prior structure needs to be put into neural networks to achieve human-like reasoning.
ChapterScaling Memory Networks for Human Reasoning
EpisodeYann LeCun: Deep Learning, Convolutional Neural Networks, and Self-Supervised Learning
PodcastLex Fridman Podcast
Building a memory network that can contain all of Wikipedia's knowledge is challenging.
14:13 - 17:14 (03:01)
Summary
Building a memory network that can contain all of Wikipedia's knowledge is challenging. Additionally, the network needs to access and process the information iteratively, requiring a neural network like the transformer.