Chapter

Scaling Memory Networks for Human Reasoning
listen on Spotify
07:52 - 17:14 (09:21)

Deep learning and human reasoning are not the same thing, and trying to scale a memory network to contain all of Wikipedia doesn't quite work. To achieve human-like reasoning, neural nets may require prior structure and different math than what is commonly used in computer science.

Clips
The speaker discusses the power of deep learning and how neural nets with non-convex objective functions and numerous parameters can learn anything from a small amount of data using stochastic gradient descent, contrary to what is taught in traditional textbooks.
07:52 - 09:50 (01:57)
listen on Spotify
Deep Learning
Summary

The speaker discusses the power of deep learning and how neural nets with non-convex objective functions and numerous parameters can learn anything from a small amount of data using stochastic gradient descent, contrary to what is taught in traditional textbooks.

Chapter
Scaling Memory Networks for Human Reasoning
Episode
Yann LeCun: Deep Learning, Convolutional Neural Networks, and Self-Supervised Learning
Podcast
Lex Fridman Podcast
Yann LeCun, Chief Artificial Intelligence Scientist at Facebook, delves into the basics of learning and intelligence of the human brain, the inseparability of learning and intelligence, and the use of machine learning to automate intelligence.
09:50 - 11:20 (01:30)
listen on Spotify
Machine Learning
Summary

Yann LeCun, Chief Artificial Intelligence Scientist at Facebook, delves into the basics of learning and intelligence of the human brain, the inseparability of learning and intelligence, and the use of machine learning to automate intelligence. He also discusses the compatibility of reasoning with gradient-based learning.

Chapter
Scaling Memory Networks for Human Reasoning
Episode
Yann LeCun: Deep Learning, Convolutional Neural Networks, and Self-Supervised Learning
Podcast
Lex Fridman Podcast
This podcast discusses the use of cybernetic math in deep learning and the challenges of reconciling discrete reasoning with gradient-based learning.
11:20 - 14:13 (02:52)
listen on Spotify
Deep Learning
Summary

This podcast discusses the use of cybernetic math in deep learning and the challenges of reconciling discrete reasoning with gradient-based learning. It poses the question of how much prior structure needs to be put into neural networks to achieve human-like reasoning.

Chapter
Scaling Memory Networks for Human Reasoning
Episode
Yann LeCun: Deep Learning, Convolutional Neural Networks, and Self-Supervised Learning
Podcast
Lex Fridman Podcast
Building a memory network that can contain all of Wikipedia's knowledge is challenging.
14:13 - 17:14 (03:01)
listen on Spotify
Memory Networks
Summary

Building a memory network that can contain all of Wikipedia's knowledge is challenging. Additionally, the network needs to access and process the information iteratively, requiring a neural network like the transformer.

Chapter
Scaling Memory Networks for Human Reasoning
Episode
Yann LeCun: Deep Learning, Convolutional Neural Networks, and Self-Supervised Learning
Podcast
Lex Fridman Podcast