Chapter

The Limitations of Deep Learning Metaphors for Understanding the Mind
listen on Spotify
38:54 - 50:58 (12:03)

The common metaphor that the mind is like a neural network that gets trained through exposure to data has some limitations according to Sam Bowman. While there has been progress in natural language processing, there are still flaws that need addressing, and one potential solution could be to write programs over the latent space these models operate on.

Clips
Some AI researchers view the human mind as a neural network that is blank at birth and gets trained through exposure to data; however, there is skepticism regarding the ability of neural networks to reason.
38:54 - 44:24 (05:29)
listen on Spotify
AI
Summary

Some AI researchers view the human mind as a neural network that is blank at birth and gets trained through exposure to data; however, there is skepticism regarding the ability of neural networks to reason.

Chapter
The Limitations of Deep Learning Metaphors for Understanding the Mind
Episode
#120 – François Chollet: Measures of Intelligence
Podcast
Lex Fridman Podcast
Simply scaling up GPT-3 to have more trained data and transformative layers won't address its flaws.
44:24 - 48:38 (04:14)
listen on Spotify
AI
Summary

Simply scaling up GPT-3 to have more trained data and transformative layers won't address its flaws. A possible way forward is to write programs over the latent space that these models operate on in collaboration with self-supervised models.

Chapter
The Limitations of Deep Learning Metaphors for Understanding the Mind
Episode
#120 – François Chollet: Measures of Intelligence
Podcast
Lex Fridman Podcast
The scalability of GPT-3 is not hindered by its size or training time, but by the amount of available training data.
48:38 - 50:58 (02:19)
listen on Spotify
GPT-3
Summary

The scalability of GPT-3 is not hindered by its size or training time, but by the amount of available training data. GPT-3 is currently trained on a crawl of the entire web by OpenUI, but the impact of GPT-3 with 100 trillion parameters is yet to be seen.

Chapter
The Limitations of Deep Learning Metaphors for Understanding the Mind
Episode
#120 – François Chollet: Measures of Intelligence
Podcast
Lex Fridman Podcast