Chapter

Challenges of Scaling Deep Learning Models
listen on Spotify
2:13:56 - 2:19:22 (05:25)

The podcast discusses the challenges and opportunities of scaling deep learning models to run on systems with hundreds of trillions of parameters and run for thousands of years. They explore the idea of sparsely gated mixture of experts and algorithmic improvement.

Clips
As technology continues to advance rapidly, there is a concern that upcoming generations may not be able to fully appreciate or comprehend it due to growing up with outdated technology.
2:13:56 - 2:16:06 (02:09)
listen on Spotify
Technology
Summary

As technology continues to advance rapidly, there is a concern that upcoming generations may not be able to fully appreciate or comprehend it due to growing up with outdated technology. The machine learning world has been captivated by GPT-3 and language models, but with the potential for even more advanced technology in the next 100 years, there is a sense of worry.

Chapter
Challenges of Scaling Deep Learning Models
Episode
#131 – Chris Lattner: The Future of Computing and Programming Languages
Podcast
Lex Fridman Podcast
The technical challenges of scaling up large networks such as sparsely gated mixture of experts, with hundreds of trillions of parameters that have to run on thousands of years.
2:16:06 - 2:19:22 (03:15)
listen on Spotify
Artificial Intelligence
Summary

The technical challenges of scaling up large networks such as sparsely gated mixture of experts, with hundreds of trillions of parameters that have to run on thousands of years. The challenge is to explore how technologies can be applied without causing great harm.

Chapter
Challenges of Scaling Deep Learning Models
Episode
#131 – Chris Lattner: The Future of Computing and Programming Languages
Podcast
Lex Fridman Podcast