Chapter

Challenges of Scaling Deep Learning Models
The podcast discusses the challenges and opportunities of scaling deep learning models to run on systems with hundreds of trillions of parameters and run for thousands of years. They explore the idea of sparsely gated mixture of experts and algorithmic improvement.
Clips
As technology continues to advance rapidly, there is a concern that upcoming generations may not be able to fully appreciate or comprehend it due to growing up with outdated technology.
2:13:56 - 2:16:06 (02:09)
Summary
As technology continues to advance rapidly, there is a concern that upcoming generations may not be able to fully appreciate or comprehend it due to growing up with outdated technology. The machine learning world has been captivated by GPT-3 and language models, but with the potential for even more advanced technology in the next 100 years, there is a sense of worry.
ChapterChallenges of Scaling Deep Learning Models
Episode#131 – Chris Lattner: The Future of Computing and Programming Languages
PodcastLex Fridman Podcast
The technical challenges of scaling up large networks such as sparsely gated mixture of experts, with hundreds of trillions of parameters that have to run on thousands of years.
2:16:06 - 2:19:22 (03:15)
Summary
The technical challenges of scaling up large networks such as sparsely gated mixture of experts, with hundreds of trillions of parameters that have to run on thousands of years. The challenge is to explore how technologies can be applied without causing great harm.