Chapter

The Exciting Possibility of Shrinking Computing Performance
The possibility of more powerful computers is exciting, but for some, shrinking computing performance is even more so. Due to this, designers are looking into optimizing architecture and using more transistors or not being swamped by their complexity.
Clips
Designing equipment that is more efficient and effective can help cope with changes in technology and materials.
35:41 - 39:11 (03:30)
Summary
Designing equipment that is more efficient and effective can help cope with changes in technology and materials. This can lead to more efficient use of resources and increase the speed of construction.
ChapterThe Exciting Possibility of Shrinking Computing Performance
EpisodeJim Keller: Moore’s Law, Microprocessors, Abstractions, and First Principles
PodcastLex Fridman Podcast
The potential of advancements in computing performance is exciting and rather than just shrinking technology, the direction towards massive parallelism in terms of stacking CPUs is one of the biggest exciting possibilities of advancement.
39:11 - 42:36 (03:25)
Summary
The potential of advancements in computing performance is exciting and rather than just shrinking technology, the direction towards massive parallelism in terms of stacking CPUs is one of the biggest exciting possibilities of advancement.
ChapterThe Exciting Possibility of Shrinking Computing Performance
EpisodeJim Keller: Moore’s Law, Microprocessors, Abstractions, and First Principles
PodcastLex Fridman Podcast
The optimization of neural networks is not necessarily a search process, as the computation involved in teasing out attributes is more of a projection process.
42:36 - 43:36 (00:59)
Summary
The optimization of neural networks is not necessarily a search process, as the computation involved in teasing out attributes is more of a projection process. While inference may involve some level of search, the training process itself is not based on search.
ChapterThe Exciting Possibility of Shrinking Computing Performance
EpisodeJim Keller: Moore’s Law, Microprocessors, Abstractions, and First Principles
PodcastLex Fridman Podcast
The definition of optimization differs when it comes to neural networks, as compared to a chess board.
43:36 - 44:34 (00:57)
Summary
The definition of optimization differs when it comes to neural networks, as compared to a chess board. The multi-dimensional space in which networks are optimizing is nothing like a chess board database.