Chapter
The Limits of Cost Functions in Evolution and Deep Learning
This podcast explores the idea that evolution does not have a cost function in the way we think of it in deep learning, raising questions about whether cost functions are holding us back in this field. It ponders the possibility of different architectures that can account for dynamic or multiple cost functions.
Clips
The cost function analogy for optimization may not always be useful as systems with multiple or dynamic cost functions may not be easily optimized with gradient descent algorithms.
09:00 - 11:14 (02:14)
Summary
The cost function analogy for optimization may not always be useful as systems with multiple or dynamic cost functions may not be easily optimized with gradient descent algorithms.
ChapterThe Limits of Cost Functions in Evolution and Deep Learning
Episode#94 – Ilya Sutskever: Deep Learning
PodcastLex Fridman Podcast
The podcast explores the role of cost functions in deep learning and questions whether they may be holding us back from discovering new ways to approach problems.
11:14 - 13:24 (02:09)
Summary
The podcast explores the role of cost functions in deep learning and questions whether they may be holding us back from discovering new ways to approach problems. They consider how neuroscience could potentially offer a new perspective on learning rules, such as spike time independent plasticity, which could offer alternate paths of exploration.