Chapter

The Limits of Cost Functions in Evolution and Deep Learning
listen on Spotify
09:00 - 13:24 (04:24)

This podcast explores the idea that evolution does not have a cost function in the way we think of it in deep learning, raising questions about whether cost functions are holding us back in this field. It ponders the possibility of different architectures that can account for dynamic or multiple cost functions.

Clips
The cost function analogy for optimization may not always be useful as systems with multiple or dynamic cost functions may not be easily optimized with gradient descent algorithms.
09:00 - 11:14 (02:14)
listen on Spotify
Optimization
Summary

The cost function analogy for optimization may not always be useful as systems with multiple or dynamic cost functions may not be easily optimized with gradient descent algorithms.

Chapter
The Limits of Cost Functions in Evolution and Deep Learning
Episode
#94 – Ilya Sutskever: Deep Learning
Podcast
Lex Fridman Podcast
The podcast explores the role of cost functions in deep learning and questions whether they may be holding us back from discovering new ways to approach problems.
11:14 - 13:24 (02:09)
listen on Spotify
Deep Learning
Summary

The podcast explores the role of cost functions in deep learning and questions whether they may be holding us back from discovering new ways to approach problems. They consider how neuroscience could potentially offer a new perspective on learning rules, such as spike time independent plasticity, which could offer alternate paths of exploration.

Chapter
The Limits of Cost Functions in Evolution and Deep Learning
Episode
#94 – Ilya Sutskever: Deep Learning
Podcast
Lex Fridman Podcast