Clip

Strategies for More Efficient Neural Network Training
listen on SpotifyListen on Youtube
51:19 - 53:20 (02:00)

In order to slow the growth of large amounts of data in supervised learning, the focus should shift to picking better examples for neural networks to learn from rather than solely relying on architecture. Furthermore, taking inspiration from biology, it's beneficial to have a weak prior, allowing for flexibility and opportunism in machine learning.

Similar Clips