Chapter

General Problem Solving Machines and Artificial Curiosity
listen on Spotify
36:03 - 49:27 (13:24)

The idea of creating general problem-solving machines involves developing artificial curiosity systems that go beyond current knowledge to become more general problem solvers and understand more of the world. As part of this process, curiosity is a natural feature of the recurrent network that learns to solve problems in a more efficient manner by creating little subnetworks that stand for the properties of the agent, hand, and other actuators.

Clips
The concept of being curious and using experimentation for survival applies not only to official scientists but also babies, who use their toys to understand how the world works.
36:03 - 38:26 (02:22)
listen on Spotify
Curiosity
Summary

The concept of being curious and using experimentation for survival applies not only to official scientists but also babies, who use their toys to understand how the world works. Curiosity is an important trait humans have, as it allows us to explore and create situations that transcend our existing knowledge base and become more successful problem-solvers.

Chapter
General Problem Solving Machines and Artificial Curiosity
Episode
Juergen Schmidhuber: Godel Machines, Meta-Learning, and LSTMs
Podcast
Lex Fridman Podcast
The podcast discusses the significance of curiosity and creativity when it comes to solving problems with the help of a problem-solving machine.
38:26 - 41:30 (03:03)
listen on Spotify
Artificial Intelligence
Summary

The podcast discusses the significance of curiosity and creativity when it comes to solving problems with the help of a problem-solving machine. It highlights the difference between applied creativity and pure creativity, the latter being essential to human-level intelligence.

Chapter
General Problem Solving Machines and Artificial Curiosity
Episode
Juergen Schmidhuber: Godel Machines, Meta-Learning, and LSTMs
Podcast
Lex Fridman Podcast
Recurrent networks are likely to form subnetworks that encode the properties of the agents, hands, and other sensors, to compress and better decode data influenced by the actions of agents, maximising the reward in a given environment, finding the charging station in time while avoiding painful obstacles, and using predictive modelling to plan ahead.
41:30 - 45:57 (04:26)
listen on Spotify
Recurrent Networks
Summary

Recurrent networks are likely to form subnetworks that encode the properties of the agents, hands, and other sensors, to compress and better decode data influenced by the actions of agents, maximising the reward in a given environment, finding the charging station in time while avoiding painful obstacles, and using predictive modelling to plan ahead.

Chapter
General Problem Solving Machines and Artificial Curiosity
Episode
Juergen Schmidhuber: Godel Machines, Meta-Learning, and LSTMs
Podcast
Lex Fridman Podcast
The speaker discusses the use of Recurrent Neural Networks (RNNs) and Long Short-Term Memory Networks (LSTMs) in speech recognition, explaining that RNNs are expanded and old versions of a record network, while LSTMs are a type of RNN that allows for better memory retention.
45:57 - 49:27 (03:30)
listen on Spotify
Recurrent Neural Networks
Summary

The speaker discusses the use of Recurrent Neural Networks (RNNs) and Long Short-Term Memory Networks (LSTMs) in speech recognition, explaining that RNNs are expanded and old versions of a record network, while LSTMs are a type of RNN that allows for better memory retention.

Chapter
General Problem Solving Machines and Artificial Curiosity
Episode
Juergen Schmidhuber: Godel Machines, Meta-Learning, and LSTMs
Podcast
Lex Fridman Podcast