Chapter

The Role of Markov Chains in Text Compression and Modeling
listen on Spotify
1:30:50 - 1:39:10 (08:19)

This episode explores the use of Markov chains and higher order models in text compression and modeling. The conversation touches on the limitations of models based on Markov chains, as well as the potential for more sophisticated models to capture key abstractions and common sense knowledge.

Clips
By creating higher order models or more sophisticated structures in machine learning, such as transformer networks, it is possible to form compressed representations of large amounts of text.
1:30:50 - 1:34:52 (04:02)
listen on Spotify
Machine learning
Summary

By creating higher order models or more sophisticated structures in machine learning, such as transformer networks, it is possible to form compressed representations of large amounts of text. In this way, it is thought that an approximation of a world model could be created that may be useful for reasoning.

Chapter
The Role of Markov Chains in Text Compression and Modeling
Episode
#115 – Dileep George: Brain-Inspired AI
Podcast
Lex Fridman Podcast
The neural network architecture, although good for text modeling, is still a feed-forward architecture, meaning that unless someone has written down the information in detail, it will not be present.
1:34:52 - 1:39:10 (04:17)
listen on Spotify
Artificial Intelligence
Summary

The neural network architecture, although good for text modeling, is still a feed-forward architecture, meaning that unless someone has written down the information in detail, it will not be present. The question arises as to whether the neural network architecture with hundreds of trillions of parameters could store all information in the world.

Chapter
The Role of Markov Chains in Text Compression and Modeling
Episode
#115 – Dileep George: Brain-Inspired AI
Podcast
Lex Fridman Podcast