Chapter

Making Tensor Computations Easier with Domain-Specific Languages and Higher Level Programming Languages
The development of domain-specific languages and higher level programming languages, such as Swift, aim to make tensor computations easier and enable more creativity with things like RNNs and sparse convolutional neural networks.
Clips
This transcript discusses how programming languages can be a barrier to implementing cutting-edge research in deep learning, and suggests that higher-level languages like Swift could allow for easier experimentation with neural network architectures like RNNs and convolutional neural networks.
16:25 - 19:53 (03:28)
Summary
This transcript discusses how programming languages can be a barrier to implementing cutting-edge research in deep learning, and suggests that higher-level languages like Swift could allow for easier experimentation with neural network architectures like RNNs and convolutional neural networks.
ChapterMaking Tensor Computations Easier with Domain-Specific Languages and Higher Level Programming Languages
EpisodeJeremy Howard: fast.ai Deep Learning Courses and Research
PodcastLex Fridman Podcast
This episode discusses how projects like tensor comprehensions, MLIR, and Halide, allow creating domain-specific languages for tensor computations, expressing the way to structure compilation in various blocks, carry out multiple processes, and work on multiple GPU backends easily.
19:53 - 23:02 (03:08)
Summary
This episode discusses how projects like tensor comprehensions, MLIR, and Halide, allow creating domain-specific languages for tensor computations, expressing the way to structure compilation in various blocks, carry out multiple processes, and work on multiple GPU backends easily.