Neural language models, and an explanation of recurrent neural networks
Chapters
0:00 Introduction
1:54 Neural N-Gram Models
6:03 Recurrent Neural Networks
11:47 LSTM Cells
12:22 Outro
Chapters
0:00 Introduction
1:54 Neural N-Gram Models
6:03 Recurrent Neural Networks
11:47 LSTM Cells
12:22 Outro
Ready to launch your vector search game? 🚀 Ditch your traditional keywords and discover the power of vector search! This video will help you discover ways users can make search smarter, and generate creative text along the way. Get hands-on with vector search on Vertex AI! Jump directly to the topics you want to learn:00:00…
What is Generative AI and how does it work? What are common applications for Generative AI? Watch this video to learn all about Generative AI, including common applications, model types, and the fundamentals for how to use it.
Check out how large language models (LLMs) and generative AI intersect to push the boundaries of possibility. Unlock real-world use cases and learn how the power of a prompt can enhance LLM performance. You’ll also explore Google tools to help you learn to develop your own gen AI apps. https://www.youtube.com/watch?v=RBzXsQHjptQ https://www.youtube.com/watch?v=RBzXsQHjptQ
This is the last of a series of 3 videos where we demystify Transformer models and explain them with visuals and friendly examples. 00:00 Introduction01:50 What is a transformer?04:35 Generating one word at a time08:59 Sentiment Analysis13:05 Neural Networks18:18 Tokenization19:12 Embeddings25:06 Positional encoding27:54 Attention32:29 Softmax35:48 Architecture of a Transformer39:00 Fine-tuning42:20 Conclusion
LangChain became immensely popular when it was launched in 2022, but how can it impact your development and application of AI models, Large Language Models (LLM) in particular. In this video Martin Keen shares an overview of the features and uses of LangChain.
If you’re interested in the herculean task of interpreting what these large networks might actually be doing, the Transformer Circuits posts by Anthropic are great. In particular, it was only after reading one of these that I started thinking of the combination of the value and output matrices as being a combined low-rank map from…