Similar Posts
Artificial Intelligence Full Course 2024 | AI & Machine Learning Full Course
This video on the Artificial Intelligence full course video cover all the topics you need to know to become a master in AI and ML. It covers all the basics of Machine Learning, the different types of Machine Learning, and the various applications of Machine Learning used in different industries. This video will also help…
Vector Search and Embeddings
Ready to launch your vector search game? 🚀 Ditch your traditional keywords and discover the power of vector search! This video will help you discover ways users can make search smarter, and generate creative text along the way. Get hands-on with vector search on Vertex AI! Jump directly to the topics you want to learn:00:00…
Prompt Engineering Tutorial – Master ChatGPT and LLM Responses
Learn prompt engineering techniques to get better results from ChatGPT and other LLMs.
The math behind Attention: Keys, Queries, and Values matrices
This is the second of a series of 3 videos where we demystify Transformer models and explain them with visuals and friendly examples. 00:00 Introduction01:18 Recap: Embeddings and Context04:46 Similarity11:09 Attention20:46 The Keys and Queries Matrices25:02 The Values Matrix28:41 Self and Multi-head attention33:54: Conclusion
Reliable, fully local RAG agents with LLaMA3.2-3b
LLaMA3.2 has released a new set of compact models designed for on-device use cases, such as locally running assistants. Here, we show how LangGraph can enable these types of local assistant by building a multi-step RAG agent – this combines ideas from 3 advanced RAG papers (Adaptive RAG, Corrective RAG, and Self-RAG) into a single…
The Attention Mechanism in Large Language Models
Attention mechanisms are crucial to the huge boom LLMs have recently had.In this video you’ll see a friendly pictorial explanation of how attention mechanisms work in Large Language Models.This is the first of a series of three videos on Transformer models. https://www.youtube.com/watch?v=OxCpWwDCDFQ
