Podcast:AI Knowhow Published On: Mon Dec 02 2024 Description: Ever wondered what really powers LLMs like ChatGPT, Claude, or Gemini? In this episode, Courtney Baker, David DeWolf, and Mohan Rao are joined by John Fowler (Knownwell's Chief Science Officer) and Ramsri Goutham Golla (Lead Data Scientist) to break down the mechanics of large language models (LLMs) in ways that are accessible and relevant for all professionals, not just data scientists. John and Ramsri help explain how LLMs predict the next word in a sentence, what makes them so powerful, and the role of neural networks and attention mechanisms. They also dive into real-world applications, such as retrieval-augmented generation (RAG), and how it’s replacing fine-tuning for more efficient and reliable AI performance. Ready to harness AI for your business? Learn how Knownwell’s AI-powered platform can empower you to stay ahead at Knownwell.com. Watch this episode on YouTube: https://youtu.be/l1S3GGyo_o8 Show Notes & Related Links Watch a guided Knownwell demo Read "Here's what's really going on inside an LLM's neural network" in ArsTechnica Read the "Attention Is All You Need" research paper from a number of leaders at Google that's referenced in the episode Connect with Ramsri Goutham Golla on LinkedIn Connect with John Fowler on LinkedIn Connect with David DeWolf on LinkedIn Connect with Mohan Rao on LinkedIn Connect with Courtney Baker on LinkedIn Follow Knownwell on LinkedIn