Beginning of LLMs (Transformers) : The Introduction
Échec de l'ajout au panier.
Échec de l'ajout à la liste d'envies.
Échec de la suppression de la liste d’envies.
Échec du suivi du balado
Ne plus suivre le balado a échoué
-
Narrateur(s):
-
Auteur(s):
À propos de cet audio
This trailer introduces Season 5 of the Adapticx Podcast, where we begin the story of large language models. After tracing AI’s evolution from rules to neural networks and attention, this season focuses on the breakthrough that changed everything: the Transformer.
We preview how “Attention Is All You Need” reshaped language modeling, enabled large-scale training, and led to early models like BERT, GPT-1, GPT-2, and T5. We also introduce scaling laws—the insight that performance grows predictably with data, compute, and model size.
This episode sets the direction for the season and explains why the Transformer marks the start of the modern LLM era.
This episode is part of the Adapticx AI Podcast. Listen via the link provided or search “Adapticx” on Apple Podcasts, Spotify, Amazon Music, or most podcast platforms.
Sources and Further Reading
Additional references and extended material are available at:
https://adapticx.co.uk