Attention, Transformers, and LLMs: a hands-on introduction in Pytorch
- Landing Page
- Preparing data for LLM training
- Small Language Models: an introduction to autoregressive language modeling
- Attention is all you need
- Other LLM Topics
This workshop focuses on developing an understanding of the fundamentals of attention and the transformer architecture so that you can understand how LLMs work and use them in your own projects.