Lecture 09 & 11: Scaling Laws
About
📝 100 AI Papers with Code
About this series
Transformer
Vision Transformer
🎓 Stanford CS336: LLM from Scratch
About this course
Lecture 01: Introduction & BPE
Lecture 02: PyTorch Basics & Resource Accounts
Lecture 03: Transformer LM Architecture
Lecture 04: MoE Architecture
Lecture 05&06: GPU Optimization, Triton & FlashAttention
Lecture 07&08: Parallelism
Lecture 09&11: Scaling Laws
Lecture 10: Inference & Deployment
Lecture 12: Evaluation
Lecture 13&14: Data Collection & Processing
Lecture 15: LLM Alignment SFT & RLHF(PPO, DPO)
Lecture 16 & 17: LLM Alignment SFT & RLVR(GRPO)
Assignment 01: BPE Tokenizer & Transformer LM
Assignment 02: Flash Attention & Parallelism
Assignment 05: SFT & GRPO
📖 Deep Learning Foundation & Concepts
About this book
Lecture 09 & 11: Scaling Laws
为什么
LARGE
language models在规模上表现更好?Lecture 09和11介绍了Scaling Laws的基本概念和原理,探讨了模型规模、数据规模与计算资源之间的关系。内容涵盖了不同类型的Scaling Laws(如参数规模、数据规模和计算规模)以及它们对模型性能的影响。通过实际案例,展示了Scaling Laws在深度学习中的应用和重要性。
Back to top
Lecture 07&08: Parallelism
Lecture 10: Inference & Deployment