Enroll Now!
What you’ll learn
-
Build a GPT-2 style language model with 20 million parameters from scratch using JAX, the open-source library behind Google’s Gemini, Veo, and Nano Banana models.
-
Learn JAX’s core primitives (automatic differentiation, JIT compilation, and vectorized mapping) and how to combine them to define, train, and checkpoint a neural network efficiently.
-
Load a pretrained MiniGPT model and run inference through a chat interface, completing the full workflow from data preprocessing and training to generating text with the trained LLM.
