In September 2025, I completed ๐๐๐ถ๐ ๐ฆ๐ฒ๐ฟ๐ฟ๐ฎ๐ป๐ผโ๐ ๐ ๐ฎ๐๐ต๐ฒ๐บ๐ฎ๐๐ถ๐ฐ๐ ๐ณ๐ผ๐ฟ ๐ ๐ฎ๐ฐ๐ต๐ถ๐ป๐ฒ ๐๐ฒ๐ฎ๐ฟ๐ป๐ถ๐ป๐ด ๐๐ฝ๐ฒ๐ฐ๐ถ๐ฎ๐น๐ถ๐๐ฎ๐๐ถ๐ผ๐ป which provided a solid mathematical foundation and sparked my interest in building a more research-oriented understanding of machine learning.
After that, I started exploring classical ML algorithms, from linear regression to k-Means, from decision boundaries to clustering, from training models to questioning how learning itself actually works. But something felt incomplete.
When using libraries like ๐๐ฐ๐ถ๐ธ๐ถ๐-๐น๐ฒ๐ฎ๐ฟ๐ป, itโs very easy to apply an algorithm, yet extremely easy to remain unaware of what is actually happening underneath the hood.
๐๐ผ๐ฟ ๐ฒ๐ ๐ฎ๐บ๐ฝ๐น๐ฒ, when someone applies ๐ธ-๐ ๐ฒ๐ฎ๐ป๐ ๐ฐ๐น๐๐๐๐ฒ๐ฟ๐ถ๐ป๐ด ๐๐๐ถ๐ป๐ด ๐๐ฐ๐ถ๐ธ๐ถ๐-๐น๐ฒ๐ฎ๐ฟ๐ป, they often donโt realize that the algorithm is fundamentally solving a constrained optimization problem: it repeatedly updates cluster centroids to ๐บ๐ถ๐ป๐ถ๐บ๐ถ๐๐ฒ ๐๐ต๐ฒ ๐๐ผ๐๐ฎ๐น ๐๐ถ๐๐ต๐ถ๐ป-๐ฐ๐น๐๐๐๐ฒ๐ฟ ๐๐๐บ ๐ผ๐ณ ๐๐พ๐๐ฎ๐ฟ๐ฒ๐ฑ ๐ฑ๐ถ๐๐๐ฎ๐ป๐ฐ๐ฒ๐, a process grounded in ๐น๐ถ๐ป๐ฒ๐ฎ๐ฟ ๐ฎ๐น๐ด๐ฒ๐ฏ๐ฟ๐ฎ, ๐ฐ๐ฎ๐น๐ฐ๐๐น๐๐, and ๐ถ๐๐ฒ๐ฟ๐ฎ๐๐ถ๐๐ฒ ๐ผ๐ฝ๐๐ถ๐บ๐ถ๐๐ฎ๐๐ถ๐ผ๐ป.
Most high-level libraries summarize all of this mathematics in a single line of code, but real understanding lives in the math.
Thatโs when I discovered the book โ๐ ๐ฎ๐๐ต๐ฒ๐บ๐ฎ๐๐ถ๐ฐ๐ ๐ผ๐ณ ๐ ๐ฎ๐ฐ๐ต๐ถ๐ป๐ฒ ๐๐ฒ๐ฎ๐ฟ๐ป๐ถ๐ป๐ดโ by ๐ง๐ถ๐๐ฎ๐ฑ๐ฎ๐ฟ ๐๐ฎ๐ป๐ธ๐ฎ. This book is a goldmine that beautifully balances theoretical depth with practical machine learning intuition.
Iโve just completed Chapter 1, and to stay consistent and accountable, I created a public GitHub repository:
๐ ๐-๐ ๐ฎ๐๐ต-๐๐ฟ๐ถ๐ฑ๐ด๐ฒ https://github.com/msami-ullah-ai/ML-Math-Bridge
There Iโll be posting:
Chapter-wise notes Python implementations of the mathematics Parallel Python projects
Iโll be sharing this journey openly. ๐๐ณ ๐๐ผ๐โ๐ฟ๐ฒ ๐ถ๐ป๐๐ฒ๐ฟ๐ฒ๐๐๐ฒ๐ฑ ๐ถ๐ป ๐๐ต๐ฒ ๐บ๐ฎ๐๐ต๐ฒ๐บ๐ฎ๐๐ถ๐ฐ๐ ๐ฏ๐ฒ๐ต๐ถ๐ป๐ฑ ๐ ๐, ๐ณ๐ฒ๐ฒ๐น ๐ณ๐ฟ๐ฒ๐ฒ ๐๐ผ ๐ณ๐ผ๐น๐น๐ผ๐ ๐ฎ๐น๐ผ๐ป๐ด. Letโs learn together