In browser locally run LLMs

I built a website that runs LLMs locally in the browser using WebGPU and WebAssembly.

Looking for feedback and suggestions to what models I should add next.

Wow! This is a very cool idea!
However, I was not able to download any model. I’m running Chrome on Ubuntu.