Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I ha e a 32G M2, but most local models I use fit into my 8G old M1 laptop.

I can run the QwQ 32G model with Q4 on my 32G M2.

I suggest using https://Ollama.com on Mac, Windows, and Linux. I experiments with all options on Apple Silicon and liked Ollama best.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: