Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The M4 I have lets me run GPT-OSS-20B on my Mac, and its surprisingly responsive. I was able to get LM Studio to even run a web API for it, which Zed detected. I'm pleasantly surprised by how powerful it is. My gaming with with a 3080 cannot even run the same LLM model (not enough VRAM).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: