Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is tiny LLM an oxymoron? I believe Apple has told us it’s a transformer language model, but not specifically a LLM.


It's like how the "New Forest" is really old now: even small LLMs are (from what I've seen which isn't exhaustive) large compared to Markov language models.


According to this article[1] it has about 34 million parameters.

https://jackcook.com/2023/09/08/predictive-text.html


There's no difference. An LLM is just a transformer language model that's "large".


Yeah, they meant a TLM


That's print('Hello, world!')




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: