Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
_flux
on April 24, 2024
|
parent
|
context
|
favorite
| on:
Snowflake Arctic Instruct (128x3B MoE), largest op...
And huggingface is hosting (randomly assuming 8-64 GB per model) 5..40 PB of models for free? That's generous of them. Or can the models share data? Ollama seems to have some ability to do that.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: