Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not to open that can of worms, but in most definitions self-improvement is not an AGI requirement. That's already ASI territory (Super Intelligence). That's the proverbial skynet (pessimists) or singularity (optimists).


Hmm my bad. Maybe Yeah I always thought that it was the endgame of humanity but isn't AGI supposed to be that (the endgame)

What would AGI mean, solving some problem that it hasn't seen? or what exactly? I mean I think AGI is solved, no?

If not, I see people mentioning that horizon alpha is actually a gpt 5 model and its predicted to release on thursday on some betting market, so maybe that fits AGI definition?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: