Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

We are in the middle of an LLM bubble.

Nvidia problem will sort itself out naturally in the coming months/years.



As someone put it in: we are in the 3D glasses phase of AI. Remember when all TVs came with one?


Same thing was said about Nvidia's crypto bubbles, and then look what happened.

Jensen isn't stupid. He's making accelerators for anything so that they'll be ready to catch the next bubble that depends on crazy compute power that can't be done efficiently on CPUs. They're so far the only semi company beating Moore's law by a large margin due to their clever scaling tech while everyone else is like "hey look our new product is 15% more efficient and 15% more IPC than the one we launched 3 years ago".

They may be overvalued now but they definitely won't crash back to their "just gaming GPUs" days.


They got extremely lucky with AI following crypto. The timing was close to perfect. I'm not sure there will be another wave like that at all for a long while.


Maybe but it's not like all those AI compute units or whatever Nvidia called them will be thrown in the dumpster after the AI bubble pops. There's a lot of problems the can be solved on them and researcher are always looking for new problems to solve as compute becomes accesibile.

I'm tired of hearing about Nvidia's "luck". There was no luck involved. Nvidia shiped Cuda on consumer GPUs since 2006. That's almost 20 years time researchers had to find used cases for that compute and Nvidia made it possible. In other words the AI bubble happened because Nvidia made the necessary ground work for it to happen, they didn't just fall into it by luck.


They will not be thrown in the dumpster, but that's actually a bad thing for NVIDIA. We had a very short period when lots of miners dumped their RTX cards on ebay and the prices fell a lot for some time. (then AI on RTX became a thing at small scales) When the A100/H100s get replaced, they will flood the market. There's many millions of $ stuck in those assets right now and in a few years they will dominate research and top end of hobbies. Only high profile companies/universities/researchers will look at buying anything newer. Maybe NVIDIA can do 2 generations of those cards, but ASIC-based solutions will hit the market and the generic CUDA will become a problem rather than a blessing. Same story as BTC miners and graphics cards.

Sure, they didn't get lucky with the tech they had to offer - that was well developed for years. They just got lucky that the next big thing was compute-based. If the next thing is memory/storage-based, they're screwed and the compute market is saturated for years - they have only gamers left.


> they're screwed and the compute market is saturated for years

If all this extra computing power is available, smart people will find a way to use it somehow.


There is no recurring revenue on them though. So NVIDIA needs to continue selling obscene amount of new chips.


There could be other "AI" waves after LLM. And we have still not hit the self-diving car wave, that could happen the next 20 years (or 40). And neither general-purpose robots, that couuuuld also happen. Personalized medicine has also not happened yet. Nor virtual reality, which might take off one day (or not). There are still many industries that could go big in terms of computational demands.


It wasn't a coincidence either though: the amount of compute available is probably the main driver of this wave.


I think there's also a very high prospect of virtual worlds with virtual people (SFW or otherwise) becoming popular, rendered with Apple/META goggles...that could require insane amounts of compute. And this is just one possibility. Relatively cheap multimodal smart glasses you wear when out and around that offload compute to the cloud are another.

Nvidia could just as easily triple in short order as get cut in half from here imho.


I thought Meta Horizon and the sales number of Vision Pro[0][1] already proves your thesis wrong. Even Zuckerberg stopped talking about it.

[0]https://www.macrumors.com/2024/04/23/apple-cuts-vision-pro-s... [1] https://www.macrumors.com/2024/04/22/apple-vision-pro-custom...


I am referring to the future (the actual future, not simulated ones), so it is not possible to know if I am wrong.

I predict this is yet another domain rich with opportunity for AI.


You mean like Omniverse from Nvidia where you can simulate entire factories or as Nvidia does data centers before they are built?

Or how about building the world virtually? https://www.nvidia.com/en-us/high-performance-computing/eart...


Ya, I saw those demos (I own the stock), incredible. But I'm thinking things more like just a single VR friend who has memory and (optionally) prior knowledge of your background. I think these could be an absolute blessing for a lot of people who have lots of time on their hands but no one to talk to.

Or, leaning more towards your examples, a Grand Theft Auto style environment containing millions of them, except life like.


Intel's all time high was in 2000, if I'm reading the charts correctly.

Of course the equivalent can happen to Nvidia. Seems almost certain.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: