Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Would the number of cores in a GPU level off? It seems like intensive computing of all sort will migrate to gpgpu programming.


Seems like the latest Nvidia GPUs aren't really an improvement over the previous ones, but just bigger and proportionally more expensive. So maybe the leveling off in performance is already starting to happen.


That is not true.

The 4090 uses less power than a 3080ti while being 63% faster.

https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-f...

, and 45% better performance than a 3090ti while using 2% less power (at 4k).

https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-f...

Can't find the link now, but I saw a youtube video where they did an analysis of it at different wattage limits and it performed very nicely.

4090 draws a lot of power, but Nvidia has just chosen to work at the diminishing returns end of the curve.

It's a halo product for people who will pay for the top of the range. I mean, look at the price!


Shrinking has almost stopped too and you can only make a chip that big before it runs into other constraints.


There is a lot of room for development before the exponential curve can be carried by the next paradigm: at least for desktop computers we are still decades away from case filling 3D "compute cubes".


It's quite possible that kind of thing has hard limits set by cooling.


sure but usually before hard limits are reached a new paradigm is ready to take over


The wafer scale computing folks would disagree...





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: