Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In the large, no, they don't care about the 90-95% of the code base that's not performance critical. And these days, the stuff that is critical will be #ifdef and asm(...) stew.

I can't tell you how many projects I have been on where disabling optimization made no measurable difference in performance.

This being said, I cannot speak for game devs nor video device driver developers.



I have to say, I have never encountered a program where compiling without optimizations made no difference. If you have seen that, then I would agree that C was a very, very poor choice for that particular domain.


It made no measurable difference. The acceptance criteria for the system were not materially affected. Since the optimization did not improve the performance by that, it was generally set one way ( mostly off ) and let that way.

Teams I'm on have written some tight - but readable -code, too. Well-architected, low-latency, low-jitter.


Business applications, as it was common in the 90's.

Witting them in C or VB made no difference for the guys sitting at the desk.


This ranged from networking equipment to phone switches to equipment control. Most of it was bare metal.

I've never written a business application in my life.


You can try an experiment and build an application like Firefox from source with disabled optimizations. I bet you'll notice a massive difference.

Even more important are things that run in datacenters on thousands and thousands of machines. Even if you suppose that optimizations make only a minuscule difference on the scale of today's infrastructure 5% fewer machines can save huge amounts of electricity.


Why would I want to build Firefox from source? I don't build things that run on thousands of machines. Last thing I ran any code on had a 2500 HP diesel; the electronics were in the noise.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: