The author doesn't claim it's perfect or that there's no way to improve on it, but I think he has a very salient point, which is that most of the OO buzzwords and programming fads that come and go in "higher-level languages" simply end up making a bigger mess of things as a project grows. Basically, the author loves C because it is simple, straightforward, and restrictive -- it forces you to write [relatively] simple, straightforward code too, instead of concocting a terrible Frankenstein of custom classes and types intertangled into a grotesque, intractable mass of dependencies and subdependencies. If something is in C, you know it is going to be built from the basics, and in many cases, this simplicity is a life saver as a project matures.
There are definitely annoyances and issues, but they are known and can be taken into account with much less hassle than attempting to grok a Java project that requires you to traverse into the basest-level of classes like BusinessObject2013SingletonDispatcherFactoryFactory every time something needs to be debugged or fixed.
It doesn't mean you should use C for your Web 2.0 startup, but his point is well taken. I heard someone 'round these parts once acknowledge that the modern equivalent of spaghetti code (i.e., code that intractably descends through hundreds of code paths with gotos, etc.) is OO hell, i.e., code with huge dependency stacks and equally intractable and unjustifiable inheritance models, where you have to descend into all kinds of classes and special cases to make a meaningful change.
>I'm pretty sure the author hasn't written a line of ANSI-C compliant code in his life, otherwise he'd never write something like this.
Damien Katz is actually fairly accomplished, and it definitely sounds like he's written multiple lines of C to me.
> OO buzzwords and programming fads that come and go in "higher-level languages" simply end up making a bigger mess of things as a project grows
No, it's not the language features which make a mess. It's people lacking judgement and common sense. (Like, trying to apply patterns everywhere. Been domain-specific [crypto] consultant on such a project and watched it smash the schedule by more than 2x. It wasn't the language [Java], it were stupid people.)
> sounds like he's written multiple lines of C to me
I don't contend that he's written a lot of C. I DO, however, contend that he's written some ANSI C. If you're writing ANSI C, you have to account for AT LEAST all of the following behaviors:
For example, if f is some function, then in the sequence
int x = 0;
f(x++, x++);
the call to f is undefined behavior because x is modified twice without an intervening sequence point. Anybody claiming that such language is "high-level" is a moron, regardless of their "accomplishments".
In scheme, argument evaluation order is not specified. So, (pretending that set! returns the assigned value),
(f (set! x (+ 1 x)) (set! x (+ 1 x)) ) is undefined and unpredictable as well. I don't think anyone would claim Scheme isn't high level.
"Undefined behavior" in C is much more insidious than merely "unspecified" or "unpredictable". All those security holes and exploits resulting from buffer overflows, stack overwrites, heap spraying, etc. are manifestations of UB made possible by a badly written program. (Or by the interaction between an invalid program and an optimistic compiler, as in the case of assuming strict pointer aliasing.)
There are definitely annoyances and issues, but they are known and can be taken into account with much less hassle than attempting to grok a Java project that requires you to traverse into the basest-level of classes like BusinessObject2013SingletonDispatcherFactoryFactory every time something needs to be debugged or fixed.
It doesn't mean you should use C for your Web 2.0 startup, but his point is well taken. I heard someone 'round these parts once acknowledge that the modern equivalent of spaghetti code (i.e., code that intractably descends through hundreds of code paths with gotos, etc.) is OO hell, i.e., code with huge dependency stacks and equally intractable and unjustifiable inheritance models, where you have to descend into all kinds of classes and special cases to make a meaningful change.
>I'm pretty sure the author hasn't written a line of ANSI-C compliant code in his life, otherwise he'd never write something like this.
Damien Katz is actually fairly accomplished, and it definitely sounds like he's written multiple lines of C to me.