In truth because the world of C (I was a C and then C++ programmer 30 years ago) did not enable the production of effective applications cheaply and quickly. Yes - you can write anything, but it took months and months and months.
We've traded efficiency for productivity. We have been able to do this because of Moore's law and all its friends.
Also, C code is hellish to read and understand (#define anyone?), and logging almost never happened.
As a sysadmin you will be able to look at the organisation you work in and count up the number of sysadmins, my guess is that there are less than 50% of the number 10 years ago, maybe that's the problem?
"So, why are we doing this?"
I think there are a lot of reasons. Market share, control of technology, not invented here syndrome, ego issues, fads, shifting focus on specific set of problems. I personally think that OO proliferation (C++) and Java and enterprise mindset was the point of no return. The mental point where the majority left simplicity and maintainability out of its scope. There was (and still is) an era where complexity (and the ability to comprehend it) is a synonym of mental machismo. Simple was left as something that lesser minds/developers had to worry about whilst the 10x devs (another great proliferated myth) which everyone worth his salt should strive to be (so the myth went on), must not only conquer it but also thrive on it and create more of it. Thus you got a multitude of tooling, languages, stacks that do the same things differently, you got Godzilla OO hierarchies and architecture diagrams that look like a dozen overlapped London metro maps. Now we got microservices and k8s which in itself is a testament to irony - started as an attempt to simplify uservices/cluster mgmt and has evolved to perhaps the most complicated ecosystem ever (one has serious trouble even comprehending what most of the tooling in k8s land are supposed to be doing, much less use them and don't get me started about really understanding code in there).
I personally think more people should read and embrace the Unix philosophy of simple and reusable but on the other hand I'm afraid that the ship is unstoppable now.
Tangential, but it's interesting how we can't seem to agree who the mythical 10x devs are.
> There was (and still is) an era where complexity (and the ability to comprehend it) is a synonym of mental machismo. Simple was left as something that lesser minds/developers had to worry about whilst the 10x devs (another great proliferated myth) which everyone worth his salt should strive to be (so the myth went on), must not only conquer it but also thrive on it and create more of it. Thus you got a multitude of tooling, languages, stacks that do the same things differently, you got Godzilla OO hierarchies and architecture diagrams that look like a dozen overlapped London metro maps.
I always believed that this kind of stuff is what mediocre developers relish. Meanwhile, the 10x developer looks at all this complex cloud-scale data pipeline, and does the same job 3x faster with a few Unix tools piped together on their mid-tier laptop.
As far as I know, the 10x developer comes from a throwaway comment in a small sample paper on comparing people's implementation of toy problem. There ends its actual basis in reality.
> Yes - you can write anything, but it took months and months and months.
The C & C++ of 30 years ago, where compile times took days - sure. Nowadays building a whole yocto software stack, basically a full linux distro - from GCC, to the kernel, to glibc, to X11, to Qt takes 6 to 8 hours in a good laptop.
In addition, modern language features & the general shift from runtime to compile-time polymorphism, especially in C++, make the potential for errors decrease sharply, and language simplifications - standard library improvements, terser syntax, also make time to release go down.
Finally, tooling has greatly improved : I barely ever need to run a build with my IDE to check for errors, because IDEs now embed clang, clang-tidy, clazy... which checks your code as soon as you type and highlight it in the editor. And asan, ubsan, handle the remaining cases which cannot be caught at compile time, leaving only logic errors to handle.
Pretty sure the comment you're replying to is talking about development time, not compile time. They're claiming it took months longer to develop a similar app in C, not that it took longer to compile. Even if compile times are negligible now for the reasons you state, they still have a point about it taking longer to develop an equivalent app.
This. The one problem is a lack of a solid networking library. Unix sockets/winsock is ancient and terrible with tons of corner cases, Boost ASIO is creaky and "boosted", other useful things like 0mq are good for IPC but not Internet. Qt eats the world and is a pain to integrate. Libevent, libev etc. are not C++ and still pain to integrate.
Maybe there is some library that is lesser known but good.
As for time to develop, besides networking, C++ wins, even for hack projects, in comparison with Python even, JS/HTML/CSS w/ framework stack being far slower and making for terrible UX. (I'm not writing about languages I do not use.)
Java and C# can be fast too, but it hits the limits when you actually have to calculate things which gets real awkward real fast.
The main win is... Tooling. Java tooling is best now since maven and ant are mostly gone legacy, then C++ (I'm counting Qt Creator and Glade, cmake is eh, cmake+something is ok), then finally JS tied with Python. JS fails terribly on debugging and the creaky rarely adaptable UX libs, while Python after all these years still has tooling issues. (Plus performance where it matters.)
Rust is not there yet, maybe soon. Scala is nice though fewer people can use it than new C++.
Honorable mention to specialized tools like R, Faust, Lua, Matlab and Fortran. Note they integrate well with C and C++... Try anything else and you're in for a ride. Even second most supported, C# and Java, go through a C FFI layer.
In truth because the world of C (I was a C and then C++ programmer 30 years ago) did not enable the production of effective applications cheaply and quickly. Yes - you can write anything, but it took months and months and months.
We've traded efficiency for productivity. We have been able to do this because of Moore's law and all its friends.
Also, C code is hellish to read and understand (#define anyone?), and logging almost never happened.
As a sysadmin you will be able to look at the organisation you work in and count up the number of sysadmins, my guess is that there are less than 50% of the number 10 years ago, maybe that's the problem?