Yes. The claim that you don't need more than 3 decimal places is laughable. Your artificial design doesn't need decimal points. At all. You probably can do even better by ignoring the last 4 bits completely.
That said, the article is very interesting, and that claim applies on different contexts. It's just aimed at the wrong one.
As an article that was here recently claims, every verification you do in a chain increases the total time of your work by an order of magnitude. So, it's only work optimizing any productive task if you already removed most verifications.
Now, some people claim that you need to improve the reliability of your productive tasks so you can remove the verifications and be faster. Those people are, of course, a bunch of coward Luddites.
> But there is an entire cohort of people who can think about specifying systems but lack the training to sdo so so using the current methods
Nah, it will be extremely surprising if even 1 such a person exists.
On the other hand, there are lots of people that can write code, but still can't specify a system. In fact, if you keep increasing the size of the system, you will eventually fit every single programmer in that category.
Ok but you need peer reviewed publications to graduate with a PhD.
And if you retort that the whole academic system is obsolete, well, it still carries a lot of prestige and legitimacy that makes politicians interested in maintaining it, so it's not going anywhere soon.
reply