>Sorry, I must disagree. Two people working 20 hours a week are refreshed, not burned out, have time for educationally side projects, etc.
Take a look at Brooks' The Mythical Man Month for data that demonstrates the opposite, at least in programming; I haven't followed the field closely, but education and industrial organization researchers have done similar work in other professions.
Brooks' was at IBM, they adhere to a 37.5 hours per week work schedule. Programmers who can produce 3-4 hours of real work per day are considered productive. Any more and you will start to burn them out.
We stuck to 37 when I was there, so everyone could take off a little early on a Friday.
I mean officially I was there 37, it was a good week if I was doing real, actual work for 20. Yes, I made up for it by working my tail off when it was needed.
We coders can be absurdly productive for short bursts, or be slow and steady. You can't have massively productive and steady for very long. In your early 20s you have a few years of this, but the more you push it then the more jaded you'll be later.
Not to detract from your overall point, but, as an IBMer, I can assure you that - while my full-time employment contract says max 37.5 hours - the reality is that at least 40 hours are expected.
1. That was about 40 years ago when the waterfall process was state of the art. Modern, iterative processes are much more flexible.
2. The loss of productivity levels off as the team size increases. Going from 50 to 100 developers is not the same as 1 to 2.
All said, I expect some loss of productivity is inevitable due to context switching and communication overhead but it isn't necessarily a major loss.
Pretty sure waterfall was never state of the art, rather an example of what not to do [0]. It's often held up as a straw man argument in promoting 'agile'. Brooks' book emphasises prototyping, and 'build one to throw away, you will anyway' which isn't 'waterfall'.
Brooks' book also clearly demonstrates that applying a dumb formula derived from the number of programmers doesn't predict the output, and using such a formula to estimate a project's timeline doesn't positively affect the project's outcome. It's main argument is that programmers added late to a project will actually cause it to take even longer to complete.
Brooks' book is remarkable - a lot of programming lore and culture originates there or is popularized by it. I'd compare it to casablanca or citizen kane rather than dismiss it out of hand.
In software, there is still economy in scale of a person's time. Throwing more people on a project is still detrimental even if you use something like Agile. It is a major loss.
The easy way to combat this is to have longer timelines for projects, or, if necessary, crunch periods followed by more relaxed recharge periods.
Brooks laid out a method as close to assembly line programming as he could, so yes, number of hours you mindlessly follow a formula would affect output. But no one programs like that now and I doubt they ever did. What we do now is closer to writing stories. How many writers do you know who spend a constant 40 hours per week writing for weeks on end?
Take a look at Brooks' The Mythical Man Month for data that demonstrates the opposite, at least in programming; I haven't followed the field closely, but education and industrial organization researchers have done similar work in other professions.