I think huherto is suggesting beer concurrency. That is, instead of having a 20 gallon setup, having four 5 gallon setups. It will take longer because you will have to do whatever mixing,testing, etc four times but if the longest part of the process is waiting- you win in that aspect.
The question is- would this make it easier to be more consistent?
As a homebrewer I think this would be a pretty rough way to try and scale. The actual brew time would be the same, but you've increased your cleaning and maintenance significantly, you need a solution to pipe from multiple stations into fermenting vessels, you need a significant amount of extra space dedicated to brewing that could otherwise be used for fermentation vessels, etc.
I think the right answer is to get your equipment and do test batches to rework your recipes at scale. If you're successful as a brewery it's a process you'll have to do multiple times as you grow anyways, so avoiding it once seems like a silly optimization.
Thanks latj, this is what I was thinking. Big batches may be a good model for a large brewery but not necessarily for a small one that is growing organically.
I can imagine several advantages. You can replicate without having to extrapolate quantities, pressure, etc. You get to run more experiments, I can envision a supervised machine learning system that learns which parameters make the best beer. You don't throw out big batches, etc. Sure, it may require more labor, but you get other advantages.
How about a coop of home brewers- everyone agrees to brew a certain recipe of beer that month; All the beer gets blended together and redistributed. What does that taste like?
I visited a village once that did this with their wine and distilled liquor.
The question is- would this make it easier to be more consistent?