I can second this. Working in industry, the bar is quite high for rigor. The general attitude of industrial researchers is to be very very skeptical of academia, since a lot of things just don't reproduce (cherry-picked data, p-hacking, only work in a narrow domain, etc., etc.). These researchers are almost all people with PhDs in various science fields, so not exactly skeptics.
Industry works solely on stuff that's reproducible because it wants to put these things into practice. That makes for an admirable level of rigor, but constrains their freedom to look at unprofitable and unlikely ideas. That inevitably results in inadvertent p-hacking. The first attempt to look at something unexpected is always "This might be nothing, but..."
They call in other people earlier because they're not protecting trade secrets or trying to get an advantage. They do want priority, and arguably it would be better if they could wait longer and do more work first, but the funding goes to the ones who discover it first.
So there's no real reason for either academics or industry scientists to look askance at each other. They're doing different things, with standards that differ because they're pursuing different goals. They both need each other: applications result in money that pushed for new ideas, and ideas result in new applications.
I agree with you, and I don't want my comment to be read as an indictment of academia exactly - we couldn't live without it, it has huge returns on investment, etc. It's worth reading 10 bad papers to find 1 with a kernel of a good idea (and worth spending research funding on 100 bad experiments to get 1 useful result).
I think what I mean to say is that the skills required in industrial research (which can be quite speculative in well-funded companies, by which I mean a 5% chance of success or so) are somewhat different from those required in academia.