I have no insight into the natural sciences, but I've spent a couple of years in computer science academia. With that in mind:
> None of it would be possible without academia though. Industry just applies academic research.
Meh, that vastly oversells academic research. Very little of academic research in computer science is actually used in the industry. It's not that the industry is ignorant, but rather that the majority of academic work is useless: They create artificial problems [1] and solve them in shoddy ways, with hand-picked benchmark results, and frequently without even publishing the source code.
It's probably not surprising, given that the typical incentive is to get a PhD. So you need a "problem" that can reliably be solved in 3-5 years and which allows you to produce 5-10 conference papers with your name on it.
[1] I'm not talking about theoretical fields – my comment is purely about supposedly practical research.
I was once watching a VC interview a snooty machine vision scientist at Johns Hopkins who was talking up how well his research was at recognizing three d things. So the VC pulled out his cellphone and took a photo of a box on the table. He asked the professor to have the software highlight the rectangular solid. Whoop. He never heard back. The software in the lab that was supposedly so great couldn't do a very basic task that wasn't from its preapproved set of tasks.
I do think that academia can be the source of some great ideas, but they often end up believing their own BS.
I worked at Google and there's just tons of stuff that never actually existed in academia and was created, launched, and then replaced by something better entirely within the company without any publications!