No, that's fixing the problem in the wrong way (by using your code version tool as an FTP more or less), complicating code reviews and unnecessarily growing your code repository.
If you want to be safe if NPM goes down, what you need is to keep an archive of built "binary" releases, either zip files, docker images, or whatever.
You can also have internal NPM proxies or caches too.
Committing external dependencies in the codebase is a terrible solution. And no, google doing it doesn't mean it is good for everyone else, probably more likely to be the opposite to what you need.
Yes, you did. Quoting: "...Anyone not committing the entire node_modules to their repo..."
> you're not revisioning your deployments
That's what I'm saying, quoting: "...If you want to be safe if NPM goes down, what you need is to keep an archive of built "binary" releases, either zip files, docker images, or whatever..."
Using Git/SVN/etc repository for this (which is what I kind of get from your responses) is just using the wrong tool.
If you want to be safe if NPM goes down, what you need is to keep an archive of built "binary" releases, either zip files, docker images, or whatever.
You can also have internal NPM proxies or caches too.
Committing external dependencies in the codebase is a terrible solution. And no, google doing it doesn't mean it is good for everyone else, probably more likely to be the opposite to what you need.