I use Node.js (via browserify) for each of my web apps, all of which have some dependencies in common and others specific to themselves. Each of these apps has a package.json
file that specifies which versions of which modules it needs.
Right now, I have a /node_modules
directory in the parent folder of my apps for modules that they all need to reference, and then I put app-specific modules in a node_modules
folder in that app's directory. This works fine in the short term, since my require()
statements are able to keep looking upward in the file structure until they find the node_modules
directory with the correct app in it.
Where this gets tricky is when I want to go back to an old project and run npm install
to make sure it can still find all the dependencies it needs. (Who knows what funny-business has occurred since then at the parent directory level.) I was under the impression that npm install
did this:
- for each module listed in
package.json
, first check if it's present, moving up the directory the same wayrequire
does. If it's not, install it to the localnode_modules
directory (creating that directory if necessary).
When I run npm install
inside an app folder, however, it appears to install everything locally regardless of where else it may exist upstream. Is that the correct behavior? (It's possible there's another reason, like bad version language in my package.json
). If this IS the correct behavior, is there a way for me to have npm install
behave like the above?
It's not a big deal to widely replicate the modules inside every app, but it feels messy and prevents me from make small improvements to the common modules and not having to update every old package.json
file. Of course, this could be a good thing...