One of the most important concepts in the node ecosystem is packages. It is at the heart of what we all love about working with node. Functionality can be broken down into small chunks and published to NPM. When you need to do something specific you can just look for the best package on NPM and one npm i
later you are moving on with your app.
Now that you are setting up your application as SOA apps you still want to be able to share some code between all those shiny new apps you have, because DRY is still a thing. So like every good Node developer, we decided to make packages out of our shared code. We created a directory called modules
and threw a bunch of directories in it. To use the packages we took three approaches:
- Just requireing the path:
require('../../../modules/log')
- Symalinking to the node_modules:
require('log') // returns code at /modules/log via /node_modules/log
- File based package installs:
require('log') // returns code at /modules/log via /node_modules/log installed with npm
All three of those have serious problems, some of which they all shared, some of which they uniquely had. The first option means that you always have to figure out how many ../
to add, and this seems like a silly problem, but man did it frustrate me. It also had the issue that you couldn't just grep
for the usages of the package with grep -rn 'require('log'
, and because we had packages required from packages it meant we could never be fully sure that we knew what was using the package when we went to change it. Lastly it means that you cannot support different versions of the same package in different apps.
Number two solves the silly frustration of the ../
's, but it means that you either need to share the unique namespace from NPM or prefix your internal packages. We took the approach of namespacing, so all of our modules started with v-
, dont ask at the choice, it is a running joke because it really makes no sense. So this means that the modules that started as require('../../modules/log')
became require('v-log')
, alot eaiser to type, but way more confusing on how to find the code. It also shares the problem with #1 that all apps require the same version of the code.
Number three was arguably the best solution to the problem because it used NPM to do the installs, which avoided the custom symalinking solution, and made it so all you had to do to know where the module came from was look at the package.json. The issue was that when you were developing a feature you STILL had to symalink the code, or risk forgetting to apply your edits to the actual source file if you made them in the node_modules version.
The apps in the monolith-soa app shared a single package.json and all accessed a shared modules directory. In that modules directory was everything from models to tooling, caching libs to database queries and front-end helpers. When a developer needed something in an application, the would just require('')
and use it.
There will come a point where you have some code shared between your applications. Probably on day 2. In the past this, there was a hard decision to make, should you setup and maintain your own internal NPM. I say in the past because now it is easy to setup and run because of NPM Inc's NPM On-Site.
This also solves a large security issue that was recently publicized with the left-pad
package issue. Because you are running a local copy you reduce your risk of falling victim to malicious packages. A local NPM in combination with npm shrinkwrap
means that you can rely on reproducable builds and secure code.
- in the app
- moved to a packaged directory
- published to internal npm
The biggest issue for all of these options actaully comes down not to issues I mentioned above though. The real issue came when we wanted to update Node to 4.x across our stack. Turns out that from 0.10.x to 4.x there were a ton of incompatiable modules, and the modules were integrated into all of these