Skip to content
Create a gist now

Instantly share code, notes, and snippets.

Better local require() paths for Node.js

Better local require() paths for Node.js


When the directory structure of your Node.js application (not library!) has some depth, you end up with a lot of annoying relative paths in your require calls like:

var Article = require('../../../models/article');

Those suck for maintenance and they're ugly.

Possible solutions

Ideally, I'd like to have the same basepath from which I require() all my modules. Like any other language environment out there. I'd like the require() calls to be first-and-foremost relative to my application entry point file, in my case app.js.

There are only solutions here that work cross-platform, because 42% of Node.js users use Windows as their desktop environment (source).

1. The Symlink

Stolen from: focusaurus / express_code_structure # the-app-symlink-trick

  1. Create a symlink under node_modules to your app directory:
    Linux: ln -nsf node_modules app
    Windows: mklink /D app node_modules

  2. Now you can require local modules like this from anywhere:

    var Article = require('models/article');

Note: you can not have a symlink like this inside a Git repo, since Git does not handle symlinks cross-platform. If you can live with a post-clone git-hook and/or the instruction for the next developer to create a symlink, then sure.

Alternatively, you can create the symlink on the npm postinstall hook, as described by scharf in this awesome comment. Put this inside your package.json:

"scripts": {
    "postinstall" : "node -e \"var s='../app',d='node_modules/app',fs=require('fs');fs.exists(d,function(d){d||fs.symlinkSync(s,d,'dir')});\""

2. The Global

  1. In your app.js:

    global.__base = __dirname + '/';
  2. In your very/far/away/module.js:

    var Article = require(__base + 'app/models/article');

3. The Module

  1. Install some module:

    npm install app-module-path --save
  2. In your app.js, before any require() calls:

    require('app-module-path').addPath(__dirname + '/app');
  3. In your very/far/away/module.js:

    var Article = require('models/article');

4. The Environment

Set the NODE_PATH environment variable to the absolute path of your application, ending with the directory you want your modules relative to (in my case .).

There are 2 ways of achieving the following require() statement from anywhere in your application:

var Article = require('app/models/article');

4.1. Up-front

Before running your node app, first run:

Linux: export NODE_PATH=.
Windows: set NODE_PATH=.

Setting a variable like this with export or set will remain in your environment as long as your current shell is open. To have it globally available in any shell, set it in your userprofile and reload your environment.

4.2. Only while executing node

This solution will not affect your environment other than what node preceives. It does change your application start command.

Start your application like this from now on:
Linux: NODE_PATH=. node app
Windows: cmd.exe /C "set NODE_PATH=.&& node app"

(On Windows this command will not work if you put a space in between the path and the &&. Crazy shit.)

5. The Start-up Script

Effectively, this solution also uses the environment (as in 4.2), it just abstracts it away.

With one of these solutions (5.1 & 5.2) you can start your application like this from now on:
Linux: ./app (also for Windows PowerShell)
Windows: app

An advantage of this solution is that if you want to force your node app to always be started with v8 parameters like --harmony or --use_strict, you can easily add them in the start-up script as well.

5.1. Node.js

Example implementation:

5.2. OS-specific start-up scripts

Linux, create in your project root:

NODE_PATH=. node app.js

Windows, create app.bat in your project root:

@echo off
cmd.exe /C "set NODE_PATH=.&& node app.js"

6. The Hack

Courtesy of @joelabair. Effectively also the same as 4.2, but without the need to specify the NODE_PATH outside your application, making it more fool proof. However, since this relies on a private Node.js core method, this is also a hack that might stop working on the previous or next version of node.

This code needs to be placed in your app.js, before any require() calls:

process.env.NODE_PATH = __dirname;

7. The Wrapper

Courtesy of @a-ignatov-parc. Another simple solution which increases obviousness, simply wrap the require() function with one relative to the path of the application's entry point file.

Place this code in your app.js, again before any require() calls:

global.rootRequire = function(name) {
    return require(__dirname + '/' + name);

You can then require modules like this:

var Article = rootRequire('app/models/article');

Another option is to always use the initial require() function, basically the same trick without a wrapper. Node.js creates a new scoped require() function for every new module, but there's always a reference to the initial global one. Unlike most other solutions this is actually a documented feature. It can be used like this:

var Article = require.main.require('app/models/article');


1. The Symlink
If you're using CVS or SVN (but not Git!), this solution is a great one which works, otherwise I don't recommend this to anyone.

2. The Global
You're effectivly swapping ../../../ for __base + which is only slightly better if you ask me. However it's very obvious for the next developer what's exactly happening. That's a big plus compared to the other magical solutions around here.

3. The Module
Great and simple solution. Does not touch other require calls to node_modules.

4. The Environment
Setting application-specific settings as environment variables globally or in your current shell is an anti-pattern if you ask me. E.g. it's not very handy for development machines which need to run multiple applications.

If you're adding it only for the currently executing program, you're going to have to specify it each time you run your app. Your start-app command is not easy anymore, which also sucks.

5. The Start-up Script
You're simplifying the command to start your app (always simply node app), and it gives you a nice spot to put your mandatory v8 parameters! A small disadvantage might be that you need to create a seperate start-up script for your unit tests as well.

6. The Hack
Most simple solution of all. Use at your own risk.

7. The Wrapper
Great and non-hacky solution. Very obvious what it does, especially if you pick the require.main.require() one.


Just set up your stuff as modules, and put them in node_modules folder, and then they're top-level things. Problem solved.

tj commented

solution we often use:

  • a single path (usually ./lib) exposed via NODE_PATH
  • shallow nesting (if ever)

let's you drop in node modules if you need to "fork" them and don't yet have a private registry. Lots of nesting in an app ends up sucking more often than not, and I'd argue that ../ in any module is usually an anti-pattern, maybe other than var pkg = require('../package') for bin .version etc


@isaacs; yes I know that's an option, but the node_modules folder currently is a nice clean place for only the external modules we use. All the application-specific modules are not generic enough to be put inside node_modules. Like all kinds of Controllers, Models and stuff. I don't think the node_modules folder is intended for that, is it?


yeah, whenever i see '../../../dir/name' i immediately think that someone has either 1) prematurely broken out their app in to a million directories and file or 2) hasn't modularized these components in to modules yet, and they should.


@branneman we do things in 3 phases.

1) something is a single file library in our app
2) we break it in to a proper node module and check it in to node_modules
3) we publish it and give it its own repository.

If it has application logic, it's not in node_modules. If a lot of things call it or depend on it, it shouldn't have application logic in it, it should be a node_module.

This helps us keep things clean and lets us write things for ourselves, make sure they work, then publish them and hopefully see others getting use from them and contributing.

tj commented

I should note that NODE_PATH can be confusing too if you're not familiar with the app, it's not always clear where a module is coming from unless it's named in an obvious way, we prefix ours with s- so it's obvious but they now live in a private registry


Thanks for all the feedback!

I hear mostly: if you have this problem: you have a bad architecture or bad application design. I also hear: maybe it's time for a private npm repository?

As an example, most modules in one of my applications depend on a config file, still I can not remove application logic from that, and I'm already using a proper (external) module to handle common config logic. But the data itself needs to be either loaded a lot or passed around a lot.

Would it then be a best practice to save that config object once per request to the req variable in express.js? I doubt that, because I'm touching objects I don't own. What is the way to do that kind of thing?

One of the other things I tried with a old version is require.paths, but that's removed now. That was actually the most elegant solution in my opinion. At least everything would stay inside the app, it's the developers responsibility to use it wisely.


I used to use the symlink method, but it's too much trouble on windows so I don't use it anymore.

In most my projects nowadays I don't have this problem. I use relative requires for intra-package modules.

I used to mix local deps with npm deps in node_modules, but that made my .gitignore too much trouble to only ignore certain deps.

My current behavior is:

1 - Write a single file
2 - when it gets too big, start moving parts to other files in the same folder with relative requires
3 - When there are too many modules, package some into reusable modules independent of my app or library.

I use symlinks (or nested directories on windows) to link my different packages to each-other, but each has it's own git repo and if it's generally usable, it's own npm name.


A while back I proposed the file:/// dependency for private installs.

Essentially the following in your package.json

"dependencies": {
    "whatever": "file///relative/path/to/folder"

It would only work for private packages but is an easy way to have the package management/install system take care of setting up the symlink for you at install time. This avoids all of the above described hacks and also has the benefit of letting you reference package.json when you want to learn about a dependency (which you do already).


The start up script is a good option, though all the solutions have some drawback. At the very least others looking at your code might not know where the require is looking for modules. You also want to eliminate the possibility of new dependencies colliding with modules of the same name.

I haven't noticed anyone mention using the relationship between your dependencies and your project root. So I went and built it myself: requireFrom. This method is intuitive to anyone looking at it, and requires no extra steps outside of adding a dependency. Third-party modules can use it relative to themselves, as well.

var requireFrom = require('requirefrom');
var models = requireFrom('lib/components/models');

var Article = models('article');

Thanks for writing up this overview.


I've been using symlinks with the following structure:

    /client -> ../client
    /server -> ../server
    /shared -> ../shared

it also solves the problem of not know where the modules come from because all app modules have client/server/shared prefixes in require paths


I ran into the same architectural problem: wanting a way of giving my application more organization and internal namespaces, without:

  • mixing application modules with external dependencies or bothering with private npm repos for application-specific code
  • using relative requires, which make refactoring and comprehension harder
  • using symlinks or environment variables which don't play nicely with source control

The start-up script is a good idea, but I didn't like the extra moving parts.

In the end, I decided to organize my code using file naming conventions rather than directories. A structure would look something like:

  • node_modules
    • ...
  • package.json
  • npm-shrinkwrap.json
  • src
    • app.js
    • app.config.js
    • app.web.js
    • app.web.routes.js
    • ...

Then in code:

var app_config = require('./app.config');
var app_models_foo = require('./');

or just:

var config = require('./app.config');
var foo = require('./');

and external dependencies are available from node_modules as usual:

var express = require('express');

In this way, all application code is hierarchically organized into modules and available to all other code relative to the application root.

The main disadvantage is of course that in a file browser, you can't expand/collapse the tree as though it was actually organized into directories. But I like that it's very explicit about where all code is coming from, and it doesn't use any 'magic'.



the start up script doesn't work very well with nodemon (or node forever).
If something changes nodemon tries to restart the start-up script and in my case the childprocess (express js) is still bound to my IP and I got a EADDRINUSE error.
I also tried to kill the child process but this will be executed too late.

var app = spawn(process.execPath, args, opt);

process.on('exit', function() {
    console.log("kill child process");

I've switched to the approach used by alexgorbatchev using a server and shared folder and making symlinks to node_modules folder.
Thank you it works great.


@visionmedia: quite like the idea of the no/low nesting, but how does that work with larger a source base - I have seen a few of your github reps which manifest what you say - I'm thinking that maybe an application has a more sprawling areas of functionality? ( I'm a newbie on node so I might be speculating? )


I also found a good way to use the start-up script solution with Grunt and nodemon.

In my Gruntfile.js, I just have set:

        concurrent: {
            dev: {
                tasks: ['nodemon', 'node-inspector', 'watch', 'mochaTest'],
                options: {
                    logConcurrentOutput: true
        nodemon: {
            dev: {
                script: 'index.js',
                options: {
                    nodeArgs: ['--debug'],
                    env: {
                        NODE_PATH: './app'

So just setting the options.env inside nodemon configuration and my application is still starting by just calling $ grunt


Here's another option to consider:

The app-module-path modifies the internal Module._nodeModulePaths method to change how the search path is calculated for modules at the application-level. Modules under "node_modules" will not be impacted because modules installed under node_modules will not get a modified search path.

It of course bothers me that a semi-private method needed to be modified, but it works pretty well. Use at your own risk.

The startup script solution will impact module loading for all installed modules which is not ideal. Plus, that solution requires that you start your application in a different way which introduces more friction.


You can create helper function in global scope to be able require modules relative to root path.

In app.js:

global.app_require = function(name) {
    return require(__dirname + '/' + name);

var fs = require('fs'),
    config = app_require('config'),
    common = app_require('utils/common');

It also will work in other files.


@gumaflux I believe @visionmedia is only talking about modules which usually wouldn't require "sprawling areas of functionality" because a single module isn't meant to do as much as an application. I think the nesting issue is more of a problem in applications, especially MVC apps.


I'm using browserify for a browser app.

The problem using paths, or putting code into node_modules is that in your app you may have sources to transform, for exemple CoffeeScript or JSX files.

When using require("some_private_node_module"), browserify doesn't seem to transform the files and builds a bundle with unprocessed sources.


@slorber Put the transforms in each module's package.json

Now your code will work and is less vulnerable to system-wide configuration changes and upgrades because each component can have its own local transforms and dependencies.

See also: avoiding ../../../../../../.. which pretty much echos what @isaacs has said already: just use node_modules/.

If you're worried about how node_modules might clutter up your app, create a node_modules/app and put all your modules under that package namespace. You can always require('app/whatever') for some package node_modules/app/whatever.

Not sure how node_modules/ works? It's really nifty!



This is a small hack. It relies only on node.js continuing to support the NODE_PATH environment variable. The NODE_PATH env setting is a fine method for defining an application specific local modules search path. However, I don't like relying on it being properly set external to javascript, in all cases (i.e. export, bash profile, or startup cmd). Node's module.js absorbs process.env's NODE_PATH into a private variable for inclusion into a list of global search paths used by require. The problem is, node only looks at process.env['NODE_PATH'] once, on main process init, before evaluating any of the app's code. Including the following 2 lines allows the re-definition of NODE_PATH, post process-init, and should be included prior to any local module specific requires. In a top level file include:

process.env['NODE_PATH'] = __dirname + '/lib';

Then simply require any modules in ./lib

var myLocalLibModule = require('myLocalLibModule'); 

This does not change the behavior of module.js as documented; node_modules, package.json, and global modules all behave as expected.


Another option for complex application logic (config files, loggers, database connections, etc) is to use inversion of control (IoC) containers with dependency injection. See @jaredhanson's Electrolyte for one implementation.


I just updated the article again and added more solutions. Thanks for all the feedback, keep it coming!

@joelabair: Great suggestion, added it as solution 6.

@a-ignatov-parc: Love the simplicity, added it as solution 7. Great and non-hacky.

@dskrepps: I don't like the fact that I would need to call require('requirefrom') in every file, unless you make it global like @a-ignatov-parc's solution as well. And then it's not that different from solution 7. (Altough I now see that you commented that one first!)

/cc @isaacs, @visionmedia, @mikeal, @creationix, @defunctzombie, @dskrepps, @alexgorbatchev, @indirectlylit, @flodev, @gumaflux, @tuliomonteazul, @patrick-steele-idem, @a-ignatov-parc, @esco, @slorber, @substack, @joelabair, @kgryte


FWIW, in case anyone is using Jest for testing, I tried solution 1 referenced above and it broke everything. But after hacking around, I figured out a way to make symlinks work: facebook/jest#98


This might be the worst IDEA ever, but what do you guys think about this ?

# CoffeeScript Example
$require = require
require = (file)->
    if /^\/\/.*$/.test file
        file = file.slice 1, file.length
        $require.resolve process.cwd() + file
        $require file

//JavaScript Example
var $require, require;
$require = require;
require = function(file) {
  if (/^\/\/.*$/.test(file)) {
    file = file.slice(1, file.length);
    return $require.resolve(process.cwd() + file);
  } else {
    return $require(file);

You can add that on the first line to override the require function with a reference to itself...

now, you can use require("express") as normal, and require("//lib/myLibFile") the difference is the leading //, inspired by the use in http requests //


My current solution is to have my script spawn a child-process to itself if NODE_PATH isn't set. This allows me to just run node file.js and not worry about anything else:

if( !process.env.NODE_PATH ){
    // set NODE_PATH to `pwd`
    process.env.NODE_PATH = __dirname + '/';

    require( 'child_process' ).spawn( 'gulp', [] process.argv, 2 ), {
        stdio: 'inherit'
    } );

    // "throw away" logging from this process.  The child will still be fine since it has access to stdout and its own console.log
    console.log = function(){};
    // start app

Thank you for this write-up! I went with #7 and have global method Require which complements require.


And what about:
var myModule = require.main.require( './path/to/module' ) ;
... seems to work pretty well as long as your main js file is at the root of your project.

azu commented

npm 2.0 support Local Paths.


I did a lib when I tried to restructure some source code in a large project. move a source file and update all require paths to the moved file.


@azu nice! Still...

This feature is helpful for local offline development and creating tests that require npm installing where you don't want to hit an external server, but should not be used when publishing packages to the public registry.

What I've been doing is to exploit the require.cache. If I have a package, say utils on node_modules I'll do a lib/utils and on there I'll merge the cache of utils to have whatever I want. That is:

var util = require('utils');
util.which = require('which');
util.minimist = require('minimist');
module.exports = util;

So I only have to require that package once and then utils.<some package> will give the necesary pack.


This is my contribution to this topic:

It just shortens the relative paths by introducing marks, points from which paths can be relative.


My solution is;

var path = require('path');

global._require = function(path) { //I call it 'reversal require'
    return require(path.join(__dirname, path));

//PS.: This code should in the root level folder of your project!

You are now basically requiring your .js files from base instead of cwd


A word of caution for people using the symlink approach with Browserify: you are likely to break transforms. This has been my experience with brfs and trying to include a module through a symlinked path. The transformer seems to ignore symlinked paths (or probably packages that are in the node_modules directory).

However, it turns out that there's an additional option for strategy #4 if you're using a build tool like gulp (and still works with browserify transforms). I've simply added process.env.NODE_PATH = "./my/include/path:" + (process.env.NODE_PATH || ""); to my gulpfile.js and everything works great now.


I released requirish, a solution that mixes strategy #3(rekuire) and #7 (require.main.require)
The tool is also a browserify-transform that convert back all the require() statements for browser, adding again the long relative paths only for the browserify processor


@azu Local path in npm isn't be synchronized with original source code when I edit it in original folder. It doesn't make a symbolic link.


I just made this module (my first) so I'd love to hear feedback (on my github page, not on this thread):

// require this module without assigning export

// you may now use additional global objects in any module,
// in addition to built-ins: __filename and __dirname
console.log('__line: ' + __line); // ex: 6
console.log('__file: ' + __file); // ex: server
console.log('__ext: ' + __ext); // ex: js
console.log('__base: ' + __base); // ex: /home/node/apps/5pt-app-model-example/api-example
console.log('__filename: ' + __filename); // ex: /home/node/apps/5pt-app-model-example/api-example/server/server.js
console.log('__function: ' + __function); // ex: (anonymous) 
console.log('__dirname: ' + __dirname); // ex: /home/node/apps/5pt-app-model-example/api-example/server

For me the hack presented by @joelabair works really well. I tested it with node v0.8, v0.10, v0.11 and it works well. In order to reuse this solution, i made a little module where you can just add the folders that should behave like the node_modules folder.

require('local-modules')('lib', 'components');

like @creationix, I didn't want to mess with private dependencies in node_modules folder.


If you put parts of your app into node_modules you can't exclude node_modules from search scope anymore. So you lose the ability to quick search through project files. This kinda sucks.


As for local-modules solution and likes...

When you start to import app modules like require("something") and those modules are not really reside in node_modules it feels like an evil magic to me. Import semantics was changed under the cover.

I actually think it should be resolved by adding special PROJECT ROOT symbol and patching native require. Syntax may be like require("~/dfdfdf").
But ~ will be confused with unix home dir so it's better to choose something else like require("@/dfdfdf").

Explicit is better than implicit, as noone may miss "@" symbol in import statements.
We basically add different syntax for different semantics which is good imo.

I believe having a special shims.js file for every non-standard installation like this in project folder is sane and safe enough.

What do you guys think?


This is my second approach. It just implements the __root solution which, in my opiniom, it's the best solution to this problem and nodejs/iojs should implement it.

I also like the require("@/dfdfdf") approach.


I wrote in my blog about a few solutions presented here versus ES6 problems:


@gustavohenke nice one, very hackish but cleaner and cross-functional among OS's. But the problem with it is the same as with putting the modules inside node_modules. Having a require call require('my/package') it's very confusing for me because I associate require paths without a leading ./ with core or external modules. You could have an external module named my, collisions may happen.


Yeah @gagle, I understand these problems, but my case is special, I won't be dropping ES6 modules. Fortunately, I have taken care of namespacing my libs so there's only a single collision point. Also, my app is well documented for developers.


This gist is so incredibly helpful. Kind of embarrassing that Node has an issue with this many hackish solutions.


It seemed that NODE_PATH is most clean solution


seems like
if you can turn this into a node module do it
else just define it in your index.js or app.js
if (!global.__base) {
global.__base = __dirname + '/';


Holy crap. Lots of hacky solutions here.

Try this instead: rootrequire

The readme:


Require files relative to your project root.


npm install --save rootrequire


  root = require('rootrequire'),
  myLib = require(root + '/path/to/lib.js');


  • You can move files around more easily than you can with relative paths like ../../lib/my-lib.js
  • Every file documents your app's directory structure for you. You'll know exactly where to look for things.
  • Dazzle your coworkers.

Learn JavaScript with Eric Elliott

This was written for the "Learn JavaScript with Eric Elliott" courses. Don't just learn JavaScript. Learn how to change the world.


To make node.js search for modules in an additional directory you could use require.main.path array.

// require('node-dm'); <-- Exception
require.main.paths.push('/home/username/code/projectname/node_modules/'); // <- any path here
console.log(require('node-dm'));  // All good

I'm using the wrapper solution. No magic just elegance.

Thanks for this post!


@ericelliott, with your solution IDE navigation is lost in the same way as with others...
There is no escape from this problem at app code level. Every "trick" breaks IDE move-to functionality.
From all those "solutions", only symlinks keep IDE working as it should.


Thanks for the post, very useful and detailed. I found the wrapper solution to be the most elegant, works on any latest node instance and does not require any pre-setup / hacks for it to work.

Besides it let me set the path to the library and avoid any potential name conflict issues.


I'll add my library to the list: (It's very Java-like though)


Turns out that npm now flattens your dependency tree which breaks the "rootrequire" method by @ericelliott.

I found a work around though:


Thanks for the awesome tutorial


Create symlink using node in npm postinstall

Since symlink is the only solution that does not confuse IDEs (as @ivan-kleshnin noted), here is my solution: add a postinstall script to the package.json that creates a symlink from the app directory the to node_modules (note the srcpath link is specified relative to the node_modules):

  "scripts": {
    "postinstall" : "node -e \"var srcpath='../app'; var dstpath='node_modules/app';var fs=require('fs'); fs.exists(dstpath,function(exists){if(!exists){fs.symlinkSync(srcpath, dstpath,'dir');}});\""

The script could also be put into a separate file, but I prefer to specify it directly inside the package.json...

For readability, here is the one-liner well formatted:

// the src path relative to node_module
var srcpath = '../app';
var dstpath = 'node_modules/app';
var fs = require('fs');
fs.exists(dstpath, function (exists) {
    // create the link only if the dest does not exist!
    if (!exists) {
        fs.symlinkSync(srcpath, dstpath, 'dir');

I think it should work on windows as well, but I have not tested it.


Would like to see an updated article for JS module syntax, as it requires you to be static with your imports - many of these solutions won't work


@scharf, on windows it works. You only should run cmd as admin
But fs.exists returns always false, so I replaced it with fs.readlink:

fs.readlink(dstpath,function(err, existLink){if(!existLink){fs.symlinkSync(srcpath, dstpath,'dir');}})

I developed wires because we had configuration and routing nightmares at my company. We've been using it for 2 years now and I just released version 0.3.0 which is world-ready, so have fun using it and don't hesitate with feedback, questions or death-threats :P

Using wires, you would create a wires.json file at the root of your app:

    ":models/": "./lib/models/"

And then just require models like this:

require( ":models/article" );
require( ":models/client" );

And call your main script using the wires binary:

wires startServer

There's a lot more to wires but I felt like sharing on this specific topic.

Hope this helps! :)


We (sineLABS) created and published the very minimal rqr node package for this as well.


If this issue was solved, I think local modules would be the ultimate key to the problem.

All we need is npm outdated and npm update to not ignore private (local + not published) modules and handle them properly based on local package.json version.

Here is a proof of concept project, showing how clean and easy it would be.


yet another solution to this… building atop npm's local modules:


I was surprised how unsolved this situation is, I summarized some of the available techniques here (with a focus on private rather than local modules) - I'd appreciate any thoughts/corrections/feedback!


My PR solving the npm local module handling has been merged and is now shipped with npm 2.9.0 / iojs 2.0.0.

It's very simple to refer to a local module, simply update your package.json:

  "name": "my-app",
  "dependencies": {

You can now use local module and enjoy a very simple, clean, non-hacky way. It comes with additional perks, like having proper modules.

Explained here, proof of concept updated.


The package.json script for #1 has an error: dstpath needs to just be d. Also you could make it even shorter by doing f=require('fs'). If you're into that kind of thing.


Updated. Thanks for the contributions all!


The package.json postinstall code doesn't work as the destination d variable is being shadowed by the fs.exists() callback d. Additionally, fs.exists() will be depreciated:

See fixed code below:

"scripts": {
  "postinstall" : "node -e \"try{require('fs').symlinkSync('../app','node_modules/app','dir')}catch(e){}\""

Tested with npm version 2.7.6 and node version v0.12.2


Joynet and then Jetbrains need to support "~/" in the require path parameter, this will reference the process.cwd(),
if a module is being loaded from the node_module directory meaning it is a dependency "~/" should be the module's root and not the process.cwd()

every other workaround is a temporary solution,
i tried to do that using a different aproach by overriding the require prototype function:

module.constructor.prototype.require = function (path) {
    try {
        var dirname = pathModule.dirname(this.filename);
        if( path.indexOf("./") > -1 && path.indexOf("./") < 2 ){ // if starts with ./ or ../
            path = pathModule.resolve(dirname, path);

     var mdl = globalContainer.resolve(path);
        if( !mdl ){
            mdl = this.constructor._load(path, this); //todo: suport DI here also?
                mdl = globalContainer.instantiate(mdl);

        return mdl;

    } catch (err) {
        handleException( err, path );

this allowed me to easily create mocks and modify the path just before i pass it to the real require.
this can potentially allow me to support "~/" but then i'm loosing intellij "go to decleration" feature.


Hey -

I've also been working on this problem recently. Here's what I've come up with: use-module and projectjs. The latter is a work-in-progress.


For anyone looking to do this with webpack there's an alias setting:


@rapilabs Thank you! :beers:


This was VERY well-written. Kudos!


@mmahalwy used your method, works great!


A simple "require-proxy" solution that depends on npm-clone (note: doesn't work out-of-the-box for multiple include path cases):

modules_dir_path = '/path/to/node_modules/';
_require = require(modules_dir_path + 'clone')(require);
require = function (name) { return _require(modules_dir_path + name); };

. . .

// Undo:
require = _require; delete modules_dir_path; delete _require;
It creates alias for base and local paths. I think, you could find it useful.

Just add getmodule once and enjoy ;)


Symlinking works okay, but I prefer to have each module in my projects contain their own dependencies explicitly stated in their respective package.json files. I find this cleaner and easier to keep up-to-date than having each module's dependencies stored in the global package.json file. With this in mind, symlinks don't cut it.

I've done a few projects where the approach was placing modules directly in node_modules with some prefix, like node_modules/app or node_modules/@app. However, I also feel like I shouldn't have a moment's panic when issuing rm -rf node_modules, so I am not that keen to implement it that way.

The local file: approach is great, but a nuisance while still developing the modules, as they require re-installation with each modification.

Another issue to keep in mind is if one of your local modules requires another of your local modules. The npm install step here (with each module maintaining it's own requirements) can lead to all kinds of fun (!).


After reading npm/npm#7426, I see there hasn't been much in solving this. I understand everyone is looking at it with different requirements. I guess I am just left feeling that as a local module, defined by using a relative path as the target in a package.json file, doesn't contain a version number, it should be approached in such a way that it always remains up-to-date. Whether that means checking the local modules version in package.json or not doesn't really bother me. So, if I have a dependency like "@app/router": "./lib/router" and I modify router, then update the router package.json file to a new version, I would think it would make some difference, but it doesn't. In project root: cat node_modules/@app/router/package.json | grep version will still reflect the old one originally installed at the project's initial npm install time.

After looking into some other recent comments, I saw someone approach it with an example: However, the steps suggested at the bottom still do not work for me as described (npm update in a local module after a version bump in it's package.json file). I am using npm 2.14.2, so I am up-to-date there.

I know there are solutions out there, such as @timoxley 's linklocal, but I really feel this should come free with npm modules.

Until this is all resolved, I am left having to always issue rm -rf node_modules && npm install && npm start or the like. For each module. Every time. Evar.


i just use following line in my app.js

global.requireFrom = require.main.require;

I'd like to suggest a new way that I've thought of after unsuccessfully trying to use the solutions listed above (I use Git and I work on Windows). I consider it a hack, because it abuses the require algorithm. But it is almost painless, so maybe someone else will find it useful too.

The solution

Move your project to node_modules/app/ (or anything else instead of app).

Let's say you have a project in a dir project/.
That means every file project/path/to/file.js becomes project/node_modules/app/path/to/file.js.
Any file project/node_modules/something.js becomes project/node_modules/app/node_modules/something.js.

Now you can require files using a simple require('app/path/to/file');.

The cons

  • Longer paths - This could possibly be a problem on Windows, but with an excellent work from the npm team the module structure has now become quite flat. So 14 + appName.length additional chars shouldn't be the thing that could tip the scale

  • How it looks like - Horrible, I know. Although more advanced editors allow you to specify project directories, so replacing the current one with the nested one shouldn't be a problem

  • Possible problems with tools that ignore node_modules - in particular, I had a problem with making Nodemon work, as the default config caused everything to be ignored, and ignore: [] wasn't a solution. This was because of Nodemon's merging algorithm that merged options with the defaults. ignore: ['.git'] solved this

The pros

  • No additional environment variables - Nobody likes them

  • No messing with scripts - This often makes the project incompatible with some tools, e.g. Browserify

  • It's cross-platform - For free

This isn't exactly a "new" way, as seen here:, but I only stumbled upon this answer on SO after thinking up the solution myself.


If you're using Babel, you can hook into the resolveModuleSource option.

In app/index.js:

  presets: ['es2015'],
  resolveModuleSource: require('babel-resolver')(__dirname)


In app/app.js:

import User from 'models/User';
// => resolves: "app/models/User.js"

Just a sidenote: I assume that most IDEs have no idea what to do with transformed/aliased require/import paths. Take this into consideration, when looking for better require paths.


Thanks for sharing!

There might a little error in the section 1 The Symlink. The outer string variable d is hidden by the inner argument d that is not a string.

"postinstall" : "node -e \"var s='../app',d='node_modules/app',fs=require('fs');fs.exists(d,function(d){d||fs.symlinkSync(s,d,'dir')});\""


"postinstall" : "node -e \"var s='../app',d='node_modules/app',fs=require('fs');fs.exists(d,function(obj){obj||fs.symlinkSync(s,d,'dir')});\""

This works for me:


Which prepends the current directory to the list of module search directories.
The module object paths field is not documented though, so it's still a hack


I like using require.main.require() however you will lose IntelliSense using that approach in Visual Studio Code. Are there any IDE or Code editors that can still give IntelliSense in such a situation?


If you use the Symlink solution (since node_modules is always in my .gitignore, I have tended to use the method), you may run into the following bug if you later try to npm install anything locally

npm ERR! Cannot read property 'localeCompare' of undefined

npm/npm#9766 may occur if you do not include a package.json file in your target directory. Just an FYI, may be worth a note.

If you're using Webpack, @rapilabs' suggestion of using resolve.alias seems like the right way to go.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.