Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save frontsideair/8f73f02e78c35ebaecc6015c2aac897f to your computer and use it in GitHub Desktop.
Save frontsideair/8f73f02e78c35ebaecc6015c2aac897f to your computer and use it in GitHub Desktop.
Do we really need `node_modules` or `npm`?

Do we really need node_modules or npm?

I remember thinking that the way we're doing JavaScript is complex but we don't have any choice. What we've been doing for the last few years is that we are downloading a lot of JavaScript modules from npm to ournode_modules folder and we transform and bundle it for browsers using webpack and babel. This was necessary because browsers didn't have support for new features, most importantly module support and sending a lot of separate files to the browser was inefficient, so we transformed and bundlead ahead-of-time.

Now the times are changing. Many popular browsers support the crucial features including module support and HTTP/2 makes it more efficient to send a bunch of files. But we're stuck with the old ways now, and we're paying the price for what made sense at the time. As it turns out putting al your JavaScript in one bundle is not that efficient either, since you're sending non essential code which makes it load and parse slower thus affecting the user experience badly. Also caching is hard to as your bundle changes even with the tiniest change. Now we're seeing more granular approaches like route specific bundling which should improve the problems mentioned.

When I think of the best chunking approach that would be the most cachable, it's no surprise that sending modules separately seems to be it. If we send lodash separately, for instance, it will stay cached until we bump the version and all subsequent visits will hit the cache. How about initial load? It would be the best for the browsers to aware of bundles and unbundle them and cache the modules but that doesn't seem like something on the roadmap. So it's either you develop intricate strategies to build the "just fine" size of bundles or you trust HTTP/2 and optimize for returning users. And it's not an either/or scenario, you can use server rendering and graceful degradation so the users don't need to wait for loading and parsing of scripts to experience the basic functionalities of your website.

The way to solve this is to trust the browsers. They are now module aware so you can add dependencies without adding them to the global scope, and they are much better at downloading multiple resources than before. So my sugggestion would be to add script tags for the modules that you use to your HTML from UNPKG, which is an excellent CDN. You may need a tool to notify your out-of-date dependencies and bump them for you but that's it. This is the vanilla case, which would cover the needs of many projects.

The advanced case is a bit more complicated, and for good reason, but the tooling necessary is currently nonexistent. But if you plan to build a web app that has server rendering and fast page transitions with client side validation and maximum code sharing, then you must have some kind of hybrid solution. What I would prefer is to write the routes of your application with JavaScript, and have a tool to generate the HTML for that route while turning the imports into script tags and inject the script for page transitions. One the script is loaded, internal links can be intercepted, so the script for the target and the data requirements can be fetched without a full-page refresh. The tool also must be able to generate an API route for the data.

In this case we can also go with UNPKG and use the hypothetical dependency tracking tool, but we also need IDE support and we may want to serve the scripts ourselves. (No, shared cache does not work anymore. [https://www.jefftk.com/p/shared-cache-is-going-away]) In that case using npm and node_modules can be unavoidable, but the brigt side is you get all the niceties of npm. Maybe in the future the tooling can be improved so we can get away with simpler tools than npm to mimic the UNPKG way.

It also makes sense to keep in mind that static generation of routes is also an important optimization that has a bright future. [https://joshwcomeau.com/gatsby/a-static-future/] The logical conclusion is to have a multi-pass rendering scheme where the routes are semi-generated for semi-static routes so unnecessary calls to the database are not made frequently. In that case database can be a trigger for the build process but that's a topic for another note.

Key takeaways:

  • If your website is mostly static, think about adding modules from UNPKG to keep things simple, using a simple tool to manage your dependencies
  • If your website is dynamic, define your routes with JavaScript and React, use server rendering and let a tool add script tags for your dependencies, create API routes and inject fast transition script
  • Seriously consider what you can push to static, explore multi-pass rendering
@frontsideair
Copy link
Author

Looks like Snowpack partly solves the problem, but Pika CDN by the same folks takes this a step further and I think I'll start using it for static pages instead of UNPKG and maybe for more complex projects as well! I also had some questions about the speed, turns out there's an excellent study that Snowpack mentions, and they say basically it's okay if you don't have 1000 direct dependencies.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment