What is the NERP stack? I’m glad you asked! It’s a fun little development stack acronym, my counterpoint to MEAN. In Part 1, we consider the Pros and Cons of Node.js!
What is MEAN? It’s a set of development technologies, frequently called a ‘stack’:
Now, here we are, over two years later, and we’ve realized the error of our ways. Angular’s tendency toward nested controllers, confusing
$scope, hard-to-debug two-way binding and so on have unseated it from its frontend throne. And many have realized the substantial downsides of Mongo. Sadly, new articles are still being posted recommending MEAN.
This is the first in a series meant to bring balance to the force. Er, I mean, tech acronyms on the internet.
Say you’re new to the world of Node.js. What should your default stack be? I humbly propose the NERP stack:
A quick introduction to the new players in this updated acronym:
PostgreSQL is a battle-tested, open-source relational database appropriate for tutorial-style use and for high-traffic production websites. It’s not a hip NoSQL database, but it can do schema-free JSON if you so desire. You’re likely already familiar with SQL databases, so you’ll be right at home.
React is a radical rethinking of the way we approach frontend development, which brings big benefits. Its recommended one-way data flow, taken directly from software architectures first developed in the 1960s, makes things more predictable and verifiable.
Let’s get into the details!
Node.js is a good choice for our first foray into the technologies of NERP stack, since it’s also part of MEAN. But that’s not enough reason to use it. We’d better be sure, because we can’t use Express, also shared between the two acronyms, unless we also choose Node.js.
So, let’s remind ourselves why Node.js is interesting in the first place.Yes, lots of people are using it, but is that enough? Are there more reasons beyond simply following the crowd? I believe the answer is a resounding “yes!”
First let’s talk about the good things about Node.js:
On the unix command line you can compose things very easily from a set of very basic components. For example, we can chain three tools together and very easily accomplish something that would take many confused clicks in most GUIs - delete all items which match a search query (‘temp’) within a directory hierarchy:
find . | grep temp | xargs rm
npm command line tool. That’s it. There’s a very high bar for things making it into the installer, and everything else is delivered as a user package.
Thus Node.js is very lightweight. You can use it to build a command-line utility like one of the unix tools above. Or you can use it like the shell itself above: chaining a lot of modules together to build a more complex whole. Node.js is particularly well-suited for microservices, which bring the unix ethos to the web.
npm in the box, and very few core libraries, Node.js was set up for a strong ecosystem from the start. And it didn’t disappoint: there is an unbelievable amount of energy in the Node.js ecosystem. The npm registry has a huge number of libraries, about 230k as of this writing. And many new libraries are published every day!
Want to print color to the console? Use
chalk. Want to parse command-line arguments? Try
commander. The generic utilities you’ll want on day one?
lodash. And it goes on and on and on. Full command-line tools you install globally, complete server frameworks, development tools, and more.
And it’s all open source. When you install a node module, the code is right there, on disk like the old days of pre-minified web pages. You can learn from them, modify them, even contribute improvements back to them. That’s the beauty of open source. You can stand on the shoulders of giants, then contribute back to the community and allow others to stand on your shoulders.
One Node.js web service process can handle many, many concurrent connections. How? Node.js is made for the internet! Just as a web page or image is broken into a lot of little packets for transmission over TCP/IP, Node.js does best working a little bit at a time. But unlike other servers like Ruby on Rails or Apache, it does all the work on one thread.
The key to being so productive on one thread is asynchronous I/O. Going to disk, going to a database, going to a web service across the internet - all of them are treated the same: request the data, and then do other things while you wait for the response.
Now, you might be saying to yourself: “But other frameworks do this!” Sure, platforms like EventMachine (Ruby) and Tornado (Python) have the same single-threaded asynchronous I/O. The difference is that in the world of Node.js, anything you install via
npm is compatible with it. Installations via
pip will likely use synchronous APIs, which would eliminate most of the benefits of EventMachine or Tornado.
Node.js allows you to take that knowledge to the server! One developer can implement a feature from the database all the way to the client-side, with no waiting for another team to do their part! You can even share the exact same code on the server and in the browser, like data validations or HTML rendering.
In this world you can reduce complexity. No more maintaining completely different environments for the server and the browser. You can use the same coding standards, linting tools and configuration, test frameworks, etc. across all of your development.
Now let’s consider the other side, the negative aspects of Node.js. You already know about the dangerous cliffs. What else is there?
It’s got no proper number type. It has only very limited support for immutability (though it does have immutable libraries). It does no enforcement of any kind on function parameters - you can add too many or provide none at all. Its
arguments keyword looks like an
Array, but you’d get an error if you tried to
arguments.slice(). It’s got no real hash maps, since all object keys are first coerced to strings. And that’s just the beginning of its many, many quirks.
If you want to produce quality code, linting is necessary. The language and runtime won’t help.
The number of packages available on npm is a double-edged sword. It makes finding the right library for your needs that much harder. Say you’re looking for a generic set of utility functions: do you use
lodash, or something else?
How do you decide? If you’re on a team of people who have each used different options in the past, how will you decide what to use for a new project? When you make a final choice, will those unfamiliar with that library cause more bugs?
You start a new project and decide on
grunt for your project automation framework. A couple months later your new hires start complaining that you’re not using
gulp for your automation. The next month your designer says that
broccoli is the latest thing. Now your developers want to follow the latest trend - npm scripts directly in the project’s package.json.
But other technologies are very fast to begin with. Java Virtual Machines have very mature hotspot optimization. C and C++ lack garbage collection overhead and can be hand-tuned down to the finest detail. When these technologies start using the same single thread and async I/O techniques, they can go faster than Node.js.
Of course, so much of it comes down to the specific workload, as well as the specific algorithms and data structures used. But it’s a mistake to say that Node.js is fast without also including some sort of reference point.
Alright! I think our N is solidly justified. We are well on our way to building an acronym which will stand the test of time! Will this one will be even more popular than LAMP? We can certainly dream.
Now you’re ready for E for Express (NERP stack part 2)…