Marching on Toward a Node Maintenance Mode

May 23, 2016 | Product,

Since the beginning, the goal with Wrenchmode has been to support pretty much any language or framework that has a middleware stack. Rack for Ruby was our first target, but Express for NodeJS is a close second. We’re hoping to have the npm module for Express any day now, but I thought I’d talk about some of the interesting design considerations, and how the Express implementation differs from the Rack middleware layer.

Ruby: Threaded…Kinda…

The core problem of the Wrenchmode middleware is how to keep in touch with the main Wrenchmode server while not interfering with requests as they come in. With Ruby/Rack, my solution was to use concurrency. Even though Ruby doesn’t support true parallelism, it still supports smart thread scheduling and swapping out blocked or sleeping threads. For the Wrenchmode middleware, this means we can have a separate thread responsible for polling the status from the Wrenchmode server. The main thread still handles passing requests up the Rack middleware stack, and there is no real performance impact because the second thread spends 99.9999% of its life either sleeping or blocked on an HTTP request.

I eventually came up with a fairly elegant solution, but multi-threading is always tricky. I had to consider a lot of cases where race conditions might occur, especially when it came to updating some core data structures that were shared by both threads.

In short, it works, and it’s nice, but if only there were a language that supported a better model…?

Async to the Rescue

Node supports asynchronicity much more elegantly than Ruby. Rather than relying on explicit multithreading, I can easily leverage Node’s async/callback structure. Simply put, I can write a simple middleware layer that just “does” the polling of the main Wrenchmode server¬†right alongside the main request handling. Node will handle pushing the Wrenchmode status request callback onto the stack and then get right back to handling incoming requests. No worries about deadlock, and no fancy mutexes required.

More to Come

I’m still putting together the actual code, but I’m hopeful that it will come together very quickly. This is a problem space that NodeJS handles very elegantly, and that means the¬†actual coding will be a joy. I’ll be sure to come back here and update this with my experiences once it’s all done. Stay tuned!