I have been working on a service which exposes its functionalities via a REST API. So, we have a number of REST services listening online and a REST client to control it. The whole server/client setup was originally written in Java.

I was instructed to rewrite the client in JS so that we can have a web UI. I was given one and a half days (36 hours) to finish this. Should not be much of a challenge, right? Or so, I thought.

Plans and More Plans

I split the timeline and scheduled to learn decent amount of CSS, brush up on my JS, design the UI and port the entire client to JS. I already had some design concepts done in my free time, so this should make it easier. Challenge Accepted.

Asynchronous Programming

The paradigm of asynchronous programming welcomed me with a kick to my face. I admit, it took me a little longer for me to find my feet. JS in browser is completely different from what one would use in nodejs or for writing extensions for Cinnamon.

In the world of JS in browsers, nothing is blocking. No code is run in synchronous mode. Anything which needs to wait for a resource is run in asynchronous mode with callbacks passed to them which are executed on success or failure accordingly.

The Problem

Now, we have to make a series of REST calls (around 20 to 25) to different endpoints. All these calls are to be made sequentially, one after the other as they succeed. Even if one of them fails, the entire process has to be terminated. Now, AJAX and JS gave me an option to do what it does best. To make all the 25 calls at the same time. But, I do not require that. I want them to be called, one after the other.

Synch mode?

AJAX has an option to make the HTTP requests in a synchronous mode. But, that is hardly useful in our case. The reason is

  • It is deprecated.
  • It locks down the browser and we cannot do any action until all the calls are made.

Endless Cascades

The only way to achieve what we need is making the subsequent calls in the success callbacks of each AJAX request. So, the code would probably look like this

    url : 'http://azure.gtux.in/ping',
    success : function(data){
            url : 'http://azure.gtux.in/vm/snapshot',
            success : function(data) {
                    url : 'http://somewhere',
                    success : function(data) {
                        //do this for 20 more times

If this kind of approach had been adapted, the resultant spaghetti code would have made any kind of debugging or feature addition impossible in the future.

Instinct of a Reverse Engineer

This called for a need for a better approach to this problem. There was this idea from Virtual Machine based obfuscation of binary executables, where the original program is compiled into a virtual bytecode and a custom interpreter and a dispatcher is built, which executes the virtual bytecode.

Heavily inspired from this, I started building a simple, interpreter which does a execute-increment-repeat cycle.

I rewrote all the API calls to be made to the services as methods, which returns Promise instances instantly. Now, all these methods accept only the self instance (as one cannot access self from callbacks. So, we pass the reference directly as a function parameter). The actual parameters are passed in the shared object memory (cannot pass by usual method, since argument number differ from one method to another).

Consider all these methods as atomic assembly instructions. So, we prepare an array which will consist of the code which consists of references to methods ordered in the method of execution.

We also have an instruction pointer which points to the currently executing method. This forms our simple dispatcher mechanism.

var instructionPointer = 0;
function dispatch(this) {
    let code = [

    let nextFunction = code[this.instructionPointer++];

And each and every instruction (function) has an epilogue in its success callback, which calls the dispatcher. And then the dispatcher proceeds to call the next method. And so on, the execution continues.

This is fun.