ECMAScript Harmony – Compile ES6 code to ES5 using Traceur/Babel

ECMAScript Edition 6 (ES6) comes with lot of features such as classes, generators, destructuring, modules and much more. Currently ES6 code is not yet implemented/supported in all browsers and JS engines, you might wanna look here. Using Firefox and chrome(latest) developer tools you can test features like arrow functions etc. for chrome you will also need to enable chrome harmony flag. chrome://flags/#enable-javascript-harmony.

It might be bit early but you wanna to use latest ES6 features and here you will need a compiler that compiles your ES6 code down to regular Javascript that could run in browsers. currently we have two most famous compilers Traceur and Babel for this purpose.

Traceur

You can try Traceur in several ways:

  • Include Traceur in a Web page and it will compile ES6 code on the fly.
  • Or use node to compile ES6 to ES5 offline and include the result in Web pages or just run the result in node.

For browser you can include following script and it will convert automatically.

<!DOCTYPE html>
<html>
 <body>
 <script src="https://google.github.io/traceur-compiler/bin/traceur.js"></script>
 <script src="https://google.github.io/traceur-compiler/src/bootstrap.js"></script>
 <script type="text/javascript">
 var arr = [{a:1,b:1},{a:1,b:4}];
 console.log(arr.map((x)=>x.a));
 </script>
 </body>
</html>

In above example first file included above is the traceur compiler; next comes a small file of JS to apply the compiler to the Web page. Traceur itself written in ES6. so my simple example manipulate the array using high order function .map  that takes arrows function as callback.

You can also import ES6 modules

<script type="module">
 import './module.js';
</script>

Other way is using offline compilation. for that you need node installed. Traceur includes a shell script traceur to compile ES6 code to ES5/Regular JavaScript.

./traceur --out scripts/module.js --script module.js

We can test the module.js we just compiled like:

<html>
<head>
<script src="bin/traceur-runtime.js"></script>
<script src="script/module.js"></script>
</head>
<body>
</body>
</html>

This runtime file contains polyfills as well as helper functions which reduce the generated code size. you can find more about traceur here.

Babel

Babel is another compiler for EcmaScript6.

npm install babel-core es6-module-loader

You will need to include following files to your web page.

  • node_modules/babel-core/browser.js
  • node_modules/es6-module-loader/dist/es6-module-loader.js

Following is example to import ES6 module

<!DOCTYPE html>
<html>
<body>
<script src="js/vendors/browser.js"></script>
<script src="js/vendors/es6-module-loader.js"></script>
<script src="js/module.js"></script>
 <script>
 System.transpiler = 'babel';
 import Module from 'js/module'
 var instance = new Module();
 </script>
</body>
</html>

Let’s setup development envirnmnet and start using ES6 using Babel in your applications. youll need to install following presets and plugins

npm install --save-dev babel-cli babel-preset-es2015 babel-preset-stage-0

babel-preset-es2015 is for ES6 and babel-preset-stage-0 is for ES7 experimental features, this is not required in our case, it may be good idea to have it.

Simple express application written using ES6 could be:

let express = require('express'),
		app = express(),
		PORT = 3636;

app.get('/', (req, res) => {
	res.status(200);
	res.write(JSON.stringify({'data':"visiting home url."}));
	res.end();
});

app.listen(PORT, ()=>{
	console.log('listening at port: '+ PORT);
});
babel express.js --out-file out/express.js

Above command will transpile ES6 code from express.js to out/express.js. so that will normal JavaScript which can run in ES5 or less supported environments.

Following is the output:

'use strict';

var express = require('express'),
    app = express(),
    PORT = 3636;

app.get('/', function (req, res) {
	res.status(200);
	res.write(JSON.stringify({ 'data': "visiting home url." }));
	res.end();
});

app.listen(PORT, function () {
	console.log('listening at port: ' + PORT);
});

Also it is possible to compile whole directory containing ES6 code files using following way:

babel src --out-dir out

 

Node.js Streams, Pipe and chaining

A stream in Node.js is an abstract interface that is implemented by various objects. Thanks to Node.js’s asynchronous and event-driven nature, it excels at handling I/O-bound tasks and streams, which are similar to Unix pipes.

For example, an HTTP server request is a stream, just like the standard output (stdout). The request is a readable stream, while the response is a writable stream. Streams can be classified as Readable, Writable, or Duplex (both readable and writable). Readable streams allow you to read data from a source, while writable streams allow you to write data to a destination. A “duplex” stream, such as a TCP socket connection, can both read and write data.

All streams in Node.js are instances of EventEmitter. Here is an example:

var fs = require('fs');
var readStream = fs.createReadStream('file.txt');
var text = '';

readStream.on('data', function(chunk) {
  text += chunk;
});

readStream.on('end', function() {
  console.log(text);
});

In the code above, fs.createReadStream() provides a readable stream object. By using this object, we can listen for the data event and attach a callback. Whenever chunks of data are read, they are passed to the callback and appended to the text string. When all data has been read, the stream emits an end event, and we log the text.

You can set the encoding on a stream by calling readStream.setEncoding(). This will cause the data to be interpreted as UTF-8 and passed to your callback as a string.

Additionally, Node.js provides a useful function called pipe() for transferring the flow of data from one stream to another. For instance, we can pipe a readable stream to a writable stream like this:

var fs = require('fs');
var readStream = fs.createReadStream('file1.txt');
var writeStream = fs.createWriteStream('file2.txt');

readStream.pipe(writeStream);

In this case, we pipe the readable stream of file1.txt to the writable stream for file2.txt. The pipe() method manages the data flow automatically, so you don’t have to handle the callbacks manually.

The pipe() method also supports chaining, which allows you to pipe data through multiple destinations:

var readStream = fs.createReadStream('file.txt');
var zlib = require('zlib').createGzip();
var writeStream = fs.createWriteStream('file.txt.gz');

readStream.pipe(zlib).pipe(writeStream);

In this example, we pipe the readStream to zlib (for compression), and then pipe the resulting data to writeStream. The purpose of the pipe() method is to manage buffering and ensure that data is transferred at a rate that prevents memory overload, especially when dealing with sources and destinations of varying speeds.

Angular.js Promises – Deferred and Promise handling

A promise is an object that represents the return value or the thrown exception that the function may eventually provide. A promise can also be used as a proxy for a remote object to overcome latency.

Promises are highly useful when you wish to synchronize multiple asynchronous functions and also want to avoid Javascript callback hell. as following is a asynchronous example.

step1(function (value1) {
 step2(value1, function(value2) {
 step3(value2, function(value3) {
 // do something.
 });
 });
});

and alternative use of promise library for above example is:

Q.fcall(promise1)
.then(promise2)
.then(promise3) {
 // Do something
})
.catch(function (error) {
 // Handle any error from all above steps
})
.done();

Angular.js provide a service $q with implementation of promises/deferred objects inspired by Kris Kowal’s Q.  you can simply start using it as a service.

angular.module('PromiseExample').controller('PromiseCtrl', ['$scope', $http, $q, function ($scope, $http, $q) {
 
 function getUser(){
   var deferred = $q.defer();
   $http.get(ApiUrl+ '/user/' + $scope.uuid)
   .success(function(result){
    deferred.resolve(result);
   })
   .error(function(error){
     deferred.reject(error);
   });
 
   return deferred.promise;
 }

 getUser().then(function(user){
   $scope.user = user;
 });

}]);

In above code, getUser() is creating a $q.defer() object which has returned a promise at the end of function, we have called it remote object earlier above. now deffered.resolve provide us the results in case of success and deffered.reject object gonna tell us the reason why it failed.

$q.defer() also have a method notify() which gives the status of the promise’s execution. This may be called multiple times before the promise is either resolved or rejected.

You can also wrap multiple promises using $q.all. all function combines multiple promises into a single promise and later on you can foreach through responses.

var first = $http.get("/api/v1/user/educations/"+user_id),
second = $http.get("/api/v1/user/employments/"+user_id),
third = $http.get("/api/v1/user/posts/"+user_id);

$q.all([first, second, third]).then(function(result) {
// foearch through all responses.
angular.forEach(result, function(response) {
console.log(response.data);
});
});

If you want to take look at further examples of Promises, you can see it Kris Kowal’s Q and Angular $q service

Introduction to WebSockets – Server, Client examples using Node.js

Initially the paradigm about web was Client/Server model, in which clients job is to request data from a server, and a servers job is to fulfill those requests. This paradigm fulfills the web requirement for number of years, but later on with Introduction of AJAX makes us enable to communicate with server asynchronously.

Now using AJAX it was easy to communicate with server but we still need more power when talk in term of real time applications. an simple example http request could be like bellow.

GET /users/ HTTP/1.1
Host: example.com
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1.5) Gecko/20091102 Firefox/3.5.5 (.NET CLR 3.5.30729)
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 300
Connection: keep-alive
Cookie: PHPSESSID=r2t5uvjq435r4q7ib3vtdjq120
Pragma: no-cache
Cache-Control: no-cache
So every time you make an HTTP request a bunch of headers and cookie data are transferred to the server and in response you will response header along with data. If you are working on a real time application or game running in browser, you will have problems running things smoothly because every request contain a lot of unnecessary data which is not needed every time you make request to server. so here comes WebSockets which is a way of creating a persistent, low latency connection that can support transactions initiated by either the client or server.
Using WebSockets you can start a bidirectional connection where client/server can use to start send data at any time. Client establishes connection through a process called WebSocket handshake, client will send regular HTTP request to the server – upgrade header is included in this request to let server know that a client want to establish a WebSocket connection.
GET /chat HTTP/1.1
Host: example.com:8000
Upgrade: websocket
Connection: Upgrade
Sec-WebSocket-Key: dGhlIHNhbXBsZSBub25jZQ==
Sec-WebSocket-Version: 13

Now if the server supports the WebSocket protocol, it will send upgrade header back in the response.

TTP/1.1 101 Switching Protocols
Upgrade: websocket
Connection: Upgrade
Sec-WebSocket-Accept: s3pPLMBiTxaQ9kYGzzhZRbK+xOo=

After handshake is complete the initial HTTP connection is replaced by a WebSocket connection that uses TCP/IP connection and now either party can starting sending data.

A simple client side connection is like bellow.

var socket = new WebSocket('ws://example.com');
// onopen send/recieve data
socket.onopen = function(event) {
console.log(&amp;quot;connected to server.&amp;quot;)
};
// Handle any errors that occur.
socket.onerror = function(error) {
console.log('WebSocket Error: ' + error);
};

To send a message through the WebSocket connection you call the send() method. You can send both text and binary data through a WebSocket.

Now if you want to create a simple socket server using Express.js and Socket.IO.

var app = require('express')();
var http = require('http').Server(app);
var io = require('socket.io')(http);

app.get('/', function(req, res){
res.sendfile('index.html');
});

io.on('connection', function(socket){
console.log('a user connected');
});

http.listen(3000, function(){
console.log('listening on *:3000');
});

Server is now listening at port 3000, which can recieve and send data.