Shaun Xu

The Sheep-Pen of the Shaun


News

logo

Shaun, the author of this blog is a semi-geek, clumsy developer, passionate speaker and incapable architect with about 10 years’ experience in .NET and JavaScript. He hopes to prove that software development is art rather than manufacturing. He's into cloud computing platform and technologies (Windows Azure, Amazon and Aliyun) and right now, Shaun is being attracted by JavaScript (Angular.js and Node.js) and he likes it.

Shaun is working at Worktile Inc. as the chief architect for overall design and develop worktile, a web-based collaboration and task management tool, and lesschat, a real-time communication aggregation tool.

MVP

My Stats

  • Posts - 122
  • Comments - 542
  • Trackbacks - 0

Tag Cloud


Recent Comments


Recent Posts


Archives


Post Categories


.NET


 

When I described on how to host a Node.js application on Windows Azure, one of questions might be raised about how to consume the vary Windows Azure services, such as the storage, service bus, access control, etc.. Interact with windows azure services is available in Node.js through the Windows Azure Node.js SDK, which is a module available in NPM. In this post I would like to describe on how to use Windows Azure Storage (a.k.a. WAS) as well as the service runtime.

 

Consume Windows Azure Storage

Let’s firstly have a look on how to consume WAS through Node.js. As we know in the previous post we can host Node.js application on Windows Azure Web Site (a.k.a. WAWS) as well as Windows Azure Cloud Service (a.k.a. WACS). In theory, WAWS is also built on top of WACS worker roles with some more features. Hence in this post I will only demonstrate for hosting in WACS worker role.

The Node.js code can be used when consuming WAS when hosted on WAWS. But since there’s no roles in WAWS, the code for consuming service runtime mentioned in the next section cannot be used for WAWS node application.

We can use the solution that I created in my last post. Alternatively we can create a new windows azure project in Visual Studio with a worker role, add the “node.exe” and “index.js” and install “express” and “node-sqlserver” modules, make all files as “Copy always”.

In order to use windows azure services we need to have Windows Azure Node.js SDK, as knows as a module named “azure” which can be installed through NPM. Once we downloaded and installed, we need to include them in our worker role project and make them as “Copy always”.

You can use my “Copy all always” tool mentioned in my last post to update the currently worker role project file. You can also find the source code of this tool here.

image

The source code of Windows Azure SDK for Node.js can be found in its GitHub page. It contains two parts. One is a CLI tool which provides a cross platform command line package for Mac and Linux to manage WAWS and Windows Azure Virtual Machines (a.k.a. WAVM). The other is a library for managing and consuming vary windows azure services includes tables, blobs, queues, service bus and the service runtime. I will not cover all of them but will only demonstrate on how to use tables and service runtime information in this post. You can find the full document of this SDK here.

Back to Visual Studio and open the “index.js”, let’s continue our application from the last post, which was working against Windows Azure SQL Database (a.k.a. WASD). The code should looks like this.

   1: var express = require("express");
   2: var sql = require("node-sqlserver");
   3:  
   4: var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:ac6271ya9e.database.windows.net,1433;Database=synctile;Uid=shaunxu@ac6271ya9e;Pwd={PASSWORD};Encrypt=yes;Connection Timeout=30;";
   5: var port = 80;
   6:  
   7: var app = express();
   8:  
   9: app.configure(function () {
  10:     app.use(express.bodyParser());
  11: });
  12:  
  13: app.get("/", function (req, res) {
  14:     sql.open(connectionString, function (err, conn) {
  15:         if (err) {
  16:             console.log(err);
  17:             res.send(500, "Cannot open connection.");
  18:         }
  19:         else {
  20:             conn.queryRaw("SELECT * FROM [Resource]", function (err, results) {
  21:                 if (err) {
  22:                     console.log(err);
  23:                     res.send(500, "Cannot retrieve records.");
  24:                 }
  25:                 else {
  26:                     res.json(results);
  27:                 }
  28:             });
  29:         }
  30:     });
  31: });
  32:  
  33: app.get("/text/:key/:culture", function (req, res) {
  34:     sql.open(connectionString, function (err, conn) {
  35:         if (err) {
  36:             console.log(err);
  37:             res.send(500, "Cannot open connection.");
  38:         }
  39:         else {
  40:             var key = req.params.key;
  41:             var culture = req.params.culture;
  42:             var command = "SELECT * FROM [Resource] WHERE [Key] = '" + key + "' AND [Culture] = '" + culture + "'";
  43:             conn.queryRaw(command, function (err, results) {
  44:                 if (err) {
  45:                     console.log(err);
  46:                     res.send(500, "Cannot retrieve records.");
  47:                 }
  48:                 else {
  49:                     res.json(results);
  50:                 }
  51:             });
  52:         }
  53:     });
  54: });
  55:  
  56: app.get("/sproc/:key/:culture", function (req, res) {
  57:     sql.open(connectionString, function (err, conn) {
  58:         if (err) {
  59:             console.log(err);
  60:             res.send(500, "Cannot open connection.");
  61:         }
  62:         else {
  63:             var key = req.params.key;
  64:             var culture = req.params.culture;
  65:             var command = "EXEC GetItem '" + key + "', '" + culture + "'";
  66:             conn.queryRaw(command, function (err, results) {
  67:                 if (err) {
  68:                     console.log(err);
  69:                     res.send(500, "Cannot retrieve records.");
  70:                 }
  71:                 else {
  72:                     res.json(results);
  73:                 }
  74:             });
  75:         }
  76:     });
  77: });
  78:  
  79: app.post("/new", function (req, res) {
  80:     var key = req.body.key;
  81:     var culture = req.body.culture;
  82:     var val = req.body.val;
  83:  
  84:     sql.open(connectionString, function (err, conn) {
  85:         if (err) {
  86:             console.log(err);
  87:             res.send(500, "Cannot open connection.");
  88:         }
  89:         else {
  90:             var command = "INSERT INTO [Resource] VALUES ('" + key + "', '" + culture + "', N'" + val + "')";
  91:             conn.queryRaw(command, function (err, results) {
  92:                 if (err) {
  93:                     console.log(err);
  94:                     res.send(500, "Cannot retrieve records.");
  95:                 }
  96:                 else {
  97:                     res.send(200, "Inserted Successful");
  98:                 }
  99:             });
 100:         }
 101:     });
 102: });
 103:  
 104: app.listen(port);

Now let’s create a new function, copy the records from WASD to table service.

1. Delete the table named “resource”.

2. Create a new table named “resource”. These 2 steps ensures that we have an empty table.

3. Load all records from the “resource” table in WASD.

4. For each records loaded from WASD, insert them into the table one by one.

5. Prompt to user when finished.

In order to use table service we need the storage account and key, which can be found from the developer portal. Just select the storage account and click the Manage Keys button.

image

Then create two local variants in our Node.js application for the storage account name and key. Since we need to use WAS we need to import the azure module. Also I created another variant stored the table name.

In order to work with table service I need to create the storage client for table service. This is very similar as the Windows Azure SDK for .NET. As the code below I created a new variant named “client” and use “createTableService”, specified my storage account name and key.

   1: var azure = require("azure");
   2: var storageAccountName = "synctile";
   3: var storageAccountKey = "/cOy9L7xysXOgPYU9FjDvjrRAhaMX/5tnOpcjqloPNDJYucbgTy7MOrAW7CbUg6PjaDdmyl+6pkwUnKETsPVNw==";
   4: var tableName = "resource";
   5: var client = azure.createTableService(storageAccountName, storageAccountKey);

Now create a new function for URL “/was/init” so that we can trigger it through browser. Then in this function we will firstly load all records from WASD.

   1: app.get("/was/init", function (req, res) {
   2:     // load all records from windows azure sql database
   3:     sql.open(connectionString, function (err, conn) {
   4:         if (err) {
   5:             console.log(err);
   6:             res.send(500, "Cannot open connection.");
   7:         }
   8:         else {
   9:             conn.queryRaw("SELECT * FROM [Resource]", function (err, results) {
  10:                 if (err) {
  11:                     console.log(err);
  12:                     res.send(500, "Cannot retrieve records.");
  13:                 }
  14:                 else {
  15:                     if (results.rows.length > 0) {
  16:                         // begin to transform the records into table service
  17:                     }
  18:                 }
  19:             });
  20:         }
  21:     });
  22: });

When we succeed loaded all records we can start to transform them into table service. First I need to recreate the table in table service. This can be done by deleting and creating the table through table client I had just created previously.

   1: app.get("/was/init", function (req, res) {
   2:     // load all records from windows azure sql database
   3:     sql.open(connectionString, function (err, conn) {
   4:         if (err) {
   5:             console.log(err);
   6:             res.send(500, "Cannot open connection.");
   7:         }
   8:         else {
   9:             conn.queryRaw("SELECT * FROM [Resource]", function (err, results) {
  10:                 if (err) {
  11:                     console.log(err);
  12:                     res.send(500, "Cannot retrieve records.");
  13:                 }
  14:                 else {
  15:                     if (results.rows.length > 0) {
  16:                         // begin to transform the records into table service
  17:                         // recreate the table named 'resource'
  18:                         client.deleteTable(tableName, function (error) {
  19:                             client.createTableIfNotExists(tableName, function (error) {
  20:                                 if (error) {
  21:                                     error["target"] = "createTableIfNotExists";
  22:                                     res.send(500, error);
  23:                                 }
  24:                                 else {
  25:                                     // transform the records
  26:                                 }
  27:                             });
  28:                         });
  29:                     }
  30:                 }
  31:             });
  32:         }
  33:     });
  34: });

As you can see, the azure SDK provide its methods in callback pattern. In fact, almost all modules in Node.js use the callback pattern.

For example, when I deleted a table I invoked “deleteTable” method, provided the name of the table and a callback function which will be performed when the table had been deleted or failed. Underlying, the azure module will perform the table deletion operation in POSIX async threads pool asynchronously. And once it’s done the callback function will be performed. This is the reason we need to nest the table creation code inside the deletion function. If we perform the table creation code after the deletion code then they will be invoked in parallel.

Next, for each records in WASD I created an entity and then insert into the table service. Finally I send the response to the browser.

Can you find a bug in the code below? I will describe it later in this post.

   1: app.get("/was/init", function (req, res) {
   2:     // load all records from windows azure sql database
   3:     sql.open(connectionString, function (err, conn) {
   4:         if (err) {
   5:             console.log(err);
   6:             res.send(500, "Cannot open connection.");
   7:         }
   8:         else {
   9:             conn.queryRaw("SELECT * FROM [Resource]", function (err, results) {
  10:                 if (err) {
  11:                     console.log(err);
  12:                     res.send(500, "Cannot retrieve records.");
  13:                 }
  14:                 else {
  15:                     if (results.rows.length > 0) {
  16:                         // begin to transform the records into table service
  17:                         // recreate the table named 'resource'
  18:                         client.deleteTable(tableName, function (error) {
  19:                             client.createTableIfNotExists(tableName, function (error) {
  20:                                 if (error) {
  21:                                     error["target"] = "createTableIfNotExists";
  22:                                     res.send(500, error);
  23:                                 }
  24:                                 else {
  25:                                     // transform the records
  26:                                     for (var i = 0; i < results.rows.length; i++) {
  27:                                         var entity = {
  28:                                             "PartitionKey": results.rows[i][1],
  29:                                             "RowKey": results.rows[i][0],
  30:                                             "Value": results.rows[i][2]
  31:                                         };
  32:                                         client.insertEntity(tableName, entity, function (error) {
  33:                                             if (error) {
  34:                                                 error["target"] = "insertEntity";
  35:                                                 res.send(500, error);
  36:                                             }
  37:                                             else {
  38:                                                 console.log("entity inserted");
  39:                                             }
  40:                                         });
  41:                                     }
  42:                                     // send the
  43:                                     console.log("all done");
  44:                                     res.send(200, "All done!");
  45:                                 }
  46:                             });
  47:                         });
  48:                     }
  49:                 }
  50:             });
  51:         }
  52:     });
  53: });

Now we can publish it to the cloud and have a try. But normally we’d better test it at the local emulator first. In Node.js SDK there are three build-in properties which provides the account name, key and host address for local storage emulator. We can use them to initialize our table service client. We also need to change the SQL connection string to let it use my local database. The code will be changed as below.

   1: // windows azure sql database
   2: //var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:ac6271ya9e.database.windows.net,1433;Database=synctile;Uid=shaunxu@ac6271ya9e;Pwd=eszqu94XZY;Encrypt=yes;Connection Timeout=30;";
   3: // sql server
   4: var connectionString = "Driver={SQL Server Native Client 11.0};Server={.};Database={Caspar};Trusted_Connection={Yes};";
   5:  
   6: var azure = require("azure");
   7: var storageAccountName = "synctile";
   8: var storageAccountKey = "/cOy9L7xysXOgPYU9FjDvjrRAhaMX/5tnOpcjqloPNDJYucbgTy7MOrAW7CbUg6PjaDdmyl+6pkwUnKETsPVNw==";
   9: var tableName = "resource";
  10: // windows azure storage
  11: //var client = azure.createTableService(storageAccountName, storageAccountKey);
  12: // local storage emulator
  13: var client = azure.createTableService(azure.ServiceClient.DEVSTORE_STORAGE_ACCOUNT, azure.ServiceClient.DEVSTORE_STORAGE_ACCESS_KEY, azure.ServiceClient.DEVSTORE_TABLE_HOST);

Now let’s run the application and navigate to “localhost:12345/was/init” as I hosted it on port 12345. We can find it transformed the data from my local database to local table service.

image

Everything looks fine. But there is a bug in my code. If we have a look on the Node.js command window we will find that it sent response before all records had been inserted, which is not what I expected.

image

The reason is that, as I mentioned before, Node.js perform all IO operations in non-blocking model. When we inserted the records we executed the table service insert method in parallel, and the operation of sending response was also executed in parallel, even though I wrote it at the end of my logic.

image

The correct logic should be, when all entities had been copied to table service with no error, then I will send response to the browser, otherwise I should send error message to the browser. To do so I need to import another module named “async”, which helps us to coordinate our asynchronous code.

Install the module and import it at the beginning of the code. Then we can use its “forEach” method for the asynchronous code of inserting table entities. The first argument of “forEach” is the array that will be performed. The second argument is the operation for each items in the array. And the third argument will be invoked then all items had been performed or any errors occurred. Here we can send our response to browser.

   1: app.get("/was/init", function (req, res) {
   2:     // load all records from windows azure sql database
   3:     sql.open(connectionString, function (err, conn) {
   4:         if (err) {
   5:             console.log(err);
   6:             res.send(500, "Cannot open connection.");
   7:         }
   8:         else {
   9:             conn.queryRaw("SELECT * FROM [Resource]", function (err, results) {
  10:                 if (err) {
  11:                     console.log(err);
  12:                     res.send(500, "Cannot retrieve records.");
  13:                 }
  14:                 else {
  15:                     if (results.rows.length > 0) {
  16:                         // begin to transform the records into table service
  17:                         // recreate the table named 'resource'
  18:                         client.deleteTable(tableName, function (error) {
  19:                             client.createTableIfNotExists(tableName, function (error) {
  20:                                 if (error) {
  21:                                     error["target"] = "createTableIfNotExists";
  22:                                     res.send(500, error);
  23:                                 }
  24:                                 else {
  25:                                     async.forEach(results.rows,
  26:                                         // transform the records
  27:                                         function (row, callback) {
  28:                                             var entity = {
  29:                                                 "PartitionKey": row[1],
  30:                                                 "RowKey": row[0],
  31:                                                 "Value": row[2]
  32:                                             };
  33:                                             client.insertEntity(tableName, entity, function (error) {
  34:                                                 if (error) {
  35:                                                     callback(error);
  36:                                                 }
  37:                                                 else {
  38:                                                     console.log("entity inserted.");
  39:                                                     callback(null);
  40:                                                 }
  41:                                             });
  42:                                         },
  43:                                         // send reponse
  44:                                         function (error) {
  45:                                             if (error) {
  46:                                                 error["target"] = "insertEntity";
  47:                                                 res.send(500, error);
  48:                                             }
  49:                                             else {
  50:                                                 console.log("all done");
  51:                                                 res.send(200, "All done!");
  52:                                             }
  53:                                         }
  54:                                     );
  55:                                 }
  56:                             });
  57:                         });
  58:                     }
  59:                 }
  60:             });
  61:         }
  62:     });
  63: });

Run it locally and now we can find the response was sent after all entities had been inserted.

image

Query entities against table service is simple as well. Just use the “queryEntity” method from the table service client and providing the partition key and row key. We can also provide a complex query criteria as well, for example the code here.

In the code below I queried an entity by the partition key and row key, and return the proper localization value in response.

   1: app.get("/was/:key/:culture", function (req, res) {
   2:     var key = req.params.key;
   3:     var culture = req.params.culture;
   4:     client.queryEntity(tableName, culture, key, function (error, entity) {
   5:         if (error) {
   6:             res.send(500, error);
   7:         }
   8:         else {
   9:             res.json(entity);
  10:         }
  11:     });
  12: });

And then tested it on local emulator.

image

Finally if we want to publish this application to the cloud we should change the database connection string and storage account.

For more information about how to consume blob and queue service, as well as the service bus please refer to the MSDN page.

 

Consume Service Runtime

As I mentioned above, before we published our application to the cloud we need to change the connection string and account information in our code. But if you had played with WACS you should have known that the service runtime provides the ability to retrieve configuration settings, endpoints and local resource information at runtime. Which means we can have these values defined in CSCFG and CSDEF files and then the runtime should be able to retrieve the proper values.

For example we can add some role settings though the property window of the role, specify the connection string and storage account for cloud and local.

image

And the can also use the endpoint which defined in role environment to our Node.js application.

image

In Node.js SDK we can get an object from “azure.RoleEnvironment”, which provides the functionalities to retrieve the configuration settings and endpoints, etc.. In the code below I defined the connection string variants and then use the SDK to retrieve and initialize the table client.

   1: var connectionString = "";
   2: var storageAccountName = "";
   3: var storageAccountKey = "";
   4: var tableName = "";
   5: var client;
   6:  
   7: azure.RoleEnvironment.getConfigurationSettings(function (error, settings) {
   8:     if (error) {
   9:         console.log("ERROR: getConfigurationSettings");
  10:         console.log(JSON.stringify(error));
  11:     }
  12:     else {
  13:         console.log(JSON.stringify(settings));
  14:         connectionString = settings["SqlConnectionString"];
  15:         storageAccountName = settings["StorageAccountName"];
  16:         storageAccountKey = settings["StorageAccountKey"];
  17:         tableName = settings["TableName"];
  18:  
  19:         console.log("connectionString = %s", connectionString);
  20:         console.log("storageAccountName = %s", storageAccountName);
  21:         console.log("storageAccountKey = %s", storageAccountKey);
  22:         console.log("tableName = %s", tableName);
  23:  
  24:         client = azure.createTableService(storageAccountName, storageAccountKey);
  25:     }
  26: });

In this way we don’t need to amend the code for the configurations between local and cloud environment since the service runtime will take care of it.

At the end of the code we will listen the application on the port retrieved from SDK as well.

   1: azure.RoleEnvironment.getCurrentRoleInstance(function (error, instance) {
   2:     if (error) {
   3:         console.log("ERROR: getCurrentRoleInstance");
   4:         console.log(JSON.stringify(error));
   5:     }
   6:     else {
   7:         console.log(JSON.stringify(instance));
   8:         if (instance["endpoints"] && instance["endpoints"]["nodejs"]) {
   9:             var endpoint = instance["endpoints"]["nodejs"];
  10:             app.listen(endpoint["port"]);
  11:         }
  12:         else {
  13:             app.listen(8080);
  14:         }
  15:     }
  16: });

But if we tested the application right now we will find that it cannot retrieve any values from service runtime. This is because by default, the entry point of this role was defined to the worker role class. In windows azure environment the service runtime will open a named pipeline to the entry point instance, so that it can connect to the runtime and retrieve values. But in this case, since the entry point was worker role and the Node.js was opened inside the role, the named pipeline was established between our worker role class and service runtime, so our Node.js application cannot use it.

image

To fix this problem we need to open the CSDEF file under the azure project, add a new element named Runtime. Then add an element named EntryPoint which specify the Node.js command line. So that the Node.js application will have the connection to service runtime, then it’s able to read the configurations.

image

Start the Node.js at local emulator we can find it retrieved the connections, storage account for local.

image

And if we publish our application to azure then it works with WASD and storage service through the configurations for cloud.

image

 

Summary

In this post I demonstrated how to use Windows Azure SDK for Node.js to interact with storage service, especially the table service. I also demonstrated on how to use WACS service runtime, how to retrieve the configuration settings and the endpoint information. And in order to make the service runtime available to my Node.js application I need to create an entry point element in CSDEF file and set “node.exe” as the entry point.

I used five posts to introduce and demonstrate on how to run a Node.js application on Windows platform, how to use Windows Azure Web Site and Windows Azure Cloud Service worker role to host our Node.js application. I also described how to work with other services provided by Windows Azure platform through Windows Azure SDK for Node.js.

Node.js is a very new and young network application platform. But since it’s very simple and easy to learn and deploy, as well as, it utilizes single thread non-blocking IO model, Node.js became more and more popular on web application and web service development especially for those IO sensitive projects. And as Node.js is very good at scaling-out, it’s more useful on cloud computing platform.

Use Node.js on Windows platform is new, too. The modules for SQL database and Windows Azure SDK are still under development and enhancement. It doesn’t support SQL parameter in “node-sqlserver”. It does support using storage connection string to create the storage client in “azure”. But Microsoft is working on make them easier to use, working on add more features and functionalities.

 

PS, you can download the source code here. You can download the source code of my “Copy all always” tool here.

 

Hope this helps,

Shaun

All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

 

Microsoft had just announced an update for Windows Azure Web Site (a.k.a. WAWS). There are four major features added in WAWS which are free scaling mode, GitHub integration, custom domain and multi branches.

Since I ‘m working in Node.js and I would like to have my code in GitHub and deployed automatically to my Windows Azure Web Site once I sync my code, this feature is a big good news to me.

 

It’s very simple to establish the GitHub integration in WAWS. First we need a clean WAWS. In its dashboard page click “Set up Git publishing”.

image

Currently WAWS doesn’t support to change the publish setting. So if you have an existing WAWS which published by TFS or local Git then you have to create a new WAWS and set the Git publishing.

Then in the deployment page we can see now WAWS supports three Git publishing modes:

- Push my local files to Windows Azure: In this mode we will create a new Git repository on local machine and commit, publish our code to Windows Azure through Git command or some GUI.

- Deploy from my GitHub project: In this mode we will have a Git repository created on GitHub. Once we publish our code to GitHub Windows Azure will download the code and trigger a new deployment.

- Deploy from my CodePlex project: Similar as the previous one but our code would be in CodePlex repository.

 

Now let’s back to GitHub and create a new publish repository.

Currently WAWS GitHub integration only support for public repositories. The private repositories support will be available in several weeks.

We can manage our repositories in GitHub website. But as a windows geek I prefer the GUI tool. So I opened the GitHub for Windows, login with my GitHub account and select the “github” category, click the “add” button to create a new repository on GitHub.

image

You can download the GitHub for Windows here.

I specified the repository name, description, local repository, do not check the “Keep this code private”. After few seconds it will create a new repository on GitHub and associate it to my local machine in that folder.

image

We can find this new repository in GitHub website. And in GitHub for Windows we can also find the local repository by selecting the “local” category.

image

 

Next, we need to associate this repository with our WAWS. Back to windows developer portal, open the “Deploy from my GitHub project” in the deployment page and click the “Authorize Windows Azure” link. It will bring up a new windows on GitHub which let me allow the Windows Azure application can access your repositories.

image

After we clicked “Allow”, windows azure will retrieve all my GitHub public repositories and let me select which one I want to integrate to this WAWS. I selected the one I had just created in GitHub for Windows.

image

So that’s all. We had completed the GitHub integration configuration. Now let’s have a try. In GitHub for Windows, right click on this local repository and click “open in explorer”. Then I added a simple HTML file.

   1: <html>
   2:     <head>
   3:     </head>
   4:     <body>
   5:         <h1>
   6:             I came from GitHub, WOW!
   7:         </h1>
   8:     </body>
   9: </html>

Save it and back to GitHub for Windows, commit this change and publish. This will upload our changes to GitHub, and Windows Azure will detect this update and trigger a new deployment.

If we went back to azure developer portal we can find the new deployment. And our commit message will be shown as the deployment description as well.

image

And here is the page deployed to WAWS.

image

 

Hope this helps,

Shaun

All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

 

In my previous post I demonstrated about how to develop and deploy a Node.js application on Windows Azure Web Site (a.k.a. WAWS). WAWS is a new feature in Windows Azure platform. Since it’s low-cost, and it provides IIS and IISNode components so that we can host our Node.js application though Git, FTP and WebMatrix without any configuration and component installation.

But sometimes we need to use the Windows Azure Cloud Service (a.k.a. WACS) and host our Node.js on worker role. Below are some benefits of using worker role.

- WAWS leverages IIS and IISNode to host Node.js application, which runs in x86 WOW mode. It reduces the performance comparing with x64 in some cases.

- WACS worker role does not need IIS, hence there’s no restriction of IIS, such as 8000 concurrent requests limitation.

- WACS provides more flexibility and controls to the developers. For example, we can RDP to the virtual machines of our worker role instances.

- WACS provides the service configuration features which can be changed when the role is running.

- WACS provides more scaling capability than WAWS. In WAWS we can have at most 3 reserved instances per web site while in WACS we can have up to 20 instances in a subscription.

- Since when using WACS worker role we starts the node by ourselves in a process, we can control the input, output and error stream. We can also control the version of Node.js.

 

Run Node.js in Worker Role

Node.js can be started by just having its execution file. This means in Windows Azure, we can have a worker role with the “node.exe” and the Node.js source files, then start it in Run method of the worker role entry class.

Let’s create a new windows azure project in Visual Studio and add a new worker role. Since we need our worker role execute the “node.exe” with our application code we need to add the “node.exe” into our project. Right click on the worker role project and add an existing item. By default the Node.js will be installed in the “Program Files\nodejs” folder so we can navigate there and add the “node.exe”.

Then we need to create the entry code of Node.js. In WAWS the entry file must be named “server.js”, which is because it’s hosted by IIS and IISNode and IISNode only accept “server.js”. But here as we control everything we can choose any files as the entry code. For example, I created a new JavaScript file named “index.js” in project root.

Since we created a C# Windows Azure project we cannot create a JavaScript file from the context menu “Add new item”. We have to create a text file, and then rename it to JavaScript extension.

After we added these two files we should set their “Copy to Output Directory” property to “Copy Always”, or “Copy if Newer”. Otherwise they will not be involved in the package when deployed.

image

Let’s paste a very simple Node.js code in the “index.js” as below. As you can see I created a web server listening at port 12345.

   1: var http = require("http");
   2: var port = 12345;
   3:  
   4: http.createServer(function (req, res) {
   5:     res.writeHead(200, { "Content-Type": "text/plain" });
   6:     res.end("Hello World\n");
   7: }).listen(port);
   8:  
   9: console.log("Server running at port %d", port);

Then we need to start “node.exe” with this file when our worker role was started. This can be done in its Run method. I found the Node.js and entry JavaScript file name, and then create a new process to run it. Our worker role will wait for the process to be exited. If everything is OK once our web server was opened the process will be there listening for incoming requests, and should not be terminated. The code in worker role would be like this.

   1: public override void Run()
   2: {
   3:     // This is a sample worker implementation. Replace with your logic.
   4:     Trace.WriteLine("NodejsHost entry point called", "Information");
   5:  
   6:     // retrieve the node.exe and entry node.js source code file name.
   7:     var node = Environment.ExpandEnvironmentVariables(@"%RoleRoot%\approot\node.exe");
   8:     var js = "index.js";
   9:  
  10:     // prepare the process starting of node.exe
  11:     var info = new ProcessStartInfo(node, js)
  12:     {
  13:         CreateNoWindow = false,
  14:         ErrorDialog = true,
  15:         WindowStyle = ProcessWindowStyle.Normal,
  16:         UseShellExecute = false,
  17:         WorkingDirectory = Environment.ExpandEnvironmentVariables(@"%RoleRoot%\approot")
  18:     };
  19:     Trace.WriteLine(string.Format("{0} {1}", node, js), "Information");
  20:  
  21:     // start the node.exe with entry code and wait for exit
  22:     var process = Process.Start(info);
  23:     process.WaitForExit();
  24: }

Then we can run it locally. In the computer emulator UI the worker role started and it executed the Node.js, then Node.js windows appeared.

image

Open the browser to verify the website hosted by our worker role.

image

Next let’s deploy it to azure. But we need some additional steps. First, we need to create an input endpoint. By default there’s no endpoint defined in a worker role. So we will open the role property window in Visual Studio, create a new input TCP endpoint to the port we want our website to use. In this case I will use 80.

Even though we created a web server we should add a TCP endpoint of the worker role, since Node.js always listen on TCP instead of HTTP.

image

And then changed the “index.js”, let our web server listen on 80.

   1: var http = require("http");
   2: var port = 80;
   3:  
   4: http.createServer(function (req, res) {
   5:     res.writeHead(200, { "Content-Type": "text/plain" });
   6:     res.end("Hello World\n");
   7: }).listen(port);
   8:  
   9: console.log("Server running at port %d", port);

Then publish it to Windows Azure.

image

And then in browser we can see our Node.js website was running on WACS worker role.

image

We may encounter an error if we tried to run our Node.js website on 80 port at local emulator. This is because the compute emulator registered 80 and map the 80 endpoint to 81. But our Node.js cannot detect this operation. So when it tried to listen on 80 it will failed since 80 have been used.

 

Use NPM Modules

When we are using WAWS to host Node.js, we can simply install modules we need, and then just publish or upload all files to WAWS. But if we are using WACS worker role, we have to do some extra steps to make the modules work.

Assuming that we plan to use “express” in our application. Firstly of all we should download and install this module through NPM command. But after the install finished, they are just in the disk but not included in the worker role project. If we deploy the worker role right now the module will not be packaged and uploaded to azure. Hence we need to add them to the project. On solution explorer window click the “Show all files” button, select the “node_modules” folder and in the context menu select “Include In Project”.

image

But that not enough. We also need to make all files in this module to “Copy always” or “Copy if newer”, so that they can be uploaded to azure with the “node.exe” and “index.js”. This is painful step since there might be many files in a module. So I created a small tool which can update a C# project file, make its all items as “Copy always”. The code is very simple.

   1: static void Main(string[] args)
   2: {
   3:     if (args.Length < 1)
   4:     {
   5:         Console.WriteLine("Usage: copyallalways [project file]");
   6:         return;
   7:     }
   8:  
   9:     var proj = args[0];
  10:     File.Copy(proj, string.Format("{0}.bak", proj));
  11:  
  12:     var xml = new XmlDocument();
  13:     xml.Load(proj);
  14:     var nsManager = new XmlNamespaceManager(xml.NameTable);
  15:     nsManager.AddNamespace("pf", "http://schemas.microsoft.com/developer/msbuild/2003");
  16:  
  17:     // add the output setting to copy always
  18:     var contentNodes = xml.SelectNodes("//pf:Project/pf:ItemGroup/pf:Content", nsManager);
  19:     UpdateNodes(contentNodes, xml, nsManager);
  20:     var noneNodes = xml.SelectNodes("//pf:Project/pf:ItemGroup/pf:None", nsManager);
  21:     UpdateNodes(noneNodes, xml, nsManager);
  22:     xml.Save(proj);
  23:  
  24:     // remove the namespace attributes
  25:     var content = xml.InnerXml.Replace("<CopyToOutputDirectory xmlns=\"\">", "<CopyToOutputDirectory>");
  26:     xml.LoadXml(content);
  27:     xml.Save(proj);
  28: }
  29:  
  30: static void UpdateNodes(XmlNodeList nodes, XmlDocument xml, XmlNamespaceManager nsManager)
  31: {
  32:     foreach (XmlNode node in nodes)
  33:     {
  34:         var copyToOutputDirectoryNode = node.SelectSingleNode("pf:CopyToOutputDirectory", nsManager);
  35:         if (copyToOutputDirectoryNode == null)
  36:         {
  37:             var n = xml.CreateNode(XmlNodeType.Element, "CopyToOutputDirectory", null);
  38:             n.InnerText = "Always";
  39:             node.AppendChild(n);
  40:         }
  41:         else
  42:         {
  43:             if (string.Compare(copyToOutputDirectoryNode.InnerText, "Always", true) != 0)
  44:             {
  45:                 copyToOutputDirectoryNode.InnerText = "Always";
  46:             }
  47:         }
  48:     }
  49: }

Please be careful when use this tool. I created only for demo so do not use it directly in a production environment.

Unload the worker role project, execute this tool with the worker role project file name as the command line argument, it will set all items as “Copy always”. Then reload this worker role project.

Now let’s change the “index.js” to use express.

   1: var express = require("express");
   2: var app = express();
   3:  
   4: var port = 80;
   5:  
   6: app.configure(function () {
   7: });
   8:  
   9: app.get("/", function (req, res) {
  10:     res.send("Hello Node.js!");
  11: });
  12:  
  13: app.get("/User/:id", function (req, res) {
  14:     var id = req.params.id;
  15:     res.json({
  16:         "id": id,
  17:         "name": "user " + id,
  18:         "company": "IGT"
  19:     });
  20: });
  21:  
  22: app.listen(port);

Finally let’s publish it and have a look in browser.

image

 

Use Windows Azure SQL Database

We can use Windows Azure SQL Database (a.k.a. WACD) from Node.js as well on worker role hosting. Since we can control the version of Node.js, here we can use x64 version of “node-sqlserver” now. This is better than if we host Node.js on WAWS since it only support x86.

Just install the “node-sqlserver” module from NPM, copy the “sqlserver.node” from “Build\Release” folder to “Lib” folder. Include them in worker role project and run my tool to make them to “Copy always”. Finally update the “index.js” to use WASD.

   1: var express = require("express");
   2: var sql = require("node-sqlserver");
   3:  
   4: var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:{SERVER NAME}.database.windows.net,1433;Database={DATABASE NAME};Uid={LOGIN}@{SERVER NAME};Pwd={PASSWORD};Encrypt=yes;Connection Timeout=30;";
   5: var port = 80;
   6:  
   7: var app = express();
   8:  
   9: app.configure(function () {
  10:     app.use(express.bodyParser());
  11: });
  12:  
  13: app.get("/", function (req, res) {
  14:     sql.open(connectionString, function (err, conn) {
  15:         if (err) {
  16:             console.log(err);
  17:             res.send(500, "Cannot open connection.");
  18:         }
  19:         else {
  20:             conn.queryRaw("SELECT * FROM [Resource]", function (err, results) {
  21:                 if (err) {
  22:                     console.log(err);
  23:                     res.send(500, "Cannot retrieve records.");
  24:                 }
  25:                 else {
  26:                     res.json(results);
  27:                 }
  28:             });
  29:         }
  30:     });
  31: });
  32:  
  33: app.get("/text/:key/:culture", function (req, res) {
  34:     sql.open(connectionString, function (err, conn) {
  35:         if (err) {
  36:             console.log(err);
  37:             res.send(500, "Cannot open connection.");
  38:         }
  39:         else {
  40:             var key = req.params.key;
  41:             var culture = req.params.culture;
  42:             var command = "SELECT * FROM [Resource] WHERE [Key] = '" + key + "' AND [Culture] = '" + culture + "'";
  43:             conn.queryRaw(command, function (err, results) {
  44:                 if (err) {
  45:                     console.log(err);
  46:                     res.send(500, "Cannot retrieve records.");
  47:                 }
  48:                 else {
  49:                     res.json(results);
  50:                 }
  51:             });
  52:         }
  53:     });
  54: });
  55:  
  56: app.get("/sproc/:key/:culture", function (req, res) {
  57:     sql.open(connectionString, function (err, conn) {
  58:         if (err) {
  59:             console.log(err);
  60:             res.send(500, "Cannot open connection.");
  61:         }
  62:         else {
  63:             var key = req.params.key;
  64:             var culture = req.params.culture;
  65:             var command = "EXEC GetItem '" + key + "', '" + culture + "'";
  66:             conn.queryRaw(command, function (err, results) {
  67:                 if (err) {
  68:                     console.log(err);
  69:                     res.send(500, "Cannot retrieve records.");
  70:                 }
  71:                 else {
  72:                     res.json(results);
  73:                 }
  74:             });
  75:         }
  76:     });
  77: });
  78:  
  79: app.post("/new", function (req, res) {
  80:     var key = req.body.key;
  81:     var culture = req.body.culture;
  82:     var val = req.body.val;
  83:  
  84:     sql.open(connectionString, function (err, conn) {
  85:         if (err) {
  86:             console.log(err);
  87:             res.send(500, "Cannot open connection.");
  88:         }
  89:         else {
  90:             var command = "INSERT INTO [Resource] VALUES ('" + key + "', '" + culture + "', N'" + val + "')";
  91:             conn.queryRaw(command, function (err, results) {
  92:                 if (err) {
  93:                     console.log(err);
  94:                     res.send(500, "Cannot retrieve records.");
  95:                 }
  96:                 else {
  97:                     res.send(200, "Inserted Successful");
  98:                 }
  99:             });
 100:         }
 101:     });
 102: });
 103:  
 104: app.listen(port);

Publish to azure and now we can see our Node.js is working with WASD through x64 version “node-sqlserver”.

image

 

Summary

In this post I demonstrated how to host our Node.js in Windows Azure Cloud Service worker role. By using worker role we can control the version of Node.js, as well as the entry code. And it’s possible to do some pre jobs before the Node.js application started. It also removed the IIS and IISNode limitation. I personally recommended to use worker role as our Node.js hosting.

But there are some problem if you use the approach I mentioned here. The first one is, we need to set all JavaScript files and module files as “Copy always” or “Copy if newer” manually. The second one is, in this way we cannot retrieve the cloud service configuration information. For example, we defined the endpoint in worker role property but we also specified the listening port in Node.js hardcoded. It should be changed that our Node.js can retrieve the endpoint. But I can tell you it won’t be working here.

In the next post I will describe another way to execute the “node.exe” and Node.js application, so that we can get the cloud service configuration in Node.js. I will also demonstrate how to use Windows Azure Storage from Node.js by using the Windows Azure Node.js SDK.

 

Hope this helps,

Shaun

All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

 

In the first post of this series I introduced on how to run Node.js application on Windows. In the second one I demonstrated about how to use SQL Server in Node.js. That was the steps when I was learning Node.js, firstly made it work on Windows, then try to use SQL Server. But returned back to my original goal, I need to implement a synchronization service for Wang Tao’s worktile, and planed to have a prototype hosted on Windows Azure. So the previous two posts are preparation. In this post I am going to describe how to host a Node.js website on Windows Azure Web Site.

 

Node.js on Windows Azure Web Site via Git Deployment

Windows Azure Web Site (a.k.a. WAWS) is a new service which introduced in Microsoft Meet Windows Azure event on June. It provides a low-cost, build-in templates and simple way to host websites on the cloud. For more information about WAWS, please refer to my previous posts here, here and here.

WAWS is the simplest way for us to deploy our Node.js website I think.

-  IIS and IISNode are installed and configured in WAWS environment, so that we don’t need to install and anything when deployment.

- WAWS supports multiple deployment approaches such as TFS, FTP and Git. For Node.js, FTP and Git deployment are very easy and quick.

- WAWS provides three types of scale mode, which are free, shared and reserved. The cost is lower than Windows Azure Cloud Service (a.k.a. WACS) if we only need a website with a database.

- WAWS supports Windows Azure SQL Database (a.k.a. WASD) as well as MySQL.

 

Let’s have a look on how simple and easy to have a website on WAWS in Node.js. First of all, we need to create a new WAWS. Since I will demonstrate how to use WASD from Node.js later this post we will create a WASD here as well. Go to windows azure developer portal, from the NEW button select COMPUTER, WEB SITE, CREATE WITH DATABASE.

image

Once the website was created we go to its dashboard, click the Set up Git publishing link on the right side.

image

After few seconds Windows Azure will finish the Git deployment configuration. If this is the first time you configured Git or FTP deployment on Windows Azure, you need to click the Reset deployment credentials link on the dashboard to provide the deployment user name and password.

image

Next, let’s clone the Git repository from Windows Azure to our local disk. If you don’t have Git installed on your machine you can download it here. After installed we will open the Git Bash from the start menu Git folder. In the command windows navigate to the folder we want to clone. For example in my case I ‘m going to use the folder “D:\Research\nodejs\apps\nodejssample” so I need to go to ““D:\Research\nodejs\apps” and make sure the folder “nodejssample” does NOT exists, since Git will create this folder when it clones.

Back to the developer portal deployment page, copy the Git URL and then execute the Git clone command as shown below. When it needs password, specify the password we set in the previous step.

image

Now the repository had been cloned from WAWS to our local disk. Then we will use another GUI tool to commit and push our changes named GitHub for Windows.

GitHub for Windows is a GUI tool running on Windows for easily control the repositories on GitHub. We can use this tool to control our repository cloned from WAWS as well. You can download it here.

Open the GitHub for Windows, open the folder where the repository we had just cloned in file explorer, and then drag this folder into the GitHub for Windows.

image

When we clicked this repository on the GitHub windows we need to input the credentials, which is what we specified in developer portal. Currently there no changes in local folder.

Since the WAWS hosts Node.js application through IIS and IISNode, it will start the JavaScript file named “server.js”. So we must create a source file named “server.js” and this is the entry of our website. Create a web server and let it listen on port which come from “process.env.PORT”.

“process.env.PORT” means it will retrieve the port number from the environment variant named “PORT”. Since the WAWS hosts all website behind the windows azure router and firewall, this environment variant represents the correct internal port our WAWS is listening on.

   1: var http = require("http");
   2:  
   3: http.createServer(function (req, res) {
   4:     res.writeHead(200, {"Content-Type": "text/plain"});
   5:     res.end("Hello Node.js and Windows Azure Website!\n");
   6: }).listen(process.env.port);
   7:  
   8: console.log("Server started.");

Once we saved the file and back to GitHub window we would found it detected this change. Then we can commit it.

image

Then click the “publish” button on top of the window. It will publish our changes to the WAWS remote repository, and then WAWS will begin to deploy.

image

Now back to windows azure developer portal, there will be a new deployment in the deployment page under this website and have a try.

image

NPM Modules in Windows Azure Web Site

Use NPM modules and deploy modules alone with our Node.js website is pretty simple, too. All modules we added are located at the “node_modules” subfolder so we do not need any extra effort.

For example, I installed the “express” module in my website through the NPM command. It downloaded all necessary files to “node_modules” subfolder. And then I will change my code to use “express”.

   1: var express = require("express");
   2: var app = express();
   3:  
   4: app.get("/", function(req, res) {
   5:     res.send("Hello Node.js, Express and Windows Azure Web Site.");
   6: });
   7:  
   8: app.get("/Echo/:value", function(req, res) {
   9:     var value = req.params.value;
  10:     res.json({
  11:         "Value" : value,
  12:         "Time" : new Date()
  13:     });
  14: });
  15:  
  16: console.log("Web application opened.");
  17: app.listen(process.env.PORT);

Then go to GitHub, commit and sync to remote repository.

image

Then in windows azure we will found the new deployment.

image

And if we refresh the website home page we will find the new content. Also we can test the new function we added in this changes.

image

 

Work with Windows Azure SQL Database

Let’s continue use the SQL schema and data I mentioned in my previous post, execute the script below under the SQL Database I created at the beginning.

   1: /****** Object:  Table [dbo].[Resource]    Script Date: 9/4/2012 3:47:14 PM ******/
   2: SET ANSI_NULLS ON
   3: GO
   4: SET QUOTED_IDENTIFIER ON
   5: GO
   6: CREATE TABLE [dbo].[Resource](
   7:     [Key] [varchar](256) NOT NULL,
   8:     [Culture] [varchar](8) NOT NULL,
   9:     [Value] [nvarchar](4000) NOT NULL,
  10:  CONSTRAINT [PK_Resource] PRIMARY KEY CLUSTERED 
  11: (
  12:     [Key] ASC,
  13:     [Culture] ASC
  14: )WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF)
  15: )
  16:  
  17: GO
  18: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Controller_HomeAbout_Message', N'en-US', N'Your app description page.')
  19: GO
  20: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Controller_HomeAbout_Message', N'zh-CN', N'???????')
  21: GO
  22: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Controller_HomeContact_Message', N'en-US', N'Your contact page.')
  23: GO
  24: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Controller_HomeContact_Message', N'zh-CN', N'?????????')
  25: GO
  26: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Controller_HomeIndex_Message', N'en-US', N'Modify this template to jump-start your ASP.NET MVC application.')
  27: GO
  28: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Controller_HomeIndex_Message', N'zh-CN', N'??????,??????ASP.NET MVC?????')
  29: GO
  30: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_Password_Display', N'en-US', N'Password')
  31: GO
  32: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_Password_Display', N'zh-CN', N'??')
  33: GO
  34: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_Password_Required', N'en-US', N'Please input {0}.')
  35: GO
  36: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_Password_Required', N'zh-CN', N'???{0}?')
  37: GO
  38: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_RememberMe_Display', N'en-US', N'Remember me?')
  39: GO
  40: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_RememberMe_Display', N'zh-CN', N'???????')
  41: GO
  42: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_UserName_Display', N'en-US', N'User Name')
  43: GO
  44: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_UserName_Display', N'zh-CN', N'???')
  45: GO
  46: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_UserName_Required', N'en-US', N'Please input the {0}.')
  47: GO
  48: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_UserName_Required', N'zh-CN', N'???{0}?')
  49: GO
  50: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_RegisterModel_ConfirmPassword_Compare', N'en-US', N'The password and confirmation password do not match.')
  51: GO
  52: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_RegisterModel_ConfirmPassword_Compare', N'zh-CN', N'???????????')
  53: GO
  54: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_RegisterModel_ConfirmPassword_Display', N'en-US', N'Confirm password')
  55: GO
  56: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_RegisterModel_ConfirmPassword_Display', N'zh-CN', N'??????')
  57: GO
  58: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_RegisterModel_Password_StringLength', N'en-US', N'The {0} must be at least {2} characters long.')
  59: GO
  60: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_RegisterModel_Password_StringLength', N'zh-CN', N'{0}?????')
  61: GO
  62: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_AccountLogin_ExtenalAccount', N'en-US', N'Use another service to log in.')
  63: GO
  64: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_AccountLogin_ExtenalAccount', N'zh-CN', N'?????????')
  65: GO
  66: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_AccountLogin_LocalAccount', N'en-US', N'Use a local account to log in.')
  67: GO
  68: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_AccountLogin_LocalAccount', N'zh-CN', N'?????????')
  69: GO
  70: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_AccountLogin_RegisterIfNoAccount', N'en-US', N'{0} if you don''t have an account.')
  71: GO
  72: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_AccountLogin_RegisterIfNoAccount', N'zh-CN', N'??????,?{0}?')
  73: GO
  74: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_AccountRegister_Message', N'en-US', N'Create a new account.')
  75: GO
  76: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_AccountRegister_Message', N'zh-CN', N'????????')
  77: GO
  78: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_ExternalLoginsListPartial_MessageInfo', N'en-US', N'There are no external authentication services configured. See <a href="http://go.microsoft.com/fwlink/?LinkId=252166">this article</a> for details on setting up this ASP.NET application to support logging in via external services.')
  79: GO
  80: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_ExternalLoginsListPartial_MessageInfo', N'zh-CN', N'???????????????????ASP.NET?????????????????,????<a href="http://go.microsoft.com/fwlink/?LinkId=252166">??</a>?')
  81: GO
  82: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_ExternalLoginsListPartial_SocialLoginList', N'en-US', N'Log in using another service')
  83: GO
  84: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_ExternalLoginsListPartial_SocialLoginList', N'zh-CN', N'?????????')
  85: GO
  86: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomeAbout_Title', N'en-US', N'About')
  87: GO
  88: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomeAbout_Title', N'zh-CN', N'??')
  89: GO
  90: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomeContact_Title', N'en-US', N'Contact')
  91: GO
  92: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomeContact_Title', N'zh-CN', N'????')
  93: GO
  94: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomeIndex_Title', N'en-US', N'Home Page')
  95: GO
  96: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomeIndex_Title', N'zh-CN', N'??')
  97: GO
  98: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Featured', N'en-US', N'To learn more about ASP.NET MVC visit <a href="http://asp.net/mvc" title="ASP.NET MVC Website">http://asp.net/mvc</a>. The page features <mark>videos, tutorials, and samples</mark> to help you get the most from ASP.NET MVC. If you have any questions about ASP.NET MVC visit <a href="http://forums.asp.net/1146.aspx/1?MVC" title="ASP.NET MVC Forum">our forums</a>.')
  99: GO
 100: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Featured', N'zh-CN', N'???????ASP.NET MVC??????<a href="http://asp.net/mvc" title="ASP.NET MVC??">http://asp.net/mvc</a>??????<mark>??,?????</mark>,??????????ASP.NET MVC???????????ASP.NET MVC???,??????<a href="http://forums.asp.net/1146.aspx/1?MVC" title="ASP.NET MVC??">??</a>?')
 101: GO
 102: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest', N'en-US', N'We suggest the following:')
 103: GO
 104: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest', N'zh-CN', N'????:')
 105: GO
 106: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_1_Title', N'en-US', N'Getting Started')
 107: GO
 108: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_1_Title', N'zh-CN', N'??')
 109: GO
 110: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_1_Val', N'en-US', N'ASP.NET MVC gives you a powerful, patterns-based way to build dynamic websites that enables a clean separation of concerns and that gives you full control over markup for enjoyable, agile development. ASP.NET MVC includes many features that enable fast, TDD-friendly development for creating sophisticated applications that use the latest web standards. <a href="http://go.microsoft.com/fwlink/?LinkId=245151">Learn more...</a>')
 111: GO
 112: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_1_Val', N'zh-CN', N'ASP.NET MVC?????????????????????????,???????????,????,?????????????? ASP.NET MVC???????,????????????,?????Web??,TDD??????<a href="http://go.microsoft.com/fwlink/?LinkId=245151">????…</a>')
 113: GO
 114: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_2_Title', N'en-US', N'Add NuGet packages and jump-start your coding')
 115: GO
 116: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_2_Title', N'zh-CN', N'??NuGet???,??????')
 117: GO
 118: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_2_Val', N'en-US', N'NuGet makes it easy to install and update free libraries and tools. <a href="http://go.microsoft.com/fwlink/?LinkId=245153">Learn more...</a>')
 119: GO
 120: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_2_Val', N'zh-CN', N'NuGet??????????????????????<a href="http://go.microsoft.com/fwlink/?LinkId=245153">????…</a>')
 121: GO
 122: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_3_Title', N'en-US', N'Find Web Hosting')
 123: GO
 124: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_3_Title', N'zh-CN', N'??????')
 125: GO
 126: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_3_Val', N'en-US', N'You can easily find a web hosting company that offers the right mix of features and price for your applications. <a href="http://go.microsoft.com/fwlink/?LinkId=245157">Learn more...</a>')
 127: GO
 128: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_3_Val', N'zh-CN', N'???????????Web????,?????????????????????<a href="http://go.microsoft.com/fwlink/?LinkId=245157">????…</a>')
 129: GO
 130: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_Layout_LogoHere', N'en-US', N'your logo here')
 131: GO
 132: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_Layout_LogoHere', N'zh-CN', N'???????')
 133: GO
 134: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_Layout_Title', N'en-US', N'My ASP.NET MVC Application')
 135: GO
 136: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_Layout_Title', N'zh-CN', N'??ASP.NET MVC????')
 137: GO
 138: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_LoginPartial_Login', N'en-US', N'Log in')
 139: GO
 140: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_LoginPartial_Login', N'zh-CN', N'??')
 141: GO
 142: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_LoginPartial_Logoff', N'en-US', N'Log off')
 143: GO
 144: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_LoginPartial_Logoff', N'zh-CN', N'??')
 145: GO
 146: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_LoginPartial_Register', N'en-US', N'Register')
 147: GO
 148: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_LoginPartial_Register', N'zh-CN', N'??')
 149: GO
 150: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_PageName_About', N'en-US', N'About')
 151: GO
 152: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_PageName_About', N'zh-CN', N'??')
 153: GO
 154: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_PageName_Contact', N'en-US', N'Contact')
 155: GO
 156: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_PageName_Contact', N'zh-CN', N'????')
 157: GO
 158: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_PageName_Home', N'en-US', N'Home')
 159: GO
 160: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_PageName_Home', N'zh-CN', N'??')
 161: GO

Then we need to add the node-sqlserver module in our local repository. As I mentioned in the previous post the node-sqlserver works with SQL Server and WASD. But there’s a problem if we are using NPM to install it. If you remembered, I specified that when NPM installed node-sqlserver it will build some C++ code by Python. The output binary depends on the local machine and Python whether it’s x86 or x64. If our local machine was x86 then this is no problem. But if our local machine is x64, the node-sqlserver module cannot be used on WAWS, since in WAWS all website are hosted in IIS in x86 WOW mode.

One solution is to change our development machine to x86, which may not be acceptable since x64 is preferred for other Windows Azure services development such as WASD and WACS.

Or maybe we can find a x86 machine to download and compile a x86 version of node-sqlserver for us, then copy it to our working PC.

Or we can download the x86 version node-sqlserver from Microsoft, which named “Microsoft Driver for Node.JS for SQL Server Preview” found here. Then follow its instruction we could install and copy the “node-sqlserver” folder under the “node_modules” folder in our repository.

Now we can use the similar source code in previous post to work with WASD, just need to change the connection string. The code would be like this.

   1: var express = require("express");
   2: var sql = require("node-sqlserver");
   3:  
   4: var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:{YOUR SERVER NAME}.database.windows.net,1433;Database=nodejssample;Uid={YOUR LOGIN}@{YOUR SERVER NAME};Pwd={YOUR PASSWORD};Encrypt=yes;Connection Timeout=30;";
   5: var port = process.env.PORT
   6:  
   7: var app = express();
   8:  
   9: app.configure(function () {
  10:     app.use(express.bodyParser());
  11: });
  12:  
  13: app.get("/", function(req, res) {
  14:     sql.open(connectionString, function(err, conn) {
  15:         if(err) {
  16:             console.log(err);
  17:             res.send(500, "Cannot open connection.");
  18:         }
  19:         else {
  20:             conn.queryRaw("SELECT * FROM [Resource]", function(err, results) {
  21:                 if(err) {
  22:                     console.log(err);
  23:                     res.send(500, "Cannot retrieve records.");
  24:                 }
  25:                 else {
  26:                     res.json(results);
  27:                 }
  28:             });
  29:         }
  30:     });
  31: });
  32:  
  33: app.get("/text/:key/:culture", function(req, res) {
  34:     sql.open(connectionString, function(err, conn) {
  35:         if(err) {
  36:             console.log(err);
  37:             res.send(500, "Cannot open connection.");
  38:         }
  39:         else {
  40:             var key = req.params.key;
  41:             var culture = req.params.culture;
  42:             var command = "SELECT * FROM [Resource] WHERE [Key] = '" + key + "' AND [Culture] = '" + culture + "'";
  43:             conn.queryRaw(command, function(err, results) {
  44:                 if(err) {
  45:                     console.log(err);
  46:                     res.send(500, "Cannot retrieve records.");
  47:                 }
  48:                 else {
  49:                     res.json(results);
  50:                 }
  51:             });
  52:         }
  53:     });
  54: });
  55:  
  56: app.get("/sproc/:key/:culture", function(req, res) {
  57:     sql.open(connectionString, function(err, conn) {
  58:         if(err) {
  59:             console.log(err);
  60:             res.send(500, "Cannot open connection.");
  61:         }
  62:         else {
  63:             var key = req.params.key;
  64:             var culture = req.params.culture;
  65:             var command = "EXEC GetItem '" + key + "', '" + culture + "'";
  66:             conn.queryRaw(command, function(err, results) {
  67:                 if(err) {
  68:                     console.log(err);
  69:                     res.send(500, "Cannot retrieve records.");
  70:                 }
  71:                 else {
  72:                     res.json(results);
  73:                 }
  74:             });
  75:         }
  76:     });
  77: });
  78:  
  79: app.post("/new", function(req, res) {
  80:     var key = req.body.key;
  81:     var culture = req.body.culture;
  82:     var val = req.body.val;
  83:     
  84:     sql.open(connectionString, function(err, conn) {
  85:         if(err) {
  86:             console.log(err);
  87:             res.send(500, "Cannot open connection.");
  88:         }
  89:         else {
  90:             var command = "INSERT INTO [Resource] VALUES ('" + key + "', '" + culture + "', N'" + val + "')";
  91:             conn.queryRaw(command, function(err, results) {
  92:                 if(err) {
  93:                     console.log(err);
  94:                     res.send(500, "Cannot retrieve records.");
  95:                 }
  96:                 else {
  97:                     res.send(200, "Inserted Successful");
  98:                 }
  99:             });
 100:         }
 101:     });
 102: });
 103:  
 104: app.listen(port);

The connection string can be found in our WASD page in developer portal. In Node.js we need to use the ODBC connection string and make sure change the password before commit and sync.

Save the file and in GitHub window commit and sync it. Our website will be deployed automatically.

image

If we changed a little bit of our C# console application then we can test the post function in our Node.js application hosted on WAWS. The updated C# console code would like this, just changed the remote URL.

   1: static void Main(string[] args)
   2: {
   3:     var key = args[0];
   4:     var culture = args[1];
   5:     var val = args[2];
   6:  
   7:     var req = HttpWebRequest.Create("http://nodejssample.azurewebsites.net/new");
   8:     req.ContentType = "application/x-www-form-urlencoded";
   9:     req.Method = WebRequestMethods.Http.Post;
  10:  
  11:     var param = string.Format("key={0}&culture={1}&val={2}", key, culture, val);
  12:     var bytes = System.Text.Encoding.UTF8.GetBytes(param);
  13:     req.ContentLength = bytes.Length;
  14:     using(var stream = req.GetRequestStream())
  15:     {
  16:         stream.Write(bytes, 0, bytes.Length);
  17:     }
  18:  
  19:     var res = req.GetResponse();
  20:     using (var sr = new StreamReader(res.GetResponseStream()))
  21:     {
  22:         Console.WriteLine(sr.ReadToEnd());
  23:     }
  24:     Console.ReadKey();
  25: }

Then run this application to add some records in WASD.

image

And then back to the browser and find the items we had just inserted.

image

 

Summary

In this post I demonstrated on how to host Node.js website on Windows Azure Web Site. It’s very simple and easy to deploy by using the Git command client and the new GitHub GUI.

I also described how to use Windows Azure SQL Database from Node.js on Windows Azure Web Site. Make sure you imported the right version of sqlserver.node. For better development and cloud environment we can have two node-sqlserver modules, x86 version in folder “node-sqlserver-x86” while x64 in “node-sqlserver-x64”. Then we can import the x64 on local development and changed it to “node-sqlserver-x86” before commit and sync to azure.

In Windows Azure there are some other services besides SQL Database such as Storage, Service Bus, etc.. Almost all Windows Azure services can be used by Node.js through another module named “azure”, which is Windows Azure Node.js SDK. In the next post I will demonstrate on how to host our Node.js application on Windows Azure Cloud Service Worker Role, how to consume the Storage Service and how to retrieve the Cloud Service configuration.

 

Hope this helps,

Shaun

All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

 

In my previous post I demonstrated how to install, configure and run Node.js application on Windows. In this post I will cover a specific and short topic, which is how to use SQL Server in Node.js.

 

Install SQL Service Module

In NPM there are many SQL Server modules. From the command windows we can use “npm find” to find the modules for specified terms. For example, below is the result when I found “sqlserver”.

image

In this post I will describe one of them, the “node-sqlserver” module. This module was built by Microsoft. It utilizes a library built in C++ as the bridge between the JavaScript API and the backend SQL Server Native Client. When we download and install this module, it will retrieve the source code from NPM, and it will compile the C++ code by a script written in Python. Hence there are some requirement components.

- Node.js: version 0.6.10 or higher. But per MS said it’s not fully tested under the 0.8.x Node.js.

- Python 2.7.x: It’s said that Python 3.x is not acceptable.

- Visual C++ 2010: Installed alone with Visual Studio Express 2010 or higher.

- SQL Server Native Client 11.0: Can be found at SQL Server 2012 Feature Pack.

If we have those installed then we can use NPM to download and install this module.

image

After it installed there is still one step need to be done manually. Seems like a bug in the module installation package, by default the binary of the SQL client (which compiled from the C++ source code) was not in the module’s library folder. So we need to do it by ourselves.

For example, assuming we installed node-sqlserver under the folder “D:\Research\nodejs\apps”, then we need to go to “D:\Research\nodejs\apps\node_modules\node-sqlserver\build\Release”, find a file named “sqlserver.node”, copy it to “D:\Research\nodejs\apps\node_modules\node-sqlserver\lib”.

image

The sqlserver.node will be compiled based on whether we installed x86 or x64 of Python. Basically if we are using x64 Node.js then we should use x64 Python to get x64 sqlserver.node. Otherwise the Node.js will raise an exception said it cannot find “sqlserver.node”.

 

Querying in Node.js

Here I would like to use the database I introduced in my Caspar project. It only contains one table with 3 columns which is very simple. The full schema and data creation script is listed below.

   1: /****** Object:  Table [dbo].[Resource]    Script Date: 9/4/2012 3:47:14 PM ******/
   2: SET ANSI_NULLS ON
   3: GO
   4: SET QUOTED_IDENTIFIER ON
   5: GO
   6: CREATE TABLE [dbo].[Resource](
   7:     [Key] [varchar](256) NOT NULL,
   8:     [Culture] [varchar](8) NOT NULL,
   9:     [Value] [nvarchar](4000) NOT NULL,
  10:  CONSTRAINT [PK_Resource] PRIMARY KEY CLUSTERED 
  11: (
  12:     [Key] ASC,
  13:     [Culture] ASC
  14: )WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF)
  15: )
  16:  
  17: GO
  18: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Controller_HomeAbout_Message', N'en-US', N'Your app description page.')
  19: GO
  20: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Controller_HomeAbout_Message', N'zh-CN', N'???????')
  21: GO
  22: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Controller_HomeContact_Message', N'en-US', N'Your contact page.')
  23: GO
  24: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Controller_HomeContact_Message', N'zh-CN', N'?????????')
  25: GO
  26: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Controller_HomeIndex_Message', N'en-US', N'Modify this template to jump-start your ASP.NET MVC application.')
  27: GO
  28: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Controller_HomeIndex_Message', N'zh-CN', N'??????,??????ASP.NET MVC?????')
  29: GO
  30: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_Password_Display', N'en-US', N'Password')
  31: GO
  32: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_Password_Display', N'zh-CN', N'??')
  33: GO
  34: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_Password_Required', N'en-US', N'Please input {0}.')
  35: GO
  36: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_Password_Required', N'zh-CN', N'???{0}?')
  37: GO
  38: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_RememberMe_Display', N'en-US', N'Remember me?')
  39: GO
  40: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_RememberMe_Display', N'zh-CN', N'???????')
  41: GO
  42: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_UserName_Display', N'en-US', N'User Name')
  43: GO
  44: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_UserName_Display', N'zh-CN', N'???')
  45: GO
  46: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_UserName_Required', N'en-US', N'Please input the {0}.')
  47: GO
  48: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_LoginModel_UserName_Required', N'zh-CN', N'???{0}?')
  49: GO
  50: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_RegisterModel_ConfirmPassword_Compare', N'en-US', N'The password and confirmation password do not match.')
  51: GO
  52: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_RegisterModel_ConfirmPassword_Compare', N'zh-CN', N'???????????')
  53: GO
  54: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_RegisterModel_ConfirmPassword_Display', N'en-US', N'Confirm password')
  55: GO
  56: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_RegisterModel_ConfirmPassword_Display', N'zh-CN', N'??????')
  57: GO
  58: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_RegisterModel_Password_StringLength', N'en-US', N'The {0} must be at least {2} characters long.')
  59: GO
  60: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'Model_AccountModels_RegisterModel_Password_StringLength', N'zh-CN', N'{0}?????')
  61: GO
  62: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_AccountLogin_ExtenalAccount', N'en-US', N'Use another service to log in.')
  63: GO
  64: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_AccountLogin_ExtenalAccount', N'zh-CN', N'?????????')
  65: GO
  66: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_AccountLogin_LocalAccount', N'en-US', N'Use a local account to log in.')
  67: GO
  68: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_AccountLogin_LocalAccount', N'zh-CN', N'?????????')
  69: GO
  70: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_AccountLogin_RegisterIfNoAccount', N'en-US', N'{0} if you don''t have an account.')
  71: GO
  72: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_AccountLogin_RegisterIfNoAccount', N'zh-CN', N'??????,?{0}?')
  73: GO
  74: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_AccountRegister_Message', N'en-US', N'Create a new account.')
  75: GO
  76: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_AccountRegister_Message', N'zh-CN', N'????????')
  77: GO
  78: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_ExternalLoginsListPartial_MessageInfo', N'en-US', N'There are no external authentication services configured. See <a href="http://go.microsoft.com/fwlink/?LinkId=252166">this article</a> for details on setting up this ASP.NET application to support logging in via external services.')
  79: GO
  80: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_ExternalLoginsListPartial_MessageInfo', N'zh-CN', N'???????????????????ASP.NET?????????????????,????<a href="http://go.microsoft.com/fwlink/?LinkId=252166">??</a>?')
  81: GO
  82: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_ExternalLoginsListPartial_SocialLoginList', N'en-US', N'Log in using another service')
  83: GO
  84: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_ExternalLoginsListPartial_SocialLoginList', N'zh-CN', N'?????????')
  85: GO
  86: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomeAbout_Title', N'en-US', N'About')
  87: GO
  88: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomeAbout_Title', N'zh-CN', N'??')
  89: GO
  90: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomeContact_Title', N'en-US', N'Contact')
  91: GO
  92: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomeContact_Title', N'zh-CN', N'????')
  93: GO
  94: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomeIndex_Title', N'en-US', N'Home Page')
  95: GO
  96: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomeIndex_Title', N'zh-CN', N'??')
  97: GO
  98: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Featured', N'en-US', N'To learn more about ASP.NET MVC visit <a href="http://asp.net/mvc" title="ASP.NET MVC Website">http://asp.net/mvc</a>. The page features <mark>videos, tutorials, and samples</mark> to help you get the most from ASP.NET MVC. If you have any questions about ASP.NET MVC visit <a href="http://forums.asp.net/1146.aspx/1?MVC" title="ASP.NET MVC Forum">our forums</a>.')
  99: GO
 100: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Featured', N'zh-CN', N'???????ASP.NET MVC??????<a href="http://asp.net/mvc" title="ASP.NET MVC??">http://asp.net/mvc</a>??????<mark>??,?????</mark>,??????????ASP.NET MVC???????????ASP.NET MVC???,??????<a href="http://forums.asp.net/1146.aspx/1?MVC" title="ASP.NET MVC??">??</a>?')
 101: GO
 102: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest', N'en-US', N'We suggest the following:')
 103: GO
 104: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest', N'zh-CN', N'????:')
 105: GO
 106: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_1_Title', N'en-US', N'Getting Started')
 107: GO
 108: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_1_Title', N'zh-CN', N'??')
 109: GO
 110: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_1_Val', N'en-US', N'ASP.NET MVC gives you a powerful, patterns-based way to build dynamic websites that enables a clean separation of concerns and that gives you full control over markup for enjoyable, agile development. ASP.NET MVC includes many features that enable fast, TDD-friendly development for creating sophisticated applications that use the latest web standards. <a href="http://go.microsoft.com/fwlink/?LinkId=245151">Learn more...</a>')
 111: GO
 112: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_1_Val', N'zh-CN', N'ASP.NET MVC?????????????????????????,???????????,????,?????????????? ASP.NET MVC???????,????????????,?????Web??,TDD??????<a href="http://go.microsoft.com/fwlink/?LinkId=245151">????…</a>')
 113: GO
 114: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_2_Title', N'en-US', N'Add NuGet packages and jump-start your coding')
 115: GO
 116: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_2_Title', N'zh-CN', N'??NuGet???,??????')
 117: GO
 118: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_2_Val', N'en-US', N'NuGet makes it easy to install and update free libraries and tools. <a href="http://go.microsoft.com/fwlink/?LinkId=245153">Learn more...</a>')
 119: GO
 120: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_2_Val', N'zh-CN', N'NuGet??????????????????????<a href="http://go.microsoft.com/fwlink/?LinkId=245153">????…</a>')
 121: GO
 122: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_3_Title', N'en-US', N'Find Web Hosting')
 123: GO
 124: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_3_Title', N'zh-CN', N'??????')
 125: GO
 126: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_3_Val', N'en-US', N'You can easily find a web hosting company that offers the right mix of features and price for your applications. <a href="http://go.microsoft.com/fwlink/?LinkId=245157">Learn more...</a>')
 127: GO
 128: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_HomIndex_Suggest_3_Val', N'zh-CN', N'???????????Web????,?????????????????????<a href="http://go.microsoft.com/fwlink/?LinkId=245157">????…</a>')
 129: GO
 130: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_Layout_LogoHere', N'en-US', N'your logo here')
 131: GO
 132: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_Layout_LogoHere', N'zh-CN', N'???????')
 133: GO
 134: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_Layout_Title', N'en-US', N'My ASP.NET MVC Application')
 135: GO
 136: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_Layout_Title', N'zh-CN', N'??ASP.NET MVC????')
 137: GO
 138: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_LoginPartial_Login', N'en-US', N'Log in')
 139: GO
 140: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_LoginPartial_Login', N'zh-CN', N'??')
 141: GO
 142: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_LoginPartial_Logoff', N'en-US', N'Log off')
 143: GO
 144: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_LoginPartial_Logoff', N'zh-CN', N'??')
 145: GO
 146: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_LoginPartial_Register', N'en-US', N'Register')
 147: GO
 148: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_LoginPartial_Register', N'zh-CN', N'??')
 149: GO
 150: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_PageName_About', N'en-US', N'About')
 151: GO
 152: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_PageName_About', N'zh-CN', N'??')
 153: GO
 154: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_PageName_Contact', N'en-US', N'Contact')
 155: GO
 156: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_PageName_Contact', N'zh-CN', N'????')
 157: GO
 158: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_PageName_Home', N'en-US', N'Home')
 159: GO
 160: INSERT [dbo].[Resource] ([Key], [Culture], [Value]) VALUES (N'View_PageName_Home', N'zh-CN', N'??')
 161: GO

It’s very simple to connect to the SQL Server through node-sqlserver. First of all, we need to import this module. Then we invoke its “open” function to open the connection. The first parameter is the connection string while the second parameter is a callback, which will be invoked once the connection was opened or failed.

   1: var sql = require("node-sqlserver");
   2:  
   3: var connectionString = "Driver={SQL Server Native Client 11.0};Server={.};Database={Caspar};Trusted_Connection={Yes};";
   4:  
   5: sql.open(connectionString, function(err, conn) {
   6:     if(err) {
   7:         // error handler code
   8:     }
   9:     else {
  10:         // do what you want to do to this database
  11:     }
  12: });

Once we have the connection opened, we can execute any T-SQL commands through the “queryRaw” function. This function has two parameters. The first one is the T-SQL command while the second one is a callback that will be invoked once the query was performed completed or failed.

   1: sql.open(connectionString, function(err, conn) {
   2:     if(err) {
   3:         // error handler here
   4:     }
   5:     else {
   6:         var command = "{YOUR_COMMAND_HERE}";
   7:         conn.queryRaw(command, function(err, results) {
   8:             if(err) {
   9:                 // error handler here
  10:             }
  11:             else {
  12:                 // results in JSON format here
  13:             }
  14:         });
  15:     }
  16: });

The parameter “results” represents all records that in JSON format. It contains two parts. The first part named “meta”, which is the metadata information of each columns of the result. The second part named “rows” which is an array contains all records.

For example, below is the query result against the table I had just created. In the meta part it said there are three columns in the result which are Key, Culture and Value. And the rows part is an array of all records.

image

Hence if we need to retrieve the data from the result we have to use the rows array. Currently the module doesn’t support getting result value from the column name.

   1: conn.queryRaw(command, function(err, results) {
   2:     if(err) {
   3:         // error handler here
   4:     }
   5:     else {
   6:         for(var i = 0; i < result.rows.length; i++) {
   7:             console.log(i + ": " + "Key = " + result.rows[i][0]);
   8:             console.log(i + ": " + "Culture = " + result.rows[i][1]);
   9:             console.log(i + ": " + "Value = " + result.rows[i][2]);
  10:         }
  11:     }
  12: });

You can find the source code and the wiki of node-sqlserver on its GitHub page. More information about this module please refer here.

Let’s create a simple web server to demonstrate how to use the node-sqlserver module. I created a new Node.js application named “hellosql.js”, installed two modules: express and node-sqlserver. Then when user navigate to the home page I will connect to the database and return all the records in JSON type.

   1: var express = require("express");
   2: var sql = require("node-sqlserver");
   3:  
   4: var app = express();
   5: var connectionString = "Driver={SQL Server Native Client 11.0};Server={.};Database={Caspar};Trusted_Connection={Yes};";
   6:  
   7: app.configure(function () {
   8:     app.use(express.bodyParser());
   9: });
  10:  
  11: app.get("/", function(req, res) {
  12:     sql.open(connectionString, function(err, conn) {
  13:         if(err) {
  14:             console.log(err);
  15:             res.send(500, "Cannot open connection.");
  16:         }
  17:         else {
  18:             conn.queryRaw("SELECT * FROM [Resource]", function(err, results) {
  19:                 if(err) {
  20:                     console.log(err);
  21:                     res.send(500, "Cannot retrieve records.");
  22:                 }
  23:                 else {
  24:                     res.json(results);
  25:                 }
  26:             });
  27:         }
  28:     });
  29: });
  30:  
  31: app.listen(10000);

And since currently node-sqlserver doesn’t support SQL parameter so we have to use text combination if we need some query criteria. In the code below I added another path that the user can specify resource key and culture in URL, then our application will search for the value.

   1: app.get("/text/:key/:culture", function(req, res) {
   2:     sql.open(connectionString, function(err, conn) {
   3:         if(err) {
   4:             console.log(err);
   5:             res.send(500, "Cannot open connection.");
   6:         }
   7:         else {
   8:             var key = req.params.key;
   9:             var culture = req.params.culture;
  10:             var command = "SELECT * FROM [Resource] WHERE [Key] = '" + key + "' AND [Culture] = '" + culture + "'";
  11:             conn.queryRaw(command, function(err, results) {
  12:                 if(err) {
  13:                     console.log(err);
  14:                     res.send(500, "Cannot retrieve records.");
  15:                 }
  16:                 else {
  17:                     res.json(results);
  18:                 }
  19:             });
  20:         }
  21:     });
  22: });

The result would look like as below.

image

 

Insert, Update and Delete

As I mentioned, the “queryRaw” function can be used to insert, update and delete rows against the database. In fact we can specify any T-SQL commands in this function. For example we can pass “EXEC sp_GetItem ‘MY_KEY’, ‘en-US’” to invoke the store procedure.

The code below I added a route for user to insert a new record in the database. The user must send the request by HTTP POST method and specify the values in the request body. The application retrieves the values from the body, combine the SQL command and then invoke the node-sqlserver module to insert the data.

   1: app.post("/new", function(req, res) {
   2:     var key = req.body.key;
   3:     var culture = req.body.culture;
   4:     var val = req.body.val;
   5:     
   6:     sql.open(connectionString, function(err, conn) {
   7:         if(err) {
   8:             console.log(err);
   9:             res.send(500, "Cannot open connection.");
  10:         }
  11:         else {
  12:             var command = "INSERT INTO [Resource] VALUES ('" + key + "', '" + culture + "', '" + val + "')";
  13:             conn.queryRaw(command, function(err, results) {
  14:                 if(err) {
  15:                     console.log(err);
  16:                     res.send(500, "Cannot retrieve records.");
  17:                 }
  18:                 else {
  19:                     res.send(200, "Inserted Successful");
  20:                 }
  21:             });
  22:         }
  23:     });
  24: });

Since we need to use POST method this cannot be done from the browser, I created a simple console application in C#.

   1: class Program
   2: {
   3:     static void Main(string[] args)
   4:     {
   5:         var key = args[0];
   6:         var culture = args[1];
   7:         var val = args[2];
   8:  
   9:         var req = HttpWebRequest.Create("http://EPEKW-Q1100423.is.ad.igt.com:10000/new");
  10:         req.ContentType = "application/x-www-form-urlencoded";
  11:         req.Method = WebRequestMethods.Http.Post;
  12:  
  13:         var param = string.Format("key={0}&culture={1}&val={2}", key, culture, val);
  14:         var bytes = System.Text.Encoding.UTF8.GetBytes(param);
  15:         req.ContentLength = bytes.Length;
  16:         using(var stream = req.GetRequestStream())
  17:         {
  18:             stream.Write(bytes, 0, bytes.Length);
  19:         }
  20:  
  21:         var res = req.GetResponse();
  22:         using (var sr = new StreamReader(res.GetResponseStream()))
  23:         {
  24:             Console.WriteLine(sr.ReadToEnd());
  25:         }
  26:         Console.ReadKey();
  27:     }
  28: }

Let’s run this console application and specify the values we want to insert into database.

image

And then back to the database we can see it had been inserted.

image

 

Summary

In this post introduced a SQL Server Node.js module named “node-sqlserver”. This module is created by Microsoft which works with SQL Server as well as Windows Azure SQL Database. It’s still in the early preview phase so the functionality may not be perfect, such as not supporting SQL parameter, not supporting store procedure and not supporting getting result value by column name, etc.. But Microsoft is working to add more features.

In the next post, I will describe how to run Node.js application on Windows Azure. I will demonstrate how to use Windows Azure Web Site to host our Node.js application and how to connect to the Windows Azure SQL Database as well.

 

PS: The full source code below.

   1: var express = require("express");
   2: var sql = require("node-sqlserver");
   3:  
   4: var app = express();
   5: var connectionString = "Driver={SQL Server Native Client 11.0};Server={.};Database={Caspar};Trusted_Connection={Yes};";
   6:  
   7: app.configure(function () {
   8:     app.use(express.bodyParser());
   9: });
  10:  
  11: app.get("/", function(req, res) {
  12:     sql.open(connectionString, function(err, conn) {
  13:         if(err) {
  14:             console.log(err);
  15:             res.send(500, "Cannot open connection.");
  16:         }
  17:         else {
  18:             conn.queryRaw("SELECT * FROM [Resource]", function(err, results) {
  19:                 if(err) {
  20:                     console.log(err);
  21:                     res.send(500, "Cannot retrieve records.");
  22:                 }
  23:                 else {
  24:                     res.json(results);
  25:                 }
  26:             });
  27:         }
  28:     });
  29: });
  30:  
  31: app.get("/text/:key/:culture", function(req, res) {
  32:     sql.open(connectionString, function(err, conn) {
  33:         if(err) {
  34:             console.log(err);
  35:             res.send(500, "Cannot open connection.");
  36:         }
  37:         else {
  38:             var key = req.params.key;
  39:             var culture = req.params.culture;
  40:             var command = "SELECT * FROM [Resource] WHERE [Key] = '" + key + "' AND [Culture] = '" + culture + "'";
  41:             conn.queryRaw(command, function(err, results) {
  42:                 if(err) {
  43:                     console.log(err);
  44:                     res.send(500, "Cannot retrieve records.");
  45:                 }
  46:                 else {
  47:                     res.json(results);
  48:                 }
  49:             });
  50:         }
  51:     });
  52: });
  53:  
  54: app.get("/sproc/:key/:culture", function(req, res) {
  55:     sql.open(connectionString, function(err, conn) {
  56:         if(err) {
  57:             console.log(err);
  58:             res.send(500, "Cannot open connection.");
  59:         }
  60:         else {
  61:             var key = req.params.key;
  62:             var culture = req.params.culture;
  63:             var command = "EXEC GetItem '" + key + "', '" + culture + "'";
  64:             conn.queryRaw(command, function(err, results) {
  65:                 if(err) {
  66:                     console.log(err);
  67:                     res.send(500, "Cannot retrieve records.");
  68:                 }
  69:                 else {
  70:                     res.json(results);
  71:                 }
  72:             });
  73:         }
  74:     });
  75: });
  76:  
  77: app.post("/new", function(req, res) {
  78:     var key = req.body.key;
  79:     var culture = req.body.culture;
  80:     var val = req.body.val;
  81:     
  82:     sql.open(connectionString, function(err, conn) {
  83:         if(err) {
  84:             console.log(err);
  85:             res.send(500, "Cannot open connection.");
  86:         }
  87:         else {
  88:             var command = "INSERT INTO [Resource] VALUES ('" + key + "', '" + culture + "', '" + val + "')";
  89:             conn.queryRaw(command, function(err, results) {
  90:                 if(err) {
  91:                     console.log(err);
  92:                     res.send(500, "Cannot retrieve records.");
  93:                 }
  94:                 else {
  95:                     res.send(200, "Inserted Successful");
  96:                 }
  97:             });
  98:         }
  99:     });
 100: });
 101:  
 102: app.listen(10000);

 

Hope this helps,

Shaun

All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

 

Two weeks ago I had had a talk with Wang Tao, a C# MVP in China who is currently running his startup company and product named worktile. He asked me to figure out a synchronization solution which helps his product in the future. And he preferred me implementing the service in Node.js, since his worktile is written in Node.js.

Even though I have some experience in ASP.NET MVC, HTML, CSS and JavaScript, I don’t think I’m an expert of JavaScript. In fact I’m very new to it. So it scared me a bit when he asked me to use Node.js. But after about one week investigate I have to say Node.js is very easy to learn, use and deploy, even if you have very limited JavaScript skill. And I think I became love Node.js.

Hence I decided to have a series named “Node.js Adventure”, where I will demonstrate my story of learning and using Node.js in Windows and Windows Azure. And this is the first one.

 

(Brief) Introduction of Node.js

I don’t want to have a fully detailed introduction of Node.js. There are many resource on the internet we can find. But the best one is its homepage.

Node.js was created by Ryan Dahl, sponsored by Joyent. It’s consist of about 80% C/C++ for core and 20% JavaScript for API. It utilizes CommonJS as the module system which we will explain later. The official definition of Node.js is

Node.js is a platform built on Chrome's JavaScript runtime for easily building fast, scalable network applications. Node.js uses an event-driven, non-blocking I/O model that makes it lightweight and efficient, perfect for data-intensive real-time applications that run across distributed devices.

First of all, Node.js utilizes JavaScript as its development language and runs on top of V8 engine, which is being used by Chrome. It brings JavaScript, a client-side language into the backend service world. So many people said, even though not that actually, “Node.js is a server side JavaScript”.

image

Additionally, Node.js uses an event-driven, non-blocking IO model. This means in Node.js there’s no way to block currently working thread. Every operation in Node.js executed asynchronously. This is a huge benefit especially if our code needs IO operations such as reading disks, connect to database, consuming web service, etc..

Unlike IIS or Apache, Node.js doesn’t utilize the multi-thread model. In Node.js there’s only one working thread serves all users requests and resources response, as the ST star in the figure below. And there is a POSIX async threads pool in Node.js which contains many async threads (AT stars) for IO operations.

image

When a user have an IO request, the ST serves it but it will not do the IO operation. Instead the ST will go to the POSIX async threads pool to pick up an AT, pass this operation to it, and then back to serve any other requests. The AT will actually do the IO operation asynchronously.

image

Assuming before the AT complete the IO operation there is another user comes. The ST will serve this new user request, pick up another AT from the POSIX and then back.

image

If the previous AT finished the IO operation it will take the result back and wait for the ST to serve. ST will take the response and return the AT to POSIX, and then response to the user.

image

And if the second AT finished its job, the ST will response back to the second user in the same way.

image

As you can see, in Node.js there’s only one thread serve clients’ requests and POSIX results. This thread looping between the users and POSIX and pass the data back and forth. The async jobs will be handled by POSIX. This is the event-driven non-blocking IO model.

The performance of is model is much better than the multi-threaded blocking model. For example, Apache is built in multi-threaded blocking model while Nginx is in event-driven non-blocking mode. Below is the performance comparison between them.

image

And below is the memory usage comparison between them.

image

These charts are captured from the video NodeJS Basics: An Introductory Training, which presented at Cloud Foundry Developer Advocate.

 

Node.js on Windows

To execute Node.js application on windows is very simple. First of you we need to download the latest Node.js platform from its website. After installed, it will register its folder into system path variant so that we can execute Node.js at anywhere. To confirm the Node.js installation, just open up a command windows and type “node”, then it will show the Node.js console.

image

As you can see this is a JavaScript interactive console. We can type some simple JavaScript code and command here. To run a Node.js JavaScript application, just specify the source code file name as the argument of the “node” command.

For example, let’s create a Node.js source code file named “helloworld.js”. Then copy a sample code from Node.js website.

image

   1: var http = require("http");
   2:  
   3: http.createServer(function (req, res) {
   4:     res.writeHead(200, {"Content-Type": "text/plain"});
   5:     res.end("Hello World\n");
   6: }).listen(1337, "127.0.0.1");
   7:  
   8: console.log("Server running at http://127.0.0.1:1337/");

This code will create a web server, listening on 1337 port and return “Hello World” when any requests come. Run it in the command windows.

image

Then open a browser and navigate to http://localhost:1337/.

image

As you can see, when using Node.js we are not creating a web application. In fact we are likely creating a web server. We need to deal with request, response and the related headers, status code, etc.. And this is one of the benefit of using Node.js, lightweight and straightforward.

But creating a website from scratch again and again is not acceptable. The good news is that, Node.js utilizes CommonJS as its module system, so that we can leverage some modules to simplify our job. And furthermore, there are about ten thousand of modules available n the internet, which covers almost all areas in server side application development.

 

NPM and Node.js Modules

Node.js utilizes CommonJS as its module system. A module is a set of JavaScript files. In Node.js if we have an entry file named “index.js”, then all modules it needs will be located at the “node_modules” folder. And in the “index.js” we can import modules by specifying the module name.

For example, in the code we’ve just created, we imported a module named “http”, which is a build-in module installed alone with Node.js. So that we can use the code in this “http” module.

image

Besides the build-in modules there are many modules available at the NPM website. Thousands of developers are contributing and downloading modules at this website. Hence this is another benefit of using Node.js. There are many modules we can use, and the numbers of modules increased very fast, and also we can publish our modules to the community. When I wrote this post, there are totally 14,608 modules at NPN and about 10 thousand downloads per day.

Install a module is very simple. Let’s back to our command windows and input the command “npm install express”. This command will install a module named “express”, which is a MVC framework on top of Node.js.

image

And let’s create another JavaScript file named “helloweb.js” and copy the code below in it. I imported the “express” module. And then when the user browse the home page it will response a text. If the incoming URL matches “/Echo/:value” which the “value” is what the user specified, it will pass it back with the current date time in JSON format. And finally my website was listening at 12345 port.

   1: var express = require("express");
   2: var app = express();
   3:  
   4: app.get("/", function(req, res) {
   5:     res.send("Hello Node.js and Express.");
   6: });
   7:  
   8: app.get("/Echo/:value", function(req, res) {
   9:     var value = req.params.value;
  10:     res.json({
  11:         "Value" : value,
  12:         "Time" : new Date()
  13:     });
  14: });
  15:  
  16: console.log("Web application opened.");
  17: app.listen(12345);

For more information and API about the “express”, please have a look here.

Start our application from the command window by command “node helloweb.js”, and then navigate to the home page we can see the response in the browser.

image

And if we go to, for example http://localhost:12345/Echo/Hello Shaun, we can see the JSON result.

image

The “express” module is very populate in NPM. It makes the job simple when we need to build a MVC website. There are many modules very useful in NPM.

- underscore: A utility module covers many common functionalities such as for each, map, reduce, select, etc..

- request: A very simple HTT request client.

- async: Library for coordinate async operations.

- wind: Library which enable us to control flow with plain JavaScript for asynchronous programming (and more) without additional pre-compiling steps.

 

Node.js and IIS

I demonstrated how to run the Node.js application from console. Since we are in Windows another common requirement would be, “can I host Node.js in IIS?” The answer is “Yes”.

Tomasz Janczuk created a project IISNode at his GitHub space we can find here. And Scott Hanselman had published a blog post introduced about it.

 

Summary

In this post I provided a very brief introduction of Node.js, includes it official definition, architecture and how it implement the event-driven non-blocking model. And then I described how to install and run a Node.js application on windows console. I also described the Node.js module system and NPM command. At the end I referred some links about IISNode, an IIS extension that allows Node.js application runs on IIS.

Node.js became a very popular server side application platform especially in this year. By leveraging its non-blocking IO model and async feature it’s very useful for us to build a highly scalable, asynchronously service. I think Node.js will be used widely in the cloud application development in the near future.

 

In the next post I will explain how to use SQL Server from Node.js.

 

Hope this helps,

Shaun

All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

 

This is the upgraded version of one of my previous post named “Localization in ASP.NET MVC – 3 Days Investigation, 1 Day Job”. I updated my solution to the latest ASP.NET MVC 4, Visual Studio 2012 with some bug fixes.

I also tried to provide some solutions which I mentioned in the original post but didn’t implement.

 

Thanks and History

Since I posted the original post two years ago the view count is over 30 hundred and there are about 55 comments till now. I would like to take this opportunity to say thank you for all the people who read and commented to that post. Your feedback gave me the passion to continue blog posting. And it’s my great honor.

I wrote that post when I was working at Ethos Technologies on a web application project. At that moment I need to add localization feature to an ASP.NET MVC 2 web application.

Since I changed my career to IGT I’m more focusing on the cloud computing, backend service development and architecture especially in Windows Azure and WCF. Even though I’m keeping an eye on ASP.NET platform I didn’t pay much time on using them. This is the reason that I didn’t provide enough answers to the questions in that post.

This week my team had released a new version of our production successfully I would like to back to the ASP.NET world and using a separate post to get my localization solution upgraded. So, before read this post I’d like to suggest you take several minutes to quick go through the original one which should provide some background knowledge.

In this post I ‘m going to focus on the areas below:

- Migrate the whole solution to the latest ASP.NET MVC4 and Visual Studio 2012.

- Moved the localization codes in a separated project so that it can be used more easily.

- Provided a solution to abstract the localization resource provider, so that we can put the localization items in any data source such as database, configuration file, and of course the resource file.

- Updated the language selector helper class so that we can use texts and images, scripts, etc..

 

Principle of Localization

The basic principle of localization is to retrieve the the localized value from the center localization framework by specifying the localization key. Never ever put any final localized string in the application. For example, when we need to print “Hello World”, we should use the localization framework to retrieve the localized “Hello World” to us by providing the culture and the key.

image

In ASP.NET MVC, all strings in the application, not only in view pages but those in the controllers, models and backend services, should specify the localization key, and then use that key to get the final value from the localization framework.

The localization framework takes the responsible for:

- Helps the ASP.NET MVC application to get the culture based on the URL and set to the UI culture.

- Helps to figure out the current culture from the URL and render the proper URLs based on the currently culture.

- Html helper to render the language select section in views.

- Retrieve the localized value based on the localization key.

 

Upgraded to ASP.NET MVC 4 and Separated Project

We still need to use the route feature of ASP.NET to help us implement the localization feature. When the application was started we need to register new routes with the language information. So that we know which culture the user set currently.

In ASP.NET MVC 4, all configuration procedures are split into the App_Start folder. So we will add a new static class in our localization project and this should be invoked in the Global.asax, Application_Start function.

In this post there will be a project contains all codes for localization. And then an ASP.NET MVC 4 web application will be using this project.

   1: public static void RegisterRoutes(RouteCollection routes)
   2: {
   3:     routes.MapRoute(
   4:         "Account", // Route name
   5:         "Account/{action}", // URL with parameters
   6:         new { controller = "Account", action = "Index" } // Parameter defaults
   7:     );
   8:  
   9:     routes.MapRoute(
  10:         Constants.ROUTE_NAME, // Route name
  11:         string.Format("{{{0}}}/{{controller}}/{{action}}/{{id}}", Constants.ROUTE_PARAMNAME_LANG), // URL with parameters
  12:         new { controller = "Home", action = "Index", id = UrlParameter.Optional } // Parameter defaults
  13:     );
  14: }

An other class which contains all constants used by the localization framework.

   1: internal static class Constants
   2: {
   3:     internal static string ROUTE_NAME = "Localization";
   4:     internal static string ROUTE_PARAMNAME_LANG = "lang";
   5: }

Then we can add the route in Global.asax. One thing need to be noticed that our localization routes must be in front of the default route. So we must invoke the LocalizationConfig.RegisterRoutes(RouteTable.Routes) before the default route register RouteConfig.RegisterRoutes(RouteTable.Routes).

I really don’t know why Microsoft doesn’t provide the insert or reorder functions in its RouteCollection class. It utilized a Collection<RouteBase> to store all routes but utilizes a private dictionary to store the relationship between the route name and route object. And there’s no way for us to touch the private dictionary and set the order of routes.

   1: protected void Application_Start()
   2: {
   3:     AreaRegistration.RegisterAllAreas();
   4:  
   5:     WebApiConfig.Register(GlobalConfiguration.Configuration);
   6:     FilterConfig.RegisterGlobalFilters(GlobalFilters.Filters);
   7:  
   8:     // register the localization routes
   9:     // note: this must be invoked before the RouteConfig.RegisterRoutes
  10:     LocalizationConfig.RegisterRoutes(RouteTable.Routes);
  11:     
  12:     RouteConfig.RegisterRoutes(RouteTable.Routes);
  13:     BundleConfig.RegisterBundles(BundleTable.Bundles);
  14:     AuthConfig.RegisterAuth();
  15: }

The next step is to implement the localization procedure in all controllers. In the previous post I created a base controller which override the ExecuteCore method, then put the localization related logic there. Then all controllers in my project should be inherited from this base controller to have the localization feature enabled.

But there are some problems in this solution. The first one is, in ASP.NET MVC 4, the ExecuteCore method will NOT be invoked by default. So I need to move the localization logic to the BeginExecuteCore.

In order to support the synchronized controller in ASP.NET MVC 4 the ExecuteCore will not be invoked unless you set the DisableAsyncSupport to FALSE. For more information about this breaking change please refer here and here.

The second one is related with architecture. Putting the localization logic in a base controller is OK. But this means you must have all your controllers inherited from this base controller. This could be a problem if you have an existing base controller which cannot be modified. For example, assuming we are going to create an ASP.NET MVC website and all controllers must be inherited from the Contoso.BaseController which contains some common business logic of my company, and this base controller defined in an assembly that could not be able to change. Now if I need the localization, all controllers must be inherited from the localization base controller, as well as the company base controller. How can we do that?

image

Hence, I decided to move my localization logic into a standalone class. Then we can create a base controller class from the Contoso.BaseController and invoke our localization logic in its BeginExecuteCore, and have all my controllers from this base one so that all of them have the common business logic and the localization feature.

image

I moved the code from the base controller’s ExecuteCore to this new helper class.

   1: public class LocalizationControllerHelper
   2: {
   3:     public static void OnBeginExecuteCore(Controller controller)
   4:     {
   5:         if (controller.RouteData.Values[Constants.ROUTE_PARAMNAME_LANG] != null &&
   6:             !string.IsNullOrWhiteSpace(controller.RouteData.Values[Constants.ROUTE_PARAMNAME_LANG].ToString()))
   7:         {
   8:             // set the culture from the route data (url)
   9:             var lang = controller.RouteData.Values[Constants.ROUTE_PARAMNAME_LANG].ToString();
  10:             Thread.CurrentThread.CurrentUICulture = CultureInfo.CreateSpecificCulture(lang);
  11:         }
  12:         else
  13:         {
  14:             // load the culture info from the cookie
  15:             var cookie = controller.HttpContext.Request.Cookies[Constants.COOKIE_NAME];
  16:             var langHeader = string.Empty;
  17:             if (cookie != null)
  18:             {
  19:                 // set the culture by the cookie content
  20:                 langHeader = cookie.Value;
  21:                 Thread.CurrentThread.CurrentUICulture = CultureInfo.CreateSpecificCulture(langHeader);
  22:             }
  23:             else
  24:             {
  25:                 // set the culture by the location if not speicified
  26:                 langHeader = controller.HttpContext.Request.UserLanguages[0];
  27:                 Thread.CurrentThread.CurrentUICulture = CultureInfo.CreateSpecificCulture(langHeader);
  28:             }
  29:             // set the lang value into route data
  30:             controller.RouteData.Values[Constants.ROUTE_PARAMNAME_LANG] = langHeader;
  31:         }
  32:  
  33:         // save the location into cookie
  34:         HttpCookie _cookie = new HttpCookie(Constants.COOKIE_NAME, Thread.CurrentThread.CurrentUICulture.Name);
  35:         _cookie.Expires = DateTime.Now.AddYears(1);
  36:         controller.HttpContext.Response.SetCookie(_cookie);
  37:     }
  38: }

Last, let’s move the language selector bar to the new project without any code changes.

   1: public static class LanguageBarHelper
   2: {
   3:     private class Language
   4:     {
   5:         public string Url { get; set; }
   6:         public string ActionName { get; set; }
   7:         public string ControllerName { get; set; }
   8:         public RouteValueDictionary RouteValues { get; set; }
   9:         public bool IsSelected { get; set; }
  10:  
  11:         public MvcHtmlString HtmlSafeUrl
  12:         {
  13:             get
  14:             {
  15:                 return MvcHtmlString.Create(Url);
  16:             }
  17:         }
  18:     }
  19:  
  20:     private static Language LanguageUrl(this HtmlHelper helper, string cultureName, bool strictSelected = false)
  21:     {
  22:         // set the input language to lower
  23:         cultureName = cultureName.ToLower();
  24:         // retrieve the route values from the view context
  25:         var routeValues = new RouteValueDictionary(helper.ViewContext.RouteData.Values);
  26:         // copy the query strings into the route values to generate the link
  27:         var queryString = helper.ViewContext.HttpContext.Request.QueryString;
  28:         foreach (string key in queryString)
  29:         {
  30:             if (queryString[key] != null && !string.IsNullOrWhiteSpace(key))
  31:             {
  32:                 if (routeValues.ContainsKey(key))
  33:                 {
  34:                     routeValues[key] = queryString[key];
  35:                 }
  36:                 else
  37:                 {
  38:                     routeValues.Add(key, queryString[key]);
  39:                 }
  40:             }
  41:         }
  42:         var actionName = routeValues["action"].ToString();
  43:         var controllerName = routeValues["controller"].ToString();
  44:         // set the language into route values
  45:         routeValues[Constants.ROUTE_PARAMNAME_LANG] = cultureName;
  46:         // generate the language specify url
  47:         var urlHelper = new UrlHelper(helper.ViewContext.RequestContext, helper.RouteCollection);
  48:         var url = urlHelper.RouteUrl(Constants.ROUTE_NAME, routeValues);
  49:         // check whether the current thread ui culture is this language
  50:         var current_lang_name = Thread.CurrentThread.CurrentUICulture.Name.ToLower();
  51:         var isSelected = strictSelected ?
  52:             current_lang_name == cultureName :
  53:             current_lang_name.StartsWith(cultureName);
  54:         return new Language()
  55:         {
  56:             Url = url,
  57:             ActionName = actionName,
  58:             ControllerName = controllerName,
  59:             RouteValues = routeValues,
  60:             IsSelected = isSelected
  61:         };
  62:     }
  63:  
  64:     public static MvcHtmlString LanguageSelectorLink(this HtmlHelper helper,
  65:         string cultureName, string selectedText, string unselectedText,
  66:         IDictionary<string, object> htmlAttributes, bool strictSelected = false)
  67:     {
  68:         var language = LanguageUrl(helper, cultureName, strictSelected);
  69:         var link = helper.RouteLink(language.IsSelected ? selectedText : unselectedText,
  70:             Constants.ROUTE_NAME, language.RouteValues, htmlAttributes);
  71:         return link;
  72:     }
  73: }

 

Oops! Bugs in System.ComponentModel.DataAnnotations

All migration job are done. Let’s have a try. First of all we need two resources files one for English and the other for Chinese. Then put some values there.

image

In order to make this sample simple I just demonstrate the localization feature in the three major places in ASP.NET MVC: view pages, controllers and model attributes. Now let’s have a very quick test. In the home index view I specified the title through the resource.

   1: @{
   2:     ViewBag.Title = Resources.Global.View_HomeIndex_Title;
   3: }
   4: @section featured {
   5:     <section class="featured">
   6:         <div class="content-wrapper">
   7:             <hgroup class="title">
   8:                 <h1>@ViewBag.Title.</h1>
   9:                 <h2>@ViewBag.Message</h2>
  10:             </hgroup>
  11:             <p>
  12:                 ... ...
  13:             </p>
  14:         </div>
  15:     </section>
  16: }
  17: ... ...
  18: ... ...

In the home controller I set the message through the resource.

   1: public class HomeController : DefaultController
   2: {
   3:     public ActionResult Index()
   4:     {
   5:         ViewBag.Message = Resources.Global.Controller_HomeIndex_Message;
   6:  
   7:         return View();
   8:     }
   9: }

And in the account login model we specified the user name field’s display and validation message through the resource on the attributes.

   1: public class LoginModel
   2: {
   3:     [Required(ErrorMessageResourceName = "Model_Account_UserName_Required",
   4:               ErrorMessageResourceType = typeof(Resources.Global))]
   5:     [Display(Name = "Model_Account_UserName_Display",
   6:              ResourceType = typeof(Resources.Global))]
   7:     public string UserName { get; set; }
   8:  
   9:     [Required]
  10:     [DataType(DataType.Password)]
  11:     [Display(Name = "Password")]
  12:     public string Password { get; set; }
  13:  
  14:     [Display(Name = "Remember me?")]
  15:     public bool RememberMe { get; set; }
  16: }

Let’s start our MVC application and have a look. First, the proper string was shown in the home page which one comes from the view and the other come from the controller.

image

And it worked well once we clicked on the language select bar.

image

Then click login and verify the localization in System.ComponentModel.DataAnnotations attributes. Oops!

image

This error message means that there’s no public and static string property in the class generated from our resource file, so that the ASP.NET MVC cannot retrieve the localized string from the DisplayAttribute.

If we dig into the source code of the DisplayAttribute in System.ComponentModel.DataAnnotations we will find that it retrieves the localized value through an internal class named LocalizableString. And the LocaliableString class tried to find a visible property in the type defined in DisplayAttribute.ReourceType with the name of DisplayAttribute.Name.

image

But if we opened the Resource.Global class we will find that the class and all its properties were generated as “internal”. This means it cannot be retrieved by the DisplayAttribute since its IsVisible is FALSE. Someone believe this is a bug of DataAnnotations assembly since another attribute, ValidationAttribute retrieves the resource from the properties only if it’s static.

image

I’m not sure if this was implemented “by design” or a mistake in .NET BCL. I hope it could be fixed in the “.NET 5.0” but now we need to have some workaround. One solution is to move the resources in another project so that we can specify its access mode from “internal” to “public”.

image

Alternatively we can force Visual Studio to generate the resource class as “public”. Select the resource files and open its property windows, change the Build Action to Embedded Resource and change the Custom Tool to PublicResXFileCodeGenerator.

image

We also need to modify the codes those are using the resource class since this generator will have the output class under the App_GlobalResouces namespace. Now let’s retry our web application and we will see the localization strings in view page, controller and model attributes are working well.

image

 

Beyond the Resources: Localization Source Provider

In my previous post there are some comments that wondered to know how to use database to store the localization strings instead of the resource files. At that moment I replied that we can implement our own resource manager. But this might not be a simple task.

In ASP.NET MVC there are three places we may need the localization: views, controllers and models. It should not be a big problem that retrieving the localized string through database in views and controllers, since we have to full control of coding. For example, assuming that we created an interface named ILocalizationSrouce which contains one method that returns the localized strings. Then in views and controllers we can simply implement like the pseudo-code below.

- <p>@ILocalizationSrouce.GetString(cultureName, “Home_Index_Title_Key”)</p>

- ViewBag.Message = ILocalizationSrouce.GetString(cultureName, “Home_Index_Message_Key”);

But this will be difficult when implemented in the model attributes.

As I described in the previous post, when received a HTTP request ASP.NET MVC will create the related controller, use the action invoker and build the model binder. By default, the model binder will validate the input model. In ASP.NET MVC 4 almost all parts can be extended. So when the default model binder tried to validate a property of the model it will retrieve the appropriate ModelValidator and call its Validate method. The validation result will be turned back and will be added into the BindingContext.ModelState.

image

ASP.NET MVC 4 utilizes DataAnnotationsModelMetadataProvider to retrieve the metadata for each models, and utilizes DataAnnotationsModelValidatorProvider for each model validation. And both of these providers leverage the DisplayAttribute and ValidationAttribute in DataAnnotations to retrieve the localized display name and error message. For example, in ValidationAttribute it invoked the a private method named SetResourceAccessorByPropertyLookup to get the delegation on how to retrieve the localized string. And then invoke this delegation (_errorMessageResourceAccessor) and add the display name. Then return the error message alone with the ModelValidationResult.

image

The problem is that, Microsoft doesn’t provide any extension points in DataAnnotations assembly. When it needs the localized validation message it finds the static string property defined in the ResourceType. If yes then it will use reflection to invoke this property and retrieve the localized string. Otherwise, it will throw an exception. All logic defined in the DataAnnotations assembly.

In order to use any kinds of localization source there are two solutions. The first one is very straightforward. Since the default behavior of DataAnnotations is to find the static string property for each localization items, we can create our own class which have the properties for each items.

For example, I created a table in SQL Server which contains all localization strings. The structure of this table would be like this. The primary key consists of Key and Culture.

image

Then copy the resource values into this table.

image

Now let’s create a class which contains the static properties for each localization items. It will connect to the database and use the current UI culture and the key to find the proper localized string.

   1: public class DbResources
   2: {
   3:     private static string CST_CONNSTRING = "Data Source=.;Integrated Security=True;Connect Timeout=15;Encrypt=False;TrustServerCertificate=False";
   4:  
   5:     private static string GetValue(string key)
   6:     {
   7:         var culture = Thread.CurrentThread.CurrentUICulture.Name;
   8:         using(var conn = new SqlConnection(CST_CONNSTRING))
   9:         {
  10:             using(var cmd = conn.CreateCommand())
  11:             {
  12:                 cmd.CommandText = "SELECT [Value] FROM [Caspar].[dbo].[Resource] WHERE [Key] = @key AND [Culture] = @culture";
  13:                 cmd.Parameters.AddWithValue("key", key);
  14:                 cmd.Parameters.AddWithValue("culture", culture);
  15:                 conn.Open();
  16:                 var value = cmd.ExecuteScalar();
  17:                 return (string)value;