Updates from September, 2011 Toggle Comment Threads | Keyboard Shortcuts

  • Richard 4:14 pm on September 30, 2011 Permalink |  

    Introducing Node-Azure 

    Node-Azure is a Node.js package (available via NPM) which allows native javascript access to the Windows Azure storage API.

    Node-Azure currently supports most of methods on these storage types:

    • Blob storage
    • Table storage
    • Queues

    Getting Started

    To include the node-azure dependency, and set up the details of your account, include these lines of code:

    // include the node-azure dependency
    var azure = require('azure');
    	
    // every request has an account parameter, which is an object like this:
    var account = {
      name : "YOURACCOUNTNAME",
      key : "YOURACCOUNTKEY",
      blob_storage_url : "https://YOURACCOUNTNAME.blob.core.windows.net",
      table_storage_url : "https://YOURACCOUNTNAME.table.core.windows.net",
      queue_storage_url : "https://YOURACCOUNTNAME.queue.core.windows.net"
    }
    

    Blob Examples

    Upload text as a blob:

    azure.blob.put_blob(account, "container", azure.blob.BlockBlob, "blobname",
      "Hello world!", {'Content-Type': "text/plain"}, callback);

    Download a blob:

    azure.blob.get_blob(account, "container", "blobname", function(x) {
      // x == "Hello world!"
    });
    
    

    Table Examples

    Insert an entity:

    azure.tables.insert_entity(account, 'tablename',
      { RowKey:'123', PartitionKey: 'xyz', Value: 'foo' }, callback);

    Get an entity:

    azure.tables.get_entity(account, 'tablename', 'xyz', '123', function(entity){
      // x == { RowKey:'123', PartitionKey: 'xyz', Value: 'foo' }
    });

    Query a table:

    azure.tables.query_entities(account, 'tablename', "Value+eq+'foo'", function(entities){
      // entities is an array of matching items
    });

    Queue Examples

    Put a message on a queue:

    azure.queues.put_message(account, q, {Test:"Message"}, callback);

    Pop the message off the queue:

    azure.queues.get_message(account, q, function(message){
      // our javascript object is returned: message.Data
    });

    To install

    run:

    npm install node-azure

    Or alternatively, manually copy the repository into a node_modules/node-azure folder.

    Advertisement
     
  • Richard 2:33 pm on September 30, 2011 Permalink |  

    Lucene.NET 

    An example of how to create a Lucene.NET index, add a document and search it:

    using System;
    using System.IO;
    using Lucene.Net.Analysis;
    using Lucene.Net.Documents;
    using Lucene.Net.Index;
    using Lucene.Net.QueryParsers;
    using Lucene.Net.Search;
    using Lucene.Net.Store;
    
    class Program
    {
        static void Main(string[] args)
        {
            // create the index
            var directory = FSDirectory.Open(new DirectoryInfo(@"d:\index\"));
            var analyzer = new SimpleAnalyzer();
            var indexWriter = new IndexWriter(directory, analyzer, IndexWriter.MaxFieldLength.UNLIMITED);
            var doc = new Document();
    
            // read a file into the index (in this case a file containing lorem ipsum)
            using (StreamReader sr = new StreamReader(@"d:\foo.txt"))
            {
                doc.Add(new Field("id", @"D:\foo.txt", Field.Store.YES, Field.Index.NO));
                doc.Add(new Field("content", sr.ReadToEnd(), Field.Store.YES, Field.Index.ANALYZED));
            }
            indexWriter.AddDocument(doc);
            indexWriter.Optimize();
            indexWriter.Commit();
            indexWriter.Close();
            // we have created and saved the index
    
            // create a query against the index
            var queryParser = new QueryParser(Lucene.Net.Util.Version.LUCENE_29, "content", analyzer);
            var query = queryParser.Parse("Lorem");
            var indexSearcher = new IndexSearcher(directory, true);
            var hits = indexSearcher.Search(query);     // this method is to be deprecated, however, I can't find an easier way of doing it!
    
            // display the results
            Console.WriteLine("Found {0} results", hits.Length());
            for (int i = 0; i < hits.Length(); i++)
            {
                var doc2 = hits.Doc(i);
                Console.WriteLine("Result num {0}, score {1}", i + 1, hits.Score(i));
                Console.WriteLine("ID: {0}", doc2.Get("id"));
                Console.WriteLine("Text found: {0}" + Environment.NewLine, doc2.Get("content"));
            }
    
            // close 
            indexSearcher.Close();
            directory.Close();
        }
    }

    Never have I worked with a library with so many obsolete and deprecated methods. It’s quite hard going!

     
  • Richard 1:07 pm on September 22, 2011 Permalink |  

    Windows Azure and Node.js 

    As Rob Blackwell pointed outNode.js has a clear part to play in cloud computing. But what does an application powered by Node look like? Can we build something entirely with javascript and run it on Azure now? I decided to explore this by building a simple todo application.

    The application lists the items in your todo list, new items are added by typing in the input box, and pressing ‘Add’:


    Clicking on an item shows a different page, and asks you if you want to delete it.


    Very simple. There is no security, no error trapping, and it looks awful in Internet Explorer – but it proves a concept. Let’s dig under the covers.

    Design Tenets

    1. The application running in the browser should be a static HTML file. This allows us to consider offline use with HTML5 storage if available.
    2. When the application is running, only data (JSON) should be exchanged with the server. We should not be rendering markup on the server. This keeps our bandwidth requirements to a minimum.
    3. The only code I am going to write is javascript.

    The User Interface

    The application is a single static HTML file, and uses a browser-based MVC architecture. The application uses a number of standard javascript libraries. In particular, these libraries are used:

    Backbone.js

    Backbone provides the client-side routing in the application. All links in the application link to anchors (i.e. they only modify parts of the URL after the  ‘#’ ). These links do not cause a round trip back to the server, instead, backbone picks them up and routes you to a controller. So in this application I have these two routes set up:

    var AppRouter = Backbone.Router.extend({
      routes: {
        "/view/:id": "viewItem",
        "*actions": "defaultRoute"
      },
      defaultRoute: function(actions){
        two10.actions.defaultAction();
      },
      viewItem: function(id){
        two10.actions.view(id);
      }
    });
    var router = new AppRouter;

    So when the URL matches the first route (i.e.  ‘…/#/view/1234’) the ‘two10.actions.view’ function is called, passing in ‘1234’. Otherwise the second route is matched, and ‘two10.actions.defaultAction’ is called.

    This simple piece of functionality is very powerful, as it means my application can be bookmarked. It also means I don’t have a complicated web of javascript functions dotted around my application, I mostly just use links like this

      <a href="#/view/1234">View</a>

    JQuery Templates

    My controllers need to write html out for the user to see. To do this I use JQuery Templates. There are dozens of options for templating in javascript, I’m just sticking with what I know. The template to display the todo item, with a delete button looks like this (I have simplified the markup):

    <script id="view" type="text/x-jquery-tmpl">
      <div>
        <div>
          <h1>${item.get("name")}</h1>
          <a href="#">Back</a>
        </div>
        <div>
          <p>${item.get("name")}</p>
          <p><a href="javascript:two10.actions._delete('${item.cid}');">Delete</a></p>
        </div>
      </div>
    </script>

    Note how in the link for the delete button bypasses the routing, and invokes the controller directly. This is because I want to emulate a post, and I don’t want the URL for deleting the item to appear in the browser or the routing table.

    To actually render the view, my code needs to pass in the model (i.e. the object which holds the todo item):

    var item = two10.data.byId(id);        // retrieve the item out of the in-memory collection
    $("#body").html($("view").tmpl(item)); // render the template, passing in the item

    JQuery Mobile

    I attempted at first to use JQuery Mobile to enhance my markup (and make my application look pretty). However there is also a routing capability with JQuery Mobile, which conflicts with Backbone. Despite efforts to disable it, I decided to just use the CSS from JQuery Mobile, and put the enhanced markup in my templates. Whilst this requires more effort when writing the templates, it probably improves runtime performance by reducing the amount of DOM manipulation required in the browser.

    Server Side

    Serving the page

    The server side code is a node.js script. The first thing it’s required to do is to serve up the HTML page to the browser.

    var http = require('http');
    var fs = require('fs');
    http.createServer(function (request, response) {
      switch (request.url) {
        case '/':
          fs.readFile('./default.htm', function(error, content) {
            if (error) {
              response.writeHead(500);
              response.end();
            }
            else {
              response.writeHead(200, { 'Content-Type': 'text/html' });
              response.end(content, 'utf-8');
            }
          });
          break;
     ...

    Connecting to Azure Storage

    The todo items should be persisted in Azure Table Storage. To do this, I forked a prototype version of the node-azure library which makes it straight forward to save/retrieve records in table storage using these methods:

    function insert_entity(account, tablename, data, callback)
    function get_entity(account, tablename, partitionKey, rowKey, callback)

    The code running on the browser does a GET as soon as the application starts, node retrieves the data from table storage, and just returns it back to the browser. Every time the user changes the items in the todo list, the application does a POST to node, which just saves the data in table storage.

    Deploying to Azure

    I tried to use the iisnode technique to get node running on a web role, and failed. Instead, I added node as a background startup task to a worker role. These are the steps to set it up:

    1. Create a Worker Role, open up port 80, and register a startup command. Your Service Definition file should look like this:

    <?xml version="1.0" encoding="utf-8"?>
    <ServiceDefinition name="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition">
      <WorkerRole name="WorkerRole1" vmsize="ExtraSmall">
        <Startup>
          <Task commandLine="Startup.cmd" executionContext="elevated" taskType="background"/>
        </Startup>
        <Endpoints>
          <InputEndpoint name="Endpoint1" protocol="tcp" port="80" />
        </Endpoints>
      </WorkerRole>
    </ServiceDefinition>

    2. Create a startup.cmd file, and add it to your Worker Role. The command should contain the instruction to start node, passing in your script:

    node server.js

    3. Add node.exe, and all of your javascript/html files to your Worker Role project, ensure all these files are set to copy local (node dependencies can be added to a ‘node_modules’ sub-folder).

    4. Deploy, and stand back.

    Conclusion

    Developing in javascript is fun, and more productive than I had thought.

    I would like to see more investment in node on the windows platform, currently the Node Package Manager (NPM) hasn’t been ported, (read how to do this manually) and it would be better to have node running under IIS than a background task in a Worker Role.

    The power of  routing and templating in the browser shouldn’t be underestimated. With the addition of some HTML5 features, this technique could seriously compete with Silverlight.

    I don’t think this stack is the solution for everything, but if you want multi-device support, good scalability and low bandwidth requirements, you should certainly consider it. Bear in mind though, node is in it’s infancy, this is the bleeding edge.

    Source

    The entire source can be downloaded here.

     
  • Richard 8:26 am on September 22, 2011 Permalink |  

    Pattern for processing Azure Queue Messages in parallel 

    By using a function that dequeus messages as an IEnumerable, you can easily plug in the parallel task library to process message simultaneously.

    using System;
    using System.Collections.Generic;
    using System.Threading.Tasks;
    using Microsoft.WindowsAzure;
    using Microsoft.WindowsAzure.StorageClient;
    
    class Program
    {
        static void Main(string[] args)
        {
            var account = CloudStorageAccount.DevelopmentStorageAccount;
            var client = account.CreateCloudQueueClient();
            var queue = client.GetQueueReference("foo");
    
            Parallel.ForEach<CloudQueueMessage>(GetMessages(queue), (message) =>
            {
                // process your message here
                Console.WriteLine(message.AsString);
                queue.DeleteMessage(message);
            });
        }
    
        private static IEnumerable<CloudQueueMessage> GetMessages(CloudQueue queue)
        {
            while (true)
            {
                var message = queue.GetMessage();
                if (message != null)
                {
                    yield return message;
                }
                else
                {
                    yield break;
                }
            }
        }
    }
    
    The added advantage to this approach, is that messages added to the queue once the loop has started will still get processed. You could wrap the whole Parallel.ForEach in a loop, and use Thread.Sleep to pause whilst there is nothing in the queue to process. 
     
  • Richard 2:27 pm on September 21, 2011 Permalink |
    Tags: node.js   

    NPM NOW WORKS IN WINDOWS IGNORE THIS POST… 

    NPM NOW WORKS IN WINDOWS – IGNORE THIS POST

    Windows is missing the NPM (Node Package Manager) – so the easiest way I have found to manually install a node package, is to create a ‘node_modules’ folder, in the same location as your .js file, and copy in the package’s javascript file (normally located in the lib folder).

    You can then include you package in the normal way:

    var package = require('package');
     
  • Richard 9:07 am on September 20, 2011 Permalink |  

    Running Classic ASP on Azure 

    By default, classic asp is not enabled on a Windows Azure Web Role. You can easily install it with a startup task:

    1. Create a Startup.cmd file in your ASP.NET project.

    2. By default text files are saved as ‘Unicode (UTF-8 with signature) – Codepage 65001’. Unfortunatly this seems to add an extra unwanted character to the start of the file, which prevents the script form being run. Go to ‘Advanced Save Options…’ in the file menu of Visual Studio, and select ‘US-ASCII – Codepage 20127’.

    3. On the properties of this file, set ‘Copy to Output Directory’ to ‘Copy if newer’.

    4. Add this line to the file:

    start /w pkgmgr /iu:IIS-ASP

    5. Modify your ServiceDefinition file, to add the startup task:

    <?xml version="1.0" encoding="utf-8"?>
    <ServiceDefinition name="ClassicASPonAzure" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition">
      <WebRole name="WebRole1" vmsize="ExtraSmall">
         <Startup>
          <Task commandLine="Startup.cmd" executionContext="elevated" taskType="simple" />
        </Startup>
    ...

    6. Publish your instance, and you should have classic ASP enabled.

    
    	
     
    • steven 1:09 pm on February 22, 2013 Permalink | Log in to Reply

      I dont use visual studio – can I still get azzure to run asp ? and where could i find the steps ? thanks

    • Richard 3:41 pm on February 26, 2013 Permalink | Log in to Reply

      Sure, the Azure Websites support classic ASP. You can deploy using FTP or Git.

  • Richard 1:14 pm on September 12, 2011 Permalink |  

    AzureSugar 

    A lightweight .NET library which makes working with the Azure API easier.

    https://github.com/richorama/AzureSugar

    Table Storage

    Define a class which represents your table, and optionally supply a name using the ‘TableName’ attribute.

    [TableName("Customers")]  // <- this is optional
    public class Customer : TableServiceEntity
    {
        public string Firstname { get; set; }
        public string Lastname { get; set; }
    }

    The AzureSugarTableContext allows you to create new customers easily:

    using (var context = new AzureSugarTableContext(CloudStorageAccount.DevelopmentStorageAccount))
    {
        var customer = context.Create<Customer>();
        customer.Firstname = "John";
        customer.Lastname = "Smith";
    }

    Primary keys (GUIDs) are automatially assigned. All commits are performed on disposal of the context.

    It’s just as easy to query the table:

    using (var context = new AzureSugarTableContext(CloudStorageAccount.DevelopmentStorageAccount))
    {
        foreach (var customer in (from c in context.Query<Customer>() where c.Firstname == "John" select c))
        {
            Console.WriteLine(customer.Firstname);
        }
    }

    Queues

    Queues are strongly typed. Let’s say we are working with this class.

    public class Foo
    {
        public string Bar { get; set; }
        public string Baz { get; set; }
    }

    To push a message on to the queue, just do this:

    var foo = new Foo { Bar = "bar", Baz = "baz" };
    var queue = new AzureSugarQueue<Foo>(CloudStorageAccount.DevelopmentStorageAccount);
    queue.Push(foo);

    To pop a message from a queue, we just need to do this:

    var queue = new AzureSugarQueue<Foo>(CloudStorageAccount.DevelopmentStorageAccount);
    using (var message = queue.Pop())
    {
        Foo foo = message.Content;
        // do something with foo
    }

    Your object is automatically deleted from the queue on the disposal of the message, however, you can have more control over this:

    using (var message = queue.Pop(false))
    {
        var foo2 = message.Content;
        message.VoteCommit();
    }

    In this case, the message will only be deleted if ‘VoteCommit’ is called.

    The queue name is automatically derived from the type name ‘Foo’, however you can override this as well.

     
  • Richard 8:43 am on September 12, 2011 Permalink |  

    Sticky Sessions and Windows Azure 

    One of the problems that often crops up with moving legacy applications to the cloud is reliance of sticky sessions. The Azure load balancer works on a round-robin basis, so if your application has been designed to work on a sticky session basis, you may have some work to do.

    Although it is possible to use Sticky Sessions in Azure (see this example) it’s not a great fit for cloud architecture, for these reasons:

    1. When you provision new instances, only new sessions will be routed to them. Depending on your load balancing logic, new sessions may also still be provisioned on the old instances. This results in it taking a long time (depending on the average length of your session) for load to be evenly distributed across your instances.
    2. Cloud solutions should be designed to fail. One of your instances may be removed at any time for patching, or hardware failure. With a sticky session scenario, the clients with a session on this instance will probably have to log in again, or may see an error page, depending on your load balancing logic.
    3. Unless your load balancer shares state in some way, it’s likely that you’re load balancing with a single instance (single point of failure) and not eligible for the Microsoft SLA.
    So what options are there?
    1. Use the Azure AppFabric Cache to share ASP.NET session state across your instances. This walk through shows you how, by simply changing your Web.config.
    2. Ideally move to a stateless architecture, making use of cookies or the database where appropriate. Designing an application in this way will provide you with the best scaling potential, and allow you to get the most from the cloud.
     
  • Richard 3:46 pm on September 2, 2011 Permalink |  

    Mount Azure Blob Storage as a Windows Drive 

    Do you need to access blobs as files from your local computer or Azure? AzureBlobDrive solves this problem by mounting blob storage as drive in Windows. Containers appear as a folders, each blob is displayed as a file:

    AzureBlobDrive can run either on an Azure instance, or on your local machine.

    Works well for these scenarios:

    + If you have an existing application which you want to migrate to Azure, but it needs to read/write to a persistent drive.

    + If you have an application split across Azure and a local data centre, and they need to share files.

    + If you want convenient access to blobs and containers from your local computer.

    Not suitable for these scenarios:

    • Reading/writing large files.
    • Storing files that change a lot (i.e. database files).

    Demonstration

    Run locally

    1. Install the Dokan driver, which can be downloaded from here: http://dokan-dev.net/wp-content/uploads/DokanInstall_0.6.0.exe
    2. Download the AzureBlobDrive source: https://github.com/richorama/AzureBlobDrive
    3. Update the app.config file in the Two10.AzureBlobDrive.Console project, to point to your Azure storage account (alternatively you can use the Azure emulator).
    4. Run the Two10.AzureBlobDrive.Console project. An ‘R’ drive will be mounted, you should be able to see your containers.

    Run in Azure

    1. Download the AzureBlobDrive source: https://github.com/richorama/AzureBlobDrive
    2. Update the Two10.AzureBlobDrive.Console.exe.config in the Two10.AzureBlobDrive.WorkerRole project, to point to your Azure storage account.
    3. Build the solution.
    4. Publish the Two10.AzureBlobDrive.CloudProject project to Azure, the role will install the Dokan driver with an elevated startup command, and the worker role will mount the ‘R’ drive in the ‘Run’ method.

    Current limitations

    AzureBlobDrive is alpha quality code, and has a number of limitations. Some of these limitations are passed on by the inherent limitations of Azure blob storage.

    • Files can only be placed in a folder (container). You cannot have files in the root directory.
    • Folders cannot contain folders (blob storage does not support hierarchy).
    • Root folder names (containers) must be in lower case, and cannot contain spaces and other special characters.
    • Performance is poor, and large files are not recommended.
    • Files and folders are cached for one minute, so changes made by other machines may not be instantly viewable.
    • Files (blobs) cannot be empty (i.e. have a zero size).

    How does it work?

    AzureBlobDrive uses the Dokan file system driver (http://dokan-dev.net/en/). Dokan provides you with an interface for calls to the file system, such as ReadFile, WriteFile etc… The Two10.AzureBlobDrive project contains a class which implements these methods, accessing blob storage to retrieve the information. The Azure Worker Role is configured to install the Dokan dependencies (Startup.cmd), and then start the Two10.AzureBlobDrive.Console application as a background task.
    Some of these limitations will be improved over time. Feel free to fork the repo!

     
    • Parv 12:20 pm on December 7, 2011 Permalink | Log in to Reply

      hi.. awsm wkr with this project. i was looking smthing exactly like this cuz i need to build a system to in azure for files with programatic access restrictions. however can u help becuz the run method of the worker role present contains no code to mount the drive and also the console that is supposed to work inside windows locally is also exiting without any error and without any drive being mounted to my system.
      thankx in advance

    • Parv 7:01 am on December 9, 2011 Permalink | Log in to Reply

      awsm wrk i hav worked things out but because i dnt hav a documentation of which function does what.. plz help

    • Gibberish 5:47 am on January 4, 2012 Permalink | Log in to Reply

      What are you trying to say? Stopped reading your post because your writing style is annoying, please use proper english to be heard and understood.

    • parv 6:56 am on January 4, 2012 Permalink | Log in to Reply

      was tht for me? if yes.. through my last comment here i was asking for some documentation to be added so that the one (like me) attempting to edit code knows exactly how each method in the different classes work and what exactly is expected out of them.. also what all will break if a particular function is removed. u get it now? i apologize if my style of writing annoyed you

c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel