Updates from January, 2012 Toggle Comment Threads | Keyboard Shortcuts

  • Richard 11:41 am on January 24, 2012 Permalink |  


    I went along the Gov Camp 2012 event on Saturday 21st Jan. I didn’t quite know what to expect, but I was there with my laptop, and an eagerness to roll up my sleeves and bash out some code.

    The day started with the ~200 delegates introducing themselves in turn. It was clear that there was some real talent in the room. We had an army of developers as well as UX people, PR and media types, and most importantly – people who understood Government. What could we accomplish? What could we build? The possibilities were endless.

    After the intros, people were invited to pitch for workshops or tasks they would like to undertake/run. Lots of ideas were put forward, and a few of them were ‘let’s build this’, ‘let’s design that’. I went for one of those workshops.

    We had some great discussions, bounced ideas around, and expert opinion and experience was injected in to that. However, we failed to really get anything done. The same for a later session, what was advertised as  ‘Design and Build’ was just a talking shop. Don’t get me wrong, if other people in the session got something out of it, then it was a success, but I didn’t leave with a sense of achieving anything. I can’t help but think that if there was a little more focus on creating solutions rather than discussing problems, something real and worthwhile could have been produced.

    I did enjoying meeting some interesting people, I took away several business cards of people I intend to contact to continue conversations. I was also inspired to see that Government are taking open source very seriously and embracing technologies like GitHub to collaborate both internally and with the public.


  • Richard 2:04 pm on January 13, 2012 Permalink |  

    Improving Blob Upload Speeds 

    The Challenge:

    Upload a 144 MB file into Azure Blob Storage as fast as possible.


    The CloudBlob.UploadFile() method doesn’t do a bad job. Behind the scenes it’s chunking the file into blocks, and using the ‘PutBlock‘ method with the Parallel Task Library to upload them simultaneously.

    However, the Parallel Task Library will not upload all of the blocks at once, instead it uses a thread pool to upload a few at a time. This is good, but we can do better.

    All at once

    After looking through some performance metrics on blob upload, it seems that Azure storage performance gets better when you use 20-40 threads to upload your file.

    I modified my code to chunk the 144 Meg file into 36 x 4 Meg chunks, then used 36 threads to upload all the chunks simultaneously.

    The result: it’s faster (most of the time).


    My intention is to create a more complete list of benchmarks, but here’s what I’ve got at the moment.

    Location CloudBlob.UploadFile() All at once technique
    Azure Instance
    (Extra Large)
    6.1 seconds 5.3 seconds
    T1 Connection 21 seconds 14 seconds
    10 Mbps 143 seconds 615 seconds *
    Domestic ADSL Line
    (0.36 Mbps upload speed)
    Timeout 923 seconds

    (*) The network equipment is throttling the number of simultaneous outbound connections.

    Where to go from here?

    More performance gains can be made, but with added complexity. Peer networking could provide one answer. If the file exists in more that one place, perhaps uploading different parts from both locations would help?

    Another answer is compression. The particular file I used was already highly compressed. However, a file which could be heavily compressed could be uploaded to Azure compute instances in a GZipped stream, and then inflated and inserted into blob storage by a background process.

    I have tested this approach, and significant gains can be made, in direct correlation to the compressibility of the data. Obviously there is a cost implication.

  • Richard 1:47 pm on January 3, 2012 Permalink |  

    Getting the samples to work on the Azure Node.js SDK 

    (You probably want to do this in linux)

    Install Node.js & NPM

    echo 'export PATH=$HOME/local/bin:$PATH' >> ~/.bashrc
    . ~/.bashrc
    mkdir ~/local
    mkdir ~/node-latest-install
    cd ~/node-latest-install
    curl http://nodejs.org/dist/node-latest.tar.gz | tar xz --strip-components=1
    ./configure --prefix=~/local
    make install
    curl http://npmjs.org/install.sh | sh

    Download the SDK

    npm install azure

    Download the dependencies

    cd azure-sdk-for-node
    npm install 
    cd examples/blog
    npm install
    cd ../tasklist
    npm install
    cd ../../

    Update the storage credentials

    Update the ./lib/services/serviceclient.js with your credentials.

    The examples use the dev storage, so just update the dev storage details to be the live endpoint URIs, and your account and access key.

    Run the examples

    Start node with either of the following commands:

    cd examples/blog
    node server


    cd examples/tasklist
    node server

    Then browse to http://localhost:1337/ to try the applications.

    • Glenn Block 9:42 am on January 6, 2012 Permalink | Log in to Reply

      Hi Richard

      Nice post!

      Instead of having to install all the modules yourself you should be able to just rely on the package.json. Go to the directory where the SDK lives (also for the samples) and just type npm install.

    • Richard 8:51 pm on January 6, 2012 Permalink | Log in to Reply

      Thanks Glenn, I have updated the post accordingly. It makes the script quite a more simple!

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc