Updates from November, 2012 Toggle Comment Threads | Keyboard Shortcuts

  • Richard 7:34 am on November 21, 2012 Permalink |  

    Uploading files directly to Blob Storage from the browser 

    Based on some work I did previously, to upload data to blobs from the browser, I thought I’d take it a step further and upload whole files.

    The layout of the application should be exactly the same. The browser opens a page which is served from blob storage, it then obtains a shared access signature from a website using a JSONP request. The following node.js code will serve that signature, it’s the same as before:

    var azure = require("azure");
    var app = require('express')();
    app.enable("jsonp callback");
    var blobService = azure.createBlobService();
    blobService.createContainerIfNotExists("container", function(error){});
    app.get('/getsignature/:file', function(req, res){
    	var url = blobService.generateSharedAccessSignature("container", req.params.file, {
    	AccessPolicy : {
    		Permissions : "rwdl",
    		Expiry : getDate()
    	res.jsonp({url: url.url()});
    function getDate(){
    	var date = new Date();
    	date.setHours((date).getHours() + 1);
    	return date;

    In the broswer, you need some code which will get call this service, and retrieve the URL containing the shared access signature.

    jQuery.ajax("http://website.com/getsignature/example.txt" , {
        dataType: "jsonp",
        success: function(x){ 
            bloburl = x.url; 

    You then need some HTML to choose the file:

    <form id="form1">
     <input type="file" name="fileToUpload" id="fileToUpload"/>
     <input type="button" onclick="uploadFile()" value="Upload" />

    …and some code to upload the file to Blob Storage:

    function uploadFile() {
        var xhr = new XMLHttpRequest();
        xhr.upload.addEventListener("progress", uploadProgress, false);
        xhr.addEventListener("load", uploadComplete, false);
        xhr.addEventListener("error", uploadFailed, false);
        xhr.addEventListener("abort", uploadCanceled, false);
        xhr.open("PUT", bloburl);

    Simple! This seems to work in Chrome, Firefox and IE10.

  • Richard 9:47 pm on November 18, 2012 Permalink |  

    Rendering Enums as drop downs in ASP.NET MVC 

    This is a handy view, for automatically turning an Enum property on a ViewModel into a drop down list.

    @model object
      IEnumerable ToSelectListItems(Enum enumObj)
        foreach (var obj in Enum.GetValues(enumObj.GetType()))
          yield return new SelectListItem
            Value = obj.ToString(),
            Text = Enum.GetName(enumObj.GetType(), obj).Replace("_", " ")
    @if (this.Model is Enum)
      @Html.DropDownListFor(m => m, ToSelectListItems(this.Model as Enum))
      @Html.ValidationMessageFor(m => m)
      @Html.TextBox("",ViewData.TemplateInfo.FormattedModelValue,new { @class = "text-box single-line" })

    Simply call it ‘string.cshtml’ and add it to your Views\Shared\EditorTemplates folder.

    Now when you call  @Html.EditorForModel(Model as object) in your view, you get Enum properties rendered as drop downs. Underscores in the Enum names will be displayed as spaces.

  • Richard 11:50 am on November 5, 2012 Permalink |  

    Azure SDK 2.0 

    With relase of the new 1.8 Azure SDK release, came version 2.0 of the .NET storage client.

    A complete list of changes are available, but the main difference is an entirely new table storage client. This article covers what I’ve found out after a quick play.

    Gone is the linq support, and instead is an API that makes the developer think about reading and writing entities in the most efficient way.

    To get started, use the nuget package:

    Install-Package WindowsAzure.Storage

    The new SDK affords quite a bit of flexibility, but the easiest way to start is to create classes for each table, and inherit from TableEntity. TableEntity has the PartitionKey, RowKey, ETag etc.. properties, so you just need to define your own (Bar and Baz in this case).

    public class Foo : TableEntity
        public string Bar { get; set; }
        public string Baz { get; set; }

    To get a reference to your table:

    var account = CloudStorageAccount.Parse("xxx");
    var tableClient = account.CreateCloudTableClient();
    var table = tableClient.GetTableReference("foo");

    To write an entity to your table:

    var foo = new Foo
      RowKey = "1",
      PartitionKey = "1",
      Bar = "Hello World"

    To read from the table, use the execute function again:

    TableResult result = table.Execute(TableOperation.Retrieve<Foo>("1", "1"));
    Foo foo = result.Result as Foo; // this will be null if it doesn't exist

    There are a number of table operations, for merge, replace, delete etc… This is great, as it makes the developer acutely aware of exactly what will happen to the entity when it’s saved. It’s also obvious how to batch operations together (although there’s only one operation in this case!)

    var batch = new TableBatchOperation();

    Use the TableQuery class to execute queries on other columns, but be careful, these may not be indexed:

    var query = new TableQuery();
    query.Where(TableQuery.GenerateFilterCondition("Bar", "eq", "Hello World"));
    Foo foo = table.ExecuteQuery(query).First();

    A TableQuery can also be used to limit which columns are returned, or just take a count.

    If you don’t specify the type, you get a DynamicTableEntity, which has a dictionary of properties:

    var query = new TableQuery();
    var results = table.ExecuteQuery(query);

    In summary, this looks like a well thought out library. The linq support in the old SDK hid quite a bit of complexity which, whilst being powerful, displaced the developer from the underlying features of the system. The new library only feels slightly closer to the metal, but gives me much more confidence about the code I write.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc