Microsoft & .NETBuilding in the Cloud with Azure Storage and the Azure SDK

Building in the Cloud with Azure Storage and the Azure SDK

Developer.com content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

Introduction


Windows Azure and Azure Storage offers a new scalable and
robust architecture that borrows much from the common feel
of ASP.NET applications but brings plenty of new features as
well. This paradigm shift from what has become traditional
client-server architecture will offer new options to
developers and headaches alike. While “the cloud” is not
intended to be the answer for all applications and
situations, it can only be a potential answer (another
“tool” in the proverbial “tool belt”) if you have at minimum
a general understanding.


The following will provide you with a walkthrough
experience of a cloud architected application. It will
explore how to build a multi-role cloud service that uses
two new Azure Storage services, Tables and Queues by
covering the following:


  • Creating a Windows Azure application with a Web Role and Worker Role

  • Azure Storage


    • Set up configuration files with account information

    • Using a Table to store and retrieve data

    • Using a Queue to send messages to the Worker Role


  • Using the Azure SDK StorageClient project for Table and Queue interaction


If you are planning on following along with the code
examples in your own environment here are the prerequisites
you will need installed:



Important: Installing Windows Azure Tools for Visual
Studio requires Vista SP1 or greater; Windows XP will not
allow installation.


The below code and screen shots are based on the July
2009 CTP of Azure and are subject to change before the
production release. Please consult up-to-date documentation
after the release if certain functionality no longer works
or has been changed.


The application built in the following walkthrough uses
one Web Role to take two numeric inputs from a user and
allows that user to come back later to view the results of
adding them together. A Worker Role will be used to pick up
a Queue message, retrieve the two inputs and add them
together, storing the results back to Table storage. While
using a Worker Role for simple addition is overkill, it will
demonstrate all the necessary steps to architect your cloud
application to maintain user interface responsiveness and
scalability.


Creating a Windows Azure application


Before opening Visual Studio you’ll have to get in the
habit of running it as an Administrator when working on
Azure projects. The Azure development fabric and storage,
which comes with the Azure tools for VS, requires
Administrative rights when you go to debug your application.
You can think of the development fabric and storage as your
own little mock cloud running locally.


In Visual Studio, after installing the Azure tools for
VS, you will have a new project type option of “Cloud
Service” with one template to pick. Give the project a name
as you would with any new project,
MyCouldService” for this example.



Figure 1.1 - New Cloud Service

Click here for larger image

Figure 1.1 New Cloud Service

When you click OK, Visual Studio launches a wizard
interface that allows you to add the following Roles to the
Cloud service project.

Roles to the Cloud Service Project

Each role added will create a separate project file in
the Visual Studio Solution Explorer. Before clicking OK you
have the ability to rename each role by clicking on the
pencil icon to the right of the role’s name. This is useful
when you have multiple Web or Worker roles that will be
designated to perform different functions.



Figure 1.1 - New Cloud Service

Click here for larger image

Figure 1.2 Creating and Naming Cloud Service Roles

Click OK to finish the wizard. In the Solution Explorer
window you will notice that in addition to the two role
projects you added in the wizard there is a third project
for the cloud service. Each cloud service solution has this
additional project that contains the
ServiceDefinition and
ServiceConfirmation files. These files control
the number of instances Azure will start of each of the role
and Azure storage configuration among other things.


Since we will be using Azure Table and Queue storage open
both the ServiceDefinition and
ServiceConfirmation files and add the following
to each respective file.


ServiceDefinition.csdef


<?xml version=”1.0″ encoding=”utf-8″?>
<ServiceDefinition name=”MyCloudService” >
<WebRole name=”MyAppWebRole” enableNativeCodeExecution=”false”>
<InputEndpoints>
<!– Must use port 80 for http and port 443 for https when running in the cloud –>
<InputEndpoint name=”HttpIn” protocol=”http” port=”80″ />
</InputEndpoints>
<ConfigurationSettings>
<Setting name=”AccountName” />
<Setting name=”AccountSharedKey” />
<Setting name=”QueueStorageEndpoint” />
<Setting name=”TableStorageEndpoint” />
</ConfigurationSettings>
</WebRole>
<WorkerRole name=”MyAppWorkerRole” enableNativeCodeExecution=”false”>
<ConfigurationSettings>
<Setting name=”AccountName” />
<Setting name=”AccountSharedKey” />
<Setting name=”QueueStorageEndpoint” />
<Setting name=”TableStorageEndpoint” />
</ConfigurationSettings>
</WorkerRole>
</ServiceDefinition>

ServiceConfiguration.cscfg


<?xml version=”1.0″?>
<ServiceConfiguration serviceName=”MyCloudService” >
<Role name=”MyAppWebRole”>
<Instances count=”1″ />
<ConfigurationSettings>
<Setting name=”AccountName” value=”devstoreaccount1″ />
<Setting name=”AccountSharedKey” value=”Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==” />
<Setting name=”QueueStorageEndpoint” value=”http://127.0.0.1:10001″ />
<Setting name=”TableStorageEndpoint” value=”http://127.0.0.1:10002″ />
</ConfigurationSettings>
</Role>
<Role name=”MyAppWorkerRole”>
<Instances count=”1″ />
<ConfigurationSettings>
<Setting name=”AccountName” value=”devstoreaccount1″ />
<Setting name=”AccountSharedKey” value=”Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==” />
<Setting name=”QueueStorageEndpoint” value=”http://127.0.0.1:10001″ />
<Setting name=”TableStorageEndpoint” value=”http://127.0.0.1:10002″ />
</ConfigurationSettings>
</Role>
</ServiceConfiguration>


For development the AccountName and
AccountSharedKey values are fixed to what is
shown above(When working with the development environment
you must use that name and key). This allows you to test
your authentication code when accessing and storing data in
Azure Storage.

The endpoint values shown are for the development
environment. To verify your development Azure storage
endpoints debug the application and open the Development
Storage management window found in the system tray. If this
is your first time starting the development storage the
Table service will be stopped because the development
storage tables have not yet been created. You will create
these as part of a later step.



Figure 1.1 - New Cloud Service

Click here for larger image

Figure 1.3 First Run of Development Storage

In Default.aspx add two text boxes and a
button for collecting the two values to add with a label to
display a retrieval key. Then add an additional textbox and
button to retrieve the calculated value and a label to
display the calculated value. For each button create the
click event handler. Figure 1.4 below shows the form with
all controls plus layout and some labels.



Figure 1.1 - New Cloud Service

Click here for larger image

Figure 1.4 ASP.NET Form w/Controls

At this point in the walkthrough you have completed the
initial setup of the application, configured it with Azure
Storage account information and laid out the user interface.
The following steps will add the wiring to consume Azure
Storage and interact across cloud service roles.


Windows Azure Storage


There are three types of storage available in Windows
Azure Storage; Tables, Blobs and Queues. Blob storage is
meant for storing large chunks of unstructured data. Queue
storage is intended to be used as temporary messaging
between Web roles and Worker roles. Finally, Table storage
is meant for storing structured data in a scalable method.
Below is a table for quick reference.

Windows Azure Storage Types

Important: Table storage is NOT the same thing as
Azure SQL or for that matter has anything at all to do with
a relational database model. Table storage is not
relational.


Azure Storage is accessed via a REST interface and
therefore is accessible by anything that can make HTTP
request. It also means that Table storage is accessible
using ADO.NET Data Services. Lucky for us Microsoft has
already built a managed code wrapper for interfacing with
Azure Storage as part of the Azure SDK. As of the July CTP
the project is called StorageClient and can be
found (full source) in “C:Program FilesWindows Azure
SDKv1.0SamplesStorageClient” after installing the Azure
SDK. You can either directly add this project to your cloud
service solution or compile the assembly and reference it in
your solution. The StorageClient interface will
be used throughout this example to consume Table and Queue
storage.


Using Windows Azure Table Storage


Table storage in development varies slightly from what is
found in the Windows Azure Storage services. One example of
this difference is that development Table storage requires
that the tables have a fixed schema. Table storage in the
full Azure Storage service does not require table
configuration and supports adding varying numbers and types
of properties to numerous entities within the same table. A
complete list of development and production differences can
be found at http://msdn.microsoft.com/en-
us/library/dd320275.aspx
.


To create the table schema and data service context for
accessing the table’s data create two separate classes. Call
the first class “Calculation.cs” and the second
CalculationDataServiceContext.cs“. In the
project that these two classes are created, add a new
Reference to System.Data.Services.Client. Then
add the following to each respective class.


(In line comments provide additional details)
Calculation.cs


public class Calculation : Microsoft.Samples.ServiceHosting.StorageClient.TableStorageEntity
{
// Inheriting from TableStroageEntiry predefines the 3 necessary
// properties for storing any entity in Table storage; PartitionKey,
// RowKey and TimeStamp
public Calculation()
{
// Hardcoded for this example but can be anything; best practice
// is to assign the PartitionKey based on a value that groups
// your data to optimize search performance.
PartitionKey = “TEST”;
// Unique identifier with each partition
RowKey = DateTime.Now.Ticks.ToString();
}

public int ValueOne { get; set; }
public int ValueTwo { get; set; }
public int CalcValue { get; set; }
}

CalculationDataServiceContext.cs
public class CalculationDataServiceContext : TableStorageDataServiceContext
{
public CalculationDataServiceContext(StorageAccountInfo accountInfo)
: base(accountInfo)
{ }

// Properties that return IQueryable<T> are used by VS to create a
// table in development storage named after the property. The
// schema is defined by the public properites in the model class.
public IQueryable<CALCULATION> Calculations
{
get
{
return this.CreateQuery<CALCULATION>(“Calculations”);
}
}

// Method to take two input values, input them into the table storage
// and returns the RowKey so the value can be later retrieved.
public string AddCalculation(int valueOne, int valueTwo)
{
Calculation calc = new Calculation();
calc.ValueOne = valueOne;
calc.ValueTwo = valueTwo;

this.AddObject(“Calculations”, calc);
this.SaveChanges();

return calc.RowKey;
}
}

After successfully creating the two classes, the next
step is to create the test storage tables via Visual Studio.
In the Solution Explorer, right-click on the
MyCloudService project and select “Create Test
Storage Tables”. Visual Studio will display a message when
it is complete. You will now be able to debug the
application, open the Development Storage management window
and see the Table service running.



Figure 1.5 Development Storage After Creating Table

Click here for larger image

Figure 1.5 Development Storage After Creating Table

You are now ready to add code to the
Default.aspx.cs” file so that new Calculation
objects can be created and saved to the Table storage. This
will be done by adding code to both the
Page_Load method and to the start calculation
button’s click event handler.


Default.aspx.cs (Part 1)


using Microsoft.Samples.ServiceHosting.StorageClient;
using System.Data.Services.Client;

// …

protected void Page_Load(object sender, EventArgs e)
{
try
{
StorageAccountInfo accountInfo = StorageAccountInfo.GetAccountInfoFromConfiguration(“TableStorageEndpoint”);

// Not required in development since the table is created
// manually but is needed in the cloud to create the tables
// the first time.
TableStorage.CreateTablesFromModel(typeof(CalculationDataServiceContext), accountInfo);
}
catch (DataServiceRequestException ex)
{
// Unable to connect to table storage. In development
// verify the service is running. In the cloud check that
// the config values are correct.

// Handle error
}

}

protected void cmdStartCalculation_Click(object sender, EventArgs e)
{
StorageAccountInfo accountTableInfo = StorageAccountInfo.GetDefaultTableStorageAccountFromConfiguration();

CalculationDataServiceContext calculationDSContext = new CalculationDataServiceContext(accountTableInfo);

lblCalculationKey.Text =
calculationDSContext.AddCalculation(int.Parse(txtValueOne.Text), int.Parse(txtValueTwo.Text));
lblCalculationKey.Visible = true;
}

When you run the application now, you will be able to
enter two values, store them to Table storage and return the
RowKey so it can be later retrieved. The next
step will be to send a message to the Worker Role via Queue
storage to retrieve the values and add them together.


Using Windows Azure Queue Storage


To send a message to the Work Role you will first need to
insert code to the “Default.aspx.cs” file to
add the message to the Queue. The code needs to be added in
the same button click event as when the Calculation object
is created and the RowKey is returned. The
Worker Role will need to know both the
PartitionKey and RowKey to
retrieve the correct values to perform the calculation. For
this example the PartitionKey is hardcoded to
“TEST” so you only need to pass the RowKey. In
a more robust application you would most likely have to pass
both values for the Worker Role to be able to retrieve a
unique record.


Default.aspx.cs (Part 2)


protected void cmdStartCalculation_Click(object sender, EventArgs e)
{
// …

StorageAccountInfo accountQueueInfo = StorageAccountInfo.GetDefaultQueueStorageAccountFromConfiguration();

QueueStorage qs = QueueStorage.Create(accountQueueInfo);
MessageQueue queue = qs.GetQueue(“calculations”);
if (!queue.DoesQueueExist())
queue.CreateQueue();

Message msg = new Message(lblCalculationKey.Text);
queue.PutMessage(msg);
}


The next step is to open the “WorkerRole.cs
file and add the code that retrieves the message from the
Queue storage, performs the calculation and updates the
entity in Table storage. To do this the following code needs
to be added to the Start method of the
WorkerRole.cs” file.


WorkerRole.cs


using Microsoft.Samples.ServiceHosting.StorageClient;

// …

public override void Start()
{
RoleManager.WriteToLog(“Information”, “Worker Process entry point called”);

// Create account info objects for retrieving messages and accessing table storage
StorageAccountInfo accountTableInfo = StorageAccountInfo.GetDefaultTableStorageAccountFromConfiguration();
StorageAccountInfo accountQueueInfo = StorageAccountInfo.GetDefaultQueueStorageAccountFromConfiguration();

CalculationDataServiceContext calcDSContext = new CalculationDataServiceContext(accountTableInfo);

QueueStorage qs = QueueStorage.Create(accountQueueInfo);
MessageQueue queue = qs.GetQueue(“calculations”);

while (true)
{
Thread.Sleep(10000);

if (queue.DoesQueueExist())
{
Message msg = queue.GetMessage();
if (msg != null)
{
// Retrieve the Calculation entity from Table storage
var calc = (from c in calcDSContext.Calculations
where c.PartitionKey == “TEST”
&& c.RowKey == msg.ContentAsString()
select c).FirstOrDefault();

if (calc != null)
{
calc.CalcValue = calc.ValueOne + calc.ValueTwo;

// Update the entity with the new CalcValue
// property populated
calcDSContext.UpdateObject(calc);
calcDSContext.SaveChanges(SaveChangesOptions.ReplaceOnUpdate);

// Need to delete the message once processed!
queue.DeleteMessage(msg);
}
}
}
}
}


You now have a functioning Worker Role that is going to
check the “calculation” queue for messages every 10 seconds.
When it finds a message it retrieves the corresponding
Calculation entity from Table storage, updates the
calculated value property and then updates it back to Table
storage. The last and important thing that happens is the
message is deleted from the queue so it is not retrieved
again.

Putting It All Together


Finally, you will want to add the code to
Default.aspx.cs” that allows the user to
retrieve the calculated value. To enable the user to do so
add the following code to the retrieve calculation button’s
click event handler.


Default.aspx.cs (Part 3)


protected void cmdRetrieveCalculation_Click(object sender, EventArgs e)
{
// Create account info objects for accessing table storage
StorageAccountInfo accountTableInfo = StorageAccountInfo.GetDefaultTableStorageAccountFromConfiguration();

CalculationDataServiceContext calcDSContext = new CalculationDataServiceContext(accountTableInfo);

// Retrieve the Calculation entity from Table storage
Calculation calc = (from c in calcDSContext.Calculations
where c.PartitionKey == “TEST”
&& c.RowKey == txtCalculationKey.Text
select c).FirstOrDefault();

lblCalculatedValue.Text = calc.CalcValue.ToString();
lblCalculatedValue.Visible = true;
}


Debug the application and you will now be able to start a
calculation and retrieve the result. Keep in mind that the
service is only waking up every 10 seconds to check the
queue to wait the appropriate time before trying to retrieve
the calculated value.



Figure 1.6 Final Funtioning Application Screen

Click here for larger image

Figure 1.6 Final Funtioning Application Screen

Conclusion

While building a basic adding machine in the cloud is not
exactly necessary, you can learn from the above walkthrough
many important features of cloud service applications. The
walkthrough has shown you how to create a new cloud service,
set up configuration files to use Azure Storage, use Table
storage to store, retrieve and update data, use Queue
storage to pass messages between cloud service roles and
finally how the Azure SDK provided
StorageClient project can be leveraged for
quick development against Azure Storage services.


About the Author


Matt Goebel is the Founder and President of AP
Innovation, Inc. in Indianapolis, Indiana. He can be reached
at 859-802-7591 or matt
.goebel@apinnovation.com
.

Get the Free Newsletter!

Subscribe to Developer Insider for top news, trends & analysis

Latest Posts

Related Stories