Friday, December 31, 2010

Windows Azure and Cloud Computing Posts for 12/31/2010+

A compendium of Windows Azure, Windows Azure Platform Appliance, SQL Azure Database, AppFabric and other cloud-computing articles.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.


Azure Blob, Drive, Table and Queue Services

The MSDN Library added Storing Data in the Windows Azure Platform for Windows Phone on 12/15/2010 (missed when posted):

imageThe Windows Azure platform provides several data storage options for Windows Phone applications. This topic introduces components of the Windows Azure platform and describes how they relate to a general architecture for storing non-public data in the cloud. For more information about how Windows Phone applications can use the Windows Azure platform, see Windows Azure Platform Overview for Windows Phone.

This topic describes current Windows Azure platform features and a basic Windows Azure application architecture. For information about the latest Windows Azure features, see the Windows Azure home page and the Windows Azure AppFabric home page.

WCF services and WCF Data Services hosted on the Windows Azure platform can be consumed by Windows Phone applications just like other HTTP-based web services. For more information about consuming web services in your Windows Phone applications, see Connecting to Web and Data Services for Windows Phone.

Architectural Overview

This basic “client-server” architecture is comprised of three tiers. The Windows Phone application is the “client” application in the client tier; the Windows Azure Web role is the “server” application in the Web services tier; and Windows Azure storage services and SQL Azure provide data storage in the data storage tier. This architecture is shown in the following diagram:

Windows Azure Platform Storage for Windows Phone

Note: This architecture is designed for non-public data that requires authentication. For public data, blobs and blob containers can be directly exposed to the Web and read via anonymous requests. For more information, see Setting Access Control for Containers.

Client Tier

The client tier is comprised of the Windows Phone application and isolated storage. Isolated storage is used to store the application data that is needed for subsequent launches of the application. Isolated storage can also be used to temporarily store data before it is saved to the data storage tier. For more information about isolated storage, see Isolated Storage Overview for Windows Phone.

Web Service Tier

The Web service tier is comprised of a Windows Azure web role that hosts one or more web services based on Windows Communication Foundation (WCF) or WCF data services. WCF is a part of the .NET Framework that provides a unified programming model for rapidly building service-oriented applications. WCF Data Services enables the creation and consumption of Open Data Protocol (OData) services from the Web (formerly known as ADO.NET Data Services). For more information, see the WCF Developer Center and the WCF Data Services Developer Center.

In this architecture, the Web service tier enables abstraction of the data storage tier. By using widely available public specifications to define the protocols and abstract data structures that the web service implements, a wide variety of clients can interact with the service, including Windows Phone applications. Abstraction of the data storage tier also allows the data storage implementation to adapt to changing business requirements without affecting the client tier.

For WCF services, abstract data structures are defined by a data contract. The data contract is an agreement between the client and service that describes the data to be exchanged. For more information, see Using Data Contracts.

For OData services, abstract data structures are defined by a data model. WCF Data Services supports a wide variety of data models. For more information, see Exposing Your Data as a Service (WCF Data Services).

In this architecture, the Windows Phone application communicates with the web service to authenticate users based on their username and password. Depending on the value of the data, user credentials for accessing the Web service tier may or may not be stored on the phone. The Windows Phone application does not directly connect to the data storage tier. Instead, the Web role accesses the data storage tier on behalf of the Windows Phone application.

Security Note: We recommend that Windows Phone applications do not connect to the data storage tier directly. This prevents keys and credentials for the data storage tier from being stored or entered on the phone. In this architecture, only the Web role is granted access to the data storage tier. For more information about web service security, see Web Service Security for Windows Phone.

Data Storage Tier

The data storage tier is comprised of the Windows Azure storage services and SQL Azure. The Windows Azure storage services include the Blob, Queue, and Table services. SQL Azure provides a relational database service. A Windows Azure role can use any combination of these services to store and serve data to a Windows Phone application. For more information about these services, see Understanding Data Storage Offerings on the Windows Azure Platform.

NoteNote: The Windows Azure platform also provides Windows Azure roles a temporary storage repository named local storage. A Windows Azure role can access local storage like a file system. Local storage is not recommended for long-term durable storage of your data.

Configuring a Windows Azure Storage Service

Before using a Windows Azure storage service, the respective endpoint must first be created and configured programmatically. For example, to store images to a blob service for the first time, the Web role must first create and configure the blob container that will store the images.

Configuring a SQL Azure Database

There are several ways that you can create and configure a SQL Azure database:

  • Windows Azure Platform Management Portal: Use Windows Azure Platform Management Portal to create and manage databases.

  • SQL Server Management Studio: Manage a SQL Azure database similar to an on-premise instance of SQL Server, using SQL Server Management Studio from SQL Server 2008 R2.

  • Transact-SQL: Use a Windows Azure role to programmatically issue Transact-SQL statements to SQL Azure.

Important Note: You must first configure the SQL Azure firewall to connect to the database from inside or outside of the Windows Azure platform. For more information, see How to: Configure the SQL Azure Firewall.

Getting Started with the Windows Azure Platform

Perform the following steps to get started building web services like the one described in this topic:



Learn: Review the developer centers for the latest information about the products and educational references.

Install: Install the development tools that emulate Windows Azure on your computer. Install SQL Server 2008 R2 Express to develop local or SQL Azure relational databases.

Join: To use Windows Azure or SQL Azure, you will need an account. Note: An account is not required for developing applications and databases locally.

Create: Create your first Windows Azure local application. Create a local or SQL Azure database.

Develop: Develop a web role that hosts a WCF service or WCF data service. For storage, the web role can use development storage (a local simulation of Windows Azure storage services) or a relational database. If using a database, the local web service can use SQL Azure or a local database that is hosted with SQL Express.

Deploy: Deploy your web role application and database to the cloud.

Other Resources

<Return to section navigation list> 

SQL Azure Database and Reporting

imageNo significant articles today.

<Return to section navigation list> 

MarketPlace DataMarket and OData

Justin Plock (@jplock) posted his Python OData Client Library to github on 12/30/2010:

imageI’ve not been able to find any documentation for the library and it’s not listed on the OData SDK page.

Luxor Technologies asserted “Now you can import Money Service Business data directly from the cloud into your proprietary application, Microsoft Excel and Microsoft Word documents through Microsoft's Azure DataMarket” in an introduction to its MSBScope™ DataMarket Money Service Business Data on the Cloud post of 12/2010:

image Luxor has partnered with Microsoft to offer its MSBScope data through Microsoft's Azure DataMarket. Azure is Microsoft's answer to Cloud computing; and has created a growing collection of program features that can be directly accessed using Microsoft programs.

imageUsing Excel, Word or your own customized Windows-based program, you can plug in MSB data from Luxor's MSBScope™ DataMarket service. Using Microsoft's easy-to-implement interface, you are a click a way from dragging MSB data into your application.

image Match MSB data to names in your Excel spreadsheet; or use Access to implement a screen-based front end for entering name information and easily match that data against over 40,000 FINCEN registered money service businesses.
Phonetic searches are done under your control. You can set the percentage matching threashold, the maximum number of matches and a list of states within which to search. Geographic searches list MSBs within a mileage radius from your target address. Results include legal name, dba name, address, dates, services offered, states within which the MSB does business and latitude and logitude for mapping. Optionally, an HTML formatted report is returned for each match.
Billing is done through Microsoft and a variety of search bundles are available.


  • Phonetic Name Search
  • Geographic Target Search
  • Microsoft Azure Interface
  • Simple Billing

I couldn’t find an entry for Luxor Technologies or MSBScope in the current DataMarket Publishers page or by browsing the list of available DataMarket services.

The Windows Azure DataMarket Team asserted “DataMarket enables governments to achieve their goals of transparency, participation and collaboration, streamlining the process of publishing data, and making it easier for constituents to access the data from the applications they currently use and for developers to programmatically access the data” as an introduction to its two-page DataMarket for Government PDF brochure published on 12/20/2010 (missed when posted):


Elisa Flasko reiterated the the DataMarket Team’s intent to secure DataMarket Section 508 Compliance & FISMA Certification on 12/19/2010:

image At Microsoft, we are committed to developing products that are accessible to everyone. We take a strategic approach to accessibility by focusing on integrating accessibility into product planning, research and development, product development, and testing. As such, we are very excited to announce that DataMarket is Section 508 compliant and that the Windows Azure Marketplace DataMarket v1.0 Voluntary Product Accessibility Template (VPAT) is now available on the Microsoft Section 508 VPATs site.

imageAt this time, we are also announcing our intent to secure certification for DataMarket to demonstrate compliance with the Federal Information Security Management Act (FISMA). FISMA establishes responsibility and accountability for the security of all federal agency information systems and defines security requirements that must be met by all US Federal government information systems.

The MSDN Library added Connecting to Web and Data Services for Windows Phone on 12/15/2010:

image The Internet is host to an extensive variety of web and data services that you can use in your Windows Phone applications to create compelling new user experiences. This topic introduces web and data services, and describes the primary classes and utilities that you can use for building web-integrated Windows Phone applications.

Introduction to Web and Data Services

Much of the success of the Internet is due to the Hyper-text transfer protocol (HTTP). HTTP is a relatively simple and nearly ubiquitous networking protocol that web browsers and web service client applications use for exchanging information with servers across the Internet. HTTP is the foundation on which most web services are built.

imageWeb services enable programmatic access to a wide variety of data over the Internet. A data service is an HTTP-based Web service that implements the Open Data Protocol (OData) to expose data as resources that are defined by a data model and addressable by Uniform Resource Identifiers (URIs).

Web and data services each use an open XML-based language to describe their web-based API. The Web Service Description Language (WSDL) is used to describe the services that a web service offers. The Conceptual Schema Definition Language (CSDL) describes the Entity Data Model (EDM) that a data service offers. For more information, see Web Services Description Language (WSDL) and Conceptual Schema Definition File Format.

Classes and Utilities

The following list contains the classes that you may use directly to make web requests, as well as the utilities available to you to generate other classes optimized to make particular kinds of web requests from your Windows Phone applications:

  • WebClient Class: Provides common methods for sending data to and receiving data from a URI-based resource.

  • HttpWebRequest Class: Provides an HTTP-specific implementation of the abstract WebRequest class.

  • Silverlight Service Model Proxy Generation Tool (SLsvcUtil.exe): Generates proxy classes based on a web service WSDL file.

  • Visual Studio Add Service Reference Feature: Generates proxy classes based on a web service WSDL file.

  • WCF Data Service Client Utility (DataSvcUtil.exe): Generates proxy classes based on a data service CSDL file.

Note Note:

In this release of the Windows Phone Application Platform, the Visual Studio Add Service Reference feature is not supported for data services (OData).

The following table shows which classes can be used for the various types of HTTP-based programming:


The WebClient and HttpWebRequest classes can be used for a wide range of HTTP-based programming, from general HTTP requests to programming web and data services. Depending on how your application uses a web or data service, using the WebClient or HttpWebRequest classes exclusively may require you to write a significant amount of code.

When developing a web or data services client application, an alternative to programming at the HTTP-level is to use a proxy class. A proxy class is a class that represents the web or data service that is based on the corresponding WSDL or CSDL file, respectively. See the following sections of this topic for more information.

Web Services

Because the vast majority of Web services published on the Internet are based on HTTP, you can use the HttpWebRequest and WebClient classes to access web services from Windows Phone applications. To help ease the task of generating the additional code that web services often require, you can use the Silverlight Service Model Proxy Generation Tool (SLsvcUtil.exe) or the Visual Studio Add Service Reference feature to generate a proxy class.

A web service proxy class implements the serialization, request, and response code for a web service, based on the web service WSDL file. You can use the generated proxy class in your Windows Phone application for communicating with the corresponding web service. For more information, see Using SLsvcUtil.exe to Access a Service.

Important note Important Note:

Only code generated from the Silverlight 3 version of SLsvcUtil.exe is supported for use with Windows Phone applications. This version is included in Microsoft Visual Studio® 2010 Express for Windows® Phone and cannot generate code for features that are specific to Silverlight 4. For more details, see Networking in Silverlight for Windows Phone.

Data Services (OData)

A data service is an HTTP-based Web service that implements the Open Data Protocol (OData) to expose data as resources that are defined by a data model and addressable by URIs. This enables you to access and change data using the semantics of representational state transfer (REST), specifically the standard HTTP verbs of GET, PUT, POST, and DELETE. For more information about OData, see Open Data Protocol (OData) Overview for Windows Phone.

Because data services are based on HTTP, you can use the HttpWebRequest and WebClient classes to access data services from Windows Phone applications. To help ease the task of generating the additional code that a data service requires, you can use the WCF Data Service Client Utility, DataSvcUtil.exe, to generate a proxy class based on the data service CSDL file. You can use the generated proxy class in your Windows Phone application for communicating with the corresponding data service. DataSvcUtil.exe is part of the OData Client Library for Windows Phone. For an example of using this utility, see How to: Consume an OData Service for Windows Phone.

Note Note:

WCF Data Services enables the creation and consumption of Open Data Protocol (OData) services from the Web in .NET Framework applications. For more information, see the WCF Data Services Developer Center.

Security Considerations

When connecting to a web service that requires an application key, do not store the application key with an application that will be run on a device. Instead, you can create a proxy web service to authenticate a user and call an external cloud service with the application key. For more information about security recommendations, see Web Service Security for Windows Phone.


Networking support for Windows Phone is based on Silverlight 3. For a full list of differences in networking support for Windows Phone between Silverlight 3 and Silverlight 4, see Networking for Windows Phone.

When porting web service client code for use in a Windows Phone application, check the Silverlight APIs to ensure that methods used in the code are supported. For more information about supported Silverlight APIs for Windows Phone, see Class Library Support for Windows Phone.

See Also

<Return to section navigation list> 

Windows Azure AppFabric: Access Control and Service Bus

The MSDN Library added an API Reference (Windows Azure AppFabric CTP Caching) topic on 9/28/2010 (missed when published):

image722322Windows Azure AppFabric CTP October release uses the same cache client programming model as the on-premise solution of Windows Server AppFabric. In most cases, you can learn how to programmatically access a Windows Azure AppFabric cache by reviewing the Windows Server AppFabric cache client development documentation and class library reference. However, there are important differences when developing a Windows Azure AppFabric solution.

The following information provides more details on the support for specific APIs in the Windows Azure AppFabric CTP SDK.

API Support in Windows Azure AppFabric Labs

The following classes, methods, or properties have no support or limited support in Windows Azure AppFabric CTP October release. These are in the Microsoft.ApplicationServer.Caching namespace.


ImportantImportant: The previous list highlights the most important differences in the API, but it is not exhaustive. For example, there are many methods that have an overload which takes a region parameter. These overloads would not be supported, because custom regions are not supported.

See Also

Concepts: Windows Azure AppFabric CTP Caching

<Return to section navigation list> 

Windows Azure Virtual Network, Connect, RDP and CDN

imageNo significant articles today.

<Return to section navigation list> 

Live Windows Azure Apps, APIs, Tools and Test Harnesses

D. R. McGhee described Beginning development with Windows Azure in this 12/31/2010 post:

imageGetting started with Windows Azure isn’t as tricky or daunting as you may think.

After all, its familiar tools, with the end product residing in a different place.

I’d concur with my colleague Tom Hollander who argued recently that the best  way to learn new technologies is via a real or realistic assignment.

The following links are just ways in filling up your knowledge gaps. Its also worthwhile checking out instructor led events such as the RDN if you are like me in putting off training for another day.  It might be worth considering Windows Azure platform Acceleration Technical Training Tour – Sydney 27th/28th January if you are in town.

Starting links:

  1. On Demand training (example – there are many in the series)
  2. Training kits
  • Boot camp series

Whatever your choice it may be worthwhile drawing up a learning plan (like Buck Woody’s example) and signing up to one of the offers or discovery packs.

Morebits continues his/her series with Building Windows Azure Service Part4: Web Role UI Handler of 12/30/2010:

imageIn this post, you will create the UI that enables the user to perform read, write operations on the GuestBookEntry table. You will update the web role project generated when you created the Windows Azure service. Specifically, you will perform the following tasks:

  • Add a page to the project that contains the UI to display the guest GuestBookEntry table.
  • Create the code that enables the user to store guest information in Table Storage and images in Blob Storage.
  • Finally, configure the storage account used by the web role.

To render the guest book

  1. In Solution Explorer right-click the GuestBook_WebRole project, select Add Reference.
  2. Add a reference to the Microsoft.WindowsAzure.StorageClient assembly.
  3. Add a reference to the GuestBookData project.
  4. Open the default.aspx file.
  5. Replace the file content with the following markup.
<%@ Page Language="C#" AutoEventWireup="true" CodeBehind="Default.aspx.cs" Inherits="GuestBook_WebRole._Default" %>

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "">
<html xmlns="">
<head id="Head1" runat="server">
    <title>Windows Azure Guestbook</title>
    <link href="main.css" rel="stylesheet" type="text/css" />
    <form id="form1" runat="server">
    <asp:ScriptManager ID="ScriptManager1" runat="server">
    <div class="general">
        <div class="title">
                Windows Azure GuestBook
        <div class="inputSection">
                    <label for="NameLabel">Name:</label></dt>
                      Text="*" />
                    <label for="MessageLabel">Message:</label>
                       CssClass="field" />
                       Text="*" />
                    <label for="FileUpload1">Photo:</label></dt>
                        size="16" />
                        Text="*" />
                        ErrorMessage="Only .jpg or .png files are allowed"
                        ValidationExpression="([a-zA-Z\\].*(.jpg|.JPG|.png|.PNG)$)" />
            <div class="inputSignSection">
                       AlternateText="Sign GuestBook"
                       ImageAlign="Bottom"  />
        <asp:UpdatePanel ID="UpdatePanel1" runat="server">
                        <div class="signature">
                            <div class="signatureImage">
                                <a href="<%# Eval("PhotoUrl") %>" target="_blank">
                                    <img src="<%# Eval("ThumbnailUrl") %>" 
                                         alt="<%# Eval("GuestName") %>" />
                            <div class="signatureDescription">
                                <div class="signatureName">
                                    <%# Eval("GuestName") %>
                                <div class="signatureSays">
                                <div class="signatureDate">
                                    <%#((DateTime)Eval("Timestamp")).ToShortDateString() %>
                                <div class="signatureMessage">
                                    "<%# Eval("Message") %>"
To store guest information in Table and Blob Storage

To allow the user to enter guest information and store the entry in Table Storage and the related image in the Blob Storage, you must execute the following steps.

  1. In Solution Explorer open the default.aspx.cs file.
  2. Replace the file content with the following code.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.UI;
using System.Web.UI.WebControls;
using System.Net;
using GuestBook_Data;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.ServiceRuntime;
using Microsoft.WindowsAzure.StorageClient;
using System.Text;

namespace GuestBook_WebRole
    public partial class _Default : System.Web.UI.Page
        private static bool storageInitialized = false;
        private static object gate = new Object();
        private static CloudBlobClient blobStorage;
        private static CloudQueueClient queueStorage;

        protected void Page_Load(object sender, EventArgs e)
            if (!Page.IsPostBack)
                Timer1.Enabled = true;

        protected void SignButton_Click(object sender, EventArgs e)
            if (FileUpload1.HasFile)
                // upload the image to blob storage
         CloudBlobContainer container = blobStorage.GetContainerReference("guestbookpics");
                string uniqueBlobName = string.Format("image_{0}.jpg", Guid.NewGuid().ToString());
                CloudBlockBlob blob = 
                blob.Properties.ContentType = FileUpload1.PostedFile.ContentType;
"Uploaded image '{0}' to blob storage as '{1}'", 
FileUpload1.FileName, uniqueBlobName);

                // create a new entry in table storage
                GuestBookEntry entry = new GuestBookEntry() { GuestName = 
NameTextBox.Text, Message = MessageTextBox.Text, PhotoUrl = 
blob.Uri.ToString(), ThumbnailUrl = blob.Uri.ToString() };
                GuestBookEntryDataSource ds = new GuestBookEntryDataSource();
"Added entry {0}-{1} in table storage for guest '{2}'", 
entry.PartitionKey, entry.RowKey, entry.GuestName);

                // queue a message to process the image
                var queue = queueStorage.GetQueueReference("guestthumbs");
                var message = new CloudQueueMessage(String.Format("{0},{1},{2}", 
uniqueBlobName, entry.PartitionKey, entry.RowKey));
"Queued message to process blob '{0}'", uniqueBlobName);

            NameTextBox.Text = "";
            MessageTextBox.Text = "";


        protected void Timer1_Tick(object sender, EventArgs e)

        private void InitializeStorage()
            if (storageInitialized)

            lock (gate)
                if (storageInitialized)

                    // Create a new instance of a CloudStorageAccount object from a specified configuration setting. 
                    // This method may be called only after the SetConfigurationSettingPublisher 
                    // method has been called to configure the global configuration setting publisher.
                    // You can call the SetConfigurationSettingPublisher method in the OnStart method
                    // of the web role or in the Application_Start method in the Global.asax.cs file.
                    // If you do not do this, the system raises an exception 
                    var storageAccount = 

                    // create blob container for images
                    blobStorage = storageAccount.CreateCloudBlobClient();
                    CloudBlobContainer container = 

                    // configure container for public access
                    var permissions = container.GetPermissions();
                    permissions.PublicAccess = 

                    // create queue to communicate with worker role
                    queueStorage = storageAccount.CreateCloudQueueClient();
                    CloudQueue queue = 
                catch (WebException)
                    StringBuilder buffer = new StringBuilder();
                    buffer.Append("Storage services initialization failure.");
                    buffer.Append(" Check your storage account configuration settings.");
                    buffer.Append(" If running locally,");
                    buffer.Append(" ensure that the Development Storage service is running.");

                    throw new WebException(buffer.ToString());

                storageInitialized = true;
To configure the storage account for the web role

In order for the web role to use the Windows Azure storage services, you must provide account settings as shown next.

  1. In Solution Explorer, expand the Role node in the GuestBook project.
  2. Double click GuestBook_WebRole to open the properties for this role and select Settings tab.
  3. Click Add Settings.
  4. In the Name column, enter DataConnectionString.
  5. 12. In the Type column from the drop-down list select ConnectionString.
  6. 13. In the Value column, from the drop-down list, select Use development storage.
  7. 14. Click OK. Then press Ctrl+S to save your changes.


Figure 6 Configuring Storage Account For Web Role

A storage account is a unique end- point for Windows Azure blob, queue and table services. You must create a storage account to use these services. For more information, see Windows Azure Platform

This walkthrough uses the development storage included in the Windows Azure SDK development environment to simulate blob, queue, and table services available in the cloud. Windows Azure by default uses SQL Server Express to simulate these services. You can also use a local instance of the SQL Server. To do so you must define the connection string that the development storage can use to connect to the server. For more information, see Using Windows Azure Development Environment Essentials.

To use the development storage, you set the value of the UseDevelopmentStorage keyword in the connection string for the storage account to true. When you deploy your application to Windows Azure, you need to update the connection string to specify storage account settings including your account name and shared key. For example,
<Setting name="DataConnectionString" value="DefaultEndpointsProtocol=https;AccountName=YourAccountName;AccountKey=YourAccountKey" />

For related topics, see the following posts.

The MSDN Library added a Windows Azure Platform Overview for Windows Phone topic on 12/15/2010:

The Windows Azure platform is an internet-scale cloud services platform hosted through Microsoft data centers. It provides highly-scalable processing and storage capabilities, a relational database service, and premium data subscriptions that you can use to build compelling Windows Phone applications.

This topic provides an overview of the Windows Azure platform features that you can use with the Windows Phone Application Platform. For information about how to use the Windows Azure platform for data storage, see Storing Data in the Windows Azure Platform for Windows Phone. For information about using web services with your Windows Phone applications, see Connecting to Web and Data Services for Windows Phone.

Windows Azure Compute Service

The Windows Azure Compute service is a runtime execution environment for managed and native code. An application built on the Windows Azure Compute service is structured as one or more roles. When it executes, the application typically runs two or more instances of each role, with each instance running as its own virtual machine (VM).

You can use Windows Azure roles to offload work from your Windows Phone applications and perform tasks that are difficult or not possible with the Windows Phone Application Platform. For example, a web role could directly query a SQL Azure relational database and expose the data via a Windows Communication Foundation (WCF) service. For more information about writing Windows Phone applications that consume web services, see Connecting to Web and Data Services for Windows Phone.

There are several benefits to using a Windows Azure Compute service in conjunction with your Windows Phone application:

  • Programming options: When writing managed code for a Windows Azure role, developers can use many of the .NET Framework 4 libraries common to server and desktop applications. Although a substantial number of Silverlight and XNA components are available for developing a Windows Phone application, there are limits to what can be done with those components.

  • Availability: Windows Azure roles run in a highly-available internet-scale hosting environment built on geographically distributed data centers. Considering that the phone can be turned off, a Windows Azure role may be a better choice for long-running tasks or code that needs to be running all the time.

  • Processing capabilities: The processing capabilities of a Windows Azure role can scale elastically across servers to meet increasing or decreasing demand. In contrast, on a Windows Phone, a single processor with finite capabilities is shared by all applications on the phone.

A Windows Azure web role can provide Windows Phone applications access to data by hosting multiple web services including Windows Communication Foundation (WCF) services and WCF data services. WCF is a part of the .NET Framework that provides a unified programming model for rapidly building service-oriented applications. WCF Data Services enables the creation and consumption of Open Data Protocol (OData) services from the Web (formerly known as ADO.NET Data Services). For more information, see the WCF Developer Center and the WCF Data Services Developer Center.

Windows Azure Storage Services

Storage resources on the phone are limited. To optimize the user experience, Windows Phone applications should minimize the use of isolated storage and only store what is necessary for subsequent launches of the application. One way to minimize the use of isolated storage is to use Windows Azure storage services instead. For more information about isolated storage best practices, see Isolated Storage Best Practices for Windows Phone.

The Windows Azure storage services provide persistent, durable storage in the cloud. As with the Windows Azure Compute service, Windows Azure storage services can scale elastically to meet increasing or decreasing demand. There are three types of storage services available:

  • Blob service: Use this service for storing files, such as binary and text data. For more information, see Blob Service Concepts.

  • Queue service: Use this service for storing and delivering messages that may be accessed by another client (another Windows Phone application or any other application that can access the Queue service). For more information, see Queue Service Concepts.

  • Table service: Use this service for structured storage of non-relational data. A Table is a set of entities, which contain a set of properties. For more information, see Table Service Concepts.

NoteNote: To access Windows Azure storage services, you must have a storage account, which is provided through the Windows Azure Platform Management Portal. For more information, see How to Create a Storage Account.

We do not recommend that Windows Phone applications store the storage account credentials on the phone. Rather than accessing the Windows Azure storage services directly, we recommend that Windows Phone applications use a web service to store and retrieve data. The exception to this recommendation is for public blob data that is intended for anonymous access. For more information about using Windows Azure storage services, see Storing Data in the Windows Azure Platform for Windows Phone.

SQL Azure

Microsoft SQL Azure Database is a cloud-based relational database service built on SQL Server technologies. It is a highly available, scalable, multi-tenant database service hosted by Microsoft in the cloud. SQL Azure Database helps to ease provisioning and deployment of multiple databases. Developers do not have to install, set up, update, or manage any software. High availability and fault tolerance are built-in and no physical administration is required.

Similar to an on-premise instance of SQL Server, SQL Azure exposes a tabular data stream (TDS) interface for Transact-SQL-based database access. Because the Windows Phone Application Platform does not support the TDS protocol, a Windows Phone application must use a web service to store and retrieve data in a SQL Azure database. For more information about using SQL Azure with Windows Phone, see Storing Data in the Windows Azure Platform for Windows Phone.

SQL Azure enables a familiar development environment. Developers can connect to SQL Azure with SQL Server Management Studio (SQL Server 2008 R2) and create database tables, indexes, views, stored procedures, and triggers. For more information about SQL Azure, see SQL Azure Database Concepts.

Windows Azure Marketplace DataMarket

Windows Azure Marketplace DataMarket is an information marketplace that simplifies publishing and consuming data of all types. The DataMarket enables developers to discover, preview, purchase, and manage premium data subscriptions. For more information, see the Windows Azure Marketplace DataMarket home page.

The DataMarket exposes data using OData feeds. The Open Data Protocol (OData) is a Web protocol for querying and updating data. The DataMarket OData feeds provide a consistent Representational State Transfer (REST)-based API across all datasets to help simplify development. Because DataMarket feeds are based on OData, your Windows Phone application can consume them with the OData Client Library for Windows Phone or use the HttpWebRequest class. For more information, see Connecting to Web and Data Services for Windows Phone.

Note: In this release of the Windows Phone Application Platform, the Visual Studio Add Service Reference feature is not supported for OData data services. To generate a proxy class for your application, use the DataSvcUti.exe utility that is part of the OData Client Library for Windows Phone. For more information, see How to: Consume an OData Service for Windows Phone.

There are two types of DataMarket datasets: those that support flexible queries and those that support fixed queries. Flexible query datasets support a wider range of REST-based queries. Fixed query datasets support only a fixed number of queries and supply a C# client library to help client applications work with data. For more information about these query types, see Fixed and Flexible Query Types.

See Also

Other Resources

<Return to section navigation list> 

Visual Studio LightSwitch

image2224222No significant articles today.

Return to section navigation list> 

Windows Azure Infrastructure

Tim Anderson (@timanderson) asserted “Microsoft got cloud religion” in his Ten big tech trends from 2010 post of 12/31/2010:

Microsoft got cloud religion

image Only up to a point, of course. This is the Windows and Office company, after all. However – and this is a little subjective – this was the year when Microsoft convinced me it is serious about Windows Azure for hosting our applications and data. In addition, it seems to me that the company is willing to upset its partners if necessary for the sake of its hosted Exchange and SharePoint – BPOS (Business Productivity Online Suite), soon to become Office 365.

imageThis is a profound change for Microsoft, bearing in mind its business model. I spoke to a few partners when researching this article for the Register and was interested by the level of unease that was expressed.

Microsoft also announced some impressive customer wins for BPOS, especially in government, though the price the customers pay for these is never mentioned in the press releases.

Read the rest of Tim’s New Year’s post here.

Jay Fry (@jayfry3) took A cloudy look back at 2010 in this 12/31/2010 post:

image Today seemed like a good day to take stock of the year in cloud computing, at least according to the view from this Data Center Dialog blog – and from what you as readers thought was interesting over the past 12 months.

Setting the tone for the year: cloud computing M&A

It probably isn’t any big surprise that 3 of the 4 most popular articles here in 2010 had to do with one of the big trends of the year in cloud computing – acquisitions. (Especially since my employer, CA Technologies, had a big role in driving that trend.) CA Technologies made quite a bit of impact with our successive acquisitions of Oblicore, 3Tera, and Nimsoft at the beginning of the year. We followed up by bringing onboard others like 4Base, Arcot, and Hyperformix.

But those first three set the tone for the year: the cloud was the next IT battleground and the big players (like CA) were taking it very seriously. CRN noted our moves as one of the 10 Biggest Cloud Stories of 2010. Derrick Harris of GigaOm called us out as one of the 9 companies that drove cloud in 2010.

As you’d expect, folks came to Data Center Dialog to get more details on these deals. We had subsequent announcements around each company (like the release of CA 3Tera AppLogic 2.9), but the Nimsoft one got far and away the most interest. I thought one of the more interesting moments was how Gary Read reacted to a bunch of accusations of being a “sell-out” and going to the dark side by joining one of the Big 4 management vendors they had been aggressively selling against. Sure, some of the respondents were competitors trying to spread FUD, but he handled it all clearly and directly -- Gary's signature style, I’ve come to learn.

What mattered a lot? How cloud is changing IT roles

Aside from those acquisitions, one topic was by far the most popular: how cloud computing was going to change the role of IT as a whole – and for individual IT jobs as well. I turned my November Cloud Expo presentation into a couple posts on the topic. Judging by readership and comments, my “endangered species” list for IT jobs was the most popular. It included some speculation that jobs like capacity planning, network and server administration, and even CIO were going the way of the dodo. Or were at least in need of some evolution.

Part 2 conjured up some new titles that might be appearing on IT business cards very soon, thanks to the cloud. But that wasn’t nearly as interesting for some reason. Maybe fear really is the great motivator. Concern about the changes that cloud computing is causing to peoples’ jobs certainly figured as a strong negative in the survey we published just a few weeks back. Despite a move toward “cloud thinking” in IT, fear of job loss drove a lot of the negative vibes about the topic. Of course, at the same time, IT folks are seeing cloud as a great thing to have on their resumes.

All in all, this is one of the major issues for cloud computing, not just for 2010, but in general. The important issue around cloud computing is not so much about figuring out technology, it’s about figuring out how to run and organize IT in a way that makes the best use of technology, creates processes that are most useful for the business, and that people learn to live and work with on a daily basis. I don’t think I’m going out on a limb here to say that this topic will be key in 2011, too.

Learning from past discussions on internal clouds

James Urquhart noted in his “cloud computing gifts of 2010” post at CNET that the internal/private cloud debate wound its way down during the year, ending in a truce. “The argument died down…when both sides realized nobody was listening, and various elements of the IT community were pursuing one or the other – or both – options whether or not it was ‘right.’” I tend to agree.

These discussions (arguments?), however, made one of my oldest posts, “Are internal clouds bogus?” from January 2009, the 5th most popular one – *this* year. I stand by my conclusion (and it seems to match where the market has ended up): regardless of what name you give the move to deliver a more dynamic IT infrastructure inside your 4 walls, it’s compelling. And customers are pursuing it.

Cloud computing 101 remained important

2010 was a year in which the basics remained important. The definitions really came into focus, and a big chunk of the IT world joined the conversation about cloud computing. That meant that things like my Cloud Computing 101 post, expanding on my presentation on the same topic at CA World in May, garnered a lot of attention.

Folks were making sure they had the basics down, especially since a lot of the previously mentioned arguments were settling down a bit. My post outlined a bunch of the things I learned from giving my Cloud 101 talk, namely don’t get too far ahead of your headlights. If you start being too theoretical, customers will quickly snap you right back to reality. And that’s how it should be.

Beginning to think about the bigger implications of cloud computing

However, several forward-looking topics ended up at the top of the list at Data Center Dialog this year as well. Readers showed interest in some of the things that cloud computing was enabling, and what it might mean in the long run. Consider these posts as starting points for lots more conversations going forward:

Despite new capabilities, are we just treating cloud servers like physical ones? Some data I saw from RightScale about how people are actually using cloud servers got me thinking that despite the promise of virtualization and cloud, people perhaps aren’t making the most of these new-fangled options. In fact, it sounded like we were just doing the same thing with these cloud servers as we’ve always done with physical ones. It seemed to me that missed the whole point.

Can we start thinking of IT differently – maybe as a supply chain? As we started to talk about the CA Technologies view of where we think IT is headed, we talked a lot about a shift away from “IT as a factory” in which everything was created internally, to one where IT is the orchestrator of service coming from many internal and external sources. It implies a lot of changes, including expanded management requirements. And, it caught a lot of analyst, press, customer, -- and reader – attention, including this post from May.

Is cloud a bad thing for IT vendors? Specifically, is cloud going to cut deeply into the revenues that existing hardware and software vendors are getting today from IT infrastructure? This certainly hasn’t been completely resolved yet. 2010 was definitely a year where vendors made their intentions known, however, that they aren’t going to be standing still. Oracle, HP, IBM, BMC, VMware, CA, and a cast of thousands (OK, dozens at least) of start-ups all made significant moves, often at their own user conferences, or events like Cloud Expo or Cloud Connect.

What new measurement capabilities will we need in a cloud-connected world? If we are going to be living in a world that enables you to source IT services from a huge variety of providers, there is definitely a need to help make those choices. And even to just have a common, simple, business-level measuring stick for IT services in the first place. CA Technologies took a market-leading stab at that by contributing to the Service Measurement Index that Carnegie Mellon is developing, and by launching the Cloud Commons community. This post explained both.

So what’s ahead for 2011 in cloud computing?

That sounds like a good topic for a blog post in the new year. Until then, best wishes as you say farewell to 2010. And rest up. If 2011 is anything like 2010, we’ll need it.

Jay is Strategy VP for CA Technologies’ cloud business. He joined CA via its Cassatt acquisition (private cloud software).

Cory Fowler (@SyntaxC4) and John Bristowe (@jbristow) co-authored Essential Resources for Getting Started with Windows Azure of 12/30/2010:

image image This blog post was co-authored by John Bristowe [left] and Cory Fowler [right]. [John is a Senior Developer Evangelist for Microsoft Canada; Cory is a Windows Azure MVP and Developer as a Service, Consultant at @ObjectSharp.]

Important! Check out the Windows Azure Introductory Special. It is the easiest way to get started with Windows Azure. To make it even easier, Barry Gervin and Cory Fowler have created some step by step videos on how to register for Windows Azure using the Introductory Special. If you’re a [Premium, Ultimate, or BizSpark] MSDN Subscriber, you are eligible to participate in a Windows Azure Benefits which includes free consumption of Windows Azure Services.

imageEssential Downloads
Folks in the Know
Blogs and Websites
Essential Reading
Essential Listening
Projects, Third Party Tools and Other Downloads
Essential Code/Scripts/Virtual Labs

This article also appears on Canadian Developer Connections.

<Return to section navigation list> 

Windows Azure Platform Appliance (WAPA), Hyper-V and Private Clouds


No significant articles today.

<Return to section navigation list> 

Cloud Security and Governance


No significant articles today.

<Return to section navigation list> 

Cloud Computing Events


No significant articles today.

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

Phil Leggetter claimed Yahoo’s Open Sourced S4 Could be a Real-time Cloud Platform in a 12/31/2010 post to the ProgrammableWeb blog:

imageIn a world where real-time data streams are becoming much more common, and with the volume of that data continuing to increase, it makes sense that a framework would be developed to increase the ease at which that data can be processed. Yahoo! S4 isn’t the first such framework to be concieved, or even open sourced, but it is likely to massively increase awareness that such frameworks exist, what problems they may help solve and get developers thinking about how they could use the technology and potentially increase the likelihood of somebody moving S4-like capabilities into the cloud and offering it as as service.

Yahoo! S4The requirement for a “distributed stream computing platform” came about for Yahoo! in order to be able to process thousands of search queries per second, from potentially millions of users per day,  to facilitate the generation of highly personalized adverts for web search. A new framework was required because Yahoo! felt that MapReduce, which is commonly used to process large datasets in batch jobs, was “hard to apply to stream computational tasks”.

imageYahoo! describe the S4 framework using a number of terms that have become common place in the world of cloud computing:

S4 is a general-purpose, distributed, scalable, partially fault-tolerant, pluggable platform that allows programmers to easily develop applications for processing continuous unbounded streams of data.

Exactly what Yahoo! S4 is, and what it is capable of, has been discussed in a number of other places. The most commonly used term by comparable frameworks is Complex Event Processing with applications including filtering, correlation and pattern matching. These discussions will no doubt continue but ultimately a framework is something that can be put to multiple uses which is why Yahoo! chose to call it “general-purpose”.

Yahoo! have created a couple of examples to demonstrate some of the basic capabilities and clarify what S4 can do. One of the examples recieves data from the Twitter real-time Garden Hose stream, counts the number of times a hashtag is mentioned and keeps an ordered list of the most commonly mentioned hashtags. Each step of the process is performed in what Yahoo! are calling Processing Elements and it’s these elements that enforce the separation of each logical step of the process (e.g. recieve update, extract hashtags, count hashtags, order hastag count list)  and allow the execution of the process to take place on a distributed system.

One potential thing holding S4 adoption back is that as yet it’s not offered as a service. As well as writing their own Processing Elements developers will have to host their own distributed stream computing platform. If S4 proves to be a useful and popular framework then we may start to see hosted distributed stream computing platform services in the same way that we’ve already seen MapReduce being offered as a service by Amazon.

Yahoo! S4 is yet another powerful real-time component now available to the Programmable Web. It opens up a number of possibilities for developers to start building exciting data-centric applications, mashups or hosted services which could integrate with other components such as real-time APIs, real-time client push services and DaaS services.

<Return to section navigation list>