Tuesday, October 11, 2011

Windows Azure and Cloud Computing Posts for 10/11/2011+

A compendium of Windows Azure, SQL Azure Database, AppFabric, Windows Azure Platform Appliance and other cloud-computing articles. image222


Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

Azure Blob, Drive, Table and Queue Services

Avkash Chauhan (@avkashchauhan) described Collecting Windows Azure Storage REST API level metrics data without a single line of programming, just by using tools in a 10/11/2011 post:

image[From t]ime to time you might want to know what the total transaction counts are when you perform any storage activity which included Windows Azure storage.

imageFor example when you mount a VHD from Windows Azure Storage to you Azure VM or you write total X number of file included Y MB/GB of data. I decided to write this article to answer all of these questions.

First you really need to understand how Windows Azure Billing works and there is no better place then the link below:


The above link explains how Windows Azure Billing works along with few examples as below:

  • A single GetBlob request to the blob service = 1 transaction
  • PutBlob with 1 request to the blob service = 1 transaction
  • Large blob upload that results in 100 requests via PutBlock, and then 1 PutBlockList for commit = 101 transactions
  • Listing through a lot of blobs using 5 requests total (due to 4 continuation markers) = 5 transactions
  • Table single entity AddObject request = 1 transaction
  • Table Save Changes (without SaveChangesOptions.Batch) with 100 entities = 100 transactions
  • Table Save Changes (with SaveChangesOptions.Batch) with 100 entities = 1 transaction
  • Table Query specifying an exact PartitionKey and RowKey match (getting a single entity) = 1 transaction
  • Table query doing a single storage request to return 500 entities (with no continuation tokens encountered) = 1 transaction
  • Table query resulting in 5 requests to table storage (due to 4 continuation tokens) = 5 transactions
  • Queue put message = 1 transaction
  • Queue get single message = 1 transaction
  • Queue get message on empty queue = 1 transaction
  • Queue batch get of 32 messages = 1 transaction
  • Queue delete message = 1 transaction

Now please follow given steps to collect Windows Azure Storage REST API level these details:

Step 1: Download CloudBerry Explorer for Windows Azure Version 1.4.1 +

Step 2: Now create a Windows Azure Storage Service and configured in CloudBerry Azure Blob Storage Explorer. After configuration please enabled Storage Metrics as below:

2.1 Open menu Analytics > Settings:

2.2 Select your Azure Storage Name:

2.3 Now enabled logging as below: (You can also select “Delete data older then X days” as well if want)

2.4 Now Metrics data as below: (You can also select “Delete data older then X days” as well if want)


2.5 Select Apply to confirm all of above settings.

Step 3: Now for testing please copy a file in your above configured Windows Azure Storage. As you can see below I have copied 64MB VHD to my Azure Storage.


Step 4: Now let’s check the Statistics for just copy operation.
4.1 Go to Analysis > View Log as below:

4.2 You will a long list of table for each Storage API used for the transaction. If log is not visible please try to set the Date Interval correctly.

Log Analysis:
You can also copy/save the log and Open in TextAnalysisTool.NET (Download from this link) to parse as below:


Or you can use Microsoft Excel 2010 to filter the log file as below to get API specific details:

This way you can get API Specific metric data for your Azure Storage

<Return to section navigation list>

SQL Azure Database and Reporting

SD Times Newswire reported CA Technologies to Deliver ERwin Data Modeling Solution for SQL Azure on 10/11/2011:

imageCA Technologies (NASDAQ: CA) today announced CA ERwin Data Modeler for SQL Azure, a powerful new solution that helps customers manage and integrate their database infrastructure with the SQL Azure cloud database environment.
Many customers have concerns about moving their data to the cloud, fearing the potential complexity and risk of designing and deploying systems in an off-premise environment. CA ERwin Data Modeler for SQL Azure enables customers to make fact-based decisions about which data to move to the cloud, and which to keep on premise, by empowering them to easily:

  • Create an inventory of data assets, including both cloud and on-premise databases, using a single visual interface
  • Migrate from on-premise databases, such as SQL Server, to cloud-based databases, such as SQL Azure
  • Maintain their cloud-based databases using a familiar desktop data modeling paradigm, eliminating the need for specialized skills or training

image“Maximizing the benefits of the cloud is challenging if you do not fundamentally understand the data that currently exists, let alone which data would best reside in cloud property,” said Mike Crest, general manager, Data Management, CA Technologies. “CA ERwin Data Modeler for Microsoft SQL Azure provides visibility into data assets and the complex relationships between them, enabling customers to remain in constant control of their database architectures even as they move to public, private and hybrid IT environments.”

imageCA Technologies unveiled CA ERwin Data Modeler for SQL Azure at PASS Summit 2011, where it is conducting product demonstrations at booth #207. An early preview of the product, which is scheduled for availability in the first quarter of 2012, is available here. [Requires site registration.]

“CA ERwin Data Modeler for SQL Azure is the latest example of CA Technologies commitment to helping customers fully capitalize on the cloud without needing special resources or skill sets,” said Kim Akers, general manager for ISV partners, Microsoft Corp. “We are pleased that CA Technologies has chosen to support the SQL Azure platform, easing the way for customers to migrate to cloud-based databases with assurance and confidence.”

CA ERwin Data Modeler is a key component of the CA ERwin family of products, which offers an integrated set of technologies that help enable best practices for database design and modeling.

CA Technologies is the global market share leader in data modeling tools, based on 2010 revenues, according to IDC.

Bruce Kyle reported Samples, Tool Helps You Convert SQL Lite to SQL Server Compact for Phone Apps in a 10/11/2011 post to the US ISV Evangelism blog:

imageA database conversion tool help you convert you SQLite database to SQL Server Compact. The tool and examples combined with previous extensive guides (for Android, iPhone, and Symbian Qt) to accelerate your ramp up time and improve your experience in porting apps to Windows Phone from iPhone and Android.

More information about the tool can be found on the team blog post More guidance and tool for porting iPhone & Android apps to Windows Phone by Jean-Christophe Cimetiere.

You will find information about:


A series of samples to aid you in the process of migrating your iPhone & Android applications over to Windows Phone. We’ve started with 3 samples:

  • In-App Advertisements
  • Geo-Location
  • Group Messaging

You’ll find the source code on Android/iPhone, the Windows Phone ported version and the porting notes.

Conversion Tool

SQLite2SQLCE is a tool developed to make the conversion process simple by converting a SQLite database into SQLCE while simultaneously creating the default classes needed to incorporate the new database into your Windows Phone application.

In addition, you’ll find a tool designed to aid developers in converting their SQL queries to LINQ while simultaneously helping them to learn the new query language.

API Mapping Tool Updated

The API Mapping tool has been expanded: it now covers a few more features like sensors (Camera, Compass & Gyro), multitasking (notification, app switching & background agents) , data access (SQL, file access), launchers/choosers.

<Return to section navigation list>

MarketPlace DataMarket and OData

The Visual Studio LightSwitch Team (@VSLightSwitch) described How to Create a RIA Service Wrapper for an Editable OData Source in a 10/11/2011 post:


imageLightSwitch has built-in support for SQL Server and SharePoint data sources. To access other data sources, you can write a custom WCF RIA DomainService. This post will show you how to read and write from an Odata Service by wrapping access to it in a DomainService. …

Read more in the Visual Studio LightSwitch and Entity Framework v4+ section below. This feature should have been built into the v1 release.

Jan van der Haegen recommended that you Organize your chickens: NuGet for the enterprise in a 10/11/2011 post:

Just finished giving a presentation on NuGet.

As promised, I did the entire session wearing my official LightSwitch blue polo that a colleague of mine gave to me! :-)

During one of the examples, I created the Metro Theme NuGet package, uploaded it and used it in a brand new LightSwitch application. You could see at least half the room looking at the LightSwitch framework more so then the actual NuGet implementation, nice!

Now enjoying the second part by Maarten & Xavier!

Special thanks to Gill and the Visug community for having us!

From the Visual Studio User Group (Belgium) blog:

imageTwo for the price of one: Organize your chickens: NuGet for the enterprise

Presented by: Thomas Houtekier, Jan Van der Haegen, Xavier Decoster, Maarten Balliauw on: Tuesday, October 11, 2011

image2 sessions in one evening about NuGet

Managing software dependencies, whether those created in-house or from third parties can be a pain in the behind. Whether dependencies feel like wild chickens or people run around like chickens dealing with dependencies, the NuGet package manager can be a cure. Let us guide you to creating enterprise (chicken) NuGets and dealing with them in a structured, easy-to-maintain manner. From developer workstation to build server, NuGet tastes great! We'll provide you the dip sauce.

Thomas Houtekier: http://about.me/thomas.houtekier
Jan Van der Haegen: http://about.me/jan.van.der.haegen
Xavier Decoster: http://about.me/xavierdecoster
Maarten Balliauw: http://about.me/maartenballiauw

Minh Cuong updated the source code for CodePlex’s SocialPhotos: WP7, Android, iOS, ASP.net MVC, LightSwitch, OData... on 10/11/2011:

What is “SocialPhotos”?

SocialPhotos is a social networking application for photography lovers that is based on smart phone, web 2.0, and cloud technologies. With SocialPhotos, users can share their photos, as well as view, rate, and comment on other’s photos from any smartphone or HTML5 web browser. Smartphones, including Windows Phone, Android, and iPhone, provide access to photos from mobiles. Not only can you take and upload a photo from your smartphone, but you can also tag a photo with information about the location where photo is taken. Based on these tags, you can search for photos takes in a specific location.
Because SocialPhotos maintains your photos in the cloud, you won’t lose your photos, even if you lose your smartphone. In the cloud, your photos are protected and backed-up by our high-availability Windows Azure-based cloud computing services.

imageThe idea for a “SocialPhotos” service was initiated by Cuong Trinh to demonstrate the Microsoft cloud services platform of WCF Data Services/OData, Windows Azure, SQL Azure, Windows Phone, as well as other non-Microsoft technologies, including Google Android, Apple iOS, HTML5, JavaScript, and jQuery. The team plans to demonstrate how these technologies, by aligning around the Open Data Protocol (OData), can provide a seamless, multi-platform user experience.

The text is edited by Glenn Gailey for better readability. Thanks Glenn for your support and first sample code.

Windows Phone Emulator


Rob Tiffany continued his MEAP series with Consumerization of IT Collides with MEAP: iPhone + iPad > On-Premises on 11/10/2011:

In my last ‘Consumerization of IT Collides with MEAP’ article, I described how to connect a Windows Phone device to Microsoft’s Cloud servers in Azure. By now you’re probably thinking, “It’s easy to talk about Microsoft endpoints talking to Microsoft servers.” So in this week’s scenario, I’ll use the picture below to illustrate how iOS devices like the iPhone and iPad can utilize many of Gartner’s Critical Capabilities to connect to Microsoft’s On-Premise infrastructure:


As you can see from the picture above:

  1. For the Management Tools Critical Capability, iOS uses Microsoft Exchange for On-Premise policy enforcement via Exchange ActiveSync (EAS) but has no private software distribution equivalent to System Center Configuration Manager 2007 from Microsoft today. Instead, in-house apps are hosted and distributed via a web server over wireless by having a user click on a URL. In the future, System Center Configuration Manager 2012 will be able to better manage iOS devices.
  2. For both the Client and Server Integrated Development Environment (IDE) and Multichannel Tool Critical Capability, iOS uses Visual Studio. While the Server/EAI development functionality is the same as every other platform, endpoint development will consist of HTML5, ECMAScript 5, and CSS3 delivered by ASP.NET. WCF REST + JSON Web services can also be created and consumed via Ajax calls from the browser.
  3. For the cross-platform Application Client Runtime Critical Capability, we will rely on iOS’s WebKit browser called Safari to provide HTML5 + CSS3 + ECMAScript5 capabilities. Offline storage is important to keep potentially disconnected iPhones and iPads working and this is facilitated by Web Storage which is accessible via JavaScript.
  4. For the Security Critical Capability, iOS provides AES 256 hardware encryption as well as Data Protection based on the user’s device passcode for data-at-rest. Data-in-transit is secured via SSL, VPN, and 802.1X. Built-in LDAP support allows it to access corporate directory services.
  5. For the Enterprise Application Integration Tools Critical Capability, iOS can reach out to servers directly via Web Services or indirectly via SQL Server or BizTalk using SSIS/Adapters to connect to other enterprise packages.
  6. imageThe Multichannel Server Critical Capability to support any open protocol directly, via Reverse Proxy, or VPN is facilitated by ISA/TMG/UAG/IIS. Crosss-Platform wire protocols riding on top of HTTP are exposed by Windows Communication Foundation (WCF) and include SOAP, REST and Atompub. Cross-Platform data serialization is also provided by WCF including XML, JSON, and OData. These Multichannel capabilities support thick clients making web service calls as well as thin web clients making Ajax calls. Distributed caching to dramatically boost the performance of any client is provided by Windows Server AppFabric Caching.
  7. While the Hosting Critical Capability may not be as relevant in an on-premises scenario, Windows Azure Connect provides an IPSec-protected connection to the Cloud and SQL Azure Data Sync can be used to move data between SQL Server and SQL Azure.
  8. For the Packaged Mobile Apps or Components Critical Capability, iOS runs cross-platform mobile apps including OneNote, Bing, Tag, and of course the critical ActiveSync component that makes push emails, contacts, calendars, and device management policies possible.

As you can see, iOS meets many of Gartner’s Critical Capabilities. It has really improved over the years in areas of security and device management. As you can see from the picture, the big gap is with the client application runtime critical capability. Native development via Xcode/Objective-C is where Apple wants to steer you and Microsoft doesn’t make native tools, runtimes or languages for this platform. You can certainly kick the tires and perform your own due diligence on MonoTouch from our friend Miguel de Icaza and his colleagues. From a Microsoft perspective though, you’re definitely looking at HTML5 delivered via ASP.NET.

Next week, I’ll cover how iOS connects to the Cloud.

<Return to section navigation list>

Windows Azure AppFabric: Apps, Access Control, WIF and Service Bus

Neil MacKenzie (@mknz) described his Windows Azure AppFabric Applications Presentation at SVCC in a 10/11/2011 post:

imageI did a presentation on Windows Azure AppFabric Applications CTP1 at the Silicon Valley Code Camp yesterday. This was based on the post I did several months ago. I uploaded the deck to SlideShare for the benefit of those who were there.

image72232222222During the talk, I pointed people in the direction of a number of posts on Windows Azure AppFabric Applications by Alan Smith (@alansmith):

  • Introduction to AppFabric Applications [Webcast]
  • Create Custom External Service in Azure AppFabric June CTP [Tutorial]

The Windows Azure AppFabric Applications CTP can be accessed on the Windows Azure AppFabrics Labs portal. The MSDN documentation is here.

Subscribed to Alan Smith’s blog.

Alan Smith (@alansmith) announced a CTP Version of The Developers Guide to AppFabric on 10/3/2011 (missed when published):

imageI’ve just published a CTP version of “The Developers Guide to AppFabric”. Any feedback on the content would be great, and I will include it in the full release next week.

“The Developer’s Guide to AppFabric” is a free e-book for developers who are exploring and leveraging the capabilities of the Azure AppFabric platform.

image72232222222The goal is to create a resource that will evolve and mature in parallel with the Azure AppFabric technologies. The use of an electronic format will allow sections to be added as new technologies are released and improved as the knowledge and experience of using the technologies grows within the developer community.

The CTP version, published on the 3th October 2011, marks seven years to the day since the first version of “The Blogger’s Guide to BizTalk” was published.

The first section will provide an introduction to the brokered messaging capabilities available in the Azure AppFabric September 2011 release. The next section will go into a lot more depth and explore the brokered messaging feature set. Future sections will cover the Access Control Service, relayed messaging, and cache. Features like the application model and integration services will be covered as they emerge and are released.

The book will always be free to download and available in CHM and PDF format, as well as a web based browsable version. The planned release schedule for the remainder of 2011 is to update the guide with new content monthly, around the start of each month. Updates will be made available on the website and announced through my blog and twitter.

Developer’s Guide to AppFabric: devguide.cloudcasts.net. The feedback form is here.

Alan Smith (@alansmith) recommended Two Great AppFabric Resources in a 9/23/2011 post (missed when published):

imageIf you want to get up to speed on Azure AppFabric brokered messaging there are a couple of resources you should check out.

SB CTP Messaging User Guide

image72232222222This is a 30 page word document put together by the AppFabric dev team. It’s a great read to get an introduction to Queues, Topics and Subscriptions. If you are working in the AppFabric Labs environment using the May or June CTP the code will work fine. There have been significant changes in the API in the September 2011 release, so don’t expect any of the code to compile in that release, but the theory section is still very much valid.

Best Practices for Leveraging Windows Azure Service Bus Brokered Messaging API

This is a 10 page blog post from the Windows Azure CAT team that provides some great tips for developers who have some familiarity with brokered messaging. The level is pretty advanced, and it’s impressive to see how much experience the CAT team has accumulated on brokered messaging. All the code here is on the September 2011 release. It’s a must-read if you plan to develop anything using service bus brokered messaging.

Alan Smith (@alansmith) posted Another AppFabric Walkthrough - Introducing Topics and Subscriptions on 9/21/2011:

imageI’ve added a third walkthrough to the AppFabric Walkthrough series.

“The following walkthrough will show how topics and subscriptions can be used to implement a simple publish-subscribe messaging channel. The next walkthrough will build on this sample and explore the use of filters on subscriptions.”

Introducing Topics and Subscriptions

image72232222222The other two walkthroughs are here.

Simple Brokered Messaging

Creating a Simple Queue Management Tool

At present there are no charges for using the new brokered messaging capabilities in AppFabric, so I have been taking advantage of this to explore the new functionality. This free offer will not last forever, so now is a great time to take a look at the new capabilities.

<Return to section navigation list>

Windows Azure VM Role, Virtual Network, Connect, RDP and CDN

imageNo significant articles today.

<Return to section navigation list>

Live Windows Azure Apps, APIs, Tools and Test Harnesses

Himanshu Singh posted a Security Firm G4S Moves to the Cloud with Windows Azure case study on 10/11/2011 to the Windows Azure blog:

imageLeading international security solutions group G4S is focused on protecting rock stars and sports stars as well as delivering cash and bullion worldwide. The company has developed a system called eViper in order to plan the most efficient routes and track the movement of cash and valuables worldwide. G4S transports more than £300 billion in cash each year so eViper is a mission critical system for G4S; quite simply if it fails then cash stops moving. G4S has just completed its move to the cloud by moving eViper to Windows Azure.

imageG4S wanted to move away from its dependency on legacy code base, which had become difficult to manage, and infrastructure supplied by outsourced managed service providers, which had poor availability. G4S knew it needed to create an application that would ensure high availability, scalability, cost effectiveness and security. To secure these features G4S realised it had to migrate to a modern platform and turned to Microsoft for help.

imageMicrosoft worked with G4S on a solution based on Windows Azure, which allows the eViper application to be hosted in different data centres at different times. This means that, if an issue impacts the primary data centre, the application will quickly failover to the secondary data centre, ensuring maximum reliability and uptime.

Richard Wallace, Technology Director at G4S explains: “Security is at the core of everything that we do. We believe Windows Azure is the safest environment we could use to host the eViper system - we conducted a 170-control point assessment and found that Windows Azure was more secure than our existing infrastructure partners. I often get told that I do not know where the data is held. The reality is that I know exactly where the data is held and I know that the data cannot be accessed by anyone other than G4S. By working on a Microsoft cloud we are saving money and also have the ability to scale up operations around the world without a huge capital outlay on data centre infrastructure.”

In addition to reducing running costs by two thirds, G4S also sees benefits from moving to Windows Azure in the areas of scalability and agility. Running eViper on Windows Azure enables G4S to deploy into a new country without needing to establish a data centre there. But the most important reason G4S looked to Microsoft for a cloud platform was security. The company chose Windows Azure largely due to Microsoft’s commitment to - and experience in - providing some of the most reliable, private and secure cloud services in the industry.

Michael Newberry, Windows Azure product manager at Microsoft UK, sums it up this way: “G4S asked Microsoft for a reliable platform able to handle one of its most important systems and keep its data secure and private. Windows Azure was a natural fit – a trusted cloud platform that can manage complex, sensitive data and systems. Our heritage in providing products and services built on a foundation of security and privacy was a primary factor in G4S’s decision. This, coupled with our comprehensive offering of secure and reliable cloud offerings has meant we have been able to help G4S drive its business forward by enabling it to move to the cloud on its own terms and at its own pace.”

David Aiken (@TheDavidAiken) explained Implementing Windows Azure Retry Logic in a 10/10/2011 post:

Windows Azure will automatically repair itself. Can your service? In this post I’m going to show you a simple way to make your service a little more resilient by adding retry logic.

Transient Datacenter conditions

When you have to deal with external services of any type there are times when the service might not respond. This could be due to any number of reasons from network connectivity & service throttling, to hardware failure. Windows Azure is designed to withstand these failures, not by avoiding them, but by taking corrective action when they do occur. Windows Azure auto heals itself. These conditions are sometimes referred to as transient conditions because they are not typically long lasting.

As an example, SQL Azure can give you connection errors, and throttling errors, Windows Azure Storage can give you timeout and throttling errors and Service Bus has ServerBusy and MessagingCommunication Exceptions.

Any other external dependency will also likely have similar conditions. Without defensive coding for these transient conditions, your app will suffer unnecessary outages. Fortunately the problem can be easily resolved.

Retry Logic

Handling these conditions is usually as easy as repeating the operation after a short delay.

The Windows Azure Storage Client Library that ships with the SDK already has retry behavior that you need to switch on. You can set this on any storage client by setting the RetryPolicy Property.

SQL Azure doesn’t provide a default retry mechanism out of the box, since it uses the SQL Server client libraries. Service Bus also doesn’t provide a retry mechanism.

Transient Fault Handling Framework

To provide an easy way to handle this, the Windows Azure Customer Advisory Team have developed a Transient Fault Handling Framework – http://windowsazurecat.com/2011/02/transient-fault-handling-framework/. The framework provides a number of ways to handle specific SQL Azure, Storage, Service Bus and Cache conditions.

The most interesting aspect to me however is the ExecuteAction and ExecuteAction<T> methods. These methods allow you to basically wrap any user code in a retry block. Example:

var policy = new RetryPolicy<SqlAzureTransientErrorDetectionStrategy>(MaxRetries,
policy.ExecuteAction(() => object.DoSomething());
Retry Pattern

What is great about these methods are they enable you to use the decorator pattern to add retry logic to your service. This of course assumes you built your service with extensibility in mind.

In my example I have a UriRepository which is defined by the IUriRepository interface. I have a SQLAzureUriRepository that implements the interface. This class however contains no retry logic. Instead I implemented a RetryUriRepository that also implements IUriRepository. RetryUriRepository allows you to specify via constructor injection, which UriRepositiory to retry.

Here is a snippet of the RetryUriRepository:

public class RetryUriRepository : IUriRepository
    private readonly IUriRepository _uriRepository;
    private const int MaxRetries = 10;
    private const double DelayMs = 2000;

    public RetryUriRepository(IUriRepository uriRepository)
    {   _uriRepository = uriRepository;    }

    public void InsertUri(string shortUri, string longUri, string ipAddress)
        var policy = GetRetryPolicy();
        policy.ExecuteAction(() => _uriRepository.InsertUri(shortUri, longUri, ipAddress));
    private static RetryPolicy GetRetryPolicy()
        return new RetryPolicy<SqlAzureTransientErrorDetectionStrategy>(MaxRetries,                              TimeSpan.FromMilliseconds(DelayMs));

Using the supplied framework might be overkill, but it should give you an idea on how to implement retry logic in your service.

Too Much Retry

One thing that becomes interesting is when the number of retries increases. This typically indicates either a longer error condition, or you are overloading the services you are consuming. The most likely, and only one we can do anything about, is the later. The more throttling that goes on, the more retires. The more retires the less throughput. The less throughput the slower the response time. Poor response time = disgruntled users (and executives).

Don’t be tempted to turn off the retry logic when this happens. This will just make the problem much worse. About the only solution when dealing with overloading a service is to either scale that service out, or attempt to delay the processing using a queue/worker pattern.


Implementing retry logic is critical if you want your service to keep running. Monitoring the frequency of these retries can be a good indicator you are starting to experience scale issues. Don’t turn your retry logic off to handle scale issues.


Richard Seroter (@rseroter) decides When to use SDKs and when to go “Go Native” in a 10/10/2011 post:

imageI’m going to China next week to speak at QCon and have spent the last few weeks building up some (hopefully interesting) demos. One of my talks is on “cloud integration patterns” and my corresponding demos involve Windows Azure, a .NET client application, Salesforce.com, Amazon Web Services (AWS) and Cloud Foundry. Much of the integration that I show uses AWS storage and I had to decide whether I should try and use their SDKs or go straight at their web service interface. More and more, that seems to be a tough choice.

imageEveryone loves a good SDK. AWS has SDKs for Java, .NET, Ruby and PHP. Microsoft provides an SDK for .NET, Java, PHP and Ruby as well. However, I often come across two issues when using SDKs:

  1. Lack of SDK for every platform. While many vendors do a decent job of providing toolkits and SDKs for key languages, you never see one for everything. So, even if you the SDK for one app, you may not have it for another. In my case, I could have used the AWS SDK for .NET for my “on-premises” application, but would have still likely needed to figure out the native API for the Salesforce.com and Cloud Foundry apps.
  2. Abstraction of API details. It’s interesting that we continue to see layers of abstraction added to technology stacks. The difference between using the native, RESTful API for the Azure AppFabric Service Bus (think using HttpWebRequest) is pretty different than using the SDK objects. However, there’s something to be said for understanding what’s actually happening when consuming a service. SDKs frequently hide so much detail that the developer has no idea what’s really going on. Sometimes that’s fine, but to point #1, the information about using an SDK is rarely portable to environments where no SDK exists.

I’ll write up the details of my QCon demos in a series of blog posts, but needless to say, using the AWS REST API is much different than going through the SDK. The SDK makes it very simple to query or update SimpleDB for example, but the native API requires some knowledge about formatting the timestamp, creating a hashed signature string and parsing the response. I decided early on to go at the REST API instead of the .NET SDK for AWS, and while it took longer to get my .NET-based integration working, it was relatively easy to take the same code (language changes notwithstanding) and load it into Cloud Foundry (via Ruby) and Salesforce.com (via Apex). Also, I now really understand how to securely interact with AWS storage services, regardless of platform. I wouldn’t know this if I only used the SDK.

I thought of this issue again when reading a great post on using the new Azure Service Bus Queues. The post clearly explains how to use the Azure AppFabric SDK to send and receive messages from Queues. But when I finished, I also realized that I haven’t seen many examples of how to do any of the new Service Bus things in non-.NET environments. I personally think that Microsoft can tell an amazing cloud integration story if they just make it clearer how to use their Service Bus resources on any platform. Would we be better off seeing more examples of leveraging the Service Bus from a diverse set of technologies?

So what do you think? Do SDKs make us lazy developers, or are we smarter for not concerning ourselves with plumbing if a vendor has reliably abstracted it for us? Or should developers first work with the native APIs, and then decide if their production-ready code should use an SDK instead?

<Return to section navigation list>

Visual Studio LightSwitch and Entity Framework 4.1+

The Visual Studio LightSwitch Team (@VSLightSwitch) described How to Create a RIA Service Wrapper for an Editable OData Source in a 10/11/2011 post:


LightSwitch has built-in support for SQL Server and SharePoint data sources. To access other data sources, you can write a custom WCF RIA DomainService. This post will show you how to read and write from an Odata Service by wrapping access to it in a DomainService.

image222422222222There are a couple limitations on the OData Services that can be exposed using RIA Services and LightSwitch.

Complex Types

While both OData and RIA Services support complex types on entities, LightSwitch does not. If a complex type property is exposed on an entity, LightSwitch will import the entity, ignoring that property. There are a couple of workarounds for this that we will detail in another blog post.

Navigation Properties without Foreign Keys

An OData service can contain navigation properties that are not associated with any foreign key. This is likely the case with many-to-many relationships, but can also occur for 0..1-Many or 1-Many relationships. For example, the Netflix OData Catalog contains a many-to-many relationship between Titles and Genre. Unfortunately, RIA Service associations are foreign key based. If an OData association is not foreign key based, there isn't a good way to represent it over a RIA Service.

If an OData service does contain these types of associations, there isn't currently a way to represent these in LightSwitch. However, you can add parameterized queries on your RIA Service that can be called LightSwitch. Using this functionality, queries that represent these unsupported associations could be exposed. For Netflix, for example, you could define GetGenresByTitle and GetTitlesByGenre queries on your RIA Service, which call into the appropriate OData navigation properties.

The basic steps to create an OData wrapper DomainService for LightSwitch are as follows:

  1. Create a class library project
  2. Add a service reference to your Odata service
  3. Add a WCF DomainService to your project
  4. Add a metadata class to provide appropriate information to LightSwitch about the classes exposed by the service reference
  5. Add query methods to your DomainService to expose each entity class on the OData service
  6. Add methods to your DomainService to Create, Update and Delete each entity class

Steps 1-5 are covered in the How to create a RIA service wrapper for OData Source post. This blog post will build on and update the DomainService created in that post.

Allowing LightSwitch To Specify Connection Information

The first blog post assumed that the address of the OData service was hard-coded in your DomainService. We will now modify our class to allow the address to specified when consuming the RIA Service from LightSwitch.

The Add Data Source wizard in LightSwitch will prompt developers for a connection string when adding a DomainService data source. This connection string will be stored in the web.config for the project using the class name of the DomainService. We'll modify our DomainService to get the connection information from the web.config in its Initialize method.

First add references to System.Web and System.Configuration.


Add a Description attribute to the DomainService class. This description will be displayed in the Add Data Source wizard when requesting a connection string from the user.

<Description("Specify the address to the ProductCatalog Service")> _
Public Class ProductService
   Inherits DomainService
   ' Modify the Initialize method to check for and use the address specified from LightSwitch.
   Public Overrides Sub Initialize(ByVal context As    System.ServiceModel.DomainServices.Server.DomainServiceContext)
   ' Get connection information from the web.config
   If Web.Configuration.WebConfigurationManager.ConnectionStrings(GetType(ProductService).FullName) Is       Nothing OrElse String.IsNullOrWhiteSpace         (Web.Configuration.WebConfigurationManager.ConnectionStrings(GetType      (ProductService).FullName).ConnectionString) Then
       Throw New Exception("The address to RIA Service must be provided when attaching to this data source from LightSwitch.")
       Dim url As String = Web.Configuration.WebConfigurationManager.ConnectionStrings(GetType(ProductService).FullName).ConnectionString
          _context = New ProductCatalog.ProductCatalogEntities(New Uri(url))
   End If
End Sub
Providing a Submit Method for the RIA Service

The Submit method of the DomainService will be called whenever LightSwitch attempts to save any changes for the data source. The Submit method will need to process each changed entity and then save changes to the OData service.

In our OData Service, each Product has an associated Category. To ensure that the reference between Product and Category is correctly maintained, our RIA Service will need to process Categories prior to Products. This is done by re-ordering the set of changed entities prior to processing them. This reordering will need to be customized based on the structure of each OData service. The following class will handle ordering the change set.

Public Class ProductEntitiesComparer
   Inherits Comparer(Of Object)

   Public Overrides Function Compare(x As Object, y As Object) As Integer
      If TypeOf x Is ProductCatalog.Product AndAlso TypeOf y Is ProductCatalog.Category Then 
         Return 1
      ElseIf TypeOf x Is ProductCatalog.Category AndAlso TypeOf y Is ProductCatalog.Product Then
         Return -1
         Return 0
      End If
   End Function
End Class

Once the change set has been reordered, we will need to process each record in the change set. This is achieved by calling the base implementation of Submit. The base implementation of Submit simply calls the separate Update, Create and Delete methods for each entity type. We will provide these next.

After each record has been processed, we need to save the changes to the OData service. Since a given save can include more than one record and these records are dependent on each other, we'll need to save in batch mode.

Public Overrides Function Submit(changeSet As ChangeSet) As Boolean
   ' Reorder the change set to ensure that categories are processed before products. Products are dependent    on categories. 
   Dim c As New ChangeSet(changeSet.ChangeSetEntries.OrderBy(Function(entry) entry.Entity, New ProductEntitiesComparer()))
   Dim baseResult As Boolean = MyBase.Submit(c)
   Return True
End Function
Providing the Create, Update and Delete Methods for Category

For each method, we first need to attach the Category to DataServiceContext object. For the Update and Delete methods, we also need to specify what operation is occurring on the attached object. The methods are listed below.

Public Sub CreateCategory(ByVal c As ProductCatalog.Category)
   ' Add the new category to the service reference context
   _context.AddObject("Categories", c)
End Sub

Public Sub UpdateCategory(ByVal c As ProductCatalog.Category)
   ' Attach the object to the context and specify that it has been updated
   _context.AttachTo("Categories", c)
End Sub

Public Sub DeleteCategory(ByVal c As ProductCatalog.Category)
   ' Attach the object to the context and specify that it has been deleted
   _context.AttachTo("Categories", c)
End Sub
Providing the Create, Update and Delete Methods for Product

These methods are very similar to those for Category. However, for the CreateProduct method, we need to inform the DataServiceContext that there is a relationship (link) between Product and Category. This will ensure that the newly added Product will be correctly associated with a Category. It is necessary to do this on the "1" side of a relationship.

Public Sub CreateProduct(ByVal p As ProductCatalog.Product)
   ' Add the new product to the service reference context
   ' Need to set link between Product and Category (to ensure that inserts to the database are ordered correctly)
   '  For existing categories, get the category first
   If p.Category Is Nothing Then
      p.Category = _context.Categories.Where(Function(c) c.ID = p.CategoryID).FirstOrDefault()
   End If
   ' Set the link between the product and category
   _context.SetLink(p, "Category", p.Category)
End Sub

Public Sub UpdateProduct(ByVal p As ProductCatalog.Product)
   ' Attach the object to the context and specify that it has been updated
   _context.AttachTo("Products", p)
End Sub

Public Sub DeleteProduct(ByVal p As ProductCatalog.Product)
   ' Attach the object to the context and specify that it has been deleted
   _context.AttachTo("Products", p)
End Sub

This method can be extended to an OData Service with an arbitrary number of entity types. The only modifications that need to be made are in the Submit and the Create<Entity> methods. In the Submit method, the change set will need to be re-ordered to ensure that parent types are processed before their children. In the Create<Entity> methods, the links between entity types will need to be specified on the "1" or child end of the relationship.

Beth Massi (@bethmassi) posted Resources from Silicon Valley Code Camp LightSwitch Sessions on 10/10/2011:

imageThanks to Peter Kellner and all the folks that made for another extraordinary Silicon Valley Code Camp this year! There were over 2300 people that came to Foothill college in Los Altos this weekend. And what gorgeous weather we had!

image222422222222Thanks to all those folks who came out to my LightSwitch talks Saturday afternoon, I had a blast! Here are all the resources from the sessions to check out. Of course, all of these resources are available from the LightSwitch Developer Center, including the free trial download.

Intro Session: Building Business Applications Quickly with Visual Studio LightSwitch
Advanced Session: LightSwitch Advanced Development and Customization Techniques

And here are some more good resources that you’ll want to explore as you start building your LightSwitch business apps!


The ADO.NET Entity Framework Team announced EF 4.2 Release Candidate Available on 9/28/2011 (missed due to Atom feed failure):

We recently posted about our plans to rationalize how we name, distribute and talk about releases. We heard a resounding ‘Yes’ from you so then we posted about our plans for releasing EF 4.2. We then shipped EF 4.2 Beta 1.

Third party EF provider writers tried out EF 4.2 Beta 1 and identified a couple more areas that were causing issues for them. We have been working to improve these areas and today we are making EF 4.2 Release Candidate available. The final release of EF 4.2 will be available in the near future.

EF 4.2 = Bug Fixes + Semantic Versioning

When we released ‘EF 4.1 Update 1’ we introduced a bug that affects third party EF providers using a generic class for their provider factory implementation, things such as WrappingProviderFactory<TProvider>. We missed this during our testing and it was reported by some of our provider writers after we had shipped. If you hit this bug you will get a FileLoadException stating “The given assembly name or codebase was invalid”. This bug is blocking some third party providers from working with ‘EF 4.1 Update 1’ and the only workaround for folks using an affected provider is to ask them to remain on EF 4.1. Third party provider writers then identified some areas in EF where it was hard to get EF to work with their providers, so we decided to address these issues in the EF 4.2 release. These provider related changes will be the only changes between ‘EF 4.1 Update 1’ and ‘EF 4.2’.

Obviously a single bug fix wouldn’t normally warrant bumping the minor version, but we also wanted to take the opportunity to get onto the semantic versioning path rather than calling the release ‘EF 4.1 Update 2’.

Getting Started

The following walkthroughs are available for EF 4.2:

Getting EF 4.2 Release Candidate

EF 4.2 Release Candidate is available via NuGet as the EntityFramework.Preview package. If you already have the EntityFramework package installed then updating to the latest version will give you EF 4.2.


Model First & Database First Templates

The templates for using the DbContext API with Model First and Database First are now available under the “Online Templates” tab when “Right-Click –> Add Code Generation Item…” is selected on the EF Designer.



This is a preview of features that will be available in future releases and is designed to allow you to provide feedback on the design of these features. It is not intended or licensed for use in production. If you need assistance we have an Entity Framework Pre-Release Forum.

What’s Not in This Release?

As covered earlier this release is just a small update to the DbContext & Code First runtime. The features that were included in EF June 2011 CTP require changes to the Core Entity Framework Libraries that are part of the .NET Framework and will ship at a later date. Note that Code First Migrations is not compatible with the EntityFramework.Preview package. Please continue to use the most recent EntityFramework package when working with Migrations. Our Code First Migrations work is continuing and we are working to get the next release in your hands soon.

Open attached fileEntity-Framework-4.2-RC-_2D00_-EULA.rtf

Return to section navigation list>

Windows Azure Infrastructure and DevOps

Avkash Chauhan (@avkashchauhan) recommended on 10/10/2011 that you Try SCOM for Deploying and Managing Windows Azure Applications for 180 days free using [a] pre-configured VHD:

imageSo if you always wanted to try System Center Operation Manager (SCOM) plugin for Managing Windows Azure Application, [I] think this post would give you a good start. You just need to download a VHD which is already pre-configured for everything you need on this regard.

imageIf you are new to Windows Azure:

  • You can actually deploy a new Sample Windows Azure Application and then manage it with SCOM plugin for Windows Azure.

If you have already running Windows Azure application:

  • You can use this pre-configured VHD to actually manage & monitor your already running Windows Azure Application.

In this evaluation, users will follow a step-by-step guide to deploy a sample application to Windows Azure and then use System Center Operations Manager to monitor and manage that application.

The following components are included in the Deploying and Managing Windows Azure Applications evaluation:

  • Windows Azure Platform Trial - Sign up for a free trial of the Windows Azure platform or use your own account.
  • Evaluation Guide - This step-by-step guide walks you through deploying and managing an application on Windows Azure.
  • Windows Azure-ready Application - You will deploy this sample application on Windows Azure.
  • System Center Operations Manager 2007 R2 VHD - This fully configured virtual machine provides the environment from which you will deploy the sample application on Windows Azure and demonstrates System Center Operations Manager's capabilities for managing applications and IT services in a Windows Azure public cloud environment.
  • Windows Azure Application Monitoring Management Pack - The Windows Azure Application Monitoring Management Pack for System Center Operations Manager enables you to monitor the availability and performance of applications that are running on Windows Azure.
  • Windows Azure Software Development Kit and Tools for Visual Studio - To further explore Windows Azure, these tools extend Visual Studio 2010 and Visual Web Developer 2010 Express Edition to enable you to create, build, debug, run, and package scalable services on Windows Azure.

System Requirements

  • Windows Server 2008 R2 with the Hyper-V role enabled.
  • RAM: 8 GB or more recommended
  • Hard disk space required for install: 40 GB minimum (80 GB recommended)
  • Internet connection

Available in:

  • 64-bit edition
  • VHD (virtual hard drive) download
  • These languages: English
  • The image has a 180-day evaluation period. After this time, you will need to uninstall the software or upgrade to a fully-licensed versions of the products on this VHD.

Visit Announcement: http://technet.microsoft.com/en-us/evalcenter/hh282846.aspx?wt.mc_id=otc-f-corp-jtc-DPU-TEC_97_1_22

My Configuring the Systems Center Monitoring Pack for Windows Azure Applications on SCOM 2012 Beta post of 9/7/2011 describes issues with using the SCOM Monitoring Pack for Windows Azure with SCOM 2012 beta.

Stuart J. Johnson (@stuartj1000) asserted “Neglected until recently, platform-as-a-service (PaaS) is getting set to come into its own” in a deck for his Gartner: Platform-as-a-Service Taking Off article of 10/10/2011 for Datamation:

imageUse of online services to let customers build applications without investing in all the hardware and software necessary to build it in their own data centers is clearly on the rise -- proof that saving money can be a strong motivator.

However, while more visible categories of services get the limelight today, that may be changing.

imageSo-called platform-as-a-service (PaaS) revenues are likely to reach $707.4 million this calendar year, a big jump from $512.4 million last year, according to a leading researcher's latest report.

"Cloud [computing] has three technological aspects -- infrastructure-as-a-service (IaaS), platform-as-a-service (PaaS) and finally software-as-a-service (SaaS)," Fabrizio Biscotti, research director at analyst firm Gartner, said in the report.

"While SaaS is the most developed aspect, PaaS is the least developed, and it is where we believe the battle between vendors is set to intensify," Biscotti said.

Additionally, although initial deployments focused primarily on application services, PaaS is becoming more sophisticated over time.

"The market has since expanded to encompass other middleware capabilities as a service, such as integration, process management, and portal and managed file transfers (MFTs)," the report said.

In fact, the low end of the portal, application server, and business process management markets are already seeing erosion due to increasingly mature PaaS offerings. In the not distant future, the upper ends of those markets will also come under threat.

That, in turn, will likely grow the application integration and middleware market by attracting customers that otherwise might choose to go with packaged applications and desktop productivity products.

However, market forces such as customers desire to keep platform capabilities in-house, and to choose best of breed PaaS services that reside in different data centers, will likely drive consolidation.

"Mainstream users of PaaS services will likely look for providers that deliver comprehensive and integrated PaaS functionality suites -- forcing the specialist offerings to consolidate," Yefim Natis, vice president and distinguished analyst at Gartner, said in the report.

PaaS will mature in steps over the next few years.

"Around 2013, PaaS functionalities will consolidate around specific usage scenarios, paving the way for integrated comprehensive PaaS offerings to emerge from 2015 and beyond," the report said.

The report is entitled, "Forecast: Platform as a Service, Worldwide, 2010-2015, 3Q11 Update."

<Return to section navigation list>

Windows Azure Platform Appliance (WAPA), Hyper-V and Private/Hybrid Clouds

Scott Densmore (@scottdensmore) posted Windows Azure Guidance V3 - Hybrid Applications First Drop on 10/10/2011:

imageWe have been working on a new guide that is focusing on Hybrid Applications with Windows Azure. The focus is connecting on premise and off premise applications. The scenario focuses on a fictitious company "Trey Research" that supplies custom parts that then is fulfilled by offsite vendors. We show how to connect the application running in the cloud back to the corporate on premise app and

imageThis first drop focuses on the conversation with the shipping providers. It uses the Service Bus V2 (App Fabric) shipped at Build. The web site runs and Windows Azure and allows you to create orders and then has a simple WinForms application that polls the queue.

To run the app: start the Windows Azure Cloud project, then right click the WinForms application and Debug Start New Instance.

This our first drop and we have quite a bit to go. Your feedback can help us shape the future.

Go get the drop here and leave your feedback!

<Return to section navigation list>

Cloud Security and Governance

Lori MacVittie (@lmacvittie) posted 1024 Words: If Neo Were Your CSO … on 10/11/2011:


imageWhen nearly half of folks experienced a stateful firewall failure under attack last year[1], maybe more of the same isn’t the right strategy.

[1] Arbor Networks, Network Infrastructure Security Report

<Return to section navigation list>

Cloud Computing Events

1105 Media will present a SharePoint Live Virtual Conference & Expo on 10/26/2011 from 7:30 AM to to 4:30 PM PDT:

imageSharePoint Live Virtual Conference & Expo offers a day of in-depth technical training for IT professionals and developers. Learn best practices from the field as our technical experts show you how to deploy, manage and build solutions for SharePoint.

Steve Fox will present the opening keynote address, SharePoint and the Cloud:

imageAs IT moves to the cloud, there are many positive implications for SharePoint. In this keynote, we'll explore this mass migration to the cloud and talk about how key technologies such as Windows Azure, Bing Services, and Windows Phone 7 can not only integrate with SharePoint, but also augment the experience—both from the on-premises SharePoint and SharePoint Online perspectives.

Brian Prince (@brianhprince) interviewed Corey Fowler (@SyntaxC4) in Bytes by MSDN’s Cory Fowler and Brian Prince video segment of 10/11/2011:

Join Brian Prince, Sr. Architect Evangelist at Microsoft, and Cory Fowler, Consultant at ObjectSharp in Canada as they discuss Windows Azure. Cory talks about the community work he’s done to lend his tips and tricks of the trade to help customers of all backgrounds deploy or migrate their projects to the cloud. He also hits on the importance of knowing and loving start up tasks and the benefits of utilizing Azure’s VM Role. Tune in for some great tips on Windows Azure!  

Video Downloads
WMV (Zip) | WMV | iPod | MP4 | 3GP | Zune | PSP

Audio Downloads
AAC | WMA | MP3 | MP4

Mike Benkovich (@mbenko) will present an MSDN Webcast: Technical Chat Series on Windows Azure (Part 04): Windows Azure and the Phone (Level 200) on 10/18/2011 at 8:00 AM PDT:

  • imageEventID: 1032494325
  • Starts: Tuesday, October 18, 2011 8:00 AM
  • Time zone: (GMT-08:00) Pacific Time (US & Canada)
  • Duration: 1 hour(s)
  • Language(s): English.
  • Product(s): Windows Azure.
  • Audience(s): Pro Dev/Programmer.

imageJoin us as we look at developing a typical Windows Phone 7 application by using data and services in the web. We cover creating the data service and consuming it in our application, and talk about the available tools and techniques that make this easy.

Join us for this live Technical Chat webcast and have your questions answered by Microsoft experts.

Presenter: Mike Benkovich, Senior Developer Evangelist, Microsoft Corporation

For additional chats in this series through 11/30/2011, see my Windows Azure and Phone ‘Mango’ Live MSDN Chats for 10/12 through 11/30/2011 post of 10/11/2011.

Mike Benkovich (@mbenko) will present an MSDN Webcast: Technical Chat Series on Windows Azure (Part 05): Windows Azure and SharePoint (Level 200) on 10/25/2011 at 8:00 AM:

  • imageEvent ID: 1032494334
  • Starts: Tuesday, October 25, 2011 8:00 AM
  • Time zone: (GMT-08:00) Pacific Time (US & Canada)
  • Duration: 1 hour(s)
  • Language(s): English.
  • Product(s): Windows Azure.
  • Audience(s): Pro Dev/Programmer.

Microsoft SharePoint provides collaboration and list processing capabilities as well as a great way to host a Microsoft Silverlight application. During this webcast, we build out a Silverlight web part that uses the cloud to process a workflow and is delivered from a SharePoint solution.

Join us for this live Technical Chat session and have your questions answered by Microsoft experts.

Presenter: Mike Benkovich, Senior Developer Evangelist, Microsoft

Jesus Rodriguez announced on 10/11/2011 a Managing your IT Systems from your SmartPhone or Tablet: Moesion Webinar Tomorrow (10/12/2011) at 11:00 AM PDT:

imageThis Wednesday we will hosting the first public webinar about Moesion (http://www.moesion.com). In this webinar, we will be highlighting how you can manage your IT systems, whether on-premise on or the cloud, from your smartphone or tablet. Specifically, we will be demonstrating how you can use Moesion to manage many capabilities of your IT infrastructure including:


  • Event Log
  • Windows Services
  • Windows Processes
  • File system
  • Internet Information Services 6.0, 7.0,7.5
  • SQL Server 2005, 2008, 2008 R2
  • SharePoint Server 2007, 2010
  • BizTalk Server 2004, 2006, 2006 R2, 2009, 2010
  • Windows Azure
  • System Info
  • Users
  • Groups
  • Devices
  • Hotfixes
  • Devices
  • Disks
  • Environment Variables

Additionally, we will show you how you can publish, execute and share your own scripts from your mobile device. We will also highlight ho[w] you can start managing your servers with Moesion in 2 clicks. Finally, we will discuss the immediate Moesion roadmap and how you can help us to make the product better.

You can register for the webinar here http://www.regonline.com/Register/Checkin.aspx?EventID=1020612.

Robin Shahan (@RobinDotNet) posted Code from the October 2011 Silicon Valley Code Camp Talks on 10/10/2011:

imageAs promised, I am posting the code from my talk at the Silicon Valley Code Camp on Saturday, October 8th. For those of you who attended both sessions, I added a boolean to the WCF service to let you flip back and forth between SQL Azure and Windows Azure Table Storage without changing the code.


I have also included a copy of the SQL Azure database that you can run locally or migrate to Azure. You can download the goods by clicking here.

<Return to section navigation list>

Other Cloud Computing Platforms and Services

Bill McColl (@mccoll) asserted “We're now entering the third generation of office software” in a deck for his Get Ready for the "BigData Office" article for the Cloudscale blog:

imageFirst it was just about documents, then about collaboration, now it's all about data. Office software is changing, and changing fast. It's moving to the cloud, and moving to the age of big data.

Data is growing exponentially everywhere. With this relentless data deluge, every enterprise from global Fortune 50 companies down to SMBs, every government agency, and every R&D lab, now needs a "big data office" solution to ensure that everyone in the organization has instant access to all the information they require, at all times.

We're now entering the third generation of office software.

The first generation was all about documents - emails, calendars, word processing, presentations, spreadsheets - with products such as Microsoft Office, Google Docs, OpenOffice, Apple iWork, Zoho and others. The second generation was primarily focussed on collaboration - enterprise social networks, document sharing, file sharing - with products like SharePoint, DocVerse, Jive, Yammer, Chatter, Dropbox.

The third wave of innovation in office software will be all around data, and in particular around big data. Third generation office software will complement the previous two generations by providing data stores and app stores that support all kinds of data sharing, analytics and app sharing. Think of it like a "Dropbox for big data analytics" where anyone can easily store, share, explore and analyse the exponentially growing volumes of data in their work and in their life. Big data analytics meets the consumer web.

Unlike traditional analytics tools like SQL and Hadoop, these new office platforms will be highly interactive and realtime, and will be for everyone - business users, data scientists, app developers, individuals. Anyone, or any organization, that needs a simpler way to handle today’s explosively growing data volumes.

Within large organizations, the growth of big data office software will be viral, just like Dropbox or Skype. Sharing data and apps will create powerful network effects, unleashing data-driven creativity and innovation everywhere.

The key technological challenge in building big data office software is how to deliver the extreme simplicity, speed and scale required. How do we enable business users and other non-programmers to easily and quickly build fast, scalable big data apps?

At Cloudscale we recently cracked the code on this problem, and we've now developed the first big data office platform. Easy-to-use, super-fast and super-scalable, Cloudscale can be used in any application area - business, web, finance, government, healthcare, science. It's available as a public cloud service or as in-house private cloud software.

Bill left Oxford University to found Cloudscale. At Oxford he was Professor of Computer Science, Head of the Parallel Computing Research Center, and Chairman of the Computer Science Faculty.

Adron Hall (@adron) posted First Looks @ AWS Toolkit for Visual Studio 2010 on 10/10/2011:

imageI’ll be presenting on the AWS Toolkit for Visual Studio 2010 in the very near future (Check out the SAWSUG Meetup on October 12th, that’s this Wednesday). I’ll be covering a number of things about the new AWS Toolkit for Visual Studio. My slides are available below (with links to the Google Docs and Slideshare Versions).

imageDirect link to Google Docs Presentation or the SlideShare Presentation.

The code for the presentation is available on Github under AWS-Toolkit-Samples. Beware, this code will be changing over time, the core will stay the same though.

Mark Smith (@marksmithvr) posted Cloudy Forecast for Oracle Fusion for CRM and HCM to his Ventana Research blog on 10/7/2011 (missed when published):

imageI did not go to Oracle OpenWorld this year because it seemed the company was fixated on appliances and technology with little emphasize on its Fusion applications business, which the focus on business is a major interest of our firm. Based on the reports of my colleagues on its applications discussion (See: “Apps Hard to Find at Oracle Open World“) and Oracle Exalytics (See: “Oracle Unveils BI Appliance Called Exalytics“) and my review of Oracle’s online materials and keynotes, I was right to skip it. The week was full of diatribes about appliances and infrastructure, while applications played second fiddle. This is a longstanding imbalance for Oracle, perhaps understandable given its history and the need to build revenue from its expensive Sun Microsystems hardware acquisition.

imageI was a lot more bullish in my analyses of Oracle Fusion CRM and Fusion HCM at last year’s Oracle OpenWorld. This year Oracle made the expected announcements of cloud versions of Oracle Fusion CRM and Oracle Fusion HCM as part of Oracle Public Cloud, but it showed a serious lack of enthusiasm and understanding of its own customers’ dilemma when it comes to getting to Oracle Fusion in the cloud.

imageIn Oracle Fusion HCM and what the company calls the Talent Management Cloud Service, Oracle has made a specific set of its applications available, including compensation, performance and analytics. You can access the cloud computing site for Oracle HCM and see that the developers are getting an onboarding experience started but still have some challenges to make it work. For instance, they provide little or no conversation about application and data migration from existing applications whether Oracle’s or someone else’s. No one can start from scratch anymore, so both preloading data and synchronizing it back across the enterprise are important issues. Workforce analytics are also weak, yet our research identified that 68 percent spent of HR organizations spend most of their time in data-related activities, so users need data integration that operates across and within cloud computing and enterprise environments. Oracle Fusion HCM provides little dialogue on how to address the dilemma of incorporating all employee data or promote the company’s own data-related technologies. Other providers that operate in the cloud might be a better choice.

My latest review of the talent management advancements and business technology innovations that were unveiled at the recent HR Technology Conference confirmed that Oracle is just one of dozens of providers in talent management and one of the newest in providing it in the cloud. But then its archrival SAP is just coming out with its HCM applications for the cloud computing environment, too. Oracle will have to work hard to get the growth that CEO Larry Ellison no doubt expects.

Oracle Fusion CRM and Sales Cloud Service show indications that the company is starting to understand that organizations need not simply SFA but a broad portfolio of applications designed for specific sales activities and processes. This should help Oracle advance in our assessment in our next Value Index for Sales. Oracle is already in a respectable position, but it can take a stronger one by getting more adoption of Oracle Fusion CRM.

You can access the cloud site for CRM and see the same type of information as you’ll find for HCM. Considering the number of iterations of Oracle CRM OnDemand that have been released, the company should be more ready to help organizations get up and running – especially if it wants to compete against the latest from Salesforce.com and newcomer to the cloud SAP and its new Sales OnDemand offering.

These Oracle Fusion Applications have promise, and though demonstrations are not accessible on the Internet, you can view some screen shots. Oracle has considered usability, functionality and manageability by business. Now it’s addressing reliability and scalability across its public cloud environment, which remains to be proven. Oracle needs to better understand the existing cloud and enterprise data challenges to make its applications operational. Our business data in the cloud benchmark research found a significant challenge for organizations in their ability to use data across and within the cloud.

I am not sure that business applications are a high enough priority for Oracle, and that goes double for cloud-based apps, which customers can rent from other providers today. I hope Oracle puts more effort into Oracle Fusion and its applications, especially its cloud computing editions. I think they have potential, but the company is not good at communicating their value. This problem seems to be getting worse even as Oracle has improved the user experience of the applications. I also continue to hear Oracle is one of the most difficult technology companies to work with.

Oracle might consider adding an Oracle AppsWorld to get credibility and gain business attendance and attention; that could lead to customer and business growth. Once Oracle gets fully serious about business applications – and about its customers for them – it will find itself more often on the short list of vendors to evaluate. That won’t happen immediately, as in many cases Oracle is currently left off the list and rightfully so by potential customers evaluating HCM or talent management along with CRM and more specifically Sales in the cloud today.

Mark is CEO and Chief Research Officer of Ventana Research.

<Return to section navigation list>