Sunday, July 26, 2009

LINQ and Entity Framework Posts for 7/20/2009+

Note: This post is updated daily or more frequently, depending on the availability of new articles.

Entity Framework and Entity Data Model (EF/EDM)

Gunnar Piepman demonstrates Entity Framework 4.0: Generating SQL script from model in this 7/23/2009 post:

Entity Framework 4.0 is able to generate database schema based on model. If you built your model first and you now want to create database for it you can use new Generate Database Script from Model feature. Let’s see how it works. I will use my example gallery model .

Alex JamesCustomizing T4 Templates post of 7/22/2009 shows you how to “re-use the code from his recent Entity Framework POCO template walkthrough post to “modify the generated code to do some validation, … using a Regular Expression.”

Julie Lerman reports a Small EF4/WPF bug that bit me in the butt during my VTdotNET presentation on 7/22/2009 that involves a customized T4 template and a Custom Tool setting.

Alex James continues his Entity Framework tip series with:

Julie Lerman also was Checking out one of the new stored procedure features in EF4 on 7/22/2209 and found “that it is now simple to call the function in your code without spinning up and executing an EntityConnection and EntityCommand.” She also lamented Life without EF or any ORM the same day.

Matthieu Mezil’s How to have FK in EF v1? post of 7/21/2009 shows you how to add a foreign key property to populate a DataGridViewComboBoxColumn and the ASP DropDownList with a T4 template.

Tony Sneed’s What’s New and Cool in Entity Framework 4.0 article for DevelopMentor explains “the new support for POCO and n-tier applications in EF 4.0.”

LINQ to SQL

Chris Swain’s Linq to Sql WCF Service Generator post of 7/25/2009 begins:

The thought is that if we could just expose our server application data through a controlled interface, then a variety of client applications could make use of it and all would be right with the world. (Okay, well maybe not that last part.)

The hard part is writing all of the plumbing code to expose your data to the world in the form of a service. WCF has made this much easier for us by generating the WSDL and exposing various attributes to allow us to mark code items related to our service. Then it’s just a matter of writing some methods that you want to expose to the world and adding the appropriate attributes to them. Pretty easy, right?

But what about those instances when you have a very complex database with lots of tables and even more stored procedures? Well, assuming you’re using Sql Server as your database and you can create a Linq to Sql data access layer, then the Linq to Sql WCF Generator Item Template can help.

The i-think Twenty-Two blog’s LINQ to SQL and tables with no Primary Key observes:

I ran into an interesting issues with LINQ to SQL yesterday. I had to update a table with no Primary Key. As I expected, LINQ to SQL wasn’t too happy with this scenario. Unfortunately LINQ to SQL will only throw an exception when you try to Insert or Delete a record with no primary key. Updates fail silently. …

Without a primary key the two following interfaces aren’t emitted: INotifyPropertyChanging and INotifyPropertyChanged

Therefore LINQ to SQL doesn’t know that your record has changed (so can’t warn you that it can’t update).

Now that you understand the problem the solution is simple: Define a primary key in your table.

LINQ to Objects, LINQ to XML, et al.

Ayende Rahien reports in his NHibernate Linq 1.0 released! post of 7/26/2009 that LINQ to NHibernate has RTW’d:

The most requested feature for NHibernate for the last few years has been Linq support, and it gives me great pleasure to announce the release of NHibernate Linq 1.0 RTM, which you can download from here.

NHibernate Linq support is based on the existing, proven in production, Linq provider in NHibernate Contrib. The plans to overhaul that and merge that into NHibernate’s proper for the next release are still active, but the project team feels most strongly that production quality Linq support is something that we ought to provide for our users now.

This Linq release support just about anything that you can do with the criteria API. We do not support group joins or subqueries in select clauses, since they aren’t supported by the criteria API. NHibernate’s Linq support has been tested (in production!) for the last couple of years, and most people find it more than sufficient for their needs.

We do plan for expanding the Linq support to support more, but the decision has been made that it makes absolutely no sense not to provide an interim, highly capable, release in the meantime for our users. …

LinqMaster explains Using IEqualityComparer and Lambda Expressions in this 7/26/2009 post:

Anyone using LINQ to manipulate in-memory collections is probably also using plenty of lambda expressions to make things quite easy. These two additions were really meant for each other. One of our interns here recently ran into an interesting problem while using LINQ. As a relatively new user of .NET based languages, reference types caused him a bit of trouble.

Brian MainsLINQ to XML: Non white space characters cannot be added to content post of 7/25/2009 notes “I got this error when setting up some LINQ to XML code and I couldn't figure out why I kept getting this error as it made no sense.”

Jim Wooley reports his Site updated with MVC on 7/24/2009. His MVC Sitemap Navigation with XML Literals post of 7/22/2009 describes part of the update process.

Beth Massi describes her Channel 9 Interview: LINQ Language Deep Dive with Visual Studio 2008 of 7/21/2009 as follows:

In this interview I sit down with Jonathan Aneja, a Program Manager on the Visual Basic Compiler team, who dives deep into these features like Type Inference, Anonymous Types, Lambda Expressions, Expressions Trees, and more. He explains what's actually happening behind the scenes and all the work the compiler is doing for you when you write a LINQ query.

ADO.NET Data Services (Astoria)

No significant new Astoria articles this week.

ASP.NET Dynamic Data (DD)

David Ebbo’s Using an Associated Metadata Class outside Dynamic Data post of 7/24/2009 begins:

A while back, I blogged about how ASP.NET Dynamic Data apps can uses an Associated Metadata class (aka a ‘buddy’ class) to add metadata attributed to properties defined in a generated class.  It’s a mostly ugly thing that was made necessary by limitations of the C# and VB.NET languages: they don’t let you add attributes to properties defined in another partial class.

What I didn’t mention there is that this ‘buddy’ class mechanism is actually not specific to Dynamic Data apps, and can in fact be used anywhere.  Since I’ve recently heard of several cases of users trying to do something similar, I’ll describe how it’s done.  If you’re familiar with TypeDescriptionProviders (which have been around since ancient times), this will look very trivial.

Rick Anderson’s Using DataAnnotations in MVC 2 - Catching up to Dynamic Data post of 7/23/2009 appears to be missing most of its intended content. If Rick wrote an article for MSDN Magazine I couldn’t find it.

Miscellaneous (WPF, WCF, MVC, Silverlight, etc.)

Aaron Skonnard’s RESTful Services With ASP.NET MVC article for MSDN Magazine’s July 2009 issue “describes how to use XHTML and ASP.NET MVC to implement REST services.”

Justin Etheredge describes Building Testable ASP.NET MVC Applications in MSDN Magazine’s July 2009 issue.

Windows Azure and Cloud Computing Posts for 7/20/2009+

Windows Azure, Azure Data Services, SQL Azure Database and related cloud computing topics now appear in this weekly series.

 

• Updated 7/25/2009: Minor updates and additions
•• Updated 7/23 and 7/24/2009: Correction to Azure instance cost, Azure ROI Calculator, Google Apps Sync, other additions
• Updated 7/21 and 7/22/2009 9:30 AM PDT: Additions
Updated 7/20/2009 2:30 PM PDT: New Azure Toolkit and SDK July 2009 CTP (see Infra)

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use these links, click the post title to display the single article you want to navigate.

Azure Blob, Table and Queue Services

<Return to section navigation list> 

• Eric Florenzano reviews the following (mostly key-value) alternatives to SQL RDBMSs in his My Thoughts on NoSQL post of 7/21/2009:

  • Tokyo Cabinet / Tokyo Tyrant
  • CouchDB
  • Redis
  • Cassandra

Steve Lesem’s Cloud Storage and The Innovator's Dilemma 7/19/2009 essay begins:

Too many think of cloud storage as just another or the next type of storage.  As usual with this view, it is associated with a view that the "next" storage type is bigger, faster and cheaper.  Because each generation of storage is always bigger, faster and cheaper.  As such, proponents of this view generally believe that access via traditional approaches, like WebDAV, NFS, cifs and others, is a critical capability.  Some may even argue that Web Services APIs are not the critical differentiation of Cloud Storage.  We disagree.

Cloud storage is a radical change.  It enables new application types.  The critical capability for cloud storage is a Web services API access, revealing the full promise of SOA (Service Oriented Architecture).  Second, the services that are revealed by the API access go far beyond "put" and "get".  Anytime and anywhere access, tagging, sharing and collaboration, geo storage via a single namespace, and policy management of storage are some of the services that the new applications will expect to find in the storage clouds they chose.  Also, storing massive amounts of data in the cloud and having these services available to act on all the data is required.

SQL Azure Database (SADB, formerly SDS and SSDS)

<Return to section navigation list> 

James Hamilton’s HadoopDB: MapReduce over Relational Data post of 7/25/2009 contends that Hadoop “could run over a full relational database”:

MapReduce has created some excitement in the relational database community. Dave Dewitt and Michael Stonebraker’s MapReduce: A Major Step Backwards is perhaps the best example. In that posting they argued that map reduce is a poor structured storage technology, the execution engine doesn’t include many of the advances found in modern, parallel RDBMS execution engines, it’s not novel, and its missing features.

In Mapreduce: A Minor Step Forward I argued that MapReduce is an execution model rather than storage engine. It is true that it is typically run over a file system like GFS or HDFS or simple structured storage system like BigTable or Hbase. But, it could be run over a full relational database. …

John Willis passes along RightScale first to support IBM DB2 [Express] in the cloud (Press Release) on 7/22/2009 with the full text of the release, starting:

RightScale, Inc., the leader in cloud computing management, today announced that the RightScale Cloud Management Platform is the first solution that lets users create, manage and automate IBM DB2 Express-C 9.7 database software on the cloud. Now, RightScale users can more easily build, test and deploy applications on leading clouds such as Amazon EC2 using the most advanced version of IBM DB2 directly from the RightScale platform. …

Markus Klems tackles Relaxed Consistency in this 7/22/2009 essay with links to additional references:

Since Werner Vogels’ famous blog post & paper about eventual consistency (aka relaxed consistency) some time has passed. Relaxed consistency is a concept related to the complexity of distributed systems. Since it can take some time for a read or write operation to be properly validated and passed through the various layers of a distributed system, the question arises, “how does this affect user experience?” Clearly, not in a good way.

Werner Vogels explains that for certain applications it might be necessary to tolerate a certain amount of system inconsistency in order to speed up the user-perceived time it takes to complete an operation (such as adding an item to the famous shopping cart). The concept is somewhat similar to Ajax, however, Ajax happens on the presentation layer whereas relaxed consistency happens somewhere in a back-end layer of the software stack. …

Fortunately, SADB doesn’t suffer from the problems associated with relaxed or eventual consistency.

.NET Services: Access Control, Service Bus and Workflow

<Return to section navigation list>

• Peter G. JonesAzure .Net Service Bus & TCP post of 7/22/2009 provides a real-world application for the .NET Service Bus:

We recently implemented the .Net Service bus to expose some in-house WCF services to the world wide world. It may be useful for me and others if I describe how to do this :) This setup allows you to switch between TCP and HTTP relay binding with configuration.

For the in-house systems you need to create a host – you can’t use IIS as the service bus requires a connection to be initiated by both ends of the communication.  We created a Windows Service application that also runs as a console app.  This is much simpler when developing and debugging.

Richard Seroter explains Sending Messages From Azure Service Bus to BizTalk Server 2009 in this 7/19/2009 post:

In my last post, I looked at how BizTalk Server 2009 could send messages to the Azure .NET Services Service Bus.  It’s only logical that I would also try and demonstrate integration in the other direction: can I send a message to a BizTalk receive location through the cloud service bus?

Richard’s preceding post is Securely Calling Azure Service Bus From BizTalk Server 2009 of 7/12/2009.

Live Windows Azure Apps, Tools and Test Harnesses

<Return to section navigation list> 

Davide Zordan shows you how to deploy David J. Kelly’s Simon Silverlight 3 game to Azure in his Simon in the cloud: deploying your existing Silverlight application to the Windows Azure platform. You can run a demo of the ported project here.

David Pallman’s Azure ROI Calculator post of 7/22/2009 announces:

… [T]he availability of Neudesic's Azure ROI Calculator, available online at http://azureroi.cloudapp.net. This is a beta tool we are soliciting feedback on.

The ROI Calculator lets you estimate what your monthly cloud computing charges will be on the Azure Services Platform. It also helps you determine what your return on investment will be if you migrate an application over to the cloud. …

Om Malik and Michael Cote chose Omar Del Rio’s Personal Radio Station as the international winner of newCloudApp() - The Azure™ Services Platform Developer Challenge on 7/22/2009. According to the help file:

With Personal Radio Station you can share your music with your friends, recommend music and post comments about the music you are listening to.

It is as simple as signing into your Windows Live account and your Mesh account (Your Mesh account will work as an storage drive for your favorite songs).

Jim Nakashima announces the availability of Windows Azure Tools and SDK July 2009 CTP on 7/20/2009. VS 2008 gets a new dialog for creating Cloud Services with multiple WebRoles and WorkerRoles, as well as Associating an ASP.NET Web Application (including MVC) as a Web Role. For more details, see Azure Infrastructure. Here’s Jim’s example:

 

Azure Infrastructure

<Return to section navigation list> 

••• Paul Krill’s Spring may be coming soon for Microsoft's Azure Computerworld post of 7/24/2009 speculates:

Java developers fond of the popular Spring Framework might gain an interesting deployment option -- the Microsoft Windows Azure cloud platform -- if an alliance can emerge between Spring proponents and Microsoft.

And given Microsoft's newly stated intention to possibly align with Spring advocates, it looks like such an option could be in the works.

•• David Chappell replies to my mischaracterization of his cloud presentation in a Comment to my A Comparison of Azure and Google App Engine Pricing post of 7/19/2009:

Just to be clear, the presentation linked to above (it's not a white paper) wasn't sponsored by Microsoft. Rather, it was a keynote I wrote for a Silicon Valley cloud computing conference, and it expresses my personal perspective. (In fact, this is true of everything I write, regardless of the sponsor.)

My goal in that talk wasn't to show which platforms were richer than others. Instead, I was trying to find a more useful framework for comparison than IaaS/PaaS, which I really do think is too simplistic. For example, AWS has some aspects of PaaS, such as SimpleDB and SQS. And depending on which dimension you look at, Windows Azure can look like either IaaS or PaaS. It's priced like IaaS--by instance hour--but you give it an application to run rather than explicitly creating a VM, like PaaS. The simple IaaS/PaaS breakdown once made sense, but it's no longer enough.

This is a big part of why I think scenario-based comparisons can sometimes be more useful. Still, things have changed since I wrote that talk: AppEngine now supports more of my scenarios with its addition of tasks and the Task Queue API. If I were to update the comparison, AppEngine would today cover pretty much the same terrain as Windows Azure.

•• David Linthicum’s Why on-premise SOA technology vendors should move to cloud computing pricing post of 7/24/2009 challenges vendors to “Move to service-based pricing, I dare you!”

Let's face it, there are no guarantees around the success of SOA technology within the enterprise. We pay our million-plus dollars for huge license agreements, prior to one server going online, and then find out after implementation that the technology has fallen short and the project [is] a failure. Sound familiar?

I don't put this on the SOA technology vendors, by the way. In many instances, the SOA architects just did not ask the right questions and testing was not done; when you do that, you get what you deserve. However, if you're looking at this from the SOA architect's point of view, they would say that they have to expend the money when the budget dictates and make specific assumptions that the technology will work as advertised.

But make sure the CPU pricing emulates Google’s usage-based formula for App Engine, not Azure’s fixed daily charge, regardless of user activity.

•• David Worthington reports that Burton research director Anne Thomas Manes recommends not using unmanaged code in Azure projects in his Analyst: Beware of unmanaged code in Windows Azure article of 7/24/2009 for Software Development Times:

The March CTP of Azure introduced .NET Full Trust, a method for developers to call into unmanaged COM DLLs; Platform Invocation Services, which enables managed code to call native code; and PHP support.

Anne Thomas Manes, research director with the Burton Group, said that Microsoft has three primary motivations for including unmanaged code: Certain aspects of Windows Communication Foundation (WCF) and Windows Workflow Foundation may require Full Trust, including Microsoft Message Queuing, a channel that is used in WCF; PHP is popular for new development; and COM support is necessary for hosting legacy applications.

… Manes cautioned that developers should have a strong business case for using COM in the cloud.

"COM development has a legacy of viruses and is unsafe in the cloud. Azure is a target of hackers. People should not write COM applications on Azure," she added.

•• C. Burns and B. Guptil co-authored Saugatuck Research’s MIT Cloud Computing Forums: Organizational Politics Cloud[s] Adoption Research Alert of 7/22/2009 (site registration required).

From June 11 through June 18, Saugatuck Research VP Charlie Burns took part in four expert panel and networking reception events hosted by the MIT Technology Review, that examined the realities of Cloud Computing, and their effects on user business and IT strategy, planning and management (See Note 1, and Research Alert MIT Cloud Computing Forums: Executives Don’t Know What They Don’t Know, RA-610, published 24Jun09).

[I]n this Research Alert, we highlight one important question that was asked of the event attendee’s: What is single greatest challenge to the adoption of Cloud Computing within their own firms? Interestingly, 40 percent of the 149 respondents indicated “organization/politics.” The next-ranked challenges were “legacy assets” (31 percent) and “workload migration” (29 percent).

John Gauntt will author a Mobile Cloud Computing Project for GigaOm Pro that’s expected to publish after Labor Day, according to this 7/22/2009 post:

I’ll be canvassing both the mobile and cloud computing sectors for data and insight.We’re still knocking around the outline which should be sorted by next week. Suffice to say that it’s going to pivot between an app-centric and service-centric view of Act II of the Mobile Internet.

Ping me with ideas and contacts if you’ve got a strong view on mobile cloud computing.

Nikhil Chinchwade started a new thread on 7/22/2009 in the Azure forum with this question: July CTP feature (Multiple web role and worker role). Will it be deployed on same VM or different VM's?. The official answer from Microsoft’s Li-Lun Luo was:

[E]ach instance will be deployed on a separate VM. So of course different roles will run on different VMs.

Nikhil then asked:

I mean, if I have configured two web roles under same cloud service (per the new feature provided in July CTP) with 1 instance each, would it use 2 VM's or 1?

Aleks Gershaft of the Azure team clarified Li-Lun’s use of the term instance to mean *Role instance:

It will be two VMs. At the current time, each role instance is given a separate VM. [Emphasis added.]

Which means US$0.12 per hour, $2.88 per day, or $86.40 per month per Role instance after RTW. I found this hard to swallow in an environment as competitive as cloud computing platforms. (7/23/2009: Compute cost corrected from $0.15 per hour in original post.)

As developer Pita.O points out:

The whole idea of the feature request for multiple roles per instance was for us to use one VM efficiently. …

That was my understanding also. Hopefully, the Azure team will enable by RTW all role instances to run on one VM when traffic is light.

• Elizabeth Montalbano’s Company Helps ISVs Assess Cost of Microsoft's Windows Azure review of 7/22/2009 for PC World describes PreEmptive’s new DotFuscator release, which:

[I]ncludes a way for ISVs (independent software vendors) to monitor an application not only to find out how many computing resources it requires when running on Windows Azure, but also to find out how people are using it, said Sebastian Holst, the chief marketing officer at PreEmptive, based in Mayfield Village, Ohio. …

PreEmptive charges US$12,000 for a development group to use the full suite, and then if users want to subscribe to a PreEmptive-hosted dashboard to view the data, it's $2,000 per user per year, Holst said.

A bit pricey, n’est-ce pas?

• James Watters takes NY Times editorial contributor and Harvard law professor Jonathan Zittrain to the woodshed in a scathing NYT Kicks Off Cloud Paranoia Series response of 7/21/2009 to Zittrain’s Lost in the Cloud op-ed piece of 7/19/2009. Zittrain claims the cloud is a threat to innovation. As Watters observes, Zittrain confuses Facebook and Apple’s AppStore with cloud computing:

Innovation will be alive and well because the fundamental technologies at the core of cloud computing are designed for massive, vibrant, explosive, awesome, and amazing application innovation. There will always be a big place in the market for companies who achieve design simplicity by limiting what can be done on their platforms—Apple and Facebook may march to massive market share by this principle—but as long as the technologies underpinning the network are open, programmable, extensible, modular, and dynamic as they are and will be, innovation is in good hands.

Ask any new cloud platform what they really want in abundance—it’s developers.  As always, as in the desktop era, you win them by giving them the best place to innovate and create.

I agree with Watters that Zittrain’s essay completely missed the mark.

• Gordon Haff analyzes the trade-off between local and cloud-based computing in his Moore's Law vs. the Cloud article of 7/21/2009 for CNet News:

… [A]lthough cloud computing tracks improvements in networks, it doesn't necessarily sync up so cleanly with the parallel improvements going on in computers themselves. As a commenter put it in a recent post of mine: "The thing that I don't understand about the move to "cloud-based services" is that it seems at odds with Moore's Law. Specifically, devices are going to have more & more processing power, disk space & memory - why would you want to offload processing to the cloud?" …

• Simon DaviesWindows Azure SDK July SDK Powershell Role Sample of 7/21/2009 describes:

[A] new sample project in the [Azure July 2009] SDK called PowerShellRole , this sample highlights the fact that Powershell is available - and usable – from Windows Azure. There is a sample web role that enables Powershell commands to be  executed, the following screenshot shows using the Clouddrive Powershell provider to list the contents of a blob store:

Jim Nakashima announces the availability of Windows Azure Tools and SDK July 2009 CTP on 7/20/2009. VS 2008 gets a new dialog for creating Cloud Services with multiple WebRoles and WorkerRoles, as well as Associating an ASP.NET Web Application (including MVC) as a Web Role. For more details, see Azure Infrastructure.  (Repeated from Live Windows Azure Apps, Tools and Test Harnesses.)

Here’s the full What’s New for the July 2009 CTP?

    • Support for developing and deploying services containing multiple web and worker roles. A service may contain zero or more web roles and zero or more worker roles with a minimum of one role of either type.
    • New project creation dialog that supports creating Cloud Services with multiple web and worker roles.
    • Ability to associate any ASP.NET Web Application project in a Cloud Service solution as a Web Role
    • Support for building Cloud Services from TFS Build
    • Enhanced robustness and stability

Phil Wainewright questions whether Microsoft is “really aiming to become the leading cloud provider in those spheres?” in his Assessing Microsoft's Cloud Intentions post of 7/20/2009 to ebizQ:

… I'm sure the Azure and Online Services teams are ambitious for their own products, but Microsoft as a whole has to balance their aspirations with those of other business units, many of which bring in much larger revenues. No such considerations hold back Amazon in formulating the business strategy for its cloud computing services, or Google as it promotes its Google Apps offering to enterprises. It's pretty clear what these companies are trying to achieve. Whereas Microsoft's intentions are always liable to ambivalence, compromised by the demands of other, more established product lines. …

Maria Spinola’s White Paper: An Essential Guide to Possibilities and Risks of Cloud Computing of 7/19/2009:

The goal of this White Paper is to provide a realistic perspective of the possibilities, benefits and risks of Cloud Computing; what to look for, what to avoid, and also some tips and best practices on implementation, architecture and vendor management strategies. It is important to consider all those aspects before you decide either to move (but without putting the carriage before the horse) or not to move your systems, applications, and/or data to to the “Cloud”, in a “hype free” approach.

Click here to download the White Paper

Eric Lai claims “One integrator who uses rival cloud services questions Microsoft's math” in his Windows Azure will outcompete Amazon Web Services on features, total cost post of 7/15/2009 to InfoWorld’s CloudComputing blog:

While The Register declared Azure to be cheaper than Amazon.com's price for hosted Windows and more expensive than a Linux instance, Silicon Valley Insider called Azure's price "not significantly different than either Google or Amazon."

"The actual per-unit pricing is totally uninteresting in my mind," Prashant Ketkar, director of marketing for Windows Azure, told Computerworld on Tuesday. "What will it cost me end-to-end?"

Ketkar says that Azure offers a number of standard features that, if purchased as add-ons for most competing platforms, cause their prices "to be substantially more expensive than us."

He cited Azure's automated service management as a "killer feature" that enables apps on a downed server to be reloaded onto another server with minimal interruption using Microsoft's "fabric controller".

Outcompete seems a stretch to me.

Elizabeth Montalbano reports Microsoft: Azure enterprise licenses will be simple on 7/17/2009 for ComputerWorld, but the following argues against her premise:

In particular, pricing for its hosted Business Productivity Online Suite (BPOS) -- which includes hosted versions of Exchange, SharePoint, LiveMeeting and Office Communications -- is causing customers some concern, said Paul DeGroot, an analyst with research firm Directions on Microsoft.

If a customer purchases a BPOS subscription for employees who will access only those services, the customer must still purchase CALs for those users, DeGroot said, even though they are not accessing the on-premise software as well. Microsoft gives customers a discount on other parts of their license in such scenarios -- on the Software Assurance (SA) maintenance program required for enterprise agreements, for example -- but they still end up paying for something they are not using, DeGroot said.

Depending on how it wants to give companies access to Azure beyond the pay-as-you-go pricing model, the company could run into the same trouble with its cloud-computing platform, he said. "With Azure it could get even more complicated," DeGroot said, though it remains to be seen until Microsoft unveils specific terms of Azure's integration into enterprise contracts.

Emma Stewart and John Kennedy offer their environmental The Sustainability Potential of Cloud Computing: Smarter Design analysis of 7/20/2009, which starts with this premise:

[F]or most financial, software and service sector companies, data centers are a major – and growing source – of greenhouse gas emissions. In 2006, the DOE estimates that U.S. data centers used 61 billion kWh of electricity, representing 1.5 percent of all U.S. electricity use, or the amount used by about 6 million US houses.

Emma and John work for Autodesk, Inc.

Cloud Security and Governance

<Return to section navigation list>

•• Chris Hoff (@Beaker) proposes “embed[ding] a standardized and open API layer [of automated audit and security management capability] into each IaaS, PaaS and SaaS offering” in his Extending the Concept: A Security API for Cloud Stacks post of 7/24/2009:

This way you win two ways: automated audit and security management capability for the customer/consumer and a a streamlined, cost effective, and responsive way of automating the validation of said controls in relation to compliance, SLA and legal requirements for service providers. [Emphasis Beaker’s.]

Since we just saw a story today titled “Feds May Come Up With Cloud Security Standards” — why not use one they already have in SCAP to suggest we leverage it to get even better bang for the buck from a security perspective.  This concept extends well beyond the Public sector and it doesn’t have to be SCAP, but it seems like a good example.

Of course we would engineer in authentication/authorization to interface via the APIs and then you could essentially get ISV’s who already support things like SCAP, etc. to provide the capability in their offerings — physical or virtual — to enable it.

Microsoft is promoting automated operation and has SAS 70 attestation and ISO/IEC 27001:2005 certification, so it sounds to me like a good model for the Azure Services Platform.

••• Beaker added the following update to the preceding post later in the afternoon:

Update: Wow, did this ever stir up an amazing set of commentary on Twitter. No hash tag, unfortunately, but comments from all angles.  Most of the SecTwits dropped into “fire in the hole” mode, but it’s understandable.  Thank you @rybolov (who was there when I presented this to the gub’mint and @shrdlu who was the voice of, gulp, reason.

Tim Green asserts Feds may come up with cloud security standards in this 7/24/2009 NetworkWorld Cloud Security Alert:

The federal government may step up with a set of cloud-security standards to meet government requirements for protecting sensitive data.

Federal CIO Vivek Kundra says he wants to certify cloud services that pass government muster so federal agencies can buy the computing or applications services they need and turn them on quickly. …

That requires establishing standards that officially meet 2002 Federal Information Security Management Act requirements that federal IT infrastructure must comply with. …

@Beaker tweets “Interesting given my discussions yesterday.” Was Chris referring to his anti-panelist duties for the Open Group’s The Cloud Security Debate: Is Cloud Computing More or Less Secure than Traditional In-house IT? discussion. In case you’re wondering, @Beaker’s official job title is Director of Cloud and Virtualization Solutions, Cisco.

John Pescatore’s Financial Friday: The Cost of a Security Incident Is Usually Much Greater Than Preventing It post of 7/24/2009 analyzes the total cost of a recent “loss of unencrypted CDs that contained on the order of 180,000 [HKSB] customer records” and concludes that it was US$20 million, without including soft costs, such as loss of business.

John is a Gartner analyst who specializes in IT security matters.

Stephen Foskett addresses the issue of business continuity planning in his Can You Leverage Cloud Services For Disaster Recovery? post of 7/24/2009, which asserts: “Cloud resources are inherently flexible, giving needed capacity on demand.” Another post with the same title claims that “The current economic climate is pushing many to look for current-year budget savings.”

Alan Wilensky’s Cloud insanity – the Shills come out of the woodwork post of 5/30/2009 (and just appeared in my feed reader today) takes on:

[A] shill for the cloud industry [who] says, in so many words, that the time to question the cloud hosted apps is over, they are established and able to deliver, and that self styled analysts, like me, have NO BID-NESS asking what if the service goes down, whaaaaaa! Self hosted solutions go down. And then commenter Russell says one of the most amazingly naive things I have ever seen in print, maybe in my entire life”:

“Many of the PaaS providers are in business with deep pockets (Force and Quickbase), well funded by professional investors (Bungee Labs), running with established management teams (Quickbase), or conservatively managed with established customer bases (WorkXpress).”

See the actual [comment] thread [on Jane McCarty’s blog] here.

Chris Hoff (@Beaker) points out several cloud-security papers (including his) from the USENIX HotCloud 09 conference held in San Diego June 15, 2009 in his Tons Of Interesting Papers/Presentations From Usenix/HotCloud ‘09 post of 7/21/2009. Beaker’s post also includes many links to his own recent papers, articles, etc.

The “Geneva” Team Blog’s 7/21/2009 Official Name for "Geneva" repeats the preceding week’s announcement and introduces Microsoft’s

Business Ready Security strategy to help both partners and customers 1) protect everywhere and access anywhere, 2) integrate and extend security across the enterprise, and 3) simplify the security experience and manage compliance.

For more information about these announcements and others that were made, check out the Microsoft Forefront Team Blog.

• Tom McHale asks Does Every Cloud Have a Silver Compliance Lining? in his 7/21/2009 post to CA’s Governance, Risk and Compliance blog:

While CIOs are considering cloud computing as a potential hedge for future IT infrastructure investments, the various compliance teams are saying/thinking: “You want to put what, where?” I have had discussions with a few members of the media on this issue (read more here, here, and here) and I thought I would summarize my few thoughts in a post.

• Ivan Lucas analyzes Password Recovery Speeds in this 7/10/2009 article on Lockdown.co.uk:

This document shows the approximate amount of time required for a computer or a cluster of computers to guess various passwords. The figures shown are approximate and are the maximum time required to guess each password using a simple brute force "key-search" attack, it may (and probably will) be possible to guess correctly without trying all the combinations shown using other methods of attack or by having a "lucky guess".

He offers times for passwords up to nine characters long consisting of numbers, letters, numbers and letters, and numbers, letters and special characters guessed by seven computer classes (from Pentium 100s to supercomputers.)

Erik M. Filterman asks Can Cloud Defend Against DDoS Attacks? on 7/20/2009 and claims:

If you've been thinking about moving your applications into the cloud but weren't sure how to best justify the investment, you can probably thank the North Koreans for helping to write your business case. …

Fiterman is a former FBI special agent and founder of Methodvue, a consultancy that provides cybersecurity and computer forensics services to the federal government and private businesses.

Ellen Messmer reports McAfee getting more aggressive on cloud-based security in this 7/20/2009 article for NetworkWorld:

McAfee Monday says it intends to expand its security-as-a-service offerings in recognition that customers are opting more and more to adopt cloud-based deployments.

"We already have a good foundation for this," says Marc Olesen, McAfee's senior vice president and general manager of the new software-as-a-service business unit. McAfee Total Protection Service, which has about 5 million users, is primarily cloud-based for endpoint and mail security scanning.

In addition, McAfee's Web Protection Service, wholly in the cloud, provides URL filtering and reputation analysis, while the company's Vulnerability Assessment service can scan Internet-facing systems to discover software vulnerabilities.

But McAfee anticipates a much wider push into security-as-a-service in the course of the coming year.

Cloud Computing Events

<Return to section navigation list>

Leon Katsnelson describes a forthcoming IBM and xkoto Webinar, Scalability in the Cloud: Fact or Fiction?,  in his The Great Cloud Computing Deception post of 7/22/2009:

… One of the most hyped and the least understood attributes of the Cloud is “elasticity” or in other words ability to get compute capacity when you need it on a moments notice and to pay only for the capacity that you actually used. This is indeed a fantastic feature and for some it will result in thousands or even millions in cost savings and more agile IT. That is – if it can be achieved – and it is one big “IF” and it is precisely the subject of this webinar. So come join us for an hour on July 28 in the conversation on this topic.

We will have Ariff Kassam, co-founder and CTO of xkoto, Rav Ahuja our IBM DB2 Cloud Computing Product Manager, Paul Lapointe, Solutions Architect from xkoto and me (Leon Katsnelson) share our opinions, do a demonstration and answer questions or engage in a debate if time permits. You can register free of charge here.

When: 7/28/2009 9:00 to 10:00 AM PDT 
Where: Internet (Webinar) 

• Bryce Cameron’s Welcome to [the Chicago] Azure July User Group meeting announcement of 7/22/2009 describes the 7/20/2009 meeting:

Our topic this month is Multi-Enterprise Business Applications (MEBAs), a new category of application for business collaboration that the cloud makes possible. We'll review what the current thinking on MEBAs is from Microsoft and the cloud community followed by an in-depth demo and code exploration of an Azure business collaboration application.

Dave Bost, Developer Evangelist from the Developer Platform and Evangelism team at Microsoft will be the presenter. He will discuss how MEBAs facilitate business processes that span enterprises, how they are enacted by the exchange of messages, and how complex, cross-organizational challenges are managed through these applications (e.g. Security, Data, Management and Governance).

When: 7/30/2009 6:00 PM to 8:00 PM
Where: Microsoft Downers Grove Office, 3025 Highland Parkway, Downers Grove, IL, USA 

• Dan Logan reports on Cloud Presentations in the boonies of the central California coast for the week of 7/20/2009:

[T]he Cloud Computing Forum will offer a look at how cloud computing can be used to cut costs and lessen a business’ impact on the environment and [a] slide show … entitled “The Future of Computing: Green, Virtual, and Cloudy” will be presented by Winston Bumpus, president of an industry consortium called the Distributed Management Task Force.

When: 7/21/2009 4:00 PM - 8:00 PM
Where: Veterans’ Memorial Building, 801 Grand Ave., San Luis Obispo, CA, USA

Josh Fraser, vice president of business development at Santa Barbara-based RightScale, will give Softec, the Central Coast Software and Technology Association, an overview of cloud computing and discuss the benefits of deploying and automating in the cloud.

When: 7/22/2009 6:00 PM
Where: Pelican Point Restaurant, 2555 Price St., Pismo Beach, CA, USA

(Sorry for the short notice.)

• SYS-CON’s Cloud Computing Bootcamp Returns to Silicon Valley according to this 7/21/2009 post:

One-day immersive learning on November 3, 2009, at the Santa Clara Convention Center Santa Clara, CA

When: 11/3/2009
Where: Santa Clara Convention Center, Santa Clara, CA, USA

• Barbara Darrow analyzes Windows Azure pricing announced at the Worldwide Partners Conference in her Microsoft partners ponder Azure pricing complexities post of 7/14/2009. She notes:

… Perhaps most important for partners is that they will be able to bill their customers for any partner-developed Azure services and thus "own" that customer relationship. There had been concern that Microsoft would insist on controlling that transaction and relationship. [Carl] Mazzanti said that is very good for partners that they price and bill out their own Azure services. …

Carl Mazzanti is co-founder of eMazzanti Technologies, a Hoboken, N.J.-based Microsoft Gold VAR with a hosting business.

Microsoft’s Worldwide Partner Conference features 16 Azure-tagged sessions, including:

  1. AP005 Extend Your Application to the Azure Cloud with SQL Azure Database (Mon 7/13 | 3:30 PM-4:30 PM | 220-222) Slides, Video
  2. SS001 Software-Plus-Services: Bringing it all together across MS Online Services, Partner Hosted and Windows Azure. (Mon 7/13 | 3:30 PM-4:30 PM | 217)
  3. SS003 Lap around Windows Azure, Business Edition (Mon 7/13 | 5:00 PM-6:00 PM | 220-222) Slides (earlier slides by Dave Bost) Video
  4. AP002 Partnering with the Azure Services (Tue 7/14 | 2:00 PM-3:00 PM | 220-222) Slides Video
  5. SS006 The Azure Services Platform Partner Model and Pricing (Tue 7/14 | 3:30 PM-4:30 PM | 220-222) Video
  6. SS001R Software-Plus-Services: Bringing it all together across MS Online Services, Partner Hosted and Windows Azure (Tue 7/14 | 5:00 PM-6:00 PM | 220-222)
  7. SS008 Embracing the Cloud: How ISVs Can Use New Microsoft Programs to Move into the Software-Plus-Services World (Tue 7/14 | 5:00 PM-6:00 PM | 215-216) Video (Accompanying text for SS006 is incorrect)
  8. US007 US Public Sector: Cloud Computing - the Government Perspective (Thu 7/16 | 3:00 PM-4:00 PM | 277)
  9. CI011 Building the Foundation for a Cloud Computing Infrastructure (Tue 7/14 | 3:30 PM-4:30 PM | No Room Data)
  10. Dynamic Datacenter: Enabling the Hosted Cloud and Managed Services Video

Slides linked from the “The Latest Resources” section of the Azure Partner Quickstart portal. This section will be updated as other slide decks appear. (Copied from Windows Azure and Cloud Computing Posts for 7/13/2009+ in the hope of adding more video links.)

Other Cloud Computing Platforms and Services

<Return to section navigation list> 

•• Steve Lesem describes Google Apps Sync for two-click synchronizing of Microsoft Exchange and Outlook data with App Engine projects in his Disrupting Microsoft: Google migrates Microsoft customers to the Cloud post of 7/24/2009:

The phrase “razor sharp focus” is a tired cliché in our field, but you have to hand it to Google. They have just announced a “two-click data migration tool which allows employees to easily copy existing data from Exchange or Outlook into Google Apps.”
By building a tool to make this migration a “point-and-click” experience, they are hastening the defection rate for businesses looking for an alternative to Microsoft’s office suite. What’s more, three service providers - NuVox, Netfirms and IKANO - have already begun offering this tool to their customer base.

Google Apps Sync, as the migration tool is called, has already been put to use at enterprises like Genentech and Avago.

It’s a case-study in business model disruption. The cost? One-sixth the price of Microsoft.

Of course we’re still in the “early days” and the jury is still out. Microsoft will surely counter with Azure, but you can see why Ray Ozzie is worried.

For Google, on the other hand, the state of cloud computing is promising. They claim around 1.75 million companies are running Google Apps. The enterprise, as Gray noted earlier, is ready for Cloud Computing. And why is this?  We’ve mentioned the economics before, but here is Google’s take on the benefits of Cloud Computing.

Steve’s post contains a video of “some compelling Google propaganda” for Google Apps Sync.

•• Aliya Sternstein’s White House mulls making NASA a center for federal cloud computing post of 7/24/2009 reports:

Officials at the space agency and the Office of Management and Budget have "broached the idea of NASA becoming an IT service provider," said Mike Hecker, NASA's associate chief information officer for architecture and infrastructure. But, "NASA as an IT service provider takes us into a new realm. We're still debating if that's a good idea or not."

NASA is developing a cloud computing model, called Nebula, to support some of its projects. For example, the agency uses Nebula to share NASA images and statistics with international partners and academic institutions. The system provides high-capacity computing, storage and network connectivity

James Urquhart reports Sybase to enter cloud through mobility in this 7/23/2009 post:

A recent report noting some big wins in the mobile-platform space by former database powerhouse Sybase has some interesting observations about what this means for the company's role in the cloud-computing market. Since 1998, Sybase CEO John Chen has been redirecting the company's efforts toward mobility, which he claims is now paying off big time.

Most interesting to me, however, is Chen's claim that cloud computing means a big opportunity for his mobile-platform business. As the article notes:

“Sybase's mobile platform may provide a cloud-based lifeline for the likes of SAP, Microsoft, and Oracle, providing those legacy enterprise application vendors an entry into the mobile-computing world of the future.”

Another arrow in the back of Windows Mobile and SQL Server CE.

John Foley’s Amazon Web Services Secrets Revealed InformationWeek expose of 7/22/2009 begins:

Amazon.com exercises tight control over information related to its cloud computing business, a source of frustration to anyone trying to get a complete picture of Amazon Web Services. So I went in search of information from other sources. Here's what I found.

First, Amazon does provide a few details about the size and scope of AWS. In a mid-year status report, the company said that "hundreds of thousands" of developers have registered for AWS and that the network bandwidth consumed by two AWS services – its Elastic Compute Cloud (EC2) and Simple Storage Service (S3) – now exceeds the bandwidth required for all of Amazon's global Web sites. It also disclosed that 52 billion objects are stored in S3 and that S3 requests regularly peak at 80,000 requests per second. …

And continues with more details gleaned from other sources.

Chris Kanaracus claims Oracle Grid Update Tied to Emerging Cloud Trend in this 7/22/2009 PC World article:

Oracle this week shipped an update to its Coherence in-memory data grid, a member of a class of middleware that some say may be on the cusp of broader adoption for cloud computing.

In-memory data grids store information that applications need in memory across a pool of servers, instead of reading it off disks, resulting in major performance gains.

The Coherence product is one of the more mature in a space occupied by offerings from IBM as well as smaller companies like GigaSpaces and a number of open-source projects. Microsoft is also developing a system dubbed "Velocity."

• Tom Lounibos announces a NEW Web Services Performance Certification Program from SOASTA in this 7/20/2009 post:

I’m very excited to announce the new SOASTA Performance Certification Program designed to enable companies deploying software in the Cloud, at hosted data centers, or behind corporate firewalls to certify that their websites have been tested and have met or exceeded industry benchmarks for performance at peak levels of user traffic.  For the past ten years, the dirty little secret in the web development  community has been that whether due to cost, complexity or lack of resources, the vast majority of web applications and sites have not been tested at normal user volumes, much less for unexpected spikes in traffic. Which means our user communities have become the testers for virtually every website, a risk that has proven very costly time and again.

John Foley comments on Tom’s post in his Is Your Cloud App Ready For 100,000 Users? post of 7/21/2009 for InformationWeek.

• Jo Maitland interviews Mike Repass, product manager for Google App Engine, in her Google opens up on App Engine post of 6/29/2009:

Mike Repass: The question gets at, what are Google's core competencies? We know how to deal with hundreds of thousands of machines. All our hardware is custom built and not something we could easily serve up at a raw level in a way that makes sense to people. Infrastructure as a service would be a play against Google's core competencies.

We're saying, let's play with clouds and see if we can catch lightning in a bottle. Google does not say, "Let's build a product." The company doesn't work like that. …

Antone Gonsalves reports HP Buys Cloud-Computing Vendor IBrix on 7/20/2009 for InformationWeek:

Hewlett-Packard (NYSE: HPQ) has agreed to buy IBrix, a maker of file-serving software, to boost HP's infrastructure offerings to cloud-computing customers.

Founded in 2000, IBrix, in Billerica, Mass., has 53 employees and more than 175 corporate customers spanning the communications, media, entertainment, Internet, oil and gas, healthcare, life sciences and financial services industries. HP uses the company's technology in several products, including StorageWorks storage area networks, ProLiant servers, BladeSystems and ProCurve Ethernet switches and management software. …

Stacey Higginbotham claims PrimaCloud’s Virtualized I/O Takes Cloud Computing to the Next Level in her Giga Om post of 7/20/2009:

The folks behind PrimaCloud, a cloud computing and storage product that offers a service-level agreement that it claims delivers 99.99 reliability (that means it can go down 53 minutes each year), said today it will save $1 million by virtualizing its network and will spend 50 percent less to deliver its high reliability cloud. The company has installed boxes from Xsigo Systems that sit between the servers and switches and create a cloud through which the network traffic from the virtual machines loaded on the servers is routed. The network can handle traffic destined for other servers or for the storage network without requiring separate cables. …

Michael Wolf lists some of What Happened in Cloud Computing in Q2? in this 7/20/2009 post to Giga Om Pro. Topics are:

  • Old Guard Does Cloud Dance
  • McKinsey Rattles Cloud Crowd
  • VMware Launches “Cloud OS”
  • Networking Giant Cisco Jumps In
  • Oracle and EMC Make Big Buys

Alan Williamson’s “Quick Q&A with Geert Bevin, Evangelist, Terracotta Inc.” in his Scaling Java From the Enterprise To the Cloud post of 7/20/2009 begins:

Simple scalability - that's the Terracotta value proposition. In this Quick Q&A with SYS-CON's Cloud Computing Journal editor-in-chief Alan Williamson, Geert Bevin (pictured) - Evangelist at Terracotta Inc - explains that how Terracotta helps enable Java in the Cloud...and clarifies the difference between the commercial and OSS editions of Terracotta.

Ian Grant reports BT to host Microsoft cloud software - but not Azure on 7/20/2009 for ComputerWeekly.com: 

Less than a week after Microsoft announced its Azure cloud computing initiative, BT said today it will market and deliver Microsoft Online Services to business customers, giving them access to integrated cloud computing and voice services.

Customers will have internet access to Microsoft's Exchange Online, SharePoint Online, Office Communications Online and Microsoft Office Live Meeting, all hosted by BT.

BT will embed the suite into its multi-protocol layer switching (MPLS) networks, giving customers a fully hosted service that includes power, performance management, maintenance and software upgrades. …

Neither firm responded to queries as to how this deal related to Azure, the Microsoft-hosted processing, storage and networking package launched last week.

Seems to me to be a bit premature to discuss commuercial Azure deployment in the UK when the product’s still in beta in the US.

Maureen O’Gara claims “Engine Yard Cloud platform along with Flex is a cloud service plan for production-level Rails applications” in her Ruby-on-Rails Apps Get Cloud Lift post of 7/20/2009:

Engine Yard, the Ruby-on-Rails doyen, is supposed to announce its Engine Yard Cloud platform today along with Flex, a cloud service plan for production-level Rails applications.

The company says its Engine Yard Cloud leverages hundreds of man-years of experience in deploying, managing and scaling some of the world's biggest Rails sites and will put that know-how to work for businesses that want to run Rails on Amazon's EC2 public cloud.

It automates the deployment and management of applications that fetch up to a million unique visitors a month. More and the company will advise shifting to its dedicated infrastructure.

Rich Miller’s Report: Microsoft Plans Data Center in Brazil Data Center Knowledge post of 7/17/2009 says:

Microsoft (MSFT) plans to add a new data center in Brazil to support its online services business in Latin America, company officials have told Business News Americas, which said the facility will be operational in the fourth quarter of this year, and will host Microsoft’s online services for the business market.

Although the report quoted several executives from Microsoft’s Latin America business unit, the company is not confirming any plans for a data center in Brazil.

“We don’t have anything to announce about new facilities at this time,” said a spokesperson for Microsoft Global Foundation Services, which builds and operates the company’s data centers. …

Microsoft’s Confirming Commercial Availability and Announcing Business Model release from #WPC09 says, inter alia.:

A few months after, in March 2010 to be more precise, Microsoft will target 16 additional countries with Windows Azure: Brazil, Chile, Colombia, Czech Republic, Greece, Hong Kong, Hungary, Israel, South Korea, Malaysia, Mexico, Poland, Puerto Rico, Romania, Singapore, and Taiwan. [Emphasis added.]

<Return to section navigation list> 

Sunday, July 19, 2009

A Comparison of Azure and Google App Engine Pricing

Update 10/24/2009: See my MSDN Premium Subscribers and BizSpark Members to Receive Windows Azure and SQL Azure Benefits post of 10/20/2009 for updated information on developer quotas for Windows Azure and SQL Azure.

Update 7/16/2009: Added GAE Data Store CPU Time, Azure and GAE instance resources
           7/19/2009: Clarified need for free service threshold or developer accounts

The computer trade press on 7/14/2009 was full of comparisons of Amazon Web Services (AWS) and Windows Azure pricing with headlines such as Is Microsoft Starting a Cloud Price War? by Reuven Cohen. What’s missing from the price-war posts I’ve seen to date is a comparison between Google App Engine (GAE) and Azure pricing.

Most cloud computing observers (including me) classify GAE and Azure as Platforms as a Service (PaaS) and AWS as an Infrastructure as a Services (IaaS) offering. Both GAE and Azure offer a multi-lingual development environment: Google supports Python and Java, while Azure accommodates .NET’s Visual Basic and C#, IronPython and IronRuby plus Java on the desktop and in the cloud. Therefore, it seems much more logical to me to compare GAE and Azure costs.

Following are GAE and Azure pricing presented in the tabular format from Google’s Billing and Budgeting Resources page with data from the Windows Azure Pricing & Licensing Overview added:

Resource Unit GAE Unit Cost GAE Daily Quota Azure Unit Cost
Outgoing Bandwidth gigabyte $0.12 1 $0.15
Incoming Bandwidth gigabyte $0.10 1 $0.10
Application CPU Time* CPU hours $0.10 6.5 $0.12
Stored Data gigabytes/month $0.15 1 $0.15
Data Store CPU Time* CPU hours $0.10 60 N/A
Recipients Emailed recipient $0.0001 2,000 N/A
Storage Transactions 100,000 N/A N/A $0.10
.NET Service Messages 100,000 N/A N/A $0.15

*Clarified 7/16/2009

Disregarding GAE’s free daily Bandwidth and CPU Time quotas, Azure appears a bit more expensive than GAE. However, GAE defines CPU Time as follows:

The total processing time for handling requests, including time spent running the app and performing datastore operations. This does not include time spent waiting for other services, such as waiting for a URL fetch to return or the image service to transform an image.

CPU time is reported in "seconds," which is equivalent to the number of CPU cycles that can be performed by a 1.2 GHz Intel x86 processor in that amount of time. The actual number of CPU cycles spent varies greatly depending on conditions internal to App Engine, so this number is adjusted for reporting purposes using this processor as a reference measurement.

Azure’s CPU Time is the clock time per instance deployed, $2.88/day; GAE’s CPU Time is based on application activity. GAE claims its free daily quota, which is worth up to $7.02 per day, will “serve a reasonably efficient application around 5 million page views per month, completely free.”

Conclusion: If there is a “cloud price war,” Azure has lost it to Google.

Update 7/16/2009: Comparison of instance resources:

Each Windows Azure instance is the equivalent of a 1.5-1.7GHZ AMD processor and runs Windows 2008 Server x64 with 2 GB RAM, which leaves about 1.7 GB to run the application.  Each instance has the equivalent of a 250-GB fixed disk for transient storage and a 100-Mbps network connection.

For free GAE accounts, the maximum CPU usage rate is equivalent to 15 CPUs (15 CPU minutes/minute); billable accounts offer up to 72 CPUs (72 CPU minutes/minute). Free incoming and outgoing bandwidths are limited to 56 MB/minute; billable accounts receive up to 740 MB/minute.

Of note: Joab Jackson’s Revving up Google App Engine post of 3/3/2009 to the Government Computing News blog quotes Wayne Beekman, co-founder and co-owner of consulting company Information Concepts, who delivered a seminar at Google’s Reston, VA offices:

Thus far, about 45,000 apps have been built on GAE, and about 10 million developers have registered for the service. …

Uptime was another concern audience members voiced. Google offers no specific guarantees of how reliable you could expect the service to be, called a service-level agreement (SLA) by the industry. When your users come calling, you want to make sure the app is ready. The discussion of downtime is pertinent given that one Google service has had a few unscheduled downtimes of late.

I haven’t found any evidence that Google has offered an SLA for GAE subsequently.

Update 7/15/2009: Aashish Dhamdhere (@dhamdhere) of the Azure team said in a 7/15/2009 Tweet responding to my Twittered suggestion about free Azure quotas or special no-charge development accounts: “We are actively considering this. More soon.”

The Azure Team’s Steve Marx (@smarx) added the following in another couple of Tweets:

Note that GAE's cost function is different. Small app w/ no traffic = no cost to them. Not so in Windows Azure.”

“That said, we're still looking at lots of ways to evolve our pricing model, and I do hear the feedback. Stay tuned.”

Update 7/19/2009: I believe it’s imperative that Microsoft offer either a free service threshold or free developer accounts for Windows Azure and SQL Azure Database after their RTW at PDC 2009. AWS and GAE are proven cloud-computing hosts; Azure and SADB are relatively unknown entities, especially SADB. Failure to provide potential users and developers free access for initial testing will severely affect the ultimate commercial uptake of Azure services. The reported 45% development discount won’t suffice.

Windows Azure and Cloud Computing Posts for 7/13/2009+

Windows Azure, Azure Data Services, SQL Azure Database and related cloud computing topics now appear in this weekly series.

•••• Update 9/18 – 9/19/2009: A few additions
Update 9/16 – 9/17/2009: Additions and clarifications, two more WPC session videos 
Update 9/15/2009: New content from #WPC09 and elsewhere
• Update 9/14/2009: Live from the Microsoft Worldwide Partner Conference

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use these links, click the post title to display the single article you want to navigate.

Azure Blob, Table and Queue Services

<Return to section navigation list> 

Steve Nagy’s Azure Worker Role Management – A WorkSharing Framework post of 7/13/2009 suggests Azure developers consider parallel processing with WorkerRoles:

I’ve seen a few demos of Windows Azure and one of the common themes I see around the worker role is that people want to demonstrate scalability through increasing the number of instances in their service definition. That’s fine but we need to also remember that we are scaling up an entire virtual machine each time we increase our instance count, and maybe that just isn’t necessary when we consider parallelization instead. …

Steve then goes on to provide detailed code examples for what he calls WorkSharing. Full code is available for Steve’s WorkSharing framework.

SQL Azure Database (SADB, formerly SDS and SSDS)

<Return to section navigation list> 

•••• Jeff Currier was Coming up for air.. on 7/17/2009:

This week has been a week of relief for the team.  We're rounding out the release that will be available for early access to a limited set of early access customers.  We've got the bits running in the cluster, we're passing our load/stress tests on the system as well as our security tests.  It's one of those quiet moments where you reflect back on the hard work you've done before the storm kicks up again next week as we make the final engineering push for the PDC.

And notes that the abbreviation for SQL Azure Database is SADB, not SAD.

The Data Platform Insider blog’s SQL Azure Database is on its way; new pricing and licensing information announced at WPC post of 7/14/2009 offers additional SAD details from these links:

    • The business model and pricing for Windows Azure, .NET Services, and SQL Azure
    • The partner channel model and pricing for Azure services
    • The Windows Azure QuickStart program for partners

John Treadway’s Databases and Cloud Computing Roundup post of 7/13/2009:

[D]ivide[s] cloud DBMS offerings into four categories based on whether or not they are “relational” and the degree to which they are “native” to the cloud (e.g. integrated part of a cloud service).  Note that I specifically exclude SaaS platforms with their underlying databases because often it’s not possible to tell what’s under the covers.  Here’s the general breakdown:

.NET Services: Access Control, Service Bus and Workflow

 <Return to section navigation list>

Lori MacVittie asks “Is ESB just an expensive integration hub or is there more to the story than we heard? …” in her Use The Source, Luke! post of 7/17/2009, which begins:

[I]t was somewhat surprising to see the CTO of an organization that offers an (open-source) ESB essentially quoted as discouraging the use of ESBs unless it’s for use as an integration hub. Dana Gardner, in To ESB, or Not to ESB?, analyzes MuleSource CTO Ross Mason’s recent blog that actively discourages architects from leveraging an ESB unless it’s necessary.

Matias Woloski explains how get a token from ADFS (Geneva Server) using Windows Identity Foundation and WSTrustClient in his Getting a token from ADFS (ex Geneva Server) using WCF post of 7/17/2009, which provides a link to downloadable code.

Brent Stineman shows you how to .NET Services’ Service Bus in his .NET Service Bus – Hands On with Relays (Part 1) post of 7/16/2009. Brent begins:

Today we’re going to start diving into the first of three features of .NET Services, the Service Bus. I’m running the July CTP bits on a Windows 7 VPC with Visual Studio 2008. This article series has taken longer to come together than I would have liked because just like when I was working with Azure storage and REST calls, I’m a bit of a noob when it comes to WCF. And while WCF isn’t required for working with the Service Bus, its the easiest way to work with it when using .NET (there are Java, PHP, and Ruby SDK’s either already ready or in the works).

Vittorio Bertocci’s The “Geneva” suite of products get official names post of 7/13/2009 announces the following changes from code to official names for the Geneva identity suite:

Code name Official Name
Geneva Server Active Directory Federation Services (ADFS)
Geneva Framework Windows Identity Foundation
Windows Cardspace Geneva Windows CardSpace

I’m surprised to see official names assigned while the products are still in beta test (Beta 2).

I asked Vibro in a comment how the change of names affects his previous observation of Geneva Framework’s incompatibility with Azure WebRoles in his Claims and Cloud: Pardon our Dust post of 4/1/2009? So far, I haven’t seen an answer.

Live Windows Azure Apps, Tools and Test Harnesses

<Return to section navigation list> 

•••• Mario Gandasegui posted two Azure-hosted applications, AzureBright Blog & Forum, to CodePlex on 7/9/2009 as entries in the NewCloudApp contest. Mario says the apps goals were:

    1. To have a fast, clean (html), scalable, Web 2.0 look and feel, and SEO Friendly, Blog and Forum (StackOverflow.com-like functionality) for the .Net community.
    2. To share development knowledge and experiences, of new challenges encountered with Windows Azure Service’s Platform.

•••• BlueThread TechnologiesStoragePoint 2.0 cloud storage adapters enable storing SharePoint documents on Windows Azure, EMS Atmos Online, and Amazon S3. This 10-minute video posted on 7/16/2009 shows how it works:

••• Cliff Saran reports on 7/14/2009 for ComputerWeekly.com easyJet to use Microsoft Azure for mobile passenger services:

Budget airline easyJet is planning to use Microsoft Azure to allow its ground staff to upgrade customers' seats or pay for excess baggage from mobile terminals.

The airline aims to reduce the number of fixed airport desks it needs to operate.

Airport operators charge airlines for each desk, such as the ticketing desk and the check-in desk, each of which requires a separate mainframe terminal. Each desk means a new queue.

The programme, dubbed Halo, will use a virtual private network based on 3G or Wi-Fi to plug mobile devices into the Azure cloud on the internet.

John Udell describes the migration of his Elm City Calendar Aggregator Azure application from one to two WorkerRole instances in his More fun than herding servers post of 7/14/2009.

Sharon Pian Chan’s Microsoft Cloud Computing Gets Down to Earth story of 7/13/2009 for the Seattle Times is decked out with “As U.S. companies begin exploring cloud computing this year, a school system on the other side of the globe has already leapt into the cloud. Ethiopia is rolling out 250,000 laptops to schoolteachers all over the country, all running on Microsoft's platform called Azure.”

The story describes FullArmor’s participation in the Azure-based project:

As U.S. companies start exploring doing some of this computing this year, a school system on the other side of the globe has already leapt into the cloud. Ethiopia is rolling out 250,000 laptops to its schoolteachers nationwide, all running on Microsoft's cloud platform, called Azure.

The laptops will allow teachers to download curriculum, keep track of academic records and securely transfer student data throughout the education system, without having to build a support system of hardware and software to connect them.

"They're going to be able to leapfrog ahead of most companies in the U.S.," said Danny Kim, chief technology officer of FullArmor, a Boston company working on the software deployment in the Ethiopian project.

Microsoft is expected to announce more details about Azure at its Worldwide Partners Conference, which begins today in New Orleans. Kim is scheduled to present a demonstration during a keynote presentation Tuesday with Bob Muglia, president of Microsoft's Server and Tools business.

Learn more in a video about Full Armor’s work from the Azure Partner Quickstart (see below.)

Andrea DiMaio reports Conservatives Push For Google And Microsoft To Take Over Patient Records in this 7/13/2009 post:

Google Health and Microsoft Health Vault received their first blessing in Europe by the Conservative party in the UK. The Centre for Policy Studies published a report  with an intriguing title “It’s Ours - Why we, not government, must own our data.

According to Mary Jo Foley’s Microsoft links HealthVault service with Amalga software post of 4/6/2009:

HealthVault is Microsoft’s consumer-focused health-records-management Software+Service platform, which the company unveiled officially in 2007. (The service component of HealthVault is one of a handful of Microsoft services that already is hosted on top of Azure.) Amalga UIS, (one of the products formerly under the Azyxxi brand), is one of the main elements of Microsoft’s enterprise health-information-system platform.

Azure Infrastructure

<Return to section navigation list> 

•••• David Deans claims “The anticipated benefits from adopting managed cloud services have reached the executives suites of many corporations” in his Cloud Services Interest Erupts in Groundswell of 7/19/2009:

The anticipated benefits from adopting managed cloud services have reached the executives suites of many corporations. Proactive CEOs and CFOs are pushing their IT leadership team to seek out actionable information and guidance.

There's also a constant stream of service providers announcing new offerings -- and the momentum is becoming a global phenomenon. As a result, Forrester Research has witnessed an expanding number of client inquiries around cloud computing. …

David is a co-author and moderator of the Business Technology Roundtable and is a member of the Service Provider marketing team at Cisco Systems, Inc.

•••• Don MacVittie says “By the time your application knows it should be doing something, it’s too late” in his Advanced Load Balancers for Developers post of 7/17/2009:

For me, as a developer, the big differentiator between a Load Balancer and an Application Delivery Controller (ADC) is the ability to use code to help manage how my application and the network interact. Some things you just can’t do from your application because by the time your application knows it should be doing something, it’s too late, some things are just easier done on a network device (yeah, or a VM pretending to be a network device if your name is Izzy ;-)).

Don is a Strategic Architect at F5 Networks working on the DevCentral Team and probably is related to Lori MacVittie, who also works at F5 (?).

••• Alan Murphy says “Now if only they’d franchise Azure we’d really be cookin’” in his Choosing Between Azure and VMM Private Clouds post to the Virtual Data Center blog of 7/17/2009. He concludes the post with these questions:

Does Microsoft have it right in keeping Azure and private clouds completely separate for enterprise customers because they are in fact two different beasts, yet saying they’ll work together? Or is Microsoft comparing the two because it’s not yet sure how customers will use and embrace Azure?

According to earlier Azure team explanations, the reason for separate system is that the Windows Azure fabric requires non-standard hardware configurations.

••• Mary Jo Foley finds “it interesting who Turner failed to mention when talking up Microsoft’s competition. Amazon sure has a hefty head start in the rent-a-cloud space” in her Microsoft's fiscal 2010 battle cry: Growing our share post of 7/16/2009.

James Urquhart asks Will Microsoft Azure promote efficient software development? in this 7/16/2009 post and implies the answer is “yes” because Microsoft’s charges for resources will promote application efficiency.

The VAR Guy’s Microsoft Windows Azure Cloud Meets MySQL post of 7/16/2009 claims Azure can run My SQL:

Now, Microsoft is seeking to repeat that success with Windows Azure, the company’s newly launched cloud computing environment. On the one hand, Azure is a platform that allows traditional ISVs (independent software vendors) to re-write their on-premise server applications for cloud computing. On the other hand, Windows Azure could open the door to a range of new, innovative cloud apps.

Along the way, some big-name open source applications could land in the Azure cloud. For example, “we’ve enabled MySQL on top of Windows Azure,” said Microsoft’s Prashant Ketkar, director of product marketing for Windows Azure. Ketkar made the statement during a July 15 meeting with The VAR Guy at the Microsoft Worldwide Partner Conference 2009 (WPC09) in New Orleans.

My A Comparison of Azure and Google App Engine Pricing post of 7/15/2009 (updated 7/16/2009) concludes:

If there is a “cloud price war,” Azure has lost it to Google.

However, it appears that the Azure team is considering a countermove.

The Azure Team has posted an official, comprehensive Pricing & Licensing Overview for CTP (no-charge) and the Release to Web (RTW) versions of Windows Azure, Azure Storage Services, .NET Services, and SQL Azure Database (SAD, formerly SQL Data Services, SDS). Partner and MSDN discounts aren’t mentioned.

The page contains SLA availability guarantees for dual Windows Azure instances, storage, .NET Services and SAD. See Tobe Zope’s article below for non-conformance penalties.

The Azure Team is Confirming Commercial Availability and Announcing Business Model with this 7/14/2009 post:

Today, during the Microsoft Worldwide Partner Conference 2009 we announced the business and partner model for the Windows Azure platform including service level agreements and support programs.

Windows Azure, SQL Azure and .NET Services will be commercially available at the Professional Developer Conference 2009 and we hope you will continue building on the Community Technology Preview (CTP) at no cost today.

Upon commercial availability we will offer Windows Azure through a consumption-based pricing model, allowing partners and customers to pay only for the services that they consume.

Windows Azure SQL Azure .NET Services
Compute @ $0.12 / hour Web Edition – Up to 1 GB relational database @ $9.99 / month Messages @ $0.15/100K message operations, including Service Bus messages and Access Control tokens
Storage @ $0.15 / GB stored Business Edition – Up to 10 GB relational database @ $99.99 / month  
Storage Transactions @ $0.01 / 10K    

Partners receive a 5% discount from the above pricing. Microsoft will announce subscription pricing on commercial release and volume licensing discounts at an undisclosed interval after release. Enterprise Agreement (EA) licensees will receive EA discounts.

SLA: For compute, we guarantee that when you deploy two or more role instances in different fault and upgrade domains your Internet facing roles will have external connectivity at least 99.95% of the time. Additionally, we will monitor all of your individual role instances and detect within two minutes when a role instance’s process is not running and initiate corrective action. For storage, we guarantee that at least 99.9% of the time we will successfully process correctly formatted requests that we receive to add, update, read and delete data. We also guarantee that your storage accounts will have connectivity to our Internet gateway.

99.95% (3-1/2 nines) availability allows a maximum of 21.6 minutes per month of downtime. The post doesn’t include details of the penalty payments if Azure doesn’t conform to the SLA terms and conditions.

However, Tobe Zolpe quotes Mark Taylor, Microsoft director of developer and platform evangelism, in his Microsoft sets Azure pricing and service levels post of 9/14/2009 for ZDNet UK:

Microsoft will provide a 10 percent credit if compute connectivity falls below 99.95 percent uptime; a 10 percent credit if role-instance uptime or storage falls below 99.9 percent uptime.

Commercial availability will coincide with the Professional Developer’s Conference, to be held 11/17 to 11/19/2009 in Los Angeles.

Ina Fried reports a “development accelerator” discount for CNetNews in her Microsoft announces Azure pricing, details article of 7/14/2009:

The discount plan, dubbed the "development accelerator" comes in two forms and offers a 15 percent to 30 percent discount off the consumption charges. It requires a six-month commitment, with overage charges billed at the regular rates. After six months, the pricing reverts to the standard Azure rates.

You can find out more about the “development accelerator” discount in the Microsoft Unveils Windows Azure Platform Business Model, Bringing New Revenue Opportunities to Partners Worldwide press release of 7/14/2009.

The OakLeaf Blog hit Techmeme’s Discussion list for Ina Fried’s Microsoft announces Azure pricing, details article (see above):

John Foley writes in his Microsoft Beats Amazon By A Whisker In Cloud Pricing post to InformationWeek’s Plug Into the Cloud blog of 7/14/2009:

Microsoft has disclosed pricing on its forthcoming Windows Azure services, and in one small but significant way, Microsoft has undercut rival Amazon on pay-per-use fees. Amazon charges 12.5 cents per hour for a bare bones Windows Server instance; Microsoft's list price is 12 cents.

Microsoft officials had previously indicated that Windows Azure pricing would be competitive, but the price differential may be more symbolic than material. At their published rates, if you ran a Window server in the cloud every hour of the day for an entire year, you'd save a mere $43.80 going with Microsoft. Indeed, if penny pinching is important, Amazon Web Services actually has a cheaper alternative, though it's not Windows. Amazon charges 10 cents per hour for "small" virtualized Linux and Unix servers.

Microsoft promised to price Azure competitively with other Platform as a Service (PaaS, e.g. Google App Engine, GAE) and Infrastructure as a Service (IaaS, e.g. Amazon Web Services, AWS). Google offers free quotas for GAE; Microsoft has made no mention of free quotas that I’ve heard or seen. Here’s the “Will Azure Pricing be Competitive with Google App Engine?” question I posted to askwpc@microsoft.com on 7/14/2009.

Hello,

Microsoft promised Azure pricing would be competitive with other PaaS/IaaS providers. Google App Engine has a free threshold for development/demonstration apps. Here are the details from Google (http://code.google.com/appengine/docs/quotas.html#Free_Changes):

“On June 22nd, 2009, the free quota levels for the billable quotas were reduced. App Engine will always remain free to get started. We believe these new levels will continue to serve a reasonably efficient application around 5 million page views per month, completely free.

The new free quota levels are as follows:

  • CPU Time: 6.5 hours of CPU time per day
  • Bandwidth: 1 gigabyte of data transferred in and out of the application per day
  • Stored Data & Email Recipients: unchanged

Fixed quotas for applications with billing enabled were not affected.”

I have three demo apps running on Azure: http://oakleaf.cloudapp.net/Default.aspx (Azure tables), http://oakleaf2.cloudapp.net/Default.aspx (Azure blobs), and http://oakleaf5.cloudapp.net/Default.aspx (Azure queues). I won’t be able to maintain these demo apps after Azure goes live.

Thanks in advance for your reply.

If I receive a reply, I’ll post it here and to the current issue of this post when received.

Mary Jo Foley’s Microsoft announces its Azure cloud computing pricing post of 7/14/2009 summarizes the pricing and SLA data.

• Gavin Clark reports in his Microsoft's Azure cloud price pipped by Amazon's Linux 7/14/2009 Register article:

Microsoft has announced pricing for Azure that marginally undercuts Amazon on raw computing for Windows-based clouds but remains more expensive than the mega book warehouse's Linux option.

The company has said it will charge $0.12 per compute hour for its Windows Azure Compute. Amazon's price for an ondemand Windows instance starts at $0.125.

Amazon's Linux-based service undercuts Windows, with pricing starting at $0.10 per computing hour. Add in storage, and Azure's price will creep up further against Amazon: Azure will charge $0.15 per gigabyte stored versus $0.10 per gigabyte each month from Amazon.

Clark continues:

But initially, Azure will undercut Amazon's Linux with a free service: Microsoft threw open Azure to early adopters on Tuesday at its Worldwide Partner Conference in New Orleans, Louisiana.

Windows Azure has been free for development since its initial appearance as a Community Technical Preview (CTP) of SQL Server Data Services (SSDS) at MIX 08. The free ride is over as of mid-November 2009 (not October).

David Chappel’s Windows Azure an dISVs: A Guide for Decision Makers is a 12-page *.pdf file that concludes:

Like all new platforms, Windows Azure will succeed only if ISVs choose to build applications on it. Microsoft clearly understands this, and so making their new cloud platform attractive to this audience is a priority. The core attractions are these:

  • Because Windows Azure lets ISVs run applications and store data in a very large data center while paying only for the resources used, it can provide appealing economics.
  • By providing a ready-made platform designed to support scalable and reliable cloud applications, Windows Azure reduces the time and money required to create and run SaaS applications and other cloud-based code.

Cloud computing looks like the next great wave in our industry. Just as ISVs have had to adapt to the changes brought by PCs, mobile devices, and other new platforms, they now need to decide how to exploit cloud platforms. And just as Windows played a significant part in those earlier shifts, Windows Azure is poised to take an important role in this new world. If you’re responsible for charting your firm’s path, understanding and evaluating the Azure environment makes good sense.

Thanks to @WadeWegner for the heads-up.

Microsoft announced the Azure Partner Quickstart on 7/13/2009 at the Worldwide Partner’s conference:

This community portal provides Microsoft partners with a single destination to start driving their business with Windows Azure. We update this page regularly as new content, whitepapers and presentations become available, provide you with the latest training resources and toolkits to help you get started with your Windows Azure projects, and connect you to the Microsoft partner community and the Microsoft sales, marketing and product management community around Windows Azure.

Doug Tidwell announced in his A New Era in Cloud Standards post of 7/13/2009:

[T]oday the Object Management Group announced cloud-standards.org, a site for coordinating the cloud standards work of various organizations. The announcement was made at the OMG's Cloud Standards Summit. The groups involved are:

The site is a Wiki maintained by representatives of the five groups above. Some of the relevant activities include the DMTF's Open Cloud Standards Incubator, the OGF's Open Cloud Computing Interface and SNIA's Cloud Data Management Interface.

Thanks to John Willis (a.k.a. @Botchagalupe) for the heads up.

Peter Loh claims “Cloud computing will change the processes and tools that IT organizations currently use” in his Managing Cloud Applications post of 7/13/2009:

As enterprises evaluate if and how cloud computing fits into their core IT services, they must consider how they will manage cloud services as part of their day-to-day operations. This article examines how operational management of cloud computing differs from traditional methods, and examine techniques for addressing these needs. …

David Bernstein, Erik Ludvigson, Krishna Sankar, Steve Diamond, and Monique Morrow are the authors of Blueprint for the Intercloud - Protocols and Formats for Cloud Computing Interoperability presented at the IEEE Computer Society’s 2009 Fourth International Conference on Internet and Web Applications and Services. Here’s the abstract:

Cloud Computing is a term applied to large, hosted datacenters, usually geographically distributed, which offer various computational services on a “utility” basis. Most typically the configuration and provisioning of these datacenters, as far as the services for the subscribers go, is highly automated, to the point of the service being delivered within seconds of the subscriber request. Additionally, the datacenters typically use hypervisor based virtualization as a technique to deliver these services. The concept of a cloud operated by one service provider or enterprise interoperating with a clouds operated by another is a powerful idea. So far that is limited to use cases where code running on one cloud explicitly references a service on another cloud. There is no implicit and transparent interoperability. Use cases for interoperability, as well as work-in-progress around inter-cloud protocols and formats for enabling those use cases, are discussed in this paper.

You can purchase a full-text copy for US$ 19.

Cloud Security and Governance

<Return to section navigation list>

•••• Chris Hoff (@Beaker) reports that he updated his CSA “Cloud Architectural Framework” in his Cloud Computing [Security] Architectural Framework post of 7/19/2009:

For those of you who are not in the security space and may not have read the Cloud Security Alliance’s “Guidance for Critical Areas of Focus,” you may have missed the “Cloud Architectural Framework” section I wrote as a contribution.

We are working on improving the entire guide, but I thought I would re-publish the Cloud Architectural Framework section and solicit comments here as well as “set it free” as a stand-alone reference document.

Please keep in mind, I wrote this before many of the other papers such as NIST’s were officially published, so the normal churn in the blogosphere and general Cloud space may mean that  some of the terms and definitions have settled down.

•••• Wyatt Kash’s Standards groups form alliance to set cloud-computing standards article of 7/17/2009 reports:

A group of leading standards development organizations are working jointly to foster common standards for cloud computing and storage, beginning with the launch this week of a new wiki resource site called cloud-standards.org.

“We brought together a large number of players so we don’t get an enormous mess of standards,” said Richard Soley, chairman and chief executive officer of Object Management Group, one of the organizations participating in the effort. Soley announced the formation of the group at a cloud-computing symposium held by National Defense University’s Information Resource Management College July 15.

The organizations joining in the collaborative effort include the Cloud Security Alliance, the Distributed Management Task Force, the Open Grid Forum, the Storage Networking Industry Association and the Open Cloud Consortium, with other groups expected to participate, Soley said. …

•••• David Linthicum claims “Those doing SOA and cloud computing are stubbing [their toe on]” in his 7/16/2009 Three critical -- and avoidable -- cloud computing mistakes InfoWorld article. The mistakes are:

  1. Looking at cloud computing as a mere platform change and not architecture.
  2. Ignoring performance.
  3. Asking "when" instead of "why."

••• Ron Schmelzer continues the “Cloud governance: something old, something new, something borrowed …” theme in his Cloud Governance Awakens post of 7/17/2009:

Perhaps the reason why usage of the Cloud is still nascent in the enterprise is because of an increasing chorus of concerns being voiced about the usage of Cloud resources:

Cloud availability. Cloud security. Erosion of data integrity. Data replication and consistency issues. Potential loss of privacy. Lack of auditing and logging visibility. Potential for regulatory violations. Application sprawl & dependencies. Inappropriate usage of Services. Difficulty in managing intra-Cloud, inter-Cloud, and Cloud and non-Cloud interactions and resources. And that’s just the short list.

Ron Schmelzer is founder and senior analyst of ZapThink.

••• Gray Hall’s Cloud Computing and ITIL: Service Delivery and Cloud SLAs post of 7/16/2009 begins:

One of the interesting side effects of the rapid adoption of Cloud Computing by the enterprise is the impact this adoption will have on the design and delivery of IT service processes.

In his article Assessing cloud providers, Frank Ohlhorst reminds us that "moving to the cloud is primarily a business decision" dependent on the metrics of ROI (Return On Investment), performance, sustainability and suitability to task.

If you're familiar with IT [Iinfrastructure Llibrary] ITIL V.3, you'll recognize this service model overview:

Chris Hoff (a.k.a. @Beaker) takes a Beckett tack in Cloud Security: Waiting For Godot & His Silver Bullet post of 7/16/2009:

Referencing my prior post about the state of Cloud security, I’m reminded of the fact that as a community of providers and consumers, we continue to wait for the security equivalent of Godot to arrive and solve all of our attendant Cloud security challenges with the offer of some mythical silver bullet.  We wait and wait for our security Godot as I mix metaphors and butcher Beckett’s opus to pass the time.

Mala Ramakrishnan, Sriram Chakravarthy, Srini Vinnakota, and Chris Nguyen contend that “Governance is the key for enterprises to successfully deliver applications in the cloud” in this 7/16/2009 post:

The advantages of an enterprise application leveraging the public cloud sound like utopia - lowered total cost of ownership and overhead costs, ease of maintenance, inherent high availability and scalability that is built into the infrastructure. Yet when the theory of moving to the cloud is put into practice, the biggest hurdle that stalls the success of the transition is governance. This article analyzes its importance and the various aspects of governance in the realm of cloud computing.

James Urquhart’s Lawyers shine light on real cloud concerns post of 7/14/2009 begins:

Like moths to a porch light (or trial lawyers to ambulances), many lawyers are finding the uncertain legal and regulatory terrain of cloud computing fertile ground for new legal analysis--and new legal business.

The effect of cloud computing on our legislative and regulatory world has long been a sub-interest of sorts for me. I have long been fascinated by the ways in which a truly dynamic, multiparty compute environment will challenge laws that assume that electronic assets behave the same as their paper or celluloid brethren--static, not easily duplicated and stored on the owner's premises.

The gap between the cloud and the current state of legislation is serious.

James continues with a recent history of articles about cloud computing’s legal issues.

William Hurley asks Do dark terms of service signal storm clouds? in this 7/13/2009 post decked “Some IT pros are taking advantage of cloud computing, but without reading the terms of service agreement.” Hurley writes:

[M]any IT pros are jumping onto the cloud bandwagon feet first without gauging the risk to their company. I believe this is widespread, although I haven't read about any major snafus yet. It's all too easy to click through an agreement that may bind you, your company, or worse, your company's data to deplorable terms you would never have thought a provider would have the audacity to stipulate.

And goes on to analyze Blizzard Entertainment’s vicious Terms of Service for their new Battle.net “cloud service.”

Jason Bloomberg asks “How do you apply SOA Governance best practices to Cloud Governance?” in his Cloud Governance: Something Old, Something New, Something Borrowed… post of 7/13/2009:

As we predicted earlier in the year, Cloud computing is starting to take hold, especially if you believe the marketing literature of vendors and consulting firms. Yet, we are seeing an increasing number of Cloud success stories, ranging from simplistic consumption of utility Services and offloading of compute resources to the sort of application and process clouds we discussed in a previous ZapFlash. Perhaps the reason why usage of the Cloud is still nascent in the enterprise is because of an increasing chorus of concerns being voiced about the usage of Cloud resources. …

Cloud Computing Events

<Return to section navigation list>

•••• O’Reilly Media’s Open Source Convention (OSCON) runs from 7/20 to 7/24/2009 at the San Jose Convention Center. OSCON features 13 sessions in the Cloud Computing track.

When: 7/20/2009 to 7/24/2009 
Where: San Jose Convention Center, San Jose, CA, USA 

•••• David Pallman reports OC Azure User Group Meeting Thursday 7/23/09 on Azure MEBAs and Multi-Tenant SaaS:

The topic for this month's Azure User Group meeting is Multi-Enterprise Business Applications (MEBAs), a new class of applications the cloud is ideally suited for. We'll also look at SaaS considerations and weigh single-tenant vs. multi-tenant approaches.

In addition we'll share the recent Azure announcements, including rates.

When: 7/23/2009 6:00 PM to 8:00 PM PDT
Where: Quickstart Intelligence, 16815 Von Karman Ave. Suite 100, Irvine, CA 92606

My A Comparison of Azure and Google App Engine Pricing post of 7/15/2009 (updated 7/16/2009) concludes:

If there is a “cloud price war,” Azure has lost it to Google.

However, it appears that the Azure team is considering a countermove. (Repeated from the Azure Infrastructure section.)

Microsoft’s Worldwide Partner Conference features 16 Azure-tagged sessions, including:

  1. AP005 Extend Your Application to the Azure Cloud with S+S SQL Data Services Azure Database (Mon 7/13 | 3:30 PM-4:30 PM | 220-222) Slides, Video 
  2. SS001 Software-Plus-Services: Bringing it all together across MS Online Services, Partner Hosted and Windows Azure. (Mon 7/13 | 3:30 PM-4:30 PM | 217)
  3. SS003 Lap around Windows Azure, Business Edition (Mon 7/13 | 5:00 PM-6:00 PM | 220-222) Slides (earlier slides by Dave Bost) Video
  4. AP002 Partnering with the Azure Services (Tue 7/14 | 2:00 PM-3:00 PM | 220-222) Slides Video
  5. SS006 The Azure Services Platform Partner Model and Pricing (Tue 7/14 | 3:30 PM-4:30 PM | 220-222) Video
  6. SS001R Software-Plus-Services: Bringing it all together across MS Online Services, Partner Hosted and Windows Azure (Tue 7/14 | 5:00 PM-6:00 PM | 220-222)
  7. SS008 Embracing Leveraging the Cloud: How ISVs Can Use New Microsoft Programs to Move into the Software-Plus-Services World (Tue 7/14 | 5:00 PM-6:00 PM | 215-216) Video (Accompanying text for SS006 is incorrect)
  8. US007 US Public Sector: Cloud Computing - the Government Perspective (Thu 7/16 | 3:00 PM-4:00 PM | 277)
  9. CI011 Building the Foundation for a Cloud Computing Infrastructure (Tue 7/14 | 3:30 PM-4:30 PM | No Room Data)
  10. Dynamic Datacenter: Enabling the Hosted Cloud and Managed Services Video 

Slides linked from the “The Latest Resources” section of the Azure Partner Quickstart portal. This section will be updated as other slide decks appear.

Also of note:

Microsoft To Announce Cloud Infrastructure Program at WPC:

On July 13, Microsoft will announce the Cloud Computing Infrastructure Initiative’s Hosted Partner Network Program and outline further details on the Enterprise Dynamic Datacenter Toolkit (DDTK). 

On July 14, be sure to attend the “Building the Foundation for a Cloud Computing Infrastructure” breakout session to get an overview of Microsoft’s end-to-end cloud strategy and the business opportunities involving the private cloud infrastructure:

Get an overview of Microsoft's cloud computing infrastructure strategy and how it can help customers lower costs while improving agility within their datacenters. We share the architecture and demo both the Dynamic Datacenter Toolkit for Hosters (available now!) that enables hosting Partners to offer managed servers in a hosted environment, and the Dynamic Datacenter Toolkit for Enterprises (available Q4 CY 2009) that enables building the foundation for a Private cloud.

Update 7/8/2009: Follow the action at the WPC09 Press Room and @WPC09 on Twitter.

The preceding was copied from last week’s OakLeaf post.

Update 7/13/2009: Mary Jo Foley says Microsoft to flesh out further its private cloud strategy on 7/14/2009:

Microsoft is crystalizing its “private cloud” positioning and plans to run it by the 6,000 or so partners attending its Worldwide Partner Conference (WPC) this week.

Microsoft officials previously have said that they won’t allow customers to run the Microsoft Azure cloud operating system on customers’ on-premise servers, but that they will make available to users many of the advances in Windows Server, System Center, Hyper-V and other Microsoft technologies so users can create their own “private clouds.”

Microsoft is expected to tout its Dynamic Data Center Toolkit for Enterprises at the show. The product, originally expected to ship by the end of 2009 — according to a private cloud fact sheet that was on Microsoft’s site earlier today but is gone — is now slated for the first half of 2010. It is a “free, partner-extensible toolkit that will enable datacenters to dynamically pool, allocate, and manage resources to enable IT as a service.” Microsoft already offers a version of the Dynamic Data Center Toolkit for its hosting partners. …

In related news, Microsoft is expected to unveil Azure pricing and licensing on Tuesday, July 14, at the Worldwide Partner Conference.

The Azure Team announced Seattle Lunch 2.0 @ Windows Azure to be held on 7/31/2009 in Bellevue.

You've seen all the latest press this week about Windows Azure - but two cloud services out of Seattle (three if you count EMC Decho (link) What's the best choice as a startup in the area? The Windows Azure team is going to host a Seattle Lunch 2.0 for you and will answer ANY questions you can come up with.

When: 7/31/2009 11:30 AM to 1 PM PDT
Where: Microsoft Windows, 700 Bellevue Way NE 15th Floor, Bellevue, WA 98004

Kevin Jackson is the Technical Chair for the 1st Annual Government IT Conference & Expo
to be held on 10/6/2009 in Washington, DC. Kevin writes in his Letter from the Technical Chair post of 7/13/2009:

To highlight the importance of this conference, I only need to repeat the guiding principles driving President Obama’s technology agenda:

  • Innovation in the Economy: Drive Economic Growth and Solve National Problems By Deploying a 21st Century Information Infrastructure 
  • Innovation in Science: Invest in Science and Science Education
  • Innovation in Public Administration: Creating an Open and Secure Government
  • Restoring a Culture of Accountability through Openness and Transparency of Government Operations and Information
  • When: 10/6/2009 
    Where: Washington, DC (venue TBD)

    Other Cloud Computing Platforms and Services

    <Return to section navigation list> 

    Charles Babcock reports BMC Software Sees Hybrid Clouds On Horizon in this article of 7/17/2009 for InformationWeek:

    BMC (NYSE: BMC) Software is working to give IT managers the power to provision their internal users with virtual machines, regardless of whether those VMs are hosted on company servers or in the Amazon (NSDQ: AMZN) EC2 cloud.

    It's another step toward what observers call the hybrid cloud, or an internal data center working hand-in-glove with an external set of resources in a public cloud. Such a linkage would allow data centers to carry the bulk of a company's computing workload, but rely on a cloud to handle load spikes or non-critical, variable tasks, such as software testing. It may sound like BMC is just climbing aboard a hot button issue, but Herb Van Hook, VP of business planning, said BMC "has been led to the cloud by our customers. We had a joint customer with Amazon who was building out a hybrid cloud." …

    •••• Maureen O’Gara casts a more jaundiced eye on BMC’s offering in her BMC Latest Cloud Me-Too post of 7/17/2009: “BMC is leveraging Amazon’s EC2 to get a piece of the cloud action.” …

    •••• Alex Hadley reported The end of Sun in his 7/16/2009 post for Software Development Times:

    At 10:05 a.m. Pacific time today, Sun Microsystems' fate was sealed. At that exact moment, shareholder voting closed, and the motion to accept the acquisition offer from Oracle was approved. There was little fanfare. Jonathan Schwartz, Sun's CEO, and Scott McNealy, its chairman, were both absent. Schwartz was said to be sick. I can't help but think it was psychosomatic. …

    Truly an ignominious end.

    Sun Microsystems Systems News interviews Sun VP Hal Stern in Change Is in the Wind for Sysadmins Says Sun VP of 7/17/2009:

    The cloud, whether private or public, says Sun VP Hal Stern, will change the nature of the system administrator's job, demanding a shift of emphasis from hardware and reliability to software. He shared his views with Alex Goldman in an interview for internetnews.com. "With services, we are leaving the hardware world," Stern said.

    William Vambenepe's REST in practice for IT and Cloud management (part 1: Cloud APIs) post of 7/16/2009 “compare[s] four public Cloud APIs (AWS EC2, GoGrid, Rackspace and Sun Cloud) to see what practical benefits REST provides for resource management protocols.”

    Rackspace announced its Public API for Cloud Computing Servers in this 7/14/2009 post to Cloudonomics Journal:

    Rackspace Hosting has announced the availability of the public beta of its Cloud Servers® API. Cloud Servers, part of the company’s portfolio of cloud services, is an Infrastructure as a Service (IaaS) offering that provides inexpensive compute capacity that can be instantly sized allowing businesses to pay only for what it uses—as needed. Through the open, standards-based API, Rackspace Cloud customers can now manage their cloud infrastructure with greater control and flexibility. The API, for example, enables elastic scenarios as users can write code that programmatically detects load and scales the number of server instances up and down

    You also can read An Interview With The Architects of The [Rackspace] Cloud Servers API posted 7/14/2009.

    <Return to section navigation list>