Friday, December 25, 2009

OakLeaf Blog Analytics for November 2009

I’ve decided to post monthly reports of the OakLeaf Blog’s traffic in order to spot trends. The analytics so far show little difference month to month for the year 2009, although a controversial post occasionally raises weekly traffic by as much as 50%.

Here’s Google Analytics’ default report for the OakLeaf Blog during November 2009 (click image for full-size screen capture):

The most popular post for the month was Windows Azure and Cloud Computing Posts for 11/9/2009+, which immediately preceded the week of PDC 2009:

The most interesting statistics for this post, at least to me, are the almost three minutes that the average visitor spent on the page and the number of returning visitors.

Statistics for December 2009 will be posted the first week of January 2010.

Wednesday, December 23, 2009

Windows Azure and Cloud Computing Posts for 12/21/2009+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.


• Update 12/23/2008: Microsoft Events: Tech*Ed 2008 Tracks and Content Topics; Windows Azure Platform AppFabric Team: The Windows Azure platform AppFabric December 2009 Release is Live 

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the November CTP in January 2010. 
* Content for managing DataHubs will be added as Microsoft releases more details on data synchronization services for SQL Azure and Windows Azure.

Off-Topic: OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009.

Azure Blob, Table and Queue Services

Tom explains the mysterious There is not enough space on the disk message you or your site’s users receive when uploading files greater than 100 MB in size with the ASP.NET FileUpload control in this 12/21/2009 post:

… There are a few ways that you can work around this issue.  The best solution would be to use Silverlight to handle the upload instead of the ASP.NET FileUpload control.  By using Silverlight, you can have the client directly upload the file to blob storage and reduce how many places the file gets copied.

There are also some 3rd party controls that you can use to do this as well. …

This appears to be the first userful post to Tom’s Azure Support Team blog. Subscribed.

Rob Gillen explores a problem uploading both large (1.5-GB) and small (92.5-MB) blobs in 4-MB blocks to Azure Storage Services on 12/18/2009 in his Time to do some digging… post of 12/21/2009:

I’ve been getting my test harness and reporting tools setup for some performance baselining that I’m doing relative to cloud computing providers and when I left the office on Friday I set off a test that was uploading a collection of binary files (NetCDF files if you care) to an Azure container. I was doing nothing fancy… looping through a directory, for each file found, upload to the container using the defaults for BlobBlock and then record the duration (start/finish) for that file and the file size. The source directory contained 144 files representing roughly 58 GB of data. 32 of the files were roughly 1.5 GB each and the remainder were about 92.5 MB.

I came in this morning expecting to find the script long finished with some numbers to start looking at. Instead, what I found is that, after uploading some 70 files (almost 15 GB), every subsequent upload attempt failed with a timeout error – stating that the operation couldn’t be completed in the default 90-second time window. …

Rob continues with a detailed analysis, which pointed to the default 4-MB block size as the problem. Reducing the blob block size to 256 kB restored upload capability on 12/21/2009, but at a dramatically slower than usual rate.

In a comment to Rob’s post, I mentioned My Windows Azure Table Test Harness App Was Down for 02 Hours and 30 - 40 Minutes Yesterday, which links to an Azure forum thread by Microsoft’s Steve Marx reporting that the South Central US data center had been having problems with processing Azure queues and blobs/tables were affected for some users. The times don’t correspond, but the two issues might be related.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

André van de Graaf provides yet another illustrated guided tour to Getting Started with SQL Azure, create your first SQL database in the cloud on 12/22/2009:

Microsoft® SQL Azure™ Database is a cloud-based relational database service built on SQL Server® technologies. … Therefore, I started creating a SQL Azure database in the cloud to see the current status of the cloud-based relational database service.

Buck Woody’s Monitoring SQL Azure For Performance post of 12/22/2009 observes:

In SQL Server Azure, there are no Dynamic Management Views (DMVs) or Performance Monitor Objects and Counters that you can access, so you can’t run your standard performance monitoring that way. I suspect that as time goes on, SQL Azure will have some instrumentation, but for the time being, you’ll have to go with a different metric – round trip throughput.

What I mean by that is you’ll need to measure the start of a transaction and it’s completion. In the end, this is the only metric that matters anyway, but it is helpful to know what to fix – and this metric doesn’t help you with that.

What’s been working for me is to develop my queries locally using all my tricks and tools, and then post those up to SQL Azure. While it’s not a one-to-one map, it does seem to fit the bill for now. I’ll keep you posted.

Cihan Biyikoglu promises to start blogging after a two-year hiatus in his Feels good to be back post of 12/22/2009:

Hi folks, I am back and blogging again about my new cause. Your data in the cloud with SQL Azure.

Here is what I have been up to the last few years; HealthVault. A health information database for the consumers and closely followed developments on the Azure side. We shipped the first beta back in 2007 and shipped the production release in 2009. HealthVault certainly changed the way I maintain and share my health information and I think it will change yours too. Especially with support for digital signatures (here is the presentation I did at the Connected Health Conference 09) and the recent support for large healthcare data like X-Rays and MRIs.

SQL Azure has been on my watch list the last few years and finally made the leap back to my native land – SQL Server. I spent some time at PDC 09 this year and the interest for SQL Azure and the rest of the Azure stack is phenomenal. Finally there is a good excuse to start blogging again. Expect to see programmability techniques and best practices for SQL Azure. [Emphasis added.] …

Subscribed.

<Return to section navigation list> 

AppFabric: Access Control, Service Bus and Workflow

• The Windows Azure Platform AppFabric (nee .NET Services) Team announces The Windows Azure platform AppFabric December 2009 Release is Live on 12/22/2009:

The Windows Azure platform AppFabric December release is live as of December 18th 2009.  This release includes improvements in stability, scale, and performance. Please refer to the release notes for a complete list of the breaking changes in this release. You are encouraged to visit the AppFabric portal to retrieve the latest copy of the SDK.

Be sure to read the Release Notes before attempting to run the sample applications. There are yet more breaking changes.

Eugenio Pace’s Updated code samples & chapters for Claims Identity Guide – Release Candidate post of 12/21/2009 announces:

In synch with the availability of ADFS V2.0 Release Candidate, I’m very happy to announce that we are posting a new update of the guide. Our own “RC”.

You’ll find new samples and new chapters. Both content complete now.

We are now covering the following scenarios:

  1. Single Sign on for web applications: one company, 2 applications, before and after claims
  2. Windows Azure: an extension of scenario 1, this shows how to host a web application in Windows Azure and keeping SSO experience.
  3. Simple federation sample: 2 companies collaborating. SSO across different security realms.
  4. Federation with multiple partners: demonstrates an application with multiple federation relationships. It also shows WIF and MVC.
  5. Web Services: this is essentially scenario #3, but using WCF and a WPF smart client.

The samples are packaged now as a self-extractable zip file and includes a dependency checking tool that will help you identify, install and configure all pre-requisites:

image

<Return to section navigation list>

Live Windows Azure Apps, Tools and Test Harnesses

• Lynn Langit (@llangit, a.k.a. SoCalDevGal) recently posted a sample Windows Azure application that connects to the AdventureWorksLT database running in an SQL Azure instance:

This is a sample application which connects to SQL Azure, queries and displays the query result using a ASP.NET application.

In this example, the ASP.NET application is hosted in Windows Azure. It is NOT a requirement when using SQL Azure that the 'front-end' be hosted in Windows Azure, for example the front-end could be hosted by an ISP (public website), or on-premise (intranet) scenario.

The sample database used in this application is a version of AdventureWorksLT (available on CodePlex at http://www.CodePlex.com) which is compatible with the SQL Server features which are supported by SQL Azure.  To learn more about how I wrote and published this simple application to the Azure Platform, including details on which SQL Server RDMS features are supported in SQL Azure (and which features are not), see my blog http://blogs.msdn.com/SoCalDevGal.

The BizSpark team launched the new StartUp Zone, “The guide to working with Microsoft for software startups and their investors” as a stealth portal in about mid-December. The team wants startups to:

Join Microsoft® BizSpark™

  • Get current full-featured Microsoft development tools and production licenses
  • No upfront costs and minimal requirements
  • Get support, training and marketing visibility
  • Get guidance and mentorship from Network Partners

Learn more

It’s an impressive site.

Mike Amundsen’s The Path to HTTP/REST Mastery of 12/22/2009 delivers a list of what it takes to demonstrate Black-Belt RESTMaster prowess:

[W]ant to be known as an HTTP/REST guru? [R]eady to step onto the path of HTTP/REST mastery? [H]ere's my list of things you should have already done, be doing now, or be preparing to do in the near future. [I]f you've got this list taken care of (not just 'covered', but really nailed) then [I]'d consider you eligible for the title of HTTP/REST Master. [Emphasis Mike’s.]

Mike is one of the technical editors of Cloud Computing with the Windows Azure Platform and is a contributor to Subbu Allamaraju’s forthcoming RESTful Web Services Cookbook, which is scheduled to hit the bookstores in March 2010.

Subbu Allamaraju deals with the issue of whether the use of custom media types is RESTful in his Media Types, Plumbing and Democracy post of 12/22/2009. He concludes:

So what is the right thing to do? Here is my democratic approach.

  • If the sender is formatting representations using standard extensible formats such as XML or JSON, use standard media types such as application/xml and application/json.
  • Mint new media types when you invent new formats.
  • If you are just looking for a way to communicate application level semantics for XML and JSON messages, use something else (e.g. XML namespaces and conventions).
  • If the goal is versioning, use version identifiers in URIs.

(For those not clear about the difference between a format and a media type, a media type is an identifier for a format.)

Gladinet announced the capability to Backup Music to Microsoft Windows Azure with Gladinet Cloud Desktop v1.4.2 on 12/21/2009:

Gladinet Cloud Desktop v1.4.2 includes the capability to define file type based backup sets. This gives users the ability to easily backup their music to Microsoft Windows Azure. As shown below, the user interface is very easy to use.

First, select “Backup My Music” from Gladinet Cloud Desktop’s systray menu.

clip_image002

After doing this, a backup wizard will appear. The first page allows selection of a name for the backup task and the included extensions. It also allows the addition of additional file types. …

clip_image004

Other Gladinet posts of the same date describe backing up additional file types to Azure blobs:

Krishnan Subramanian’s Healthcare And Cloud Computing post of 12/21/2009 begins:

I am a strong proponent of tapping into the Cloud to solve healthcare problems. Both myself and Zoli, at different instances, have written about Cloud based healthcare initiatives by Google and Microsoft in this blog. Even though the progress on both Google Health and Microsoft Healthvault services are disturbingly slow, we need the big players to step in along with the government to accelerate the use of cloud computing in the healthcare. …

David Aiken shows you How to Create a x509 Certificate for the Windows Azure Management API in this step-by-step tutorial of 12/21/2009.

Jim O’Neil completes his “Discovering ‘Dallas’” series with Discovering Dallas: Part 3 of 12/21/2009:

This is the final post of a trifecta examining Microsoft “Dallas”, a marketplace for data services announced at PDC 2009.  In my first post, I provided an overview of “Dallas”, including how to access various trial data services via the developer portal and in code. 

I followed that up last week with a post that walks through modifying the auto-generated C# service proxy classes to provide asynchronous invocation capabilities, a must have for Silverlight client access and, in general, a good way to go to keep your application UI responsive in the face of slow or unpredictable network performance.

For this article, I’m going to leverage the code changes I made in the last article (to enable asynchronous access) to build a Silverlight application that accesses the same Data.gov crime statistics that the previous Windows Forms application did.  As I mentioned in that post, Silverlight has two primary constraints in terms of accessing resources and services over the web:

  • Access must be asynchronous, and
  • Resources must be accessed from the point-of-origin of the Silverlight application (although there are options we’ll discuss next for working around this). …

Soyatec published its Windows Azure Tools for Eclipse site to the Web on 12/22/2009 (or possibly earlier) with the following overview:

The Windows Azure platform offers an intuitive, reliable and powerful platform for the creation of web applications and services. The Windows Azure platform is comprised of Windows Azure: an operating system as a service; SQL Azure: a fully relational database in the cloud; and .NET Services: consumable web-based services that provide both secure connectivity and federated access control for applications.

The purpose of this project is the creation of a feature- rich open source PHP application development environment in Eclipse that enables  development and deployment of PHP applications for Windows Azure.  The windowsazure4e plug-in builds upon the PHP Development Toolkit (PDT) and integrates Web Tools Platform (WTP) to provide a complete toolkit for Windows Azure Web Application development.

The windowsazure4e extensions offers an end-to-end Windows Azure development experience, including:

  • Project Creation & Migration: The New Project Wizard creates a new PHP Web Application targeting Windows Azure. Existing PHP projects can be converted to Windows Azure projects (or vice-versa) using the migration tool.
  • Azure Project Structure & Management: The windowsazure4e plug-in creates the project artifacts that Windows Azure expects, including a Windows Azure Service project and a Web-role Project, as well as Windows Azure configuration and definition files. Project and Windows Azure settings are exposed via the properties window in Eclipse
  • Storage Explorer: As part of the plug-in, a Windows Azure Storage Explorer is provided within the Eclipse environment. The Storage Explorer allows easy management of Windows Azure Storage Accounts. In addition, it also provides a friendly user-interface for performing Create, Read, Update, and Delete (CRUD) operations on Blobs, Queues, and Tables. The Storage Explorer it built using the Windows Azure SDK for JavaTM.
  • Azure Project Deployment: Once the PHP application for Windows Azure had been developed and tested locally on the Windows Azure Development Fabric, the application can be packaged up for Windows Azure deployment with a right-clicking on the target project from within Eclipse.

The team promises to start a blog shortly.

Ben Riga continues his Windows Azure Lessons Learned video series with a Windows Azure Lessons Learned: Invensys post of 12/21/2009:

In this episode of Windows Azure Lessons Learned I chat with Paul Forney, System Architect for Invensys and Aleksey Savateyev, Senior Architect in Microsoft’s Global ISV group working with Invensys.  Invensys is well known for industrial automation and control systems.  They’ve been working to develop a system for the power industry to manage the large network of smart meters that will be used to build out smart grids delivering electricity from suppliers to consumers.  To do this Invensys is using Windows Azure AppFabric (formerly called “.NET Services”).  The AppFabric Service Bus is the magic that allows this type of application.  It allows those meters not only to connect across the cloud to on-premises systems but also does it a way that can scale to the millions of homes and businesses that will form the smart grids.

Channel 9: Windows Azure Lessons Learned: Invensys

and Windows Azure Lessons Learned: GoGrid of 12/22/2009:

One question that is often asked is how hosters can benefit from the Windows Azure Platform.  While the platform can be used to deploy many types of web apps we expect many partners including hosters to develop on top of the Windows Azure platform infrastructure.  In this episode of Azure Lessons Learned I chat with Paul Lappas, VP Engineering at GoGrid and Mehul Shah and Madhavrao Pachupate from Blue Star Infotech.  GoGrid has been working on a hybrid solution that builds on the GoGrid infrastructure to assist in development and load testing of Windows Azure applications.

For more information on the GoGrid solution for Windows Azure have a look here: http://www.gogrid.com/azure/

Channel9: Windows Azure Lessons Learned: GoGrid

Matt Kerner posted the sample code from his Windows Azure Monitoring, Logging, and Management APIs PDC 2009 session (SVC15) to the Windows Azure Diagnostics, Logging and Monitoring CodePlex site on 12/18/2009:

This code sample demonstrates Windows Azure Diagnostics, the framework in the WA SDK that allows service developers to scalably control and gather standard Windows and .Net instrumentation data from their roles running in the cloud. These samples were demonstrated at PDC 2009 in the SVC 15 talk: http://microsoftpdc.com/Sessions/SVC15.

There are four samples posted on the "Downloads" tab:

  1. Windows Azure Diagnostics Controller - this is a command-line application to be run from your desktop that allows you to retrieve the diagnostic configuration of role instances running in the Windows Azure cloud. It also allows you to initiate an on-demand transfer of diagnostic data to Windows Azure Storage from your role instances in the cloud.
  2. Windows Azure Diagnostics Demo - this is a sample website that integrates the Windows Azure Diagnostic Monitor. It buffers data locally and can also be configured to transfer the data to Windows Azure storage on-demand, or on a scheduled basis.
  3. Windows Azure Diagnostics Hello World - this is a very simple sample website that integrates with the Windows Azure Diagnostic Monitor
  4. Windows Azure Sample Tracing Library - these tracing routines make it easy to dump the Windows Azure Diagnostic Monitor configuration to the console. …

Return to section navigation list> 

Windows Azure Infrastructure

• Eric Nelson ponders Multi-tenanted applications on Windows Azure and SQL Azure – added to my todo list for 2010 in this 12/23/2009 post:

Back when I was an Application Architect working with UK ISVs I spent a lot of time helping companies understand how to architect multi-tenanted applications. However, I have never pondered about the challenges (and opportunities) of doing this with Windows Azure and SQL Azure. Until now.

What sparked it off was a fairly simple question from a UK ISV that I ended up copied on.

The question in essence was: “What is the recommended approach to building a multi-tenancy solution on Windows Azure and SQL Azure in relation to domains, IP addresses and https certificates for each customer”

I thought there would be a lot of public information on this – but after a 30 minute search I was surprised to find relatively little on this area. …

Eric continues with a list of recent resources for writing multi-tenanted Windows Azure applications and concludes:

… It is worth mentioning that “Azure v1” was not designed to specifically support building multi-tenanted applications. Hence potentially there are easier ways to build multi-tenanted applications than by leveraging Windows Azure. Check out SaaSGrid for one such example. SaaSGrid lets you utilize Microsoft .NET languages to write code and provides a slim, simple yet very powerful API layer that allows the application to interact with SaaSGrid in certain explicit ways.

The final upshot of the above is… I have added this topic to my “things to understand in 2010” list :-)

• Bruce Guptil and Charlie Burns assert Cloud IT Being Approached as “Mainstream” IT in this 12/23/2009 Research Alert for Saugatuck Research (site registration required):

Regardless of their size, industry, or geographic location, the latest Saugatuck research report indicates that the vast majority are approaching Cloud Computing, especially the acquisition and use of Cloud infrastructure solutions, in the same manners as they would any other form or type of IT.

The approaches to Cloud services consideration, acquisition, and management, including expectations of benefits and use, mirror those of traditional forms of IT.
For Saugatuck, this is strong support for our position that Cloud-based IT is well on its way to becoming a regular, component of “mainstream” IT and business strategy and management. It also indicates a significant gap between user executives’ perceptions of Cloud adoption and use, and the realities of Cloud effects and costs on user enterprises.

The first of two critical studies from Saugatuck using this research is being released this week to clients of our premium subscription Continuous Research Service (CRS) (please see Note 1). …

Bruce and Charlie continue by fleshing out their assertion in detail.

The Windows Azure Platform Team confirmed their Windows Azure Platform CTP Upgrade Path in a 12/23/2009 email to Windows Azure and SQL Azure token holders. The only information that was new to me is emphasized in bold below:

During the first week of January 2010, you will receive an email to this account with detailed upgrade instructions. You will then have until January 31, 2010 to upgrade. If you elect not to upgrade, on February 1, 2010 your CTP account(s) will be disabled and any Windows Azure Storage will be made read-only. SQL Azure CTP accounts will be able to keep using their existing databases but will no longer be able to create new databases. On March 1, 2010, the SQL Azure CTP accounts that have not been upgraded will be deleted. On April 1, 2010, the Windows Azure Storage CTP accounts that have not been upgraded will be deleted. It is important to export your data if you do not plan to upgrade to a commercial subscription prior to these dates.

If you have any questions regarding the upgrade process, please feel free to contact customer support. If you are participating in the Microsoft Codename "Dallas" CTP, your service remains in CTP and is not impacted by this announcement.

• John Willis (@botchagalupe) presents The 2009 Cloudies Awards on 12/23/2009:

This is the second year of the “Cloudies” award and still only one judge (me).  However, there are things in the works to make the “Cloudies” a more official and non-tongue-n-cheek next year.  I did solicit some tweets this year for awards. Please don’t be offended if you are not in this list.  This list represents my radar and is somewhat of a goof.  I  am a one man show and not a global organization.  If you don’t agree with me please post a comment and if you have a good argument I will create an updated post. …

I agree with John’s choice of Chris Hoff (@Beaker) as Cloud Hero of 2009 (see And the 2009 Cloudie Award Goes To… ), but not his failure to give Windows Azure at least one pick in his numerous categories.

John anoints Lori MacVittie with the “Rookie of the year” award. See Lori’s paen to @Beaker in the Cloud Security and Governance section.

Randy Bias explains How Clouds Enable Global Reach in this 12/19/2009 post:

Over a year and a half ago, I mentioned that there were four key aspects to cloud computing: scalability, leverage, speed, and reach.  All of these still hold true today.  In particular, the one area that was underdeveloped was the notion of using clouds for global reach.

As you know, since then quite a bit has changed.  Amazon’s Elastic Compute Cloud crossed the Atlantic to Europe, EC2 opened up a U.S. West Coast presence, AWS also recently pre-announced their Asian expansion, and a number of other clouds sprung up across the globe, including a very strong new Australian entrant, Cloud Central.[1]

All of this goes to show that my prediction around the importance of reach in cloud computing is coming true.  One of the examples that brings this home that I enjoy talking about is Friendster.

Erick Knorr asserts “A lot of people scratched their heads over cloud computing this year. Here are the answers to the most persistent questions” in his Five big questions about cloud computing post of 12/22/2009 from InfoWorld to NetworkWorld:

    1. What defines a cloud service? …
    2. Is there such a thing as a private cloud? …
    3. Will cloud services replace the Microsoft desktop? …
    4. Do cloud services mean the end of IT as we know it? …
    5. Does the cloud really enable anything new? …

Michael Coté’s A little something extra… topic from his Links for December 18th through December 22nd post asserts:

Several people have asked us recently about Microsoft Azure and Microsoft’s cloud strategy. A little while ago, James Governor pointed out that Microsoft seemed to have been trying to figure out who in Microsoft would “own” cloud computing. And it seemed, James said, that Bob Muglia’s group ended up with it. As your little something extra today, here’s a recent response along those lines I sent to a reporter asking after Microsoft and cloud:

“Microsoft is well positioned with Azure. Now that the internal decision process of where cloud should reside has been resolved (in Bob Muglia’s Server and Tools Business), there’s less distraction in figuring out if Microsoft’s answer to the cloud will be consumer-centric (more the Ray Ozzie line of thought) and business centric.

“Muglia’s group has done well executing of late, and they seem to have clamped down old school Microsoft style on Azure. They identified their core strength – millions of “Microsoft developers” – and have delivered a cloud offering along those lines – a platform as a service, a development platform.

“This differentiates Microsoft’s cloud offering from Amazon (which is purely at the infrastructure, operations, “build it yourself” level), Salesforces (tailored more towards ERP and application extensions), and Google’s (not too business oriented). Others like IBM and HP are more interesting in tooling private clouds, where as Microsoft seems very keen on delivering a new way for general software developers to deliver applications over the public Internet.

“For more on Azure, check out the three interviews I did with Microsofties on the topic back at MIX 2009.”

Michael is an industry analyst with RedMonk.

Paul Thurott asserts “Cloud computing will likely be the defining topic of 2010” in his 2010: More of the Same ... But in a Good Way post of 12/22/2009 to the Windows IT Pro blog:

… Cloud computing will likely be the defining topic of 2010 as well, with Microsoft finally unleashing is Windows Azure services and the long-awaited Office Web Applications. Meanwhile, previously released products, such as the Business Productivity Online Suite (BPOS), which repositions traditional servers like Exchange and SharePoint as cloud services, will continue to grow—and grow dramatically. This is a huge and important shift in Microsoft's focus and I expect it to pay off big time in the coming year. …

R “Ray” Wang recommends that you “Keep In Mind Basic Rules Still Apply Regardless Of Deployment Option” in his Tuesday’s Tip: 10 Cloud and SaaS Apps Strategies For 2010 post of 12/22/2009 to the SoftwareInsider blog:

The proliferation of SaaS solutions provides organizations with a myriad of sorely needed point and disruptive solutions.  Good news - business users can rapidly procure and deploy, while innovating with minimal budget and IT team constraints.  Bad news - users must depend more on their SLA guarantees and deal with a potential integration nightmare of hundreds if not thousands of potential SaaS apps.  Though the 7 key benefits of SaaS outweigh most downside risks, organizations must design their SaaS apps strategies with the same rigor as any apps strategy.  Just because deployment options have changed, this does not mean basic apps strategy is thrown out the window.  Concepts such as SOA, business process orchestration, and enterprise architecture will be more important than ever.

Ray continues with “10 strategies to consider as organizations take SaaS mainstream.”

John Fontana claims “For Microsoft, 2010 is a critical year for its cloud computing platform” in his Microsoft's 2010 task: Make the cloud clear post of 12/21/2009 to NetworkWorld:

For Microsoft, 2010 is a platform building and marketing year with no less than the future success of its cloud strategy hanging in the balance, according to observers.

Experts say Microsoft's charge is not only to begin developing and delivering technology that will define its external, internal and hybrid cloud environments, but to clearly articulate to an overwhelming majority of corporate IT pros just how and why they want to live in a cloud.

"In terms of the cloud, it is important for Microsoft to be on the right trajectory, it's not necessarily important to their business from a revenue standpoint to capture lots of revenue out of cloud in the next 24 months," says Al Gillen, program vice president for system software at IDC. "But if they don't get in line to compete, they put themselves at a significant risk of being not there when real money starts to get spent in this space."

Lining up that trajectory will dominate 2010, as Microsoft clearly has work to do across its product line to define the cloud as part of its software-plus-services and three-screens-and-a-cloud strategies.

"Our initial focus was to make it as easy for the new applications coming in [to the cloud]," said Amitabh Srivastava, senior vice president of Microsoft's new Server and Cloud division, referring to the work he has overseen on the Azure cloud platform. Srivastava, who spoke to Network World at Microsoft's November PDC conference said, "The shift you are seeing now is where our head is going. And one place is to go after legacy apps, we have to move those to the cloud." …

@Colinizer’s Azure Platform Billing On-Ramp post of 12/22/2009 reports:

Here’s the timeline for the ramp-up of Azure Platform Services billing:

  • Jan 4, 2010 – CTP accounts can be upgraded to commercial accounts …
  • Feb 1, 2010 – Billing starts for upgraded accounts and non-upgraded accounts are disabled with Windows Azure Storage going read-only and no new database creation in SQL Azure. 
  • Mar 1, 2010 – non-upgraded SQL Azure databases will be deleted
  • Apr 1, 2010 – non-upgraded Windows Azure Storage will be deleted …

Bruce D. Kyle let’s you Take a Tour of Data Center in a Container in this video posted 12/22/2009:

Frank Arrigo takes you on a tour of what the cloud physically looks like by walking you through a container that was shown at Professional Developers Conference (PDC09). The container is similar to one that is placed in data centers that host your Windows Azure applications.

James Urquhart recommends Seven [SaaS and cloud-related] businesses to look out for in 2010 in a 12/21/2009 update to a 1/1/2008 post:

In January of 2008, dreading the idea of a cliche "prediction" post, I wrote a post that attempted to somewhat humorously outline seven businesses that would result from the then nascent cloud computing movement. As I look back at that post this year, I'm surprised to find myself thinking that most--if not all--of these should appear in one form or another in the coming year. …

He follows with an “updated commentary from this year in italics.”

Michael Krigsman offers Three success predictions for 2010 on 12/21/2009. The first of the three is:

1. The cloud grows bigger and implementation innovation becomes more important

Cloud implementations offer the potential for simpler, smaller deployments with shorter cycle times and reduced risk.

A significant part of the risk reduction associated with on-demand software occurs because the scope of these deployments tends to be smaller than full-blown on-premise implementations. In that sense, a simple cloud / on-premise cost comparison isn’t fair. For example, a basic Salesforce.com CRM implementation will naturally be faster than an SAP ERP deployment.

Nonetheless, cloud software can pull time and effort from customer implementations, leading to lower cost and risk. I disagree with those who think the cloud is pure panacea — it’s not — but when a vendor’s software as a service (SaaS) offering matches customer needs, the results can be great.

This trend will accelerate through 2010 and beyond, forcing established enterprise software and services firms to figure out innovative ways to improve implementations. …

This prediction bodes well for Windows Azure and SQL Azure, which exhibit considerable “implementation innovation” and ease of use, especially for .NET developers.

Steve Clayton’s Windows Azure for Microsoft Partners post of 12/21/2009 provides a comprehensive list of Windows Azure and SQL Azure resources of interest to members of the Microsoft Partner Network in the following categories:

    • Business Content …
    • Technical Content …
    • Architectural Content …
    • Developer Content …

<Return to section navigation list> 

Cloud Security and Governance

Lori MacVittie prefaces her ‘Twas Two Weeks Past (Cloud) Deployment poem of 12/23/2009 with “Here comes St. [@]Beaker and Santa Cloud …”:

‘Twas two weeks past deployment and all through the house
Echoed taps on a keyboard and clicks from a mouse
The apps were all running inside VMware
In hopes compute resources soon would they share.

The dashboard showed statuses green and not red
our admins had thoughts of going home in their heads
The director was ready to it a wrap
and I began thinking I'd soon take a nap.

When all of a sudden our illusions did shatter
I called up a console to see what was the matter
On the keyboard my fingers they flew like a flash
To open a terminal and the shell they call bash …

Is there a word that corresponds to “doggerel” for squirrels?

Lori received John Willis’s “Rookie of the year” Cloudies award on 12/23/2009. See the Cloud Security and Governance section.

Scott Morrison’s Cloud Security Alliance Guidance v2 Released post of 12/23/2009 summarizes v2.1’s domains:

Last week, the Cloud Security Alliance (CSA) released its Security Guidance for Critical Areas of Focus in Cloud Computing V2.1. This is a follow-on to first guidance document released only last April, which, gives you a sense of the speed at which cloud technology and techniques are moving. I was one of the contributors to this project.

The guidance explores the issues in cloud security from the perspective of 13 different domains:

Cloud Architecture

  • Domain 1: Cloud Computing Architectural Framework

Governing in the Cloud

  • Domain 2: Governance and Enterprise Risk Management
  • Domain 3: Legal and Electronic Discovery
  • Domain 4: Compliance and Audit
  • Domain 5: Information Lifecycle Management
  • Domain 6: Portability and Interoperability

Operating in the Cloud

  • Domain 7: Traditional Security, Business Continuity, and Disaster Recovery
  • Domain 8: Data Center Operations
  • Domain 9: Incident Response, Notification, and Remediation
  • Domain 10: Application Security
  • Domain 11: Encryption and Key Management
  • Domain 12: Identity and Access Management
  • Domain 13: Virtualization

I thought the domain classification was quite good because it serves to remind people that technology is only a small part of a cloud security strategy. I know that’s become a terrible security cliche, but there’s a difference between saying this and understanding what it really means. The CSA domain structure–even without the benefits of the guidance–at least serves as a concrete reminder of what’s behind the slogan.

Have a close look at the guidance.  Read it; think about it; disagree with it; change it–but in the end, make it your own. Then share your experiences with the community. The guidance is an evolving document that is a product of a collective, volunteer effort. It’s less political than a conventional standards effort (look though the contributors and you will find individuals, not companies). The group can move fast, and it doesn’t need to be proscriptive like a standard–it’s more a distillation of considerations and best practices. This one is worth tracking.

I agree.

Chris Hoff (@Beaker)’s 2010 – It’s Time for Security Resolutions Not Predictions… post of 12/21/2009 begins:

November and December usually signal the onslaught of security predictions for the coming year. They’re usually focused on the negative.

I’ve done these a couple of times and while I find the mental exercise interesting, it really doesn’t result in anything, well, actionable.

So, this year I’m going to state what I am *going* to do rather than what I think others *might.*  I’ve spent the last couple of years talking about the challenges, now it’s time to focus on the solutions. …

@Beaker continues with a list of security-related 10 resolutions for 2010.

<Return to section navigation list> 

Cloud Computing Events

Microsoft Events has published Tech*Ed 2008 Tracks, which includes a Cloud Computing and Online Services (COS) track, and Tech*Ed 2008 Call for Content Topics (PDF), which includes a Windows Azure AppFabric topic in the Application Integration (AIN) track and Microsoft SQL Azure topic in the Database Platform (DAT) track. Here’s a link to the Call for Contents page, which doesn’t include a Windows Azure [Platform] topic.

Use the code RSVP10-ARC to sign in to the call for Topics, as recommended by the Tech*Ed North America 2010 - Architecture Track Call for Content page; I found RSVP10-COS also works.

Either there isn’t any interest in Windows Azure topics or the @TechEd_NA team doesn’t want Windows Azure topics presented by independent developers or software vendors. Looks like the birth of another Epic #FAIL to me.

The good news is that Tech*Ed has returned to New Orleans which means the my wife and I can enjoy another stay at the Omni Royal New Orleans and the food at Gallatoire’s, et al.

Andre Leibovici announces Second CloudCamp Sydney confirmed in a 12/22/2009 post:

The second CloudCamp Sydney has been confirmed March 4th, 2010.

I had the opportunity to go to the first event and that’s when I learned about the unconference panel. CloudCamp is by nature an unconference and there are no specific subjects to discuss, no key notes and no presenters. The attendees are part of the discussion and are responsible for deciding the themes to be discussed.

The discussion points for this CloudCamp? Nobody knows… it will be decided there, but one thing I’m sure – It will be about clouds! Visit the official registration site and see you there!
http://cloudcamp-sydney-10.eventbrite.com/

Also checkout CloudCamp Sydney from August 2009.

Sys-Con Events announced Cloud Expo New York Call for Papers Extended to January 15, 2010 in this 12/22/2009 post:

In response to massive demand, the world's largest Cloud event - the International Cloud Computing Expo series - is expanding its number of tracks and sessions for its upcoming New York event: 5th International Cloud Expo.

The deadline for the very popular Call for Papers, which resulted in a greater deluge of submissions than ever seen before, is being extended to the other side of the holiday period - to January 15, 2010.

Additionally, SYS-CON Events announces the addition of additional tracks, additional session slots, and a general all-round expansion of the program in line with the expectations and needs of the 5,000 delegates who are expected to register for the event.

Greg WillisLaunching Windows Azure in Australia post of 12/21/2009 announces:

I’m very pleased to share that we will be holding a launch event for the Windows Azure Platform in Australia on February 23rd 2010 in Sydney.  We are also planning a local roadshow around this date, including a stop in Melbourne on February 25th.

Full details forthcoming soon, including our visiting speakers – watch this space.  In the meantime save the date in your diaries!

Commercial availability of Windows Azure in Australia is planned for the March 2010 timeframe as detailed in the official Azure FAQ.   One of the great things for Australian developers about being in the second launch wave is that we get a few extra months of free local access to the full Community Technology Preview platform so go and get started and let us know what you create!

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

• Ellen Rubin lists “Holiday parties, snow, and new features from cloud providers” as Holiday Presents from the Cloud in her 12/23/2009 post, which analyzes Amazon Web Services’ latest upgrades and concludes:

… The news from Amazon comes on top of what was already an outstanding year for cloud computing with major announcements from many key players, including: IBM software running in the cloud, new VMware-based public clouds, reduced pricing for servers and storage in the cloud, and Microsoft’s Azure gaining momentum. Each of the cloud providers is growing and maturing its cloud offerings, and we are reaching a tipping point where there are multiple clouds with sufficient features to support enterprise workloads. Get ready for 2010—it’s going to be an exciting year as large-scale enterprise cloud computing takes off. [Emphasis added.]

Pat Romanski reports “New solutions for developers to create and deliver software in both public and private cloud environments” in her IBM Introduces New Cloud Offerings post of 12/21/2009:

IBM announced new solutions for developers to create and deliver software in both public and private cloud environments.

  • IBM Rational Software Delivery Services for Cloud Computing include a set of ready-to-use application lifecycle management tools for developing and testing in the IBM Cloud, and use infrastructure management capabilities, to help organizations build software applications in the cloud. With these new services, clients can lower costs and respond quicker to organizational demands. For example, organizations can reduce the time it takes to provision a test environment from weeks to hours, and in some cases even minutes.
  • IBM Smart Business Development and Test on the IBM Cloud is a free public cloud beta for software development that provides compute and storage as a service, as well as Rational Software Delivery Services to help application developers and testers speed the development and delivery of software applications. IBM is inviting free and open participation at www.ibm.com/cloud/developer.

Cliff Boulton’s IBM Preps for Cloud Computing War vs. Google, Microsoft in 2010 post of 12/21/2009 to eWeek’s Cloud Computing blog asserts:

IBM will ramp up its cloud computing efforts in the messaging and collaboration market in 2010, focusing on extending the security of on-premises solutions to its LotusLive SAAS offerings. From January through October, IBM launched LotusLive Engage, a broad social networking and collaboration platform; LotusLive Connections, a SAAS version of its social networking suite; and its LotusLive iNotes hosted mail solution. IBM will have more to say on its future cloud computing direction in January at Lotusphere 2010.

Customers and industry watchers can expect IBM to accelerate its cloud computing efforts into 2010 and beyond, investing at a rate that is commensurate to a $120 billion cloud computing market, an IBM executive told eWEEK. …

<Return to section navigation list> 

Monday, December 21, 2009

Windows Azure and Cloud Computing Posts for 12/18/2009+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

• Update 12/19/2009: Upperside Conferences: Call for Papers for Cloud Telco 2010; Rob Gillen: Windows Azure, Climate Data, and Microsoft Surface; Azure AppFabric Team: New Windows Azure platform AppFabric SDK V1.0; Silverlight - TN: Tunisian Silverlight Developers Blog live on Azure; Michael Krigsman: Modern SOA governance: Adoption and measurement; Christofer Löf: CRUDing with “ActiveRecord for Azure”; ADO.NET Data Services (Astoria) Team: Update on the Data Services Update for the .NET 3.5 SP1; Mikael Ricknäs: Amazon adds media streaming to content delivery service; Scott Guthrie: Installing .NET 4 on Windows Azure and more.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the November CTP in January 2010. 
* Content for managing DataHubs will be added as Microsoft releases more details on data synchronization services for SQL Azure and Windows Azure.

Off-Topic: OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009.

Azure Blob, Table and Queue Services

Cory Fowler’s Setting up the Development Storage Service post of 12/20/2009 explains:

If you are setting up your Development Environment for Windows Azure and want to avoid installing SQL Server Express. I didn’t want to use SQL Server Express because I’ve already installed SQL Server 2008 R2 CTP so I can interact with my SQL Azure Database in the Cloud.

and continues with a detailed tutorial for launching the Development Storage Initialization Tool (DSInit.exe) with a default SQL Server 2008 R2 November 2009 CTP instance.

The easier approach, however, is to retain the .\SQLEXPRESS instance already installed and just download and install SQL Server Management Studio (SSMS) 2008 R2 Express November 2009 CTP with the “Tools Only” option from here.

My Windows Azure Table Test Harness App Was Down for 02 Hours and 30 - 40 Minutes Yesterday post of 12/28/2009 describes a serious outage of my Azure Table Test Harness application running in the South-Central US (San Antonio, TX) data center.

Steve Marx explained the reason (a problem with Azure Queues) in his RESOLVED: Recent errors in storage and portal thread of 12/27/2009 in the Windows Azure forum:

This evening some CTP participants with storage accounts in the "South Central US" region received errors from the storage service.  Because the Windows Azure portal relies on the storage service, some operations in the portal resulted in errors as well.  This issue has already been resolved, and no data was lost.

The root cause was a bug in queue storage, which had a cascading effect on blobs and tables for some customers.  We applied a manual workaround to restore service to full functionality, and we're working on a code fix for the underlying bug.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

• Johan Ã…hlén offers a detailed, well illustrated tutorial for getting started with SQL Azure Database and SQL Server Management Studio 2008 R2 in his SQL Azure - some tips & tricks post of 12/20/2009, which covers the following topics:

    • What is SQL Azure
    • Connecting to SQL Azure
    • Scripting
    • Copying data
    • [Connection] Encryption
    • Connection closing
    • Collations

Johan describes his live Windows Azure application in his Nyhetskoll - my contribution to the Windows Azure Developer Challenge post of 11/13/2009.

Harish Ranganathan explains how to integrate SQL Azure Database and Windows Azure techniques in his detailed Moving your ASP.NET Application to Windows Azure – Part I tutorial of 12/18/2009:

Earlier I had written 2 posts – Taking your Northwind Database to SQL Azure and binding it to an ASP.NET GridView Part I and Part II .  I thought [I would] complete the series with a post on moving your ASP.NET [SQL Azure] Application as well to Windows Azure making it a truly cloud based application. …

Moving your ASP.NET Application to Windows Azure – Part II of 12/19/2009 begins:

In the previous post I had described the steps to secure your Windows Azure tokens and get the necessary Visual Studio templates as well as making your web application Azure ready by adding the cloud project and building against it.

Once you have tested the Development Fabric, the instances as well as the application, the next step would be to publish it to the Windows Azure platform.  Select the “CloudService1” project that you added to the solution, right click and select “Publish”

Ben Riga continues his Azure Lessons Learned video series with this Azure Lessons Learned: Embarcadero post of 12/18/2009 that features Embarcadero Technologies:

Database tooling is important for many developers and DBAs as they manage numerous databases across the enterprise and the cloud.  In this episode of Azure Lessons Learned I chat with Scott Walz, Sr. Director Product Management at Embarcadero Technologies responsible for the DBArtisan product.

Scott walks us through the DBArtisan product to show how SQL Azure integrates seamlessly into this cross-DBMS product.  It was interesting to hear how quickly the effort to add SQL Azure went.  I think that bodes well for other tooling in general for SQL Azure.  Since SQL Azure is so very close to SQL Server it should be relatively simple for ISVs to add SQL Azure support to products that support SQL Server today.

See Rackspace Partners with FathomDB for Database in the Cloud in the Other Cloud Computing Platforms and Services section.

Bruce Kyle explains how to Share Real-time Premium Data with Codename "Dallas" in this 12/18/2009 post to the US ISV Community blog:

Microsoft® Codename "Dallas" is a new service allowing developers and information workers to easily discover, purchase and manage premium data subscriptions in the Windows Azure platform.

Dallas was announced at Professional Developers Conference (PDC09).

Dallas is an information marketplace that brings data, imagery, and real-time web services from leading commercial data providers and authoritative public data sources together into a single location, under a unified provisioning and billing framework. Additionally, Dallas APIs allow developers and information workers to consume this premium content with virtually any platform, application or business workflow. …

For an illustrated tour of Codename “Dallas,” see my Codename “Dallas” Developer Portal Walkthrough post of 12/17/2009:

Harish Ranganathan explains Binding Entity Framework to your SQL Azure Database – Visual Studio 2010 Beta 2 in this 12/17/2009 post:

If you have used the Entity Framework that shipped with Visual Studio 2008 SP1, you would really start appreciating the flexibility it offers for building schema driven data access layer and get it to the UI Layer either directly or using a middle tier such as WCF RIA Service.   Check my earlier post on this, if you are interested further :)

Meanwhile, the other exciting stuff that has been around is the SQL Azure which is part of the Windows Azure platform.  SQL Azure provides relational data over the web which means, the Database is hosted, maintained and all is done by us and you get to store your database and query the same as if you were running it in your local Data Center or server.  Of course, SQL Azure is currently CTP and you can get free access to it if you have the Azure Tokens.

While I had earlier written about Migrating your database to SQL Azure that example used an ASP.NET front end which had a GridView doing direct data binding with SQL DataSource.   Obviously, one would want to use some of the more abstract controls such as LINQ DataSource / Entity DataSource. …

<Return to section navigation list> 

AppFabric: Access Control, Service Bus and Workflow

Eran Hammer-Lahav delivers the first chapter, “Protocol History” of his forthcoming Authoritative Guide to OAuth book in his Sneak Peek: The Authoritative Guide to OAuth post of 12/20/2009.

The new book will replace Eran’s OAuth Beginner’s guide of 10/2007. Windows Identity Foundation (WIF) now supports Web Resource Authorization Protocol (WRAP) v2.0, a related enterprise-oriented protocol supported by Microsoft, Yahoo and Google, which derives from OAuth.

Eran offers his negative view of WRAP in WRAP, and the Demise of the OAuth Community of 11/23/2009. Microsoft’s Dare Obasanjo compares the issue of the OAuth vs. OAuth WRAP APIs with a fictitious Facebook fork of the Twitter API in his Some Thoughts on the Twitter API as a "standard API" for microblogging post of 12/21/2009:

Things get even more interesting if Facebook actually did decide to create their own fork or "profile" of the Twitter API due to community pressure to support their scenarios. Given how this has gone down in the past such as the conflict between Dave Winer and the RSS Advisory board or more recently Eran Hammer-Lahav's strong negative reaction to the creation of OAuth WRAP which he viewed as a competitor to OAuth, it is quite likely that a world where Facebook or someone else with more features than Twitter decided to adopt Twitter's API wouldn't necessarily lead to everyone singing Kumbaya.

Mario Szpuszta of Microsoft Austria (a.k.a mszCool) presents an alternative view in his Live from PDC 2009 in L.A. – Windows Identity Foundation Released and further cool announcements… post of 11/18/2009 which supports WIF’s use of OAuth WRAP.

The Azure AppFabric Team described a New Windows Azure platform AppFabric SDK V1.0 posted on 12/18/2009 as follows:

This SDK includes API libraries and samples for building connected applications with the .NET platform. It spans the entire spectrum of today’s Internet applications – from rich connected applications with advanced connectivity requirements to Web-style applications that use simple protocols such as HTTP to communicate with the broadest possible range of clients.

Technical details, rather than propaganda, would be appreciated.

The “Geneva” Team confirms Vibro’s report (see below) in their Announcing the AD FS 2.0 Release Candidate and More of 12/18/2009:

We are happy to announce several updated federated identity product releases that are available NOW!

The team’s Announcing WIF support for Windows Server 2003 !! post of the same date confirms an earlier (11/2009) post:

We are glad to announce that Windows Identity Foundation (WIF) RTW for Windows Server 2003 is available NOW! This release supports both Windows Server 2003 SP2 and Windows Server 2003 R2 platforms and following seven languages: English (en-us), German (de-DE), Spanish (es-ES), French (fr-FR), Italian (it-IT), Dutch (nl-NL), and Japanese (ja-JP).

You can download the language specific WIF RTW packages for Windows Server 2003 from here.

Vittorio Bertocci reports that Active Directory Federation Services (ADFS) v2.0 is almost cooked in his ADFS 2.0 RC is Here! post of 12/18/2009:

ADFS RC

The release candidate is always an important milestone for a product: if possible, it is even more so for a component as essential as your identity provider or your federation provider, which must be absolutely rock solid, secure, always available... you know the drill.

Well, good news everyone! From this morning Active Directory Federation Service 2.0 is officially in Release Candidate mode: read the announcement here and download bits & goodies from here.

As you’ve come to expect, the Id Element is here to provide in video the juicy details, directly from the protagonists:

Active Directory Federation Services (ADFS) 2.0 finally reached the Release Candidate phase!
This special episode of the Id Element is all about the new features introduced in the
RC: Matt Steele, Senior PM in the ADFS team, makes his second appearance on the show and gives us an insider view on how the feedback on Beta2 helped to improve the product.
From SAML protocol interop to farms and certificates management, going through new authorization capabilities and improved user experience, in
this release there's something for everybody!

The video is available here. [Emphasis Vibro’s.] …

<Return to section navigation list>

Live Windows Azure Apps, Tools and Test Harnesses

• The Windows Azure Team’s Azure at PDC 2009: Replays Now Available! post of 12/20/2009 recaps the session videos for Windows Azure and SQL Azure with brief descriptions of their content.

Neil MacKenzie explains the Service Management API in Windows Azure in this 12/19/2009 detailed tutorial, which covers the following topics:

    • Authentication
    • RESTful Interface
    • Service Management Operations
    • Get Operation Status
    • Create Deployment
    • Get Deployment
    • Change Deployment Configuration
    • Delete Deployment

Neil provides sample C# code and configuration sections where appropriate and concludes:

Hopefully, Microsoft will release a higher-level .NET API to supplement this low-level REST API.

The ADO.NET Data Services (Astoria) Team’s Update on the Data Services Update for the .NET 3.5 SP1 announces:

We’ve found an issue with the update we released earlier this week and as a result we have removed the update from the download site while we address the issue. We will make an updated version of the download available as soon as possible.

The issue is due to a change to the IDataServiceHost interface and only affects existing Data Services that have a “custom host” (i.e. directly implement the IDataServiceHost interface). This issue does not affect Data Services that use the standard WCF/ASP.NET host (the host that your Data Service will have if you have used the built in tools in Visual Studio to create your service). The issue causes services that use “custom hosts” to fail to initialize.

We believe this issue affects only a small number of existing data services but it affects enough that we have made the decision to remove the update from the download page until we have addressed the problem. We are currently working on the fix and will have an updated version available as soon as possible.

In the meantime, the latest CTP Download is available here: http://www.microsoft.com/downloads/details.aspx?FamilyID=a71060eb-454e-4475-81a6-e9552b1034fc&displaylang=en.

• Christofer Löf describes CRUDing with “ActiveRecord for Azure” in this 12/19/2009 post:

In my previous blog post I introduced you to my little experiment – a sample implementation of the ActiveRecord pattern for the Windows Azure Storage system which I call “ActiveRecord for Azure”. (It’s easier to refer to something it it has a name – right?). In this post I want to elaborate a little bit further on the features previously mentioned. Since most of us associate the ActiveRecord pattern with MVC style apps I’m going to show the Create, Read, Update and Delete (CRUD) support by implementing a simple Task List application using ASP.NET MVC. …

• Rob Gillen’s Windows Azure, Climate Data, and Microsoft Surface post of 12/18/2009 describes an Azure vizualization application:

… I built a simple visualization app that does a real-time query against the data in Azure and displays it. Originally the app was built as a simple WPF desktop application, but I got to thinking that it would be particularly interesting on the Surface and therefore took a day or two to port it over. The video … is a walkthrough of the app – the dialog is a bit cheesy but the app is interesting as it provides a very tactile means of interacting with otherwise stale data.

• Soumow Atitallah (@soumow) deployed the Tunisian Silverlight Developers Blog live to Azure (Silverlight – TN) on 12/19/2009, according to a tweet of the same date.

Archetype posted a live, personalize Archetype Holiday Card app to Windows Azure on 12/17/2009:

You can personalize the cookie’s shape and decorations, and share it with friends by e-mail or Facebook. Why not Twitter?

<Return to section navigation list> 

Windows Azure Infrastructure

• Julie Bort asserts “By 2011, 89% of 212 enterprises surveyed plan to use W7 but most are also leaning toward Google for cloud computing” in her Most business will adopt Windows 7 by 2011, but prefer Google's cloud post of 12/18/2009 to NetworkWorld’s Microsoft Subnet blog:

As for cloud computing, the news isn't completely bleak for Microsoft. It has its biggest foothold in its most coveted customer, the large enterprise with $1 billion or greater in annual revenues. Although only one-quarter of the total respondents said they were interested in Azure in 2010 for hosted Microsoft apps, most of those interested (14%) were large enterprises. This is a big jump from the last CIO Zone survey on SaaS in June when so few details of Azure were known. At that time, most organizations said they thought they would be heavily using Google in 2010.

On the other hand, you can also say the glass is half empty for Azure. Even among the [largest], wealthiest companies, most plan to use Google, and use it more heavily, too. When asked how much usage they expected to give a specific cloud computing platform in 2010, on a scale of 1-5, with 1 indicating greatest usage and 5 indicating least usage, large enterprises ranked their planned Google usage at 3.03. They ranked their planned Azure usage at 3.48. Planned Amazon usage came in third at 3.49.

The problem with Julie’s analysis is failure to distinguish between Google Apps and the Google App engine as a Windows Azure competitor. Windows Azure isn’t intended to “host Microsoft apps,” which I assume means Office productivity applications, a.k.a. Office Web Apps. That will be Microsoft Office Live’s job. Windows Azure is intended to host custom cloud-computing applications written primarily by .NET developers.

Julie replied to my comment in her thanks, Roger, but I didn't confuse those two post of 12/21/2009.

Eric Golpe announced that “customers will be able to use their Azure benefits for normal (production) use” in his Windows Azure for MSDN Premium Subscribers & BizSpark Members post of 12/19/2009:

Starting January 4th, MSDN Premium subscribers and BizSpark members in 21 countries can sign up for Windows Azure Platform benefits. Previously, we communicated that the Azure benefit usage for subscribers would be limited to development and testing. We’ve lifted this restriction so that customers will be able to use their Azure benefits for normal (production) use! In doing so, they won't need a separate account to transition between development and production; however, customers cannot consolidate or pool the Azure benefits from multiple subscriptions onto one account.

Updated details on Azure benefits for MSDN subscribers are here.

• Jeffrey Schwartz reports “As CIOs largely reject the early crop of cloud services for business-critical apps, Redmond readies private and hybrid cloud platforms” in his Microsoft's Private Cloud Formation post to Visual Studio Magazine’s In Depth blog of 12/17/2009:

As Microsoft rolls out its Windows Azure and SQL Azure public cloud services in January 2010, the first implementers will likely include those building greenfield Web 2.0-type apps as well those who develop and test software looking for capacity on demand. But for cloud computing to take hold in the enterprise for business-critical applications, Microsoft knows it must extend Windows Azure to integrate securely and seamlessly with internally hosted systems.

Hence, the next phase of Windows Azure will enable enterprises to build private and enable hybrid clouds with a new set of deliverables that will evolve throughout 2010 and likely into the following years.

The allure of cloud services is that they provide infrastructure on demand and remove the capital and administrative requirements of running internal systems. Yet the vast majority of CIOs say they simply can't put certain types of applications and data into the current incarnation of cloud services.

"It's going to be a tough sell," says John Merchant, assistant vice president at The Hartford Financial Services Group Inc., a large insurance company. "As a Fortune 500 company with highly regulated data and a very conservative outlook, it's going to be difficult for any insurance company or any financial institution of any size to migrate any data to the cloud."

On a panel in November at Interop New York addressing the top cloud engineers at Amazon.com Inc., Google Inc. and Microsoft, Rico Singleton, deputy CIO for the State of New York, asked: "Can you give me a private cloud that can provide all the benefits that you provide now on my private network closed to the outside, and still be able to give me similar ROI?" The answer by top cloud engineers at Microsoft, Amazon and Google was a resounding: "Not yet." …

• Scott Guthrie reports in an Installing .NET 4 on Windows Azure comment to his Visual Studio 2010 and .NET 4 Update post of 12/17/2009:

We are working with the Azure folks right now to try and get .NET 4 installed on it as soon as possible.  Unfortunately I don't have an exact ETA yet.

Information Week: Analytics presents their Strategy: Outlook 2010 Cloud Computing Brief:

The results of InformationWeek Analytics’ Outlook 2010 survey, where we asked 360 business technology pros about their plans for the year ahead, don’t make you want to break out the party hats and blowers. But there are some signs that IT spending will at least level off and that customer-facing and sales-supporting projects will be on the rise. Compare that to last summer, when we heard a lot about cost-cutting infrastructure projects and renegotiations with vendors but not a lot about IT initiatives that drive growth.

In terms of emerging technology, cloud computing’s momentum is real, as markedly more IT pros are considering it than they were a year ago. Data center innovation remains a high priority. Despite some optimism, the IT hiring outlook remains weak, and if there’s budget cutting ahead, IT will take its share of the lumps. …

The price is US$99.00

Reuven Cohen’s Introducing the Private Partner Cloud post of 12/18/2009 describes a South Korean mobile service provider’s cloud infrastructure for developing mobile phone applications for the provider’s network:

I'm currently in Seoul South Korea for a variety of meetings with SK government and technology industry folks. Yesterday I had a very interesting meeting with the largest South Korean mobile provider. During the meeting they described a great potential use case for telecom focused IaaS cloud offerings. Basically what they've done is created an on demand compute infrastructure specifically for their network of mobile application developers. The service is offered free of charge to their partners and provides all the tools necessary for the development, testing and deployment of mobile applications specifically tailored to their particular mobile network environment. This may be one of the best use cases for semi-private cloud I've heard of.

In a sense they're subsidizing the infrastructure costs for mobile application developers they work with. They are basically covering the costs associated with the more routine aspects of mobile app development while also empowering a new and broader group of potential partners by providing a quick and easy way to develop applications for their environment. Another advantage is in gaining a greater pool of potential network specific applications & developers. Very smart. …

Agreed. I believe Microsoft should provide Windows Mobile developers with some free Azure bandwidth and support for creating WinMo apps.

Frank GensNew IDC IT Cloud Services Survey: Top Benefits and Challenges post of 12/15/2009 begins:

This year’s IDC IT cloud services survey reveals many of the same perceptions about cloud benefits and challenges as seen in last year’s survey.  But there are a few interesting  shifts this year, driven largely by: 1) budget pressure from the challenging economy, and 2) a growing sophistication in users’ understanding of cloud services.

This year’s survey was fielded, like last year’s, from the IDC Enterprise Panel of IT executives and their line-of-business (LOB) colleagues.  The respondent population is very similar to that of last year’s survey, validating comparisons with last year’s results.

Economics and Adoption Speed Still Top Benefits; Standardization Moves Up

This year’s survey shows, once again, that economic benefits are key drivers of IT cloud services adoption. Three of the top five benefits were about perceived cost advantages of the cloud model: pay for use (#1), payments streamed with use (#3) and shift of IT headcount and costs to the service provider (#5).

IDC Cloud Survey 2009 Benefits

While pay-for-use slightly edged out last year’s #1 – easy/fast to deploy – these two are essentially in a tie for #1. It’s pretty safe to ascribe the slight edge for pay-for-use to the enormous pressure that the Great Recession has put on IT budgets, and the consequent increased focus on cloud economics in the minds of customers.  But it’s still clear that speed/simplicity of adoption remains a key driver of demand for cloud services. …

That last observation bodes well for Windows Azure and SQL Azure.

Mary Jo Foley includes Azure in her Microsoft products worth watching in 2010 article for Redmond Magazine, which I missed when published:

… Microsoft is launching the final version of its cloud-based hosting platform, Azure, next month. Live Mesh -- the consumer-focused collaboration and synchronization service that will be one of Microsoft's first Azure-based offerings -- is supposed to be a proof point for the platform. Both Azure and Live Mesh are Chief Software Architect Ray Ozzie's pet projects. Microsoft has taken a different tack than other cloud vendors like Amazon and Google. Instead of simply providing data center space and resources,

Microsoft is trying to build a cloud platform that's similar to Windows and .NET. The company hopes developers will want and need an OS, a database, collaboration and other building blocks. …

I’m more sanguine about the prospects for Azure than Live Mesh.

<Return to section navigation list> 

Cloud Security and Governance

• Michael Krigsman’s Modern SOA governance: Adoption and measurement post of 12/18/2009 to the Enterprise Irregulars blog claims:

Recent discussions have brought attention to the important and evolving role of governance in the world of service-oriented architecture (SOA).

“Modern conceptions of SOA governance,” to borrow a phrase from SOA expert, Dion Hinchcliffe, recognize that technical architecture is only one component of successful adoption. Achieving deeper success involves bringing technology deployment into conformity with business needs.

Governance is critical to successful SOA adoption. For this reason, I brought together Dion and Software AG’s Vice President and Chief Strategist, Miko Matsumura, for a podcast discussion on this topic. Together, the three of us explored interconnections between SOA, technology, business, and trust. …

Information Week: Analytics presents their Research: Cloud Governance, Risk and Compliance report:

Navigating the Storm: Governance, Risk and Compliance in the Cloud
Q: What’s more fashionable than government bailouts, Twitter, hybrids and pimping your greenness?

A: Cloud computing, that sexy new IT concept that everyone is talking about, but no one seems able to clearly define.

Besides buzzwords like SaaS (software as a service), PaaS (platform as a service) and IaaS (infrastructure as a service), cloud computing provides IT groups with extra potential layers of abstraction, extremely complex interdependency models—and an unsettling level of uncertainty about where our data goes, how it gets there and how protected it will be over time. If you’ve got a nagging feeling that much of the current discussion seems new, yet somehow strangely familiar, you aren’t losing your mind. We struggled through similar issues a few years ago when application service providers were all the rage. This time around, when it comes to defining the scope of the phenomenon, the only thing all parties seem to agree on is that cloud computing represents something that is not local—not at your site. This oversimplification is understandable given that, for network engineers, the generic cloud icon has for decades represented everything from foreign networks and remote sites to the rats’ nests we really don’t want anyone asking about. …

This report appears to be free to download.

<Return to section navigation list> 

Cloud Computing Events

• Upperside Conferences: Call for Papers for Cloud Telco 2010 to be held at the Novotel Convention Center and Spa, Paris CDG France on 6/1 – 6/4, 2009:

Several major telcos have recently announced their wish to enter the cloud computing services market. During the past years, operators have moved beyond broadband voice and data into network-dependent applications like videoconferencing and telepresence and have secured deeper enterprise relationships. Providing hosted solutions to the companies is the next move many carriers are considering today.

Indeed, telecommunication providers could play an important and lucrative role in the burgeoning world of cloud computing by combining their natural advantages as network operators with a new wave of technological innovation. The opportunity represented by cloud-based services is potentially immense because, for starters, it increases the value of carrier networks in multiple ways and creates new roles and revenues for telecom service providers. …

The call for proposals is in online.

Sergey Barskiy reports in his Upcoming talks post of 12/17/2009 that he will present sessions about:

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

Bob Evans reports "Ellison speaks out on Oracle's new Sun-enabled strategy and how that points to where the entire IT industry is headed” in his 12/18/2009 Oracle CEO Larry Ellison On The Future Of IT Global CIO column for InformationWeek:

… Oracle founder and CEO Larry Ellison spoke in considerable detail about how his vision of the computer industry of the future is centered on the idea of optimized systems that provide high value to customers because they don't need to do or pay for a lot of systems integration, and in return provide high margins to the providers.

Ellison also quite casually wove the terms "private clouds" and "cloud computing" into his strategic overview without lampooning them, which was a big step forward even though Ellison's discomfort with the term is shared by IBM CEO Sam Palmisano and Hewlett-Packard CEO Mark Hurd. It was a big step because whatever his personal misgivings over cloud terminology might be, it's a name and concept that has truly begun to fire the imagination of customers and industry players alike, and the combination of Ellison's new acceptance of the term combined with his ambitious plans for Oracle to become a major supplier of cloud systems can only accelerate that already forceful trend. …

• James Hamilton contrasts the openness of storage subsystems with networking hardware in his Networking: The Last Bastion of Mainframe Computing post of 12/19/2009:

The networking world remains one of the last bastions of the mainframe computing design point. Back in 1987 Garth Gibson, Dave Patterson, and Randy Katz showed we could aggregate low-cost, low-quality commodity disks into storage subsystems far more reliable and much less expensive than the best purpose-built storage subsystems (Redundant Array of Inexpensive Disks). The lesson played out yet again where we learned that large aggregations of low-cost, low-quality commodity servers are far more reliable and less expensive than the best purpose-built scale up servers. However, this logic has not yet played out in the networking world.

The networking equipment world looks just like mainframe computing ecosystem did 40 years ago. A small number of players produce vertically integrated solutions where the ASICs (the central processing unit responsible for high speed data packet switching), the hardware design, the hardware manufacture, and the entire software stack are stack are single sourced and vertically integrated. Just as you couldn’t run IBM MVS on a Burrows computer, you can’t run Cisco IOS on Juniper equipment.

James’ article offers an interesting counterpoint to Larry Ellison’s paean to Sun’s proprietary hardware approach to cloud computing (see above):

"So customers will be able to buy high-end SMP machines that are high-performance and high-value, or a high-end private cloud, with all of the pieces including processing, storage, and networking integrated together with Oracle-slash-Sun software. We think that will heavily differentiate our offerings from the offerings of IBM, HP and Dell, and we think we're gonna be able to compete very effectively there and that will deliver high margins and allow us to deliver that $1.5 billion additional profit in our first full year of owning Sun."

• Bob Evans hints “Its 13 petabytes include archived data from the world's top banks and pharma companies, and it's growing rapidly. The owner's name starts with A -- but it's not Amazon” as the answer to his The World's Largest Private Cloud: Who's Number One? question posed in the second “The Cloud Imperative” article of 9/16/2009 for InformationWeek’s Global CIO column:

Leaning hard into the cloud-computing phenomenon that has become the major business-technology theme for 2010, Autonomy Corp. is claiming to be King of the Cloud by virtue of its massive Digital Safe archiving system, which spans 6,500 servers across seven data centers and handles 3 million new files per hour. …

And that private-cloud beast is only in the early stages of an astonishing growth spurt: just 8 months ago, it was at 10 petabytes. And Autonomy CMO Nicole Eagan says the surge to the cloud for archiving has only just begun.

Cloud-based data archiving in this scale is a significant vote of confidence for cloud computing in general.

Danny Tuppeny’s Microsoft Windows Azure vs Google App Engine: Pricing post of 12/18/2009 concludes:

I really hope Microsoft re-evaluate their pricing for small apps. It's too expensive to play around with small prototypes at those prices, whereas Google's offering will let me get started completely free, until my app is churning a considerable amount of traffic, and even then, it'll work our cheaper for the same processing/transfer.

Sorry Microsoft. I love .NET and Visual Studio, but Google App Engine is just so easy and cheap that it's going to be my "toy of choice" for my hobby coding for the immediate future!

• Mikael Ricknäs asserts “Amazon's CloudFront supports on-demand streaming, will add live events next year” in his Amazon adds media streaming to content delivery service post of 12/16/2009 for the IDG News Service:

Amazon Web Services has added support for audio and video streaming to the beta version of CloudFront, its Web service for content delivery, the company said on Wednesday.

The support for streaming is based on Adobe's Flash Media Server. Today, the service supports on-demand streaming, but Amazon plans to add support from live streaming next year, it said.

To stream content customers must first store the original copies of their movies and songs on Amazon's S3 (Simple Storage Service), and then enable streaming of the content using the AWS Management Console or Amazon's APIs (application programming interfaces) for CloudFront, according to Amazon.

CloudFront can stream content from 14 locations in the U.S., Europe, Hong Kong and Japan. Users are automatically sent to the best location, Amazon said. …

Salvatore Genovese reportsOrange announces complete cloud computing services, from infrastructure to real-time business applications” in his Orange Sets Out Its Ambitions in Cloud Computing post of 12/18/2009, which reads more like a press release:

Orange is the key brand of France Telecom, one of the world’s leading telecommunications operators. With 126 million customers, the Orange brand now covers Internet, television and mobile services in the majority of countries where the Group operates.

Leveraging its cloud-ready network, Orange is best placed to provide enterprises with simpler, safer and more flexible cloud services.

Orange Business Services, which has already rolled out successful cloud services, such as IT Plan (desktop virtualization) and Flexible Computing (hosted virtualized infrastructure) will launch a dozen of new cloud computing services in the coming 24 months, covering six main areas including real-time applications, collaboration, security, infrastructure, cloud-ready networking and vertical solutions for specific industries. …

Geva Perry offers his Thoughts on Amazon EC2 Spot Instances in this 12/28/2009 post:

The innovation just keeps on coming from the good folks at Amazon Web Services. This week they announced a new pricing model for Amazon EC2 instances: spot pricing. Spot pricing is the third pricing model Amazon is offering for EC2 instances -- with On-Demand and Reserved being the other two -- and it brings us closer to an efficient and commoditized IT infrastructure market, and it got my mind racing on the various possibilities of it, and where it goes if taken to its logical conclusion.

Run_spot_run

You can read explanations of what it is and how it works from Jeff Barr, Werner Vogels, RightScale, Reuven Cohen and James Urquhart.

James has a very succinct explanation of the key tenets of the new offering:

  • Each customer sets a maximum price he or she is willing to pay for "spot instances."

  • Amazon sets a "spot price" for instances hour-by-hour, based on available supply and demand.

  • Customers pay whatever the spot price is up to their maximum price. So, if someone bids $0.07/hour, and the spot price is $0.05/hour, the person pays $0.05/hour.

  • If the spot price exceeds the customer's maximum price, the customer's instances are terminated.

I had to open my old finance textbook from business school and think of all sorts of possibilities: call options, put options, futures, and other forms of derivatives and hedging techniques. It will be interesting to see if any of those evolve over time. By the way, there already is a real-time ticker for Amazon spot pricing, called Cloud Exchange. But here are some thoughts on issues that are relevant in the shorter term. …

Geva goes on to discuss Workloads and Bidding activities.

Kent Langley points to a fledgling spot exchange cloud market report in his AWS EC2 Spot Price Visualization Site and a few thoughts about CPU cycles post of 12/18/2009:

This is rather interesting to see.  Someone already put up a set of live charts keeping track of the AWS compute instances.

http://cloudexchange.org

What's interesting to me is that the same resource can have different prices in different regions (obvious, but interesting) and that in many cases (if not all) the costs are substantially below the retail rate for the same instance.

For example: us-east-1, c1.xlarge, $0.25 / hour.  The retail for that is $0.68 per hour.  Nice discount. …

Liam Eagle reports Rackspace Partners with FathomDB for Database in the Cloud with rates as low as US$0.02/hour ($14.40/month) in this 12/18/2009 post:

The Rackspace Cloud (www.rackspacecloud.com), the cloud hosting division of Rackspace Hosting (www.rackspace.com) announced on Thursday that it has partnered with FathomDB (www.fathomdb.com), which it calls a pioneer in the realm of database as a service technology, to create a version of FathomDB’s cloud database offering using Rackspace’s cloud hosting solutions.

Rackspace says FathomDB helped to launch the database as a service business by creating a user interface and analytics engine that support MySQL databases in the cloud, initially powered by Amazon’s EC2 and S3 cloud products, but now also using Rackspace’s Cloud Servers product.

Rackspace says that, built on the Rackspace Cloud’s API, the new FathomDB offering will provide “a seamless database management experience” using Rackspace’s Cloud Servers and simplifying administration tasks. The FathomDB offering handles a long list of database tasks that includes automated backup and routine maintenance, analytics tools, real-time monitoring and performance reporting and simple configuration tools.

The release describes FathomDB as a “strategic partner,” rather than simply as a customer, which would suggest that the relationship goes a little deeper than just an application optimized to work with Rackspace’s cloud API. At least part of that relationship appears to be the inclusion of the solution within Rackspace’s “cloud tools” ecosystem. …

MG Siegler reported Rackspace Goes Down. Again. Takes The Internet With It. Again. for TechCrunch later on 12/18/2009:

Another day, another Rackspace outage. The hosting company had a complete and total failure today that took down a number of big sites on the Internet, including ours. This has been happening all too often in recent months, including downtime just last month.

The failure apparently originated in the company’s Dallas-area server farm. But unlike previous times, this does not appear to be a power issue, the company says. Some other sites that are currently affected include: 37signals, Brizzly, Scoble’s blog, all of the sites hosted by Laughing Squid, Tumblr custom domains, and many others.

This is another black eye for the company, though they are generally responsive with other issues we’ve had throughout our time with them. But until they can prove to be more reliable, we’ve decided to get a backup version of TechCrunch up and running at another datacenter, for when someone inevitably trips over a power cord at the Dallas Rackspace center again.

MG continues with a “few updates from the company” starting at 3:45 PM CST.

Bruce Guptill and Charlie Burns ring in about Amazon EC2 Spot Instances Enables and Demands, Change in Cloud Buying and Use on 12/18/2009 in this Saugatuck Research Alert (site registration required):

What is Happening?  On December 14, 2009, Amazon Web Services (AWS) announced a new Cloud Computing offering, Spot Instances, intended to complement their previous AWS offerings (On-Demand Instances and Reserved Instances).

Spot Instances introduces a dynamic model for pricing, selling, buying, and using unused Amazon EC2 capacity: dynamic pricing for dynamic resources. Buyers bid for one or more EC2 instances based on a price that they are willing to pay. Based on the supply and demand, Amazon sets the prices for these unused resources. The prices can be expected to fluctuate periodically based on levels of demand, time of the day, and other typical resource use factors. If a user’s bidding price exceeds the spot price set by Amazon, their bid becomes the spot price, their instances will be run, and they will be charged the current spot price. When the spot price goes over that bidding price, the instances will be terminated. If and when prices come back down, the user’s instances can run again, automatically.

AWS sees Spot Instances as best-suited for such non-time-dependent batch processing jobs as software development testing, scientific research, video rendering, and financial modeling, and massive data analysis (e.g., seismic data). Saugatuck sees Spot Instances as a harbinger of Cloud disruptions to come, especially for buyers and for Cloud services providers. …

Bruce and Charlie continue with analyses of Why is it Happening? and Market Impact.

Sun Microsystems published a rogues’ gallery of their Sun Cloud team members and offers links to several whitepapers in a recent (undated) Sun Cloud post to the Cloudbook blog. Here are the whitepaper offerings:

Sun also offers a video peek into their data center: A Look Inside the Sun Cloud - June 2009:

A tour of SuperNAP, the datacenter that is home to the Sun Cloud. Highlights the security, availability and energy efficiency features of the facility.

<Return to section navigation list>