Monday, September 14, 2009

Windows Azure and Cloud Computing Posts for 9/7/2009+

Windows Azure, Azure Data Services, SQL Azure Database and related cloud computing topics now appear in this weekly series.

•• Update 9/12/ and 9/13/2009: Live CloudDotNet sample project, Wade Wegner’s Azure presentation, NY Times and Hadoop, and more.
• Update 9/10/ and 9/11/2009: Sample Azure app down 25 minutes overnight, Dell offers migration from paper to cloud based medical record storage, NHibernate Shards, SQL Azure Reference Data (OGDI), Microsoft’s Chicago data center goes live. 
Note: Articles for 9/7/2009 have been duplicated from Windows Azure and Cloud Computing Posts for 9/3/2009+.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use these links, click the post title to display the single article you want to navigate.

Azure Blob, Table and Queue Services

Dare Obasanjo’s long-awaited (at least by me) Building Scalable Databases: Denormalization, the NoSQL Movement and Digg post of 9/10/2009 begins:

Database normalization is a technique for designing relational database schemas that ensures that the data is optimal for ad-hoc querying and that modifications such as deletion or insertion of data does not lead to data inconsistency. Database denormalization is the process of optimizing your database for reads by creating redundant data. A consequence of denormalization is that insertions or deletions could cause data inconsistency if not uniformly applied to all redundant copies of the data within the database.

His “The No-SQL Movement vs. Abusing Relational Databases for Fun & Profit” topic appears to denigrate SQL Azure:

If you’re a web developer interested in building large scale applications, it doesn’t take long in reading the various best practices on getting Web applications to scale such as practicing database sharding or eschewing transactions before it begins to sound like all the advice you are getting is about ignoring or abusing the key features that define a modern relational database system. Taken to its logical extreme all you really need is a key<->value or tuple store that supports some level of query functionality and has decent persistence semantics. Thus the NoSQL movement was borne.

Remember that Microsoft abandoned it’s Entity-Attribute-Value based SQL Server Data Services (SSDS), later SQL Data Services (SDS) in favor of a fully relational SQL Azure implementation.

Waiming Mok analyzes non-relational data storage formats in his Data Stores: No One Solution post of 9/7/2009:

As Michael Stonebraker points out in his blog on ACM:  there is no one database that solves all problems.

Same thing can be said of data stores:  it’s no longer a choice between direct attached disks, SAN block storage, NAS file systems.   Many of the developments came about because of the use of multiple servers to build web services, cloud computing, and grid computing.  The developers found that they can significantly improve system performance by locating data closer to the application or to the edge (where the data is accessed), that they could ease the complexity of development by relaxing consistency constraints (which are not needed) — see CAP Theorem, and that they could significant lower costs by avoiding or minimizing the use of expensive SAN storage or CDN data delivery.

Waiming concludes his post with a table that briefly describes Hadoop HDFS, Facebook Haystack and Yahoo! MObstor.

• Shawn Wildermuth’s ADO.NET Data Services 1.5 Feature: Projections post of 9/9/2009 begins:

If you've been following my blog, you should know that I am keeping a pretty close watch on ADO.NET Data Services. The team recently released a second CTP of the new version with some interesting features. This CTP has some pretty compelling additions, but I am going ot focus on one in particular.

I've been teaching and using ADO.NET Data Services for a long time and I like showing off exposing a LINQ-based provider (Entity Framework, NHibernate or others) to a Silverlight application. While ADO.NET Data Services does expose its API through a REST API, the magic for me is in its use in Silverlight. In case you haven't been following along, using the Silverlight client you can issue a LINQ query through the Silverlight client (though in fairness, the full power of LINQ is not supported in the client).

Shawn continues with C# examples.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

Steven Forte discovers SQL Azure Does Stored Procedures in this 9/11/2009 post:

When SQL Data Services (now SQL Azure) released its first CTP it did not look anything like SQL Server: there were no tables, stored procedures, views, etc. With the new CTP, SQL Azure embraces SQL Server in the sky and supports the relational model, including stored procedures. This is good since there are millions of lines of stored procedures out there in production today and migrating them to SQL Azure is pretty easy.

He continues with a demonstration of creating a simple T-SQL stored proc in SADB.

• Simon Munro’s The Trouble With Sharding post of 9/10/2009 points out that:

Database sharding, as a technique for scaling out SQL databases, has started to gain mindshare amongst developers.  This has recently has been driven by the interest in SQL Azure, closely followed by disappointment because of the 10GB database size limitation, which in turn is brushed aside by Microsoft who, in a vague way, point to sharding as a solution to the scalability of SQL Azure.  SQL Azure is a great product and sharding is an effective (and successful) technique, but before developers that have little experience with building scalable systems are let loose on sharding (or even worse, vendor support for ‘automatic’ sharding), we need to spend some time understanding what the issues are with sharding, the problem that we are trying to solve, and some ways forward to tackle the technical implementation. …

Simon continues with an analysis of sharding techniques and issues, and concludes with:

I am disappointed that the SQL Azure team throws out the bone of sharding as the solution to their database size limitation without backing it up with some tools, realistic scenarios and practical advice.  Sharding a database requires more than just hand waving and PowerPoint presentations and requires a solid engineering approach to the problem.  Perhaps they should talk more to the Azure services team to offer hybrid SQL Azure and Azure Storage architectural patterns that are compelling and architecturally valid.  I am particularly concerned when it is offered as a simple solution to small businesses that have to make a huge investment in a technology and and architecture that they are possibly unable to maintain.

I’m disappointed, too, Simon. Perhaps the SQL Azure team will throw some light on recommended sharding patterns and practices at PDC09.

See Dare Obasanjo’s Building Scalable Databases: Denormalization, the NoSQL Movement and Digg post of 9/10/2009 in the preceding section.

• Michael Otey lists 7 Facts About SQL Azure in this 9/3/2009 article for SQL Server Magazine:

The move to the cloud has been a big trend for technology vendors this year. However, businesses haven’t been so fast to follow. Moving to the cloud entails making many application-related changes. Even so, the cloud promises scalability and cost savings that make it worth considering. The cloud in this case is SQL Azure, formerly SQL Data Services, Microsoft’s SQL Server-based database cloud offering. Let’s look at seven things you need to know about SQL Azure.

Ayende Rahien proposes completing the NHibernate Shards project with volunteers in his SQL Azure, Sharding and NHibernate: A call for volunteers post of 10/6/2009. Ayende says:

Sharding is a term that was invented by Google, and a few years ago several Google engineers decided that they want to use Sharding with Hibernate. Thus, the Hibernate Shards project was born, bringing transparent sharding support to Hibernate.

The equivalent project for NHibernate was started, but porting was never complete. This is a call for volunteers to help continue the port of Hibernate Shards to NHibernate. You now have a very clear goal for why you would want that.

My fully illustrated Using the SQL Azure Migration Wizard with the AdventureWorksLT2008 Sample Database post of 9/7/2009 describes how to use George Huey’s schema migration utility to duplicate schemas of on-premises SQL Server databases in SQL Azure Database tables in Microsoft Data Centers and the problems I encountered when using the Wizard with the AdventureWorksLT2008 sample database to populate the tables with data.

Ayende Rahien (a.k.a., Oren Eini) reports in his NHibernate on the cloud: SQL Azure - Ayende Rahien post of 9/7/2009:

I just finished running the NHibernate test suite against SQL Azure. I am very happy to say that It Just Works*.

Hat tip for Microsoft for managing to create an environment so similar to that of SQL Server under a drastically different conditions.

* The only caveat is that some Schema Export scripts fails on SQL Azure, it seems like drop table has some slightly different behavior with SQL Azure, it does not drop the table immediately, but there seems to be some delay before it actually happens (especially if you use a different connection to check for the existence of the table).

<Return to section navigation list> 

.NET Services: Access Control, Service Bus and Workflow

Nigel Watling explains What are .NET Services? in the 00:07:01 Channel9 video of 9/10/2009:

This brief video gives an introduction to .NET Services. We'll explore its components, what they do and what value they provide.

The .NET Services Team reports .NET Services Scheduled Maintenance (September 8th, 2009) – COMPLETE on 9/8/2009. If the “Routine maintenance on the storage layer of the Access Control Service” actually took down ACS for the scheduled 6 hours, that would result in .NET services uptime for September being 99.17% (6/720), not 99.5% or 99.95%.

<Return to section navigation list> 

Live Windows Azure Apps, Tools and Test Harnesses

cirrious offers CloudDotNet, a Windows Azure project that lets you browse all projects by keyword. However, only the app displays only the first two OakLeaf sample projects when searching with OakLeaf as the keyword.

Jean-Christophe Cimetiere’s Viewing public government data with Windows Azure and PHP: a cloud interoperability scenario using REST post of 9/10/2009 to the Interoperability@Microsoft blog demonstrates a government data and Platform Interoperability scenario in the cloud:

This week Microsoft is participating in the first Gov 2.0 Summit produced by O'Reilly Media, Inc. and TechWeb in Washington D.C., to explore how technology can enable transparency, collaboration and efficiency in government. Today, we're pleased to present a cloud interoperability scenario which takes advantage of the recently announced Toolkit for PHP with ADO.NET Data Services to view public government data with Windows Azure and PHP.

As you may recall, few weeks ago, Microsoft announced the Toolkit for PHP with ADO.NET Data Services, a new bridge enabling PHP developers to connect to .NET using a RESTful architecture. Today, we've published a cloud interoperability scenario where a Windows Azure application exposes data in a standard way (XML / Atom) and how you can simply “consume” this data from a PHP web application. This scenario takes advantage of the Open Government Data Initiative (OGDI), another piece of Microsoft's Open Government effort, built on the foundation of transparency, choice and interoperability.

The Open Government Data Initiative (OGDI) is a project launched in May by our colleagues from the Microsoft Public Sector Developer Platform Evangelism team

In a nutshell, Open Government Data Initiative (OGDI) is a cloud-based collection of software assets that enables publicly available government data to be easily accessible. Using open standards and application programming interfaces (API), developers and government agencies can retrieve the data programmatically for use in new and innovative online applications, or mashups.

Reuven Cohen adds his US$0.02 to the preceding project in his Governmental Cloud Interoperability on The Microsoft Cloud post of 9/11/2009.

• Glenn Laffel, MD takes the Office of the National Coordinator (ONC) for Health IT to task for its failure to adhere to a January 2010 timeframe for a definition of “meaningful use” of Electronic Health Records (EHR) in his ONC: A Call to Action post of 9/11/2009:

ONC had planned to have finalized meaningful use criteria by this January, but after receiving far more public feedback than expected, ONC punted the sign-off date all the way to spring.

In itself, the delay appears innocuous, but seen in the context of ONC’s complex agenda and recent developments in the marketplace, the delay could spell trouble (as we describe below).

Meanwhile, the EHR vendor certification process remains in a planning stage. ONC’s HIT Policy Committee announced a few weeks back that would recommend that 10-12 entities be empowered to certify EHR systems but so far as we know, ONC has not signed off on this, much less begun to screen and select the entities.

Entity selection itself will take time, yet it is a relatively small step in the deployment of a full-blown certification process. …

My Electronic Health Record Data Required for Proposed ARRA “Meaningful Use” Standards post of 9/5/2009 explains the Personal Health Record (PHR) objectives of the “meaning use” criteria to qualify for ARRA provider subsidies and bonus payments. PHR is expected to a major market for cloud computing services in the 2010 decade.

John ?’s CCHIT Town Hall Meeting on Preliminary ARRA Certified EHR post of 9/11/2009 takes a dim view of CCHIT “meaningful use” certification of EHRs:

I’ll be honest with you. For my own health I took off the last month from reading about CCHIT. I guess the birth of my third child made a difference as well. However, I’d been getting some comments and emails lately about CCHIT’s new certification programs and so I had to go and take a look at what was going on. Well, let’s just say that CCHIT has yet to disappoint me. They are so full of CCHIT that it’s not even funny. The conclusions they come to are crazy. Ok, now that I’ve made my bias clear, take a look at some of the things they’re saying. …

John Chilmark says that today’s “Personal Health Record” definition is outmoded and PHRs should be replaced by Personal Health Platforms in his Time to Kill the PHR Term: Part 1 and Time to Kill the PHR Term: Part 2 posts of 9/9 and 9/10/2009:

Yesterday, we outlined why the PHR term has the potential to stunt future advances in consumer health and engagement via HIT.  Our thesis is that the PHR term is rooted in a dated concept of simply providing the user/citizen a virtual file cabinet for their health records.  Since the initial introduction of Internet-based PHRs nearly a decade ago, adoption has been by and large abysmal.  Our belief is that adoption, or lack thereof, is symptomatic of PHRs not having a sufficient value proposition for the vast majority of potential users.

But where we really get concerned with the PHR term is that in the meaningful use recommendations that were accepted in July.   Under meaningful use guidelines, those obtaining Stimulus (ARRA) funding for adoption of a certified EHR must provide a PHR to their patients by 2013.  Trouble here is how will HHS define what that PHR is?  Last year, HHS paid a princely sum to have the PHR term defined (see below).  This term, we have been told, is what will be used within the context of meaningful use rule making.  If this is indeed true, adoption of PHRs will continue to be lackluster.

The “princely sum” was the US$500,000 that HHS paid Bearing Point to define six health-related acronyms.

• Bill Peterson’s Risk Management and BI Cloud Computing – Will It Work and Who’s in Charge? post of 9/11/2009 to Tibco’s Spotfire blog briefly discusses risk management for cloud-based business intelligence projects:

As Cloud Computing matures, will it evolve into a viable platform used by financial services organizations, including Risk Management?  If so, are Business Intelligence (BI) tools the best mechanism to access, navigate, manipulate, and report on the Cloud-based data and content?  If we oversimplify and look at Cloud Computing as simply picking up our data centers and placing them in the Cloud, then our existing BI tools would likely follow. However, if we look at financial services, and specifically Risk Management, the questions become:

  • Who is in charge of minimizing risk when BI happens in the Cloud?
  • Should any of the risk extend to the Cloud provider? …

• Jamie Thomson released his RESTful Northwind on SQL Azure sample application to public access on 9/10/2009:

I recently gained access to SQL Azure, the hosted database part of Microsoft’s Azure cloud offering. I don’t currently have a reason to use SQL Azure in my day-to-day work so I set myself a small project that would enable me to prod and poke at the service and hopefully learn a bit about it. I decided to implement a RESTful service on top of the venerable Northwind database using SQL Azure (for data storage), ADO.Net Data Services (for service enabling the data) and Windows Azure (for hosting the service). The resultant service is available at: http://northwindazure.cloudapp.net/Northwind.svc/

Following are a few of Jamie’s sample queries (make sure feed reading is turned off in your browser):

Customer ALFKI http://northwindazure.cloudapp.net/Northwind.svc/Customers('ALFKI')
Order Details items with a price > US$250 http://northwindazure.cloudapp.net/Northwind.svc/Order_Details?$filter=UnitPrice%20gt%20250
All orders by customer ANATR http://northwindazure.cloudapp.net/Northwind.svc/Customers('ANATR')?$expand=Orders
All order details of all orders by customer ANATR http://northwindazure.cloudapp.net/Northwind.svc/Customers('ANATR')?$expand=Orders/Order_Details
All order details for the first 3 customers http://northwindazure.cloudapp.net/Northwind.svc/Customers?$top=3&$expand=Orders/Order_Details
All products in all orders for the first 3 customers http://northwindazure.cloudapp.net/Northwind.svc/Customers?$top=3&$expand=Orders/Order_Details/Products

Jamie continues with a description of the project. Source code is available from CodePlex.

• Sopima announces its “Online Contract Bank – A Contract Management Software-as-a-Service from the Microsoft Azure Cloud” in this Sopima Demystifies Contract Management post of 9/10/2009:

Sopima is a new online service for Contract Creation, Deal Making and Contract Management. The service consists of (1) a smart secure contract bank, (2) online negotiation and signature tools, and (3) contract management best practices.

Sopima makes content of the contracts smart and interactional. This includes automated reminders on tasks and deadlines. The service ensures collaboration and true business productivity for financial and administrative managers and legal consultants. The value is brought to the customer by shortening the time to negotiate deals, saving time in daily tasks, and enabling transparency with extensive search and reporting on all contract assets. Sopima also takes risk management to a new level with no more expensive lessons in neglected or forgotten commitments.

Sopima is a Finnish member of Microsoft’s BizSpark program. Not surprisingly, its Web site is regionalized in Finnish.

Maarten Balliauw released PHP Azure CPT 3 (v0.3.0) to CodePlex on 9/10/2009. I can’t find a description of the changes to CPT 2 on CodePlex or from Maarten’s Tweets on the topic.

My Lobbying Microsoft for Azure Compute, Storage and Bandwidth Billing Thresholds for Developers post of 9/8/2009 proposes that Microsoft offer developers a Microsoft Azure Platform instance at a monthly charge of $9.95 for compute, 1GB of storage, and 1GB of ingress and egress bandwidth for development or public-facing demo applications.

Anand Iyer shows you how to Exchange Business Cards on Twitter – @TwtMyCard in this 9/8/2009 post:

First off, my friend Kevin Marshall (@ksmarshall) deserves a HUGE pat on the back for this. I first had this idea first at SxSW and had created an ugly-as-sin prototype several months later. I’d coded it using WebForms in ASP.NET and C# and hosted it on Windows Azure. I shared the idea with Kevin, who basically threw away all my code and rewrote the ENTIRE thing in ASP.NET MVC and of course made it look ridiculously pretty.

There’s a front-end Windows Azure web role that you get to see, and a background worker role (we call it the ‘bot’) that wakes up every few minutes, reads all the direct messages (since the last time it read DMs), parses the message, and sends @ messages to the recipient with a link to the sender’s profile.

All the profile data is hosted on Azure Table Storage.

Bill Crounse, MD requests that you Please take care in selecting an EMR for your practice in this 9/8/2009 post to the Microsoft HealthBlog:

As reported by HDM on-line, the Office of the National Coordinator for Health Information Technology has published additional information on a $598 million grant program to fund the creation of about 70 Health Information Technology Regional Extension Centers.  The centers will help hospitals and physicians select, acquire and use electronic health records systems.

No doubt some serious education and hand-holding will be needed as more physicians and hospitals take the plunge into electronic medical record systems and “meaningful use”.  If taking the plunge is anything like what I saw and heard during a visit to my own doctor last week, doing your EMR homework before you buy is an important step if you hope to swim rather than sink. …

With SQL Azure coming on line, the Windows Azure Platform becomes a more attractive candidate for hosting Electronic Medical Records (EMR), in addition to Personal Health Records (PHR).

Robert Rowley MD calls Electronic Medical Records a Disruptive innovation in healthcare in this 9/8/2009 post:

… [A]n Electronic Health Record (EHR) system needs to be able to create data that is standardized, such that data from different sources can be exchanged transparently, and used to create reports and decision-support at the point of care.

In a setting where large, legacy vendors (the “health IT mainframes”) have created a landscape of separate, locally-installed, proprietary and closed systems, the only way to achieve these national health goals is to enforce a set of standards for data exchange – this means that specific pieces of clinical data need to be identified as being “important” (like medications, allergies, immunizations, demographics, and lab values), a standard format for import/export of these pieces of data needs to be required of all vendors (the CCD and CCR standards), and some data-interchange platforms need to be created where this standardized data can be uploaded to and downloaded from (regional Health Information Exchanges, or HIEs, and the National Health Information Network, or NHIN). This is especially important if one believes that “the future will be a bigger version of the past,” and that the landscape will continue to be local, segregated and proprietary. …

Reuven Cohen reports on CloudLoop - Universal Cloud Storage API in this 9/8/2009 post:

… The CloudLoop project is described as a universal, open-source Java API and command-line tool for cloud storage, which lets you store, manage, and sync your data between all major providers.

According to the announcement Cloudloop aims to solve cloud storage related problems by putting a layer in between your application and its storage provider. It gives you one simple storage interface that supports a full directory structure and common filesystem-like operations (e.g. mv, cp, ls, etc).

The project currently supports Amazon S3, Nirvanix, Eucalyptus Walrus, Rackspace CloudFiles, Sun Cloud, with support coming soon for Microsoft Azure, EMC Atmos, Aspen, Diomede storage clouds. … [Emphasis added.]

Alan Smith has created "a Windows Azure Hosted community site featuring webcasts from the Microsoft development community” according to the MVP Releases Windows Azure Community Site post of 9/2/2009 to The Microsoft MVP Award Program blog:

CloudCasts.net aims to cover a wide range of Microsoft technologies and provides a learning resource for developers whilst providing an option for community members to make their ideas available to a wider audience. The About page contains contact information for anyone wishing to have their webcasts added, along with tips for producing your own webcasts.

The site already has some famous names from the MVP community and Microsoft, and it would be great to hear from other MVPs or individuals who are producing webcasts or screencasts. You can contact Alan through his blog, or the CloudCasts Twitter.

My illustrated Using the SQL Azure Migration Wizard with the AdventureWorksLT2008 Sample Database post of 9/7/2009 describes how to use George Huey’s schema migration utility to duplicate schemas of on-premises SQL Server databases in SQL Azure Database tables in Microsoft Data Centers and the problems I encountered when using the Wizard with the AdventureWorksLT2008 sample database. [Repeated from SQL Azure Database (SADB).]

Magnus Mårtensson’s Extensible Windows Azure projects using MEF post of 9/7/2009 explains how to extend Azure project and contains links to earlier testability and persistence ignorance projects:

Here is how to enable a rich extensibility model for Windows Azure projects and how to run create jobs on Windows Azure Storage only once in your Windows Azure Projects. This sample and related AzureContrib release leverages Managed Extensibility Framework (MEF) – an upcoming .NET Framework component in .NET Framework 4.0.

The[re have] been three releases of AzureContrib, each one aimed to make the basic Windows Azure project template a bit more rich and intelligent.

The new release to AzureContrib adds a couple of important services (AzureContrib.ServiceHosting.ServiceRuntime.Services); the IWorkService And the IOneTimeWorkService. Also it adds a bit more intelligence to a Windows Azure Page, UserControl and most importantly to the Windows Azure WorkerRole.

Here are links to 23 videos about Microsoft HealthVault.

<Return to section navigation list> 

Windows Azure Infrastructure

Sam Johnston’s An obituary for Infrastructure as a Product (IaaP) post of 9/13/2009 begins:

There's been an interesting discussion in the Cloud Computing Use Cases group this week following a few people airing grievances about the increasingly problematic term "private cloud". I thought it would be useful to share my response with you, in which I explain where cloud came from and why it is inappropriate to associate the term "cloud computing" with most (if not all) of the hardware products on the market today.

All is not lost however - where on-site hardware is deployed (and maintained by the provider) in the process of providing a service then the term "cloud computing" may be appropriate. That said, most of what we see in the space today is little more than the evolution of virtualisation, and ultimately box pushing.

Jim Liddle’s Cloud Computing Best Practices post of 9/12/2009 describes what to be aware of “when putting your application on the cloud.” Topics include:

  • Licensing
  • Data transfer costs
  • Latency
  • State
  • Data regulations
  • Dependencies
  • Standardisation
  • Security
  • Compliance
  • Quality of Service
  • System hardening

• Bob Evans reports on the Central US (Chicago) data center in his Global CIO: Microsoft Opens $500M Data Center Mothballed In January post of 9/9/2009 for InformationWeek: “Now that Microsoft's massive Chicago data center has been delayed, mothballed, purchased, and finally opened, it's time for the $500 million drama queen to grow up.”

You want your data center to be many things, but drama queen is not one of them. And while Microsoft (NSDQ: MSFT)'s brand-new, colossal, $500 million data center near O'Hare Airport will probably turn out to be as outwardly dull as a pantomime in the fog, it's had a turbulent gestation period: in January, Microsoft delayed its opening indefinitely as part of the company's plan to cut $700 million dollars in costs, including the phased layoff of 5,000 workers. …

• Pingdom reported on 9/10/2009 that My OakLeaf Systems Azure Table Services Sample Project was down 23 minutes during the early morning:

Pingdom DOWN alert: Azure Tables (oakleaf.cloudapp.net) is down since 09/10/2009 02:03:21AM.

Pingdom UP alert: Azure Tables (oakleaf.cloudapp.net) is UP again at 09/10/2009 02:28:21AM, after 25m of downtime.

25 minutes of downtime =  25 / (60 * 24 * 30) =  99.942% uptime with <Instances count="2" />. This is just barely under the Service Level Agreement’s guarantee with two instances = 99.95%.

Kevin L. Jackson claims 1 Billion Mobile Cloud Computing Subscribers !! by 2014 in this 9/9/2009 post:

Yes. That's what I said! A recent EDL Consulting article cites the rising popularity of smartphones and other advanced mobile devices as the driving force behind a skyrocketing mobile cloud computing market.

According to ABI Research, the current figure for mobile cloud computing subscribers worldwide in 2008 was 42.8 million, representing 1.1 percent of all mobile subscribers. The 2014 figure of 998 million will represent almost 19 percent of all mobile subscribers. They also predicted that business productivity applications will take the lead in mobile cloud computing applications, including collaborative document sharing, scheduling, and sales force automation.

"The major platform-as-a-service providers - Force.com, Google and Amazon - are expected to start "aggressively" marketing their mobile capabilities starting in 2010. An earlier study from ABI Research reported that mobile cloud computing will generate annual revenues of more than $20 billion by 2014."

Lori MacVittie describes the Impact of Load Balancing on SOAPy and RESTful Applications: “A load balancing algorithm can make or break your application’s performance and availability” in this 9/9/2009 post:

It is a (wrong) belief that “users” of cloud computing and before that “users” of corporate data center infrastructure didn’t need to understand any of that infrastructure. Caution: proceed with infrastructure ignorance at the (very real) risk of your application’s performance and availability. Think I’m kidding? Stefan’s SOA & Enterprise Architecture Blog has a detailed and very explanatory post on Load Balancing Strategies for SOA Infrastructures that may change your  mind.

George Moore spells out the business folks’ view in his Windows Azure Business Model for Developers - An Introduction Chanel9 interview of 9/8/2009:

21 year Microsoft veteran and Software Architect George Moore is involved in defining and implementing an effective strategy for taking Windows Azure from technology preview to enterprise business ready. Specifically, George is responsible for all integration of all Azure services (Windows Azure, SQL Azure, .NET Services) to other systems at Microsoft. This includes the billing system integration across all Azure services, the business owner portal, and the developer portal for all Azure services.

Here, we get to know a bit more about the thinking behind the commercialization of Windows Azure (which you will learn more about in great detail at PDC 09).

Steve Lessem claims Cory Doctorow Misses the Point of Cloud Computing in his 9/8/2009 rejoinder:

With all due respect to Cory Doctorow, he's wrong.
In his article Not every cloud has a silver lining (Guardian) he states:

“There's something you won't see mentioned by too many advocates of cloud computing - the main attraction is making money from you.”

And I suppose all the vendors of physical storage, the hard drives, etc., are interested in your spiritual well being!

Here's the heart of Doctorow's beef with cloud computing:

“Rather than buying a hard-drive once and paying nothing - apart from the electricity bill - to run it, you can buy cloud storage and pay for those sectors every month. Rather than buying a high-powered CPU and computing on that, you can move your computing needs to the cloud and pay for every cycle you eat.”

The point he misses is that cloud computing exists because it answers a real need. … [Emphasis Steve’s.]

B. Guptil and B. McNee authored An Endless Cycle of Innovation: Saugatuck SaaS Scenarios Through 2014, a $1,295 Sawgatuck Research Report that covers SaaS and Cloud Computing scenarios:

This report includes analysis, insights and guidance developed from Saugatuck’s fourth annual SaaS research program, which was comprised of a web survey including 1,788 qualified user enterprise executives; interviews with 30 user enterprise executives with SaaS experience; and briefings with 25 SaaS vendors/providers.

“The research shows us a combination of changing SaaS acquisition and adoption, both as a result of the global recession, and as a result of the changing nature of SaaS itself. How users do business with SaaS is changing how providers develop and deliver SaaS, and is changing how ISVs and other players will need to compete over the next several years,” according to Saugatuck founder and CEO Bill McNee, one of the study’s lead authors. “Failure to recognize and adapt to these changes will make it extremely difficult, and much more costly than it should be, for anyone to benefit from SaaS.”

The following is from the Research Summary topic:

    • Despite impressive investments in SaaS development and adoption in different parts of the world, SaaS (and Cloud Computing) will not become the primary IT standard and practice by YE 2012. SaaS will instead be primarily an important “agent of change” through this time period.
    • By YE 2014, however, SaaS (and Cloud Computing) will become integral to infrastructure, business systems, operations and development within all aspects of user firms, with variations in status and roles based on region and business culture.
    • While SaaS has favored both startups and established firms with a variety of management styles and pocketbooks, the current economic challenges will weed out all but the better-funded and better-managed SaaS providers by YE 2012 (especially those that are not cash-flow positive).

Anthony Ha asks Is it time for businesses to embrace the cloud? in this 9/7/2009 post to Venture Beat’s “Conversations on Innovation” section that’s sponsored by Microsoft:

This is part of a series of posts about cutting-edge areas of innovation. The series is sponsored by Microsoft. Microsoft authors will participate, as will VentureBeat writers and outside experts.

Cloud computing has become a magic phrase over the last year or so. Everyone agrees it’s a hot trend, but people are still arguing about what it is, and who should be using it.

Over next few days, we’ll look at the cloud as part our Conversations on Innovation series (sponsored by Microsoft). The big question we’re tackling: What needs to come together, such as policy standards and programming models, to reach the cloud’s true potential?

To kick things off, I’ll look at where cloud computing stands, and the challenges it faces. …

<Return to section navigation list> 

Cloud Security and Governance

McGarr Solicitors describe potential pitfalls of cloud computing in the EC in the firm’s Cloud Computing: European Data Protection Dangers post of 9/11/2009:

Cloud computing is rapidly becoming a buzzphrase in IT-reliant businesses. Its proponents include some of the largest technology companies in the world. But while enterprises may be able to save money by moving into the cloud it is difficult to see how they can do so with their customer’s personal information without breaching EU data protection law. [Emphasis added.] …

Krishnan Subramanian’s Context Is Important While Talking About Cloud Security post of 9/10/2009 begins:

Recently, I was talking to a group of people about cloud computing and its plan for world domination. One of the issues that surprised me is the way people take a binary approach while talking about cloud security. The debate usually goes on with one side making a blanket statement that cloud computing is totally insecure and the other side claiming that it is very secure. In fact, I have been a participant in these kind of debates in the past but I am more enlightened now. There is no size fits all solution when it comes to security. It applies to cloud computing too.

All sides in this debate forget a few important points while letting their emotions run high. Some of them are

  • There is no fool proof security in the traditional computing environment either
  • We can take most of the stuff from our traditional security toolkit and apply it to the cloud
  • We need some rethinking in the way we approach cloud security, especially the public clouds. One aspect is the cloud scale itself and the other is the multi-tenancy
  • Cloud Computing is a, relatively, new field and it needs time to mature both in terms of its technological capabilities and in terms of its security. Security always follows any new technological advance with a time lag. …
  • The needs of all people (well, businesses) are not created equal

Tim Wilson posits “'Cross-VM attacks' could threaten sensitive data in shared environments, researchers say” in his University Research Exposes Potential Vulnerabilities In Cloud Computing post of 9/8/2009:

Users of cloud computing infrastructures should be aware that their sensitive data could be potentially leaked, a group of university researchers say.

In a new research paper (PDF), several computer scientists from the University of California at San Diego (UCSD) and the Massachusetts Institute of Technology (MIT) say they have discovered soft spots in the cloud computing concept that could leave data vulnerable to attack.

"Overall, our results indicate that there exist tangible dangers when deploying sensitive tasks to third-party compute clouds," the paper says.

In a nutshell, the researchers argue that by taking the right steps, an attacker could place a malicious virtual machine (VM) in close proximity to a target server in a shared, "cloud" environment. From there, it would be possible to launch a "cross-VM attack" using a variety of different hacking strategies, they say. …

Greg Ness highlights some of Chris Hoff’s best blog posts in The Infrastructure 2.0 Young Turks: Hoff of 9/9/2009:

Chris Hoff (@Beaker) asks is DDoS – A Moose On Cloud’s Table Or A Pea Under The Mattress? in this 9/7/2009 post:

… Depending upon where you stand, especially if we’re talking about Public Clouds — and large Public Cloud providers such as Google, Amazon, Microsoft, etc. — you might cock your head to one side, raise an eyebrow and focus on the sentence fragment “…and in particular the biggest threat facing cloud computing.”  One of the reasons DDoS is under-appreciated is because in relative frequency — and in the stable of solutions and skill sets to deal with them — DDoS is a long tail event.

With unplanned outages afflicting almost all major Cloud providers today, the moose on the table seems to be good ol’ internal operational issues at the moment…that’s not to say it won’t become a bigger problem as the models for networked Cloud resources changes, but as the model changes, so will the defensive options in the stable …

<Return to section navigation list> 

Cloud Computing Events

Wade Wenger offers on 9/13/2009 slides and videos from the Windows Azure in the Real World session he presented with Joseph Paradi to the Wisconsin .NET Users Group earlier in the week. From the abstract:

Does this sound familiar? Your boss is asking about cloud computing and Windows Azure, but you’re not sure how to separate the hype from the reality. Or perhaps you’ve heard about Windows Azure and had a chance to try it out, but you still don’t quite understand how or why to use it. Or maybe you’ve been using Windows Azure since PDC in 2008, but you’d like a clearer picture of the roadmap and pricing. If any of these points resonate with you, or if you have different questions and concerns, please join Wade Wegner, Architect Evangelist with Microsoft, and Joseph Paradi, Innovation Lead with Accenture, as they provide you with an update on the Windows Azure Platform and show you how companies like Accenture are using the cloud today. Additionally, Wade and Joseph will discuss the migration of existing internal applications to Windows Azure, securing applications through claims-based authentication and passive federation with Geneva Server, using relational databases in the cloud with SQL Azure, migrating data to the cloud through tools like SSIS, and more.

• Vivek Kundra, the Obama administration’s chief information officer “will hold a news conference Tuesday [9/15/2009] in Silicon Valley to ‘outline his vision for a new federal government cloud computing initiative,’” according to Rich Miller’s Federal Cloud Announcement Due Tuesday post of 2/11/2009:

The event is being held at NASA’s Ames Research Center at 10 a.m. Pacific on Tuesday, Sept. 15. The event will be streamed live on the NASA TV web site.

The event will be attended by NASA Deputy Administrator Lori Garver and “top Silicon Valley information technology leaders.” Who might be there? The NASA Ames facility is around the block from the Googleplex, as well as Microsoft’s Silicon Valley office

Robert MacMillan shed more light on the event in his White House CIO to disclose cloud computing plans post of 9/10/2009.

When: 9/15/2009 10 AM PDT  
Where: National Aeronautics and Space Administration, Ames Research Center, Mountain View, CA, USA

Mike Taulty recommends the Architect Forum. Cloud: An Architectural View event taking place 9/25/2009 at Microsoft Cardinal Place, London:

Change is the one constant in IT and today is no exception. In a time when economic necessities dictate that we do more with less, faster and cheaper than ever before we are still seeing projects fail at an alarming rate. The not so new buzz is cloud computing that the analysts are falling over themselves to convince us is the next big thing. Well there is no doubt that it is becoming ever more tangible as the main vendors like Microsoft seek to ready their propositions in the cloud space. Since its announcement at PDC08, Windows Azure has set itself apart from the infrastructure as a service crowd offering a full compute platform capability as a service. With another PDC due this year that will herald the launch of Azure with increased features and services, pricing and SLAs we will see the cloud become ever more real! Undoubtedly this is the platform of choice for start-ups and ISVs but is it ready for Enterprise time? What are the opportunities and barriers for forward thinking organisations, is it too early to take to the skies? What is the architecture of the enterprise going to look like? Is it all about private/public clouds, virtualised infrastructures? Or are these just the vestiges of an already overloaded and constrained architecture? Will the cloud really allows us to break up the silos and truly realise service oriented dream? Time will tell

When: 9/25/2009 8:45 AM to 4:30 PM GMT  
Where: Microsoft Cardinal Place, Auditorium 2, 100 Victoria Street, Cardinal Place, London SW1E 5JL, United Kingdom

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

Savio Rodrigues asks What's the New York Times doing with Hadoop? in this 9/11/2009 post to InfoWorld’s Open Sources blog that’s subtitled “A Times software engineer talks about how Hadoop is driving business innovation at the newspaper and Web site”:

With Hadoop World NYC just around the corner on Oct. 2, 2009, I thought I'd share two pieces of news.

First, I've received a 25 percent discount code for readers thinking about attending Hadoop World. Hurry because the code expires on Sept. 21.

Second, check out this Q&A with New York Times software engineer and Hadoop user, Derek Gottfrid. Derek's doing some very cool work with Hadoop and will be presenting at Hadoop World. …

Howard Anderson reports Dell Tries to Make an EHR Splash in his 9/10/2009 post to the Health Data Management blog. The article appears to be based on Steve Lohr’s article with some additional background material.

• Steve Lohr’s Tech Companies Push to Digitize Patients’ Records article of 9/10/2009 for the NY Times reports that cloud computing will aid small physician groups to migrate from paper to electronic medical records (EMR) and Dell will “act as the hardware supplier and general contractor” to automate the process:

EClinicalWorks has added four data centers in the last year, bringing the total to 10, for hosting electronic health records as a service over the Internet. The company offers its records both as conventional PC software and as a Web service. “The software as a service is where the biggest growth is,” said Girish Kumar Navani, president of eClinicalWorks.

On Thursday, Dell, the personal computer maker, plans to join the scramble in earnest, announcing its plan to form a partnership with hospital groups around the country to offer electronic health records — hardware, software, consulting services and financing — to their affiliated physicians. Dell, like the other players, sees the big opportunity as being in offices with 10 doctors or fewer, where three-fourths of the nation’s physicians practice medicine.

Dell plans to act as the hardware supplier and general contractor, working with partners like eClinicalWorks, a maker of electronic health record software, and Perot Systems for data-center hosting, if the medical groups outsource that task.

Dell already has pilot projects under way with a few hospital groups, including Memorial Hermann Healthcare System in Houston and Tufts Medical Center in Boston. This year, Dell announced it was teaming up with Sam’s Club, a division of Wal-Mart, to offer the hardware, software and services for electronic health records to doctors in small practices.

“The technology has to be a simplified, affordable package for physicians,” said Jamie Coffin, general manager of Dell’s health care business. “We’re really going after this market in a concerted way.”

Dell’s press release of 9/10/2009 on the topic is New Dell Solution Helps Hospitals Open Up Benefits of EMR to More Physicians and Patients. Following are a few cloud-related US job openings at Dell as of 9/10/2009:

Sounds to me like Perot Systems and eClinicalWorks are HealthVault competitors.

Lance Whitney supplements the preceding story by adding a Dell service to help hospitals with digital records post of 1/10/2009 to CNet’s HealthTech blog:

One key component of U.S. health care reform is the move toward digital medical records. Dell is hoping to play a role in that move.

Dell announced Thursday a new service to help doctors and hospitals more easily switch to electronic medical records (EMR).

Already in use by certain hospitals, the new EMR service--a combination of hardware, software, and support--is designed to make the transition from paper to digital records more affordable and practical for the average physician or medical staff.

Dell said its EMR system will also connect doctors and their sponsoring hospitals so they can share patient information, helping coordinate care, and slash administrative costs.

As part of its EMR package, Dell will go on site to a hospital to determine its needs and readiness. The company will install all hardware and software, offer training to the hospital staff, and provide 24-7 hardware and software support. The EMR application can be hosted either by the hospital or with a Dell EMR partner in a secure data center.

Dell said hospitals can integrate the service into their own information systems and offer it to affiliated doctors for their local practices. Dell's EMR system is modular, so hospitals can tailor it to their specific needs. …

Jason Massie describes his company’s new Disaster Recovery in the Cloud approaches hosted by Terremark in his 9/9/2009 post:

… With IaaS, you can keep an online copy of your data in the cloud with minimal latency. The main advantage is cost and the savings can be huge. You do not have to invest in the infrastructure that goes along with a DR site like cabinets, network gear, and support services like AD, DNS, backups, etc. There is also the CapEX that goes with the actual server hardware.

The speed of implementation can also allow you to have many less virtual machines than you would have to have if they were physical machines. You can just keep a master copy of a web and each kind of app server. If you fail over to the DR cloud, you can spin up 10 or 20 of these really fast. Of course to have low latency with your data, your DB tier will have to be fully implemented. …

Lauren McKay’s Salesforce.com's Second Stab at Service post of 9/9/2009’s subtitle reads “The company releases Service Cloud 2 with innovation around Knowledge, Answers, and Twitter.” Lauren continues:

It's been nine months since Salesforce.com introduced its Service Cloud, an integration of in-the-cloud customer service information including third-party data from services such as microblogging marvel Twitter. (That launch was a significant factor in CRM magazine's April 2009 decision to name Salesforce.com a Rising Star in customer service.) Since then, the company says that more than 8,000 customers have signed on to the Service Cloud, and the software-as-a-service (SaaS) pioneer seems to have no interest in slowing down, as its announcement today of Service Cloud 2 indicates.

Jeremy Geelan claims Unisys & Cloud Computing: Cloud-in-a-Box Is On Its Way: “Security is a big concern as well as privacy of the data once it leaves a client's data centers”:

Unisys recently announced a four-part cloud computing strategy that will enable clients to move their enterprise application workloads securely to tailored cloud environments and give them confidence in maintaining the integrity of their critical information.

In this Exclusive Q&A for Cloud Computing Journal with SYS-CON's Cloud Computing Expo Conference Chair, Jeremy Geelan, Rich Marcello - Unisys President, Systems & Technology - introduces the Unisys Secure Cloud and sets the scene for the upcoming Unisys Cloud-in-a-box to be delivered later this year.

James Urquhart posits Enterprise cloud computing coming of age in this 9/8/2009 post to CNet New’s The Wisdom of Clouds blog:

One of the most interesting aspects of the weeks leading up to and including this year's VMWorld was the incredible innovation in cloud-computing service offerings for enterprises--especially in the category of infrastructure as a service. A variety of service providers are stepping up their cloud offerings, and giving unprecedented capabilities to their customer's system administrators.

In this category, enterprises are most concerned about security, control, service levels, and compliance; what I call the "trust" issues. Most of the new services attempt to address some or all of these issues head on. Given that this is the infancy of enterprise cloud computing, I think these services bode well for what is coming in the next year or two.

James then goes on to analyze:

John WillisAWS Latest News (Update) post of 9/7/2009 summarizes Amazon Web Service’s e-mail to its developers of the same date. Topics include:

    • Introducing Amazon Virtual Private Cloud
    • Announcing the AWS Solution Providers Program
    • AWS Multi-Factor Authentication (AWS MFA)
    • Seamlessly Rotate Your Access Credentials
    • AWS Import/Export Enters Unlimited Beta, Adds Export for Amazon S3
    • AWS Management Console Adds Support for Amazon CloudWatch
    • New Lower Prices for Amazon EC2 Reserved Instances
    • AWS Start-Up Challenge Deadline Extended to September 25, 2009

Sam Dean reports Eucalyptus Systems Bridges Private and Public Clouds in this 9/9/2009 post:

On the heels of the launch and funding of open source cloud computing player Eucalyptus Systems, the company has now announced its first commercial product. The Eucalyptus Enterprise Edition (EEE) enables customers to implement an on-premise Eucalyptus cloud with VMware'VSphere virtualization platform, and ESX hypervisor.

VSphere is VMware's cloud operating system. Not only will Eucalyptus' EEE solution allow on-premise Eucalyptus clouds on VMware's platform, but it also supports other hypervisors, including Xen and KVM. With EEE, users can leverage all of these environments, and additonally develop applications compatible with Amazon's EC2.

“Eucalyptus Systems’ mission has been to support the open source Eucalyptus on-premise cloud platform while also delivering solutions for large-scale enterprise deployments,” said Dr. Rich Wolski, Eucalyptus Systems co-founder, CTO and former director of the Eucalyptus research project at the University of California, Santa Barbara (UCSB). …

Ryan Howard’s Practice Fusion Announces Investment from Salesforce.com and Cloud Computing Initiative press release of 9/7/2009 claims:

Practice Fusion offers a revolutionary application and delivery model – cloud computing – enabling physician practices to deliver superior care to their patients. Practice Fusion provides free, web-based electronic medical records (EMR), practice management, patient scheduling and more.

Practice Fusion is launching its patient health record on Force.com, salesforce.com’s enterprise cloud computing platform. Force.com provides everything companies need to quickly build and deliver business applications in the cloud, including the database, unlimited real-time customization, powerful analytics, real-time workflow and approvals, programmable cloud logic, integration, real-time mobile deployment, programmable user interface and Web site capabilities. Applications built on Force.com benefit from the proven security, reliability and scalability of salesforce.com’s real-time global service infrastructure. …

Ryan Howard is CEO of Practice Fusion.

Free Personal Health Record management applications, such as HealthVault and PassportMD (free to Medicare recipients), are common but not EMR and practice management (PM) software for physicians.

Alan Williamson reports in Amazon SimpleDB + SQS : Simple Java POJO Access of 9/6/2009 that he has updated his two Java classes that let you access Amazon Web Services’ Simple DB and Simple Queue Services (SQS):

SimpleDB features

  • No external dependencies
  • Single POJO
  • Full API support; CreateDomain, DeleteDomain, DomainMetaData, select, GetAttributes, PutAttributes, BatchPutAttributes, DeleteAttributes
  • NextToken support
  • Signature2 authentication
  • Error Reporting
  • Last Request ID and BoxUsage reporting

SimpleSQS features

  • No external dependencies
  • Single POJO
  • Full API support; CreateQueue, DeleteQueue, ListQueues, DeleteMessage, SendMessage, ReceiveMessage, GetAttributes, ChangeMessageVisibility, AddPermission, RemovePermission
  • Signature2 authentication
  • Error Reporting
  • Public Domain license

Jo Maitland says VMware vCloud Express: Right move, wrong focus in this 9/4/2009 essay for IT Knowledge Exchange’s Troposphere blog:

VMware is right to introduce a cloud computing service that competes with Amazon EC2. But wrong to focus on the aspect of buying these services with a credit card. We know of at least one company where the act of punching in a credit card number to buy servers is immediate grounds for dismissal.

vCloud Express, unveiled at VMworld in San Francisco this week, lets companies running VMware software hook up to a hosting provider running a public cloud also based on VMware, for additional compute resources on demand.

vCloud Express competes with Amazon.com’s EC2, now infamous for the speed at which users can buy and turn on servers, the low cost point for entry and the ability to use only what you need, when you need it. But chasing Amazon.com’s value proposition of “fast and cheap”, which is how VMware CEO Paul Maritz referred to vCloud Express in his keynote, is the wrong focus for enterprise IT.

Yes, IT managers want more agility and lower costs, but most of them won’t touch cloud services with a 10-foot pole, from VMware or anyone else, until they are sure of the security and reliability of these services. That’s where VMware should be putting its effort and focus, not on a simplistic web interface for entering credit card numbers. …

<Return to section navigation list> 

blog comments powered by Disqus