Tuesday, February 02, 2010

Windows Azure and Cloud Computing Posts for 2/1/2010+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

 
• Update 2/2/2010: Ken North: Oracle's Direction in Cloud Computing; Lori MacVittie: Alice in Wondercloud: The Bidirectional Rabbit Hole; Nenshad Bardoliwalla: The Unified Performance, Risk, and Compliance Model – Part IV – Model and Optimize; Om Malik: Microsoft Finally Opens Azure for Business; Robert Rowley, MD: Should the feds certify EHR Usability?; David Linthicum: The why and how of private clouds; SYS-CON Events: Cloud Expo 2010 West Venue Announced; Craig Balding: Brucon 2009: The Belgian Beer Lovers Guide to Cloud Security 6/7; John Moore: iPHR Market Report Executive Summary Now Up at Scribd; Steve Clayton: Microsoft and cloud interoperability; David Robinson: SQL Azure Database Now Generally Available – SLAs in effect; Sumit Mehrotra: February 2010 release of Windows Azure Tools and SDK; Brad Calder: Beta Release of Windows Azure Drive;

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the November CTP in January 2010. 
* Content for managing DataHubs will be added as Microsoft releases more details on data synchronization services for SQL Azure and Windows Azure.

Off-Topic: OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009.

Azure Blob, Table and Queue Services

• Brad Calder delivers the technical details of the new Beta Release of Windows Azure Drive (formerly XDrive) on 2/2/2010:

Today we are providing access to a beta release of Windows Azure Drive (announced as XDrive at PDC 2009).

Customers have told us that they want to take their already running Windows applications and run them in the cloud using the standard Windows NTFS APIs, and make sure that the data is durable. With Windows Azure Drive, your Windows Azure applications running in the cloud can use existing NTFS APIs to access a durable drive. This can significantly ease the migration of existing Windows applications to the cloud, enabling customers a more seamless migration experience while simultaneously reducing the amount of time it takes to move their applications from your own Windows environment to a Windows Azure environment. The Windows Azure application can read from or write to a drive letter (e.g., X:\) that represents a durable NTFS volume for storing and accessing data. The durable drive is implemented as a Windows Azure Page Blob containing an NTFS-formatted Virtual Hard Drive (VHD).

For the beta release of Windows Azure drive, customers will be billed only for the storage space used by the Page Blob and the read/write transactions to the Page Blob. This will be incorporated into the standard Windows Azure usage rates and there will not be a separate line item on the bill.

Let’s discuss some more of the technical details of the Windows Azure Drive feature. The Page Blob can be mounted as a drive only within the Windows Azure cloud, where all non-buffered/flushed NTFS writes are made durable to the drive (Page Blob).   If the application using the drive crashes, the data remains persistent via the Page Blob, and can be remounted when the application instance is restarted or remounted elsewhere for a different application instance to use.   Since the drive is an NTFS formatted Page Blob, you can also use the standard blob interfaces to upload and download your NTFS VHDs to the cloud. …

• Sumit Mehrotra notes in his February 2010 release of Windows Azure Tools and SDK post of 2/1/2010:

Windows Azure Drive is available in Beta form and the SDK allows you to use it in your simulation environment as well as in the production cloud today. In order to take advantage of Windows Azure Drive in your application, you need to choose the right version of the Guest Operating System for your application. This SDK now allows you to specify the OS Version as an attribute in your .cscfg file.

You can find more information on the supported Guest OS versions in the Windows Azure Guest OS Versions and SDK Compatibility Matrix.

For more details, see the Live Windows Azure Apps, Tools and Test Harnesses section.

Heads-up from David Burela: The SDK v1.1 and Guest OS v1.1 requires VS2010 RC which hasn’t been released yet.

Jim Nakashima reports that the February 2010 Release of the Windows Azure Tools for Microsoft Visual Studio v1.1 supports:

Windows Azure Drive: Enable a Windows Azure application to use existing NTFS APIs to access a durable drive. This allows the Windows Azure application to mount a page blob as a drive letter, such as X:, and enables easy migration of existing NTFS applications to the cloud.

For more details, see the Live Windows Azure Apps, Tools and Test Harnesses section.

John Brodkin’s CommVault ties backup software to the cloud post of 2/1/2010 claims “Vendor builds integrations with Amazon and other cloud storage services”:

Storage vendor CommVault is extending its backup, archive and de-duplication software to the cloud with a new connector that lets customers move data from inside the firewall to cloud services such as Amazon's Simple Storage Service, EMC's Atmos, Microsoft Azure and Nirvanix. [Emphasis added.]

Although few businesses have adopted cloud storage so far, the market is getting a lot of attention and it is likely that many traditional storage companies will integrate their products with public cloud services.

CommVault on Monday announced that its Simpana software now treats cloud services just like any other tier of storage, whether disk or tape, allowing de-duplication and encryption of data before it goes to a cloud, and enforcement of policies on retention requirements in cloud platforms. …

More details on Simpana are available in this CommVault Ships Cloud-Optimized Simpana Software press release of the same date.

Jerry Huang asserts “So much happened in January 2010 for the cloud storage world” as a preface to his The Cloud Storage Wars: Windows Azure vs. Google post of 2/1/2010:

First was the Windows Azure Platform went from public preview to full production. Second was Google Docs opened up  for any file type upload/download, effectively making it a GDrive in the cloud.

How would these affect you? This article will compare the two cloud storage offerings from price, speed, usability, service level agreement and to developer support. …

Jerry continues with a price comparison of blob storage: Google Docs ($0.02/GB/month, no data transfer charges) and Windows Azure ($0.15/GB/month plus data storage and transaction charges.) A more appropriate comparison would be Google Docs versus Office Live, which lets you store up to 5 GB online at no charge or Office Live SkyDrive, which has a 25-GB free quota.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

• David Robinson reported SQL Azure Database Now Generally Available – SLAs in effect on 2/1/2010:

Starting today, customers and partners in countries across the globe will be able to launch their SQL Azure Database production applications and services with the support of the full Service Level Agreements (SLAs). 

Customers who have yet to upgrade their CTP accounts to paid commercial subscriptions are encouraged to do so at the earliest in order to maintain access to their accounts.  SQL Azure Database CTP customers who have not upgraded their accounts will be able to keep using their existing databases but will no longer be able to create new databases.  On March 1, 2010, the SQL Azure Database CTP accounts that have not been upgraded will be deleted. On April 1, 2010, the Windows Azure Storage CTP accounts that have not been upgraded will be deleted. It is important to export your data if you do not plan to upgrade to a commercial subscription prior to these dates.

As a number of time zones apply to our customers and partners worldwide, Microsoft will begin charging for use of Windows Azure and SQL Azure Database starting at 12:00 AM February 2, 2010 GMT to ensure that customers and partners are not charged for their free usage in the month of January.

New customers can visit www.WindowsAzure.com to sign up to start building and deploying cloud applications today on the Windows Azure platform.

Dave continues with brief FAQs about SQL Azure.

<Return to section navigation list> 

AppFabric: Access Control, Service Bus and Workflow

Dave Kearns takes “A look back at an early identity-based application” in his The 'game' of identity post of 1/29/2010:

Back in the day when personal computers where starting to become commonplace, the various machines (Commodore VIC-20, Atari, TRS-80, Apple I and others) each had its own operating system. Software could only run on one machine, which meant there were few choices and high prices. But what they all had was built-in BASIC, the programming language. Most had a BASIC interpreter, but the IBM PC came with a BASIC compiler (showing, I guess, that it was intended for bigger and better things).

It struck me the other day that it was back then, on my old VIC-20, that I first encountered an identity-based application. It was a game, written in BASIC that had to be typed in to the computer called "Animal." The game (and you can see the code here) purported to be a "guessing" game where the computer guessed the name of an animal you were thinking of. It did this by asking a series of yes/no questions and following a decision tree to an endpoint. Then it would ask, for example: "Are you thinking of an elephant?" If you answered "No" it would ask what animal you were thinking of (say, a camel) and what question would differentiate it from an elephant ("It lives in the dessert"). It thus built up a series of attributes which -- taken together -- identified a particular animal.

That's not so very different from how we identify people. …

Michelle LaRoux Bustamante’s Caching Issued Security Tokens at the Client article for the January 2010 issue of DevConnections carries an “Add federated and claims-based security to .NET Framework apps” deck.

<Return to section navigation list>

Live Windows Azure Apps, Tools and Test Harnesses

• John Moore offers a part of Chilmark Research’s Internet-based Personal Health Record study in his iPHR Market Report Executive Summary Now Up at Scribd post of 2/1/2010:

Chilmark Research has just released the Executive Summary of its iPHR Market Report to the general public via the open publishing website, Scribd.

Maybe this was an act of generosity in honor of the shortest month of the year.

Maybe this is just to further get the word out on Chilmark Research in advance of the big industry confab HIMSS, which begins one month to the day from today’s date.

Or maybe this is just in response to the upcoming eBook wars between Amazon and Apple.

Then again, maybe it was just time to allow the broader healthcare community to take advantage of this report in light of future “meaningful use” requirements where those seeking federal stimulus dollars for the adoption and use of certified EHRs will need to provide their customers with a PHR by 2013 (at least that is what the draft rules are calling for). …

Read the Executive Summary here.

My question: Where is HealthVault? HealthVault doesn’t appear in the TOC, but Google Health does.

• Robert Rowley, MD asks Should the feds certify EHR Usability? in this 2/2/2010 post:

In an interesting turn, the Commerce Department’s National Institute for Standards and Technology (NIST) is looking to develop standards for evaluating ease-of-use of health IT systems. This raises some questions about the appropriate federal role in guiding the evolution of Electronic Health Records (EHR) systems – should the feds be specifying “usability standards” in the first place?

The NIST notice is currently very preliminary – they are simply looking for companies with expertise in quantifying and measuring Usability in health IT systems. However, the NIST has been charged with developing the specific testing and process documents that will be used (by organizations yet to be selected) to certify EHR systems. The overall policy and specification about Meaningful Use of a Certified EHR, which is needed to access ARRA stimulus moneys available beginning in 2011, have been published for open commentary. However, the specific nuts-and-bolts of certification is being hammered out by the NIST. They have already contracted with Booz Allen Hamilton to help with this process. …

So, why would the NIST be interested in evaluating Usability, given that this is not one of the criteria specified in the Certification guidelines? Poor usability has been cited as one of the main impediments to EHR adoption (besides cost), and stimulating EHR adoption is one of the central goals of the Office of the National Coordinator (ONC) for health IT. Historically, CCHIT (the exclusive legacy certification body prior to ARRA) did not include Usability as a certification domain – too difficult to quantify and test. The result has been that many large, legacy health IT systems are so cumbersome – have such poor Usability – that they are prone to mistakes (not from lack of data, but from bad presentation of that data to the end-user).

It is our opinion that Usability is an important factor in EHR selection, and such selection is determined by the market. Market factors will result in faster development of high-quality EHRs than a process regulated by the feds. The federal rule-making process is susceptible to influence by established well-funded corporations who have an interest in the status quo. While it is interesting that the NIST is considering input from expertise around quantifying Usability, it is uncertain that a federal-regulatory approach will be effective. We will be watching this development with interest.

Heads-up from David Burela: The SDK v1.1 and Guest OS v1.1 requires VS2010 RC which hasn’t been released yet.

Jim Nakashima reports availability of the February 2010 Release of the Windows Azure Tools for Microsoft Visual Studio v1.1 in this 2/1/2010 (afternoon) post:

I’m pleased to announce that the Windows Azure Tools for Microsoft Visual Studio 1.1 (direct link while the release propagates) has been released to coincide with the general availability of Windows Azure.

New for version 1.1:

  • Windows Azure Drive: Enable a Windows Azure application to use existing NTFS APIs to access a durable drive. This allows the Windows Azure application to mount a page blob as a drive letter, such as X:, and enables easy migration of existing NTFS applications to the cloud.
  • OS Version Support: Allows a Windows Azure application to choose the appropriate Guest OS to run on in the cloud.
  • Bug Fixes
    • StorageClient: Expose account key from storage credentials, expose continuation tokens for pagination, and reduce maximum buffer size for parallel uploads.
    • Windows Azure Diagnostics: Fix path corruption for crash dumps, OnDemandTransfer now respects LogLevelFilter.
    • VS 2010: Improved packaging performance.
    • VS 2010: Support for linked files in Web projects.
    • VS 2010: Support for ASP.NET web project web.config transformations.
    • Certificate selection lists certificates from LocalMachine\My instead of CurrentUser\My.
    • Right click on Role under Roles folder to select whether to launch the browser against HTTP, HTTPS or not at all.

Updated and additional samples are available at: http://code.msdn.microsoft.com/windowsazuresamples

Aleksey Savateyev’s RCA Framework and Sample Released to the Web post of 1/31/2010 describes his rich cloud application (RCA):

I've finally published to MSDN what we've been working with our partner Murano Software for the last couple of months - Rich Cloud Application Framework and Sample. This is a proof of concept (POC) application based on RCA concepts I described before

To summarize - the RCA framework is contained entirely inside the POC solution and takes form of server-side Azure components and client-side Silverlight components. It provides best practices on how to build scalable applications with rich UX. It has  following components provided in a single package:

1. Scalable duplex notification infrastructure working with Azure and Silverlight 3
2. MVVM implementation in Silverlight 3
3. Data access implementation allowing Silverlight 3 apps to seamlessly access Azure Storage just like any WCF Data Source
4. Live Id Authentication and Authorization with Silverlight and Azure storage …

Product Support Online sample is also deployed live in Windows Azure: http://pso.cloudapp.net. Give it a try!

Full source code is published on MSDN Code Gallery under MS-PL license and is free to use in your own social networking, project management and other products requiring collaboration features such as these.

Aleksey is an architect working with Global ISVs on building software based on new and emerging technologies mostly in the Web and S+S spaces.

Vadim Kreynin’s Part 2: Configuration – Learning Azure with me post of 1/31/2010 is a guided, illustrated tour of the ServiceConfiguration.csfg file:

In Part 1 “Hello Azure” we briefly talked about the Service Configuration file (ServiceConfiguration.cscfg).  In that article we didn’t modify this file, we just accepted all the values provided by Cloud Service Visual Studio template. …

The root element ServiceConfiguration has only one attribute which is serviceName.  The serviceName attribute is required. If you omit this attribute, Visual Studio is going to complain about your XML during compilation.  You can see that this attribute has the same value as our Cloud Project which is LearnAzure.  The name of the service must match the name of the service in the service definition.  If you look inside ServicDefinition.csdef file, you’ll see that value for the name attribute in ServiceDefinition element is also LeanAzure.

ServiceConfiguration element can have only Role elements as children.  Because the only role we have is ASP.NET Web role, LearnAzureWeb, we can see only one Role element.  In case our project contained another role the Service Configuration file would reflect this fact.

[The] Role element also has a single name attribute which is required.  The name attribute represents the name of the service and must match the name of the service in the service definition defined in the WebRole element. …

Chris Kanaracus claims “New IDC report predicts the SaaS BI market will grow at a compound annual growth rate of 22 percent through 2013” as a preface to his SaaS BI growth will soar in 2010 article of 2/1/2010 for InfoWorld’s Cloud Computing blog:

Mounting evidence suggests that in 2010, the hottest segment in BI (business intelligence) software will revolve around offerings delivered from the cloud, thanks to increased product sophistication, strained IT budgets, and other factors.

A new IDC report finds the SaaS BI market will experience triple the growth of the market overall, soaring at a compound annual growth rate of 22.4 percent through 2013, although actual revenue totals will remain small compared to on-premise BI applications. …

There are plenty of good reasons to adopt SaaS (software as a service) BI, according to a new report from Forrester Research.

It can get BI tools to typically under-served users, such as front-office workers, a lot faster, analyst Boris Evelson wrote. The model may also become more attractive as enterprises turn to on-demand software for other needs, such as CRM (customer relationship management).

"The more applications (and therefore BI data sources) are moved into the cloud, the fewer reasons there may be to build and operate BI applications in-house," Evelson said.

But SaaS BI could be ideal for certain types of big companies as well. Forrester cites a retailer that does 90 percent of its business during the holidays. Most of the time, the unnamed company handles BI and reporting with an on-premises data mart, but turns to a SaaS vendor for an assist to crunch the flood of year-end numbers. …

David Linthicum asserts “There are three models of cloud computing, and the one you use determines the kind of performance you get” in his How to gauge cloud computing performance article of 1/28/2010 for InfoWorld’s Cloud Computing blog:

Does cloud computing perform well? That depends on whom you ask. Those using SaaS systems and dealing with standard Web latency can't tell you much about performance. However, those using advanced "big data" systems have a much different story to relate.

You need to consider the performance models, which you can break into three very basic categories:

  • Client-oriented (performance trade-off)
  • Cloud-oriented (performance advantage)
  • Hybrid (depends on the implementation)

Client-oriented cloud computing architectures are those systems where the cloud computing providers, typically SaaS (software as a service), interact with users constantly over the Internet. The issue here is not that the cloud provider is slow, but that there is latency with the constant back-end machine-to-machine conversation that occurs between the SaaS provider and the browser. …

Dave continues with a detailed analysis of the three performance models.

Return to section navigation list> 

Windows Azure Infrastructure

• Om Malik analyzes the Azure Services Platform’s competitive stance in his Microsoft Finally Opens Azure for Business post of 2/1/2010 to GigaOm:

Microsoft, after talking about it endlessly, has finally opened up its cloud platform, Windows Azure, for business. Windows Azure and SQL Azure are now available in 21 countries worldwide, the company said. So today the rubber meets the road — and we get to see how Azure does in the marketplace.

What will Azure’s financial impact be? It’s hard to say, but I think it’s safe to assume that Microsoft, which has gotten used to the 50 percent (or higher) margins on its products, will see those margins shrivel in the face of the competition. Azure is joining a crowded marketplace, one dominated by Amazon and its web services, which now boasts of thousands of developers in addition to an entire ecosystem.

And if that wasn’t enough, Azure will be competing with offerings from IBM, Cisco Systems, EMC Corp. and VMware, Rackspace and other small players. Microsoft says it’s not worried, though and claims companies like 3M and GXS are already using the Azure cloud.

Not only are established partners and ISVs looking to the Windows Azure platform as a way to extend their revenue capabilities, but startups like Lokad are betting their businesses on it. Lokad has deployed a scalable forecasting cloud-based service to its retail and manufacturing customers that provides real-time forecasting data to allow for inventory optimization and ultimately bring them to profitability. …

• David Linthicum asserts “With the interest in cloud computing, but the need for security and control, many enterprises are opting for private clouds” in his The why and how of private clouds post of 2/2/2010 to the InfoWorld Cloud Computing blog:

Last month I pointed out that cloud computing is influencing internal IT, including the redevelopment of traditional data centers around SOA and cloud computing concepts -- or private clouds. Indeed, through 2012, Gartner forecasts IT organizations will spend more money on private cloud computing investments than on offerings from public cloud providers.

The top three reasons that enterprises are looking at private clouds are:

  1. Security and privacy. Many Global 2000 companies are not ready to trust public cloud providers. They also worry about the providers giving up their data to the authorities or competitors, or just losing it all together.
  2. Efficiency. Most nonvirtualized servers run at about 1 to 5 percent of capacity most of the time. The use of virtualization allows you to do much more with much less, and that's the name of the game in IT today.
  3. Control. While most CIOs will talk about the advantages of using public cloud computing, and perhaps point to their Salesforce.com subscription as proof that cloud computing is part of their portfolio, the thought of using servers that they can't touch is still a bit disconcerting to many in IT. …

Dave continues with a description of “the issues you should consider when implementing a private cloud.”

Will the public vs. private cloud issue ever quiet down?

• Steve Clayton weighs in on Microsoft and cloud interoperability in this 2/1/2010 post:

… Jean Paoli adds that working with open source technologies like PHP and Java will be key to Microsoft’s cloud success. Yes, you read that right – working with open source really is pretty important as if we think the cloud is going to be this all Microsoft thing we’re living in cloud cuckoo land. To be fair to my colleagues in Redmond, this approach is patently obvious if you look at services like Windows Azure that positively welcome PHP, Java, Ruby, Apache, Tomcat and MySQL. You couldn’t have said that a few years back about Microsoft…but now, we’re arguably more open that some of our competitors who you’d expect to be leading the openness.

We’re continuing to push for more openness and added a new work stream to our Interoperability Executive Customer (IEC) Council of of 35 chief technology officers and chief information officers from organizations around the world. It’s not just about our forums of course and we participate in the industry fora such as the Open Cloud Standards Incubator – all the other players you’d expect are in here – IBM, RedHat, Cisco, VMWare, HP etc.

The cloud is the future and hopefully it’ll usher in an era of never seen before interop and openness that takes away the hassles of making systems talk to each other for end users and makes all our toys and tech play nice together. I know that sounds quite utopian for those of us who have been around for a while but personally I’m optimistic and like the idea of competing on merit rather than competing with gate after gate.

Mary Jo Foley announces Microsoft's Azure cloud is officially open for business in her 2/1/2010 post to ZDNet‘s All About Microsoft blog:

As of February 1, Microsoft officially jumps into the cloud-computing frey and now is charging customers for developing and running apps in its Azure cloud.

Microsoft has been working on Azure for more than three years; beta testers have been kicking its tires for more than a year. With Azure, Microsoft is attempting to recreate its Windows ecosystem in the form of a utility. Today, developers and customers can develop and deploy on the Windows Azure operating system and make use of the SQL Azure hosted database. In the coming months, Microsoft will make available to developers its Azure AppFabric Web-service utilities. And as 2010 progresses, Microsoft is slated to make available to developers and customers more of the on-premises “private cloud” complements to Azure.

While many (including yours truly), describe Azure as Microsoft’s “cloud,” Microsoft actually has many different public and private clouds. Very few Microsoft properties are currently hosted on Azure. Those that are include Live Mesh, Microsoft’s HealthVault service and its energy-monitoring Hohm service. Mega-scale services like Windows Hotmail and Xbox Live don’t run on Azure. Neither does Microsoft’s hosted Exchange Online, SharePoint Online, CRM Online, its Business Productivity Online Suite (BPOS) or its Danger services for mobile devices. Microsoft hasn’t provided a timetable as to when (or definitively if) it will move these services to Azure.

Microsoft officials say that there already are “tens of thousands” of apps and services running on Azure, a total which includes everything from pilot “hobbyist” apps, to full-fledged commercial ones. (February 1 is the cut-off date, by the way, for those with Azure Community Technology Preview accounts to decide wehther they are going to upgrade to paying account ; Microsoft is advising userswho don’t want to subscribe  to export their data pronto so it won’t be deleted.)

Mary Jo quotes me later in the article.

The Windows Azure Team claims Windows Azure Platform Now Generally Available in 21 Countries in this 2/1/2010 post:

In today's era of computing, businesses face the challenge of reducing their IT infrastructure costs while extending the value of their current investments. For us, this has meant that the experiences of our customers and the developer community are key in building a cloud services platform that provides the flexibility and agility our customers need to tackle their business problems in new ways.  As we continue this journey, today marks a significant milestone, not only for us at Microsoft, but also for our customers and partners.  We are excited to announce the general availability of Windows Azure and SQL Azure in 21 countries.  

Starting today, customers and partners in countries across the globe will be able to launch their Windows Azure and SQL Azure production applications and services with the support of the full Service Level Agreements (SLAs).  The Windows Azure platform AppFabric Service Bus and Access Control will continue to be free until April 2010 for those that sign up for a commercial subscription. Additionally ‘Project Dallas" will continue to be in a free CTP. …

As a number of time zones apply to our customers and partners worldwide, Microsoft will begin charging for Windows Azure and SQL Azure starting at 12:00 AM February 2, 2010 GMT to ensure that customers and partners are not charged for their free usage in the month of January.

New customers can visit www.WindowsAzure.com to sign up to start building and deploying cloud applications today on the Windows Azure platform. 

Alin Irimie quotes Microsoft’s Jean Paoli and Craig Shank in this Microsoft Brings Cloud Interoperability Down to Earth post of 2/1/2010 based on a press release of the same date from Microsoft Presspass:

Cloud interoperability is specifically about one cloud solution, such as Windows Azure, being able to work with other platforms and other applications, not just other clouds. Customers also want the flexibility to run applications either locally or in the cloud, or on a combination of the two. Microsoft is collaborating with others in the industry and working hard to ensure that the promise of cloud interoperability becomes a reality.

Leading Microsoft’s interoperability efforts are general managers Craig Shank and Jean Paoli. Shank spearheads the company’s interoperability work on global standards and public policy, while Paoli collaborates with Microsoft’s product teams as they map product strategies to customers’ needs.

Shank says one of the main attractions of the cloud is the degree of flexibility and control it gives customers: ‘There’s a tremendous level of creative energy around cloud services right now — and the industry is exploring new ideas and scenarios together all the time. Our goal is to preserve that flexibility through an open approach to cloud interoperability.’

Adds Paoli, ‘This means continuing to create software that’s more open from the ground up, building products that support the existing standards, helping customers use Microsoft cloud services together with open source technologies such as PHP and Java, and ensuring that our existing products work with the cloud.’

Shank and Paoli firmly believe that welcoming competition and choice will make Microsoft more successful in the future. ‘This may seem surprising,’ notes Paoli, ‘but it creates more opportunities for its customers, partners and developers.’ …

… Cloud interoperability requires a broad perspective and creative, collaborative problem-solving. Looking ahead, Microsoft will continue to support an open dialogue among the different stakeholders in the industry and community to define cloud principles and incorporate all points of view to ensure that in this time of change, there is a world of choice.

Lori MacVittie asserts Clouds Are Like OnionsWhich of course are like Ogres. They’re big, chaotic, and have lots of layers of virtualization” in this 2/1/2010 post:

Peeled-onion-001

In discussions involving cloud it is often the case that someone will remind you that “virtualization” is not required to build a cloud. But that’s only partially true, as some layers of virtualization are, in fact, required to build out a cloud computing environment. It’s only “operating system” virtualization that is not required. Problem is unlike the term “cloud”, “virtualization” has come to be associated with a single, specific kind of virtualization; specifically, it’s almost exclusively used to refer to operating system virtualization, a la Microsoft, VMware, and Citrix. But many kinds of virtualization have existed for much longer than operating system virtualization, and many of them are used extensively in data centers both traditional and cloud-based. Like ogres, the chaotic nature of a dynamic data based on these types of virtualization can be difficult to manage.

Layer upon layer of virtualization within the data center, like the many layers of an onion, are enough to make you cry at the thought of how to control that volatility without sacrificing the flexibility and scalability introduced by the technologies. You can’t get rid of them, however, as some of these types of virtualization are absolutely necessary to the successful implementation of cloud computing. All of them complicate management and make more difficult the task of understanding how data gets from point A to point B within a cloud computing environment. …

Eric Nelson’s Seven things that may surprise you about the Windows Azure Platform article of 2/1/2010 for the the  UK MSDN Flash newsletter “will be the basis of a number of upcoming blog posts – probably about seven of them  :-)”:

…You can now run applications for your end users entirely inside the Windows Azure Platform using some or all of the services. You can even build hybrid applications which are a mix of on-premise and “in the cloud” (Software + Services). But the best bit is the skills you already have are the skills you need to build for the Windows Azure Platform. The UK Windows Azure Platform site is a great place to start exploring Azure but I thought it would be fun to peak your interest with seven less well known facts.

  1. Azure is not just about “the next twitter” …
  2. You can build applications for the Windows Azure Platform using C++, Java, Ruby, PhP...
  3. The Windows Azure Platform is still free to try …
  4. There are hundreds of great tools for Azure …
  5. SQL Azure works with Integration Services, Reporting Services and Analysis Services …
  6. All your data is replicated many times to offer high availability …
  7. A Windows Azure Storage account can store up to 100TB of data …

Lee Gomes prefaces his Abolish 'Cloud Computing!' article of 2/1/2010 for Forbes Magazine with “Not the idea, just the phrase. We'll think more clearly if we do:”

Doesn't the expression "in the cloud" sound so much prettier, so much more ethereal, than "over the Internet," even though they are essentially the same thing? Yes it does, which is precisely why we should stop the cloud-talk right away.

It's not news to anyone that there is a lot of discussion of using Wide Area Network technology to shift the sorts of computing tasks once done in-house to computers in remote locations--sometimes even at other companies. This is true for relatively simple tasks like e-mail, as well as for more complicated functions, including those essential to a company's basic operations.

But what to call this?

It is perfectly accurate to say that someone using the services of Gmail or Salesforce.com  is working "over the Internet." The phrase is not only accurate, it is also quickly understood by even the most non-technical person. It has the added advantage of quickly evoking both the potentials (for example, efficiencies and conveniences) as well as pitfalls (for example, hackers and outages) of doing business online. …

Working “over the Internet,” which includes basic browsing, is far too broad a concept to be a synonym for “cloud computing.”

<Return to section navigation list> 

Cloud Security and Governance

• Lori MacVittie asserts “Emerging architectures are conflating responsibilities up and down the application stack” and asks “Who is responsible for integration when services reside in the network?” in her Alice in Wondercloud: The Bidirectional Rabbit Hole post of 2/2/2010:

imageWhile preparing for an upcoming panel I’m moderating at Cloud Connect (in the “New Infrastructure” track), the panelists and I had a great discussion on the topics we wanted to discuss in the session. During that discussion it became increasingly clear that an interesting phenomenon has been occurring: the conflation of network and application  responsibilities in the traditional “stack.”

Much of this inversion is absolutely necessary for emerging models of networking and computing to be successful. Traditional methods of handling QoS (Quality of Service) and identity management, for example, are no longer adequate in the inherently volatile world of cloud computing and dynamic networks. Interestingly, the driver behind the inversion appears to be based largely on the ability of specific layers access to context, which is necessarily replacing IP addresses as a method of client – and server – identification. …

Nenshad Bardoliwalla has reached the fourth installment of his The Unified Performance, Risk, and Compliance Model – Part IV – Model and Optimize series in this 2/1/2010 post to the Enterprise Irregulars blog:

This is the fourth in a four part series on the Unified Performance, Risk, and Compliance ModelPart I covered the strategize and prioritize phase, Part II covered the plan and execute phase, and Part III covered the monitor and analyze phase.  In the model and optimize phase of the Performance Management Lifecycle, we strive to assess the drivers of performance and risk at a deep level to understand the various alternatives we can pursue with the goal of making the best decision given a certain set of constraints.  This phase is depicted graphically below:

Model. Modeling falls into three categories:

  1. Revenue, Cost, and Profitability Modeling. Modeling the costs, revenue, and profitability implications of performance management, risk management, and compliance management activities and their drivers can be achieved at a very detailed level using activity-based costing and associated methodologies.
  2. Scenario Modeling. Scenario modeling can be applied to financial and operational modeling and focuses on creating different business scenarios. Simple scenario modeling can include creating a base case and then high and low cases based on changes made to input variables, such as market growth rates or inflation rates. This technique is often used in modeling market and business opportunities and creating business plans.
  3. Simulation Modeling. More advanced modeling including Monte Carlo simulation supports creating a broad range of scenarios based on multiple iterations of input assumptions and combinations. With this technique, probabilities can be assigned to the various outcomes. These techniques allow the uncertainty associated with a given forecast to be estimated and to reduce risk by applying sensitivity analysis, correlation, and trend extrapolation. By simulating the effect of uncertainty, it becomes possible to answer questions such as, “How certain are we that a given project (or group of projects) will result in a minimum outcome of x?” Or, conversely, “What’s the minimum outcome that we can be, for example, 90% certain of achieving?” Simulation also makes it possible to identify and rank the various contributors to overall uncertainty.

Optimize. The goal at this phase of the PM lifecycle is to determine the optimal way to achieve objectives by taking into account the entire context of the problem, including all relevant constraints and assessments (costs, benefits, risk, labor and time), as well as business strategies, objectives, risks, and compliance factors. Optimization can be done both through human evaluation as well as through advanced algorithmic techniques.

Having reached Part IV, it’s clear that Nenshad was serious about his topic. I recommend the three preceding posts.

• Craig Balding’s 10-minute Brucon 2009: The Belgian Beer Lovers Guide to Cloud Security 6/7 video’s abstract reads as follows:

In a hurry? The short version: learn about cloud security and in the process win a tasty Belgian beer by answering easy questions! When Amazon CEO Jeff Bezos was photographed standing in front of a vintage 1890s electric generator, it was widely assumed he was paying homage to Nick Carr’s “electric generator” metaphor of utility computing. This was understandable, but quite wrong. Reminiscent of the Bruce Lee movie where the student is chided for failing to look “out there” instead of staring at his own hand, the cloud commentators failed to notice his surroundings. Bezos — and the electric generator — were standing in the middle of a Belgium Brewery! This will be the starting point of our journey through Cloud Security using a fuller flavour metaphor: Belgian beer.

Download the #brucon [2009] videos and presentations

Tim Brown suggests Cloud Security: Ten Questions to Ask Before You Jump In in this 1/26/2010 Computerworld article:

The hype around cloud computing would make you think mass adoption will happen tomorrow. But recent studies by a number of sources have shown that security is the biggest barrier to cloud adoption. The reality is cloud computing is simply another step in technology evolution following the path of mainframe, client server and Web applications, all of which had -- and still have -- their own security issues.

<Return to section navigation list> 

Cloud Computing Events

• SYS-CON Events reports on 2/1/2010 Cloud Expo 2010 West Venue Announced:

SYS-CON Events announced on Monday that Cloud Expo 2010 West, the 7th International Cloud Computing Conference & Expo, will take place November 1-3, 2010, at the Santa Clara Convention Center in Santa Clara, with more than 6,000 delegates and a record number of sponsors. Call for Papers for Cloud Expo 2010 West is also open. Cloud Expo is the world's leading Cloud-focused event since 2007, and is held five times a year, in New York City, Silicon Valley, Prague, Tokyo and Hong Kong.

Adam Grocholski announces “Azure Deployment and Maintenance” will be his topic for the February Meeting of the Twin Cities Cloud Computing User Group to be held at the Microsoft Office – Bloomington, 8300 Norman Center Drive, Suite 950, Bloomington, MN 55437 on 2/11/2010:

So, you've created your first Azure application. Now what? In this session Adam Grocholski joins us to to help walk us through moving our Azure applications from our local development environments to the cloud. Areas to be discussed include:

  • Creating Azure services
  • Preparing your application for deployment
  • Packaging your application for deployment
  • Deploying your application to Azure's staging environment
  • Moving your application from Azure's staging environment to production
  • Upgrading your Azure application
  • Creating deployment and maintenance scripts with Powershell

Kevin Jackson requests on 2/1/2010 the pleasure of your company at the Cloud Computing for DoD & Government conference:

Please join me at the Cloud Computing for DoD & Government training conference, February 22-24, 2010 at the Hilton Old Town in Alexandria, VA. This unique conference agenda blends interactive workshops and actionable information with an unprecedented opportunity to network with government cloud computing leaders.
The conference kicks-off Monday with class-room style sessions where attendees will examine the strategies behind cloud standards and security. The second day features presentations from leading government cloud implementors, prepared to share their real world experiences and learnings. Speakers include:

  • Patrick Dowd, Chief Architect, NSA - "Requirements for Collaboration in the Cloud"
  • Peter Tseronis, Deputy Associate CIO, US Dept. of Energy - "The Federal Cloud Computing Initiative and the Progress-to-Date"
  • Dr. Stephen M. Jarrett, CTO ISR/IO/IA & Cyber Security, US Navy SPAWAR - "Securing Cloud-Based Applications"
  • June Hartley, CIO Department of Interior - "Application Hosting and Cloud Computing"
  • Dr. Richard W. Etter, Sr. Information Assurance Officer, Act. Director, Cyber Security & Critical Infrastructure, Office off the CIO, Dept of the Navy - "Open Source Infrastructure for Data Intensive Computing"
  • Henry Sienkiewicz, Technical Program Director, DISA - "Cloud Computing with RACE"

Day three is also fully loaded featuring:

  • Maj Gen Stephen Gross, USAFR, Mobilization Asst to the Chief of Warfighting Integration and CIO - "The USAF's Position and Initiatives in Cloud Computing"
  • William C. Barker, Chief Cybersecurity Advisor, NIST - "Applying Various Virtualization Techniques within the Cloud"
  • Peter Mell, Cloud Computing Project Lead & Senior Computer Scientist, NIST / James Tsujimoto, Office of Government Wide Policy, GSA - "Cloud Computing: Secure and Effective Use"
  • Dr. Mary Ann Malloy, Warfighter Architecting, The MITRE Corporation - "Tactical Cloud Computing"

Thomas LaRock offers his SQLSaturday #34 Recap post of 2/1/2010 with a recommendation for speakers demoing SQL Azure:

There were six session slots lined up for the day, each one lasting about 75 minutes. I was able to attend six talks during the day, which I thought was a fantastic way to spend a cold winter day near Boston. …

The next session was by Matt Van Horn and it was on “Developing with SQL Azure”. All I can say about this is that if you are planning to work with SQL Azure, or to give a talk that includes a demo on SQL Azure, you should make sure you can connect to the intertubz. Unless, of course, you either (1) want to show people the biggest reason to not push your app to the cloud or (2) show people what to do when they lose connectivity and how important it is to have your data local, just in case. Still, Matt gave about 30 minutes before he ended his talk, and I did manage to take away a few items even without a demo. I also spent a few minutes chatting with Matt afterwards and I hope to stay in touch with him in the very near future.

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

• Ken North prophesizes Oracle's Direction in Cloud Computing in this 1/28/2010 post to [Dr.] Dobbs Code Talk:

There has been much speculation in recent months about Oracle's future direction following an acquisition of Sun Microsystems and Founder Larry Ellison's remarks about cloud computing. Chairman Ellison has made widely-reported, disparaging comments about cloud computing.  Because the Sun acquisition recently received European regulators' approval, Sun's goal of offering open cloud services is likely to end under Oracle ownership.

Oracle has decided not to join the competition in the Infrastructure-as-a-Service (IaaS) space. But those offering Software-as-a-Services (SaaS) and Platform-as-a-Service (PaaS), such as Microsoft Azure and Salesforce.com, will likely see more competition from Oracle in the future. First, Oracle has a broad spectrum of BPM, ERP and CRM suites. Second, the Sun acquisition adds Java, Glassfish and MySQL to the cornucopia of technologies that boost PaaS credentials.

Oracle is also releasing a cloud-based version of the OpenOffice suite. Cloud Office will put Oracle in competition with online office suites from Microsoft and Google. Oracle has announced that going forward it will continue to support the Linux and Solaris operating systems.

<Return to section navigation list> 

blog comments powered by Disqus