Tuesday, October 20, 2009

Windows Azure and Cloud Computing Posts for 10/19/2009+

Windows Azure, Azure Data Services, SQL Azure Database and related cloud computing topics now appear in this weekly series.

• Update 10/20/2009: MSDN: Premium Subscribers and BizSpark Members Receive Windows Azure and SQL Azure Benefits; IDC: IDC UK analyst slams cloud economics while IDC US touts cloud conference; Leena Rao: VS 2010 is cloud-ready but where are Azure Tools for Beta 2; ADO.NET Data Services Team: SharePoint lists support Astoria; Ben Kepes: EuroCloud Forms; Eugenio Pace: Claims based Identity & Access Control Guide updated; James Urquhart: Cloud computing and the future of IT; and more.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page.
* Content for managing DataHubs will be added when Microsoft releases a CTP of the technology

Azure Blob, Table and Queue Services

The ADO.NET Data Services Team’s Share your data across data sources (Sharepoint, SQL Server, Azure, Reporting Services, etc) & applications (.NET, Silverlight, Excel, etc) using Data Services post announced on 10/20/2009:

This week at the SharePoint conference, Pablo Castro will be announcing that the next version of SharePoint natively integrates with data services such that all SharePoint lists will be exposed as an ADO.NET Data Service.  This means you can now program against SharePoint list data using all the VS tools and runtime client libraries (.NET, Silverlight, AJAX, Java, PHP, etc…) which support ADO.NET Data Services.  Also, since the Data Service interface is just plain HTTP and XML/Atom or JSON, pretty much any environment with an HTTP stack can now browse, update and interact with list and document data in SharePoint. We are very excited to be able to talk about SharePoint’s data service integration (more code-centric blogs to come on this topic) as this has been an ask we hear a lot each time we present data services.

This integration is also a good example of a step towards our larger vision for data services and, more generally, open data access on the web.  The remainder of this post outlines our roadmap around data services and what parts of that roadmap we are announcing this week. …

Developer tools:

We updated the protocol itself as well as our frameworks and developer tools in Visual Studio 2010, .NET 4.0 (starting with Beta 2) and Silverlight 4.  We’ll also release an update to .NET 3.5 SP1.

Reporting Services:

Starting with SQL Server 2008 R2, all reports created with SQL Server Reporting Services will expose an option to render as an Atom feed that follows the Data Services convention as well. This makes it possible for developers and information workers to see the data “underneath” a report (whether it displays as tables, charts, etc.) and use it in their own applications.

Excel and Microsoft SQL Server PowerPivot for Excel 2010 (aka “Project Gemini”):

Also starting with SQL Server 2008 R2, the PowerPivot plug-in for Excel allows information workers to bring data from all sources for analysis will natively understand and load data from any Data Service. Now information workers can obtain data from reports, SharePoint sites, public sites that expose reference data and even custom applications and load it directly into the data analysis environments.

• Pablo Castro’s Every SharePoint 2010 server is a Data Services server post of 10/19/2009 begins:

I haven't been writing much here, mostly because I've been way too busy but also because I couldn't discuss publicly many of the things I'm doing. Now that SharePoint 2010 has been announced and its feature set published everywhere, I can finally discuss one of the coolest things we've been up to lately.

SharePoint is a repository of resources (list items and documents in document libraries) that are collected and manipulated collaboratively. Resources have a bunch of security and business logic attached to them, such as who can see each item, who gets to change it, or whether a particular column in a list needs to conform to a particular validation formula.

When SharePoint folks said they wanted a RESTful interface this was great news...the system is just a perfect fit. Not only it's a perfect fit for RESTful services in general, but also for Astoria in particular. In the end SharePoint is very data-centric in nature, it already supports queries and business logic as part of the uniform interface.

So we're really excited to announce that as of SharePoint 2010, every SharePoint server is an Astoria server out of the box. No configuration required or anything, just make sure the proper version of ADO.NET Data Services is in the box. For SharePoint 2010 beta, the "right" version is ADO.NET Data Services v1.5 CTP2. We'll put details out there for future iterations as they come.

The Data Platform Insider blog announced in its At SharePoint Conference, Project “Gemini” name officially revealed as Microsoft SQL Server PowerPivot for Excel post of 10/19/2009:

Finally, the Microsoft ADO.NET Data Services team announced that, starting with SharePoint 2010, all SharePoint sites are automatically exposed as RESTful data services that follow the ADO.NET Data Services convention. Now, any client with an HTTP stack can read and write to SharePoint just by using simple HTTP methods and Atom (XML) or JSON formatted data. Also, clients can post documents and read documents easily by using the HTTP interface. Since SharePoint exposes full metadata through the service, all Data Services tools such as Visual Studio, .NET client, and cross platform clients (e.g. PHP, Java clients recently announced) will work out of the box with SharePoint.

Furthermore, all the Silverlight support for data services will work as well, making it straightforward to front SharePoint sites with Silverlight applications that directly access SharePoint data without the need of server-side code. By enabling a standard based REST-style interface, ADO.Net Data Service greatly simplifies the programmatic access to SharePoint sites. For more details, visit the ADO.NET Data Service team blog at http://blogs.msdn.com/astoriateam/.

I’m curious to learn from the SharePoint Conference 2009 whether SP2010 enables Azure tables as data sources for lists. The Externalizing BLOB Storage in SharePoint 2010 session (#SPC399 on Twitter) might shed light on connecting to Windows Azure Blob Storage Services. Use Jeremy Thake’s SPC09 Session codes and titles list to follow Tweets by session code.

Michael Desmond delivers a synopsis of the SharePoint Conference 2009 (SPC09) in his Microsoft Gives SharePoint A Facelift post of 10/19/2009 to the Visual Studio Magazine blog:

Microsoft today previewed the next generation of its popular SharePoint Server family, which will sport a new user interface, tighter integration capabilities, a cloud-based version and much-needed support for the company's Visual Studio integrated development environment (IDE).

The new release, dubbed SharePoint 2010, will be available for general beta testing in November and should be released to manufacturing in the first half of 2010, Microsoft CEO Steve Ballmer said in the opening keynote address at the SharePoint Conference 2009 in Las Vegas.

During a detailed presentation, Microsoft offered a rundown of SharePoint SKUs and versions. With the 2010 wave, Microsoft is renaming Windows SharePoint Services (WSS) SharePoint Foundation, while the Microsoft Office SharePoint Server (MOSS) product becomes simply SharePoint Server. The new release will also come with an improved SharePoint Designer, the freely available tool for building custom SharePoint sites and behaviors. Designer will gain the Office Ribbon UI and other enhancements to help end users build mash ups, connect to external data and enable workflows, said Jeff Teper, Microsoft corporate vice president for SharePoint Server, in a blog post.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

• SearchSQLServer.com’s Q&A: Moving forward with SQL Server in the cloud of 10/19/2009 is an interview with “Brent Ozar, a SQL Server expert with Quest Software, about the potential promise and pitfalls of cloud computing and how enhancements to Microsoft's SQL Azure Database could change the way people view databases in the cloud.”

For those who are relatively unfamiliar with cloud computing, what is the case for cloud-based databases? Is it all about performance?

Brent Ozar: It's about a different kind of performance – the job performance of the DBA. How many times has someone come to the DBA and said, "I need a database, but I can't tell you how it's going to perform, because we've never worked with this app or this code before. I can't tell you how many users we're going to have, because I have no idea how popular this application is going to be. Just give me a database and we'll let you know if performance is bad. Oh, and by the way, we don't have much money in the budget."

Cloud computing gives us an answer to questions like that. In theory, we can deploy a database and scale it up based on demand with much less difficulty than scaling traditional database servers. Unfortunately, this really only means scaling up in terms of data size and user quantity – it doesn't mean we can whip out our credit cards and make the application twice as fast. If the queries are bad and the table design is bad, we're still going to be stuck with less-than-desirable performance.

Ozar is scheduled to present a session entitled Yes, I'm Actually Using the Cloud at PASS Summit 2009 in November.

The Microsoft Sync Framework team announced Sync Framework 2.0 Available for Download with this 10/19/2009 post:

Sync Framework 2.0 expands on the capabilities offered by Sync Framework 1.0:

  • Adds features that cater to new scenarios or scenarios that were difficult to support.
  • Reduces the amount of work required to develop providers.
  • Supports more data sources with new built-in providers.

Still no word about SQL Azure and the DataHub, but the Data Platform Insider blog announced in its At SharePoint Conference, Project “Gemini” name officially revealed as Microsoft SQL Server PowerPivot for Excel post of 10/19/2009:

Also today, the Microsoft Sync Framework team announced that Microsoft Office 2010 is replacing the internal synchronization engine that ties Groove Workspaces to SharePoint with the Microsoft Sync Framework.  This change supports synchronization for larger numbers of users sharing Groove Workspaces – up to 28 times the currently enabled number of users.

<Return to section navigation list> 

.NET Services: Access Control, Service Bus and Workflow

Eugenio Pace’s Claims based Identity & Access Control Guide – Updated drafts & samples available post of 10/20/2009 announces:

Yesterday, we uploaded a new release of the Guide and the samples. You can download the content from here. (Note: if you downloaded them yesterday, you might want to check again. We mistakenly uploaded the samples with no docs. It is fixed now).

You’ll find:

  • Updated introduction & WebSSO chapters, incorporating quite a bit of feedback
  • New updated samples, including scenario #2 (Federation with Partners). This is inspired in this article I wrote some time ago. 

Eugenio continues with an illustrated guide to using the samples.

The Geneva Team Blog’s How Geneva helps with access control and what is its relationship to AzMan post of 10/20/2009 discusses how the Windows Identity Foundation relates to Windows’ Authorization Manager (AzMan):

The new Identity and Access products wave from Microsoft brings a new, claims-based, approach to identity and access space. This new approach is based on the principles defined by the Identity Meta-System. Windows Identity Foundation (WIF), which is part of this new Identity and Access products wave, gives applications a much richer and flexible way to deal with identities by introducing claims-based identity concept. WIF also decouples the application business logic from the authentication and identity attributes lookup details by externalizing these processes into a component called security token service (STS). The primary goal of this new Identity and Access products wave is to enable this new, claims-based, identity and access model and make it as easy as possible for existing and new applications to leverage the exciting capabilities that this model provides. …

• Johan Danforth announced in a 10/20/2009: Just released [on CodePlex] a new version [0.2.2] of SQL Azure Explorer that works with Visual Studio 2010 Beta 2.

Wade Wegner announced in a 10/19/2009 tweet “Minor updates to Passive Federation w/ Windows Azure & ADFS v2 post given feedback from @eugenio_pace.” 

<Return to section navigation list> 

Live Windows Azure Apps, Tools and Test Harnesses

• Leena Rao reports Microsoft Moves Visual Studio Towards The Cloud in this 10/19/2009 post to TechCrunch:

Microsoft is making a significant announcement for developers today, upgrading and adding functionality to Visual Studio 2010 to make the product more cloud-friendly in anticipation of Microsoft’s release of its commercial cloud platform Azure. Visual Studio is Microsoft’s a development environment that can be used to develop web applications, sites and services based on Microsoft’s technology platforms. …

Microsoft is making it much easier for developers to build on the Azure cloud with these new tools. The specific Windows Azure tools for Visual Studio let developers build ASP.NET web applications and services that are hosted in Azure’s cloud services operating system. The tools also includes a SDK environment, and a simulated cloud environment that runs on the developer’s machine, so developers can test and debug their applications locally.

VS 2010 Beta 2 won’t be cloud-friendly until November when the Windows Azure team expect to release Windows Azure Tools for Visual Studio 2010 Beta 2, as noted in Jim Nakashima’s post at the end of this section.

Vaibhav Bhandari’s Running HealthVault Apps On Windows Azure post of 10/19/2009 is a detailed tutorial for getting a simple HealthVault application running under Windows Azure (requires the HealthVault SDK):

HealthVault SDK 1.0 introduces an interesting capability by which an HealthVault application can read their application certificate from a file. Eric has some details about this on his blog.

I’m going to describe how this capability of reading an application’s certificates from the file store could be used to run HealthVault application on Windows Azure.

Check out a simple (HelloHV application) running on Windows Azure here.

You must have an existing HealthVault account that HealthVault Azure can access. Note that HealthVault Azure doesn’t provide the level of security of Microsoft’s official HealthVault application.

Vaibhav is a software engineer on the HealthVault Solution Provider team.

Wade Wegner announced in a 10/19/2009 tweet “Minor updates to Passive Federation w/ Windows Azure & ADFS v2 post given feedback from @eugenio_pace.”  (Repeated from the .NET Services: Access Control, Service Bus and Workflow section.)

Mary Jo Foley reports Free Silverlight streaming service to be replaced by paid Azure-hosted service in this 10/19/2009 post to ZDNet’s All About Microsoft blog:

Microsoft is discontinuing its free Silverlight Streaming service and replacing it with a paid, Azure-hosted service before the end of calendar 2009.

Microsoft is confirming the report I saw on LiveSide.Net from October 18, which is based on a blog post from Microsoft’s Silverlight Streaming team. Silverlight Streaming is a Windows Live beta service that supports hosted audio/video content. Microsoft officials described the offering as a companion service for Silverlight for delivering and scaling rich media.

The Softies are saying Silverlight Streaming will be discontinued at some time in the future, with no specific timeframe offered.

“A new Windows Azure-based hosting and delivery service will be launched by the end of 2009, though this is not a direct replacement for Silverlight Streaming and will have costs associated with its use,” according to the Silverlight Streaming team.

Alin Irimie asserts Visual Studio 2010 Is Cloud Friendly in this post of 10/19/2009 about the newly released (to MSDN subscribers) of VS 2010 and .NET Framework 4:

Microsoft announced today the immediate availability of Microsoft Visual Studio 2010 Beta 2 and Microsoft .NET Framework 4 Beta 2 to MSDN subscribers; general availability will follow on Oct. 21.

New testing options in Visual Studio 2010 will help ensure quality code. Enhancements to the integrated development environment mean that whether modeling, coding, testing or debugging, developers can use existing skills to deploy a growing number of application types. Built-in tools for Windows 7 and Microsoft SharePoint 2010, new drag and drop bindings for Silverlight and Windows Presentation Foundation, and interoperability with innovative technologies (such as those for the database, ASP.NET model view controller, unified modeling language, Expression, and multicore) allow developers to bring their visions to life.

With the .NET Framework 4, developers can experience immensely smaller deployments with up to an 81 percent reduction in the framework size when using the Client Profile. Other .NET Framework 4 developer benefits include additional support for industry standards, inclusion of the Dynamic Language Runtime for more language choice, new support for high-performance middle-tier applications (including parallel programming, workflow and service-oriented applications) and backward compatibility through side-by-side installation with .NET Framework 3.5.

Jim Nakashima reviews some new VS 2010 Beta 2 features in his Windows Azure Tools and Visual Studio 2010 post of 10/18/2009:

We’re really excited about this release as not only does it support Visual Studio 2010 Beta 2, but it also adds a new UI over the service definition and configuration files, adds new template options for creating roles, improves debugging integration with the development fabric and integrates with a number of new platform features and improvements.

<Return to section navigation list> 

Windows Azure Infrastructure

• My MSDN Premium Subscribers and BizSpark Members to Receive Windows Azure and SQL Azure Benefits post of 10/20/2009 describes details of eight-month introductory and subsequent free Windows Azure computing, storage and bandwidth quotas, as well as SQL Azure Web edition instances available to MSDN subscribers and BizSpark members. (But apparently not to WebsiteSpark members.)

• Steven Shankland reports cloud computing is Gartner’s #1 strategic technology for 2010 in his Gartner: Brace yourself for cloud computing article of 10/20/2009 from Orlando FL for CNet’s DeepTech blog:

Cloud computing isn't going to be vapor much longer, Gartner said Tuesday.

The general idea--shared computing services accessible over the Internet that can expand or contract on demand--topped Gartner's list of the 10 top technologies that information technology personnel need to plan for. It's complicated, poses security risks, and computing technology companies are latching onto the buzzword in droves, but the phenomenon should be taken seriously, said analyst Dave Cearley here at the Gartner Symposium.

Gartner&#39;s top trends to watch.

Gartner's top trends to watch. (Credit: Gartner)

Specifically, companies should figure out what cloud services might give them value, how to write applications that run on cloud services, and whether they should build their own private clouds that use Internet-style networking technology within a company's firewall.

• Ben KepesIt’s All About Networks – EuroCloud Forms post of 10/20/2009 reports that Phil Wainewright is:

[P]art of a group setting up Eurocloud. EuroCloud is an:

“European network of local SaaS and cloud computing communities from Denmark, UK, Belgium, Finland, Luxembourg, France and Spain, including vendors and industry experts. It is aiming at developing the next generation of added value applications. EuroCloud plans to create further local communities in Germany, the Netherlands, Poland, and Sweden, and will set up its headquarters in Brussels. Through its diverse membership, EuroCloud will promote cloud computing in Europe, including current state of the markets and future innovations, and will become a critical exchange platform across the different continents.”

EuroCloud is led by Pierre-José Billotte, President and Founder of the French ASP Forum, with a team of SaaS & cloud computing players from the UK, Denmark, Finland, Belgium, Luxembourg and Spain, EuroCloud gathers together leading SaaS vendors, enablers, integrators and industry experts to share best practice and expand their businesses across the continent.

• James Urquhart continues his The Wisdom of Clouds series with Cloud computing and the big rethink: Part 5 of 10/19/2009:

To date, this series has tried to guide you through the changes happening from the infrastructure, developer, and end user perspectives that signal the demise of the full-featured server operating system and the virtual server. Virtualization, and the large scale, multi-tenant operations model we know and love as "cloud computing," are enabling IT professionals to rethink the packaging, delivery, and operation of software functionality in extremely disruptive--and beneficial--ways.

So, what does this mean to the future of information technology? How will the role of IT, and the roles within IT, change as a result of the changing landscape of the technology it administers? What new applications--and resulting markets--are enabled by the "big rethink"?

Following are abbreviated versions of James’ observations:

  1. Software packaging will be application focused, not server focused.
  2. Enterprise IT will begin to bend enterprise and solutions architectures to align better with what is offered from the cloud.
  3. The changing relationship between software and hardware will result in new organizational structures within the IT department.
  4. The changing landscape of software development platforms will result in new philosophies of software architecture, deployment, and operations.
  5. The need for tactical systems administrators will be reduced.

Larry Dignan asks How did IT fall so far behind the tech curve? in this 10/19/2009 post to ZDNet’s Between the Lines blog that was inspired by a session at Gartner Symposium from analysts David Mitchell Smith and Tom Austin:

Information technology departments are overloaded, missing the consumerization wave, and failing to use new developments to cut their budgets.

Those are some of the takeaways from a Gartner presentation at the IT Symposium in Orlando. The spiel by Gartner analysts David Mitchell Smith and Tom Austin revolves around the state of IT departments as technology is rapidly being changed by their users. How exactly did IT become so crotchety?

Here’s the scene setter:

Most IT professionals want the world to proceed in orderly, incremental fashion, with no massive overnight changes and with plenty of time to adapt to external change. Significant discontinuities are the stuff of which nightmares are made. For example, when assumptions about the useful life of an asset shift early in a project, plummeting from several years to several months, investors can get ruined, and people can lose their jobs and more. We see five major intersecting discontinuities on the horizon. They amplify each other. Any one of them can upset the balance of power between users and their IT organization (or between vendors in a segment). Put the five together and let them amplify each other’s dislocating impact, and there is major trouble looming.

These five amplifying developments are:

  • Software as a service
  • Open source
  • Cloud computing
  • Web 2.0
  • Consumerization

More of the argument:

There is a fundamental mismatch between what enterprise IT is good at and what is happening on the Internet. For investment projects, IT organizations typically spend six to eight years from initial conceptualization through selling, planning, testing and implementation of the first release.  Project cycles, life spans and frequencies of Internet-related developments (and consumer-related product or service introductions) are radically different.

Jeremy Geelan interviews a Microsoft technical architect in this Infrastructure-as-a-Service Will Mature in 2010: Microsoft's David Chou post of 10/19/2009:

While acknowledging that lots of work is currently being done to differentiate and integrate private and public cloud solutions, Microsoft Architect David Chou believes that Infrastructure-as-a-service (IaaS) is the area of Cloud Computing that will make its impact most noticeably in 2010 - especially for startups, and small-medium sized businesses.

In this quickfire mini-interview with SYS-CON's Cloud Computing Journal, in the run-up to November's Cloud Computing Expo in Santa Clara, CA, Chou also mentions who he deems to be the Top Five Companies in the Cloud as at Fall 2009. One of the five he chooses, hardly surprisingly, begins with M.

This sounds to me as if Chou is damning Azure by faint praise or ignoring it completely.

Kevin Fogarty’s Five Problems Keeping Legacy Apps Out of the Cloud post of 10/15/2009 for CIO.com asks: “Did you think all those legacy apps would just float up into someone else's cloud infrastructure? Management, licensing and migration concerns highlight the list of troubles that vendors are now trying to address.”

The hype about cloud computing has gotten so loud that Gartner Group used Cloud as the lead in its hype-parazzi special report Hype Cycle 2009. The sharply sloping graph in the report places cloud, along with e-book readers, wireless power and social software suites, at or near the "Peak of Inflated Expectations," preparing for a dive into the "Trough of Disillusionment."

One thing that may drive it into that trough — other than the unrealistic projections by some providers of cost-savings and easy capacity planning — is the difficulty in getting certain applications to run on it effectively, according to analysts and vendors selling technology to help bridge the gap.

Following are abbreviated versions of Kevin’s five problems:

    1. Today's clouds are not alike
    2. Security worries
    3. Licensing and interoperability concerns
    4. You don't know your own legacy
    5. Migration is manual and darn few tools will help

<Return to section navigation list> 

Cloud Security and Governance

David Linthicum warns Beware of cloud computing consultants who focus on infrastructure in this 10/19/2009 post to InfoWorld’s Cloud Computing blog: “The rise of cloud computing has led to a lot of 'consultants' who care more about the simple relocation of systems than real architecture.”

Along with the rapid rise of cloud computing, an army of people with the phrase "cloud computing consultant" on their business cards has arrived. However, many of these individuals, while well intentioned, come from the infrastructure side and have a tendency to miss the core value of cloud computing. They disregard the potential for the cloud computing to better support the business, and instead view it simply as a change of platforms.

Let me be very clear: The movement toward cloud computing is an architectural issue, meaning that you have to take the whole business into account, understand all existing systems at a detail level, and only then begin to consider cloud computing options. Moreover, you need to think about how the systems are divided among on-premise and cloud-delivered platforms, the synergy of the architecture with the mission, and the future needs of the business. …

Christina Tynan-Wood asks Should I back up data stored in the cloud? in this 10/19/2009 post to InfoWorld’s Adventures in IT blog:

Yesterday, I participated in a blog radio discussion with some folks from Google, another journalist, and Sree Sreenivasan, dean of student affairs and digital media professor at the Columbia Journalism School. The topic was how journalists use Google Docs and cloud computing, but the conversation took some interesting turns. In the aftermath of the Sidekick data failure, we were worried: How backed up is the data we store in the cloud?

People who rely on Google Docs often have a great deal of important work there. And as Gunner points out in the comments to my last post, "While it could be argued that a customer is still liable for their own backup, this 'cloud backup' is a service that is touted as being a backup -- not 'another' backup. For a vast majority of users, it was intended as their only backup, and why not?"

Yesterday, I participated in a blog radio discussion with some folks from Google, another journalist, and Sree Sreenivasan, dean of student affairs and digital media professor at the Columbia Journalism School. The topic was how journalists use Google Docs and cloud computing, but the conversation took some interesting turns. In the aftermath of the Sidekick data failure, we were worried: How backed up is the data we store in the cloud? …

Windows Azure stores the original and two replicas on three separate servers in a single data center. If you’re concerned about catastrophic loss of the data center, you’ll need to adopt multiple geo-locations when Microsoft offers the feature, presumably at PDC 2009.

GovInfoSecurity’s Fed Regulation of Private Data Mulled post of 10/16/2009 is subtititled “House Cyber Panel Chair Suggests a National Data Breach Law.”

Congress should consider enacting legislation allowing the federal government to regulate how the private sector handles and stores data to battle the increasing problem of data breaches, says the chairwoman of a House panel that has jurisdiction over cybersecurity.

Rep. Yvette Clarke, the Brooklyn, N.Y., Democrat who chairs the House Homeland Security Subcommittee on Emerging Threats, Cybersecurity and Science and Technology, says she hopes to hold hearings on what she calls the National Data Breach Law either later this year or in early 2010.

"There is no way that we can, giving the evolving nature of data breaches, not (to) regulate, bring some uniformity to our expectations on how data is managed, dealt with and stored," Clarke said in an interview with GovInfoSecurity.com on Friday. "Everyone has come to that realization."

Tanya Forsheit’s Legal Implications of Cloud Computing -- Part Two (Privacy and the Cloud) post of 9/30/2009, which I missed at the end of September, begins:

Last month we posted some basics on cloud computing designed to provide some context and identify the legal issues.  What is the cloud?  Why is everyone in the tech community talking about it?  Why do we as lawyers even care?  Dave provided a few things for our readers to think about -- privacy, security, e-discovery.

Now, let's dig a little deeper.

I am going to start with privacy and cross-border data transfers.  Is there privacy in the cloud?  What are the privacy laws to keep in mind?  What are an organization's compliance obligations?   As with so many issues in the privacy space, the answer begins with one key principle -- location, location, location.  For those of you who prefer to listen, check out my recent webinar on International Regulatory Issues in the Cloud, or you can download the slides (PPTX).

I also missed her earlier Legal Implications of Cloud Computing — Part One (the Basics and Framing the Issues) post of 8/18/2009.

<Return to section navigation list>

Cloud Computing Events

• Dmitry Sotnikov’s Cloud and BPOS sessions at Oct 21 Virtual Tradeshow post of 10/20/2009 observes that:

Quest Connect is a big online conference put together by Quest, Microsoft, Dell, NetApp, Vizioncore, Scriptlogic, Techrepublic, Oracle Magazine, Redmond Magazine, and The Code Project. The agenda is packed with a lot of useful material on Windows Server 2008 R2, AD, Identity Management, Exchange 2010, Virtualization, Cloud Computing, SharePoint, SQL, Oracle – see full agenda here – and they include some sessions specifically on cloud computing and Microsoft Online Services. …

See live session list here and on-demand list here.

• IDC announced its Cloud Computing Forum, Getting Down to Business with the Cloud, to be held November 4th at the Marriott Marquis, in New York City in this 10/20/2009 press release:

The IDC Cloud Computing Forum will feature a keynote session with Frank Gens, senior vice president and chief analyst at IDC and the leader of IDC's Cloud Computing Team. This session will share the latest research and perspectives on new "cloud-enabled" IT models that leverage software-as-a-service, "mash-up" application models, next-generation on-premise systems, and very large, global solution marketplaces to "democratize" IT access and enable organizations to out-innovate and out-execute their competitors, based on the power of their ideas rather than the size of their IT budgets or skills. Gens will also serve as the Forum Chair.

Forum case studies and presentations will focus on key topics including:

  • Here Comes the Cloud: New IT Models for Growth and Innovation
  • Managing and Securing Information in the Cloud
  • Golf in the Cloud? How the United States Golf Association has Leveraged the Cloud for Technology Advances in Tough Economic Times
  • Making Private Clouds Part of Your Data Center Transformation
  • Becoming a Cloud Services Provider: Issues, Patterns, and Considerations
  • Core Business Apps in the Cloud: Which are Ready for Prime Time?

You can get more information or register for the IDC Cloud Computing Forum here.

Leo King reports in his IDC: Use cloud as 'stop-gap' measure in recession article of 10/19/2009 for Computerworld UK:

Businesses should only use cloud computing as a "stop-gap" measure to help them survive the cost pressures of the recession.

That is the verdict of analyst house IDC, which said research showed cloud services only provided short-term cost savings. Within three years of implementing software as a service, it said, large businesses regularly found the costs exceeded that of running their own on-premise systems.

"Cloud costs need to come down much further to be a realistic long term option," said Matthew McCormack, IDC analyst, at the company's recent Cloud Computing Summit in London. "It could be useful in the short term financially for companies with severe cost overruns."

"Your datacentre would have to be really poorly run for it to be more expensive than cloud in the long run," he added.

Sounds to me like IDC’s UK and US analysts aren’t reading from the same page when it comes to cloud computing.

John Pescatore’s At Gartner Symposium: Gartner Uses Every Part of the Analyst, Including the Oink post to the Gartner blog of 10/19/2009 questions the lack of security-related sessions:

This week I’ll be sucked into the Gartner IT Symposium vortex, where life is pretty much a constant rotation of 1-1 meetings with attendees, giving presentations, doing the normal inquiry phone calls with Gartner clients, and sneaking time online to work off the never-ending flow of email.

Looking through my calendar at the one-on-one attendee meetings scheduled, the topics run the gamut. However, a few trends stand out:

  1. Mobility – secure telework, secure use of smartphones and WLAN security questions.
  2. Outsourcing – the terms are different (”use of the cloud” replaced “consume X as a service” which replaced “use an external hoster” which replaces “outsourcing”) but the questions are still about how the business can maintain security while outsourcing some function to external parties.
  3. Threat update – what are the new threats we should worry about?

The questions I don’t see are to me the most interesting. Things like “How do I keep our corporate websites secure?” and “how do we make sure we aren’t already compromised by bot clients?” are missing. Essentially, there is a lack of attention to the current state of security.

Aaron Skonnard announced his return from the Heartland Developer’s Conference (HDC) ‘09 in this 10/19/2009 post:

I had a great time at the Heartland Developer’s Conference (HDC) ‘09 in Omaha last week. I really liked the size of the event (not to small, not too big), the overall energy/vibe, and the social elements they provided for attendees. The speaker line-up was quite good for a regional event. …

If you’re looking for my demos from the show, you can grab them here:

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

ChrisW@AWS posted Announcement: Auto Scaling groups can now span multiple availability zones to the Amazon Elastic Compute Cloud forum on 10/19/2009:

We are excited to announce that starting today you can configure your Auto Scaling groups to span multiple Availability Zones. With Multi-AZ Auto Scaling groups, we provide a way to achieve a balanced group of EC2 instances that are spread across multiple Availability Zones for high availability, and provide a single entity for you to manage. In addition, we mitigate the problem of zones becoming unavailable or congested by temporarily allocating capacity in other zones and rebalancing the group back over time.

Please visit our Auto Scaling detail page for additional detail on EC2 Auto Scaling and our developer documentation for more information on the new functionality.

Andrea DiMaio posits Government as Cloud Service Provider: The Battle Has Started in this 10/19/2009 post:

Yesterday, I addressed our audience at the Gartner Symposium in Orlando on cloud computing in government about the possible competition between vendor cloud providers (like Google, Amazon, Terremark, and so forth) and government infrastructure owners (such as NASA Ames, DISA, NBC). I have blogged about this before and I also asked a related question to Vivek Kundra, the US Federal CIO, while I was interviewing him on stage. Before that, my colleague Jeff Vining moderated a panel with those same government agencies on stage.

By one of those coincidences that make life so interesting, pretty much at the same time NASA was announcing that the Office of Management and Budget is moving USASpending.gov to NASA Ames Research Center’s cloud computer infrastructure called Nebula.

On Tuesday Oct 20 at 12:30 pm I will moderate a vendor panel with Microsoft, Google, Salesforce.com, Terremark and CSC to have their views about where they think cloud computing in government should and will go and to provoke some discussion about whether they think the will cooperate or rather compete with NASA Ames and the likes.

James Staten’s 3 Keys to Using IaaS Public Clouds Wisely post of 10/19/2009 for CIO.com begins with the premise:

Public IaaS cloud computing is the delivery of compute (virtualized servers, storage, and networking) on demand as a shared service. The promises of IaaS clouds combine real flexibility and instant capacity with compelling economics like $0.10 per CPU per hour. While the economics are true, they are a tease.

The value of IaaS clouds lies in developer productivity and time-to-market more than cost, as IaaS clouds let developers entirely control the provisioning, configuration, and deployment of the VM themselves. The key is deploying the right kinds of applications, for the right types of uses, with the right business model behind this practice.

Following are abbreviated versions of James’ three keys:

    1. Best Practice 1: Test and Development in the Cloud
    2. Best Practice 2: Deploy Web Applications
    3. Best Practice 3: High-Performance Computing (HPC)

Rich Miller asserts Google CapEx Spending Rebounds Slightly in this 10/19/2009 post:

The second quarter capital expenditure (CapEx) total was up slightly from $139 million in the second quarter but tically lower than the company spent during the same period in 2006 ($492 million), 2007 ($453 million) and 2008 ($452 million). Here’s a look at the recent trend:

google-capex-3q2009

John Treadway’s Cloud Computing in the Enterprise – Private (Internal) Clouds post of 10/19/2009 begins:

I’ve been doing a lot of work on private (internal) clouds lately – it’s a result of my new job with Unisys.  Part of that work has been spending time with customers on their plans for cloud computing — internal and external.  There’s some very interesting work going on in the private cloud space, and the solutions available to enterprises to build their clouds are many.

Note – I make the (internal) distinction for a reason.  The term “private cloud” is now starting to morph from purely internal, to internal and external clouds controlled closely by IT.  Amazon’s Virtual Private Cloud is an example of a private cloud in an external provider setting.

I have seen charts from Gartner that show how private (internal) clouds will get more money from IT over the next few years than public clouds.  I’ve also seen the benefits of a private cloud in the development/test workload scenario here at Unisys.  The numbers are pretty staggering (we are publishing a paper on this).

Reuven Cohen asks Is IaaS (as a term) Doomed? in this 10/18/2009 post:

Interesting post over at thewhir.com by Joshua Beil, the Director of Market Strategy and Research for Parallels. In his post Beil asks a simple yet profound question. Is the term "IaaS" doomed?

He says "It’s not that the concept of “infrastructure-as-a-service” is flawed… it’s the acronym that is doomed. Let’s face it, “SaaS and “PaaS” can be said out loud, but when you say “IaaS” in same way, well, it just doesn’t work. I’m reminded about something my mother once told me about what happens when I assume."

He goes on to outline three options.

  1. Migrate from IaaS to CaaS and make Verizon’s day (CaaS is Computing as a service, and what Verison calls their IaaS service)

  2. Consolidate IaaS and PaaS into just PaaS, as the distinction between these two is getting blurrier as offerings from Amazon and Microsoft’s Azure evolve.

  3. Replace IaaS with another term that’s not a “C”… but what? I spent some time looking at synonyms for the word “infrastructure” but just didn’t see anything that worked really well. …

Ruv continues by saying that he agrees with Joshua’s analysis and explains why.

M. Koenig and B. Guptill claim Best of Both Worlds: Here Comes the Hybrid Cloud! in this Saugatuck Research Alert of 10/15/2009 (site registration required):

On Monday October 12, 2009, Salesforce.com announced the availability of new, jointly-marketed Cloud-based CRM solutions from Salesforce.com and Dell Inc. Dell will offer Salesforce.com's SaaS-based CRM and sales automation applications to its SMB clients, and will provide implementation and integration with existing systems and databases. Subscription based pricing starts from $9 per user per month. The next day, at Oracle Corporation’s OpenWorld conference, Salesforce.com CEO Marc Benioff appeared on stage, with special guest Michael Dell of Dell Corporation to deliver a keynote entitled “The Best of Both Worlds,” that laid out a vision of a hybrid cloud in which on-premises applications work together with on-demand applications.

Taken together, this week’s announcement and speech represent an acknowledgement by three of the biggest forces in software and hardware (Dell, Salesforce, and host company Oracle) that the IT model of today – and of the future – is one that integrates on-premise and Cloud-based hardware, software and services. …

<Return to section navigation list> 

blog comments powered by Disqus