Sunday, August 09, 2009

Windows Azure and Cloud Computing Posts for 8/6/2009+

Windows Azure, Azure Data Services, SQL Azure Database and related cloud computing topics now appear in this weekly series.

• Update 8/7 - 8/9/2008: Electronic Health Records, Personal Health Records, other additions

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use these links, click the post title to display the single article you want to navigate.

Azure Blob, Table and Queue Services

Cloudera, Inc.’s Building a Data Intensive Web Application with Cloudera, Hadoop, Hive, Pig, and EC2 tutorial of 8/6/2009 will:

[S]how you how to use Amazon EC2 and Cloudera's Distribution for Hadoop to run batch jobs for a data intensive web application. During the tutorial, we will perform the following data processing steps:

  • Configure and launch a Hadoop cluster on Amazon EC2 using the Cloudera tools
  • Load Wikipedia log data into Hadoop from Amazon Elastic Block Store (EBS) snapshots and Amazon S3
  • Run simple Pig and Hive commands on the log data
  • Write a MapReduce job to clean the raw data and aggregate it to a daily level (page_title, date, count)
  • Write a Hive query that finds trending Wikipedia articles by calling a custom mapper script
  • Join the trend data in Hive with a table of Wikipedia page IDs
  • Export the trend query results to S3 as a tab delimited text file for use in our web application's MySQL database …

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

The SQL Azure Team has posted documentation for SQL Azure CTP1 on the following topics:

See John Oltsik proposes A Prudent Approach for Storage Encryption and Key Management in the Cloud Security and Governance section. Storage Encryption and Key Management are important topics for “SQL Server in the Cloud.”

<Return to section navigation list> 

.NET Services: Access Control, Service Bus and Workflow

The Geneva Team made three Identity Samples available on Code Gallery on 8/7/2009, but they don’t involve Windows Azure. The team describes the samples:

    • FabrikamShipping. This is a fairly complete example of how to use the Windows identity Foundation for addressing common tasks in the development of web solutions: accepting identities from an external identity provider, driving the UI using claims, invoking back-end WCF services via delegated authentication, handling claims based authorization and so on. The sample is based on the scenario described in Kim’s Cameron PDC08 session. You can download the sample from here; a detailed description is available here.
    • ClaimsDrivenModifierControl. This is a sample ASP.NET control that demonstrates how you can take advantage of claims for driving the behavior of your web UX without the need of writing any code! You can download the sample from here; a detailed description is available here; finally, if you want to see the control in action a screencast is available here.
    • SecurityTokenVisualizerControl. This a very simple ASP.NET control that can help you to debug your websites secured with the Windows identity Foundation, by allowing you to inspect identity information in the current context such as claims list, raw XML of the incoming token, signing certificates and more. You can download the sample from here; a detailed description is available here

They’re included in this post because any application that demonstrates Geneva technologies aids in understanding federated identity services and the like.

Vittorio Bertocci (a.k.a. Vibro) posted a Windows Identity Foundation and Windows Azure passive federation guide on the MSDN Code Gallery on 7/4/2009:

This guide provides a step-by-step guide for hosting in Windows Azure a web application which accepts identities from an external identity provider by taking advantage of Windows Identity Foundation.

This is a provisional sample whose purpose is giving you a chance to experiment with federated scenarios in WA today, by using publicly available bits (Windows Azure July CTP, Geneva Framework Beta 2 codename beta release of Windows Identity Foundation). The code shown here is NOT production ready and contains various temporary compromises: you can expect many of those to become unnecessary in future releases of the products.

There’s more detail about integrating Geneva features with Windows Azure in Vibro’s A Guide for Enabling Federated Authentication on Windows Azure WebRoles… using Windows identity Foundation post of 8/4/2009.

<Return to section navigation list>

Live Windows Azure Apps, Tools and Test Harnesses

• Steve Marx asks if Agility Makes Up for Bad Preparation? in this 8/7/2009 post following the live chat in the Windows Azure Lounge.

I’d say “Absolutely.” After retrenching, reprogramming, and restarting an hour later to get rid of the trolls, I believe all participants were more than satisfied with the chat session. Looks like the next chat is scheduled for Friday, 8/21/2009 at 12:00 noon PDT:

Steve promises to post on Monday or later a copy of the complete chat session for those of us who didn’t allow popups before logging into the chat session and lost the ability to save it.

• Simon Munro’s Windows Azure Chat Nuggets post of 8/7/2009 offers a brief summary of the topics covered in the first chat session.

Steve Marx describes his Live Feedback Mashup: Tinychat + PollDaddy + Windows Azure in a post of 8/7/2009 that introduces the Windows Azure Lounge at http://wazl.cloudapp.net.

• Laith Noel Yousif posts on 8/7/2009 the source code for Building Twtmug, which runs in Windows Azure, looks up your public photos on SmugMug and then posts them to Twitter  with a short link.

• Jim Nakashima’s Windows Azure Tools and Visual Studio 2010 post of 8/7/2009 says:

The May and July 2009 CTPs of the Windows Azure Tools both support Visual Studio 2010 Beta 1.

As far as support for Visual Studio 2010 Beta 2 -- we won't have a release of Windows Azure Tools that supports Visual Studio 2010 Beta 2 until around the time that Beta 2 officially ships publicly. 

Stay tuned.

Bill McColl’s The Intercloud: Coming to a Desktop or Mobile Device Near You post of 8/6/2009 claims the “Cloudscale platform can handle all types of realtime data, and is as easy to use as a spreadsheet” with Windows Azure and Amazon EC2:

At Cloudscale we're preparing for the public launch, later in the year, of the first Intercloud service, a platform for the world's realtime apps. In response to a number of requests for information, here is a brief overview of what we have been developing.

The Cloudscale platform can handle all types of realtime data, and is as easy to use as a spreadsheet. Complex analytics can be run continuously, with automatic scaling and fault tolerance to ensure realtime responsiveness. The platform offers users seamless integration from standard desktop or mobile clients to realtime intercloud apps.

As a self-service platform aimed at mass market adoption, there is no database software and administration to deal with, and no IT department delays and complexity to get in the way of delivering immediate value to users. The platform will drive the democratization of data, unleashing creativity and “turning data into action” everywhere. Like the iPhone platform, Cloudscale’s AppStore will provide a commercial marketplace for the many exciting new apps developed on the platform.

<Return to section navigation list> 

Windows Azure Infrastructure

• Bob Gourley’s Cloud Computing vs. SOA: Look For a Cross-over in Hype post of 8/9/2009 observes that:

Cloud Computing is one of the many things enterprise CIOs, CTOs and other engineers will master in delivering capability.  I believe in the power of new Cloud Computing technologies and concepts and think we should all continue our focus there.

I have said, and still say, the same thing about design approaches like Service Oriented Architecture (SOA).  The constructs, methods and models of SOA are good practices that result in good designs for enterprises.   It is smart to separate data from application logic and smart to enable agility and mashups the way good SOA design does.

And then goes on to plot the declining volume of Web searches for “SOA” versus an increasing volume for “cloud computing,” and concludes:

… Your defense against this [flood of hype] will be the strength of your own position in Cloud Computing.  Therefore, I strongly recommend you personally think through what Cloud Computing means to your enterprise.  Also, think through your definition of cloud computing.  Hopefully the NIST definition will suit your use, since the more people who form up on that the better.

Chris Hoff (@Beaker) points out that There’s A Difference Between Application/OS Multitenancy and Data(base) Multitenancy in this essay of 8/8/2009 about a cloud provider with a multitenant backend database and a multiuser front-end application.

Dana Blankenhorn reports Google, Microsoft demand place at medical stimulus table in this 8/8/2009 article for ZD Net’s Healthcare blog:

Google and Microsoft are demanding Web access to electronic medical records and took their complaints right to the top this week.

Google chairman Eric Schmidt and Microsoft chief strategy officer Craig Mundie both hammered NCHIT David Blumenthal on the issue during an advisory committee meeting Thursday.

What’s most interesting is that both Blumenthal and U.S. Chief Technology Officer Aneesh Chopra pushed back. The HITECH stimulus money goes to hospitals that automate records and deliver the functional requirements of a recently-passed plan on meaningful use.

The resulting Electronic Health Records (EHRs) are covered by the HIPAA law, in terms of how they can be shared. What Schmidt and Mundie want is that EHRs respond to Web standards so the records can be turned into Personal Health Records (PHRs) controlled by patients.

For other takes on the committee meeting, see iHealthBeat’s Google, Microsoft Question Government's Health IT Plans article of 8/7/2009 and Bob Brewin’s Google, Microsoft executives criticize Obama's e-health records plan post of 8/6/2009.

According to Mary Jo Foley’s Microsoft links HealthVault service with Amalga software post of 4/6/2009, reported in Windows Azure and Cloud Computing Posts for 7/13/2009+:

HealthVault is Microsoft’s consumer-focused health-records-management Software+Service platform, which the company unveiled officially in 2007. (The service component of HealthVault is one of a handful of Microsoft services that already is hosted on top of Azure.) Amalga UIS, (one of the products formerly under the Azyxxi brand), is one of the main elements of Microsoft’s enterprise health-information-system platform.

See also Scott Shreeve, MD’s Microsoft Vaults Ahead into the Personal Health Information Space post of 8/5/2009 to the Crossover Health site:

I am flying home from the HealthVault Connected Care Conference in Seattle. I left with two big takeaways which will be addressed in two separate posts. It was a great trip, in fact refreshing in many ways, coming on the heels of a wonderful but intellectually strenuous series of meetings for the X PRIZE. In fact, I have never been to Seattle when it was so beautiful – perfectly warm and sunny days with intermittent cumulus clouds and light breezes whose temperature was nearly imperceptible. My favorite evening in town was spent watching sailboats glide effortlessly around the Sound in the fading sunlight of a perfect day. Magic.

Perhaps the setting got me in a good mood, but I walked away very clearly impressed with what Microsoft is attempting to do with their health care strategy. I have to be clear – as an ardent and passionate open source advocate (recovering zealot) – I was very ambivalent about stepping clearly into and over “enemy” lines during my sojourn in Redmond. I was quickly put at ease by the West Coast flavor of the meeting (ie, casual business dress with a young-ish crowd, high energy music, and overall good karma) and the impressive lineup of speakers and attendees. Furthermore, this was the first time I was actually able to figure out what the heck HealthVault really is and how all these various partnerships I keep reading about even begin to make sense. …

So kudos to Peter Neupert and crew for the progress to date. I was impressed.

But I was also puzzled at the same time – Where (on earth ) Is Google Health?

His CLEAR! Shocking Google Health Back to Life post of the same date compares the investments and projects the chances of success of Google Health and Microsoft HealthVault.

John Moore’s Is Google Health Irrelevant? post of 8/6/2009 agrees with Shreeve’s post about Google Health and contends:

… Google Health has been nothing more than a distraction to the broader market.  A distraction in that Google Health has really done very little to create a truly compelling platform, yet due to its size, market presence and media and market pundits belief that Google is the be all to end all, Google Health gets far more press and attention than it rightfully deserves …

John is managing director of Chillmark Research which describes itself as “a healthcare technology industry analyst firm focusing on personal healthcare technology that will enable citizens to take more direct responsibility for their health and the health of loved ones.”

The HealthVault Connected Care Conference was held 6/10 to 6/12/2009 at the Meydenbauer Center in Bellevue, WA. Microsoft’s site offers PDF presentations and video segments of the sessions. The US$19 billion allocated to health information technology (HIT) by the American Recovery and Reinvestment Act (ARRA) of 2009 has greatly increased interest in HIT as well as EHR and PHR implementations. Alice Lipowicz’s Is the nation's health network healthy? article of 8/7/2009 in FederalComputerWeek throws more light on the ARRA incentives and the National Health Information Network (NHIN), the Health and Human Services Departments replacement for the Internet.

HealthVault and Practice Fusion, which was the recipient of a recent investment by Salesforce.com (see last item in this post), appear to be competitors. However, Practice Fusion offers free EHR services primarily to physicians while HealthVault provides PHR storage for patients.

Richard MacManus analyzes the Salesforce investment in his Practice Fusion Partners With Salesforce-But Is Cloud Computing Suitable For Healthcare? article of 8/7/2009 for the NY Times.

Philip Longmann’s Code Red: How software companies could screw up Obama’s health care reform cover story in Washington Monthly’s August 2009 issue discusses the issue of proprietary health IT systems versus the Veterans Administration’s VistA system.

• Chris Hoff (@Beaker) goes off the deep end with his Introducing the “Cloud For Clunkers Program” post of 8/8/2009:

cash-for-clunkersAs compelling as the offer of Cloud may be, in order to pull off incentivizing large enterprises to think differently, it requires an awful lot going on under the covers to provide this level of abstracted awesomeness; a ton of heavy lifting and the equipment and facilities to go with it. [Emphasis added.] …

To get ready for the gold rush, most of the top-tier IaaS/PaaS Cloud providers are building data processing MegaCenters around the globe in order to provide these services, investing billions of dollars to do so…all supposedly so you don’t have to.  Remember, however, that service providers make money by squeezing the most out of you while providing as little as they need to in order to ensure the circle of life continues.  Note, this is not an indictment of that practice, as $deity knows I’ve done enough of that myself, but just because it has the word “Cloud” in front of it does not make it any different from a business case.  Live by the ARPU, die by the ARPU. …

Datacenter DynamicsThe American cloud’s weakest link post of 8/7/2009 has “Dark fiber developer brings the cloud back to the ground and shares his broadband stimulus experience” as its deck:

An essential element is often left out of excited industry discourse around cloud-computing. Is the physical network infrastructure in the country sufficient to support visions of the future cloud?

Allied Fiber CEO Hunter Newby offered a rather sobering view of reality at the DatacenterDynamics conference in Seattle, Wash., Thursday. The company specializes in building out dark fiber infrastructure.

“Without physical there is no virtual,” Newby said. “Without fiber there is no cloud.”

“Moving apps into the cloud is very dangerous if you don’t know your physical fiber route. You can be buying from three or four different providers but there’s only one path and everybody else is tied too on that path and you think you’re redundant and diverse, but in fact, you’re not. Those are very basic questions you need to ask before you do anything in higher layers. I believe that if you’re not aware of the basic fundamental things that are very simple to understand, your entire business that you build above it is in jeopardy.”

Fiber-rich patches, such as coastal areas, are sporadically spread around the country, enabling a healthy amount of competition in those areas but connectivity outside of those areas leaves a lot to be desired. “And no one company can afford to build out the proper infrastructure to make it all work and that’s the problem.”

Of $787 billion the U.S. government allocated to stimulating the economy, $7.2 billion was dedicated to developing the country’s broadband infrastructure. Newby feels that, while a lot can be accomplished with $7.2 billion, the amount is insufficient for satisfying the country’s broadband needs. …

David Linthicum warns Don't be fooled: Cloud computing is not so simple on 8/7/2009: “Beware oversimplifications such as 'the cloud is like the electric grid' -- they're destined to make you fail.”

I enjoyed James Urquhart's post, "In cloud computing, data is not electricity," which points out some of the sillier analogies we're seeing in the emerging cloud computing space. Specifically, Urquhart refers to Nick Carr's classic vision of cloud computing, "The Big Switch," which compares traditional on-premise computing as generating your own power to cloud computing as using the standard power grid.

"However, some have taken electricity as an analogy to cloud adoption to an extreme, and declared that there will be a massive and sudden shift from corporate datacenters to entirely external cloud computing environments -- public cloud utilities, if you will. They are wrong," Urquhart writes.

Hear, hear.

John Foley chimes in with Microsoft's Drag-And-Drop Windows Azure Cloud, his analysis of 8/7/2009 of the Quincy, WA data center sales tax issue for InformationWeek. (See additional stories below):

Citing an unfavorable change in tax laws, Microsoft is moving its Windows Azure cloud from a data center in Washington state to one in Texas. It's an interesting new twist in the cloud computing market—moving a cloud across state lines in response to the regulatory climate.

Of course, the problem will be that there will only be a single US data center available for some time, which means that geolocation for disaster recovery won’t be an option for early Azure adopters.

Miko Matsumura (a.k.a. @MikoJava) claimed SOA Arrogance is Dead when he followed Anne Thomas Manes’ (@atmanes) session at the Burton Group’s Catalyst Conference on 7/29/2009 and made the following point in his 8/7/2009 post:

First and foremost, the most stupid and ignorant reading of “SOA is DEAD” is that the perspective of SOA is no longer needed in the Enterprise. This point of view is stupid, particularly when SOA is so important for mash-ups, Cloud Computing, SaaS, PaaS, BSM, IT Governance, Portfolio management and most modern IT practices.

The problem of Enterprise IT Complexity (and Entropy) *DOES* need to be solved. SOA is one of many key architectural perspectives that can make this happen.

Everything is a service (SOA) is an incredibly powerful view.

But within appropriate bounds, everything can also be appropriately viewed as a Process, an Event, an Object, a database table, or other abstraction.

The idea that an enterprise architect could become so focused on “one architecture to rule them all” is as preposterous as “one vendor to rule them all”.

@atmanes’ session was In Memory of SOA, a sequel to her famous SOA is Dead; Long Live Services blogpost of 1/5/2009. Her current presentation carried the following abstract:

In most organizations, SOA has become a bad word. Except in rare situations, SOA has failed to deliver its promised benefits. IT Groups have invested hundreds of thousands, if not millions of dollars into SOA with little return to show for it. The people holding the purse strings are fed up. Funding for these SOA initiatives has dried up.

It’s time to face reality: the term “SOA” now carries too much baggage. It’s time to declare SOA dead and move on.

So what went wrong? Was SOA really just a great failed experiment? Or did we just lose our way? Should we abandon our architectural efforts? Can we salvage any value from our past efforts?

Research and Markets, which claims to be “The World’s Largest Market Research Resource,” offers to share their Worldwide Cloud Computing Market Shares, Strategies, and Forecasts, 2009-2015 report with you for US$3,400 (US$6,800 for a site license.)

This 2009 study has 712 pages, 211 Tables and Figures. Worldwide markets are poised to achieve significant growth as search engines use efficient automated process to drive new advertising and communications capabilities. Applications can be built without programming. …

SOA reaches into every industry and every segment of the economy via cloud computing. SOA drives innovation for the very large enterprises. Mid range size companies and very small organizations are adopting technologies similar to what the enterprise use, creating automated process to replace manual process. Cloud computing markets at $36 billion in 2008 are expected to reach $160.2 billion by 2015. …

The question is what will be the market for cloud-computing research reports in 2015?

David Linthicum reduces his blood pressure by writing Cloud Computing & SOA: Getting the Links Straight Between Them, “How SOA benefits cloud computing”:

Want to know what gets my blood pressure up? It's when there's both a huge shift in thinking around how we should do computing, namely cloud computing, and at the same time, there's a bunch of information out there that causes confusion. As cloud computing hype spikes to a frenzy, so does the number of less-than-intelligent things that I hear about it and its relationship to SOA.

We've got a herd mentality in IT. We're always chasing the next paradigm shift, which seems to come along every five years, claiming that whatever the last paradigm shift was had "failed" and that's why we're looking at something new. However, these hype-driven trends are often complementary, and so the real power is in figuring out how known approaches fit with what's new, and not look to replace, but how to build on the foundation. The best case for that scenario has been how SOA benefits cloud computing, but few understand how and why. …

Lori MacVittie explains Why the cloud operating system is a myth in this 8/6/2009 article for ZDNet UK:

Google's foray into the operating-system business was barely public before it was being hailed as the latest cloud operating system, with some industry watchers saying this product could be pivotal in the cloud revolution.

To be fair, Google's Chrome OS is not the only operating system to which the cloud handle has been attached. It is merely the latest in a long line of attempts to capitalise on the growing interest and hype surrounding cloud computing.

Novell, Dell, Microsoft — in fact, anyone who is anyone with a stake in operating systems has been mentioned at least once in conjunction with a cloud operating system.

There is no such thing. It is a myth existing entirely in the minds of those who cannot seem to get enough cloud in their daily technology diets. And the problem in perpetuating that myth is that it continues to confuse an already confused market.

John Foley reports that the State of Washington Chooses Data Center Over The Cloud in this post of 8/6/2009 to InformationWeek’s Plug in to the Cloud blog:

The state of Washington is investing $180 million to build a new data center, and not everyone is thrilled about it. Opponents wonder if cloud computing wouldn't be a cheaper alternative. Ironically, Washington is home to two of the biggest cloud service providers, Amazon.com and Microsoft.

As reported by The Olympian, a bond sale and groundbreaking for the new facility, which will also serve as the headquarters for the state's Information Services division, is imminent. Construction equipment is due to arrive on the site in Olympia within a few days.

Two state representatives, Reuven Carlyle and Hans Dunshee, tried to put a halt to the project. In a letter to Gov. Chris Gregoire, they pointed to data centers operated by Google, Microsoft, and others -- a.k.a. cloud computing centers -- as potentially cheaper alternatives. For a state that spends upwards of $1 billion annually on IT (according to Carlyle), lawmakers and tax payers can't be blamed for balking.

The State of Washington probably doesn’t pay sales tax to itself. For more detail on this issue, see my Mystery Move of Azure Services Out of USA – Northwest (Quincy, WA) Data Center post updated 8/5/2009.

Mike Manos adds background to Microsoft’s move of the Windows Azure platform from its Quincy, Washington, data center in his The Cloud Politic – How Regulation, Taxes, and National Borders are shaping the infrastructure of the cloud post of 8/6/2009. My preceding post has a update that includes an except from Mikes essay.

Mike currently is responsible for the global data center design, construction, ongoing operations and professional services for Digital Realty Trust, his past roles include similar responsibilities at Microsoft Corporation, and leadership roles at Walt Disney, Rhythms NetConnections, and Nuclio Corporation (now part of Sun Microsystems).

It would be nice to hear from a Microsoft executive on this topic.

Herb Torrens reports on an Evans Data Survey: Cloud Dev Efforts on the Rise in this 8/5/2009 article for Redmond Developer News:

The move to the Internet cloud will pick up steam in the next year for developers, according to a new survey from Santa Cruz, Calif.-based Evans Data Corp.

Nearly half (48 percent plus) of the 500-plus developers surveyed expects to deploy private cloud applications in the coming year. Development for the cloud is also happening now. More than 29 percent said they are currently building applications for a private cloud.

Evans Data announced some of the results on Tuesday, but the company's "Cloud Development Survey 2009" publication is expected to be released sometime next week. The survey also examines public cloud trends among developers. …

Randy Bias’s Bifurcating Clouds post of 8/6/2009 claims:

There will soon be two major paths for cloud computing providers: commodity and premium.  If you read my series, Cloud Futures, you’ll know that I broke down cloud service providers into three major categories: service clouds, consumer clouds (previously ‘commodity’)[1], and focused clouds. In retrospect I realize now that there are possibly four, not three major categories. The missing category is premium enterprise clouds. Previously I had lumped these under focused clouds, but I now realize that, in fact, there are likely to be so many of these that they deserve their own category. I’ll go even further and suggest that in terms of markets targeted, there will really only be two ends of a spectrum: enterprise and non-enterprise. …

Krishnan Subramanian asks Obstacles To Enterprise Cloud Adoption: Who Is The Culprit? on 8/5/2007:

Slowly, but steadily, enterprises are warming up to Cloud technologies. No, they are not queuing outside the Amazon headquarters waiting to order public cloud infrastructure, like the Amazon's EC2 offerings, yet. But, the idea of private clouds and the advantages of tapping the public clouds for non mission critical operations like testing are slowly making the enterprise community comfortable with Cloud Computing. In fact, a recent Gartner survey predicts that by 2012, 80 percent of Fortune 1000 enterprises will be paying for some cloud computing services and 30 percent will be paying for cloud computing infrastructure services. …

BasicGov Software issued a Customer Case Study Using BasicGov Web-Based, SaaS, Cloud Computing Software for Permits and Inspections press release about the town of Waxhaw, North Carlolina, which is “Using BasicGov web-based software (SaaS and cloud computing) for Permits and Inspections to Improve Planning and Community Development Department:”

In January 2008, Waxhaw started a new Building Inspections Department. This function was moved from the county level in order to be more responsive to Waxhaw citizens while ensuring quality building construction for new developments and historic restorations.

Greg Mahar, the Director of Planning and Community Development for Waxhaw, sought software that would better enable his team to manage planning and community development in a more effective manner. After extensive research, Greg chose BasicGov web-based software because of its affordability and reliability. …

R. McNeill and B. Guptill wrote their Cloud Computing: A Silver Lining for Outsourcing Providers? Research Alert for Saugatuck Research (site registration required):

IT services providers regularly contact Saugatuck, seeking to understand the potential opportunities and limitations of introducing Cloud Computing and SaaS to their customer bases. What we see is the emergence of web-based Cloud Computing outsourcing alternatives (including SaaS) that are substantially reshaping the way technology-enabled services are purchased and used. This in turn is fundamentally shifting the IT outsourcing landscape, creating new opportunities that grow from established, traditional categories of IT outsourcing. These emergent shifts - and opportunities - include the following:

  • From Infrastructure Outsourcing (IO) to Infrastructure-as-a-service (IaaS) and Platform-as-a-Service (PaaS);
  • From Application Management Outsourcing (AMO) to PaaS and SaaS; and
  • From Business Process Outsourcing to Cloud Enabled Business Services and IT as a Service.

Future Saugatuck Strategic Perspectives will examine these areas of change opportunity in more detail, from both the user and provider points of view. …

Dale Vile asks Will cloud put traditional hosters out of business? in this essay of 8/4/2009 for The Register:

It sometimes seems as if the whole world has gone cloud crazy - well at least most of the vendors, pundits and many in the media. If we listen to the evangelists, the days of the enterprise data centre are numbered and players like Google, Amazon and Microsoft will inherit the earth. Even David Cameron, the illustrious leader of the opposition to the UK government, has been talking about handing over the country's health records for storage and management to one of these big American multinationals.

In the midst of all this noise and hype, many have lost sight of the fact that getting a third party to run some of your infrastructure for you has been around for at least three decades. Indeed, those who have been taking advantage of hosted services - or on the other side of the fence, delivering them - must be wondering what all the fuss is about. Just what, exactly, is this cloud thing bringing to the party that is supposed to change the way everything works? …

<Return to section navigation list> 

Cloud Security and Governance

• Peter Choi explains How to Develop an Effective Security Strategy to Play in the Public Cloud and “Develop an effective security strategy with the right blend of technology and processes” in this lengthy essay of 8/7/2009:

Look all around and you can easily see that there is no shortage of press regarding the promises of cloud computing. Cloud evangelists have touted cloud computing as the next big thing, a game changer - a disruptive technology that will spark innovation and revolutionize the way businesses acquire and deliver IT services. The staggering volume of these sales pitches is to be expected, considering that cloud computing is at or near the peak of its hype cycle, but as with any new technology or model, reality will eventually set in and the public relations blitz will fade. As people continue to define cloud computing and debate its pros and cons, one thing is certain - one of the biggest obstacles to widespread cloud computing adoption will be security.

Peter Choi is the cloud computing security lead for Apptis, Inc.

• Steve Lesem posits Cloud Storage and Security [Is] Not a New Concept and asks “Everybody Talks About It, But Is It Really All That Different?” in this 8/7/2009 post:

Articles and blog posts associated with security and cloud computing are a daily occurrence, unless some well-publicized breach occurs in the cloud.  At that point the number of commentaries and discussions will increase exponentially, and then, over the following week, return to normal frequency.

I decided to focus on security as it relates to cloud storage, to see if something really new and different is occurring, and if overall changes need to be contemplated, as it comes to classic data security activities.  When I focused in this way, I quickly discovered that not much has changed, and security of data in the cloud is highly dependent on the same precautions and understandings as security of your data in a private data center.

In this recent article, it was suggested that files of one owner residing on a physical device with the files of others could somehow result in unauthorized access. It could, and the answer to this and a myriad of concerns fits within traditional approaches and understandings of security. 

Eric Chabrow asks is Janet Napolitano: The Cyber Czar? in this 8/5/2009 post:

Homeland Security Secretary Janet Napolitano isn't the federal cybersecurity czar, and has no desire to become the president's top IT security adviser. But if one of the responsibilities of the White House cybersecurity coordinator is to be the cheerleader for federal government cybersecurity initiatives, then Napolitano is filling that bill.

I’d say that Janet is angling to fill the power vacuum created by Melissa Hathaway’s resignation in advance of her anticipated appointment to the job. Melissa was the White House’s acting senior director for cyberspace.

Eric’s Cyber Czar Waiting Game post of 8/4/2009, reported in Windows Azure and Cloud Computing Posts for 8/3/2009+, suggests a more significant stumbling block to filling the job:

[H]aving two bosses, each with strong personalities and their own power centers in the White House. Who would want two bosses like that?

I wouldn’t.

Krishnan Subramanian questions Will Government Alter The Cloud SLA Game? in this 8/6/2009 post to the CloudAve blog:

One of the key parameters in the push to accelerate enterprise cloud adoption is the SLA (Service Level Agreements). It is an important requirement before enterprises can even think of jumping into the cloud. After a slow start, companies are coming out with SLAs for their services but it is still a messy affair with different companies offering varying terms with ambiguity. Recently, US General Services Administration, part of federal government, came up with a RFQ (Request For Quotations) that demands a 99.95% uptime per month. Let us try to understand the SLA dynamics in this post and see how government's requirement will affect the SLA game.

Microsoft requires two compute instances for a 99.95% uptime guarantee and only offers 99.9% uptime for data accessibility when Windows Azure RTMs.

John Oltsik proposes A Prudent Approach for Storage Encryption and Key Management:

In this white paper, Jon Oltsik, Principal Analyst at ESG, cuts through the hype and provides recommendations to protect your organization's data, with today's budget. Oltsik shows you where to start, how to focus on the real threats to data, and what actions you can take today to make a meaningful contribution to stopping data breaches.

As part of the paper's storage encryption to-do list, Oltsik details three realistic steps to provide the necessary protection for stored data based on risk.

The white paper covers:

  • What are the real threats to data today
  • Where do you really need to encrypt data first
  • How does key management fit into your encryption plans
  • What shifts in the industry and vendor developments will mean to your storage environment and strategy

Downloading requires registration.

Robert Lemos offers 5 Lessons from Dark Side of Cloud Computing in this 8/6/2009 article for ComputerWorld’s Security section:

While many companies are considering moving applications to the cloud, the security of the third-party services still leaves much to be desired, security experts warned attendees at last week's Black Hat Security Conference. …

"Guys at the low end are using (cloud infrastructure) to save money, but the danger is that the guys at the top end start to use it without any auditing," says Haroon Meer, technical director at security firm SensePost, who discussed his team's research into some aspects of Amazon's Elastic Compute Cloud (EC2) at the Black Hat security conference. …

Reuven Cohen’s The Battle for Cloud Application Neutrality post of 8/5/2009 claims “The concept of Cloud Application Neutrality extends upon the core tenets of the existing network neutrality debate.” Ruv says:

It's hard to believe that it's been a year since we first created the Cloud Computing Interoperability Forum (CCIF) with the goal of defining and enabling interoperable enterprise-class cloud computing platforms through application integration and stakeholder cooperation. Over the last 12 months a lot has happened. For me the most notable change has been how the conversation has shifted from "why use the cloud" & "what is cloud computing" to how to implement it. The need for interoperability among vendors has also become a central point of discussion with the concept being included in recent US federal government cloud requirements. But like it or not the battle for an open cloud ecosystem is far from over.

Brian Krebs warns Researchers: XML Security Flaws are Pervasive in this 8/5/2009 Washington Post article:

Security researchers today unveiled details about a little-known but ubiquitous class of vulnerabilities that may reside in a range of Internet components, from Web applications to mobile and cloud computing platforms to documents, images and instant messaging products. [Emphasis added.]

At issue are problems with the way many hardware and software makers handle data from an open standard called XML. Short for "eXtensible Markup Language," XML has been used for many years as a fast and efficient way to transport, store and structure information across a wide range of often disparate applications.

Researchers at Codenomicon Ltd., a security testing company out of Oulu, Finland, say they found multiple critical flaws in XML "libraries," chunks of code that are typically used and re-used in software applications to process XML data. …

<Return to section navigation list>

Cloud Computing Events

• Bob Gourley’s Announcing the Federal Technology Events Calender post of 8/8/2009 describes his

[P]ublic calendar titled the Federal Technology Events Calendar. This calendar uses Google Calendar technologies so it is fast and easy to maintain, which means it should be easy to keep it up to date.

Calendar data also is available for download in XML, ICAL and HTML formats.

The B.NET (Bangalore.net) User Group announces a series of sessions focused on Windows Azure starting on 8/8/2009:

B.NET brings you a series of sessions on the Microsoft Cloud Computing platform - Windows Azure. Spread across 6 sessions of 90 minutes each, this series takes you through the nuts and bolts of Windows Azure. Some of the key concepts of Windows Azure like the Fabric, Web Role, Worker Role, Tables, Blobs, Queues and configuration will be covered in-depth in the sessions. By end of the series, you would be able to architect cloud services for Azure or migrate your existing applications to the Azure platform.

What's more? Those members who attend all the 6 sessions and successfully complete a quiz by the end of the series stand to win cool prizes from our sponsors!

When: 8/8/2009 11:00 AM to 1:00 PM 
Where: PANANI Room, Microsoft Signature Building, EGL, Koramangala Ring road, Bangalore

[The five remaining session don’t have dates assigned.] 

Indiana University issued a Call for Student Participation: Cloud Computing and Collaborative Technologies in the Geosciences on 8/7/2009:

Students from all IU campuses and other university students from across the US have an opportunity to consider the implications of cloud computing on the geosciences while networking with some of the leading thought leaders in the field. The Indiana University Pervasive Technology Institute Data to Insight Center (D2I) is soliciting student abstracts for an upcoming workshop titled "Cloud Computing and Collaborative Technologies in the Geosciences."

Sponsored by the National Science Foundation, the workshop will be hosted by the Pervasive Technology Institute and the Linked Environment for Atmospheric Discovery (LEAD) Project and will take place September 17-18, 2009, at the University Place Conference Center on the campus of Indiana University-Purdue University Indianapolis.

Abstracts for poster sessions will be accepted through August 20, 2009. Funding awards for travel and accommodations will be recommended to those posters targeted to: geosciences, including atmospheric, earth sciences, hydrology, environmental sciences, and climatology; collaborative technologies; and cloud computing. …

When: 9/17 to 9/18/2009   
Where: University Place Conference Center, Indiana University-Purdue University, Indianapolis, IN, USA 

InformationWeek’s Vivek Kundra, United States Federal CIO, Announced as Keynote Speaker for 2009 InformationWeek 500 Conference press release of 8/5/2008 says:

Vivek Kundra, Federal CIO, will deliver the opening keynote address on Monday, September 14, at 8:15 a.m. PT at the 2009 InformationWeek 500 Conference and Gala Awards, to be held at the St. Regis Monarch Beach Resort in Dana Point, Calif. Kundra will share his unique perspectives on getting things done within the massive federal bureaucracy, ensuring that his $75 billion annual IT budget delivers maximum value and impact. …

Bring your surfboard.

When: 9/14/2009 8:15 AM   
Where: St. Regis Monarch Beach Resort, Dana Point, CA, USA

Adam Grocholski announces the Twin Cities Cloud Computing User Group’s August Meeting to take place August 26, 2009 from 8:30 am to 10:30am at the
Microsoft Office - Bloomington, MN.

Featured Speaker:
David Chappell, Chappell & Associates
An Overview of the Windows Azure Platform
In this session David Chappell will discuss the Windows Azure Platform which includes Windows Azure, SQL Azure and .NET Services. He will also compare Windows Azure with Amazon Web Services (AWS), Google App Engine and SalesForce.com.

When: 8/26/2009, 8:30 to 10:30 AM   
Where: 8300 Norman Center Drive, Suite 950, Bloomington, MN 55437

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

• Martin Eley says SalesForce.com time to rethink the price tag? for its application platform on 8/5/2009:

SalesForce have recently been heavily promoting their application development platform.  The platform offers all of the benefits of cloud computing (scaleable, lower costs, etc) with the added bonus of best in breed CRM and sales support.

As more and more organizations are looking towards cloud computing to reduce their ICT spend cloud computing is very attractive.  Add in all of the ready to use functionality SalesForce offers and it would appear to be the best solution but there are some hidden costs to consider.  For example if an organization wanted to authenticate users to their Visual Force site they would need to purchase a license for each user or pay per sign on.

An organisation wanting to host large volumes of data would have to pay extra once they have exceeded their data allowance (typically 1GB for an enterprise license).

For SalesForce to compete with rivals such as Amazon Web Services they will need to consider changing their price structure.  In the past their current price structure worked well for CRM and sales software as a service (SaaS) but they are now in the application development market and those high prices cannot compete with competitors who are offering cloud storage for $0.25 per GB/Month.

This is another case of miscategorizing the competition. Like Azure, the Sales Force app platform is a PaaS, not an IaaS like Amazon Web services. Fortunately, Martin compares SalesForce’s pricing with AWS and Google App Engine in these Google Sheets. It’s unfortunate that he didn’t include Azure in the comparison.

Peritor Consulting uncovers a relationship between instance size on Amazon EC2 bandwidth and reports their results in an Elastic Load Balancer and EC2 instance bandwidth post of 8/6/2009:

So we are working on a caching related project on EC2. In this scenario high performance is very important.

We are setting up a Varnish cluster on EC2 and evaluate if can replace an existing caching infrastructure in terms of costs and requests per second. Our benchmarks yielded some interesting results. It seemed that for our caching scenario the limiting factor is bandwidth. Varnish is very humble with CPU/RAM consumption. We could easily deliver 500 to 600 requests per second with a small instance and have the box idle around 95% (uncompressed content).

It turns out we are limited by bandwidth and not by CPU.

In our benchmarks we were only able to push 35 MB/s on small instances. So the actual requests per seconds were dependent on the object size we were pushing. The limit were always ~35 MB/s. Our typical HTML pages were around 50 to 70 KB, so we couldn’t reach the desired requests per second as our instance was at it’s bandwidth limit. …

Bernard Golden addresses the bandwidth issue in his The Skinny Straw: Cloud Computing's Bottleneck and How to Address It post of 8/6/2009 to CIO.com:

Virtualization implementers found that the key bottleneck to virtual machine density is memory capacity; now there's a whole new slew of servers coming out with much larger memory footprints, removing memory as a system bottleneck. Cloud computing negates that bottleneck by removing the issue of machine density from the equation—sorting that out becomes the responsibility of the cloud provider, freeing the cloud user from worrying about it.

For cloud computing, bandwidth to and from the cloud provider is a bottleneck. We recently performed a TCO analysis for a client, evaluating whether it would make sense to migrate its application to a cloud provider. Interestingly, our analysis showed that most of the variability in the total cost was caused by assumptions about the amount of network traffic the application would use. This illustrates a key truth about computing: there's always a bottleneck, and solving one shifts the system bottleneck to another location.

Virtualization implementers found that the key bottleneck to virtual machine density is memory capacity; now there's a whole new slew of servers coming out with much larger memory footprints, removing memory as a system bottleneck. Cloud computing negates that bottleneck by removing the issue of machine density from the equation—sorting that out becomes the responsibility of the cloud provider, freeing the cloud user from worrying about it. …

Zack’s Investment Research‘s Analyst Blog reports CRM Focuses on Healthcare on 8/6/2009:

Healthcare provides a good business opportunity to information technology companies, a fact reinforced by Salesforce.com (CRM - Analyst Report). The company is investing in Practice Fusion, which is involved in the business of electronic health records, health policy, health information technology and consumer medical data topics. Salesforce.com will invest around $10.0 million for a marginal stake in the company, which will generate around $1.0 million of revenue a year for the company. …

<Return to section navigation list> 

blog comments powered by Disqus