Thursday, August 13, 2009

Windows Azure and Cloud Computing Posts for 8/10/2009+

Windows Azure, Azure Data Services, SQL Azure Database and related cloud computing topics now appear in this weekly series.

 

Update 8/13/2009: Book cover photo redacted
• Update 8/11 and 8/12/2009: VMware’s SpringSource acquisition, other additions

My forthcoming Cloud Computing with the Windows Azure Platform book’s cover gets a facelift(?) that’s being applied to all new Wrox Programmer-to-Programmer series members. The stock photo that appeared on 8/12/2009 has been redacted because it isn’t certain what graphic will be used ultimately:

Jim Minatel’s New Wrox Covers post of 8/7/2009 explains the change from Wrox’s tradition of author mugshots on the cover of “Programmer to Programmer” titles. My reading of the comments to this post indicates that Wrox authors (including me) are far from enthusiastic about the new design.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use these links, click the post title to display the single article you want to navigate.

Azure Blob, Table and Queue Services

Brad Calder of the Windows Azure Storage Group posted New Windows Azure Blob Features – August 2009 to the Windows Azure blog on 8/11/2009:

Today we are releasing a new set of features for Windows Azure Blob. Windows Azure Blob enables applications to store and manipulate large objects and files in the cloud. The blobs (files) can be up to 50GB in size for the CTP.

All changes for this release are versioned changes, using “x-ms-version: 2009-07-17”. All prior versions of commands executed against the storage system will continue to work, as we extend the capabilities of the existing commands and introduce new commands.

The features for this release are:

  • Update Blob with PutBlockList
  • Root Blob Container
  • Shared Access Signatures for Signed URLs

Brad’s post contains addition data on the new features and a link to the updated Blob Service API documetation.

• Steve Marx’s New Storage Feature: Signed Access Signatures post of 8/11/2009 offers a live demo and sample code for this new blob feature:

Among other things, the latest storage release supports something called “signed access signatures.”  The basic idea is to let you create signatures that are more granular than the shared key for the whole storage account and then embed these signatures directly in a blob URL instead of an authorization header.

Making simple use of this new feature, I put together a sample called “WazDrop,” which operates similarly to applications like DropSend and YouSendIt.  It lets you upload a file and specify a duration it should be available.  It then returns you a special URL that can be used to access the uploaded file until the expiration time.

Play with the sample at http://wazdrop.cloudapp.net, and download the full source code.

Note: I received an “Authentication Failed: Signature not valid in the specified time frame” error when I uploaded a file to the WazDrop sample app.

Reuven Cohen’s Waiting in the Cloud Queue post of 8/9/2009 asks:

Which would you rather have? A compute job that gets done over a 12 hour period on a Supercomputer with the catch that you need to wait 7 days until the job actually runs? Or a job that runs over a 60 hour period on a lower performance public cloud infrastructure that can start immediately? …

And then concludes with:

In a sense this question perfectly illustrates the potential economies of scale cloud computing enables. (a long run concept that refers to reductions in unit cost as the size of a facility, or scale, increases) On a singular basis my job will take a significantly longer period of time to execute. But on the other hand, by using a public cloud there is siginificanly more capacity available to me, so I am able to do significantly more at a much lower cost per compute cycle in roughly the same time my original job was in the queue. …

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

The Data Platform Insider blog recommends that everyone Download Microsoft SQL Server 2008 R2 August CTP and Register for PASS Summit 2009 Today on 8/12/2009 with either (or both) of these two options:

First, you can download the CTP for SQL Server 2008 R2 at the Microsoft TechNet Evaluation Center. If you are participating in the Office 2010 Technical Preview, you can also download a CTP of the Project “Gemini” Excel add-in from the Microsoft Connect site for “Gemini”. The “Gemini” add-in to Excel will empower users to create their own BI solutions using data from both IT-managed and external sources, carry out rich ad-hoc analysis and modeling, extract value from data more easily, and publish and share reports for others to view.

The second option is through the Professional Association of SQL Server (PASS) organization, offering a quick and easy preview of the Application and Multi-server Management capabilities in a virtual environment – a great way to explore the new features before downloading the full trial. Application and Multi-Server Management helps DBAs proactively manage database environments at scale and accelerates consolidations and upgrades across the application lifecycle. Access to the hosted trial will be available on Friday, August 14th at http://betaforsqlserver.com.

The Data Platform Insider blog announces that MSDN and TechNet subscribers can Download SQL Server 2008 R2 August CTP Today, 8/10/2009. The CTP will be generally available on Wednesday, August 12th.

This public preview offers the opportunity to experience early, pre-release feature capabilities including:

  • Application and Multi-server Management
  • SMP scale up with support for up to 256 logical processors
  • Report Builder 3.0 with support for geospatial visualization

It’s likely that SQL Azure Database will run at least parts of SQL Server 2008 R2 when it RTMs at PDC 09.

<Return to section navigation list> 

.NET Services: Access Control, Service Bus and Workflow

• Eugenio Pace’s Announcing new project – patterns & practices - Claims based Authentication & Authorization Guide post of 8/11/2009 reports:

For the next couple of months I’ll be working on a new project here at patterns & practices, developing a new guide for claims based authentication and authorization.

I’m personally very happy to be working on this project, for many reasons. I believe frameworks like “Geneva” (previously known as “Zermatt”, now Windows Identity Foundation), products like “Geneva Server” (now ADFS) are great platform additions to enable a new set of scenarios.

I realize that SSO, Federated Identity and Claims are not new. It’s just that we have much better tools and higher abstractions to implement these scenarios much more easily than ever.

I also feel privileged to work with such a great team. I’ll be sitting on giants shoulders: Dominick Baier, Vittorio Bertocci, Keith Brown, David Hill and Matias Woloski. Many others are joining as advisors and reviewers.

Here’s Eugenio’s Guide Map to claims-based authentication and authorization, which looks to me like a part of the New York subway or London tube map:

<Return to section navigation list>

Live Windows Azure Apps, Tools and Test Harnesses

• .NET Rocks presents the Vishwas Lele on Real-World Azure podcast on 8/11/2009.

Vishwas is back to talk about his experiences programming for Microsoft Windows Azure. Vishwas did a port of the famous DinnerNow website using the Azure cloud computing platform.

Vishwas Lele is Chief Technology officer (.NET Technologies) at Applied Information Sciences, Inc, where he has worked for the last fourteen years. In his current role, he is responsible for assisting organizations in envisioning, designing, and implementing enterprise solutions that are based on the .NET technologies.

Vishwas also serves as the Microsoft Regional Director for the Washington DC area.

Vishwas starts the conversation about the DinnerNow port at 06:30 into the podcast. His Porting DinnerNow.NET to Azure post of 7/4/2009 describes the process, which “took about a week or two,” and you can test the live Azure app at http://ais.cloudapp.net. The future transition from SQL Data Services to SQL Azure Database complicates the explanation of how the Azure application works with Entity-Attribute-Value (EAV) model Azure Tables.

• Mike Ormond explains Deploying Your First Windows Azure App in this lavishly illustrated 8/11/2009 post that continues from where his Getting Up and Running with Windows Azure post of 8/5/2009 left off.

• Steve Marx uploaded the Summary and Transcript: First Windows Azure Lounge Chat to his personal blog on 8/10/2009 and adds a link to Simon Munro’s Windows Azure Chat Nuggets summary of 8/7/2009.

Joseph Conn’s Making the ‘Big Switch' to cloud computing article (part 1 of 2) of 8/10/2009 for Modern Heathcare magazine uses outsourced Health Information Technology (HIT) as a precursor of cloud-based Electronic Health Record (EHR) or Electronic Medical Record (EMR) and Personal Health Records (PHR).

One precursor to the cloud—outsourced IT service—is one of the oldest information technologies in healthcare. Daniel Emig is vice president of hosting services for Siemens Healthcare, based in Malvern, Pa., where he started working 25 years ago with Shared Medical Systems Corp., or SMS, an outsourced IT service provider for hospitals and medical groups. SMS, founded in 1969, was acquired by Siemens 10 years ago. By then, SMS was well-established, reporting 1,000 healthcare customers and revenue of $1.2 billion. According to Siemens, its Malvern data center now hosts more than 2,800 different applications and processes—194 million transactions during an average day.

From a customer's perspective, Emig said, cloud computing looks very much like what they've been seeing for decades in IT outsourcing.

“Cloud is the latest terminology, but the concept has always been the same as a buyer,” Emig said. “You plug into a network and have them do work you want done.”

The ability to move work around and divide up mainframe computers also is quite old, Emig said. “In the server world, the systems world, there are now virtualization technologies in the market that allow you to do the same thing,” he said. “You can take one physical server and create 10 virtual servers on it.”

In a follow-up e-mail, Emig said Siemens' service offers its customers most of the five NIST essential characteristics of cloud computing, noting its pricing is based on “a more fixed and predictable cost structure over the term of their contracts” rather than completely variable pricing based on usage. “We only charge for scale up when customers significantly exceed the statistics provided to Siemens. Scale down is completely transparent to healthcare providers. However, we do carefully meter usage for our own internal cost accounting.”

Note: Cloud-based HIT, EHR, EMR and PHR topics, including Microsoft HealthVault, have moved from Azure Infrastructure (last week) to this section.

Chris Seper’s A better widgetized (and monetized?) MayoClinic.com article for MedCity News reports:

Vicki Moore[, vice chair of Mayo’s Department of Global Products and Services,] said Mayo is working to integrate online records from its software, Mayo Clinic Health Manager, as well as from Microsoft HealthVault to create personalized applications that could be associated with MayoClinic.com and address specific patient health issues. For example, a patient dealing with diabetes and sleep issues could soon have tools to track certain issues specific to the health problems, and have more “personalized guidance” on how best to deal with the problems.

“We’re interested in self-management,” Moore said.

Milt Freudenheim reports on threats to patient privacy in his And You Thought a Prescription Was Private article of 8/8/2009 for the NY Times, which describes families whose prescription history was sold for marketing purposes:

… Like many other people, Ms. Krinsk thought that her prescription information was private. But in fact, prescriptions, and all the information on them — including not only the name and dosage of the drug and the name and address of the doctor, but also the patient’s address and Social Security number — are a commodity bought and sold in a murky marketplace, often without the patients’ knowledge or permission.

That may change if some little-noted protections from the Obama administration are strictly enforced. The federal stimulus law enacted in February prohibits in most cases the sale of personal health information, with a few exceptions for research and public health measures like tracking flu epidemics. It also tightens rules for telling patients when hackers or health care workers have stolen their Social Security numbers or medical information, as happened to Britney Spears, Maria Shriver and Farrah Fawcett before she died in June.

“The new rules will plug some gaping holes in our federal health privacy laws,” said Deven McGraw, a health privacy expert at the nonprofit Center for Democracy and Technology in Washington. “For the first time, pharmacy benefit managers that handle most prescriptions and banks and contractors that process millions of medical claims will be held accountable for complying with federal privacy and security rules.” …

Vince Kuraitis asks Microsoft HealthVault is a Serious Business Strategy. Will Google Health Become More than a Hobby? in this 8/7/2009 post to the e-CareManagement blog:

Google Health…please stick around….but please also get your stuff together.

Over the past few days, several of my respected colleagues have written excellent blog posts essentially asking “Does Google Health have life?”

I share their observations and sentiments.  I see Microsoft HealthVault as a serious business strategy while Google Health is more like a hobby (one of probably hundreds at Google).

And then goes on to explain why he thinks “Google should stick around in healthcare.” See the links in my Windows Azure and Cloud Computing Posts for 8/6/2009+ post for more about HealthVault vs. Google Health.

Elizabeth Tchii reports “Microsoft will provide resources such as the Windows Azure cloud computing operating platform, as well as a general cloud computing environment, in order to foster innovative Internet communications-related technologies” to “Asia’s first cloud computing center in National Chiao Tung University in Hsinchu [Taiwan]” in her 8/11/2009 article in the Taipei Times newspaper. [Emphasis added.]

The collaboration will involve investment funding from the US company and the government at a ratio of 1 to 3, and Microsoft will not put a claim on any intellectual property that arises as a result of this alliance, Microsoft Taiwan general manager Davis Tsai (蔡恩全) told at a media briefing yesterday.

Since this will be a 10-year project, no initial investment amounts have been allocated yet, the US company said.

<Return to section navigation list> 

Windows Azure Infrastructure

Alex Goldman adds more background to Gartner’s Hype Cycle chart (below) in his Gartner: Cloud Computing Hype 'Deafening' post of 8/11/2009 to InternetNews that draws on the accompanying report.

• Maureen O’Gara reports “It’s supposed to be two–five years away from mainstream adoption” in her Cloud Hype at Height: Gartner post of 8/11/2009:

Behold, according to Gartner, cloud computing has ascended to the very apex of the famous Hype Cycle that buzzword technology are said to travel during their lives. It's supposed to be two-five years away from mainstream adoption and who knows how long away from the Great Trough of Disillusionment that it must pass through to get there.

It‘s good to see SOA halfway up the Slope of Enlightenment but surprising that mainstream adoption is still 2 to 5 years away.

• Bill McColl claims “With intercloud computing, the world is going to look quite different, with the computing power moving to where the data is” in his The Intercloud: Turning Computing Inside Out post of 8/11/2009

Moving massive volumes of data around means more bandwidth, more storage, and more latency. We need instead to position the computing infrastructure next to where the data is located. With intercloud computing, we can build global apps and services where a single app can operate on data that may be spread across many public clouds and private datacenters. Intercloud computing means less bandwidth, less storage, and less latency.

Bill McColl is Founder and CEO, Cloudscale Inc. - which is developing a massively parallel cloud-based platform for continuous real-time intelligence on live data streams.

• Ric Telford’s The disruptive nature of cloud computing essay of 9/11/2009 for ZDNet News is of interest only because Ric is Vice President, Cloud Services, IBM.

Dana Gardner and his team debate the future of enterprise IT in his BriefingsDirect analysts debate the 'imminent death' of enterprise IT as cloud models ascend post and podcast of 8/10/2009:

Some of the analyst reports these days indicate that hundreds of billions of dollars in IT spending will soon pass through the doors of corporate IT and into the arms of various cloud-service providers. We might conclude that IT is indeed about to expire.

Not all of us, however, subscribe to this extent in the pace of the demise of on-premises systems, their ongoing upkeep, maintenance, and support. To help us better understand the actual future role of IT on the actual floors inside of actual companies, we're joined by our guests and analysts this week: Jim Kobielus, senior analyst at Forrester Research; Tony Baer, senior analyst at Ovum; Brad Shimmin, principal analyst at Current Analysis; Ron Schmelzer, senior analyst, ZapThink; Sandy Rogers, former program director at IDC, and now independent IT analyst and consultant, and, as our guest this week, Alex Neihaus, vice president of marketing at Active Endpoints.

Joe McKendrick riffs on Dana’s posts and his summaries of guest and analysts opinions in his Panel: too much to lose by moving to generic services and cloud? post of the same date.

Jason Bloomberg’s The Buckaroo Banzai Effect: Location Independence, SOA, and the Cloud ZapFlash post of 8/10/2009 claims “As SOA becomes mainstream and fades from view, the SOA best practice of location independence moves to the fore”:

Well-meaning pundits, analysts, bloggers, and others have sought to name this Next Big Thing -- SOA 2.0, Web 3.0, etc. -- but simply naming a concept before anybody really knows what that concept represents is sheer folly, and inevitably a lightning rod for derision. This ZapFlash, therefore will do no such thing. Instead, we'll seek in a mere 1,500 words or so to identify the elements of the Next Big Thing that we can find in today's trends, and identify the one thread -- location independence -- that may lead us to identify the successor to SOA.

Joe McKendrick recommends Richard Watson’s SOA services: stop worrying about protocols, worry about the business advice in this 8/10/2009 post:

Richard Watson says there is too much hand wringing over service protocols and standards (REST, WS-*, etc.), and not enough thought given to why a service may be needed by the business in the first place. In a new post, he states that while “debates about whether to use REST or WS-* interface styles are seductive. But, these are the wrong questions to ask first.”

Chris Hoff (@Beaker) delivers on 8/9/2009 a summary of his Cloudifornication: Indiscriminate Information Intercourse Involving Internet Infrastructure session he originally intended for the Black Hat USA 2009 conference:

The summary of CI^6 goes something like this:

What was in is now out.

This metaphor holds true not only as an accurate analysis of what happens to our data with the adoption trends of disruptive technology and innovation in the enterprise, but also parallels the amazing velocity of how our datacenters are being re-perimiterized and quite literally turned inside out thanks to Cloud computing and virtualization. …

<Return to section navigation list> 

Todd Bishop clarifies US – Northwest data center status with his Microsoft not deserting Quincy entirely, but tax debate rages on TechFlash article of 8/10/2009:

Microsoft's decision to shift its Azure cloud-computing platform away from its data center in Quincy, Wash., may have left the impression that the company was giving up on the facility entirely. Not so, Microsoft says. Even though the company won't be offering storage and computing to others from the facility via Azure, it still plans to run its own online services there.

The Spokesman-Review of Spokane first noted last week that Microsoft would remain in Quincy to some extent. A Microsoft representative confirmed the stance today: "Microsoft will continue to host many Microsoft online services out of its mega data center in Quincy, and its other locations. Microsoft continues to be committed to our business in the state of Washington and the Quincy data center." …

Cloud Security and Governance

Gartner’s Richard Hunter predicts IT Regs Will Be Enacted in 5 Years in this post of 8/11/2009 to Government Information Security Podcasts:

Like the airlines, automotive, financial services, pharmaceutical and telecommunications industries, the government will soon - probably within the next half decade - begin to regulation the IT industry, IT adviser Gartner predicts.

"There's a trajectory that industries tend to follow; when an industry is extremely successful - that is to say that when an industry succeeds in moving its products and services right into the heart of daily life, regulation tends to follow. in the 20th century," Richard Hunter, a Gartner fellow and vice president, says in an interview with GovInfoSecurity.com.

"We saw the Food and Drug Administration, we saw regulation of telecom, we saw regulation of the airlines industry, we saw regulation of the automobile industry," he says. "I think the information technology industry has been extraordinarily successful in the last 40 to 50 years in increasing the importance of its products and services to almost every aspect of modern life. And, what usually happens in any industry when you reach that level of importance in society is that regulation takes place."

In the interview, Hunter discusses how:

  • Mounting pressure to regulate the IT industry has gained favor as the number of breaches have exponentially rocketed over the past decade.
  • Innovation could be stifled, especially for startups and the open source community that don't have the financial wherewithal of a Microsoft or Oracle to conduct the testing regulation likely would require.
  • IT vendors will produce off-the-shelf tiered products, including those for information security, that would assure a certain level of quality, for a price.

Eric Chabrow, GovInfoSecurity.com managing editor, interviewed Hunter.

• Dana Gardner’s Cloud Security Panel: Is cloud computing more or less secure than on-premises IT? post of 9/12/2009 includes a podcast and excerpts of a discussion from The Open Group’s 23rd Enterprise Architecture Practitioners Conference in Toronto among the following participants:

Glenn Brunette, distinguished engineer and chief security architect at Sun Microsystems and founding member of the Cloud Security Alliance (CSA); Doug Howard, chief strategy officer of Perimeter eSecurity and president of USA.NET; Chris Hoff, technical adviser at CSA and director of Cloud and Virtualization Solutions at Cisco Systems; Dr. Richard Reiner, CEO of Enomaly; and Tim Grance, program manager for cyber and network security at the National Institute of Standards and Technology (NIST). Dana moderated the discussion.

That’s a lot of talent for a single panel. As usual, Chris Hoff (@Beaker) contributed incisive observations captured in the written excepts.

Brian Ott posted The Journey to a Secure Cloud – Beyond Virtualization and Automation to the Unisys Cloud Computing blog on 8/12/2009:

… We began our cloud journey with a foundation that started with server consolidation and then added virtualization and eventually automation. While this gave us a solid foundation it still didn’t get us to the secure cloud.

We initiated our journey to the cloud in our engineering organization to support their development and testing activities. They are a global team that has very demanding IT requirements. They are frequently requesting multiple virtual and physical machines along with the infrastructure and security to support them. They may use them for a short period of time and then have new requirements for a totally different set of IT configurations or need to keep them for an extended period of time across multiple geographies. They need the ability to understand and track the utilization of each of the IT resources and to be able to repurpose machines on the fly. This high degree of dynamic requirements results in a constantly changing and challenging IT environment. …

• John Pescatore claims A Token Effort Might Be The Right Approach to help with PCI compliance in his 8/12/2009 post to the Gartner Blog Network:

Avivah Litan and I just published a research note “Using Tokenization to Reduce PCI Compliance Requirements.” Tokenization does not replace encryption, but in many scenarios it can help reduce the number of places that card data (or any other type of sensitive data) is stored - which is invariably a good thing.

However, tokenization is just about at the peak of a rapid hype cycle - it is not a panacea for PCI compliance, and it brings on many unique challenges, as we go through in the note. We’ll be putting a decision framework research note soon on the “buy vs. build” of tokenization in the PCI context, with guidance on how to think through whether to outsource payment processing or implement your own encryption and tokenization solution.

• Neil MacDonald’s Security Thought for Tuesday: Cloud Computing Should be a More Secure Model post of 8/11/2009 to the Gartner Blog Network begins:

A computing paradigm based on the exchange and execution of arbitrary code is inherently risky.Yet, that’s pretty much the foundation of what we do today with personal computers. Consider that this model is the primary reason we pay billions of dollars to AV vendors to scan our machines for known malicious executable code. Consider that today most malware today is web-based – malicious executable code downloaded from the web onto the end-user’s machine and executed. In most cases, because the user was tricked into doing so. Sure, some baseline of local executables are needed for the OS and perhaps some personal productivity applications that require offline access. I can use whitelisting to control these. But do we really need the ability to download and install arbitrary executables for new types of functionality and services on our desktop? …

Neil’s Don’t Underestimate Microsoft post of 8/7/2009 warns the competition:

After the latest financial results were announced by Microsoft (including the first year over year revenue decline in its history), I heard an increase in the comments from press and some analysts along the lines of ‘Microsoft has hit its peak’.

Don’t underestimate Microsoft.

Microsoft is at its best when it is threatened. …

Neil is a Gartner analyst.

Christina Torode’s Beware these risks of cloud computing, from no SLAs to vendor lock-in post of 10/9/2009 regurgitates a laundry list of the perceived hazards in adopting cloud computing, but adds a twist I haven’t heard to date:

The on-demand computing model in itself is a dilemma. With the on-demand utility model, enterprises often gain a self-service interface so users can self-provision an application, or extra storage from an Infrastructure as a Service provider. This empowers users and speeds up projects.

The flip side: Such services may be too easy to consume. Burton Group Inc. analyst Drue Reeves, speaking at the firm's Catalyst show last week, shared a story of a CIO receiving bills for 25 different people in his company with 25 different accounts with cloud services providers. Is finance aware of this, or will it be in for a sticker shock?

That issue appears to be an issue with enterprise financial governance, not cloud computing.

John Pescatore asks Does Private Cloud Equal Secure Cloud? on 8/10/2009:

I’m continually having conversations with Gartner clients along the line of “We are getting pressure to use cloud computing services, what are the security issues?”

As I mentioned here, 90% of the time it turns out the pressure is really to consume some application as a service, not really cloud computing. 9.9% of the remaining conversations are more about the potential security issues of private cloud use. Which makes sense, since Gartner has projected that the actual enterprise use of true public cloud services has been way overhyped. Tom Bittman has a nice series of blog posts on private cloud issues here.

Now, whenever the word “private” is included in the name of technology, many people leap to the conclusion that security is built in. But, usually all “private” means is a closed address space, not any guarantee that the necessary security controls are baked in. For example, calling MPLS a “Virtual Private Network” caused many to assume that transport encryption was built in, but of course it is not. In fact, Bjarne Munch and I have a Gartner research note in final review on the issues of adding encryption to MPLS. …

Paul Venezia explains Why a 'protective' techie just can't trust the cloud in this 8/10/2009 post to InfoWorld’s Deep End blog. Paul says:

I know that at some point in the future the cloud will be a foregone conclusion, as much a part of modern life as cell phones, laptops, and Twitter. I see the advantages, I see the cost savings. I see the benefits. But for the moment, they simply don't outweigh the detriments.

Kevin Jackson’s US DoD Chief Security Officer on Cybersecurity Priorities post of 8/10/2009 reports:

In a Federal Executive Forum interview, Robert Lentz, Chief Security Officer for the US Department of Defense, highlighted the departments cybersecurity priorities.

Mr. Lentz is the Deputy Assistant Secretary of Defense for Cyber, Identity and Information Assurance (CI&IA) in the Office of the Assistant Secretary of Defense, Networks and Information Integration/Chief Information Officer. Since November 2000, he has been the Chief Information Assurance Officer (CIAO) for the Department of Defense (DoD) and, in this capacity, oversees the Defense-wide IA Cyber Program, which plans, monitors, coordinates, and integrates IA Cyber activities across DoD.

Along with the need to increase network speed and hardening, Mr. Lentz also mentioned efforts to increase the number of "cyber defenders" from the current 45,000. Other priorities include:

  • Insuring that information can flow from the cloud all the way to the edge
  • Helping defense industrial partners increase their cybersecurity posture
  • Implementing a robust identity management infrastructure; and
  • Increasing cybersecurity education, training and awareness

The entire interview will be broadcast on August 13th, 2009 at 2:00pm during this week's Federal Executive Forum on Federal News Radio. These 1 hour radio and video programs are produced and broadcast monthly in Washington, DC and feature 3-4 Top Government IT Executives discussing mission critical issues. …

Stephanie Balaouras warns Don't rely on industry averages for cost of downtime in her 8/10/2009 post to the ZD Net’s The View from Forrester Research blog:

On a weekly basis, I get at least one inquiry request from either a vendor or an end-user company seeking industry averages for the cost of downtime. Vendors like to quote these statistics to grab your attention and to create a sense of urgency to buy their products or services. BC/DR [business continuity/disaster recovery] planners and senior IT managers quote these statistics to create a sense of urgency with their own executives who are often loath to invest in BC/DR preparedness because they view it as a very expensive insurance policy.

BC/DR planners, senior IT managers and anyone else trying to build the business case for BC/DR should avoid the use of industry averages and other sensational statistics. While these statistics do grab attention, more often than not, they are misleading and inaccurate, and your executives will see through them. You’ll hurt your business case in the end because you haven’t done your homework and your execs will know it. …

<Return to section navigation list> 

Cloud Computing Events 

Lance Weatherby’s Government’s Gone Cloud post of 8/12/2009 is based on John Willis’s presentation to the Military Open Source Software Conference at the Georgia Tech Research Institute on the same date:

Last week the first sentence of an article in the InformationWeek periodical specifically targeted at IT employees of the U.S. Government read as follows:

‘The General Services Administration has issued a Request For Quotation for cloud storage, Web hosting, and virtual machine services.’

This dry and seemingly innocent statement is in reality a blockbuster, a headliner worthy of amazement possibly and further investigation surely.  Any computer industry veteran with federal government dealings will tell you the phrase ‘U.S. government technology innovation’ is an oxymoron (with the notable exceptions of the DOD and NASA).  And now - low and behold! – the stodgiest of the stodgy is rapidly moving (that’s correct – rapidly) past all but the most innovative organizations in the world into the era of cloud computing! …

Sam Diaz pans Judy O’Brien Chavis’ opening keynote in his OpenSource World kicks off with sparse crowds, nothing-new keynote post of 8/12/2009. Judy is director for business development and global alliances at Dell.

• John Willis (a.k.a. @botchagalupe) posted the slides for what he calls Cloud Computing for Healthcare - Awsome Meeting 8/11/09, Peter Schwoerer’s presentation about Sentry Data Systems’ Datanex platform, to SlideShare on 8/11/2009.

• Steve Marx uploaded the Summary and Transcript: First Windows Azure Lounge Chat to his personal blog on 8/10/2009 and adds a link to Simon Munro’s Windows Azure Chat Nuggets summary of 8/7/2009. (Repeated from Live Windows Azure Apps, Tools and Test Harnesses section above.)

• Bob La Loggia reports that Cloud Camp Phoenix will be held on October 24, 2009 from 11 AM to 5 PM at the University of Phoenix in his CloudCamp Phoenix date is set. Gathering sponsors post to the Cloud Camp Google Group:

We are putting together our organizing committee and beginning to look
for sponsors. If you have any contacts with companies who have
sponsored for other CloudCamps, please forward that information to
me.  I appreciate it.

When: 10/24/2009 11 AM to 5 PM   
Where: University of Phoenix, Phoenix, AZ, USA

Oracle offers only 16 sessions tagged “Cloud Computing” at its forthcoming Oracle Open World conference scheduled for October 11 to 15, 2009 at San Francisco’s Moscone Hall.

When: 10/11 to 10/15/2009  
Where: Moscone Center, San Francisco, CA, USA

IDG World Expo’s OpenSource World, Next Generation Data Center and CloudWorld Kick Off This Week in San Francisco according to this detailed press release of 8/10/2009:

OpenSource World focuses on innovative solutions in real-world environments using open source, open standards and open architecture as part of an integrated IT infrastructure. Next Generation Data Center is the only strategic IT event focused on the complete end-to-end solution for the 21st century data center and the new technologies from which these data centers are being built. CloudWorld, debuting this year, will bring together the buyers and sellers of Web-centric software, infrastructure and services designed to drive acceptance and deployment of cloud computing in the data center.

When: 8/12 – 8/13/2009   
Where: Location Moscone Center West, San Francisco

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

Kurt Mackie reports for Redmond Developer News from the Pacific Crest Technology Leadership Forum, held in Vail, Colo., Microsoft Tells Investors Why VMware Will Lose on 8/11/2009:

Microsoft's Brad Anderson on Tuesday talked about the company's management product strategies, while downplaying those of VMware, at investment banker-sponsored event.

Anderson is corporate vice president for Microsoft's Management and Services Division. He was recently promoted and now oversees Microsoft's profitable System Center line of products worldwide, along with Windows client and server management components. …

VMware and SpringSource have indicated that they will combine their technologies to enter the datacenter and cloud computing markets, which are being contested by Microsoft with its Windows Azure platform. VMware already has its own cloud "operating system" called vSphere, with version 4 rolled out in late April.

Anderson suggested that the VMware-SpringSource deal would fall short of the mark. He said that VMware is basically acquiring a development platform for J2EE applications, but he dismissed the potential competition with Microsoft's Windows Azure cloud computing platform.

"I look at that [VMware-SpringSource deal] as a response to what Microsoft has been communicating in the market about the application architecture," Anderson said in a Webinar. "But I think they're moving into a space that really is away from what their core competency is, and moving into a space where, if you look at what Microsoft has with Visual Studio, I think Microsoft has a lot of strength there."

Windows Azure continues Microsoft's .NET Framework into the cloud, and Visual Studio will provide .NET developers with a hook into that cloud computing environment. …

Finally! An official response from Microsoft to VMWare’s SpringSource acquisition.

• James Urquhart’s VMWare, SpringSource to revolutionize Java development post of 8/12/2009 adds his seal of approval to the SpringSource acquisition:

VMWare's acquisition of SpringSource this week is a significant development in the history of the Java development platform.

I have been working closely with VMWare for some time now, and I know a little bit about how the company sees its role in the cloud. The acquisition of the commercial open-source middleware/framework company makes perfect sense to me. …

And adds corroborating quotes from recent posts by Forrester Research's James Staten and Redmonk's Stephen O'Grady.

• Charles Babcock adds his two cents to the VMware/SpringSource story with the VMware's Cunning Acquisition: SpringSource post of 8/11/2009 to InformationWeek’s Plug into the Cloud blog:

Cloud computing and virtualization function hand in glove. We knew that. What we didn't know was that there are likely to be efficiencies if the application is built from the ground up for the cloud. The Spring Framework is one of those new development platforms that make it easier to develop Java applications--for the cloud.

In buying SpringSource, VMware may be correct in assuming that, in the long run, frameworks will serve as one of the springboards to cloud computing. And that means VMware has stopped thinking as a virtualization vendor and started thinking as a cloud supplier.

That is, thinking like a direct competitor of the Windows Azure Platform.

• Darryl K. Taft analyzes VMware’s SpringSource acquisition in his detailed VMware and SpringSource: It`s All About the Cloud, Baby eWeek article of 8/11/2009:

Though Facebook's acquisition of FriendFeed seemed to garner most of the headlines on Aug. 10, the bigger story for the enterprise was easily VMware's announcement of its intent to acquire SpringSource for $362 million to bolster its cloud strategy against Microsoft and others.

Make no mistake, that is what's at the core of VMware's move here. With SpringSource under its wing, VMware can become the Java-based equivalent to what is expected to be Microsoft's Azure private cloud play -- which has .NET as its development platform. But rather than .NET, VMware will have the Java-based Spring Framework and its surrounding set of Eclipse-based tools as the development environment for the emerging VMware vCloud private cloud initiative.

Microsoft will likely leverage its Azure, .NET, Hyper-V, System Center, and PowerShell, as well as Windows Server and other software to deliver its private cloud play. In a blog post, Microsoft’s Steven Martin talks a bit about this. More on Microsoft’s plans is expected at the Microsoft Professional Developers Conference (PDC) in November in Los Angeles. …

Darryl’s post includes links to an interview with “Rod Johnson, CEO of what will become the SpringSource division of VMware” and quotes Paul Maritz, former Microsoft executive vice-president and now CEO of VMware.

Milestone 3 of Schakra Inc.’s Java SDK for Microsoft .NET Services was released to SourceForge on 7/7/2009. For more background on Windows Azure’s support for Java, see Mary Jo Foley’s What is Microsoft doing to add Java support to Azure? All About Microsoft blog post of 4/13/2009 and Steve Marx’s Does Windows Azure Support Java? 4/15/post to his personal blog.

Larry Dignan reports in with his VMware's SpringSource purchase sparks head scratching; Still doesn't solve the Microsoft problem ZD Net post of 8/11/2009, which quotes securities analysts regarding the US$420 price paid by VMware and concludes with two unanswered questions:

Open source mojo: Will VMware be able to hang on to SpringSource’s strong developer community? The grand plan for VMware is to use SpringSource’s popular frameworks to garner an edge in the cloud computing OS wars. The big question: Will there be a clash between SpringSource’s community and VMware’s plans for its vSphere operating system?

Talent: VMware portrayed the SpringSource deal as a great way to acquire developer talent. Can VMware keep that talent? If it doesn’t the SpringSource deal won’t pay off.

• Matt Asay claims VMware puts squeeze on Red Hat with SpringSource buy in this 8/10/2009 CNet News article. 

First, with every acquisition of a leading open-source company by anyone other than Red Hat, Red Hat becomes more and more isolated. Other companies are integrating open source into their business strategies. Red Hat's differentiation as "the" open source company doesn't have much of a shelf life left.

Here's what [Matt says the SpringSource + VMware vision] looks like:

(Credit: SpringSource

Reuven Cohen analyzes VMware’s purchase of SpringSource, which had previously purchased Hyperic, in his VMware Getting into PaaS with SpringSource Acquisition thread of 8/10/2009 in the  Cloud Computing Interoperability Forum (CCIF) Google Group.

Hot on the heels of SpringSource's recent acquisition of
Hyperic, VMware today announced their intention to acquire
SpringSource<http://blogs.vmware.com/console/2009/08/vmware-acquires-springsource....>.

At first glance this move may seem puzzling, why would VMware want to buy an open source enterprise application development platform? Could it be for Hyperic, an open source IT management platform? I doubt it. I'd say it's all about planning for the future, a future where the OS no longer matters, a future where all applications are built, deployed and consumed via the Internet. Yes folks, I'm talking about Platform as a Service.

According to the post by VMware CTO Steve Herrod, he states that since it's founding 11 years ago, VMware has focused on simplifying IT. More to the point saying "VMware has traditionally treated the applications and operating systems running within our virtual machines (VMs) as black boxes with relatively little knowledge about what they were doing."

G. Pascal Zachary’s An Operating System for the Cloud article of 8/10/2009 for the MIT Technology Review says “Google is developing a new computing platform equal to the Internet era. Should Microsoft be worried?”

The way I read the tea leaves is that Google is layering a browserlike UI over a Linux kernel variant, which isn’t likely to reduce Windows’ market share dramatically. Linux with Google lipstick is still Linux, which hasn’t been very successful on the desktop so far.

InformationWeek’s Serdar Yegualp rings in with his Is Linux Irrelevant? post of the same date, which contends:

It's not the distro or even the Linux kernel that matter. It's the things made with Linux -- the servers, smartphones, netbooks, and other mobile devices. …

Most people got their first hands-on experience with Linux via one of about a dozen or so popular distros. The packaging dictated the utility: Most distributions are designed to work as an end-user OS. Some (like Ubuntu, fast becoming synonymous with end-user Linux) are specifically designed to work as a substitute for Microsoft Windows on the desktop, going so far as to provide Windows-specific migration tools.

But the mere presence of an alternative -- even one available at no cost, even one where the transition from Windows has been heavily automated -- hasn't caused the kind of exodus from Windows that we've seen with the OS X-era Macintosh.

People do indeed abandon Windows and use Linux -- many with great success -- but not in great numbers. The exact usage statistics for end-user desktop Linux hover at around 1%; Linux's very protean nature makes it difficult to pin down how many installations are currently out there, or how they're deployed, or how long they remain in use.

Steven Arnold concludes in his Google’s Data Center Strategy Questioned post of 8/10/2009

    • Google’s core technology is getting long in the tooth, but so far, that technology has proven itself to be scalable and extensible. Googlers think away each day looking for angles. So far, so good.
    • Microsoft’s approach has been to emulate Google. The idea seems good but Google uses the sort of gizmos you can buy at Frye’s Electronics for the most part. (There are some big exceptions, however.) Microsoft, on the other hand, uses name brand hardware. There’s a cost issue that has not been fully understood even by the legions of azure chip and blue chip consultants romping in this field of inquiry.
    • Newer solutions work where infrastructure is up to the task. The future, in my opinion, is going to look more like a blend of old and new.

<Return to section navigation list> 

blog comments powered by Disqus