Thursday, December 10, 2009

Windows Azure and Cloud Computing Posts for 12/9/2009+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

 
• Update 12/10/2009: Amazon Web Services: Amazon EC2 Now Offers Windows Server 2008; Marek Czarzbon: How to install SharePoint 2010 on Amazon Cloud : EC2; InformationWeek Webcasts: Smarter PC and Laptop Management with Cloud-Based BI; John Moore: Savvy Move, MSFT Acquires Sentillion; Liz MacMillan: Mortgage Technology in the Cloud; Salvatore Genovese: Amazon EC2 Outage; Anton Staykov: Windows Azure role stuck in Initializing/Busy/Stopping; Jason Miller: Agencies to justify not using cloud computing to OMB; Katrina Woznicki: When Asked, Patients Can't Tell; Barbara Darrow: For Microsoft Azure platform, late is good; and more.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page.
* Content for managing DataHubs will be added when Microsoft releases a CTP of the technology

Off-Topic: OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009, SharePoint Nightmare: Installing the SPS 2010 Public Beta on Windows 7 of 11/27/2009, and Unable to Activate Office 2010 Beta or Windows 7 Guest OS, Run Windows Update or Join a Domain from Hyper-V VM (with fix) of 11/29/2009.

Azure Blob, Table and Queue Services

• Julien Hanssens (@jhanssens) reminds SQL Azure users that his SQL Azure Manager application (now at v0.0.1.7) can substitute for SQL Server Management Studio (SSMS) 2008 R2:

 

Bill Kallio (@billlkallio) reminded folks on 12/9/2009 to try John O’Brien’s free AzureGadget tool to measure Azure queue sizes. John describes his project on CodePlex as follows:

So you have built yourself an Azure worker process to crunch through your work, but how do you keep an eye on things? Facing that question I investigated and then built the world’s first Vista / Win7 Gadget to monitor your Queue size.
AzureGadgetinAction.png
The cool part is this is a pure JavaScript solution! Yes I managed to set the headers, do the Hash-based Message Authentication Code using SHA256 and the base64 encoding. I’ll put together another post to explain all that very soon.

It’s the first time I’ve seen a reference to the gadget.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

The SQL Azure Team have updated the SQL Azure portal page with AppFabric and Marketplace (PinPoint) pages:

I’ll add more detail after the Pinpoint team lets me add OakLeaf Systems’ profile to the Marketplace.

• Eric Nelson interviewed David Robinson for the MSDN Flash Podcast 015 – SQL Azure at TechEd Europe posted 12/10/2009:

While at TechEd Europe in November 2009 I had a chance to catch up with David Robinson, Program Manager in the SQL Azure team and discuss... SQL Azure :-)

We cover the "reboot", large databases, the new support for SQL Azure in SQL Server Management Studio, customer feedback and futures.

V2 hints include:

  • Bigger databases
  • Automatic partitioning
  • Spatial data types

You might also want to check out the slides of a SQL Azure session I recently delivered based on the session David gave at TechEd and take a look at these SQL Azure session recordings from PDC including those of David presenting.

Listen/Subscribe: Download/Play MSDN Flash Show 015

Panagiotis Kefalidis shows you how to Programmatically set your firewall settings for SQL Azure in this 11/9/2009 post:

One of the latest features introduced on SQL Azure is the ability to apply firewall settings on your database and allow only specific IP ranges to connect to it. This can be done through SQL Azure Portal or through code using stored procedures.

If you want to take a look at which rules are active on your SQL Azure database, you can use:

select * from sys.firewall_rules
That will give you a view of your firewall rules.

If you want to add a new firewall rule, you can use the "sp_set_firewall_rule". The syntax is "sp_set_firewall_rule <firewall_rule_name> <ip range start> <ip range end>". For example:

exec sp_set_firewall_rule N'My setting','192.168.0.15','192.168.0.30'

If you want to delete that rule, you can use:

exec sp_delete_firewall_rule N'My setting'

Liam Cavanagh (@liamca) posted Announcing SQL Azure Data Sync (November CTP) Available for Download on 11/17/2009 (during PDC 2009). My Windows Azure and Cloud Computing Posts for 11/16/2009+ post included a link to Bill Zack’s 11/18 Microsoft Sync Framework Power Pack for SQL Azure post but Liam’s slipped through the cracks:

Today in the opening keynote at PDC we announced the availability of SQL Azure Data Sync – November CTP, an early preview open to the public through a demonstration with Kelley Blue Book.  For those of you who have been following our blog, you may be asking yourself, what exactly does this include and how does it compare to Project “Huron” that we have been talking about for some time now?  In this post I want to give some additional details.

You can think of SQL Azure Data Sync as the first part of our overall Project “Huron” vision which is to create a Data Hub in the Cloud, or more specifically a place for you to easily consolidate and share all of your information.  With SQL Azure Data Sync we have worked to simplify the task of sharing information whether that is from on-premises SQL Server to the cloud or from the cloud, down to mobile users, retails stores or remote offices.   All of this being powered by the Microsoft Sync Framework.

SQL Azure Data Sync allows developers and DBA's to:

  • Link existing on-premises data stores to SQL Azure.

  • Create new applications in Windows Azure without abandoning existing on-premises applications.

  • Extend on-premises data to remote offices, retail stores and mobile workers via the cloud.

  • Take Windows Azure and SQL Azure based web application offline to provide an “Outlook like” cached-mode experience.

All of this is accomplished through

  • An end-user Data Sync Tool that keeps on-premises SQL Server data in sync with SQL Azure.

  • Visual Studio templates enabling developers to take Windows Azure and SQL Azure based web application offline within SQL Compact and SQL Server databases. …

Here’s Liam’s capture of the Data Sync Tool’s fourth dialog for choosing the tables to sync:

SQL Azure Data Sync - Select Tables

The Visual Studio 2008 template for taking SQL Azure databases offline offers a C# version but lacks a Visual Basic counterpart, as noted in a comment from Jayaram Krishnaswamy.

I’m glad to see Project “Huron” finally taking shape.

<Return to section navigation list> 

AppFabric: Access Control, Service Bus and Workflow

No significant articles so far today.

<Return to section navigation list>

Live Windows Azure Apps, Tools and Test Harnesses

• Katrina Woznicki reports “Ninety-six percent of the 50 patients surveyed left out at least one drug when they were asked to list their medications, and, on average, patients omitted 6.8 medications” in her When Asked, Patients Can't Tell article of 12/10/2009 for MedPageToday:

Hospitalized patients were often clueless when asked about their medications, with almost all of them unable to name all their medications and many leaving out as many as a half-dozen drugs they have been prescribed, according to a small survey of patients in a Colorado hospital.

Moreover, 44% of the patients thought they were taking a medication that had not been prescribed.

The researchers conducted the patient survey as part of a larger project examining a potential role for patients in reducing medication errors and improving patient safety.

"This study is a first for raising the questions 'How involved should patients be in their hospital medication safety?' and 'How do you involve them?'" Cumbler told MedPage Today.

"We don't live in a perfect healthcare system and errors do occur. If you have a patient who wants to be involved in their medication safety, you have to let him or her know what they're taking and to let them be an active participant."

Among scheduled medications, patients commonly omitted several important therapeutics, including antibiotics, cardiovascular drugs, and antithrombotics. …

This story points out the importance to patient safety of easily-accessible, cloud-based personal health records (PHRs).

Anton Staykov’s Windows Azure role stuck in Initializing/Busy/Stopping post of 12/10/2009 briefly explains how to avoid deployment problems with Windows Azure projects by making sure Web and Worker roles are valid.

Liz MacMillan reports predictions for Mortgage Technology in the Cloud in her post of 12/10/2009:

Dorado Corporation has released its predictions for major trends and developments likely to shape residential mortgage lending and mortgage technology in 2010. They are:

  • Software-as-a-service (SaaS) adoption will reach critical mass in mortgage originator use, with more than 30% of all originations in North America occurring in the cloud.
  • Loan volume will fall below 2009 levels, primarily as a result of reduced FHA originations and an expectation that refinancing levels will soon peak. However, the drop will be mitigated by several factors: continued low interest rates for at least the first half of the year, employment stabilization, the continued high supply of affordable homes, and proactive lender marketing driven by advances in lead capture technology and new entrants such as Google into the lead generation space.
  • Regulatory compliance will become a competitive advantage, as those lenders that adopt turnkey approaches to updating their systems and processes take market share away from those institutions mired in human error, penalties, operational disruptions, and customer complaints.
  • New integrations and interoperability capabilities will provide banks and their borrowers alike with unprecedented choice in external services, driving the cost of these services downwards.
  • Requirements for lenders to maintain a greater financial interest in the loans sent to the secondary markets will further drive a trend towards clean, error-free loans and increased transparency in the creation and handling of loan file information.
  • The demarcation between originators and servicers will continue to blur, related to the increased exposure of lenders to the pools that they feed, and an increased level of pre-close analytics and other safeguards instituted in the origination process to minimize risk.
  • The mid-tier group of banks from 2009 will begin to differentiate themselves, with a handful of aggressive, volume-focused lenders using technology and new ways of doing business to capture market share. The result is a breakout group of “super regionals” that will force former mid-level players further down the food chain.

John Moore credits Microsoft with a Savvy Move, MSFT Acquires Sentillion in this 12/10/2009 post:

Today, Microsoft announced that it will be acquiring the healthcare IT security software firm Sentillion. This is Microsoft Health Solutions Group’s biggest acquisition to date and will add critical security features to their clinician centric solution Amalga UIS.

Chilmark Research sees this as a very savvy acquisition that will further extend the capabilities and thus market opportunities for Microsoft in the healthcare sector. For example, in the hot market for Health Information Exchanges (HIE), managing security access across multiple entities within a given region is challenging – the Sentillion suite of security solutions will slot into this market need quite readily. For Sentillion, this is also a good move has it provides them to backing, resources and distribution channel to truly take their solution suite global far faster than if they attempted to do it organically.

Mr. HIStalk had the opportunity to to interview both MSFT and Sentillion and has a good summary write-up on the acquisition as well.

Joseph Goedart adds his analysis of the Sentillion acquisition in a Microsoft to Buy Sentillion post of 12/10/2009 to the Health Data Management blog:

Microsoft Corp. will acquire Sentillion Inc., an Andover, Mass.-based vendor of context management and single-sign-on software, for an undisclosed sum.
The vendors already are partners. Redmond, Washington-based Microsoft in mid-2009 signed a license to use Sentillion's software as a module with the Amalga Unified Intelligence System. Amalga is advanced data integration and aggregation software. Sentillion's applications enable users to access and simultaneously view patient data from multiple information systems during a single session.

Sentillion will continue to sell and support its products while Microsoft invests in the long-term evolution of the combined product suite, according to the companies. Sentillion will continue to operate out of its Andover headquarters. The companies expect the acquisition to close in early 2010.

More than 115 hospitals use Amalga and 1,000 use Sentillion's applications, according to the vendors. More information is available at sentillion.com and microsoft.com/amalga/default.mspx.

See John O’Brien’s free AzureGadget tool to measure Azure queue sizes in the Azure Blob, Table and Queue Services section.

Eric Nelson reports Windows Azure Platform gets a Current Status Dashboard in this 12/9/2009 post:

We are getting closer to being 100% live (and billable!). Spotted this today – a status dashboard.

image

Related Links:

Microsoft claims “The European Environment Agency's Eye on Earth site lets citizens track important environmental data. The Windows Azure-powered portal is being showcased at this week's United Nations Climate Change Conference in Copenhagen” in a Microsoft Helps Europeans Keep an 'Eye on Earth' press release of 12/9/2009:

… The new vantage point comes compliments of Eye on Earth, a joint project between Microsoft and the European Environment Agency (EEA) that is being shown at the United Nation's 15th Climate Change Conference in Copenhagen (COP15), which kicked off Dec. 7. One of the first applications built on the Microsoft Windows Azure cloud-computing platform, the Eye on Earth portal provides real-time environmental information to the 500 million people who live in the EEA's 32 member countries. It serves up that data in a visual format via Bing Maps. [Emphasis added.]

Keeping an Eye on Earth: At the Eye on Earth home page, visitors can click on links to sensors that show water or air quality at many locations across Europe.

John Moore asks Can RWJF’s Common [Health Record Development] Platform Gain Traction? [RWJF = Robert Wood Johnson Foundation] and concludes “it is unlikely that this investment will gain much traction in the market, remaining by and large an academic exercise” in this 12/8/2009 post:

… As much as Chilmark would love to say that the CDE is a great idea that will be met with broad adoption and use, we just don’t see that playing out for several reasons:

1) Google Health & HealthVault. It wasn’t that long ago that both Google Health and HealthVault were nothing but rumors. At that time the PHA/PHR market was in funk with numerous apps being developed, but few gaining much traction in the market and the path to market was convoluted.  The concept of a CDE was a welcomed one and demonstrated foresight on the part of those leading Project HealthDesign.  But PHA development is now coalescing around the major commercial platform plays of Google and Microsoft and a third platform (CDE) that has little market visibility will wither.

2) Marketing. RWJF is a great source of funding to push the envelop on what might be in regards to PHAs but getting beyond the academic funding exercise to the promotion and marketing of the results of their funding is lackluster at best.  They simply do not have the gravitas in the market, they do not get the ink and subsequently, few even know of the CDE, let alone have accessed the code (according to RWJF, since June ‘09 there have been 38 downloads altogether of either the compiled or source code for the CDE – hardly a stampede by developers of PHAs).

3) Smartphones. The advent of the iPhone with its thousands of medical, health & wellness apps, the more recent introduction of Google’s Android mobile OS, and the onslaught from virtually every other mobile app OS to have its own “AppStore” has completely changed the equation of what consumers will ultimately use as the input device for their online health information.  From collecting ODLs to granting access to personal health information (PHI) on the fly the smartphone will become the modality of choice within the next 5 years.  This is where the market for consumer-facing healthcare apps is headed and likewise, where developers will be focusing the majority of their attention and of course limited resources. …

Return to section navigation list> 

Windows Azure Infrastructure

Barbara Darrow asserts For Microsoft Azure platform, late is good in her 12/10/2009 post to the SearchITChannel.com site:

The Microsoft Azure platform, when it makes its commercial debut in February, will be late to the cloud computing party compared to Amazon, Salesforce.com and Google. But it won't be too late, according to developers and solution providers weighing their own move into cloud computing applications.

In fact, some think that Microsoft's tardiness to market may pay off as it has in the past. The company was famously late to graphical user interfaces, Internet browsers, spreadsheets and word processors and went on to dominate all of those categories through sheer perseverance.

Even some Microsoft-centric developers and partners that had blasted the lack of commercially-hosted services from Microsoft just a few months ago, now say that Azure may hit the Web just as a critical mass of customers are finally ready to trust at least some of their information technology processes or data to a cloud of some type. …

• Steve Clayton adds his analysis of the Windows Azure + SQL Azure –> Server & Cloud Division reorg in his Windows Azure means business post of 12/10/2009:

At the Professional Developers Conference last month, Ray Ozzie and Bob Muglia made a series of announcements around Microsoft’s cloud platform that moved us from something of interest to developers to something that I think will catch the attention of CIO’s too. A few things of note

  • App Fabric & Project Sydney – bringing cloud and on premises together
  • System Center “cloud” on Muglia’s slides
  • VM support in Windows Azure
  • Dallas – putting public data sets in the cloud with a marketplace and open API’s

We’re now clearly in the Iaas, PaaS and SaaS world as shown on this slide from Bob’s presentation

image

What does this all mean? It means that Microsoft now has a full deck of cards on the table around cloud computing - though there is more to come with things like Office Web Apps – and we’re now showing customers that if they prefer to deploy IT capability in the cloud, be it infrastructure, apps they build or services we sell (like Exchange) they can choose to do that. If they’d like to continue on premises, they can choose to do that….and if they’d like a hybrid they can choose to do that. Choice….not something other vendors are really offering. They tend to be all cloud, or all on premises. Of course this runs the risk of being confusing for customers but the reality is, I think it’s the hybrid approach that many customers will take, putting some infrastructure in the cloud or “bursting” to the cloud when they need it during peak times whilst keeping some things like HR systems on premises for the moment. For a CIO and an IT director, that choice gives them plenty of ways to save money or spend in a different way – i.e. on demand vs. up front. It also gives them a way to do things like proofs of concept or short lived projects in a matter of moments as they can dial up and down servers as they need. The elasticity of the cloud is  a god send for CIO’s, CFO’s and IT folks.

Fortunately, it’s not just me who thinks this as there have been a number of posts on the web over the last week observing that Azure is ripe for business adoption. CIO magazine talked with Crispin Porter + Bogusky who have been using Azure in anger already and Chevron is eyeing it up.

Steve goes on to quote Gartner analyst Ray Valdes, as well as Network World and CRN writers about Azure’s appeal to businesses.

• Jason Miller’s Agencies to justify not using cloud computing to OMB post of 12/10/2009 to the Federal News Radio blog reports:

FederalNewsRadio has learned budget passback language also calls for alternative analyses for major IT projects in 2012.

The Office of Management and Budget will require agencies to develop an alternative analysis discussing how they could use cloud computing for all major technology projects for the fiscal 2012 budget.

Agencies will be expected to tell OMB why they wouldn't use cloud computing for these initiatives, according to the 2011 budget passback language obtained by FederalNewsRadio.

And in 2013, agencies must give OMB a complete alternatives analysis for mixed life cycle projects where agencies are spending new money-known as development, modernization and enhancement-and steady state or operations and maintenance funding for how they could move to cloud computing, the budget instructions say.

This language is on top of the plan by OMB to require agencies launch a series of cloud computing pilots across the government in 2010 using the E-Government Fund. In the Financial Services bill, the House's version includes $33 million for the General Services Administration, which manages the fund for OMB. The Senate's version includes $35 million.

Congress has yet to finalize the bill, but OMB will receive more money for e-government than ever before. …

You can download an MP3 file of the broadcast here.

Reuven Cohen offers his 2010 Predictions - Cloudy with a Chance of Convergence in this 12/10/2009 post:

I'm off to Seoul, South Korea next week, but before I leave I wanted to give you a little holiday gift, yes, the gift of my prognostication. Before I do, as anyone who routinely reads my blog will understand, all I pretty much do is attempt predict the future. As an entrepreneur that has always been a key part of my successes & failures. (That and I also seem to be an eternal optimist) Generally my view of the future is not shaped by selecting any particular point in time but instead done from what I see from my ever changing vantage point in the present. …

Ruv’s predictions fall into these categories:

    • Anytime Data - Real Time, Anytime and Anywhere
    • Emergent Clouds
    • Technological Convergence

Darryl K. Taft reports “Microsoft’s "Software + Services" strategy – delivering bits to customers in a variety of ways, from on-premises software to full-blown public/private cloud services -- is nowhere more evident than in the company's approach to the federal sector” in his Microsoft Takes Windows Azure to the Feds post to eWeek’s Cloud Computing News segment:

… Perhaps it is because of federal rules, regulations, procurement policies and whatnot, but in the federal sector, Microsoft's strategy that many have criticized as blurry if not hyped, becomes as clear as a cloudless day.

At a FedScoop Cloud Computing Shoot Out here on Dec. 8, Susie Adams, Microsoft's chief technology officer for the federal sector, and Yousef Khalidi, a Microsoft distinguished engineer and member of the founding team that created the core of Microsoft's Windows Azure cloud platform, helped deliver some of that clarity.

In back-to-back conversations with eWEEK Adams and Khalidi laid out the Microsoft plan to deliver software across the "full spectrum," from on-premise IT to the cloud -- including private cloud-like environments -- and to take the lessons learned in doing so and parlay those back into the product and services lines.

"We're taking our learnings from the process of building Azure and putting that back into the Windows Server product and our other technology," Khalidi said.

Moreover, "We fundamentally don't believe that private clouds are going to go away," Adams said, noting that certain agencies with certain information and workloads will never want to see that stuff in a public cloud environment.

Although Windows Azure is a public cloud technology, "We have dedicated offerings -- dedicated clouds [if you will]," Adams said. This consists of a dedicated network pipe, compute services that Microsoft runs and the customer "manages who gets access and we run it for them." …

Ellen Rubin asks What Does Enterprise IT Really Want from Cloud Computing? in this 12/9/2009 post that’s subtitled “IT managers are serious about testing the cloud today:”

Analysts, bloggers and mainstream media have spent 2009 promoting cloud computing as “the next big thing” that will revolutionize the way companies buy and use computing power. But beyond the hype and the C-level interest in an exciting trend, there’s value to the cloud that appeals to the pragmatic, “show me” nature of enterprise IT.

The two main drivers for cloud computing are the same ones that have always motivated enterprise IT: save money (do more with less) and be more responsive to business needs. These goals are typically in conflict with each other, so that in tough times the first takes precedence and in boom times the second one does. …

Paul MacDougall reports “Software maker reorganizes server group and strikes NetApp deal as it prepares to release cloud platform” in his Microsoft Shakeup Signals Azure Launch post of 12/9/2009 for InformationWeek:

In a move that starts the countdown to Microsoft's Jan. 1 launch of its Windows Azure cloud services platform, Microsoft has shifted the product from a development group headed by chief software architect Ray Ozzie to a commercial unit under server boss Bob Muglia.

The company also announced it will partner with NetApp on the development of some cloud technologies. …

Microsoft is betting big on so-called cloud, or hosted, computing. The company has invested billions developing Azure and opening data centers from which to deliver services. Azure provides cloud based OS, development, and storage services that will offer enterprise customers off-premises computing.

Microsoft also plans to offer cloud systems that business customers can run in their own data centers. In keeping with that, the company on Wednesday announced a three-year partnership with storage and virtualization specialist NetApp.

Under the arrangement, the two companies will collaborate on product development, integration, and marketing of products and services for in-house cloud environments. In particular, the vendors will work to integrate NetApp's storage system with Microsoft's Windows Server 2008 R2 server OS and Hyper-V virtualization technology. …

James Urquhart delivers his take on the new Microsoft cloud reorg in his Microsoft Azure, Server teams form new cloud division post of 12/8/2009 to the CNet News Wisdom of Clouds blog:

… The move makes sense, as the company's "software plus services" strategy requires consistency in the management and execution capabilities of both Windows Server and Windows Azure. Microsoft has been working on both Azure and private cloud capabilities for some time now, though its Web site currently pitches its Dynamic Data Center Toolkit as a "foundation" for both private and partner cloud services.

It should be noted that this move means that CTO Ray Ozzie is no longer heading the Azure team, a signal that Azure has graduated from a technical project to a full-fledged Microsoft business. …

Nancy Gohring’s Microsoft, Cisco, IBM and Others Form Cloud Computing Group article of 12/8/2009 for the IDG News Service reports:

A group of companies is starting up an Enterprise Cloud Buyers Council in hopes of removing barriers to enterprise use of hosted cloud computing.

Initial members include companies that offer hosted cloud computing as well as enterprises that use such services, including Microsoft, IBM, HP, Cisco, AT&T, BT, EMC, Deutsche Bank, Alcatel-Lucent, Amdocs, CA, Nokia Siemens Networks, Telecom Italia and Telstra. Two industry organizations, Distributed Management Task Force and the IT Service Management Forum, are also involved. The TM Forum, an industry association that helps information and communications companies create profitable services, came up with the idea of the council.

One important issue that the council will try to address is the current fear among enterprises of vendor lock-in, said Gary Bruce, a principal researcher at BT. The council may decide to work on standards-based solutions around various layers of cloud computing, including the virtualization, management and control layers, so that enterprises can more easily port their projects from one cloud computing vendor to another, he said.

In addition, enterprises are often concerned about security and reliability, he said.

I find it hard to believe that Microsoft wants to avoid vendor lock-in to the Windows Azure Platform and SQL Azure.

<Return to section navigation list> 

Cloud Security and Governance

See Health Data Management’s Changes in Healthcare Data Protection Laws: What Enterprises Need to Know Webcast on 12/15/2009 in the Cloud Computing Events section.

See John Moore’s Savvy Move, MSFT Acquires Sentillion post in the Live Windows Azure Apps, Tools and Test Harnesses section.

John Moore recommends the European Network and Information Security Agency (ENISA)’s 120-page Cloud Computing Security Risk Assessment report in his Cloud Computing, Security & Privacy Considerations post of 12/9/2009:

While conducting research for the long overdue and nearly completed report on Personal Health Clouds (Dossia, Google Health and HealthVault) came across a recently published report by the European Network and Information Security Agency (ENISA) addressing cloud computing security.  Though quite long (over 120 pages) the report provides a very comprehensive overview of cloud computing, its benefits, risks and some very good risk assessment tools to assist one in evaluating a cloud solution offering including segmentation by SaaS, IaaS and PaaS.

With the rapid migration to the “cloud computing” paradigm in the healthcare sector, be it personal health clouds, HIE vendors transitioning to PaaS vendors (note: Medicity made their own PaaS announcement yesterday – more to follow in near future), EMR vendors offering hosted solutions, to move to manage and store images in the cloud, and various niche vendors such as Medcommons, who uses Amazon to host its service, a report such as this is quite valuable and instructive both for potential users of cloud services as well as those offering them.

If you have even a remote interest in this subject, trust me, just get the report as it is one of the best I’ve come across to date

Joe McKendric asserts Governance isn't just for on-premise services anymore and quotes Dave Linthicum in this 12/9/2009 post to ZDNet’s Service Oriented blog:

What works for SOA-aware service governance should work for cloud service governance.

Dave Linthicum, author of the recently published Cloud Computing and SOA Convergence in Your Enterprise: A Step-by-Step Guide, made the point that its time to bring cloud services under the same governance umbrella as SOA services:

“In the world of SOA, simply put, governance means designing, building, testing, and implementing policies for services monitoring and their use. Governance as related to services, or service governance, is most applicable to the use of cloud computing, since we are basically defining our architecture as a set of services that are relocatable between on-premise and cloud computing-based services.”

The question is, where are the vendors on this? Most SOA governance solutions, up to this point, have focused on Web services. Now it appears some vendors are extending the concept of service governance to address cloud-based services.

For example, this week, AmberPoint, best known for its SOA management platform, and SOA Software, which has been in the governance game for a few years, both announced new governance offerings, and both point to the clouds. These offerings extend their reach to REST-based services and beyond, both vendors say.

Click here for more information on AmberPoint’s application and SOA governance offerings.

Robert Rowley, MD voices concerns about compliance with HIPAA privacy laws in his Using EHR data for the public-health good post of 12/9/2009:

One of the most important barriers to getting population-health data is the concern that PHI privacy could be violated. After all, health information is very personal and sensitive (perhaps, one could argue, even more than personal banking information), and HIPAA Privacy Laws govern the protection, privacy and security of such information.

In order that data extracted from EHRs can be used for such public health purposes, it would need to be de-identified. But is true de-identification possible? This has been the subject of numerous blog articles, and it has been argued that with just a few pieces of data, re-identification can be achieved.

From the standpoint of HIPAA, there is a “safe harbor” if all 18 identifiers enumerated in section 164.514(b)(2) are removed from an individual patient data point. These identifiers are (1) names, (2) geographic subdivision smaller than a state, (3) all elements of dates (except year) related to an individual (including dates of admission, discharge, birth, death), (4) phone numbers, (5) fax numbers, (6) email address, (7) social security numbers, (8) medical record numbers, (9) health plan beneficiary numbers, (10) account numbers, (11) certificate/license numbers, (12) vehicle VIN and license plate numbers, (13) device identifiers and serial numbers, (14) web URL’s, (15) internet IP addresses, (16) biometric identifiers (including finger and voice prints), (17) full face photos and comparable images, and (18) any unique identifying number.

As noted in legal reviews, the HIPAA Privacy Rule permits covered entities to release data that has been de-identified without obtaining an authorization and without further restrictions upon use or disclosure, because de-identified data is not PHI, and therefore not subject to the Privacy Rule.

<Return to section navigation list> 

Cloud Computing Events

Steve Riley will present two cloud-related sessions on Thursday, 12/10/2009 from 6:00 PM to 9:00 PM EST to the New York IT Security User Group (NYITSUG) at the AXA Financial Building, 1290 6th Avenue (nee Avenue of the Americas), New York, NY 10104 (map):

Fear the cloud no more
Suddenly, it seems, the simple network diagram symbol for the Internet has become a major component for providing infrastructure platforms and service offerings. Unlike the application service provider days of the late 1990s, cloud computing is here to stay. It’s already gained much traction for specialty computing purposes, yet many IT shops remain wary. Moving compute and storage out of your own data center and into someone else’s, mingled among many others, seems daunting at first. Common questions arise around security, manageability, performance, and reliability. Think about it, though–these are the same concerns you’ve always had. Nothing about the cloud requires that you jettison everything you’ve learned during your career. The cloud is a logical next step in the evolution of computing, and when integrated with corporate IT removes much of the burden and allows a business to concentrate on its core functions.

Security and compliance in the cloud
Moving to the cloud raises lots of questions, mostly about security. Providers worthy of your business should answer them clearly and honestly. Amazon Web Services has built an infrastructure and established processes to mitigate common vulnerabilities and offer a safe compute and storage environment.

Steve is a former Microsoft Security guru and now is a Senior Technical Program Manager at Amazon.com.

• Health Data Management presents a Changes in Healthcare Data Protection Laws: What Enterprises Need to Know Webcast on 12/15/2009 at 2:00 PM EST (free registration required):

Are you uncertain about EHR protection requirements introduced by ARRA and the HITECH Act? You’re not alone. In this web seminar, Forrester Senior Analyst Andrew Jaquith will help you navigate the new guidelines for data privacy and breach disclosure, and recommend strategies for protecting data and reducing risk. Also, technology experts from Intel and Lenovo will discuss how you can ensure maximum protection for mobile computers – the most common source of electronic health records (EHR) data breaches.
Discussion topics will include:

  • Security obligations under the HITECH Act, HIPAA and state laws
  • Proving compliance in the exchange and storage of electronic records
  • Anti-theft technology for minimizing the risks associated with mobile computers
  • Services for multi-layered security offered by PC providers
  • If your organization is implementing EHR, register now for this unique chance for expert analysis of data privacy issues.

Speakers:

  • Andrew Jaquith, Senior Analyst, Forrester Research
  • Stacy Cannady, CISSP, Product Manager, Security, Lenovo
  • Jared S. Quoyeser, Americas Healthcare Industry Manager, Intel
  • Geoff Glave, Product Manager, Absolute Software

• InformationWeek Webcasts offer a 45-minute Smarter PC and Laptop Management with Cloud-Based BI webcast on Tuesday, 12/15/2009 at 11:00 AM PST (registration required):

If you are managing PCs, laptops or mobile devices, cloud-based business intelligence services can make your job much easier, not to mention helping you improve data security and reduce management costs.

In this 45-minute webcast, our featured guest, Chris Silva of independent research firm, Forrester Research, Inc., will discuss why PC managers and "Mobile Operations Managers" need to increase their ability to monitor what happens on endpoints. He will also present research results on how operations managers are coping with their most pressing challenges.

Jonathan Dale of MaaS360 will then present case studies showing how a cloud-based business intelligence tool helped two enterprises:

  • Detect unknown security vulnerabilities.
  • Identify why some systems were not ready for software upgrades.
  • Find risky software packages on systems in remote locations.
  • Produce compliance reports that saved weeks of data compilation.
  • Prove that lost or stolen laptops were fully encrypted.

After 45 minutes you will understand how the information provided by "endpoint intelligence" can simplify your job and make you more effective. You will learn how to take the next step by using cloud based-tools to enforce policies and perform remediation. Finally, you will receive information about a simple trial that can be used to assess the value of visibility into endpoints in your own environment.

    Brandon Sanford of Waggener Edstrom announced Microsoft MIX10 registration now open in a 12/9/2009 e-mail:

    Today Microsoft announced that registration is now open for MIX10, as well as the event’s keynote line-up. MIX10 will be held March 15 - 17, 2010 at the Mandalay Bay in Las Vegas.

    Keynoters include Bill Buxton, Microsoft Principal Researcher and author of Sketching User Experiences, and Scott Guthrie, corporate vice president of Microsoft’s .NET Developer Division. The first sessions and workshops were also disclosed, covering topics including design/user experience (UX), mobile, rich Internet applications (RIAs) and web standards. Many of this year’s MIX sessions will be selected via online voting. An open call for session content is now live at http://live.visitmix.com/opencall.

    MIX is Microsoft’s premier event for designers and developers who build innovative consumer web sites including coders, strategists, information architects, visual designers, UX professionals and digital marketers. At MIX10, attendees will learn how Microsoft tools bring together client, server and cloud to create rich applications that span form factors, platforms and devices. 

    Press registration is also now available. Please contact MIX10Registration@waggeneredstrom.com to request your complimentary registration.

    So far, there’s only one session in the Cloud category: James Hamilton’s Cloud Computing Economies of Scale:

    The past few years have seen dramatic increases in the size and efficiency of the world’s largest data centers, hosted at providers like Amazon, Microsoft, and Google. As the industry builds out for the coming age of cloud computing, we are being forced to rethink old problems and learn new lessons. What we are learning about the unique economics of cloud computing will be impacting the industry for years to come. Come hear the latest insights about cloud computing economics and how it will impact you.

    Go here to submit proposals for additional Windows Azure/SQL Azure sessions.

    <Return to section navigation list> 

    Other Cloud Computing Platforms and Services

    • Amazon Web Services reports Amazon EC2 Now Offers Windows Server 2008, as well as SQL Server 2008 Express and Standard, in a 12/10/2009 e-mail plus updated Amazon Elastic Compute Cloud (Amazon EC2) and Amazon Elastic Compute Cloud (EC2) Running Microsoft Windows Server and SQL Server (Beta) pages.

    Following are the latest Windows and SQL Server 2008 AMIs:

    Prices are the same for Windows Server 2003 and 2008 versions. There’s no surcharge for SQL Server 2008 Express but SQL Server 2008 Standard Edition runs $1.08 per hour, which computes to $777.60 per month. If you can live with a 10GB maximum database size, SQL Azure’s Business Edition at $99.95 per month is a comparative bargain. David Robinson of Microsoft’s SQL Azure Team promised at PDC 2009 larger database size limits in future versions.

    Jeff Barr offers his two cents worth in an Amazon EC2 Running Microsoft Windows Server 2008 post of even date. He also provides a retrospective - The AWS Blog: The First Five Years.

    I’m surprised that Werner Vogels hasn’t chimed in on the new offering with a blog post, but he tweeted:

    Big deal for those who want to run a MS 2010 app stack: Amazon EC2 now runs Microsoft Windows Server 2008 ( http://tr.im/HbQB ) #ec2 #aws

    early Thursday morning.

    The Amazon Web Services team continues to put its competitive ducks in a row to prepare for SQL Azure’s formal debut on 1/1/2010.

    Salvatore Genovese reports a “Cloud Computing outage occurred on Dec. 9, 2009 beginning at approximately 3:34 a.m. EST and lasted approximately 44 minutes” in his Amazon EC2 Outage post of 12/10/2009:

    Apparent Networks issued a new Performance Advisory detailing an outage of Amazon’s Elastic Compute Cloud (EC2) services.

    The outage occurred on Dec. 9, 2009 beginning at approximately 3:34 a.m. EST and lasted approximately 44 minutes.

    During that time, access to systems in the Amazon’s northern Virginia data center was unavailable to businesses.

    Apparent Networks’ Cloud Performance Center, a free service that offers performance data on leading cloud computing service providers such as Amazon, Google and GoGrid, detected the outage.

    The Cloud Computing Performance Center utilizes Apparent Networks’ PathView Cloud service to test the performance of cloud service providers. The service has been configured to sample path performance to a series of pre-determined targets hosted at Amazon’s data centers every 120 seconds.

    A single 44-minute outage in a month would result in an uptime of 99.8982%, which is lower than Amazon’s 99.95% SLA for its data centers. The Amazon Elastic Compute Cloud (Amazon EC2) page says:

    The Amazon’s  EC2 Service Level Agreement commitment is 99.95% availability for each Amazon EC2 Region.

    Marek Czarzbon’s How to install SharePoint 2010 on Amazon Cloud : EC2 post of 12/10/2009 gives step-by-step instructions for installing SharePoint 2010 public beta on the following AMI:

    Marek estimates Amazon EC2 charges for the installation only to be about US$1.50.

    Jeff Barr describes an Amazon EC2 Cost Comparison Calculator spreadsheet as part of a new AWS Economics Center in his The Economics of AWS post of 12/8/2009 to the Amazon Web Services blog:

    For the past several years, many people have claimed that cloud computing can reduce a company's costs, improve cash flow, reduce risks, and maximize revenue opportunities. Until now, prospective customers have had to do a lot of leg work to compare the costs of a flexible solution based on cloud computing to a more traditional static model. Doing a genuine "apples to apples" comparison turns out to be complex — it is easy to neglect internal costs which are hidden away as "overhead".

    We want to make sure that anyone evaluating the economics of AWS has the tools and information needed to do an accurate and thorough job. To that end, today we released a pair of white papers and an Amazon EC2 Cost Comparison Calculator spreadsheet as part of our brand new AWS Economics Center. This center will contain the resources that developers and financial decision makers need in order to make an informed choice. We have had many in-depth conversations with CIO's, IT Directors, and other IT staff, and most of them have told us that their infrastructure costs are structured in a unique way and difficult to understand. Performing a truly accurate analysis will still require deep, thoughtful analysis of an enterprise's costs, but we hope that the resources and tools below will provide a good springboard for that investigation.

    Here’s a screen capture of the Amazon EC2 Cost Comparison Calculator, “a rich Excel spreadsheet that serves as a starting point for your own analysis:”

    Ec2costcalc 

    Ray Le Maistre reports BT, Cisco Claim Cloud Coup in this 12/9/2009 post to the Light Reading Europe blog:

    If you hear an increasingly loud rumbling noise, you'll probably find it's either: that chicken pesto sandwich you had for lunch; or the sound of telecom operators scrambling to position themselves as cloud services pioneers.

    And carriers need to move fast if they're to play a significant role in the cloud revolution. (See Capturing SaaS Cloud Computing Business, TM Forum Seeks Enterprise Help With the Cloud, Outlook Cloudy for Telco IaaS, and Amazon's Lessons for Telcos.)

    The latest operator to make a noise about its hosted services offerings is BT Group plc (NYSE: BT; London: BTA), though this isn't the British incumbent's first foray into the world of so-called cloud services. (See BT, Microsoft Get Cloudy .)

    Today, the operator is taking its next big step down the hosted applications road. In cahoots with long-time partner Cisco Systems Inc. (Nasdaq: CSCO), BT has unveiled a "global hosted IP telephony service" that "allows businesses to bring converged voice, mobile and data services to every desktop in their organisation, using BT and Cisco’s cloud computing-based technologies."

    The Cisco technology in question is the Hosted Unified Communications Services (HUCS) platform. The IP giant, though, has plenty of other ideas about how it can help cloud services reign. [Ed. note: Geddit?] (See Cisco Plays in the Clouds.) …

    The TM Forum Rallies Industry Giants to Create Ecosystem to Accelerate Cloud Services Adoption press release of 12/8/2009 announces:

    Major users to form Enterprise Cloud Buyers Council (ECBC) as the core driver

    Cloud service providers and technology suppliers to join with users to collaborate on a comprehensive program for accelerating commercial availability of managed and secure cloud services

    ORLANDO, FL, USA - December 8, 2009 - TM Forum, the world's premier industry group focused on business effectiveness for the communications and media sectors, today announced the formation of an ecosystem of major industry players in the emerging cloud services sector. The centerpiece of this effort is the creation of the Enterprise Cloud Buyers Council (ECBC) whose goal is to understand the needs of the largest global cloud buyers and ensure any impediments to the uptake of cloud technology are removed. Together with key service and technology suppliers, the ecosystem will initiate a range of programs designed to remove barriers to the growth of commercial cloud services. …

    TM Forum is an industry association dedicated to helping companies in the information, communications and entertainment industries reduce the costs and risks associated with creating and delivering profitable services. The Forum's initiatives focus on providing industry research, publications, technology roadmaps, best practices, software standards, certified training courses and conferences to its more than 700 member companies in 75 countries. Membership includes the world's largest service providers, cable and network operators, software suppliers, equipment suppliers and systems integrators. To learn more, please visit www.tmforum.org.

    <Return to section navigation list> 

    blog comments powered by Disqus