Tuesday, September 29, 2009

Windows Azure and Cloud Computing Posts for 9/28/2009+

Windows Azure, Azure Data Services, SQL Azure Database and related cloud computing topics now appear in this weekly series.

• Update 9/30/2009: Wordpress on Windows Azure, porting NopCommerce to Windows Azure and SQL Azure, Kaiser Permanente’s long road to and good user satisfaction with PHR, what steps to take following a data breach, IBM’s self-service approach to the cloud, and much more.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here. 

Azure Blob, Table and Queue Services

Kevin Hoffman explains the why and how of Binary Serialization and Azure Web Applications in this 9/29/2009 post:

You might be thinking, pfft, I'm never going to need to use Binary Serialization...that's old school. And you might be right, but think about this: Azure Storage charges you by how much you're storing and some aspects of Azure also charge you based on the bandwidth consumed. Do you want to store/transmit a big-ass bloated pile of XML or do you want to store/transmit a condensed binary serialization of your object graph?

I'm using Blob and Queue storage for several things and I've actually got a couple of projects going right now where I'm using binary serialization for both Blobs and Queue messages. The problem shows up when you try and use the BinaryFormatter class' Serialize method. This method requires the Security privilege, which your code doesn't have when its running in the default Azure configuration. [Emphasis Kevin’s.]

So how do you fix this problem so that you can successfully serialize/deserialize binary object graphs and maybe save a buck or two? Easy! Turn on full-trust in your service definition for whichever role is going to be using the binary serialization (in my case both my worker and web roles will be using it...).

Kevin then shows you how to turn on full-trust.

Kevin Hoffman’s Configuration Settings in Azure Applications post of 9/28/2009 begins:

One of the double-edged swords of Azure is that it feels so much like building regular web applications. This is a good thing in that you can re-use so much of your existing skills, knowledge, and best practices and they will still apply in the Azure world. However, it is really easy to make assumptions about how things work that turn out to be wrong. …

and then describes when to use a service configuration setting versus a web.config setting for storage account and configuration data.

CloudberryLab announces the beta program for its Azure Blob Storage explorer. According to a Cloudberry comment on a recent OakLeaf post:

CloudBerry Lab is looking for beta testers for their CloudBerry Explorer for Azure Blob Storage. It is a freeware application that helps you to manage Azure blob storage with FTP like interface. Currently CloudBerry Explorer is the most popular Amazon S3 client on Windows platform and we decided to extend it with Azure storage support.

Please sign up here.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

Dag König published the sourcecode for his new SQL Azure Explorer to CodePlex on 9/28/2009:

This is an attempt to learn something of SQL Azure and VS2010 Addin development by creating an SQL Azure Explorer *Addin for Visual Studio 2010 Beta 1*, even though Microsoft will probably provide such a tool eventually, this is for learning and having some programming fun.

This addin will only work for Visual Studio 2010 Beta 1. Also, the performance for anything but small databases is very slow right now, as it is doing a big chunk of querying at startup. This is going to be fixed though. This is a start :)

Some of the main features right now is:

  • SQL Azure Explorer which contains:
    • Databases
    • Tables with columns
    • Views with columns
    • Stored procs with parameters
    • Functions with parameters
  • Context menues for:
    • Open Sql Editor Window
    • Select Top 100 Rows
    • Script as CREATE for all tables, views, stored procs and functions
  • SQL Editor Window with built in:
    • SQL Execute
    • Off line parser
    • Script format[t]er

SelectTop100Rows.png

Dag is a Developer Evangelist for Microsoft, Sweden

<Return to section navigation list> 

.NET Services: Access Control, Service Bus and Workflow

Nate Dudek shows you How to Build Your First Azure-Powered ASP.NET MVC App in this 9/29/2009 post:

The Visual Studio project templates included with the Azure Tools provide a quick way to get started with a cloud-hosted web application.  Unfortunately, it only supports classic ASP.NET web projects by default.  This tutorial will get you going on deploying an ASP.NET MVC web application to Azure.

PreRequisites

To get started, you’ll need to have the following tools installed on your machine:

Installing the Azure SDK will install the two important local development components – Development Fabric, which simulates the cloud on your local machine, and Development Storage, which simulates the Azure Storage components (table, blob, and queue) using SQL Server.

Thanks to Kevin Hoffman for the heads-up.

Kevin Hoffman’s ASP.NET Membership Provider in the Cloud : The Chicken and the Egg Problem post of 9/28/2009 describes how to ensure that you have the admin role for Web apps in the Azure Fabric:

Let's take a look at this pretty common scenario. You're building an ASP.NET application (MVC or otherwise) and you intend to publish it in the cloud and you're using Azure Storage (not SQL Azure) for your underlying data store. You've already hooked your app up with the sample Azure-based Membership provider that comes with the Azure SDK and everything is running along nicely.

Your application has quite a bit of administrator-only functionality so, after you've been using it locally for a while you put in some safeguards to block access to the admin areas unless the user is in the Administrators role. That's awesome and ASP.NET and ASP.NET MVC both have some really great code shortcuts for enabling this kind of situation and you can make yourself an administrator pretty darn easily.

So you're an admin and you deploy your application to staging and you go to run it and you try to log in. Whoops your account isn't there. This is because for the last couple of weeks you've been running against your local SQL 2008 (or SQL Express) database and you forgot that you did a few tweaks to make yourself an administrator. In the last couple of weeks you removed the code on the site that allows users to self-register since your application is an LOB app with a manually administered user list. …

<Return to section navigation list> 

Live Windows Azure Apps, Tools and Test Harnesses

Wordpress has uploaded a starter Word Press for Windows Azure project to the Windows Azure Platform:

I tried posting a comment, but received an HTTP 500 error. The Wordpress on Windows Azure post says, “It uses SQL Azure as its database.”

Daniel Root’s eCommerce in the Cloud: Running NopCommerce on Azure post of 9/30/2009 explains that he:

was able to get NopCommerce running on Azure in just a few hours with relatively little fuss.  In a real project, there would of course be all of the normal issues, such as setting up products, design, and such, but Azure was really not much more difficult than your typical hosting provider.

Dan continues with a detailed description of how he ported NopCommerce to an Azure Web app and SQL Azure database.

• Jack Mann details Practice Fusion’s Ryan Howard: Five benefits of cloud-based electronic health records (EHRs) in this 9/30/2009 post to the ExecutiveBiz blog. The five benefits, in brief, are:

    1. A cloud-based model is cost-effective.
    2. A cloud-based model is secure.
    3. Federal criteria for meaningful use will likely cover three scenarios.
    4. Aligning with a web-based provider is key.
    5. The future belongs to a centralized platform.

Mann adds considerable detail to each of the stated benefits.

David Pallman reports in Azure ROI Calculator Updated of 9/30/2009:

Neudesic's Azure ROI Calculator has been updated. There are two primary changes in Beta 2 of the calculator:

Compute Time now defaults to 24 hrs/day for all scenarios. Having received some clarification since the July pricing announcement, it's now clear that compute time charges are not based on application usage but chronological time. Therefore, you'll always be computing your charges based on 24 hours a day for each hosting instance. The calculator now reflects this.

Vertical scrolling is now in place. Previously, you couldn't see all of the calculator on smaller resolution displays.

These fixes make the ROI calculator much easier for most folks to use.

Glenn Laffel, MD, PhD reports that Kaiser Permanente’s Medicare Beneficiaries Love their PHRs in this 9/30/2009 post:

The results of a recent survey suggest that Medicare beneficiaries who use Kaiser Permanente’s personal health record are overwhelmingly satisfied with the service, and are in general quite comfortable using the Internet to manage their health care online.

The health plan’s PHR—known as My Health Manager—is available only to Kaiser enrollees and so far as we know, is the only PHR that links directly to an electronic health record (in this case it is HealthConnect, Kaiser’s modified version of an Epic product).

Kaiser presented the gratifying findings last week at the World Health Care Congress’ 5th Annual Leadership Summit on Medicare in Washington, D.C.

Twenty-three percent of the seniors responded to the e-mail survey, which was distributed to more than 15,000 people.

The survey examined respondents’ Internet utilization habits and comfort with computers, as well as current health status and use of prescription drugs.

Nearly 88% of survey respondents reported being satisfied or very satisfied with Kaiser’s PHR.

Kim Nash looks down The Long Road to E-Health Records in this 9/25/2009 Computerworld article about Kaiser Permanente’s use of EMR:

When CIOs debate the difficulty of installing electronic medical records, they inevitably point to Kaiser Permanente. The $40 billion healthcare organization has been deploying electronic medical records (EMR) in various pockets of its provider and insurance network for more than a decade and decided to link them all into one companywide system. System outages, physician rebellion, privacy issues Kaiser has dealt with it all. CIO Phil Fasano, who joined Kaiser in 2006, talks about weathering the ups and downs.

Anna-Lisa Silvestre, Kaiser Permanente's vice president for online services, and Peter Neupert, Microsoft's corporate vice president for the Health Solutions Group announced on 6/9/2008 that “Kaiser Permanente and Microsoft will pilot health data transfer from the Kaiser Permanente personal health record, My Health Manager, to the Microsoft Health Vault consumer platform” in a News Conference Call – Microsoft HealthVault & Kaiser Permanente Pilot Program press release.

Jean-Christophe Cimetiere claimed New bridge broadens Java and .NET interoperability with ADO.NET Data Services in this 9/28/2009 post to the Microsoft Interoperability Blog:

Much of the work that we have collaborated on in the past several months has been centered around PHP, but rest assured we have been focused on other technologies as well. Take Java, for example. A big congratulations goes out this week to Noelios Technologies, which just released a new bridge for Java and .NET.

Reslet-org Noelios Technologies is shipping a new version of the Restlet open source project, a lightweight REST framework for Java that includes the Restlet Extension for ADO.NET Data Services. The extension makes it easier for Java developers to take advantage of ADO.NET Data Services.

Microsoft collaborated with the France-based consulting services firm and provided funding to build this extension to the Restlet Framework. It’s always very exciting for me, as a French citizen living in the United States, to witness French companies like Noelios collaborating with Microsoft to develop new scenarios and bridges between different technologies. Noelios specializes in Web technologies like RESTful Web, Mobile Web, cloud computing, and Semantic Web, and offers commercial licenses and technical support plans for the Restlet Framework to customers around the world.

The ADO.NET Data Services extension documentation’s Introduction begins:

REST can play a key role in order to facilitate the interoperability between Java and Microsoft environments. To demonstrate this, the Restlet team collaborated with Microsoft in order to build a new Restlet extension that provides several high level features for accessing ADO.NET Data Services.

The Open Government Data Initiative (OGDI) is an initiative led by Microsoft. OGDI uses the Azure platform to expose a set of public data from several government agencies of the United States. This data is exposed via a restful API which can be accessed from a variety of client technologies, in this case Java with the dedicated extension of the Restlet framework. The rest of the article shows how to start with this extension and illustrates its simplicity of use. … [Emphasis added.]

This looks to me like the RESTful start of a StorageClient library for Java programmers.

Joseph Goedert reports Free PHR a Hit at Indiana University on 8/25/2009:

The Indiana University Health Center in Bloomington early this year began testing a free personal health record for students. The goal was to work out bugs, and offer the PHR to the incoming freshman class this fall (see healthdatamanagement.com/issues/2009_67/-28272-1.html).

Just weeks into the new semester, 3,100 of 7,200 incoming students--40% of the class--have activated a PHR and entered some data, says Pete Grogg, associate director at the health center. And half of those with a PHR are sharing data with the center as they start seeking treating. "We're very happy, we weren't quite sure what to expect," Grogg says.

The university this fall expects to complete integration work and populate PHRs with pertinent patient data from the center's electronic health records system. Students presently can populate the PHR with data they receive from their primary care physician, or the health center can scan that information into the PHR. The PHR vendor, Fort Wayne-based NoMoreClipboard.com, soon will add features to enable students to request medication refills and view their financial history online.

NoMoreClipboard.com integrates with Microsoft HealthVault.

PR Newswire announces CVS Caremark and Microsoft HealthVault Expand Partnership to CVS/pharmacy Customers in a 9/29/2009 press release:

CVS Caremark (NYSE: CVS) today announced the expansion of its partnership with Microsoft HealthVault. Now, CVS/pharmacy customers have the ability to securely download their prescription histories to their individual Microsoft HealthVault record. By visiting CVS.com, consumers who fill their prescriptions at CVS/pharmacy stores can now easily add their prescription history into their HealthVault record.

CVS Caremark has been a partner with Microsoft HealthVault since June 2008. Consumers using CVS Caremark for pharmacy benefit management services can already store, organize, and manage their prescription history information online using Microsoft's HealthVault platform. In addition, patients who receive treatment at MinuteClinic, the retail-based health clinic subsidiary of CVS Caremark, can securely import their visit summaries and laboratory test results into their personal HealthVault record. …

I still haven’t heard when Walgreens will complete their software for uploading prescription data to HealthVault (see the “PHR Service Providers, Microsoft HealthVault and Windows Azure” section of my Electronic Health Record Data Required for Proposed ARRA “Meaningful Use” Standards post of 9/5/2009 for more details).

Steve Lohr’s E-Records Get a Big Endorsement article of 9/27/2009 describes how a New York regional hospital group plans to offer affiliated physicians up to about 90% of the maximum federal subsidy for adopting Electronic Medical Record (EMR) technology:

North Shore-Long Island Jewish Health System plans to offer its 7,000 affiliated doctors subsidies of up to $40,000 each over five years to adopt digital patient records. That would be in addition to federal support for computerizing patient records, which can total $44,000 per doctor over five years.

The federal [ARRA] program includes $19 billion in incentive payments to computerize patient records, as a way to improve care and curb costs. And the government initiative has been getting reinforcement from hospitals. Many are reaching out to their affiliated physicians — doctors with admitting privileges, though not employed by the hospital — offering technical help and some financial assistance to move from paper to electronic health records.

Efforts by hospital groups to assist affiliated doctors include projects at Memorial Hermann Healthcare System in Houston and Tufts Medical Center in Boston. But the size of the North Shore program appears to be in a class by itself, according to industry analysts and executives.

Big hospitals operators like North Shore, analysts say, want to use electronic health records that share data among doctors’ offices, labs and hospitals to coordinate patient care, reduce unnecessary tests and cut down on medical mistakes.

<Return to section navigation list> 

Windows Azure Infrastructure

 Peter Kretzman contends: Cloud computing: misunderstood, but really not that complicated a concept in this 9/29/2009 essay:

…[T]he reason that so many of these mainstream articles get it so wrong, is they’re trying to explain cloud computing as a consumer-oriented phenomenon, and it’s basically not. Not the exciting or “new” part, anyway. Even technology vendors drift into this as they try to tout their cloud offerings: witness a recent TV commercial from IBM entitled “My Cloud: Virtual Servers on the Horizon”, a commercial which would work just as well if it were titled “the incredible power of the Internet”, or even, “aren’t computers cool?” Similarly, that cloud computing “definition” from BusinessWeek is, quite frankly, nonsensical in its broadness: it not only completely misses the point of what makes cloud computing relevant and compelling as a game-changer, it even fails to distinguish it from the last 15+ years of the Internet in general. …

The whole of Peter’s post is definitely worth reading.

The Innov8showcase site’s Architect Journal – Service Orientation Today and Tomorrow post of 9/28/2009 lists the contents of the latest issue, which is devoted to SaaS and cloud technologies:

  • Design Considerations for Software plus Services and Cloud Computing, by Jason Hogg (Rob Boucher) et al.
    Design patterns for cloud-computing applications.
  • Model-Driven SOA with “Oslo”, by César de la Torre Llorente
    A shortcut from models to executable code through the next wave of Microsoft modeling technology.
  • An Enterprise Architecture Strategy for SOA, by Hatay Tuna
    Key concepts, principals, and methods that architects can practically put to work immediately to help their organizations overcome these challenges and lead them through their SOA- implementation journey for better outcomes.
  • Enabling Business Capabilities with SOA, by Chris Madrid and Blair Shaw
    Methods and technologies to enable an SOA infrastructure to realize business capabilities, gaining increased visibility across the IT landscape.
  • Service Registry: A Key Piece for Enhancing Reuse in SOA, by Juan Pablo García-González, Veronica Gacitua-Decar, and Claus Pahl
    A strategy for publishing and providing facilities to access services information.
  • How the Cloud Stretches the SOA Scope, by Lakshmanan G and Manish Pande
    An emerging breed of distributed applications both on-premises and in the Cloud.
  • Event-Driven Architecture: SOA Through the Looking Glass, by Udi Dahan
    Looking back on the inherent publish/subscribe nature of the business and how this solves thorny issues such as high availability and fault tolerance.
  • Is SOA Being Pushed Beyond Its Limits?, by Grace Lewis
    Challenges for future service-oriented systems.
  • You can download the entire issue as a PDF file here.

    James Urquhart’s Cloud computing and the big rethink: Part 1 of 9/29/2009 analyzes @Beaker’s “Incomplete Thought” post of last week:

    Chris Hoff, my friend and colleague at Cisco Systems, has reached enlightenment regarding the role of the operating system and, subsequently, the need for the virtual machine in a cloud-centric world.

    His post last week reflects a realization attained by those who consider the big picture of cloud computing long enough.

    James concludes:

    So, the problem isn't that OS capabilities are not needed, just that they are ridiculously packaged, and could in fact be wrapped into software frameworks that hide any division between the application and the systems it runs on.

    The irony is that Chris Hoff’s “Incomplete Thought” is far more complete than most of mine that I intend to be complete.

    Chuck Hollis chimes in with his Cloudy Discussions post of 9/29/2009, which begins:

    I have been actively involved in discussing clouds here on my blog, as well as various customer and industry forums for a little over a year.

    I've put forward some fairly definitive concepts (e.g. private cloud) as well as had plenty of time to discuss and occasionally defend my position.  It's added up to quite a few posts.

    I went back to one of the foundational posts I did way back in January, and was surprised as to how well the thinking has held up over time.

    Today, I'd like to pick up the discussion where my esteemed Cisco colleagues Chris Hoff and James Urquhart have taken the discussion, as they give me a convenient jumping-off point for some deeper topics I've been itching to get into.

    Chuck is VP of Global Marketing and CTO for EMC Corporation.

    John Fontana recounts his interview with Microsoft’s Bob Muglia in his Top Microsoft execs outline 2010 challenges post to NetworkWorld of 9/29/2009:

    … When asked in an interview Monday with Network World what the top three threats would be in 2010 for Microsoft's server and tools division, Bob Muglia, president of the unit, pulled a semantic slight-of-hand and said he preferred to refer to them as opportunities. …

    "The No. 1 opportunity we have is to look at enterprise applications and grow our share of high-end enterprise applications…" Muglia said. "We still have a disproportionally small percentage of servers and revenue associated with servers that are coming from high-end enterprise applications, which remain predominantly IBM and Oracle based."

    Muglia said the second big opportunity is to help companies transition to the cloud. [Emphasis added.]

    "We really are the company that should be able to do this for our customers because of the huge install base of Windows server applications that they have," Muglia said. "We should provide the best services at the best cost for customers to move into a cloud environment."

    Muglia rounded out his top three opportunities for 2010 saying competition with Linux would be a major focus. …

    Michael Arrington continues with his Steve Ballmer interview from where his TechCrunch post of last week left off in his Microsoft CEO Steve Ballmer On "Moving The Needle" article of 9/28/2009 for the Washington Post:

    What about new technologies like Azure, Mesh, etc? Ballmer says they're "dislocators to technology" that overlay all of these opportunities:

    [Ballmer:] “I don't list the cloud because the cloud has kind of overlaid all of those opportunities. We have opportunities by offering cloud infrastructure to enhance the margins we make in our server business, in our communications and collaboration and productivity business, and that's where things like exchange online, SharePoint online, Windows Azure, they're not really new value propositions, but they are new potential margin streams and dislocators to technology shifters and some of the existing kind of customer propositions that we invest in.” [Emphasis added.]

    Deloitte Development LLC offered Cloud Computing: A collection of working papers through Docuticker on 9/28/2009:

    Cloud computing promises to become as a foundational element in global enterprise computing; in fact, many companies are exploring aspects of the cloud today. What leadership seeks is a strategic roadmap that allows them to capitalize on the operational benefits of current cloud offerings, while establishing a migration path towards a business and architectural vision for the role cloud computing will play in their future.

    Deloitte’s Center for the Edge has spent the past year combining extensive research and industry insights to explore the topics of cloud computing and next-generation Web services. The resulting publication, Cloud Computing: A collection of working papers, explores the topic from different perspectives: business drivers, architectural models, and transformation strategies…

    Download Cloud Computing (PDF; 1.76 MB)

    Michael Vizard claims "Although cloud computing, in its current form, is only a couple of years old with fairly limited adoption, it’s already becoming a commodity” in his Cloud Computing: The End Game post of 9/28/2009 to the ITBusinessEdge.com site:

    Every hosting company in the planet has already jumped in, trying to forestall any potential loss of market share to any number of emerging cloud computing infrastructure providers. However, given the downturn in the economy and the simple fact that there is a lot more server capacity than applications to run on them, the companies that provide cloud computing services are already engaged in a bruising price war.

    In response, some cloud computing service providers such as SkyTap and IBM have been moving upstream. They not only provide raw computation power, they also provide application testing capabilities and host commercial applications in the hopes of developing a portfolio of software-as-a-service applications.

    That’s all well and good, but cheap computing horsepower derived from cloud computing is not the primary value proposition of cloud computing. In order to drive the next evolution of enterprise computing, cloud computing providers are going to have to evolve in a way that allows services to be dynamically embedded inside customizable business processes that can change in a matter of minutes and days, rather than in weeks and months. …

    Michael continues with a list of what’s needed to shed the “commodity” stigma.

    Ron Miller claims Enterprise 2.0 Brings Knowledge Management to the Forefront in this 9/22/2009 post to IntranetJournal.com:

    Knowledge Management tools emerged in the 90s but never got very far, because for the most part, they relied on individuals to fill out forms about what they knew. Even if they were willing to do that, the forms would provide limited information or become outdated very quickly providing little actual utility. Enterprise 2.0 tools like blogs, wikis and micro-blogging, which you may be adding to your Intranet mix, provide a way to capture knowledge much more organically than its 90s counterparts without people even realizing they are participating in knowledge capture.

    Bill Ives, a consultant who has been working in this space for years, and who writes the Portals and KM blog, says today's tools make it much easier to capture knowledge without nearly as much effort as the older generation of knowledge management tools. …

    <Return to section navigation list> 

    Cloud Security and Governance

    Chris Hoff (@Beaker)’s Cloud Providers and Security “Edge” Services – Where’s The Beef? post of 9/30/2009 begins:

    Previously I wrote a post titled “Oh Great Security Spirit In the Cloud: Have You Seen My WAF, IPS, IDS, Firewall…” in which I described the challenges for enterprises moving applications and services to the Cloud while trying to ensure parity in compensating controls, some of which are either not available or suffer from the “virtual appliance” conundrum (see the Four Horsemen presentation on issues surrounding virtual appliances.)

    Yesterday I had a lively discussion with Lori MacVittie about the notion of what she described as “edge” service placement of network-based WebApp firewalls in Cloud deployments.  I was curious about the notion of where the “edge” is in Cloud, but assuming it’s at the provider’s connection to the Internet as was suggested by Lori, this brought up the arguments in the post above: how does one roll out compensating controls in Cloud?

    and expresses the need for “security services such as DLP (data loss/leakage prevention,) WAF, Intrusion Detection and Prevention (IDP,) XML Security, Application Delivery Controllers, VPN’s, etc. … to be configurable by the consumer.”

    • Harris Corporation Demonstrates Secure Exchange of Public Health Information in 'Cloud' Computing Environment press release of 9/30/2009 is subtitled “Demonstration Part of Interoperability Showcase at Public Health Information Network Conference”:

    … Harris Corporation (NYSE: HRS), in collaboration with Cisco Systems, has demonstrated the ability to rapidly, safely, and securely exchange healthcare information in a virtual - or cloud - computing environment.

    At a recent demonstration during the Public Health Information Network Conference in Atlanta, the companies showed that security and privacy of web-based health information remains protected with a service as data is encrypted in transit and stored securely in the cloud. The demonstration was implemented over the CONNECT health information exchange platform with a Cisco Systems AXP router. …

    John Pescatore’s Back to the Future: The Next Generation Firewall post of 9/30/2009 concludes:

    … At Gartner we’ve long talked about the need for the “Next Generation Firewall” to deal with the new threats and the new business/IT demands. Greg Young  and I are in the final stages of a note on “Defining the Next Generation Firewall” which should be available to Gartner clients next week. Today Greg opines about UTM, which isn’t NGFW – we go through the differences in the research note coming out.

    There is a bit of deja vu all over again – back at [Trusted Information Systems] (TIS) in 1995, I thought by now firewalls would have proxies for every application and Moore’s law would have enabled firewalls to do deeper and broader inspection at wire speeds across all of them. As usual, what should happen always takes a back seat to what can happen, which is then further limited by what actually will happen.

    • Alysa Hutnik, an attorney with the Kelley Drye firm in Washington DC, specializes in information security and privacy, counseling clients on what to do after a security breach. In Privacy and the Law: Alysa Hutnik of Kelley Drye of 9/30/2009, Alysa discusses:

    • Do's and don'ts following a data breach;
    • Privacy legislation trends for 2010;
    • What organizations can do today to prevent privacy/security challenges tomorrow.

    Tim Greene claims The U.S. Patriot Act has an impact on cloud security in this 9/29/2009 post to NetworkWorld’s Cloud Security newsletter:

    Cloud security includes the obligation to meet regulations about where data is actually stored, something that is having unforeseen consequences for U.S. firms trying to do business in Canada.

    Recently several U.S. companies that wanted contracts to help a Canadian program to relocate 18,000 public workers were excluded from consideration because of Canadian law about where personally identifiable information about its citizens can be stored.

    The rule is that no matter the location of the database that houses the information, it cannot place the data in danger of exposure. From a Canadian perspective, any data stored in the U.S. is considered potentially exposed because of the U.S. Patriot Act, which says that if the U.S. government wants data stored in the U.S., it can pretty much get it.

    That effectively rules out cloud service providers with data centers only in the U.S. from doing business in Canada.

    John Pescatore’s Twelve Word Tuesday: The Cloud Needs Its Own MPLS post of 9/29/2009 claims:

    Without an added value security layer, public cloud fails for business applications.

    In this case, MPLS is an abbreviation for Multi-Protocol Label Switching not Minneapolis. Cisco defines MPLS in their Routing GLOSSARY:

    MPLS is a scheme typically used to enhance an IP network. Routers on the incoming edge of the MPLS network add an 'MPLS label' to the top of each packet. This label is based on some criteria (e.g. destination IP address) and is then used to steer it through the subsequent routers. The routers on the outgoing edge strip it off before final delivery of the original packet. MPLS can be used for various benefits such as multiple types of traffic coexisting on the same network, ease of traffic management, faster restoration after a failure, and, potentially, higher performance.

    Robert Rowley, MD’s HIEs, security, and cloud EHRs post of 9/29/2009 observes:

    Health Information Exchanges (HIEs) have received increasing attention in recent months. They are part of the agenda of the Office of the National Coordinator (ONC) for Healthcare IT, as they take steps to create a Nationwide Health Information Network (NHIN). What is the purpose of such things? What data security risks are raised by such networks? How does this relate to already-connected Internet “cloud”-based EHRs? We will attempt to address these questions in this article.

    One of the problems with a health IT landscape characterized by legacy, locally-installed Electronic Health Record (EHR) systems is that medical data is segregated into practice-centered data silos, much like medical data in a paper environment – every doctor has his/her own “chart rack” (or EHR database), and a given patient may have segments of his/her medical information scattered among many different places.

    There is no one, coherent place where all the information about a patient is kept, and so copying of needed health information and sending to others is how data from outside the practice is updated. Things like lab data, hospital reports, consultation from colleagues, x-ray and imaging reports – all these things make their way into some of the physician’s charts, often in a hit-and-miss fashion.

    Randy Bias claims Cloud Standards are Misunderstood in this 9/29/2009 post:

    Create them now and stifle innovation or create them later when it’s too late? That seems to be the breadth of the discussion on cloud standards today. Fortunately, the situation with cloud computing standards is not actually this muddy. In spite of the passionate arguments, the reality is that we need cloud standards both today and tomorrow. In this posting I’ll explore the cloud standards landscape. …

    <Return to section navigation list> 

    Cloud Computing Events

    • Kevin Jackson’s INPUT FedFocus 2010 post of 9/30/2009 requests:

    Please join me at the 7th Annual FedFocus Conference, November 5, 2009, at the Ritz Carlton in McLean, VA. This conference has been designed to provide crucial information on upcoming federal government procurement plans. I will be the morning keynote, speaking on the use of cloud computing technologies to increase government efficiency and transparency.

    When: 11/5/2009  
    Where: Ritz Carlton hotel, McLean, VA, USA

    • Jeff Currier reports on 9/30/2009 about new SQL Azure-tagged sessions at PDC 2009. Here’s the complete list:

    • Eric Nelson posted his Slides for Software Architect 2009 sessions on 9/21/2009:

    Design considerations for storing data in the cloud with Windows Azure - Wed 30th Sept, 2pm
    The Microsoft Azure Services Platform includes not one but two (arguably three) ways of storing your data. In this session we will look at the implications of Windows Azure Storage and SQL Data Services on how we will store data when we build applications deployed in the Cloud. We will cover “code near” vs “code far”, relational vs. non-relational, blobs, queues and more.

    Dimitry Sotkinov’s Attending TechEd Europe? Vote for Cloud sessions post of 9/28/2009 observes:

    There are two cloud-related sessions in the “community” section of Microsoft TechEd Europe 2009 and you need to vote for them here if you are attending the conference (and obviously if you want them in the agenda).

    Basically, both are on cloud computing: one for developers and the other for IT professionals:

    Going to the Cloud: Are we crazy?

    Are cloud services about efficiency or negligence? About being able to outsource commodity services and concentrate on core competence or loosing control and risking getting out of compliance? Which IT services can be safely moved to the cloud and which should stay in house? Let’s get together and discuss the present and the future of Software + Services use in our companies, share success stories, lessons learned, discuss concerns and best practices.

    Developing on Azure: Stories from the Trenches

    Have you given Windows Azure a try? Whether it was just kicking the tires or you are deep in the enterprise application development, let’s get together and share the lessons we learned on the way.

    Both topics are near and dear to my heart, and as a matter of fact, will be moderated by me should they get into the agenda.

    So if you want these sessions in Berlin this November, please cast your vote here.

    SYS-CON Events will convene the 1st Annual Government IT Conference & Expo
    in Washington, DC on 10/6/2009:

    Tracks will cover Cloud Computing/Virtualization, SOA, and Security & Compliance.

    There will be breakout sessions on the security issues that are unique to the Cloud, such as the crucial distinction between Private and Public clouds. Expert speakers from government and the software industry alike will be looking at issues such as the requirements for how companies can handle government information and how information can be most successfully shared by multiple clouds. Doing more with less is the new reality for most IT departments, and the Government is no exception. So the cost-effectiveness of technologies such as Virtualization will also be foremost on the agenda.

    When: 10/6/2009  
    Where: The Hyatt Regency on Capitol Hill, Washington DC, USA

    Ray@UKAzure.net announces the third meeting of the UK Azure Users Group on 10/6/2009 from 4:00 PM to 7:00 PM (GMT) at Microsoft Cardinal Place, London:

    In our session, aimed at Developers & Technical decision makers, David Chappell looks at the Windows Azure platform and how it compares with Amazon Web Services, Google AppEngine, and Salesforce.com’s Force.com.

    Following on from David Chappell’s talk David Gristwood & Eric Nelson from Microsoft will provide a deeper technical insight & update on Windows Azure & SQL Azure.  The goal is to provide a foundation for thinking about the Windows Azure platform, then offer guidance on how to make good decisions for using it.

    When: 10/6/2009 4:00 PM to 7:00 PM (GMT)  
    Where: Microsoft Cardinal Place, 100 Victoria Street, London SW1E 5JL United Kingdom

    Health 2.0 will present the Healthcare 2.0 2009 conference for User-Generated Healthcare in San Francisco on 10/6 and 10/7/2009:

    With over a hundred speakers and plenty of new live demos and technologies on display on stage and in the exhibit hall, you’ll get a sweeping overview of the ways that information technology and the web are changing healthcare in areas from online search to health focused online communities and social networks that connect patients and clinicians.

    Aneesh Chopra, Chief Technology Officer, U.S. Federal Government will present the opening keynote. Other presentation include:

    • Clinical Groupware and the Next Generation of Clinician-Patient Interaction Tools
    • Adoption of Health 2.0 Platforms by Physicians on Main Street
    • Payers and Health 2.0
    • The Patient is In (presented by Kaiser Permanente)
    • Health 2.0 Tools for Administrative Efficiency
    • Can Health 2.0 Make Health Care More Affordable?
    • The Consumer Aggregators (sponsored by Cisco)
    • Data Drives Decisions (sponsored by Oracle)
    • Innovations in Health 2.0 Tools: Showcasing the Health 2.0 Accelerator
    • Health 2.0 Tools for Healthy Aging
    • Looking Ahead: Cats and Dogs (description below)

    Following the passing of the stimulus and the debate over meaningful use, there’s been lots of tension between the “cats” (the major IT vendors)  & “dogs” (the web-based “clinical groupware” vendors). The real question is how the new wave of EMRs is going to integrate with the consumer facing and population management tools. Can there be unity around the common themes of better health outcomes through physician and patient use of technology? Or will the worlds of Health 2.0 and the EMR move down separate paths? We have three very outspoken leaders to debate the question.

    When: 10/6 – 10/7/2009   
    Where: Design Center Concourse, 635 8th Street (at Brannan), San Francisco, CA, USA

    <Return to section navigation list> 

    Other Cloud Computing Platforms and Services

    • Charles Babcock reports IBM Preparing Self-Service Software Infrastructure in this 9/29/2009 post:

    IBM has been investing in cloud computing for several years, although Willy Chiu, VP of IBM Cloud Labs, acknowledges it may be difficult for those outside IBM to develop a picture of what its cloud initiative will finally look like.

    That's because so far IBM has chosen to make point announcements of limited cloud products. Its CloudBurst appliance was announced in June, a blade server that can be loaded with IBM software and used as cloud building block.

    At Structure 09, the June 25 cloud computing conference sponsored by GigaOm in San Francisco, Chiu said: "Cloud computing is a new way of consuming IT." That's a radical view, a step ahead of the evolutionary view that the cloud will start out as an IT supplement. That is, it will absorb specific workloads, such as business intelligence or a new consumer facing application. In the long run, Chiu said, it will host many IT activities and services.

    In a recent interview, Chiu elaborated. IBM systems management software, Tivoli, has been given a set of services to administer the cloud. They include: Services Automation Manager, Provisioning Manager and Monitoriong Manager. So far these services are designed to provision and manage workloads running in VMware virtual machines, but there is no restriction that limits Tivoli to VMware file formats. …

    Ed Moltzen’s Google's Cloud 'Not Fully Redundant,' Company Admits post of 9/25/2009 notes the following statement in Google’s most recent 10-Q filing with U.S. Securities and Exchange Commission:

    "The availability of our products and services depends on the continuing operation of our information technology and communications systems. Our systems are vulnerable to damage or interruption from earthquakes, terrorist attacks, floods, fires, power loss, telecommunications failures, computer viruses, computer denial of service attacks, or other attempts to harm our systems.

    "Some of our data centers are located in areas with a high risk of major earthquakes. Our data centers are also subject to break-ins, sabotage, and intentional acts of vandalism, and to potential disruptions if the operators of these facilities have financial difficulties. Some of our systems are not fully redundant, and our disaster recovery planning cannot account for all eventualities," the company writes. [Emphasis added.]

    David Linthicum describes Microsoft's one chance to move to the cloud with Microsoft Office Web Apps in this 9/24/2009 post with a “Microsoft could give Google Docs a run for its money -- if it's really serious about the cloud” deck:

    … As Office Web Apps moves out as a "technical preview," last week there were reports that Google Docs is "widely used" at 1 in 5 workplaces. That's killing Office Web Apps, in my book. As I've stated a few times in this blog, I'm an avid Google Docs user, leveraging it to collaborate on documents and cloud development projects, as well as run entire companies. Although Google Docs provides only a subset of features and functions you'll find in Microsoft Office, it's good enough to be productive. But the collaborative features are the real selling point. …

    If Microsoft can provide most of its Office features in the cloud, it has an opportunity to stop Google's momentum, and even perhaps take market share. After all, one of the values of being in the cloud is the ability to change clouds quickly just by pointing your browser someplace else. If Microsoft has a better on-demand product, and the price is right, I'll switch. …

    Ray DePena’s Cloud Talk with Peter Coffee: What's Next for the Cloud? post of 9/28/2009 covers:

    … The economic advantages of the cloud computing model, comparisons of lifecycle costs (TCO) of services vs. acquisition + ongoing maintenance costs of legacy business models, costs of delay, and other detractors of legacy business models compared to the benefits of a public cloud offering like Salesforce.com, as well as the insights and impact of this coming paradigm shift.

    We spoke of public and private clouds, advantages and disadvantages of the models, current industry concerns - security, fail-over, real-time mirroring, and several examples of platform application development speed with Force.com (Starbucks example), which is approximately 5X that of other approaches. …

    What’s missing is a video or audio clip of the interview. Strange.

    Peter Coffee is Director of Research for Salesforce.com

    Tarry Singh analyzes Xerox’s latest acquisition in his Severe market contraction is coming: After Dell, Xerox buys ACS for $6.4 Bn deal! post of 9/28/2009:

    Xerox, based in Norwalk, Conn., has suffered from declining sales of copiers and printers, and the accompanying diminishing uses of ink, toner and paper. The deal for Dallas-based ACS is expected to triple Xerox’s services revenue to an estimated $10 billion next year from 2008’s $3.5 billion.

    The move also represents the first bold move by Xerox Chief Executive Ursula Burns, who took over on July 1. Ms. Burns, who become the first African-American woman to head a Fortune 500 company, called the deal “a game-changer” for her company.

    Xerox’s agreement comes a week after Dell Inc. agreed to buy information-technology service provider Perot Systems Corp. for $3.9 billion. The sector’s recent merger activity — which includes Hewlett-Packard Co.’s purchase last year of Electronic Data Services — leaves Accenture PLC, Computer Sciences Corp. and Unisys Corp. as some of the larger services companies still independent.

    Rich Miller recounts on 9/28/2009 Larry Ellison Rants About Cloud Computing at Palo Alto’s Churchill Club with a five-minute video that Rich introduces as follows:

    Oracle CEO Larry Ellison has bashed cloud computing hype before. So it was unsurprising but nonetheless entertaining when, during an appearance at the Churchill Club on Sept. 21, Ellison unloaded on cloud computing in response to an audience question relayed by moderator Ed Zander. “It’s this nonsense. What are you talking about?” Ellison nearly shouted. “It’s not water vapor!. All it is, is a computer attached to a network.” Ellison blamed venture capitalist “nitwits on Sand Hill Road” for hype and abuse of cloud terminology. “You just change a term, and think you’ve invented technology.” …

    Barton George’s Kibitzing about the Cloud: Ellison goes off is similar to Rich Millers post, but has a shorter video. Bart says:

    Well its been a year later and the abuse of the term cloud has gone from bad to worse.  As a result, when Mr. Ellison appeared at the  Churchill Club last week and the question of Oracle’s possible demise at the hand of the cloud came up, he became a bit animated.  Enjoy!

    (I love Ed Zander’s bemusement and reactions) …

    Of note is Larry’s succinct definition of cloud computing:  “A computer attached to a network.”  And its business model? “Rental.”

    SOASTA, Inc. announced M-Dot Network Leverages the Cloud to Test Digital Transaction Platform for 1,000,000 Users in a 9/29/2009 MarketWire press release:

    SOASTA (www.soasta.com), the leader in cloud testing, and M-Dot Network (www.mdotnetwork.com) today announced the successful completion of an unprecedented 1,000,000-user performance test using SOASTA's CloudTest On-Demand service. The test was run from the SOASTA Global Test Cloud against the M-Dot transaction application, which is deployed in Amazon EC2. CloudTest's comprehensive analytics, displayed and updated as the test was running, identified points of stress in their architecture in real time.

    The M-Dot Network platform enables consumers to receive digital coupons via a retailer's web site or micro-web site on their mobile phone. Consumers can find and select coupons online or on their mobile phone. Offers are aggregated and presented directly to consumers from multiple third party digital coupon issuers and from the retailer. …

    Intuit, Inc. supplements QuickBooks and QuickBase with the Intuit Workplace App Center, a putative competitor to Google Apps for small businesses, claiming:

    Improve your productivity using web-based apps that help you solve everyday business challenges like finding new customers or managing your back office. Plus many of these apps sync with QuickBooks! Start saving time and money today—take these apps on a free trial run.

    I didn’t find one instance of the word “cloud” in the marketing propaganda.

    <Return to section navigation list> 

    blog comments powered by Disqus