Tuesday, January 19, 2010

Windows Azure and Cloud Computing Posts for 1/18/2010+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

 
• Update 1/19/2010: Windows Azure Team: Windows Azure Platform TCO and ROI Calculator Now Available Online AND Offline; Lori MacVittie: A Fluid Network is the Result of Collaboration Not Virtualization; Brian P. Watson: Sybase's CIO on Cloud Computing, Mobility; Michael Robertson: Apple’s Secret Cloud Strategy And Why Lala Is Critical; Jerry Huang: Windows Azure Storage Windows Clients; David Linthicum: Provocative predictions for cloud computing: How realistic?; Robert Rowley, MD: Meaningful use – laboratory integration; Eric Nelson: QandA: When do I get charged for compute hours on Windows Azure?; Bill Lodin: Virtual Windows Azure Lab is Live on msdev.com; Janikiram MSV: Download 6 Part [Video] Tutorial on Demystifying The Cloud; SQL Server Team: SQL Server 2008 R2 gets an official date; Randy Bias: Debunking the “No Such Thing as A Private Cloud” Myth; and others.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the November CTP in January 2010. 
* Content for managing DataHubs will be added as Microsoft releases more details on data synchronization services for SQL Azure and Windows Azure.

Off-Topic: OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009.

Azure Blob, Table and Queue Services

• Jerry Huang finds blob upload speed to Amazon S3 and Windows Azure to be almost identical in his Windows Azure Blob Storage vs. Amazon S3 post of 1/19/2010:

This week, Gladinet Cloud Desktop reached version 1.4.2 (build 232). Windows Azure Blob Storage has been officially added to the list of supported Cloud Storage Providers (S3, AT&T, Google,e tc.)

This means you can map a network drive to the Windows Azure Blob Storage and start using it from Windows Explorer (See this link for detailed How-To information).

image

As always, every time we add a new storage provider integration, we would compare it to the existing ones, such as Amazon S3.

The test is simple, I have a 27M zip file that I first drag and drop into a folder under Amazon S3 and watch the upload progress in the Gladinet Task Manager, timing it from begin to end. Then I repeat the same thing with Windows Azure Blob Storage.

  • Amazon S3: 176 seconds
  • Windows Azure Blob Storage: 175 seconds
    (Upload Speed: 154 KBytes/sec)

It is very interesting to see the results are so close when knowing the two cloud storage providers are completely different. The servers hosting the data are completely different and the IP routes are different. There must be some kind of server side tool to throttle the speed of a single connection to a certain number. Maybe an expert of the cloud server end can explain this. …

A speed test shows my upload speed to a random server in US is 3.97Mbit/s (  ~ 490KBytes/sec). …

Jerry is a founder of Gladinet.

• Jerry Huang asserts “I haven't seen a Azure Storage Client on Mac yet but I guess the PHP and Ruby clients can cover those on Mac and Linux” in his Windows Azure Storage Windows Clients post of 1/19/2010:

As the Windows Azure Platform making the transition from public preview to full production now, here is a list of Windows  Azure Storage clients to help you leverage the Azure Storage:

  • Gladinet Cloud Desktop provides drive mapping and online backup tool for Windows Azure, Amazon S3, AT&T Synaptic Storage and etc. You can do drag and drop from Windows Explorer to Azure directly and setup Automatic Backup tasks. Azure Blob Storage Only. Free/Professional. Windows. Commercial. Read more...
  • Azure Storage Explorer is a useful GUI tool for inspecting and altering the data in your Azure cloud storage projects including the logs of your cloud-hosted applications. All three types of cloud storage can be viewed: blobs, queues, and tables. Free, Windows(NET/WPF), Open Source.
  • Cloud Storage Studio is a Windows (WPF) based client for Windows Azure Storage. CSS allows you to manage your tables, queues & blob containers. Free for Beta, Windows (.NET/WPF), Commercial.
  • CloudBerry Explorer For Windows Azure Blob Storage is a Windows .NET UI to manage files on Azure. It provides a two panel side-by-side view for loca[l]/remote storage. It also provides multi-tab to manage mult[i]ple accounts. Azure Blob Storage Only. Free/Professional. Windows(.NET 2.0). Commercial.
  • CloudXplorer is a Windows .NET UI to manage files on Azure. It provides Windows Explorer-clone like UI for Azure Blob Storage. Free. Beta. Windows (.NET 3.5). Commercial.

These tools overlap and also compliment each other on functionality. For example, you can use Gladinet Cloud Desktop to map a drive letter, do drag and drop and in place editing, and use Cloud Storage Studio to fine tune the properties of each blob and use other tools to cover functionality gap in between. …

Jersey Bob reported StoragePoint 2.2 Released on 1/17/2010 with support for Azure blobs:

Windows Azure BLOB Storage support.  The Azure adapter is no longer Beta and is available for purchase. …

We have also released an updated StoragePoint for SharePoint 2010 Beta.  If you are interested in testing the Beta please contact our Support department.

You can request a 30 day trial of StoragePoint along with any of the on-premise or cloud storage adapters at http://www.storagepoint.com/product.aspx?tab=4.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

• The SQL Server Team reported SQL Server 2008 R2 gets an official date in this 1/19/2010 post to the Data Platform Insider blog:

Today, SQL Server 2008 R2 received an official release date. It will be listed on Microsoft’s May price list, and will be available by May 2010.

SQL Server 2008 R2 showcases Microsoft’s continued commitment to business intelligence and mission-critical workloads. Since we made this release available as a Community Technology Preview (CTP) in August 2009, it has been well-received by the community with more than 150,000 downloads. …

Customers with Software Assurance can upgrade to SQL Server 2008 R2 and take advantage of the new features without incurring additional licensing costs. 

Li-Lun Luo of the SQL Azure support team was able to reproduce the problem I reported in my Windows Azure Platform Training Kit (December 2009) StartHere.cmd Utility Throws Exceptions post of 1/13/2010.

Li-Lun Luo of the SQL Azure support team was able to reproduce the problem I reported in my detailed SSMS 2008 R2 11/2009 CTP Has Scripting Problems with SQL Azure Selected as the Target Data Base Engine Type post of 1/13/2010.

See the next to last last item in the Has Anyone Else Seen These Bugs in SQL Server 2008 R2's Script Generation for SQL Azure? thread of 1/13/2010 in the SQL Azure - Getting Started forum.

Also, Vinod Kumar Jagannathan, who I believe is a Program Manager at Microsoft India, reports in a comment to this post that the problem is an issue with the 11/2009 CTP and is fixed for later (future) SSMS 2008 R2 CTPs.

Hilton Giesenow explains How Do I: Integrate An Existing Application With SQL Azure? Part – 1 in this 00:17:08 video segment announced by Liam Cavanagh on 1/18/2009:

What if you could get all the benefits of distributed, on-premises application databases AND hosted cloud-based databases for a real Software + Services implementation? In this video, Hilton Giesenow, host of The MOSS Show SharePoint podcast (http://www.TheMossShow.com/) shows us how to set up a powerful and easy-to-use synchronisation model between a local Microsoft SQL Server database and a Microsoft SQL Azure database with the Microsoft Sync Framework tools for SQL Azure.

Presented by Hilton Giesenow.

<Return to section navigation list> 

AppFabric: Access Control, Service Bus and Workflow

Dare Obasanjo asks Does the world need OpenID Connect? and concludes “Although we might need OpenID Connect someday, that day isn’t today” on 1/18/2010:

About two weeks ago Chris Messina wrote a post titled OpenID Connect where he argued for the existence of a Facebook Connect style technology build on OpenID. He describes the technology as follows

So, to summarize:

  • for the non-tech, uninitiated audiences: OpenID Connect is a technology that lets you use an account that you already have to sign up, sign in, and bring your profile, contacts, data, and activities with you to any compatible site on the web.
  • for techies: OpenID Connect is OpenID rewritten on top of OAuth WRAP using service discovery to advertise Portable Contacts, Activity Streams, and any other well known API endpoints, and a means to automatically bootstrap consumer registration and token issuance.

This is something I brought up over a year ago in my post Some Thoughts on OpenID vs. Facebook Connect. The fact is that OpenID by itself is simply not as useful as Facebook Connect. The former allows me to sign-in to participating sites with my existing credentials while the latter lets me sign-in, share content with my social network, personalize and find my friends on participating sites using my Facebook identity.

As I mentioned in my previous post there are many pieces of different “Open brand” technologies that can be pieced together to create something similar to Facebook Connect such as OpenID + OpenID Attribute Exchange + Portable Contacts + OAuth WRAP + Activity Streams. However no one has put together a coherent package that ties all of these together as a complete end-to-end solution. This isn’t helped by the fact that these specs are at varying levels of maturity and completion.

<Return to section navigation list>

Live Windows Azure Apps, Tools and Test Harnesses

• Robert Rowley, MD’s Meaningful use – laboratory integration [into electronic health records] post of 1/19/2010 describes the LOINC coding systems for clinical laboratory reports:

Ever since the ONC made its recommendations to CMS for Meaningful Use and Certification of Electronic Health Records (EHRs), we have been commenting on several aspects of the 25 criteria categories needed to be eligible for HITECH incentives starting in 2011. We have commented on the question of Certification and stand-alone billing systems, and on challenges around interoperability, particularly with respect to statewide and regional Immunization Registries.

An additional area of challenge for establishing an eventual “plug-and-play” national health information system pertains to laboratory data. Incorporating clinical lab-test results into EHRs as structured data is one of the Meaningful Use criteria (criterion 10) – physicians must be able to demonstrate that >50% of clinical lab tests that were ordered (whose results were numeric or +/-) are incorporated as structured data.

In order to achieve this, a common “language” must be implemented by all laboratories, so that EHRs can import these results in a systematic fashion. The specific vocabulary specified in the Certification documents is to use a coding system called LOINC. There is a standardized LOINC code for each different lab test type, specimen type and specific methodology – there are about 40,000 different LOINC codes currently defined for lab tests. …

Dr. Rowley continues with a description of “two issues [that] emerged when trying to implement standardization around LOINC codes.” LOINC codes should be a candidate for inclusion in the Microsoft Codename “Dallas” databases.

• Eric Nelson asks QandA: When do I get charged for compute hours on Windows Azure? on 1/19/2010 and confirms you will be charged computing time even when the project is:

    • Deployed and running but no one is using it
    • Deployed but Stopped/Suspended

That is:

    • You will be charged even if no one is using your service.
    • It is not sufficient to suspend a service – you need to actually delete it to avoid being charged.

This is the same as Amazon Web Service’s policy for Amazon EC2 instances.

• Bill Lodin’s Virtual Windows Azure Lab is Live on msdev.com as of 1/19/2010:

In this virtual lab, you will create a fully functional Windows Azure application from scratch. In the process, you will become familiar with several important components of the Windows Azure architecture including Web Roles, Worker Roles, and Windows Azure storage. You will also learn how to integrate a Windows Azure application with Windows Live ID authentication, and you’ll learn how to store relational data in the cloud using SQL Azure.

Update: If you are returning to continue the Windows Azure virtual lab and you started it before January 6, 2010, please note that the lab has been updated to reflect recent API changes. The exercises as shown here reflect the latest SDK (November 2009.)

• tbTechnet announced Updated Documents to Help You Get Your Feet Wet on Windows Azure on 1/18/2010:

To reflect the commercial availability of Windows Azure and the Windows Azure Offers – we went ahead and update[d] a couple of simple step-by-step documents to help developers get their feet wet with Windows Azure.

John Moore reported RHIO Failure: CalRHIO Goes Belly-up in this 1/18/2010 post:

The much ballyhooed Health Information Exchange (HIE) in the state of California, CalRHIO, has raised the white flag, dismissing its troops and sending home its arms supplier (Medicity).  Despite its founding five years ago, support of some significant organizations (e.g., United Health Group, Cisco, HP, California Hospital Assoc., etc.) spending some $7M to date and launching a major roadshow in March last year that included the go live of 23 institutions in Orange County in October, CalRHIO did not get the support of California’s Health and Human Services (CHHS) to be the state designated entity for overseeing ARRA funding for state HIEs.

Based on an article in California Healthline, a number of other organizations had some serious concerns with CalRHIO, enough concerns to start their own organization, the California eHealth Collaborative (CAeHC).  What is surprising here is that one of those that called into question CalRHIO’s operating model was its former CEO Lori Hack, who is now a board member of the competing CAeHC. …

John concludes:

[P]robably the clearest message here is that the governance issue of HIEs is extremely political, especially when a boat-load of federal Stimulus dollars are at stake.  The CalRHIO fiasco is unlikely to be the last one we’ll hear of over the next 3-9 months.

I consider “extremely political” an understatement, but otherwise I agree with his conclusion.

Peter Swartout’s Exploring the World of Online Personal Health Records tutorial of 1/18/2010 is the first of a planned series of articles about Online Personal Health Records:

You might have noticed how hard it is to obtain your own health care history.  Most medical records are written on paper. This is changing, but slowly.  Even forward-looking doctors, hospitals, and pharmacies who are converting to electronic records have a hard time integrating with each other, since each IT system is largely independent of other systems.  What you, the patient, would like to see is all of your history together in one place, regardless of who the provider was or where the care was given.

One of the tools that show promise in moving us towards that goal is the PHR, or Personal Health Record.  While the concept of PHR has been around a long time, a relatively new idea, the online PHR, gained a big boost in popularity when Google and Microsoft each announced plans to provide PHR applications, aka Patient Portals, in 2007.

In a series of articles, we will explore online Personal Health Records and see how they might benefit both you and your care providers.  We’ll take a look at how the two major business players architected their PHR applications, and finally dive deeper into how each of these solutions can be integrated into an existing health care IT enterprise. …

Here are the topics of this and the future articles:

    1. This introduction to PHR
    2. Google Health - what it offers, how it works, approaches to integration
    3. Google Health in an SOA as orchestrated with WebSphere Process Server
    4. Microsoft HealthVault - what it offers, how it works, approaches to integration
    5. Microsoft HealthVault in an SOA as orchestrated with Microsoft BizTalk

Dom Green explains how to cache worker roles on either the Developer or Cloud Fabric in his Windows Azure Memcached-ed post of 1/17/2010:

Memcached is a distributed cache used to help speeding up large scale web applications by taking pressure off the database. Memcached is used by many of the internets biggest sites, including Twitter, Wikipedia, and YouTube to name just a few.

A distributed cache is one of the things that I’ve been hoping to see released for Windows Azure for quite a while, and I am hoping that AppFabric Caching will make the move to the cloud in the coming year. However, until that happens I was determined to find a way to get a distributed cache and this great Windows Azure Memcached sample showed me how. …

After installing and setting up Memcached you will be able to cache any data, including data that is retrieved from your database so that the next time you need it you can get it from cache and not need to re-query your database. Therefore, reduce the pressure on the database and earning the love of our DBA.

You can download the sample code for Windows Azure from the codeplex download page you then need to download a Windows friendly version of Memcached (here is where I got mine). With the sample code from codeplex just add the memcached exe to your worker roles. You will now be able to run the sample code either on the dev fabric or in the cloud. …

Maarten Balliauw’s Creating an external facing Azure Worker Role endpoint post of 1/17/2010 begins:

When Windows Azure was first released, only Web Roles were able to have an externally facing endpoint. Since PDC 2009, Worker Roles can now also have an external facing endpoint, allowing for a custom application server to be hosted in a Worker Role. Another option would be to run your own WCF service and have it hosted in a Worker Role. Features like load balancing, multiple instances of the Worker are all available. Let’s see how you can create a simple TCP service that can display the current date and time.

Here’s what I want to see when I connect to my Azure Worker Role using telnet (“telnet efwr.cloudapp.net 1234”):

Telnet Azure Worker Role

Let’s go ahead and build this thing. Example code can be downloaded here: EchoCloud.zip (9.92 kb)

Maarten continues with detailed tutorial on the topic. He concludes:

Just a quick note: the approach described here can also be used to run a custom WCF host that has other bindings than for example basicHttpBinding.

Lucas Mearian reports “Incentives in the stimulus bill are also expected to foster a rapid spread of e-health records” in his Kaiser, VA join to give e-health a boost article of 1/18/2001 ComputerWorld article:

The e-health revolution got a shot in the arm earlier this month when health care network giant Kaiser Permanente and the U.S. Department of Veterans Affairs declared a months-long pilot program of sharing patient electronic health records (EHR) a success.

The pilot program links health records of San Diego area residents stored in the VA's Veterans Affairs Health Information Systems and Technology Architecture (VistA) and Kaiser Permanente's HealthConnect electronic health systems.

Patients must agree to be part of the shared system, which uses the Nationwide Health Information Network, a set of government protocols and standards designed to allow the secure exchange of health information between physicians at both private practices and hospitals.

During a press briefing in San Diego, officials from the VA and Kaiser Permanente also outlined plans to add the EHRs of Department of Defense personnel to the eHealth system and to expand the program geographically in the coming months.

Return to section navigation list> 

Windows Azure Infrastructure

The Windows Azure Team’s Windows Azure Platform TCO and ROI Calculator Now Available Online AND Offline announced on 1/19/2010 the availability of an offline version of the Azure TCO and ROI calculator:

Want to know how much your organization could save with the Windows Azure Platform?  Check out the Windows Azure Platform TCO and ROI Calculator: http://www.microsoft.com/windowsazure/tco/, now available in both online and offline versions! Designed to help measure the potential savings of product development or migration to Windows Azure platform services for organizations of all sizes, the calculator will ask you for detailed information about the needs of your organization, its size and other information.  Once you've entered in all your data, the Calculator will provide you with a customized report that will detail the estimated line item costs for a TCO and a 1 to 3 year ROI analysis for your organization.

Once you've registered, you'll have the option of downloading the off-line version right to your desktop. In addition to the "Reporting" and the "Collaborate" features in the on-line version, the offline version has "synchronize" option to keep the latest version of the tool, analysis models, templates etc. in sync with the online version.

While this Calculator is informational only, we hope that it provides you with a better sense of the savings you could experience by building on the Windows Azure Platform.  Let us know what you think by posting a comment.  We look forward to hearing from you

• Lori MacVittie asserts “The benefits of automation and orchestration do not come solely from virtualization” as she continues her series on Infrastructure 2.0 with A Fluid Network is the Result of Collaboration Not Virtualization of 1/19/2010:

vitameatavegaminVirtualization has benefits, there is no arguing that. But let’s not get carried away and attribute all the benefits associated with cloud computing and automation to one member of the “game changing” team: virtualization. I recently read one of the all-too-common end-of-year prediction blogs on virtualization and 2010 that managed to say with what I think was a straight face that virtualization of the network is what makes it “fluid”.

From: 2010 Virtualization Predictions - The Year the Network Becomes Fluid and Virtual

blockquote Virtualizing the network provides the similar benefits as server virtualization through abstraction and automation … The bottom line: In 2010, the network is going to become as fluid and dynamic as the data center is today.

The first problem with these statements is the separation of the network from the data center. The last time I checked the network was the core foundation upon which data centers are built, making them not only a part of the data center but an integral part of it. The second is implying that the automation from which a fluid network is derived is somehow achieved through virtualization. No. No, it isn’t. Both virtual and physical infrastructure require more than just being shoved into a virtual machine or automatically provisioned to enable the kind of benefits expected. …

• Randy Bias’ argues that “This myth is misguided because it assumes that all cloud computing is a financial model rather than a technology or service model” in his Debunking the “No Such Thing as A Private Cloud” Myth post of 1/19/2010:

Once upon a time, a network engineer scrawled an amorphous shape upon a whiteboard and wrote “Internet” thereon.  The amorphous circle, a ‘cloud’, soon became the de facto way that we represent “not my problem”, or outsourcing.  Hence, the “cloud” in cloud computing means that cloud is predominantly an outsourcing business model.  Only large scale ‘utilities’ can provide the cost savings benefits associated with cloud computing. — The Private Cloud Myth


venn-diagram-shared-infra-public-utilitiesThis myth is misguided because it assumes that all cloud computing is a financial model rather than a technology or service model.  Information Technology is rapidly changing from the older client/server and mainframe computing models to the cloud computing model.  This computing model has been pioneered by Amazon and Google, both of whom offer non-utility ‘cloud’ services. It is a model that embraces automation and on-demand self-service.  Providing a public utility service requires cloud computing, but cloud computing does not have to be delivered with a predetermined financial model.

An aside: a ‘model’ is a way of doing things.  Technology models are ways of putting technology together.  Financial models are ways to arrange finances.  Service models are ways of providing a service that is consumed by someone else. …

Randy continues with a detailed analysis that supports his contention.

• David Linthicum claims “Cloud computing will be evolutionary not revolutionary, and that should make people more excited about it” in his Provocative predictions for cloud computing: How realistic? post of 1/19/2010 to the InfoWorld Cloud Computing blog:

I was taken back a bit by this recent article talking about some big predictions from Gartner around the adoption of cloud computing:

“Cloud computing will become so pervasive that by 2012, one out of five businesses will own no IT assets at all, the analyst firm Gartner is predicting.

“The shift toward cloud services hosted outside the enterprise's firewall will necessitate a major shift in the IT hardware markets, and shrink IT staff, Gartner said.”

This is very interesting to me, considering that many new and small businesses are finding a great deal of value in moving to cloud computing. However, I'm not sure I agree with Gartner over the amount of movement that will occur by 2012. Sorry to once again be the buzzkill, but a sure way to bury a space is to overhype and underdeliver.

Don't get me wrong: Cloud computing will have an impact. I suspect that most midsize and small businesses will use e-mail and document management systems that are outside their firewalls. We've seen a lot of movement in this direction in 2009, and with the rapid expansion of Google Enterprise services and the emerging online version of Microsoft Office, this trend will only accelerate. …

• Janikiram MSV suggests that you Download 6 Part [Video] Tutorial on Demystifying The Cloud, which covers Google App Engine, Windows Azure, and Amazon Web Services:

Last month I launched my show called The Cloud9 Show. The first series of this show is called Demystifying The Cloud where I covered the key concepts of the Cloud Computing along with some of the commercial implementations of the Cloud. The unique thing about The Cloud9 Show is that you can download the video, audio, article and slides for each episode. Within the first few days of launching the show, it crossed 1000 views! I want to thank all the viewers.

I am working on the next series and currently working on the content. Now that I have covered the fundamentals of Cloud Computing, I am going to cover the hands-on tutorials. Going forward, you can watch a complete demo, download a step-by-step hands-on-lab guide along with links to all the required components.

Jani works for Alcatel-Lucent as Deputy General Manager for Bell Labs, India.

Bill Zack reports on 1/18/2009 about his presentation on 1/14/2010 to the New York Technology Council in his 3 Screens & the Cloud in New York post:

I spoke about Microsoft’s 3 Screens and the Cloud vision at a local Technology Council panel in New York Thursday night.  I was on the panel with Alfred Spector, Google’s Vice President of Research and a councilwoman from New York City.

image

3 Screens & the Cloud is Microsoft’s vision that embraces the convergence of content and protocols across all devices; the PC, the mobile phone and Television including not only the ubiquitous Television set but also game consoles and other related devices. 

They videoed the session and a link to it will be posted here as soon as they make it available. 

Bill links to the following two articles. I’ll update this post when the video becomes available.

Andrew J. Brust reviews the New York Technology Council’s panel discussion about technology trends in his GOOG vs. MSFT vs. NYC Redmond Diary entry of 1/15/2010:

Tonight was the inaugural audience event of the newly formed New York Technology Council, and I must say the organization is off to an excellent start. The event was panel discussion focusing on technology trends for 2010, and included Alfred Spector, who heads Google's research and special initiatives (and is based in New York City, not Silicon Valley), Bill Zack, an Architect Evangelist for Microsoft focusing on Azure, and New York City Councilmember Gale Brewer, who is the Chair of the Council's Committee on Technology in Government. The panel was moderated by BusinessWeek's Arik Hesseldahl. This was a strong panel, with an excellent moderator and an impressive turnout; a good omen for the future success of NYTECH.

I won't relate the blow-by-blow of the discussion (if you're interested in that you can read the tweets from the event, but I think a summary of the discourse merits some discussion.

Questions and statements concerning health IT, government open data, mobile devices and broadband were raised. With each question, patterns emerged amongst each panelist's answers. Google's Spector said his company believes all data will, and should, be interconnected. Google also feels that mobile devices, fetching data from the cloud, will continue to grow in popularity and disrupt. Microsoft offers, not surprisingly, a differing, though not opposing, view. Redmond's take is that on-premise and cloud-based architectures are very different, that each offers distinct advantages and that in many cases, a combination of the two is the most sensible choice. Contrast this with Spector's comment that "it's only incidental whether data is stored on-premise or in the cloud" as long as it's not in a "walled garden." …

Jeffrey Schwartz reports about the discussion in his Microsoft, Google Debate Cloud Future article of 1/15/2010 for Redmond Partner Channel Online:

At the inaugural meeting of the New York Technology Council Thursday night, Google Vice President of Research Alfred Spector and Microsoft architect evangelist Bill Zack debated their views on how data will be stored and shared in the future.

The two were part of a panel discussion moderated by BusinessWeek technology reporter Arik Hesseldahl. Held at the New York headquarters of PricewaterhouseCoopers in front of an audience of 200 influential venture capitalists, IT executives and vendors, the debate underscored the rivals' competing but overlapping strategies for how datacenter architectures and personal information access will evolve using cloud services.

Zack explained Microsoft's mantra of "three screens and the cloud," which is focused on making data universally accessible on PCs, mobile devices and consumer systems, including televisions and gaming consoles. "We see in terms of content and in terms of protocols the convergence of those," Zack said.

"There is some information you can put in the cloud and there is some information that you'd be crazy to put in the cloud," Zack added. "We believe in the online stack and we believe in our cloud stack. And we believe in hybrid applications so you don't have to put information out in the cloud or all information on-premise. You can build an application that leverages the best of both."

Spector, meanwhile, championed Google's vision of having all data residing in the cloud. "I think it's clear to all of us now that information sharing is an essential part of running our society, we cannot have a walled enterprise," Spector said.

"Google and Microsoft each clearly espouse views that correlate to their own agendas," wrote attendee Andrew Brust, chief of new technology at twentysix New York, in a blog post. "Google wants everything to be published and interconnected, so that it can all be indexed, searched and AdWord-ized," Brust noted. "Microsoft, on the other hand, wishes both to promote its new cloud platform (Azure) and protect its legacy PC and server software franchise. Software + Services." …

Lori MacVittie’s Infrastructure 2.0: Squishy Name for a Squishy Concept essay of 1/18/2010 draws on James Urquhart’s Understanding Infrastructure 2.0 and Greg Ness’s The beginning of the end of static infrastructure posts to define her topic:

connectivity-intelligence-dynamo

There’s been increasing interest in Infrastructure 2.0 of late that’s encouraging to those of us who’ve been, well, pushing it uphill against the focus on cloud computing and virtualization for quite some time now. What’s been the most frustrating about bringing this concept to awareness has been that cloud computing is one of the most tangible examples of both what infrastructure 2.0 is and what it can do and virtualization is certainly one of the larger technological drivers of infrastructure 2.0 capable solutions today. So despite the frustration associated with cloud computing and virtualization stealing the stage, as it were, the spotlight is certainly helping to bring the issues which Infrastructure 2.0 is attempting to address into the fore. As it gains traction, one of the first challenges that must be addressed is to define what it is we mean when we say “Infrastructure 2.0.”

Like Web 2.0 – go ahead and try to define it simply – Infrastructure 2.0 remains, as James Urquhart put it recently, a “squishy term.” …

What complicates Infrastructure 2.0 is that not only is the term “squishy” but so is the very concept. After all, Infrastructure 2.0 is mostly about collaboration, about integration, about intelligence. These are not off the shelf “solutions” but rather enabling technologies that are designed to drive the flexibility and agility of enterprise networks forward in a such as way as to alleviate the pain points associated with the brittle, fragile network architectures of the past.

Greg Ness summed it the concept, at least, very well more than a year ago in “The beginning of the end of static infrastructure” when he said, “The issue comes contextdown to static infrastructure incapable of keeping up with all of the new IP addresses and devices and initiatives and movement/change already taking place in large enterprises” and then noted that “the notion of application, endpoint and network intelligence thus far has been hamstrung by the lack of dynamic connectivity, or connectivity intelligence.”

What Greg noticed is missing is context, and perhaps even more importantly the ability to share that context across the entire infrastructure.  I could, and have, gone on and on and on about this subject so for now I’ll just stop and offer up a few links to some of the insightful posts that shed more light on Infrastructure 2.0 – its drivers, its requirements, its breadth of applicability, and its goals. …

Phil Wainwright interviews Jeff Kaplan of THINKStrategies in a 10-minute podcast as reported in Phil’s Enterprise Clouds Going Hybrid post of 1/18/2010:

Listen to my conversation with Jeff Kaplan, managing director of strategic consulting firm THINKstrategies, and one of the foremost analysts tracking software-as-a-service, on-demand and cloud computing.

In this podcast, learn why it’s not enough to simply port an existing software package to the cloud without rearchitecting it, and hear about some of the ways enterprises will deal with hybrid environments that mix on-premise and cloud assets.

Listen to or download the 10:05 minute podcast here.

Or read the written transcript of the interview in Phil’s post.

Tony Iams stratifies clouds into high, middle, low and vertical categories in his 1/15/2010 The Four Clouds of the Datacenter analysis of the HP/Microsoft agreement of last week:

In the real world, meteorologists classify clouds into four categories according to their base height: "high" clouds, "middle" clouds, "low" clouds, and "vertical" clouds, which can form at many heights. After this week's announcement that HP and Microsoft would jointly invest $250 million in developing and selling integrated cloud computing technology, it seems that a similar selection will emerge in datacenters, as vendors seek to offer clouds targeting different strata of customers. HP and Microsoft announced that they would jointly develop systems that are highly optimized for hosting Microsoft Exchange Server and Microsoft SQL Server. These systems will provide pre-integrated server, storage, networking and application packages for deploying and managing Microsoft's database and e-mail services with "push-button" simplicity.

While the prospect of tapping into third-party computing infrastructures remains a goal for many organizations, the most pressing concern for most is to virtualize as much as possible of their internal infrastructure into "secure" or "private" clouds. Indeed, for many users "cloud" currently implies nothing more than converging virtualized server, storage, and network resources into a single pool that workloads can draw upon as needed. However, these users are finding that the complexity of deploying virtual infrastructure can be overwhelming, especially in mid-sized organizations, which have the need for datacenter capabilities, but lack the depth of personnel to manage complex new datacenter functions such as virtual infrastructure.

In response, there has been a movement among the major systems vendors to provide integrated stacks that combine multiple layers of IT infrastructure, including servers, storage, networking, and software, into a single package that can be managed as a unit. Cisco is taking advantage of the need for such solutions to break into the server market, collaborating with VMware and EMC as part of the Virtual Computing Environment (VCE) alliance to deliver integrated solutions that are optimized for deploying cloud infrastructure. Oracle has integrated its database software with advanced storage functions based on Solid State Disks (SSD) in its Exadata appliance, and if its acquisition of Sun succeeds, it will be able to fold much of Sun's server, storage and software technology into future systems. IBM has strengthened the integration between its various server platforms by unifying the management experience for administrators across all of them. …

Tony continues with a table that describes the Integrated stacks for deploying datacenter clouds from HP/Microsoft (low), Oracle (middle), IBM (high), and VMWare/EMC (vertical). Tony is an analyst with Ideas International.

<Return to section navigation list> 

Cloud Security and Governance

The GovInfoSecurity site announced on 1/18/2010 the availability of the Global Security Threats & Trends: Cisco 2009 Annual Security Report white paper by Cisco Systems (site registration required for download):

Cisco Security Intelligence Operations announces the Cisco 2009 Annual Security Report. The updated report includes information about 2009 global threats and trends, as well as security recommendations for 2010.

Managing and securing today's distributed and agile network is increasingly challenging, with cloud computing and sharing of data threatening security norms. Online criminals are continuing to exploit users trust in consumer applications and devices, increasing the risk to organizations and employees.

Report Highlights

  • Online criminals have taken advantage of the large social media following, exploiting users' willingness to respond to messages that are supposedly from people they know and trust.
  • Politically-motivated threats are increasing, while governments are teaming up and promoting online security.
  • Up to 90 percent of spam is untargeted. That includes spam delivered by botnets that floods inboxes with messages from supposed banks, educational institutions, and service providers.
  • More than 80 percent of the web can be classified as "uncategorized" or "unknown", making it challenging for traditional URL filtering technology. The new Cisco Cybercrime Return on Investment Matrix tracks the performance of the underground online criminal marketplace, helping organizations understand the latest targets.

<Return to section navigation list> 

Cloud Computing Events

James Watters (@wattersjames) invited me to the San Francisco Cloud Club meetup (#sfcloudclub) they’re holding tonight. I have a conflict, but signed up despite misgivings about joining a club who would have @Beaker or me as a member.

The 1/18/2010 meetup appears to be overbooked, but I’ll try to make the next one.

INPUT and SIIA will present SaaS/Gov 2010 on 2/11/2010 at The Westin, Washington, DC:

SaaS/Gov is the most comprehensive conference bringing together federal IT
purchasers and software industry executives to address the government's
movement towards Software as a Service (SaaS) and Cloud Computing.

Featured presenters include:

  • Susie Adams, Chief Technology Officer, Microsoft Corporation (Federal Sales)
  • Philip Berman, SOA Products Group, Sales Engineer, Intel
  • Michael Binko, President and CEO, Kaulkin Information Systems
  • Michael Bowen, Director of IT, Brevard County (FL)
  • Daniel Burton, Senior Vice President, Global Public Policy, Salesforce.com
  • Charlie Catlett, Chief Information Officer, Argonne National Laboratory
  • Scott Chasin, Chief Technology Officer, McAfee Software as a Service
  • Alfred Elliott III, VP – National Intelligence and Homeland Security, JB&A
  • James Gatz, Program Manager, Office of Community Services, United States Department of Health and Human Services

and others.

Sanjay Jain announced Azure BizSpark Camp @ New York: Chance to Win $5000 on 12/5/2009. Here’s a reminder:

With several successful Microsoft BizSpark Incubation Weeks (Azure Atlanta, Win7 Boston, Win7 Reston, CRM Reston, CRM Boston, Win 7 Irvine, Mobility Mountain View,), we are pleased to announce Microsoft BizSpark Camp for Windows Azure in New York, NY during 28–29 January 2010. Based upon your feedback we have made several changes including offering cash prize, compressed time commitment, and much more. We hope you're tapping into the growing BizSpark community. …

Microsoft BizSparkCamp for Windows Azure is designed to offer following assistance to entrepreneurs.

  • Chance to win cash prize of $5000
  • Nomination for BizSpark One (an invitation only program) for high potential startups
  • Learn and build new applications in the cloud or use interoperable services that run on Microsoft infrastructure to extend and enhance your existing applications with help of on-site advisors
  • Get entrepreneurs coaching from a panel of industry experts
  • Generate marketing buzz for your brand
  • Create opportunity to be highlighted at upcoming launch

We are inviting nominations from BizSpark Startups interested in Windows Azure Platform that target one or more of the following:

The Microsoft BizSparkCamp for Windows Azure will be held at Microsoft Technology Center, New York, NY from Thu 1/28/2010 to Fri 1/29/2010. This event consists of ½ day of training, 1 day of active prototype/development time, and ½ day for packaging/finishing and reporting out to a panel of judges for various prizes.

This event is a no-fee event (plan your own travel expenses) and each team can bring 3 participants (1 business and 1 – 2 developer). It is required to have at least 1 developer as part of your team. To nominate your team, please submit the following details to Sanjay Jain (preferably via your BizSaprk Sponsor) no later than by Mon 18 January 2010. Nominations will be judged according to the strength of the founding team, originality and creativity of the idea, and ability to leverage Windows Azure Scenarios.

The New York Technology Council’s Cloud Computing SIG will present An Introduction to Microsoft Windows Azure on 2/4/2010 at 6:30 PM EST:

Join the NYTC Cloud Computing SIG for an introduction to the recently launched Microsoft Windows Azure cloud computing environment. The Windows Azure platform offers a flexible, familiar environment for developers to create cloud applications and services.

Presenting for the evening will be Chris Rolon. Chris is an Architectural Consultant for Neudesic, a solutions company focused on the delivery of products and services based on Microsoft .NET technologies. Chris brings an extensive technology background with more than 25 years of industry experience in custom application development and implementation.

Chris now heads the Windows Azure User Groups in New York and Malvern, PA, where he has been speaking on Azure since January of 2009. Chris has just completed a national tour where he spoke to other Microsoft partners on the merits of Windows Azure.

The meeting will be held at Google, 76 Ninth Ave, 4th Floor, New York, NY 10011.
The presentation is free for NYTC members and sponsors, $15 for non-members.

See the Windows Azure Infrastructure section for details about the Council’s inaugural meeting.

John Willis adds his support for CloudCampHaiti in this 1/18/2010 post:

I would love to use this opportunity to inform you of something the “Cloudcamp.org” has setup.  CloudCampHaiti is a virtual unconference we are running this Wednesday afternoon (http://www.cloudcamp.org/haiti).  Our primary goal is to raise money for the Red Cross.  One hundred percent of the proceeds will be going to the Haiti earthquake victims.  However, we also have a theme “How The Cloud Can Help” .   We want to see how cloud computing and expert resources can be used to help in disasters like this.  Our registration process is simple, $25 to attend the virtual conference, $50 to be listed as a special donor, and $250 to have your company logo.

CloudCampHaiti is going to be a great event.  We are going to have some of the biggest names in cloud computing present and we will also have a panel session and open discussion based on the on the “How The Cloud Can Help”  theme.  This is a great opportunity to learn, participate and help.

Eric Nelson writes Maybe see you at London CloudCamp on the 21st January 2010 on 1/18/2010:

I will be popping along to CloudCamp this Thursday. I’ve been to CloudCamp once before when it coincided with Qcon London when I was a speaker in 2009. But TBH I really didn’t pay much attention last time around as it was a) the end of a long day and b) I was only dabbling with cloud at the time.

On January 1st I switched full time to the Windows Azure Platform which means this time I will be paying attention :-) …

P.S. Unfortunately the event is now full (http://cloudcamplondon6.eventbrite.com/).

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

• Brian P. Watson’s Sybase's CIO on Cloud Computing, Mobility post of 1/19/2010 to ZDNet’s CIO Insight blog begins with the following introduction to a Q&A interview:

Jim Swartz, a longtime veteran of the CIO ranks, explains his view of the cloud’s pros and cons, as well as what he sees in store for IT leaders in 2010.

Sybase has made a major transition to become a mobility software provider. But beyond boosting the company's core offerings, CIO Jim Swartz is looking heavily at cloud computing, SaaS and maximizing the potential of the Millennial generation.

Swartz, formerly an IT leader at several companies, including SRI International and SAIC, spoke recently with CIO Insight Editor in Chief Brian P. Watson about his priorities for 2010. …

• Michael Robertson claims “… Lala lets Apple move faster in transitioning from their PC software business to a cloud service” in his Apple’s Secret Cloud Strategy And Why Lala Is Critical guest post of 1/19/2010 to TechCrunch:

… Apple’s recent acquisition of digital music startup Lala rekindled speculation of an iTunes subscription service. There’s no shortage of subscription offerings (Napster, Rhapsody, Spotify, Pandora, etc), but none have attracted the millions of subscribers necessary to make the high royalty structures work. Experts have pondered that Apple’s design expertise and hardware integration could make subscription work. And leveraging Lala’s digital library, licenses from the major labels, and a management team who cycled through several business models including the ten cent web song rental could make it a reality. It’s a logical assumption, but after talking to a wide variety of insider sources it’s clear there is no upcoming Apple subscription service and Apple has far different plans. …

Lala will play a critical role in Apple’s music future, but not for the reasons cited above. Lala’s licenses with major labels are non-transferable, so they’re not usable for any new iTunes service. The 10 cent song rental model never gained traction and does not cover mobile devices thus is of little value to Apple. What is of value is the personal music storage service which was an often overlooked component of Lala’s business. As Apple did with the original iPods, Lala realized that any music solution must include music already possessed by the user. The Lala setup process provides software to store a personal music library online and then play it from any web browser alongside web songs they vend. This technology plus the engineering and management team is the true value of Lala to Apple. …

Michael is the founder and former CEO of digital music pioneer MP3.com. He is currently the CEO of music locker company MP3tunes. Robertson is also an adviser to Google Voice.

Elizabeth White describes a “New Innovation Pipeline and Platform for Integration to Fuel Market Expansion” in her IBM Accelerates Cloud Computing With LotusLive post of 1/18/2010:

IBM in Monday announced the technology and business expansion of its LotusLive cloud collaboration platform through a new R&D pipeline from IBM Research and plans to open the LotusLive suite to new partners.

LotusLive cloud services provide integrated email, Web conferencing, social networking and collaboration with IBM's focus on security, and reliability.

IBM Research and Lotus are joining forces to deliver new innovation on the Web through the creation of LotusLive Labs -- a proving ground for advances in business-driven collaboration in the cloud. …

At Lotusphere 2010 this week, a range of LotusLive Labs technology previews are being unveiled including:

  • Slide Library, a collaborative way to build and share presentations;
  • Collaborative Recorded Meetings, a service that records and instantly transcribes meeting presentations and audio/video for searching and tagging;
  • Event Maps, an interactive way to visualize and interact with conference schedules; and,
  • Composer, the ability to create LotusLive mashups through the combination of LotusLive services.

Expected in second quarter of 2010 via LotusLive Labs, Project Concord is a new Web-based document editor for creating and sharing documents, presentations and spreadsheets.

James Governor of Monkchips gives his opinion of IBM’s announcement in his Lotus Puts the Labs to Work: On Innovation post of 1/18/2010:

I am here at Lotusphere 2010 in Orlando, sitting in the press room. John Fontana from Network World just walked in and and asked me what I thought of the event so far. My reply:

“Well I am not saying Lotus has all its ducks in a row, but at least it has plenty of them now.”

The Lotus portfolio is now looking both increasingly broad, and compelling at scale. …

See the Windows Azure Infrastructure section for details of the New York Technology Council’s inaugural meeting, which featured Bill Zack’s 3 Screens & the Cloud presentation, as well as the views of Alfred Spector, Google’s Vice President of Research.

<Return to section navigation list> 

blog comments powered by Disqus