Sunday, January 31, 2010

Windows Azure and Cloud Computing Posts for 1/29/2010+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.
 
• Updated 1/31/2010: Sales Dot Two Inc.: Sales 2.0 Conference: Sales Productivity in the Clouds; Sam Johnston: Oracle Cloud Computing Forum[?]; John Mokkosian-Kane: Event-Driven Architecture in the Clouds with Windows Azure; Chris Hoff: Where Are the Network Virtual Appliances? Hobbled By the Virtual Network, That’s Where…; Stephen Chapman: Microsoft Confirms “Windows Phone 7;” to be Discussed at Energize IT 2010; Microsoft Partner Network: Windows Azure Platform Partner Hub; David Burela: Using Windows Azure to scale your Silverlight Application, Using Silverlight to distribute workload to your clients
• Updated 1/30/2010: Jonny Bentwood: Downfall of AR and the Gartner Magic Quadrant; Joe McKendrick: Design-Time Governance Lost in the Cloud? The Great Debate Rages; Jon Box: Upcoming TechNet and MSDN Events; Gunnar Peipman: Creating configuration reader for web and cloud environments; Dave O’Hara: Washington State proposes legislation to restart data center construction, 15 month sales tax exemption; Greg Hughes: Brent Ozar Puts SQL In The Cloud; Chris Hoff: MashSSL – An Excellent Idea You’ve Probably Never Heard Of…; Misael Ntirushwa: Migrating an Asp.Net WebSite with Sql Server 2008 Database to Windows Azure and Sql Azure; U.S. Department of Defense: Instruction No. 5205.13 Defense Industrial Base (DIB) Cyber Security/Information Assurance (CS/IA) Activities; Tim Weber: Davos 2010: Cyber threats escalate with state attacks; Chris Hoff: Hacking Exposed: Virtualization & Cloud Computing…Feedback Please; Nicholas Kolakowski: Microsoft Promoted Azure, Office 2010 During Earnings Call, But Dodged Mobile; Eugenio Pace: Just Released – Claims-Identity Guide online; Maarten Balliauw: Just another WordPress weblog, but more cloudy; and Emmanuel Huna: Red-Gate’s SQL Compare – now with SQL Azure support.
Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:
To use the above links, first click the post’s title to display the single article you want to navigate.
Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)
Read the detailed TOC here (PDF) and download the sample code here.
Discuss the book on its WROX P2P Forum.
See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.
Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.
You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:
  • Chapter 12: “Managing SQL Azure Accounts and Databases”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”
HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the November CTP in January 2010. 
Off-Topic: OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009.

Azure Blob, Table and Queue Services

Alex JamesGetting Started with the Data Service Update for .NET 3.5 SP1 – Part 2 eight-step walkthrough of 1/28/2010 explains how to write a WPF client to consume the OData service you created in his Getting Started with the Data Services Update for .NET 3.5 SP1 – Part 1 walkthrough of 12/17/2009.
In case you missed it in in the preceding Windows Azure and Cloud Computing Posts for 1/27/2010+, the Astoria team announced Data Services Update for .NET 3.5 SP1 – Now Available for Download on 1/27/2010.
<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

• Greg Hughes announced the Brent Ozar Puts SQL In The Cloud podcast on 1/29/2010:
In this RunAs Radio podcast, Richard and I talk to Brent Ozar from Quest Software about running SQL in the cloud.
Part of the conversation focuses on SQL Azure, but Amazon’s EC2 running SQL on a virtual machine is also a version of the same concept. The larger topic is really around DBAs providing services to their organizations — because that’s what the cloud is offering!
Brent Ozar is a SQL Server Expert with Quest Software, and a Microsoft SQL Server MVP. Brent has a decade of broad IT experience, including management of multi-terabyte data warehouses, storage area networks and virtualization. In his current role, Brent specializes in performance tuning, disaster recovery and automating SQL Server management. Previously, Brent spent two years at Southern Wine & Spirits, a Miami-based wine and spirits distributor. He has experience conducting training sessions, has written several technical articles, and blogs prolifically at BrentOzar.com. He is a regular speaker at PASS events, editor-in-chief of SQLServerPedia.com, and co-author of the book, “Professional SQL Server 2008 Internals and Troubleshooting.”
• Emmanuel Huna claims “Support for SQL Azure in SQL Compare was my idea!” in his Red-Gate’s SQL Compare – now with SQL Azure support post of 1/29/2010:
For a few years I’ve been using SQL Compare from Red-Gate – an amazing product that allows you to do compare and synchronize SQL Server database schemas – and much more:
image
For more info see http://www.red-gate.com/products/SQL_Compare/index.htm
Red Gate has announced they now have an early access build of SQL Compare 8 that works with SQL Azure!  If you're interested in trying this out please complete the form at http://www.red-gate.com/Azure
Please note that I do not work for Microsoft or Red Gate Software – I’m just a very happy customer. …
• Misael Ntirushwa’s Migrating an Asp.Net WebSite with Sql Server 2008 Database to Windows Azure and Sql Azure post of 1/28/2010 begins:
As we get to the end of January Windows Azure will be ready for business (end of the trial period). If you wondering how you can migrate your existing Asp.Net website to Windows Azure and Sql Azure this post is for you.  I built a website that shows a music catalog which contains artists, theirs albums and tracks listing of those albums. The data store is a sql server 2008 database.
For the Data access layer, I used LINQ to SQL to access the database. In addition to classes auto generated by LINQ to SQL Designer, I have a LINQ helper class “LinqQueries” that contains queries, it is used by the business logic layer to query the database.
The business logic is composed by DTO (Data Transfer Objects) classes (Album, Artist, Track), converter classes (from Entities to DTO classes), and manager classes that actually query the database through LinqQueries helper class.
classes
The website display[s] music albums by genre (Rock, R&B, Pop, …). [Y]ou can search by artist name, or browse by genre. once you select an artist you can see all his/her albums. If you select an album you can see track list on the album and the release date. It’s a simple ASP.NET Webform website with a SQL Server database. The question is how do we migrate it into the cloud hosted in Windows Azure and SQL Azure ?
I will assume here that you’ve activated your Windows Azure account, you’ve watched a couple of videos on Windows Azure, or even better tried the Windows Azure Platform kit, played with the Windows Azure SDK and are ready to start creating Hosted Services and Storage accounts in windows Azure. …
Misrael continues with detail instructions for the migration.
The DataPlatformInsider team reported Free Download: Microsoft SQL Server Migration Assistant on 11/7/2010, but I missed the significance: the Microsoft SQL Server Migration Assistant 2008 for MySQL v1.0 CTP1 now migrates from MySQL directly to SQL Azure in the cloud. 
<Return to section navigation list>

AppFabric: Access Control, Service Bus and Workflow

Eugenio Pace announced Just Released – Claims-Identity Guide online on 1/29/2010:
The entire book [A Guide to Claims-Based Identity and Access Control: Authentication and Authorization for Services and the Web] is now available for browsing online on MSDN here.
Now, to be honest, it doesn’t look as nice as the printed book (small preview here):
image
But everything is in there! (and doesn’t look that bad at all either, it’s just I really like the printed version :-) ).
Eugenio continues with a contents diagram and table of technologies covered. Here’s the Windows Azure item:
Chapters Technologies Topics
Windows Azure ASP.NET WebForms on Azure Hosting a claims-aware application on Windows Azure.
Jim Nakashima’s Installing Certificates in Windows Azure VMs tutorial of 1/29/2010 begins:
A little while ago I posted How To: Add an HTTPS Endpoint to a Windows Azure Cloud Service which talked about the whole process around adding an HTTPS endpoint and configuring & uploading the SSL certificate for that endpoint.
This post is a follow up to that post to talk about installing any certificate to a Windows Azure VM.  You may want to do this to install various client certificates or even to install the intermediate certificates to complete the certificate chain for your SSL certificate.
In order to peak into the cloud, I’ve written a very simple app that will enumerate the certificates of the Current User\Personal (or My) store. …

As usual, Jim’s post is a detailed, illustrated guide to working with Windows Azure.
See the Microsoft All-In-One Code Framework 2010-1-25: brief intro of new samples item in the Live Windows Azure Apps, Tools and Test Harnesses section for details about recent Workflow samples.
The Windows Azure Platform AppFabric Team reports Additional Data Centers for Windows Azure platform AppFabric on 1/28/2010:
Window Azure platform AppFabric has now been deployed to more data centers around the world.  Previously, when you provisioned a service namespace, you were asked to select a region from a list that contained only United States (South/Central).  Now, when you provision a service namespace, you have three more regions from which to choose -- United States (North/Central), Europe (North) and Asia (Southeast).  If your firewall configuration restricts outbound traffic, you will need to perform the addition step of opening your outbound TCP port range 9350-9353 to the IP range associated with your selected regional data center.  Those IP ranges are listed at the bottom of this announcement.
Note that your existing service namespaces have already been deployed to United States (South/Central) and cannot be relocated to another region.  If you like, you may delete a service namespace, and when recreating it, associate it with another region.  However, if you do so, data associated with your deleted namespace will be lost.
Windows Azure platform AppFabric plans to deploy to more locations in the months ahead.  Further details will be posted as they become available.
IP Ranges
  • United States (North/Central): 65.52.0.0/21, 65.52.8.0/21, 65.52.16.0/21, 65.52.24.0/21, 207.46.203.64/27, 207.46.203.96/27, 207.46.205.0/24
  • Europe (North): 94.245.88.0/21, 94.245.104.0/21, 65.52.64.0/21, 65.52.72.0/21, 94.245.114.0/27, 94.245.114.32/27, 94.245.122.0/24
  • Asia (Southeast): 111.221.80.0/21, 111.221.88.0/21, 207.46.59.64
<Return to section navigation list>

Live Windows Azure Apps, Tools and Test Harnesses

•• David Burela’s Using Windows Azure to scale your Silverlight Application post of 1/31/2010 describes the benefits of Windows Azure storage and the Content Delivery Network:
Having the client download your application can become the first problem you encounter. If the user is sitting there, watching a loading screen for 5 minutes, they will most likely just move onto another site. Your application can be split up into the .net code, and the supporting files (.jpgs, etc.)
For this we can use ‘Windows Azure Storage’ + the ‘Content Delivery Network’ (CDN) to help us push the bits to the end user. An official overview and how to set it up can be seen on the Azure CDN website. But to sum it up, you can put your .xap files and your supporting files onto Azure storage, then enable CDN on your account. Azure CDN will then push the files out to servers around the world (currently 18 locations). When a users requests the files, they will download it from the closest server reducing download times.
image
The loading time of your application can be further reduced by cutting your application up into modules. This way instead of downloading a 10mb .xap file, the end user can just quickly download the 100k core application, meanwhile the rest of the modules can be loaded using MEF, Prisim or something similar. These other modules can also be put onto the Azure CDN (they are just additional .xaps after all!). Brendan Forster has a quick overview on what MEF is. But there are plenty of tutorials out there on how to integrate MEF into Silverlight. …
David’s Using Silverlight to distribute workload to your clients tutorial of the same date begins:
In my previous article i explained how you can distribute server side processing on Azure to scale the backend of your Silverlight applications. But what if we did a 180 could use Silverlight to distribute work out to clients instead?
•• John Mokkosian-Kane describes his Event-Driven Architecture in the Clouds with Windows Azure post to the CodeProject of 1/30/2010:
To demonstrate Azure and EDA beyond text book definitions, this article will show you how to build a flight delay management system for all US airports in clouds. The format of this article was intended to be similar to a hands on lab, walking you through each step required to develop and deploy an application to the clouds. This flight delay management system will be built with a mix of technologies including SQL 2008 Azure, Visual Studio 2010, Azure Queue Service, Azure Worker/Web Roles, Silverlight 3, Twitter SDK, and Microsoft Bing mapping. And while the implementation of this article is based on a flight delay system, the concepts can be leveraged to build IT systems in the clouds across functional domains.
• Gunnar Peipman posted his Creating configuration reader for web and cloud environments tutorial on 1/30/2010:
Currently it is not possible to make changes to web.config file that is hosted on Windows Azure. If you want to change web.config you have to deploy your application again. If you want to be able to modify configuration you must use web role settings. In this blog post I will show you how to write configuration wrapper class that detects runtime environment and reads settings based on this knowledge.
The following screenshot shows you web role configuration that is opened for modifications in Windows Azure site.
Changing web role settings
My solution is simple – I will create one interface and two configuration reading classes. Both classes – WebConfiguration and AzureConfiguration – implement [the] IConfiguration interface.
Gunnar continues with C# and VB code for the classes.
• Maarten Balliauw posted the slides for his Just another WordPress weblog, but more cloudy presentation PHP/Benelux on 1/30/2010:
While working together with Microsoft on the Windows Azure SDK for PHP, we found that we needed a popular example application hosted on Microsoft’s Windows Azure. Word Press was an obvious choice, but not an obvious task. Learn more about Windows Azure, the PHP SDK that we developed, SQL Azure and about the problems we faced porting an existing PHP application to Windows Azure.
Colbertz reports in a Microsoft All-In-One Code Framework 2010-1-25: brief intro of new samples post of 1/29/2010:
Microsoft All-In-One Code Framework 25th January, 2010 updates. If this is the first time you heard about All-In-One Code Framework(AIO) project, please refer to the relevant introduction on our homepage: http://cfx.codeplex.com/
In this release, we added more new samples on Azure:
CSAzureWCFWorkerRole, VBAzureWCFWorkerRole
This sample provides a handy working project that hosts WCF in a Worker Role. This solution contains three projects:
  1. Client project. It's the client application that consumes WCF service.
  2. CloudService project. It's a common Cloud Service that has one Worker Role.
  3. CSWorkerRoleHostingWCF project. It's the key project in the solution, which demonstrates how to host WCF in a Worker Role.
Two endpoints are exposed from the WCF service in CSWorkerRoleHostingWCF project:
  1. A metadata endpoint
  2. A service endpoint for MyService service contract
Both endpoints uses TCP bindings.
CSAZWorkflowService35, VBAZWorkflowService35
This sample demonstrates how to run a WCF Workflow Service on Windows Azure. It uses Visual Studio 2008 and WF 3.5.
While currently Windows Azure platform AppFabric does not contain a Workflow Service component, you can run WCF Workflow Services directly in a Windows Azure Web Role. By default, a Web Role runs under full trust, so it supports the workflow environment.
The workflow in this sample contains a single ReceiveActivity. It compares the service operation's parameter's value with 20, and returns "You've entered a small value." and "You've entered a large value.", respectively. The client application invokes the Workflow Service twice, passing a value less than 20, and a value greater than 20, respectively.
CSAZWorkflow4ServiceBus, VBAZWorkflow4ServiceBus
This sample demonstrates how to expose an on-premises WCF Workflow Service to the Internet and cloud using Windows Azure platform Service Bus. It uses Visual Studio 2010 Beta 2 and WF 4.
While the current version Windows Azure platform AppFabric is compiled against .NET 3.5, you can use the assemblies in a .NET4 project.
The workflow in this sample uses the standard ReceiveRequest/SendResponse architecture introduced in WF 4. It compares the service operation's parameter's value with 20, and returns "You've entered a small value." and "You've entered a large value.", respectively. The client application invokes the Workflow Service twice, passing a value less than 20, and a value greater than 20, respectively.
Emmanuel Huna writes in his How to speed up your Windows Azure Development – two screencasts available of 1/28/2010:
Today I sat down with my co-worker Arif A. and we went over a couple of ways I found to speed up Windows Azure Development.
Basically we discuss how you can host your web roles in IIS and how you can create a keyboard shortcut to quickly attach the Visual Studio debugger to an IIS process – giving you your RAD (Rapid Application Development) process back!  I created two separate screencasts which you can access below:
Each article contains the screencast, additional notes and a sample Visual Studio 2008 project you can download.
John Foley reports “Following a month of no-cost tire kicking, Microsoft will begin charging customers on Feb. 1 for its new Windows Server-based cloud computing service” in his Microsoft To Launch Pennies-Per-Hour Azure Cloud Service Monday article of 1/28/2010 for InformationWeek:
Developer and IT departments can choose from two basic pricing models: a pay-as-you-go "consumption" option based on resource usage, and a "commitment" option that provides discounts for a six-month obligation. At standard rates, a virtualized Windows Server ranges from 12 cents to 96 cents per hour, depending on CPU and related resources. Storage starts at 15 cents per GB per month, plus one cent for every 10,000 transactions. Microsoft's SQL Server costs $9.99 per month for a 1 GB Web database.
Azure represents a new, and unproven, business model for both Microsoft and its customers. Developers and other IT professionals need to assess Azure's reliability, security, and cost compared to running Windows servers in their own data centers. Microsoft is providing TCO and ROI calculators to help with that cost comparison, but the company makes "no warranties" on the results those tools deliver.
Will Microsoft's cloud be cheaper than on-premises Windows servers? Every scenario is different, but many customers do stand to save money by moving certain IT workloads from their own hardware and facilities to Azure, says Tim O'Brien, Microsoft's senior director of platform strategy. Early adopters such as Kelley Blue Book and Domino's Pizza are saving "millions," O'Brien says. He admits, however, that Microsoft's cloud services may actually cost more than on-premises IT in some cases.
Return to section navigation list> 

Windows Azure Infrastructure

• Dave O’Hara reported Washington State proposes legislation to restart data center construction, 15 month sales tax exemption in this 1/29/2010 post to the Green Data Center blog:
In Olympia, Washington there are two bills introduced with bipartisan support to allow a 15 month sales tax exemption on the purchase and installation of computers for new data centers. …
“Today is a good day. The bills that we support -- SB 6789 and HB 3147 -- were introduced in Olympia with wide bipartisan support. The 13 sponsors of the bills are from all over the state, from Seattle and Spokane to Walla Walla and Wenatchee. And the state Department of Revenue requested the bills.
“The bills allow a 15-month sales-tax exemption on the purchase and installation of computers and energy for new data centers in rural counties. As the bills state, they provide a short-term economic stimulus that will sustain long-term jobs. In other words, the exemption will be temporary, but the jobs and tax revenue from the centers will boost rural counties for years and years to come.” …
We’ll see if data center construction comes to the State of Washington soon.
In the short term, maybe Microsoft will bring back some of its servers from Texas.
Dave quotes Mary Jo Foley’s Tax concerns to push Microsoft Azure cloud hosting out of Washington state article of 8/5/2009 which quotes my Mystery Move of Azure Services Out of USA – Northwest (Quincy, WA) Data Center of the same date.
• Jonny Bentwood describes his Downfall of A[nalyst] R[elations] and the Gartner Magic Quadrant YouTube video of 1/29/2010:
Created by Jonny Bentwood. A parody of Hitler's failure to get his cloud app in the leader section of the Gartner magic quadrant. Disclosure: Created in jest and this does not reflect the my view ...
The subtitles are hilarious. Don’t miss it.
• Nicholas Kolakowski’s Microsoft Promoted Azure, Office 2010 During Earnings Call, But Dodged Mobile post of 1/30/2010 to eWeek’s Windows 7, Vista, XP OS and MS Office blog carries this lede:
Microsoft signaled during a Jan. 28 earnings call that a variety of initiatives, including Office 2010, Azure and Project Natal, would help power its revenues throughout 2010. However, Microsoft executives seemed less forthcoming about the possible debut of Windows Mobile 7, the smartphone operating system that could make or break the company's plans in the mobile arena, deferring instead to an announcement during the Mobile World Congress in Barcelona in February. Microsoft also stated that netbooks and a possible Windows 7 Service Pack were non-factors in affecting the company's Windows 7 revenue.
Nick continues with a detailed analysis of the earnings call.
Microsoft is betting its success in 2010 on a variety of initiatives, including the Azure cloud platform, the Project Natal gaming application, and Office 2010. However, executives during the company's Jan. 28 earnings call remained elusive about Windows Mobile 7, the long-rumored smartphone operating system that could potentially mean success or failure for Microsoft in that space.
Peter Klein, Microsoft chief financial officer, promoted both Azure and Natal during the earnings call, referring to the former as the cloud platform that would provide developers with a "smooth transition to the cloud with tools and processes," and the latter as something that "will energize this generation’s gaming and entertainment experience starting this holiday season." …
Microsoft is betting that the rollout of new versions of certain software programs throughout 2010, including Office 2010, will help spark a healthier uptake among enterprises and SMBs (small- to medium-sized businesses). Success of Azure and Natal also has the potential to contribute substantially to the company’s bottom line. But any guesses as to the role of Mobile in Microsoft’s 2010 will likely have to continue to wait until Barcelona.
Michael Ziock’s Response to North America Connectivity Issues of 1/29/2010 states:
Microsoft Online Services strives to provide exceptional service for all of our customers. On January 28, customers served from a North America data center may have experienced intermittent access to services included in the Business Productivity Online Standard Suite. We apologize for any inconvenience this may have caused you and your employees.
Be sure to read the comments, especially the one about the “problems with our RSS feed, which is that we did not communicate the correct information for this incident, which confused customers and cause problems for our support agents. This is high on the list of things we are addressing with regard to customer communications.”
Michael is Sr. Director, Business Productivity Online Service Operations, The Microsoft Online Services Team.
The Windows Azure Dashboard’s Status History for Windows Azure Compute shows no outages for the period 1/23 to 1/29/2010:
StatusReports1-23to1-29Compute929px
The same is true for all other services except AppFabric Access Control, which had 40 minutes of connectivity issues in the South Central US data center on 1/23/2010.
Erik Sherman reported Microsoft Cloud Services Had 5 Hour Outage [UPDATE 2] on 1/29/2010, but it didn’t affect my Windows Azure test harness running in the US South Central data center in San Antonio:
… I received an email from a reader claiming that Microsoft Online Service, the current name for its cloud offerings, had a five hour outage. According to the tip, that included the hosted Exchange email service. This is clearly not the sort of thing corporations want to hear when considering who to trust going forward in cloud computing.
It’s also clearly not the sort of thing that Microsoft executives would enjoy seeing widely communicated. I’ve tried checking online to see if there was any news of this, but there was nothing. That makes me a bit suspicious, as it would seem to be the sort of news that would have leaked out somewhere. However, that also doesn’t necessarily rule it out. I’ve just emailed Microsoft’s PR agency about this. However, given that it’s been two days so far and they still haven’t been able to dredge up a single comment on the registration server outage that lasted five days, I am not expecting any meaningful information in the near future about this. …
The outage appears to have been specific to Office live; Pingdom and mon.itor.us haven’t reported Windows Azure outages. Still waiting for word from Microsoft.
Eric Nelson’s Q&A: What are the UK prices for the Windows Azure Platform? post of 1/29/2010 begins:
Lots of folks keep asking me for UK prices and to be fair it does take a little work to find them (You need to start here and bring up this pop up)
Hence for simplicity, I have copied them here (as of Jan 29th 2010).
Note that there are several rates available. The following is “Windows Azure Platform Consumption”
and continues with a copy of the current rates.
<Return to section navigation list> 

Cloud Security and Governance

Chris Hoff asked Where Are the Network Virtual Appliances? Hobbled By the Virtual Network, That’s Where… and (of course) answered his question in this 1/31/2010 post:
Allan Leinwand from GigaOm wrote a great article asking “Where are the network virtual appliances?” This was followed up by another excellent post by Rich Miller.
Allan sets up the discussion describing how we’ve typically plumbed disparate physical appliances into our network infrastructure to provide discrete network and security capabilities such as load balancers, VPNs, SSL termination, firewalls, etc. …
Ultimately I think it prudent for discussion’s sake to separate routing, switching and load balancing (connectivity) from functions such as DLP, firewalls, and IDS/IPS (security) as lumping them together actually abstracts the problem which is that the latter is completely dependent upon the capabilities and functionality of the former.  This is what Allan almost gets to when describing his lament with the virtual appliance ecosystem today. …
I’ve written about this many, many times. In fact almost three years ago I created a presentation called  “The Four Horsemen of the Virtualization Security Apocalypse” which described in excruciating detail how network virtual appliances were a big ball of fail and would be for some time. I further suggested that much of the “best-of-breed” products would ultimately become “good enough” features in virtualization vendor’s hypervisor platforms.
@Beaker goes on to describe “some very real problems with virtualization (and Cloud) as it relates to connectivity and security” and conclude:
The connectivity layer — however integrated into the virtualized and cloud environments they seem — continues to limit how and what the security layers can do and will for some time, thus limiting the uptake of virtual network and security appliances.
Situation normal.
• Chris Hoff (@Beaker) announced on 1/30/2010 he’s a co-author of a forthcoming book, Hacking Exposed: Virtualization & Cloud Computing…Feedback Please:
Craig Balding, Rich Mogull and I are working on a book due out later this year.
It’s the latest in the McGraw-Hill “Hacking Exposed” series.  We’re focusing on virtualization and cloud computing security.
We have a very interesting set of topics to discuss but we’d like to crowd/cloud-source ideas from all of you.
The table of contents reads like this:
Part I: Virtualization & Cloud Computing:  An Overview
Case Study: Expand the Attack Surface: Enterprise Virtualization & Cloud Adoption
Chapter 1: Virtualization Defined
Chapter 2: Cloud Computing Defined
Part II: Smash the Virtualized Stack
Case Study: Own the Virtualized Enterprise
Chapter 3: Subvert the CPU & Chipsets
Chapter 4: Harass the Host, Hypervisor, Virtual Networking & Storage
Chapter 5: Victimize the Virtual Machine
Chapter 6: Conquer the Control Plane & APIs
Part III: Compromising the Cloud
Case Study: Own the Cloud for Fun and Profit
Chapter 7: Undermine the Infrastructure
Chapter 8: Manipulate the Metastructure
Chapter 9: Assault the Infostructure
Part IV: Appendices
We’ll have a book-specific site up shortly, but if you’d like to see certain things covered (technology, operational, organizational, etc.) please let us know in the comments below.
Also, we’d like to solicit a few critical folks to provide feedback on the first couple of chapters. Email me/comment if interested.
Amazon’s page to pre-order the book is here.
• Joe McKendrick chimes in with his Design-Time Governance Lost in the Cloud? The Great Debate Rages post of 1/30/2010 to the EbizQ SOA in Action blog:
Just when the dust seems to have settled from the whole "SOA is Dead" kerfuffle, our own Dave Linthicum throws more cold water on the SOA/cloud party -- with a proclamation that "Design-time Governance is Dead."  (Well, not dead yet, but getting there...)
We've been preaching both sides of governance as the vital core of any SOA effort -- and with good reason. Ultimately, as SOA proliferates as a methodology for leveraging business technology, and by extension, services are delivered through the cloud platform, people and organizations will play the roles of both creators and consumers of services. The line between the two are blurring more every day, and both design-time and runtime governance discipline, policies, and tools will be required.
In a post that raised plenty of eyebrows (not to mention eyelids), Dave Linthicum says the rise of the cloud paradigm may eventually kill off the design-time aspect of governance, the lynchpin of SOA-based efforts. As Dave explains it, with cloud computing, it's essential to have runtime service governance in place, in order to enforce service policies during execution. This is becoming a huge need as more organizations tap into cloud formations. "Today SOA is a huge reality as companies ramp up to leverage cloud computing or have an SOA that uses cloud-based services," he says. "Thus, the focus on runtime service execution provides much more value." …
• Chris Hoff (@Beaker) recommends MashSSL – An Excellent Idea You’ve Probably Never Heard Of… in this 1/30/2010 post:
I’ve been meaning to write about MashSSL for a while as it occurs to me that this is a particularly elegant solution to some very real challenges we have today.  Trusting the browser, operator of said browser or a web service when using multi-party web applications is a fatal flaw.
We’re struggling with how to deal with authentication in distributed web and cloud applications. MashSSL seems as though it’s a candidate for the toolbox of solutions:
“MashSSL allows web applications to mutually authenticate and establish a secure channel without having to trust the user or the browser. MashSSL is a Layer 7 security protocol running within HTTP in a RESTful fashion. It uses an innovation called “friend in the middle” to turn the proven SSL protocol into a multi-party protocol that inherits SSL’s security, efficiency and mature trust infrastructure.”
Make sure you check out the sections on “Why and How,” especially the “MashSSL Overview” section which explains how it works.
I should mention the code is also open source.
• Tim Weber quotes Microsoft’s Craig Mundie in his Davos 2010: Cyber threats escalate with state attacks report for BBC News of 1/30/2010:
Cyber-attacks are rising sharply, mainly driven by state-sponsored hackers, experts at the World Economic Forum in Davos have warned. The situation is made worse by the open nature of the web, making it difficult to track down the attackers.
Craig Mundie, Microsoft's chief research officer, called for a three-tier system of authentication - for people, devices and applications - to tackle the problem.
The biggest cyber-risks, however, are insiders, experts said.
The security of cyberspace and data in general has been a theme at a string of sessions in Davos, with leading security experts warning that internet threats were growing at a geometric rate. …
Microsoft, Mr Mundie said, had been under attack for many years and probably seen it all.
"We have weathered the storm of nearly every class of attack as it has evolved, and lost some IP [intellectual property] on the way. We had lots of distributed denial of service attacks, but we are coping with that, but to do so you have to have active security these days." …
Few clear strategies were proposed to counter the threat, but one that the experts agreed on was "authentication", where systems verify that you are who you say you are.
"The internet is a wonderful place because it's so easy to get on… but that's because it's unauthenticated. A lot could be prevented just by having a two-step authentication," said one expert.
Microsoft's Mr Mundie pointed to a much more dangerous threat to most companies.
"What we have also experienced is the act of insider threat. As a company we start with the assumption, there are agents of governments and rivals inside the company, so you have to think how you can secure yourself. But most companies don't start with the assumption that there's an insider threat."
The U.S. Department of Defense Instruction No. 5205.13 Defense Industrial Base (DIB) Cyber Security/Information Assurance (CS/IA) Activities of 1/29/2010 appears to be a reaction to state attacks:
PURPOSE. This Instruction establishes policy, assigns responsibilities, and delegates authority in accordance with the authority in DoD Directive (DoDD) 5144.1 (Reference (a)) for directing the conduct of DIB CS/IA activities to protect unclassified DoD information, as defined in the Glossary, that transits or resides on unclassified DIB information systems and networks.
POLICY. It is DoD policy to:
a. Establish a comprehensive approach for protecting unclassified DoD information
transiting or residing on unclassified DIB information systems and networks by incorporating the use of intelligence, operations, policies, standards, information sharing, expert advice and assistance, incident response, reporting procedures, and cyber intrusion damage assessment solutions to address a cyber advanced persistent threat.
b. Increase DoD and DIB situational awareness regarding the extent and severity of cyber threats in accordance with National Security Presidential Directive 54/Homeland Security Presidential Directive 23 (Reference (b)).
c. Create a timely, coordinated, and effective CS/IA partnership with the DIB …
W. Scott Blackmer posted Data Integrity and Evidence in the Cloud to the InformationLawGroup blog on 1/29/2010:
How does cloud computing affect the risks of lost, incomplete, or altered data? Often, the discussion of this question focuses on the security risks in transmitting data over public networks and storing it in dispersed facilities, sometimes in the control of diverse entities. Less often recognized is the fact that cloud computing, if not properly implemented, may jeopardize data integrity simply in the way that transactions are entered and recorded.  Questionable data integrity has legal as well as operational consequences, and it should be taken into account in due diligence, contracting, and reference to standards in cloud computing solutions. …
The interaction between the data entry system and the multiple databases is normally effected through database APIs (application programming interfaces) designed or tested by the database vendors. The input is also typically monitored on the fly by a database “transactions manager” function designed to ensure, for example, that all required data elements are entered and are within prescribed parameters, and that they are all received by the respective database management systems.
Cloud computing solutions, by contrast, are often based on data entry via web applications. The HTTP Internet protocol was not designed to support transactions management or monitor complete delivery of upstream data. Some cloud computing vendors essentially ignore this issue, while others offer solutions such as application APIs on one end or the other, or XML-based APIs that can monitor the integrity of data input over HTTP.
Since the 1980s, database management systems routinely have been designed to incorporate the properties of “ACID” (atomicity, consistency, isolation, and durability). The question for the customer is whether a particular cloud computing solution offers similar fail-safe controls against dangerously incomplete transactions and records. …
Scott goes on to analyze how cloud computing environments must handle atomicity, consistency, isolation and durability issues. He concludes:
… As transactions, databases and other kinds of business records follow email into the cloud, we are likely to see more disputes over records authentication and reliability. This suggests that customers should seek out cloud computing service providers that offer effective data integrity as well as security. Customers should also consider inserting a general contractual obligation for the service provider to cooperate as necessary in legal and regulatory proceedings --because sometimes integrity must be proven.
Kevin McPartland asserts that the financial services industry hasn’t adopted public-cloud computing because of regulatory restrictions on the storage of investor information in this 2:17 Cloud Computing Regulation video.
Kevin is a senior analyst with the Tabb Group. His video is a refreshingly concise analysis of the current position of financial services regulators.
Linda Leung reports FTC Debates Clouds, Consumers and Privacy on 1/29/2010 in this post to the Data Center Knowledge blog:
Consumer data stored by cloud computing services should be regulated through a mix of government policies, consumer responsibility, and openness by the cloud providers, according to a panel of cloud companies and consumer advocates speaking at Thursday’s privacy roundtable hosted by the Federal Trade Commission.
The panelists debated a variety of issues, including how much should consumers be aware of what happens to their data when it leaves their hands, especially if the cloud provider provisions some services through third parties.
Nicole Ozer, director of technology and civil liberties policy at the American Civil Liberties Union of Northern California, said her team’s review of the privacy policies of some cloud providers found that some basic information was either lacking or vague. …
<Return to section navigation list>

Cloud Computing Events

•• The Microsoft Partner Network attempts to take Azure social with the Windows Azure Platform Partner Hub that appears to have launched officially on 1/29/2010. (The unofficial launch was on 1/24/2010). The Partner Network also appears to have taken over the Windows Azure Platform Facebook group and offers a newsletter called the Windows Azure Platform Partner Communiqué. My first Communiqué copy contained no news but was marked *Microsoft Confidential: This newsletter is for internal distribution only*:
AzurePlatformCommunique858px
… As we built the site we set out to simulate custom dev projects and application migrations currently being done by the Microsoft partners building on the Windows Azure Platform. We worked with Slalom Consulting (A Microsoft Gold Certified Partner J) to take BlogEngine.NET to Windows Azure. To my knowledge this is one of the first deployments of BlogEngine.NET on Windows Azure and although the two work well together there were still some critical architectural and development decisions that Slalom had to navigate. As we go we’ll share lessons learned about running the site and ways that you can leverage the code or architecture on apps you build for your clients. …
The “lessons learned” and shared code won’t be of much use if they’re “for internal distribution only.” This reminds me of the SLAs for Microsoft Public Clouds Go Private with NDAs fiasco.
•• Stephen Chapman reported Microsoft Confirms “Windows Phone 7;” to be Discussed at Energize IT 2010 and provided a link to Ottawa: Energize IT – From the Client to the Cloud V 2.0, which takes place on 3/30/2010 at the Hapmton Inn & Conference Center, Ottawa, Ontario K1K 4S3, Canda:
Description: Energize IT 2010 – Anything is Possible!
Windows Azure. Office System 2010. Visual Studio 2010. Windows Phone 7. The Microsoft-based platform presents a bevy of opportunities for all of us.  Whether you are an IT Manager, Developer, or IT Pro knowing how these will impact you is critical, especially in the new economic reality. 
Registration is now available for you to attend this complimentary full day EnergizeIT event where we will help you to understand Microsoft’s Software+Services vision using a combination of demonstrations and break-outs. You will find out about the possibilities that these technologies help realize and the value that they can bring to your organization and yourself. …
EnergizeIT 2009 Event Resources Are Now Available for Download here.
• Jon Box lists Upcoming TechNet and MSDN Events for the midwest (MI, OH, KY) during February and March 2010 in this 1/29/2010 post:
Windows Azure, Hyper-V and Windows 7 Deployment
Get the inside track on new tips, tools and technologies for IT pros. Join your TechNet Events team for a look at Windows Azure™ and learn the basics of this new online service computing platform. Next, we’ll explore how to build a great virtual environment with Windows Server® 2008 R2 and Hyper-V™ version 2.0. We’ll wrap this free, half-day of live learning with a tour of easy deployment strategies for Windows® 7.
TOPICS INCLUDE:
  • The Next Wave: Windows Azure
  • Hyper-V: Tools to Build the Ultimate Virtual Test Network
  • Automating Your Windows 7 Deployment with MDT 2010
Take Your Applications Sky High with Cloud Computing and the Windows Azure Platform
Join your local MSDN Events team as we take a deep dive into Windows Azure. We’ll start with a developer-focused overview of this brave new platform and the cloud computing services that can be used either together or independently to build amazing applications. As the day unfolds, we’ll explore data storage, SQL Azure™, and the basics of deployment with Windows Azure. Register today for these free, live sessions in your local area.
Alistair Croll’s State of the Cloud webcast of 1/18/2010 is available is available as an illustrated audio podcast sponsored by Interop and Cloud Connect, which received excellent reviews on Twitter:
AlistairCrollPreso1-28-2010Costs669px
Alistair is Bitcurrent’s Principal Analyst and Cloud Computing Conference Chair for Interop and Cloud Connect.
The Cloud Connect 2010 conference will take place on 3/16 to 3/18/2010 at the Santa Clara Convention Center, Santa Clara, CA. A Cloud Business Summit will take place on 3/15/2010:
Trends which have shaped the software industry over the past decade are being drawn up into the move to cloud computing including SaaS, web services, mobile applications, open source, Enterprise 2.0 and outsourced development.
Cloud Business Summit brings together an exclusive and influential group of 200 CEOs, entrepreneurs, technologists, VC's, CIOs and service providers who will cut through the hype and discuss and debate the opportunities, business models, funding and adoption paths for the cloud.
Top entrepreneurs and executives from software, infrastructure and services companies who are building their cloud strategies will network with venture capitalists, service providers, channel partners and leading media. Produced by TechWeb, and hosted by MR Rangaswami of the Sand Hill Group, the event carries on the tradition of the Software 200X events in the Silicon Valley.
According to James Watters (@wattersjames), the San Francisco (SF) Cloud Computing Group will hold it’s fourth meeting on 3/16/2010, probably in Santa Clara.
Interop 2010 will take place in Las Vegas, NV on 4/25 to 4/29/2010. Cloud computing will be one of the conference’s primary tracks. Alistair Croll will conduct an all-day Enterprise Cloud Summit on 4/26/2010:
In just a few years, cloud computing has gone from a fringe idea for startups to a mainstream tool in every IT toolbox. The Enterprise Cloud Summit will show you how to move from theory to implementation. We'll cover practical cloud computing designs, as well as the standards, infrastructure decisions, and economics you need to understand as you transform your organization's IT. We'll also debunk some common myths about private clouds, security risks, costs, and lock-in.
On-demand computing resources are the most disruptive change in IT of the last decade. Whether you're deciding how to embrace them or want to learn from what others are doing, Enterprise Cloud Summit is the place to do it.
Ron Brachman reports Cloud Computing Brainiacs Converge on Yahoo! for Open Cirrus Summit on 1/18/2010:
Today and tomorrow, Yahoo! is hosting the second Open Cirrus Summit, attended by cloud computing thought leaders from around the world. Computer scientists from leading technology corporations, world-class universities, and public sector organizations have gathered in Sunnyvale to discuss the future of computer science research in the cloud. The breadth of the research talent is expanding this week, as the School of Computer Science at Carnegie Mellon University officially joins the Open Cirrus Testbed.
Open Cirrus Summit team
Open Cirrus Summit attendees Image credit: Yahoo Developer Network
The event will feature technical presentations from developers and researchers at Yahoo!, HP, and Intel, along with updates on research conducted on the Testbed by leading universities. Specifically, the Yahoo! M45 cluster, a part of the Open Cirrus Testbed, is being used by researchers from Carnegie Mellon, the University of California at Berkeley, Cornell, and the University of Massachusetts for a variety of system-level and application-level research projects. Researchers from these universities have published more than 40 papers and technical reports based on studies using the M45 cluster in many areas of computer science, with several studies related to Hadoop. …
Tech*Days 2010 Portugal will feature a two-part Introduction to Azure session by Beat Schwegler according to the event’s Sessões page. What was missing as of 1/29/2010 were the dates and location. (Try again later or follow Nuno Fillipe Godinho’s blog/tweets.)
<Return to section navigation list>

Other Cloud Computing Platforms and Services

•• Sam Johnston sights Oracle Cloud Computing Forum[?] in this 1/31/2010 screen capture:
 
My comment: Strange goings on from an outfit that canceled the Sun Cloud project.
According to Oracle Cloud Computing Forum in Beijing, China of 1/28/2010, it’s making the rounds in the PRC.
•• Sales Dot Two Inc. adds proof that if a conference name doesn’t include “cloud,” no one will attend. Here’s the description of its Sales 2.0 Conference: Sales Productivity in the Clouds scheduled for 3/8 – 3/9/2010 at San Francisco’s Four Seasons hotel:
The 2010 Sales 2.0 Conference is where forward-looking sales and marketing leaders will learn how to leverage sales-oriented SaaS technologies so their teams can sell faster, better, and smarter in any economic climate. The two-day conference reveals innovative strategies that make the sales cycle more efficient and effective for both the seller and the buyer. …
The only connection to cloud computing is this 55-minute presentation by Gartner’s research VP, Michael Dunne:
Mapping Sales Productivity in the Cloud
Sales needs to work smarter and more efficiently for firms to successfully drive a business recovery. More innovations are emerging with cloud computing but sales often fails to leverage automation. We will examine the top processes for accelerating sales and their implications for improving sales planning, execution and technology adoption.
Dana Gardner asserts “Apple, Oracle plot world domination” in his Apple and Oracle on Way to Do What IBM and Microsoft Could Not post of 1/29/2010:
… Apple is well on the way to dominating the way that multimedia content is priced and distributed, perhaps unlike any company since Hearst in its 1920s heyday. Apple is not killing the old to usher in the new, as Google is. Apple is rescuing the old media models with a viable online direct payment model. Then it will take all the real dough.
The iPad is a red herring, almost certainly a loss leader, like Apple TV. The real business is brokering a critical mass of music, spoken word, movies, TV, books, magazines, and newspapers. All the digital content that's fit to access. The iPad simply helps convince the producers and consumers to take the iTunes and App Store model into the domain of the formerly printed word. It should work, too.
Oracle is off to becoming the one-stop shop for mission-critical enterprise IT ... as a service. IT can come as an Oracle-provided service, from soup to nuts, applications to silicon. The "service" is that you only need go to Oracle, and that the stuff actually works well. Just leave the driving to Oracle. It should work, too.
This is a mighty attractive bid right now to a lot of corporations. The in-house suppliers of raw compute infrastructure resources are caught in a huge, decades-in-the-making vice -- of needing to cut costs, manage energy, reduce risk and back off of complexity. Can't do that under the status quo. …
This is why "cloud" makes no sense to Oracle's CEO Larry Ellison. He'd rather we take out the word "cloud" from cloud computing and replace it with "Oracle." Now that makes sense!
Linda Leung reports Cisco Outlines Plans for Enterprise Clouds, IaaS on 1/28/2010 for Data Center Knowledge:
Cisco Systems (CSCO) this week bolstered its cloud infrastructure offerings to enterprises and service providers by developing separate design guides for those respective markets. The guides present validated designs that enable enterprises and service providers to develop cloud infrastructures using Cisco gear and partner software.
The Secure Multi-Tenancy into Virtualized Data Centers guide is aimed at enterprises wanting to build clouds and is published with NetApp and VMware. It describes the design of what the vendors calls the Secure Cloud Architecture, which is based on Cisco Nexus Series Switches and the Cisco Unified Computing System, NetApp FAS storage with MultiStore, and VMware vSphere and vShield Zones.
Jinesh Varia’s 20-page Architecting for the Cloud: Best Practices white paper (January 2010) is written for Amazone Web Services, but most of its content applies to Windows Azure and other PaaS offerings. From the introduction:
For several years, software architects have discovered and implemented several concepts and best practices to build highly scalable applications. In today’s "era of tera", these concepts are even more applicable because of ever-growing datasets, unpredictable traffic patterns, and the demand for faster response times. This paper will reinforce and reiterate some of these traditional concepts and discuss how they may evolve in the context of cloud computing. It will also discuss some unprecedented concepts such as elasticity that have emerged due to the dynamic nature of the cloud.
This paper is targeted towards cloud architects who are gearing up to move an enterprise-class application from a fixed physical environment to a virtualized cloud environment. The focus of this paper is to highlight concepts, principles and best practices in creating new cloud applications or migrating existing applications to the cloud.
AWSArchitecture741px
Jinesh is a Web Services evangelist for Amazon.com.
<Return to section navigation list>

Thursday, January 28, 2010

Windows Azure and Cloud Computing Posts for 1/27/2010+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

 
Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the November CTP in January 2010. 
* Content for managing DataHubs will be added as Microsoft releases more details on data synchronization services for SQL Azure and Windows Azure.

Off-Topic: OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009.

Azure Blob, Table and Queue Services

The ADO.NET Data Services Team (a.k.a. the OData Team) announced Data Services Update for .NET 3.5 SP1 – Now Available for Download on 1/27/2010:

We’re very excited to announce that the “Data Services Update for .NET Framework 3.5 SP1” (formerly known as “ADO Data Services v1.5”) has been re-released and is available for download; the issue with the previous update has been resolved. If your target is Windows7 or Windows 2008 R2 you can pick it up here.  For all other OS versions you can get the release from here. This release targets the .NET Framework 3.5 SP1 platform, provides new client and server side features for data service developers and will enable a number of new integration scenarios such as programming against SharePoint Lists.     

As noted in the release plan update post, this release is a redistributable, in-place update to the data services assemblies (System.Data.Services.*.dll) which shipped as part of the .NET Framework 3.5 SP1.  Since this is a .NET Framework update, this release does not include an updated Silverlight client library, however, we are actively working on an updated Silverlight client to enable creating SL apps that take full advantage of the new server features shipped in this release.  We hope to have the updated SL client available shortly into the new year. …

This final release includes all the features that were in the prior CTP1 release and CTP2 releases. …

The team goes on to detail a long list of features and delivers a brief FAQ. The question is “When will Storage Client support all these new features?”  

Jerry Huang claims If you know C#, you know Windows Azure Storage in this 1/26/2010 post to his Gladinet blog:

I have been using Gladinet Cloud Desktop to manage files on the Azure Blob Storage for a while now. It has a drive letter, accessible from Windows Explorer and it works just like another network drive on my PC.

However, just like when you are driving a car on a daily basis but sometimes still curious about what is under the hood and check the oil level on weekends, I am curious about how Azure Storage works.

image

To dive into the Azure Storage, you will need the Azure SDK (Nov 2009 Release) to work with Visual Studio 2005 or VS08. VS 2010 will have Azure SDK built in.

First, you will need to have some basic knowledge about the Azure Blob Storage. As shown in the following picture, Each Windows Live ID (Master Azure Account) can have multiple projects (Accounts). Each Account has multiple containers. Each containers may have multiple Blobs. Each Blob may have multiple blocks.  After you know this, the rest will be just C# and .NET. …

Pablo Castro breaks a long silence about Astoria topics in his personal blog with an Implementing only certain aspects of OData post of 1/26/2010:

While we focus on keeping things simple, the whole OData protocol does have a bunch of functionality in it, and you don't always need the whole thing. If you're implementing a client or a server, how much of OData do you need to handle?

OData is designed to be modular and grow as you need more features. We don't want to dictate exactly everything a service needs to do. Instead we want to make sure that if you choose to do something, you do it in a well-known way so everybody else can rely on that. …

If you’re not up to date on Microsoft’s latest technology name metamophoses, OData (O[pen]Data) is the new name for ADO.NET Data Services (formerly code-named Astoria.) Pablo’s last Astoria-related post was in October 2009.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

Liam Cavanagh reviews my new Data Sync post (see below) in his Sending Email Notifications Using SQL Azure Data Sync post of 1/28/2010 to the Microsoft Sync Framework blog:

I love to see when users take the products we have created and add extensions to it to suit their specific needs.  Earlier this week I talked about how Hilton Giesenow's extended SQL Azure Data Sync to create a custom synchronization application using VB.NET to allow him to do custom conflict resolution and get better control of events in his webcast series "How Do I: Integrate an Existing Application with SQL Azure?".  Today, Roger Jennings of Oakleaf Blog pointed me to his overview of SQL Azure Data Sync in one of his latest posts: "Synchronizing On-Premises and SQL Azure Northwind Sample Databases with SQL Azure Data Sync". 

I really like this latest blog post by Roger because not only does he walk through the capabilities of the tool from start to finish, but he also spends some time talking about one of the common issues users have had using the tool (I call it the "Dreaded ReadCred Issue").  But most of all I enjoyed reading how Roger took what had been done in Hilton's webcast and expanded it even further.  At the end of the post Roger explains how you can add the ability to send email notifications when the synchronization process fails, succeeds or both. Definitely a very useful capability for a DBA. …

Many thanks, Liam.

My Synchronizing On-Premises and SQL Azure Northwind Sample Databases with SQL Azure Data Sync article of 1/28/2008 begins:

SQL Azure Data Sync[hronization] is an alternative to using SQL Server 2008 R2 Management Studio (SSMS) [Express] or the SQL Azure Migration Wizard (SQLAzureMW) v3.1.4+ for replicating schemas of on-premises and SQL Azure databases and bulk-loading a snapshot of data after creating the schema.

The most obvious application for SQL Azure Data Sync is automatically maintaining an on-premises backup of an SQL Azure database. Another common use is synchronizing databases stored in multiple data centers for disaster protection.

Note: This post is an addendum to the updated version of Chapter 13, “Exploiting SQL Azure Database’s Relational Features” of my Cloud Computing with the Windows Azure Platform book.

Contents:

  • Committing to SQL Azure Data Sync
  • Sync Framework Background and Prerequisites
  • The Computing Environment Used for this Example
  • Setting Up SQL Server Agent for Data Synchronization
  • Adding the SMTP Server Feature and Enabling Database Mail
  • Creating the On-Premises Northwind Sample Database
  • Creating and Populating the NorthwindDS SQL Azure Database
  • Running the SyncToSQLAzure-Sync_Northwind SQL Server Agent Job Manually
  • Sending E-Mail Notifications when Data Sync Jobs Fail, Succeed or Both
  • Working Around “ReadCred Failed” Errors when Running the SQL Agent Job

The post is lengthy and “lavishly illustrated.”

Rob Sanders describes building A Dynamic Data Website with SQL Azure and the Entity Framework in this 1/26/2010 post:

This is part of a series of entries written about Microsoft’s new SQL Azure database service and the Entity Framework v4.

Following on from my previous posts (check them out before continuing) – this article assumes you have followed steps outlined in the  previous posts to create various models and accounts etc.

… Our next step is to create a Dynamic Data website.  If you haven’t come across this yet, it’s most likely because you haven’t been using Visual Studio 2010 or the .Net Framework 4.0.  Recently introduced and compatible with both LINQ-to-SQL and the Entity Framework, this nice site template makes use of the dynamic nature of both LINQ-to-SQL [.dbml] (SqlMetal) and Entity Framework [.edmx] data models. …

<Return to section navigation list> 

AppFabric: Access Control, Service Bus and Workflow

Brian Loesgen continues his series with Azure Integration Part 3 - Sending a Message from an ESB off-ramp to Azure&rsquo;s AppFabric ServiceBus of 1/25/2010:

This is the third post in this series. So far, we have seen:

In this third post, we will see how to use an ESB off-ramp to send a message to the Windows Azure AppFabric ServiceBus. We will actually be doing the same thing as we did in the second post, however, we’ll be doing it in a different way.

There is an accompanying video for this post (as I did with the others too), which you can find here.

The sequence used here is:

  1. Message is picked up from a file drop (because that’s how most BizTalk demos start:))
  2. An itinerary is retrieved from the itinerary repository and applied to the message
  3. The itinerary processing steps are performed, and the message is sent to the ServiceBus
  4. The message is retrieved by the receive location I wrote about in my previous post
  5. A send port has a filter set to pick up messages received by that receive port, and persists the file to disk

The last two steps are not covered here, but are shown in the video.

<Return to section navigation list>

Live Windows Azure Apps, Tools and Test Harnesses

My Determining Your Azure Services Platform Usage Charges post of 1/28/2010 is and illustrated tutorial for determining usage patters before Microsoft begins charging for the platform services on 2/1/2010:

I see many questions from Azure Services Platform users about determining usage of Windows Azure compute, storage and AppFabric resources, as well as SQL Azure Databases, but seldom see an answer.

Here’s how to determine your daily usage of these resources:

1. Sign into Microsoft Online Service’s Customer Portal and log in with the Windows Live ID for your Windows Azure and SQL Azure services (click reduced-size images to display 1024x768 captures):

UsageMOCP1SignIn979px

2. Click the View my Bills link (highlighted above) to open the Profile Orders page:

UsageMOCP1ViewOnlineBill728px

And continues with the details of how to load the daily usage details into an Excel workbook.

 

Ben Riga’s Windows Azure Lessons Learned: RiskMetrics of 1/28/2010 continues his “Lessons Learned” series;

WinAzure_h_rgbIn this episode of “Azure Lessons Learned” Rob Fraser from RiskMetrics talks about the work they’ve done with Windows Azure to scale some of their heavy computational workloads out to thousands of nodes on Windows Azure.

RiskMetrics specializes in helping to manage risk for financial institutions and government services.  The solution they built on Windows Azure is primarily for calculating financial risk for their clients.  Calculating the risk on portfolios of financial assets is an incredibly compute-intensive problem to solve (Monte Carlo simulations on top of Monte Carlo simulations).  There is an ongoing and increasing demand for this type of computation.  RiskMetrics calculations require enormous computational power but the need for that power tends to come in peaks.  That means the required hardware is idle for much of the time.  Windows Azure solves this problem by allowing RiskMetrics to quickly acquire the very large number of required processors, use them for a short time and then release them.

Channel 9: Windows Azure Lessons Learned: RiskMetrics

To give you a sense of the scale RiskMetrics is talking about, the initial target is to use 10,000 worker roles on Windows Azure.  And that’s just a beginning as Rob thinks they could eventually be using as many as 30,000.

John Moore delivers an Update: Siemens Brings HealthVault to Europe in this 1/28/2010 analysis of the Siemens/Microsoft agreement:

Today, Siemens announced that it has struck a deal with Microsoft to create a German instance of the HealthVault platform to serve the citizens of Germany.  In a deal similar to the one that Microsoft struck with Canadian telecom, Telus, Siemens IT Solutions and Services (SIS) will re-purpose the base HealthVault platform to meet Germany’s legal framework for Personal Health Information (PHI) and seek German partners to create a rich ecosystem of data providers (insurers, providers) and apps/services to serve this market.

After a joint briefing with Microsoft and Siemens as well as interviews with three German software firms, SAP (one of the world’s largest enterprise software companies), ICW (healthcare IT infrastructure & PHR solutions) and careon (case/disease mgmt & PHR solutions), here is the scoop:

The Skinny:

Siemens SIS is 38,000 employees strong operating in 20 countries with about 10% of those employees dedicated to the healthcare market.  In addition to a deep presence in Germany, Siemens SIS supports CareNet in Belgium – leading one to conclude that Belgium may be the next extension of this agreement with Microsoft.

This is an exclusive license between Microsoft and Siemens to serve the German market and both companies stated that this is a very long-term contract as it will takes years to develop, deploy and gain traction.  Terms of agreement were not disclosed, but both companies will share in revenue generated.

Target market/business model is to sell the HealthVault service to potential sponsors that have a desire to improve care and disease management.  Likely candidates include payers and employers.  Hospitals are also a potential target market.

Service will go live in second half of 2010 and include the entire HealthVault platform, including Connection Center for biometric devices.  Existing HealthVault ecosystem partners with solutions pertinent to the German market will be included and Siemens is currently in discussions with many eHealth companies in Germany to on-board them as well upon formal launch of the platform later this year. …

John continues with detailed “Impressions, Prospects, Challenges” and “The Wrapsections.

David Linthicum writes “There are three models of cloud computing, and the one you use determines the kind of performance you get” as a preface to his How to gauge cloud computing performance in this 12/28/2009 post:

Does cloud computing perform well? That depends on whom you ask. Those using SaaS systems and dealing with standard Web latency can't tell you much about performance. However, those using advanced "big data" systems have a much different story to relate.

You need to consider the performance models, which you can break into three very basic categories:

  • Client-oriented (performance trade-off)
  • Cloud-oriented (performance advantage)
  • Hybrid (depends on the implementation)

Dave continues with analyses of the three models.

Vijay Rajgopalan reports New version of Zend Framework adds support for Microsoft Windows Azure in this 1/28/2010 post to the Interoperability@Microsoft blog:

Zend-Framework-Windows-Azure-SDK-PHP Zend Technologies Inc. has announced the availability of Zend Framework 1.10, which among other new features includes support for Microsoft Windows Azure cloud services. We’re very excited about this key milestone, which is the result of a fruitful collaboration! This particular project started last year when we announced the Windows Azure SDK for PHP CTP release and upcoming support in Zend Framework. I also want to thank again Maarten Balliauw who has been a key contributor to the initial project.

With the new Zend Framework 1.10, by simply using the new Zend_Service_WindowsAzure component, developers can easily call Windows Azure APIs from their PHP applications and leverage the storage services, including Blob Storage, Table Storage and Queue Service, offering them a way to accelerate web application development and scale up on demand.

With this announcement, PHP Developers now have great choice when it comes to writing web applications targeting Windows Azure. Besides the Windows Azure SDK included in Zend Framework, there is Windows Azure SDK for PHP which is already prepackaged in Windows Azure tools for Eclipse and the more simpler Simple Cloud API.

The Windows Azure Team reports Jordan Brand Social Mosaic Goes Live with Windows Azure Cloud Services on 1/27/2010:

We love sharing stories about the interesting and creative ways that customers are using Windows Azure Platform. One great new story about a cool new implementation is the Mosaic 23/25 built by Wirestone for Jordan Brand, a division of Nike, Inc., to celebrate their 25-year anniversary during the NBA 2010 All-Star Weekend.   Jordan Brand is leveraging Microsoft solutions to engage consumers to creatively market their new "Air Jordan 2010" collection and enable enthusiasts to personally connect with the Jordan Brand. …

Some of the underlying capabilities of the Windows Azure platform used to support the Social Mosaic include:

  • Windows Azure Blob Storage - The natural place to store the uploaded photos and resultant deep zoom imagery.
  • Windows Azure Queue Storage - Used to provide an asynchronous mediation point between Azure Blob Storage and backend Worker Roles; also used to queue email notifications.
  • Windows Azure Table Storage - Used to quickly and efficiently store data around Worker Role scale out coordination, leveraging atomic change guarantee patterns.
  • SQL Azure - Used to relationally store other important schema governed data, thus providing a more rich functionality to manage select data.

Head to the Social Mosaic and see for yourself how these great technologies are working to help create a global groundswell of brand advocacy for Jordan Brand, and how it brings the Jordan Brand experience to life.

Nancy Gohring reports “The company is working on moving its child exploitation app to the cloud” in her Ballmer: The cloud will bring new apps to law enforcement post of 1/27/2010 to NetworkWorld:

A move to the cloud will enable new kinds of applications that public safety and law enforcement agencies can use to do their jobs better, Microsoft’s CEO said during its annual Worldwide Public Safety Symposium on Wednesday.

“It’s fantastically important,” Microsoft CEO Steve Ballmer said at the event at the company’s Redmond, Washington, headquarters. “The cloud isn’t just about cost and efficiency but building a whole new generation of applications that will be far more able to get the job done than anything that would have been able to be built in yesterday’s model.”

Microsoft’s Child Exploitation Tracking System is an example of how moving applications to the cloud can advance an application. CETS is a software product that law enforcement officers use to search, share and analyze evidence in child exploitation cases across police agencies.

CETS is deployed in 10 countries including Canada and Spain. However, data in CETS is not shared across borders.

Microsoft is now working with Interpol to explore moving CETS to the cloud so that agencies in different countries can share data, Ballmer said. “If ever there was an application that would benefit from coming to the cloud so data could be shared by law enforcement in multiple jurisdictions, this is the app,” he said. Criminals involved in child exploitation typically work across borders, meaning law enforcement often loses track of perpetrators as they cross borders. “This highlights why a move to the cloud makes a difference,” Ballmer said. …

Eric Nelson reports the Results of Cloud Computing Survey - Part 1: Is Cloud relevant? in this 1/27/2010 post:

On January 7th 2010 I kicked off a survey on Cloud Computing and the Windows Azure Platform (Now closed). A big thanks to the 100 folks who completed the survey. I have been through the results and removed a few where folks clearly dropped out after page 1 (which is fine – but I felt it wasn’t helping the results).

I promised to share the results which I will do over four posts. This is the first of those four.

  • Part 1: Is Cloud relevant?
  • Part 2: How well do you know the technologies of Microsoft, Amazon, Google and SalesForce?
  • Part 3: What Plans around the Windows Azure Platform?
  • Part 4: My analysis

Some observations:

  • I am a .NET developer (well, I try to be), therefore my expectation is that most folks replying would in the main be .NET developers. This will obviously means the results will end up favouring MS technologies.
  • I am UK based, hence most of the respondents are from the UK.
  • At the time of creating the survey I had only just switched to Azure. Hence I think my “readership” at that point were not “Azure fans” or “Cloud fans” – instead they were likely a cross section of the development landscape. Which I think makes the answer to question 2 very interesting. …

Eric continues with screen captures of replies to the questionnaire.

Jim Nakashima’s Windows Azure Debugging: Matching an Instance in the DevFabric to its Process tutorial of 1/26/2010 begins:

One of the things I run into from time to time is the need/desire to match an instance I’m watching in the Development Fabric UI back to its process.

This may be because I am sitting on a breakpoint in Visual Studio and want to look at the corresponding logs for that process in the DevFabric UI or because I want to use some non Windows Azure aware tool that requires a process ID.

For example, given a Web and a Worker role with 3 instances each, I can watch the instances in the development fabric UI by right clicking on the Windows Azure task tray icon and select “Show Development Fabric UI”:

image

image

This is useful for a couple of reasons, sometimes I want to double check the status of my roles and instances but more commonly I want to watch the logs on a per instance basis. …

Hovhannes Avoyan asks “But who’s actually got some form of cloud solution working for them?” as the introduction to his Companies Recognize Importance of Cloud, But Minority Act post of 1/26/2010:

Symantec’s cloud survey was conducted by Applied Research and polled 1,780 globally with at least 1,000 employees.

Some interesting points about cloud computing were made in Symantec’s recent 2010 State of the Data Center survey. Basically, the report offers a lot of statistics that say companies think cloud computing is an important priority but that actual deployments and activity remain pretty low.

For example, more than half surveyed said cloud computing was an important priority for this year, according to a report I read about the survey.  Among them, 57% said private cloud computing is either somewhat or absolutely important; 54% had the same assessment about hybrid cloud computing; and 53% said the same about public cloud computing.

But who’s actually got some form of cloud solution working for them? Only about 20%, apparently – using it as a cost-containment strategy. Of those, just under one-quarter used private cloud computing last year to cut or tighten costs, while 22% relied on a hybrid cloud solution and only one in five used public cloud computing. …

Howard D. Smith’s Dynamax digital signage on Azure post of 1/26/2009 reports:

Dynamax Technologies is to launch a new digital signage application using the Windows Azure platform. digitalsignage.NET will enable customers to securely access their digital signage system from anywhere in the world. It also supports easier administration, whilst removing from the user the arduous task of maintaining servers and technical infrastructure.

"The Windows Azure platform provides greater choice and flexibility in how we develop and deploy scalable, but easy-to-use digital signage solutions, to our world wide partners and customers,” said Howard D. Smith, Director and CTO at Dynamax. “There has always been the issue of scalability and resilience with traditional hosting technologies for SaaS solutions; Windows Azure helps us to address that.

“digitalsignage.NET by dynamax automates and simplifies critical processes such as managing your own servers and data centers, managing security and scalability planning. It is designed for all levels of users; from the single screen deployment right through to enterprise level deployments, all on a simple low cost subscription pricing model. Built on ASP.NET technologies and designed from the ground up to be an easy-to-use digital signage cloud application, digitalsignage.NET by dynamax, we feel, will herald a new benchmark in the digital signage market.”

“Through the technical and marketing support provided by the Front Runner program, we are excited to see the innovative solutions built on the Windows Azure platform by the ISV community,” said Doug Hauger, general manager for Windows Azure Microsoft. “The companies who choose to be a part of the Front Runner program show initiative and technological advancement in their respective industries.”

Wikipedia says: “Digital signage is a form of electronic display that shows information, advertising and other messages.”

Lydia Leong analyzes the Microsoft/Intuit Windows Azure/QuickBooks partnership in her Cloud ecosystems for small businesses post of 1/26/2010:

As I’ve been predicting for a while, Microsoft and Intuit have joined forces around Quickbooks and Azure: Microsoft and Intuit announced that Intuit would name Microsoft’s Windows Azure as the preferred platform for cloud app development on its Intuit Partner Platform. This is an eminently logical partnership. MSDN developers, are a critical channel for reaching the small business with applications, Azure is evolving to be well-suited to that community, and Intuit’s Quickbooks is a key anchor application for the small business. Think of this partnership as the equivalent of Force.com for the small business; arguably, Quickbooks is an even more compelling anchor application for a PaaS ecosystem than CRM is. …

Whatever your business is, if you want to create a cloud ecosystem, you need an anchor service. Take something that you do today, and leverage cloud precepts. Consider doing something like creating a data service around it, opening up an API, and the like. (Gartner clients: My colleague Eric Knipp has written a useful research note on this topic entitled Open RESTful APIs are Big Business.) Use that as the centerpiece for an ecosystem of related services from partners, and the community of users.

Michael Coté’s Using the Intuit Partner Platform, Alterity’s story – RIA Weekly #69 post of 1/26/2010 to the Enterprise Irregulars blog delivers his latest podcast:

In this episode, I talk with Alterity’s Brian Sweat about launching their new application, Easy Analytics for Inventory, on the Intuit Partner Platform. We talk about IPP, Flex, cloud, analytics, and the ready to go QuickBooks customer-base of 4.5 million users – a pretty exciting setup for one development team’s story on using RIAs and the cloud. [Emphasis added.] …

Michael adds:

If you’re interested in an overview of IPP, I suggest last week’s episode with Intuit architect Jeff Collins.

Adam Bird explains Automating Azure deployment with Windows PowerShell in this 1/22/2010 post:

One of the pre-requisites I had for using Azure was that it could be deployed automatically as part of an integration and deployment process.

A quick scan through the Labs in the Windows Azure Platform Kit, which by the way have been an excellent resource so far, gave me Windows PowerShell as the option.

It proved pretty easy to get up and running. Here's my quick start guide. …

Return to section navigation list> 

Windows Azure Infrastructure

James Urquhart reports about feedback on his original proposal in his Payload descriptor for cloud computing: An update post of 2/28/2010 to C|Net News’ The Wisdom of Clouds blog:

Recently, I outlined my thoughts around simplifying application delivery into cloud-computing environments. At the time, I thought what was needed was a way to package applications in a universal format, whether targeted for infrastructure or platform services, Java or Ubuntu, VMs or disk drives.

The core concept was to define this format so that it combines the actual bits being delivered with the deployment logic and run-time service level parameters required to successfully make the application work in a cloud. …

Thankfully, I received tremendous feedback on the application packaging post, both in the comments on CNET and from a large number of followers on Twitter. The feedback was amazing, and it forced me to reconsider my original proposal.

  …

Brenda Michelson reports Enterprise Management Associates: CapEx reduction is largest, but not sole, cloud computing benefit in this 1/28/2010 post:

Each time I think I have my cloud computing survey list set, another is released.  The latest is from Enterprise Management Associates, in a report entitled The Responsible Cloud.  The report price is outside of my price range, but Data Center Knowledge provides a good summary.

The survey sample:

“Enterprise Management Associates (EMA) interviewed 159 enterprises with active, or immediately planned cloud deployments, and reports that 75 percent said private cloud is the preferred model. Fifty two percent are implementing both on-premises and off-premises clouds…”

The key findings, according to Data Center Knowledge:

“Of the enterprises already running cloud computing, lowered IT capital costs (hardware, facilities, licenses, etc.) was cited by 61% of respondents. One quarter of all respondents reported that they had reduced both capital expenditure and operational expenditures such as staff, power, rent and maintenance costs.”

Her article continues with more findings from th eData Center Knowledge post.

Lori MacVittie writes “I haven’t heard the term “graceful degradation” in a long time, but as we continue to push the limits of data centers and our budgets to provide capacity it’s a concept we need to revisit” as an introduction to her How to Gracefully Degrade Web 2.0 Applications To Maintain Availability post of 1/27/2010:

storyfailwhaletwitter You might have heard that Twitter was down (again) last week. What you might not have heard (or read) is some interesting crunchy bits about how Twitter attempts to maintain availability by degrading capabilities gracefully when services are over capacity.

Twitter Down, Overwhelmed by Whales” from Data Center Knowledge offered up the juicy details:

blockquote The “whales” comment refers to the “Fail Whale” – the downtime mascot that appears whenever Twitter is unavailable. The appearance of the Fail Whale indicates a server error known as a 503, which then triggers a “Whale Watcher” script that prompts a review of the last 100,000 lines of server logs to sort out what has happened.

When at all possible, Twitter tries to adapt by slowing the site performance as an alternative to a 503. In some cases, this means disabling features like custom searches. In recent weeks Twitter.com users have periodically encountered messages that the service was over capacity, but the condition was usually temporary. At times of heavy load for more on how Twitter manages its capacity challenges, see Using Metrics to Vanquish the Fail Whale.

I found this interesting and refreshing at a time when the answer to capacity problems is to just “go cloud”, primarily because even if (and that’s a big if) “the cloud” was truly capable of “infinite scale” (it is not) it is almost certainly a fact that most organization’s budgets are not capable of “infinite payments” and cloud computing isn’t free. …

Kevin Jackson writes “Available now for pre-order on Amazon, this guide is a crystal ball into the future of business” as an introduction to his Review: Executive's Guide to Cloud Computing by Eric Marks and Bob Lozano of 1/27/2010:

Recently, I had the privilege of reviewing an advance copy of Executive's Guide to Cloud Computing by Eric Marks and Bob Lozano.

Available now for pre-order on Amazon, this guide is a crystal ball into the future of business.

Not a technical treatise, this excellent book is an insightful description of how cloud computing can quickly sharpen the focus of information technology and line executives onto the delivery of real value.

Using clear prose, Eric and Bob explain how cloud computing elevates IT from it's traditional support role into a new and prominent business position.

The Windows Azure Support Team announced Dallas Feature Voting and reminded folks to vote for Windows Azure and SQL Azure features in We would love to hear your Azure Ideas of 1/26/2010:

Do you want to have a say in the future of Azure?  Do you have a great idea that would really make one of the Azure products really great?  Well we would love to hear from you.  You can submit your ideas on http://www.mygreatwindowsazureidea.com

You can also vote on any of the existing suggestions to help bring awareness to the great idea.  The Azure teams will be monitoring this site and looking to add additional features into future versions of the products.

There is also a place for SQL Azure Feature Voting and Dallas Feature Voting.

John Treadway asserts in his Private (external) Clouds in 2010 post of 1/26/2010:

At the enterprise level, the interest in private clouds still exceeds serious interest in public clouds.  Gartner and others predict that private cloud investments in the enterprise will exceed public cloud through 2012.  In my conversations with people, there appears to be some confusion as to just what is a private cloud, where you might find them, and how they can be used.

My definition of what distinguishes a private cloud from a public cloud is very simple — tenancy at the host level.  If more than one organization is sharing the physical infrastructure, it’s Public.  If it’s just you on the box – it’s Private.  It’s not about where it runs, because some of my smarter cloud colleagues in the industry believe that the hottest deployment model for private clouds this year will be external — in someone else’s data center.  That means that “most” private cloud deployments in 2010 could very well NOT be inside the corporate firewall. [Emphasis added.] …

Dustin Arnheim is “Defining elastic application environments” in his What Does Elastic Really Mean? article of 1/26/2010:

In terms of cloud computing in application environments, elasticity is perhaps one of the more alluring and potentially beneficial aspects of this new delivery model. I’m sure that to many of those responsible for the operational and administrative aspects of these application environments, the idea that applications and associated infrastructure grows and shrinks based purely on demand, without human intervention mind you, sounds close to a utopia. While I would never dispute that such capability can make life easier and your environments much more responsive and efficient, it’s important to define what elasticity means for you before you embark down this path. In this way, you can balance your expectations against any proposed solutions. …

Steve Clayton reported on a speech Microsoft’s Brad Smith gave in Brussels on 1/26/2010 in The cloud in Europe – challenges and opportunities of the same date:

Brad Smith, Senior Vice President and General Counsel at Microsoft gave a speech titled “Technology leadership in the 21st century: How cloud computing will change our world”. It follows a similar speech he gave last week in the US about building confidence in the cloud. In that speech, he made some direct requests of US Congress and US industry to act together. He called for both bodies to act to provide a “safe and open cloud” and for the US Congress to deliver a Cloud Computing Advancement Act. Big, bold words and frankly it’s what customers should be demanding of all cloud vendors, not just Microsoft. There is gold silver in the cloud for sure, but wherever there are riches, there are inevitably those looking to profit illegally. While many consumers put blind faith in the cloud, Government and industry can’t afford to do so. There are other challenges regarding policy that I’ll touch on below.

In today’s speech in Brussels, Brad focused on a number of areas – some very similar, but some new themes too. He talked about the huge potential the cloud holds for small and medium businesses in terms of economic growth and job creation. Put simply, the cloud enables small guys to have IT operations just like big guys and pay only for what they use.

The Etro Study, “The Economic Impact of Cloud Computing on Business Creation, Employment and Output in Europe” concluded that the adoption of cloud computing solutions could create a few hundred thousand new small-and medium-sized businesses in Europe, which in turn could have a substantial impact on unemployment rates (reduced by 0.3/0.6 %) and GDP growth (increased by 0.1/0.3%). The study also concluded that these positive benefits will be “positively related to the speed of adoption” of cloud computing. …

David F. Carr’s 5 Things You Need to Know about Platform as a service of 1/25/2010 expands on the following points:

  1. It's like glue. …
  2. You can test it now. …
  3. Start small. …
  4. Scalability isn't guaranteed. …
  5. Platforms aren't Portable. …

What’s more interesting about this post to CIO.com is that the San Francisco Chronicle picked it up for its “Business” section. Unlike many other authors of these short summaries, Carr includes Windows Azure in his analysis.

Joel York analyzes Cloud Computing vs. SaaS – Mass Customization in the Cloud in this 1/25/2010 post:

SaaS Do #8 Enable Mass Customization is a core principle for building SaaS applications. Salesforce.com, for example, has taken it to new heights with offerings such as the Force.com platform. However, do SaaS-based development platforms such as Force.com represent a fundamental shift in application development, or are they simply the SaaS equivalent of Microsoft Visual Basic for Access? How do they stack up against cloud computing platforms like Amazon Web services? This post examines the potential for competitive advantage through mass customization in cloud computing vs. SaaS.

The short answer is this…
Mass customization in cloud computing is more natural, more flexible, and offers more potential for competitive advantage than in the wildest dreams of SaaS, because cloud computing is built on Web services that are a) inherently abstracted, b) independent components and c) accessible at every layer of the technology stack. …

<Return to section navigation list> 

Cloud Security and Governance

Ellen Rubin writes “To codify data security and privacy protection, the industry turns to auditable standards” as a preface to her Security vs. Compliance in the Cloud post of 1/28/2010:

Security is always top of mind for CIOs and CSOs when considering a cloud deployment. An earlier post described the main security challenges companies face in moving applications to the cloud and how CloudSwitch technology simplifies the process. In this post, I’d like to dig a little deeper into cloud security and the standards used to determine compliance.

To codify data security and privacy protection, the industry turns to auditable standards, most notably SAS 70 as well as PCI, HIPAA and ISO 27002. Each one comes with controls in a variety of categories that govern operation of a cloud provider’s data center as well as the applications you want to put there. But what does compliance really mean? For example, is SAS 70 type II good enough for your requirements, or do you need PCI? How can your company evaluate the different security claims and make a sound decision?

Ellen continues with detailed discussions about:

    • SAS 70 (Types I and II)
    • PCI (and Its HIPAA Component)
    • Compliance Building Blocks
    • Deploying to the Cloud


Lori MacVittie asserts “Using HTTP headers and default browser protocol handlers provides an opportunity to rediscover the usability and simplicity of the mailto protocol” in her How to Make mailto Safe Again post of 1/28/2010:

envelope-mailboxOver the last decade it's become unsafe to use the mailto protocol on a website due to e-mail harvesters and web scraping. No one wants to put their e-mail address out on teh Internets because two minutes after doing so you end up on a trillion SPAM lists and the next thing you know you're changing your e-mail address.

But people still wanted to share contact information, so it became common practice to spell out your e-mail address, such as l.macvittie AT F5 dot com. But e-mail harvesters quickly figured out how to circumvent that practice so people got even more inventive, describing how to type the @ sign instead. For example, you can send me an e-mail at l.macvittie SHIFT 2 f5.com. But that's inconvenient and isn't easily automated, and eventually the e-mail harvesters figure that one out, too. …

Lori goes on to explain three solutions.

David Linthicum explains Why design-time service governance makes less sense in the cloud in this 1/26/2010:

I seem to have struck a nerve with my post "Cloud computing will kill these 3 technologies," including my assertion that that design-time service governance would fall by the wayside when considering the larger cloud computing picture. Although this is unwelcome news for some, the end result will still be the same and a prediction I stand by.

<Return to section navigation list> 

Cloud Computing Events

Andrew Coates reports on 1/28/2010 that “The one and only Dave Lemphers is coming back to Australia for a week at the end of February” for Windows Azure User Group Briefings:

The one and only Dave Lemphers is coming back to Australia for a week at the end of February for a whirlwind tour of the country to coincide with the launch in Australia of Windows Azure. He'll be doing public technical briefings hosted by the user groups in 5 capital cities.

Register Here for the 2/22/2010 session in Adelaide.

If/when there are registration links for the events in the other 4 cities, I'll update them here. Otherwise, just rock up at the date/time above.

For more details about Windows Azure in Australia, see Greg Willis' post from last month.

John McClelland announced Live Meeting: Is Azure Right For My Business? Webcast for 2/25/2010 in this 1/28/2010 post:

Windows AzureDid you miss the Azure roadshow for Microsoft Partners in December? Are you still pondering the business opportunity behind Azure? Well block off 2-hours on your calendar today to attend our Live Meeting presentation of the roadshow content on Thursday, February 25th. Gather up your team for an extended lunch to learn from leading edge partners on what the Windows Azure opportunity can mean for your business. For more information and to register for the event go to https://swrt.worktankseattle.com/webcast/3936/preview.aspx. …

John is a Principal Partner Evangelist with Microsoft's U.S. Developer and Platform Evangelism team.

Matt Deacon reported Microsoft Architect Insight Conference 2010 – registration open! in this 1/28/2010 post:

The 5th annual Microsoft Architect Insight Conference is now open for registration!

This year we’re moving back to the full 2-day, 3-track format of previous years but with the flexibility to choose whether to attend one or both days.

Day 1 will focus on Architecting Today: From Cost to Innovation
Day 2 will focus on Architecting Tomorrow: Implications of Cloud

There are a pile of cracking key notes (see some of our headliners below) and the breakouts will focus on the needs of the Enterprise, Solution and Infrastructure architect, on top we are planning a series of interactive tracks so we can delve deeper into the topics addressed in the main break outs!

  • Iain Mortimer, Chief Architect, Merrill Lynch Bank of America
  • Andy Hopkirk, Head of Projects and Programmes and Director e-GIF Programme, NCC
  • David Sprott, CEO, Everware-CBDi International and Founder CBDi Forum
  • Ivar Jacobson, Ivar Jacobson International
  • Kim Cameron, Chief Architect of Identity and Distinguished Engineer, Microsoft Corp
  • Steve Cook, Software Architect, Visual Studio Team System, Microsoft Corp

The conference will be held at Microsoft London (Cardinal Place), 100 Victoria Street, London SW1E 5JL on 3/31/2010 and 4/1/2010.

David Terrar reported in EuroCloud UK members making sense of Cloud standards and security of 1/28/2010:

The newly formed EuroCloud UK group held their first member meeting a week ago  at the Thistle City Barbican Hotel – a panel led group discussion on Cloud standards and security.  Chaired by Phil Wainewright, the panel experts were Dr. Guy Bunker, independent consultant and blogger, formerly Symantec’s chief scientist and co-author of ENISA’s cloud security assessment document, Ian Moyse, Channel Director of SaaS provider Webroot, and Adrian Wright, MD, Secoda Risk Management, formerly global head of information security at Reuters.

In the spirit of cooperation we had invited Lloyd Adams from Intellect and Jairo Rojas from BASDA because we want to ensure that the three UK Cloud and SaaS vendor groups keep in close contact and try to coordinate their various deliverables and activities as much as is practical.  In addition we invited Richard Anning who heads the ICAEW’s IT Faculty.  As I’ve reported before, Phil, Jairo, Richard and I have been in discussions, triggered by Dennis Howlett, about trying to achieve some form of pragmatic standard or quality mark on security and best practice.  We decided to use this discussion to identify if there are any sensible, existing standards or initiatives that we could adopt or incorporate in to our thinking. …

SugarCRM Announces SugarCon 2010 Registration and Call for Papers Now Open for Global Customer and Developer Conference in San Francisco, Calif., April 12-14 in this 1/26/2010 post:

SugarCRM, the world's leading provider of open source customer relationship management (CRM) software, today announced that it will host SugarCon 2010, its global customer, partner and developer conference, April 12-14, at The Palace Hotel in San Francisco, Calif.

“Whether you are a customer, partner, developer or just interested in learning about the future of business applications, cloud computing and open source, we invite you to spend three exciting days with us.”

The theme for SugarCon 2010 is “Evolve Your CRM.” Conference keynotes, tracks and sessions will offer practical advice on how companies can take advantage of the big trends – cloud computing, social networking and open source – impacting how companies attract and retain customers. To learn more, please visit: http://www.sugarcon.com.

Attendance cost is $499 per person. Early-bird registrants will receive a 40 percent discount if they register before Saturday, February 13. Please visit: http://www.sugarcrm.com/crm/events/sugarcon/register.html to register.

The conference will include a Business Apps in the Cloud track covering Microsoft Azure, Amazon EC2, GoGrid, Rackspace:

This track will help developers decide where to spend their time and how to start innovating today.

Geva Perry reports about the Under the Radar: Commercializing the Cloud conference in this 1/26/2010 post:

I'm happy to say that this year I'll be working with the organizers of the excellent Under The Radar (UTR) conference as a Content Advisor and on the start-up selection committee. The conference takes place on April 16, 2010 at the Microsoft campus in Mountain View, CA.

If you haven't had a chance to attend past events, it is a unique and kind of fun format where startups get to present to a panel of judges (typically VCs, Fortune 500 execs, analysts, etc.) and receive a critique on the spot.

It's a great place for companies to get exposed to and network with venture capitalists, journalists, potential customers and partners. A lot of great startups presented at UTR and went on to do great things, including Zimbra, 3Tera, Elastra, Heroku, Flickr, LinkedIN, Twilio, Sauce Labs and many others.

If you'd like your company to present you can apply here or just leave me a comment or contact me via Twitter.

Bruce Kyle delivers the details of Free Windows Azure Training Events Nationwide in this 1/26/2010 post to the US ISV Developer Community blog:

Come join in a 15-city tour of Windows Azure events for developers and IT Pros.

Join your local MSDN Events team as we take a deep dive into cloud computing and the Windows Azure Platform.

For more details and for other events in your area, see MSDN Live Events for Developers.

tbtechnet announced the Windows Azure Platform Application Development Contest! on 1/25/2010:

It’s always great to see contests and challenges the get the creative thinking into top gear around new technologies.

This page just went live: http://www.msdev.com/promotions/default.aspx

Windows Azure Platform Application Development Contest!

January 22-February 5, 2010

Attention Partners! Give your applications visibility in the USA Public Sector community. Participate in the State and Local Government Azure Applications Development Contest! Created for Microsoft Partners who focus on the State and Local Government market.  Promote new applications and innovative ideas that you are creating for cloud computing on the Azure Services Platform.

Don't Miss Out! The deadline for submissions is February 5, 2010, and winners will be announced at the CIO Summit on February 25, 2010.

Register and view rules - Visit the contest website: http://www.microsoft.com/government/azure

Terms and Conditions

David Liebowitz explains his start-up’s SQL Scrubs applicatoin’s use of Windows Azure in Partner Interview: SummitCloud of 1/25/2010 on Channel9:

SummitCloud is a BizSpark startup that produces analysis and compliance software used by suppliers and retailers. Microsoft Partner Evangelist John McClelland and I sat down with SummitCloud CEO David Leibowitz to talk about SummitCloud, BizSpark, and the Microsoft Partner Network.

This is an audio interview and I've just added a few slides to it so that it would be easier to view online.

You can find out more about SummitCloud by visiting their web site at http://summitcloud.com

You can find out more about the community edition of SQL Scrubs here.

Paul Tremblett asserts “Simpler solutions are often better than their more complex counterparts” in his Amazon SimpleDB: A Simple Way to Store Complex Data article of 1/22/2010 for Dr. Dobbs:

The presence of the last two letters in the name "Amazon SimpleDB" is perhaps unfortunate; it immediately invokes images of everything we have learned about databases; unless, like me, you cut your teeth on a hierarchical database like IMS, that means relational databases and all of the baggage that comes with them: strictly defined fields, constraints, referential integrity and having most of what you are allowed to do defined and controlled by a DBA -- hardly deserving of being described as simple. To allay any apprehensions even thinking of such things might arouse, let me state that Amazon SimpleDB is not just another relational database. So just what is SimpleDB? The most effective way I have found to understand SimpleDB is to think about it in terms of something else we all use and understand -- a spreadsheet.

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

Geva Perry asks “Is someone out there working on an open source implementation of the Rackspace Cloud API or Microsoft Azure?” in his Does AppScale Have a Commercial Future? post of 1/28/2010:

About 15 months ago I asked the question in the title of this blog post about an open source project from UC Santa Barbara called Eucalyptus (Does Eucalyptus Have a Commercial Future?). Eucalyptus did indeed go on to become a commercial company funded by top-tier VC Benchmark Capital and BV Capital.

A couple of weeks ago, I received an email from Jonathan Kupferman, a master's student at that same university -- UCSB -- who is one of the team members on a project called AppScale. As Jonathan eloquently explains:

“The main idea behind AppScale is to create an open source version of Google AppEngine to allow people to deploy their AppEngine applications on their own hardware or in other clouds. It is of course all open source and runs on top of Xen/KVM and both EC2 and Eucalyptus. Perhaps the simplest way to explain AppScale is: AppScale is to Google AppEngine what Eucalyptus is to Amazon EC2.”

Eric Nelson continues his survey series with Part 2: How well do you know the technologies of Microsoft, Amazon, Google and SalesForce? of 12/28/2010:

On January 7th 2010 I kicked off a survey on Cloud Computing and the Windows Azure Platform (Now closed).

I promised to share the results which I will do over four posts. This is the second of those four.

Eric’s post details answers to a pair of questions about familiarity with Windows Azure’s competitors.

Brenda Michelson’s Gojko Adzic: Tips for keeping your sanity (and apps) in the cloud post of 1/28/2010 begins:

Although Software Design is further down my enterprise considerations list, when Gojko Adzoc’s post on lessons he has learned developing in Amazon’s AWS environment, I knew I had to pass it along.  The post describes new challenges for developers who have previously worked in a purpose-built, directly controlled, infrastructure environment.  These challenges range from server reliability to storage speed.  After articulating the challenges, Adzic offers advice on “How to keep your sanity”:

“It took me a while to understand that just deploying the same old applications in the way I was used to isn’t going to work that well on the cloud. To get the most out of cloud deployments, applications have to be designed up-front for massive networks and running on cheap unstable web boxes. But I think that is actually a good thing. Designing to work around those constraints makes applications much better – faster, easier to scale, cheaper to operate. Asynchronous persistence can significantly improve performance but I never thought about that before deploying to the cloud and running into IO issues. Data partitioning and replication make applications scale better and work faster. Sections of the system that can work even if they can’t see other sections help provide a better service to customers. This also makes the systems easier to deploy, because you can do one section at a time. …

In the post, Adzic maintains that he is a cloud computing advocate, his goal of the post, and the presentation it came from, was to “expose some of the things that you won’t necessarily find in marketing materials.”

Read Adzic’s post.  Remember the 4th Enduring Aspect of Cloud Computing.

Paul Krill reports Oracle hails Java but kills Sun Cloud in this 1/27/2010 article for InfoWorld (published by ITWorld):

No interest in cloud utilities. The prognosis was not so positive for Sun Cloud, the public computing platform announced by Sun in March 2009 that was due to be deployed last summer. "We're not going to be offering the Sun Cloud service," said Edward Screven, Oracle's chief corporate architect. [Paul’s emphasis.]

Oracle CEO Larry Ellison has questioned just how new or important the cloud computing concept actually is. But, even though Oracle will not sell compute cycles through the Sun Cloud similar to what Amazon.com does, the company will offer products to serve as building blocks for public and private clouds, company officials said. …

CloudBook’s Cloud Products & Services directory was featured the Sun Cloud on 1/28/2010, despite the fact that Oracle announced the demise of that product two days earlier:

SunCloudOnCloudBook755px

Michael Cot̩ analyzes the future of the Sun/Oracle (Snor[a]cle) combine in his Oracle and Sun РQuick Analysis post of 1/27/2010 to the Enterprise Irregulars blog:

Oracle had it’s big, 5 hour event today explaining what they’ll be doing with Sun and the new strategies and company that will result. As immense as the topic is, below is an equally sprawling selection of my commentary. While folks like myself relish this kind of thing, I expect most others will have what they call a “meh” reaction, not too sure about what’s new and different. Well, Oracle trying to be your single source for all enterprise IT is the main thing, Oracle trying to take over IBM’s market-spot as a second. …

Michael continues with detailed

    • Summary a la Twitter
    • Summary: Portfolio Rollerball
    • The Single Stack
    • Why Oracle thinks it’ll work this time
    • The single-sourced stack
    • The Cloud

Topics. He begins “The Cloud” with:

There wasn’t much mention of “cloud computing,” with words like “clusters” and “grids” being used as synonyms. Larry Elision reserved most of the cloud zingers for himself, and they amounted to one of his kinder statements “everything’s called cloud now, there’s nothing but cloud computing.” Which is another way of saying, if everything is a cloud, “cloud” is meaningless.

And, of course, they took what I call the LL Cool J stance on cloud computing: don’t call it an innovation, we been doin’ it for years! With virtualization and cloud computing, this is what every elder company has said, so it’s expected.

Ellison said Solaris would be the best at running clusters, “or I guess the [popular] word is ‘cloud,’” he tacked on. So, there you go: cloud for Oracle is about all that existing cluster and grid stuff we’ve had for year. Presumably, with new innovation laced on-top. And really, with “cloud computing” potentially having the same specter as “open source” (cheap), who can blame that attitude?

Stephen O’Grady joins his RedMonk partner with Sunset: The Oracle Acquisition Q&A of 1/28/2010:

If yesterday’s epic five hour webcast discussing Oracle’s plans for its finally acquired Sun assets was a long time coming for the analysts listening in, you can imagine how much of a wait it’s been for those on both sides of the transaction. It’s been roughly nine months, remember, since the database giant announced its intention to acquire the one time dot com darling.

Between Ellison, Kurian, Phillips and the rest, we got our share of answers yesterday. But as is almost always the case in such situations, there was as much left unsaid as said. Meaning that significant questions remain; some which can be answered, some which we’ll only be able to answer in future. To tackle a few of these, let’s turn to the Q&A. …

Stephen continues with a very long Q&A section.

Judith Hurwitz tackles the same task as Michael Coté in her Oracle + Sun: Five questions to ponder analysis of 1/27/2010. Here are the five questions (abbreviated):

  • Issue One: Can Oracle recreate the mainframe world?
  • Issue Two: Can you package everything together and still be an open platform?
  • Issue Three: Can you manage a complex computing environment?
  • Issue Four: Can you teach an old dog new tricks? Can Oracle really be a hardware vendor?
  • Issue Five. Are customers ready to embrace Oracle’s brave new world?

As expected, Judith provides a well thought out answer to each.

Reuven Cohen’s Calculating Cloud Service Provider ROI of 1/27/2010 analyzes the new Cisco IaaS ROI and Configuration Guidance Tool:

… Cisco's ROI tool does a good job of shedding light on the complexities in defining cloud infrastructure focused business models. It outlines components such as operational costs like Labor, Power, Maintenance as well as capital costs including Data Center Build out (Construction), system integration, storage and compute. For me by far the most interesting part of the calculator is the compute related options. They've broken them down into 3 basic VM categories (Power, Average, and Light)


I also found the proposed scale (number of servers, customers etc) in which the ROI tool was built quite telling about the market Cisco is going after with minimum capital expenditures in the 12 - 15 million dollar range for a Cisco based IaaS deployment with smaller deployments actually returning negative ROI results. …

Be sure to read the comments from Cisco’s Chris Hoff (@Beaker) and Sunil Chhab.

Dan Morrill explains How HP could give IBM a run for its money in Cloud Computing Security in this 1/26/2009 analysis of the recent HP/Microsoft cloud-computing partnership:

… With HP’s new playground on Azure, as well as a tool set that helps them work out complex policies and support mechanisms for information security inside the cloud it might be possible to start seeing virtual private clouds that are certifiable against many of the security processes that need to be in place to meet audit and regulatory requirements as I talked about yesterday in my “Can regulators keep up with Cloud Computing?” article. This is what makes cloud computing security interesting, companies may not have a choice anymore – they will have to start outsourcing their information security practices until they can hire smart professionals that have deep skills in this. HP is providing a direct avenue along with Microsoft to ensure that compliance, along with standards and practices can be met – abet only with an outsourced company as support.

HP has a play ground, a deep understanding of the Azure network, tools to help support and define the security standards and practices that might meet audit standards, this is going to be a tough bit of competition. While I would not count anyone out, especially IBM, HP seems to be working in a very smart direction that can truly help support cloud computing security.

<Return to section navigation list>