Tuesday, December 15, 2009

Windows Azure and Cloud Computing Posts for 12/14/2009+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

 
• Update 12/15/2009: Azure AppFabric Team: Windows Azure platform AppFabric Breaking Changes Pre-Announcement; Ben Riga: Azure Lessons Learned: Quark Software; Kyle Austin: User Authentication: It Doesn't Belong In Your Application; ISV Developer Community: Windows Azure – Great Resources; Lori MacVittie: Botnets, Worms, and “Open” Clouds: Can Enterprise-Class Clouds Be Far Behind?; The VAR Guy: MySQL: Two Potential Oracle Moves; Ina Fried: Microsoft's server chief talks cloud (Q&A); and many others.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the November CTP in January 2010. 
* Content for managing DataHubs will be added as Microsoft releases more details on data synchronization services for SQL Azure and Windows Azure.

Off-Topic: OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009.

Azure Blob, Table and Queue Services

Cory Fowler (SyntaxC4) is seeking Java, Python, Ruby and PHP developers to create Azure application in his Open Source Role Call for Windows Azure post of 12/14/2009:

In order to test out the Windows Azure platform to it’s full extent I am looking for a few  developers that program in the open source realm to develop a few simple web applications in Java, Python, or Ruby to be hosted on Windows Azure. These application don’t need to be anything too extravagant, however, I would prefer if the application was to use the Windows Azure Storage API.

You can download the respective storage APIs here: Java, Python, Ruby, and PHP.

If you are an Open Source Developer and would like to try out the Windows Azure Platform contact me for a Windows Azure Token. Once you have deployed your application into the cloud, I will create a blog account  for you on the Canadian Azure Experience Cloud Blog [that I installed using Azure BlogEngine.Net] for you to post your experience with Windows Azure. …

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

• The VAR Guy analyzes MySQL: Two Potential Oracle Moves in Oracle’s potential competition with SQL Azure if the EC permits acquisition of Sun Microsystems:

Oracle has assured the European Union that the software giant will continue investing in Sun’s MySQL open source database. That move potentially clears the way for Oracle’s pending buyout of Sun Microsystems, according to published reports. But more importantly, the move positions Oracle to potentially undercut Microsoft’s SQL Server database. Will VARs choose sides?

Let’s be clear: Microsoft’s SQL Server partner ecosystem is fiercely loyal. And Microsoft’s SQL Azure positions application partners to potentially cash in on Microsoft’s cloud strategy. Heck, even MySQL is jumping into Microsoft’s Windows Azure cloud.

Listen Closely

Still, there are whispers about a forthcoming Oracle move. It involves a potential Oracle plan to launch an Unbreakable MySQL push against Microsoft SQL Server.

The Unbreakable MySQL chatter started during Oracle OpenWorld in October 2009. Fast forward to the present, and Oracle has given the European Union certain assurances that the company will continue to invest in MySQL for years to come. Those assurances potentially clear the way for the EU to approve Oracle’s buyout of Sun Microsystems within the next month or so, reports BusinessWeek.

Already, Oracle has built a Linux specialization for channel partners. Hmmm… could a MySQL specialization follow in 2010? The VAR Guy sure thinks so. Such a specialization could help MySQL to energize a channel partner effort that appears to have gone silent for most of 2009. …

Stephen O’Grady contributes his Oracle, MySQL and the EU: The Endgame Q&A post of 12/15/2009 to the Oracle/MySql analysis melange:

Like a tea kettle, the ongoing acquisition of Sun by Oracle, objected to by the EU, has gone from cold to boiling to cold to boiling and back againg these past few months. The diversity of opinions, even amongst the those considered to be experts on the subject, is remarkable and has led to a wide ranging, passionate debate.

Nor has the debate been purely academic; the past two weeks have seen progress sufficient for some to declare that the transaction’s endgame is – mercifully – at hand.

Still, as evidenced by the ongoing debate, whatever the future might hold, questions remain. And as always, we at RedMonk love to answer questions. So on to the Q&A. …

Steve is an industry analyst with RedMonk.

Liam Cavanagh explains How to Synchronize Multiple Geographically Distributed SQL Server Databases using SQL Azure Data Sync in this detailed 12/14/2009 post:

Imagine that you want to have multiple copies of the same SQL Server databases located in different locations around the country or even around the world. Those SQL Server databases could exist in your headquarters, subsidiaries, retail stores, and even in your remote offices.  Currently, to accomplish this, there are a number of really great technologies to help you use on-premises software that you manage.  Some examples of this include Merge Replication and Sync Framework database providers (previously known as Sync Services for ADO.NET).  The most common concern that we hear with these technologies is the difficulty of getting the system up and running quickly and the complexity of the management requirements.  Quite often it requires working with IT to open holes in the corporate firewall and setting up web servers to host the synchronization logic.  What I would like to show you is an alternate way that you can accomplish this, by using a technique that removes the need to configure corporate firewalls or to install and configure web services.  With this technique you can take virtually any SQL Server database and share it with other users via Windows Azure. 

To get started let’s begin by looking at a very basic example.  Imagine a company we will call Fabrikam that has a SQL Server in their New York headquarters.  They would really like to make an exact copy of that SQL Server database and make it available in their London subsidiary. Down the road they will also want to put one in their Tokyo office.  Fabrikam wants to be able to have the database local to each of these locations to remove any latency issues.  Ultimately, there will be users that make changes to both of these databases, so periodic synchronization will need to take place to move changes to and from each of these locations. 

One of the ways that this can be accomplished is through the use of SQL Azure and in particular SQL Azure Data Sync.  SQL Azure is a fully managed relational database in the cloud.  This database is built on SQL Server technologies.  Using SQL Azure Data Sync, we can easily solve the first half of Fabrikam's problem, in that we can set up synchronization from their existing New York database and synchronize it to a SQL Azure database.  All of this can be done without any specific configuration to the corporate firewall (other than outbound only access on port 1433) and without the need to set up web services.  This is because SQL Azure Data Sync sets synchronization up within the SQL Server as a SQL Agent process that periodically pushes changes to and from the SQL Server and SQL Azure databases.  Since it makes outbound calls to the public SQL Azure database service there is no need to open holes in the corporate firewall.  The first part of the architecture looks like this:

Sync to SQL Azure

<Return to section navigation list> 

AppFabric: Access Control, Service Bus and Workflow

The Azure AppFabric Team issued a Windows Azure platform AppFabric Breaking Changes Pre-Announcement on 12/14/2009:

In this [forthcoming] Windows Azure platform AppFabric release, there are improvements in stability, scale and performance over the previous CTP.  Some of these improvements might require user code modification.  Applications built on or that communicate with AppFabric Service Bus and AppFabric Access Control might be affected and need to change accordingly. 

Access Control

  • WRAP version updated …
  • WRAP Request changes: Token type field, Token field and Scope field …
  • WRAP Response changes: Expiration field, Body field, Issuer in the returned token and Authorization (auth) header …

Service Bus 

  • Message Buffers …
  • Relay Bindings …
  • URIs and Headers …
  • Using Access Control …

It’s clear that the Azure AppFabric (formerly .NET Services) is still a moving target.

Kyle Austin suggests “Rethinking how you handle authentication” in his User Authentication: It Doesn't Belong In Your Application article for Dr. Dobbs of 12/14/2009:

When building Web-based app, you've got a thousand design and implementation decisions to make -- decisions that affect the usability and performance of your application, as well as its key functionality. Unfortunately, user authentication is typically the last thing you spend design cycles on. Just do what you've always done -- create a user database with accounts and passwords, and maybe hash the passwords for good measure.

That approach doesn't work anymore. The online world is a complex place, and your application doesn't operate in isolation .The problem isn't so much one of technology as it is your users themselves. With dozens of accounts and passwords to manage, what do people do? They share them between applications. This means that even if your password file isn't compromised, attackers can still find ways into your data. And whether or not it's your fault, it's always your problem.

The best illustration of this is the recent concentrated password attack on accounts belonging to Twitter employees. The attacker managed to get access to a personal email account and, through a combination of doggedness, password guessing and password resets, discover and break into numerous other accounts hosting Twitter corporate data as well as personal information. And in doing so they demonstrated how even one authentication problem can compromise a whole chain of accounts. …

Kyle is vice president of engineering at TriCipher.

<Return to section navigation list>

Live Windows Azure Apps, Tools and Test Harnesses

• Wade Wegner announced Significant updates to the SQL Azure Migration Wizard in this 12/15/2009 post:

George Huey has done it again!  He has just published some significant updates to the SQL Azure Migration Wizard.

Previously, I’ve described the SQL Azure Migration Wizard as a tool that helps you migrate your SQL Server database into SQL Azure.  This is still true, but now, thanks to updates made by George Huey, you can also migrate from SQL Azure-to-SQL Server and SQL Azure-to-SQL Azure.  These are significant updates to the tool!

Please watch the [post’s] video for an updated explanation of the tool:

• Lynn Langit (SoCalDevGal) posted Lynn's Windows Azure GuestBook, a simple interactive Azure app.

• Ben Riga interviews Quark Software’s Stephan Friedl in a Channel9 Azure Lessons Learned: Quark Software video of 12/15/2009:

In this episode of Azure Lessons Learned I chat with Stephan Friedl, Chief Architect at Quark Software.  Quark (of Quark XPress fame) has been built a new business called Quake Promote for small and medium business to design and print high-quality collateral (brochures, business cards, postcards etc) to promote their business. 

In and of itself this is an interesting Software + Services solution built with a compelling WPF design client and a high performance ASP.NET server.  Quark chose to deploy this solution using Windows Azure.  In that way they could build out their business to handle the numerous relationships they’ve setup neighborhood printers.  The architecture is service-based specifically so they could handle these type of relationships and host the solution on the partner’s site and in fact host on multiple sites from that same single multi-tenant solution running on the Windows Azure platform.

• Colinizer’s Deploy This Silverlight Application on Windows Azure in 10 minutes – no Tools Required! post of 12/14/2009 offers:

This post guides you through the process of deploying and configuring the provided Silverlight application on Windows Azure through the Windows Azure Platform web portal using just your compatible web browser.  You do not need any development tools…

The included Guest Wall application is a Silverlight app hosted in a ASP.NET website that runs on Windows Azure Hosted Services and uses the Windows Azure Cloud Storage to store messages that anyone can post.  You can configure it to some extent as explained [in his post]. …

• Joannes Vermorel’s Where do Windows Azure folks get their inspiration? post of 12/14/2009 is a paean to Windows Azure:

At PDC'09, Microsoft and Lokad unveiled a case study about Windows Azure. Yet, what was our surprise when discovered the following video at the Windows Azure booth (check for video links below).

Once upon a time, there was a little company with little funds, but great ambitions. …

Data Analytics with Windows Azure

• CloudBerry Lab announces the availability of its free CloudBerry Explorer for Azure Blob Storage in Microsoft Pinpoint. CloudBerry’s detailed description of its initial utility for Azure storage should serve as a model for others in this and related businesses.

Innov8showcase’s Dominos Pizza Hosts a Java/Tomcat Infrastructure on Windows Azure post of 12/14/2009 announces:

connectedShow_HalfSizePeter Laudati & Dmitry Lyalin host the edu-taining Connected Show developer podcast on cloud computing and interoperability. Check out episode 21 “It’s a PDC 2009 Pizza Party”.  Peter interviews Tim Wise from Dominos Pizza about how Dominos is hosting their Tomcat/Java based web-ordering system to Windows Azure to support peak traffic periods. Dominos anticipates this will result in significant infrastructure cost savings.

Peter discusses Azure interoperability with Sumit Chawla on the Microsoft Interop team.  Also, news coverage from the PDC on Silverlight 4, MEF on Mono, F#, and Dallas with special guests, Gary Russo from the New York Java SIG and Sara Chipps.

If you like what you hear, check out previous episodes of the Connected Show at www.connectedshow.com.  You can subscribe on iTunes or Zune.  New episodes approximately every two weeks!

Mike Leach explains Everything You Need to Know About Azure in 5 Minutes of this 12/14/2009 video Webcast:

Well, almost everything.

A basic iteration with Azure consists of:

  • Develop
  • Debug and Test
  • Deploy to Staging
  • Deploy to Production

Mike embeds the player for his 00:05:00 production.

Return to section navigation list> 

Windows Azure Infrastructure

 My New Microsoft Online Services Customer Portal Adds Windows Azure Platform Sign-Up and Billing Features pot of 12/15/2009 announces:

In preparation for Microsoft’s Azure Services Platform commercial debut on 1/4/2010, the Microsoft Online Services team added a Customer Portal (MOCP) with sign-up and billing pages to its site:

 Judith Hurwich includes “Cloud computing will move out of the fear, uncertainty and doubt phase to the reality phase for many customers” and “Cloud service providers will begin to drop their prices dramatically as competition intensifies in her six Predictions for 2010: clouds, mergers, social networks and analytics post of 12/15/2009.

Yes, it is predictions time. Let me start by saying that no market change happens in a single year. Therefore, what is important is to look at the nuance of a market or a technology change in the context of its evolution. So, it is in this spirit that I will make a few predictions. I’ve decided to just list my top six predictions (I don’t like odd numbers). Next week I will add another five or six predictions.

 Ina Fried interviews Bob Muglia in her Microsoft's server chief talks cloud (Q&A) article of 12/15/2009 for CNet News:

It's been a busy year for Bob Muglia.

Microsoft's server and tools boss shipped an update to Windows Server, got promoted to division president, and prepared Microsoft's operating system in the clouds--Windows Azure--for its commercial launch.

In what has become a bit of a year-end ritual, Muglia sat down with CNET for a year-end interview. We hit on a range of topics, from the future of Windows Server, to why his bank won't be moving to Windows Azure any time soon, to the changing life of an IT manager, to Microsoft's consumer future. (Spoiler alert: Muglia thinks it is bright.) …

Ina follows with an edited transcript of the extensive interview. I’m underwhelmed with her choice of Bob’s mug shot for the article.

Frederick Lardinois includes Windows Azure in his 10 Web Platforms of 2009 post of 12/14/2009 to the ReadWriteWeb blog:

Azure

Azure is Microsoft's big push towards cloud computing. While it is still too early to judge the success of this platform, we think it would be wrong to underestimate Microsoft's commitment to this space and the size of its developer ecosystem. While Amazon and RackSpace's cloud services are clearly more popular than Microsoft's new service, there can be little doubt that the arrival of Microsoft in this market will help to push the incumbents towards more innovation. …

• Michael Coté contributes a positive spin about Microsoft buys Opalis – Shoring up good old fashioned IT Management in this 12/14/2009 post:

Last week, Microsoft confirmed the rumors that it was buying Opalis, buying what looks like a solid process automation platform to give it’s System Center line a much needed boost in the good old fashioned IT Management space.

What Opalis Does

Opalis takes processes in IT that would otherwise have to be done manually and automates them. It uses a combination of workflow creation, run book cataloging, and what’s hopefully endless amounts of integration code to work with the various infrastructure out there to deploy changes, remediate problems, and otherwise do the tasks an admin would have to do by hand. You can imagine that in a virtualized and cloud-driven world, this kind of automation is table steaks. Here’s how I summed it up in 2007 after talking with Opalis:

“The GUI is a console that lets you define workflows for IT management: check the state of a system, if it’s bad, open a help desk ticket, notify this system, then do some other action. The idea is to create a “run book,” or a set of modeled courses of action to take when some even occurs: be it reactive like responding problems in IT, or less “fire fighting” like provisioning a new server in response to a change request.”

What I liked about what I saw was that it seemed to be an integration layer — a “spanning layer,” even — on-top of all sorts of existing IT management software.

Now, I’m always a bit suspicious of the wide applicability of anything that involves drag-and-drop modeling. The simple tasks you see in demos can’t speak for the long-term functionality of such a product. But, the idea and the implementation I saw looked nice. The modeling of workflows and execution of thereof certainly fits in well with enterprise ideas of systems management.

I haven’t really heard much negative about Opalis over the years, and their integration with Big 4 and beyond enterprise IT management seems legit. …

The ISV Developer Community posted Windows Azure – Great Resources, a complete compendium of resources from PDC 2009, on 12/15/2009:

PDC was a major milestone for the Windows Azure Platform, with Ray Ozzy announcing “production” and Bob painting a future vision in the keynote – there is a lot of new content, information and messaging to digest. New services were announced including “Dallas” as we as some old favorites being re-branded into the AppFabric.

If you want to get started, or need some more information on any of the technologies, then the links [in this post] will provide you with a wealth of resources to jump start your development.

John Rockerfeller envisions bundling Windows Azure subscriptions with consumer PCs at retail in his Windows Azure is Windows 8 post of 12/13/2009:

I sat in my office last night trying to identify what Microsoft is doing to combat upstart thin-client operating systems like Google Chrome OS, continue making money with its very popular offline Office suite and offline Windows platform, and compete against Amazon for data and web services now that the world is moving into cloud services.

They will have a lot of competition in the next 3 to 5 years against their core, money-making software products and I believe their plan is to leverage the millions of existing .NET developers and all of the skills they’ve spent years developing to change Windows from a boxed product to a subscription-based “Windows-As-A-Service” service.

I’ve been working with the Windows Azure platform for a few weeks now and I have to say I’m quite impressed. Launching apps is pretty easy once you have the required software installed and there are plenty of projects already listed at CodePlex to get you started. Moving from .NET development to Azure development is a piece of cake. They also appear to be much more open to supporting non-Microsoft development languages such as Ruby and PHP. As a Linux guy, I have to admit they’ve put this together pretty damn well.

Currently, the industry has only paid attention to the web application deployment features of Azure. I believe the true power of Azure is not just deploying scaling web applications but in its ability to launch virtualized desktops from the cloud. Let me explain what I envision Microsoft’s plans to be for the future of the entire software lineup. …

• Jeffrey Schwart interviews Yousef Khalidi about Azure and private clouds in his Q&A: Microsoft Distinguished Engineer Yousef Khalidi Discusses Private Cloud Migration post of 11/20/2009 to the Visual Studio Magazine site:

After a long week at Microsoft's Professional Developers Conference in Los Angeles, Yousef Khalidi flew across the country to New York on Thursday to talk about Microsoft's new Azure cloud services at the Interop trade show.

Khalidi, a distinguished engineer at Microsoft for cloud infrastructure services, sat on a panel among rivals Amazon, Google and hosting provider Joyent in a keynote session. Also on that panel were three prospective customers who grilled Khalidi and his competitors citing their preference for a private clouds.

In an interview following the panel discussion, Khalidi talked about some of those concerns and Microsoft's plans to deliver a private cloud offering.

Tim Anderson’s Reflections on Microsoft PDC 2009 post of 12/14/2009 observes:

… So how was PDC 2009? While there was a ton of good content there, and an impressive launch for Silverlight 4, there was a noticeable lack of direction; maybe that was why Ballmer decided not to show up. It should have been the Windows Azure PDC, but as I have just written elsewhere, Microsoft has little excitement about its cloud. Chief Software Architect Ray Ozzie gave almost exactly the same keynote this year that he gave last year; and the body language, as it were, is more about avoiding the cloud than embracing it. Cross-platform clients, commodity pricing, throw away your servers: from Microsoft’s point of view, what’s not to hate? …

Tim’s earlier Microsoft has little excitement about its cloud post compares Marc Benioff’s with Ray Ozzie’s keynoting style and offers more insight about Microsoft’s relative position in the cloud.

Glasshouse Technologies reports Glass House Survey Reveals 60 Percent of Executives Plan to Implement Cloud Technology Next Year in this 12/14/2009 press release on BitWire:

GlassHouse Technologies, a leading independent IT infrastructure consulting and services firm, today announced the results of a survey that highlights the growing popularity of cloud computing. According to GlassHouse’s cloud computing trends survey, 60 percent of executives plan to implement cloud initiatives in the coming year. Additionally, the survey found enterprises are paying increased attention to internal cloud networks. Among IT executives surveyed by GlassHouse, 72 percent consider internal clouds their highest priority.

As a growing number of companies consider the benefits of cloud computing, GlassHouse has recognized the industry’s increasing need for education on how to most effectively manage these environments. To provide today’s IT professionals with much needed information around cloud technology and trends GlassHouse is also today releasing the “CIO’s Guide to Cloud Computing” whitepaper.

The GlassHouse whitepaper highlights initiatives companies can take to develop a proactive enterprise-wide strategy for cloud computing that is inline with business objectives. GlassHouse recommends several steps, including applying true demand forecasting and planning capabilities to cloud environments that will enable IT to deliver agile services at the right cost and the right service level. By implementing these practices, organizations will be better able to manage their cloud programs, leading to cost savings and operational benefits.

“For most CIOs, cloud computing is a relatively abstract concept. For practical consideration, cloud needs to be addressed in the context of immediate issues and future opportunities for enterprise IT,” said Jim Damoulakis, chief technology officer at GlassHouse Technologies. “This whitepaper provides organizations with a concrete overview of cloud deployment models and trends, as well as guidelines for planning and evaluating enterprise cloud solutions to obtain the cost savings promised by cloud technology.” …

Sadagopan Singam’s Cloud Computing & IT Spending! post of 12/14/2009 to the Enterprise Irregulars Web site begins:

In continuation of the previous note:

Involved discussions on cloud invariably turn towards the question: would embracing cloud help business bring down IT spend? This question assumes more relevance given the fast early adoptors are seen to be increasing and IT spend by various counts could be seen to be flat to marginally higher numbers in the coming quarters.

Today cloud vendors seem to mostly follow the path of playing the volume game to make their money – very different from conventional approaches. In most of the cases, early adopters in the past, used to spend at higher costs to gain the first mover advantage in business and therby recoup the costs, while the technology matures and begins to be offered at lower prices. But doay, partly the confusion is sown by some vendors when they boldly proclaim huge cuts in IT spend with the adoption of the cloud. I think that this may be true in a few cases but this is hardly the way to push cloud computing. Why not, one may ask.

As recorded by Joe Mckendrick, at a recent panel at Interop, AT&T’s Joe Weinman, raised doubts about the sustainability of cloud computing economics, describing a scenario in which they break down as enterprise management requirements come into play. “I’m not sure there are any unit-cost advantages that are sustainable among large enterprises,” he said. He expects adoption of external cloud computing in some areas, and private capabilities for others.

<Return to section navigation list> 

Cloud Security and Governance

• Lori MacVittie questions Botnets, Worms, and “Open” Clouds: Can Enterprise-Class Clouds Be Far Behind? in this 12/15/2009 post:

Cloud computing environments are just as suited to illegitimate use as legitimate use. Do providers need a way to separate the chaff from the wheat to reassure enterprise-class customers that they’re doing everything they can to eliminate the hijacking of cloud computing resources for nefarious purposes?

One of the negatives of being the technology darling du jour is that every misstep, problem, and outage is immediately jumped on and reported everywhere. Amazon is particularly susceptible to such coverage, being recognized as one of the leaders in public cloud computing. Last week Amazon suffered yet another outage, true, but more interesting may be the discovery that it had been infected by the Zeus bot, a password-stealing banking Trojan.

blockquote On Wednesday, security researchers for CA found that a variant of the infamous password-stealing Zeus banking Trojan had infected client computers after hackers were able to compromise a site on EC2 and use it as their own C&C (command and control) operation.

imageThe Zeus bot has been loose for quite some time and Amazon is certainly not the first – nor likely the last – organization to be infected by this nasty little trojan. In October social  networking giant Facebook was targeted by miscreants attempting to spread some Zeus-bot love around as well. The bot is a Windows-specific trojan that, like so many others, attempts to lure its victims into installing it via phishing and drive-by attacks. …

• David Linthicum claims “Service governance technology providers are building products with the cloud in mind” and describes SOA governance's move into the clouds in this 12/15/2009 post to InfoWorld’s Cloud Computing blog:

Governance -- that often-unappreciated discipline of making sure the right thing happens in the right way and the wrong things don't happen -- is heading to the clouds, taking advantage of the insights learned during SOA's glory days. After all, the clouds and SOA share many architectural aspects, as I've argued many times in this blog, so it makes sense to apply SOA governance principles as well.

Vendors are taking notice, too. As InfoWorld's Paul Krill points out, "AmberPoint Governance System monitors an application environment for changes and updates, discovering application components and resources. Policy compliance is automated, and reporting is done in real time, AmberPoint said. The product acts as a single system to enforce governance policies." …

• K. Scott Morrison deliver links to his Identity Buzz Podcast with [Daniel Rivkin] Now Available: End-to-End Web Services Security in this 12/14/2009 post:

I recently had a great, freewheeling discussion with Daniel Raskin, Sun’s Chief Identity Strategist. Daniel runs the Identity Buzz podcasts. We talked about issues in identity and entitlement enforcement in SOA, compliance, and the problems you run into as you move into new environments like the cloud.

Daniel’s post about our podcast is on his blog. You can download the podcast directly here.

Linda Musthaler explains how BitArmor achieves Cost-effective data encryption in the cloud in her 12/14/2009 post:

One of the best practices you can implement to secure sensitive data is to encrypt it. This is especially important when the data is most vulnerable, such as when it is being stored or transported on mobile media like a laptop hard disk, a USB stick, a CD or DVD, or when it is attached to an e-mail message. Seeing how many data breaches are the result of a lost or stolen laptop or portable storage device, it should be a given that companies are going to encrypt sensitive data.

If common sense isn't enough to spur you to encrypt your data, industry, state and federal regulations might be your driving force. HIPAA, HITECH and PCI all require organizations in specific industries to protect sensitive data -- with encryption being the de facto means of protection. …

BitArmor recently introduced a hosted encryption service that reduces the cost and complexity of encrypting sensitive data. The solution is available via the cloud, with BitArmor hosting the encryption server and managing the keys. All you have to do is install a piece of software on the PCs containing data you want to protect. The BitArmor service provides full disk encryption for laptops and desktops; protection for data on removable media; and protection for data that is sent as an e-mail attachment.

There are two things that make BitArmor's solution rather unique. The first is how the solution is deployed in the cloud. BitArmor's competitors typically deploy one host server per customer. For efficiency, BitArmor slices one server to provide isolated services to each of its customers. As you can imagine, fewer servers mean far lower costs, allowing BitArmor to charge a relatively low fee of $5 per device per month. Even small companies can afford this kind of fee. …

<Return to section navigation list> 

Cloud Computing Events

Jayaram Krishnaswamy asks Want to hear a 15 minutes intro to SQL Azure? in this 12/14/2009 post that announces he will deliver a session at the NJSQL December 15th Meeting - Lots of Presenters, Prizes and Food!!:

Date:  Tuesday, December 15th

Location: SetFocus (Directions)

Agenda:
6:00pm - 6:30pm: Dinner (Microsoft/ApexSQL)
6:30pm - 6:40pm: NJSQL/Community Updates
6:45pm: Red-Gate Demo - Alec Lazarescu
7:05pm: SQL Server Profiler in a production environment - Bill Foster
7:25pm: Northwind Velocity - Waseem Naik
7:45pm: Dessert (SetFocus)
7:50pm: SQL Azure - 15 questions answered - Jayaram Krishnaswamy
8:10pm: ApexSQL Demo - Michael Coles
8:30pm: Closing/Prizes

Refreshments: Not Pizza...(something a little nicer) 

Sponsors: Microsoft/ApexSQL/Red-Gate
Give-A-Ways: Thumb Drives (Red-Gate), ApexSQL License (ApexSQL),
Zune, Windows 7 & Office 2007 (Microsoft)

NJ SQL Server User Group

Sorry for the short notice.

Stephen Forte reports on 12/14/2009 NYC.NET Developer User Group: Meeting this Thursday (12/17/2009):

Thursday, December 17, 2009: Application Development for the Windows Azure Platform

You must register at https://www.clicktoattend.com/invitation.aspx?code=143228 in order to be admitted to the building and attend.

At PDC 09 Microsoft did announce commercial availability of the Windows Azure Platform. This presentation will cover the key features of the Windows Azure distributed fabric based operating system that make it possible to build applications leveraging distributed computation and storage capabilities in the cloud. It will also cover higher-layer .NET Services such as SQL Azure, Microsoft’s distributed SQL Server database in the cloud and the .NET Services that provide extended connectivity and security across cloud and on-premise applications.
The focus will be on developing applications using the various platform APIs for development, deployment and management of Windows Azure Platform services.

Speaker: Bill Zack, Microsoft
Bill Zack is an Architect Evangelist with Microsoft. He comes to this role after serving as a Solutions Architect in the Financial Services Group of Microsoft Consulting Services. His experience includes developing, supporting and evangelizing .NET/SOA based frameworks used to jump-start development projects for financial services companies. Prior to joining Microsoft he acted as a Consultant, Architect, Administrator, Developer, and System Integrator. He has also authored several computer books and white papers. …

Date: Thursday, December 17, 2009

Time: Reception 6:00 PM , Program 6:15 PM

Location: Microsoft , 1290 Avenue of the Americas (the AXA building - between 51st and 52nd Sts.), 6th floor

Directions: B/D/F/V to 47th-50th Sts./Rockefeller Ctr
1 to 50th St./Bway
N/R/W to 49th St./7th Ave.

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

• Rich Miller’s fawning Rackspace and the Transition to the Cloud post of 12/15/2009 appears to celebrate the Data Center Blog’s start of “The Managed Hosting Channel[, which] is brought to you by Rackspace Hosting:”

John Engates jokes that web hosting is “the original cloud computing.” It’s a reminder of the connection between the past and future for Rackspace Hosting. As the chief technical officer of Rackspace, Engates has helped steer the company through transitions driven by both technology and terminology.

As customers focus on the benefits of cloud computing, hosting companies are pondering the best way to reposition their product offerings to compete. Perhaps no hosting provider has navigated the shift to the cloud more smoothly than Rackspace (RAX), a company founded in the early days of the dot-com boom that has emerged as the second-largest player in cloud computing, according to some analyses.

In the first three quarters of 2009, Rackspace’s cloud computing operation grew from 34,820 customers to 61,616. The company’s revenue from cloud computing in the third quarter was $15.3 million, a fraction of the $147 million brought in by the company’s managed hosting business, but more than double the $6.5 million from the same period in 2008. …

James Urquhart adds his thoughts about Putting Amazon's spot pricing into perspective in this 12/15/2009 post:

As reported on CNET, Amazon Web Services has announced a new pricing option that lets its customers take advantage of spare capacity within the EC2 infrastructure at variable, supply-and-demand-driven pricing.

The news has taken the cloud community by storm. For some, it represents the beginning of a long-anticipated move to market pricing for core IT infrastructure services.

While there is some truth to the importance of AWS spot pricing to the history of cloud computing, let's keep things in perspective: this pricing is set by Amazon, not any market. We are a long way from a true commodity market for any form of cloud computing service. …

Graphics credit: Wikimedia Commons

Reuven Cohen contributes his analysis of Amazon’s spot pricing move in his Spot on, Amazon Commoditizes The Cloud post of 12/14/2009:

The consensus found in the more traditional areas of commerce is the longer your products are sitting in your warehouses the less money you're making off them. (This is also known as carrying costs) Over the years retail focused companies such as Walmart and Amazon have strived to keep these carrying costs to a minimum by implementing various just-in-time (JIT) inventory strategies. (A technique that was first used by Henry Ford at the Ford Motor Company Work in the early 20th century).

The philosophy of JIT is simple: inventory is waste. The idea behind a JIT strategy is to improve a business's return on investment by reducing the associated carrying costs associated with under utilized assets, this could be a toaster or a hosting company's unused server. The Data Center business is in a lot of ways is very similar, the more unused rack space the less you are making. Cloud centric data centers make this problem even worse, not only do you need to have excess data center space, you now need to have physical hardware in place, just in case your demand spikes. For a lot of larger players this means un-utilized compute capacity is making you nothing.

The folks at Amazon Web Services have come up with a very interesting approach to solve the problem of DC carrying costs by implementing a spot pricing scheme for unused EC2 instances. In case you're not familiar with the concept, wikipedia describes the spot price of a commodity as the price that is quoted for immediate (spot) settlement (payment and delivery). In securities, the term cash price is more often used. …

Werner Vogels explains spot pricing for EC2 instances in his Expanding the Cloud - Amazon EC2 Spot Instances post of 12/14/2009:

Today we launched a new option for acquiring Amazon EC2 Compute resources: Spot Instances. Using this option, customers bid any price they like on unused Amazon EC2 capacity and run those instances for as long their bid exceeds the current "Spot Price." Spot Instances are ideal for tasks that can be flexible as to when they start and stop. This gives our customers an exciting new approach to IT cost management.

The central concept in this new option is that of the Spot Price, which we determine based on current supply and demand and will fluctuate periodically. If the maximum price a customer has bid exceeds the current Spot Price then their instances will be run, priced at the current Spot Price. If the Spot Price rises above the customer's bid, their instances will be terminated and restarted (if the customer wants it restarted at all) when the Spot Price falls below the customer's bid. This gives customers exact control over the maximum cost they are incurring for their workloads, and often will provide them with substantial savings. It is important to note that customers will pay only the existing Spot Price; the maximum price just specifies how much a customer is willing to pay for capacity as the Spot Price changes.

Spot Instances are ideal for Amazon EC2 customers who have workloads that are flexible as to when its tasks are run. These can be incidental tasks, such as the analysis of a particular dataset, or tasks where the amount of work to be done is almost never finished, such as media conversion from a Hollywood's studio's movie vault, or web crawling for a search indexing company. For most of these tasks their completion is not time critical and as such they are ideal targets for additional cost savings. …

I’m waiting for the other shoe to drop on Microsoft’s Azure site.

<Return to section navigation list> 

blog comments powered by Disqus