Friday, June 04, 2010

Windows Azure and Cloud Computing Posts for 6/3/2010+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this daily series.

 
Vacation Notice: We’ll be on vacation (and attending TechEd North America 2010) in New Orleans next week. The OakLeaf Systems blog won’t be updated from Sunday 6/6 through Friday 6/11/2010. Laissez Les Bon Temps Roulet!

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts and Databases”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated in June 2010 for the January 4, 2010 commercial release. 

Azure Blob, Drive, Table and Queue Services

Jim White compares Windows Azure Table Storage vs. Windows SQL Azure in his 6/1/2010 post to the Intertech blog:

imageLast week, my fellow Intertech colleague and Microsoft MVP, Tim Star, and I presented the Windows Azure Bootcamp for the Twin Cities.  According to Microsoft reps, we broke the record for the most attended bootcamp in the US with nearly a hundred people at the event.

A common question that I received during and after the bootcamp was "Why would I want to use Windows Azure Table Storage versus Windows SQL Azure to store my application data?"  A good question with the answer dependent on your application and data needs.

Table Storage and SQL Azure Defined

First, allow me to back up and set the stage a little for this discussion.  SQL Azure is essentially SQL Server in the Microsoft cloud computing environment known as Azure.  That is not quite true in that SQL Azure currently has a number of limitations and unsupported features that SQL Server 2008 has (here is a starter list of limitations:  http://msdn.microsoft.com/en-us/library/ee336245.aspx). 

I like to say that SQL Azure is either SQL Server 2008 minus or SQL Express plus depending on how you want to view it.  Table Storage is one of three alternate storage mechanisms built into Azure that is collectively called Windows Azure Storage Services.  The other two being queues and blobs.  Table Storage allows you to store serialized entities in a table, but the term table here  is not a relational database table.  To provide people with some analogy they can use to get their arms around Table Storage, I like to tell people to think of Table Storage as a fancy spreadsheet.  You can store the state of your entities in the columns of the spreadsheet.  However, there is no linkage or relationship (therefore joins) between entities - at least none that is automatically managed and maintained by Azure.  There are no custom indexes - at least not today.

Interestingly, when Azure was first introduced in 2008, SQL Server was not part of the original picture.  Because of developer negative reactions, SQL Azure was added to the next preliminary release in 2009.  There is a growing faction that is trying to get the software community to look at SQL alternatives.  The "No-SQL" community (see here and here), to some extent, has influenced part of the Azure cloud computing platform through the Table Storage option, but not enough to eliminate it from the Microsoft cloud.

While both SQL Azure and Azure Table Storage provide data persistence via table structure, there are a number of differences between them.  The sections below outline some of the key differences and factors you want to weigh before building an application for Azure that requires some form of table persistence. …

Jim continues his analysis of Scale and Performance, Data Access, Portability, Transactions and Concurrency, Queries, Column Types, Cost and Bottom Line topics. Here’s his final [SQL Azure] Future topic:

imageWe are given every indication by Microsoft that SQL Azure will have far more capability in the future - akin to the SQL Server you might find in your data centers today.   So some of the comparison above may be moot or less important over time.  Additional functionality is also being proposed to Table Storage as well.  For example, support of secondary (non-key) indexes is already been suggested for a future release (see here).  However, key architectural differences between SQL Azure and Table Storage will remain and leave application designers having to pick the best option for their systems.  Welcome to cloud computing.  There is a lot of ROI to be had by running in the cloud, but only with proper application design and architecture.

<Return to section navigation list> 

SQL Azure Database, Codename “Dallas” and OData

Wayne Walter Berry shows you how to overcome BCP Utility Upload Errors in SQL Azure in this 6/4/2010 post to the SQL Azure Team blog:

imageOne cause of bcp utility upload errors in SQL Azure is trying to upload too much data in a single batch. Each batch of rows from your table is a single transaction, and SQL Azure has constraints on transactions that could cause the bcp utility upload to fail if BCP violated those constraints. This article will address how to avoid violating the constraints.

The bcp utility is a command line utility that ships with Microsoft SQL Server. It bulk copies data between SQL Azure (or SQL Server) and a data file in a user-specified format. The bcp utility that ships with SQL Server 2008 R2 is fully supported by SQL Azure. You can find out more about using BCP with SQL Azure in this blog post.

One type of error you can encounter when uploading is:

SQLState = 08S01, NativeError = 10054

Error = [Microsoft][SQL Native Client]TCP Provider: An existing connection was forcibly closed by the remote host.

SQLState = 08S01, NativeError = 10054

Error = [Microsoft][SQL Native Client]Communication link failure

This is an example of SQL Azure closing the connection because of transaction constraints, including the rule that a transaction must not exceed 1 Gigabyte of data.

The default batch size is 1000 for BCP, which means that there are 1000 rows uploaded per transaction. If those rows exceed the transaction constraints, you could get the error above. To reduce the number of rows in a batch you can implement bcp utility with the –b flag and indicate the number of rows.

bcp AdventureWorksLTAZ2008R2.SalesLT.Customer in C:\Users\user\Documents\MoveDataToSQLAzure.txt -b100 -c -U username@servername -S tcp:servername.database.windows.net -P password

Reducing the row count will make the transfer chattier, and slows your rows/second transfer rate. However, it might be the only way to get tables with large row sizes into SQL Azure using bcp utility.

For Better Performance

You can increase your performance by increasing the batch size, so that more rows are inserted per transaction. This will only work if data length of each row is small – opposite of the rows giving you the error above. I have been uploading batches of 10,000 rows with good results, especially for many to many tables where there are only four columns.

See Wayne’s Creating a SQL Azure Database with SQLCMD post of 6/3/2010 below.

Pinal Dave explains SQL SERVER – Generate Database Script for SQL Azure [2008 R2] in this 6/4/2010 post:

imageWhen talking about SQL Azure the common complain I hear is that the script generated from stand-along SQL Server database is not compatible with SQL Azure. This was true for some time for sure but not any more. If you have SQL Server 2008 R2 installed you can follow the guideline below to generate script which is compatible with SQL Azure.

Dave continues with illustrated steps to set Scripting Options for SQL Server 2008 R2 for creating SQL Azure databases.

Stephan Groschupf claims “Cloud computing reduces the cost of big data analytics” in his Why Small Businesses Are Using 'Big Data' commentary of 6/4/2010 for Forbes Magazine:

imageBig data analytics is no longer just the purview of big companies with big budgets. Increasingly, cloud computing gives small companies an affordable and easy-to-use way to find out how big data can help grow their existing business or uncover new opportunities. Because cloud computing removes the need to invest in expensive infrastructure to try out their new ideas, small companies no longer face barriers to big data innovation.

Infrastructure-as-a-Service (IaaS) providers such as Amazon, Microsoft ( MSFT - news - people ), Google ( GOOG - news - people ), GoGrid, Rackspace and Slicehost, along with the on-demand analytics solution vendors that support them, make big data analytics very affordable. A lot of parameters go into computing the exact price of running big data analytics in the cloud such as usage and configuration and, of course, each IaaS and analytics vendor has its own pricing model. However, it is safe to say, that there are solutions that can allow a small business to perform simple data analytics on a terabyte of data for as low as $100.

So, perhaps you are now convinced that big data analytics is becoming cheap but are wondering what demand from small companies will really be. After all, don't small companies have small data? Do they really have the skill sets required for big data analytics? Are the solutions available to these small companies really as good as what the big players have access to? The answers to these questions may surprise you.

All companies have big data whether they realize it or not. Certainly, most online businesses, large or small, will collect large volumes of data from their Web logs and clickstream data. But internal data can be just a small part of the big data portfolio as the number of publicly available data sources grows. Consider that the World Bank makes its statistical data about the entire world available online or that all Twitter data since March 2006 will soon be archived digitally at the Library of Congress. Further, there are plenty of news and investment data services that offer low-cost access to their information and their prices keep dropping.

Big data analytics functionality is likely to accelerate in the world of cloud analytics instead of being just a subset of what is available in non-cloud products. Just as users in small companies turn to IaaS to gain affordable access to big data analytics, developers of analytics software and associated plug-ins will turn to the infrastructure as an affordable way to develop and test their applications. Affordability will increase ubiquity of solutions and extend their functionality. …

Wayne Walter Berry explains Creating a SQL Azure Database with SQLCMD in this 6/3/2010 post:

imageOnce you have your server allocated via the SQL Azure Portal you can create and drop databases from your desktop’s command line using sqlcmd.exe. This can be handy for clean slate testing where you want to create a database from scratch, unload a schema and then do some testing in a repeatable way.

One thing to note is that you want to target the master database when creating and dropping databases. It is also important to encrypt your connection using the –N parameter, you can learn more about why we encrypt in this blog post.

Creating a Database

First, you need to construct a .sql file that contain the Transact-SQL command to create your database, it should look something like this:

CREATE DATABASE Test

This will create you a 1 GigaByte database, the default size. There should be no other Transact-SQL statements in this .sql file. More about the CREATE DATABASE command in SQL Azure can be found on MSDN.

To execute this command from the command line where the file is named CreateDatabase.sql, use this command:

sqlcmd -SyourSeverdatabase.windows.net -UyouLogin@yourServer -PyourPassword -dmaster -i –N CreateDatabase.sql

The user name needs to be in the form: yourLogin@yourServer, where the login is the administrator login for the server on SQL Azure. You can get this information directly from the SQL Azure portal. Notice that the database targeted is master.

Dropping a Database

Dropping a database is down in much the same way. The script looks like this:

DROP DATABASE Test

More about the DROP DATABASE command in SQL Azure can be found on MSDN.

To execute this command from the command line where the file is named DropDatabase.sql, use this command:

sqlcmd -SyourSeverdatabase.windows.net -UyouLogin@yourServer -PyourPassword -dmaster -i –N DropDatabase.sql

Experimenting

Just a note when experimenting with these scripts: The minimum billing period for SQL Azure is one day; the 1 Gigabyte database costs $9.99 per month. That is roughly 33 cents a day. To create and immediately drop a database will cost you a minimum 33 cents. More on pricing can be found here.

Summary

The tools you are used to using with SQL Server work with SQL Azure, including sqlcmd.exe. Do you have questions, concerns, comments? Post them below and we will try to address them.

Zach Skyles Owens sugests that you attempt to Win $10k for Building a Rad PHP App on SQL Server or SQL Azure in this 6/3/2010 post:

image31

Whether you have an established business or have a great startup idea this is an awesome opportunity to win some cash.  The SQL Server team is running a contest for PHP developers building apps on top of SQL Server or SQL Azure with a $10,000 prize.  This includes sites built on top of WordPress or Drupal.

I’ve actually been working with WordPress on SQL and SQL Azure a lot lately and wrote an article about this contest at http://wordpress.visitmix.com which has a bunch of details of how everything works along with some ideas for building the winning app.

Click the image or here for entry details.

See Jim White’s Windows Azure Table Storage vs. Windows SQL Azure post of 6/1/2010 in the above Azure Blob, Drive, Table and Queue Services section.

Jesse Liberty shows you how to combine Windows Phone 7: Lists, Page Animation and [O]Data in this 6/3/2010 post:

image This is the fourth in a fast paced series on programming Windows Phone 7. In this mini-tutorial I will demonstrate how absurdly easy it is to create a master page with a list of data, and a details page to display more information about the selected item, and to animate the transition from one to the other. To make it more interesting, we’ll get the list and the details from a web service, using [O]Data.

image The reason all of this is so easy is that Visual Studio 2010 provides a template for this very case.  While that smells a bit of making a demo that fits the template, the truth is that this is an extremely common scenario, and the template is very flexible. …

Jesse continues with an illustrated tutorial and source code examples.

Thomas Claburn claims “The emerging protocol aims to provide an open mechanism for sharing content across Web sites” in his Google, Microsoft Back OExchange Social Sharing post of 6/3/2010 to InformationWeek’s Software blog:

image Furthering efforts to foster open alternatives to Facebook's social platform, Google, LinkedIn, and Microsoft on Thursday declared their support for OExchange, a content sharing protocol spearheaded by Clearspring Technologies.

OExchange provides a way for online services like Google Buzz to receive content shared from other sources, for third-party sharing tools to discover and share content, and for user preferences to be communicated across services.

"OExchange is a promising effort that seeks to simplify and make sharing easier for publishers and service providers alike," explains Google's Chris Messina in a blog post.

The problem that OExchange and related protocols like XAuth are trying to solve is what Messina has described as the NASCAR problem, a reference to the limited number of sponsor logos that can fit on a race car and still have value as advertisements.

OExchange tries to solve this problem by allowing a sharing tool like Clearspring's AddThis, which can be found on InformationWeek pages, to share content with any other site that supports the protocol.

This would be an improvement over the current implementation of AddThis which displays a predetermined set of 13 other Web sites in a small menu or a daunting list of several dozen sharing services in an expanded menu, an experience not unlike NASCAR logo overload. …

It will be interesting to see if OExchange gathers momentum similar to that of OData. The fact that Microsoft and Google support the protocol leads me to believe OExchange has legs.

Liam Cavanagh wrote about his “Using Microsoft SQL Azure as a Datahub to Connect Microsoft SQL Server and Silverlight Clients” session at TechEd North America 2010 in this 6/2/2010 post to the Microsoft Sync Framework blog:

Just a quick post to let you know that a few of the engineers and I will be attending TechEd North America 2010.   This year we have a booth (look for the "SQL Azure Data Sync" banners) and a few sessions, including one I am presenting titled "Using Microsoft SQL Azure as a Datahub to Connect Microsoft SQL Server and Silverlight Clients", as well as some content in a few other sessions including one by Patric McElroy titled "What’s New in Microsoft SQL Azure". 

I can't give away too much, but we do have a couple of great announcements coming that relate to our work on synchronization with SQL Azure. 

So if you happen to be at the conference, hopefully I will see you either at my session or at the booth.

<Return to section navigation list> 

AppFabric: Access Control and Service Bus, CDN

Eric Nelson answered Q&A: What is the UK pricing for the Windows Azure CDN? on 6/4/2010:

The pricing for Windows Azure Content Delivery Network (CDN) was announced last week. The prices are:

  • £0.091 per GB transferred from North America & Europe locations
  • £0.1213 per GB transferred from other locations
  • £0.0061 per 10,000 transactions

CDN rates are effective for all billing periods that begin subsequent to June 30, 2010. All usage for billing periods beginning prior to July 1, 2010 will not be charged.

To help you determine which pricing plan best suits your needs, please review the comparison table, which includes the CDN information.

Steven Nagy has also done an interesting follow up post on CDN.

See Matias Woloski’s announcement of the Cloud Life Science Events in New Jersey and Boston, June 8th and 15th in the Cloud Computing Events section below.

Vittorio Bertocci promotes his TechEd North America 2010 session in this WIF @ Tech.Ed US post of 6/2/2010:

imageI’ve closed the last of the WIF Workshops barely 5 hours ago, the larynx still slowly recovering from the overuse & abuse I’ve made of it in the last 2 days… but claims based identity never stops, and the calendar already reminds me that Sunday I’ve got to get up real early to catch a plane for New Orleans.

And what happens in new Orleans next week? Well, no no, not that, we can’t talk about that here… I meant the other thing: tech.ed US!!!

This year WIF is covered in two sessions, delivered by yours truly. Lo and behold, the people contoured shots on fucsia background (including one in pure Roberto Benigni style) of the main theme are not the only new thing in tech.ed US: in an uncharacteristically RESTful move from the event portal, I can point you to my sessions by URI! The resource representation you can get includes title, level, track, type, abstract & speaker but not time and room of the session: I’ll add them here for your convenience. One step at a time my friends, one step at a time… ;-)

SIA304 | Identity and Access Management: Windows Identity Foundation Overview

Wednesday, June 9  |  11:45 AM - 1:00 PM  |  Auditorium C

Level: 300 - Advanced

Hear how Windows Identity Foundation makes advanced identity capabilities and open standards first class citizens in the Microsoft .NET Framework. Learn how the Claims-Based access model integrates seamlessly with the traditional .NET identity object model while also giving developers complete control over every aspect of authentication, authorization, and identity-driven application behavior. See examples of the point and click tooling with tight Microsoft Visual Studio integration, advanced STS capabilities, and much more that Windows Identity Foundation consistently provides across on-premise, service-based, Microsoft ASP.NET and Windows Communication Foundation (WCF) applications.

SIA304 is the classic intro for practicing developers to clams based identity and Windows Identity Foundation. After years that I introduce the topic it feels a bit weird to keep doing it, but so far I’ve always been surprised by how much intros are still needed. We’ll see if the trend will continue for this session as well, or if we will receive signal that we can start drilling deeper on the topic and the tool.

If you never head about claims based identity/WIF, or if you used only specific aspects of WIF and you want to see a bigger picture, I’d daresay that you could benefit from this session. OTOH if you already know about WIF, chances are that the session won’t be news for you. Unless you really like the Italian accent or presenters with the fallen rockstar look, you can probably aim directly to lunch. (which, perhaps surprisingly, on Wednesday will be from 11:00am to 1:30pm.)

SIA303 | Identity and Access Management: Windows Identity Foundation and Windows Azure

Wednesday, June 9  |  3:15 PM - 4:30 PM  |  Rm 295

Level: 300 - Advanced

Claims-based identity provides an open and interoperable approach to identity and access control that can be consistently applied both on-premises and in the cloud. Come to this session to learn about how Windows Identity Foundation can be used to secure your Web Roles hosted in Windows Azure, how you can take advantage of existing on-premises identities and how to make the best of features in our cloud offering, such as certificate management and staged environments.

Ah, SIA303 is a different matter. This breakout is partly based on the homonymous session I gave during the 2nd day of the WIF Workshops, and I’ll have you know it can be pretty tough. Want to know why? Simple. Claims based identity applies equally well on-prem and in the cloud: in order to find differences between the two I have to dig pretty deep, all the way to things like managing sessions in NLB environments, dealing with multiple URIs for the same app deployed in devfabric/staging/production, handling certificates, using tracing without a file system… see what I mean?

You’ll get the most from this session if you already have experience with Windows Identity Foundation or Windows Azure (preferably both). The good news is that so far the feedback has been consistently good: although the topic can be tough at times, I’m told that the info here are not easily available anywhere else. That sounds like a good thing, does it :-)

Monday and Tuesday mornings I will hang around the Identity booth, blue shirt, “professional shoes” and everything; if you want to chat feel free to swing by, or to grab me if you see me around.

Does “homonymous” = “eponymous?” Ask Vibro. See his 7 years of blogging for a laundry list of his posts for the last 12 months and my Updated List of 75 Cloud-Computing Sessions at TechEd North America 2010 for additional Azure-related sessions.

<Return to section navigation list> 

Live Windows Azure Apps, APIs, Tools and Test Harnesses

Ryan Dunn and Steve Marx present a 00:40:08 Cloud Cover Episode 14 – Diagnostics Webcast as of 6/4/2010:

image Join Ryan and Steve each week as they cover the Microsoft cloud. You can follow and interact with the show at @cloudcovershow
In this episode:  

  • Learn about the Diagnostics capabilities in Windows Azure.
  • Discover how to remotely configure each instance's Diagnostics configuration.
  • Listen as we discuss debugging versus monitoring along with which techniques to use in Windows Azure. 

Show Links:
Programming Windows Azure
Developing and Deploying with SQL Azure whitepaper
Windows Azure Architecture Guidance, Part 1
Announcing Windows Azure CDN Pricing
Web Role Crash Dumps

Shamelle’s Windows Azure SDK: Porting Code From July CTP to Windows Azure SDK v1.0 November 2009 post of 6/4/2010 is an illustrated tutorial for moving from the July CTP to the last pre-release CTP:

There have been several occasions where I downloaded some sample Windows Azure code and realized that the code was written based on the Windows Azure July CTP. The later versions of the Windows Azure SDK are not completely backward compatible with previous CTP releases. Initially, I spent a fair amount of time, trying to port such code to Windows Azure SDK v1.0 (latest is Windows Azure SDK v1.1).  Lately, I’ve ported Azure code far too many times, and thought of putting together a blog post;

If this is the first time you are trying to port from Windows Azure CTP to Windows Azure SDK v1.0, this post will save you some time.

Shamelle describes workarounds for the following build errors:

  1. The type or namespace name ‘ServiceHosting’ does not exist in the namespace ‘Microsoft’ (are you missing an assembly reference?)
  2. The name ‘RoleManager’ does not exist in the current context
  3. The type or namespace name ‘RoleException’ could not be found (are you missing a using directive or an assembly reference?)
  4. ‘System.Configuaration.ConfigurationSettings.AppSettings’ is obsolete

Jim Nakashima wrote Hope to see you at TechEd post on 6/4/2010:

image

If you are attending TechEd 2010 next week, I really hope to see you there.  I’ll be at the Windows Azure booth and speaking on Wednesday June 9th at 5:00 in Room 356. 

I’ll be speaking about the end to end development experience for Windows Azure and have about 3-4 cool new things to show that I’m really excited about. 

COS307 | Using Microsoft Visual Studio 2010 to Build Applications That Run on Windows Azure

A platform is only as powerful as the tools that let you build applications for it. This session focuses on using demos, not slides, to show the best way to use Visual Studio 2010 to develop Windows Azure applications. Learn tips, tricks and solutions to common problems when creating or moving an existing application to run on Windows Azure. Come see how Visual Studio 2010 supports all parts of the development cycle as we show how to take an ASP.NET application running on IIS and make it a scalable cloud application running on Windows Azure.

If you can’t make it, stay tuned, I’ll be posting links to all of the Windows Azure videos and I have some blog posts coming on those cool new things :)

The Windows Azure Team’s Real World Windows Azure: Interview with Paddy Srinivasan, CEO at Cumulux post of 6/3/2010 adds another case study:

As part of the Real World Windows Azure series, we talked to Paddy Srinivasan, CEO at Cumulux, about ManageAxis, the company's monitoring, management, and deployment solution for customers who use the Windows Azure platform for delivering cloud-based applications.

MSDN: Tell us about Cumulux and the services you offer.

Srinivasan: Cumulux is a leading provider of cloud-computing products and services to Fortune 100 enterprises and major independent software vendors. Cumulux is led by Microsoft and industry veterans and was recently named one of the top 150 cloud-computing companies by the Cloud Computing Journal.  Our newest product, ManageAxis, enables robust monitoring, management, and compliance of enterprise-class applications that run on the Windows Azure platform.

MSDN: What are the biggest challenges that ManageAxis is solving for Windows Azure customers?

Srinivasan: Businesses that are developing and deploying cloud applications on the Windows Azure platform face several challenges: 

  • Management and governance-Cloud applications have to be managed efficiently and proactively to control costs. Tight policy-based governance controls over who has access to cloud assets and periodic and on-demand logging and reporting are essential for managing mission-critical cloud deployments.
  • Operational visibility and compliance-Operational and compliance regulations mean that businesses need to monitor and track critical application metrics like performance, security, and geo-location of services. However, many cloud-based applications are black boxes that provide little visibility into performance or other operational metrics.
  • Application life-cycle management-Businesses that adopt the Windows Azure platform need to manage the application life cycle, including managing multiple versions and automating development, testing, and staging configurations for their cloud-based applications.

MSDN: Can you describe how ManageAxis addresses the monitoring, management, and application life-cycle needs of customers using Windows Azure?

Srinivasan: ManageAxis addresses these challenges through the following functionalities:

  • Monitoring-ManageAxis helps customers monitor application performance, storage, and transaction metrics; monitor business key performance indicators (KPIs), service level agreements, and compliance metrics; and monitor operational costs of the application on Windows Azure.
  • Management-ManageAxis enables customers to establish and implement policies for dynamic scaling based on forecasted peak loads or unplanned bursts in traffic. It also gives them the ability to create roles-based compliance workflows and provides reporting on critical business and performance KPIs.
  • Deployment and application life-cycle management-Through ManageAxis, customers can manage multiple deployment configurations through the application life cycle and support rapid deployment with single-click cloning of solution topologies. They also gain roles-based access to manage the deployed version of the application. …

The interview continues with the standard “What makes your solution unique?” and “What kinds of benefits are you realizing with Windows Azure?” closer Q&As.

Karen Forster asks Really? Business Intelligence on SQL Azure? and reports CloudOlap’s port to Windows Azure in this 6/3/2010 post:

image Microsoft's SQL Azure cloud platform does not yet provide Business Intelligence (BI) functionality, but a smart developer was bound to fill that gap with a third-party offering. Enter Kevin Ashley, founder of CloudOlap, a BI solution originally built for Amazon's EC2 cloud environment. Seeing an opportunity to expand his business, Ashley is porting his BI solution to Microsoft's Azure platform.

You can listen to Ashley's reasons for moving to Azure by clicking here: TechNet Radio: Business Intelligence in the Cloud. In this 00:23:21 podcast, Ashley and I discuss business cases for cloud computing in the financial industry, and he gives examples from investment banks and hedge funds.  Ashley has some interesting insights into BI implementations and how to make them successful.

In addition, Ashley compares the Amazon EC2 environment with Microsoft's Azure platform. Ashley's knowledge of BI and the cloud provide unique perspectives to help answer questions and clear up confusion.

Technet describes the podcast as follows:

Kevin Ashley, founder of CloudOlap, a cloud-based BI solution, chats with Karen Forster from PlatformVision.com about Business Intelligence (BI) in the cloud and specifically addresses business cases in the financial industry, giving examples from investment banks and hedge funds. Quoting a Gartner study, Ashley notes that 60 percent of BI projects fail on time or budget and provides insights into how the Microsoft Windows Azure platform can improve BI implementations. Ashley explains why his company is moving its cloud offerings to Microsoft Windows Azure and compares it to the Amazon EC2 Cloud environment.

Gunther Lenz reminds developers that Outback Steakhouse uses CloudPoll on Facebook, powered by Windows Azure in this 6/2/2010 post to the US ISV Evangelism blog:

Outback Steakhouse started a series of polls on Facebook to learn more about the demographics, preferences, and habits of their fans. CloudPoll gives them a fully featured, easy to use, free Facebook application to just do that. CloudPoll provides the scalability needed for Outback to handle potential rapid grow of the poll in Facebook.

Check the application out at http://bit.ly/brzlsZ and you can also create and manage your own polls with the scalability of a Windows Azure based cloud solution. Best of all: It is provided by Microsoft DPE free of charge for you!

image    image

Cloudpoll is built on the Windows Azure Toolkit for Facebook, which you can download as jump start for your Windows Azure based solution at http://bit.ly/azurfb.

You can also find the CloudPoll Facebook application here: http://apps.facebook.com/cloudpoll/Home/Public

Gunther continues with instructions for trying the Facebook app. My poll is What's Your Opinion of the Cost of Using Windows Azure for Cloud Computing?

You can also find the CloudPoll Facebook application here: http://apps.facebook.com/cloudpoll/Home/Public

Return to section navigation list> 

Windows Azure Infrastructure

Mary Jo Foley reported Microsoft passes the 10,000 customer milestone with Azure in this 6/4/2010 post to her All About Microsoft ZDNet blog:

imageMicrosoft now has more than 10,000 customers (each with an unspecified number of users) using its Windows Azure cloud environment.

That new milestone was mentioned by Doug Hauger, General Manager of Windows Azure, during his appearance on June 3 at the Cowen & Co. Tech Conference. (I listened to him via the Webcast.)

Hauger shared some other new data and statistics. There are four main workloads customers are running on Windows Azure — Microsoft’s cloud operating environment which became commercially available in February this year. The four: On/off batch job computing; quick start-up (with no need or money to build out a private data center; unpredictable bursting; and predictable bursting.

Hauger said Microsoft was surprised to find that 50 percent of customers using Azure are running their applications in steady state — in other words, as replacements for on-site/on-premises software.

“Adoption has been very good,” Hauger told attendees of the conference, noting that the customer base included smaller independent software vendors and developers, but also a number of enterprise customers. He said that once business customer took the time to actually assess the regulatory requirements for particular applications/verticals, they discovered that they weren’t as stringent as they believed. Consequently, certain customers were moving their line-of-business applications to Azure, given they “only” needed FAST or ISO certification, he said.

At the same time, there are lots of casual, Facebook-type games being built on the Azure platform, he said. There also are a number of high-performance applications using Azure as a “landing place,” he said.

Hauger said the growth Microsoft is seeing in Azure isn’t at the expense of Windows Server. He said it’s been a case of incremental growth, rather than cannibalization — “net additive for them (Microsoft’s customers) and us.” …

John Brodkin stated “Analysts hope for more details on Windows Azure” as a preface to his Microsoft TechEd event to shed light on cloud computing plans post of 6/3/2010 to NetworkWorld’s Data Center blog:

image Microsoft will shed more light on its cloud computing strategy next week at the annual TechEd conference in New Orleans.

While Microsoft's specific TechEd announcements haven't yet been revealed, some analysts are hoping for details about Windows Azure, Microsoft's answer to Amazon's popular EC2 cloud platform.

There are any number of cloud topics Microsoft could address at TechEd, from its struggle to wrench momentum away from Google Apps, to security of the cloud, and licensing the use of Windows in cloud services

But Microsoft's strategy around Azure seems to be less well defined than its other cloud ventures, and therefore may receive a bigger focus at TechEd, analysts say.

"I expect them to make a major move" regarding Azure, says Burton Group analyst Drue Reeves, who believes Microsoft has to walk a fine line with Azure, which delivers a cloud-based operating system, relational database and several other services.

Azure potentially poses a conflict of interest for Microsoft, he says. Microsoft wants partners to use the Hyper-V virtualization technology and .Net software framework to build cloud services, but the market presence of Azure might dissuade cloud providers from using those Microsoft technologies, Reeves says.

"If you're a new provider, are you going to use Hyper-V, or are you going to use .Net and offer that as a service and compete with Azure? Not likely," Reeves says. Microsoft is "providing the enabling technologies for cloud providers and selling against them at the same time."

Azure exited beta and went into general availability on Feb. 1 of this year.

But Microsoft has been relatively quiet overall about the cloud service, analysts say. "I have a list of questions to ask [Microsoft] about Azure," says Pund-IT analyst Charles King.

In addition to the cloud-based operating system and SQL database, Azure includes a content delivery network. The CDN was released in November 2009 into beta, and has been available to users for free since then. Microsoft has just recently announce that it will charge $0.15 per GB for data transfers starting on June 30.

King says he wants to know what other Azure services will be rolled out, and what kind of interest Azure is receiving from customers so far. While most big IT vendors are focusing on enterprises and service providers, King says Microsoft may be ideally positioned to market its cloud service to small- and medium-sized businesses.

"Microsoft is the de-facto vendor of choice for most small businesses," King says. "I think small- and medium-sized businesses are in a position to really gain some interesting benefits from the cloud."

Reeves says Azure seems to be in flux, with Microsoft still "trying to figure out whether it's platform-as-a-service or infrastructure-as-a-service."

Cloud platforms allow developers to build and deploy web applications without any internal hardware and software, while infrastructure services deliver raw computing and storage capacity to customers.

Microsoft still needs to show that Azure has real, live customers, define proper use cases and say "'this is what it's good for' so they can position themselves in the market," Reeves says. In competing against Amazon, "It's not a question of technology. It's a question of their business model," he says.

A Microsoft spokesperson says the TechEd event "will focus on the cloud strategy for enterprise customers and developers," without revealing other details. With a topic as broad as cloud computing, that leaves open the possibility of several areas Microsoft may choose to focus on.

For example, Azure senior architect Hasan Alkhatib said last December that Microsoft was working on a new security structure for multi-tenant cloud environments, and private cloud software based on the same technology used in Azure.

Microsoft could discuss its vision for using its System Center Virtual Machine Manager, Hyper-V, and Windows Server together as the infrastructure for building private clouds, as well as how customers can create hybrid clouds that allow workloads to shift from internal data centers to external Windows-based cloud services.

Microsoft could also shed more light on Office Web Apps and Business Productivity Online Standard Suite (BPOS), the Web-based versions of its Microsoft Office software products. BPOS is already available, and the consumer-flavored Office Web Apps will go live on June 15.

But Microsoft has still not said exactly when these Web-based tools will be upgraded to the 2010 versions of Office, SharePoint and Exchange. They currently run on the 2007 versions.

Microsoft's cloud strategy wouldn't be complete without a revamp of its licensing policies. Reeves says Microsoft has done a good job with its service provider licenses, which, for example, lets customers run Windows and SQL Server on the Amazon cloud.

But Microsoft has more work to do with service providers to ensure that customers feel comfortable running Windows in numerous types of cloud services, Reeves says.

I disagree with Reeves when he says “Azure seems to be in flux, with Microsoft still ‘trying to figure out whether it's platform-as-a-service or infrastructure-as-a-service.’” Azure always has been and, as far as I can see, will be a Windows-based PaaS, not a generic IaaS play.

Dan Nystedt reported Microsoft Opens Cloud Computing Center in Taiwan in his 6/3/2010 article for PCWorld:

imageMicrosoft opened a joint cloud computing center with Taiwan's economics ministry on Thursday at the Computex electronics show, and announced a plan to work with two local companies on new designs for servers meant specifically for cloud computing, the growing trend towards decentralized, virtualized computing services.

When the project was first announced last November, officials said it would be a first for Microsoft in Asia. Now, it's clear the new center, which Microsoft calls a Software and Services Excellence Center, will be much more than first thought.

Microsoft, which has worked with Taiwanese companies for 20-years, will license patents from its technology portfolio and share its software expertise with companies, academia and research institutes in Taiwan to develop connected devices and cloud data centers, the company said in a statement.

One initiative announced Thursday was a partnership between Microsoft and the two biggest laptop manufacturers in the world, Quanta Computer and Compal Electronics of Taiwan. The three companies plan to develop a new generation of servers designed for cloud computing.

The cloud servers would fit another idea Microsoft has talked up in recent years, data centers built inside 20-foot (6.1-meter) shipping containers. Servers are currently built for traditional data centers, but Microsoft has asked companies to design new ones for containerized data centers.

Taiwan will need such servers for an initiative to build complete containerized data centers that was announced Wednesday at Computex by Taiwan's biggest publicly funded research group, the Industrial Technology Research Institute (ITRI). The research organization is working to halve the cost of building containerized data centers by using standardized computing components and a set stack of software. Containerized data centers can cost millions of dollars, so the project would have a big impact on lowering the cost of building new data centers.

"Cloud computing services are a strategic industry that the government is promoting," said Wu Ming-ji, a director general at Taiwan's economics ministry, adding that Taiwanese companies will be able to take advantage of "the most advanced software technologies as well as cloud data center implementation experience from Microsoft," through the partnership.

Microsoft and the two laptop makers plan to have prototypes of the new cloud computing servers available in the fall. The first prototype containerized data center from Taiwan is due at the end of this year.

The Voices for Innovation (VfI) blog added Cross-Post: The Roll of IT in the Era of Cloud Computing on 6/4/2010:

The post below is from Patrick O'Rourke, Director of Microsoft's Server & Tools Business, and it originally appeared on The Microsoft Blog. What stands out for VFI members is the focus on the changing role and opportunities of IT departments and professionals. IT already plays a critical role in enterprises -- and it will play an even more vital role going forward, according to O'Rourke and Bob Muglia, the President of the Server & Tools Business. Here's the post...

The Roll of IT in the Era of Cloud Computing

Without question, cloud computing continues to be the hot topic of discussion in information technology circles.  Vendors, customers and industry observers are all weighing in on the opportunities and challenges posed by the cloud. 

Many IT professionals are, quite reasonably, asking question such as: “What does my job look like as the tech industry and my company move toward cloud computing?”

Of course, IT managers know that their roles never stop evolving. New technologies and business demands arise every day.  Business leaders and employees always want more from IT.  And yes, cloud computing will accelerate that evolution.  As Bob Muglia, president of our Server and Tools Business says in this video, the cloud is “a world-class, dramatic shift.”

The good news, as Muglia points out, is that the shift presents great opportunities for IT to contribute more to their organization’s bottom line.  The cloud will help IT more quickly deploy new capabilities – applications, services, access – that will enable business to happen more efficiently and effectively.

Simply put, the cloud can help IT do more and be more important than ever.  With the cloud, IT managers can help the business bring a new service to market faster than the competition.  Or, it can help IT take advantage of extra computing horsepower to meet seasonal demand without acquiring new hardware and software.  And IT can use the cloud to offload the management and delivery of traditional applications, such as email, in order to devote more time and resources to implementing new, strategic solutions.

For example, Siemens IT Solutions and Services uses the Microsoft Windows Azure platform to distribute software to thousands of Siemens devices around the world – enhancing services and avoiding significant new capital investment. In another example,  Kelley Blue Book runs its high-traffic automotive Web site on the Windows Azure platform, saving $100,000 annually and freeing up IT sources for other projects.

It’s an exciting time to be in IT, with cloud computing providing much more efficient hardware, much faster application deployment, and lower operational costs.  And at Microsoft we’re focused on helping IT managers “mind the gap,” as Muglia says, to use their existing expertise and systems to bridge the current, on-premises world of IT with the cloud.

Visit Cloud Computing: A Guide for IT Leaders to view more videos and content providing guidance about cloud computing. Also, Muglia will be talking more about the opportunities of cloud computing at next week’s Tech Ed 2010 conference, taking place in New Orleans.

David Linthicum posits “Private clouds do let IT maintain desired control, but they are not always the right choice” in his Why private clouds are surging: It's the control, stupid! post of 6/3/2010 to InfoWorld’s Cloud Computing blog:

An article by Steve Rosenbush, "Private cloud computing takes off in companies not keen on sharing," indicates that the interest in private cloud computing is outpacing interest in public cloud computing: "For now, though, big companies are going to spend a lot of money building their own private clouds because the comfort level with public clouds isn't high enough. 'Can we actually have a public cloud that can guarantee certain service-level agreements, certain tiers of service?' wonders Greg McCall, an analyst with Sage Asset Management in New York. The answer is: Maybe soon, but not quite yet."

The fact of the matter is that public clouds do a pretty good job at providing storage, applications, and compute services on demand. While the myth is that private cloud computing providers "guarantee certain service-level agreements," the reality is that "private cloud" is just another term for on-premise systems, and on-premise systems typically have an uptime record a bit south of typical public clouds.

That said, it might not matter to those in IT who want to maintain control. If it's not control, it's security; if it's not security, it's cost -- those in IT pushing back against public clouds have a well-rehearsed set of excuses. Of course, the tech press has not helped much by publicizing the cloud computing outages on their front pages.

The trouble with all this is the IT naysayers are right some of the time. However, the lack of thinking around the core requirements of IT -- in favor of a dumb argument around private versus public clouds -- leads many to select the wrong platform for the wrong reasons. Not considering public clouds means you could be missing out on a very compelling architectural option.

Within most enterprises, there are solutions that clearly should be in public clouds, solutions that should be in private clouds, and many IT components that should stay where they are for now (not in a cloud at all). When you don't consider all options, your solution won't be optimal, and within many Global 2000 companies that translates into millions of dollars wasted per month.

That's OK: You are able to maintain control. But how much is that control worth to the business?

InformationWeek Analytics presents Government Cloud Platform Strategy, the second white paper in its Strategy: Cloud Platforms series:

Government Cloud Platform Strategy
In this report, the second in a four-part series on government clouds, we analyze the IT infrastructure considerations that government IT pros must take into account as they look to implement cloud services in federal data centers.This report, the second in a four-part series, will help government IT pros evaluate the technologies and infrastructure considerations that must be taken into account when building clouds in federal data centers. We explore the software and hardware environment and management tools that you need to be successful in the cloud, and assess the role that APIs play in supporting hybrid scenarios that tap into public cloud services.

There are three broad types of cloud services: software as a service, platforms as a service and infrastructure as a service. In the same way that cloud service providers do business in the commercial market, government agencies are beginning to offer such services to internal users and other government agencies.

When we asked government IT pros which factors are most important in considering deployment of private clouds, cost savings and the ability to meet user demands and achieve scale quickly emerged as top factors. Many see the private cloud model as being more secure than public cloud services, since government IT departments have control over all aspects of the environment. Agencies like DISA offer cloud services that have already been vetted to government security standards. In March 2010, the U.S. government introduced a pilot program called FedRAMP that’s intended to streamline certification and accreditation processes. Government IT organizations must also evaluate how to manage and monitor cloud services. They should expect to see advances aimed at increasing performance in server environments. Government IT pros rank servers, storage, security, networking and virtualization as the leading core technologies that factor into their private cloud plans. Bandwidth and network connectivity are critical to ensuring that users can effectively access services in the cloud.

Utility-based, on-demand computing will be the model of the future. To enable this model, government IT organizations need to ensure that their platform strategy aligns to this vision.

Download

Private Clouds: Powerful Convergence or False Promise? in a June 2010 InformationWeek Cloud Computing Brief:

IT leaders rightfully debate how extensively they should embrace cloud computing techniques inside their data centers, and what kind of benefits they'll get. But make no mistake: Private clouds do offer a new and powerful data center strategy.Is the private cloud's promise of a more automated and scalable internal data center, using highly virtualized server and storage resources, all that revolutionary? There's a lot of haziness around this terminology as vendors rebrand just about any data center service as cloud, and they try to link the real savings from cloud environments to their specific offerings.

In this report, we will sweep away that haziness and illuminate the critical issues around private clouds. Make no mistake: The "private cloud" is a new and powerful data center strategy.

Download

<Return to section navigation list> 

Cloud Security and Governance

Cumulux will present Governance & Compliance of Windows Azure Applications  ( June 14th and 28th ) webinar:

imageTight policy based governance over who has access to the various cloud assets and periodic logging / reporting are essential for managing cloud deployments. Operational and compliance regulations need businesses to monitor and track critical application metrics like performance, security, and geo-location of services. In this webinar, we will explore the principles behind governance of cloud based applications and then dive deep into the following topics:

image

  1. Role based access for tighter policy based governance
  2. Change Management using Alerts/Notifications
  3. Audit Logging for cloud lifecycle compliance

How do I attend: Find the day and time that works best for you, then click to register. Everything you need will be emailed prior to the start of the training.

See the Windows Azure Team’s Real World Windows Azure: Interview with Paddy Srinivasan, CEO at Cumulux post of 6/3/2010 in the Live Windows Azure Apps, APIs, Tools and Test Harnesses section above.

<Return to section navigation list> 

Cloud Computing Events

See the Cumulux will present Governance & Compliance of Windows Azure Applications  ( June 14th and 28th ) webinar directly above.

Matias Woloski announced Cloud Life Science Event in New Jersey and Boston, June 8th and 15th on 6/4/2010:

During the next couple of weeks, Southworks will be presenting together with a Fortune 500 pharmaceutical company a project that we’ve developed during the last couple of months around Claims Based Federated Identity and the Cloud. Hong Choing and Ben Flock from Microsoft DPE are hosting the event in New Jersey and Boston and kindly invited us to share with other organizations from the Life Science industry the work we’ve done together.

We will be presenting 3 different scenarios and how we approached them using Federated Identity (ADFS and Windows Identity Foundation) and Cloud Computing (Windows Azure and Amazon EC2). We will talk about the architecture behind, involving an ADFS acting as a Federation Hub, the notion of different level of trusts/assurance and the inclusion of social identity providers like Facebook, Yahoo, LiveId, Twitter, etc.

image
image

image

The solution shows

  • A web site hosted on Windows Azure that is something like “Federated SkyDrive” where a user can assign cross-organization permissions based on email, group and organization claim.
  • Organizations plugged to the hub using identity providers like ADFS, CA SiteMinder or PingFederate
  • Other organizations plugged to the hub using social identity providers like Facebook, Yahoo, Google, Twitter, LiveID
  • Different level of trusts depending on the identity provider that issued the token
  • Multiple cloud computing providers like Amazon EC2 hosting an ADFS v2 and Windows Azure hosting the website

image

imageThe scenario and architecture used is similar to the one we described in the Federation with Multiple Partners chapter of the Claims-Based Identity and Access Control guide from patterns & practices. The guide was key to help some of the stakeholders understand the concepts and artifacts of the solution.

With the advent of the cloud, the need of collaborating fast and securely between organizations in a cost effective way, these kind of concepts and architectures should become the de-facto solution. Looking forward to that future!

OpSource and SIIA present Cloud Thought Leadership Videos, brief outtakes from the All About the Cloud conference held on 5/10 – 5/12/2010 at the Santa Clara Convention Center in Santa Clara, CA.

Following are the videos available as of 6/4/2010:

Eileen Boerger, Agilis Solutions | Video
Elliot Curtis, Microsoft | Video
Mike Dunham, Scio Consulting | Video
Mike Flanagan, Less Software | Video
Jon Kondo, Host Analytics | Video
Anita Moorthy, Novell, Inc. | Video
Rick Nucci, Boomi | Video
Ron Papas, Informatica | Video

Simon Peel, Cast Iron Systems | Video
Pamela Roussos, AppFirst | Video
Treb Ryan, OpSource | Video
Narinder Singh, Appirio | Video
Tayloe Stansbury, Intuit | Video
Phil Wainewright, Procullux Ventures | Video
Maynard Webb, LiveOps | Video

Keynotes:

Principles for Creating and Driving Value in the Cloud | Video | Slides
Tayloe Stansbury, SVP & Chief Technology Officer, Intuit

Impact of Cloud Innovation | Video | Slides
Doug Hauger, General Manager, Windows Azure, Microsoft

Companies Need More Than Servers:
How and Why the Ecosystem is Critical to Cloud Success
| Video | Slides
Treb Ryan, CEO, OpSource, Inc.

Cloud Revolution: Beyond Computing | Video | Slides
Maynard Webb, Chairman & CEO, LiveOps

Panels:

Private Clouds | Video
Phil Wainewright, Director, Procullux Ventures (Moderator)
Zorawar Biri Singh, VP, IBM Enterprise Initiatives, Cloud Computing, IBM
Brian Byun, VP & GM, Cloud, VMWare, Inc.
Jeff Deacon, Managing Director, Cloud Services, Verizon
Rebecca Lawson, Enterprise Business Marketing, Hewlett Packard

Public Clouds | Video
Jeffrey Kaplan, Managing Director, THINKstrategies (Moderator)
Mr. Scott McMullan, Google Apps Partner Lead, Google Enterprise
Jim Mohler, Sr. Director, Product Development, NTT America
Steve Riley, Sr. Technical Program Manager, Amazon Web Services
John Rowell, Co-Founder & Chief Technology Officer, OpSource, Inc.
Matt Thompson, General Manager, Developer and Platform Evangelism, Microsoft

Chris Czarnecki reported about Learning Tree’s Cloud Computing Technologies: A Comprehensive Hands-On Introduction course at Rockville, MD in Cloud Computing Course Day 2 in this 6/4/2010 post:

Day two began with a comprehensive coverage of SaaS and some more hands-on work. We then moved onto investigating software plus services. A key feature of many cloud offerings is that they expose their functionality via Web services, either REST or SOAP style. Attendees spent some time analysing these service contracts. Examples included the Amazon EC2 WSDL contract.

Moving through the cloud architecture PaaS was next. The offerings from Microsoft with Azure, Google with the App engine and Force.com were examined in detail. Attendees undertook hands-on exercises working with the Azure and App engine toolsets and seeing how applications are developed and deployed to the cloud. A demonstration by myself of developing an application with the Force.com illustrated a different approach to application development in the cloud – one that requires no programming skills !

The clean application development cycle supported by PaaS offerings from building through to cloud deployment were appreciated by attendees. An interesting question that arose from attendees was what level of control does a user have over tuning operating system and server parameters when working with PaaS ? It is this type of question that attending the course helps to answer. The exposure to a wide range of cloud technologies in a focused expert lead environment, really equips attendees with the skill set to make informed decisions on the appropriate way of implementing cloud computing for their organisations.

Looking forward to day 3.

Chris is the course presenter. Here’s the day 1 report of 6/3/2010: Cloud Computing Course Under Way.

Cory Fowler wrote Clouds as far as the eye can see about Prairie DevCon 2010 and his plans for TechEd NA 2010 on 6/4/2010:

I’m writing to you from my hotel room in Regina, Saskatchewan where this week I attended and Presented at the first ever PrairieDevCon. Saskatchewan, normally overlooked for Technology Events showed their support with nearly 100 attendees, featuring 50 sessions over two days, by 25 industry experts.

I contributed to this incredible event with 2 sessions both pertaining to the Microsoft Cloud. First up was Taking it to the Cloud, which pulled in a crowd of 10-15 attendees. To take a look at the code that was presented, it was released as an open source project on CodePlex called Azure Email Queuer [a Refactored Visual Basic Version is also available]. My Second session entitled Making your data Rain from the Clouds attracted between 7-12 attendees. Both Code Samples demonstrated are also available on CodePlex. Download ASP.NET SQL Azure Connection, or Ruby SQL Azure Connection to see how to leverage WCF Web Services that expose data using the OData Protocol. …

What’s Next

Next week, to finish off my 3 week conference tour, I’ll be in New Orleans for TechEd North America. I’m looking forward to learning more Advanced Topics on Windows Azure, SQL Azure, and get my feet wet with Windows  Phone 7. Besides the ability to learn advanced topics, I’ll also be trying to meet some of my Programming Idols that will be in attendance. I look forward to meeting with Craig Shoemaker from Infragistics, Steve Marx from Microsoft, Shawn Wildermuth of AgiliTrain.

image If you see me at the event*, please feel free to come introduce yourself. I look forward to making many new friends at the event.

I will also attempt to do some Live Blogging of some of the sessions so be sure to keep checking back for some interesting news from New Orleans.

*Added Cory’s mug shot so you can recognize him at TechEd.

Steve Nagy reported CloudCamp Brisbane – Just Around the Corner! in this 6/3/2010 post:

On the 8th June 2010 Brisbane will have its first ever CloudCamp event. I’m very excited about this. I’ve had my head in the Azure space for so long I’ve somewhat neglected what everyone else is doing.

Register for CloudCamp Brisbane here.

About

Here’s the line direct from the website:

CloudCamp is an unconference where early adopters of Cloud Computing technologies exchange ideas. With the rapid change occurring in the industry, we need a place where we can meet to share our experiences, challenges and solutions. At CloudCamp, you are encouraged to share your thoughts in several open discussions, as we strive for the advancement of Cloud Computing. End users, IT professionals and vendors are all encouraged to participate.

And the tentative schedule:

  • Registration & Networking
  • Welcome and Thank yous
  • Lightning Talks (5 minutes each)
  • Unpanel
  • Begin Unconference (organize the unconference)
  • Unconference Session 1
  • Unconference Session 2
  • Wrap-up
  • Networking in conjunction with Zendesk meetup
Location

The event will be held at Griffith University’s Nathan Campus. Three rooms have been allocated to us (for free thanks to the School of Computing and Information Technology)

Click here for a map.

As you can see from the map, the University is right in the middle of Toohey Forest.

Steve continues with transport recommendations

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

Paul Krill claimed "Users needing enterprise-grade infrastructure can now deploy on Terremark's cloud” in his Engine Yard expands Rails apps cloud services post of 6/3/2010 to InfoWorld’s Cloud Computing blog:

image Engine Yard will extend on Thursday its services for cloud-based services for Ruby on Rails applications, giving users the option to deploy on Terremark infrastructure.

The company's xCloud service enables deployments of Rails applications onto Teramark. Engine Yard already has offered Rails application hosting on the Amazon cloud via the Engine Yard AppCloud service.

“XCloud is] an expansion of our PaaS (platform as a service,)" said Jim Shissler, Engine Yard marketing director.

The xCloud service uses infrastructure-as-a-service capacity from Terremark Enterprise Cloud, offering SAN storage, guaranteed resources, hardware support, and compliance guarantees, including SAS70 and PCI, Engine Yard said.

Terremark differs from Amazon in that it can, for example, let users add components such as security infrastructure. Capabilities for high I/O requirements and for maintaining a high application load on a database are featured, Shissler said. Amazon does not support these capabilities at the same level as Terremark, he said.

"Basically, if you've got a Rails application that requires enterprise-grade infrastructure, this is going to be a great product for you," Shissler said. …

Thomas Claburn reported “Months of rapid growth have left Google App Engine unable to scale to meet demand” in his Google App Engine Troubles Prompt Fee Suspension article of 6/3/2010 for InformationWeek:

image Google App Engine, Google's on-demand computing infrastructure service, has been growing at a rate of 25% every two months for the past six months and that has become a problem: Google has not adequately planned for App Engine's success.

Despite App Engine's increased computing footprint and its expansion across multiple Google data centers, the cloud computing service's central database, Datastore, experienced three service interruptions last month, one of which lasted 45 minutes, and is experiencing high latency again on Thursday.

The situation has become serious enough that Google has decided to stop charging for Datastore CPU usage. Google says that Datastore latency since April 1 has been about 2.5 times higher than normal.

"We want you to know we are taking the current problems with the Datastore very seriously," Google said in a blog post. "We have put other Datastore projects on hold to dedicate more people to accelerating improvements to Datastore performance, and to our datacenter configuration. We expect the Datastore may still have a few issues for the next two weeks, as we respond to the problem."

As of Wednesday, Google said it would stop charging for Datastore CPU costs until performance returned to satisfactory levels. This will be reflected starting with the May 31 bill.

The company said it will notify users via the App Engine blog seven days before it begins charging again. It also advised users not to set their Datastore CPU budgets to $0, since that would effectively prevent applications from running.

App Engine users have sought clarification from Google about its plans to address the problem.

"I would personally like more information on these Datastore growing pains," said App Engine developer Bill Edwards in a Google Groups post on Wednesday. "As a startup that has been strongly considering building solely on Google App Engine, we are very, very concerned with these instances of failure. ...If we can't depend 99.9% on GAE right now, that is fine. We will come back when you are ready. But, as a startup building a business application, we can't risk these sorts of downtimes."

Google knows it has to do better. Last month it launched a technology preview of App Engine For Business, which specifically promises 99.9% uptime though its Service Level Agreement.

App Engine for Business’s SLA sounds like wishful thinking to me.

Mikael Ricknäs prefaced his Red Hat's CEO: Clouds can become the mother of all lock-ins post of 6/2/2010 to NetworkWorld’s Data Center blog with “A certification program that lets companies move workloads will help, James Whitehurst said:”

image Cloud architecture has to be defined in a way that allows applications to move around, or clouds can become the mother of all lock-ins, warned Red Hat's CEO James Whitehurst.

Once users get stuck in something, it's hard for them to move, Whitehurst said in an interview. The industry has to get in front of the cloud wave and make sure this next generation infrastructure is defined in a way that's friendly to customers, rather than to IT vendors, according to Whitehurst.

Lock-in comes in many different guises, including the inability to move workloads among different clouds, the difficulty of extracting data from the cloud and being forced to use the underlying virtualization platform chosen by the cloud provider.

Red Hat is focusing much of its efforts on the first of these potential issues. Certifying cloud partners is the most important thing Red Hat has been working on this year, according to Whitehurst. Making sure workloads are mobile in the new cloud-based environment is critical, Whitehurst said, and that is what its Premier Cloud Provider Program is about.

"Our customers can run the workload in their data center or migrate it to multiple cloud providers... and we'll support you on it and your ISVs will support you on it," said Whitehurst.

The cloud certification program was announced last year, and Amazon Web Services was the first cloud provider to get certified. Since then, NTT and IBM have been added to the list of certified partners and more are on the way, according to Whitehurst.

For a cloud provider to be certified it has to use a virtualization platform based on VMware's ESX hypervisor, Microsoft's Hyper-V or Red Hat's own hypervisor, which is based on KVM (Kernel-based Virtual Machine).

To be able to move a workload from a data center to a cloud or between two clouds, a connecting API (application programming interface) is needed, and there are a plethora of different ones being developed. Fewer would be better, according to Whitehurst. However, the real challenge isn't the API, but ensuring that the application will run with the same performance when it has been moved. That is what Red Hat is focusing on. Getting an API in place that allows a workload to be moved is only 10 percent of the work, Whitehurst said.

The next step is also being able to move also licenses along with the workloads, according to Whitehurst. In April, Red Hat announced Cloud Access, which will let enterprises use their subscriptions to support either traditional on-premise servers or servers hosted on Amazon's Elastic Compute Cloud.

Red Hat isn't the only company that wants to make it possible for companies to move their workloads among data centers and clouds. For example, VMware is developing the vCloud Service Director, previously code named Project Redwood. However, the tool is still being beta tested, and the plan is to ship it before the end of the year, according to Richard Garsthagen, senior evangelist at VMware in Europe, the Middle East and Africa.

<Return to section navigation list> 

blog comments powered by Disqus