|Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.|
• Update 12/19/2009: Upperside Conferences: Call for Papers for Cloud Telco 2010; Rob Gillen: Windows Azure, Climate Data, and Microsoft Surface; Azure AppFabric Team: New Windows Azure platform AppFabric SDK V1.0; Silverlight - TN: Tunisian Silverlight Developers Blog live on Azure; Michael Krigsman: Modern SOA governance: Adoption and measurement; Christofer Löf: CRUDing with “ActiveRecord for Azure”; ADO.NET Data Services (Astoria) Team: Update on the Data Services Update for the .NET 3.5 SP1; Mikael Ricknäs: Amazon adds media streaming to content delivery service; Scott Guthrie: Installing .NET 4 on Windows Azure and more.
- Azure Blob, Table and Queue Services
- SQL Azure Database (SADB)
- AppFabric: Access Control, Service Bus and Workflow
- Live Windows Azure Apps, Tools and Test Harnesses
- Windows Azure Infrastructure
- Cloud Security and Governance
- Cloud Computing Events
- Other Cloud Computing Platforms and Services
To use the above links, first click the post’s title to display the single article you want to navigate.
Discuss the book on its WROX P2P Forum.
See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.
Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.
You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:
- Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
- Chapter 13: “Exploiting SQL Azure Database's Relational Features”
HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the November CTP in January 2010.
* Content for managing DataHubs will be added as Microsoft releases more details on data synchronization services for SQL Azure and Windows Azure.
Off-Topic: OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009.
Cory Fowler’s Setting up the Development Storage Service post of 12/20/2009 explains:
If you are setting up your Development Environment for Windows Azure and want to avoid installing SQL Server Express. I didn’t want to use SQL Server Express because I’ve already installed SQL Server 2008 R2 CTP so I can interact with my SQL Azure Database in the Cloud.
and continues with a detailed tutorial for launching the Development Storage Initialization Tool (DSInit.exe) with a default SQL Server 2008 R2 November 2009 CTP instance.
The easier approach, however, is to retain the .\SQLEXPRESS instance already installed and just download and install SQL Server Management Studio (SSMS) 2008 R2 Express November 2009 CTP with the “Tools Only” option from here.
My Windows Azure Table Test Harness App Was Down for 02 Hours and 30 - 40 Minutes Yesterday post of 12/28/2009 describes a serious outage of my Azure Table Test Harness application running in the South-Central US (San Antonio, TX) data center.
This evening some CTP participants with storage accounts in the "South Central US" region received errors from the storage service. Because the Windows Azure portal relies on the storage service, some operations in the portal resulted in errors as well. This issue has already been resolved, and no data was lost.
The root cause was a bug in queue storage, which had a cascading effect on blobs and tables for some customers. We applied a manual workaround to restore service to full functionality, and we're working on a code fix for the underlying bug.
• Johan Åhlén offers a detailed, well illustrated tutorial for getting started with SQL Azure Database and SQL Server Management Studio 2008 R2 in his SQL Azure - some tips & tricks post of 12/20/2009, which covers the following topics:
- What is SQL Azure
- Connecting to SQL Azure
- Copying data
- [Connection] Encryption
- Connection closing
Johan describes his live Windows Azure application in his Nyhetskoll - my contribution to the Windows Azure Developer Challenge post of 11/13/2009.
Harish Ranganathan explains how to integrate SQL Azure Database and Windows Azure techniques in his detailed Moving your ASP.NET Application to Windows Azure – Part I tutorial of 12/18/2009:
Earlier I had written 2 posts – Taking your Northwind Database to SQL Azure and binding it to an ASP.NET GridView Part I and Part II . I thought [I would] complete the series with a post on moving your ASP.NET [SQL Azure] Application as well to Windows Azure making it a truly cloud based application. …
Moving your ASP.NET Application to Windows Azure – Part II of 12/19/2009 begins:
In the previous post I had described the steps to secure your Windows Azure tokens and get the necessary Visual Studio templates as well as making your web application Azure ready by adding the cloud project and building against it.
Once you have tested the Development Fabric, the instances as well as the application, the next step would be to publish it to the Windows Azure platform. Select the “CloudService1” project that you added to the solution, right click and select “Publish”
Ben Riga continues his Azure Lessons Learned video series with this Azure Lessons Learned: Embarcadero post of 12/18/2009 that features Embarcadero Technologies:
Database tooling is important for many developers and DBAs as they manage numerous databases across the enterprise and the cloud. In this episode of Azure Lessons Learned I chat with Scott Walz, Sr. Director Product Management at Embarcadero Technologies responsible for the DBArtisan product.
Scott walks us through the DBArtisan product to show how SQL Azure integrates seamlessly into this cross-DBMS product. It was interesting to hear how quickly the effort to add SQL Azure went. I think that bodes well for other tooling in general for SQL Azure. Since SQL Azure is so very close to SQL Server it should be relatively simple for ISVs to add SQL Azure support to products that support SQL Server today.
Bruce Kyle explains how to Share Real-time Premium Data with Codename "Dallas" in this 12/18/2009 post to the US ISV Community blog:
Dallas was announced at Professional Developers Conference (PDC09).
Dallas is an information marketplace that brings data, imagery, and real-time web services from leading commercial data providers and authoritative public data sources together into a single location, under a unified provisioning and billing framework. Additionally, Dallas APIs allow developers and information workers to consume this premium content with virtually any platform, application or business workflow. …
For an illustrated tour of Codename “Dallas,” see my Codename “Dallas” Developer Portal Walkthrough post of 12/17/2009:
Harish Ranganathan explains Binding Entity Framework to your SQL Azure Database – Visual Studio 2010 Beta 2 in this 12/17/2009 post:
If you have used the Entity Framework that shipped with Visual Studio 2008 SP1, you would really start appreciating the flexibility it offers for building schema driven data access layer and get it to the UI Layer either directly or using a middle tier such as WCF RIA Service. Check my earlier post on this, if you are interested further :)
Meanwhile, the other exciting stuff that has been around is the SQL Azure which is part of the Windows Azure platform. SQL Azure provides relational data over the web which means, the Database is hosted, maintained and all is done by us and you get to store your database and query the same as if you were running it in your local Data Center or server. Of course, SQL Azure is currently CTP and you can get free access to it if you have the Azure Tokens.
While I had earlier written about Migrating your database to SQL Azure that example used an ASP.NET front end which had a GridView doing direct data binding with SQL DataSource. Obviously, one would want to use some of the more abstract controls such as LINQ DataSource / Entity DataSource. …
The new book will replace Eran’s OAuth Beginner’s guide of 10/2007. Windows Identity Foundation (WIF) now supports Web Resource Authorization Protocol (WRAP) v2.0, a related enterprise-oriented protocol supported by Microsoft, Yahoo and Google, which derives from OAuth.
Eran offers his negative view of WRAP in WRAP, and the Demise of the OAuth Community of 11/23/2009. Microsoft’s Dare Obasanjo compares the issue of the OAuth vs. OAuth WRAP APIs with a fictitious Facebook fork of the Twitter API in his Some Thoughts on the Twitter API as a "standard API" for microblogging post of 12/21/2009:
Things get even more interesting if Facebook actually did decide to create their own fork or "profile" of the Twitter API due to community pressure to support their scenarios. Given how this has gone down in the past such as the conflict between Dave Winer and the RSS Advisory board or more recently Eran Hammer-Lahav's strong negative reaction to the creation of OAuth WRAP which he viewed as a competitor to OAuth, it is quite likely that a world where Facebook or someone else with more features than Twitter decided to adopt Twitter's API wouldn't necessarily lead to everyone singing Kumbaya.
Mario Szpuszta of Microsoft Austria (a.k.a mszCool) presents an alternative view in his Live from PDC 2009 in L.A. – Windows Identity Foundation Released and further cool announcements… post of 11/18/2009 which supports WIF’s use of OAuth WRAP.
• The Azure AppFabric Team described a New Windows Azure platform AppFabric SDK V1.0 posted on 12/18/2009 as follows:
This SDK includes API libraries and samples for building connected applications with the .NET platform. It spans the entire spectrum of today’s Internet applications – from rich connected applications with advanced connectivity requirements to Web-style applications that use simple protocols such as HTTP to communicate with the broadest possible range of clients.
Technical details, rather than propaganda, would be appreciated.
• The “Geneva” Team confirms Vibro’s report (see below) in their Announcing the AD FS 2.0 Release Candidate and More of 12/18/2009:
We are happy to announce several updated federated identity product releases that are available NOW!
The team’s Announcing WIF support for Windows Server 2003 !! post of the same date confirms an earlier (11/2009) post:
We are glad to announce that Windows Identity Foundation (WIF) RTW for Windows Server 2003 is available NOW! This release supports both Windows Server 2003 SP2 and Windows Server 2003 R2 platforms and following seven languages: English (en-us), German (de-DE), Spanish (es-ES), French (fr-FR), Italian (it-IT), Dutch (nl-NL), and Japanese (ja-JP).
You can download the language specific WIF RTW packages for Windows Server 2003 from here.
Vittorio Bertocci reports that Active Directory Federation Services (ADFS) v2.0 is almost cooked in his ADFS 2.0 RC is Here! post of 12/18/2009:
The release candidate is always an important milestone for a product: if possible, it is even more so for a component as essential as your identity provider or your federation provider, which must be absolutely rock solid, secure, always available... you know the drill.
Active Directory Federation Services (ADFS) 2.0 finally reached the Release Candidate phase!
This special episode of the Id Element is all about the new features introduced in the RC: Matt Steele, Senior PM in the ADFS team, makes his second appearance on the show and gives us an insider view on how the feedback on Beta2 helped to improve the product.
From SAML protocol interop to farms and certificates management, going through new authorization capabilities and improved user experience, in this release there's something for everybody!
The video is available here. [Emphasis Vibro’s.] …
• The Windows Azure Team’s Azure at PDC 2009: Replays Now Available! post of 12/20/2009 recaps the session videos for Windows Azure and SQL Azure with brief descriptions of their content.
• Neil MacKenzie explains the Service Management API in Windows Azure in this 12/19/2009 detailed tutorial, which covers the following topics:
- RESTful Interface
- Service Management Operations
- Get Operation Status
- Create Deployment
- Get Deployment
- Change Deployment Configuration
- Delete Deployment
Neil provides sample C# code and configuration sections where appropriate and concludes:
Hopefully, Microsoft will release a higher-level .NET API to supplement this low-level REST API.
• The ADO.NET Data Services (Astoria) Team’s Update on the Data Services Update for the .NET 3.5 SP1 announces:
We’ve found an issue with the update we released earlier this week and as a result we have removed the update from the download site while we address the issue. We will make an updated version of the download available as soon as possible.
The issue is due to a change to the IDataServiceHost interface and only affects existing Data Services that have a “custom host” (i.e. directly implement the IDataServiceHost interface). This issue does not affect Data Services that use the standard WCF/ASP.NET host (the host that your Data Service will have if you have used the built in tools in Visual Studio to create your service). The issue causes services that use “custom hosts” to fail to initialize.
We believe this issue affects only a small number of existing data services but it affects enough that we have made the decision to remove the update from the download page until we have addressed the problem. We are currently working on the fix and will have an updated version available as soon as possible.
In the meantime, the latest CTP Download is available here: http://www.microsoft.com/downloads/details.aspx?FamilyID=a71060eb-454e-4475-81a6-e9552b1034fc&displaylang=en.
• Christofer Löf describes CRUDing with “ActiveRecord for Azure” in this 12/19/2009 post:
In my previous blog post I introduced you to my little experiment – a sample implementation of the ActiveRecord pattern for the Windows Azure Storage system which I call “ActiveRecord for Azure”. (It’s easier to refer to something it it has a name – right?). In this post I want to elaborate a little bit further on the features previously mentioned. Since most of us associate the ActiveRecord pattern with MVC style apps I’m going to show the Create, Read, Update and Delete (CRUD) support by implementing a simple Task List application using ASP.NET MVC. …
• Rob Gillen’s Windows Azure, Climate Data, and Microsoft Surface post of 12/18/2009 describes an Azure vizualization application:
• Soumow Atitallah (@soumow) deployed the Tunisian Silverlight Developers Blog live to Azure (Silverlight – TN) on 12/19/2009, according to a tweet of the same date.
… I built a simple visualization app that does a real-time query against the data in Azure and displays it. Originally the app was built as a simple WPF desktop application, but I got to thinking that it would be particularly interesting on the Surface and therefore took a day or two to port it over. The video … is a walkthrough of the app – the dialog is a bit cheesy but the app is interesting as it provides a very tactile means of interacting with otherwise stale data.
Archetype posted a live, personalize Archetype Holiday Card app to Windows Azure on 12/17/2009:
You can personalize the cookie’s shape and decorations, and share it with friends by e-mail or Facebook. Why not Twitter?
• Julie Bort asserts “By 2011, 89% of 212 enterprises surveyed plan to use W7 but most are also leaning toward Google for cloud computing” in her Most business will adopt Windows 7 by 2011, but prefer Google's cloud post of 12/18/2009 to NetworkWorld’s Microsoft Subnet blog:
As for cloud computing, the news isn't completely bleak for Microsoft. It has its biggest foothold in its most coveted customer, the large enterprise with $1 billion or greater in annual revenues. Although only one-quarter of the total respondents said they were interested in Azure in 2010 for hosted Microsoft apps, most of those interested (14%) were large enterprises. This is a big jump from the last CIO Zone survey on SaaS in June when so few details of Azure were known. At that time, most organizations said they thought they would be heavily using Google in 2010.
On the other hand, you can also say the glass is half empty for Azure. Even among the [largest], wealthiest companies, most plan to use Google, and use it more heavily, too. When asked how much usage they expected to give a specific cloud computing platform in 2010, on a scale of 1-5, with 1 indicating greatest usage and 5 indicating least usage, large enterprises ranked their planned Google usage at 3.03. They ranked their planned Azure usage at 3.48. Planned Amazon usage came in third at 3.49.
The problem with Julie’s analysis is failure to distinguish between Google Apps and the Google App engine as a Windows Azure competitor. Windows Azure isn’t intended to “host Microsoft apps,” which I assume means Office productivity applications, a.k.a. Office Web Apps. That will be Microsoft Office Live’s job. Windows Azure is intended to host custom cloud-computing applications written primarily by .NET developers.
Julie replied to my comment in her thanks, Roger, but I didn't confuse those two post of 12/21/2009.
• Eric Golpe announced that “customers will be able to use their Azure benefits for normal (production) use” in his Windows Azure for MSDN Premium Subscribers & BizSpark Members post of 12/19/2009:
Starting January 4th, MSDN Premium subscribers and BizSpark members in 21 countries can sign up for Windows Azure Platform benefits. Previously, we communicated that the Azure benefit usage for subscribers would be limited to development and testing. We’ve lifted this restriction so that customers will be able to use their Azure benefits for normal (production) use! In doing so, they won't need a separate account to transition between development and production; however, customers cannot consolidate or pool the Azure benefits from multiple subscriptions onto one account.
Updated details on Azure benefits for MSDN subscribers are here.
• Jeffrey Schwartz reports “As CIOs largely reject the early crop of cloud services for business-critical apps, Redmond readies private and hybrid cloud platforms” in his Microsoft's Private Cloud Formation post to Visual Studio Magazine’s In Depth blog of 12/17/2009:
As Microsoft rolls out its Windows Azure and SQL Azure public cloud services in January 2010, the first implementers will likely include those building greenfield Web 2.0-type apps as well those who develop and test software looking for capacity on demand. But for cloud computing to take hold in the enterprise for business-critical applications, Microsoft knows it must extend Windows Azure to integrate securely and seamlessly with internally hosted systems.
Hence, the next phase of Windows Azure will enable enterprises to build private and enable hybrid clouds with a new set of deliverables that will evolve throughout 2010 and likely into the following years.
The allure of cloud services is that they provide infrastructure on demand and remove the capital and administrative requirements of running internal systems. Yet the vast majority of CIOs say they simply can't put certain types of applications and data into the current incarnation of cloud services.
"It's going to be a tough sell," says John Merchant, assistant vice president at The Hartford Financial Services Group Inc., a large insurance company. "As a Fortune 500 company with highly regulated data and a very conservative outlook, it's going to be difficult for any insurance company or any financial institution of any size to migrate any data to the cloud."
On a panel in November at Interop New York addressing the top cloud engineers at Amazon.com Inc., Google Inc. and Microsoft, Rico Singleton, deputy CIO for the State of New York, asked: "Can you give me a private cloud that can provide all the benefits that you provide now on my private network closed to the outside, and still be able to give me similar ROI?" The answer by top cloud engineers at Microsoft, Amazon and Google was a resounding: "Not yet." …
We are working with the Azure folks right now to try and get .NET 4 installed on it as soon as possible. Unfortunately I don't have an exact ETA yet.
Information Week: Analytics presents their Strategy: Outlook 2010 Cloud Computing Brief:
The results of InformationWeek Analytics’ Outlook 2010 survey, where we asked 360 business technology pros about their plans for the year ahead, don’t make you want to break out the party hats and blowers. But there are some signs that IT spending will at least level off and that customer-facing and sales-supporting projects will be on the rise. Compare that to last summer, when we heard a lot about cost-cutting infrastructure projects and renegotiations with vendors but not a lot about IT initiatives that drive growth.
In terms of emerging technology, cloud computing’s momentum is real, as markedly more IT pros are considering it than they were a year ago. Data center innovation remains a high priority. Despite some optimism, the IT hiring outlook remains weak, and if there’s budget cutting ahead, IT will take its share of the lumps. …
The price is US$99.00
Reuven Cohen’s Introducing the Private Partner Cloud post of 12/18/2009 describes a South Korean mobile service provider’s cloud infrastructure for developing mobile phone applications for the provider’s network:
I'm currently in Seoul South Korea for a variety of meetings with SK government and technology industry folks. Yesterday I had a very interesting meeting with the largest South Korean mobile provider. During the meeting they described a great potential use case for telecom focused IaaS cloud offerings. Basically what they've done is created an on demand compute infrastructure specifically for their network of mobile application developers. The service is offered free of charge to their partners and provides all the tools necessary for the development, testing and deployment of mobile applications specifically tailored to their particular mobile network environment. This may be one of the best use cases for semi-private cloud I've heard of.
In a sense they're subsidizing the infrastructure costs for mobile application developers they work with. They are basically covering the costs associated with the more routine aspects of mobile app development while also empowering a new and broader group of potential partners by providing a quick and easy way to develop applications for their environment. Another advantage is in gaining a greater pool of potential network specific applications & developers. Very smart. …
Agreed. I believe Microsoft should provide Windows Mobile developers with some free Azure bandwidth and support for creating WinMo apps.
Frank Gens’ New IDC IT Cloud Services Survey: Top Benefits and Challenges post of 12/15/2009 begins:
This year’s IDC IT cloud services survey reveals many of the same perceptions about cloud benefits and challenges as seen in last year’s survey. But there are a few interesting shifts this year, driven largely by: 1) budget pressure from the challenging economy, and 2) a growing sophistication in users’ understanding of cloud services.
This year’s survey was fielded, like last year’s, from the IDC Enterprise Panel of IT executives and their line-of-business (LOB) colleagues. The respondent population is very similar to that of last year’s survey, validating comparisons with last year’s results.
Economics and Adoption Speed Still Top Benefits; Standardization Moves Up
This year’s survey shows, once again, that economic benefits are key drivers of IT cloud services adoption. Three of the top five benefits were about perceived cost advantages of the cloud model: pay for use (#1), payments streamed with use (#3) and shift of IT headcount and costs to the service provider (#5).
While pay-for-use slightly edged out last year’s #1 – easy/fast to deploy – these two are essentially in a tie for #1. It’s pretty safe to ascribe the slight edge for pay-for-use to the enormous pressure that the Great Recession has put on IT budgets, and the consequent increased focus on cloud economics in the minds of customers. But it’s still clear that speed/simplicity of adoption remains a key driver of demand for cloud services. …
That last observation bodes well for Windows Azure and SQL Azure.
Mary Jo Foley includes Azure in her Microsoft products worth watching in 2010 article for Redmond Magazine, which I missed when published:
… Microsoft is launching the final version of its cloud-based hosting platform, Azure, next month. Live Mesh -- the consumer-focused collaboration and synchronization service that will be one of Microsoft's first Azure-based offerings -- is supposed to be a proof point for the platform. Both Azure and Live Mesh are Chief Software Architect Ray Ozzie's pet projects. Microsoft has taken a different tack than other cloud vendors like Amazon and Google. Instead of simply providing data center space and resources,
Microsoft is trying to build a cloud platform that's similar to Windows and .NET. The company hopes developers will want and need an OS, a database, collaboration and other building blocks. …
I’m more sanguine about the prospects for Azure than Live Mesh.
• Michael Krigsman’s Modern SOA governance: Adoption and measurement post of 12/18/2009 to the Enterprise Irregulars blog claims:
Recent discussions have brought attention to the important and evolving role of governance in the world of service-oriented architecture (SOA).
“Modern conceptions of SOA governance,” to borrow a phrase from SOA expert, Dion Hinchcliffe, recognize that technical architecture is only one component of successful adoption. Achieving deeper success involves bringing technology deployment into conformity with business needs.
Governance is critical to successful SOA adoption. For this reason, I brought together Dion and Software AG’s Vice President and Chief Strategist, Miko Matsumura, for a podcast discussion on this topic. Together, the three of us explored interconnections between SOA, technology, business, and trust. …
Information Week: Analytics presents their Research: Cloud Governance, Risk and Compliance report:
Navigating the Storm: Governance, Risk and Compliance in the Cloud
Q: What’s more fashionable than government bailouts, Twitter, hybrids and pimping your greenness?
A: Cloud computing, that sexy new IT concept that everyone is talking about, but no one seems able to clearly define.
Besides buzzwords like SaaS (software as a service), PaaS (platform as a service) and IaaS (infrastructure as a service), cloud computing provides IT groups with extra potential layers of abstraction, extremely complex interdependency models—and an unsettling level of uncertainty about where our data goes, how it gets there and how protected it will be over time. If you’ve got a nagging feeling that much of the current discussion seems new, yet somehow strangely familiar, you aren’t losing your mind. We struggled through similar issues a few years ago when application service providers were all the rage. This time around, when it comes to defining the scope of the phenomenon, the only thing all parties seem to agree on is that cloud computing represents something that is not local—not at your site. This oversimplification is understandable given that, for network engineers, the generic cloud icon has for decades represented everything from foreign networks and remote sites to the rats’ nests we really don’t want anyone asking about. …
This report appears to be free to download.
• Upperside Conferences: Call for Papers for Cloud Telco 2010 to be held at the Novotel Convention Center and Spa, Paris CDG France on 6/1 – 6/4, 2009:
Several major telcos have recently announced their wish to enter the cloud computing services market. During the past years, operators have moved beyond broadband voice and data into network-dependent applications like videoconferencing and telepresence and have secured deeper enterprise relationships. Providing hosted solutions to the companies is the next move many carriers are considering today.
Indeed, telecommunication providers could play an important and lucrative role in the burgeoning world of cloud computing by combining their natural advantages as network operators with a new wave of technological innovation. The opportunity represented by cloud-based services is potentially immense because, for starters, it increases the value of carrier networks in multiple ways and creates new roles and revenues for telecom service providers. …
The call for proposals is in online.
Sergey Barskiy reports in his Upcoming talks post of 12/17/2009 that he will present sessions about:
- SQL Azure to the Gwinnett Georgia Microsoft Users Group (no date specified)
- Building an Azure application step-by-step to the Atlanta .NET Users Group at the Microsoft Atlanta office on 1/25/2009
• Bob Evans reports "Ellison speaks out on Oracle's new Sun-enabled strategy and how that points to where the entire IT industry is headed” in his 12/18/2009 Oracle CEO Larry Ellison On The Future Of IT Global CIO column for InformationWeek:
… Oracle founder and CEO Larry Ellison spoke in considerable detail about how his vision of the computer industry of the future is centered on the idea of optimized systems that provide high value to customers because they don't need to do or pay for a lot of systems integration, and in return provide high margins to the providers.
Ellison also quite casually wove the terms "private clouds" and "cloud computing" into his strategic overview without lampooning them, which was a big step forward even though Ellison's discomfort with the term is shared by IBM CEO Sam Palmisano and Hewlett-Packard CEO Mark Hurd. It was a big step because whatever his personal misgivings over cloud terminology might be, it's a name and concept that has truly begun to fire the imagination of customers and industry players alike, and the combination of Ellison's new acceptance of the term combined with his ambitious plans for Oracle to become a major supplier of cloud systems can only accelerate that already forceful trend. …
• James Hamilton contrasts the openness of storage subsystems with networking hardware in his Networking: The Last Bastion of Mainframe Computing post of 12/19/2009:
The networking world remains one of the last bastions of the mainframe computing design point. Back in 1987 Garth Gibson, Dave Patterson, and Randy Katz showed we could aggregate low-cost, low-quality commodity disks into storage subsystems far more reliable and much less expensive than the best purpose-built storage subsystems (Redundant Array of Inexpensive Disks). The lesson played out yet again where we learned that large aggregations of low-cost, low-quality commodity servers are far more reliable and less expensive than the best purpose-built scale up servers. However, this logic has not yet played out in the networking world.
The networking equipment world looks just like mainframe computing ecosystem did 40 years ago. A small number of players produce vertically integrated solutions where the ASICs (the central processing unit responsible for high speed data packet switching), the hardware design, the hardware manufacture, and the entire software stack are stack are single sourced and vertically integrated. Just as you couldn’t run IBM MVS on a Burrows computer, you can’t run Cisco IOS on Juniper equipment.
James’ article offers an interesting counterpoint to Larry Ellison’s paean to Sun’s proprietary hardware approach to cloud computing (see above):
"So customers will be able to buy high-end SMP machines that are high-performance and high-value, or a high-end private cloud, with all of the pieces including processing, storage, and networking integrated together with Oracle-slash-Sun software. We think that will heavily differentiate our offerings from the offerings of IBM, HP and Dell, and we think we're gonna be able to compete very effectively there and that will deliver high margins and allow us to deliver that $1.5 billion additional profit in our first full year of owning Sun."
• Bob Evans hints “Its 13 petabytes include archived data from the world's top banks and pharma companies, and it's growing rapidly. The owner's name starts with A -- but it's not Amazon” as the answer to his The World's Largest Private Cloud: Who's Number One? question posed in the second “The Cloud Imperative” article of 9/16/2009 for InformationWeek’s Global CIO column:
Leaning hard into the cloud-computing phenomenon that has become the major business-technology theme for 2010, Autonomy Corp. is claiming to be King of the Cloud by virtue of its massive Digital Safe archiving system, which spans 6,500 servers across seven data centers and handles 3 million new files per hour. …
And that private-cloud beast is only in the early stages of an astonishing growth spurt: just 8 months ago, it was at 10 petabytes. And Autonomy CMO Nicole Eagan says the surge to the cloud for archiving has only just begun.
Cloud-based data archiving in this scale is a significant vote of confidence for cloud computing in general.
• Danny Tuppeny’s Microsoft Windows Azure vs Google App Engine: Pricing post of 12/18/2009 concludes:
I really hope Microsoft re-evaluate their pricing for small apps. It's too expensive to play around with small prototypes at those prices, whereas Google's offering will let me get started completely free, until my app is churning a considerable amount of traffic, and even then, it'll work our cheaper for the same processing/transfer.
Sorry Microsoft. I love .NET and Visual Studio, but Google App Engine is just so easy and cheap that it's going to be my "toy of choice" for my hobby coding for the immediate future!
• Mikael Ricknäs asserts “Amazon's CloudFront supports on-demand streaming, will add live events next year” in his Amazon adds media streaming to content delivery service post of 12/16/2009 for the IDG News Service:
Amazon Web Services has added support for audio and video streaming to the beta version of CloudFront, its Web service for content delivery, the company said on Wednesday.
The support for streaming is based on Adobe's Flash Media Server. Today, the service supports on-demand streaming, but Amazon plans to add support from live streaming next year, it said.
To stream content customers must first store the original copies of their movies and songs on Amazon's S3 (Simple Storage Service), and then enable streaming of the content using the AWS Management Console or Amazon's APIs (application programming interfaces) for CloudFront, according to Amazon.
CloudFront can stream content from 14 locations in the U.S., Europe, Hong Kong and Japan. Users are automatically sent to the best location, Amazon said. …
Salvatore Genovese reports “Orange announces complete cloud computing services, from infrastructure to real-time business applications” in his Orange Sets Out Its Ambitions in Cloud Computing post of 12/18/2009, which reads more like a press release:
Orange is the key brand of France Telecom, one of the world’s leading telecommunications operators. With 126 million customers, the Orange brand now covers Internet, television and mobile services in the majority of countries where the Group operates.
Leveraging its cloud-ready network, Orange is best placed to provide enterprises with simpler, safer and more flexible cloud services.
Orange Business Services, which has already rolled out successful cloud services, such as IT Plan (desktop virtualization) and Flexible Computing (hosted virtualized infrastructure) will launch a dozen of new cloud computing services in the coming 24 months, covering six main areas including real-time applications, collaboration, security, infrastructure, cloud-ready networking and vertical solutions for specific industries. …
Geva Perry offers his Thoughts on Amazon EC2 Spot Instances in this 12/28/2009 post:
The innovation just keeps on coming from the good folks at Amazon Web Services. This week they announced a new pricing model for Amazon EC2 instances: spot pricing. Spot pricing is the third pricing model Amazon is offering for EC2 instances -- with On-Demand and Reserved being the other two -- and it brings us closer to an efficient and commoditized IT infrastructure market, and it got my mind racing on the various possibilities of it, and where it goes if taken to its logical conclusion.
James has a very succinct explanation of the key tenets of the new offering:
Each customer sets a maximum price he or she is willing to pay for "spot instances."
Amazon sets a "spot price" for instances hour-by-hour, based on available supply and demand.
Customers pay whatever the spot price is up to their maximum price. So, if someone bids $0.07/hour, and the spot price is $0.05/hour, the person pays $0.05/hour.
If the spot price exceeds the customer's maximum price, the customer's instances are terminated.
I had to open my old finance textbook from business school and think of all sorts of possibilities: call options, put options, futures, and other forms of derivatives and hedging techniques. It will be interesting to see if any of those evolve over time. By the way, there already is a real-time ticker for Amazon spot pricing, called Cloud Exchange. But here are some thoughts on issues that are relevant in the shorter term. …
Geva goes on to discuss Workloads and Bidding activities.
Kent Langley points to a fledgling spot exchange cloud market report in his AWS EC2 Spot Price Visualization Site and a few thoughts about CPU cycles post of 12/18/2009:
This is rather interesting to see. Someone already put up a set of live charts keeping track of the AWS compute instances.
What's interesting to me is that the same resource can have different prices in different regions (obvious, but interesting) and that in many cases (if not all) the costs are substantially below the retail rate for the same instance.
For example: us-east-1, c1.xlarge, $0.25 / hour. The retail for that is $0.68 per hour. Nice discount. …
Liam Eagle reports Rackspace Partners with FathomDB for Database in the Cloud with rates as low as US$0.02/hour ($14.40/month) in this 12/18/2009 post:
The Rackspace Cloud (www.rackspacecloud.com), the cloud hosting division of Rackspace Hosting (www.rackspace.com) announced on Thursday that it has partnered with FathomDB (www.fathomdb.com), which it calls a pioneer in the realm of database as a service technology, to create a version of FathomDB’s cloud database offering using Rackspace’s cloud hosting solutions.
Rackspace says FathomDB helped to launch the database as a service business by creating a user interface and analytics engine that support MySQL databases in the cloud, initially powered by Amazon’s EC2 and S3 cloud products, but now also using Rackspace’s Cloud Servers product.
Rackspace says that, built on the Rackspace Cloud’s API, the new FathomDB offering will provide “a seamless database management experience” using Rackspace’s Cloud Servers and simplifying administration tasks. The FathomDB offering handles a long list of database tasks that includes automated backup and routine maintenance, analytics tools, real-time monitoring and performance reporting and simple configuration tools.
The release describes FathomDB as a “strategic partner,” rather than simply as a customer, which would suggest that the relationship goes a little deeper than just an application optimized to work with Rackspace’s cloud API. At least part of that relationship appears to be the inclusion of the solution within Rackspace’s “cloud tools” ecosystem. …
MG Siegler reported Rackspace Goes Down. Again. Takes The Internet With It. Again. for TechCrunch later on 12/18/2009:
Another day, another Rackspace outage. The hosting company had a complete and total failure today that took down a number of big sites on the Internet, including ours. This has been happening all too often in recent months, including downtime just last month.
The failure apparently originated in the company’s Dallas-area server farm. But unlike previous times, this does not appear to be a power issue, the company says. Some other sites that are currently affected include: 37signals, Brizzly, Scoble’s blog, all of the sites hosted by Laughing Squid, Tumblr custom domains, and many others.
This is another black eye for the company, though they are generally responsive with other issues we’ve had throughout our time with them. But until they can prove to be more reliable, we’ve decided to get a backup version of TechCrunch up and running at another datacenter, for when someone inevitably trips over a power cord at the Dallas Rackspace center again.
MG continues with a “few updates from the company” starting at 3:45 PM CST.
Bruce Guptill and Charlie Burns ring in about Amazon EC2 Spot Instances Enables and Demands, Change in Cloud Buying and Use on 12/18/2009 in this Saugatuck Research Alert (site registration required):
What is Happening? On December 14, 2009, Amazon Web Services (AWS) announced a new Cloud Computing offering, Spot Instances, intended to complement their previous AWS offerings (On-Demand Instances and Reserved Instances).
Spot Instances introduces a dynamic model for pricing, selling, buying, and using unused Amazon EC2 capacity: dynamic pricing for dynamic resources. Buyers bid for one or more EC2 instances based on a price that they are willing to pay. Based on the supply and demand, Amazon sets the prices for these unused resources. The prices can be expected to fluctuate periodically based on levels of demand, time of the day, and other typical resource use factors. If a user’s bidding price exceeds the spot price set by Amazon, their bid becomes the spot price, their instances will be run, and they will be charged the current spot price. When the spot price goes over that bidding price, the instances will be terminated. If and when prices come back down, the user’s instances can run again, automatically.
AWS sees Spot Instances as best-suited for such non-time-dependent batch processing jobs as software development testing, scientific research, video rendering, and financial modeling, and massive data analysis (e.g., seismic data). Saugatuck sees Spot Instances as a harbinger of Cloud disruptions to come, especially for buyers and for Cloud services providers. …
Bruce and Charlie continue with analyses of Why is it Happening? and Market Impact.
Sun Microsystems published a rogues’ gallery of their Sun Cloud team members and offers links to several whitepapers in a recent (undated) Sun Cloud post to the Cloudbook blog. Here are the whitepaper offerings:
- Optimizing Applications for Cloud Computing Environments by Jason Carolan (11/2009)
- Harnessing the Potential of the Cloud with Cloud Patterns, anon. (11/2009)
- Cloud Computing Infrastructure and Architecture by Jason Carolan (6/2009)
Sun also offers a video peek into their data center: A Look Inside the Sun Cloud - June 2009:
A tour of SuperNAP, the datacenter that is home to the Sun Cloud. Highlights the security, availability and energy efficiency features of the facility.