Windows Azure, Azure Data Services, SQL Data Services and related cloud computing topics now appear in this weekly series.
•• Update 5/15-17/2009: Download new Geneva materials from Tech*Ed; Citrix offering for cloud service providers; Windows Azure MMC blob and queue management tool, more Geneva posts, other additions.
• Update 5/13-14/2009: Geneva Beta 2 does not solve the incompatibility problems with Windows Azure (see the .NET Services: Access Control, Service Bus and Workflow section.)
• U.S. General Services Administration (GSA) issues Request for Information (RFI) about Infrastructure as a Service (IaaS) offerings (see the Azure Infrastructure section.)
Subscribe to the OakLeaf blog on your Amazon Kindle. Get details here. Click capture below for full-size image.
Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:
- Azure Blob, Table and Queue Services
- SQL Data Services (SDS)
- .NET Services: Access Control, Service Bus and Workflow
- Live Windows Azure Apps, Tools and Test Harnesses
- Azure Infrastructure
- Cloud Computing Events
- Other Cloud Computing Platforms and Services
• Brent Stineman has updated his Azure queues series with Hands on Azure Queues – Part 3.5 of 5/14/2009. Brent says this post is:
[A]n update to my QueueDemo solution that combines enhancements I’ve made as a result of my last article on service configuration options. Putting my own words to practice, I have updated the QueueDemo so that it can now be used against either Development or Hosted Storage. The changes are as follows:
- enhanced constructor to pull account credentials and URI endpoint from configuration
- enhance URI to check host and create full URI in either path or host style as appropriate
Jim Nakashima’s Adding an HTTPS Endpoint to a Windows Azure Cloud Service post of 5/12/2009 clarifies the instructions in his Enabling SSL Connections on Windows Azure article for adding a HTTPS/port 443 endpoint for a WebRole:
First are the cert requirements.
- The certificate must contain a private key that is marked exportable
- The certificate must have the Server Authentication Intended Purpose
When running on the Development Fabric, the certificate also needs to be self-signed – this is to prevent any security issues around leaking the private key of a real certificate.
•• Still waiting for videos of the Tech*Ed 2009 SDS sessions. It’s a total FAIL not to have session videos available within 24 hours of the presentation.
•• Vittorio Bertocci (a.k.a. Vibro) delivers More details about the Identity Developer Training Kit, which includes Exercise 5: Accepting Tokens from .NET Access Control Service, in this 5/15/2009 post.
•• Sidd Shenoy’s Deep Dive into "Geneva" Next Generation Identity Server and Framework presentation at Tech*Ed 2008 is regurgitated for attendees only in Tech*Ed 2009 online. Beth Massi’s TechEd Sessions Online – My Favorites post of 5/15/2009 lists four Tech*Ed 2009 session videos, but I’ve yet to see a 2009 Azure session.
•• The “Geneva” Team offers Step-By-Step Guides, Virtual Machines and “Geneva” Whitepapers from Tech*Ed 2009 hands-on labs and presentations in this 5/15/2009 post:
The Step-by-Step guides and virtual machines that were used at the Tech Ed "Geneva" hands on labs and sessions are now available for download. You should find these helpful if you were unable to get to TechEd, missed the Geneva hands on lab sessions, or just want to be able to go through the material at your own pace. They are a great way to get your hands on and play with the Geneva technology.
You can download these materals from here.
“Geneva” interop whitepapers include "Geneva" and Sun OpenSSO and Novell Access Manager, which you can download from here.
Additionally, for Beta 2 we've updated our "Geneva" Framework for Developers whitepaper and the "Geneva" Datasheet. You can download these papers from here.
• John Fontana’s Microsoft IT goes live with its Geneva identity platform post of 5/13/2009 reports:
The company's IT department will change DNS records today on its internal network so all its identity federations are handled through its Geneva server environment rather than the current five Active Directory Federation Servers (ADFS) the company runs, according to Brian Puhl, a technology architect for Microsoft IT [Puhl link added].
Microsoft has nearly 410,000 computers and 165,000 users on its network. …
Puhl also said Microsoft will federate Live IDs with its Geneva infrastructure to support access to cloud services.
So far, it appears to me that only Stephane Grunet and those using his workaround (see below) can use Geneva Beta 2 now to federate Windows Live IDs with Azure WebRoles.
Here’s a diagram from John’s article that depicts the planned integration:
Capture courtesy of Network World
• Update 5/13/2009: Geneva Beta 2 does not solve the incompatibility problems with Windows Azure. According to Stephane Gunet’s ACS passive federation in a webrole : with source and demo post of 5/13/2009 to a thread in the .NET Services - Technical Discussions forum:
First of all, it seems the issues we had with beta 1 are not solved : the session cookie is still encrypted using DPAPI and certificate reference is still a reference to the certificate stores of the machine.
• John Fontana mentions Azure twice in his Microsoft's identity cloud platform enters Beta 2 post of 5/12/2009 for NetworkWorld:
The company has linked Geneva and the Microsoft Azure cloud operating system to create SSO to cloud-based services.
Also part of the platform is the Microsoft Service Connector, the Microsoft Federation Gateway and the .Net Access Control Service, which are designed to create a sort of identity backbone and connection to the cloud, specifically Azure.
The incompatibility problems noted in the preceding post make claims of cloud-based identity services with Azure suspect.
Vittorio Bertocci’s Announcing the Identity Developer Training Kit post of 5/11/2009 does what it says. Vibro describes it:
The Identity Developer Training Kit is a set of hands-on labs and resources designed to help developers to take advantage of Microsoft’s identity products and services. Being designed for developers, the kit focuses on the Geneva Framework: however it also gives guidance on how to take advantage of Geneva Server, Windows Live ID, the Microsoft Federation Gateway and the .NET Access Control Service (which is featured in a renewed and expanded lab). Most of the tasks are demonstrated both for ASP.NET web applications and for WCF services. We went to great lengths for eliminating as much as possible the friction that is traditionally associated with security samples, by providing configuration scripts and tools which automate many of the setup steps. (Emphasis Vibro’s.)
He also posted Geneva Beta 2 is out! on the same day.
- Enhanced FedUtil Tool with local STS for easy offline development
- New Visual Studio templates for building claims-aware web applications, web services, and security token services
- Support for SharePoint 2007
- Revised token handlers
- Revised federation authentication module
- New Claims Authorization Manager API
- Updated config support
Here’s what’s new in CardSpace:
- Support for Group Policy-based Information Card provisioning.
- Updated management UI
- Updated card tile
- Group Policy-based way for administrator to make card selection decisions for specific sites
- - Improved provisioning of X509-backed cards
- - Compatible with most existing managed cards
The Geneva Forum is here, this is the support email address and the Geneva website. The Microsoft Code Name "Geneva" beta 2 Resource Page has links to more than you want to know about the product. You receive a message with a link to this page after registering.
I’m downloading and testing now; I’ll update this post after I determine Beta 2’s compatibility with Windows Azure projects. I’m asking around. Keep an eye on the latest additions to Stephane Gunet’s ACS passive federation in a webrole : with source and demo thread in the .NET Services - Technical Discussions forum.
Note: The download requires registration, removes your existing CardSpace “Geneva” tool, and requires a reboot. I removed CardSpace “Geneva” and the Microsoft “Geneva” SDK with the Control Panel applet before installing Beta 2.
•• Ryan Dunn announces the availability of the Windows Azure MMC in this 5/14/2009 post:
Available immediately from Code Gallery, download the Windows Azure MMC. The Windows Azure Management Tool was created to manage your storage accounts in Windows Azure.
Following are some of the tool’s features:
- Manage multiple storage accounts
- Easily switch between remote and local storage services
- Manage your blobs
- Create containers and manage permissions
- Upload files or even entire folders
- Read metadata or preview the blob contents
- Manage your queues
- Create new queues
- Monitor queues
- Read (peek) messages
- Post new messages to the queue
- Purge queues
Note that the tool doesn’t work with the Windows 7 RC due to a bug in the RC’s PowerShell feature.
• Alin Irimie’s Windows Azure PHP Development Kit post of 5/14/2009 announces:
Microsoft announced today at TechEd India — a show running simultaneously with TechEd 2009 in Los Angeles, a new software development kit for people interested in building apps in PHP for Windows Azure cloud. The PHP SDK for Windows Azure, a.k.a. PHPAzure, is an open source project, available for download from Microsoft CodePlex. The SDK provides consistent programming model for Windows Azure Storage (Blobs, Tables, Queues).
but doesn’t offer a link to download the open-source code, which is here. Mary Jo Foley’s Microsoft makes available PHP development kit for its Azure cloud post of 5/13/2009 contains more information about PHPAzure.
• Jamie Thomson describes an SQL Server Integration Services (SSIS) 2008 package to read Azure log files in his Viewing Windows Azure log files using SSIS post of 5/14/2009. His package doesn’t download the log blobs, but it does format them nicely for readability in a viewer. The post include a link to the package’s source code.
Steve Nagy’s Converting A Web App Into An Azure Cloud Service Web Role post of 3/11/2009 shows you how to start with a Blank Cloud Service, make a change to the Web app’s *.csproj file to add a <RoleType>Web</RoleType> element, and then affiliate the Web app with the cloud service. Steve says:
So there you go! Quick and easy. The important thing now is that you can either run the web app standalone, or you can run it on your local development fabric, OR you can deploy it to Windows Azure.
•• James Hamilton reviews a minibook by Luiz André Barroso and Urs Hölzle of the Google infrastructure team in his The Datacenter as a Computer post of 5/16/2009. James has high praise for The Datacenter as a Computer: An Introduction to the Design of Warehouse-Scale Machines, which “is just over 100 pages long but an excellent introduction into very high scale computing and the issues important at scale.” Here’s the eBook’s abstract:
As computation continues to move into the cloud, the computing platform of interest no longer resembles a pizza box or a refrigerator, but a warehouse full of computers. These new large datacenters are quite different from traditional hosting facilities of earlier times and cannot be viewed simply as a collection of co-located servers. Large portions of the hardware and software resources in these facilities must work in concert to efficiently deliver good levels of Internet service performance, something that can only be achieved by a holistic approach to their design and deployment. In other words, we must treat the datacenter itself as one massive warehouse-scale computer (WSC).
We describe the architecture of WSCs, the main factors influencing their design, operation, and cost structure, and the characteristics of their software base. We hope it will be useful to architects and programmers of today’s WSCs, as well as those of future many-core platforms which may one day implement the equivalent of today’s WSCs on a single board.
•• Andrea DiMaio’s US Federal Government Blesses “Government 2.0” post of 5/16/2009 discusses how the White House’s Crosscutting Programs document addresses transparency as well as participation collaboration with emphasis on:
- USAspending.gov, a web site that will allow citizen to verify “when, with whom, and on what the Government is spending taxpayer funds, and whether or not that money is delivering results”. Data will be made available in such a way that users will be able to “combine them into different data sets, conduct analysis and research, or power new information-based products and businesses”.
- Data.gov, the much discussed repository to access public data from across the whole federal government to help unlock the so-called “power of information” and to create value by mashing up public and non-government information. As I mention in earlier posts, this has already triggered initiatives by vendors who are (or want to be seen as) proactive.
- Recovery.gov, which applies the same principles as USAspending.gov to the tracking of funds coming from the Stimulus Package.
•• James Urquhart continues his interoperability series with his Exploring cloud interoperability, part 3 article of 5/16/2009, which asserts:
Today, the focus is on providing a unified API for Infrastructure as a Service operations. In addition to standardizing how systems are provisioned, when they are active and what policies apply for situations like component failure or load spikes, it is also critical that this API unifies the way in which images are imported and exported from each provider's platform. A cloud operations API needs to cover as much of the system life cycle as possible, including provisioning and deployment.
There is so much effort being made in this space right now, by so many groups, that it is at times a little overwhelming. Luckily, I found a resource that has helped me organize not only the predominant operations API effort, but also the image/data and application/service interoperability classes. Believe it or not, it is a wiki dedicated to cloud efforts in the federal government market. (I'm telling you, the feds seem to be way ahead of the pack when it comes to organizing cloud activities these days.)
•• VMGuy’s Could Microsoft Be Contemplating a Data Services Cloud Offering? post of 5/15/2009 analyzes Microsoft’s “plans to consolidate its data storage and Web services business units:”
While most saw this as a sign of the bad economic times, Joe McKendrick thinks maybe it’s actually a smart business move.
Effectively, this move puts all of Microsoft’s SOA-related initiatives, including Oslo, under the same branch as its data storage and cloud services. Here’s why McKendrick thinks it’s a money-making, rather than cost-cutting, move:
I don’t think Microsoft is retrenching or cutting back SOA to save money — rather, I think the vendor sees more opportunity in the cloud, with the growing service-orientation of data management — with SOA as the enabler.
The connection between the cloud and SOA is pretty clear. SOA is services, cloud is services delivered via the Internet. And it’s pretty obvious how cloud computing and data storage go together. But what’s the connection between all three?
VMGuy’s post contains his answer.
• David Linthicum seconds Chris’s concern with “Right to Audit” costs in his SaaS/Cloud Audit Demands Could be Costly post of 5/14/2009. Quoting SC Magazine's Angela Moscaritolo, who focuses on security in the world of SaaS and cloud computing:
With respect to data security, organizations must review the vendor's data protection techniques to ensure appropriate cryptography is used for both data in rest and in motion, and make sure the appropriate documentation is available for auditors. In addition, the provider's access control and authentication procedures should be reviewed, and companies should find out if third parties have access to the information.
• Chris Hoff’s Incomplete Thought: The Crushing Costs of Complying With Cloud Customer “Right To Audit” Clauses post of 5/14/2009 observes:
Almost all of the Cloud providers I have spoken to are being absolutely hammered by customers acting on their “right to audit” clauses in contracts. This is a change in behavior. Most customers have traditionally not acted on these clauses as they used them more as contingency/insurance options. With the uncertainty relating to confidentiality, integrity and availability of Cloud services, this is no more. Cloud providers continue to lament that they really, really want a standardized way of responding to these requests*
These providers — IaaS, PaaS and especially SaaS — are having to staff up and spend considerable amounts of time, money and resources on satisfying these requests from customers.
Chris suggests that the Cloud Security Alliance could, “as a community, facilitate both expectations and deliverables from both the consumer and provider perspective.”
• Robert Westervelt reports Forrester advises cautious approach to cloud computing services in this detailed 5/14/2009 post to SearchSecurity.com. The report analyzes Forrester’s US$1,999 "How secure is your cloud?" report by Chenxi Wang, a principal analyst at Forrester. Chenxi will present a session on the topic in Forrester’s IT Forum 2009 conference’s Track H: Protecting Your Data, Safeguarding Your Reputation on May 19, 4:00 PM at the Palazzo, Los Vegas.
• Reuven Cohen reports in his Federal Cloud Capability RFI Released by U.S. Government post of 5/13/2009 that that the U.S. General Services Administration (GSA) has released a Request for Information (RFI) for Infrastructure as a Service (IaaS) offerings, which is the first step in the procurement process leading to selecting IaaS vendor(s):
Description: The GSA Office of the Chief Information Officer (OCIO), in concert with the IT Infrastructure Line of Business (ITI LoB), requests Capability Statements and responses to Business Model, Pricing Model, and Service Level Agreement (SLA) questions, 1 through 5, from vendors who provide Infrastructure as a Service (IaaS) offerings.
Ruv’s post analyzes the RFI document with an emphasis on interoperability and data portability:
For me the most important aspect of this RFI is the emphasis they've placed on cloud Computing Interoperability and Portability specifically in #5 of the RFI Doc. Something I've been pushing for in my recent Washington meetings. I'm ecstatic they've included some my recommendations including an "exit strategy", prevention of vendor lockin and multi-cloud (“cloud-to-cloud”) support.
Terremark Worldwide probably has an inside track because they were chosen to host the USA.gov site, as noted in Rich Miller’s The Obama Team’s Cloudy Ambitions post of 5/13/2009. There’s more about Terremark in my Windows Azure and Cloud Computing Posts for 4/27/2009+ post’s Other Cloud Computing Platforms section.
• Andrea DiMaio chimes in with his US Federal Government Puts Its Toes into the Cloud Computing Water post of 5/12/2009:
Suggested pilots cover a broad spectrum, ranging from user computing to data centers, from portals to content and records management, from case management to enterprise software. It is clear that, while the most immediate impact will be on the IT Infrastructure Line of Business (not by chance GSA’s Patrick Stingley has been named Federal Cloud CTO), other Lines of Business (such as Financial Management or HRM) are soon to come.
It will be interesting to watch how this plays out and affects the marketplace. This is probably the largest scale endorsement of cloud computing of any government and, although it is still at a nascent stage, may set the bat for many other governments to follow.
• J. Nicholas Hoover’s Federal Budget Lays Out Government Cloud Computing Plans article of 5/12/2009 for InformationWeek analyzes “a supplement to the administration's proposed 2010 budget:”
A section of the Analytical Perspectives document on Crosscutting Programs calls for a number of pilot projects that would help the government roll out government-wide common services, including some using cloud computing. According to the document, the government would use these tests to determine security and privacy requirements, develop standards, gather data, and benchmark costs and performance, but the pilots eventually will roll out more widely to federal agencies.
Kevin Jackson concludes “What a great day for Federal Cloud Computing !!” in his President Obama's 2010 Budget Highlights Cloud Computing post of 5/12/2009:
President Obama's 2010 Budget (pp. 157-158) has highlighted cloud computing as a key tool for improving innovation, efficiency and effectiveness in Federal IT.
"Cloud-computing is a convenient, on-demand model for network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. The cloud element of cloud-computing derives from a metaphor used for the Internet, from the way it is often depicted in computer network diagrams. Conceptually it refers to a model of scalable, real-time, internet-based information technology services and resources, satisfying the computing needs of users, without the users incurring the costs of maintaining the underlying infrastructure. Examples in the private sector involve providing common business applications online, which are accessed from a web browser, with software and data stored on the “cloud” provider’s servers."
Reuven Cohen’s White House Leading Cloud Computing Charge post of 5/12/2009 carries a “Optimizing Common Services and Solutions/Cloud-Computing Platform” deck and begins:
Very interesting developments today from the U.S. federal government on cloud computing. Bob Marcus at the OMG has sent me an overview of a White House Cross-Cutting Programs Document released earlier. The document outlines the administration's 2010 budget requests. According to the document White House officials want agencies to launch pilot projects that identify common services and solutions and that focus on using cloud computing. I think the most important aspect of this announcement is that "cloud computing" is now being mandated from the highest levels of the U.S. government.
John Treadway of Cloud Bzz asked my colleague Ben Pring, at our Outsourcing Summit, about how we derived our cloud forecast. Ben’s answer is apparently causing a bit of concern. I figured it might be useful for me to respond publicly, since I’m one of the authors of the forecast.
The full forecast document (clients only, sorry) contains a lot of different segments, which in turn make up the full market that we’ve termed “cloud computing”. We’ve forecasted each segment, along with subsegments within them. Those segments, and their subsegments, are Business Process Services (cloud-based advertising, e-commerce, HR, payments, and other); Applications (no subcategories; this is “cloud SaaS”); Application Infrastructure (platform and integration); and System Infrastructure (compute, storage, and backup).
John Treadway says “Gartner’s definition of cloud computing is at odds with the U.S. Government's” in his Cloud Computing Expo: The Gartner Cloud vs. Everybody Else post of 5/12/2009. John contends:
Gartner’s definition of cloud computing is at odds with the U.S. Government (http://csrc.nist.gov/organizations/fissea/2009-conference/presentations/fissea09-pmell-day3_cloud-computing.pdf), McKinsey (http://uptimeinstitute.org/content/view/353/319), IDC (http://blogs.idc.com/ie/?p=190), U.C. Berkeley
(http://www.eecs.berkeley.edu/Pubs/TechRpts/2009/EECS-2009-28.html) and nearly every other definition that’s out there. It provides no actionable value, which in the end is where Gartner tends to shine, and it’s not defensible in any discussion with practitioners and other experts.
James Urquhart asks Are the feds the first to a common cloud definition? on 5/10/2009 (updated 5/12/2009) and then quotes Reuven Cohen and Chris Hoff (@Beaker) posts regarding the recently-replicated cloud computing definition from the National Institute for Standards and Technology (NIST).
Andrea DiMaio reports in his A New Perspective on Cloud Computing in Government post of 5/10/2009 to the Gartner Blog Network:
Over the last several months I have been researching on both web 2.0 and cloud computing in government. Incidentally, both topics are top of mind for the new US administration.
I am working on a research note that explores in detail the analogies between these two topics, but I want to share some of my thoughts on this blog.
It seems to me that most discussions about cloud computing have a bottom-up nature, since they relate to the ability and willingness of governments to gradually commoditize portions of their infrastructure as well as some of their enterprise applications. At the same time though, governments are discussing about the role of “open government data” and social media to improving citizen engagement. …
The commodization of IT assets and services is the “server-side” version of the consumerization of client devices and social media tools that is driving agencies to consider the use of consumer tools to engage with citizens as well as to support internal operations. [Emphasis Andrea’s.]
Her subsequent What Does “Commoditization of Government” Mean? post of 5/11/2009 makes this point:
[A]ll aspects of government, ranging from service delivery to operations, from IT to policy making, are at risk of being commoditized as a consequence of changes we see already happening. The commoditization of government concerns infrastructure and applications, as well as the socialization of information and the engagement of external constituents in service delivery, problem solving and policy-making through crowdsourcing. These are just early signs of a deeper, longer term process that will cause many government organizations to rethink how special they really are and to what extent they can leverage resources that are commodities.
•• Mike Ormond’s Interested in Azure? post of 5/14/2009 announces the next meeting of the UK Azure NET users group. Additional details and registration are at http://ukazurenet-in-rest.eventbrite.com/. The signup list is quite long at the moment.When: May 19, 2009
Where: Microsoft London (UK) Office
Randy Bias delivers in his CloudSlam ‘09 Conference Materials post of 5/10/2009 links to *.wmv files on Amazon S3 of presentations at the CloudSlam ‘09 virtual conference. The list matches, as far as I can determine, the CloudSlam schedule posted here. The links are reported to be good until 5/20/2009, so don’t let grass grow under your feet before downloading them. File size is about 100 MB, on average.When: April 20-24, 2009
Where: Virtual Event (Internet)
Geva Perry says in his Enterprise Cloud Summit post of 5/4/2009:
When: May 18-19, 2009
I'll be speaking on a panel at the upcoming Enterprise Cloud Summit in Las Vegas (@Interop), May 18-19. The topic of the panel is "Where Can Things Go Wrong?" and should be a nice conversation with the moderator Greg Ness and these panelists:
Peter Coffee, Director, Platform Research, salesforce.com
Randy Rowland, General Manager, Managed Hosting & Cloud Computing Services, Terremark Worldwide, Inc.
Geva Perry, Founder, Thinking Out Cloud
Bill McGee, Vice President, Products and Technology, Third Brigade
The rest of the agenda also looks very interesting. Check it out here. And there's also a CloudCamp event on Monday evening.
If you haven't signed up already, you can register here and get a 40% discount.
Where: Mandalay Bay Convention Center, Las Vegas
Don Dodge’s What’s Next in Tech? Boston post of 5/12/2009 describes the upcoming What’s Next In Tech - Where will the next waves of growth will come from? panel. Don says “A panel of entrepreneurs will reveal their thoughts on the next big thing, followed by a panel of VCs discussing where they are investing their money.”
One of the topics is cloud computing; Don provides the following as a discussion starter:
When: June 25, 2009 Price: $40
Cloud Computing – Obviously…everyone knows this. Amazon, Google, Microsoft, and other big players are making huge investments in Cloud Computing. But, where are the opportunities for startups? Every decade it seems a platform shift creates opportunities for tools (management, monitoring, migration, security, performance, auditing, etc) to manage the transition. All the tools that worked so well in the mainframe era didn’t work for client/server. New companies emerged to fill the gap. Then web based applications came along and created a new need for software tools to manage the process. The old client server tools didn’t work. The move to virtualization did it again. And now we have a new set of challenges in cloud computing. There are lots of gaps and holes to fill.
Where: Boston University, School of Management, 595 Commonwealth Ave, Boston, 02215
Brandon Watson says in his GigaOm Structure ‘09 Event post of 5/12/2009 that he’ll be a member of the Toward Cloud Computing: Private Enterprise Clouds As A First Step panel at Om Malik’s GigaOm Structure ‘09 event in San Francisco on 6/25/2009. The theme of Structure ‘09 is “Put Cloud Computing to Work.” Here’s the panel’s description:
Enterprises aren’t yet able or ready to migrate their applications to public clouds. Public cloud infrastructure can’t run existing enterprise applications without requiring a rewrite. Yet the lure of usage-based resourcing is strong. The solution? Build your own cloud with your existing infrastructure investment as a first step. This panel will contrast vendor visions about how internal systems can work in unison with external cloud platforms. A must attend for those looking to find solutions or opportunity in the migration path to cloud services.
Click here to register with an Azure-related discount code.When: June 12, 2009
Where: Mission Bay Conference Center, San Francisco, CA
•• Yogesh Gupta says Cisco Gets into Cloud Computing with Unified Service Delivery in this 5/15/2009 post:
Cisco strengthened its foray into cloud-based domain with the launch of Cisco Unified Service Delivery (USD). The solution will aid service providers to build a platform for cloud services by combining Cisco's datacenter portfolio and Unified Computing architecture to deliver data and video services to any place and device. Elaborating on the Cisco DataCentre 3.0 strategy, Rajesh Chainani, Country Director- Service Provider, Cisco India said, "From datacenter 2.0 focusing on client server and distributed computing, the market has evolved into third phase of datacenter services which revolve around service oriented and web 2.0 based virtualized environment."
•• Reuven Cohen’s Citrix Jumps on Cloud Hosting Bandwagon post of 5/15/2009 reports:
The cloud hosting & service provider market seem to be becoming the key battle ground for cloud enablers formerly known as virtualization vendors. Following upon recent announcements from VMware and Cisco, Citrix has announced a Service Provider Program aimed squarely at service/hosting providers who deliver software services and hosted applications to end-user customers on a rental, subscription or services basis, A.K.A. Cloud Service Providers (CSP).
The most interesting aspect of the new program has to do with Citrix's approach to billing. The program is designed with cloud business goals in mind with no up-front license fee commitments. Cloud hosters need only submit monthly usage reports and are invoiced accordingly. The program is being offered as part of Citrix Cloud Center (C3) which they describe as designed to give cloud providers a complete set of service delivery infrastructure building blocks for hosting, managing and delivering cloud-based computing services.
• James Watters asks IBM’s Webspan Cloud to Compete with Azure? on 5/14/2009 and responds with a definite “No way”:
Having thrown a little sand at IBM’s lack of a cloud compute service, I read the recent headlines about IBM Webspan with some trepidation. Geek’s.com proclaimed “IBM announces WebSpan, the first Windows Azure alternative,” gulp, had IBM really created a massive cloud service to rival the scope of Azure?
No: desite the hyperbolic headline this is an existing SaaS business integration provider named Hubspan building a Websphere based extension to its platform. Hubspan/Webspan is not a general compute platform; instead it is business integration SaaS with existing integration customers such as Visa. As I detailed in my initial piece, the economic pressures and operational challenges of a generalized compute cloud remain unattractive to big blue. If Azure wants to be the OS for the next 50 years, backed by massive data-center investments, Webspan is hardly comparable in scope or investment.
• Aaron Ricadela asks What's Holding Back Google Apps? in this 5/14/2009 BusinessWeek article. The topic is Google’s office productivity apps but the objections to their use unquestionably apply to PaaS and IaaS cloud computing, too.
Andrew Ross Sorkin analyzes Sun Microcomputer’s Preliminary Proxy Statement from the SEC in his Sun’s Deal Saga and the Mystery Suitor “Dealbook” post of 5/12/2009, which choreographs what became the mating dance of Oracle and Sun:
The blow-by-blow account describes a saga in which various suitors showed interest, backed away and then re-engaged. One leading contender, described in Tuesday’s regulatory filing as “Party A,” is a dead ringer for I.B.M., whose offer for Sun was complicated by Sun’s concerns that antitrust regulators might block the deal.
James Urquhart’s As Citrix vies for cloud lead, is anyone following? post of 5/12/2009 observes:
Last week's announcement of enhancements to Citrix Cloud Center (C3) at Citrix Synergy 2009 was one that made me sit up and take notice. A while ago, I proclaimed that the era of the "cloud os" had begun, and I called out VMWare vCloud, Citrix C3 and 3TERA AppLogic as examples of what would eventually become cloud operating systems.
Citrix (and the former XenSource team) has been strangely silent since that post. Yeah, there have been one or two "announcements" that basically positioned existing Citrix technologies as being cloud infrastructure, but all in all both VMWare and 3TERA greatly outstripped Citrix in the marketing department.
Geva Perry notes that “It's no surprise then that consumer-oriented companies, such as Amazon and Google, are the ones leading the charge in what is essentially a B2B market” in his Marketing Cloud Computing: Uncharted Territories post of 5/12/2009.
Geva Perry’s Hubs, Spokes and Islands in the Cloud post of 5/12/2009 begins a series about the roles of startups in the cloud marketspace:
This first post in the series has to do with the cloud offering's role in its ecosystem. Before I jump into it, I acknowledge the fact that there is a very large body of work about this topic. Please see my footnote on this.
We can think about as the hub-spoke-island dilemma: cloud providers (IaaS, PaaS, SaaS) and cloud services need to make a strategic decision about their role within the ecosystem.
Some are the sun of their solar system (hub) and others play the role of the planets (spoke). But things can get a bit more complex than that and the decision on what role to play is not trivial.