••• Update 8/1 and 8/2/2009: Additions (July 2009 traffic report SeaDragon view, NIST SP 800-53 Revision 3 and others)
•• Update 7/30 and 7/31/2009: Additions (Cloud Computing Use Cases whitepaper, Cloud Camp Boston reports, Cloud Security podcast, National Business Center and many more)
• Update 7/28 and 7/29/2009: Additions
Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:
- Azure Blob, Table and Queue Services
- SQL Azure Database (SADB)
- .NET Services: Access Control, Service Bus and Workflow
- Live Windows Azure Apps, Tools and Test Harnesses
- Windows Azure Infrastructure
- Cloud Security and Governance
- Cloud Computing Events
- Other Cloud Computing Platforms and Services
To use these links, click the post title to display the single article you want to navigate.
••• The Tahoe-LAFS group’s ANNOUNCING Tahoe, the Lofty-Atmospheric Filesystem, v1.5 post of 8/2/2009:
Tahoe-LAFS is the first cloud storage technology which offers security and privacy in the sense that the cloud storage service provider itself can't read or alter your data. Here is the one-page explanation of its unique security and fault-tolerance properties: http://allmydata.org/source/tahoe/trunk/docs/about.html
This release is the successor to v1.4.1, which was released April 13, 2009 . This is a major new release, improving the user interface and performance and fixing a few bugs, and adding ports to OpenBSD, NetBSD, ArchLinux, NixOS, and embedded systems built on ARM CPUs. See the NEWS file  for more information.
Hadoop, and TiddlyWiki, and more. See the Related Projects page on the wiki .
•• Maarten Balliauw added on 7/31/2009 a Table storage session handler to the PHP SDK for Windows Azure source code on CodePlex:
When running a PHP application on the Windows Azure platform in a load-balanced mode (running 2 Web Role instances or more), it is important that PHP session data can be shared between multiple Web Role instances. The PHP SDK for Windows Azure provides the Microsoft_Azure_SessionHandler class, which uses Windows Azure Table Storage as a session handler for PHP applications.
• Maarten Balliauw announced on 7/29/2009 that you can now Query the cloud with PHP (PHPLinq and Windows Azure):
I’m pleased to announce PHPLinq currently supports basic querying of Windows Azure Table Storage. PHPLinq is a class library for PHP, based on the idea of Microsoft’s LINQ technology. LINQ is short for language integrated query, a component in the .NET framework which enables you to perform queries on a variety of data sources like arrays, XML, SQL server, ... These queries are defined using a syntax which is very similar to SQL.
Next to PHPLinq querying arrays, XML and objects, which was already supported, PHPLinq now enables you to query Windows Azure Table Storage in the same manner as you would query a list of employees, simply by passing PHPLinq a Table Storage client and table name as storage hint in the in() method.
••• Eric Baldeschwieler’s News Flash: Hadoop Development Going Strong @ Yahoo! post of 7/30/2009 to the Yahoo! Developer Network blog says:
Many folks have been asking us what impact Yahoo's recently announced search deal with Microsoft will have on our Hadoop team.
Don't Panic! We are as committed as ever to building a world class open source Cloud Computing infrastructure and Apache Hadoop remains our solution for batch computing. Hadoop is used to solve many, many internet scale problems beyond search at Yahoo. Today's deal only improves Yahoo's ability to invest in Hadoop.
Eric is VP Hadoop Software Development at Yahoo!
•• Maximilian Ahrens discusses Databases in the Cloud in his 7/30/2009 post:
[T]echnically, there is nothing that prevents databases from residing in the cloud. To understand the complex relation between databases and the cloud, one needs to understand the complex chain of problems that need to be solved before a database with important data resides in the cloud. These problems are:
- [L]egal aspects of where the data resides
- [L]ong term custody warranties
- [T]rust in the cloud
The primary issue to date has been whether relational databases (as opposed to the entity-attribute-value—Hadoop or BigTable—data model) are suited for cloud deployment. Microsoft’s SQL Azure Database will tell the tale when we have a chance to test it.
Rick Grehan’s Open source Hive: Large-scale, distributed data processing made easy InfoWorld article of 7/23/2009 claims:
Thank heaven for Hive, a data analysis and query front end for Hadoop that makes Hadoop data files look like SQL tables.
•• John Fontana reports on 7/30/2009 from Burton’s Catalyst09 conference: Cloud-based identity services taking on a different look as they grow:
Cloud-based identity services are starting to gain a foothold among corporate users, but the evolving architecture looks nothing like the platforms companies have been building internally, according to Bob Blakley, vice president and research director at the Burton Group.
“The perception has been that [cloud-based identity services] would be this big monolithic thing, but that is not what the service providers built,” said Blakley, who spoke on the opening day of the Burton Group Catalyst Conference in San Diego. “What the market is building is a set of small specialty firms that handle individual identity tasks and offer discrete billable units that companies can put together.”
What is emerging, says Blakley, is the ability to build a virtual identity provider using a multitude of different services.
That’s what federated identity (and “Geneva”) is all about.
••• Jim Nakashima updated on 7/29/2009 the Windows Azure Code Samples on the MSDN Code Gallery for the Windows Azure July 2009 CTP.
•• Cumulux’s Demystifying Cloud Computing Costs, a “Multi-part Series on Azure Pricing for different Application Scenarios” starting on 7/31/2009 is based on the premise that:
With the proliferation of Cloud platforms, Enterprises are gearing up to develop and deploy Cloud based applications. One of the first questions they have to answer is "Does it make economic sense?" but when it comes to determining the Economic Impact of the cloud, it is an inexact science.
Complex granular numbers like $0.10 per compute hour / $0.01 per 10K messages makes it very hard to put it in perspective of real world applications and determine if the potential benefits are worth it and what the Total Cost of Ownership (TCO) is likely to be.
Cumulux is authoring a three part white paper series aimed at demystifying the pricing model of the Cloud in the context of typical application scenarios. It is positioned at decision makers, enabling them to make educated decisions about the Total Cost of Ownership of running applications on the Azure Services Platform.
Apparently, Cumulux missed the name change from Azure Services Platform to Windows Azure Platform.
“Part I - Compute Intensive Financial Services Application” is available for download now.
••• John Treadway says SaaS v. Cloud Should Not Be Contentious in this 8/1/2009 reply to Chris Hoff:
… I think that this approach, while intellectually interesting, is perhaps a bit off the mark. While NIST’s framework for cloud computing is generally accepted on the surface, the lower-level distinctions they make may not be so universally agreed upon. …
••• Chris Hoff (@Beaker) starts another controversy with his Contentious Issue: When Does a SaaS Offering Qualify As a Cloud SaaS Offering? post of 8/1/2009:
I made a comment on Twitter a couple of days ago reacting to how some were positioning McAfee’s purchase of MX Logic as the latter representing a “Cloud Security provider.”
The link above has the article’s author referring to the deal as one focused on the expansion of McAfee’s “Cloud portfolio” whilst all the McAfee quotes refer to it as bolstering their “security-as-a-service” offerings.
I read many articles referring to this deal as “Cloud” in nature and in a fit of frustration I said:
I’m sorry, but MX Logic is not a “Cloud Security Provider”
That caught the eye of Erik Boles (@ErikBoles) who suggested that because MX Logic is a SaaS provider, they are a Cloud provider and have been since their start in 2002. MX Logic’s website advertises them as a SaaS provider, but not a Cloud provider. McAfee refers to them as security-as-a-service. I thought it was pretty clear. Then Erik kept pushing. I’m glad he did.
••• Stefan Reid’s 7/27/2009 Forrester archived teleconference, PaaS: How To Benefit From A $15 Billon Market, is available for listening or watching at $250/pop. Description:
The increasing adoption of software-as-a-service (SaaS) across the world requires software vendors to look for a more efficient way of deploying business applications on multitenant platforms. Because independent software vendors (ISVs) are starting to realize the value of platform-as-a-service (PaaS) offerings, Forrester has applied its sizing methodology for emerging markets to this new platform market.
Based on the current assumptions, the PaaS market will grow over eight years to a size of $15 billion in total volume. This teleconference helps you to understand Forrester's sizing methodology step by step.
- ISVs, outsourcing providers, and corporate data centers can use PaaS.
- Sizing an emerging market — Forrester's sizing methodology
- Forecast of market dynamic
- How can you leverage existing PaaS platforms?
Vendors mentioned: 10gen, Apprenda, Bungee Labs, Caspio, Daptiv, Google, Magic Software, Microsoft, NetSuite, OrangeScape Technologies, Relational Networks (LongJump), Salesforce.com, Stax Networks, and Zoho
••• Stefan Reid presents Yet Another Cloud Forrester's First Proposal Of A Cloud Taxonomy (7 pp. US$1,749). Executive summary:
Despite the current crisis, the demand for cloud computing is growing at enterprises and small and medium-sized businesses (SMBs). Some customers consider virtual infrastructure in the Internet as cloud, while others deem their business applications delivered in a software-as-a-service (SaaS) deployment style as cloud computing. This enormous bandwidth of customer perception often leads to confusion. Also, many service providers and product vendors are afraid of missing the cloud bus and are starting to call everything a cloud product.
This report tried to classify the various cloud and SaaS services and relates them to existing products and services. Vendors can categorize their portfolio along this taxonomy and structure their marketing positioning to be more understandable and importantly more credible. IT users can also use the taxonomy to structure their cloud sourcing approach.
•• The Cloud Computing Use Case Discussion Group presents their 36-page (869-line) Cloud Computing Use Cases whitepaper of 7/31/2009:
Contributors: Dustin Amrhein, Joe Armstrong, Ezhil Arasan B, Richard Bruklis, Reuven Cohen, Rodrigo Flores, Gaston Fourcade, Thomas Freund, William Jay Huie, Sam Johnston, Ravi Kulkarni, Anil Kunjunny, Gary Mazzaferro, Andres Monroy-Hernandez, Dirk Nicol, Lisa Noon, Santosh Padhy, Balu Ramachandran, Jason Reed, German Retana, Dave Russell, Krishna Sankar, Patrick Stingley, Robert Syputa, Doug Tidwell, Kris Walker, Kurt Williams, John M Willis, Yutaka Sasaki, and Fred Zappert.
Public comments on this document are welcomed and encouraged at http://groups.google.com/group/cloud-computing-use-cases.
Here’s the paper’s taxonomy diagram:
Many well-respected cloud-computing “thought leaders” are missing from the Contributors list.
•• Reuven Cohen describes the whitepaper in his IBM's Crowd-Sourced Cloud Computing Use Cases White Paper Published post of the same date. Ruv announced the first draft in IBM Cloud Computing Use Cases Group Releases Draft White Paper on 7/5/2009.
Maybe IBM’s herding (by Doug Tidwell?) of “lots of people” contributing to the whitepaper is responsible for the missing “though leaders”
•• Aaron Skonnard’s Service Virtualization and the Managed Services Engine (MSE) post of 7/30/2009 promotes Microsoft’s MSE:
Over the past several months, I’ve spent a lot of time exploring the concept of service virtualization. Service virtualization is an emerging trend in the SOA landscape that focuses on providing a common infrastructure for building and managing a complex service ecosystem - the Managed Services Engine (MSE) brings service virtualization to life on the Microsoft platform. During my journey, I wrote three papers you might find helpful:
- Why Service Virtualization Matters? Four page executive summary on Service Virtualization
- SOA Simplified: Service Virtualization With The Managed Services Engine An article published in the May 2009 issue of MSDN Magazine
- An Introduction to Service Virtualization on the Microsoft .NET Platform A 57 page whitepaper that is an in-depth extension of the MSDN Magazine article
In the end, service virtualization can help reduce your time-to-market for new investments on-premises or in the cloud and it will provide a more realistic approach to managing your service ecosystem as it grows over time – it can help you realize the full benefits SOA has to offer.
•• Lori MacVittie explains “[t]he importance of a full-proxy architecture to application delivery, security, cloud computing, and virtualization” in her Two Different Sock(et)s post of 7/31/2009:
“Two different socks” is probably the most accurate (and simplest) description of a full-proxy based application delivery platform, at least if you’re a developer and have an understanding of network-oriented programming. If you’ve ever written even a simple TCP-based application in, well, just about any environment and have had a need to reference examples you’ll recall that sample code often uses the variable “sock” to represent a reference to an accepted TCP connection over a socket. …
•• Randy Bias’s The ‘Cloud’ Is NOT Outsourcing post of 7/30/2009 begins:
There was recently a small brouhaha on twitter regarding whether a ‘private’ or ‘internal’ cloud is really a ‘cloud’. There was a very high level of chatter including a ton of the clouderati such as @jamesurquhart, @samcharrington, @boblozano, @jesserobbins, @ITKLcameron, @samj, and many more. The argument of the folks who claim that internal clouds aren’t really clouds is best summarized by @GeorgeReese, who essentially said clouds must be:
- Utility Charge Model (pay by the hour, with no contracts)
My purpose in this particular post is not to pick on George. He’s a super sharp guy, who literally wrote the book on Cloud Application Architectures (with an Appendix by yours truly). George has a point of view that is worthy of consideration; however, I disagree with it and I want people to understand why internal clouds are still ‘cloudy’. More importantly, I’d like them to know that internal clouds are going to be where some of the biggest game-changing in the cloud computing space happens. …
•• Bruce Guptil says “Saugatuck [Research] sees ad-supported SaaS (and, possibly, Cloud) services as an important part of the future of online business and IT” in his Yahoo, Microsoft and Google: Search is a Button; SaaS is Business Research Alert of 7/29/2009 (requires site registration.)
• David Linthicum’s Busted: Three myths of cloud computing post of 7/29/2009 is his “first post for InfoWorld's Cloud Computing blog” in which he deals with “three oft-repeated falsehoods about the cloud:”
- Cloud computing is a throwback to the traditional timesharing model.
- Cloud computing is always cheaper.
- Cloud computing is unsecure.
• Randy Bias’s Up, Out, Centralized, and Decentralized post of 7/28/2009 begins:
It can be confusing to understand how to scale computing systems, but it’s not rocket science. There are really only two main axes of scale: out and up. Closely related to the axis of scale is the general type of architecture: centralized or decentralized. In this article I’m going to briefly revisit scaling and then talk about centralized vs. decentralized architectures.
Randy goes on to discusses:
- The Axes of Scale
- Centralized and Decentralized Systems
Scaling up via centralized systems is still a viable architectural decision for those whose growth needs fit Moore’s Law. Given the advent of cloud computing and the ability to add more servers when needed, scale-out tactics for building decentralized systems has been gaining more prevalence. We will begin to see more and more scale-out solutions even within the enterprise as startups like Cloudera, ParaScale, StackJet, and many others build easier to manage decentralized systems. I am very much looking forward to this new world as it solves a great many hard problems in a very efficient manner. Just remember that scaling up will always be a viable and, in some cases, cost effective architectural decision.
• David Linthicum cites The Fear of Multitenancy in this 7/28/2009 post:
In the world of SaaS and cloud computing there is one single word that will send chills up the spines in IT: Multitenancy.
He goes on by quoting Wikipedia’s definition and concludes that multitenancy really isn’t or shouldn’t be an issue.
• Brian Sommer asks Is Your IT Shop Mature Enough for Cloud Computing? in this 7/27/2009 post to ZD Net’s Software & Services Safari blog:
Recently, I’ve written about software vendors being (or not being ready) for cloud computing. I’ve also written about large consultancies being able to support their clients as they move to the cloud. These consultancies and systems integrators are taking the cloud mainstream.
Last week, I spoke with a HP executive about the cloud and their insights into client adoption for same. One of the subjects we covered focused on the readiness of clients to adopt the cloud for some of their processing needs. Note, these aren’t HP’s words or opinions – they are exclusively mine. But, I thank the folks at HP for the illuminating conversation and the thoughts they provoked.
• Chris Hoff (@Beaker) continues his meditation on the Inter-Cloud with an Inter-Cloud Rock, Paper, Scissors: Service Brokers, Semantic Web or APIs? essay of 9/27/2009, which concludes:
Here’s how I see Inter-Cloud playing out: In the short term we’ll need the innovators to push with their own API’s, then the service brokers will abstract them on behalf of consumers in the mid-stream and ultimately we will arrive at a common, open and standardized way of solving the problem in the long term with a semantic capability that allows fluidity and agility in a consumer being able to take advantage of the model that works best for their particular needs.
• A Microsoft Case Study, Internet Firm Speeds Time-to-Market by 40 Percent with Startup Support, “Cloud” Computing, of learn-to-write Eduify start-up claims:
Eduify was founded to provide students with educational technology that can assist them in researching topics faster and writing better. To create its solution, Eduify received assistance as well, from Microsoft® technologies and services such as the Azure™ Services Platform, the Microsoft BizSpark™ ecosystem, and Windows Mobile® 6.
Eduify took advantage of these technological and business resources to create a software-plus-services solution that delivers online writing help to students anywhere, anytime, with functionality and a user experience not possible from the Web alone. At the same time, Eduify gained a 40 percent faster time-to-market, U.S.$500,000 in development and other savings, and easier access to the largest market of mobile device users.
Obviously written before the name-change to Windows Azure Platform.
Simon Wardley’s OSCON ‘09 keynote video, Cloud Computing – Why IT Matters, is on YouTube. Simon is Software Services Manager / Cloud Computing Strategist at Canonical Ltd. Canonical is the main sponsor and supporter of the Ubuntu distribution.
Microsoft’s U.S. Partner Team offers the following one-hour Live Meeting training courses or presentations for the Windows Azure Platform and its components:
- What Is Azure?
- ISV Innovation: Windows Azure for Developers, Part 1: Fundamentals
- ISV Innovation: Windows Azure for Developers, Part 2: Developing a Windows Azure Application
- ISV Innovation: Windows Azure for Developers, Part 3: .NET Services
- ISV Innovation: Windows Azure for Developers, Part 4: Microsoft SQL Data Services
- ISV Innovation: Windows Azure for Developers, Part 5: Live Framework Services
You might need to register as a Microsoft Partner to view these courses.
••• The National Institute of Standards and Technology updated one of its major IT security guidance publications, characterizing it as "historic in nature." Here’s how the NIST Issues "Historic" Security Controls Guidance article of 7/31/2009 describes SP 800-53 Revision 3:
Special Publication 800-53 Revision 3 - Recommended Security Controls for Federal Information Systems and Organizations - includes security controls in its catalogue for national security and non-national security systems, a first in its continuing initiative to develop a unified IT security framework for government agencies and contractors. NIST said the updated security control catalogue incorporates best practices in information security from the Department of Defense, intelligence community and civilian agencies to produce the most broad-based and comprehensive set of safeguards and countermeasures ever developed for information systems.
Revision 3, according to NIST, contains significant changes from earlier versions, including:
- A simplified, six-step risk management framework;
- Additional security controls and control enhancements for advanced cyber threats;
- Recommendations for prioritizing or sequencing security controls during implementation or deployment;
- Revised security control structure with a new references section;
- Elimination of security requirements from supplemental guidance sections;
- Guidance on using the risk management framework for legacy information systems and for external providers of information system services;
- Updates to security control baselines consistent with current threat information and known cyber attacks;
- Organization-level security controls for managing information security programs;
- Guidance on the management of common controls within organizations; and
- Strategy for harmonizing Federal Information Security Management Act security standards and guidelines with international security standard ISO/IEC 27001.
••• James Urquhart contends that In cloud computing, data is not electricity in this 8/1/2009 post:
… Here is the fundamental difference between data and electricity: With electricity, I don't care what electrons pushed the electrons that ultimately come out of the socket. I also don't care that if I were to generate power and supply it to the grid (through, say, solar panels on my home) who might take that electricity and store it in a battery someplace. An amp is an amp is an amp.
With the cloud, however, I care about exactly which bits come out of my ethernet port. Furthermore, if I generate data and put it out on the Internet, I care exactly where and how my data is stored, and who can have access to it. The Internet is not a shared information grid, its a shared network for transmitting information from one specific point to another. There is a difference. …
••• Steve Lessem’s Security and Cloud Storage: Everybody Talks About it, but is it Really All That Different? post of 7/31/2009:
In this recent article, it was suggested that files of one owner residing on a physical device with the files of others could somehow result in unauthorized access. It could, and the answer to this and a myriad of concerns fits within traditional approaches and understandings of security. For example, Mezeo encrypts all files prior to storage. So, even if you somehow got access to another's file, it would do you no good. My point is that the cloud introduces a few additional complications, but it is not a problem that the current level of speculation seems to portray it as. An extension to typical security practices, diligence, effective execution and audit of your current practices is what is required. …
•• Chris Hoff (@Beaker) announces Ralph the Mouth and Potsie Do A Cloud Security Podcast in this 7/31/2009 post and threatens to do more than one:
I’ll leave it up to you to figure who’s who [I'm the one with the 'good' accent,] but Craig Balding from Cloudsecurity.org and I have teamed up to host a regularly-scheduled (whatever that means) podcast on Cloud Security.
It’s called…wait for it… The Cloud Security Podcast.
You can find it, and the show notes of our very first (and dodgy) version right here, homed at libsyn. We’ll stick it on iTunes shortly.
•• Reuven Cohen’s A Cloud Service Rating System post of 7/30/2009 suggests “the creation of a Cloud Service Provider Rating System similar to a corporate ‘credit rating’ that estimates the service worthiness of a cloud computing provider.” Ruv’s idea has received a generally favorable response among the cloud-oriented folks on Twitter.
Current status of the provider’s SAS 70 attestations and ISO/IEC 27001:2005 certifications are two items I’d like to see in a rating.
•• Ben Kepes emphasizes issues with data geolocation in his Where is YOUR data? post of 7/30/2009:
I posted recently about the increasingly positive news that cloud email providers have been gearing – large scale rollouts, more consumer acceptance and adoption by enterprise all point to a growing validity of the model.
Ben goes on to quote an unnamed correspondent who’s alarmed about privacy when using Google Web apps and then addresses the issue of New Zealand government RFPs requiring data be domiciled in that country:
We’re rapidly nearing the point where cloud vendors need to appreciate the fact that customers have specific requirements and higher levels of customer satisfaction, increased revenue and higher uptake for cloud computing will come when those requirements are met.
•• Dean Takahashi reports on 7/30/2009 from the Black Hat Conference that Cloud computing raises stakes for data break-ins:
[C]loud computing comes with a lot of its own security risks outlined today at the Black Hat conference in Las Vegas.
The talk was given by Andrew Becherer, Alex Stamos, and Nathan Wilcox of iSEC Partners. Cloud computing means that lots of Internet host servers are being marshalled to deliver data to you in real time. That data is distributed across a lot of different commodity storage servers, all tied together through software. The apps are stored on servers and can be moved from machine to machine without impacting you. …
•• David Berlind adds his slant on cloud computing at the Black Hat conference in his Black Hat Researcher Rains On Cloud Computing's Parade With Talk Of Vulnerabilities post to David Berlind’s Tech Radar:
iSEC Partners partner (and Black Hat researcher) Alex Stamos says there's really no such thing as cloud computing. According to him, it's just a trendy name to take your money. Regardless of what you want to call it though, the vulnerabilities inherent to it are very real. That was Stamos' message in a briefing he gave this morning at the Black Hat conference in Las Vegas. Among the highlights of my podcast interview with him; Salesforce (NYSE: CRM) gets a gold star and Windows-based virtual machines are architecturally more secure than Linux-based ones. …
"The term cloud computing is useless" said Stamos. "It's way overused. It's mostly about gathering venture capital or selling your products." …
•• Brian Ott of Unisys says, “Like most organizations, we began with server consolidation and virtualization” and adds his The Journey to a Secure Cloud Continues post of 7/29/2009:
However, now with the advent and maturing of the cloud environments, there is a whole new set of activities that need to be accomplished. I will cover these in my next blog – “The Journey to a Secure Cloud – Beyond Virtualization and Automation.”
The initial post is The Journey to a Secure Cloud of 7/9/2009.
• John Pescatore claims Charging Your Customers To Reduce Your Security Costs Never Has and Never Will Work in this 7/29/2009 post that quotes a “Network World piece about a credit union in California offering two factor authentication to its customers”:
I like to see businesses “encouraging” customers to use stronger security, since it will reduce the business’s fraud costs, make the customers happier - great business payback. However, later on the piece defines how the credit union will “encourage” customers:
“The credit union, which has 150,000 members, many of them associated with the high-tech industry, will launch the service for free in the roll-out phase but will likely charge for the service down the road. For those using the stronger two-factor authentication, an annual fee of $10 for the service is anticipated, plus $10 for a handheld token. The iPhone and Blackberry applications are available for free.”
- A series of incremental web page samples showing how to invoke CardSpace, culminating in a sample that shows error handling and progress spinners for long-running operations like policy and token retrieval.
- A sample that shows using Geneva Framework’s WSFederationAuthenticationModule to protect a web site with CardSpace credentials.
- A sample for VC++ programmers that demonstrates CardSpace’s API for native programs. If you have wanted to include the CardTile in your own program or browser extension, this is for you!
Last I heard, CardSpace Geneva had been renamed Windows CardSpace.
• Brent Stineman’s Cloud Computing - Health and Activity Monitoring post of 7/27/2009 notes that “there doesn’t seem to be much discussion of health and activity monitoring and/or runtime governance” and discusses AmberPoint’s BPM and SOA monitoring methods:
Leveraging the cloud presents new challenges to monitoring our systems. AmberPoint worked by using nearly invisible agents that could be placed at times in and at other times near our various service endpoints. But with the cloud, with the leverage of various new providers, we lose much of our ability to place those agents near our endpoints. Additionally, each vendor has their own solution for service monitoring. This is one thing the cloud, at least as of yet, does not do well at all.
So we’re stuck leveraging old solutions, creating logging and aggregation services. Spending our own time creating ways to help monitor and manage our services.
Dan Kuznetsky discusses Cloud Computing and Governance in this Virtually Speaking ZDNet article of 7/27/2009:
My recent post concerning Layer 7 and dealing with regulatory compliance and overall governance issues in the Cloud (see Layer 7 Application Services Governance) seems to have hit an industry nerve. I’ve read and received comments from all over the place.
It appears that quite a number of suppliers including Sun, Univa UD, WebLayers and many others have announced products, services or programs that target the need for better control of Cloud computing environments. …
Alex Weisel’s Safety in the Cloud(s): ‘Vaporizing’ the Web Application article of 7/27/2009:
… is intended to pick up where the [Cloud Security Alliance] (CSA) guide left off in terms of defining what a distributed web application firewall (dWAF) should look like in order to meet the standards set within the CSA document.
Kevin L. Jackson reviews Maria Spinola: An Essential Guide to Cloud Computing, which Maria calls “A pragmatic, effective and hype-free approach for strategic enterprise decision making, on 7/27/2009:
So why should you bother [reading Maria’s white paper] if you've already decide[d] not to use cloud computing? Maria actually has a very succinct answer to that question:
- It’s likely that without your knowledge, some of your departments are already using Cloud Computing,
- You will need to define a Cloud Governance Program and make it available to all your internal customers.
- A department may decide to go to a Cloud Computing service provider and start using their services immediately, instead of waiting months to have an on premise system installed,
- Since Cloud offerings are "free" to start using immediately instead of asking permission to use it, employees may be asking you for forgiveness later.
I expect this guide will be widely distributed because it provides an excellent summary of current cloud computing industry views.
••• John Willis announces AWSome Atlanta (Cloud Computing User's Group) August Meetup in this 8/1/2009 post:
When: 8/11/2009 7:00 to 9:00 PM
7:00 "The Super Secure Cloud: Requirements from the Medical Industry"
This would be based on what Sentry Data Systems has encountered
over the last four years as they developed their cloud
infrastructure and SaaS applications to the healthcare industry.
Speaker: Peter Schwoerer Datanex Specialist with Sentry Data Systems.
8:00 PSTUCT – Private Something's That Use Cloud Technology
There has been way to much debate on the subject of what is a cloud and what is not a cloud. For this discussion I have decided to skip that debate and dive right into the topic of private things that look very interesting and appear to use cloud technology. For the past few weeks I have been working on a special project researching a few of the "PSTUCT's" that exist in the market place.
I will be covering the differences between products like Vmware's vShphere 4.0 and 3Tera's Applogic as well as some open source platforms such as Enomaly, Eucalyptus, and OpenNebula. If there is enough time I will also throw in some of the really interesting things Red Hat is doing for enabling cloud computing in the enterprise. If you have any interest in these private somethings that seem to be very interesting, please come join me.
Speaker: John M Willis of Johnmwillis.com (a.k.a botchagalupe).
Where: Secret (only disclosed to members)
••• Point Zero Media’s Cloud Computing World Forum will take place on 10/1/2009 at 76 Portland Place, London according to this Point Zero Media Ltd announces the Cloud Computing World Forum Conference press release of 8/1/2009.
The event will feature a full 1 day conference agenda with delegates hearing from analysts, leading case studies, government ministers and the key players and providers of cloud services.
The conference will address the state of the cloud computing market, the business value of the cloud, development strategies and integration techniques, security and privacy issues and will take a look into what the future holds from cloud service providers.
It will also look at why cloud computing is a compelling proposition for SMB’s.
Where: 76 Portland Place, London, England
Where: Sun Microsystems GmbH, Sonnenallee 1, 85551 Kirchheim-Heimstetten,
Deutschland (Germany) http://www.sun.de/
•• Reuven Cohen proposes a CloudCamp in the Cloud (A Virtual unConference) in this 7/30/2009 post:
Interesting idea floating around this afternoon on twitter. After yesterday's two CloudCamp's I received a few messages asking if there was any online ability to view the CloudCamp proceeding streaming live. In short the answer is no, although a few have posted video afterward.
This got me thinking, why not do a monthly virtual CloudCamp unconference via Webex? Cisco is already a sponsor of several CloudCamp's and I'm sure they would be more then happy to donate the Webex account in return for some promotional consideration. By creating a virtual unconference we could include everyone anywhere in the world using the very medium we help promote. Another benefit is an archive of audio and video posted to the CloudCamp.com website from the monthly events.
Twitterati (including me) were enthusiastic about Ruv’s proposal.
•• Rob Berry adds more reporting on CloudCamp Boston in his Cloud computing application development talk heats up post of 7/31/2009:
"If you just put an existing application out there, you can say it's 'cloud,'" said Igor Moochnick, the founder and VP of engineering at IgorShare Consulting. "But it's basically just hosting."
Moochnick said what makes the cloud useful to application architects is its distributed nature. Distributed computing does have advantages in the degree of parallel processing it opens up. But while a native relational application can be tailored to run in a cloud environment, Moochnick said not converting an app to a distributed model makes a cloud provider little more than a glorified hosting company. That switch to distributed thinking can be a sizeable hurdle for development teams. …
•• Gordon Haff reports about CloudCamp Boston: Inching to the next phase on 7/30/2009 for C|Net News:
Any event of this sort inevitably has lots of different simultaneous threads going on. However, here are a few that I think are worth highlighting:
Is security really an issue? The security aspects of cloud computing are often presented as a matter of trust in service providers and getting comfortable with the loss of direct hands-on control. Those are part of it certainly. But a number of discussions made it clear to me that the situation is a lot more complicated than developing a "comfort level" with the technology. It also goes beyond point products such as data encryption.
Christofer Hoff (who was in one of the unconference sessions I participated in) wrote a post a few days ago that gives a nice window into some of the complexities here. This isn't to say that security control and compliance concerns are stumbling blocks for moving all applications into a cloud, but they do have to be taken into account (in all their myriad complexity) for many core business applications.
What is cloud interoperability? Interoperability seems to be emerging as a bit of a contentious topic in cloud computing. This is partly because interoperability, like cloud computing itself, means different things to different people. Just about everyone agrees that base-level data portability (download your customer records from a CRM system in a readable format) is a must and that a nirvana of totally transparent computing delivery across providers is years away.
Gordon goes on to attempt to answer Where's the business opportunity?
•• John Treadway provides links to CloudCamp Boston reporters and bloggers, and the keynote slide deck in his CloudCamp Boston post of 7/30/2009. John was one of the keynote speakers.
•• Paul Miller offers his Thoughts from last night’s Cloud Camp (Newcastle, not Boston!) on 7/31/2009:
Amazon Web Services Evangelist, Simone Brunozzi, went first and shared some of his views on security in the Cloud. He suggested that psychology plays a significant part in enterprise concerns; it’s not a real belief that the Cloud is less secure, so much as a fear of loss of control. Simone made the distinction between Amazon-controlled physical security of data centres and real computers, and largely customer-controlled responsibility for running virtual machines responsibly. He noted that Amazon’s virtual machine instances are created by default with every network port closed; a new customer needs to request that a port is opened before they can even log in to their new instance.
Paul continues with summaries of brief presentations by Flexiscale, Sun Microsystems, Microsoft, Arjuna, and Rozmic evangelists.
• Redmond Media Group’s VS Live! Orlando conference will feature a Cloud Computing track with the following sessions and presenters:
- Windows Azure: A New Era of Cloud Computing: Aaron Skonnard
- Windows Azure: Is the Relational Database Dead? Benjamin Day
- Architecting an n-tier application for the Azure Platform Vishwas Lele
- Programming .NET Services Aaron Skonnard
Where: Buena Vista Palace hotel, Orlando, FL USA
SD Forum’s Cloud Services and SOA SIG (formerly the Web-Services SIG) meets monthly to discuss this disruptive model for providing application services via the Cloud. The upcoming meeting’s topic is Convergence of Private Clouds:
As enterprises look for ways to benefit from cloud computing, they are seeing enterprise virtualization follow the path of external cloud hosting providers to provide internal Private Clouds. In this presentation, Greg Lato of VMWare will provide an introduction to Private Clouds and highlight the advantages that a Private Cloud will provide to IT Management, IT Operations, and Application Developers.
When: 7/28/2009 6:30 to 9:00 PM
Where: Tibco Software Inc., 3301 Hillview Avenue, Building #2, Palo Alto, CA USA
Price: $15 at the door for non-SDForum members; no charge for SDForum members; no registration required
The Burton Group’s Catalyst Conference runs from 7/27 to 7/31/2009 in San Diego. The Cloud Computing Business Advantage track covers the following topics:
- Defining the Cloud: Architecture, Infrastructure, and Economics
- Using the Cloud: Rewards, Risks, and Practices
- Server Virtualization: The Foundation for Cloud Infrastructure
Here’s a link to the agenda for 7/29 to 7/31/2009.
When: 7/27 to 7/31/2009
Where: Hilton San Diego Bayfront, 1 Park Boulevard, San Diego, CA 92101, USA
•• The U.S. Bureau of the Interior’s National Business Center wants a piece of the federal cloud action with its NBC Cloud Computing offering “to both NBC's business services clients and data center hosting clients alike.” From the site:
NBC Capabilities Currently Supported
- Human Resources Management Suite
- Mainframe (Z-Linux) and storage virtualization
NBC Planned Offerings for Cloud Computing
- Web 2.0
- Software development and deployment platform
- Automatic provisioning
- X86 virtualization
Sounds to me like inter-agency rivalry is heating up.
•• Reuven Cohen‘s US Federal Government Releases Cloud Computing Initiative RFQ post of 7/31/2009 includes a description and the text of the RFQ:
According to the RFQ document (See document here) the objective of this RFQ is to offer three key service offerings through IaaS providers for ordering activities. The requirements have been divided into three distinct Lots:
- Lot 1: Cloud Storage Services
- Lot 2: Virtual Machines
- Lot 3: Cloud Web Hosting
The RFQ also sheds some light on the potential usage of cloud based infrastructure with the US federal Government. The Federal Cloud Computing initiative is a services oriented approach, whereby common infrastructure, information, and solutions can be shared/reused across the Government. The overall objective is to create a more agile Federal enterprise – where services can be reused and provisioned on demand to meet business needs. …
•• Kevin Jackson add his commentary in GSA Release Cloud Computing RFQ of the same date.
•• NetSuite offers an On-Demand ERP in the Enterprise—A Practical CIO Guide to Implementation white paper:
Want to Ensure the Success of Your SaaS Implementation Strategy?
Cloud computing and Software as a Service (SaaS) pose exciting opportunities for large enterprises, but also come with challenges. Are you ready to face them?
A new independent research and analysis report by Phil Wainewright, one of the world’s foremost authorities on business automation and a ZDNet writer on SaaS, will give you a framework for crafting your SaaS strategy and successfully implementing SaaS ERP within your organization.
•• Jamal Mazhar describes Building a Private Cloud within a Public Cloud in this 7/29/2009 post:
One of our customers wanted to establish a site to site connectivity between their datacenter and public cloud (Amazon EC2) and then have a private network within Amazon EC2 with their own custom IP addresses for their servers in the cloud. Basically idea here is to augment the internal datacenter resources with the resources in the public cloud securely so that the servers in the cloud appear as if they are part of their own private corporate network. The idea here is to isolate the servers used by the customer in the cloud from the rest of the servers in the cloud using private network, just like the corporate internal datacenters are isolated using private network with private routers routing the internal traffic.
•• David Linthicum praises Intuit’s code.intuit.com community in his Cloud computing brings developers to the people and claims “New partner programs promote innovative cloud computing application development and provide prebuilt channels”:
In the world of cloud computing, some very innovative offerings have flown under the hype radar. An example of this is Intuit's launch of an open source community for developers interested in creating online applications. In essence, it's a community for building and deploying cloud-based software for fun and/or profit, aimed at the small business.
The reality is that SaaS, while being widely adopted now by the larger enterprises and governments, had initial success within the world of small business. If you can't afford a datacenter or even a few servers, SaaS was a logical option, and guys like Intuit, Salesforce.com, and other older SaaS players had their initial success within small business. Indeed, cloud computing today, while on the lips of almost every Global 2000 company and most government agencies, is finding the most success within small business where the value is almost always there. …
•• John Foley chimes in with his NASA's Next Mission: Cloud Computing post of 7/30/2009 for Information Week:
The Nebula cloud is in limited beta test now, and NASA is accepting applications from interested parties that want to give it a try. Take note: NASA is making Nebula available not just to its own staffers, but to employees and contractors of other federal agencies.
See below for more NASA/Nebula articles.•• Mark Ferelli’s Cisco Plays Both Sides of the Virtual Coin post of 7/30/2009 to Virtualization Review is subtitled “The Unified Computing System supports VMware and Hyper-V and aims to be a "Cloud in a Box" solution. But can Cisco transition from routers to servers?”
It was code-named "California" and is expected to link computing, network, storage, access and virtualization capabilities together into one cohesive system. Now known as the Unified Computing System (UCS), the new platform from Cisco Systems Inc. has moved the company beyond pure-play networking into the server space with virtualization as the defining feature. And Cisco has lined up two big virtual fish for support -- Microsoft and VMware Inc. …
•• Andrew Conry-Murray expands on Rackspace’s new dedicated private cloud initiative (see below) in his Rackspace Launches Dedicated Cloud post of 7/23/2009 for InformationWeek:
Today Rackspace announced the availability of its Private Cloud offering, in which Rackspace manages a dedicated set of servers inside Rackspace data centers on behalf of customers. Customers use a portal to spin up and spin down VMware-based virtual machines on demand on top of the dedicated infrastructure.
The company says customers will pay an annual base price for the service depending on the hardware configurations, which begin at 16Ghz of processing power and 32 Mbytes of memory. Pricing starts at around $6,000 for a server environment that will support a minimum of eight virtual machines, as well as a dedicated firewall.
In addition to the base price, customers will also pay per VM on a consumption-based model. "A customer who had a batch job that ran on 5 VMs for the last week of the month, they only pay for the VMs at the time they used them," says John Engates, Rackspace CTO. The company will charge for VM use per day, rather than per hour in its public cloud option. …
•• Brynn Koeppen’s NASA May Be Prototype For White House Cloud Computing Initiative of 7/29/2009 summarizes details for establishing NASA and Nebula as the foundation of a new federal cloud-computing infrastructure:
The White House may decide to make NASA its pilot project for replacing the government’s current IT infrastructure with cloud computing. According to NASA’s associate chief information officer for architecture and infrastructure, Mike Hecker, NASA officials have ”broached the idea of NASA becoming an IT service provider. NASA as an IT service provider takes us into a new realm.” But also said that, “We’re still debating if that’s a good idea or not.”
NASA’s cloud computing model, Nebula, is currently only used for sharing space images and statistics with international partners and the academic sector. Ben Pring, vice president of research at Gartner believes NASA’s Nebula “could be used in a way to run some new applications on it, to see whether it would make sense.” Pring went on to say, “Absolutely it would be a good, smart thing to do that.”
•• Reuven Cohen adds more detail to the federal cloud storefront in his The Rise of the Government App Store post of 7/29/2009:
In a recent post to the CCIF mailing list, Bob Marcus outlined the coming opportunties and challenges facing what he described as "Government Cloud Storefronts". In the post he described Vivek Kundra's (US Federal CIO) vision for the creation of a government Cloud Storefront. This Storefront (run by GSA) which will be launched Sept 9th and will make Cloud resources (IaaS, PaaS, SaaS) available to agencies with in the US federal Government. (an $80+ Billion a year IT organization).
What's also interesting is the US isn't alone in the vision of centralized access points for procuring Cloud services and related applications. Several other governments including the United Kingdom G-Cloud app store and the Japanese Kasumigaseki Cloud are attempting to do the same with Japan spending upwards of 250 million dollars on their initiative.
Kundra, speaking at a recent conference at the National Defense University on cloud computing elaborated on his GovApp Store concept "Any agency can go online and agencies will be able to say 'I'm interested in buying a specific technology' and we will abstract all the complexities for agencies. They don't have to worry about Federal Information Security Management Act compliance. They don't have to worry about certification and accreditation. And they don't have to worry about how the services are being provisioned. Literally, you'll be able to go in as an agency… and provision those on a real-time basis and that is where the government really needs to move as we look at standardization. This will be the storefront that will be simple." …
• Reuven Cohen views Cloud Computing as a Commodity and discusses pricing by the Universal Computing Unit (UCU) and it’s inverse, the Universal Computing Cycle (UCC) in this 7/29/2009 post:
I seem to keep coming back to the same question when discussing Cloud Computing. Can cloud computing be treated as a commodity that could be brokered and or exchanged? Recently a few have attempted to do this, notably a German firm called Zimory.
To give you a little background, before the development of the Enomaly ECP platform. I had the grand idea to create what I described as "Distributed Exchange (DX)" (circa 2004 - I've put the site online temporarily for demo purposes). This was actually one of the original motivations for the creation of the original ECP platform (aka Enomalism) The idea of DX was to create a platform and marketplace which would allow companies to buy and sell excess computing capacity similar to that of a commodities /exchange marketplace. Think Google Adwords & Adsense for compute capacity.
• Rackspace annnounces VMware–based private cloud technology for sale in its Rackspace Private Cloud leverages VMware to Extend Enterprise Computing on Demand press release of 7/29/2009:
“Rackspace provides an excellent option for customers to take advantage of the cloud,” said Dan Chu, vice president, emerging products and markets, VMware. “Enterprises who purchase the Rackspace Private Cloud can count on the powerful combination of VMware’s industry-leading virtualization platform and Rackspace’s reliable hosting environment and infrastructure.”
Rackspace Private Cloud is an evolution of its popular dedicated virtual server (DVS) offering within the managed hosting business unit. In the last year, revenue from virtualization solutions has grown substantially, driven mainly by the increased flexibility, improved asset utilization and lower capital and operating costs that VMware’s virtualization provides.
• CA points out that “CA Federation Manager supports standards such as SAML and WS-Federation” in its CA's Identity Federation for Cloud Computing and SaaS Applications press release of 7/28/2009 and notes:
CA will participate in Burton Group's single sign-on interoperability demonstration for cloud applications Wednesday, July 29, in San Diego at Burton Group's Catalyst Conference. CA will show how CA Federation Manager provides Internet single sign-on and standards-based identity federation to help improve security when accessing multiple cloud and SaaS applications.
Competition for .NET Access Control Service, Windows Identity Foundation, Active Directory Federation Service (ADFS) and Windows CardSpace?
• Jay Fry’s Luckily, it's time for the next chapter in cloud computing post of 7/28/2009 asserts:
What's going on now seems to be the next chapter in the cloud computing story -- the chapter where reality starts to rear its ugly head. All of the promises that have been talked about now need to be matched with some actual delivery.
and goes on to discuss the absorption of Cassatt by CA.
Chris Kanaracus reports Oracle grid update tied to emerging cloud trend in this 7/22/2009 InfoWorld article:
Performance-boosting in-memory data grids, presently used in large-scale Web sites and high-throughput transactional systems, could play a key role in the next generation of cloud services.
As could Microsoft’s Velocity.
Reuven Cohen’s The Inter-Cloud and The Cloud of Clouds post of 7/26/2009 interprets the term inter-cloud, a concept being primarily promoted by Cisco as part of their Unified Computing platform:
My interpretation of the so called "inter-cloud" is the abstract ability to exchange information between distinct computing clouds (storage, compute, messaging etc) be it public or private in a uniform/unified way. I've come to think of it like a higher level inter-connected network atop the current world wide web via linked API's and data sources. Greg Papadopoulos from Sun calls it a Cloud of Clouds. …
Be sure to read James Urquhart’s comment of 7/27/2009.
Pass your mouse over the image to display the controls at the bottom of the image. Drag the Google Analytics image with your mouse. (Tested with Internet Explorer 8 and Mozilla Firebox 3.0.11 only. Reduced image quality is better with IE 8.)