|Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.|
•• Updated 11/8/2009: Vittorio Bertocci: Identity Developer Tranining Kit November 2009 Release; Wade Wegner: Preview of his PDC 2009 Migration to Azure session; Stephen Elop and Marc Benioff: Will participate in an Economist Debate starting on 11/10/2009; Arshad Ali: Moving databases to SQL Azure; Judith Hurwitz: Why all workloads don’t belong in the cloud; ChristophDotNet: WcfTestClient with Windows Azure Problems; Tony Bishop: Secure Enterprise Clouds; and Herve Roggero: SQL Azure - Auditing Choices.
• Updated 11/7/2009: Jim Nakashima: Using SQL Server Management Studio with SQL Azure; Me: Corrected location for downloading the LIMOGv2 VB.NET 2008 source code; Tim Fischer: Enterprise Apps in Windows Azure - Calling the Internet Service Bus; Microsoft Research: Azure Library for Lucene.Net; and many more.
Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:
- Azure Blob, Table and Queue Services
- SQL Azure Database (SADB)
- .NET Services: Access Control, Service Bus and Workflow
- Live Windows Azure Apps, Tools and Test Harnesses
- Windows Azure Infrastructure
- Cloud Security and Governance
- Cloud Computing Events
- Other Cloud Computing Platforms and Services
To use the above links, first click the post’s title to display the single article you want to navigate.
Discuss the book on its WROX P2P Forum.
See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.
Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.
You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:
- Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
- Chapter 13: “Exploiting SQL Azure Database's Relational Features”
HTTP downloads of the two chapters are available from the book's Code Download page.
* Content for managing DataHubs will be added when Microsoft releases a CTP of the technology
Off-Topic: OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009
• My updated Azure Storage Services Test Harness: Table Services 5 – Generating Classes/Collection Initializers with LIMOG v2 post of 11/7/2009 corrects the download location for the LINQ In-Memory Object Generator version 2 (LIMOGv2) code generator for Azure entity-attribute-value (EAV) tables:
As of 11/7/2009, you can download the VB.NET 2008 source code for LIMOGUtilityV2 from Windows Live SkyDrive. After I remove the obsolete code to generate SQL Data Services classes, I will include the source code in the comprehensive download available from Wrox’s Cloud Computing with the Windows Azure Platform site.
Mary Jo Foley reported in her Microsoft puts more Azure cloud plumbing in place post of 11/6/2009:
Microsoft … rolled out on November 5 a new CDN capability that extends the storage piece of the Windows Azure cloud operating system. …
Mary Jo continues:
The new Windows Azure CDN is designed to allow developers to deliver high-bandwidth content more quickly and efficiently. …
She also offers:
[M]ore details from a November 5 blog post by Brad Calder, who is a leader of the Windows Azure Storage team. …
Steve Marx’s Using the New Windows Azure CDN with a Custom Domain post of 10/6/2009 delivers the details for implementing the new CDN:
Today we announced a new service in Windows Azure: the Windows Azure Content Delivery Network (CDN). You can read about the details over on the Windows Azure blog. In short, you can now tap into our global CDN to cache your content close to your users. The Windows Azure CDN is a free preview for now, and we’ll announce pricing information in the future.
As we all know, my blog is an international sensation, and I sometimes include images in my posts. Now I have the opportunity to boost the performance of my blog by serving those images from the Windows Azure CDN. In this post, I’ll show you how I enabled the CDN under a custom domain name for images on my blog.
He explains some of the updates to his original post in this Why my latest blog post didn't work for a while post to his new new microblog.
Randy Cooper compares the two primary content deliver networks (CDNs) for cloud-based blob storage in his Azure CDN vs. Amazon CloudFront/S3 post of 11/6/2009:
… Note that the latest addition to the Azure family is in CTP release only. All we know is that the Windows Azure platform launch, PDC 2009 is expected to release new features later this month, followed by its official launch in January and first billing cycle in February; it’s likely that the CDN will be available along those timelines as well.
Like CloudFront, Microsoft’s CDN does not solve the HTTPS issue in the first release either. In terms of pricing, if Windows Azure platform pricing is any indication, you can expect to pay ~0.17/Gb for each targeted zone. …
Brad Calder announces a new capability for Accessing Windows Azure Blobs Using Custom Storage Domain Names in this 11/5/2009 post to the Windows Azure Team blog:
Today we are releasing the ability to access Windows Azure Blobs using custom domain names. Windows Azure Blob storage enables applications to store and manipulate large objects and files in the cloud.
The custom storage domain name feature allows you to register a custom domain name for a given storage account for anonymous blob access using that domain name. Currently we provide access to blob storage using the following domain name:
But if I owned a domain called “toddlers.wingtiptoys.com”, I may instead want my blobs accessible via:
When registering a custom storage domain name, you can use that domain name to access the contents of a public container instead of http://<account>.blob.core.windows.net/. For example, given a public container “images” for a storage account named “toys”, we register the custom domain name “toddlers.wingtiptoys.com” for that storage account. …
•• Herve Roggero’s SQL Azure - Auditing Choices post of 11/7/2009 begins:
As I am digging more into SQL Azure, it seems choices for auditing will become a little bit more restricted.
Generally speaking there are four ways to audit SQL Server statements; these mechanisms are used by various software vendors to deliver auditing capabilities for compliance mandates and for security reviews. However as we will see, many of the products will stop from working for SQL Azure due to some limitations imposed by the database.
Herve explains the four auditing methods and concludes:
At this point at least, there appears to be no real silver bullet for auditing a SQL Azure database; at least not yet...
Still, most applications using the SQL Azure platform will not likely store any sensitive data, initially. As the SQL Azure platform grows in its use, I would expect some of the options above to be enabled, or new options to become available.
•• Arshad Ali describes Moving your database to the cloud with SQL Azure - Part 1 in this 11/5/2009 article for the MSSQLTips Community:
There has been lots of buzz about cloud computing lately and looking at the benefits it provides (in terms of cost savings, high availability, scalability (scale up/down) etc.) it is now evident that cloud computing is the future for next generation applications. Many of tomorrow's applications will be designed and hosted in the cloud. Microsoft realizes this potential and provides a cloud computing solution with Windows Azure. Windows Azure platform, which is hosted inside Microsoft data centers, offers several services which you can leverage while developing your application if you target them for the cloud. One of them is Microsoft SQL Azure, it's a cloud based relational database service built on Microsoft SQL Server technologies. In this tip series, I am going to show how you can start creating databases and database objects on the cloud with SQL Azure.
• Jim Nakashima details Setting Up SQL Server Management Studio with SQL Server Express 2008 Installed for use with SQL Azure in his 11/2/2009 post:
One of the things I’ve been playing with lately is SQL Azure. I’ll post about my experience using Windows Azure and SQL Azure together shortly, this post is all about setting up SQL Server Management Studio (SSMS) with SQL Server Express 2008.
In order to use SSMS with SQL Azure, you need to have the 2008 version of SSMS installed. I had the 2005 version and it failed to connect.
The reason I’m posting about this is that I ran into a few non obvious things. Now the thing is, I’m not that savvy with SQL Server and I’m sure that contributed to my confusion. On the other hand, I figure other folks may be in my situation and could find this to be useful – if I can save a couple of people’s time, then I’m happy :) …
Dave Robinson reported an SQL Azure Portal issue this week in this 11/6/2009 post:
We received an inquiry earlier this week about a user who was trying to access the SQL Azure Portal and was unable to log in. I want to take a quick moment to respond broadly just incase anyone else had the same experience.
There were no issues with the SQL Azure Database service this week. There was, however, an availability issue with the SQL Azure Portal while some configuration changes were being made in preparation for the launch of SQL Azure Database at PDC on November 17th. The issue we experienced was with our portal only – not the actual service. No one trying to connect to the service or their database(s) were impacted at all.
The portal is different than the service end point. The current portal is a placeholder tool (CTP only) used for signing up for the SQL Azure service prior to go-live. It also offers capabilities for creating and deleting databases within a provisioned SQL Azure Database server but, it is only one of several tools that are used to do this. Other tools, such as SSMS and SQLCMD, are routinely used for these create/delete operations. The use of these tools was not impacted at all due to the portal issue. Your data was, and is, safe, sound and accessible in our highly available service.
Bill Zack announced ORM Comes to SQL Azure! in this brief 11/6/2009 post:
It was only a matter of time before Object Relational Mapping tools were extended to work with SQL Azure.
More ORMs that work with SQL Azure may come out in the future, but to my knowledge this is the first one.
Mary Jo Foley reported Billing system testing behind Microsoft's SQL Azure outage this week in this 11/5/2009 post to her ZDNet All About Microsoft blog:
Testers of Microsoft’s SQL Azure service experienced a three-plus hour unplanned outage this week — just a couple of weeks before Microsoft is set to remove the beta tag from its Azure cloud service.
During prior Azure outages (planned and unplanned), the team made sure to blog about the causes. This week’s outage, which occurred on the opening day of Microsoft’s SQL PASS user group conference, received no mention (other than a brief acknowledgment on the MSDN SQL Azure forums).
A tester wondering what happened sent me a note. From his e-mail:
“Microsoft didn’t formally acknowledge the problem until the outage was almost resolved. That’s 3+ hours wondering when the cloud would recover. Still no details on what happened.”
When I asked about what was behind the outage, I received the following note back from an Azure spokesperson:
“We were doing testing on the connection of the central billing platform yesterday and unfortunately experienced some downtime with SQL Azure. When discovered, we notified (Community Technology Preview) CTP customers right away and within a few hours had the service back online.”
Yes, Azure and SQL Azure are still in the test phase. But Microsoft is trying to lay the groundwork to get consumers, developers and enterprise customers to trust the availability, reliability and privacy guarantees of the service. Speaking of privacy guarantees, Microsoft published today a white paper outlining the company’s privacy policies for cloud computing.
James Hamilton claims One Size Does Not Fit All in this 11/3/2009 post, which analyzes Amazon’s RDB and SimpleDB:
Last week AWS announced the Amazon Relational Database Service (Amazon RDS) and I blogged that it was big step forward for the cloud storage world: Amazon RDS, More Memory, and Lower Prices. This really is an important step forward in that a huge percentage of commercial applications are written to depend upon Relational Databases. But, I was a bit surprised to get a couple of notes asking about the status of Simple DB and whether the new service was a replacement. These questions were perhaps best characterized by the forum thread The End is Nigh for SimpleDB. I can understand why some might conclude that just having a relational database would be sufficient but the world of structured storage extends far beyond relational systems. In essence, one size does not fit all and both SimpleDB and RDS are important components in addressing the needs of the broader database market.
Relational databases have become so ubiquitous that the term “database” is often treated as synonymous with relational databases like Oracle, SQL Server, MySQL, or DB2. However, the term preceded the invention and implementation of the relational model and non-relational data stores remain important today.
Relational databases are incredibly rich and able to support a very broad class of applications but with incredible breadth comes significant complexity. Many applications don’t need the rich programming model of relational systems and some applications are better serviced by lighter-weight, easier-to-administer, and easier-to-scale solutions. Both relational and non-relational structured storage systems are important and no single solution is appropriate for all applications. I’ll refer to this broader, beyond-relational database market as “structured storage” to differentiate it from file stores and blob stores.
George Huey shows you how to take advantage of Chunking BCP output to upload lots of data into SQL Azure, thanks to a guest post of 11/2/2009 on Wade Wegner’s blog:
One of the things that we found out during a series of Windows Azure Platform Migration Labs held in the Chicago MTC is that you cannot upload hundreds of thousands of records without giving SQL Azure time to catch up. Consequently, you have to chunk your data and give SQL Azure time to process each chunk before uploading the next chunk of data.
The tool that we used for migrating our customer databases to SQL Azure was the SQL Azure Migration Wizard. The migration wizard uses BCP to download data from an on premise SQL Server database and then uses BCP to upload the data to SQL Azure. BCP allows you to specify the first row (-F), the last row (-L), and the batch size (-b). These options will allow you to chunk the data beginning uploaded to SQL Azure. For example:BCP MyDb.dbo.Transactions out Transactions.dat -E -q -n –T
The above command extracts data from table Transactions in the database MyDb. At the end of the BCP output, you will find the number of records copied to file (for example: 2,524,520 rows copied). …
George then shows how to upload data in chunks with time delays for Azure to store the chunk before executing the next request.
Kathleen Richards’ Re-Architecting Azure feature article of 11/1/2009 for Visual Studio Magazine’s November issue describes the changes expected in the release to Web (RTW) versions of SQL Azure and Windows Azure: “A year after the first technical previews of Azure, Microsoft is launching a less-ambitious platform with forklift revisions based on developer feedback.”
•• Vittorio Berocci (Vibro.NET) recommended on 11/6/2009 that you Download the November 2009 release of the Identity Developer Training Kit:
Let’s close the WIF RC day with the November refresh of our Identity Developer Training Kit.
The new version of the Identity Developer Training Kit ported forward the three WIF labs (web site, web services, ASP.NET Membership provider) to the RC, and improved support for Windows 7 and Windows Server 2008 R2.
The ACS labs have been temporary removed, to give us the time to accommodate the new REST scenarios it now supports, but it will be back in in no time.
In addition to that, we’ll also be adding some new interesting content very soon… but I won’t spoil the surprise ;-) …
Vibro also reported that the ClaimsDrivenModifierControl has been updated to WIF RC:
• Tim Fischer describes writing Enterprise Apps in Windows Azure - Calling the Internet Service Bus (.NET Services from Azure) in this fully illustrated 11/7/2009 post:
In the last days I implemented a typical enterprise cloud app on Windows Azure.
WARNING: Beyond this step no Hello World scenarios! Watch your step!
The scenario is based on the famous TimeTracker SL3 Sample which you can find in the Expression Gallery. It is a vendor management system where I can track my vendors time and [I] can approve the tasks and then have them sen[t] over to SAP to create a Purchase Order. …
Mary Jo Foley reports Microsoft 'Geneva' identity wares approach the finish line in this 11/6/2009 post:
Microsoft is making available for download the near-final Release Candidate (RC) test build of its “Geneva” framework, the technology officially known as Windows Identity Foundation.
(For all you Microsoft codename trackers out there, “Geneva” is the next version of Active Directory Federation Services (ADFS). The programming framework supporting the next version of ADFS originally was codenamed “Zermatt,” then, later, also took on the “Geneva” codename. Microsoft’s Windows Cardspace is the third component of what Microsoft calls “Geneva.”)
On November 6, Microsoft released the RC bits of the framework, which are designed to provide developers with a new programming model and software development kit for creating identity-aware .Net applications. According to a blog post on the Forefront Team Blog, Windows Identity Foundation “provides developers pre-built .NET security logic for building claims-aware applications, enhancing either ASP.NET or WCF (Windows Communication Foundation) applications.
Geneva and the Geneva framework also are related to Microsoft’s Azure environment, as the next version of ADFS is part of the Azure Services layer in Microsoft’s cloud. (Microsoft’s current Azure diagrams don’t show ADFS as part of Azure, but I hear any new ones we see at the Professional Developers Conference in mid-November will include it.) The goal of Geneva is to provide developers and users with a single, secure sign-in capability across both cloud-based and on-premise applications. [Emphasis added.]
The .NET Services Team has released the Microsoft .NET Services SDK (Nov 2009 CTP) for download. There was no announcement in the teams blog as of 11/5/2009 10:30 AM PST. The following articles are repeated from the previous (Windows Azure and Cloud Computing Posts for 10/26/2009+) post for convenience:
The The .NET Services November 2009 CTP Breaking Changes Announcement and Scheduled Maintenance post of the same date announces the following changes:
- NET Services Portal address
- Subscription migration
- Solution migration
- Solution region migration
Services Bus will undergo the following changes (a.k.a., a complete transmogrification):
- Queues changes
- Routers removal
- RelayBinding Security Default
- Service Namespace replaces Solution name
- TransportClientCredentialType update
- TransportCredentialOnly is removed
- HttpBufferClient is not public
- TcpRelayConnectionMode.Direct is removed
- Service publishing feed address aligns with service transport
- WSHttpRelayBinding is removed
- WS2007FederationHttpRelayBinding is removed
Access Control Service will undergo the following modifications:
- Solution credentials replaced with Issuer credentials
- WS-Trust STS replaced with Web Resource Authorization Protocol (WRAP) STS
- Access Control Service data will not be migrated
- Access Control Management Portal replaced with a SDK Command-line Tool (acm.exe)
In other words, it’s start over from ground zero. It will be interesting to hear the .NET Services team’s explanation for such a complete product makeover at this late date (about two weeks from the Azure Services Platform’s commercial release. Like Gregor Samsa, it might turn into a giant beetle. I left the following comment to the .NET Services Team blog’s empty The .NET Services November 2009 CTP Breaking Changes Announcement and Scheduled Maintenance post of 10/30/2009:
Will .NET Services be commercially available when the Azure Services Platform (including SQL Azure) is released at PDC 2009? A transmogrification of this scope within about two weeks of Platform CA seems to me to be premature (and perhaps ill-advised).
The justification for rearchitecting .NET Services would make an interesting read.
Thanks to Wade Wegner for the heads-up.
•• ChristophDotNet’s WcfTestClient with Windows Azure post of 10/30/2009 describes issues with running Azure Windows Communication Framework (WCF) Web Services in Azure’s local Development Fabric:
One of my customers is working on an Azure WCF service. When wanted to test the service with WcfTestClient, but we ran into some issues. We started the dev fabric and had the WebRole running on port 81. When we went to the WCF service metadata page at http://mybox:5101/ProdKService.svc, we got the expected web page, which states:
“To test this service, you will need to create a client and use it to call the service. You can do this using the svcutil.exe tool from the command line with the following syntax:
This will generate a configuration file and a code file that contains the client class. Add the two files to your client application and use the generated client class to call the Service.”
Note that the instructions point you to port 5101 in the service URL. That’s the port where the Azure instance is running in my local development fabric. It is not as we would expect the address of the Azure dev fabric which is running on port 81. We tried to follow the instructions and point WcfTestClient to the address on the page, but instead of testing the service, we got [a] not so friendly error message. …
Christoph goes on to describe the error message and its culprit.
• Microsoft Research released Azure Library for Lucene.Net without fanfare on 7/28/2009:
Lucene works on top of an abstract store object called Directory. There are several Directory objects, including FSDirectory, for file systems, and RAMDirectory, for in-memory store. Azure Library for Lucene.Net implements a smart blob-storage Directory object called AzureDirectory which enables the use of Lucene.NET on top of Azure Blob Storage. AzureDirectory automatically creates a local cache of blobs and intelligently auto-uploads them on the fly.
The EULA restricts the library to non-commercial use.
Bill Zack claims The SQL Azure Tools Keep Coming! in this 11/5/2009 post:
SQL Azure is now feature complete for it’s launch at Microsoft’s Professional Developers Conference (PDC). Some of the other products in the Microsoft platform suite (like SQL Server Management Studio) have not yet caught up to SQL Azure. Hopefully that will be by PDC. :-)
According to Microguru the tool: provides an intuitive user interface to connect to and work with SQL Azure databases. Gem Query Tool supports execution of any DDL and DML script supported by SQL Azure. To facilitate authoring of SQL queries, Gem Query Tool for SQL Azure displays tables and columns in your database.
Read more here…
Wade Wegner’s Leveraging WMI in an Azure Web Role post of 11/4/2009 describes and demonstrates how to take advantage of Windows Management Instrumentation to collect information about the Windows 2008 Server instances that run Windows Azure:
A few weeks ago I threw out a teaser on Twitter:
This got the attention of a few folks, and I promised I’d follow-up on this with some details on how I got this to work. Of course, I completely forgot to do so, and was only recently reminded (thanks, Roger Jennings).
You can still take a look at that application here (if nothing else, I find it interesting to look at the specifications of the CTP machines for Windows Azure): http://wmi.cloudapp.net/
In actuality, the solution is really quite straightforward – I used WMI. …
Another of Wade’s 11/4/2009 posts, How to Leverage the RoleEntryPoint in an Azure Web Role post, begins:
One of the advantages to the approach our teams building the Windows Azure Platform have taken is flexibility. Recently, when I spoke at the Day of Cloud presentation, I recall Don Schwarz from Google making these two points (you can see video of his talk here):
- You can’t spin up your own threads in Google App Engine.
- You build your applications according to how Google thinks your apps should be built (the argument being that Google knows how to run highly available services at scale, which I think is a fair statement).
Now to be fair, there are good reasons for this – the Google App Engine has a number of very good use cases (Don Schwarz demonstrated one of them when showing the audience a multiplayer game running on Google App Engine).
Note: I’d like state for the record that I mean no criticism of other cloud vendors (i.e. Amazon, Google, and SalesForce). I think each of them have a place in the market, and exhibit various strengths. That said, I do believe that the Windows Azure Platform stands out as the only real platform that can bridge the chasm between cloud services and on-premises software. (Note to self: back this statement up in a future blog post.)
I would argue, however, that most enterprise developers require a little more flexibility when building out enterprise class applications. The Windows Azure Platform provides this flexibility (I mean, come on – sometimes you just want to execute some native code!).
I decided to test how far I could take this flexibility in Windows Azure.
•• Judith Hurwitz describes Why all workloads don’t belong in the cloud in this 11/2/2009 post to her Cloud-Centric Weblog:
I had an interesting conversation with a CIO the other day about cloud computing. He had a simple question: I have an relatively old application and I want to move it to the cloud. How do I do that? I suspect that we will see a flurry of activity over the coming year where this question will be asked a lot. And why not — the cloud is the rage and who wouldn’t want to demonstrate that with the cloud all problems are solved. So, what was my answer to this CIO? Basically, I told him that all workloads do not belong in the cloud. It is not because this technically can’t be done. It can. It is quite possible to encapsulate an existing application and place it into a cloud environment so that new resources can be self-provisioned, etc. But, in reality, you have to look at this issue from an efficiency and an economic perspective. …
• Microsoft’s White Paper: The Time to Move to the Cloud is Now post of 11/6/2009 points to a Microsoft-funded whitepaper, A Sense of Urgency for Software Companies: Partnering for Success in the Cloud by Saugatuck Research:
“The challenge for software vendors is no longer about when to include or even shift entirely to SaaS, software-plus services, or other Cloud-based solutions. The time is now. The challenge is how to make it happen,” according to [the] white paper. …
[B]y year end 2010, nearly half of firms worldwide will be depending upon SaaS as part of their regular business operations.
The paper concludes, "The Cloud is the next stage of business solutions development and delivery. For on-premise software vendors the time to move to the Cloud is now.”
• Jay R. Galbraith asks Will Microsoft become the General Motors of software? in this 11/6/2009 post to Fortune Magazine’s BrainstormTech blog:
It has near-monopoly status and nimble, disruptive competitors. We’ve seen this movie before. …
Microsoft also suffers from the incumbent’s curse during a technological transition. The curse is well described in Clayton Christensen’s research. Cloud computing, in which software and other applications are housed in a central location and delivered over networks to end users, could lead to a shift away from desktop-based computing and from complicated operating systems. As Microsoft adapts to it, will it promote cloud computing or protect Windows? Will the team leading Microsoft’s Azure cloud computing business have the freedom to cannibalize the desktop? Or will it be integrated into Windows, where the desktop mafia will slow, modify and dilute the efforts to convert to a new business model?
The General Motors scenario does not have to happen. Ballmer can focus inward on transforming the desktop mafia to the new computing paradigm. Or, better yet, appoint a hands-on, change-experienced chief operating officer who can do it with him.
• Hosting.com presents its 2009 Cloud Computing Trends Report eBook for public download (site registration required):
During December 2008 and January 2009, Hosting.com surveyed nearly 700 individuals regarding Cloud Computing industry and purchasing trends. The resulting Cloud Computing Trends Report introduces new data into the cloud marketplace and is available for download below.
The survey provided insight into the expectations small, medium and large businesses have of Cloud Computing, their intended uses, reasons for adopting, and expected time-frames for implementing cloud-based solutions. The eBook reveals that there is little difference between how larger companies and small businesses will utilize Cloud Computing. [Emphasis by author.]
The survey appears dated to me, which might be the reason for its public release.
Darren Cunningham posits If It’s Not Multitenant, It’s Not Really SaaS in this 11/6/2009 post:
Last year Gartner came out with a SaaS revenue forecast that made the following statement:
“It is important to differentiate SaaS from hosting or application management or application outsourcing. Because the SaaS/on-demand market is ‘hot’, many suppliers are rebranding their hosting or application management or application outsourcing capabilities as SaaS/on-demand. The core proposition behind SaaS/on-demand is the delivery of multi-tenant service from a remote location over an internet protocol (IP) network via a subscription-based outsourcing contract.”
If your hardware and software appliance is hosted in an infrastructure as a service (IaaS) environment like Amazon EC2 , can you call it software-as-a-service (SaaS)? If it’s single tenant, the answer should be no.
Darren goes on to explain why “the answer should be no.”
See Kathleen Richards’ article of 11/1/2009 in the SQL Azure Database (SADB) section.
•• Tony Bishop’s Secure Enterprise Clouds post of 11/7/2009 to Cloud Security Journal begins:
There is so much waste in the data centers of Fortune 1000 companies today that a CIO – as an officer of the company – could be considered in breach of their fiduciary duty to stockholders given the dollars in question. Of course that requires costs transparency, so sadly most are safe for now. It seems that every new technology innovation brings the promise of greater efficiencies and cost savings but in reality tends to leave a mess of ‘legacy’ infrastructure on the floor that results in a net higher TCO than the CIO had in the first place.
So what does this have to do with Cloud Computing? While there is no shortage of companies trying to ply their wares as the ideal enabler for Cloud, I am surprised by the lack of attention from vendors that have the most to gain – the Cloud providers themselves. …
Tony is the founder and CEO of Adaptivity.
•• Judith Hurwitz asks Is cloud security really different than data center security? in this 10/30/2009 article that I missed when posted:
Almost every conversation I have had over the past year or so always comes back to security in the cloud. Is it really secure? Or we are thinking about implementing the cloud but we are worried about security. There are, of course, good reasons to plan a cloud security strategy. But in a sense, it is no different than planning a security strategy for your company. But it is the big scary cloud! Well, before I list the top then issues I would like to say one thing: if you think you need an entirely different security strategy for the cloud, you may not have a comprehensive security strategy to start with. Yes, you have to make sure that you cloud provider has a sophisticated approach to security.
However, what about your Internet service provider? What about the level of security within your own IT department? Can you throw stones if you live in a glass house (yes, that is a pun…sorry)? So, before you start fretting about security in the cloud, get your own house in order. Do you have an identity management plan? Do you ensure that one individual within the data center can’t control all of the data within a single environment to minimize risks? If you don’t have a well executed internal security plan, you aren’t ready for the cloud. But let’s say that you have fixed that problem and you are ready to really plan your cloud security strategy. So, here five of the issues to consider. If you have others, let’s start a conversation.
This post complements her earlier Unintended consequences of the cloud – part II (10/29/2009) and What are the unanticipated consequences of Cloud Computing- Part I (10/28/2009) posts. Read all Judith’s recent Cloud-Centric Weblog posts at http://jshurwitz.wordpress.com/.
Dave Rosenberg reviews Microsoft's weak cloud privacy position on 11/6/2009 for CNet News’ BusinessTech blog:
Microsoft released on Thursday a new position paper, "Privacy in the Cloud Computing Era: A Microsoft Perspective," that includes information about the remote storage and processing of personal information.
Privacy and security concerns continue to be a primary argument that cloud naysayers use against storing data and applications on the Internet. Big IT vendors and service providers like Microsoft and Hewlett-Packard will sooner or later be forced to take the cloud seriously or risk missing out on the whole next wave of IT consumption. And their large enterprise customers will expect them to offer cloud services with the appropriate levels of privacy and security measures in line with their business needs.
The interesting thing about this paper is that Microsoft takes surprisingly minimal responsibility for the data it will manage:
“Unlike our consumer business, in which Microsoft has a direct relationship with consumers and directly controls the policies that govern their data, our cloud services for business customers defer to the policies of those customers. In this case, Microsoft has no direct relationship with the business's employees or the customers to whom the hosted data may pertain. Policies relating to the business's handling of this data in the cloud environment are controlled and set by that business rather than by Microsoft. Our role is to handle and process the data on behalf of the business, much like third-party telephone call centers process customer inquiries, orders, and data for their business customers.
“The division of responsibility between an enterprise or government and its cloud services provider is similar to that of a company that rents physical warehouse space from a landlord for storing boxes of customer or company files. Even though someone else might own the building, access to those files and the use of information within them is still governed by the policies of the company that rents the space. These same principles should apply in the cloud environment.” …
Microsoft's privacy principles are well documented, but as I read through this position paper, I found myself expecting more substantial assurances, especially considering Microsoft wants to be a cloud services provider for not just consumers but for enterprises and governments as well.
Mary Jo Foley reports in her Billing system testing behind Microsoft's SQL Azure outage this week post of 11/5/2009:
Microsoft published today a white paper outlining the company’s privacy policies for cloud computing.
Here’s the start of the “Cloud Computing and Privacy” preamble on page 1:
A new generation of technology is transforming the world of computing. Internet-based data storage and services—also known as “cloud computing”—are rapidly emerging to complement the traditional model of software running and data being stored on desktop PCs and servers. In simple terms, cloud computing is a way to enhance computing experiences by enabling users to access software applications and data that are stored at off-site datacenters rather than on the user’s own device or PC or at an organization’s on-site datacenter.
E-mail, instant messaging, business software, and Web content management are among the many applications that may be offered via a cloud environment. Many of these applications have been offered remotely over the Internet for a number of years, which means that cloud computing might not feel markedly different from the current Web for most users. (Technical readers will rightly cite a number of distinct attributes—including scalability, flexibility, and resource pooling—as key differentiators of the cloud. These types of technical attributes will not be addressed here because they are outside the scope of this document.)
Cloud computing does raise a number of important policy questions concerning how people, organizations, and governments handle information and interactions in this environment. However, with regard to most data privacy questions as well as the perspective of typical users, cloud computing reflects the evolution of the Internet computing experiences we have long enjoyed, rather than a revolution.
Microsoft recognizes that privacy protections are essential to building the customer trust needed for cloud computing and the Internet to reach their full potential. Customers also expect their data and applications stored in the cloud to remain private and secure. While the challenges of providing security and privacy are evolving along with the cloud, the underlying principles haven’t changed—and Microsoft remains committed to those principles. We work to build secure systems and datacenters that help us protect individuals’ privacy, and we adhere to clear, responsible privacy policies in our business practices—from software development through service delivery, operation, and support. …
•• Stephen Elop (President, Microsoft Business Division) and Marc Benioff (Chairman & CEO, salesforce.com) will debate the Economist’s Cloud Computing: This house believes that the cloud can't be entirely trusted topic proposition on 11/10/2009:
There is nothing the computer industry likes better than a big new idea. Cloud computing is the latest example, and companies large and small are already joining the fray. The idea is that computing will increasingly be delivered as a service, over the internet, from vast warehouses of shared machines. Many things work this way already, from email and photo albums to calendars and shared documents. Albeit more slowly, companies are also moving some of their applications into the cloud. But is this a good idea? Can providers of these computing clouds be trusted? Are these mainframes in the sky reliable enough? What happens if data get lost? What about privacy and lock-in? Will switching to another cloud be difficult?
This debate will happen online, and starts on November 10th 2009. You can sign up for email alerts to be notified when this debate begins.
These two should be on the same side. However, Stephen might be debating from the the Microsoft Office, rather than the Windows Azure perspective. Elizabeth Montalban’s MS Office battles Google in the cloud post for Network World of 11/6/2009 provides more background with a recent Elop interview.
•• Wade Wegner’s Preview of “Lessons Learned: Migrating Applications to the Windows Azure Platform” post of 10/7/2009 begins:
It’s hard to believe that the Professional Developers Conference (PDC) 2009 is less than two weeks away. It doesn’t seem that long ago that I sat behind the stage at PDC 2008 providing support for the RedPrairie keynote with Bob Muglia and spoke in a breakout session with Jack Greenfield on Multi-Enterprise Business Applications. I’ll be back again this year, and I’m giving another talk – this time on lessons learned when migrating applications to the Windows Azure platform.
Rather than present this session entirely on my own, I decided to invite some of my customers to come and talk about their own experiences. I am extremely excited that the following three customers will join me at PDC:
I will provide additional information regarding the “mystery company” as soon as I am able. Suffice to say, you know ‘em, and they’re doing some really cool stuff with Windows Azure.
SC Magazine reminds cloud practitioners that its free SC eConference and Expo starts on Tuesday, 11/10/2009 at 8:00 AM PST:
This innovative virtual conference will address the following pertinent issues:
- What are the risks of cloud computing applications?
- How should organizations better protect against these?
- How can organizations implement authentication or access controls?
- What can they do prevent data leakage?
- How do they ensure that their end-users aren't introducing even greater problems?
Bill Zack announced a Webcast: Designing Multi-Tenanted Applications on Windows Azure in this 11/5/2009 post:
November 25, 2009, 11:00am – 12:30pm PST
Speaker: Joseph Hofstader
Abstract: Cloud computing is one of the hottest topics in information technology today. With all the confusion surrounding acronyms ending in ‘aas’ like Platform as a Service (PaaS), Infrastructure as a Service (IaaS) and Software as a Service (SaaS) it can be intimidating for even seasoned IT professionals. This presentation will briefly discuss the different types of cloud platforms and then address one of the key business scenarios for the cloud: Software as a Service.
Software as a Service is a business model for making applications available over the Internet. One of the key tenets of SaaS is multi-tenancy, or software designed to be used by multiple parties. Designing SaaS applications touches on many of the technologies that comprise the Azure platform: Processing, Storage, Workflow, Database and most importantly security. This presentation will discuss how each of technologies can be utilized to define a flexible architecture for multi-tenant solutions.
Event ID: 1032432981
Joseph Hofstader is an architect/evangelist in Microsoft Communications Sector.
Dave Robinson summarizes SQL Azure at PASS Summit 2009 in this 11/1/2008 post:
This upcoming week, PASS (Professional Association for SQL Server) will be holding its annual summit in Seattle, WA. The summit runs from November 2nd through the 5th and, as described on the website, provides:
- In-depth technical sessions all focused on SQL Server
- Unparalleled access to the industry’s top SQL Server experts and the Microsoft SQL Server development team
- Unique opportunity to network with your peers, share challenges, and get answers and advice
- Return to work with new skills and knowledge to do your job better, faster, easier – right away.
This year the SQL Azure team will be presenting the following sessions:
- What’s new in SQL Azure - Patric McElroy
- Building Applications with SQL Azure and Windows Azure – David Robinson and Liam Cavanagh
- Roles and Responsibilities Managing a Microsoft SQL Azure Database - Nino Bice
Dave’s post includes brief descriptions of the sessions.
Sanjay Jain’s Microsoft BizSpark Incubation Week for Windows Azure @ Atlanta 09Nov09 post of 10/28/2009 announced:
- Windows Azure
- Microsoft .Net Services
- Microsoft SQL Azure
- Live Services
- Microsoft SharePoint Services
- Microsoft Dynamics CRM Services
The Microsoft BizSpark Incubation Week for Windows Azure will be held at Microsoft Technology Center, Atlanta, GA from Mon 11/09/2009 to Fri 11/13/2009. This event consists of ½ day of training, 3 ½ days of active prototype/development time, and a final day for packaging/finishing and reporting out to a panel of judges for various prizes.
This event is a no-fee event (plan your own travel expenses) and each team can bring 3 participants (1 business and 1-2 technical). To nominate your team, please submit the following details to Sanjay Jain (preferably via your BizSpark Sponsor). Nominations will be judged according to the strength of the founding team, originality and creativity of the idea, and ability to leverage Windows Azure Scenarios.
• Sebastian Rupley lists his 11 Top Open-source Resources for Cloud Computing in a 11/6/2009 post to GigaOm:
Open-source software has been on the rise at many businesses during the extended economic downturn, and one of the areas where it is starting to offer companies a lot of flexibility and cost savings is in cloud computing. Cloud deployments can save money, free businesses from vendor lock-ins that could really sting over time, and offer flexible ways to combine public and private applications. The following are 11 top open-source cloud applications, services, educational resources, support options, general items of interest, and more. …
• Reuven Cohen’s The Open Web Foundation Agreement (OWFa) for Collaborative Open Cloud Standards post of 11/6/2009 reports:
As part of a new initiative at the Open Web Foundation -- a group dedicated to the creation of community-driven specifications & standards. David Rudin along with several other individuals & organizations have crafted a new simple and easy to understand Open Web Foundation agreement (OWFa) targeting collaborative specification development and publishing. You can think of OWFa as similar to the Creative Commons license. But unlike the a CC license the OWFa was developed with the specific needs of spec & standards developers covering aspects such as patents, copyright/trademarks and other issues that most contributors (including open source developers) are concerned about.
More specifically it was created with an open collaboration model in mind where both large companies and individuals can equally collaborate without fear of legal ramifications. Using the OWFa the actual spec development can be done in any forum the participants choose (Unincorporated Google groups / Social Networks, non-profits, startups, Enterprises, etc.)
You can download a copy of the v9 draft here.
• Reuven Cohen reports ISO Forms Group for Cloud Computing Standards in this 11/6/2009 article:
Big news on the Cloud Standards front, I was just informed that the International Organization for Standardization (ISO) - JTC 1 have formed a new Subcommittee (SC) at their Plenary last week that includes working groups for SOA and Web Services as well as a Study Group for standardization of cloud computing. (This information has not yet been made public, my source has indicated that I am allowed to share this)
The scope will include Standardization for interoperable Distributed Application Platform and services including Web Services, Service Oriented Architecture (SOA), and Cloud Computing. SC 38 will pursue active liaison and collaboration with all appropriate bodies (including other JTC 1 subgroups and external organizations, e.g., consortia) to ensure the development and deployment of interoperable distributed application platform and services standards in relevant areas.
Similar to other ISO initiatives each member country that’s interested in participating in this group will come up with their own structure to provide feedback on work items and establish voting positions, including the InterNational Committee for Information Technology Standards (INCITS) who will be the US TAG.
Ruv’s post includes the complete text of Resolution 36 ‐ New JTC 1 Subcommittee 38 on Distributed Application Platforms and Services (DAPS).
• Jim Ericson interviews John Willis (@botchagalupe) in the Overheard: Cloud Computing - Learn It Or Lose post of 11/6/2009 for the Information Management blog:
… Do you see the cloud as more than another evolutionary step in IT?
As I picked it up from others, I also came to think that the cloud is kind of a Cambrian explosion that may mark this time in IT history as a spectacular moment. You could say it started with computers and IBM or PCs or the Internet, but the convergence of massively scalable commodity computers, open Internet protocols and our understanding of it looks different than everything I’ve seen in the last 30 years. There’s plenty of hype, but the stories are extremely real and you can’t ignore them. …
James Urquhart reported IBM launches development and test cloud in his 11/4/2009 article for CDNet’s The Wisdom of Clouds blog:
With a nod toward the heterogeneous application development environments that exist in most enterprise IT departments, IBM on Wednesday launched a pair of services targeted at building cloud applications.
The first, the IBM Smart Business Development and Test on the IBM Cloud, is a cloud service hosted in IBM's data centers that provides tools and interfaces designed to support developers using Java, .NET, and Open Source environments. This service provides computing and storage capacity, and support for WebSphere middleware, Rational Software Delivery Services, and its Information Management database. It also provides "pre-configured integrations" of some Rational services based on IBM's Jazz framework, its collaborative software platform.
There are no pre-configured integrations announced for third-party or open source tools or languages. …