Tuesday, November 02, 2010

Windows Azure and Cloud Computing Posts for 11/2/2010+

A compendium of Windows Azure, Windows Azure Platform Appliance, SQL Azure Database, AppFabric and other cloud-computing articles.

AzureArchitecture2H_thumb31133  
Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.


Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now freely download by FTP and save the following two online-only PDF chapters of Cloud Computing with the Windows Azure Platform, which have been updated for SQL Azure’s January 4, 2010 commercial release:

  • Chapter 12: “Managing SQL Azure Accounts and Databases”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available for download at no charge from the book's Code Download page.


Tip: If you encounter articles from MSDN or TechNet blogs that are missing screen shots or other images, click the empty frame to generate an HTTP 404 (Not Found) error, and then click the back button to load the image.


Azure Blob, Drive, Table and Queue Services

My (@rogerjenn) What Happened to Secondary Indexes for Azure Tables? post of 11/2/2010 laments the Windows Azure Team’s failure to deliver long-promised secondary indexes:

image Windows Azure Tables have a single, composite index on a PartitionKey + RowKey unique identifier (equivalent to a relational primary key), so all table data retrieved is sorted in this (ascending) order. One of the more popular requests since the first CTP of Windows Azure is the ability to specify additional (secondary) indexes, which would enable high-performance filtering by column values, instead of laborious row-scans with <, <=, ==, >=, or > operators.

imageThe Windows Azure Storage Team and its predecessors have promised secondary indexes for Azure Tables since the initial Azure Beta program in 2008 but haven’t delivered in the intervening two years.

As far as I’ve been able to determine, secondary indexes for Azure Tables weren’t even mentioned by the Windows Azure team at PDC 2010. I heard no mention of secondary indexes in Jai HaridasWindows Azure Storage Deep Dive session on Friday morning and his slide deck was similarly bereft of secondary index mentions.

Here are Jai’s performance-related slides for Azure Tables Tips & Best Practices:

image

image

image

The post continues with more background about missing secondary indexes for Windows Azure Tables.


<Return to section navigation list> 

SQL Azure Database, Azure DataMarket and OData

Steve Yi reported a new addition to the Wiki: Inside SQL Azure in an 11/2/2010 post to the SQL Azure Team blog:

image Interested in how SQL Azure works on the inside? The SQL Azure team has just published a white paper in the wiki section of TechNet called: “Inside SQL Azure”.

imageThe paper examines the internals of the SQL Azure databases, and how they are managed in the Microsoft Data Centers, to provide you high availability and immediate scalability in a familiar SQL Server development environment.

Read Inside SQL Azure.

imageNo significant OData articles today.


<Return to section navigation list> 

AppFabric: Access Control and Service Bus

Wade Wegner recommended his PDC10: Introduction to Windows Azure AppFabric Caching (CS60) session video on 11/2/2010:

In addition to building the Composite Application keynote demo presented by James Conard and Bob Muglia, I presented a session on the new Windows Azure AppFabric Caching service that’s now available as a CTP release in the AppFabric LABS environment.  You can find the presentation here: http://player.microsoftpdc.com/Session/1f607983-c6eb-4d9f-b644-55247e8adda6

Introduction to Windows Azure AppFabric Caching

image7223A few interesting notes on the Caching service:

  • It’s a distributed, in-memory, application cache provided entirely as a service – no installation, management, or deployment required
  • Low latency and high throughput (i.e. it’s fast)
  • Based off of Windows Server AppFabric Caching (codename “Velocity”), and the development experience is identical
  • Local cache support allows you to keep your data in-memory on the client, reducing network latency penalties
  • You can cache any managed object, regardless of the size; if using local cache you pay no serialization costs
  • Secured by the Access Control service
  • Pre-built providers for ASP.NET session state and page output caching

You’re going to hear a lot about the Caching service from me, as it fills a very significant gap in the Windows Azure Platform. 

For some more information, you should also take a look at these resources:

And of course, visit http://portal.appfabriclabs.com/ to try it out today!


<Return to section navigation list> 

Windows Azure Virtual Network, Connect, and CDN

No significant articles today.


<Return to section navigation list> 

Live Windows Azure Apps, APIs, Tools and Test Harnesses

J. D. Meier summarized Developer Guidance Maps Roundup for ADO.NET, ASP.NET, Silverlight, Windows Azure and Windows Phone on 11/2/2010:

image Developer Guidance Maps are treasure maps and guided tours of our developer content collections.  They are consolidated and organized views of content collections spanning Channel9, MSDN Developer Centers, MSDN Library, Code Gallery, CodePlex, the All-in-One Code Framework, www.ASP.net, www.Silverlight.net, WindowsClient.net, etc. 

imageI’m creating these as part of our “IA” effort.  One of the things I’ve been tasked with is creating an IA, or "information architecture," for the developer guidance ecosystem at Microsoft.  As part of that effort, I have to map out what we already have as well as identify the various sources of content and clearing houses.   Rather than simply do this behind the scenes, I’ve decided to share the maps with you as I go so that you benefit … thus Developer Guidance Maps were born. 

Each map basically provides a map of the technology (common categories, features, scenarios), sources of where to look for key content, a Getting Started section of content, an Architecture and Design section of content, and then extensive content collections for Code Samples, How Tos, Videos, and Training, organized by scenarios and common tasks.

Benefits of Developer Guidance Maps

  1. Show you the key sources of developer content and where to look (“teach you how to fish”)
  2. Provide an index of the main content collections (Code Samples, How Tos, Videos, and Training)
  3. Use the map as a model for creating your own map of developer guidance to streamline your learning and ramp up.

Developer Guidance Maps Available for Download
Here is a roundup of the current Developer Guidance Maps:

Mental Model for the Maps
Here is a simple mental model for the Developer Guidance Maps:

image

The Approach
Rather than boil the ocean, I’ve used a systematic and repeatable approach.  I’ve focused on common categories and features for key technologies and simple content types.   Here is how we prioritized our focus:

  1. Content Collections: Code, How Tos, Videos, Training
  2. Building Apps: Cloud, Data, Desktop, Phone, Service, Web
  3. Technology Building Blocks:  ADO.NET, ASP.NET, Silverlight, WCF, WPF, Windows Azure, Windows Client, Windows Phone

The Maps are Works in Progress
Keep in mind these maps are works in progress and they help us pressure test our simple information architecture (“Simple IA”) for developer guidance at Microsoft.  Creating the maps also helps me test the model, create a catalog of developer guidance, and easily find the gaps and opportunities.   While the maps are incomplete, they may help you find content and sources of content that you didn’t even know existed.  For example, the All-In-One Code Framework has more than 450 code examples that cover 24 Microsoft development technologies such as Windows Azure, Windows 7, Silverlight, etc. … and the collection grows by six samples per week.

Here’s another powerful usage scenario.  Use the maps as a template to create your own map for a particular technology.  By creating a map or catalog of content for a specific technology, and  organizing it by topic, feature, and content type, you can dramatically speed up your ability to map out a space and leverage existing knowledge. (… and you can share your maps with a friend ;)

My Related Posts


Peter Galli reported Zend Framework 1.11 Ships in a 11/2/2010 post to the interoperability@Microsoft blog:

image At the annual ZendCon 2010 in Santa Clara, CA today, Zend Technologies announced general availability of Zend Framework 1.11, the latest release of its PHP application framework. This adds support for mobile application development and includes the open source Simple Cloud API, which allows PHP developers to build portable cloud applications.

The Zend Framework is a PHP application framework with more than 15 million downloads and over 500 contributors, including Microsoft, Amazon, IBM, Adobe and Google.

According to Zend's announcement, Zend Framework 1.11 gives developers access to the first deliverables for the Simple Cloud API project, including:

  • Document Service integration, which allows developers to utilize a variety of NoSQL cloud storage solutions including Amazon SimpleDB and Microsoft Windows Azure Table storage.
  • Queue Service integration, which lets developers perform asynchronous operations in order to offload heavy-lifting, pre-cache pages, and more. Queue Service integrations include Amazon Simple Queue System (SQS), Microsoft Windows Azure Queue service, and all adapters supported by the Zend Framework Zend_Queue component.
  • Storage Service integration, which allows developers to push static resources such as images and archives to the cloud. Currently supported services include Amazon Simple Storage Service (S3), Microsoft Windows Azure Blog storage, and Nirvanix.

imageWindows Azure access from the Simple Cloud API is made possible by the Windows Azure SDK for PHP, a project sponsored by Microsoft and developed by RealDolmen. This is yet another example of Microsoft's continuous commitment to the openness of Windows Azure Platform by working with larger open source community.

For its part, Microsoft is pleased to see the role this project is playing in "driving adoption among PHP developers for cloud computing platforms, and hope that many of these developers will be encouraged to use Windows Azure," says Jean Paoli, General Manager of Interoperability Strategy at Microsoft Corp.

"The Simple Cloud API is an important catalyst for open and interoperable cloud computing, and Microsoft has an ongoing investment in the Simple Cloud API project, together with Zend and other contributors," Paoli says.

The new mobile device support in Zend Framework 1.11 provides functionality for detecting mobile device types and their capabilities. Developers can choose from the  WURFL database, TeraWurfl, or DeviceAtlas to retrieve device capabilities, or they can write their own classes to leverage additional device databases.

Zend Framework 1.11 mobile support also includes the Dojo Toolkit 1.5 update, which includes the dojox.mobile subproject. This delivers a flexible, lightweight mobile application framework, including CSS3 and JavaScript widgets optimized for use on mobile devices and for mobile-specific contexts.


Ron Jacobs explained why WCF WebHttp Service returns HTTP 415 “Unsupported Media Type” in this 11/1/2010 post:

image When I was developing my demos for PDC10 I ran into a problem.  I was using some new HTTP activities for the next release of WF and invoking a simple console application with an HTTP POST.  For some strange reason I ran into the following error

Request
POST http://127.0.0.1:8080/Marketing HTTP/1.1
Content-Type: application/xml
Host: 127.0.0.1:8080
Content-Length: 91
Expect: 100-continue
Connection: Keep-Alive

<string xmlns="http://schemas.microsoft.com/2003/10/Serialization/
">Phyllis Harris</string>
Response
HTTP/1.1 415 Cannot process the message because the content type 
'application/xml' was not the expected type 'text/xml; charset=utf-8'.
Content-Length: 0
Server: Microsoft-HTTPAPI/2.0
Date: Wed, 27 Oct 2010 20:00:06 GMT

I have done a fair bit of development with WCF but I had never seen this error message before.  I was so sure that my code was correct I immediately assumed it must be a problem with our new HttpPost activity so first I decided to check with our test team to see if they had a working test.  When I found out they did we began looking more closely and I found… a copy paste bug.

That’s right, in my hosting code it turned out that I was using ServiceHost instead of WebServiceHost to host my WCF WebHttp service.  In most cases when service host detects that your service (or it’s configuration) is somehow invalid it throws an exception when you try to open the host.  For some strange reason (bug?) in this case it does not throw an exception but happily starts accepting requests and throwing 415 when you invoke it.

The solution of course is to use WebServiceHost instead of ServiceHost.  Hopefully this will save somebody out there some time.

No significant articles today.image


<Return to section navigation list> 

Visual Studio LightSwitch

image22242No significant articles today.


Return to section navigation list> 

Windows Azure Infrastructure

Mike Wickstrand reminisced in his Windows Azure: A Year of Listening, Learning, Engineering, and Now Delivering! post of 11/2/2010:

image Although I’ve worked at Microsoft for more than 11 years, 2009 marked the first time I had the opportunity to attend Microsoft’s Professional Developers Conference. When I walked around PDC 09 in Los Angeles last year and spoke with developers, I found that I was inundated with many great ideas on how to make Windows Azure better, almost too many to sort through and prioritize. As someone who helps chart the future course for Windows Azure this was a fantastic problem to have at that point, because in late 2009 we were finalizing our priorities and engineering plans for calendar year 2010.

imageEnergized by those developer conversations and wanting a way to capture and prioritize it all, on the flight home I launched http://www.mygreatwindowsazureidea.com (Wi-Fi on the plane helped). It’s a simple site where Windows Azure enthusiasts and customers (big or small) can tell Microsoft’s Windows Azure Team directly what they need by submitting and voting on ideas. I wasn’t sure anyone would participate, so I submitted a few ideas of my own to get things going and gauge interest in some ideas we were kicking around within the Windows Azure Team. The goal of the site was and is to better understand what you need from Windows Azure and to build plans around how we make the things that "bubble to the top" a reality for our customers in the future.

So what happened? Well, with a year now gone by and a slew of features on tap for release it’s the perfect time to reflect back. In the past 12 months, more than 2000 unique visitors to mygreatwindowsazureidea.com have submitted hundreds of feature requests and cast nearly 13,000 votes for the features that matter most to them. There were also hundreds and hundreds of valuable comments and blog posts that grew out of the ideas people were sharing on the site. Thank you for this amazing level of participation!

With the announcements last week at PDC 10 and with a look forward to things that weren’t announced, but are in the works, I am pleased to let you know that we are addressing 62% of all of the votes cast with features that are already or soon will be available. Said another way, we are addressing 8 out of the top 10 most requested ideas (and more ideas lower down on the list) that in total account for roughly 8000 of the nearly 13,000 votes cast.

I hope that you agree that we are sincerely listening to you and knocking these high priority ideas off one-by-one. I am sure there are some of you that want new features to come sooner or perhaps you’re not happy as your requested feature isn’t yet available (or it isn’t available exactly in the way that you envisioned). With more than 2000 people participating, this is going to happen - - I just hope with what we are releasing that you are now even more enthusiastic to keep an active dialog going with me. Also, please realize this site is just one of many channels we use to determine our engineering and business priorities, and this one just happens to be the most public.

On that note, I received this e-mail the other day that I wanted to share with you:

From: Paul <last name and e-mail address withheld>
Sent: Saturday, October 30, 2010 4:15 AM
To: Mike Wickstrand
Subject: Windows Azure

Mike,

I’ve been keeping a close eye on Windows Azure, and so far it’s been a case of “Wow, I’d love to develop for this, but it’s too expensive”.  I’ve been looking on “mygreatwindowsazureidea.com”, and I have to say, the new announcements for $0.05 per hour instances and being able to run multiple roles <web sites> per instance has tipped me in favour of Azure enough to begin learning and developing for it.

Thanks so much for listening to the feedback of the developer community.  It gives me a warm feeling that we have our Microsoft of old back, who cares and listens to the developer community.

Honestly, this is great news.

Thanks,

Paul (A born-again Microsoft fan-boi)

When I was sitting on that plane last year flying back from PDC 09 I hoped that in a year I was going to be able to look each of you in the eye and demonstrate to you that Microsoft listens and that the Windows Azure Team cares about what you need. In the best case scenario I envisioned that I would hear from customers like Paul. I gave you my assurance that if you tell us what you want that I will do my best to champion those ideas within Microsoft to make those things become a reality. I hope you feel like I’ve lived up that and that I’ve earned the right to keep hearing your ideas on how to make Windows Azure great for you and for your companies.

So…a big thank you to the more than 2000 people who shared ideas and voted for what you want and need from Windows Azure. To the thousands of Windows Azure customers who regularly receive e-mails from me asking for your opinions, thank you and please keep the feedback coming. And lastly, to the Windows Azure Team, thanks for making all of this happen, for Paul and for our thousands of customers just like him.

In the past year we’ve also added a few more ears to my team, so along the way please don’t hesitate to share your ideas with harism@microsoft.com (Haris), adamun@microsoft.com, (Adam) or rduffner@microsoft.com. (Robert).

We look forward to coming back to you in another year after PDC 11 and having an even better story to tell.

Let’s hope we don’t need to wait for PDC 11 to get secondary indexes for Windows Azure Tables (see my What Happened to Secondary Indexes for Azure Tables? post of 11/2/2010.)


David Linthicum asserted “Here's a prescription of what it should mean, in contrast to the empty marketing usage” as a deck for his What does 'open' really mean in cloud computing? post of 11/2/12010 to InfoWorld’s Cloud Computing blog:

image This week I'm at Cloud Expo, where I suspect I'll see and hear the world "open" a lot. This is not only a cloud issue -- in much of the software world, "open" is becoming almost a religious belief. Technologists typically assume they need to use "open" technology but are not sure what it means.

image The idea is compelling. We all want to invest in technology that works and plays well with other "open" technology, and where we're not dependent on the vendors because it's "open" source. Thus, we can protect our investment in that technology, and all will be right with the world.

It's natural the same aspiration is being applied to the cloud. Thus, standards organizations are emerging around the concept of "open" in the world of cloud computing. For example, just this week the Open Data Center Alliance was formed to sway IT vendors toward supporting a cloud computing world where openness is a priority. Of course there's also the Open Cloud Consortium, which focuses on working with vendors and industry to make sure the cloud is open, including the use of its Open Cloud Testbed.

The truth of the matter is the term "open" is so widely used that it does not have any meaning or value any more. I view it as more of a marketing concept. Perhaps it's time to draw a line in the sand and decide what "open" really means.

Here's my proposed definition:

  • First, the vendor must provide the code for the core cloud product or service -- not a subset of items on a separate code tree, which many vendors call the "open source version," but what the vendor is actually pushing for its customers to use.
  • Second, the vendor must take feedback, fixes, and new features back into the core code tree from outside the organization. 
  • Finally, the vendor doesn't take legal action against anyone who takes its core product and builds something better with it, or includes it in other products.

You'll find that only a very few cloud technologies make it through that filter.


<Return to section navigation list> 

Windows Azure Platform Appliance (WAPA)

The HPC in the Cloud blog reported Daimler Building New Cloud-Based Collaboration Platform on Microsoft Technologies in an 11/2/2010 post:

image Microsoft Corp. has won another key customer with its collaboration platform and cloud solution. Following an evaluation phase, Daimler AG decided to migrate onto Microsoft's unified communications, messaging and collaboration solutions. Staff members at Daimler AG now communicate and collaborate more efficiently by using Microsoft technology.

The Microsoft solution helps ensure the highly secure and simple exchange of documents — even beyond organizational boundaries — by including ad-hoc communications, conferencing and traditional e-mail communications in one solution. The specification set by Daimler AG was to reduce the ever-increasing complexity of PC workstations and build a communication and collaboration architecture that will meet the requirements of future working processes. The Microsoft platform with Microsoft Office 2010, Microsoft Exchange Server 2010 and Microsoft SharePoint 2010 helps meet these requirements. In the future, 180,000 Daimler employees worldwide will communicate and collaborate using Microsoft technology through e-mail, instant messaging or web conferences.

"We want to help our automotive customers worldwide transcend the market's turbulence and use technology and innovation as powerful tools to drive profitable growth," said Simon Witts, corporate vice president of Microsoft's Enterprise and Partner Group. "Our commitment to this key industry is unwavering, and our excitement is high as we anticipate the evolution path of today's vehicles."

Building a Private Cloud

image

For Daimler AG, a key to the launch of a new communications platform was achieving a high degree of standardization and scalability throughout its IT services. Another challenge was creating an open communications platform that also supports fast communication for partners while taking into account the need for protecting intellectual property. Daimler will use a service provider to support its infrastructure via a private cloud.

"We convinced Daimler of the added value of our technology," said Ralph Haupter, chairman of the Management Board of Microsoft Deutschland GmbH. "The project emphasizes that the cloud has already been adopted by businesses and today's discussion is about specific business scenarios."

This significant new customer enables Microsoft to further expand its market share in Germany.

About Microsoft in Automotive

Microsoft has been working with the automotive industry for more than a decade. The Microsoft Automotive and Industrial Equipment vertical works with industry partners to develop solutions based on Microsoft technologies that enable original equipment manufacturers (OEMs), suppliers and customers to help improve efficiency, effectiveness and knowledge across the business. Business applications, Microsoft .NET Framework-based technologies and enterprise platform support help manufacturers accelerate time to market, collaborate globally with engineers, reduce costs by leveraging the power of the Internet, and increase visibility into their production and supply chain processes. More information can be found at http://www.microsoft.com/automotive.

Read More: Page 1  of  2, 2, All


<Return to section navigation list> 

Cloud Security and Governance

Boris Segalis reported Data Commissioners Conference in Jerusalem Focuses on Future of Privacy, Cooperation and Enforcement in this 11/2/2010 post to the Information Law Group blog:

image Last week, we joined privacy regulators, practitioners and industry representatives from around the world in Jerusalem for the 32nd International Conference of Data Protection and Privacy Commissioners. On numerous panels, conference participants engaged in lively discussions about privacy compliance and enforcement as well as the future of privacy in light of evolving consumer expectations and advances in technology that tracks and identifies individuals.

image In discussions about the current state and future of privacy, some industry representatives took the position that active sharing by consumers of personal data online, including through social networks, is a vote of confidence in the current approach to privacy regulation. In response, some of the regulators and academics called for stronger privacy protections, arguing that consumers are still unaware of the consequences of disclosing their personal data. Notably, opinions on the state and future of privacy did not necessarily split along the industry/regulator lines. Rather, some industry representatives took a decidedly pro-consumer view of privacy protection, seeing it as a good business practice, while some of the privacy regulators, including the Israeli regulator and some of the European officials, sought to balance privacy protection with the interests of the business community.

On the issue of privacy compliance, participants agreed that Europe continues to be a difficult landscape to navigate in understanding the applicability of local data protection laws to personal data processing activities. At the same time, European panelists acknowledged that diverging views on jurisdiction may not be compatible with the fact that data flows do not know physical borders, and called for more uniformity among EU member states.

The topic of privacy enforcement generated great interest among conference participants. It continues to be a source of frustration for the industry and privacy practitioners. At the conference, panelists acknowledged limitations and inconsistencies of the various privacy enforcement regimes. For example, many of the European regulators are constrained by limitations on their investigative or enforcement authority or discretion as to which consumer complaints to address, as well as budgetary constrains. U.S. regulators appear to be taking privacy seriously. The conference was well-attended by representatives of a number of U.S. federal agencies, including the Federal Trade Commission, the State Department, Commerce Department, and the Department of Homeland Security. The FTC’s Director of the Bureau of Consumer Protection David Vladeck explained that the FTC is choosing its enforcement actions carefully to give guidance to the industry as to which practices the Commission considers unacceptable. The FTC’s expectation is that the industry will follow the guidance provided by its privacy enforcement actions. At the same time, the Commission is ready to increase enforcement if it believes that privacy compliance levels are unsatisfactory. Panelists also suggested that private action enforcement, such class actions in the U.S. and group actions in Europe, may be gaining steam, although the practice is still in its infancy.

At the conclusion of the conference, the commissioners took a step in increasing international cooperation on privacy matters by admitting the FTC into membership in the conference. The admission is a vote of confidence in the FTC’s authority and independence in enforcing privacy regulations. It is also without a doubt the result of the FTC’s increased cooperation with European data protection commissioners. According to the FTC’s David Vladeck, this joint work will continue.

There are many more lessons learned from the Jerusalem conference that we expect to mention in future posts, so please stay tuned.


Adrian Lane posted SQL Azure and 3 Pieces of Flair on 11/1/2010:

image

I have very little social life, so I spent my weekend researching trends in database security. Part of my Saturday was spent looking at Microsoft's security model for the Azure SQL database platform. Specifically I wanted to know how they plan to address database and content security issues with their cloud-based offering. I certainly don't follow all things cloud to the degree our friend Chris Hoff over at RationalSurvivability does, but I do attempt to stay current on database security trends as they pertain to cloud and virtual environments.

Rummaging around MSDN, looking for anything new on SQL Azure database security, I found Microsoft's Security Guidelines and Limitations for SQL Azure Database. And I downloaded their Security Guidlines for SQL Azure (docx). All 5 riveting pages of it. I have also been closely following the Oakleaf Systems blog, where I have seen many posts on secure session management and certificate issuance. In fact Adam Langley had an excellent post on the computational costs of SSL/TLS this Saturday. All in all they paint a very consistent picture, but I am quite disappointed in what I see. Most of the technical implementations I have looked at appear sound, but if the public documentation is an accurate indication of the overall strategy, I am speechless.

Why, you ask? Firewall, SSL, and user authentication are the totality of the technologies prescribed. Does that remind you of something?

This, perhaps?

Universal Security Model

With thanks to Gunnar Peterson, who many years ago captured the essence of most web application security strategies within a singe picture. Security minimalism. And if they only want to do the minimum, that's okay, I guess. But I was hoping for a little content security. Or input validation tools. Or logging. I'm not saying they need to go wild with features, but at this point the burden's on the application developer to roll their own security.

—Adrian Lane


Chris Hoff (@Beaker) posted Navigating PCI DSS (2.0) – Related to Virtualization/Cloud, May the Schwartz Be With You! on 11/1/2010:

[Disclaimer: I'm not a QSA. I don't even play one on the Internet. Those who are will generally react to posts like these with the stock "it depends" answer, to which I respond "you're right, it does.  Not sure where that leaves us other than with a collective sigh, but...]

The Payment Card Industry (PCI) last week released version 2.0 of the Data Security Standard (DSS.) [Legal agreement required]  This is an update from v1.2.1 but strangely does not introduce any major new requirements but instead clarifies language.

Accompanying this latest revision is also a guidance document titled “Navigating PCI DSS: Understanding the Intent of the Requirements, v2.0” [PDF]

One of the more interesting additions in the guidance is the direct call-out of virtualization which, although late to the game given the importance of this technology and its operational impact, is a welcome edition to this reader.  I should mention I’ve sat in on three of the virtualization SIG calls which gives me an interesting perspective as I read through the document.  Let me just summarize by saying that “…you can’t please all the people, all of the time…” ;)

What I find profoundly interesting is that since virtualization is a such a prominent and enabling foundational technology in IaaS Cloud offerings, the guidance is still written as though the multi-tenant issues surrounding cloud computing (as an extension of virtualization) don’t exist and that shared infrastructure doesn’t complicate the picture.  Certainly there are “cloud” providers who don’t use infrastructure shared with other providers beyond themselves in order to deliver service to different customers (I think we call them SaaS providers,) but think about the context of people wanting to use AWS to deliver services that are in scope for PCI.

Here’s what the navigation document has to say specific to virtualization and ultimately how that maps to IaaS cloud offerings.  We’re going to cover just the introductory paragraph in this post with the guidance elements and the actual DSS in a follow-on.  However, since many people are going to use this navigation document as their first blush, let’s see where that gets us:

PCI DSS requirements apply to all system components. In the context of PCI DSS, “system components” are defined as any network component, server or application that is included in, or connected to, the cardholder data environment. System components” also include any virtualization components such as virtual machines, virtual switches/routers, virtual appliances, virtual applications/desktops, and hypervisors.

I would have liked to see specific mention of virtual storage here and although it’s likely included by implication in the management system/sub-system mentions above and below, the direct mention of APIs. Thanks to heavy levels of automation, the operational movements related to DevOps and with APIs becoming the interface of the integration and management planes, these are unexplored lands for many.

I’m also inclined to wonder about virtualization approaches that is not server-centric such as physical networking devices, databases, etc.

If virtualization is implemented, all components within the virtual environment will need to be identified and considered in scope for the review, including the individual virtual hosts or devices, guest machines, applications, management interfaces, central management consoles, hypervisors, etc. All intra-host communications and data flows must be identified and documented, as well as those between the virtual component and other system components.

It can be quite interesting to imagine the scoping exercises (or de-scoping more specifically) associated with this requirement in a cloud environment.  Even if the virtualized platforms are operated solely on behalf of a single customer (read: no shared infrastructure — private cloud,)  this is still an onerous task, so I wonder how — if at all — this could be accomplished in a public IaaS offering given the lack of transparency we see in today’s cloud operators.  Much of what is being asked for relating to infrastructure and “data flows” between the “virtual component and other system components” represents the CSP’s secret sauce.

The implementation of a virtualized environment must meet the intent of all requirements, such that the virtualized systems can effectively be regarded as separate hardware. For example, there must be a clear segmentation of functions and segregation of networks with different security levels; segmentation should prevent the sharing of production and test/development environments; the virtual configuration must be secured such that vulnerabilities in one function cannot impact the security of other functions; and attached devices, such as USB/serial devices, should not be accessible by all virtual instances.

“…clear segmentation of functions and segregation of networks with different security levels” and “the virtual configuration must be secured such that vulnerabilities in one function cannot impact the security of other functions,” eh? I don’t see how anyone can expect to meet this requirement in any system underpinned with a virtualized infrastructure stack (hardware or software) whether it’s multi-tenant or not.  One vulnerability in the hypervisor makes this an impossibility.  Add in management, storage, networking. This basically comes down to trusting in the sanctity of the hypervisor.

Additionally, all virtual management interface protocols should be included in system documentation, and roles and permissions should be defined for managing virtual networks and virtual system components. Virtualization platforms must have the ability to enforce separation of duties and least privilege, to separate virtual network management from virtual server management.

Special care is also needed when implementing authentication controls to ensure that users authenticate to the proper virtual system components, and distinguish between the guest VMs (virtual machines) and the hypervisor.

The rest is pretty standard stuff, but if you read the guidance sections (next post) it gets even more fun.  This is why the subjectivity, expertise and experience of the QSA is so related to the quality of the audit when virtualization and cloud are involved.  For example, let’s take a sneak peek at section 2.2.1, as it is a bit juicy:

2.2.1 Implement only one primary function per server to prevent functions that require different security levels from co-existing
on the same server. (For example, web servers, database servers, and DNS should be implemented on separate servers.)
Note: Where virtualization technologies are in use, implement only one primary function per virtual system component
.

I  acknowledge that there are “cloud” providers who are PCI certified at the highest tier.  Many of them are SaaS providers.  Many simply use their own server stacks in co-located facilities but due to their size and services merely call themselves cloud providers — many aren’t even virtualized per the description above.   Further, there are also methods of limiting scope and newer technologies such as tokenization that can assist in solving some of the information-centric issues with what would otherwise be in-scope data, but they offset many of the cost-driven efficiencies marketed by mass-market, low-cost cloud providers today.

Love to hear from an IaaS public cloud provider who is PCI certified (to the VM boundary) with customers that are in turn certified with in-scope applications and cardholder data or even a SaaS provider who sits atop an IaaS provider…

Just read this first before responding, please.

/Hoff


Buck Woody listed Windows Azure Security Links on 11/1/2010:

image Research shows that companies that are considering a “cloud” platform have various concerns, and that security is at the top of that list. I’ve put together a list of the resources I use for explaining our security posture, and the steps that you need to take to be secure in Windows and SQL Azure. I’ll try and keep this list current – if you don’t see something that you need, leave me a comment below and I’ll research that for you.

image

Security in any technology should use a multi-layered approach, and that holds true for cloud computing as well. There are things that Microsoft does for security, and things that you need to do to secure your own code and environment. As always, it’s best to discuss these items with a technical professional, but these links should provide you some good background to have those discussions.

This isn’t an exhaustive list; there will be other sources you can use for that, but I have it in a format that I think is easy to follow. Most of the links I show here have references to yet other sources as you need them.

General Information on Cloud Computing Security:

· General Security Whitepaper – answers most questions: http://blogs.msdn.com/b/usisvde/archive/2010/08/10/security-white-paper-on-windows-azure-answers-many-faq.aspx

· Windows Azure Security Notes from the Patterns and Practices site: http://blogs.msdn.com/b/jmeier/archive/2010/08/03/now-available-azure-security-notes-pdf.aspx

· Great Overview of Azure Security: http://www.windowsecurity.com/articles/Microsoft-Azure-Security-Cloud.html

· Azure Security Resources: http://reddevnews.com/articles/2010/08/19/microsoft-releases-windows-azure-security-resources.aspx

· Cloud Computing Security Considerations: http://www.microsoft.com/downloads/en/details.aspx?FamilyID=68fedf9c-1c27-4642-aa5b-0a34472303ea&utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+MicrosoftDownloadCenter+%28Microsoft+Download+Center

· Security in Cloud Computing – a Microsoft Perspective: http://www.microsoft.com/downloads/en/details.aspx?FamilyID=7c8507e8-50ca-4693-aa5a-34b7c24f4579&utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+MicrosoftDownloadCenter+%28Microsoft+Download+Center

Physical Security for Microsoft’s Online Computing:

· The Global Foundation Services group at Microsoft handles our physical security. It’s quite robust, and meets ISO 27001 and SAS-70 requirements. More here: http://www.globalfoundationservices.com/security/index.html

· Microsoft’s Security Response Center: http://www.microsoft.com/security/msrc/

Software Security for Microsoft’s Online Computing:

· Windows Azure is developed using the Trustworthy Computing Initiative - you should follow this as well: http://www.microsoft.com/about/twc/en/us/default.aspx and http://msdn.microsoft.com/en-us/library/ms995349.aspx

· Identity and Access in the Cloud: http://blogs.msdn.com/b/technology_titbits_by_rajesh_makhija/archive/2010/10/29/identity-and-access-in-the-cloud.aspx

Security Steps you should take:

· Securing your cloud architecture, step-by-step: http://technet.microsoft.com/en-us/magazine/gg296364.aspx

· Security Guidelines for Windows Azure: http://redmondmag.com/articles/2010/06/15/microsoft-issues-security-guidelines-for-windows-azure.aspx

· Best Practices for Windows Azure Security: http://blogs.msdn.com/b/vbertocci/archive/2010/06/14/security-best-practices-for-developing-windows-azure-applications.aspx

· Active Directory and Windows Azure: http://blogs.msdn.com/b/plankytronixx/archive/2010/10/22/projecting-your-active-directory-identity-to-the-azure-cloud.aspx

· Understanding Encryption (great overview and tutorial): http://blogs.msdn.com/b/plankytronixx/archive/2010/10/23/crypto-primer-understanding-encryption-public-private-key-signatures-and-certificates.aspx

· Securing your Connection Strings: http://blogs.msdn.com/b/sqlazure/archive/2010/09/07/10058942.aspx

· Getting started with Windows Identity Foundation (WIF) quickly: http://blogs.msdn.com/b/alikl/archive/2010/10/26/windows-identity-foundation-wif-fast-track.aspx


<Return to section navigation list> 

Cloud Computing Events

Dan Scarfe announced the Inaugural Cloud Evening Meeting to be held 11/11/2010 at Skills Matter, 116-120 Goswell Road, City/Town: London, EC1V 7DP, UK:

image Welcome to the inaugural Cloud Evening. Cloud Evening is the UK's only Cloud-focussed user group.

We want to build on the interest in the market about the Cloud and provide a forum for developers and architects to meet and discuss relevant topics.

We'll be doing 2 presentations per meeting and, of course, have free beer and pizza.

Please RSVP, as spaces are limited. You can register for the event at http://cloudeve.ning.com/events/inaugural-cloud-evening

The following is the agenda for our inaugural Cloud Evening:

6:00pm-6:30pm: Registration

6:30pm-7:30pm: Real world experience of a startup using Azure (by @RossDScott)

imageWe've seen the demos, written the "Hello World" but is it really a good idea? This session will take you through the first 3 months of a new startup developing on the Azure platform.

  • An introduction to the technical concepts (from a coding point of view)
  • Cost driven development, a new way of thinking.
  • Data storage, should I go with the easy option or change my relational mindset?
  • What's it actually going to cost me?
  • A few tips and tricks

7:30pm: Break for Beer & Pizza

8:00pm-9:00pm: PDC 2010 Round Up (by @MarkRendle)

image Mark will give a round up of the Cloud announcements at this years PDC, plus a deep dive into some of the new capabilities of the Azure platform

Speakers Bios

Ross Scott (@RossDScott) - For the last year Ross has been bootstraping a startup business in his evenings and weekends. The platform of choice for Ross was Azure as this gave him the least path of resistance and the greatest confidence.

Ross has been a developer for 10 years and had been contracting for the last 3.

Mark Rendle ???


Adron Hall (@adronbh) reported on 11/2/2010 that he has a Windows Azure (w/ AWS) Presentation Coming Up on 11/10/2010 at 1:00 PM PST:

image I have a presentation coming up next week on the 10th.  If you’re interested in cloud computing, specifically around storage then you should tune in.  I’ll be covering the basics and some of the architectural ideas, uses, and more around Windows Azure Storage, and the comparable Amazon Web Services storage services.  I’ll also be noting a few of my ongoing projects that you might, if you’re into cloud bits, get a kick out of or want to join.

imageTo tune in to the presentation swing over to the https://www.clicktoattend.com/invitation.aspx?code=147809 link.  There is registration information on the page.  The presentation will technically start at 1 PM PST on the 10th of next week and run until about 1:45pm.  We’ll make the meeting live about 12:45 for early arrival and after about 1:45 there will be a question and answer session.  I hope to have a good bit of conversation afterwards discussing the uses, architectures, and patterns around storage use with cloud services.

I hope you’ll join me.


<Return to section navigation list> 

Other Cloud Computing Platforms and Services

Matthew Weinberger reported Google Sues US Government Agency over SaaS & Cloud Contract in an 11/02/2010 post to the MSPMentor blog:

If that headline made you do a double-take, you’re not alone. I was surprised when I read a Seattle Times’ blog indicating Google was suing the US Department of the Interior for only taking bids from Microsoft partners reselling BPOS/Office 365 cloud solution. The Google lawsuit definitely has cloud channel partner implications. Here’s the update.

Just the facts, per that blog: Google and Ohio-based reseller partner Onix Networking say they went after a five-year cloud migration project worth $59 million. But the Department of the Interior they were only interested in Microsoft BPOS solutions due to security concerns. Google met with department officials, but couldn’t convince them of the security of their own Google Apps cloud suite.

And now Google’s literally made a federal case of it, saying that preferring Microsoft to the exclusion of other e-mail systems was arbitrary and could potentially end up costing taxpayers millions in licenses — Microsoft Office 365 is slightly more expensive than Google Apps (though Microsoft boosters say you get what you pay for), and that adds up.

And, of course, it would be a major PR blow if Google lost the suit. Google has been making the Google Apps suite government-friendly. Having a court potentially rule against Google could potentially kill a lot of the search giant’s government momentum.

Moreover, there are channel implications here. Google is essentially going to court over a channel partner losing a deal. Neither Microsoft nor the Department of the Interior are commenting on the suit. But we’ll be watching it closely, so stay tuned.


Bob Warfield reported Dell Buys Boomi: Right Inline With My Cloud Strategy in an 11/2/2010 post to the Enterprise Irregulars blog:

image Just read that Dell is buying Cloud data integration company Boomi.  That’s right in line with the focus on data strategy I’ve recommended for Cloud vendors.  I’m not sure how many more companies in this space are available to be picked up.  IBM picked up Cast Iron Systems, which was another great catch.

Just a refresher on why these companies are so pivotal, because it amounts to two key observations:

First, Clouds have latency, which creates a network effect.  It’s easier for apps in the same Cloud to talk to each other than it is to go across Clouds.  Hence Clouds are going to accrete applications based on which ones need to talk to each other.

Second, in the SaaS world, integration is a tremendous competitive and transaction friction issue.  No Enterprise application is an island, they all need to talk to some other application.  If that integration has to be done without the leveraging benefits of a supporting technology, if it has to be done strictly as a service, that adds a lot of friction to the transaction.  That friction can slow down sales momentum.  In addition, from a competitive standpoint, SaaS apps are often bought by the Business with the idea that they will cause minimal support overhead for IT.  Whether IT buys into that or not, the availability of suitable integration technology is going to determine how happy everyone is.  If IT has to constantly dive in and bandaid the integration, nobody is happy.  If IT can get comfortable at the outset that the integration solution will be high quality, everyone will be a lot happier.

These data integration acquisitions are very strategic to the Cloud space.  Good on Dell for going after Boomi.


Jeff Barr asked What Can I Say? Another Amazon S3 Price Reduction! in this 11/1/2010 post:

image We've reduced the prices for Amazon S3 storage again. As is always the case, the cost to store your existing data will go down. This is markedly different than buying a hard drive at a fixed cost per byte and is just one of the many advantages of using cloud-based storage. You can also count on Amazon S3 to deliver essentially infinite scalability, eleven nines of durability (99.999999999%), and your choice of four distinct geographic locations for data storage.

image So, starting November 1, 2010, you'll see a reduction of up to19% in your overall storage charges on a monthly basis. We've created a new pricing tier at the 1 TB level, and we have removed the current 50 - 100 TB tier, thereby extending our volume discounts to more Amazon S3 customers.

The new prices for standard storage in the US Standard, EU - Ireland, and APAC - Singapore regions are as follows:

image

Reduced Redundancy storage will continue to be priced 1/3 lower than standard storage in all regions.

The full price list can be found on the Amazon S3 page. We'll continue to work relentlessly to drive our costs down so that we can pass the savings along to you!

We've got several more announcements related to S3 coming up in the near future, so stay tuned.

The S3 team is hiring Software Development Engineers, a Technical Program Manager, System Engineers, Administrators, and Product Managers. More information and instructions for applying can be found on the Amazon S3 Jobs page.


<Return to section navigation list> 

0 comments: