Monday, March 12, 2012

Windows Azure and Cloud Computing Posts for 3/12/2012

A compendium of Windows Azure, Service Bus, EAI & EDI Access Control, Connect, SQL Azure Database, and other cloud-computing articles. image222

image433

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:


Azure Blob, Drive, Table, Queue and Hadoop Services

image_thumb3_thumb

No significant articles today.

image

<Return to section navigation list>

SQL Azure Database, Federations and Reporting

imageNo significant articles today.


<Return to section navigation list>

MarketPlace DataMarket, Social Analytics, Big Data and OData

Chris Woodruff (@cwoodruff) blew his own horn when reporting Chris Woodruff Speaking at Code PaLOUsa 2012 Conference in a 3/12/2012 post to the Microsoft NSI Partner blog:

imagePerficient’s National Microsoft Cloud Practice Lead Chris Woodruff will be speaking at this week’s Code PaLOUsa 2012 Conference in Louisville, KY covering the Open Data Protocol (OData) during a Pre-Conference workshop Thursday and 2 OData presentations Friday and Saturday. Tickets are still available at the conference home page.

imageHere is the list of Chris’ workshop and presentations:


Adam Hurwitz reported Support added for Cloud Numerics format by Codename “Data Transfer” in a 3/12/2012 post to the Microsoft Codename “Data Transfer” Blog:

Microsoft Codename "Cloud Numerics" is a SQL Azure Lab that lets you model and analyze data at scale. Now when you want to upload a file with Microsoft Codename "Data Transfer" to Windows Azure Blob storage for use in Cloud Numerics, you can choose to have the file converted to the Numerics Binary Format. This only applies to CSV and Excel files that contain numerical data ready for analysis with Cloud Numerics.

When uploading to blob you will be presented with a choice of output format.

When you select Numerics Binary Format, you will receive additional options regarding the file that you are uploading.

You can run a model in Windows Azure after the data transfer of the Numerics Binary Format file completes. Here is a C# example of a model that computes Eigen values using Cloud Numerics:

public static void Main(string[] args)
{
// Step 1: Initialize the Microsoft.Numerics runtime to create
// and operate on Microsoft Numerics distribute arrays
// DO NOT REMOVE THIS LINE
NumericsRuntime.Initialize();
//Setup the Azure values used in the Data Transfer
string account = "tbd"; // Azure Account
string key = "tbd"; // Azure Storage Key
string container = "tbd"; // Azure Container
string file = "tbd"; // Name of Numerics Binary Format File
//Load the Numerics Binary File into a distributed array
var sr = new SequenceReader(account, key, containerName, fileName, 0);
var berlinAdapterResult = Loader.LoadData<double>(sr);
long[] shape1 = berlinAdapterResult.Shape.ToArray();
int ndims1 = berlinAdapterResult.NumberOfDimensions;
Console.WriteLine("Container: {0}, File name: {1}", container, file);
for (int i = 0; i < ndims1; i++)
{
Console.WriteLine("Dimension {0} has length = {1}", i, shape1[i]);
}
//Calculate the Eigen values
var result = Decompositions.EigenValues(berlinAdapterResult);
Console.WriteLine("Eigen values :\n {0}", result.ToString());
// Shutdown the Microsoft.Numerics runtime
NumericsRuntime.Shutdown();
Console.WriteLine("Numeric Binary Format Successfully Read.");
Console.WriteLine("Hit enter to continue ... ");
Console.ReadLine();
}

To learn more about deploying this model, please visit SQL Azure Labs Microsoft Codename “Cloud Numerics”.

image_thumb15_thumb


<Return to section navigation list>

Windows Azure Access Control, Service Bus and Workflow

imageNo significant articles today.


<Return to section navigation list>

Windows Azure VM Role, Virtual Network, Connect, RDP and CDN

imageNo significant articles today.


<Return to section navigation list>

Live Windows Azure Apps, APIs, Tools and Test Harnesses

Maarten Balliauw (@maartenballiauw) reported Pro NuGet is finally there! on 3/12/2012:

Short version: Install-Package ProNuget or http://amzn.to/pronuget

imageIt’s been a while since I wrote my first book. After I’ve been telling that writing a book is horrendous (try writing a chapter per week after your office hours…) and that I would never write on again, my partner-in-crime Xavier Decoster and I had the same idea at the same time: what about a book on NuGet? So here it is: Pro NuGet is fresh off the presses (or on Kindle).

Pro NuGet - Continuous integration Package RestoreSpecial thanks go out to Scott Hanselman and Phil Haack for writing our foreword. Also big kudos to all who’ve helped us out now and then and did some small reviews. Yes Rob, Paul, David, Phil, Hadi: that’s you guys.

Why a book on NuGet?

Why not? At the time we decided we would start writing a book (september 2011), NuGet was out there for a while already. Yet, most users then (and still today) were using NuGet only as a means of installing packages, some creating packages. But NuGet is much more! And that’s what we wanted to write about. We did not want to create a reference guide on what NuGet command were available. We wanted to focus on best practices we’ve learned over the past few months using NuGet.

Some scenarios covered in our book:

  • What’s the big picture on package management?
  • Flashback last week: NuGet.org was down. How do you keep your team working if you depend on that external resource?
  • Is it a good idea to auto-update NuGet packages in a continous integration process?
  • Use the PowerShell console in VS2010/11. How do I write my own NuGet PowerShell Cmdlets? What can I do in there?
  • Why would you host your own NuGet repository?
  • Using NuGet for continuous delivery
  • More!

I feel we’ve managed to cover a lot of concepts that go beyond “how to use NuGet vX” and instead have given as much guidance as possible. Questions, suggestions, remarks, … are all welcome. And a click on “Add to cart” is also a good idea ;-)


Michael Washam (@MWashamMS) reported a Windows Azure PowerShell Cmdlets 2.2.2 Release on 3/12/2012:

imageWindows Azure PowerShell Cmdlets (v2.2.2)

We have a brand new release of the Windows Azure PowerShell cmdlets that we hope will make getting started and scripting with the cmdlets a much easier task.

The new release can be downloaded from its CodePlex project site here.

imageGetting Started Improvements

In 2.2.2 we have added a start menu link that starts a PowerShell session with the Windows Azure cmdlets already loaded. We have also added a Start Here link that shows how to complete the setup and a short tour about the capabilities of the Windows Azure PowerShell cmdlets and release changes.

Subscription Management Improvements

We have taken the subscription management improvements from the 2.2 release and made them much better. Specifically, we have added the ability to persist your subscription settings
into your user profile. This functionality allows you to set the subscription data once and then in new scripts or PowerShell sessions just select the subscription you want to use without the need to specify the subscription ID, Certificate and storage accounts each time.

Code Snippet One: Setting Subscription Data

     $subid = "{subscription id}"
     $cert = Get-Item cert:\CurrentUser\My\CERTTHUMBPRINTUPPERCASE

     # Persisting Subscription Setings
     Set-Subscription -SubscriptionName org-sub1 -Certificate $cert -SubscriptionId $subid

     # Setting the current subscription to use 
     Select-Subscription -SubscriptionName org-sub1    

Calling the Set-Subscription cmdlet with your certificate
and subscription ID. Set-Subscription will persist the certificate thumbprint
and subscription id to (C:\Users\{username}\AppData\Roaming\Windows Azure
PowerShell Cmdlets\DefaultSubscriptionData.xml) associated with the subscription name.

This functionality supports adding multiple subscriptions to your configuration
so you can manage each individually within the same script simply by calling
Select-Subscription with the subscription name.

Code Snippet Two: Setting the Default Subscription

        Set-Subscription -DefaultSubscription org-sub1    

Snippet two demonstrates setting the default subscription to use if you do not set one with Select-Subscription.

Code Snippet Three: Associating Storage Accounts with your Subscription

     # Save the cert and subscriptio id for two subscriptions
     Set-Subscription -SubscriptionName org-sub1 -StorageAccountName mystoragename1 -StorageAccountKey mystoragekey1
     Set-Subscription -SubscriptionName org-sub1 -StorageAccountName mystoragename2 -StorageAccountKey mystoragekey2

     # Specify the default storage account to use for the subscription
     Set-Subscription -SubscriptionName org-sub1 -DefaultStorageAccount mystoragename1    

Snippet three shows that you can associate multiple storage accounts with a single subscription. All it takes to use the correct storage account is to set the default before calling a cmdlet that requires a storage account.

Code Snippet Four: Specifying the Subscription Data File Location

       # Overriding the default location to save subscription settings
       Set-Subscription -SubscriptionName org-sub1 -Certificate $cert -SubscriptionId $subid -SubscriptionDataFile c:\mysubs.xml

       # Retrieving a list of subscriptions from an alternate location
       Get-Subscription -SubscriptionDataFile c:\mysubs.xml    

Each of thd subscription cmdlets take a -SubscriptionDataFile parameter that allows you to  specify which XML file to use for operations.

Code Snippet Five: MISC Subscription Management

        # Returns all persisted settings 
       Get-Subscription

       # Removes mysub2 from persisted settings
       Remove-Subscription -SubscriptionName org-sub2

       # Removing a storage account from your persisted subscription settings
       Set-Subscription -SubscriptionName org-sub1 -RemoveStorageAccount mystoragename1    
Other Usability Improvements

We have made many of the cmdlets simpler to use by allowing more parameters to be optional with default values.

  • -Label parameter is now optional in New-AffinityGroup, Set-AffinityGroup, New-HostedService, New-StorageAccount, New-Deployment and Update-Deployment.
  • -Slot parameter is now optional in New-Deployment and Update-Deployment (Production slot is used by default).
  • -Name parameter is now optional in New-Deployment (a Globally Unique Identifier value is used by default).

In addition to the defaults we provided some needed fixes to unblock certain scenarios.

  • Get-Deployment now returns $null if no deployment was found in the specified slot (an error was thrown in previous versions).
  • -Package and -Configuration parameters now accept UNC paths in New-Deployment and Update-Deployment.
Breaking Changes

With improvements like these we did have to make some sacrifices.
Before you download the latest build please review the list below because we have a few breaking changes.

  • -DefaultStorageAccountName and -DefaultStorageAccountKey parameters were removed from Set-Subscription. Instead, when adding multiple accounts to a subscription, each one needs to be added with -StorageAccountName and -StorageAccountKey or -ConnectionString. To set a default storage account, use Set-Subscription –DefaultStorageAccount {account name}.
  • -SubscriptionName is now mandatory in Set-Subscription.
  • In previous releases, the subscription data was not persisted between PowerShell sessions. When importing subscription settings from a publishsettings file downloaded from the management portal, the Import-Subscription cmdlet optionally saved the subscription information to a file that could then be restored using Set-Subscription thereafter. This behavior has changed. Now, imported subscription data is always persisted to the subscription data file and is immediately available in subsequent sessions. Set-Subscription can be used to update these subscription settings or to create additional subscription data sets.
  • Renamed -CertificateToDeploy parameter to -CertToDeploy in Add-Certificate.
  • Renamed -ServiceName parameter to -StorageAccountName in all Storage Service cmdlets (added “ServiceName” as a parameter alias for backward compatibility).
Summary

In the 2.2.2 release we have made a number of fixes such as accepting UNC paths and fixing Get-Deployment to not throw an error on empty slots. We have also substantially improved the getting started experience and how you can manage your Windows Azure subscriptions from PowerShell.

The new release can be downloaded from http://wappowershell.codeplex.com.


Himanshu Singh (@himanshuks) posted Real World Windows Azure: Interview with Nathan Brouwer, Global Partner Support Manager at Figlo on 3/12/2012:

imageAs part of the Real World Windows Azure series, I spoke with Nathan Brower, Global Partner Support manager at Figlo about how the independent software vendor (ISV) is using Windows Azure to expand internationally and realize a 60 percent cost reduction. Read the customer success story. Here’s what he had to say:

Himanshu Kumar Singh: What does Figlo do?

Nathan Brouwer: Figlo develops financial planning solutions that advisors use in working with their clients. Our flagship product, the Figlo Platform, offers professionals a rich set of data visualization tools for tracking client asset performance. With our solution, advisors can provide clients with anywhere access to account data while offering a variety of ways to interact with their financial information.

imageHKS: What opportunities do you see with cloud technologies`?

NB: The Figlo Platform has an advanced multicurrency calculation engine that can be adapted to fit local financial regulations, which is an advantage in international markets. We wanted to start distributing the solution globally using our established subscription-pricing model. We saw an opportunity to shift from hosting the Figlo Platform in our corporate data center to hosting it in a cloud environment to gain scalability and reduce operating expenses.

HKS: What cloud solutions did you explore?

NB: We began evaluating cloud platforms in late 2010. We had previously used Amazon Elastic Compute Cloud (EC2) to host the Figlo Platform for a customer in the United States. Our developers found that the tools and processes for deploying applications to the Amazon cloud service required several manual, time-consuming steps. We looked into Windows Azure, the Microsoft cloud services development, hosting, and management environment. Because Windows Azure delivers true platform-as-a-service capabilities through a set of easy-to-use tools, it offers a much more complete solution for our business.

HKS: What steps did you take to launch the Figlo Platform in the cloud?

NB: In January 2011, we spent a few weeks moving the Figlo Platform to Windows Azure and SQL Azure. The solution is built on the ASP.NET web application framework and uses Microsoft SQL Server 2008 database management software. It also incorporates user interface components built on Microsoft Silverlight browser plug-in technology. Our developers used Windows Azure Tools for Visual Studio to complete the application conversion and testing processes in two weeks.

HKS: What business model are you using for delivering the Figlo Platform in the cloud?

NB: Since releasing the Figlo Platform on Windows Azure, we have continued to deliver the solution as a subscription service. We offer discounted pricing for large-volume multitenant deployments. Customers who require a single-tenant configuration pay a yearly hosting fee in addition to the base subscription price. We still operate on the same basic business model, but now we can get customers set up in less time, which means we can start realizing revenue faster.

HKS: Describe some of the other benefits of Windows Azure for Figlo.

NB: Figlo spends 10,000 euros (U.S.$14,700) annually per instance to host the Figlo Platform on Windows Azure. Previously, we spent 25,000 euros (U.S.$36,700) annual per instance to maintain a hosted server environment. We calculate a total savings of U.S.$22,000 per instance per year, which is a 60 percent cost reduction. In addition, with Windows Azure, we can tailor the configuration of the infrastructure to fit each customer’s exact needs and deploy our solution in minutes, rather than hours or days.

Read the full story. Learn how others are using Windows Azure.


Steve Plank (@plankytronixx, pictured below) asked and answered Running an illegal website? Use the cloud to check it… in a 3/12/2012 post:

imageJohn Mannix from Governor Technology talks about http://cookielaw.org, a service that used crowd-sourcing to farm cookies from volunteers who installed an agent on their machines and provided later analysis and gave advice on whether a particular site's cookies existed within the limitation of the EU directive that came in to force on the 27th May 2011 (and that will be enforced in May 2012).

imageAn interesting discussion on how they architected to use Azure features so they could easily scale up/down depending on the number of cookies that were harvested.

They didn’t know how many cookies they were going to be receiving – anything from 10 cookies to 10 trillion. But they knew they needed an answer to the possibility that literally anything could happen, hence they chose Windows Azure to host the service and were able to scale up and down as cookies were collected.


<Return to section navigation list>

Visual Studio LightSwitch and Entity Framework 4.1+

Jan Van der Haegen (@janvanderhaegen) posted Monthly LightSwitch column in MSDN Magazine, and other “all things LightSwitch”… on 3/12/2012:

imageI closed my last post two weeks ago by stating that

I’ll probably be in the LightSwitch 11 zone during the next 48 hours straight .

image_thumb1Judging by the number of new posts ever since, you might think that I have instead been clustered to my laptop for 14 days straight, not sharing of writing about my LightSwitch experiments with the community, but those of you that know me in person would probably beg sometimes for a way to have me shut my mouth about LightSwitch for more then 5 minutes… I’ve just been more quiet on this blog…

LightSwitch covered in a monthly column in MSDN Magazine.

I’m extremely proud to announce, that MSDN Magazine – the Microsoft Journal for Developers - will soon have its own monthly web column about “all things LightSwitch”, written by yours truly. As one could expect, the articles will deal with both “what comes out of the LightSwitch box” and “thinking outside the LightSwitch box”; from the looks of the propositions as they are now I’ll be your guide on a LightSwitch flavored tour through Windows Phone 7.5 (“Consuming a LightSwitch OData service from a Windows Phone application”), application building and Metro design style (“Building data centric applications faster than ever with Microsoft Visual Studio LightSwitch 11 beta” and “Taking LightSwitch for a ride on the Metro“), Azure ACS (“Logging on to a LightSwitch application using a Windows Live ID“) and Windows WorkFlow Foundation (“LightSwitch 11 – coding’s optional“) mashups.

The first post will be published quite soon (and rest assured I’ll make sure you know where to find it), but I wanted to go ahead and thank everyone involved, with a couple of people in particular: Michael Washington (www.LightSwitchHelpWebsite.com), Michael Desmond and Sharon Terdeman (1105 Media), and Beth Massi (Microsoft).

“My first eBook”.

Working title, by the way. ;-)

Another item on my “LightSwitch list of things to do”, is writing on my first eBook. The kind people at Syncfusion – Deliver Innovation with Ease - have asked me to author an eBook for them about “all things LightSwitch”. My only request to them was that the eBook will be available for free to the general public, which they immediately and gladly accepted. The eBook should be written by May 1st, rest assured I’ll make sure you know where to find it!

“The LightSwitch startup”.

Another working title, I’m afraid.

I’ve already mentioned it a couple of times, and on April 2nd it’s finally happening. Besides my fulltime employment at Centric, I’ll be working for my own startup that will do “all things LightSwitch”: LightSwitch evangelism (training, blogging, writing, speaking, …), LightSwitch consulting, building LightSwitch solutions, and LightSwitch extensions. Actually, my second LightSwitch extension is almost, almost, – really – almost ready to be released in beta, and I promise it will blow your mind!

So anyways, I haven’t been so active on my blog lately, but have instead been playing with LightSwitch 11 beta, and other “all things LightSwitch”. Did anything fun lately that you’d like to share, got a good name for the eBook or a suggestion for the startup’s name, know that I just love it if you hit that comment button below to let me and the other readers know!


Julie Lerman (@julielerman) described EF Code First Migrations, Update-Database outside of Visual Studio in a 3/12/2012 post:

imageIn a recent blog post comment, someone asked “can you please tell them [EF Dev Team] some developers would like to use the Power Shell command script and not PM console to update database.”

If you look inside the package folder for Entity Framework 4.3 (or 4.3.1 or whatever is the current version in the future) there’s a tools directory and inside of there a migrate.exe command. It is the same as the update-database command with all of the same parameters.

migratecommandline


Return to section navigation list>

Windows Azure Infrastructure and DevOps

imageNo significant articles today.


<Return to section navigation list>

Windows Azure Platform Appliance (WAPA), Hyper-V and Private/Hybrid Clouds

Kevin Remde (@KevinRemde) announced the availability of NEW System Center 2012 CTPs for Windows Server “8” beta in a 3/12/2012 post:

imageLate on Friday the System Center Virtual Machine Manager Blog released a post enthusiastically titled “System Center 2012 CTP for Windows Server ‘8’ Beta support now available!!!”. It would seem the System Center team is looking for early feedback on running System Center against the next version of Windows Server (currently codename Windows Server “8”). And they are particularly interested in people trying out even newer Virtual Machine Manager and Data Protection Manager components, to test how they work with some of the new and exciting features in Server 8.

An EARLY version“But…this is a CTP? I thought System Center 2012 was already in the Release Candidate phase.”

It is. But the soon-coming release of System Center 2012 won’t have support for Windows Server “8”, which will still be in beta (or maybe RC – I don’t know the exact timings here) when System Center is released. Support for Server 8 features such as the new VHDX disk format, “shared nothing” live migrations, and live storage migration won’t be directly supported in System Center 2012 until after Server “8” is released, probably in the form of an update to System Center. These CTPs are the first publically available versions of those updated System Center components.

“These ‘CTPs’? There’s more than one?”

imageWell.. I say that because what you’ll be getting is not the entire System Center 2012 product set, but only the CTPs of new Virtual Machine Manager and Data Protection Manager components that have the necessary new functionality.

So..

  1. CLICK HERE to get the CTPs of VMM and DPM, and
  2. Give your FEEDBACK HERE.

---

Are you going to try this out? I will, very soon. Let us know your thought or experiences with this in the comments (after providing feedback to the SC team, of course).


<Return to section navigation list>

Cloud Security and Governance

Bruce Kyle continued his Windows Azure Security Best Practices – Part 4: What Else You Need to Do series on 3/12/2012:

imageSo which security threats are mitigated by the Windows Azure environment and which security threats must be mitigated by the developer?

The paper, Security Best Practices for Developing Windows Azure Applications, describes what you should consider as key threats that your an application running on the Windows Azure. And it shows specifically where Azure provides the mitigation and those you need to call APIs and those which you need to handle yourself. (It does not address regulatory compliance issues.)

image

What You Should Handle

image_thumbI’ll take a selection of the threats and and what you should do and provide a reference for where you can learn more about how to implement them in your code. This list comes from Windows Azure Security Overview. But the results will come for you.

This is not an exhaustive list. And as you can tell from the previous parts in this series, you tailor your security practices based on your own application needs.

Threat of Tampering

Tampering/disclosure of credentials or other sensitive application data. Use Windows Identity Foundation and HTTPS mutual authentication for SSL connections.

  • See How to: Manage Service Certificates for information on adding certificates to the store, associating certificates with services, and updating certificates. In these scenarios, the IT manager and the service developer are presumed to be two different people, but they may also be the same person.
  • See Windows Identity Foundation helps simplify user access for developers by externalizing user access from applications via claims and reducing development effort with pre-built security logic and integrated .NET tools.
Threat of Repudiation

Audit log collection, storage and analysis. Use monitoring and diagnostic APIs as needed; transfer logs to Storage private blob/table storage over HTTPS. See:

  • Take Control of Logging and Tracing in Windows Azure in MSDN Magazine.
  • Azure Monitor for code to monitor your Azure-hosted applications in real-time. It includes a library for capturing runtime process information to cloud table storage; and also a desktop application for viewing the captured information in real-time.
  • Using Windows Azure Diagnostics (about a third of the way down the page). Windows Azure provides integrated features for monitoring, logging, and tracing within the deployed environment; generally referred to as diagnostics. These features are most suited to monitoring performance of applications over time, though they can also be used for debugging purposes as well.
Threat of Information Disclosure

Disclosure of arbitrary secrets in blob/table/queue storage. Pre-encrypt secret data prior to uploading. Do not store decryption keys in Windows Azure Storage.

Disclosure of Shared Access Signatures. Use HTTPS to securely transfer Shared Access Signatures to intended recipients and set appropriate permissions on containers. See Managing Access to Blobs and Containers for how to use Shared Access Signatures.

Denial of Service Threat

Request flooding at the customer code/app level. Implement application-level request throttling if necessary. See Autoscaling and Windows Azure.

Elevation of Privilege

Misconfiguration of Service/Application settings. Must scope all cookies and the document.domain property to the service subdomain (eg. http://contoso.cloudapp.net) and NOT to *.cloudapp.net

Cross-site Request Forgery Attacks against the web role. Use ASP.NET defenses. See Take Advantage of ASP.NET Built-in Features to Fend Off Web Attacks.

Cross-site Scripting Attacks against the web role. Use the Anti-XSS Library.

API fuzzing attacks on interfaces exposed by the web role. Fuzz all interfaces and endpoints unique to code exposed to the web (or any other services)

Apply security-testing tools including fuzzing tools. "Fuzzing" supplies structured but invalid inputs to software application programming interfaces (APIs) and network interfaces so as to maximize the likelihood of detecting errors that may lead to software vulnerabilities.

File Fuzzing attacks against custom, application-provided file parsers. Fuzz test all proprietary network protocol or file format parsers.

Patching of security vulnerabilities at the Web Role/customer code level. Have a security response and updating plan in place.

You can get tools to assist in your fuzzing testing from the The Microsoft Security Development Lifecycle (SDL) site. SDL includes tools and processes that you can use freely. For example, you can use:

SQL Azure

I added this section about SQL Azure because the platform offers some additional ways to mitigate the threats that you should know.

SQL Azure Security Administration. Security administration in SQL Azure Database is similar to security administration for an on-premise instance of SQL Server. Managing security at the database-level is almost identical, with differences only in the parameters available. Because SQL Azure databases can scale to one or more physical computers, SQL Azure Database uses a different strategy for server-level administration.

SQL Azure Firewall. You can lock down your database to provide access to only those users or computers who are authorized. To help protect your data, the SQL Azure firewall prevents all access to your SQL Azure server until you specify which computers have permission. The firewall grants access based on the originating IP address of each request.

Resources
Next Up

Windows Azure Security Best Practices – Part 5: Claims-Based Identity, Single Sign On. User identification represents the keys to accessing data and business processes in your application. In this section, I describe how you can separate user identity and the roles of your user out of your application and make it easier to create single sign on applications.


Ed Moyle (@securitycurve) described PCI virtualization compliance: Three steps for PCI compliance in the cloud in a 3/12/2012 post to TechTarget’s SearchCloudSecurity.com:

imageIf your organization uses Infrastructure as a Service (IaaS) or hosts an internal private cloud -- and the scope of your PCI DSS compliance (i.e. your cardholder data environment or CDE) includes that environment -- you’ve probably realized how challenging PCI virtualization compliance can be. Not only are you required to periodically revalidate service providers (i.e., Requirement 12.8, “Maintain a program to monitor service providers’ PCI DSS compliance status at least annually”), but ongoing operational problems can raise issues during annual compliance audits as well.

It’s a challenging situation to be in: You need to keep the environment compliant, but implementing new controls isn’t always in the cards for IaaS through a service provider because you may not control the infrastructure directly. For private deployments, PCI compliance in the cloud can be difficult because keeping pace with controls and control updates is expensive, and the main purpose of cloud computing is cost-control.

Fortunately, there are a few strategies that organizations can implement -- free of charge --that help to ensure the environment stays compliant over a long-term deployment. None of these suggestions are rocket science, but putting them in practice can help enforce PCI virtualization compliance when it really gets challenging: after one or two years in the field.

PCI virtualization compliance: Assign a “shelf life”

VM sprawl-- the uncontrolled proliferation of VMs -- is one of the biggest operational challenges when it comes to private or public IaaS. Rogue VMs are always bad for security generally, but add PCI DSS to the mix, and you almost always have a recipe for non-compliance.

Rogue VMs are always bad for security generally, but add PCI DSS to the mix, and you almost always have a recipe for non-compliance.

This is because even spun down, disused, or vestigial snapshots represent areas where DSS security controls should be applied. And because the PCI Standards Council makes it clear in its virtualization guidance that compliance of the hypervisor and VM are linked (i.e. if the guest is in the CDE so is the host and vice-versa), anywhere you move that image to, whether using vMotion or plain old copy/paste, becomes a part of your CDE. Imagine what happens over time as these new VMs -- created on the fly and unintentionally becoming part of the scope of compliance -- start cropping up in a QSA’s audit sample.

One effective strategy to combat this is to define an “upper bound” timeframe within which these VMs can persist. Keep a formalized inventory of mandatory VMs (maybe it’s your official virtual inventory and there’s a controlled process requiring approval to get on it); VMs that are not on the inventory beyond a certain age (for example, three months) get automatically deleted. That way, even if a problematic VM is created, there’s a hard and fast end date for how long it can stay around. Obviously a process preventing it from being created in the first place would be ideal, but implementing a formal inventory and time limits means that even if a mistake happens, the pain is limited.

PCI virtualization compliance: Monitor dormant VMs

If a VM image is kept dormant (i.e. spun down) for long periods of time, critical security functions don’t always happen; for example, patching may not occur and malware signatures may not get updated. When that dormant image finally does get activated again, it may take some time before automated processes have a chance to bring that image to a secure state, and the longer the image has been on the shelf, the longer this process may take. This poses a problem when images contain cardholder data, have access to cardholder data, or live on the same hypervisor as other images that contain cardholder data because of PSS DSS requirements on patching (Requirement 6.1) and malware signatures (Requirement 5.2).

To combat this, a manual or automated process that monitors the “freshness date” of dormant images, and spins them up to make sure they get updates, can ensure secure configuration over time. So, should your QSA evaluate one of those images, required security parameters are current.

PCI virtualization compliance: Follow a naming convention

While inventories of virtual images are critical, they’re notoriously unreliable in practice. It’s not unusual, for example, to have multiple, conflicting inventories specifying different functions and owners for the same VM. Recall that a PCI Report on Compliance (RoC) requires a list of all devices with access to cardholder data so conflicting inventories won’t pass muster.

Ideally, not letting the inventory get out of hand in the first place is best, but since that proves challenging in practice, using a naming convention that makes any image instantly recognizable is helpful. For example, explicitly labeling which images contain or have access to cardholder data (as part of the naming convention) helps prevent moving a VM to a hypervisor that doesn’t have PCI DSS-required security controls.

Each of the steps described above is designed to build in a “safety net” for the kind of bad behavior that challenges virtual environments operationally. They’re simple, but you’d be surprised how often organizations overlook them and end up in lengthy audit and remediation cycles.

Ed Moyle is a senior security strategist with Savvis as well as a founding partner of Security Curve.

Full disclosure: I’m a paid contributor to TechTarget’s SearchCloudComputing.com blog.


<Return to section navigation list>

Cloud Computing Events

Mike Benkov posted Announcing Windows Azure Kick Start Events on 3/12/2012:

It’s spring and once again we’re back on the road to helping people explore the possible and see how to get started with Cloud Computing. Along with the webcast series I’ve been doing (http://benkotips.com/s2nCloud) we’ll be on the road to bring the content to your town. The schedule so far is listed below.

By the way if you have MSDN you have free cloud benefits! This video shows you how to get your risk free access to Azure to explore and learn the cloud or activate your MSDN Cloud benefits here. If you have questions send our Azure team members an email: msnextde at microsoft.com.


Peter Myers of Solid Quality Mentors (@SolidQ) will present a Microsoft Business Intelligence QuickStart Program: Understanding the Benefits of Microsoft Self-Service BI webinar on 3/16/2012 at 11:00 PDT:

imageDo your end users have the capability to extend and leverage your business intelligence (BI) solution? What role - if any - does self-service BI have in your organization? 

imageJoin us as recognized BI expert Peter Myers discusses the key benefits of Microsoft's unique approach:

  1. Quickly deploy scalable self-service BI solutions
  2. Easily integrate data from multiple sources
  3. Provide users with powerful dashboards and flexible data interaction capabilities
  4. Enable users to intuitively collaborate with other decision makers
  5. Understand how self-service BI complements and completes corporate BI solutions

You don't want to miss this free informative webinar: Register Here

Online - Log in information will be provided in your confirmation email

Add to my calendar


<Return to section navigation list>

Other Cloud Computing Platforms and Services

Nancy Gohring (@idgnancy) asserted “Microsoft joined Google and Amazon Web Services in cutting the cost of cloud services” in a deck for her Cloud economics improving for users in wake of price cuts article of 3/9/2012 for InforWorld’s Cloud Computing blog:

imageCloud computing price cuts by Google, Amazon, and now Microsoft may indicate that businesses are discovering that moving to the cloud doesn't always save costs.

On Friday, Microsoft dropped the price on its Azure Storage Pay-as-you-Go service and lowered the price of its six-month storage plan. The cost to use Azure Extra Small Compute has dropped in half.

imageEarlier this week, Google cut the price of its Cloud Storage service and Amazon Web Services dropped prices on Elastic Compute Cloud, Relational Database Service, ElastiCache and Elastic Map Reduce. Amazon highlighted how the new price cuts will particularly reduce costs for big businesses.

The price cuts come as the service providers try to convince potential new customers to move to the cloud model, one expert said.

"The myth is that cloud computing is always cost-effective," said David Linthicum, CTO of Blue Mountain Labs, a company that advises businesses on moving to the cloud. [Ed. note: David Linthicum is also an InfoWorld blogger.] "In many instances, it's not."

Service providers like Amazon are likely hearing this from potential customers, he said. "They're losing on deals where people are going to buy hardware and software because it's cheaper than leasing their services," he said. "They're reacting by reducing price to capture market."

Amazon, Microsoft, and Google are likely willing to drop prices quite a bit, he said. The cloud services are not a primary business for any of the companies, he noted. Plus, once any of them wins a customer, they have a good chance of holding on to the customer for a long period of time because it's difficult for users to switch cloud service providers. For those reasons, Linthicum expects plenty of additional price drops in the future, he said.

The cuts could point to other issues in the market, another analyst said. "They probably signal a developing overcapacity problem in the market for basic public cloud hosting," said Marc Brien, vice president of research at Domicity, a consulting and analysis company.

The continued price cuts by Amazon -- this is its 19th price reduction -- indicate a strategy of trying to drive away competition. "They likely have the most favorable cost structure of anyone in the public hosting industry and want to force a rationalization of the industry in their favor by putting up prices that make the whole thing unappealing and unprofitable for anyone else," Brien said.

Microsoft, Google, and Amazon may also be bracing for the near future when other big companies with deep pockets, like Hewlett-Packard and telecommunications providers, enter the market, he said.


<Return to section navigation list>

0 comments: