Tuesday, March 13, 2012

Windows Azure and Cloud Computing Posts for 3/13/2012

A compendium of Windows Azure, Service Bus, EAI & EDI Access Control, Connect, SQL Azure Database, and other cloud-computing articles. image222

image433

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:


Azure Blob, Drive, Table, Queue and Hadoop Service

Andrew Brust (@andrewbrust) continued his MapReduce series with MapReduce translations, from skyscrapers to Hadoop clusters to his new Big on Data Blog for ZDNet on 3/13/2012:

imageIn my last post [see below], I explained MapReduce in terms of a hypothetical exercise: counting up all the smartphones in the Empire State Building. My idea was to have the fire wardens count up the number of smartphones, by platform, in each suite on the warden’s floor. For each suite, they’d write the smartphone platform name and number of handsets on a separate sheet of paper. They’d put all the sheets in an envelope, and drop the envelope in the mail chute. Down in the lobby, I’d get all the envelopes out of the mailboxes, open them, sort the sheets by platform name into separate piles, then stuff the piles into new envelopes and send a couple each to the fire wardens on the 10th, 20th and 30th floors. The fire wardens on those three floors would enter final tallies for each platform they had sheets for onto a new sheet of paper. Back in my office, the three final tally sheets’ data would then get entered into a spreadsheet and I’d have some valuable data, given that the Empire State Building has enough people to warrant its own ZIP code.

imageIn this post, I want to correlate some of the actors and objects in the skyscraper scenario to MapReduce vocabulary. If you can follow the last post and this one, you’ll understand MapReduce pretty well, all without getting bogged down in lines of code. And if you can do that, Big Data will make a lot more sense.

In our skyscraper analogy, the work the fire wardens did in getting the per-suite, per-platform handset count would be the Map step in our job, and the work the fire wardens on the 10th, 20th and 30th floors did, in calculating their final platform tallies, would be the Reduce step. A Map step and a Reduce step constitute a MapReduce job. Got that?

imageLet’s keep going. For the building, the collection of suite numbers and smartphone types in each would represent the keys and values in our input file. We split/partitioned that file into a smaller one for each floor which, just like the original input file, would have suite number as the key and smartphone platform data as the value. For each suite, our mappers (the fire wardens on each floor) created output data with the smartphone platform name as key and the count of handsets for that suite and platform as the value. So the mappers produce output which eventually becomes the Reduce step’s input.

But a little work is required there. The work I did in the lobby, making sure all data for a given key from the mappers’ output went to one, and only one, reducer (i.e. one of the fire wardens on the 10th, 20th and 30th floors) as input, made me the master node (as opposed to a worker node, where the mapper and reducer work takes place). As it happens, I sent data for two keys to each reducer node, which is fine. Each reducer then processed its input and generated an output file of its own with platform as key and the grand total handset count in the building for that platform as the value. My assistant, acting as the output writer, concatenated the reducers’ output into a single output file for the job. All of the fire wardens and I acted as the nodes in the cluster.

Belaboring the analogy ever more, imagine that building management didn’t want to be outdone by the new World Trade Center building under construction downtown, and so added 50 new floors to the building. My method for getting smartphone platform counts would still work. I’d merely scale out my cluster by enlisting the new floors’ fire wardens (i.e. I’d add nodes to the cluster). At any point in time, if one of them quit or got fired, I’d merely enlist the help of the new fire warden on that floor, since, in my example, I’d be treating each node as mere commodity hardware. (That would be kind of impersonal of me, but I have work to do). Best of all, I’d get the answer just as quickly this way, since each fire warden would be counting his or her input data at the same time (and thus processing in parallel).

A Map step is typically written as a Java function that knows how to process the key and value of each input and then emit a new set of key-value pairs as output. The Reduce step is written as another function that does likewise. By writing just two functions in this way, very complex problems can be calculated rather simply. And by enlisting more and more nodes in our Hadoop cluster, we can handle more and more data. Hadoop takes care of routing all the data amongst nodes and calling my Map and Reduce functions at the appropriate times.

But Map and Reduce functions can be written in many languages, not just Java. Moreover, software can be used to provide abstraction layers over MapReduce, allowing us to give instructions in another manner and rely on the software to generate MapReduce code for us. To provide just two examples, a Hadoop companion called Hive provides a SQL (structured query language)-like abstraction over MapReduce and another companion technology called Pig does so with its own Pig Latin language.

We’ll talk more about Hive, Pig and other products in future posts, as we will about langauges that can be used in place of Java to write MapReduce code. We’ll look at how analytics and data visualization products use Hive, Pig and other technology to make themselves Big Data-capable. And we’ll understand that it all comes down to MapReduce, which isn’t so mysterious after all.


Andrew Brust (@andrewbrust) started a MapReduce series with The MapReduce 101 story, in 102 stories on 3/12/2012:

imageA little over a year ago when I started my company, I was able to find a small office in the Empire State Building. I’m on the 72nd floor facing south, so the view is amazing. I wish I had better Internet service options though; I’ve realized it’s just not that attractive to service providers to pull their cables to the top of such a tall, old building. In time, though, I’ve decided that the building might be more tech-savvy than I realized. That’s because, with only a little contrivance, I believe I can use the building to explain MapReduce, without using code.

imageOne of the things I do in my work is follow market share figures for various smartphone platforms. I typically rely on the findings of the larger analyst firms to figure out what’s what, but I dream of one day getting getting my own numbers instead. It struck me recently that if I had a little more pull at the ESB, I could just total up the different smartphone handsets, by platform, in the building. After all, the building has a good distribution of city and suburban dwellers, different income levels, and a large enough population to have its own 5-digit zip code.

imageAs I continue this data-gathering day dream, I think through how I could go about counting all these cell phones. I certainly couldn’t do it myself. Even if I had the patience and the speed, the inefficiencies in getting between floors would hurt my performance, as the elevators can be slow, and no employee in the building is happy about people who get on and then off one floor later.

But then I have an idea. Since every floor has a fire warden whose job it is to count people, maybe I could use those folks as my agents on each floor. Each floor fire warden could go into each suite on his or her floor and write down, on a separate piece of paper for each major smartphone platform, the platform name and total number of handsets. I could tell the fire wardens to create a separate sheet of paper, per suite, for iOS, Android, Blackberry, Windows Phone, webOS and Symbian and could also tell them to disregard other phones. Each fire warden would likely have multiple sheets per platform, of course, since each sheet’s count would correspond to a particular suite on the floor. But that’s just fine.

When the fire wardens were done in all suites, they could put all their sheets in an envelope and drop it in the mail chute (in the hypothetical case that the chutes were still in use.) I could be waiting in the lobby, and when I knew that all fire wardens had completed their work, I could go around to the mail boxes at each chute and collect the envelopes with the smartphone count sheets.

As a next step, I’d go sit at the security desk, open all the envelopes and sort the sheets, by smartphone platform, into six new piles, putting each pile in an envelope. I’d have an intern bring two of the new envelopes up to the 10th floor, another intern bring two more to the 20th, and my third intern bring the last two to the 30th floor. The fire wardens on each of those three floors would open an envelope, total up the counts on the individual sheets, and write down the platform name and that grand total on a new sheet of paper. He or she would then repeat the process for the other envelope, writing its platform name and handset total on the same sheet of paper as the first. Each of my three interns would then take these new sheets from the fire wardens up to my office on the 72nd floor, where an assistant would be waiting. He’d then put the data from all three sheets of paper into a single spreadsheet, with platform names in column A and handset counts in column B. And with that I’d have my smartphone stats for the building. With the help of the friendly fire wardens, I’d get my answer pretty quickly too.

This example’s not perfect, and I might update this post over time to make it more so. But if you can understand the process I just explained, then you can understand MapReduce. Just let this stuff sink in for a bit. In my next post, I’ll introduce the vocabulary (jargon?) used in MapReduce-speak to explain what the building employees, suite numbers, smartphone platform names, handset counts, fire wardens, sheets of paper, and the final spreadsheet represent.


<Return to section navigation list>

SQL Azure Database, Federations and Reporting

imageNo significant articles today.


<Return to section navigation list>

MarketPlace DataMarket, Social Analytics, Big Data and OData

Elizabeth Maher explained Using LightSwitch OData Services in a Windows 8 Metro Style Application in an 8/13/2012 post:

One of the exciting new features in Visual Studio 11 Beta is applications you create with LightSwitch will automatically expose Open Data Protocol (OData) services. For a good article covering how OData and LightSwitch work see Enhance Your LightSwitch Applications with OData and Creating and Consuming LightSwitch OData Services by Beth Massi. The ability for LightSwitch application to produce OData service allows other clients to read the data produced by your LightSwitch application. In this walkthrough, we will cover how to read an OData service created by a LightSwitch application in a Windows 8 Metro style application.

image

For the purposes of this article, the Contoso application will be the LightSwitch application used. Instructions for setting up this sample may be found here: Contoso Construction - LightSwitch Advanced Sample (Visual Studio 11 Beta). We will add a Windows 8 Metro style application to the Contoso solution, change the Metro style application to read the construction project data from the Contoso LightSwitch application, and setup Visual Studio to allow us to debug both the LightSwitch application and Windows 8 Metro style application at the same time. This walkthrough will focus on the development process of creating a LightSwitch application at the same time as creating a Windows 8 Metro style application. The deployed Windows 8 Metro style application is expected to use the published OData service url.

OData Service from LightSwitch Application

imageWe will be reading the Projects table to get a list of construction projects. To see the data feed from the OData service, you can simply type the service url into Internet Explorer. The url for LightSwitch Applications will be in the form [base url]/[DataSourceName].svc. For a Contoso Construction project that is being debugged, the url will the http://localhost:#####/ApplicationData.svc, where ##### is the current port number being used. To see a list of the projects, add a ‘/Projects’ to the url to make http://localhost:#####/ApplicationData.svc/Projects. Below is a picture of the OData feed in Internet Explorer, when the reading View is turned off.

odataInInternetExplorer

To turn off the reading view in Internet Explorer, choose Tools -> Internet Options. Click the Content tab and click the Settings button under the Feeds and Web Slices section. Uncheck ‘Turn on feed reading view’.

ieReaderOptions

Now that we know the url to get the data information for construction projects, lets create a Windows 8 Metro style application that will list the projects in a touch-friendly user interface.

Creating a Windows 8 Metro Style Application

To complete this walkthrough, you will need a version of Visual Studio 11 Beta that has both the LightSwitch project templates and the Windows 8 Metro project templates. Visual Studio 11 Professional Beta, Visual Studio 11 Premium Beta and Visual Studio 11 Ultimate Beta will all have the necessary features. Download Visual Studio 11 Beta here.

For simplicity, we will be using the Javascript Split Metro Style Application. Click File –> Add New Project. Select the Split Application template found under JavaScript \ Windows Metro style.

addNewProject

Please note, that you need a developer license to create a Metro style application. If creating a Metro style application for the first time, you will be asked to accept the terms of agreement and will be prompted for a Microsoft account information.

Script Libraries

In this walkthrough, we will be using two libraries, JQuery and DataJS. See resource links at the end of this document for more information concerning both libraries. The best way to install these libraries is to use the Package Manager. Open the Package Manager Console by selecting Tools -> Library Package Manager -> Package Manager Console. Type ‘install-package jquery'. Once that command is complete type 'install-package datajs' in the Package Manager Console. The Package Manager Console should look like the following.

packageManager

After both packages are successfully installed, the Scripts folder should list all the necessary files.

scriptFolder

Modifying Metro Style Application

Let’s modify the Metro style application project to read the list of construction projects from ContosoConstruction. First thing that must be done is to setup the projects so we can use the DataJS and JQuery library. Open the default.html and add a reference to DataJS and JQuery under the WinJS references by adding the lines

  <!-- jQuery references --> 
  <script src="/Scripts/jquery-1.7.1.js"></script> 
  <!-- datajs references --> 
  <script src="/Scripts/datajs-1.0.2.js"></script> 

Now open js\Default.js to create a variable for OData by adding the line

   var OData = window.OData;

The next thing that must be done is to define the groups that will be displayed on the home page. We will show all projects.

Open the js\data.js file. Replace the sample groups with the following.

    var sampleGroups = [ 
        {
            key: "allProjects", title: "All Projects", subtitle: "All Contoso projects.",
            backgroundImage: darkGray
        },
    ]; 

The next step is to load the actual data. To do this, the code to load the sample data must be replaced. The sample code should look like the following.

// TODO: Replace the data with your real data.
// You can add data from asynchronous sources whenever it becomes available.
sampleItems.forEach(function (item) {
     list.push(item);
});

Replace the sample code with the following code:

    //Generic function for loading data via a odata url
    function loadData(data, odataUrl, dataLoaded) {
        if (data) {
            return WinJS.Promise.as(data);
        }
        else {
            return new WinJS.Promise(function (complete, error, progress) {
                OData.read(odataUrl,
                function (data) {
                    complete(dataLoaded(data.results));
                },
                function (dataerror) {
                    error(dataerror);
                });
            });
        }
    }


    var projectsODataUrl = "http://localhost:#####/ApplicationData.svc/Projects";
    //TODO: Replace projectsODataUrl with url for deployed OData service
    //  before publishing this application.
    var _projects;
    //Loads projects
    function loadProjects() {
        loadData(_projects, projectsODataUrl, function (results) {
            _projects = results;
            return _projects;
        }).then(function (projects) {
            var items = [];
           
            $.each(projects, function (l, e) {
                var notes;
                if (e.Notes === null) {
                    notes = "";
                }
                else {
                    notes = e.Notes;
                }
                items.push({
                    displayName: e.ProjectName, subtitle: "Estimate: $" +
                        e.OriginalEstimate, description: "", content: notes
                });
            });
            showProjects(items.sort(), sampleGroups[0]);
        });
    }

    //Adds projects to binding list.
    function showProjects(items, itemGroup) {
        items.forEach(function (item) {
            list.push(
                {
                    group: itemGroup, title: item.displayName,
                    subtitle: item.subtitle, description: item.description,
                    content: item.content, backgroundImage: lightGray
                }
              )
        });
    }

    loadProjects();
Preparing for Debugging
Launching Projects Simultaneously

To debug the Metro style application that is connected to a non-published LS OData service it is best to configure both projects to launch simultaneously on F5. To do this, open the Solution Property Pages. Select the Common Properties\Startup Project properties page. Select ‘Multiple start projects’. Set ConstosoConstruction and the Windows 8 Metro style application project to have a startup action of ‘Start’. Make sure that ContosoContruction is listed before the Windows 8 Metro style project in the startup order.

solutionProperties

Setting Capabilities for Metro Style Application

It is also necessary to set the capabilities for the Metro style application. Double-click the package.appxmanifest file of the Metro style application project in Solution Explorer. This will show the interface for modifying package properties. Click the Capabilities tab. Check the ‘Private Networks (Client & Server)’ capability to allow your application to connect to intranet resources. Check the ‘Enterprise Authentication’ capability only if your LS OData service uses Windows Authentication. This capability will allow credentials of the current user to be automatically used, rather than having the user prompted for their credentials. Note – both of these capabilities are designed for enterprise applications and will not be found in typical application from the Window Store.

packageCapabilities

Getting Current Port Number for LightSwitch Project

You are almost ready to debug. It is important to note that an F5’ed LightSwitch application is not guaranteed to have the same port number every time the application is debugged. Put a breakpoint on the line where the LightSwitch OData url is defined and change it when needed. You can get the port number a few ways. If your LightSwitch application type is set to ‘Web’, you can get the port number out of the url in the browser.

gettingPortNumber

You can also use the task manager to get the port number. Open Task Manager and select the Details tab. Show the ‘Command Line’ column for the process. Find the ‘vslshost.exe’ process and look for the ‘/p:’ in the command line arguments. The ‘/p’ stands for port number. For example the command line might be, "C:\Users\CurrentUser\Documents\Visual Studio 11\Projects\Contoso\ContosoConstruction\bin\Debug\VslsHost.exe" /s /SuppressWarmup /AppBridgeInstanceName:2624 /AppBridgeServiceId:1 /p:13743 /UseMEFCache /GrantedPermissionsForDevelopment:"Microsoft.LightSwitch.Security:SecurityAdministration" /n:"ContosoConstruction" /MainThreadCulture:en-US

The number 13743 would be the port number.

Start Application

You are ready to hit F5! Modify the port number for ContosoContruction OData service, if needed. You should see a single button with the title ‘All Projects’.

title

After clicking the button, all the construction projects for ContosoContruction will be shown. See below for example.

items

Publishing

Each project must be published independently. First publish the LightSwitch application to your production server. Documentation, including some walkthroughs, explaining publishing a LightSwitch application can be found at http://msdn.microsoft.com/en-us/library/ff852001.aspx. Once the LightSwitch application is deployed and the production OData service url is known, the url used in the Metro style application must be updated before it is deployed. See http://msdn.microsoft.com/en-us/library/windows/apps/hh454036(v=vs.110).aspx for documentation regarding packaging your Windows 8 Metro style application using Visual Studio.

Wrap Up

I hope this post has helped you understand how to call LightSwitch OData services from a Windows Metro Style application. For more information, see the links below.

 

image_thumb15_thumbNo significant articles today.


<Return to section navigation list>

Windows Azure Access Control, Service Bus and Workflow

imageNo significant articles today.


<Return to section navigation list>

Windows Azure VM Role, Virtual Network, Connect, RDP and CDN

imageNo significant articles today.


<Return to section navigation list>

Live Windows Azure Apps, APIs, Tools and Test Harnesses

Nick Harris (@cloudnick) reported Updated Release of Windows Azure Toolkit for Windows 8 Consumer Preview in a 3/13/2012 post to the Windows Azure blog:

imageThe Windows Azure Toolkit for Windows 8 makes it easy for developers to create Windows Metro Style applications that can harness the power of Windows Azure. Today we released v1.2.0 on Codeplex and you can download the self-extracting package from here. This version of the toolkit is a refresh that includes required updates for Windows 8 Consumer Preview.

If you are building Windows 8 Metro Style applications with Windows Azure and have not yet downloaded the toolkit I would encourage you to do so. Why?, as a quick demonstration this video shows how you can use the toolkit to build a Windows 8 Metro Style application that uses Windows Azure and the Windows Push Notification Service to send Toast, Tile and Badge notifications in 4 minutes.

The core features of the toolkit include:

  • Automated Install – Scripted install of all dependencies including Visual Studio 2010 Express and the Windows Azure SDK on Windows 8 Consumer Preview.
  • Project Templates – Windows 8 Metro Style app project templates in Dev 11 in both XAML/C# and HTML5/JS with a supporting C# Windows Azure Project for Visual Studio 2010.
  • NuGet Packages – Throughout the development of the project templates we have extracted the functionality into NuGet Packages for Push Notifications and the Sample ACS scenarios. You can find the packages here and full source in the toolkit under /Libraries.
  • Samples – Five sample applications demonstrating different ways Windows 8 Metro Style apps can use ACS and Push Notifications
  • Documentation – Extensive documentation including install, file new project walkthrough, samples and deployment to Windows Azure.

You can read more about the toolkit on the project site’s wiki where we have documentation that will help you get started using the toolkit step by step. Additionally, I will be writing more detailed technical posts about the toolkit on my blog.

Nick Harris is a Technical Evangelist for Windows Azure. Follow Nick at @cloudnick.

Nick provides his twitter alias, so it would be politic to change the default avatar.


The Windows Azure Toolkit Team posted an updated Windows Azure Toolkit for Windows 8 [Consumer Preview] to CodePlex on 3/13/2012:

WATWindows8.png
imageMaximize the potential of your Windows Metro style app with Windows Azure. Building a cloud service to support rich Windows Metro style apps is even easier with the Windows Azure Toolkit for Windows 8. This toolkit has all the tools to make it easy to develop a Windows Azure service and deploy it to your users. In addition to documentation, this toolkit includes Visual Studio project templates for a sample Metro style app and a Windows Azure cloud project. This tool is designed to accelerate development so that developers can start enabling Windows 8 features, such as notifications, for their app with minimal time and experience. Use this toolkit to start building and customizing your own service to deliver rich Metro style apps.

Windows Azure
Windows Azure is a cloud-computing platform that lets you run applications and store data in the cloud. Instead of having to worry about building out the underlying infrastructure and managing the operating system, you can simply focus on building your application and deploying it to Windows Azure. Windows Azure provides developers with on-demand compute, storage, networking, and content delivery capabilities.

For more information about Windows Azure, visit the Windows Azure website. For developer focused training material, download the Windows Azure Training Kit or view the online Windows Azure Training Course.

Key Technologies

The Windows Azure Toolkit for Windows 8 uses the following technologies:

  • ASP.NET MVC 3
  • WCF REST Services
  • Windows Azure SDK 1.6
  • Windows Identity Foundation
  • Windows Push Notification Service (WNS)

Content

Learn More:

Videos:

As a brief preview to the toolkit check out this 4 minute video to see how you can get a jump start using Windows Azure Toolkit for Windows 8 to send Push Notifications with Windows Azure and the Windows Push Notification Service (WNS). This demonstration uses the Windows Azure Toolkit for Windows 8 that includes full end to end scenario for Toast, Tile and Badge notifications.

To understand WNS, Windows Azure and Push Notifications watch the in-depth Build Conference Video with Darren Louie and Nick Harris

Delivering notifications using the Windows Push Notification Service and Windows Azure

Blogs:

Email:

watwindows8@microsoft.com

I had problems installing the Toolkit. I had to load the Windows Azure SDK v1.6 manually, not from the Web Installer. I also received many errors stating that a “later version of the Web Installer was already installed.”

Also, I had previously installed Visual Studio 11 preview but Dependency Checker didn’t recognize it. Possibly this occurred because I installed VS 10 Ultimate as prerequisite for the toolkit after VS 11 Ultimate Preview. Dependency Checker recognized VS 10 SP1 as present. Performing a repair on VS 11 didn’t fix the problem with that version.


<Return to section navigation list>

Visual Studio LightSwitch and Entity Framework 4.1+

Julie Lerman (@julielerman) answered VS11 and EF5: Where’s that database that Code First created? on 3/13/2012:

imageVisual Studio 11 brings a new development database -- SQL Server Local Database. Bye bye SQL Server Express dependency.

I typically let Code First run with it’s default database of choice – up to now that’s been SQL Server Express -- when I’m creating simple samples where I don’t care too much about keeping the sample database around.

But I’m so used to opening up SSMS to look at detailed information about the database, especially when Code First is involved in creating it’s schema. I like to see what’s happening.

In Visual Studio 2010’s server explorer, I can’t typically glean the details I’m after. (Maybe I need more instruction here? Smile) For example, I have to open a column’s properties window to see details.

vs2010server

In SSMS, I prefer the view:

ssms

With EF5 however, the default database for code first is the new SQL Server 2012 LocalDb.

When you add EntityFramework 5 to your project it adds some configuration elements to the application’s config file and in there is where EF is setting the default ConnectionFactory for Code First to use LocalDb.

defaultconn

I’m sure lots of people won’t think to look in there and will just be happy that the database is magically there. (Not me. I can’t bear not knowing how things work. Winking smile)

So if you want to look at your data, forget the Server Explorer. Check out the new SQL Server Object Explorer in Visual Studio. It will look very familiar … if you use SSMS, that is. They’ve pulled the explorer from SSMS into Visual Studio and improved upon it. Very nice!!

Also, take note that last time I checked, you could not open up localdb databases in SSMS. That may not longer be the case. But I don’t feel like installing full blown SQL Server on my virtual machine to verify since I now have what I need inside of Visual Studio.

To see your database in the new Object Explorer:

1. Click the New Database icon/glyph/blob (circled)

ssoe1

2. In Connect to Server window, TYPE IN “(localdb)/v11.0”. Don’t click the dropdown unless you want to wait for the wizard to explore the outer reaches of the universe for every possibly accessible SQL Server instance.

ssoe2

3. Then after connecting, you can expand the new connection to explore your database, in detail, inside of Visual Studio. (yay)

ssoe3

There are some nice improvements over the SSMS 2008 UI I’m used to. For example, if you right click a table you can choose View Data and there is an option box in the viewer to choose how many rows you want to look at. That’s just one little example. Another example is if you do something like delete a table, you will get the options of creating a script or just executing the change on the database. I’m sure you’ll find lots of information on these types of changes.

Now when you are working with Code First and using default behavior, you know where to find the database it’s created for you and how to inspect that database.


Julie Lerman (@julielerman) explained Updating to Entity Framework v.Latest the Easy Way in a 3/13/2012 post to her Don’t Be Iffy blog:

imageEntity Framework is evolving rapidly which is why they are releasing via NuGet rather than being strapped to the .NET release cycle. (You can read more about the how’s and why’s of EF’s release cycle here: http://blogs.msdn.com/b/diego/archive/2012/01/15/why-entity-framework-vnext-will-be-ef5-and-nothing-else.aspx ).

The following is about keeping current with Entity Framework Code First and DbContext, not about upgrading the core API that is in .NET.
EF 4.1 – 4.3.1 work with .NET 4.0.
EF 5 (currently in beta) will work with .NET 4.5 (also currently in beta).

It’s recommended that you keep your apps that use EF (Code First/DbContext) up to date. The updates add functionality and fix some bugs, so this is a fairly safe prospect (granted there were a few problems for some very particular scenarios in the past but those have been corrected).

Thanks to the NuGet integration in Visual Studio it’s really easy to update EF assemblies across a solution without having to update each project that might need it. (You’ll need NuGet installed in VS which you can do via the Extension Manager.)

Right click the solution in Solution Explorer and click Manage NuGet Packages for Solution.

nugetefupdate1

Select Updates (circled in the image). The dialog will show you any packages for which updates are available. My solution has 5 projects and I am using EF 4.1 in four of them. So the tool sees that I’ve got those installed and that there’s a newer version available, so it presents that to me. Click Update.

nugefupdate2

Now I am presented with all of the projects that are using an out-of-date version of Entity Framework. By default, they are all checked to have the most current version installed. Click OK.

nugetefupdate3

As NuGet updates the packages in your projects, it will show the progress for each package. Here I can see that my console app project has just been updated from 4.1 to 4.3.1.

nugetefupdate4


image_thumb1See Elizabeth Maher explained Using LightSwitch OData Services in a Windows 8 Metro Style Application in an 8/13/2012 post to the Visual Studio LightSwitch and Entity Framework v4+ section above.


Return to section navigation list>

Windows Azure Infrastructure and DevOps

Stuart J. Johnston quoted me in his Microsoft Azure prices drop following Amazon, Google cloud rate cuts article of 3/13/2012 for TechTarget’s SearchCloudComputing.com blog:

imageAfter Amazon and Google both cut prices for some cloud infrastructure services early last week, Microsoft jumped on the bandwagon and lowered Windows Azure rates.

imageWhile the cuts aren't directly comparable because of variations in the way each company structures its cloud offerings, the timing reinforces the premise that cloud infrastructure costs are plunging. That, in turn, drags down competitors' prices across the board. And the race is on to pass at least some of those savings on to customers.

Thursday, in a "me too" move, Microsoft said it had cut rates for both pay-as-you-go storage inWindows Azure as well as for customers that choose to pay for storage on six-month contracts.

Additionally, the software giant slashed the price for an Extra Small Compute instance by half-- from $0.04 to $0.02 per hour, said Steven Martin, general manager of Windows Azure business strategy and planning, in a blog post.

"The price cuts were aimed at two major Windows Azure market segments,” said Roger Jennings, a Windows Azure MVP and developer. “Users new to Windows Azure or using it to host a website or blog on Extra Small Compute instances, and enterprises who are ready to pony up in advance can take advantage of commitments to six-month plans."

The reduction in pay-as-you-go storage prices is basically meeting competition for what’s quickly becoming a commodity.

Roger Jennings, Windows Azure MVP and developer

Last Monday, Amazon reduced rates for Elastic Compute Cloud (EC2), as well as Amazon Relational Database Service (RDS), Amazon ElastiCache and Amazon Elastic MapReduce. It was the nineteenth Amazon Web Services (AWS) pricing cut in six years, Amazon officials said.

As a partial comparison, a small Amazon-hosted website that cost $876 per year using pay-as-you-go on-demand pricing in 2006 would cost $250 per year today if the customer converted to a year or three-year contract. A "micro" Amazon EC2 instance with pay-as-you-go pricing currently costs $0.020 per hour, running on Linux or Unix, or $0.030 running on Windows.

Tuesday, Google cut pricing for Google Cloud Storage, retroactive to March 1. The starting rate for customers storing less than a terabyte is now $0.12 per gigabyte, down a penny from the previous rate of $0.13. Cuts vary from 7.69% at the low end to 15% off -- a decrease from $0.10 per gigabyte to $0.085 -- to store between 100 and 500TB.

Microsoft, meanwhile, cut Azure's pay-as-you-go storage rate to $0.125 per gigabyte, a 12% drop from the previous rate of $0.14. Customers who pay for cloud storage on a six-month-contract basis see a reduction of 14%, Martin added.

Prices notwithstanding, Microsoft's Martin argues that value-added features differentiate Azure's services. For instance, he emphasized data protection features in Azure.

"While the reduced price ... provides cost benefits, geo-replication differentiates Windows Azure Storage from other services in market," Martin added.

Though these vendors say the price cuts are unrelated, industry watchers say the timing of these three announcements is more than coincidence. "The reduction in pay-as-you-go storage prices is basically meeting competition for what’s quickly becoming a commodity," Jennings said.

Full disclosure: I’m a paid contributor to SearchCloudComputing.com.


David Linthicum (@DavidLinthicum) asserted “With release of new iPad, we're reminded that IT must live with employee-owned mobile devices. The answer is in the cloud” in a deck for his The answer to IT's mobile dilemma is in the cloud article of 3/13/2012 for InfoWorld’s Cloud Computing blog:

imageThe new iPad hits the streets this week (I pre-ordered mine), and IT is once again faced with supporting these devices as more employees walk in with them. This year's crop includes tablets, smartphones, and (still) netbooks. Next year, count on set-top boxes, such as Apple TV and Roku, and gaming consoles. The year after, you'll be dealing with mobile devices embedded in our cars.

imageThe natural reaction of traditional enterprise IT is to toss these evil things into a bonfire just outside the lobby. In reality, supporting these devices leads to better productivity and happier employees -- and that advantage goes right to the bottom line.

What about security, privacy, network performance, and other risks that come with using these devices for business? The answer is to deal with mobile devices a bit differently than you've treated new technology in the past. Moreover, and most important to me, is how IT uses the cloud in support of these devices. But how?

The problem is that most enterprises believe they need to own and control all devices where business data is displayed to a user. Although I understand the urge, these days that wish is either impossible or too expensive. The world is moving toward a BYOD (bring or choose your own devices) strategy.

To find a solution to the BYOD challenge, first picture the world of IT 30 or 40 years ago, when users sat in front of dumb terminals. We've come full circle: We have the advantage of new technology to implement the retrofit of dumb terminals into today's devices. The key word is "abstraction," which means removing the devices and the underlying complexities from IT assets, including compute services and data services. Thus, the mobile devices function as mere terminals, providing a view into enterprise systems and data, typically operational and business intelligence data. These can be native applications running on the device or traditional Web-based applications that automatically adapt to mobile devices' form factors.

Read more: 2, next page ›


<Return to section navigation list>

Windows Azure Platform Appliance (WAPA), Hyper-V and Private/Hybrid Clouds

Thomas W. Shinder (pictured below) posted Yigal Edery’s (@yigaledery) Let’s Build Clouds with Windows Server “8” article on TechNet’s Private Cloud Architecture blog:

imageHello Cloud fans!

It gives me great pleasure to kick off a new series of blogs about Windows Server 8 as a cloud-optimized OS! We've been operating in silent mode till now, with limited public exposure (mainly at \\BUILD) but now it's time to start sharing the great stuff with you. It's going to be fun!

When we think of building a cloud, be it a private cloud or public cloud intended to offer Infrastructure as a Service (IaaS), we think first about the underlying platform that enables this cloud to be built . For us, this is Windows Server. It provides the abstracted pool of resources (compute, storage and network) that you then use to place workloads on. It is the combination of Windows Server features coming from all different technology areas such as virtualization, networking, storage, clustering, automation, and much more - that when combined together, create a great cloud platform. Simply put, Windows Server 8 is the first ever operating system that was truly optimized for being the operating system of the cloud.

In the upcoming series of blogs, we will dive deep into the different pieces of the puzzle. We will look at specific technology areas in Windows Server 8 and talk about how each of them contribute to that cloud story, and discuss the impact of these features on how you can now architect a cloud. For today, lets start by pointing you to some reference materials that's already out there and will hopefully help you get starting in understanding how revolutionary Windows Server 8 is going to be for the cloud!

For starters, if you haven't done so already, you should read Bill Laing's blog announcing the Windows Server 8 beta.

To go straight into the complete list of new Windows Server 8 capabilities that makes it such a cloud optimized OS, make sure to read the following white paper and watch this presentation that I co-delivered at BUILD: Using Windows Server 8 for building private and public IaaS clouds. It talks in details about each feature area and how it delivers value for cloud.

Last, if you're like me, then you want to start by playing with the bits, right? Want to get some hands-on experience? The easiest way for you to do that would be to get your hands on a few servers, and then use one of the two configuration guides describing step by step how to set up your own mini-cloud environment. The first one takes a more traditional approach to datacenter design, while the second really shows you the new cool ways you could architect your cloud using converged 10GbE networks and low-cost file server storage.

Hope you'll have as much fun using Windows Server 8 as we had building it!

Stay tuned for more exciting stuff on this blog and the Windows Server Blog from the server leadership team.

Thanks,
Yigal Edery
Principal Program Manager, Windows Server Manageability

On behalf of the Windows Server 8 Cloud Infrastructure team


<Return to section navigation list>

Cloud Security and Governance

Bruce Kyle continued his security series with Windows Azure Security Best Practices – Part 5: Claims-Based Identity, Single Sign On on 3/13/2012:

imageClaims-based identity is a simple but powerful way of handling identity and access for your web sites and web services, whether you work on-premises or you are targeting the cloud. You can create more secure applications by reducing custom implementations and using a single simplified identity model based on claims.

Windows Identity Foundation logoWindows Identity Foundation (WIF) is a set of .NET Framework classes. It is a framework for implementing claims-based identity in your applications.

imageArchitecturally using claims-based identity gets your application out of the authentication business. Single sign-on is much easier to achieve, and your application is no longer responsible for:

  • Authenticating users.
  • Storing user accounts and passwords.
  • Calling to enterprise directories to look up user identity details.
  • Integrating with identity systems from other platforms or companies.

Instead your application uses a claim that arrives to your application as a security token from an issuing authority. A security token service (STS) is the plumbing that builds, signs, and issues security tokens according to the interoperable protocols. Your application is the relying party.

Claim. A claim is some information that your application need to know about a user. For example, a user’s name or email address or in the sales organization. Your application will accept the claim from

Security Token. In a Web service, these claims are carried in the security header of the SOAP envelope. In a browser-based Web application, the claims arrive via an HTTP POST from the user’s browser, and may later be cached in a cookie if a session is desired.

Issuing Authority is a Web application or Web service that knows how to issue security tokens. In the scenario of claims-based identity, the issuing authority is responsible for issuing the proper claims (such as name and email or whether the person is in the sales organization.)

Security Token Service (STS). STS is trusted by both the client and the Web service to provide interoperable security tokens.

Relying Party. That’s your application or Web service. You can see it described as claims aware application or claims-based application.

SAML Token. Most STSs today issue SAML (Security Assertion Markup Language) tokens. SAML is an industry-recognized XML vocabulary that can be used to represent claims in an interoperable way.

Scenario

There are many scenarios. But in the one I chose, a user points her browser at a claims-aware Web application (relying party). The Web application redirects the browser to the STS so the user can be authenticated.

The STS, wrapped by a simple Web application that reads the incoming request, authenticates the user via standard HTTP mechanisms, and then creates a SAML token and emits a bit of JavaScript that causes the browser to initiate an HTTP POST that sends the SAML token back to the relying party.

The SAML token in the POST body contains the claims that the relying party requested.

image

Your application takes the SAML token and using Windows Identity Foundation, uses a few lines of code to open up the token and extract the claims. Now you have access to the requested data, such as name, email, and whether or not the person is in the sales organization.

There are many other scenarios. This one uses WS-Trust.

You don’t have to worry about what domain or security realm your user happens to be part of. In fact, you can support Facebook identity or Windows Live or Google ID or a claim from a user based on their active directory. Using claims based identity makes it a lot easier to federate identity with other platforms or organizations.

Windows Identity Foundation Object Model for Claims

When you build a relying party with WIF, you’re shielded from all of the cryptographic heavy lifting that WIF (and its underlying WCF plumbing) does for you. It decrypts the security token passed from the client, validates its signature, validates any proof keys, shreds the token into a set of claims, and presents them to you via an easy-to-consume object model.

In your code you ask the token for each claim you need.

Here’s a sample that returns an email address.

protected string GetUserEmail(object sender, EventArgs e) 
{ 
IClaimsIdentity id = 
((IClaimsPrincipal)Thread.CurrentPrincipal).Identities[0];

// you can use a simple foreach loop to find a claim... 
string usersEmail = null; 
foreach (Claim c in id.Claims) { 
if (c.ClaimType == ClaimTypes.Email) { 
usersEmail = c.Value; 
break; 
}

return usersEmail; 
}

The code can assume that the assumed the caller was authenticated and that her and email address had been sent as claims. The reason this program can make these assumptions is because it has a web.config file that uses the WS-Federation Authentication Module (FAM) from WIF and configures it with the address of an STS that can authenticate the user and supply these types of claims.

FAM is an HttpModule that is specifically designed to make it easy to build federated claims-aware Web applications using ASP.NET 2.0.

So you need some information in your web.config that is explained in the Microsoft Windows Identity Foundation (WIF) Whitepaper for Developers.

WIF offers built-in Visual Studio project template for creating a claims-aware ASP.NET application or claims-aware WCF services. So you can have an excellent starting point.

Writing Your Own STS

You may already be maintaining a membership list of user names, names, and passwords. You can create your own STS to provide identity.

The STS accepts incoming requests, validates, decrypts, and shreds incoming security tokens into claims, and does the opposite for outgoing security tokens. WIF takes care of all of that heavy lifting.

Note: WIF does not do provide a framework for managing or administering policy, which you can think of as the logic, or the rules, behind the STS.

Making ASP.NET Membership Provider Into an STS

If you are using ASP.NET Membership Provider, you can turn that into an STS and make it one of the providers your users can use to access your application. You can do this by adding a simple STS to you ASP.NET membership provider-based website. By adding a simple page containing WIF code you will enable your partners to accept your users in their websites, even enabling Single Sign On for the users already logged in your website. See Enhancing an ASP.NET Membership Provider Website with Identity Provider Capabilities.

Providing Single Sign On

Once your application uses claims, it is easier to add scenarios where you can use other ways to sign in. The application only cares that the token is provided by a trusted provider. And the STS provides the information the application needs, such a name, email, or whether the person is in the sales role.

Single sign-on (SSO) is where a user's token, is trusted across multiple IT systems or even organizations. You application can use a federated identity as the means of linking a person's electronic identity and attributes, stored across multiple distinct identity management systems.

In many of the scenarios, the STS that provides the user’s claim run inside the same organization as your application. But your application can now take advantage of an STS that is outside the organization.

As long as the application trusts the federation provider STS, that STS can run anywhere—even in the cloud.

Windows Azure Access Control is a federation provider STS that runs in the cloud. And when you connect to another organization, their Active Directory that could provide the token might not express a role in the same way as another. So Access Control has a straight forward way to map the roles of various providers into the names your application uses.

You can even allow log ins from other providers including Facebook, Google, Windows Live, or Yahoo.

I’ll describe Access Control in the next part of this series.

Context of Windows Identity Foundation

Windows Identity Foundation is part of Microsoft's identity and access management solution built on Active Directory that also includes:

  • Active Directory Federation Services 2.0: a security token service for IT that issues and transforms claims and other tokens, manages user access and enables federation and access management for simplified single sign-on.
  • Windows Azure Access Control Services: provides an easy way to provide identity and access control to web applications and services, while integrating with standards-based identity providers, including enterprise directories such as Active Directory, and web identities such as Windows Live ID, Google, Yahoo! and Facebook.
Getting Started with Windows Identity Foundation
References

Brokered Authentication: Security Token Service (STS)

Windows Identity Foundation (WIF)

Next Up

Windows Azure Security Best Practices – Part 6: How Azure Services Extends Your App Security. In this last part I show how other services in Windows Azure provide secure identity mapping, messaging, and connection to on premises application. This section describes the implications of Windows Azure Active Directory, Windows Azure Connect, and Service Bus for cloud applications, on premises applications, and hybrid applications.


<Return to section navigation list>

Cloud Computing Events

Alan Smith reported on a Sweden Windows Azure Group Meeting - Windows Azure Service Bus, 26th March, Stockholm in a 3/13/2012 post:

I’ll be presenting a session on “Windows Azure Service Bus” for the Sweden Windows Azure Group (SWAG) at AddSkills in Stockholm on the 26th March. It will be a demo intensive session looking at the relayed and brokered messaging capabilities of the Service Bus.

Register for the event here.

imageSign up to Sweden Windows Azure Group (SWAG) for notifications of future events here.

Read more about the Windows Azure Service Bus in my e-book “Windows Azure Service Bus Developer Guide”.


<Return to section navigation list>

Other Cloud Computing Platforms and Services

Joe Panettieri (@joepanettieri) asked HP Cloud and Microsoft Windows Azure: On A Collision Course? in a 3/12/2011 post to the TalkinCloud blog:

imageHewlett-Packard’s public cloud and corporate cloud strategies are starting to come into focus. And if you take a look, the HP public cloud and Microsoft Windows Azure seem to be on a potential collision course.

In an interview with The New York Times, Hewlett-Packard made it clear that the company wants to leverage open standards (Ruby, Java, PHP) to attract software developers onto its cloud platform.

imageMost folks are comparing the HP public cloud strategy to that of Amazon Web Services. But in my mind, HP is more likely to collide with Microsoft Windows Azure — which also leverages a range of open standards. Plus, HP wants to promote data management and analytics in its public cloud. To me, that sounds a bit like SQL Azure.

HP and Microsoft: Partnering and Competing

imageNow for the irony: Hewlett-Packard is one of Microsoft’s most trusted, longest-standing server application partners. HP was one of the first promoters of Windows NT Server back in the 1990s, and HP remains one of the largest corporate integrators of Exchange Server. On premise, it’s clear Microsoft and HP will continue to partner closely. But in the cloud, it’s clear Microsoft and HP will both partner and compete — the latest example of so-called IT “coopetition.”

HP will need to recruit developers to embrace its public cloud efforts. That’s a tall challenge considering HP doesn’t have much history in the ISV (independent software vendor) market. But perhaps HP’s buyout of Autonomy can assist that effort. And in the SMB market, HP has been working with such companies as Axcient — a backup and disaster recovery specialist — in the cloud.

Windows Azure Cloud Update

Meanwhile, I think Microsoft is off to a mixed — though promising — start with Windows Azure and SQL Azure. The company suffered an embarrassing leap year cloud meltdown on February 29, but there are signs that channel-related solutions are shifting into the Azure cloud. Two examples include Quosal (a quoting and sales proposal tool for VARs and MSPs) and CA ARCserve (the backup platform for channel partners), both of which now leverage Azure.

Admittedly, I could be jumping the gun comparing HP’s public cloud strategy to that of Windows Azure. HP hasn’t even officially “announced” its public cloud initiative. But by mentioning a standards-based ISV effort in that New York times article, I think HP has Windows Azure on its radar…

Read More About This Topic

Brian Taylor (@BT_TalkinCloud) asked HP Cloud: A Real Niche Between Amazon, Oracle & IBM? in another 3/12/2012 post to the TalkinCloud blog:

imageHewlett-Packard has announced two initiatives in the European cloud marketplace, and there are signs that HP is finally making good on its promise to eventually counter Amazon Web Services (AWS) — while striving to elbow aside Oracle and IBM along the way. The big wild-card: Where do channel partners fit into HP’s public cloud equation?

imageAbout a year ago, former HP CEO Leo Apotheker vowed that his company would counter Amazon in the cloud. At the time Apotheker was short on details and offered no insights about when or how HP would actually counter HP. By September 2011, Apotheker was ousted and HP’s plan to counter Amazon fell off the media’s radar.

HP Still Going Into the Cloud

imageFast forward to the present, and HP Senior VP Zorawar “Biri” Singh told The New York Times that HP is building a business-oriented cloud that will offer services for structured and unstructured data, plus analytics.

Instead of getting into a cloud price war with Amazon, HP told The Times that the IT giant will focus on value-added services and multiple languages — like Ruby, Java and PHP. To some of us here at Talkin’ Cloud, it sounds a little like Microsoft’s cross-platform strategy for Windows Azure.

Singh said the effort will leverage HP’s entire sales channel — though partner program details are still forthcoming. And it’s unclear how HP will ultimately try to build a niche that stands out against cloud initiatives at IBM, Oracle and Amazon.

Watch Europe

Meanwhile, HP is showing some cloud momentum in Europe. With Swisscom AG, a major telecoms company in Switzerland, HP has launched a unified communications and collaboration (UC&C) program using HP’s private cloud infrastructure. Also, in the Netherlands HP announced IT providers Centric and Eshgro as the first certified HP Cloud partners for customers in Holland.

In addition to its nearly 6 million customers and 1.6 million broadband connections, Swisscom also offers managed UC&C services to corporate clients. Swisscom employs the HP Converged Infrastructure based on HP’s CloudSystem to provide managed unified communications for a fixed price each month.

With HP, Centric and Eshgro in the Netherlands are delivering secure, unified cloud solutions based on HP CloudSystem to help businesses benefit from cloud services. Through the HP CloudAgile Program, Centric and Eshgro have changed their business models to help clients achieve their migration to the cloud.

As we know, companies are increasingly turning to cloud providers rather than building their own data centers. Thus firms like Centric and Eshgro are enabled to provide cloud services that fulfill clients’ compliance and business needs in their national and regional markets. Through the CloudAgile program, Centric and Eshgro have access to HP’s local sales force, program incentives including support and financing, and expanded service offerings.

For channel partners, the HP CloudAgile Program includes certified partners delivering cloud services, and also service providers, service hosts, systems integrators and value-added resellers. For more information, visit the HP CloudAgile Service Provider page.

And stay tuned. An HP public cloud — or corporate cloud — is in the works.

Additional reporting by Joe Panettieri.

Read More About This Topic

IMO, HP brings too little too late to the public cloud. “[V]alue-added services and multiple languages — like Ruby, Java and PHP” won’t overcome higher prices for cloud compute instances and storage. Windows Azure already supports those languages, plus .NET of course.


Quintin Hardy reported H.P. Attempts to Take On Amazon’s Cloud Service in a 3/9/2012 post to the NY Times’ Bits blog:

imageWithin two months, Hewlett-Packard will offer a large and powerful cloud computing service similar to Amazon Web Services, but with more business-oriented features, according the head of the project.

image“We’re not just building a cloud for infrastructure,” said Zorawar “Biri” Singh, senior vice president and general manager of H.P.’s cloud services. “Amazon has the lead there. We have to build a platform layer, with a lot of third-party services.” Among the first software applications available as part of the Hewlett-Packard cloud, he said, will be both structured and unstructured databases, and data analytics as a service.

image“We won’t pull (Amazon’s) customers out by the horns,” he said, “but we already have customers in beta who see us as a great alternative.” He did not say how much the computing services would cost, but said “we are not coming at this at ‘8 cents a virtual computing hour, going to 5 cents.’” Amazon recently cut its prices, and its lowest cost computing is 2 cents per hour, though with extra features it can cost more. While Amazon tends largely to have a self-service model, Hewlett-Packard’s cloud will also offer more personalized sales and service, Mr. Singh said.

H.P. also plans to offer a number of tools for developers to use popular online software languages, like Ruby, Java, and PHP, as well as ways for customers to provision and manage their workloads remotely. The service will also include an online store where people can offer or rent software for use in the Hewlett-Packard public cloud. Mr. Singh said the company would take precautions to ensure the quality and security of these software offerings from third parties by providing services like user authentication and billing.

Hewlett-Packard’s alternative to A.W.S. has been underway for over a year, and is likely to be the most ambitious project yet under Meg Whitman, who became chief executive of the Palo Alto, Calif., technology company last September. While seemingly focused on Amazon, the company is also looking at the project as a new way to compete with its traditional rivals.

“We want to make it hard for an I.B.M. or an Oracle or anyone to come in,” he said. By offering a lot of tools for developers and business-ready software to corporations, H.P. could find ways to undercut existing enterprise offerings, while surviving against Amazon, a notoriously low-margin competitor. …

Read more.


<Return to section navigation list>

0 comments: