Tuesday, October 18, 2011

Windows Azure and Cloud Computing Posts for 10/17/2011+

A compendium of Windows Azure, SQL Azure Database, AppFabric, Windows Azure Platform Appliance and other cloud-computing articles. image222

image433

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:


Azure Blob, Drive, Table, Queue and Hadoop Services

Andrew Brust (@andrewbrust) posted Putting the ‘BI’ in Big Data on 10/16/2011:

imageLast week, at the PASS (Professional Association for SQL Server) Summit in Seattle, Microsoft held a coming out party, not only for SQL Server 2012 (formerly “Denali”), but also for the company’s “Big Data” initiative. Microsoft’s banner headline announcement: it is developing of a version of Apache Hadoop that will run on Windows Server and Windows Azure. Hadoop is the open source implementation of Google’s proprietary MapReduce parallel computation engine and environment, and it's used (quite widely now) in the processing of streams of data that go well beyond even the largest enterprise data sets in size. Whether it’s sensor, clickstream, social media, location-based or other data that is generated and collected in large gobs, Hadoop is often on the scene in the service of processing and analyzing it.

imageMicrosoft’s Hadoop release will be a bona fide contribution to the venerable open source project. It will be built in conjunction with Hortonworks, a company with an appropriately elephant-themed name (“Hadoop” was the name the toy elephant of its inventor’s son) and strong Yahoo-Hadoop pedigree. Even before PASS, Microsoft had announced Hadoop connectors for its SQL Server Parallel Data Warehouse Edition (SQL PDW) appliance. But last week Microsoft announced things that would make Hadoop its own – in more ways than one.

imageYes, Hadoop will run natively on Windows and integrate with PDW. But Microsoft will also make available an ODBC driver for Hive, the data warehousing front-end for Hadoop developed by FaceBook. What’s the big deal about an ODBC driver? The combination of that driver and Hive will allow PowerPivot and SQL Server Analysis Services (in its new “Tabular mode”) to connect to Hadoop and query it freely. And that, in turn, will allow any Analysis Services front end, including PowerView (until last week known by its “Crescent” code name), to perform enterprise-quality analysis and data visualization on Hadoop data. Not only is that useful, but it’s even a bit radical.

imageAs powerful as Hadoop is, it’s more of a computer scientist’s or academically-trained analyst’s tool than it is an enterprise analytics product. Hadoop tends to deal in data that is less formally schematized than an enterprise’s transactional data, and Hadoop itself is controlled through programming code rather than anything that looks like it was designed for business unit personnel. Hadoop data is often more “raw” and “wild” than data typically fed to data warehouse and OLAP (Online Analytical Processing) systems. Likewise, Hadoop practitioners have had to be a bit wild too, producing analytical output perhaps a bit more raw than what business users are accustomed to.

imageBut assuming Microsoft makes good on its announcements (and I have pretty specific knowledge that indicates it will), then business users will be able to get at big data, on-premise and in-cloud, and will be able to do so using Excel, PowerPivot, and other tools that they already know, like and with which they are productive.

Microsoft’s Big Data announcements show that Redmond’s BI (Business Intelligence) team keeps on moving. They’re building great products, and they’re doing so in a way that makes powerful technology accessible by a wide commercial audience. For the last seven years, SQL Server’s biggest innovations have been on the BI side of the product. This shows no sign of stopping any time soon, especially since Microsoft saw fit to promote Amir Netz, the engineering brain trust behind Microsoft BI since its inception, to Technical Fellow. This distinction is well-deserved by Mr. Netz and its bestowal is a move well-played by Microsoft.

Last week’s announcements aren’t about just Big Data; they’re about Big BI, now open for Big Business.


<Return to section navigation list>

SQL Azure Database and Reporting

My (@rogerjenn) PASS Summit: SQL Azure Sync Services Preview and Management Portal Demo (Problem Fixed) post updated 10/18/2011 begins:

imageMy (@rogerjenn) Quentin Clark at PASS Summit: SQL Azure Reporting Services and Data Sync CTPs Available from Azure Portal post of 10/13/2011 (updated 10/14/2011) described setting up the SQL Azure Sync Services Preview with the updated Windows Azure Management Portal, which is repeated with additional details below. The Preview and Portal update were announced by Quentin Clark in his 10/13/2011 keynote session.

Update 10/18/2011: The first two passes at this process failed at step 27 due to a serialization error. The problem was caused by my installation of the .NET Framework 4.5 preview with the Visual Studio 11 Developer Preview, which Microsoft released at the \\BUILD Conference in mid September. It installs .NET Framework v4.5.40805 and sets it as the default version as shown here:

image

See end of post for more details.

The post continues with a 34 steps with complete screen captures for setting up, testing, and troubleshooting the SQL Azure Data Sync preview with SQL Azure and SQL Server 2008 R2 [Express] SP1 databases.


The Microsoft Codename “Data Explorer” Team posted a Microsoft Codename “Data Explorer” Walkthrough on 10/17/2011:

In our previous post we introduced you to Microsoft Codename “Data Explorer” , a cloud service that helps you to gain insight into your data by allowing you to discover, enrich, and publish your data. This post walks you through an end-to-end scenario using “Data Explorer”.

In this scenario, our business consultant (Anna) is tasked by Contoso Yogurt to help them decide where to open the next three stores in the Western Washington State area. In order to answer this question, Anna needs to consider various aspects about the target customers, the demographics of each potential location. Anna must also forecast the reaction of the people near those new store locations after the new stores are open; i.e. how are people “feeling” about these stores.

Anna already has access to most of the data sources that contain the information that she needs, either because she owns the data or because the data is part of her company’s information systems. Additionally, Anna knows that there are other useful pieces of information in the wild that she can leverage, such as web pages, forums, social networks or other places on the Internet. She is not yet sure how she might leverage some of this valuable data., but thanks to the power of “Data Explorer”, now she can put this data to use and gain new insights.

These are the different data sources that Anna already knows and considers useful with the decision process:

  • Existing Contoso stores (including store performance rating): This data is available in a SQL Azure database that Contoso owns and maintains.
  • Potential new locations: Anna has done some research and created a list of candidate locations for new Contoso stores. This list contains some of the most popular shopping centers in the area. This data is stored in an Excel spreadsheet.

Anna starts with the Welcome page, where she can start working with her data.

clip_image002

Discover

On the Welcome page, Anna clicks Dashboard to start working with her data.

clip_image004

Anna clicks on Add data source to start adding new data.

This brings Anna to the Add Data page, where she can add data from many different kinds of data sources. She can connect to network resources such as a database, consume contents from a web page, a data feed or data coming from Windows Azure Marketplace. Alternatively she can add data from her local machine in various formats (Excel, Access, Text, etc.) or even create data “ad-hoc” by typing or pasting text, or creating formulas.

clip_image006

Anna starts off by connecting to the SQL Azure database where Contoso stores information about existing stores. To do this, she provides the Server and Database information, and a user name and password with permissions to access this database.

clip_image008

Next, Anna adds the Excel spreadsheet which contains potential new locations (shopping centers). She does this from the Add Data page as well. In this case, she uses File – Excel as the data source type, and she gets to indicate where the Excel file is currently located so that it can be uploaded.

clip_image010

Now that Anna has added her two data sources, she goes back to the Dashboard page and notices that her two existing data sources appear within the dashboard (on the right). In addition, a lot of useful information appears on this page, including classifications about the data that was imported. There are also some recommendations about potentially useful and relevant data sets from Azure Marketplace and Bing that she could leverage.

clip_image012

Anna finds these recommendations interesting and will incorporate some of them to her data later.

The next thing that Anna needs to do is to combine the information from the SQL Azure database regarding Contoso stores with the list of shopping centers for potential new locations that was in the Excel file added earlier. This is generally not a trivial task, but with “Data Explorer” she merely needs to select both sources in the list and click Mashup.

clip_image014

Enrich

Once Anna has clicked on Mashup, she is taken to the Mashup editor. This is where she can start shaping the data, enriching it by connecting the two different tables.

clip_image016

We will talk about the Mashup editor in greater detail in subsequent posts, but for now there are a few concepts that we want you to learn in order to understand the rest of this scenario…

In the top-left corner of the editor, you can see the resource pane currently displaying the two resources that Anna is trying to mash up, namely ShoppingCenters and ContosoStoreTraq.

The New option above the resource pane allows you to add more data via the Add Data experience we covered above.

The Merge option allows you to merge two resources into a single table.

clip_image018

Currently, the selected resource is ContosoStoreTraq which is being previewed in the editor. Immediately below the ribbon are two gray boxes – we refer to this as the Task Stream. Each of those boxes is a “Step” that transforms or refines the data in a particular way. You can select from the ribbon a wide variety of tasks and apply them to a given resource, thereby adding to its task stream. These tasks are suitable for filtering, ordering, changing column names in the preview, transforming the data as you choose. The tabular preview shows the result of adding each task to the task stream, giving you immediate visual feedback on how the data shape is affected with each step. You can click on each task to see how the preview looked during that step.

clip_image020

You will also notice recommendations about datasets that are relevant to the data that you are currently working with (shown in blue in the bottom left). Since Anna is working with stores and shopping center locations, demographics data from Data Market and phone book information from Bing are provided as recommended datasets.

image

Next, Anna wants to combine these two resources because the information about existing stores contains a store performance rating. She would like to make this performance rating appear next to each shopping center where there is an existing Contoso Yogurt store. She can easily achieve that by adding a lookup column to ShoppingCenters, as displayed below…

image

After adding this lookup column, Anna selects the recommended data set about demographics and adds it to her mashup. She is then going to merge this resource with the ShoppingCenters resource, based on the Zip code.

clip_image026

Once she has merged these two resources using the Zip and PostalCode columns, Anna can incorporate another one of the recommended datasets; in this case, she is adding data from the Bing Phone Book to enrich her current data with phone numbers.

clip_image028

Using the Bing Phone Book API, Anna is able to create a new column with the count of the number of high schools within a ten mile radius of each store.

clip_image030

She could also add some of the other recommended services in order to provide social sentiment for each of the shopping centers, in order to pick the one that people like best…

clip_image032

Publish

Finally Anna wants to share her findings with her colleagues and uses the Publish features from “Data Explorer”, which allows her to publish the results in many different formats (Excel, PowerPivot, etc.), as you can see in the following screenshot. We will explore in detail the different publish mechanisms in subsequent posts.

clip_image034

You will be able to try these capabilities and much more very soon. We are working hard every day so that by the time you get to try them, they are even more powerful! As a side effect of this, some parts of the user interface that you have seen in this post might look a bit different by the time you start using “Data Explorer”.

We hope this tour has helped you get a better understanding of the great opportunities “Data Explorer” brings to you and your data. If you still haven’t had the chance to sign up to try “Data Explorer”, you can follow this link to sign up.


David Pallman continued his series with The Cloud Gourmet 30-Minute Meal: Simple Database Migration to SQL Azure on 10/17/2011:

Bonjour and welcome again to The Cloud Gourmet with Chef Az-ure-D.

Today we have another “30 minute meal” recipe for Windows Azure. This time we will move a simple database into the cloud. For databases that meet our criteria, this is a “piece of cake”.

I must emphasize we are talking about simple databases: large or complex databases will certainly take more than 30 minutes; some take weeks! Below we are specific about the criteria for an easy migration.


imageOf course we often think of a database as just one tier of a complete solution, but this is not always the case. Sometimes a database on its own is a valued asset, such as reference data that many consume for different reasons. Even if you do need to migrate a complete solution, this recipe can be of use, because it is often the best approach to migrate the data first and then the application code.
To illustrate the steps of the migration, we will use the AdventureWorksLT database from Microsoft. If you wish to practice using the same data, you can obtain it here.

imageRecipe: Migrating a Simple SQL Server Database to SQL Azure in 30 Minutes
Databases on the Windows Azure platform are hosted by the SQL Azure Database service, available in sizes from 1-50GB at a cost of $9.99/GB/month plus bandwidth charges. SQL Azure is very similar to SQL Server and uses the same protocol (TDS) and programming model. You can work with SQL Azure databases using both the familiar SQL Server Management Studio tool (2008 R2 edition) and the SQL Azure section of the Windows Azure Management Portal.

Plan
An excellent tool for migrating SQL Server databases to SQL Azure is the SQL Azure Migration Wizard. We will use this tool to analyze, revise, and migrate the schema and data from an on-premise database to a cloud database. Nothing we will do will modify the source database in any way.
There are 5 steps we will perform:

  1. Create a SQL Azure Database Server
  2. Create a SQL Azure Database
  3. Migrate the Database
  4. Review SQL Azure Database in SSMS
  5. Review SQL Azure Database in Portal

Databases come in all sizes and complexities. The majority of the world’s databases are small in size and simple in nature. Database products come in all sizes and complexities too. Check that you meet the criteria below before using this recipe.

Criteria
Migrations to SQL Azure are smoothest and fastest when the following are true:

  • SQL Server Database. Your starting point database should be in SQL Server or SQL Server Express, preferably a recent version like 2008 or 2008 R2. If this is not the case, you can expect a longer migration time and you may have to work through feature differences. If you are migrating from a different database product or a very old version of SQL Server, it is recommended you “stage” your migration by moving your database over to SQL Server 2008 R2 first, and then up into the cloud on SQL Azure.
  • Small Database. Database size matters in migrations for several reasons. First of all, there is an upper size limit for a SQL Azure Database (currently 50GB); if your database is larger, then you will need to partition into multiple databases. Second, large data takes a long time to transmit (days perhaps) and you are more likely to be encountering occasional communication failures during upload. If you measure your database size in megabytes, this 30-minute recipe is for you. If you are dealing with gigabytes of data, you can still use this recipe but expect to spend one or more days migrating and you may want to employ the services of an expert.
  • Simple Database. A simple database is one that is mostly about the data. Complex databases are those that depend on database server features which may extend the migration time or pose barriers to migration. Use of features like constraints, granular security, or stored procedures are often fine but can sometimes pose complications. Features like Windows authentication, SQL Agent jobs, replication, full text search, XML indexing, distributed transactions, and transparent data encryption are not available yet in SQL Azure.

Ingredients:
1 SQL Server database meeting the above criteria

You Will Need:

Directions

Step 1: Create a SQL Azure Database Server
In this step you will create a SQL Azure database server using the Windows Azure management portal.
Note: if you have already created a SQL Azure database server, proceed to Step 2.

1A. In a large bowl.. .er, I mean In a web browser, go to the Windows Azure portal.

1B. Navigate to the Database category on the lower left.

1C. Select a subscription on the upper left.

1D. Click the Create button in the Server toolbar. A dialog appears.

1E. Specify the data center locale where you want to create your database server and click Next.

1F. Now specify a name you would like to use for an Administrator, along with a strong password. Record the administrator name and password for safekeeping. Again click Next.

Note: it’s possible to change the password in the future, but not the admin name.

1G. Finally, you will be prompted to set firewall rules. In order to access the database we will need to add a firewall rule.

A. Check the Allow other Windows Azure services to access this server checkbox.

B.Click Add.

C. Enter a rule name.

D. Enter a starting and ending IP address. You can either specify your own IP (shown in the dialog), or some other range. If you want to allow access for all, you can specify 0.0.0.0 through 255.255.255.255.

E. Click OK and Finish to close the dialogs.

1H. A database server will soon appear in the main area of the Windows Azure portal. You will need to capture the server name, which is auto-generated and cryptic. In our example, it is b7e77waic7. Your name will be different.

So far we have created a database server (It is actually multiple servers, but this single virtual server name fits the SQL Server connection string model). The next step will be to create a database on that server.


Step 2: Create a SQL Azure Database
In this step you will create a database on the database server you created in Step 1.

2A. In the Windows Azure portal, select your database server.

2B. Click the Create button in the Database toolbar. A dialog appears.

2C. Specify (and record) a name for your database, and select an edition/size sufficient for your database. The Web edition offers 1/5GB sizes and the Business edition 10/20/30/40/50GB. Then click OK.

2D. Verify the database was created by finding it and selecting it in the outline at upper left. Notice that in addition to your database the database server also contains a master database.

Now that we have created our cloud database, we can begin the migration.


Step 3: Migrate the Database
In this step you will use the SQL Azure Migration Wizard to migrate your SQL Server database to the SQL Azure database created in Step 2.
3A. Launch the SQL Azure Migration Wizard and select the Analyze and Migrate / SQL Database option. Then click Next.

3B. In the Connect to Server dialog that appears, specify your source database server and authentication credentials as described below. Then click Connect.

  • Server name: Here you must specify server-name or server-name\instance-name. If you are running full SQL Server on your local machine and using the default SQL Server instance, just leave the setting to its default of localhost. If you are running SQL Server Express on your local machine, you are most likely using the .\SQLEXPRESS instance.
  • Authentication: Your source database may be set up for Windows authentication or SQL authentication. Select the appropriate option. For SQL authentication, also enter your username and password credentials.
  • Database. Leave this at its default value of Master DB. This will cause the databases to be listed for you on the next screen.

3C. On the next screen (Select Source), select your database and click Next.

3D. On the Choose Objects dialog, leave the default of Script all database objects and click Next.

3E. On the Script Wizard Summary dialog, make no changes and click Next. When prompted Ready to generate SQL script?, click Yes.

3F. When the Results Summary dialog appears, review the report content fully to see if there are any problems. You may also use the Save button to save a copy of the report.

In the case of the AdventureworksLT sample database we are using to illustrate this recipe, there is a problem noted in the report for the ProductionDescriptionsSchemaCollection table: XML Schema Collections are currently not supported in SQL Azure. This means our source database is using an unsupported feature in this one table. In our case we are unconcerned and decide to proceed without that one table. If you encounter errors, you will need to evaluate how serious they are and what action you should take about them. Note that it is possible in Step 3D earlier to specify which database objects are and are not included in the migration.

3G. If you have chosen after reading the report to proceed with the migration, click Next. A Connect to Server dialog appears. Otherwise click Exit and you will need to work on resolving your issues.

3H. Specify the information below to connect to the SQL Azure database you created earlier in Steps 1 and 2. Then click Next.

  • Server name: Specify the server name in the form SERVER.database.windows.net where SERVER is the generated database server name from Step 1H (b7e77waic7 in our example).
  • Authentication: Specify Use a specific user ID and password
  • User name: Specify an administrator username in the form USERNAME@SERVER where the administrator user name is the administration name you made up in Step 1F and the server name is the generated database server name from Step 1H (chefazured@b7e77waic7 in our example).
  • Password: Specify the administrator password from Step 1F.
  • Database: Select the Specify Database option and specify the name of the database you created in Step 2C (adventureworks in our example).

3I. On the Setup Target Server Connection dialog, confirm the database name is correct and click Next. When prompted to Execute script against destination server?, click Yes.

3J. Now sit back as the database is migrated. The running report will show you what is happening, including errors and remedial actions. Just how long this takes depends on the size of your database and the quality of your Internet connection.

3K. When the processing completed, you’ll see Processing finished on the bottom of the report. Review the report and decide if you think the migration was successful. You can save this report with the Save button if you wish. Then click Exit.

The last thing to do is confirm the database migration was successful. You can do this with SQL Server Management Studio (Step 4) or the Windows Azure Portal (Step 5).


Step 4: Review SQL Azure Database in SSMS (Optional)
In this step you will confirm your database was successful migrated to the cloud by accessing it with SQL Server Management Studio. You need the 2008 R2 edition of SSMS for this.

4A. Launch SQL Server Management Studio. The Connect to Server dialog appears.

4B. In the Connect to Server dialog, specify the connection information below:

  • Server name: the generated server name from Step 1H.
  • Authentication: SQL Server Authentication.
  • Login: The login name from Step 1F.
  • Password: the password from Step 1F.
  • Remember password: check if you don’t want to have to re-enter this info.
  • Options / Connect to Database: click the Options button and enter the database name from Step 2C under Connect to Database.

4C. Click Connect to connect to the database. If it fails, check the following:

  • Did you create your database server?
  • Did you create your database?
  • Are you specifying the correct database server name in the right format?
  • Are you specifying the correct admin username and password?
  • Did you specify the database name on the Connection Properties tab?
  • Are there any extraneous leading or trailing spaces in what you entered?
  • Does your firewall prevent port 4133 (which SSMS required).

4D. Now inspect your database, start by expanding the SSMS outline at left and ensuring the tables you expect to be present are there.

4E. Perform some SELECT queries to ensure the data looks right. Do tables contain the number of rows you expect? Does table data look like you expect?


Step 5: Review SQL Azure Database in Portal (Optional)
This step is optional. If you would also like to see your data in the SQL Azure Portal, or if you were unable to use SSMS in Step 4, follow these steps:

5A. In the Windows Azure Management Portal, return to the Database area where you were working in Steps 1 and 2.

5B. Select your database on the outline and left and click the Manage button in the Database toolbar.

5C. In the sign-in dialog, specify the administrator username and password from Step 1F. Then click Log on.

5D. Click Tables at left and the database tables will be listed. Confirm the tables you expect are all there.

5E. Select a table you want to view. Click the Data icon to view its data, or the Design icon to view its design.


Concluding Remarks
Congratulations, you were magnificent! Vous avez été magnifiques! I am very proud of you.

I very much hope that your migration went well. If you ran into complications, if the souffle did not rise, do not depair. You must appreciate that not all database migrations are simple affairs. For help you can make avail of the excellent online guidance, support forums, community blogging, and perhaps consider obtaining help from an experienced consultant.


<Return to section navigation list>

MarketPlace DataMarket and OData

The ADO.NET Data Services team described Using Geospatial Data in a 10/17/2011 post:

imageThis CTP of WCF Data Services adds support for geospatial data. The release allows use of all of the OData geospatial data types and the geo.distance() canonical function. This enables two key scenarios:

  • Read and write geospatial data (all types supported by Sql Server 2008 R2).
  • Find all entities (i.e. coffee shops) near a location.

Before I illustrate use of these features, I'd like to mention some limitations of this CTP. First, (and most significantly) WCF Data Services providers only support geospatial data with custom or reflection providers. You can't use Entity Framework at this time. OData will support geospatial data over EF as soon as there is an EF release that supports geospatial data.

Second, this CTP does not allow null values in geospatial properties. Nulls will be added by RTM.

OK, enough on what it doesn't do. Let's interact with some geospatial data!

Adding Geospatial Data to the Model

https://gist.github.com/1293201 is a simple OData service which lets a user find people and businesses near them. I'll describe the key geospatial parts here.

First, the entities each have a property of type GeometricPoint (one of the new geospatial types):

[DataServiceKey("BusinessId")]
public class Business
{
public int BusinessId { get; set; }
public string Name { get; set; }
public string Description { get; set; }
public GeographicPoint Location { get; set; }
}
[DataServiceKey("Username")]
public class User
{
public User()
{
this.Friends = new List<User>();
}
public string Username { get; set; }
public IList<User> Friends { get; set; }
public GeographicPoint LastKnownLocation { get; set; }
} 

To create sample data values for geospatial types, I use the GeographyFactory (the data creation API is likely to change before the RTM, but this correct for now):

new User { Username = "Chai", LastKnownLocation = GeographyFactory.Point(47.7035614013672, -122.329437255859) }

Finally, geospatial data is only supported in V3 of the OData protocol:

public static void InitializeService(DataServiceConfiguration config)
{
config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V3;
} 

That's it. Geospatial data values are just primitive data values like any other. It requires as little effort to use them as it does to use a DateTime.

Writing the client

Just use Add Service Reference to codegen a client. Consuming geospatial data is no different than consuming any other V3 OData service.

Reading and writing geospatial values

There's nothing special about geospatial values. For example, to update your last known location, you would query for your User entity, set the value of its LastKnownLocation, and call SaveChanges().

Enabling geo.distance queries

This sample service is interesting because it allows users to find nearby friends and businesses. We want to write queries that filter or orderby geo.distance().

Unfortunately, the October CTP does not include an in-memory implementation for distance. Computing distance on a round earth is complicated. You'll need to find a good implementation for this operation (Sql Server has such an implementation). Once you have it, you can use the following glue code to hook it up.

public static void InitializeService(DataServiceConfiguration config)
{
// ...
// Register my operations
SpatialOperations.Register(2.0, new MyOperations());
}
internal class MyOperations : SpatialOperations
{
public override double Distance(Geometry operand1, Geometry operand2)
{
// TODO: Put your code here.
throw new NotImplementedException();
}
public override double Distance(Geography operand1, Geography operand2)
{
// TODO: Put your code here.
throw new NotImplementedException();
}
} 
Making a distance query

Now that you've got a service that supports geo.distance(), we want to query it. Here are a couple of queries we can run:

var localStuff = new LocalStuff(new Uri("http://localhost/LocalStuff.svc", UriKind.Absolute));
var me = localStuff.Users.First(u => u.Username == "Chang");
var myNearbyFriends = me.Friends
.Where(friend => friend.LastKnownLocation.Distance(me.LastKnownLocation) < 1000.0);
var moviesNearMe = localStuff.Businesses
.Where(b => b.Description.Contains("movie"))
.OrderBy(b => b.Location.Distance(me.LastKnownLocation))
.Take(3); 

Have fun with the new geospatial data features. Please provide any feedback on the OData.org mailing list.


Alex James (@adjames) updated Actions in WCF Data Services

image“Actions will provide a way to inject behaviors into an otherwise data-centric model without confusing the data aspects of the model, while still staying true to the resource oriented underpinnings of OData."

The October 2011 CTP of WCF Data Services adds powerful, but incomplete support for Actions. The motivation behind Actions stems from wanting to advertise in an OData entry an invocable ‘Action’ that has a side-effect on the OData service.

imageThis statement is broad, but deliberately so; Actions have a lot of power.

Using WCF Data Services to Invoke an Action:

This release’s WCF Data Services client can invoke Actions that have no parameters with any return type (i.e. void, Feed, Entry, ComplexType, Collection of ComplexType, PrimitiveType or Collection of PrimitiveType.

To invoke Actions you call either Execute(..) for void actions or Execute<T>(..) for everything else. For example:

var checkedOut = ctx.Execute<bool>( 
new Uri(“http://server/service.svc/Movies(6)/Checkout”), 
HttpMethod.Post, 
true 
).Single();

Here the Execute<T> function takes the Uri of the Action you want to invoke, the HttpMethod to use (which in this case is Post because we are invoking a side-effecting action), and singleResult=true to indicate there is only one result (i.e. it is not a collection). The method returns a QueryOperationResponse<bool>, which implements IEnumerable<bool>, so we call Single() to get the lone boolean that is the result of invoking the action.

NOTE: Needing to specify singleResult=true is a temporary CTP only requirement, because in the CTP our deserialization code can’t automatically detect whether the result is a collection or single result.

A nice side-effect of this new feature is that you can now call ServiceOperations too, so long as you craft the full Uri (including any parameters) yourself. For example, the code below calls a ServiceOperation called GetMoviesByGenre that takes a single parameter called Genre and returns a Collection (or feed) of Movies using a Get:

var movies = ctx.Execute<Movie>( 
new Uri(“http://server/service.svc/GetMoviesByGenre?genre=’Comedy’”), 
HttpMethod.Get, 
true 
);

foreach(var movie in movies) { 
// do something 
}
Coming Soon…

By RTM we plan to add full support for parameters, both for actions and service operations.

The current plan is for a new BodyParameter class that could be used to specify Actions parameters like this:

var checkedOutForAWeek = ctx.Execute<bool>( 
new Uri(“http://server/service.svc/Movies(6)/Checkout”), 
HttpMethod.Post, 
new BodyParameter("noOfDays", 7) 
).Single();

And a new UriParameter class that could be used to specify ServiceOperation parameters too:

var movies = ctx.Execute<Movie>( 
new Uri("http://server/service.svc/GetMoviesByGenre"), 
HttpMethod.Get, 
new UriParameter("genre", "Comedy") 
);
Setting up a WCF Data Service with Actions:

Unfortunately, creating actions with WCF Data Services in this release is quite tricky because it requires a completely Custom Data Service Provider, but we are striving to make this easy by RTM.

This CTP’s WCF Data Services Server only supports one parameter (i.e. the binding parameter), again this will change by RTM.

To get started with actions, first create a ServiceAction in your IDataServiceMetadataProvider2 implementation; something like this:

ServiceAction checkout = new ServiceAction( 
"Checkout", 
ResourceType.GetPrimitiveResourceType(typeof(bool)), 
null, 
new List<ServiceOperationParameter>{ 
new ServiceOperationParameter("movie", movieResourceType) 
}, 
true 
); 
checkout.SetReadOnly(); 
ServiceAction currently derives from ServiceOperation, so you will need to add any ServiceActions that you create to the collection of ServiceOperations you expose via both IDataServiceMetadataProvider.ServiceOperations and IDataServiceMetadataProvider.TryResolveServiceOperation(..). Also because Data Services are locked down by default you will need to configure your service to expose your actions using SetServiceOperationAccessRule(…).

IDataServiceMetadataProvider2 also adds a new method to find actions possibly bound to a ResourceType instance, (i.e. to an individual Movie). This is so that when the WCF Data Service is serializing Entities it doesn’t need to walk over all the metadata to find Actions that might bind to a particular entity. Here is a naïve implementation, where _sop is a list of all ServiceOperations:

public IEnumerable<ServiceOperation> GetServiceOperationsByResourceType(ResourceType resourceType) 
{ 
return _sops.OfType<ServiceAction>() 
.Where(a => a.Parameters.Count > 0 && a.Parameters.First().ParameterType == resourceType); 
}

Next implement IDataServiceQueryProvider2.IsServiceOperationAdvertisable(..) to tell Data Services whether an Action should be advertised on a particular entity:

public bool IsServiceOperationAdvertisable( 
object resourceInstance, 
ServiceOperation serviceOperation, 
ref Microsoft.Data.OData.ODataOperation operationToSerialize 
){ 
Movie m = resourceInstance as Movie; 
if (m == null) return false; 
var checkedOut = GetIsCheckedOut(m, HttpContext.Current.User);

if (serviceOperation.Name == "Checkout" && !checkedOut) return true; 
else if (serviceOperation.Name == "Checkin" && checkedOut) return true; 
else return false; 
}

Here resourceInstance is the instance that is being serialized to the client, serviceOperation is the ServiceAction that the server is considering advertising, and operationToSerialize is an OData-structure representing the action information that’ll be serialized if you return true (note you can change properties on this class if for example you want to override the title or target of the Action in the payload).

As you can see, this code knows that only Movies have actions, and that it has only two actions; Checkin and Checkout. It calls an implementation-specific method to work out whether the current user has the current movie checked out and then uses this information to decide whether to advertise the Action.

Next you need to implement IDataServiceUpdateProvider2.InvokeAction(..) so that when a client invokes the Action you actually do something:

public object InvokeServiceAction(object dataService, ServiceAction action, object[] parameters) 
{ 
if (action.Name == "Checkin") 
{ 
Movie m = (parameters[0] as IQueryable<Movie>).SingleOrDefault(); 
return Checkin(m); 
} 
else if (action.Name == "Checkout") 
{ 
Movie m = (parameters[0] as IQueryable<Movie>).SingleOrDefault(); 
return Checkout(m); 
} 
else 
throw new NotSupportedException(); 
}

As you can see, this code figures out which action is being invoked and then gets the binding parameter from the parameters collection. The binding parameter will be an unexecuted query (it is unexecuted because this gives a provider the opportunity to invoke an action without actually retrieving the parameter from the datasource, if indeed that is possible), so we extract the Movie by casting parameters[0] to IQueryable<Movie> and calling SingleOrDefault, and then we call the appropriate code for the action directly.

And we are done…

WARNING: This code will need to change by RTM so that Actions actually get invoked during IDataServiceUpdateProvider.SaveChanges(..). This will involve creating delegates and returning something that isn’t the actual results, but rather something from which you can get the results later. See this post on implementing IDataServiceUpdateProvider for more context if you are interested.

Conclusion:

As you can see, Actions is a work in progress, and many things are likely to change. Even though it is a lot of work to implement actions with the CTP (mainly because you have to implement IDataServiceMetadataProvider2, IDataServiceQueryProvider2 and IDataServiceUpdateProvider2 from scratch), it’s worth trying because Actions opens up the world of behaviors to OData.

Come RTM we expect the whole experience to be a lot better.


<Return to section navigation list>

Windows Azure AppFabric: Apps, Access Control, WIF and Service Bus

Chris Klug (@ZeroKoll) described Using the Windows Azure Service Bus - Topics and Subscribers in a 10/18/2011 post:

imageI guess it is time for another Azure Service Bus post. The previous ones has been relatively popular, so I thought I would do one more post to cover one last feature in the bus. (I say one last now, but I am pretty sure I will be back…)

image72232222222Topics and subscribers are the basic units behind the Service Bus implementation of the pub/sub pattern. And as expected from a “simple” pattern like this, it should be simple to implement, and it is. The basics would be, create a topic, add subscribers that subscribe to messages from the topic, and finally push some messages to the topic, which are then relayed to the subscribers. Simple as…

Ok, so let’s look at some code already…and yes we will, but first I just want to mention that this post builds upon some of the previous things. It for example assumes that there is a Service Bus namespace up and rolling and so on. All information about getting that set up is available here. It also contains the information about getting the required SDK installed and so on. So if you haven’t read my post about message relaying, do it.

With that information out of the way, let’s start coding. I start off by creating a class library project, which I call Messages. In this project, I will place the messages that will be sent and received from the topic.

Just remember, like previously, we have to change the target framework to .NET 4. As soon as that is out of the way, I create 2 new classes called “MyFirstMessage” and “MySecondMessage”.

I know that is lacking a little bit of imagination, but that will have to do…

The messages are simple classes, and actually have the same single string property called Message. So they look like this

public class MyFirstMessage
{
public MyFirstMessage() {}
public MyFirstMessage(string msg)
{
Message = msg;
}

public string Message { get; set; }
}

public class MySecondMessage
{
public MySecondMessage() {}
public MySecondMessage(string msg)
{
Message = msg;
}

public string Message { get; set; }
}

Yes, they are pretty much identical. But they will do for this simple demo. The important part is to have 2 messages.

Now that we have the messages it is time to create a publisher, and in good old style, I am creating a new Console application. I change the target framework and add a reference to the Microsoft.ServiceBus assembly. And just as in previous samples, I add in an app.config file and place the namespace, issuer and secret in the appsettings.

As soon as that is done, I can start messing with the actual program. I start by adding in the configuration properties and setting up a new NamespaceManager and MessagingFactory, making the application more or less a copy of the one used in the previous post about queueing. The only difference so far, is that I also create a string constant to hold the name of the Topic to interact with. It looks like this

class Program
{
private const string Topic = "TheTopic";

static void Main(string[] args)
{
var url = ServiceBusEnvironment.CreateServiceUri("sb", GetNamespace(), string.Empty);
var credentials = TokenProvider.CreateSharedSecretTokenProvider(GetIssuerName(), GetSecret());

var nsc = new NamespaceManager(url, credentials);
var mf = MessagingFactory.Create(url, credentials);
...
}

private static string GetIssuerName()
{
return ConfigurationManager.AppSettings["issuer"];
}
private static string GetSecret()
{
return ConfigurationManager.AppSettings["secret"];
}
private static string GetNamespace()
{
return ConfigurationManager.AppSettings["namespace"];
}
}

Ok, so so far, there is nothing new really, and what comes next might be new, but still looks very familiar. I start by using the NamespaceManager to see if there is a Topic with the specified name. If there isn’t I create one. After that is done, I create a TopicClient that will be responsible for communicating with the topic.

As soon as I get my hands on that TopicClient, I start a for-loop that sends 10 messages to the Topic. Five of type MyFirstMessage, and 5 of type MySecondMessage. But before sending the messages, I use another feature of the BrokeredMessage class, the ability to add metadata to the message. This is done by putting the data into an IDictionary<string,object> called Properties. This metadata is then passed along with the message to the client, adding some extra features that you will see later.

For now, all you need to know is that together with the message, I am passing the full name of the message type as a metadata property called “Type”. I also make sure to replace any “.” in the full name with “_” as it fails if there are “.” characters in the data.

The loop looks like this

for (int i = 0; i < 11; i++)
{
object msg;
msg = i % 2 == 0 ? (object)new MyFirstMessage("First message " + i) : (object)new MySecondMessage("Second message " + i);

var brokeredMessage = new BrokeredMessage(msg);
brokeredMessage.Properties["Type"] = msg.GetType().FullName.Replace('.','_');

Console.Write("Sending message " + i + "...");

client.Send(brokeredMessage);

Console.WriteLine("Done!");
}

And finally I close the client and add a Console.ReadKey() to keep the window open. This is really not necessary, but it feels wrong to have the window just close…

That was the publisher! Recap, create a Topic, then a TopicClient, then a BrokeredMessage, and finally send it to the message to the Topic.

Next up is a subscriber. Once again, it is a Console application, with changed framework and a new reference. And once again, I add the app.cofig and the config retrieving properties. I also add the string constant that I used in the publisher.

I start off the same way as in the publisher, by creating the NamespaceManager and MessagingFactory and making sure that the Topic is there. Normally, the Topic should be created before the subscriber arrives. But to make it foolproof in this case, I add the creation logic in here as well.

But that’s where the commonality changes. As soon as I know the Topic exists, I carry on with the Subscription. I once again use the NamespaceManager to check if there is a Subscription with a name that I have defined, in this case “Subscriber”. If there isn’t, I create one.

So the way that Topics and Subscribers work, is that you have a Topic that messages are sent through. The clients listening connect to a so called Subscription, which is pretty natural. The important thing to understand though, is that each Subscription has a name, and only one client can listen to it. If you connect more than one client to the same Subscription, you will get a race and only one client will receive the messages.

The way it works, at least conceptually, is that each Subscription has a queue. When you send a message to the Topic, it puts that message into all registered Subscription’s queues. And as you connect a client to a Subscription, it looks for messages in that queue. (It might be a little different in the actual implementation, I don’t know, but it basically works like that at least…)

Ok, back to the code. As soon as the Subscription’s existence has been confirmed I use the MessagingFactory to create a new SubscriptionClient. To create one of these, it takes the name of the Topic, and the name of the Subscription.

I then start a loop that receives messages asynchronously. And since it is async, I use a Console.ReadKey() call to keep the application running. After the ReadKey() call, I close the client, and use the NamespaceManager to delete the Subscription. Like this

static void Main(string[] args)
{
var url = ServiceBusEnvironment.CreateServiceUri("sb", GetNamespace(), string.Empty);
var credentials = TokenProvider.CreateSharedSecretTokenProvider(GetIssuerName(), GetSecret());

var nsc = new NamespaceManager(url, credentials);
var mf = MessagingFactory.Create(url, credentials);

if (!nsc.TopicExists(Topic))
nsc.CreateTopic(Topic);

var name = "Subscriber";
if (!nsc.SubscriptionExists(Topic, name))
nsc.CreateSubscription(Topic, name);

_client = mf.CreateSubscriptionClient(Topic, name);

BeginReceive();

Console.WriteLine("Waiting for messages...");
Console.ReadKey();

_client.Close();

nsc.DeleteSubscription(Topic, name);
}

As each Subscription is an “entity” in the bus, removing it when you are done is a good idea… Messages will not go into the Subscription and stay there forever, but they will be there for a while and use up space. I also assume that having Subscriptions around, will also slow down the message sending a tiny amount… Anyhow, in this case it makes sense to remove it at least…

Ok, so that was the main part of the subscriber. The part that wasn’t in that code was the message receive loop that was started by the call to BeginReceive(). The BeginReceive() method is very simple. It checks to make sure that the client isn’t closed, and then calls BeginReceive() on it. The BeginReceive() method takes a TimeSpan that defines how long it will wait for a message, a callback method for when a message has been received, or the time has run out, and finally a user state object if needed.

private static void BeginReceive()
{
if (!_client.IsClosed)
_client.BeginReceive(TimeSpan.FromMinutes(5), MessageReceived, null);
}

The actual work is in the MessageReceived() method. It takes an IAsyncResult like all async callbacks. This is then used when calling the EndReceive() method on the client. If something has gone wrong, an exception might be thrown, so I wrap it in a try/catch and Concole.WriteLine the exception.

If everything goes ok, I get a message back, or potentially nothing. If a timeout is the reason for the ending of the call, the EndReceive() will return null. So I start by checking whether or not I got a message.

If I did, I check the type by looking at the metadata property called “Type”. Remember, the one I added above… I make sure to replace my “_” with “.” and then do an if-statement to handle different message types that were sent.

If the type is one that I recognize, I get the message from the BrokeredMessage by calling GetBody<T>(), and out put the message to the console. If it is a type I don’t recognize, I output that to the console.

And finally, at the end of the method, I call BeginReceive() again to receive more messages.

private static void MessageReceived(IAsyncResult iar)
{
try
{
var msg = _client.EndReceive(iar);
if (msg != null)
{
var type = (string)msg.Properties["Type"];
if (!string.IsNullOrEmpty(type))
type = type.Replace('_', '.');

if (type == typeof(MyFirstMessage).FullName)
{
var myMsg = msg.GetBody<MyFirstMessage>();
Console.WriteLine("Received a MyFirstMessage: " + myMsg.Message);
}
else if (type == typeof(MySecondMessage).FullName)
{
var myMsg = msg.GetBody<MySecondMessage>();
Console.WriteLine("Received a MySecondMessage: " + myMsg.Message);
}
else
{
Console.WriteLine("Received a message I don't understand...");
}
}
}
catch (Exception ex)
{
Console.WriteLine("Exception was thrown: " + ex.GetBaseException().Message);
}
BeginReceive();
}

Ok, that works fine! But there has to be a better way to handle the message type…and there is. And my second subscriber will use it. The second subscriber is identical to the first one, except for 2 things. First of all, if obviously uses another name for its Subscription, and secondly, it only cares about messages of type MyFirstMessage.

I actually create the second subscriber by creating anew Console project as usual, and then copying across the Main() method and the app.config.

So how do I handle this in a good way. Well, when you create a Subscription, you can pass along a filter. This filter, makes it easy to filter messages based on their metadata using a T-SQL-like language.

The filter class is called SqlFilter, and is very simple to add like this

if (!nsc.SubscriptionExists(Topic, name))
{
var filter = new SqlFilter("Type = '" + typeof(MyFirstMessage).FullName.Replace('.', '_') + "'");
nsc.CreateSubscription(Topic, name, filter);
}

This filter will make sure that the Subscription only gets messages where the metadata property “Type” has the correct value.

You can add as many filters as you want to a Subscription. However, they are calculated using “OR”. So if any of the filters match, the message will be received.

After that little change in the Subscription creation, I also change the MessageReceived() method to just accept MyFirstMessage messages.

private static void MessageReceived(IAsyncResult iar)
{
try
{
var msg = _client.EndReceive(iar);
if (msg != null)
{
var type = (string)msg.Properties["Type"];
if (!string.IsNullOrEmpty(type))
type = type.Replace('_', '.');
if (type == typeof(MyFirstMessage).FullName)
{
var myMsg = msg.GetBody<MyFirstMessage>();
Console.WriteLine("Received a MyFirstMessage: " + myMsg.Message);
}
else
{
Console.WriteLine("Received a message I don't understand...");
}
}
}
catch (Exception ex)
{
Console.WriteLine("Exception was thrown: " + ex.GetBaseException().Message);
}
BeginReceive();
}

I still have the else statement in there, just as a failsafe, but it will never be called…

That is it! If you start off by starting the subscribers one at the time, and then the publisher, you should see the messages flowing through.

You can also start the publisher first, and then the subscribers. They will still get the messages when they start up.

The only thing is that if you start them at exactly the same time, you might get an exception because two of the apps are both trying to create the Topic at once…

That’s it for this time!

The code is available as usual: DarksideCookie.Azure.ServiceBusDemo.zip


Paolo Salvatori (@babosbird) posted How to integrate a BizTalk Server application with Service Bus Queues and Topics to the MSDN code library on 10/13/2011 (missed when published). From the Description:

Introduction

imageThis sample shows how to integrate a BizTalk Server 2010 application with Windows Azure Service Bus Queues, Topics, and Subscriptions to exchange messages with external systems in a reliable, flexible, and scalable manner.

image72232222222Queues and topics, introduced in the September 2011 Windows Azure AppFabric SDK, are the foundation of a new cloud-based messaging and integration infrastructure that provides reliable message queuing and durable publish/subscribe messaging capabilities to both cloud and on-premises applications based on Microsoft and non-Microsoft technologies. .NET applications can use the new messaging functionality from either a brand-new managed API (Microsoft.ServiceBus.Messaging) or via WCF thanks to a new binding (NetMessagingBinding), and any Microsoft or non-Microsoft applications can use a REST style API to access these features.

Microsoft BizTalk Server enables organizations to connect and extend heterogeneous systems across the enterprise and with trading partners. The Service Bus is part of Windows Azure AppFabric and is designed to provide connectivity, queuing, and routing capabilities not only for the cloud applications but also for on-premises applications. Using both together enables a significant number of scenarios in which you can build secure, reliable and scalable hybrid solutions that span the cloud and on premises environments:

  • Exchange electronic documents with trading partners.
  • Expose services running on-premises behind firewalls to third parties.
  • Enable communication between spoke branches and a hub back office system.

In this demo you will learn how to use WCF in a .NET and BizTalk Server application to execute the following operations:

  • Send messages to a Service Bus queue.
  • Send messages to a Service Bus topic.
  • Receive messages from a Service Bus queue.
  • Receive messages from a Service Bus subscription.

In this demo you will also learn how to translate the explicit and user-defined properties of a BrokeredMessage object into the context properties of a BizTalk message and vice versa. Before describing how to perform these actions, I’ll start with a brief introduction of the elements that compose the solution

Building the Sample

Inside the zip file you can find a Readme file with the instructions on how to install the demo.

Article

You can read the related article on MSDN at http://msdn.microsoft.com/en-us/library/hh542796(v=VS.103).aspx.

No significant articles today.


<Return to section navigation list>

Windows Azure VM Role, Virtual Network, Connect, RDP and CDN

imageNo significant articles today.


<Return to section navigation list>

Live Windows Azure Apps, APIs, Tools and Test Harnesses

Tim Huckaby (@TimHuckaby) is the interviewer in Bytes by MSDN: October 18 - Simon Hamilton Ritchie of 10/18/2011:

imageJoin Tim Huckaby and Simon Hamilton Ritchie (@simon_h_r, pictured at right), CEO of MatchboxMobile as they discuss Windows Phone and it’s many capabilities. Simon walks us through his latest project with T-Mobile where he helped create an application that helps families communicate through sharing pictures, calendar sync and chat features.

imageThis application was created within 6 weeks prior to the Windows Phone 7 launch and was pre-installed in HTC®phone devices. Windows Azure was also utilized to deploy this application and was the key in helping them deliver within 6 weeks. Tune in to hear more!

Open attached fileHDI_ITPro_MSDN_mp3_Simon_Hamilton_Ritchie_MIXEDAUDIO_8000k.mp3


Avkash Chauhan (@avkashchauhan) reported Visual Studio 11 Developer Preview Training Kit is released (Windows 7 and 8) on 10/18/2011:

imageOn 16th October, Visual Studio team released the first version of the Visual Studio 11 Developer Preview Training Kit. This kit includes hands-on labs to help you understand how to take advantage of the variety of enhancements in Visual Studio 11 and the .NET Framework 4.5, how to support and manage the entire application lifecycle and how to build Windows Metro style apps.

imageThe Training Kit contains the following content:

  • Visual Studio Development Environment
    • A Lap Around the Visual Studio 11 Development Environment
    • Languages
      • Asynchronous Programming in .NET 4.5 with C# and Visual Basic
      • Web
        • What's New in ASP.NET and Visual Studio 11 Developer Preview
        • What's New in ASP.NET Web Forms 4.5
        • Build RESTful APIs with WCF Web API
        • Application Lifecycle Management
          • Building the Right Software: Generating Storyboards and Collecting Stakeholder Feedback with Visual Studio 11
          • Agile Project Management in Team Foundation Server 11
          • Making Developers More Productive with Team Foundation Server 11
          • Diagnosing Issues in Production with IntelliTrace and Visual Studio 11
          • Exploratory Testing and Other Enhancements in Microsoft Test Manager 11
          • Unit Testing with Visual Studio 11: MSTest, NUnit, xUnit.net, and Code Clone
          • Windows Metro Style Apps
            • Windows 8 Developer Preview Hands on Labs from BUILD. NOTE: The Training Kit contains a link to these labs at http://www.buildwindows.com/labs and does not include the labs themselves.

You can download the Training Kit from here: http://go.microsoft.com/?linkid=9779649.

  • The 37 MB file (VS11TrainingKitOctober2001.Setup.exe) contains the entire Training Kit. Install this and you will have all of the labs.
  • The 2 MB file (VS11TK_WebInstaller_Preview.exe) uses the new Content Installer from DPE. When you run this exe you can choose which labs to install. This lets you customize the install to your desires.

Also, in the future, you can download additional labs without having to install the entire kit again.

imageNote: Don’t install the Visual Studio 11 Developer Preview if you intend to sync SQL Azure with SQL Server 2008 R2 [Express] SP1 databases on the same machine. See my PASS Summit: SQL Azure Sync Services Preview and Management Portal Demo (Problem Fixed) post in the SQL Azure Database and Reporting section above.


Avkash Chauhan (@avkashchauhan) explained Windows Azure Application launch failed with an error "This access control list is not in canonical form and therefore cannot be modified" in a 10/18/2011 post:

imageEnvironment: Visual Studio 2010 Professional SP1 & Windows Azure SDK 1.5

While working on an issue, I found an interesting issue with a new hello worlds ASP.NET based web role. When launching this application with or without debugger, the exception was generated as below:

System.ServiceModel.FaultException`1 was unhandled
Message=This access control list is not in canonical form and therefore cannot be modified.
Source=mscorlib
Action=http://schemas.microsoft.com/net/2005/12/windowscommunicationfoundation/dispatcher/fault
StackTrace:
Server stack trace:
at System.ServiceModel.Channels.ServiceChannel.ThrowIfFaultUnderstood(Message reply, MessageFault fault, String action, MessageVersion version, FaultConverter faultConverter)
at System.ServiceModel.Channels.ServiceChannel.HandleReply(ProxyOperationRuntime operation, ProxyRpc& rpc)
at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object[] ins, Object[] outs, TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(IMethodCallMessage methodCall, ProxyOperationRuntime operation)
at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage message)
Exception rethrown at [0]:
at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
at IConfigurator.Deploy(String roleId, WebAppModel webAppModelPath, String roleRootDirectory, String sitesDestinationRootDirectory, String diagnosticsRootDirectory, String roleGuid, Dictionary`2 globalEnvironment)
at Microsoft.WindowsAzure.Hosts.WaIISHost.Program.Main(String[] args)

InnerException:

imageAt this point when exception occurred, if you open IIS Management application (Inetmgr.exe) you will see the site is actually running and it also has binding set to IP address similar to 127.255.0.0:82 and if you browse this URL, it does show your ASP.NET web application. So the problem is mainly related with the interaction between compute emulator and IIS components. After digging a few more minutes, I was not able to find the actual root cause however found a workaround.

Here is what I tried which did not work:

  • Reinstalling Windows Azure SDK 1.5
  • Disabling and then enabling IIS from Program and Features, did not help

What worked:

  • When I run the web role to legacy mode (HWC) by commenting whole sites section in CSDEF the exception disappeared.

Unfortunately, if HWC is not your option then the faster method is to get a new OS and tools to get you going.


Don Pattee of the Windows HPC Team announced a Preview of Windows Azure Scheduler and the HPC Pack 2008 R2 Service Pack 3 releases now available in a 10/17/2011 post:

imageMicrosoft's High Performance Computing team has just made our 'release candidate' for two products available: The HPC Pack 2008 R2 Service Pack 3 and the Windows Azure Scheduler SDK.

The HPC Pack service pack is an update to the same Windows HPC cluster software that you know and love, with improvements to basic functionality & stability and a few additional new features such as the integration of the Linq to HPC runtime (previously released as a beta add-on, check out the BUILD conference presentation at http://channel9.msdn.com/Events/BUILD/BUILD2011/SAC-453T for more information), enhancements to our Windows Azure bursting scenarios by reducing the number of ports you have to open in your firewall (services now use 443 instead of a multiple ones), and the ability to install the HPC Pack software on a server not dedicated to your cluster (e.g. a team file server) for use in a manner similar to the Workstation Node functionality previously available.

imageThe big new part is the first chance at trying out the Windows Azure Scheduler, previously announced at the BUILD conference (http://channel9.msdn.com/Events/BUILD/BUILD2011/SAC-452T)

The Windows Azure Scheduler for Parallel Applications is a solution that enables you to deploy applications in a scalable, high-performance computing (HPC) infrastructure in Windows Azure. With the Windows Azure Scheduler, you can schedule, submit, and monitor HPC jobs that use your Message Passing Interface (MPI), service-oriented architecture (SOA), or LINQ to HPC applications.

With the Windows Azure Scheduler SDK, you can create Windows Azure deployments that support scalable, compute-intensive, parallel applications. This SDK provides the following features:

  • Built-in job scheduling and resource management.
  • Runtime support for Message Passing Interface (MPI).
  • Service-oriented architecture (SOA).
  • LINQ to high performance computing (HPC) applications, web-based job submission interfaces, and persistent state management of job queue and resource configuration.

Applications that have been built using the on-premises job submission API in Windows HPC Server 2008 R2 can use very similar job submission interfaces in the Windows Azure Scheduler.

To get access to these pre-release installers head over to the HPC team's "Connect" beta website (http://connect.microsoft.com/HPC). Once you sign up, you'll have access to the Release Candidate on the website's "downloads" section.

For questions and comments head over to our discussion forums at http://social.microsoft.com/Forums/en-US/category/windowshpc

Signed up for the former Dryad and DryadLINQ SKUs.


The Blackbird Group reported Blackbird Group Extends Cloud Support Leveraging Microsoft's Azure Platform in a 10/17/2011 press release:

imageToday at The Experts Conference (TEC) 2011 Europe, where it is a gold sponsor, the Blackbird Group announced support for the Microsoft Azure platform in its Event Vault log consolidation product, which is now shipping.

imageBlackbird, a Microsoft Managed Gold Partner and leader in compliance and automation management, recognized that its customers would want to take advantage of Microsoft's investment in cloud-based services. Says Blackbird CEO, Christian Ehrenthal, "Now our customers, large and small, can more easily leverage the benefits of cloud storage. Azure will enable them to outsource their SQL and provide maximum flexibility in terms of both demand growth and availability."

Blackbird Event Vault is the newest module in the company's suite of products. It consolidates Windows native log entries into a single repository for centralized management of all event log data. It is the first of Blackbird's products to be Azure-enabled and the company has plans to similarly enable its entire suite of products in the near future.

The Blackbird Management Suite powerfully and seamlessly guides administrators through the entire identity and access lifecycle. It helps enterprises effectively deal with internal security initiatives and regulatory compliance requirements. Its unique integrated approach simplifies real-time auditing, compliance reporting, continuous recovery and access entitlement management, by consolidating them into a single easy-to-use console.

About the Blackbird Group, Inc.

A leader in automation and compliance management, the Blackbird Group provides insight for enhanced governance and cost efficiencies across the Microsoft Windows infrastructure. Blackbird's innovative solutions help organizations make the most of their Windows infrastructure investments through automation and simplified audit, recovery, event management and reporting. The Blackbird Group is a privately held company with more than 15 million seats under management. Its headquarters are located in Manhattan, New York.


Bruce Kyle reported Microsoft Demos Dynamics NAV Running on Windows Azure in a 10/16/2011 post:

    imageThe Microsoft Dynamics NAV team has produced a demo of the application running on Windows Azure. The product was demoed at Directions US, which is an event for Dynamics NAV Partners run by Dynamics NAV Partners.

    imageWhite on the team’s blog post Excited in Orlando writes, “At Convergence 2011, we announced that we would deliver Microsoft Dynamics NAV on Azure with the next major release of the solution - Microsoft Dynamics NAV “7”. Last night we demonstrated Dynamics NAV running on Azure. The development project is in great shape. We expect to ship this release in September/October 2012.”

    About Dynamics NAV

    imageMicrosoft Dynamics NAV (formerly Navision) delivers comprehensive business management functionality, from financials to your supply chain to manufacturing and more. It connects the many moving parts of your organization, giving you better visibility into and control over what's going on in your business. And it supports highly specific industries with powerful solutions created by Microsoft partners.

    To learn more about Microsoft Dynamics NAV visit http://www.microsoft.com/en-us/dynamics/erp-nav-overview.aspx or contact a Microsoft Dynamics partner in your local market.


    Avkash Chauhan (@avkashchauhan) described Handling RoleEnvironment.Stopping event to perform specific action when Windows Azure VM is going down for scheduled update in a 10/16/2011 post:

    imageIf you have read my previous article about Windows Azure VM Downtime due to Guest and Host OS update….

    image… I would like to add little more information on this regard for some level of completeness. So now you must be sure that it is very much possible that your Windows Azure VM will be down for a very short while when Host OS is being updated (let’s consider roughly once a month) as well as when Guest OS is updated, pretty much same frequency so what else you can do when your Azure VM going for an short update process….

    One option is that you can handle RoleEnvironment.Stopping event to perform some actions if you wish to do so. There is one thing to consider then when you get RoleEnvironment.Stopping event your VM is already out from loadbalancer.

    Code snippet is as below:

    public override bool OnStart()
    {
    RoleEnvironment.Stopping += RoleEnvironmentStopping;

    return base.OnStart();
    }

    private void RoleEnvironmentStopping(object sender, RoleEnvironmentStoppingEventArgs e)
    {
    // Add code that is run when the role instance is being stopped
    }

    public override void OnStop()
    {
    try
    {
    // Add code here that runs when the role instance is to be stopped
    }
    catch (Exception e)
    {
    Trace.WriteLine("Exception during OnStop: " + e.ToString());
    // Take other action as needed.
    }
    }


    Note: Code running in the OnStop method has a limited time to finish when it is called for reasons other than a user-initiated shutdown. After this time elapses, the process is terminated, so you must make sure that code in the OnStop method can run quickly or tolerates not running to completion.

    So if you decided to write some cleanup code in the based on above note you might say, how much time you have to run your cleanup code in Stopping event or why there is a limit. The reason to have specific time to get out of this event are mainly stabilizing Azure VM health and don’t let your instance out from load balancer longer. Potentially:

    1. It is possible OS gets stuck because your code wouldn’t quit and OS keeps on waiting
    2. It is also possible OS could hit a problem during the shutdown
    3. For any reason the guest OS takes some time to spin down pretty much similar shutting down and OS proceeds to flush the system and it could take a while to get past the shutdown screen however you must know it could be normal.

    So the period of time is enough for fabric to detect all these variations however this should not be your concern. You can learn more about RoleEnvironment.Stopping event in MSDN article as below:


    <Return to section navigation list>

    Visual Studio LightSwitch and Entity Framework 4.1+

    The ADO.NET Team described How We Talk about EF and its Future Versions in a 10/18/2011 post:

    imageThe EF team has been working to become more agile and get new features into your hands faster. Our first two releases shipped as part of the .NET Framework but, as the result of an effort to release more often, in recent times we have been shipping a slew of “out-of-band” features that build on top of the functionality included in .NET.

    These out-of-band releases caused some confusion because we didn’t have a consistent way of talking about versions. To avoid generating more of this confusion in the future we recently decided to adopt semantic versioning principles.

    While consistent versioning is great we still have quite a few EF related packages that ship independently of each other. We need to rationalize how we talk about them and define what we are actually referring to when we talk about “Entity Framework” and future versions of it.

    The purpose of this post is to provide additional clarity about how we plan to talk about our releases in the future and to get your feedback on the approach we are taking.

    The Confusion

    In order to deliver features as quickly as possible, we want to keep as much as possible outside of the .NET Framework. With EF continuing to have features that ship both inside and outside the .NET Framework we need to rationalize how we talk about these.

    All the EF related libraries, packages and downloads we have today are:

    • Core EF libraries that ship in the .NET Framework (i.e. System.Data.Entity.dll, System.Data.Entity.Design.dll, System.Web.Entity.dll, etc.)
    • EntityFramework NuGet package which includes the DbContext API and Code First (i.e. EntityFramework.dll)
    • Entity Designer that ships in Visual Studio
    • T4 Templates for using DbContext API with Model First & Database First that ship on Visual Studio Gallery
    • EF Power Tools (still in Alpha stage) that ship on Visual Studio Gallery
    • Code First Migrations (still in Alpha stage) which ships via NuGet
    The Clarity

    Obviously it’s going to get very confusing if we release different versions of those at different times. To avoid this we are going to focus on the EntityFramework NuGet package as the primary deliverable that we refer to as “The Entity Framework” or “EF”. The version number of this package will now define the current version of EF (currently EF 4.1 and soon to be EF 4.2). This is the main point that will govern how we talk about EF, all of our blog posts, how-to videos, whitepapers etc. will guide you toward using the EntityFramework NuGet package.

    We will generally treat the core Entity Framework libraries in the .NET Framework as we do any other .NET libraries, such as System.dll, System.Data.dll etc. Therefore when the next release of the .NET Framework comes out we will release an updated version of EF that takes advantage of the new features in the Framework (including new functionality in System.Data.Entity.dll). In situations when we need to refer explicitly to the EF specific assemblies in the .NET Framework we will call them the EF Core Libraries.

    The Entity Designer will continue to be installed with Visual Studio and will work with the latest version of EF.

    In the future we hope to move the classes from System.Data.Entity.dll into the EntityFramework NuGet package so that we can release features such as Enums without having to wait for a .NET Framework update.

    The T4 Templates for DbContext will be available under the “Online” tab when you select “Add New Code Generation Item…” from the EF designer. The templates are located on the Visual Studio Gallery but you don’t need to be aware of this or visit the gallery to use them. We are leaving the templates on VS Gallery so that we can quickly and easily respond to bug fixes, feature requests etc.

    The EF Power Tools are still in preview mode and we haven’t really decided where they belong yet. For the moment they will remain as a stand-alone download on Visual Studio Gallery as this is the easiest place to distribute them.

    Code First Migrations will remain as a separate NuGet package and when Code First detects that you need to migrate the database schema it will direct you to this NuGet package. Keeping Code First Migrations as a separate package allows us to quickly iterate on this new feature.

    Some Features Waiting on .NET Framework 4.5

    Since we shipped EF 4.0 as part of .NET 4.0, we started talking about upcoming features as “EF vNext” features. When we then released EF 4.1 and more recently published the EF 4.2 previews, many of you logically asked why those features weren’t included.

    While the out-of-band approach has worked well for the DbContext API, Code First and Code First Migrations, the fact is that there are some other features, such as enum type support, that inherently require us to update the core libraries that we previously shipped as part of .NET.

    The obvious solution to this is to ship the entire Entity Framework independent of the .NET Framework. The June 2011 CTP was our first attempt at doing this. While we are still pursuing this option it has become clear that from a technical standpoint we are not ready to achieve this immediately. Because of this our new features that require updates to our core libraries will need to wait for the next .NET Framework release. This includes support for Enum Types, Spatial Types, Table-Valued Functions, Stored Procedures with Multiple Results and Auto-Compiled LINQ Queries.

    These features will reappear in a preview of Entity Framework that we will ship alongside the next public preview of .NET 4.5.

    Summary

    In summary, when we talk about “Entity Framework” we are now going to be talking mainly about the out-of-band features that we ship as the EntityFramework NuGet package. Entity Framework is built on the .NET Framework and will make use of the EF core libraries that ship with the .NET Framework, including System.Data.Entity. The EF specific tooling that ships with Visual Studio will always work with the latest version of EF.


    Beth Massi (@bethmassi) reported on 10/17/2011 her LightSwitch Tips & Tricks on dnrTV:

    imageCheck it out, I’m back with the always entertaining Carl Franklin on dnrTV showing some tips and tricks you can use in your screens and queries. See how you can add command bars to any control, create custom search screens, pass parameters into queries, and fine-tune the layout of your screens.

    image222422222222Watch: Beth Massi on Tips and Tricks in LightSwitch 2011

    Here’s some links I’d like to share from the show:


    Return to section navigation list>

    Windows Azure Infrastructure and DevOps

    The Editor of the Gadget blog (@gadgetza, South Africa) reported Microsoft unveils Cloud for SA on 10/17/2011:

    imageMicrosoft has announced the imminent South African release of its cloud offerings, including the Azure platform and the Office version for use online, Office 365. It also lifted the lid on the Windows Phone Mango platform. [Emphasis added.]

    Microsoft has stepped up its aggressive drive into the South African consumer and cloud space with the imminent local release in the next few months of two key products: its Azure cloud platform and its cloud subscription version of Office, Office 365 as well as with the recently released Windows Phone Mango platform.

    That was the big news from the opening day of the software maker’s Tech-Ed and Partner Summit 2011 in Durban, which has attracted more than 3 500 technology vendors, developers and executives from across Africa.

    Tech·Ed runs from 17-20 October at the International Conference Centre (ICC), with Partner Summit ending one day earlier on 19 October. Another smaller event, Microsoft’s CIO Summit, will be hosted at the Oyster Box Hotel at the same time, and is has drawn nearly 100 of the country’s top chief information officers.

    The Windows Phone Mango update brings 500 new features to local users of the platform, but the major development is the availability of Marketplace, which will allow South African consumers to buy local and international apps using local currency via their credit cards. Mango also includes Xbox live integration, which will allow users to access Xbox mobile games.

    imageSpeaking at the event keynote at the ICC, Microsoft corporate vice-president Jason Zander said Microsoft’s cloud platform, Azure, which is planned for release between March and May 2012. Office 365, which brings cloud productivity to businesses of all sizes, will be commercially available in the first half of 2012, with trial availability towards the end of this year. [Emphasis added.]

    “Microsoft has repeatedly made its commitment to the cloud very clear and has made repeated updates to its cloud offerings,” said Zander. “We recently announced several new updates to the Windows Azure platform– which we see as the most comprehensive operating system for Platform-as-a-service – that will help customers create rich applications that enable new business scenarios in the cloud.”

    In his welcome, Microsoft South Africa MD Mteto Nyati said the event would focus strongly on the two broad trends that are shaping the industry and are shaping Microsoft’s strategy: the cloud, both public and private cloud computing, as well as devices.

    “You're going to see a whole bunch of devices of different shapes, sizes, form factors, speeds, usage types. We need a world of devices, and they need to be smart. They need to create data, they need to connect to the cloud,” said Nyati.

    “The cloud for us is the extension of rich experiences that once began on the desktop or on the server, and it's making them richer and more interesting to users and more compelling every single day.”

    Microsoft has high hopes for Office 365, which it says will bring cloud productivity to businesses of all sizes, particularly smaller businesses without an IT department. The service will be hosted from the Microsoft datacentres in Europe, leveraging the economies of scale present in these large datacentres.

    “Office 365 is the best of everything we know about productivity, all in a single cloud service,” said Zander. “The power of cloud solutions allows companies to rent computing power, rather than acquire it outright. Microsoft Office 365 is software-as-a-service, a form of cloud computing where business services are presented to the end user in a subscription model.”

    Customers and partners can visit www.office365.co.za to pre-register for the trial.

    More details on the events, as well as links to videos, blogs, product downloads, and other information about this year’s event, can be found atwww.teched.co.za and www.partnersummit.co.za.


    Mitesh reported NIST Releases Federal Cloud Roadmap, Architecture on 10/14/2011:

    imageThe organization that creates technology standards for the federal government has released a new cloud computing roadmap and reference architecture as part of its continued efforts to help federal agencies adopt this technology model. The documents provide guidance for agencies to help understand the standards they should use when deploying clouds, as well as the categories of cloud services that can be used across the government, according to the National Institute for Standards and Technology (NIST).

    The NIST Cloud Computing Standards Roadmap includes a standards inventory the organization will continue to update as more are created. The inventory covers standards for key features of deploying a cloud computing architecture, such as security, portability, and interoperability. It also identifies models and use cases that are relevant to cloud computing and identifies standardization priorities for the feds in the areas of security auditing and compliance, and identity and access management, according to NIST.

    The guiding principles were used to create the NIST Cloud Computing Reference Architecture, a vendor-neutral design meant to serve as a guideline for agencies, not to be followed specifically, according to NIST. The architecture is based on a system of the so-called “actors” in a cloud architecture–consumer, provider, broker, auditor, and carrier–and defines the roles each play.

    <Return to section navigation list>

    Windows Azure Platform Appliance (WAPA), Hyper-V and Private/Hybrid Clouds

    Lori MacVittie (@lmacvittie) added Examining architectures on which hybrid clouds are based… as an introduction to her Cloud Infrastructure Integration Model: Bridging article of 10/17/2011 for F5’s DevCentral blog:

    imageIT professionals, in general, appear to consider themselves well along the path toward IT as a Service with a significant plurality of them engaged in implementing many of the building blocks necessary to support the effort. IaaS, PaaS, and hybrid cloud computing models are essential for IT to realize an environment in which (manageable) IT as a Service can become reality.

    imageThat IT professionals –65% of them to be exact – note their organization is in-progress or already completed with a hybrid cloud implementation is telling, as it indicates a desire to leverage resources from a public cloud provider.

    What the simple “hybrid cloud” moniker doesn’t illuminate is how IT organizations are implementing such a beast. To be sure, integration is always a rough road and integrating not just resources but its supporting infrastructure must certainly be a non-trivial task. That’s especially true given that there exists no “standard” or even “best practices” means of integrating the infrastructure between a cloud and a corporate data center.

    Specifications designed to address this gap are emerging and there are a number of commercial solutions available that provide the capability to transparently bridge cloud-hosted resources with the corporate data center.

    Without diving into the mechanism – standards-based or product solution – we can still examine the integration model from the perspective of its architectural goals, its advantages and disadvantages.

    THE BRIDGED CLOUD INTEGRATION ARCHITECTURE

    The basic premise of a bridged-cloud integration architecture is to transparently enable communication with and use of cloud-deployed resources. While the most common type of resources to be integrated will be compute, it is also the case that these resources may be network or storage focused. A bridged-cloud integration architecture provides for a seamless view of those resources. Infrastructure and applications deployed within the data center are able to communicate in an environment agnostic-manner, with no requirement of awareness of location.

    image

    This is the premise of the network-oriented standards emerging as a solution: they portend the ability to extend the corporate data center network into a public cloud (or other geographically disparate location) network and make them appear as a single, logical network.

    Because of the reliance of infrastructure components on network topology, this is an important capability. Infrastructure within the data center and the services they provide are able to interact with and continue to enforce or apply policies to the resources located external to the data center. The resources can be treated as being “on” the local network by infrastructure and applications without modification.

    Basically, bridging normalizes the IP address space across disparate environments.

    Obviously this approach affords IT a greater measure of control over cloud-deployed resources than would be otherwise available. Resources and applications in “the cloud” can be integrated with corporate-deployed services in a way that is far less disruptive to the end-user. For example, a load balancing service can easily extend its pool of resources into the cloud to scale an application without the need to adjust its network configuration (VLANs, routing, ACLs, etc…) because all resources are available on what are existing logical networks. This has the added benefit of maintaining operational consistency, especially from a security perspective, as existing access and application security controls are applied inline.

    All is not rosy in bridging land, however, as there are negatives to this approach. The most obvious one should be the impact on performance. Latency across the Internet, implied by the integration of cloud-based resources, must be considered when determining to which uses those remote resources should be put. Scaling applications that are highly latency-sensitive using remote resources in a bridged architecture may incur too high a performance penalty. Alternatively, however, applications integrated using out-of-band processing, i.e. an application that periodically polls for new data and processes it in bulk, behind the scenes, may be well-suited to such an architecture as latency is not usually an issue.

    The bridging model also does not address the need for fault tolerance. If you’re utilizing remote resources to ensure scalability and without them failure may result, you run the risk of connectivity issues incurring an outage. It may be necessary to employ a tertiary provider, which could result in increased complexity in the network and require changes to infrastructure to support.

    Next time we’ll examine a second approach to cloud infrastructure integration: virtualization.


    No significant articles today.


    <Return to section navigation list>

    Cloud Security and Governance

    Jack Greenfield started a Business Continuity On Azure series on 10/18/2011:

    imageAzure customers want their services to be continuously available to their users, despite component failures, platform degradations and data center outages. In other words, they want business continuity. To achieve it, the services must be highly available, and must recover from outages quickly and with minimal data loss, while complying with policies and regulations, and conforming to accepted industry practices.

    image

    Windows Azure and SQL Azure currently provide high availability within a data center, but they don’t provide any protection when data centers go down or offline. Customers can deploy services to multiple data centers to work around this limitation, but the platform doesn’t help them integrate the separate deployments to form highly available, geographically distributed services. This is the first of several blog posts on this topic taken from a paper that I recently wrote with Dima Sonkin and Erik Wahlstrom.

    The posts will provide an introduction to business continuity, describe Azure platform features enabling business continuity, and describe how to achieve business continuity with the existing features.


    <Return to section navigation list>

    Cloud Computing Events

    Alan Smith (@alansmith) reported on on 10/17/2011 Sweden Windows Azure Group (SWAG) meeting on Thursday 27th October in Stockholm, "Extending to the Cloud":

    imageI’ll be presenting a session on “Extending to the Cloud” at the Sweden Windows Azure User Group next week.

    Extending to the Cloud

    image

    Extending to the cloud involves developing hybrid applications that leverage the capabilities of cloud based platforms. Many of the Windows Azure solutions developed today involve extending the capabilities of existing applications and infrastructure to leverage the capabilities of Azure.

    These hybrid applications allow developers to combine the best of both worlds. Existing IT infrastructure can be utilized and the capabilities of cloud platforms used to extend the applications, providing enhanced functionality at minimal cost.

    This demo centric session will highlight how the hosting, compute and storage capabilities in the Window Azure platform can be used to extend on premise applications to the cloud. The rendering of 3D animations and cloud based source control will be used as case studies.

    Free registration is here.


    Jeff Barr (@jeffbarr) reported on 10/17/2011 the Free Live Stream of AWS GovCloud Summit II on 10/18/2011:

    imageDue to an overwhelming response, the live registration for tomorrow's AWS GovCloud Summit has been closed.

    We will be streaming the entire event and I invite you to sign up now. The event will feature keynote addresses by GSA CIO Casey Coleman and Amazon.com CTO Werner Vogels, customer presentations, how-to sessions, information on FISMA and ITAR compliance, and tracks specifically designed for technical and senior manager audiences.

    imageYou will also get to hear from the following government presenters:

    • Shawn Kingsberry - Chief Information Officer, Recovery Accountability and Transparency Board.
    • Don Preuss - Head-Systems, National Institute of Health, National Library of Medicine, National Center for Biotechnology Information.
    • Tim Stitely - Director of Administrative Operations, Health and Human Services, Program Support Center.
    • Khawaja Shams - Senior Solutions Architect, NASA Jet Propulsion Laboratory.
    • Todd Myers - Chief Technology Adviser, NSG Expeditionary Architecture.

    There will be one stream for the morning keynotes and customer sessions and three parallel streams for the Executive, Architect, and Solution tracks in the afternoon.

    The Executive track will cover Oversight and governance, the AWS GovCloud, cloud payment and pricing models, compliance, and standards.

    The Architect track will cover best practices for architecting in the cloud, migrating applications to the cloud, application security best practices, and selecting storage options.

    Again, the stream is free and you can sign up here.


    <Return to section navigation list>

    Other Cloud Computing Platforms and Services

    Geva Perry (@gevaperry) posted Java PaaS to his Thinking Out Cloud blog 10/17/2011:

    imageA couple of weeks ago during OracleWorld/JavaOne, Oracle announced its public cloud offering, including its Oracle Java Cloud Service -- the latest entrant into the increasingly crowded Java Platform-asa-Service space. Despite being the most popular programming language, PaaS offerings for Java were not adopted as quickly as those for Ruby, Python and PHP.

    imageNot surprisingly, however, once Java PaaS arrived at the scene, many of the big players now have PaaS offerings -- given Java's popularity in the enterprise.

    imageHere's the list I compiled (in alphabetical order). If I have overlooked anything, please let me know in the comments:

    It should be noted that HP has a also made some noise about a Java PaaS but it hasn't launched yet.

    Additional reading:


    Lydia Leong (@CloudPundit) announced that she’ll be a Gartner Symposium this week in a 10/16/2011 post:

    imageI am at Gartner Symposium in Orlando this week, and would happy to meet and greet anyone who feels like doing so.

    I am conducting a workshop on Thursday, at 11 am in Salon 7 in Yacht and Beach, called “Using Amazon Web Services“. (The workshop is full, but it’s always possible there may be no-shows if you’re trying to get in.) This workshop is targeted at attendees who are currently AWS customers, or who are currently evaluating AWS.

    imageGartner Invest clients, I’ll be at the Monday night event, and willing to chatter about anything (CDNs, especially Akamai, seem to be the hot topic, but I’m getting a fair chunk of questions about Rackspace and Equinix).

    I hope to blog about some trends on my one-on-one interactions and other conversations at the conference, as we go through the week.


    <Return to section navigation list>

    0 comments: