Tuesday, December 04, 2012

Windows Azure and Cloud Computing Posts for 12/2/2012+

A compendium of Windows Azure, Service Bus, EAI & EDI, Access Control, Connect, SQL Azure Database, and other cloud-computing articles. image_thumb7_thumb1

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:


Azure Blob, Drive, Table, Queue, Hadoop and Media Services

Jonathan Antoine (@jmix90) described a Win8 App : stream videos from a Windows Azure Blob storage in a 12/4/2012 post:

imageVideos are everywhere and specially in Windows 8 app. Using a Windows Azure’s blob to store them is a good solution and this what we choose in on of the app we build in my company (Infinite Square).

We uploaded the files in the blobs, used a MediaElement to play them but it seems like the video needed to be fully downloaded to be played. It was very frustrating since the videos weighed dozen of Mbytes.

The solution is in the Cloud!

imageWhen you use Blob storage, you can actually set a default version for the API. It can be obvious for the Azure expert but not for me.

If you set this default version to “2011-08-18″ then you will be able to do partial and pause/resume downloads on blob objects. This is exactly what we were looking for !

imageTo do this, th only way I found was to create a little .NET program, reference the Azure SDK (thank you Nuget) and call the SetServiceProperties with the correct arguments.

string accountName = "your account Name";

string accountKey = "your account key";

var cloudStorageAccount = CloudStorageAccount

.Parse(string.Format(

"DefaultEndpointsProtocol=http;AccountName={0};AccountKey={1}"

, accountName, accountKey));

CloudBlobClient client = cloudStorageAccount.CreateCloudBlobClient();

var prp = client.GetServiceProperties();

client.SetServiceProperties(new ServiceProperties()

{

DefaultServiceVersion = "2011-08-18",

Logging = prp.Logging,

Metrics = prp.Metrics

});

Now you just have to use the URI of the video in your XAML UI ( or with a video tag) :

<playerFramework:MediaPlayer Source="{Binding VideoUri}"

PosterSource="/Assets/Images/videoPoster.png"

AutoPlay="False"

AutoLoad="False" />


Aung Oo and Matthew Hendle of the Windows Azure Storage Team described AzCopy – Uploading/Downloading files for Windows Azure Blobs in a 12/3/2012 post:

imageOne of the frequent requests we receive is for a simple way to upload or download files between Windows Azure Blob Storage and their local file system. We’re pleased to release AzCopy (Beta Version), which is a command line utility which allows Windows Azure Storage customers to do just that. The utility is designed to simplify the task of transferring data in to and out of a Windows Azure Storage account. Customers can use this as a standalone tool or incorporate this utility in an existing application. The utility can be downloaded from Github.

image_thumb75_thumb1The command is analogous to other Microsoft file copy utilities like robocopy that you may already be familiar with. Below is the syntax:

AzCopy <Source> <Destination> [filepattern [filepattern…]] [Options]

In this post we highlight AzCopy’s features without going into implementation details. The help command (AzCopy /?) lists and briefly describes all available system commands and parameters.

Key Features:
  • Efficient and Flexible: AzCopy allows users to selectively copy the data. By using recursive mode, customers can copy nested directories of files. Users can specify file pattern – wild cards in Windows file system or prefix in the blob storage to identify source files that are candidates for copying. In addition, users can set an option to copy files which have “Archive” attribute set. When copying a large number of files, and if there is a copy failure due to network or other issues, the re-startable option can resume the copy process from where it left off (the files which have not been copied). Using re-startable mode, you will not need to re-copy files which were successful in the previous attempt.
  • Support for Windows Azure Storage Data Types: AzCopy provides options for customers to specify the destination data in a storage account as a block blob or a page blob. Default is set to block blobs, as this is best suited for nearly all files. When using the page blob option, the blob will be zero-padded to a 512-byte boundary.
  • Naming Rules: URI format (http or https) is used to specify the blob storage path and NTFS Windows file folder path is used for Windows file system. Since some of the blob names cannot be directly translated into Windows file system, AzCopy translates into Windows file system names using its own rules. Here are some rules that we follow in naming translations:
    • Blobs that conflict with Windows special file names will be renamed using the following rules: “.” => “dot”; “..” => “dotdot”; “/” => “slash”; etc. As with other conflict resolution rules, if a conflict occurs on any of these names the string “(n)” will be added to the conflicting file or blob name such that the conflict is resolved.
    • The Windows file system namespace is case insensitive (but case retentive), and the Windows Azure blob namespace case sensitive, the following rules apply:
      • Blobs in the blob namespace are created with the default case for the file name.
      • Files are created with the default case of the blob namespace.
      • If a case-conflict occurs while copying from the blob-namespace to the file-namespace, we will append the string “(n)” to the conflicting file or blob names.
  • Logging: Users can run AzCopy in a verbose mode, which displays lists of files and directories processed, and render the list of files that the utility failed to copy. AzCopy also displays the progress indication for each file while running in verbose mode.
Examples:

Example 1: Copy a directory of locally accessible files to blob storage container in a recursive mode.

AzCopy C:\blob-data https://myaccount.blob.core.windows.net/mycontainer/ /destkey:key /S

The above command will copy all files from the “c:\blob-data” directory and all subdirectories as block blobs to the container named “mycontainer” in storage account “myaccount”. “Blob-data” folder contains the following files and one subdirectory named “subfolder1”;

  • C:\blob-data\car1.docx
  • C:\blob-data\car2.docx
  • C:\blob-data\car3.docx
  • C:\blob-data\train1.docx
  • C:\blob-data\subfolder1\car_sub1.docx
  • C:\blob-data\subfolder1\car_sub2.docx

After the copy operation, “mycontainer” blob container will contain the following blobs:

  • car1.docx
  • car2.docx
  • car3.docx
  • train1.docx
  • subfolder1/car_sub1.docx
  • subfolder1/car_sub2.docx

If we do not use recursive mode (Copying without the “/S” option), the “mycontainer” blob container would only contain the following files under “blob-data” folder and would ignore files under the “subfolder1” folder.

  • car1.docx
  • car2.docx
  • car3.docx
  • train1.docx

Example 2: Recursively copy a set of blobs from a blob storage to a locally accessible directory in both verbose and recursive modes.

AzCopy https://myaccount.blob.core.windows.net/mycontainer c:\blob-data /sourceKey:key /S /V

The command will copy all blobs under the “mycontainer” blob container in account “myaccount” to the “c:\blob-data” directory in both verbose and recursive modes.

“mycontainer” blob container contains the following files:

  • car1.docx
  • car2.docx
  • car3.docx
  • train1.docx
  • subfolder1/car_sub1.docx
  • subfolder1/car_sub2.docx

Since we are using the verbose mode, the tool will display the following output which contains the file transfer status of each of the file in addition to the transfer summary. By default, the tool will only display the transfer summary:

  • Finished Transfer: car1.docx
  • Finished Transfer: car2.docx
  • Finished Transfer: car3.docx
  • Finished Transfer: train1.docx
  • Finished Transfer: subfolder1/car_sub1.docx
  • Finished Transfer: subfolder1/car_sub2.docx
  • Transfer summary:
  • -----------------
  • Total files transferred: 6
  • Transfer successfully: 6
  • Transfer failed: 0

After the copy operation, c:\blob-data folder will contain the files listed below:

  • C:\blob-data\car1.docx
  • C:\blob-data\car2.docx
  • C:\blob-data\car3.docx
  • C:\blob-data\train1.docx
  • C:\blob-data\subfolder1\car_sub1.docx
  • C:\blob-data\subfolder1\car_sub2.docx

Let’s try a slightly different scenario by copying the blobs which start with “subfolder1\” by using the following command:

AzCopy https://myaccount.blob.core.windows.net/mycontainer/subfolder1 c:\blob-data /sourceKey:key /S /V

The above command will only copy blobs which begin with “subfolder1/”, and thus the tool will only copy “subfolder1/car_sub1.docx” and “subfolder1/car_sub2.docx” blobs to “c:\blob-data\” folder. After the copy operation, “C:\blob-data” will contain the following files:

  • C:\blob-data\car_sub1.docx
  • C:\blob-data\car_sub2.docx

Example 3: Copy a directory of locally accessible files to a blob account in re-startable mode

AzCopy c:\blob-data https://myaccount.blob.core.windows.net/mycontainer /destkey:key /Z:restart.log /S

Restart.log, a journal file, will be used to maintain a record of the status of the copy operation to allow the operation to restart if interrupted. If there is no text file specifies along with the re-startable mode parameter, the journal file will default to “azcopy.log” in the current working directory.

For instance, “C:\blob-data” folder contains the five large files with each of the file size greater than 100 MB.

  • C:\blob-data\car.docx
  • C:\blob-data\car1.docx
  • C:\blob-data\car2.docx
  • C:\blob-data\car3.docx
  • C:\blob-data\car4.docx

When running with restart option, AzCopy allows you to restart the process in the case of failure. If the failure occurred while copying “car.docx”, AzCopy will resume the copy from the part of “car.docx” which has not been copied. If the copy occurred after “car.docx” was successfully copied, AzCopy will resume the copy operation from one of the remaining four files which have yet to be copied.

Example 4: Select number of files in a storage blob container using a file pattern and copy them to a locally accessible directory.

AzCopy https://myaccount.blob.core.windows.net/mycontainer c:\blob-data car /sourceKey:key /Z /S

“mycontainer” contains the following files:

  • car1.docx
  • car2.docx
  • car3.docx
  • train.docx
  • carfolder/car_sub1.docx
  • carfolder/train_sub2.docx
  • subfolder1/car_sub1.docx
  • subfolder1/car_sub2.docx

After copy operation, “c:\blob-data” will contain the files listed below. Since the file pattern with the prefix of “car” was specified, the copy operation copies only the file with the prefix of “car”. Note that this prefix is applied to the blob, if it’s directly in the “mycontainer” container, or to the subdirectory name.

  • C:\blob-data\car1.docx
  • C:\blob-data\car2.docx
  • C:\blob-data\car3.docx
  • C:\blob-data\carfolder\car_sub1.docx
  • C:\blob-data\carfolder\train_sub2.docx
Performance

Within a Windows Azure datacenter (i.e., between a compute instance and a storage account within the same DC), users should be able to achieve 50MB/s upload and download speed for uploading and downloading large amounts of data using an extra-large compute instance. Transfers to and from a Windows Azure datacenter will be constrained by the bandwidth available to AzCopy.

Known Issues
  • When copying without /XN (Exclude Newer) and /XO (Exclude Older), the tool only compares the names of source and target files before copying. Therefore, users will be prompted whether to overwrite the target files although source and target files are identical.
  • When using /XN and /XO, note that your local system time and the time stored in the storage service will vary slightly. So if the blob and local file were modified at nearly the same time, this comparison may not filter correctly.
  • When copying a file to a page blob with the re-startable option and if there is a copy operation failure in the middle of copying, the tool will restart the copy process from the beginning of the file. This issue does not apply to copying a file to a block blob.
  • When copying blobs to local %systemdrive%, the tool will not prompt the confirm message to overwrite or not to overwrite the files if there are existing files with the same name.
  • If there are two blobs named “a” and “a/b” under a storage container, copying the blobs under that container with /S will fail. Windows will not allow the creation of folder name “a” and file name “a” under the same folder.

image_thumb1


<Return to section navigation list>

Windows Azure SQL Database, Federations and Reporting, Mobile Services

Chris Klug (@ZeroKoll) described Authenticating users in Windows Azure Mobile Services in a 12/3/2012 post:

imageIn my previous post about Mobile Services, I talked about how to get started with the service. I also promised that I would follow up with a post about how to authenticate the users, so that is what this post is going to be about.

imageYou currently have 4 different options when it comes to authentication, Microsoft ID (previously Live ID), Facebook, Twitter and Google. They are all 3rd party services, and requires your users to have accounts with one of the providers. Luckily, most users already do. And the neat thing about using 3rd party authentication is that you don’t have to care about handling sensitive data such as usernames and passwords. And leaving that to someone else is making your life a lot less complicated. Not to mention that having Mobile Services handle all of the actual interaction with them makes your life ridiculously simple, as you will see.

imageI must admit that I would have loved to see the Mobile Services authentication run through ACS instead. That way, it would have been easy for us to set up authentication through ADFS and other identity providers, but I guess we can’t have it all…at least not at once…

Ok, so how do you set up your Mobile Service to support authentication? Well, it is actually a piece of cake… First you have to decide what identity provider to use, or if you want to support several of them. In this post, I will focus on using Facebook.

Once you have decided what identity provider to use, you have to configure the identity provider for you application. As mentioned, I will use Facebook, so I browse to http://developer.facebook.com/ and click the “Apps” menu item. If you have never used the developer part of Facebook, it is not that complicated to get started. Next, I click “Create New App” at the top right corner, which pops-up the following “window”

image

All you need to do here is to add a unique “App Name” and press continue. And after you have figured out the captcha that you are faced with, you get the following view

image

The last thing you need to do in here is to click the “Website with Facebook Login” link and fill in the url to the Mobile Services endpoint you are going to use. In my case, that will be https://darksidecookie.azure-mobile.net/, so I will it out like this

image

Ok…that’s pretty much all you need to do to configure Facebook. Just press “Save Changes” and you are done. The next step is to set up the Mobile Services integration, which is actually just as easy. Just go to the top of the Facebook app page, and locate the “App ID” and “App Secret”, these need to added to the Mobile Services settings in the Azure Portal. So, just browse to the settings page for the Mobile Service you want to work with, and click the “IDENTITY” link.

image

If you then look just below the settings for microsoft accounts, you will find “facebook settings”, and 2 textboxes with very familiar labels. Just copy your App ID and App Secret from the Facebook developer portal into these, press “SAVE” and confirm that you want to update the settings.

That’s it! Authentication configured…now all there is left to do, is to get the actual application to use it, which once again is a walk in the park.

In my case, I have added a Login button to my application, and hooked up a handler for the click event.

<Button Content="Login" Click="LoginLogoutClick" />

In the click handler, I verify whether or not the current user is logged in or not by checking if the CurrentUser property is null. If it is, it is time to log in, if not, it is time to logout.

private async void LoginLogoutClick(object sender, RoutedEventArgs e)
{
if (App.MobileService.CurrentUser == null)
{
await App.MobileService.LoginAsync(MobileServiceAuthenticationProvider.Facebook);
if (App.MobileService.CurrentUser != null)
{
((Button) sender).Content = "Logout";
new MessageDialog("You are now logged in [" + App.MobileService.CurrentUser.UserId + "]").ShowAsync();
}
}
else
{
App.MobileService.Logout();
((Button)sender).Content = "Login";
new MessageDialog("You are now logged out").ShowAsync();
}
}

As you can see, the Mobile Services proxy handles all of it for us. All that is needed is to call LoginAsync(), passing in the identity provider to use, and it does the rest. Remember that logging in is async though, so it need to be “awaited”. Logging out however is synchronous, so we don’t need to “await” it. The result of calling LoginAsync() in an App Store application looks like this

image

And after entering my credentials, I am requested to authorize the Facebook application to get basic information from Facebook.

image

And I am then finally greeted by a MessageDialog like this

image

As you can see, the user id is just a string, which consists of the identity provider, a colon and a unique id given to the service by the identity provider. In the case of Facebook, it is the user id of my Facebook account, but you should not assume that it will always be that…

Ok, so what now? What can we do now? Well, you now have a user identity to use when you save data in your service, which is really useful. And the cool thing is that this is a one-time thing. The user doesn’t need to sign in everytime the application starts, the Mobile Services proxy handles all of that for us…

So say that I wanted to store a name and e-mail address for my user. This is really simple. All you need to do is give the user a UI to insert the information, and then use a little bit of code like this

private void SaveUserDetailsClick(object sender, RoutedEventArgs e)
{
var msg = new JsonObject {{"name", GetName()}, {"email", GetEmail()}};
App.MobileService.GetTable("userdetails").InsertAsync(msg);
}

As you can see, I do not pass the user id with my entity. I do not trust passing it as part of the message, as the user could easily modify the data being sent to the server and set other users information. Instead, the best practice in this case is to use an insert script and set the user id on the server instead.

To do this, I go to the scripts section for my table, and modify the insert script as follows

function insert(item, user, request) {
item.userId = user.userId;
request.execute();
}

Thus making sure that the user id is added to the table…but also making sure that it is the correct user id.

Now, to secure the table, I also make sure that only authenticated users can perform operations on the table by setting the permissions as follows

image

That’s it! A few settings, some copy pasting and a line of two of code, and you have authentication and authorization built in to your application back end. Very simple if you ask me…


Scott Seeley posted MVP Monday - Connected Apps Made Wicked Easy with Windows Azure Mobile Services to the Microsoft MVP Award Program blog on 12/3/2012:

imageEditor’s note: The following post was written by Microsoft Integration MVP Scott Seely

This article assumes that you have:

  • imageA Windows Azure subscription.
  • Experience with C#/XAML (this lets me stick with one language/markup style in the article for the phone and tablet).
  • Visual Studio 2012.

You can get the code for this project from github: https://github.com/sseely/wams-demo.git

imageI’ve been building software since the early 1990s. Because of my own personal interests, I frequently get involved in the architecture of components that handle security, consistent data storage, and server/client interactions. This expertise means that even my personal projects tend to be a bit ambitious in terms of security, distribution, and capability. When building something, I typically prototype an idea first, see how it works locally, and then iteratively work in the back end. Anyone who builds an application does this and has the same, basic needs. I was very excited when I learned that Microsoft launched something to meet those needs called Windows Azure Mobile Services (WAMS). WAMS is a very nice, general purpose back end that handles the needs of many applications. Today, it supports applications running on a diverse range of devices from Windows 8, Windows Phone 8, and Apple iOS. Support for Android is “coming soon.”

In this post, we’ll take a look at what WAMS provides to authenticate a user, store data, and push out notifications to connected devices from other parties. What most tablet and smartphone applications need is the ability to do the following:

  1. Secure the application data such that only the data owner and, possibly others, can see it.
  2. Have some backend bits that run alright on Windows 8 and other mobile platforms.
  3. Make sure that updates from one device get pushed to everyone else who might be interested in the data.

In the rest of this post, we will walkthrough how WAMS supports these scenarios.

Getting Started

To use WAMS, go over to https://manage.windowsazure.com/and create a new Mobile Service. WAMS is currently in beta and you may have to sign up for the beta before Mobile Services lights up in the management portal. Once you have Mobile Services enabled, you simply click on Mobile Services, then New,thenMobile Service, then Create. Pick a URL for your service and a region to run in. I chose jeanius-shoppinglist, in the Western US Region with a new SQL Database. You also get to make a few more decisions about how big the database should be. Feel free to use a Web Database with a size of 1 GB if you are following along. When the Mobile Service is completed, make sure to download and install the Mobile Services client libraries. You can find links to the libraries on the home page for your Mobile Services app. Just pick a platform. Under Get Started for each platform, you can pick “Create a new … app”.Under Get the tools are links to the SDK for the platform you have chosen.

Thinking about eventually setting up a shopping list, I then created a set of projects:

  • For the phone: JeaniusFactory.ShoppingList.Phone (Windows Phone 8 project, Databound App)
  • For the Windows 8 devices: JeaniusFactory.ShoppingList.Tablet (Windows Store project, Blank App)
  • For the shared data types: JeaniusFactory.ShoppingList.Model (Portable Class Library, for .NET 4.5, Windows Phone 8 and .NET for Windows Store apps)

On the phone and Windows Store apps, I also added a reference to the Windows Azure Mobile Services Managed Client. Then, for each of the App objects in the Phone and Table projects, I added the following code to allow me to talk to WAMS:

private static MobileServiceClient _mobileServiceClient;

public static MobileServiceClient MobileServiceClient

{

get

{

if (_mobileServiceClient == null)

{

_mobileServiceClient = new MobileServiceClient(

"https://jeanius-shoppinglist.azure-mobile.net/",

"[key]");

}

return _mobileServiceClient;

}

}

 

So far, nothing spectacular and it all built. The next thing I wanted was a way to authenticate users.

Adding Security

In security, we find that many applications conflate authentication with authorization. Many websites require that you login with a username/password specific to that site in order to access that particular site. This is at odds with what we see in the real world. In the real world, I carry around a driver’s license and, occasionally, a passport. These documents are issued by a government that people trust to have verified things like my birthdate, name, and address. I present these documents to get access to loans, through security at airports, and to be served alcohol at a restaurant. On the Internet, we have a number of other common sites that many people use. As an application developer, I can choose to trust these other sites to verify a username/password. Upon verification, these sites can hand my application the equivalent of a passport—a digital token filled with data about the user. In the same way a waiter uses my driver’s license to verify that I can order a beer, I can use the digital token and its data to authorize the user within my application.

In recognition of the fact that most users have an account with Microsoft, Facebook, Twitter, or Google, WAMS provides a mechanism to use any of these four as authentication mechanisms. Since I’m targeting Windows Store and Windows Phone 8 apps, most of the time a user will have a Microsoft ID. I did this in a few steps:

  1. Created an application in the Windows Store Portal.
  2. Go over to the Live Connect Developer Center to retrieve the key and secret for my application. To do that, just click on the app that was created in the Windows Store Portal, then look for the API Settings group which contains the Client ID and Client secret.
  3. Edit the application API settings, configuring the Redirect Domain to point to you mobile application. For my sample app, this is https://jeanius-shoppinglist.azure-mobile.net/. Then, click Save.
  4. Copy the Client ID and Secret from #2 into the Microsoft account settings area on the WAMS identity tab.
  5. Click Save.

At this point, I just had to add authentication to the applications. Again, this is a small set of steps. For both the Phone and Tablet applications, I opened up MainPage.xaml.cs and added a class variable to track the current user:

MobileServiceUser User { get; set; }

I then added an Authenticate method to get the credentials for the current user to both apps:

private async System.Threading.Tasks.Task Authenticate()

{

  if (App.MobileServiceClient.LoginInProgress)

{

return;

}

while (User == null)

{

    string message = null;

try

{

User = await App.MobileServiceClient

.LoginAsync(MobileServiceAuthenticationProvider.MicrosoftAccount);

}

catch (InvalidOperationException)

{

message = "You must log in. Login Required";

}

if (!string.IsNullOrEmpty(message))

{

var dialog = new MessageDialog(message);

dialog.Commands.Add(new UICommand("OK"));

await dialog.ShowAsync();

}

}

}

Windows Phone 8 does have a slight change to the above code; change the block that displays the message from 3 lines to this:

MessageBox.Show(message);

Finally, in the MainPage.OnNavigatedTo method for both apps, change OnNavigatedTo to be async and add the following line at the end of the method:

await Authenticate();

If you did everything right, the application will now ask you to login using your Microsoft ID whenever you start the application. We now have authentication working! Next, let’s add the ability to manage a single list.

Managing Data

The application data is actually pretty simple. Pretty much every application that uses these features will start out with a few basic objects, then add objects that are specific to their application. At a minimum, you have Users and UserDevices. This requires two tables which I named User and UserDevice.

For each of the tables, set the permissions to only allow authenticated users to insert, update, delete, or read. Once created, the tables will have an id column that is essentially a C# long named id. You can put the simple objects into a portable class library project so that the binaries can be used in both the Phone and Tablet projects. The downloadable project is a skeleton application that contains everything needed to bootstrap an application that shares its device URI with a central service.

One thing I found really interesting in all of this is that the underlying tables do not require the columns to exist in the database before the columns are used. You can go ahead and manage columns if you like, but you do not have to. A missing column just gets added automatically for you. Data types for the columns are picked out from the JavaScript value that would be used for the column.

When accessing the objects client side, you can use regular old LINQ expressions to select data from the tables. The client libraries take those LINQ expressions and translate them into an OData query. Once received, the messages are then dispatched to the Read operation at the server. For example, I could look up the current user record in the backing store using code like this:

var userTable = App.MobileServiceClient.GetTable<User>();

var currentUser = (await (from aUser in userTable

where aUser.UserId == User.UserId

select aUser).ToEnumerableAsync()).FirstOrDefault();

currentUser would contain the user that matches the query. I could also chose to limit things such that the current user can only ever retrieve their own record. To do this I hook into the scripts that run server side. Each table has accompanying scripts that are executed whenever someone tries to Insert, Update, Delete, or Read a record. The script is not a stored procedure. Instead, the server runs JavaScript within a node.js application running on top of a Windows Azure Web Site. (That’s just an implementation detail that you don’t really need to worry about.)

Let’s look at the Read operation for the User table. By default, it has this implementation:

function read(query, user, request) {

request.execute();

}

If we want to restrict the request to only include the authenticated user, we would add one line to the method, immediately before request.execute():

query.where({UserId: user.userId});

What happens here is any queries to get data from the User table automatically see that the results should be filtered include a UserId that matches the authenticated user. The authenticated user is always delivered in the user parameter. If the table Read doesn’t require an authenticated user, this value may occasionally be null.

Push Notifications

To be able to send push updates, you need to register your application in the application store and go through the basic registration. Details on setting up an account in the Windows Store are detailed here. To do that, you first need to go your WAMS application in the portal and select the Push tab. From here, enter the client secret and package SID on the tab and click on Save.

One thing we might want to do is notify the user when they have logged on at another location. For this, it might be interesting to at least indicate the type of device being used in the notice. We can tell the difference based off the of notification service URL. For example, Windows Phone uses http://sn1.notify.live.netand Windows 8 uses https://bn1.notify.windows.com. This functionality is supported by the global push object. The object has two properties hanging off of it for the Windows Phone and Windows Store named mpnsand wns respectively. Given a device URI, you can push a message to that device with just a short amount of code and a few handlers: one for success and one for errors. When an error happens, the general recommendation is to forget the failed device URL. Typically, that means remove the URL from your data store. For example, to send code via the Microsoft Push Notification Service with a known device URI and message, one would write the following:

push.mpns.sendToast(uri, {text1: "Push", text2: message},

{success: successMns(id),

error: errorMns(id, uri),

});

The success and error functions are then just this:

function successMns(deviceId){

return function (err, results){

if (err.shouldDeleteChannel){

userDeviceTable.del(deviceId);

}

}

}

function errorMns(deviceId, uri){

return function (err, results){

if (err.shouldDeleteChannel){

userDeviceTable.del(deviceId);

}

}

}

For the Windows Notification Service and regular Windows apps, the code is similarly simple. Push out the toast using:

push.wns.sendToastText04(uri, {text1: message},

{success: successWns,

error: errorWns(id, uri)});

Then, handle success and failure with a pair of functions:

function successWns(pushResponse) {

// Do nothing

}

function errorWns(deviceId, uri){

return function(err, result){

if (err.headers['x-wns-notificationstatus'] === "dropped"){

userDeviceTable.del(deviceId);

}

};

}

And, with that, your application is sending out toast to all interested parties!

Summary

WAMS provides a set of functionality to make it significantly easier to go from idea to implementation with your Windows Store and smart phone applications. It implements a lot of the back end services you need, letting you focus on the differentiators: what the application does, business logic, and availability across form factors. The service is currently in beta. Now is a great time to learn what the service does and to engage with the product team at Microsoft.

image_thumb18


<Return to section navigation list>

Marketplace DataMarket, Cloud Numerics, Big Data and OData

Venkatesh Narayanan posted BCS OData Custom Authentication using Extension Provider on 11/30/2012 (missed when posted):

imageBusiness Connectivity Services (BCS) in SharePoint 2013 supports connecting to OData-based LOB systems. BCS supports the following authentication modes for authenticating against the OData service:

  • PassThrough
  • RevertToSelf
  • Credentials
  • Windows Credentials
  • Digest Credentials
  • Custom Authentication using OData Extension Provider

image_thumb8In this article, we cover in detail how to connect from SharePoint on-premises to any OData service with custom authentication using an OData extension provider.

Some examples of custom authentication schemes that the OData service provider could support are:

  • Authenticating using Windows Azure Access Control Service (ACS)
  • Client certificate-based authentication

To enable these custom authentication mechanisms, you need to implement an OData extension provider and configure the same in BCS. The OData extension provider is invoked by BCS for every call to the OData LOB Service.

Note: BCS supports custom authentication using an OData extension provider only for apps hosted in on-premises SharePoint or for those apps that connect using hybrid (SharePoint Online to on-premisea-based LOB through hybrid). This restriction is because it is not possible to install custom assemblies in SharePoint Online.

Implementing an OData extension provider
  1. Create a new project in Visual Studio 2010. Select Class Library as the project type:
    ODataExtensionFigure1[2]
  2. Change the file name and class name to ODataOAuthExtensionProvider.
  3. Add a reference to the Microsoft.BusinessData.dll assembly to the project. Add the namespace Microsoft.BusinessData.SystemSpecific.OData to the class.
  4. The empty custom OData extension provider implementation should be as follows:
    ODataExtensionFigure2[2]
  5. Sign the assembly by selecting the “Signing” options under the project settings. Build the SampleODataExtensionProvider.dll assembly.
Installing the OData extension provider assembly

Install the SampleODataExtensionProvider.dll in the global assembly cache (GAC) on the SharePoint web front-end (WFE) machines.

If BCS is configured to connect from SharePoint Online to an on-premises LOB system through a hybrid, then the SampleODataExtensionProvider.dll has to be installed in the GAC in the SharePoint on-premises WFE.

Configuring a connection to the OData service from SharePoint on-premises

In order to connect to the OData service from BCS, a connection setting has to be created in SharePoint using a BCS Windows PowerShell commandlet.

The connection setting contains information required by SharePoint BCS to connect to the OData LOB system (OData Service URI, OData Service Metadata URI, Authentication Mode, and Extension Provider).

New-SPODataConnectionSetting -Name "ContosoServiceApp" -ServiceContext "http://contoso" -ServiceAddressURL "http://tv.telerik.com/services/OData.svc" -AuthenticationMode "Anonymous" -ExtensionProvider "SampleODataExtensionProvider. ODataOAuthExtensionProvider, SampleODataExtensionProvider, Version=1.0.0.0, Culture=neutral, PublicKeyToken=34c4d4fa89a6bb3b

image
Configuring the connection in the BCS model

The connection information has to be specified in the BCS model. The BCS runtime uses the connection information for invoking the OData LOB service.

To specify the connection information, add the following property in the BCS model by adding it to the LOB System and LOB System Instance.

<Property Name="ODataConnectionSettingsId" Type="System.String"> ContosoServiceApp </Property>

ODataExtensionFigure3[2]

Summary

This article describes in detail how to support custom authentication with a BCS OData connector using an OData extension provider.

By following the steps mentioned here, you should be able to import a BCS model that connects to an OData service with an extension provider that does custom authentication. In the next article, we will change the OData extension provider to authenticate against Windows Azure ACS.


<Return to section navigation list>

Windows Azure Service Bus, Access Control Services, Caching, Active Directory and Workflow

Clemens Vasters (@clemensv) continued his Subscribe! video series on Channel9 with Getting Started with Service Bus. Part 1: The Portal on 12/4/2012:

imageWith this post (entry? segment?) I'm going to start a series of screen casts for folks who are new to Service Bus or haven't yet had the time to use any or some of it features.

In this first post I'll take you on a 15 minute tour through the new portal sections for Service Bus.

imageIf you're a Service Bus veteran you will likely be familiar with the features of the new portal and may want to use the 15 minutes to make your solutions awesomer (sic!), but if you are just getting started, this will help you creating a new namespace and creating the first set of resources like Queues and Topics. I'm also going to show you how to get to the ACS portal where you can manage (also see Securing Service Bus with ACS) access control protection for the resources of a Service Bus namespace.

image_thumb75_thumb3I vote for segment or episode. Looks like Clemens ran over his time estimate by 6 minutes.


Clemens Vasters (@clemensv) continued his Subscribe! video series on Channel9 with Environments and Scale Units on 12/2/2012:

image_thumb75_thumb3In this video I'm spilling the beans on some of the internal structures we set up for Service Bus for you to consider and potentially adapt as your building out your own global services on Windows Azure - Environments are deployment zones for separate purposes and Scale Units are deployment zones for separate purposes and Scale Units are sets of resources and that are bundles together in configuration and form independent units of deployment and management whose size is informed by the required scale.


Clemens Vasters (@clemensv) started a Subscribe! video series on Channel9 with Hello World on 11/30/2012:

image

imageSubscribe! is a video blog about Messaging, Middleware, Architecture, and all sort of other interesting topics around building larger and more sophisticated solutions than your average website on Windows Azure and Windows Server.

imageYour host and, mostly, monologist is Clemens Vasters from the Windows Azure Service Bus team who puts this blog together in his studio on his island of solitude in Germany.


<Return to section navigation list>

Windows Azure Virtual Machines, Virtual Networks, Web Sites, Connect, RDP and CDN

Sandrino di Mattia (@sandrinodm) described Creating up to 50 free Windows Azure Web Sites in a single subscription in a 12/3/2012 post:

imageWhen Windows Azure Web Sites were announced during the Meet Windows Azure event (June 7th, 2012) the preview version was only available in the East US region. The pricing details page explained what you would get for free per month:

  • imageWeb Sites: Up to 10 web sites
  • Data Transfers: 165 MB of outbound data transfers per day, per subscription; unlimited inbound data transfers
  • Storage: 1 GB
  • Relational Database: 20 MB of a third-party, MySQL database

Windows Azure Web Sites allowed you to run 10 web sites for free, Scott Guthrie and Scott Hanselman even blogged about it. But a lot has changed since then. In September Microsoft announced the shared model (between free and reserved) which added support for CNAMEs, A-Records and naked domains. And the support for multiple regions also improved over the last few months. Today you can deploy your Windows Azure Web Site to 5 different regions: East US, West US, East Asia, West Europe, North Europe.

If you look at the pricing details page, you’ll see that the content for free Web Sites also changed:

  • With the Windows Azure Web Sites free instance model, you will receive the following at no charge:
  • 10 free web sites per sub region* on the AzureWebSites.net domain
  • 165 MB of outbound data per day per sub region, up to 5 GB per region**; unlimited inbound data
  • 1 GB of storage per sub region (shared by all web sites)
  • 20 MB of a third-party MySQL database per sub region for the first 12 months (charges may apply thereafter)

You are now allowed to run 10 free web sites per sub region. Since Windows Azure Web Sites (Preview) is available in 5 regions this means you can create up to 50 free Web Sites. The only thing you’ll need to look after is that you choose a different region when you hit the 10 sites limit.

Here is an example of 30 free Web Sites I created over 3 different regions:

image_thumb75_thumb4No significant articles today

image_thumb11


<Return to section navigation list>

Live Windows Azure Apps, APIs, Tools and Test Harnesses

Mary Jo Foley (@maryjofoley) asserted “Microsoft's CRM and ERP teams are stepping up the delivery pace, and have new Windows 8, cross-platform, and Azure-hosted capabilities coming to market starting next year” in a deck for her More cross-platform and cloud support ahead for Microsoft Dynamics ERP, CRM article of 12/4/2012 for ZDNet’s All About Microsoft blog:

The Microsoft Dyanmics CRM/ERP teams are finishing out 2012 with a handful of new releases. They're also readying more cross-platform and Windows Azure support in the new year.

imageThe Dynamics AX team just made generally available (this past weekend) the Dynamics AX 2012 R2 release -- the follow-on to the Dynamics AX 2012 product the team shipped a little over a year ago. Next up from the team will be a feature pack (delivered in another year-plus), and then the next full Dynamics AX release, which will be out in 2014, said Christian Pedersen, General Manager, Dynamics Enterprise Applications and Services at Microsoft.

By mid-December, the promised Q4 service update for Dynamics CRM (on-premises and online both) is slated to be generally available. This update will include more cross-browser support, but not the cross-platform support originally expected in this update.

To round out the month, on December 19, the Dynamics GP team will be making generally available the Dynamics GP 2013 release. Dynamics GP is one of Microsoft's four different ERP platforms; the other three are the aforementioned Dynamics AX, as well as Dynamics NAV and Dynamics SL.

image_thumb75_thumb5Microsoft execs had been hoping to get Dynamics NAV 2013 and Dynamics GP 2013 hosted on Windows Azure this year, but neither was ready. It's now looking like NAV 2012 and GP 2013 will be hosted on Azure by mid-calendar 2013, said Errol Schoenfish, Director of Product Management. The Azure-hosted options will be sold via partners, not directly by Microsoft.The Dynamics AX 2012 R2 update enables multiple languages and legislations to be run from within a single instance, among other new features. The coming GP 2013 update adds improvemens to hostability, manageability and deeper Office/Office 365 integration -- and a total of 125 different enhancements, said Microsoft officials. The CRM fall service update adds support for more browsers; embedded Bing maps and support for Office 2013, among other new features.

Like many other teams at Microsoft, the Dynamics CRM and ERP teams have stepped up the delivery pace of new updates and releases. The CRM team already is delivering two major CRM Online and on-premises updates every year. The NAV team is moving toward a yearly update cycle. The Dynamics GP team is evaluating whether customers in that space are interested in updates delivered any more frequently than the current every two-year pace, said Schoenfish. For customers running GP on premises, more frequent updates might prove harder to digest than not, he said.

Microsoft officials said earlier this fall that a Windows 8 version of its Dynamics CRM application would be available in the Windows Store by mid-2013. On the ERP side of the house, the teams are evaluating what kind of Windows 8 companion apps would make sense. Rather than attempting to make the ERP back end available as a Windows 8 app, the Dynamics GP and AX teams are looking to provide Windows 8 apps that would provide a subset of capabilities, such as an app for querying and viewing information about their ERP implentations. So far, there's no public timetable for these applications, however.

At the same time, Windows isn't the Dynamics teams' only focus. The CRM team went back to the drawing board to redo promised iPad/iOS version of Dynamics CRM and make it a JavaScript/HTML app. It sounds as if the ERP team is thinking along the same lines.

"HTML5 client experiences should give users on other platforms more access to the ERP back end," said Pedersen. He said the ERP teams had nothing yet to announce on the cross-platform front, however.

Schoenfish and Pedersen both emphasized during a phone interview I had with them earlier this week that the Dynamics team is collaborating more closely with other teams outside of their Microsoft Business Division unit than ever before. The Dynamics AX team, for example, has done work with the SQL Server team to enable Power View and PowerPivot to integrate directly with customers' ERP implemention. The team also has worked hand-in-hand with the SharePoint team resulting in the ability of ERP customers to use either SharePoint Server or SharePoint Online as their repositories.

As we've heard, cross-business-unit collaboration is a priority for Microsoft in 2013 and beyond, so stay tuned for more here.


Scott Bekker (@scottbekker) asserted Microsoft on Track for 100,000 Cloud Essentials Partners this Year in a 12/3/2012 article for 1105 Media’s Redmond Channel Partner magazine:

imageA relatively minor change in the way Microsoft partners can sign up for the Cloud Essentials program is leading to a boom in the number of participants, according to Microsoft's top channel executive.

Cloud Essentials has always been a relatively easy program for partners to join. A free sign-up and a modest amount of training gives partners access to 25 internal use right seats each for Office 365, Dynamics CRM Online and Windows Intune. Other benefits include presales and technical support, marketing resources and sales incentives.

imageYet participation stayed stubbornly low. Of the 640,000 partners Microsoft routinely claims to have, only 43,000 had signed up in the program's first two years.

Apparently, the trick is where and when you ask partners to join. Until a little over a week ago, partners had to navigate to Cloud Essentials-specific pages, such as microsoftcloudpartner.com. But on Nov. 20, Microsoft turned joining Cloud Essentials into a one-button step that is part of the regular re-enrollment process in the Microsoft Partner Network.

"We've had over 15,000 partners signed up in one week," Jon Roskill, corporate vice president of the Microsoft Worldwide Partner Group, said in an interview Thursday. With the program growing by almost a third in one week, Roskill has high hopes for the rest of the year.

"My expectation now is that we're going to track to be over 100,000 in the Cloud Essentials program this calendar year," Roskill said. "What this shows is how important it is to get the experience on the portal right."

Microsoft announced the Cloud Essentials change in a Nov. 20 blog entry, marking the start of a number of MPN changes that had been announced at the Worldwide Partner Conference in July.

Microsoft's goal with Cloud Essentials is to get partners familiar with Microsoft's cloud suites and get them selling without asking for major investments upfront. Some of those investments and commitments come later -- for example, re-enrolling in Cloud Essentials for a second year requires that a partner sell 25 seats worth of cloud services in the first year. The next level program, Cloud Accelerate, is similar in its requirements to obtaining a silver competency in the MPN.

imageClick here for details of Cloud Essentials benefits from the Windows Azure Portal. Notice that the 375 hours of a small compute instance per month don’t allow full-time deployment of a Web or Worker role.

Full disclosure: I’m a contributing editor for 1105 Media’s Visual Studio Magazine. I’m also a Microsoft Partner and have a Cloud Essentials benefit, which I’m attempting (without success so far) to renew. On 12/4/2012, I was unable to log into my partner account due to sign-on problems with Windows Live ID.


Claudio Caldato reported MS Open Tech Contributes Support for Windows ETW and Perf Counters to Node.js in a 12/3/2012 post to the Interoperability @ Microsoft blog:

imageHere’s the latest about Node.js on Windows. Last week, working closely with the Node.js core team, we checked into the open source Node.js master branch the code to add support for ETW and Performance Counters on Windows. These new features will be included in the new V0.10 when it is released. You can download the source code now and build Node.js on your machine if you want to try out the new functionality right away.

image_thumb75_thumb5Developers need advanced debugging and performance monitoring tools. After working to assure that Node.js can run on Windows, our focus has been to provide instrumentation features that developers can use to monitor the execution of Node applications on Windows. For Windows developers this means having the ability to collect Event Tracing for Windows ® (ETW) data and use Performance Counters to monitor application behavior at runtime. ETW is a general-purpose, high-speed tracing facility provided by the Windows operating system. To learn more about ETW, see the MSDN article Improve Debugging And Performance Tuning With ETW.

ETW

With ETW, Node developers can monitor the execution of node applications and collect data on key metrics to investigate and performance and other issues. One typical scenario for ETW is profiling the execution of the application to determine which functions are most expensive (i.e. the functions where the application spends the most time). Those functions are the ones developers should focus on in order to improve the overall performance of the application.

In Node.js we added the following ETW events, representing some of the most interesting metrics to determine the health of the application while it is running in production:

  • NODE_HTTP_SERVER_REQUEST: node.js received a new HTTP Request
  • NODE_HTTP_SERVER_RESPONSE: node.js responded to an HTTP Request
  • NODE_HTTP_CLIENT_REQUEST: node.js made an HTTP request to a remote server
  • NODE_HTTP_CLIENT_RESPONSE: node.js received the response from an HTTP Request it made
  • NODE_NET_SERVER_CONNECTION: TCP socket open
  • NODE_NET_STREAM_END: TCP Socket close
  • NODE_GC_START: V8 starts a new GC
  • NODE_GC_DONE: V8 finished a GC

For Node.js ETW events we also added some additional information about the JavaScript track trace at the time the ETW event was generated. This is important information that the developer can use to determine what code has been executed when the event was generated.

Flamegraphs

Most Node developers are familiar with Flamegraphs, which are a simple graphical representation of where time is spent during application execution. The following is an example of a Flamegraph generated using ETW.

clip_image002

For Windows developers we built the ETWFlamegrapth tool (based on Node.js) that can parse etl files, the log files that Windows generates when ETW events are collected. The tool can convert the etl file to a format that can be used with the Flamegraph tool that Brendan Gregg created.

To generate a Flamegraph using Brendan’s tool, you need to follow the simple instructions listed in the ETWFlamegraph project page on Github. Most of the steps involve processing the ETW files so that symbols and other information are aggregated into a single file that can be used with the Flamegraph tool.

ETW relies on a set of tools that are not installed by default. You’ll either need to install Visual Studio (for instance, Visual Studio 2012 installs the ETW tools by default) or you need to install the latest version of the Windows SDK tools. For Windows 7 the SDK can be found here.

To capture stack traces:

  1. xperf -on Latency -stackwalk profile
  2. <run the scenario you want to profile, ex node.exe myapp.js>
  3. xperf -d perf.etl
  4. SET _NT_SYMBOL_PATH=srv*C:\symbols*http://msdl.microsoft.com/downloads/symbols
  5. xperf -i perf.etl -o perf.csv -symbols

To extract the stack for process node.exe and fold the stacks into perf.csv.fold, this includes all information about function names that will be shown in the Framegraph.

node etlfold.js perf.csv node.exe. (etlfold.js is the file found in the ETWFlamegraph project on GitHub).

Then run the flamegraph script (requires perl) to generate the svg output:

flamegraph.pl perf.csv.fold > perf.svg

If the Node ETW events for JavaScript symbols are available then the procedure becomes the following.

  1. xperf -start symbols -on NodeJS-ETW-provider -f symbols.etl -BufferSize 128
  2. xperf -on Latency -stackwalk profile
  3. run the scenario you want to profile.
  4. xperf -d perf.etl
  5. xperf -stop symbols
  6. SET _NT_SYMBOL_PATH=srv*C:\symbols*http://msdl.microsoft.com/downloads/symbols
  7. xperf -merge perf.etl symbols.etl perfsym.etl
  8. xperf -i perfsym.etl -o perf.csv -symbols

The remaining steps are the same as in the previous example.

Note: for more advanced scenarios where you may want to have stack traces that include the Node.js core code executed at the time the event is generated, you need to include node.pdb (the debugging information file) in the symbol path so the ETW tools can resolve and include them in the Framegraph.

PerfCounters

In addition to ETW, we also added Performance Counters (PerfCounters). Like ETW, Performance counters can be used to monitor critical metrics at runtime, the main differences being that they provide aggregated data and Windows provides a great tool to display them. The easiest way to work with PerfCounters is to use the Performance monitor console but PerfCounters are also used by System Center and other data center management applications. With PerfCounters a Node application can be monitored by those management applications, which are widely used for instrumentation of large cloud and enterprise-based applications.

In Node.js we added the following performance counters, which mimic very closely the ETW events:

  • HTTP server requests: number of incoming HTTP requests
  • HTTP server responses: number of responses
  • HTTP client requests: number of HTTP requests generated by node to a remote destination
  • HTTP client responses: number of HTTP responses for requests generated by node
  • Active server connections: number of active connections
  • Network bytes sent: total bytes sent
  • Network bytes received: total bytes received
  • %Time in GC: % V8 time spent in GC
  • Pipe bytes sent: total bytes sent over Named Pipes.
  • Pipe bytes received: total bytes received over Named Pipes.

All Node.js performance counters are registered in the system so they show up in the Performance Monitor console.

clip_image003

While the application is running, it’s easy to see what is happening through the Performance Monitor console:

clip_image004

The Performance Monitor console can also display performance data in a tabular form:

clip_image005

Collecting live performance data at runtime is an important capability for any production environment. With these new features we have given Node.js developers the ability to use a wide range of tools that are commonly used in the Windows platform to ensure an easier transition from development to production.

More on this topic very soon, stay tuned.

Claudio Caldato
Principal Program Manager Lead
Microsoft Open Technologies, Inc.


Cory Fowler (@SyntaxC4) described Provisioning a MySQL Database from the Windows Azure Store in a 12/2/2012 post:

imageThe Windows Azure Store is available as part of the Windows Azure Management Portal, a convenient resource for all of your Windows Azure needs, which can be accessed from a variety of Devices including your favorite iDevice, Surface, Windows Phone, Mac or PC.

image_thumb75_thumb5At the time of writing, the Windows Azure Store is currently only available in the US.

Click on the + New in the Windows Azure Management Portal Taskbar, then select STORE.

CustomCMS-Store-TaskbarDrawer

imageThe Store opens in a modal dialogue, either scroll down or filter to APP SERVICES to find ClearDB MySQL Database, then click the [next] arrow.

CustomCMS-Store-AppService-ClearDB

Select an appropriate database size (or stay with the Free plan and upgrade later once the site is live) and select the Subscription to charge. Provide a name for the Database (can be left with the default, a name will be assigned for automatically), select the region to provision the database in (whenever possible, try to provision the Web Site and database in the same region to avoid latency). Click the [next] arrow.

CustomCMS-Store-ClearDB-Create

The following screen will provide an overview of the monthly charges of the new MySQL Database. Be sure to review the terms of use and privacy statement, then click the [purchase] checkmark.

CustomCMS-Store-ClearDB-Purchase

After the Add-on has been provisioned, click on the Connection Info button in the Taskbar.

CustomCMS-Store-ClearDB-ConnectionInfo

Copy the connection string for use in your application. Alternatively, a newly created database can be added as a Linked Resource to an existing Windows Azure Web Site, the credentials will be surfaced under the connection string section of the CONFIGURE tab.

Brian Swan has an interesting solution for parsing a connection string from the connection string settings found in the CONFIGURE section of a Windows Azure Web Site in his blog entry getting database connection information in Windows Azure Web Sites

CustomCMS-Store-ClearDB-ConnectionInfo-Details


Philip Fu described [Sample of Dec 2nd] Azure + Bing Map sample application in a 12/2/2012 post to the Microsoft All-In-One Code Framework blog:

imageSample Download:

CS Version: http://code.msdn.microsoft.com/CSASPNETSearchEngine-52f5392c

VB Version: http://code.msdn.microsoft.com/VBASPNETSearchEngine-30f7b126

This sample shows how to implement a simple search engine in an ASP.NET web site.

You can find more code samples that demonstrate the most typical programming scenarios by using Microsoft All-In-One Code Framework Sample Browser or Sample Browser Visual Studio extension. They give you the flexibility to search samples, download samples on demand, manage the downloaded samples in a centralized place, and automatically be notified about sample updates. If it is the first time that you hear about Microsoft All-In-One Code Framework, please watch the introduction video on Microsoft Showcase, or read the introduction on our homepage http://1code.codeplex.com/.

image_thumb22


<Return to section navigation list>

Visual Studio LightSwitch and Entity Framework 4.1+

Philip Fu posted [Sample of Dec 3rd] EF4 Complex Type Objects demo to the Microsoft All-In-One Code Framework blog on 12/3/2012:

imageSample Download :

CS Version: http://code.msdn.microsoft.com/CSEFComplexType-d058a5a3

VB Version: http://code.msdn.microsoft.com/VBEFComplexType-8ff6c3b3

image_thumbThe CS/VBEFComplexType example illustrates how to work with the Complex Type which is new in Entity Framework 4.0. It shows how to add Complex Type properties to entities, how to map Complex Type properties to table columns, and how to map a Function Import to a Complex Type.

You can find more code samples that demonstrate the most typical programming scenarios by using Microsoft All-In-One Code Framework Sample Browser or Sample Browser Visual Studio extension. They give you the flexibility to search samples, download samples on demand, manage the downloaded samples in a centralized place, and automatically be notified about sample updates. If it is the first time that you hear about Microsoft All-In-One Code Framework, please watch the introduction video on Microsoft Showcase, or read the introduction on our homepage http://1code.codeplex.com/.

image_thumb6No significant articles today:

 


Return to section navigation list>

Windows Azure Infrastructure and DevOps

James Staten (@staten7) described Forrester Research’s 2013 Cloud Predictions: We’ll Finally Get Real About Cloud in a 12/3/2012 post:

imageAs the end of 2012 approaches there is one clear takeaway about the cloud computing market – enterprise use has arrived. Cloud use is no longer solely hiding in the shadows, IT departments are no longer denying it’s happening in their company and legitimate budgeting around cloud is now taking place. According to the latest Forrsights Surveys nearly half of all enterprises in North America and Europe will set aside budget for private cloud investments in 2013 and nearly as many software development managers are planning to deploy applications to the cloud.

imageSo what does that mean for the coming year? In short, cloud use in 2013 will get real. We can stop speculating, hopefully stop cloudwashing, and get down to the real business of incorporating cloud services and platforms into our formal IT portfolios. As we get real about cloud we will institute some substantial changes in our cultures and approaches to cloud investments. We asked all the contributors to the Forrester Cloud Playbook to weigh in with their cloud predictions for the coming year then voted for the top ten. Herewith is what we expect to happen when enterprise gets real about cloud in 2013:

  1. We'll finally stop saying - everything is going cloud. And get real about what fits and what doesn't. We now have enough understanding about what makes cloud platforms different from traditional virtual infrastructures and traditional hosting environments to make architecturally sound decisions about which applications to move to the cloud. Forrester’s Strategic Rightsourcing method can help you facilitate these discussions inside your company. Be sure to involve your developers who have actual hands on experience with these platforms – they should be your guide.
  2. Cloud and mobile will become one. What’s the value of a mobile app that doesn’t call out through the Internet to back-end services? Not much. And where will these backend services live? Probably not in your datacenter -- unless you plan to poke a big hole in your firewall to accommodate an unpredictable flood of traffic. More often than not, we are finding mobile applications connected to cloud-based back end services (increasingly to commercial Mobile Backends as a Service) that can elastically respond to mobile client engagements and shield your data center from this traffic. Nearly every SaaS application has a mobile client now, which is proof of the model as well. As Forrester analyst Glenn O’Donnell puts it, Cloud plus mobile is a classic "more than the sum of its parts" combination.
  3. We'll stop stressing about cloud SLAs. And recognize that apps have to protect themselves. The best practice for cloud application design and configuration is to build resiliency into the application rather than expect it from the cloud platform. This way you can achieve any SLA regardless of the base SLA provided by the cloud platform. Getting the performance you need is an application-specific goal anyway. What’s the value of having your sourcing & vendor management team negotiate a high and tight SLA from the cloud vendor when only 10% of the applications deployed there need that level of protection?
  4. We'll get real about cost modeling. For two years now, Forrester has been preaching that the cloud isn’t always cheaper;but most likely cheaper with the right use model. Do the math, understand the economics, monitor and optimize as your use evolves. If you want to get the best ROI out of your use of cloud services and platforms you need to actively model the cost profile of your applications, monitor their resource use and adjust accordingly. We’re not saying cost should solely drive your cloud deployment decisions but cost can no longer be ignored or assumed. With cloud cost monitoring tools like Cloudyn, CloudCruiser, Cloudability, Newvem and Rightscale, plus the cost reporting tools that come directly from the leading cloud vendors, you no longer have an excuse for not managing your costs. Good cost management should also drive your hybrid deployment, service selection and discount schedule negotiations. In 2013 your CFO will wise up to this cost management opportunity so be prepared for him to demand you take this responsibility.
  5. I&O will free the development teams to build apps in the cloud. The developers don’t really need I&O's permission; and our surveys and client inquiries show the business unit-aligned developers certainly aren’t waiting for it. In 2013 the I&O team getting comfortable with the fact that development on public clouds is going to happen whether they like it or not and it’s easier for them to engage developers and be part of the conversation about how to do it safely, securely and with appropriate oversight. It also gives I&O, working with AD&D and EA, the ability to set guardrails through a formal cloud policy that shows what type of development is acceptable and engage in a dialogue about what may not be such a good idea…at least not yet.
  6. We’ll get real about using the cloud for backup & DR. Instead of enterprises buying resources in case of a disaster, cloud computing and its pay-per-use pricing model lets you pay for long-term data storage while only paying for servers when testing or declaring a disaster. It probably won’t replace your existing BCDR resources completely, but the cloud is turning the cost of storage upside down faster every month and what was cheaper to back up to traditional DR storage last year will be cheaper and easier to pu in the cloud is short order -- and faster to recover. Pretty soon we'll wonder why we ever maintained our own long-term cold storage.
  7. We'll stop equating cloud with commodity. Cloud services are highly standardized and automated, but standardization does not have to mean commodity. We’re already seeing cloud services backed by high-end hardware, offering GPUs, SSDs and other clearly non-commodity infrastructure options. In 2013 expect to see the proliferation of these types of choices as cloud providers leverage them to meet specific market demands and to differentiate competitively. But don't think we're saying all non-commodity infrastructure has a future in cloud. Today’s “commodity” server or storage is easier to enhance to be tomorrow’s premier offering than to try and accommodate yesterday's technology. Look for the cloud to disrupt more and more technology sectors thought to be safely “high-margin.”
  8. We'll stop equating cloud with AWS. While Amazon Web Services has opened up a substantial lead in the cloud platforms market – arguably as large as 70% market share – in 2013 we'll see that market position give way to a cadre of strengthening competitors and new entrants. Microsoft and Google have made significant improvements to their platforms and by the end of 2013 we fully expect to see at least 3 substantial OpenStack-based clouds building strong positions. Look for a Forrester Wave of Public Cloud Platforms in mid-2013.
  9. We’ll acknowledge that advanced virtualization is a good thing, and no, it’s not a cloud. The cloudwashing award for 2012 definitely goes to enterprise I&O departments who relabeled last year’s VMware environment a private cloud so they could “get to yes” in the eyes of their CIO. Very few of these environments offered self-service to the developer, fully-automated provisioning, standardized services or cost transparency. In 2013, let’s get real about these environments. A mostly static virtual environment that successfully drives workload consolidation, operational efficiencies and fast recovery is a good thing – a very good thing. It’s just not a cloud and nor should it be. Enterprise I&O teams should be happily bipolar: your optimized and dynamic virtual environment and your on-demand private cloud both have a place in the datacenter. They solve different problems and meet different demands. Don’t waste energy trying to make on into the other when they’re both delivering value.
  10. Developers will awaken to: Development isn't all that different in the cloud. There are no cloud-specific or cloud-best languages. Our Cloud Developer Survey shows that the majority of languages, frameworks and development methodologies we use in the enterprise are also in use in the cloud. What’s different isn’t the coding but the services orientation and the need to configure the application to provide its own availability and performance. And frankly this isn’t all that new either. We’ve had to worry about these aspects with our web sites since 2000. While some of the best practices and cloud services may be new, there are little excuses for a well-trained developer to not be productive in the cloud. So what are you waiting for?

My thanks to the many contributors to this report including Dave Bartoletti, Michael Gualtieri, Lauren Nelson, Rachel Dines, Michael Facemire, Andre Kindness, Glenn O'Donnell, Liz Herbert.and Chris Voce.


David Linthicum (@DavidLinthicum) asserted “The way to make cloud computing successful is to find the business problems that make sense to attack” in a deck for his Successful cloud adoption: It's the fit, stupid article of 12/3/2012 for InfoWorld’s Cloud Computing blog:

imageInfoWorld's IT advice columnist Bob Lewis reached out to me last week after my blog post "How AWS can conquer enterprise IT's resistance to public clouds" with a few ideas. He suggested we should take a page out of the early PC and Web playbooks to help readers understand how to match up the new technology with the old problems. For example, we could use Amazon Web Services -- or any cloud computing technology -- to address business problems that would be impractical to deploy on traditional IT platforms.

Indeed, cloud computing providers tend to push their technology as the solution to any and all business problems. Unfortunately, there is not a universal fit for cloud computing technology, so you have to be careful to match the business problem you're looking to solve with the technology that best addresses it. To paraphrase James Carville, the political strategist for former President Bill Clinton, it's the fit, stupid.

In all their hype around cloud computing, the providers portray all public, private, and hybrid cloud computing technology as the way to get IT to the promised land, whatever that might be. However, the geography on the ground does not change, so organizations deploying cloud computing make disappointing discoveries:

  • In some instances, the value is not there. A cloud computing solution may end up costing more and providing less than the traditional IT technology. Such cases are typical, but you have to run the numbers before considering the cloud.
  • Larger organizations are often a tougher fit for cloud computing. They have an investment in hardware and software, and they frequently find that migrating from traditional systems to cloud-based systems adds too much risk and cost.

Cloud computing is not special in the fact that the fit varies, so make sure to learn from the mistakes and the successes of the past, including the rise of the PC and of the Internet. Although both are universally useful, there were attempts made to push those then-new, shiny technologies at old, rusty problems. Those fails are legendary.

Overall, cloud computing is a good move for most organizations, both large and small. But fight the urge to push it at everything and anything.


<Return to section navigation list>

Windows Azure Platform Appliance (WAPA), Hyper-V and Private/Hybrid Clouds

image_thumb75_thumb7No significant articles today


<Return to section navigation list>

Cloud Security and Governance

image_thumb2No significant articles today

 


<Return to section navigation list>

Cloud Computing Events

Tyler Doerksen (@tyler_gd) proclaimed Tis the season in a 12/3/2012 post:

imageNo this post is not about the holidays. It is about speaking events.

My blog is a bit sparse these days, mainly because I have been involved in so many events in October-November and that steady stream is not slowing down in December.

VS 2012 Launch Event @ Winnipeg IMAX – December 6

Visual Studio 2012

Register Here (eventbrite.ca)

Event Details

  • When: Thursday, December 6th from 8:00 AM – 4:00 PM
  • Where: IMAX Theatre, Portage Place
  • Cost: *FREE!*

Agenda

  • 8:00 – 9:00:    Continental Breakfast and Registration
  • 9:00 – 9:15:    Welcome
  • 9:15 – 10:30:  End-To-End Application Lifecycle Management with TFS 2012
  • 10:30 – 10:45: Break
  • 10:45 – 12:00: Improving Developer Productivity with Visual Studio 2012
  • 12:00 – 1:00:   Lunch Break (Lunch Not Provided)
  • 1:00 – 2:15:    Web Development in Visual Studio 2012 and .NET 4.5
  • 2:15 – 2:30:    Break
  • 2:30 – 3:45:    Microsoft Cloud Development with Azure and Visual Studio 2012
  • 3:45 – 4:00:    Prizes and Thanks

image_thumb75_thumb7Here is the abstract for my talk at 2:30

Microsoft Cloud Development with Azure and Visual Studio 2012
Tyler Doerksen, Imaginet

Microsoft’s public cloud platform is nearing its third year of public availability, supporting web site/service hosting, storage, relational databases, virtual machines, virtual networks and much more. Windows Azure provides both power and flexibility. But to capture this power you need to have the right tools! This session will demonstrate the primary ways you can harness Windows Azure with the .NET platform. We’ll explain cloud service development, packaging, deployment, testing and show how Visual Studio 2012 with the Windows Azure SDK and other Microsoft tools can be used to develop for and manage Windows Azure.Harness the power of the cloud from the comfort of Visual Studio 2012

Also don’t forget…

Vancouver Web Camp – Online – December 4

Vancouver Web Camp

Register Online Here

I will be taking questions in the chat online.

Be sure to attend these great events this week! And stay tuned for more coming up in the new year!


Mick Badran (@mickba) reported a new Azure: Australian Developers Site in a 12/3/2012 post:

imageHi guys, while planning for an Azure Based Event (ABE) coming soooon….or at least after Santa has come and gone & given me a birthday pressie, I was directed to a new site in the wings.

Coatsy and his DPE crew have been busy creating a site just for us.

One that accepts our slang and other Aussie quotes.

image_thumb75_thumb8Register and it will notify you of all the events and other up and coming tidbits.

http://azure.msdeveloper.com.au/Default.aspx

+1 for the Aussie know how (Even if we speak US (English) :))


<Return to section navigation list>

Other Cloud Computing Platforms and Services

Barb Darrow (@gigabarb) reported And whomp, here it is: The Pivotal Initiative brought to you by VMware and EMC in a 12/4/2012 article for GigaOm’s Cloud blog:

imageAt long last here it is: The long-simmering VMware-EMC spin-off that is. The Pivotal Initiative will focus on bringing resources from the two parent companies to bear on big data and cloud application platforms, according to a statement from the companies

imageThe effort, to be led by Paul Maritz, who was the former VMware CEO and then transitioned in the fall to EMC’s chief strategy officer, will be formalized by the second quarter of next year. Although not mentioned here, former Pivotal Labs CEO Rob Mee is expected to play a key role as well. The spin out will include technology and people from:

  • imageEMC’s Greenplum data analytics group
  • Pivotal Labs agile development
  • Cetas analytics services
  • VMware’s vFabric (including Springsource Java frameworks and GemFire data caching
  • VMware’s Cloud Foundry Platform-as-a-Service

imageAs GigaOM has reported, this move will allow the semi-independent organization to focus on the cloud application stack — using Spring, Gemfire etc. and will leave VMware, now under CEO Pat Gelsinger, to concentrate on its core server virtuaization business — which is profitable but is also seeing increased competition from Microsoft Hyper-V, and Xen and KVM open sourcerivals. And the spinoff will also bring EMC, a legacy storage hardware maker, more credibility in the booming big data and cloud space.

There are lots of questions still. It’s not clear where other “non-core” VMware businesses like Zimbra and Socialcast end up, for example. So, stay tuned.


Datanami (@Datanami) reported AWS Marketplace Announces Windows Support on 12/4/2012:

imageAmazon Web Services Inc. (AWS), an Amazon.com company, today announced that the AWS Marketplace now supports Windows-based software and other new software categories including Big Data solutions. AWS Marketplace is an online store that makes it easy for customers to find, compare, and immediately start using the software they need to build products and run their businesses. Customers can now quickly discover and 1-Click deploy software products running on Windows Server to the AWS Cloud, including well-known business intelligence, database, and hosting solutions.

image_thumb111As with all products in the AWS Marketplace, customers pay only for what they use and can scale their software up or down as needed. Amazon Elastic Compute Cloud (Amazon EC2) instances running Windows Server, include 2003 R2, 2008, 2008 R2, and 2012 editions, and as with all Amazon EC2 Windows instances, customers have the option of enterprise-class support and updates, while taking advantage of the security, scalability, and pay-as-you-go pricing of AWS. To learn more about the AWS Marketplace, visit http://aws.amazon.com/marketplace.

Windows users will now be able to find, deploy, and start using popular application infrastructure, development tools, and business software in minutes. These customers can use the AWS Marketplace to easily search and filter for Windows-compatible software, compare pricing across different server types, and launch in the AWS region of their choice. AWS Marketplace’s 1-Click deployment lets customers start their software in minutes, automating the provisioning of hardware, and simplifying the deployment and configuration of software.

Software charges appear on customers’ existing AWS bills, and AWS consolidates payments through one pipeline – there’s no need to set up and manage new accounts for every software product. With no servers to purchase and configure, and no complicated software deployment process, AWS is making it simple and easy to use Windows-based software.

A selection of popular Windows Server software is now available in the AWS Marketplace including products from Parallels Software, MicroStrategy, and Quest Software. Parallels Software is offering Plesk Panel, a widely used hosting control panel solution; MicroStrategy is helping customers quickly build business dashboards with their MicroStrategy 9.3 business intelligence platform; and Quest Software is making available their Toad database productivity software which helps customers support Oracle, SQL Server and MySQL databases running on AWS.

Windows ISVs and software resellers can now list their software in the AWS Marketplace and sell their solutions to hundreds of thousands of active AWS customers around the world. AWS Marketplace can reduce the time to discover, purchase and deploy software dramatically, shortening it from days (or weeks or months) to minutes compared to traditional sales cycles. ISVs and resellers also benefit by having all billing, collections, and disbursements automatically handled by AWS, with software revenue deposited directly into the software vendor’s or reseller’s account. This includes the ability to support pay-as-you-go pricing with minimal additional programming effort: sellers simply set the pricing for their software by instance size, and AWS tracks usage and calculates the bill. ISVs can version products and the AWS Marketplace can help notify customers when new versions are available.

“Offering customers choice for their cloud infrastructure and enabling open development is a priority for us, which is why we’re excited to provide our Toad solution for on-demand use across managed databases,” said Michael Sotnick, Vice President of Worldwide Channels and Alliances for Quest Software (now part of Dell). “We deliver value to the AWS ecosystem today through our freeware edition that enables more than two million developers to easily leverage Toad for Oracle, Toad for SQL Server and Toad for MySQL within the AWS cloud.”

“We have customers who require an easy, fast method to get our Plesk Control Panel up and running,” said John Zanni, Vice President of Marketing and alliances, Parallels. “The integrated billing, including software in the AWS Marketplace, makes using AWS efficient and effective. Being able to tap into the AWS user base is a really attractive strategy to broaden our customer reach.”
AWS Marketplace now includes a Big Data category for customers who want to analyze large amounts of data and are looking for ways to quickly solve Big Data problems. Customers can now deploy and immediately start using big data technologies such as Couchbase and MongoDB with the click of a button. In addition, AWS Marketplace offers open source operating systems CentOS, Debian and FreeBSD, extending the selection of operating systems available in Marketplace which include Windows Server, Red Hat Enterprise Linux, SUSE Linux Enterprise Server, and Ubuntu.


<Return to section navigation list>

0 comments: