Tuesday, August 02, 2011

Windows Azure and Cloud Computing Posts for 8/2/2011+

image222
A compendium of Windows Azure, SQL Azure Database, AppFabric, Windows Azure Platform Appliance and other cloud-computing articles.

image433

• Updated 7/3/2011 8:00 AM PDT with new articles marked by Avkash Chauhan, Beth Massi, and Mary Jo Foley.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:


Azure Blob, Drive, Table and Queue Services

imageNo significant articles today.


<Return to section navigation list>

SQL Azure Database and Reporting

imageNo significant articles today.


<Return to section navigation list>

MarketPlace DataMarket and OData

imageNo significant articles today.


<Return to section navigation list>

Windows Azure AppFabric: Apps, Access Control, WIF and Service Bus

image72232222222No significant articles today.


<Return to section navigation list>

Windows Azure VM Role, Virtual Network, Connect, RDP and CDN

Avkash Chauhan described Modifying RDP access settings in Windows Azure Role to reclaim one input endpoint? in an 8/2/2011 post:

imageAll Windows Azure Role can have any type of internal endpoints. Internal endpoints are limited to 5 per role. Internal endpoints are used for internal role communication, for example, role-to-role communication. So based on it, Web role can support up to five internal endpoints, like the other roles. Depending upon what your service looks like, if this is your front-end role (or your only role), RDP may actually be taking an input endpoint, not an internal endpoint. You are allocated 25 input endpoints throughout your entire service.

imageDue to the limit of input endpoint 5 per role and 25 total, it is possible that you will need to adjust RDP access in your role to reclaim one input Endpoint in specific role.

IF you have RDP enabled in your Windows Azure Application, for one of your roles RemoteForwarder will consume an input endpoint, and on all of your roles RemoteAccess will consume an internal endpoint. You can choose which role will use the input endpoint by modifying your csdef to put the RemoteForwarder import into the role you want to expose the 3389 RDP port.


<Return to section navigation list>

Live Windows Azure Apps, APIs, Tools and Test Harnesses

Cory Fowler (@SyntaxC4) claimed that you Must have NuGet Packages for Windows Azure Development in an 8/1/2011 post:

imageIf you’re a Microsoft .NET Developer and haven’t yet heard of NuGet, you are missing out. NuGet is a Package Management System which allows for simple import, setup and upgrading of Third Party Software Packages. The best thing about NuGet is it isn’t limited to Third Party Software as it is possible to Create Packages and Custom Lists which makes it perfect for sharing code within a Company.

image

How does NuGet fit into Windows Azure Development?

imageTo be honest, you can definitely do your Windows Azure Development without NuGet. Where the real power of NuGet comes in is the Configuration and Upgrading.

NuGet out-of-the-box allows for Configuration and Source Code Transformations which simplifies the process of using Community Contributions or Open Source Packages from other Software Venders.

When taking a dependency on a Third Party Component many projects never update to the newer bits of the packages they use. These updated bits could contain important security or service updates which are ignored mostly because it’s too hard to upgrade or fear of breaking changes to their product.

While NuGet doesn’t ultimately mitigate the risk of breaking changes, it does allow for a more controlled upgrade to packages as the manufacturer of the package is handling the update process.

Windows Azure Awesomesauce in NuGet

To get the ball rolling Install NuGet, and execute your first command from the Powershell based Package Manager Console.

Get-Package –Filter Azure –ListAvailable

Results for "Get-Package -Filter Azure -ListAvailable"

This Query of Packages in NuGet containing reference to Windows Azure returns 34 Packages which can be used in your upcoming Windows Azure Projects.

My Windows Azure NuGet Package Recommendations

To set the stage here, I have only used a few of these packages in projects, but some of them are definite no brainers when it comes to future projects I work on. Let’s dig in and sweet what has been made much easier for us! [Note: the Headers are the actually installation commands from the Package Management Console]

Install-Package System.Web.Providers

The ASP.NET Universal Providers package is currently in a beta release. These extensions are to provide consistent Provider Support for Membership, Users, Roles, Profile and Session support regardless of Deploying to an On-Premise Server or in the Cloud using Windows Azure.

You can think of System.Web.Providers as the next generation of the original Windows Azure Samples found in Microsoft.Samples.ServiceHosting.AspProviders. The benefit this package will hopefully bring is a fully supported implementation of Role and Membership Management in Windows Azure Table Storage.

[Follow: System.Web.Providers]

Install-Package WindowsAzure.ServiceBus

This is the June 2011 CTP of the Windows Azure AppFabric Service Bus Features.

This package will install and enable interaction with Windows Azure AppFabric Topics or Queues. I’m looking forward to start diving into Topics and Queues for my upcoming presentation at SDEC2011.

[Follow: WindowsAzure.ServiceBus]

Install-Package WindowsAzure.WebRole.MVC3

Since the RTW of ASP.NET MVC 3.0 many developers have been tweeting, and asking questions on the Forums about an MVC3 Supported WebRole in the Windows Azure Tools. This hasn’t been added to the tooling to date, but you now have the ability to create MVC3 enabled Web Roles for your Windows Azure Deployments.

This package takes care of the Copy Local=true issue that many people have run into during their first deployment.

[Follow: WindowsAzure.WebRole.MVC3]

Install-Package WindowsAzure.Caching

Need to use the Windows Azure AppFabric Caching layer for one of your Web Applications? Look no further from this NuGet Package which will download the required Assemblies as well as pre-fill the web.config file to get you well on your way to using Windows Azure AppFabric Caching.

[Follow: WindowsAzure.Caching]

Windows Azure CDN Helpers

This one breaks the mold as there are two different types of packages for this one. If you’re a Web Forms Developer or an MVC Developer you’re both covered, but using separate packages.

Install-Package CdnHelpers.Razor

Provides a Razor Syntax for the Windows Azure Content Delivery Network (CDN).

[Follow: CdnHelpers.Razor]

Install-Package CdnHelpers.ASPX

Provides an ASP.NET Web Forms approach to Windows Azure Content Delivery Network (CDN).

[Follow: CdnHelpers.ASPX]

Install-Package TransientFaultHandlingFx

One of the important things to remember when dealing with Cloud Development is the possibility of failure. This package is a great find for anyone working with SQL Azure, Windows Azure Storage, or Windows Azure AppFabric.

There are a number of great implementations of Failure Retry Logic as well as a number of Extension Methods which allows quick additions to pre-existing code that currently doesn’t implement transaction retry logic.

[Follow: TransientFaultHandlingFX]

Wrapping Up

Thanks for reading this post! If you enjoyed the tips in this post you might be interested in my post on creating the Ultimate Windows Azure Development VM (or Environment).


Maarten Balliauw (@maartenballiauw) described A client side Glimpse to your PHP application in an 8/2/2011 post:

imageA few months ago, the .NET world was surprised with a magnificent tool called “Glimpse”. Today I’m pleased to release a first draft of a PHP version for Glimpse! Now what is this Glimpse thing… Well: "what Firebug is for the client, Glimpse does for the server... in other words, a client side Glimpse into whats going on in your server."

For a quick demonstration of what this means, check the video at http://getglimpse.com/. Yes, it’s a .NET based video but the idea behind Glimpse for PHP is the same. And if you do need a PHP-based one, check http://screenr.com/27ds (warning: unedited :-))

Fundamentally Glimpse is made up of 3 different parts, all of which are extensible and customizable for any platform:

  • Glimpse Server Module
  • Glimpse Client Side Viewer
  • Glimpse Protocol

This means an server technology that provides support for the Glimpse protocol can provide the Glimpse Client Side Viewer with information. And that’s what I’ve done.

What can I do with Glimpse?

A lot of things. The most basic usage of Glimpse would be enabling it and inspecting your requests by hand. Here’s a small view on the information provided:

Glimpse phpinfo()

By default, Glimpse offers you a glimpse into the current Ajax requests being made, your PHP Configuration, environment info, request variables, server variables, session variables and a trace viewer. And then there’s the remote tab, Glimpse’s killer feature.

When configuring Glimpse through www.yoursite.com/?glimpseFile=Config, you can specify a Glimpse session name. If you do that on a separate device, for example a customer’s browser or a mobile device you are working with, you can distinguish remote sessions in the remote tab. This allows debugging requests that are being made live on other devices! A full description is over at http://getglimpse.com/Help/Plugin/Remote.

PHP debug mobile browser

Adding Glimpse to your PHP project

Installing Glimpse in a PHP application is very straightforward. Glimpse is supported starting with PHP 5.2 or higher.

  • For PHP 5.2, copy the source folder of the repository to your server and add <?php include '/path/to/glimpse/index.php'; ?> as early as possible in your PHP script.
  • For PHP 5.3, copy the glimpse.phar file from the build folder of the repository to your server and add <?php include 'phar://path/to/glimpse.phar'; ?> as early as possible in your PHP script.

Here’s an example of the Hello World page shown above:

1 <?php 2 require_once 'phar://../build/Glimpse.phar'; 3 ?> 4 <html> 5 <head> 6 <title>Hello world!</title> 7 </head> 8 9 <?php Glimpse_Trace::info('Rendering body...'); ?> 10 <body> 11 <h1>Hello world!</h1> 12 <p>This is just a test.</p> 13 </body> 14 <?php Glimpse_Trace::info('Rendered body.'); ?> 15 </html>

Enabling Glimpse

From the moment Glimpse is installed into your web application, navigate to your web application and append the ?glimpseFile=Config query string to enable/disable Glimpse. Optionally, a client name can also be specified to distinguish remote requests.

Configuring Glimpse for PHP

After enabling Glimpse, a small “eye” icon will appear in the bottom-right corner of your browser. Click it and behold the magic!

Now of course: anyone can potentially enable Glimpse. If you don’t want that, ensure you have some conditional mechanism around the <?php require_once 'phar://../build/Glimpse.phar'; ?> statement.

Creating a first Glimpse plugin

Not enough information on your screen? Working with Zend Framework and want to have a look at route values? Want to work with Wordpress and view some hidden details about a post through Glimpse? The sky is the limit. All there’s to it is creating a Glimpse plugin and registering it. Implementing Glimpse_Plugin_Interface is enough:

1 <?php 2 class MyGlimpsePlugin 3 implements Glimpse_Plugin_Interface 4 { 5 public function getData(Glimpse $glimpse) { 6 $data = array( 7 array('Included file path') 8 ); 9 10 foreach (get_included_files() as $includedFile) { 11 $data[] = array($includedFile); 12 } 13 14 return array( 15 "MyGlimpsePlugin" => count($data) > 0 ? $data : null 16 ); 17 } 18 19 public function getHelpUrl() { 20 return null; // or the URL to a help page 21 } 22 } 23 ?>

To register the plugin, add a call to $glimpse->registerPlugin():

1 <?php 2 $glimpse->registerPlugin(new MyGlimpsePlugin()); 3 ?>

And Bob’s your uncle:

Creating a Glimpse plugin in PHP

Now what?

Well, it’s up to you. First of all: all feedback would be welcomed. Second of all: this is on Github (https://github.com/Glimpse/Glimpse.PHP). Feel free to fork and extend! Feel free to contribute plugins, core features, whatever you like! Have a lot of CakePHP projects? Why not contribute a plugin that provides a Glimpse at CakePHP diagnostics?


<Return to section navigation list>

Visual Studio LightSwitch and Entity Framework 4.1+

• Beth Massi (@bethmassi) reported new article in MSDN Magazine: Programming Made Easy with Visual Studio LightSwitch on 8/2/2011:

imageThe August issue of MSDN Magazine is out and it features a couple Visual Studio LightSwitch articles you should definitely check out. LightSwitch is a new product in the Visual Studio family aimed at developers of all skill levels who want to quickly create data-centric business applications for the desktop, Web and cloud.

image222422222222The first one by Robert Green is a good overview of the development experience and capabilities of LightSwitch:

Build Business Applications with Visual Studio LightSwitch
In this article Robert shows you how LightSwitch simplifies the development process because it does most of the development work for you. You don’t need to write code to interact with databases and you don’t need to manually lay out screens. You can concentrate on the business logic.

The second one (written by yours truly) is aimed at the professional developer looking to customize their LightSwitch applications with more advanced features:

Advanced Programming Made Easy With Visual Studio LightSwitch
In this article I’ll show you how to work with LightSwitch APIs, create custom screen layouts, use advanced query processing, write complex business rules, and use and create LightSwitch extensions. You’ll see that programming even the more advanced features of a LightSwitch business application is simplified dramatically because LightSwitch handles all the plumbing for you.

Enjoy!


• Avkash Chauhan described Windows Azure and Visual Studio LightSwitch 2011 - Part 1/3 : Creating Hello World Application in Visual Studio LightSwitch for deployment on Windows Azure on 8/2/2011:

imageVisual Studio LightSwitch 2011 offers downloadable starter kits and flexible deployment options that help you create and easily release custom business apps that look professional and polished, with no coding required. Learn more at:

image222422222222http://www.microsoft.com/visualstudio/en-us/lightswitch/overview

Technology used in this 3 part blog entry:

  • Visual Studio LightSwitch 2011
  • Visual Studio 2010 Pro and Above
  • Windows Azure SDK 1.4 Refresh

Visual Studio LightSwitch is the simplest way to develop business applications for the desktop and the cloud. LightSwitch handles all the plumbing for you so that you can concentrate on the business value. The only code you write is the only code you could write.

After you have installed LightSwitch 2011, lets not get to work.

Visual Studio LightSwitch 2011 Application Creation:

Step 1: Start LightSwitch 2011 application and create a new application based on your choice of language. In this case i have create an application name "AzureLightSwitchWebRole".

The first screen will look like as below:

Step 2: Because we are creating a very simple application which we will host in Windows Azure and our main objective is Windows Azure so in the above screen you can select "Create new table" option and you will see new dialog windows to create table items. You can create tables "AddressList" and included tables items as below:

Step 3: Once above table items are created in AddressList table, you can save your project and create new screen as below:

Step 4: Once you have clicked "Add Screen" option, you will see a new screen wizard as below:

Step 5: Please select "List and Detail Screen" from the template above and select your "AddressList" table as "Screen Data". Once above selections are done, you will see the screen layout as below:

Step 6: The following screen is just to show how you can use screen layout properties to set individual table items property:

That's it. I am not going to write any code behind it and if you want to learn more you can download Visual Studio LightSwitch 2011 Training Kit from the link below:

http://www.microsoft.com/download/en/details.aspx?displayLang=en&id=23746

Visual Studio LightSwitch 2011 Application Execution (As Desktop Application):

Now lets run our application and you will see the application executes as a Desktop Application as below:

In this application you can insert new addresses as below and save them. All new address are saved in the table and when you start new instance of the application the previously saved address are populated.

Visual Studio LightSwitch 2011 Application Execution (As Web Application inside Web Browser) :

To test the same application inside Web Browser you will need to set the application properties as below:

Once above properties selection is completed, you can run application and you will see the application runs in web browser as below:

That's it here for us. In the second part of this blog entry we will publish the same application in Windows Azure.

[Next article, see below:] Windows Azure and Visual Studio LightSwitch 2011 - Part 2/3 : Publishing LightSwitch Application to Windows Azure using LightSwitch Publish Wizard

Windows Azure, Code Sample


• Avkash Chauhan continued his series with Windows Azure and Visual Studio LightSwitch 2011 - Part 2/3 : Publishing LightSwitch Application to Windows Azure using LightSwitch Publish Wizard on 8/2/2011:

imageIn this section we will use the application "AzureLightSwitchWebRole" created in first part of this blog (link below) and "Publish" to Windows Azure using LightSwitch Publish Wizard

Windows Azure and Visual Studio LightSwitch 2011 - Part 1/3 : Creating Hello World Application in Visual Studio LightSwitch

Creating Windows Azure Service for LightSwitch Application:

image222422222222Before we go further and start using LightSwitch Azure Deploy wizard, let visit to "Windows Azure Management Portal" and create an empty hosted service for our application. Open "Windows Azure Management Portal (https://windows.azure.com)".

First create a new hosted service using "New Hosted Service" selection as below:

Then enter the following Hosted Service Information (or you can choose your own name but please remember later) and be sure to select "Deploy Option" as "Do Not Deploy":

Above I have created a new hosted service names as "azurelightswitchwebrole". Please wait to verify that your service is created without any glitch.

That's good. We have created an empty "Hosted Service" names "azurelightswitchwebrole" successfully.

Visual Studio LightSwitch 2011 Application Deployment to Windows Azure:

Please set your application to "Release" mode and then open you application property and select Publish Option as below:

Once you will select "Publish" option in above dialog, you will see "LightSwitch Publish Application Wizard" opens as below:

Please select "Web" as your application type and then select "Next" to proceed. You will see now "Application Server Configuration" settings as below in which please select "Windows Azure" as Hosting option and select "Next":

Once you will go next wizard screen, you will see that now we are dealing with Windows Azure settings here. You will need an active Windows Azure account details to further proceed. Now you are at "Connect to Windows Azure" screen as below:

In the above screen, please enter your "Subscription ID" and then select a certificate which is already stored in "Windows Azure Management Certificate" section. If you are not sure selected certificate is located in Windows Azure or not, you can use "Copy Path" button to select the temp CER certificate location and then use Windows Azure Management Portal to upload this certificate.

Note: The following screen shot will show you, where this certificate should be uploaded in your Windows Azure Management Portal:

Once you passed "Connect to Windows Azure" screen above you will see the following "Azure Service Configuration" screen in which you will select Azure Service Name which you have created above in step 1.

Once you passed above "Azure Service Configuration" screen you will see "Security Settings" screen which is designed to add HTTPS endpoint to your application. Once you will add correct SSL certificate which is issued to your domain, you will be able to get https://www.yourservicename.com endpoint working.

Note: If you will use wrong SSL certificate in above step, you will get warning while accessing your application. You will see those error in my case at the end of this blog as I am using just any SSL certificate here.

Once you passed above "Security Settings" screen, you will jump to next wizard screen as "Database Connections". As your application will run on Windows Azure, you will need to connect this application to a certain database. You can configure your application to connect with any on-premise database however here I will use SQL Azure. The "Database Connections" screen needs the following info:

Note: Please remember the following here:

  • Please be sure to add proper Database settings above because during application publish process, this DB configuration will be used to setup table details and if there is any problem, your publish process will break.
  • Please be sure to add your local development machine IP Address in the SQL Azure firewall access rule because during publish, this IP address will be used to configure your database.
  • Please try to use "Test Connection" option above which will return an error related with IP Address, and you can use this IP address to add into SQL Azure Firewall.

Once you passed above "Database Connections" screen you can go to next step "Specify a Certificate", which is used to sign your XAP file. In my case I have used the same PFX to sign my XAP file. Because my LightSwitch application will run as SilverLight application in Azure, so you can sign your XAP file for further protection. This is good option because your XAP will run on client side so setting this option is good security practice.

If you decide to not to use XAP signing step, you can bypass this step by un-checking "Specify a certificate" check box.

Once you passed above "Specify a Certificate" screen you are almost ready to start the actual publish step. Please select "Next" to see the summary as below:

Now you can start publishing just by using "Publish" option above. In your Visual Studio IDE you will see the publish progress as below:

Now if you see your Windows Azure Management Portal you will see a new Package is being process as below:

It is possible that you will see your service is stopped state as below and if it is the case, you can start it using "Start" option in Windows Azure Management Portal:

Once service is started you will see some more progress as below:

Once all goes well you will see service ready status as below:

Now you can launch your application using the proper URL. In my case my service name is "AzureLightSwitchWebRole" so I can use http://azurelightswitchwebrole.cloudapp.net or https://azurelightswitchwebrole.cloudapp.net URL to test the application as below:

I have also tested that once I close the browser and open later the items are saved. This proved the database connection is working as expected as well as this simple application.


• Avkash Chauhan completed his series with Windows Azure and Visual Studio LightSwitch 2011 - Part 3/3 : Adding RDP Access to LightSwitch 2011 Application which is already deployed to Windows Azure on 8/2/2011:

imageIn this section we will learn how to add RDP access credentials to an already deployed LightSwitch application which you have already published. You can visit part 1 and part 2 of my articles on the same topic as below:

image222422222222The need for this topic arise because LightSwitch Azure Publish wizard directly deploy application to Windows Azure using wizard (as explained in above part 2) however the wizard does not have a way to configure RDP access to the same application. This article fulfill the gap.

To start with, I would assume that you already have your LightSwitch application running in Azure and if you see your application bin folder you will see the following files:

As you can see we have CSPKG, CSCFG and CSDEF files in above folder, which are main file related with Windows Azure. To add RDP access we will need to edit both CSDEF and CSCFG files.

First we need to generate RDP access credentials for any Windows Azure Application so we can export them in above LightSwitch CSDEF and CSCFG files. Now to start with please create a simple helloworld Azure application and use Publish option to add RDP access to it.

Note: To learn how to add RDP access to Azure application please visit: http://blogs.msdn.com/b/avkashchauhan/archive/2011/04/03/setting-rdp-access-in-windows-azure-application-with-windows-azure-sdk-1-3-1-4.aspx

In Windows Azure application RDP Access a PFX certificate is used to encrypt RDP access credentials and this FX certificate must be uploaded in "certificate" section of your Windows Azure Application. So there are two options:

  1. We can create a new PFX certificate and use it to encrypt RDP access credentials and then upload PFX to "certificate" section of your Windows Azure Application in Windows Azure Portal.
  2. We can use the same PFX certificate which we have use to sign XAP file in "Specify a certificate" section (Part 2)

Because above option 1 is already described in my blog link above, I have decided to use option 2.

In the HelloWorld application publish wizard, I am used the same certificate as below:

If you match the certificate thumbprint above with Windows Azure Application certificate, you can verify that the thumbprint does match so I don't need to upload it again.

Now once RDP access setting is completed in a sample helloworld application, you can see following entries which we will use in our LightSwitch application:

ServiceDefinition.CSDEF (RDP Access related settings only)

 <Imports>
<Import moduleName="RemoteAccess" />
<Import moduleName="RemoteForwarder" />
</Imports>

ServiceConfiguration.cscfg (RDP Access related settings only)

 <ConfigurationSettings>
<Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.Enabled" value="true" />
<Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountUsername" value="avkashc" />
<Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountEncryptedPassword" value="MIIBHwYJKoZIhvcNAQcDoIIBEDCCAQwCAQAxgdkwgdYCAQAwPzArMSkwJwYDVQQDEyBBdmthc2ggV2luZG93cyBBenVyZSBQb3J0YWwgQ2VydAIQXNki5y7CtptNzxrLjUQ52zANBgkqhkiG9w0BAQEFAASBgC2KUWTLVFI0NtfhznAc+LC40l/jmFBdoDlYqh7pBDs4ujEvYCTUDuqfVp2jlqRgKJGUf6UFxaXKDgnT78dirwuRnw8aYvlkLEDb0OvjG1DQWFp72XGwp3U8hSljX41zXnkjprEJo4tgaFQIycXkROU4y+11GfOgfzD4A75A95PHMCsGCSqGSIb3DQEHATAUBggqhkiG9w0DBwQI2oBjQVC07caACMjaoYP3REwm" />
<Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountExpiration" value="2011-11-30T23:59:59.0000000-08:00" />
<Setting name="Microsoft.WindowsAzure.Plugins.RemoteForwarder.Enabled" value="true" />
</ConfigurationSettings>
<Certificates>
<Certificate name="Microsoft.WindowsAzure.Plugins.RemoteAccess.PasswordEncryption" thumbprint="A77B40E35556DFDB09C3B246453A548B2D7B9444" thumbprintAlgorithm="sha1" />
</Certificates>

Now we will try to migrate these settings into our LightSwitch application. Here you will need to update ServiceDefinition.csdef and ServiceConfiguration.cscfg which are located at the root of LightSwitch application:



Now lets update the ServiceDefinition.csdef which is located @ C:\Azure\AzureLightSwitchWebRole\AzureLightSwitchWebRole\ServiceConfiguration.cscfg

ServiceDefinition.csdef (LightSwitch Application updated with RDP Access settings )

<ServiceDefinition name="AzureLightSwitchWebRole" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition">
<WebRole name="LightSwitchWebRole"
vmsize="Small"
enableNativeCodeExecution="true">
<ConfigurationSettings>
<Setting name="Microsoft.LightSwitch.Trace.Enabled" />
<Setting name="Microsoft.LightSwitch.Trace.LocalOnly" />
<Setting name="Microsoft.LightSwitch.Trace.Level" />
<Setting name="Microsoft.LightSwitch.Trace.Sensitive" />
<Setting name="Microsoft.LightSwitch.Trace.Categories" />
<Setting name="Microsoft.LightSwitch.RequireEncryption" />
</ConfigurationSettings>
<Sites>
<Site name="Web">
<Bindings>
<Binding name="HttpIn" endpointName="HttpIn" />
<Binding name="HttpsIn" endpointName="HttpsIn" />
</Bindings>
</Site>
</Sites>
<Endpoints>
<InputEndpoint name="HttpIn" protocol="http" port="80" />
<InputEndpoint name="HttpsIn" protocol="https" port="443" certificate="SSLCertificate" />
</Endpoints>
<Certificates>
<Certificate name="SSLCertificate" storeLocation="LocalMachine" storeName="My" />
</Certificates>
<Imports>
<Import moduleName="RemoteAccess" />
<Import moduleName="RemoteForwarder" />
</Imports>
</WebRole>
</ServiceDefinition>
Now lets update the ServiceDefinition.csdef which is located @ C:\Azure\AzureLightSwitchWebRole\AzureLightSwitchWebRole\ServiceDefinition.csdef

ServiceConfiguration.cscfg (LightSwitch Application updated with RDP Access settings)

<ServiceConfiguration serviceName="AzureLightSwitchWebRole" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration">
<Role name="LightSwitchWebRole">
<Instances count="1" />
<ConfigurationSettings>
<!-- A value of true will enable diagnostic logging on the server -->
<Setting name="Microsoft.LightSwitch.Trace.Enabled" value="false" />
<!-- A value of true only lets local access to Trace.axd -->
<Setting name="Microsoft.LightSwitch.Trace.LocalOnly" value="true" />
<!-- The valid values for the trace level are: None, Error, Warning, Information, Verbose -->
<Setting name="Microsoft.LightSwitch.Trace.Level" value="Information" />
<!-- A value of true will indicate that logging sensitive information is okay -->
<Setting name="Microsoft.LightSwitch.Trace.Sensitive" value="false" />
<!-- The semi-colon separated list of categories that will be enabled at the specifed trace level -->
<Setting name="Microsoft.LightSwitch.Trace.Categories" value="Microsoft.LightSwitch" />
<!-- A value of true will indicate http requests should be re-directed to https -->
<Setting name="Microsoft.LightSwitch.RequireEncryption" value="true" />
<Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.Enabled" value="true" />
<Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountUsername" value="avkashc" />
<Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountEncryptedPassword" value="MIIBHwYJKoZIhvcNAQcDoIIBEDCCAQwCAQAxgdkwgdYCAQAwPzArMSkwJwYDVQQDEyBBdmthc2ggV2luZG93cyBBenVyZSBQb3J0YWwgQ2VydAIQXNki5y7CtptNzxrLjUQ52zANBgkqhkiG9w0BAQEFAASBgC2KUWTLVFI0NtfhznAc+LC40l/jmFBdoDlYqh7pBDs4ujEvYCTUDuqfVp2jlqRgKJGUf6UFxaXKDgnT78dirwuRnw8aYvlkLEDb0OvjG1DQWFp72XGwp3U8hSljX41zXnkjprEJo4tgaFQIycXkROU4y+11GfOgfzD4A75A95PHMCsGCSqGSIb3DQEHATAUBggqhkiG9w0DBwQI2oBjQVC07caACMjaoYP3REwm" />
<Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountExpiration" value="2011-11-30T23:59:59.0000000-08:00" />
<Setting name="Microsoft.WindowsAzure.Plugins.RemoteForwarder.Enabled" value="true" />
</ConfigurationSettings>
<Certificates>
<Certificate name="Microsoft.WindowsAzure.Plugins.RemoteAccess.PasswordEncryption" thumbprint="A77B40E35556DFDB09C3B246453A548B2D7B9444" thumbprintAlgorithm="sha1" />
</Certificates>
</Role>
</ServiceConfiguration>
Note: It is always good practice to have backup of your above CSCFG and CSDEF files.

Now you can rename CSPKG, CSCFG and CSDEF which are located @ C:\Azure\AzureLightSwitchWebRole\AzureLightSwitchWebRole\Bin\<Release|Debug>\

After it you can start the same publish wizard and publish it again. you will see new CSCFG, CSDEF and CSPKG files are created as below:

After the publish is completed you can verify that RDP access is enabled in your Application. When you select your role at Windows Azure Management Portal you can see that RDP access is enabled and can be configured (if needed) as below:

When you select your Instance at Windows Azure Management Portal you can see that RDP access is enabled and can be configured (if needed) as below:

Now you can launch the RD Connection to your LightSwitch Application and without any problem you will have to your VM as below:

If you need assistance on how to use RD connection with Azure VM please visit: http://blogs.msdn.com/b/avkashchauhan/archive/2011/04/03/how-to-login-into-windows-azure-virtual-machine-using-remote-desktop.aspx

Windows Azure, How to Do.


SD Times Newswire reported a New LightSwitch Reports Extension by DevExpress on 8/2/2011:

imageDevExpress.com is proud to announce the immediate availability of the XtraReports Suite for LightSwitch - full-featured reporting and data analysis extension for Visual Studio LightSwitch. With this new LightSwitch Reporting Extension, you can create a wide-range of business reports for the LightSwitch Platform. Whether generating a simple contact list or an advanced sales report...the XtraReports Suite for LightSwitch ships with a rich design experience to simplify the report generation process and to help deliver compelling and informational reports with ease. Features include:

image• Easy-to-Activate Extension
• Fast Report Generation with Asynchronous Data Service
• Native Support for LightSwitch Data Structures
• Native LightSwitch Reporting Screen
• Powerful and Easy-to-Use Report Designer

image222422222222To learn more about the XtraReports Suite for LightSwitch and how to deliver compelling business solutions targeting Visual Studio LightSwitch, visit:

If you own an active license of the XtraReports Suite, you’ll receive the DevExpress LightSwitch Reporting Extension free-of-charge. If you are new to XtraReports, you can purchase the LightSwitch Extension for $99.99 (per developer). In both instances, DevExpress never charges any runtime royalties or distribution costs.

Experience the DevExpress Difference
To download a free evaluation copy of the DevExpress LightSwitch Extensions along with our entire line-up of market leading Visual Studio Controls, IDE Productivity Tools and Application Frameworks, visit http://www.devexpress.com/LightSwitch

Recent Awards
DevExpress recently won thirteen (13) Visual Studio Magazine Readers Choice Awards – To view a complete list of DevExpress Awards visit: http://www.devexpress.com/Home/Awards.xml


Jonathan Goodyear (@angryCoder) asserted “LightSwitch offers a good way to quickly create business apps that are easy to maintain and upgrade--and that's good for both end users and developers” in a deck for his Visual Studio LightSwitch Is Live; Should You Care? article of 8/1/2011 for DevProConnections:

imageLast year, I wrote about a beta product that Microsoft announced called LightSwitch (see "Visual Studio LightSwitch: Mort Lives!"). In a nutshell, Visual Studio LightSwitch 2011 (as it has been formally named at launch) is an approachable development tool positioned neatly in between Office/ Visual Basic for Applications (VBA) and full-fledged Visual Studio development on the development complexity scale. You can read more about the product and its features at the official Visual Studio LightSwitch website.

image222422222222The key takeaway that I presented in my previous column is that Mort gets a .NET-friendly way to build line-of-business (LOB) applications quickly and easily, while we hard-core developers don't get ulcers when we inherit these LOBs as they graduate to "officially supported" status in the workplace. After all, unlike Office/VBA LOBs, LightSwitch applications are built around patterns, are scalable, and can be imported directly into Visual Studio.

imageAside from this peripheral interest, though, should you care about LightSwitch? After all, we're not in the business of quick-n-dirty CRUD, are we? I disagree. I think that LightSwitch can be extremely useful, even to full-fledged software developers. Here are a few reasons.

One of the first things that popped into my head when I saw LightSwitch was how useful it could be for the creation of back-office applications to support customer-facing applications and websites. As a funny (or not so funny) anecdote, I led a team of developers several years ago that created a commerce website for a major television shopping channel. The project was done under extreme schedule duress, so as a result, we launched with virtually no back-office support. Here we were doing over one million dollars a day in sales, with the website content editors using SQL Server Enterprise Manager to enter product information. It would have been a godsend to have a tool like LightSwitch to help us bang out some back-office tools until we caught our breath. And the best part is that (as previously mentioned) we wouldn't have lost any of our work if we scaled up to a more complex solution.

Another huge benefit of LightSwitch is that since it is built on the Silverlight platform and can easily be re-targeted as a windows desktop or web application, you instantly get support for the Apple Macintosh platform. In years past, this wouldn't have been such a big deal, but with the surge of Mac usage in corporate America (and beyond), it is increasingly more important to keep multi-platform in mind.

Microsoft (as well as I in my previous column) espouses the value of the hand-off of code from Mort to software developer via LightSwitch; but what about the other way around? Think about it this way. Have you ever built a Microsoft Excel spreadsheet, PowerPoint presentation, or Word document for a user who was not overly technically adept and then showed them how to adapt it for their needs? LightSwitch offers the same possibility for LOB applications.

Some Morts may not quite be proficient enough to wrap their arms around LightSwitch and start from scratch. However, if a seasoned developer got them started in a quick and easy way and gave Mort a head start toward solving their problem, both parties benefit. The time-consuming business rules customization could be performed by Mort by following a simple template, while the software developer can feel good about the code that he or she will inherit when the torch is eventually passed back to them (as we all know it will).

At the end of the day, LightSwitch isn't the kind of flashy new technology that gets us hard-core developers excited. However, it is the kind of technology that, if encouraged and properly implemented, could save many of us countless hours of aggravation porting poorly written LOBs as well as solve our cross-platform back-office tool needs. LightSwitch is a winner, and that's the message that we need to spread.

Jonathan Goodyear is president of ASPSOFT, an Internet consulting firm in Orlando. He is Microsoft regional director for Florida, an ASP.NET MVP, a Microsoft Certified Solution Developer, and a contributing editor for DevProConnections.

Related Content:

Return to section navigation list>

Windows Azure Infrastructure and DevOps

The Windows Azure Team (@WindowsAzure) sponsored a series of Azure- and cloud- related articles in a Microsoft Cloud Computing section of The Telegraph:

image'Cloud computing' is set to transform the IT industry - with Microsoft estimating that within three years it will be worth almost £2bn in the UK market alone, but what exactly is ‘Cloud’ all about? Check out our videos and articles to find out more about this latest development in IT infrastructure.

imageIn the future, Cloud Computing will be the only choice: Successful businesses may soon have no chief executive, no headquarters and no IT infrastructure, says Nick Martindale.

Silver lining seminar - microsoft cloud computing, cloud based hub, computing, it, storing data, information technology, virtualisation, hybrid cloud infrastructure

Speakers from Microsoft, The Telegraph and the Government at the cloud computing conference, hosted by The Telegraph and Microsoft on the 40th floor of London’s famous Gherkin building. Photo: Andy Paradise.

I wouldn’t be surprised to see sections like this appear in other EMEA and US newspapers.


David Linthicum (@DavidLinthicum) asserted “When a VMware exec gets the importance of SOA wrong, you know a lot of folks are dangerously ignorant of the SOA-cloud connection” as a deck for his Ignorance of SOA leads to cloud failure article of 8/2/2011 for InfoWorld’s Cloud Computing blog:

imageIn these days of YouTube, I'm always extra careful about what I say when I speak at conference. I know that statements can be taken out of context, and saying something silly, stupid, or factually incorrect can haunt you for months. Evidently, at least some of the folks at EMC VMware are not worried about that.

Rod Johnson of VMware, in his keynote presentation (see the video below) and as reported by TheServerSide.com, states that SOA was a fad, whereas cloud computing is real: "If you look at the industry over the past few years, the way in which cloud computing is spoken of today is the way in which SOA was spoken of four or five years ago. I think with respect to SOA, it really was a fad."

In Johnson's defense, I think that declaration was not as uneducated as it sounds, so I urge you to listen to the entire presentation. But there was enough wrong with it that Johnson and anyone who tooks his comments at face value deserve additional education.

To those who use SOA to get to the cloud, Johnson's statement is akin to saying that sound engineering principles are a fad. Or that the new generation of cars is here to stay, er, until next year's models come out. You won't get to the new generation of cars without solid engineering principles, and those engineering principles are durable across many generations of cars.

The differences and value of both SOA and cloud computing concepts are lost on most in the emerging world of cloud computing -- Johnson included, it appears. Allow me to clear things up.

SOA and cloud computing are two very difference concepts. SOA is something you do. It's an architectural pattern and approach where you address core IT resources as sets of services, then configure and reconfigure those services as solutions. SOA provides a great deal of value to the world of cloud computing, considering that you're dealing with clouds through APIs or services, so the use of SOA as a way to both build and leverage cloud computing is a natural fit.

Read more: Next page, 2


Adron Hall (@adronbh) posted a link to a 00:01:31 Bing Maps Data Center Time Lapse video segment to his Composite Code blog on 8/2/2011:

This is pretty cool. A minute and a half time lapse of a data center build out in Colorado. It’s kind of interesting, in a hardware hacker geek kind of way.

imageThe 00:02:15 version is better.

From the video’s description:

Microsoft Bing Maps works with Dell Data Center Solutions on adding Modular Data Centers as part of its new computing microsite in Longmont, Colorado.

Here’s a Bing Map of the location in Longmont:

Barton George (@barton808) added to the “microsite” description in his Dell’s Modular Data Center powers Bing Maps post of 8/1/2011:

imageLate last week we announced that Dell’s Data Center Solutions group had outfitted Bing Maps’ uber-efficient, uber-compact data center (or as Microsoft calls it “microsite”), located in Longmont, Colorado. The facility is a dedicated imagery processing site to support Streetside, Bird’s Eye, aerial and satellite image types provided by Bing Maps. The site’s key components are Dell’s Modular Data Centers and Melanox Infiniband networking.

Brad Clark, Group Program Manager, Bing Maps Imagery Technologies described their goal for the project, “Our goal was to push technological boundaries, to build a cost effective and efficient microsite. We ended-up with a no-frills high-performance microsite to deliver complicated geospatial applications that can in effect ‘quilt’ different pieces of imagery into a cohesive mosaic that everyone can access.”

Keeping things cool

The challenge when building out the Longmont site was to design a modular outdoor solution that was optimized for power, space, network connectivity and workload performance.

The modules that Dell delivered use a unique blend of free-air with evaporative cooling technology, helping to deliver world-class efficiency and a Power Usage Effectiveness (PUE) as low as 1.03.

To watch the whole site being built in time-lapse check this out:

Extra-credit reading


<Return to section navigation list>

Windows Azure Platform Appliance (WAPA), Hyper-V and Private/Hybrid Clouds

• Mary Jo Foley (@maryjofoley) described Reading the fine print on Microsoft's new 'cloud-ready' licensing terms in an article of 8/2/2011 for ZDNet’s All About Microsoft blog:

imageAs of July 1, Microsoft introduced new licensing mobility terms into the company’s volume license agreements. Microsoft’s stated goal in making the changes was to enable customers to easier for customers to move their application-server workloads from on-premises to the cloud and/or to run hybrid cloud/on-premises set-ups.

imageDirections on Microsoft, the Kirkland, Wash.-based Microsoft-watching firm, combed through the fine print and published a report entitled “Enterprise Cloud Licensing Rules Clarified” on some of the new restrictions and caveats.

Directions analyst John Cullen — who spent the last five years at Microsoft writing the licensing rules for Windows Server before joining the firm — is the author of the report. I had a chance to ask him about some of his findings. Here is our exchange;

MJF: What’s your take as to why Microsoft is making these changes (in terms of benefits to customers, partners and to the company itself)?

Cullen: This is to make it more attractive and easier for customers with Software Assurance to migrate to the cloud; specifically to the hoster’s third party cloud. It is also a clear Microsoft attempt to make sure SA maintains its relevancy with the move to cloud computing. Microsoft wants to further enable customers moving to the cloud; whether it’s on Microsoft’s hosted online services; or through a third party hoster under SPLA (Service Provider License Agreement). It also provides more value to the Software Assurance subscription program.

MJF:
What do you anticipate the net effect of these changes will be? Are they going to be more positive or more negative for most customers? And positive or negative for the 3rd party hosters?

Cullen:
Overall, these changes create a net positive effect because the customer can now do something that they couldn’t do before, and Microsoft is not taking away any existing license rights. These changes provide added flexibility for customers to move licenses to third party hoster clouds that are multitenant (i.e., hardware shared across multiple customers). The hoster can charge less money to end customer and still make more money since customers are now bringing their own licenses. However, the hoster will have a more complicated internal process for tracking license compliance with the new rules.

MJF: Do you expect customer costs to go up, down or stay about the same as a result of these changes (and will different customer segments experience different effects)?

Cullen: This gets complicated because it depends on a license cost comparison between the overall costs when customer had to pay hoster for SPLA licensing fees versus the costs of on-premises licenses and SA that are moved to a hoster. Generally, I expect these changes can reduce licensing costs in many scenarios, but as highlighted, there can be cases when moving licenses to a hoster would require additional purchases to be compliant.

MJF: What types of customers are likely most affected by the reduction in the number of operating system environments (OSEs) allowed under license mobility?

Cullen: The most affected will likely be customers moving processor licenses for SQL Server Enterprise and Datacenter editions.

A scenario where you “win” (licenses let you do more in the cloud than on-premises): Were running one SQL Server workload on a dual proc on-premises server licensed with two SQL Enterprise proc licenses. You can move the workload up to a multitenant hoster with a quad proc box, at times using more proc “horsepower” than you did when on premises, and yet you only need to allocate ONE of your two SQL Enterprise proc licenses to do so.

A scenario where you “lose” (you’ll need to supplement with more licenses when you move to the cloud): Were running four SQL Server workloads, each in its own VM, on a dual proc on-premises server licensed with two SQL Server Enterprise proc licenses. Moving these licenses to the cloud would entitle you to run only two of the four workloads on the hoster’s multitenant server.

MJF: Any particular “gotchas” worth pulling out from the report? In other words, should BizTalk users be especially wary? Or SQL Server customers? (and if so, why)

Cullen:
The primary “gotcha” from our report would be concerning SQL Server and BizTalk server and processor licenses. Here is an abbreviated chart showing how the use rights become more limiting when the licenses are moved to hosted multitenant servers of a hoster:

image

This chart lists server application licenses and key differences between their on-premises licensing use rights and corresponding licensing use rights when moved to multitenant servers hosted by a service provider hoster. Depending on the product, the results of moving a license to the cloud can vary significantly.

Note that the term “virtual processor” used in the chart has a specific meaning for purposes of licensing. A virtual processor, for purposes of licensing, is the equivalent of one physical processor’s worth of computational power (with a physical processor defined as a chip occupying a socket on the motherboard), which is a very different definition than the industry-accepted technical definition of virtual processor. As stated in Microsoft’s quarterly Product Use Rights (PUR) document, “Solely for licensing purposes, a virtual processor is considered to have the same number of threads and cores as each physical processor on the underlying physical hardware system.”

Directions holds regular Microsoft Licensing Boot Camps, the next of which is slated for Orlando from October 25-26.


Jeff Woolsey posted Beware the VMwre Memory vTax; Plus--Good News for Hyper-V… to the Microsoft Virtualization Team Blog on 8/2/2011:

Virtualization Nation,

imageThe last few weeks have been buzzing with virtualization news. Just two examples are the Windows Server “8” Hyper-V Sneak Peek at the Microsoft Worldwide Partner Conference (WPC) and VMware’s creation of the Memory vTax.

WPC: A Sneak Peek at Windows Server “8” Hyper-V

imageAt WPC, I participated in a keynote and got to demo the first sneak peek at the next version of Hyper-V. If you’d like to see the Windows Server “8” sneak-peek demo, go here and fast forward to 36:50 of this online video. Don’t wait. I don’t know how long the video will be up. Judging by the tens of thousands of views in the first couple of weeks, I think there’s a bit of interest.

Here’s what we showed:

  • Greater than 16 virtual processors within a Hyper-V VM. We are keenly aware that our customers want more virtual processors within a virtual machine to support large scale up workloads and the new version goes above and beyond in addressing that need. I demoed a 16-virtual-processor virtual machine under heavy load. I pointed out that 16 virtual processors is not the maximum number of virtual processors. It was simply the largest server I was able to ship for the demo. As for support for more virtual processors, I’ll just say, “Stay Tuned” for some good news.
  • Hyper-V Replica. It’s time to democratize VM replication. Hyper-V Replica is asynchronous, application consistent, virtual machine replication. Hyper-V Replica lets you replicate a virtual machine from one location to another, using Hyper-V and a network connection. Hyper-V Replica works with virtually any vendor’s server hardware, networking and storage products. In addition, we will provide unlimited VM replication in the box. What do I mean by unlimited VM replication? I mean exactly that. Replicate as many virtual machines as you want. Whether it’s 1, 100, or 10,000 VMs, replicate as much as you want. Would you like to replicate VMs:
    • from your primary site to your secondary site?
    • from your branch office to your corporate office? Vice versa?
    • to a private cloud hoster?

We think you should be able to do those things—without paying a per-VM Replication Tax.

This is what our valued customers have asked for, and what we’re delivering.

The VMware Memory Entitlement (vTax)

imageWe’ve been deluged with email this week asking about the new VMware vRAM entitlement which has quickly been dubbed the “VMware vTax.” Here’s a quick description From VMware vSphere 5.0 Licensing, Pricing and Packaging, p. 3

“vSphere 5.0 will be licensed on a per processor basis with a vRAM entitlement. Each vSphere 5.0 CPU license will entitle the purchaser to a specific amount of vRAM, or memory configured to virtual machines.”

I’ll get into the details below, but specifically, here’s what folks are asking:

  1. What do you think about the new VMware vSphere 5.0 Licensing Changes?
  2. Does this make sense to you?

In a word, NO. But don’t take my word for it. Let’s see what VMware’s customers, think of these changes, starting with Twitter. …

Jeff continues with pasted tweets, excerpts from analyst posts, and VMware forum messages complaining about the vTax issue. He then compares the licensing cost of one and ten servers with vSphere 5 and Microsoft Hyper-V Server 2008 R2:

1 Server: vSphere 5 vs. Microsoft Hyper-V Server 2008 R2 SP1

Let’s compare a single server populated with various memory configurations.

In this first comparison, let’s analyze the effect of the VMware Memory Tax and focus on the hypervisor layer. For this comparison, I’m going to use VMware vSphere 5.0 and Microsoft Hyper-V Server 2008 R2 SP1. This comparison allows us to focus on the ability of the hypervisor to fully utilize memory resources in a physical server for virtual machines. Let me preface this example by stating, this comparison doesn’t include hardware, guest operating system licenses, storage, networking or systems management.

This comparison includes VMware’s Support and Subscription (SnS) licensing. I was going to be charitable and omit the SnS subscription, but then I read in the VMware vSphere 5.0 Licensing, Pricing and Packaging whitepaper at the bottom of pages 3-11 in the fine print it states, “SnS is required for all vSphere purchases.” Thus, I included the SnS License per VMware’s requirement. It should be noted that because the VMware Memory Tax requires purchasing more licenses for larger memory footprints and because "a Support and Subscription (SnS) contract is required for every vSphere Edition purchase", the SnS requirement acts as a subtle, additional tax even if the user is purchasing the extra license for vRAM capacity.

The first comparison is vSphere 5 to Microsoft Hyper-V Server 2008 R2 SP1.

image

Table 1: Memory Tax: vSphere 5 vs. MS Hyper-V Server 2008 R2 SP1 (1 Server)

10 Servers: vSphere 5 vs. Microsoft Hyper-V Server 2008 R2 SP1

Now, let’s compare populating a pool of virtualization servers with various memory configurations.

In this second comparison, let’s analyze the effect of the VMware Memory Tax on a 10 node cluster (or two 5 node clusters if you prefer). This second comparison also allows us to focus on the ability of the hypervisor to fully utilize memory resources in a pool of physical servers for virtual machines. Like the first example, let me preface this by stating, this comparison doesn’t include hardware, guest operating system licenses, storage, networking or systems management.

This comparison includes VMware’s Support and Subscription (SnS) licensing per VMware’s requirement because “SnS is required for all vSphere purchases.” It should be noted that because the VMware Memory Tax requires purchasing more licenses for larger memory footprints and because "a Support and Subscription (SnS) contract is required for every vSphere Edition purchase", the SnS requirement acts as a subtle, additional tax even if the user is purchasing the extra license for vRAM capacity [only].

image

Table 2: Memory Tax: vSphere 5 vs. MS Hyper-V Server 2008 R2 SP1 (10 Servers)

At this point, you’re may be thinking, “Doesn’t VMware offer a free ESXi version?”

Yes, and VMware has promptly downgraded it. The free version of vSphere ESXi 5 is limited to 8 GB of memory.

1 Server: vSphere 5 vs. Windows Server 2008 R2 SP1 [Full ECI Stack]

In this next analysis, let’s look at the full stack and the effect of the VMware Memory Tax on the complete equation. For this comparison, I’m going to use VMware vSphere 5.0 and the Microsoft ECI Suite.

The Microsoft ECI Suite includes: Windows Server 2008 R2 SP1 Datacenter Edition and System Center Datacenter Edition and Forefront Security. At a very high level, this provides:

  • Unlimited number of virtualized Windows Server instances
  • Management for an unlimited number of virtual machines (and more importantly the apps running within those VMs) for all of System Center including:
    • Operations Manager
    • Configuration Manager
    • Data Protection Manager
    • Service Manager
    • Opalis (Orchestrator)
      • Opalis was released this past year and added to the System Center Suite this past year. Existing customers received this automatically added at no additional cost.
    • Virtual Machine Manager
  • Forefront for an unlimited number of virtual machines which provides a unified, multilayered, and highly manageable approach to protecting servers from malware

The VMware figures below include VMware’s Support and Subscription (SnS) licensing per VMware’s requirement that “SnS is required for all vSphere purchases.” It should be noted that because the VMware Memory Tax requires purchasing more licenses for larger memory footprints and because "a Support and Subscription (SnS) contract is required for every vSphere Edition purchase", the SnS requirement acts as a subtle, additional tax even if the user is purchasing the extra license for vRAM capacity. The VMware figures below also include the cost of providing unlimited Windows Server Datacenter instances to more closely match the Microsoft ECI offering. The VMware figures do not include System Center or Forefront licensing. Like the previous examples, this example doesn’t include server hardware or storage.

image

Table 3: Memory Tax: vSphere 5 vs. Microsoft ECI (1 Server)

10 Servers: vSphere 5 vs. Windows Server 2008 R2 SP1

In this final analysis, let’s look at the full stack and the effect of the VMware Memory Tax on the complete equation. For this comparison, I’m going to use VMware vSphere 5.0 and the Microsoft ECI Suite.

The Microsoft ECI Suite includes: Windows Server 2008 R2 SP1 Datacenter Edition and System Center Datacenter Edition and Forefront Security. At a very high level, this provides:

  • Unlimited number of virtualized Windows Server instances
  • Management for an unlimited number of virtual machines (and more importantly the apps running within those VMs) for all of System Center including:
    • Operations Manager
    • Configuration Manager
    • Data Protection Manager
    • Service Manager
    • Opalis (Orchestrator)
      • Opalis was released and added to the System Center Suite this past year. Existing customers received this automatically added at no additional cost.
    • Virtual Machine Manager
    • Forefront for an unlimited number of virtual machines which provides a unified, multilayered, and highly manageable approach to protecting servers from malware

The VMware figures below include VMware’s Support and Subscription (SnS) licensing per VMware’s requirement that “SnS is required for all vSphere purchases” and include the cost of providing unlimited Windows Server Datacenter instances to more closely match the Microsoft ECI offering. The VMware figures do not include System Center or Forefront licensing. Like the previous examples, this example doesn’t include server hardware or storage. Let’s take a look at a 10 node cluster (or two 5 node clusters if you prefer).

image

Table 4: Memory Tax: vSphere 5 vs. Microsoft ECI (10 Servers)

Your Questions Answered

So let’s return to your two questions:

  1. What do you (Microsoft) think about the new VMware vSphere 5.0 Licensing Changes?
  2. Does this make sense to you?

There’s very little to say that hasn’t already been said by VMware’s own customers.

VMware’s Licensing changes lay the foundation to lock customers into high priced software and into a business model that is based on taxing customers for achieving greater density and maximizing hardware resources. These changes fly in the face of the benefits of virtualization and cloud computing. Specifically, the vSphere Licensing Model has devolved from per processor with physical core restrictions, commonly referred to as the VMware Core Tax, to per processor with vRAM entitlements, a new VMware Memory Tax. VMware’s Memory tax fundamentally goes against the economics of the private cloud and undermines what you have come to expect from virtualization. Namely, you want to maximize hardware utilization, drive up density and reduce costs.

What’s unfathomable is that we’re having this conversation at all. Increased hardware utilization, better density and lower costs are why people gravitated to virtualization in the first place. This is Virtualization 101.

VMware also fails to recognize what is important in virtualized environments today, especially as we move towards private cloud solutions. Aspects such as management and monitoring of applications and cross-platform support have been overlooked, and with vCloud Director, VMware’s private cloud story is still focused on VMware-only infrastructures. vSphere 5 is the latest VMware toll booth erected on the road to the private cloud in a history where increased licensing costs are a regular occurrence. Two years ago it was the Core Tax where many saw there licensing increase over 200% and now it’s the Memory Tax where many are seeing licensing increases of upwards of 200-400% and higher.

With Microsoft, customers can build scalable virtualized infrastructures on Hyper-V and with System Center, accelerate their progression towards private cloud environments with deep application monitoring, protection and management along with rich self-service capabilities. All of this, without the restrictive licensing that accompanies vSphere, ensures that a Microsoft private cloud provides the greatest value at the lowest costs.

As for scalability, you should know that scalability and performance are ongoing development activities at Microsoft. Scalability and performance work is never complete. If you look at Windows Server, we have improved the scalability, performance, and capabilities in every release. Needless to say, the next version of Windows Server will improve on these numbers and you can expect even more capabilities.

At the Worldwide Partner Conference 2011, we demonstrated some of the new capabilities of Windows Server “8,” specifically around Hyper-V. With an ability to create VMs with more than 16 virtual processors and built-in replication with Hyper-V Replica, Microsoft is showcasing its deep commitment to its customers, and our relentless pursuit to provide even more value, at no extra cost. These are just 2 of the hundreds of features coming in the Microsoft Private Cloud, of which you’ll be able to find out more at Microsoft’s BUILD conference, September 13th-16th in Anaheim, CA.

Finally, I don’t understand how VMware can claim a memory tax benefits customers. I’ve had the privilege of working on virtualization for over a decade and not once has a customer told me, “I really wish you would license virtualization by the memory assigned to a VM.”

Not once.

Next question: Does Microsoft plan to do anything similar to the vTax?

NO, we have no intention of imposing:

  • A VM Memory vTax
  • A VM Core vTax
  • A VM Replication vTax

Per VM taxes are what virtualization vendors do, not strategic cloud providers.

See you at Build,

Jeff Woolsey, Windows Server & Cloud

P.S. YES, the amount of memory in a Hyper-V “8” VM is going to go up. Way up.

Jeff Woolsey is a Principal Program Manager Lead for Windows Virtualization in the Microsoft Server and Tools Division.


<Return to section navigation list>

Cloud Security and Governance

Alex Bewley answered The Cost of the Cloud – How to Budget for Public Cloud? in a guest post of 8/2/2011 to the CloudTimes blog:

imageThe ultimate goal of deploying application or dynamic infrastructure to the cloud is the truly agile and cost-competitive nature of running and managing applications and infrastructure. However, cost can increase exponentially without proper cloud monitoring and cloud cost modeling. It has become crucial for IT to tie cloud success to cost analysis, in addition to overall system performance. In this article, I’ll cover some common pitfalls and pains around current gaps in cloud costing and deployment, as well as a key set of questions to help IT make smart cloud decisions.

imageUp to now, the success of applications in cloud, virtual and physical environments have usually been viewed in only two dimensions – availability and performance. However, a key new dimension is cost, and it’s cost that will dramatically influence what, when and where IT organizations deploy to the cloud. Presently, there are no cloud monitoring tools that can help IT and LOBs monitor their cloud costs, predict workload/application cost, notify when costs are escalating, while providing deep cloud capacity management. However, we do see this tooling issue changing in the near future.

imageTo date, companies have been oblivious to the workload cost of an application running in the cloud, apart from unclear monthly billing. We are entering a new era where performance and availability will be baseline requirements, but workload cost efficiency will be the new key to success. This will be the age of ‘economic compute’ and will be defined by how and where companies can run workloads at the best cost (assuming performance and availability remain constant). It won’t matter if it’s internally run on physical or virtual servers, or in the cloud, as the economics will drive this decision. However, the lynch pin to this costing decision model is missing and that is, how do we monitor and meter cloud costs in real-time?

To responsibly manage IT budgets with cloud deployments, companies need visibility to the cost and performance data of workloads, applications and dynamic infrastructure services. However, the industry is missing a complete toolset or product suite that can help IT easily see and predict the cost of cloud deployment. Applications and services can be deployed on cloud infrastructure (assuming it returns acceptable performance and availability), but it’s essential for IT to have clear visibility to what the workloads will cost comparatively, across different cloud vendors or even the cost of an internally run workload. How can IT make a cost-conscious decision without the basic cost data of an application, workload or service? Quite simply, it can’t. This is where the idea of the economic cloud comes into play:

Example #1 – Dynamic Infrastructure Services:

  • Ensure IT Doesn’t Overpay: A company may have provisioned a $500 per month system, but if its CPU is only consumed 10 percent of the time, then one is largely over paying. Now scale that scenario out to a company that is running many services, applications and servers in the cloud.
  • Companies with Many Separate Cloud Accounts: For IT managers trying to understand the cumulative costs of many developers or departments (LOBs) with cloud accounts, it can be almost impossible, with no clear means of reconciling usage (until it’s too late).
  • Manage Cost Across Geographically Dynamic Workloads: For more advanced scenarios, there are now a number of services that allow the creation of cloud instances in specific geographic regions, which enables a new generation of smartphone or mobile applications to exist. There are millions of smartphone users in the world in non-North American geographies, such as Latin America. Now, imagine if you could dynamically and geographically provision cloud resources that are compute heavy, or can service the requests of these remote smartphone clients, in a cost effective manner. This reduces bandwidth requirements, increases the response time and can be done on cheaper, temporarily available compute resources. This kind of dynamism is incredibly powerful, yet monitoring the changing costs and performance of these cloud resources is going to be a difficult problem to solve.

Example #2 – Applications in the Cloud:

  • See Cloud Cost: IT needs to see a clear monthly workload cost of their entire Amazon AWS deployment (by server, application or service) before they get the bill. For those companies that have deployed in AWS, the anxiousness and pain associated with the monthly AWS bill can be quite frustrating. Seeing an accurate monthly bill forecast immediately when you make changes to your cloud deployments will remove risk and provide much needed cloud cost certainty.
  • Predict Cloud Cost: Reports are needed that can estimate or predict the cost of running an application or service in AWS before it’s deployed. Predicting cloud cost based on individual workloads, applications or services is essential.
  • Identify Cloud Ready Applications: Reporting that can show which workloads are prime candidates for cloud deployment would be extremely helpful to IT departments wrestling with how to use cloud most effectively.

Example #3 – Development and Testing in the Cloud: Although spinning up new test and development environments in the cloud improves agility, the essential questions to ask are:

  • Internal or Cloud: Is it more cost-effective to host the application or service internally or in the cloud? Can IT prove its decision?
  • Which Cloud is Best: Which cloud vendor should be chosen? What is an easy way to see a cost comparison for the different vendors, based on your workload?
  • How Much will it Cost: How much will this service/workload cost month over month?
  • Failsafe Cloud Alerting and Reports: Additionally, the added problem of developers forgetting to de-commission cloud infrastructure and services drives major cost overruns. Proper notification of these ‘cloud zombies’ is essential to prevent large bills over time. While services like Amazon’s AWS are an incredible boon to being able to create development and test environments in a few clicks, there are latent costs which aren’t always readily apparent. For example, stopped instances still consume storage resources (and cost), snapshots linger even when volumes are deleted (and add more cost), just to name a few. IT needs better visibility.

As stated earlier, there are no tools that can help IT (or LOBs) model the cost of their cloud needs, predict their workload costs or notify when costs are escalating. However, there are tools coming in the future.

The most fundamental aspect of optimizing cloud deployments is to understand the relationship between application/infrastructure capacity and cost. Presently, the industry is just beginning to understand how to monitor the performance of applications in the cloud, yet it lacks a cloud-costing dashboard necessary for IT managers to make smart budget related decisions. How can organizations understand the cost of cloud computing without a deeper level of visibility? It has become crucial for IT to tie cloud success to cost analysis, as well as overall system performance.

So the question is…to Cloud or not to Cloud? How will you tie your cloud decisions to cost for justification to senior management? Or will you just deploy and cross your fingers?

Alex is CTO at uptime software (also uptime cloud.)


<Return to section navigation list>

Cloud Computing Events

Bruce Kyle reported BUILD Windows Conference Sells Out, Content Will Be Available Online in an 7/2/2011 post:

imageIf you haven’t registered for BUILD Windows Conference in Anaheim, CA, you will be able to watch keynotes live and the sessions the following day. BUILD is September 13-16.

Microsoft will be announcing the developer strategy for Windows 8 at the BUILD Windows conference.

imageBUILD is a new event that shows modern hardware and software developers how to take advantage of the future of Windows. Learn how to work with the all new touch-centric user experience to create fast, fluid, and dynamic applications that leverage the power and flexibility of the core of Windows, used by more than a billion people around the world.

imageHear how the UI was designed to work seamlessly with a diversity of devices and form factors. Go behind the scenes and learn all about the new app model that allows you to create powerful new apps. All while retaining the ability to use your existing apps. Web-connected and web-powered apps built using HTML5 and JavaScript have access to the power of the PC. Touch-optimized browsing, with the full power of hardware-accelerated Internet Explorer 10 transforms your experiences with the web. BUILD is the first place to dive deep into the future of Windows.

There was no session or speaker information for the BUILD Conference (@bldwin) as of 12:30 PM on 8/2/2011, less than six weeks in advance of the conference. The site was last updated on 6/1/2011. Neither Mary Jo Foley (@maryjofoley) nor I (@rogerjenn) had received press credentials as of Monday. The conference remains a pig in a poke (#PigInPoke) for me.

Update 8/2/2011 4:30 PM PDT: Finally received press credentials for BUILD on Tuesday.


<Return to section navigation list>

Other Cloud Computing Platforms and Services

No significant articles today.


<Return to section navigation list>

0 comments: