Windows Azure and Cloud Computing Posts for 8/9/2011+
A compendium of Windows Azure, SQL Azure Database, AppFabric, Windows Azure Platform Appliance and other cloud-computing articles. |
Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:
- Azure Blob, Drive, Table and Queue Services
- SQL Azure Database and Reporting
- Marketplace DataMarket, AppsMarket and OData
- Windows Azure AppFabric: Apps, Access Control, WIF and Service Bus
- Windows Azure VM Role, Virtual Network, Connect, RDP and CDN
- Live Windows Azure Apps, APIs, Tools and Test Harnesses
- Visual Studio LightSwitch and Entity Framework v4+
- Windows Azure Infrastructure and DevOps
- Windows Azure Platform Appliance (WAPA), Hyper-V and Private/Hybrid Clouds
- Cloud Security and Governance
- Cloud Computing Events
- Other Cloud Computing Platforms and Services
Azure Blob, Drive, Table and Queue Services
Wade Wegner (@WadeWegner) described the WA Toolkit for iOS: New Project Experience for Accessing Windows Azure Storage in an 8/8/2011 post:
The Windows Azure Toolkit for iOS provides an easy and convenient way of accessing Windows Azure storage from iOS-based applications.
The toolkit works in two ways – the toolkit can be used to access Windows Azure storage directly, or alternatively, can go through a proxy service known as the Cloud Ready package. The Cloud Ready code is the same code as used in the WP7 toolkit for Windows Azure (found here) and negates the need for the developer to store the Azure storage credentials locally on the device. If you are planning to test using the proxy server, you’ll need to download and deploy the server controls hosted on the Codeplex site.
Unpacking the v1.2 library zip file
In the zip file, you’ll find several folders:
- /4.3-device – the library binary for iOS 4.3 (device)
- /4.3-simulator – the library binary for iOS 4.3 (simulator)
- /include – the headers for the library
Creating your first project using the toolkit
If you are not familiar with XCode, this is a short tutorial for getting your first project up and running. Launch XCode 4 and create a new project:
Select a View-based application and click Next.
Give the project a name and company. For the purposes of this walkthrough, we’ll call it “FirstAzureProject”. Do not include Unit Tests.
Pick a folder to save the project to, and uncheck the source code repository checkbox.
When the project opens, right click on the Frameworks folder and select “Add Files to…”
Locate the libwatoolkitios.a library file from the download package folder (from either the simulator or device folder), and add it to the Frameworks folder.
Now, click on the top most project (FirstAzureProject) in the left hand column. Click on the target in the second column. Click on the “Build Settings” header in the third column. Ensure that the “All” button is selected to show all settings.
In the search box, type in “header search” and look for an entry called “Header Search Paths”
Double click on this line (towards the right of the line), and click on the “+” button in the lower left.
Add the path to where the folder containing the header files are located (this is the include folder from the download). For example, "~/Desktop/v1.0.1/include" if you have extracted the folder on your desktop. Be sure to encapsulate in quotes if you have spaces in the path.
Remaining in the “Build Settings” tab, now type in “other linker flags” in to the search bar. In the “Other Linker Flags” section, add two Flags. –ObjC and –all_load
Finally, click on the “Build Phases” tab and expand the “Link Binary with Libraries” section:
Click on the “+” button in the lower left, and scroll down until you find a library called “libxml2.2.dylib”. Add this library to your project.
Testing Everything Works
Now that you’ve added all of the required references, let’s test that the library can be called. To do this, double click on the [ProjectName]AppDelegate.m file (e.g. FirstAzureProjectAppDelegate.m), and add the following imports to the class:
#import "WAAuthenticationCredential.h" #import "WACloudStorageClient.h"Perform a build. If the build succeeds, the library is correctly added to the project. If it fails, it is recommended to go back and check the header search paths.
Assuming it builds, in the .m file, add the following declarations after the @synthesize lines:
WAAuthenticationCredential *credential; WACloudStorageClient *client;Now, add the following lines after the [self.window makeKeyAndVisible] line in the didFinishLaunchingWithOptions method:
credential = [WAAuthenticationCredential credentialWithAzureServiceAccount:@"ACCOUNT_NAME" accessKey:@"ACCOUNT_KEY"]; client = [WACloudStorageClient storageClientWithCredential:credential]; [client fetchBlobContainersWithCompletionHandler:^(NSArray* containers, NSError* error) { if (error) { NSLog(@"%@",[error localizedDescription]); } else { NSLog(@"%i containers were found…",[containers count]); } }];Be sure to replace ACCOUNT_NAME and ACCOUNT_KEY with your Windows Azure storage account name and key, available on the Windows Azure portal (http://windows.azure.com).
Build and run the project. You should something similar to the following output in the debug window:
2011-05-06 18:18:46.001 FirstAzureProject[27456:207] 2 containers were found…
The last line shows that this account has 2 containers. This will of course vary, depending on how many blob containers you have setup in your own Windows Azure account.
Doing more with the toolkit
Feel free to explore the class documentation to explore more of the toolkit API. To help, here are some additional examples:
In [ProjectName]AppDelegate.m class, add the following headers:
#import "WAAuthenticationCredential.h" #import "WACloudStorageClient.h" #import "WABlobContainer.h" #import "WABlob.h" #import "WATableEntity.h" #import "WATableFetchRequest.h" #import "WAQueue.h" #import "WAQueueMessage.h"In the didFinishLaunchingWithOptions method, after the [self.window makeKeyAndVisible] line, try testing a few of the following commands. Again, running the project will return results into the debugger window.
To authenticate using account name and key:
credential = [WAAuthenticationCredential credentialWithAzureServiceAccount:@"ACCOUNT_NAME" accessKey:@"ACCOUNT_KEY"];To authenticate instead using the proxy service from the Windows Phone 7 toolkit, you can use the following:
credential = [WAAuthenticationCredential authenticateCredentialWithProxyURL:[NSURL URLWithString:@"PROXY_URL"] user:@"USERNAME" password:@"PASSWORD" withCompletionHandler:^(NSError *error) { if (error) { NSLog(@"%@",[error localizedDescription]); } else { NSLog(@"Successfully logged in"); } }];Replace the PROXY_URL, USERNAME, and PASSWORD with the information required to access your proxy service.
To create a new client using the credentials:
client = [WACloudStorageClient storageClientWithCredential:credential];To list all blob containers (this method is not supported via the proxy server):
// get all blob containers [client fetchBlobContainersWithCompletionHandler:^(NSArray *containers, NSError *error) { if (error) { NSLog(@"%@",[error localizedDescription]); } else { NSLog(@"%i containers were found…",[containers count]); } }];To get all tables from storage (this works with both direct access and proxy):
// get all tables [client fetchTablesWithCompletionHandler:^(NSArray* tables, NSError* error) { if (error) { NSLog(@"%@",[error localizedDescription]); } else { NSLog(@"%i tables found",[tables count]); } }];To create a table (works with both direct access and proxy):
[client createTableNamed:@"testtable" withCompletionHandler:^(NSError *error) { if (error) { NSLog(@"%@",[error localizedDescription]); } else { NSLog(@"Table created"); } }];To delete a table (works with both direct access and proxy):
//delete a table [client deleteTableNamed:@"wadestable" withCompletionHandler:^(NSError *error) { if (error) { NSLog(@"%@",[error localizedDescription]); } else { NSLog(@"Table was deleted"); } }];To get entities for a table (works with both account key and proxy):
// get entities for table developers WATableFetchRequest* fetchRequest = [WATableFetchRequest fetchRequestForTable:@"Developers"]; [client fetchEntities:fetchRequest withCompletionHandler:^(NSArray *entities, NSError *error) { if (error) { NSLog(@"%@",[error localizedDescription]); } else { NSLog(@"%i entities found in the developer table",[entities count]); } }];To get entities for a table using predicate (works with both account key and proxy):
// get entities for table developers with predicate request NSError* error = nil; NSPredicate* predicate = [NSPredicate predicateWithFormat:@"Name = 'Wade' || Name = 'Nathan' || Name = 'Nick'"]; WATableFetchRequest* anotherFetchRequest = [WATableFetchRequest fetchRequestForTable:@"Developers" predicate:predicate error:&error]; [client fetchEntities:anotherFetchRequest withCompletionHandler:^(NSArray *entities, NSError *error) { if (error) { NSLog(@"%@",[error localizedDescription]); } else { NSLog(@"%i entities returned by this request",[entities count]); } }];Doing even more with the toolkit
If you are looking to explore the toolkit further, we would recommend looking at the sample application that can be found in the watoolkitios-samples project. This project demonstrates all of the functionality of the toolkit, including creating, uploading, and retrieving entities from both table and blob storage.
In addition, you might also want to check out the documentation outlining the ACS (Access Control Service) functionality of the toolkit.
See the Wade Wegner (@WadeWegner) described the WA Toolkit for iOS: New Project Experience with Windows Azure Access Control Service in an 8/8/2011 post article in the Windows Azure AppFabric: Apps, Access Control, WIF and Service Bus section below.
<Return to section navigation list>
SQL Azure Database and Reporting
See the SSWUG announced on 8/9/2011 a Free Expo Event: Working with Windows Azure and SQL Azure to be held on 8/19/2011 at 9:00 AM to 3:00 PM PDT article in the Cloud Computing Events section below. The Expo Event is mostly about SQL Azure.
<Return to section navigation list>
MarketPlace DataMarket, AppsMarket and OData
The Windows Azure Team (@WindowsAzure) reported New Today: More Windows Azure Marketplace Content & Hands On Lab on 8/9/2011:
It’s time again for the release of more exciting new Windows Azure Marketplace content! Today we announce the release to the Marketplace of 46 new applications and a great new data offer from LexisNexis, Lawyers.com Consumer Legal Articles - Each article is written by lawyers in a manner that's authoritative and informative, yet easy enough for a non-lawyer to understand.
Also announced today, the release of the August 2011 update to the Windows Azure Platform Training kit includes a new Hands-on Lab (HOL), Introduction to Windows Azure Marketplace for Applications. In this HOL you will take an existing application through the full process of prepping the application code to work with the marketplace, as well as the process of submitting the application for publication.
You can download the updated Windows Azure Platform Training Kit – August 2011 Refresh from the Microsoft download center.
Vittorio Bertocci (@vibronet) described the Hands-on lab: Windows Azure Marketplace for Applications in an 8/9/2011 post:
For the series “things I was working on before moving from DPE to the product team”. Today Wade’s gang released an update to the Windows Azure Platform Training Kit containing a new lab, “Introduction to the Windows Azure Marketplace for Applications” [See below post]. Here there’s a bit of brain dump (not too long, it’s already past midnight here) of what were the thoughts in designing the lab: I see that the guys did some changes here and there, but the structure appears to be the same.
You may have seen last month the announcement that Windows Azure opened up a marketplace for applications: the aforementioned lab walks you though the process of adapting one existing SaaS application to take advantage of the marketplace for handling subscriptions & subscription lifecycle.
The main idea behind the lab was to show how an existing subscription management solution- such as a redeemable code handed out by your salesperson for every new deal - could be handled in a much more natural way by integrating with the Windows Azure marketplace.
Another reason for starting from an existing working solution was to make extra-clear where the marketplace responsibilities end and yours begin: for example the user registration (via ACS, of course) are entirely up to you. I would have liked to add a feature for handling multiple users (in order to be extra certain that nobody confuses “user” with “tenant”, you won’t believe how many times that happens) but there was no time. In any case, I am satisfied with the amount of identity topics we got in there.I won’t spoil (all) the surprise to you, but integrating your app in the marketplace largely consists in adding to your app an endpoint that the marketplace can call to notify you of a new subscription, a deprovisioning of an existing subscriptions, and the various details you need to act upon that info (for example which subscription level the customer paid for). Those calls are secured by…. surprise surprise… OAuth2. In fact, the lab takes advantage of WIF’s OAuth2 extensions.
The Windows Azure marketplace has various developer-friendly features, such as a “playground” from where you can send test subscription and deprovisioning messages; the lab walks you thru their user in detail. The lab also demonstrates how to go through the publication process, necessary to get one entry for your app in the application catalog.
Of course another important goal of the lab was to help you wrap your head about how to handle multitenancy (by managing the subscriptions info accordingly) and think a bit about what a real system may need to cope with (for example avoiding to delete all tenant data upon preprovisioning messages, or handle idempotency in case the same message needs to be re-sent). You tell me, but I think that the lab covers that pretty nicely.
Extra goodness? The lab setup uses the ACS cmdlets.
What is there to add? Congratulations to Wade, Donovan & he rest o the gang for the new release, and I wish you best of luck for your Windows Azure apps in the Marketplace!
Wade Wegner (@WadeWegner) posted an Introduction to Windows Azure Marketplace for Applications on 8/9/2011:
Today we released an update to the Windows Azure Platform Training Kit to include a new HOL: Introduction to Windows Azure Marketplace for Applications. You can download the updated Windows Azure Platform Training Kit – August 2011 Refresh from download center or view online now on the Windows Azure Platform Training Course on MSDN.
The important take away is that in this HOL you will take an application through the full provisioning process. This is extremely valuable if you have customers/partners looking to publish their applications into the Windows Azure Marketplace for Applications.
Hands-On Lab Details
There are five exercises in the HOL:
- Exercise 0: Understanding a SaaS Subscription Scenario
- Exercise 1: Modify your Application to support Windows Azure Marketplace Subscriptions
- Exercise 2: Register the Application in Windows Azure Marketplace and Test Subscriptions with the Dev Playground
- Exercise 3: Add Support for Unsubscribe messages
- Exercise 4: Learn how to Publish in Windows Azure Marketplace, Explore the Finished App
You can take a look at the SaaS application live in the marketplace: https://datamarket.azure.com/application/AA1EA7A3-E3FE-4EE5-B7B5-CBF4142CB983
Feedback?
We are always looking to make improvements to the WAPTK. If you encounter any bugs or have any problems, please send us feedback at: azcfeed@microsoft.com. Additionally, if there’s additional content you’d like to see in the training kit, please let us know on the Windows Azure Platform Training Kit Forum – you’re feedback is highly valued.
SAP explained How To Write an OData Channel Gateway Service. Part 1 - The Model Provider Class in an 8/2011 manual:
0. Introduction
This document is the first of two documents showing you how to implement a Gateway Service using the OData Channel approach (ABAP coding).
A Gateway Service is implemented by creating two ABAP classes – a Model Provider class and a Runtime Data Provider class.
In this document, we will cover the creation of the Model Provider Class. The coding in this document must be completed before you can start the coding described in the second document.
2. Business Scenario
You would like to develop some ABAP functionality for showing flights to and from various worldwide airports.
3. Background Information
Since you would like to expose this service to users outside the scope of your SAP system, this functionality needs to be made accessible via the SAP NetWeaver Gateway interface.
The SAP NetWeaver Gateway interface has been implemented using the Open Data Protocol (OData). This is a non-proprietary, license free interface based on the Atom Publishing format.
In this document, you will be shown various segments of ABAP. However, rather than simply providing you with working code for you to copy and paste from this document, you will be guided through a step by step analysis of the code in order to gain sufficient understanding to adapt what you have learnt here to your own business situation.
At each stage we will then look at the service through a browser in order to understand what XML elements are generated as a result of the ABAP coding that has just been entered.
4. Prerequisites
This document assumes the following:
- You have access to a NetWeaver 7.02 SP6 or higher system into which the SAP NetWeaver Gateway ABAP add-ons have been installed.
- You have at least a basic understanding of ABAP development.
It should be understood that an SAP NetWeaver Gateway system is not a new type of SAP system; it is merely a set of ABAP add-ons that can be applied to any SAP NetWeaver 7.02 SP6 system or higher.The add-ons, taken together, form the Gateway system; however, the add-ons can be installed either all together on a single ABAP system, or split across two ABAP systems.
Although there are several ABAP add-ons making up a Gateway system, for the purposes of this discussion, the important ones are GW_CORE and IW_BEP because these two components determine in which system your development and configuration are performed.
a) GW_CORE and IW_BEP in the same system
If GW_CORE and IW_BEP are installed in the same SAP system, then this system will contain both your business data and functionality and will also be the system to which web-based clients connect in order to run the Gateway Services.
Such an installation scenario is recommended only for testing purposes, or implementations in which all the end users are within a corporate intranet.
b) GW_CORE and IW_BEP in different systems
If GW_CORE and IW_BEP are in different systems, then the system containing GW_CORE will be your Gateway server, and the system containing IW_BEP will be your backend business system.
This installation option allows you to isolate your backend business system behind a firewall, exposing only your Gateway Server to the public Internet.1
In order to perform the tasks in this tutorial, you will need to log on to whichever ABAP system contains the IW_BEP component with a user id having sufficient authorization to perform ABAP development and has been registered as a developer.
You will also need to log on to the ABAP system containing the GW_CORE add-on in order to perform a single configuration task.
1 A trusted RFC connection must exist between the two systems. This will have been created during the post installation
configuration of your Gateway system.
<Return to section navigation list>
Windows Azure AppFabric: Apps, Access Control, WIF and Service Bus
Steve Peschka (@speschka) offered valuable Tips for Upgrading or Moving ADFS 2.0 in an 8/9/2011 post:
I recently spent too much time trying to get an ADFS Server upgraded, in my case from Windows Server 2008 to 2008 R2. Like many SharePoint folks that are just trying to get along in a claims happy world, seemingly simple things like this can cause a surprising amount of churn. Here are some tips that may help you get through it:
- There really isn't a straight upgrade path from ADFS 2.0 on Windows Server 2008 to Windows Server 2008 R2. It just completely uninstalled ADFS for me. So once you're done you'll need to start over from scratch, sort of. I recommend you back up the database first. More on that next.
- ADFS really wants to use that dang Windows Internal Database. If you're just trying to get things up and going for your SharePoint farm then that's often okay. So how do you manage it though when you need to backup and restore the database? Fortunately there is a free download for managing it. The link I found said SQL Server 2005 but it still worked fine with Windows Internal Database. I downloaded the tool from http://www.microsoft.com/download/en/details.aspx?DisplayLang=en&id=8961, where it calls the tool "SQL Server Management Studio Express".
- The connection you need to use when you open the tool is about as unintuitive as you will find, so I will just paste it here; you should be able to copy from here and paste into the tool when the connect dialog opens: "\\.\pipe\MSSQL$MICROSOFT##SSEE\sql\query" (without the quote marks)
- When you install ADFS again, you may get a warning after you complete the ADFS wizard that says something like the ADFS web site is already installed so it didn't overwrite the contents of it. It then gives you a link that it tells you to follow if you want to redploy the web site. NEWSFLASH: the link is WORTHLESS! Shocking, I know...please hold your gasps of disbelief in abeyance for now. What's more irritating is that if you look in the IIS Manager snap-in you will not see any ADFS virtual directories. Frustrating! Turns out you need to use appcmd to delete the vdirs. I did it with these two commands:
- C:\Windows\System32\inetsrv>appcmd delete app "Default Web Site/adfs/card"
- C:\Windows\System32\inetsrv>appcmd delete app "Default Web Site/adfs/ls"
- Now, after you've done all that goo you can run the ADFS wizard again to get everything set up. Once it's all up then you can restore the databases that you backed up from above. Here's a tip to help with that though:
- Close the ADFS Management app if you have it open
- Stop the ADFS service
- Restore the AdfsConfiguration database first
- Start the ADFS service
- Restore the AdfsArtifactStore database
- Open up the ADFS Management app and everything should be working and restored
- Finally you want to see what it's using for the token signing certificate. It will again try to use the self-signed certificate that it creates at install time. However if you had previously been using a different certificate that will of course break when you try to go to any SharePoint sites that were working prevoiusly with it (the old not trusted root authority message that I described at http://blogs.technet.com/b/speschka/archive/2010/02/13/root-of-certificate-chain-not-trusted-error-with-claims-authentication.aspx). However, before you can just add a new token signing certificate you must run these PowerShell commands on the ADFS server:
- add-pssnapin Microsoft.adfs.Powershell
- set-adfsproperties -AutoCertificateRollover $false
- If you add a token signing certificate, remember to make it the Primary certificate if that's how you had it configured previously.
Hope this is helpful to you.
Wade Wegner (@WadeWegner) described the WA Toolkit for iOS: New Project Experience with Windows Azure Access Control Service in an 8/8/2011 post:
The v1.2 release of the Windows Azure Toolkit for iOS includes support for Windows Azure Access Control Service (ACS). The enables iOS developers to create applications that can use and rely on federated identity providers such as Windows Live, Google Accounts, and Facebook Connect.
This walkthrough document covers configuring the ACS service and creating a new iOS project using XCode to authenticate against this service. This is accomplished using the following three steps in this document:
- Manually Configure ACS (Access Control Service).
- Create New Project and Import Library
- Expanding the solution with the Cloud Ready Package
In order to run through this tutorial, you’ll need an active Windows Azure subscription. Further information about creating a new account for Windows Azure can be found on http://www.windowsazure.com
Step 1 – Manually Configure ACS (Access Control Service)
This section will describe how to configure ACS manually for a new iOS application. If you have already deployed the Cloud Ready package and configured the service using the Cloud Ready Configuration Wizard, you can skip to Step 2.
First, access the Windows Azure portal by navigating to http://windows.azure.com and signing in with your credentials.
Click on the Service Bus, Access Control & Caching menu item and select the Access Control menu item under the AppFabric folder. Select an active Windows Azure subscription and click on the New button in the toolbar.
The new service namespace dialog will open. Ensure that Access Control is selected, and enter a unique namespace and country/region for the service.
The ACS namespace will now be created. This might take a few minutes. Wait until the namespace is showing in an Active state.
Once this is complete, highlight the newly created service and click on the Access Control Service button in the toolbar. This will launch the Access Control Service portal.
Within the portal, click on Identity Providers and add the identity providers you would like to use for your application.
The default is Windows Live ID, but you can add other preconfigured providers (such as Google and Yahoo!) as well as external identity systems configured to use WS-Federation.
Once you have added the required providers, click on the Relying Parties section of the portal.
Click on the Add button and enter the following information for the relying party application:
Name – a given name for your application
Realm – a unique ID for your application. For this walkthrough, we’ll be using uri:wazmobiletoolkit
Return URL – you can leave this blank
Error URL – you can leave this blank
Token Format – select SWT
Token Lifetime – Feel free to change the default from 600 seconds.Select the identity providers that you would like to use, and then under the Token Signing Settings section, click on the Generate button to create a new symmetric key that will be used for this application.
Finally, click on the Save button. This will create the Relying Party Application.
Next, go into the Rule Groups section of the portal and select the default rule group that was created for the application.
Click on the Generate link in order to generate a set of default rules for this rule group.
Select the providers that you wish to use, and click on the Generate button. Once this is complete, you should see a set of rules.
After this step is complete, ACS has now been configured correctly to be used with your iOS application. Make a note of your Service Namespace (found at the top of the portal) and Realm as you’ll need these in the next step of the walkthrough.
Step 2 – Create New Project and Import Library
The following has been tested with XCode 4.0.2 or higher.
Before you begin, you should download the latest version of the Windows Azure Toolkit for iOS library from http://github.com/microsoft-dpe/watoolkitios-lib. Save the zip file in a place that can be found later.
To start, launch XCode 4 and create a new project.
From the project template dialog, select a View-based application and click on the Next button.
Enter a Product Name and Company Identifier and click on the Next button to continue. Select a directory to use for the project file and return to the IDE.
Next, locate the version of the Windows Azure toolkit for iOS library that you downloaded earlier. In the download will be a zip file containing two versions of the library (one for the device, one for the simulator) and some header files for the project.
Right click on your project and select the Add Files to… menu option.
Locate the .a file (for the simulator) and header files and add them to your project. You may want to create a new group (called lib) to store these in.
Now we need to add a reference to a library required for XML parsing. To do this, click on the top most project file, click on the target in the 2nd column of the IDE, and select Build Phases from the tab menu.
In the main window, expand the Link Binary with Libraries option.
Ensure that the libwatoolkitios.a file has been automatically added as a reference, click the + button to add a new library, and select the libxml2.dylib library from the drop down list.
Click on the Add button to add a reference to this library for your project.
Before we start adding any code, we need to add a couple of required linker flags to the project. To do this, click on the Build Settings tab (next to Build Phases).
In the search box, type “other linker” to filter the settings.
You should see a setting called Other Linker Flags. Double click on the right side of this row to add new flags.
Click on the + button to add two flags. The first is –ObjC and the second is –all_load. Once complete, your linker flags should look like the following screenshot:
Click on the Done button to save these settings. The project is now configured correctly to reference the Windows Azure Toolkit library.
To test that the library works, click on the project’s [ProjectName]AppDelegate.m file. Add the following #import statement at the top of the class:
#import "WACloudAccessControlClient.h"
Next, search for a method called didFinishLaunchingWithOptions and after the [self.window makeKeyAndVisible] line, enter the following code.
NSLog(@"Intializing the Access Control Client…"); WACloudAccessControlClient *acsClient = [WACloudAccessControlClient accessControlClientForNamespace:@"iostest-walkthrough" realm:@"uri:wazmobiletoolkit"]; [acsClient showInViewController:self.viewController allowsClose:NO withCompletionHandler:^(BOOL authenticated) { if (!authenticated) { NSLog(@"Error authenticating"); } else { NSLog(@"Creating the authentication token..."); WACloudAccessToken *token = [WACloudAccessControlClient sharedToken]; /* Do something with the token here! */ } }];Replace the namespace and realm in the first line with the service namespace and realm for your own service, as created in Step 1.
As you can see from the above, the code creates a new instance of the access control client, requests that the client shows itself in the current view controller, and then extracts a token.
Build and run the application in the iOS Simulator.
Once the application starts, you should be prompted to select an identity provider from the list that you configured in your ACS service.
Pick one of the providers, and enter a valid set of credentials.
Click on the Remember me checkbox if you want to skip this step when running this application again, and click on the Sign in button.
The first time the application is run, you’ll be prompted to authorize the application to access your provider data.
Click on the Allow button to continue. The login window will now disappear and you’ll be returned to your application.
In the debug window, you should see the following two logs:
2011-07-22 10:12:26.284 iostest-walkthrough[25838:207] Intializing the Access Control Client…
2011-07-22 10:12:36.359 iostest-walkthrough[25838:207] Creating the authentication token…The final message indicates that the access token was retrieved and can be used for further use. The WACloudAccessToken (derived from [WACloudAccessControlClient sharedToken]) contains an NSDictionary of claims and other properties that can be stored within your application. Using these properties on future calls can be used to identify returning users to your application.
Congratulations! You’ve successfully built an application that can use the Windows Azure ACS service for federated identity!
Step 3 – Expanding the solution with the Cloud Ready Package
If you have deployed the Cloud Ready package and configured your ACS service with the Cloud Ready Configuration Utility, you can also extend your sample by using the returned WACloudAccessToken to securely access blob, table, and queue storage in Windows Azure.
To do this, create a new WAAuthenticationCredential object using the token, use this to initialize the WACloudStorageClient and then make calls to the storage type.
The following code example shows how this can be done.
WAAuthenticationCredential *credential = [WAAuthenticationCredential authenticateCredentialWithProxyURL:[NSURL URLWithString:@"[URL OF YOUR CLOUD READY PACKAGE]"] accessToken:[WACloudAccessControlClient sharedToken]]; NSLog(@"Creating the storage client…"); WACloudStorageClient *storageClient = [WACloudStorageClient storageClientWithCredential:credential]; [WACloudStorageClient ignoreSSLErrorFor:@"[FIRST PART OF THE URL FOR YOUR CLOUD READY PACKAGE]"]; NSLog(@"Accessing table storage…"); [storageClient fetchTablesWithCompletionHandler:^(NSArray *tables, NSError *error) { if (!error) { NSLog(@"%i Tables were found!", [tables count]); UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Table Storage" message:[NSString stringWithFormat:@"%i tables were found!",[tables count]] delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; [alert release]; } else { NSLog(@"%@", [error localizedDescription]); } }];Replace the URL strings in the above code with the correct URLs for your deployed Cloud Ready package. If everything works, after you sign in to the application you should see an alert box showing the number of tables in your storage account.
<Return to section navigation list>
Windows Azure VM Role, Virtual Network, Connect, RDP and CDN
No significant articles today.
<Return to section navigation list>
Live Windows Azure Apps, APIs, Tools and Test Harnesses
Bruce Kyle recommended that you Dive Into the Cloud with Getting Started, In Depth Tutorial on Windows Azure Site in an 8/9/2011 post:
A new set of tutorial shows you how to get started with Windows Azure. The tutorial begin with explaining how to get the latest tools and how to deploy your first sample application, build an application with ASP.NET MVC, and how to build a code first data application with SQL Azure.
- Get Started
- Plan & Design
- Develop Applications
- Migrate Services & Data
- Store & Access Data
- Monitor Services & Data
- Manage Services & Data
- Improve Performance
- Connect Services & Data
- Control Access
You will learn how Windows Azure in steps, from topics ranging from beginning through intermediate to advanced. The tutorial point to new content on Windows Azure and to developer material on MSDN.
David Pallman reported Neil Mackenzie's Windows Azure Book is Now Available in an 8/8/2011 post:
If you do much Windows Azure development, you're probably dependent as I am on the blog posts of fellow Windows Azure MVP Neil Mackenzie. Neil is also one of the star community supporters on the MSDN Windows Azure forum.
Neil's Windows Azure book has just been published, Microsoft Windows Azure Development Cookbook by Packt Publishing [see below post]. Congratulations, Neil! Putting out a book is no easy task. Which reminds me, I need to get back to work on my next one. :)
Neil Mackenzie (@mknz) described his new Microsoft Windows Azure Development Cookbook in an 8/8/2011 post:
The reason I didn’t write many posts on the blog in the early part of this year was that I was writing a book on Windows Azure for Packt Publishing. This book, Microsoft Windows Azure Development Cookbook, is now available from all the usual outlets as well as direct from the publisher.
The book is one of a series of cookbooks published by Packt. They are intended to provide “recipes” showing how to implement specific techniques in a particular technology. They are not intended to be narrative books read from cover-to-cover – although they certainly can be.
In the Microsoft Windows Azure Development Cookbook, I show how to perform various development tasks across the Windows Azure Platform – including the Window Azure Storage Service, Windows Azure hosted services, Windows Azure Diagnostics, and the Windows Azure Service Management REST API. I also provided some “recipes” for the Windows Azure AppFabric and SQL Azure, although these areas are sufficiently large as to deserve their own books rather than be reduced to chapters in a general book. Packt has provided a free download of the chapter on the Windows Azure Service Management REST API.
One of the problems in writing about a new technology is that the technology develops as the book is being written. This was particularly true of the Windows Azure Access Control Service. This is an interesting part of the Windows Azure Platform but is one that is not covered in the book – partly because the technology was changing and partly because of the difficulty of shoehorning information about it into the structure of the book.
As with any large project, the creation of a book takes many people. I am particularly indebted to the technical editors who provided a lot of comments that immeasurable improved the book:
- Maarten Balliauw (@maartenballiauw)
- Michael Collier (@MichaelCollier)
- Gaurav Mantri (@gmantri)
- Brent Stineman (@BrentCodeMonkey)
They, of course, bear no responsibility for the errors that are inevitable in any project like this. I am also grateful to the many people at Packt who worked on the book and helped ensure it came out in a timely fashion.
The chapters on the Windows Azure Storage Service would have been very difficult to do had I not used Cerebrata’s Cloud Storage Studio. This is an excellent product and I highly recommend it to anyone using the Windows Azure Storage Service.
The genesis of the book is clearly in this blog. However, the coverage of the various features is much more extensive in the book. So, if you like the posts on the blog you should look at the Microsoft Windows Azure Development Cookbook.
And now, on to all that new stuff the Windows Azure Team keeps announcing.
<Return to section navigation list>
Visual Studio LightSwitch and Entity Framework 4.1+
Beth Massi (@bethmassi) announced an Updated Deployment Guide (Beth Massi) in an 8/9/2011 post:
I just updated the LightSwitch Deployment guide I wrote back in March for the RTM release of Visual Studio LightSwitch. This guide is meant for people who want to host three-tier LightSwitch applications on their own IIS servers. There was only a couple screen shot updates but now it should be clearer on how to set this up. You can read it here:
Deployment Guide: How to Configure a Web Server to Host LightSwitch Applications
Nicole Haugen explained How To: Using the Created Method to Set Default Property Values in an 8/8/2011 post:
An extremely useful method that I consider to be a “must-have” in almost any type of LightSwitch application is the Created method. The Created method allows us to easily set the default values for an entity’s properties (e.g. fields in a table). To understand this further, let’s dive into an example that shows how to use the Created method.
If you read my first blog post on how to Prevent a Hole with Security Access Control, then you may remember in that scenario that I created a simple application used to track expense reports (which I creatively called the “Expense Report Tracker”). This application consists of two entities sharing a one-to-many relationship, called ExpenseReport and ExpenseItems. There are two types of users that use the application:
- Employees that create and save expense reports for approval
- Managers that approve or reject employees’ expense reports
In addition, we have a couple of business rules for when an employee creates an expense report – specifically, when an employee creates an expense report:
- The ExpenseReport’s Name and UserAlias fields should be automatically set to the employee’s user information
- The ExpenseReport’s Status field should be automatically set to a “Pending” status
The Created method allows us to implement both of the above business rules by using it to set the default values for the Name, UserAlias, and Status properties when the ExpenseReport entity is created. It’s important to note that the Created method is executed on both the Client and Server tier. As a result, the Created method is called no matter what mechanism is used to create an entity, regardless if it’s by clicking the “Add” button on a screen or by constructing an entity through code.
To add the ExpenseReport’s Created method to our application, open the ExpenseReport entity in the designer and select the “ExpenseReport_Created” method from the “Write Code” drop-down menu
Once the “ExpenseReport_Created” method’s code is generated, we can access the entity and set the properties’ values using the following code:
C#:partial void ExpenseReport_Created() { this.Status = "Pending"; this.Name = this.Application.User.FullName; this.UserAlias = this.Application.User.Name.Substring( this.Application.User.Name.LastIndexOf("\\") + 1); }VB:Private Sub ExpenseReport_Created() Me.Status = "Pending" Me.Name = Me.Application.User.FullName Me.UserAlias = Me.Application.User.Name.Substring( Me.Application.User.Name.LastIndexOf("\\") + 1 End SubNext, let’s add a screen, called CreateNewExpenseReport, The screen is used for creating Expense Reports that display the values of the Name, UserAlias, and Status properties. Since these values are set automatically for the user in the Created method, we don’t want the user to have to enter these values. Essentially, we want these values to be non-editable through the screen.
The easiest way to get this behavior is to use Label controls to display the Name, UserAlias, and Status properties. Label controls allow the user to view a value, but not edit it. To set the Control Type to Label, select the control in the screen designer and change the “Control Type” property to “Label”.
It’s important to note that even though our screen prevents the user from entering values for the Name, UserAlias, and Status properties, these values aren’t protected from modification. To properly secure these values and ensure that an employee cannot modify them, we need to implement security access control checks in the save pipeline methods. Refer to my previous blog post for information on how to do this.
Summary
As you can see from the above scenario, the Created method is extremely useful for setting default values for entity properties. In particular, setting default values in this manner helps save the end-user time so that they don’t have to enter data that the application can automatically detect and set for them.
For a video demonstration of this technique see [Beth Massi’s] How Do I: Set Default Values on Fields when Entering New Data?
See the Beth Massi (@bethmassi) announced her LightSwitch Speaking Tour in Fresno, L.A. & Phoenix Areas on 8/8/2011 article in the Cloud Computing Events section below.
Julie Lerman (@julielerman) announced You Win! An EF 4.1 Update to Programming Entity Framework is In the Works! in an 8/9/2011 post:
After I finished writing the first edition of Programming Entity Framework, 832 pages long, I announced to anyone within earshot that if I every talked about writing another book to just shoot me.
After I finished writing the second edition of Programming Entity Framework, which came in at nearly 900 pages, I said “I really mean it this time"!
And then Entity Framework 4.1 was released with Code First modeling and the sweetness of the DbContext and other additions to this API.
Many asked me if I would update the book. I said “no” a thousand times and explained in this blog post, EF4 books and EF 4.1, why revising the entire book for what amounts to two small additions that don’t impact the core behavior of Entity Framework made no sense.
But you still asked.
I wrote articles and created videos.
But you still asked for a new book.
And then in a moment of insanity (I believe it was during 5 long hours of driving alone in the car to my parents’ house), I decided that maybe I could just write a short book that would essentially “tack-on” to Programming Entity Framework Second Edition.
And so it goes…this is what I am now working on. But I got smart this time! This spring, I worked on a series of content for MSDN with Rowan Miller from Microsoft. Rowan is a Program Manager on the ADO.NET Entity Framework team and has been instrumental in EF 4.1. He knows it better than most anybody. Certainly better than I do! And he’s a good writer. He’s a bit less verbose than I am (you can wrap that in exaggeration) but I’ve been working on him. I liked working on that project with Rowan and it did not take a lot of convincing to get Rowan to agree to do a book with me. I’m very lucky to have him as a partner-in-crime for so many reasons!
Rowan and I are collaborating on all aspects of this project. We are both writing, but we are working very closely together so that it is not disjointed. We’ll have a common writing style and there will be a storyline and buildup of code from beginning to end. We are helping each other with decisions about samples and how information should flow.
The Game Plan
We are writing two “mini-books” for O’Reilly Media. We are writing them as though they are a continuation of Programming Entity Framework. I expect that we’ll have the same Seychelles Blue Pigeon on the cover (or some twist on that). We’ll work with the same business domain, Breakaway Geek Adventures, and there will be references to the previous book (2nd edition). There’s just no reason to repeat explanations of API stuff that is the same.
The pattern for these books follows other recent offerings from O’Reilly. They will be short-ish (targetting 100 pages each) and presented as e-books (with print on demand availability). The first book will focus on Code First – more specifically, on building a model, database initialization etc. The second book will start where the first book ends focusing on the other half of EF 4.1, DbContext, DbSet etc. In this book we’ll be able to write real code with the combination of DbContext (etc) and Code First. This is where we’ll create some sample apps, repository, do some testing etc. We are already about 1/2 way through writing the Code First book and hope these will be out in mid-fall.
If you take a look at the CouchDB books that Bradley Holt ( a friend and neighbor and one of the Vermont Code Camp organizers – just coincidentally ) has written -- Writing and Querying MapReduce Views in CouchDB, First Edition and Scaling CouchDB, First Edition (with more to come)), this is what Rowan and I are doing. Note that Bradley’s books are also on Amazon (etc.) and available for Kindle too. I don’t know how the pricing will work out.
The e-books will be in color (yay!). So as we copy our code from Visual Studio, we are leaving the code coloring in tact. The print books will be black & white.
We want to get all of the core writing done by mid-September, otherwise Rowan won’t be able to go attend his wedding and honeymoon. That wouldn’t be a great way to start a marriage, so we’re working hard toward this goal.
On order!
Return to section navigation list>
Windows Azure Infrastructure and DevOps
SD Times Newswire reported AppDynamics Introduces the First .NET Application Management Solution Designed for Cloud and Modern Environments in an 8/9/2011 article:
AppDynamics, Inc., the leader in application performance management (APM) for the cloud generation, has delivered a disruptive new .NET monitoring and management solution designed to remake the face of APM for IT professionals who support critical .NET environments.
Delivered via a Software-as-a-Service (SaaS) model and bolstered by a freemium marketing strategy, where users can “try before they buy” via a free download, AppDynamics’ NET product delivers unmatched visibility, code-level diagnostics, and extraordinary ease of use.
AppDynamics unmatched go-to-market strategy for.NET performance management includes:
- Freemium model: Disrupt the market with a one-of-a-kind free .NET performance troubleshooting tool available to all .NET IT Professionals
- Cloud Ready: Plug-and-play performance monitoring for Microsoft Azure cloud apps
- Software-as-a-Service: AppDynamics Pro edition for .NET and Java is available as Software-as-a-Service (SaaS) and on-premise deployment options
- Support for Modern .NET architectures: AppDynamics Pro provides unmatched visibility and troubleshooting capabilities for modern .NET application architectures that are distributed, componentized and agile
- Transparent and Affordable Pricing: AppDynamics .NET pricing is publicly available on AppDynamics.com and is less than half the price of legacy APM solutions
AppDynamics is distinct from other APM tools due to its ability to perform in the volatile environment of the cloud, making it a perfect fit for application owners who are migrating applications to Microsoft Azure. Designed from the ground up for the cloud as well as complex, distributed environments, AppDynamics self-learns the performance baseline of an application—enabling the solution to keep up with cloud-based applications that constantly spin up and spin down nodes based on pre-configured thresholds.
“We've successfully used AppDynamics for the last year to get visibility into our Java applications at a level that we have not seen before," said Jim Wyatt, Systems Architect at Pearson eCollege. "With AppDynamics' support for .NET, we can now trace transactions that span our Java and .NET application tiers. We've tried other tools to watch our applications, but AppDynamics is the first one that enables eCollege to monitor transaction performance across technology stacks with business context. These capabilities have been key for us to respond to new demands on our application."
Damon Edwards (@damonedwards) asked DevOps and Technical Debt: A Debt Crisis in Your Workplace? on 8/9/2011:
With all of the recent global financial news being dominated by various debt crisises, this seems like a fitting time to point out that there is another type of debt that is rampant in IT organizations as well.
This type of debt also sneaks up on you if you aren't keeping an eye on it and it too can have devastating effects. I'm talking about Technical Debt.
Technical Debt is already well known in the Agile circles as way of quantifying the deficit created by cutting corners on code quality or completeness in order to speed business feature delivery. The "Technical Debt" is the difference between doing something good enough for now rather than doing it right.
The debt metaphor is used because it implies that the organization has taken on liabilities that must be "repaid" (i.e. fixed) at some point in the future. The further along in development you move without getting rid of that debt, the more the debt grows. And like monetary debt, there is a nasty compounding effect at work here as well.
My favorite short explanations of the main points of the metaphor are:
- Skipping design is like borrowing money
- Refactoring is like repaying principal
- Slower development due to complexity is like paying interest
Folks like Ward Cunningham, Martin Fowler, and Israel Gat do a much better job of explaining Technical Debt than I do and I highly recommend reading their work.
So what does this have to do with DevOps? I think it's becoming increasingly clear that DevOps problems can best be approached and quantified using the concepts of Technical debt. I hear people all the time digging themselves into deep holes of DevOps problems with mindset of "lets just get these features out the door first and then we'll come back and fix our process and automation issues". They are taking on massive amounts of technical debt and are usually lacking a way to quantify or account for it.
Let's try the above definitions on for size with one particularly common DevOps problem -- missing or poor quality automation:
- Missing or poor quality process automation is like borrowing money
- Implementing and improving process automation is like repaying principal
- Slower pace of innovation and poor execution due to missing or poor quality process automation is like paying interest
It does seem to fit quite well. And the best part? Aside from being a concept that forward thinking developers have already embrace, Technical Debt has also been proven to be a persuasive metaphor at the executive level. Now we just have to port these ideas and vocabulary to the mainstream of the DevOps movement.
The next time you are struggling to convince an executive to fund and support DevOps work, remember to looking into using tried and true Technical Debt arguments.
Below is an except from a recent video interview I did with Israel Gat of the Cutter Consortium. In this segment he goes into what technical debt is and how it can be used to prove the cost of not "doing it right".
Update: If you are at Agile 2011, go see Israel speak at one of his sessions. It's worth it. His session on Wednesday, 'Super-Fresh Code', promises to be of interest to anyone grapping with DevOps issues.
Victoria Reitano described Moving software architects into the business analyst role in an 8/8/2011 post to the SD Times on the Web blog:
The role of the software architect has morphed into a project manager, forward thinker and business analyst all rolled into one, according to Benjamin Day, owner and founder of Benjamin Day Consulting. While vendors and architects disagree on the responsibilities to be taken on by the new software architects, they all agree that agile is the force driving this change.
“With the rise of Scrum, [development teams] aren’t doing big design up-front anymore," said Day. "In my career, it’s more and more of a collaborative process within the development team to figure out what the right architecture is. Frequently, the architect is also the middle man between the dev side and business side, helping each to understand the other's ideas and mission."
This idea of a software architect as a business analyst is a result of Day’s experience as a consultant for a variety of development teams working in small and large enterprises. Business analysts aren’t going away, but within collaborative teams, architects are taking on portions of their role to communicate the needs of the development team and ultimately achieve an understanding of the business side of the process.
Businesspeople are not as tech savvy as coders are, and someone is often needed to bridge the gap between both teams to help them understand their different roles and objectives within the Scrum development cycle, a role Day has found himself in frequently.
“The biggest change in the software architect role is to help the business team get comfortable with Scrum because now there are metrics showing how well each team is working throughout the process,” Day said. The inefficiencies shown by the Scrum metrics, such as those that measure the efficiency of deployment schedules, are often hard for the business team to cope with, in his experience, as business team leaders often believe that slow release cycles are caused solely by the development team.
Larry O’Brien, consultant and columnist for SD Times, believed that the software industry has started to shift focus from a high-level view of the development process to a lower-level look, mostly prompted by the agile movement. Agile doesn’t really buy into the value of high-level architectural modeling tools sold by vendors, he said. Instead, architecture should become important now. …
For the details about related big-data analytics, including HPC Pack, LINQ to HPC, Project “Dayton” and Excel DataScope, read my Links to Resources for my “Microsoft's, Google's big data [analytics] plans give IT an edge” Article post of 8/8/2011.
<Return to section navigation list>
Windows Azure Platform Appliance (WAPA), Hyper-V and Private/Hybrid Clouds
No significant articles today.
<Return to section navigation list>
Cloud Security and Governance
Cloud Times (@cloudtimesorg) reported The Big Shift to Cloud-based Security in an 8/9/2011 article:
How small and medium-sized organizations can manage their IT risks and maintain regulatory compliance with minimal staff and budget.
Keeping IT systems secure and running within regulatory compliance mandates, especially for mid-sized and even small businesses, seems next to impossible. There are many reasons for this – but fortunately, several recent technological trends show that it doesn’t have to be this way.
- Cyber-threats and regulations don’t care about business size
Most attackers don’t care whether they’re targeting a Fortune 25 firm or a small town manufacturer with 25 employees. What cyber criminals want is data and identities to steal and sell. Likewise, regulators are expecting the same security diligence from small and mid-sized firms as from large corporations. Consider the various data-breach disclosure laws that are in effect. They’re not based on the size of the company but the quantity and type of customer records that have been breached. And, while there may be slight differences in how regulations such as HIPAA, PCI DSS, and others affect mid-sized and even smaller firms, their overarching impact is the same.
- Software flaws: an ever-growing concern
The number of software vulnerabilities announced daily shows no sign of letting up. According to the Common Vulnerabilities and Exposures List, sponsored by the National Cyber Security Division of the U.S. Department of Homeland Security, there have been more than 3,500 flaws reported during the first three quarters of 2010. That’s well over 10 newly announced software flaws every day. And these vulnerabilities, which make it possible for many forms of malware and attackers to gain entryto protected systems, are equally detrimental to businesses large and small. It’s not just end-point operating systems, servers, and on-premise software that are at-risk. It’s also Web applications. According to a recent study by Web security firm Dasient, more than a million Web domains were infected with malware in just a 90 day span of this year.
- The extended business risk: partners, suppliers, and other stakeholders
All businesses are under internal and external pressure. Increasingly, businesses are demanding to see the security and risk management plans of those with which they do a significant amount of business. They want to know about disaster recovery and business continuity procedures. They want to know how security defenses are managed. And they want to know how their confidential information is protected.
This paper covers how small and medium-sized organizations can manage their IT risks and maintain regulatory compliance with minimal staff and budget.
Download the FREE WHITEPAPER here.
This article appears to me to have been sponsored by Qualys.
<Return to section navigation list>
Cloud Computing Events
Mary Jo Foley (@maryjofoley) reported If you’re among those going to the Windows 8 Build conference next month, be advised the scheduled pre-conference is no longer happening in a summary of her Microsoft cancels pre-conference sessions at Build Windows 8 event post of 8/9/2011 to ZDNet’s All About Microsoft blog:
It’s just over a month until Microsoft’s much-touted Build [@bldwin] conference. There’s still no session list or speakers list. There’s also — as of the past day or so — no pre-conference.
Microsoft’s original plan, as documented on its Build Web site, was to host a pre-conference on Monday September 12, followed by four days of sessions.Here’s a cached version of the Build page showing the scheduled pre-conference:
Thanks to a tip on August 9 by one of my Twitter buds (@preconsult), I saw the pre-conference had been removed from the agenda.
I asked Microsoft officials why the preconference was removed. Here’s the official statement:
“In order to better focus on all the new content that will be covered at BUILD, we decided to optimize for the four main days of the event, which will include an enormous number of speakers and sessions. Registration is open on Monday but there will not be any formal sessions for attendees that day. We’re communicating the change now so people have the option to change their travel arrangements.”
That’s it. I’m not sure what kind of content was due to be presented or by whom, but it’s moot at this point.
Build is sold out, but Microsoft is planning to stream the keynotes live and make available publicly all of the session content a day after they happen.
Microsoft has positioned the Build show as a replacement for its Professional Developers Conference (PDC) and WinHEC (Windows Hardware Engineering Conference) — at least for this year. There’s no word as to whether Build will be a recurring event or how much (if any) Windows Phone, Xbox/XNA and/or Azure development content will be included as part of the conference.
Build is expected to focus on Windows 8 and the Windows 8 development platform. Microsoft officials have said attendees will have a chance to play with the Windows 8 bits. Company officials have not confirmed whether the company will release a beta or developer preview build of Windows 8 client and server at the conference, but this is widely expected.
Update: A few folks have suggested to me on Twitter that Microsoft couldn’t hold a pre-conference if they wanted to save their big surprises and secret content until Day 1 of the show (Tuesday the 13th). I thought the pre-con might include things like HTML5/JavaScript tutorials and information for those steeped in .Net and Silverlight…. In any case, the pre-con is gone….
Kick off your day with ZDNet's daily e-mail newsletter. It's the freshest tech news and opinion, served hot. Get it.
Registration from 7:00 AM to 9:00 AM is still listed for Monday (probably in error.)
Keith Ward (@Keithinator) posted Microsoft's BUILD Conference: A Riddle Wrapped in an Enigma on 8/9/2011 to his OnWard and Upward column for Visual Studio Magazine:
So Microsoft's BUILD conference is just about a month away. Incredibly, it's been sold out for many weeks now. Fortunately, I have a press pass, so I didn't have to suffer the disappointment many of you undoubtedly did.
I can understand why it's sold out, too. BUILD is, at least for 2011, replacing the Professional Developers Conference and Hardware Engineering Conference. That's two big shows crammed into one. It's also the unveiling of Microsoft's Windows 8 strategy going forward. Big questions need to be answered, like: How much of a hybrid (traditional desktop/laptop and mobile) will it be? Is HTML5/JavaScript the Web platform developers should be learning now? What about Silverlight, for Pete's sake? What about .NET?
In other words, BUILD promises to be huge. It's almost like the developer Super Bowl.
I know you're as excited as I am too to look over the session descriptions, keynotes and pre-conference sessions to dive deep into the technology. I think the session I'm most looking forward to is...Well, let's see. There are no session descriptions, as of Tuesday afternoon, Aug. 9. At least I got to sign up for the pre-conference session on...wait a minute. Word has just come down from the irreplaceable Mary Jo Foley that the pre-conference goodies have been cancelled. Hmm. How about the keynote? It looks fantastic, with the speaker being...Oh. No keynoter listed. No keynote agenda/topics listed.
Yes, so far there's a whole lot of expectations for BUILD, but precious little data. For a show this potentially momentous, it's almost inconceivable that nothing regarding the actual, you know, content, has been published yet. Is Microsoft afraid that listing the agenda will give away state secrets? Is that the same reason the pre-conference sessions are six feet under?
I'm starting to feel like the BUILD conference has a huge, plastic dome over it, like the "Cone of Silence" from the old Get Smart TV show. No news gets out, no hints, no whispers, nothing. Let's just hope that Microsoft's execution is better than Agent 86's.
Where are WikiLeaks when you need them?
Full disclosure: I’m a contributing editor for Visual Studio Magazine.
Beth Massi (@bethmassi) announced her LightSwitch Speaking Tour in Fresno, L.A. & Phoenix Areas on 8/8/2011:
I’ve got a few speaking engagements at some .NET user groups coming up in the next couple weeks. If you’re in the area and want to know what Visual Studio LightSwitch is all about, what it can do, and how to make the most out of developing LightSwitch applications come on and check me out… er… I mean check out the talks ;-)
I plan on doing an introduction to the LightSwitch development experience by building and deploying an end-to-end, fully functional data-centric application. Then we’ll dive a bit deeper into what makes these applications tick by discussing the architecture and technologies upon which a LightSwitch application is built. I’ll show how to tap into the LightSwitch API, save pipeline and query processing. I’ll also show you the array of customization available from building your own controls & RIA Services all the way to authoring full-blown extensions like themes and shells using the Extensibility Toolkit.
Fresno, CA - Central California .NET User Group
Thursday August 11th, 6pm
Click here for more detailsSanta Monica, CA - Los Angeles Silverlight User Group
Wednesday August 24th, 7pm
Click here for more detailsChandler, AZ - Southeast Valley .NET User Group
Thursday August 25th, 5:30pm
Click here for more detailsHope to see you there!
SSWUG announced on 8/9/2011 a Free Expo Event: Working with Windows Azure and SQL Azure to be held on 8/19/2011 at 9:00 AM to 3:00 PM PDT:
SSWUG.ORG’s virtual expo will review various aspects of developing and working with Windows Azure and SQL Azure, which enable you to build, host and scale applications in Microsoft datacenters. Through our in-depth sessions with some of the leading experts, you will see many demonstrations and examples on working with data and developing applications that run in the cloud. By the end of our event, you should have the tools and understanding needed to deploy applications and services on Microsoft’s scalable, customer-facing platform.
Expo Structure and Costs
All attendees can access and participate in the “Best of SSWUG.ORG” track for free. The Premium track, which includes access to the Personal Recommendation track, is available for $39 to SSWUG.ORG’s full members and $49 to non-members. To become a full member and save on all upcoming SSWUG.ORG events, visit www.sswug.org/join.aspx to start the process.
For thorough examination of the topics in the comfort of your home and office, DVDs of the sessions are also priced at $99.Please register now to learn from six acclaimed Azure experts on Friday, August 19!
<Return to section navigation list>
Other Cloud Computing Platforms and Services
Alex Handy asserted Cloud platforms are adopting a broader view in an 8/9/2011 article for SD Times on the Web:
The emerging platform-as-a-service market continues to heat up, with new offerings (and new takes on what PaaS actually means) forming rapidly. From Java and .NET to Python and Ruby, PaaS offerings now cover all of the major languages, but the future may be more about all of these languages used together rather than each one standing on its own within a dedicated PaaS cloud.
John Rymer, principal analyst at Forrester Research, said that limiting PaaS' scope could limit appeal. “If what you're trying to do is entice the widest possible audience of developers to your platform, then a single-language strategy is anathema to that.”
Indeed, many successful PaaS companies and emerging PaaS offerings are already tending toward expanding their supported environments. Heroku, for example, has long been the poster child for Ruby PaaS. But in mid-July, the Salesforce.com-owned company announced that it has begun supporting Clojure in its PaaS offering, expanding the appeal of the platform beyond just Ruby users.
Byron Sebastian, general manager of Heroku, said that more platforms supported means more choice for developers. “When you're building distributed applications—where there might be a Web front end or some machine intelligence and data analysis involved—when you're building a complex distributed system, not only do you want different types of data stores and a combination of Web processes and distributed processes, but you might also want to implement them in different processes," he said.
"We felt Clojure made sense for our customers; it's got a lot of mind share, it's another step in the path we're taking, which currently includes Ruby, Node.js and Clojure. We will continue adding more and more languages there."
VMware is also modifying its own PaaS strategy to have a broader scope. Originally, the company discussed a Java- and Spring-based PaaS offering from its SpringSource division. But later this summer, VMware will introduce a desktop-hosted version of Cloud Foundry (called Micro Cloud), the company's latest PaaS offering, which aims to include far more than just Java stacks. Micro Cloud is designed to give developers a way of building cloudy PaaS-based applications on their desktops. …
Read more: Next Page, 2, 3, 4, 5
Charlene O’Hanlon reported Apprenda Receives $10 Million in Funding for .NET PaaS in a 8/9/2011 post to the TalkinCloud blog:
Platform as a Service provider Apprenda is seeing some love from investors for its .NET stack, announcing it has closed a $10 million Series B funding round led by Ignition Partners. Previous investors New Enterprise Associates and High Peaks Venture Partners also participated in the funding round.
According to Apprenda, the new capital will be used to accelerate its product roadmap and increase marketing and sales for its .NET private PaaS and framework. And, as per usual with funding deals, Apprenda will welcome Ignition’s managing director Frank Artale to its board of directors.
The funding is significant in the still-nascent PaaS space, the least well-known of the three major –aaS categories. However, PaaS is growing: Renub Research predicts the PaaS market will reach $400 million by 2013.
“Apprenda has a growing customer base and a scalable business model to help grow into an extremely profitable business as they are solving a key need for enterprises,” Artale said in a statement announcing the funding.
No doubt we’ll be hearing more about successful funding rounds for more PaaS vendors as the category matures.
Read More About This Topic
Joe Brockmeier (@jzb) reported Google Announces SSAE-16 Compliance in an 8/9/2011 article for the ReadWriteCloud blog:
Google has announced that its Google Apps, App Engine, Postini and Google Storage for Developers products have passed the SSAE-16 Type II and ISAE 3402 Type II certifications. Sounds great, but what does it actually mean?
Well, the SSAE-16 is actually the "American Institute of Certified Public Accountants (AICPA) Statement on Standards for Attestation Engagements 16 (SSAE-16)." Got all that? If you go read the sites related to SSAE-16 and ISAE 3402 linked by Google, you'll probably come away with a head full of jargon and no better idea of what Google has achieved than before.
According to the End Point blog, it basically means that Google is saying that its products are suitable for customers that need to comply with legal requirements like Sarbanes-Oxley and that its data centers are secure. It also means that individual customers don't need to perform their own audits of Google's services – they can use Google's SSAE-16 report in their own reporting.
The SSAE-16 evolved from a previous standard called SAS 70. What's the difference? According to Curt Finch of Journyx, SSAE-16 "prohibits the use of prior evidence," which means that companies can't roll previous audits into the current certification.
SSAE also differs in that the company's management is required to attest to "the fair presentation and design of controls." Previously it was just the auditors that reported on a company's controls. By the way, one might wonder who performed the audit for Google – but Google declined to publicly disclose who their auditor is.
In short, the SSAE-16 is a good thing to have, but it's important to understand what it is and isn't. Essentially, Google sets a series of controls and control objectives, and an auditor verifies that Google meets those practices. The SSAE-16 does not appear to be an objective measure that means that every organization that is SSAE-16 compliant is following the same procedures. (Note that the consensus seems to be that Google should announce "compliance" and not "certification.")
If you're really interested in Google's data center security, they've posted a video tour of a Google data center to highlight security and data protections in place, as well as a security white paper that goes into more detail.
Sourya Biswas reported Amazon AWS – Small Bump In The Night in an 8/9/2011 post to the CloudTweaks blog:
Amazon AWS service was down for close to 25 minutes last night . Based on their status page: http://status.aws.amazon.com/
7:39 PM PDT We are investigating connectivity issues for EC2 in the US-EAST-1 region.
7:50 PM PDT We can verify connectivity issues between instances in the US-EAST-1 region and the Internet.
8:03 PM PDT Full connectivity has been restored. The service is operating normally.
Some of the more recognized companies affected by the outage were Netflix, Foursquare and Reddit to name a few. The 25 minutes of downtime is a lot more forgiv[able] than it was during the Spring outage in which some sites were down 2 days or more. This had created a swift gust of trepidation within many and rightfully so as two days of lost revenue can be very crippling for some. These issues can certainly paralyze the public’s trust towards services such as Amazon and others.
<Return to section navigation list>
0 comments:
Post a Comment