|A compendium of Windows Azure, Windows Azure Platform Appliance, SQL Azure Database, AppFabric and other cloud-computing articles.|
Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:
- Azure Blob, Drive, Table and Queue Services
- SQL Azure Database, Codename “Dallas” and OData
- AppFabric: Access Control and Service Bus
- Live Windows Azure Apps, APIs, Tools and Test Harnesses
- Visual Studio LightSwitch
- Windows Azure Infrastructure
- Windows Azure Platform Appliance (WAPA) and private cloud items
- Cloud Security and Governance
- Cloud Computing Events
- Other Cloud Computing Platforms and Services
To use the above links, first click the post’s title to display the single article you want to navigate.
Discuss the book on its WROX P2P Forum.
See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.
Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.
You can now freely download by FTP and save the following two online-only PDF chapters of Cloud Computing with the Windows Azure Platform, which have been updated for SQL Azure’s January 4, 2010 commercial release:
- Chapter 12: “Managing SQL Azure Accounts and Databases”
- Chapter 13: “Exploiting SQL Azure Database's Relational Features”
HTTP downloads of the two chapters are available for download at no charge from the book's Code Download page.
Tip: If you encounter articles from MSDN or TechNet blogs that are missing screen shots or other images, click the empty frame to generate an HTTP 404 (Not Found) error, and then click the back button to load the image(s).
No significant articles today.
• Walter Wayne Berry (@WayneBerry) delivered the fourth in his series, Securing Your Connection String in Windows Azure: Part 4, on 9/10/2010:
This is the fourth part in a multi-part blog series about securing your connection string in Windows Azure. In the first blog post (found here) a technique was discussed for creating a public/private key pair, using the Windows Azure Certificate Store to store and decrypt the secure connection string. In the second blog post (found here) I showed how the Windows Azure administrator imported the private key to Windows Azure. In the third blog post I will show how the SQL Server Administrator uses the public key to encrypt the connection string. In this blog post I will discuss the role of the developer and the code they need to add to the web role project to get the encrypted connection string.
In this technique, there is a role of web developer; he has access to the public key (however he doesn’t need to use it), and the encrypted web.config file given to him by the SQL Server Administrator. His job is to:
- Reference the Thumbprint of the private key in the web.config.
- Add the provider assembly (PKCS12ProtectedConfigurationProvider.dl[l]) to the project.
The developer’s role is the most restricted role in this technique. He doesn’t have access to the private key, nor the connection string.
Download and Compiling the Provider
The provider needs to be compiled so it can be referenced by the web role project. You will need
Visual Studio 2005[*] orVisual Studio 2008 or 2010 on your box. We discussed this in Part 3, for the SQL Azure Administrator, however the developer needs a compiled instan[ce] of this also. In some case[s,] the developer will compile it for the SQL Azure Administrator, in some case[s,] the code will be checked in and compiled with every build – follow your companies guide lines. The step[s] to compile it are:
- From the MSDN Code Gallery download [PKCS12ProtectedConfigurationProvider.zip] with the source code.
- Save everything in the .zip file to your local machine.
- Find the PKCS12ProtectedConfigurationProvider.sln file and open it as a solution with Visual Studio.
- From the Tool Menu Choose Build | Build Solution.
- In the PKCS12ProtectedConfigurationProvider\bin\Release directory there should be a PKCS12ProtectedConfigurationProvider.dll
- Copy PKCS12ProtectedConfigurationProvider.dll where your other third party assemblies are located in the web role, if you are using source control, check it in. This way the other developers will not need to recompile it.
Updating the Project to Use the Provider
- Add an assembly reference for this custom protected configuration provider to the Web role project, this is the PKCS12ProtectedConfigurationProvider.dll you built above. To do so, right click on References in the Web Role project and click on Add Reference. Then browse to the redistributables directory for the custom provider and select “Pkcs12CertProtectedConfiguratoinProvider.dll”. Right click on the added reference and click on Properties. Set the Copy Local property of the reference to True; most likely it will already be set to True. This is required so that the assembly is deployed to Windows Azure.
- From the Windows Azure Administrator get the thumbprint of the private key from the Windows Azure Portal.
Replace the thumbprint in the web.config with the thumbprint of the from the Windows Azure portal, this is the thumbprint of the private key and is needed by the provider to decrypt the connection string.
- Check in the web.config file and the project file with the newly added reference.
Now you are ready to create your deployment package and deploy to Windows Azure. The web.config file with the encrypted connection string along with the PKCS12ProtectedConfigurationProvider.dll assembly will be deployed to Windows Azure, working with the private key in the Windows Certificate store the provider, referenced by the thumbprint, the provider will be able to decrypt the connection string for the code.
A Developer’s Life
Have you ever noticed that when things become more secure the developer’s job gets harder? One thing about this technique is that the production web.config file will not work on the developer’s box running the development fabric. The reason is that the private key is not on the developer box, and that private key is needed to decrypt the web.config. The solution is not to install the private key on the developer’s box, this would compromise the connection string. The solution is to have the developers running a different web.config, one that contains connection strings to development SQL Azure databases. This version of the connection string doesn’t need to be encrypted.
Any code running on the production Windows Azure servers that has access to the web.config and the Windows Azure Certificate store has access to the SQL Azure connection string. For example this code:Response.Write("Clear text connection string is: " + System.Web.Configuration.WebConfigurationManager.ConnectionStrings ["SQLAzureConn"].ConnectionString);
Running on the production server would print out the connection string. This means that all code running on the Windows Azure server needs to have a security code review to make sure a rogue developer doesn’t compromise the integrity of the security work that we have done by encrypting the connection string. It also means that anyone that can deploy to the production Windows Azure server has the ability to figure out the connection string.
* I’ve left a comment on Wayne’s post questioning the requirement for VS 2005 or VS 2008 noted above. Part 3 mentions VS 2008 or VS 2010, which makes more sense to me.
• Update 2/11/2010: Wayne changed the requires VS version to 2008 or 2010.
• On the subject of keys, see Lori MacVittie (@lmacvittie) asked “There’s a rarely mentioned move from 1024-bit to 2048-bit key lengths in the security demesne … are you ready? More importantly, are your infrastructure and applications ready?” as a preface to her F5 Friday: The 2048-bit Keys to the Kingdom post of 9/10/2010 to F5’s DevCentral blog in the Cloud Security and Governance section below.
• Vittorio Polizzi translated his Replicare le funzionalità di Database Mail su SQL Azure post of 9/5/2010 to Database Mail on SQL Azure of 9/11/2010:
Among the SQL Server 2008 features not available in SQL Azure, as shown in the MSDN, there is the absence of all those related to SQL Agent.
Database Mail, which lets you send emails from the database engine using some stored procedures in msdb, is one of these.
Creating a mechanism to queue e-mail messages in a SQL Azure database and send them through a worker role is easy, but I would like to go further, making the existing code using Database Mail compatibile with SQL Azure.
In this post we will see how we can adapt the system stored procedures to make them run on SQL Azure. We will consider only the most common stored procedures. All others, if necessary, may be adapted in the same way.
Obtaining the original SQL Server 2008 scripts
Obtain the original code of the stored procedure is quite simple. Just open the msdb database with SSMS, left click on the object you want to change and choose Modify on the context menu. For almost all items it was possible to obtain the code to be modified this way, except for the dbo.get_principal_id and dbo.get_principal_sid functions and the xp_sysmail_format_query extended procedure. The first two have been easily rewritten, while the extended procedure xp_sysmail_format_query, given the purpose of the post, it was decided to not to proceed. This procedure is used to append to the body or attach a file with the result of a query. If you really need it, you will need to write the code that formats the output of a query as a string representing the output table.
Messages are sent using profiles, which contain all the account information and the SMTP server address. They also allow for a redundancy mechanism that ensures that your message is sent, no matter if a SMTP is down or your account has expired. Typical operations for the creation of profiles are:
– Creating a Database Mail account
@account_name = ‘AdventureWorks2008R2 Public Account’,
@description = ‘Mail account for use by all database users.’,
@email_address = ‘db_users@Adventure-Works.com’,
@replyto_address = ‘danw@Adventure-Works.com’,
@display_name = ‘AdventureWorks2008R2 Automated Mailer’,
@mailserver_name = ‘smtp.Adventure-Works.com’ ;
– Creating a Database Mail profile
@profile_name = ‘AdventureWorks2008R2 Public Profile’,
@description = ‘Profile used for administrative mail.’ ;
– Adding an account to profile
@profile_name = ‘AdventureWorks2008R2 Public Profile’,
@account_name = ‘AdventureWorks2008R2 Public Account’,
@sequence_number =1 ;
– Granting profile access to users
@profile_name = ‘AdventureWorks2008R2 Public Profile’,
@principal_name = ‘public’,
@is_default = 1 ;
Stored procedures involved, along with their dependencies are:
These stored procedures can be created in SQL Azure starting from SQL Server 2008 scripts. In general, for stored procedures not involved in the creation of credentials (sysmail_create_user_credential_sp and sysmail_add_account_sp) requires no changes other than the removal of any explicit reference to the msdb database.
The sysmail_add_account_sp stored procedure stores the username and password as a credential. This mechanism, since is not possible to retrieve the credential secret, it should be changed to allow a worker role to get the password for SMTP authentication.
An easy way is to create a new table for storing the SMTP username and password and change the sysmail_create_user_credential_sp stored procedure in order to use that table instead of server credentials.
The password field needs to be adequately protected, but because of SQL Azure current limitations we can not use the EncryptBy * statements. For sake of simplicity we will store passwords in clear text (The password field has the name “ciphertext” so we don’t forget to protect it).
CREATE TABLE sysmail_account_credential(
credential_id int IDENTITY(1,1) NOT NULL,
username nvarchar(256) NULL,
cyphertext nvarchar(256) NULL,
CONSTRAINT [SYSMAIL_ACCOUNT_CREDENTIALIDMustBeUnique] PRIMARY KEY CLUSTERED
WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF))
CREATE PROCEDURE [dbo].[sysmail_create_user_credential_sp]
@credential_id int OUTPUT
— Le porzioni commentate fanno riferimento al codice originale
SET NOCOUNT ON
–DECLARE @rc int
–DECLARE @credential_name UNIQUEIDENTIFIER
DECLARE @credential_name_as_str varchar(40)
–DECLARE @sql NVARCHAR(max)
—- create a GUID as the name for the credential
–SET @credential_name = newid()
SET @credential_name_as_str = convert(varchar(40), @username) –@credential_name)
–SET @sql = N’CREATE CREDENTIAL [' + @credential_name_as_str
-- + N'] WITH IDENTITY = ‘ + QUOTENAME(@username, ””)
— + N’, SECRET = ‘ + QUOTENAME(ISNULL(@password, N”), ””)
–EXEC @rc = sp_executesql @statement = @sql
— RETURN @rc
INSERT INTO dbo.sysmail_account_credential (username,cyphertext) VALUES (@username, @password)
–SELECT @credential_id = credential_id
–WHERE name = convert(sysname, @credential_name)
SELECT @credential_id = credential_id FROM dbo.sysmail_account_credential WHERE credential_id = @@IDENTITY
IF(@credential_id IS NULL)
RAISERROR(14616, -1, -1, @credential_name_as_str)
The following tables are used by the stored procedures:
Again the tables can be created using the scripts generated by SSMS connected to SQL Server 2008. To fix some incompatibilities with SQL Azure we need to remove all references to filegroups and the ALLOW_PAGE_LOCKS = ON, ALLOW_ROW_LOCKS = ON and PAD_INDEX = OFF options from the scripts.
You can use the stored procedures on SQL Azure as usual, with the exception that they do not reside in the msdb system database but on our user database.
The sp_send_dbmail stored procedure
The stored procedure sp_send_dbmail enqueues a mail message for sending. The message can be either in plain text or HTML and have files in attachment. The body of the message or the attachment may also contain the result of a query.
A glance at the stored procedures’s source code reveals that some changes are needed to make it run on SQL Azure. Firstly it is necessary to comment out the code that checks for the service broker and the ExternalMailQueue queue:
–Check if SSB is enabled in this database
–IF (ISNULL(DATABASEPROPERTYEX(DB_NAME(), N’IsBrokerEnabled’), 0) 1)
— RAISERROR(14650, 16, 1)
— RETURN 1
–Report error if the mail queue has been stopped.
–sysmail_stop_sp/sysmail_start_sp changes the receive status of the SSB queue
–IF NOT EXISTS (SELECT * FROM sys.service_queues WHERE name = N’ExternalMailQueue’ AND is_receive_enabled = 1)
— RAISERROR(14641, 16, 1)
— RETURN 1
we also need to remove the call to sp_SendMailQueues
— Create the primary SSB xml maessage
–SET @sendmailxml = ”
— + CONVERT(NVARCHAR(20), @mailitem_id) + N”
— Send the send request on queue.
–EXEC @rc = sp_SendMailQueues @sendmailxml
–IF @rc 0
— RAISERROR(14627, 16, 1, @rc, ‘send mail’)
— GOTO ErrorHandler;
and, of course, all references to the msdb database.
The original stored procedure calls dbo.sp_validate_user. This stored procedure is used to identify, if not specified, the default profile and checks if the user has rights to use it. At first it, this process is carried out using the information stored in sysmail_principalprofile. If the default profile is not identified, the stored procedure tries a profile lookup based on Windows Group membership. Of course, since we only have SQL authentication, this does not makes sense in SQL Azure and we have to comment out the few lines related.
The security check makes use of the functions dbo.get_principal_id and dbo.get_principal_sid, not present on SQL Azure. The function code is not available, but you can easily write it:
CREATE FUNCTION dbo.get_principal_id (@sid varbinary(85))
DECLARE @id int
SELECT @id = principal_id FROM sys.database_principals WHERE sid=@sid
CREATE FUNCTION dbo.get_principal_sid (@id int)
DECLARE @sid varbinary(85)
SELECT @sid = sid FROM sys.database_principals WHERE principal_id=@id
Through the parameter @file_attachments of sp_send_dbmail you can specify a list of files to be attached to the message. The attachments are specified as a semicolon separated list of absolute file paths. The stored procedure sp_GetAttachmentData, using the extended procedure xp_sysmail_attachment_load, stores the file content in sysmail_attachments_transfer. This mechanism cannot work in SQL Azure, so you must remove the code.
Since the attachment names are stored in sysmail_mailitems they can be used for sending. Of course the files will be on the cloud storage and consequently the file path will be an absolute URL or the container/file name pair of a known storage account.
Also, the stored procedure sp_send_dbmail allows to append to the message body (or as file in attachment) the result of a query. This feature relies on the xp_sysmail_format_query extended procedure, which is called by sp_RunMailQuery. xp_sysmail_format_query deals with query and builds the string that represents the output in tabular format. For sake of simplicity this feature is removed. You can port it to SQL Azure by adding to sp_RunMailQuery the code required to execute the query and format the output table as a string.
The original definition of the stored procedures uses an impersonation as dbo (CREATE PROCEDURE [dbo]. [Sp_send_dbmail] … WITH EXECUTE AS ‘dbo’). SQL Azure does not allow the use of SNAME_SUSER() in impersonation contexts, and since it is used in the code and as a default field value, the WITH EXECUTE AS clause is removed.
Sending an e-mail
The stored procedure sp_send_dbmail relies for its operation on the sysmail_mailitems table. The sent_account_id field is set to the account id of the actually used profile, the sent_status value indicates the message status (1 sent, 2 sending failed, 3 not sent) and sent_date is set to the date and time of message sending.
With a join between sysmail_mailitems, sysmail_profile, sysmail_profileaccount, sysmail_account, sysmail_server and sysmail_account_credential you can retrieve all information needed for sending:
SELECT sysmail_mailitems.recipients, sysmail_mailitems.copy_recipients, sysmail_mailitems.blind_copy_recipients, sysmail_mailitems.subject, sysmail_mailitems.body,
sysmail_mailitems.body_format, sysmail_mailitems.importance, sysmail_mailitems.sensitivity, sysmail_mailitems.file_attachments, sysmail_mailitems.sent_status,
sysmail_account.email_address, sysmail_account.display_name, sysmail_account.replyto_address, sysmail_server.servername, sysmail_server.port,
sysmail_server.username, sysmail_profileaccount.sequence_number, sysmail_account_credential.cyphertext AS [Password]
FROM sysmail_profileaccount INNER JOIN
sysmail_account ON sysmail_profileaccount.account_id = sysmail_account.account_id INNER JOIN
sysmail_mailitems INNER JOIN
sysmail_profile ON sysmail_mailitems.profile_id = sysmail_profile.profile_id ON sysmail_profileaccount.profile_id = sysmail_profile.profile_id INNER JOIN
sysmail_server ON sysmail_account.account_id = sysmail_server.account_id LEFT OUTER JOIN
sysmail_account_credential ON sysmail_server.credential_id = sysmail_account_credential.credential_id
WHERE (sysmail_mailitems.sent_status = 0)
ORDER BY sysmail_profileaccount.sequence_number
The complete script for creating tables and stored procedures can be downloaded here . Of course, before you use this code in a production environment, you should test thoroughly add tighter security checks.
The worker role needs to check if there are messages waiting on queue (sent_status = 0) and send them using an SMTP Client, trying with several accounts if needed. After sending the message, it should update the sysmail_mailitems table setting the fields sent_status, sent_date, from_address and reply_to. The values for the latter two are from the actually used account.
As mentioned earlier, the attachment file should be stored on the cloud storage, the worker role transfers it on local storage and instructs the SMTP client accordingly.
I hope I’ve provided some guidance to ease the porting process of your solution on SQL Azure, limiting the need for changes. If you need
forclarification, you can also contact me directly using the form on “Contatti”.
• Humberto (humlezg) listed support for OData in his Released! CRM 2011 Beta. My Favorite Features post of 9/10/2010:
The last 3 years of a lot of people's work have been released and it ROCKS! The CRM 2011 beta is live! http://www.crm2011beta.com/
[The] Beta is a quality, feature complete release of CRM 2011 that will help us fine tune the last set of pieces for the official release of CRM 2011. There are A TON of features and enhancements on the product but people always have favorites right? Here are my top 5+; yes, totally biased because my team worked on many of them J
Solutions Management: We took development of business applications on top of the CRM framework to a whole new level. We introduced the concept of unmanaged and managed solutions that allow you to define business applications to bundle your components, transport them, deploy them in all flavors(on-premises and online) and all clients(web and outlook) of CRM 2011, update them (while preserving end user customizations), uninstall them if necessary, restrict customization of selected components, automatic dependency tracking etc.
- Unmanaged solutions represent the “source” (so to speak) of your application and allow developers to create and update applications and also allows those that are used to work with CRM4 to use the same techniques they were accustom to customize CRM (just customize it)
- Once a solution is ready for final distribution it can be exported as managed. When the solution is installed as managed the framework enforces restrictions established by the developer (e.g. prevents customizations if the component is not customizable) and tracks changes performed on it as such, customizations “on top”. Multiple managed solutions that customize the same component are automatically handled by the framework and changes to key components such as forms, ribbon and sitemap navigation are merged. Any subsequent updates to the solution give the option to preserve existing customizations or to overwrite them; all without having to write a single line of code or complicated installers.
You will see a lot more material on this (solutions management is a vast topic) including information about the Dynamics Marketplace but you can start by looking into some of the enhancements on this video (see Video 9): http://offers.crmchoice.com/CRM2011Beta-Landing
Next week I’m slated to appear on a “meet the team” video explaining a bit more the whole feature set and giving a demo so stay tuned.
Web Resources + oData endpoint. This is by all means the feature with the most bang for the buck for CRM developers. It completely unlocks the client side extensibility capabilities of the CRM framework. Instead of forcing developers to learn new languages Web Resources allow them to use familiar technologies such as: HTML pages, rich Silverlight applications, images, style sheets, xml and more. You can extend Forms, Dashboards, Ribbons; create your own stand-alone pages, libraries, etc. while the framework takes care of storage, rendering and caching of the resources. Web Resources work On-Premises, Online and they also work in Outlook, both online AND offline. If you add on top of that our new RESTFUL endpoint you have a killer combination: rich UI + rich server side interaction. Did I mention that now we also support multiple handlers for the same form/field event? Oh yes, we do. [Emphasis added.]
Check them out at http://offers.crmchoice.com/CRM2011Beta-Landing (video 2)
Charts, Dashboards and Filtering. Those sexy graphs with real time insight and drilldown capabilities are killer features for CRM 2011. A lot of competitors do dashboards and charts, they are not really new to business applications but the way CRM 2011 delivers them is unique, they allow you to Navigate your system while drilling into data. You can also extend dashboards with your own controls using Web Resources.
The new Outlook Client. Yes, I’m talking about a whole new client completely revamped for CRM 2011. With many familiar feature but with lots of new goodies that seamlessly blend CRM and Outlook. You can now take advantage of native outlook grids, reading pane, categories, filters, support for multiple organizations, streamlined installation and updates and much much more. Any custom solution automatically flow into the outlook client without having to write a single line of C++ code; all your form customizations, web resources (yes, Silverlight too), ribbon modifications, dashboards, charts, they all work in the outlook client too. This is a close as it gets to the holy grail of “code once, use everywhere” J
Form Editor and Customize Tab. I list them together because it was the same feature team (not me btw) who made them happen. The new Form Editor speaks for itself: drag and drop, rich controls, navigation editing; now it is just plain fun to edit forms. The Customize Tab has an interesting story behind; now everybody loves it because right from almost any object you just click “Customize” on the Ribbon and make whatever changes you want; it seems like a no brainer right?… but the love was not always there and in fact at some point on the release we weren’t sure if it was going to make it. Literally one day the Tab just showed up and everybody realized that we had to finish it; now is one of the most demoed enhancements on CRM 2011; customization at your fingertips J
Hidden Gems. I’ll probably write more about these additional features but wanted to at least list them here initially:
- Dialogs. Not really a hidden gem, in fact a pretty hallmark feature for this release so I suspect you’ll hear a lot about it. Wizards anybody?
- Global Option Sets. The new fancy name for global picklists. Finally, define an option set once and use it in multiple entities
- Filtered Lookups. Indeed, we got them in! You can now declaratively filter lookups based on relationships by just configuring them on the form editor or, for more advanced users, you can create brand new views on the fly using the client side API. Want to filter based on dynamics values on a form? You got it. No hay problema!
- Most Recently used items in lookups. That’s right, now lookups remember the items you frequently select; the next time you type something the UI will automatically show you recent items to pick from without having to launch the full lookup dialog.
- New SOAP Endpoint. Totally based in .Net 4 and WCF the newly revamped endpoint makes it a snap to interact with CRM back and forth. I’m personally a fan of the oData (REST) endpoint instead but hard core .Net developers will love the new WCF endpoint.
- Claims Support. People might not realize the potential that supporting standard claims authentication have but trust me, this is a game changer but I’ll let the security expers on the team speak of it.
Download the Microsoft Dynamics CRM 2011 Beta here.
See also Bruce Kyle suggested that you Get Your Cloud App Development Started with Microsoft Dynamics CRM 2011 in this 9/10/2010 post to the US ISV Evangelism blog in the Live Windows Azure Apps, APIs, Tools and Test Harnesses section.
Matt Witteman took A Quick Look at OData in the Dynamics CRM 2011 Beta in his 10/11/2010 post to the C5insight blog:
SharePoint 2010 and the upcoming Dynamics CRM 2011 are two of Microsoft's premier products that are leveraging the new Open Data protocol,or OData. At C5 Insight, we focus on SharePoint and CRM, so we're excited about diving into OData and finding ways to use it to enhance our clients' investments in these systems.
OData is an open-source standard for performing read, create and update operations using nothing more than querystring parameters appended to a URL. A RESTful architecture, it's the evolution of Microsoft's ADO.NET Data Services. Data sources like SharePoint lists and CRM tables can be enabled for OData access using the .NET Framework 4's Windows Communication Foundation.
OData uses standard HTTP messages (GET, PUT, POST, and DELETE) combined with the OData syntax to make it easy to connect web-based applications. To use OData, you address the data service's endpoint. This is essentially a URL, something like www.foo.com/mydata.svc. Browsing to this address would bring back an XML feed that you could read in your browser or reference in code.
Inside each data service endpoint will be collections or sets of data. You can think of them as the lists or tables. For example, there might be a "documents" set. So if you were to browse to foo.com/mydata.svc/documents you'd get an XML feed that contained the document data. OData endpoints also provide access to metadata. The standard way to access this would be to browse to the data service endpoint and then add the /$metadata parameter (foo.com/mydata.svc/$metadata).
To query a data collection or add or update records, you add the query parameters. OData has a syntax for this. A simple example is to select the top five records: $top=5 (foo.com/mydata.svc/documents?$top=5). You can combine query parameters as well, so you can do more complex queries.
In Microsoft Dynamics CRM 2011, each organization will have its own OData service address, and each entity will be represented as a "set" under the OData service. The URIs for CRM 2011's OData service look like this:
- The main OData service: http://crmserver/orgname/xrmservices/2011/organizationdata.svc
- The metadata URI: http://crmserver/orgname/xrmservices/2011/organizationdata.svc/$metadata
- Accessing the Account data set: http://crmserver/orgname/xrmservices/2011/organizationdata.svc/AccountSet
- A query that returns the account number and city for an account named "Advanced Components": http://crmserver/OrgName/xrmservices/2011/organizationdata.svc/AccountSet?$select=AccountNumber,Address1_City&$filter=Name eq 'Advanced Components'
Here's the XML the query above would return:
Download the Microsoft Dynamics CRM 2011 Beta here.
• Eve Mahler’s Making identity portable in the cloud post of 9/10/2010 reported that Microsoft is touting OData as the successor to LDAP:
Yesterday I had the opportunity to contribute to BrightTALK’s day-long Cloud Security Summit with a webcast called Making Identity Portable in the Cloud.
Some 30 live attendees were very patient with my Internet connection problems, meaning that the slides (large PDF) didn’t advance when they were supposed to and I couldn’t answer questions live. However the good folks at BrightTALK fixed up the recording to match the slides to the audio, and I thought I’d offer thoughts here on the questions raised.
“Framework provider – sounds suspiciously like an old CA (certificate authority) in the PKI world! Why not just call it a PKI legal framework?” Yeah, there’s nothing new under the sun. The circles of trust, federations, and trust frameworks I discussed share a heritage with the way PKIs are managed. But the newer versions have the benefit of lessons learned (compare the Federal Bridge and the Open Identity Solutions for Open Government initiative) and are starting to avail themselves of technologies that fit modern Web-scale tooling better (like the MDX metadata exchange work, and my new favorite toy, hostmeta). PKI is still quite often part of the picture, just not the whole picture.
“How about a biometric binding of the individual to the process and the requirement of separation of roles?” I get nervous about biometric authentication for many purposes because it binds to the bag of protoplasm and not the digital identity (and because some of the mechanisms are actually rather weak). If different roles and identities could be separated out appropriately and then mapped, that helps. But with looser coupling come costs and risks that have to be managed.
“LDAP, AD, bespoke, or a combination?” Interestingly, this topic was hot at the recent Cloud Identity Summit (a F2F event, unlike the BrightTALK one). My belief is that some of today’s tiny companies are going to outsource all their corporate functions to SaaS applications; they will thrive on RESTfulness, NoSQL, and eventual consistency; and some will grow large, never having touched traditional directory technology. I suspect this idea is why Microsoft showed up and started talking about what’s coming after AD and touting OData as the answer. (Though in an OData/GData deathmatch, I’d probably bet on the latter…). [Emphasis added.]
Thanks to all who attended, and keep those cards and letters coming.
I’d bet on OData; Google has been very quite about GData lately.
• Patrick Butler Monterde lists SQL Azure Papers in his 9/10/2010 retrospective:
This is a list of the currently published SQL Azure Papers
SQL Azure FAQ
SQL Azure Database is a cloud based relational database service from Microsoft. SQL Azure provides relational database functionality as a utility service. Cloud-based database solutions such as SQL Azure can provide many benefits, including rapid provisioning, cost-effective scalability, high availability, and reduced management overhead. This document addresses some of the most frequently asked questions by our customers.
Comparing SQL Azure with SQL Server
SQL Azure Database is a cloud based relational database service from Microsoft. SQL Azure provides relational database functionality as a utility service. Cloud-based database solutions such as SQL Azure can provide many benefits, including rapid provisioning, cost-effective scalability, high availability, and reduced management overhead. This paper provides an architectural overview of SQL Azure Database, and describes how you can use SQL Azure to augment your existing on-premises data infrastructure or as your complete database solution.
Developing & Deploying with SQL Azure
This document provides guidelines on how to deploy an existing on-premise SQL Server database into SQL Azure and guidelines around best practices during data migration.
Security Guidelines with SQL Server
SQL Azure Database is a cloud database service from Microsoft. SQL Azure provides web-facing database functionality as a utility service. Cloud-based database solutions such as SQL Azure can provide many benefits, including rapid provisioning, cost-effective scalability, high availability, and reduced management overhead. This document provides an overview of security guidelines for customers connecting to SQL Azure Database, and building secure applications on SQL Azure.
Scaling out with SQL Azure
SQL Azure Database is a cloud database service from Microsoft. SQL Azure provides web-facing database functionality as a utility service. Cloud-based database solutions such as SQL Azure can provide many benefits, including rapid provisioning, cost-effective scalability, high availability, and reduced management overhead. This paper provides an overview on some scale out strategies, challenges with scaling out on-premise and how you can benefit with scaling out with SQL Azure.
Troubleshooting & Optimizing Queries with SQL Azure
SQL Azure Database is a cloud based relational database service from Microsoft. SQL Azure provides web-facing database functionality as a utility service. Cloud-based database solutions such as SQL Azure can provide many benefits, including rapid provisioning, cost-effective scalability, high availability, and reduced management overhead. This paper provides guidelines on the Dynamic Management Views that are available in SQL Azure and how they can be used for troubleshooting purposes.
MS TechNet: http://social.technet.microsoft.com/wiki/contents/articles/troubleshooting-and-optimizing-queries-with-sql-azure.aspx
Sync Framework for SQL Azure
SQL Azure Database is a cloud database service from Microsoft. SQL Azure provides web-facing database functionality as a utility service. Cloud-based database solutions such as SQL Azure can provide many benefits, including rapid provisioning, cost-effective scalability, high availability, and reduced management overhead. This document is not intended to provide comprehensive information on SQL Azure Data Sync. However, the intent is to provide best practices guidelines on synchronizing SQL Azure with SQL Server and to supplement the information available at the links in the References section.
The library contains Data Source Control for XAML forms. The control extends proxy class to get an access to OData Services(created by Visual Studio) and cooperates with others Silverlight GUI controls. The control supports master-detail, paging, filtering, sorting, validation, changing data. You can append your validation rules. It works like a Data Source Control for RIA Services.
You can try [a] live demo here http://alutam.cz/samples/DemoOData.aspx
In detail, there are three basic classes: ODataSource, ODataSourceView and Entity.
- ODataSource is a control and you can use it in XAML Forms.
- ODataSourceView is a class which contains a data set. ODataSourceView has the following interfaces: IEnumerable, INotifyPropertyChanged, INotifyCollectionChanged, IList, ICollectionView, IPagedCollectionView ,IEditableCollectionView.
- Entity is a class for one data record and has the following interfaces: IDisposable, INotifyPropertyChanged, IEditableObject, IRevertibleChangeTracking, IChangeTracking.
Here’s a screen capture of the Basic version of the live ODataSource Control sample:
There’s also a master-detail version.
Scott Klein explained Automating a SQL Azure Database Copy with SQL Server Integration Services in this 9/9/2010 post:
On October 24th, Microsoft released Service Update 4 for SQL Azure. The major component of this release was the live release of Database Copy functionality, which allows you to make a real-time "backup", or shapshot of your database into a different database. Backup functionality has been at the top of feature requests for SQL Azure for a long time, so this feature is certainly welcome.
However, this great feature is certainly not without its limitations. For example, there is no built-in way to automate a backup. For now it all manual via the CREATE DATABASE statement. Also, if you want to create multiple backups, you will need to create multiple new databases for each new copy. Keep in mind that there is a cost for each database. However, the Database Copy is a nice and convenient way to backup your database, and this blog post will walk you through the steps to automate the backup, or Copy, of a SQL Azure database using SQL Server Integration Services (SSIS) and a SQL job. This one of the topics covered in the book Herve and I are writing, Pro SQL Azure by Apress.
As a quick intro, creating a Database Copy is done via the CREATE DATABASE statement, as follows:
CREATE DATABASE targetdatabasename AS COPY OF sourcedatabasename
Simple as that. So, the first step in automating the Database Copy is to create a new SSIS project.
Once the project is created, place two Execute SQL Tasks on the Control flow, then link them.
Next, create a new connection to your SQL Azure master database. Be sure to test the connection to ensure you can properly connect to your SQL Azure database.
In the first Execute SQL Task, set the properties as shown in the following Figure. The Connection property should point to the connection you just created, and the SQL Statement property should simply be:
DROP DATABASE database
This database should be the target (Copy) database.
In the second Execute SQL Task, set the properties as shown in the following Figure. The Connection property should point to the same connection, and the SQL Statement property should simply be:
CREATE DATABASE targetdatabasename AS COPY OF sourcedatabasename
Your SSIS package is now complete. Test your package by running it manually. If all goes well, both tasks should be green when the package is finished executing.
Now, this example assumes that the target (Copy) database already exists when you first run this. If it doesn't, you will have to execute the command manually first via SQL Server Management Studio. Unfortunately you can't wrap the DROP DATABASE statement in an IF EXISTS statement because if you do you will get an error stating that the DROP DATABASE statement must be the only statement in the batch.
At this point, you can deploy your package to an on-premise SSIS Server. To automate the backup, create and schedule a SQL job that calls this SSIS package. For example, you can schedule the job to run nightly at 1:00 AM which will back up your database each night at 1:00 AM.
BrowserBoy quoted me in his Project Houston offers cloud alternative for SSMS post of 9/10/2010:
Currently dubbed Project Houston, the software is a free Microsoft Silverlight-based application that developers and DBAs can use to connect directly to SQL Azure from a browser. The company released a community technical preview (CTP) for the app in late July.
Before Houston, users were required to run SQL Server Management Studio (SSMS) 2008 R2 to connect to SQL Azure. This was fine for machines already running SQL Server 2008 R2 where SQL Azure support comes built in. otherwise, users were forced to download and install SSMS 2008 R2 Express to access their SQL Azure databases, a process that Microsoft program manager Dan Jones recently described as a less-than-practical solution.
Project Houston, on the other hand, requires only Microsoft Silverlight to connect to SQL Azure. from there, users can create database objects and tables, write and edit stored procedures, run queries and perform other tasks similar to what they can do with SQL Server Management Studio Express.
Roger Jennings, principal consultant with California-based OakLeaf Systems, said he was impressed with Houston’s functionality, but added that its only real benefit seems to be the lack of SSMS requirement. [Emphasis added.]
“I was able to do everything they advertised that you could do, but of course you can do the same thing with SSMS,” he said. “I think that Houston was basically an exercise in hardcore Silverlight programming — a demonstration that it was relatively easy to write a front end for a SQL Azure cloud database.”
The value of Project Houston could expand over time. Mark Kromer, a data platform technology specialist with Microsoft, said that the long-term plan is for Houston to support SQL Server Express and other databases in the enterprise.
“A lot of people use tools similar to Houston today to manage their off-premise SQL Server databases, so there is more and more pressure [for Microsoft] to do that,” he said.
The timetable for enterprise functionality beyond SQL Azure is unclear. for now, Kromer said that some organizations have already begun to test Houston for development databases. The idea is to allow for in-and-out testing and development through SQL Azure, without requiring developers to “get dangerous with SSMS,” he said.
Houston could potentially expand to non-IT scenarios, as well. Jennings noted that the Silverlight-based interface could make the app a good fit for in-house demos and presentations. “It might be a good demo for C-level executives,” he said. “People would probably use that rather than SSMS for executive briefings because it’s a little flashier.”
Originally hosted in Microsoft’s North Central U.S. data center, Houston was recently added to all Microsoft data centers for SQL Azure. The CTP is currently available at the SQL Azure Labs site, and Oakleaf Systems’ Jennings said he’d be surprised if Microsoft made any major changes before its official release.
“I think this CTP is it,” he said. “For now, I don’t think you could justify investing a lot more money into that app when you consider the limited benefits it provides.”
Mike Flasko asked What do you want to see added/changed in WCF Data Services? in this 9/10/2010 post to the WCF Data Services Team blog:
As we’ve posted a few times before on this blog the data services team is beginning to explore improvements and new features for our next release. As part of this process, it’s critical that we hear your feedback, as it helps us ensure that what we build actually meets your requirements in real-world scenarios. Today we’re launching a new site that will allow you to interact more directly with the development team and provide input: http://dataservices.mswish.net .
The site’s pretty simple and self-explanatory – you can add a new feature request or vote for feature requests that are already there. We’ve already pre-populated this list with items that were submitted on our prior Connect site and we’ve also transferred over votes. We hope you’ll try it out and vote on the features you most want to see added. Finally, as features move from ideas into actual development we’ll post our thoughts and ideas to this blog as design notes.
Juval Löwy’s very successful WCF book is now available in its third edition – and Juval asked me to update the foreword this time around. It’s been over three years since I wrote the foreword to the first edition and thus it was time for an update since WCF has moved on quite a bit and the use of it in the customer landscape and inside of MS has deepened where we’re building a lot of very interesting products on top of the WCF technology across all businesses – not least of which is the Azure AppFabric Service Bus that I work on and that’s entirely based on WCF services. [Emphasis added.]
You can take a peek into the latest edition at the O’Reilly website and read my foreword if you care. To be clear: It’s the least important part of the whole book :-)
The book is subtitled “Mastering WCF and the Azure AppFabric Service Bus.” On order at Amazon.
• Bruce Kyle suggested that you Get Your Cloud App Development Started with Microsoft Dynamics CRM 2011 in this 9/10/2010 post to the US ISV Evangelism blog:
Microsoft Dynamics CRM 2011 beta gives ISVs and developers an opportunity to get started rapidly developing line of business cloud applications.
Developers can start building their applications right away as the solution packages built on beta will be compatible with the final release of the product. It is a great opportunity for developers to build your application now and get ahead of the competition by listing your application in the Dynamics Marketplace which will be available in the next few days. Partners can also attend one of our free day-long global readiness tour events happening worldwide.
The beta for the next version of Dynamics CRM 2011 has been released this week. You can download or sign up for the beta today at http://www.crm2011beta.com to evaluate and start building solutions based on the new release.
ISVs are developing applications that integrate with Dynamics CRM by
- Using CRM as a convenient customizable data store
- By adding functionality to a customer relationship management system
- By stripping out the customer/sales team portion and building line of business applications that require identity, customizable data entry, and great - familiar user interface
And the same code you write for Dynamics CRM 2011 in the cloud can be used in an on premises version of your software. [Emphasis added.]
CRM 2011 Global Readiness Tour
In the United States the CRM 2011 Global Readiness Tour schedule include:
- Sept 8 – Reston, VA
- Sept 14 – Irvine, CA
- Sept 16 – Chicago, IL
- Sept 21 – Dallas, TX
- Sept 23 – San Francisco, CA
- Sept 28 – Southfield, MI
- Sept 30 – Alpharetta, GA
- Oct 5 – Bellevue, WA
- Oct 7 – Islin, NJ
The day kicks off with a general session keynote, presentation of what’s new, how to win with CRM 2011, and then an interactive Q&A lunch. Following lunch there are two session tracks for marketing and sales, and technical pre-sales. In some cities there is a pre-day or post-day session also scheduled; please check PartnerSource to sign up.
Getting Started with Dynamics CRM
See these videos on MSDEV that describe what Dynamics CRM is and how you can get started.
My colleague John O’Donnell, Sanjay Jain, and Andrew Bybee have put together a set of videos that show you how to get started writing your line of business application.
- Microsoft Dynamics CRM: Building Line-of-Business Applications
- Microsoft Dynamics CRM 4.0 : Installation with John O'Donnell
- Microsoft Dynamics CRM 4.0 : Data Migration Manager with John O'Donnell
- Microsoft Dynamics CRM 4.0 : Workflow and Processes with John O'Donnell
- Microsoft Dynamics CRM 4.0 : Report Wizard with John O'Donnell
- Microsoft Dynamics CRM 4.0 : Import Data Wizard with John O'Donnell
- Microsoft Dynamics CRM 4.0 : Multicurrency with John O'Donnell
- Microsoft Dynamics CRM 4.0 : Duplicate Detection with John O'Donnell
- Microsoft Dynamics CRM 4.0 Multi-Language User Interface (MUI) with Sanjay Jain
Microsoft Platform Ready
Enroll in Microsoft Platform Ready for special limited time tech support, free developer training, and marketing assistance offered worldwide for Microsoft Partners in key technologies, such as Microsoft Dynamics CRM 2011.
About Microsoft Dynamics CRM 2011
Microsoft Dynamics CRM 2011 delivers the Power of Productivity through familiar, connected and intelligent experiences for users inside and outside an organization. The key enhancements include:
- Familiar experiences through a next-generation native Microsoft Outlook client, Microsoft Office contextual CRM Ribbon, RoleTailored design and user personalization
- Intelligent experiences through guided process dialogs, inline data visualizations, performance and goal management, and real-time dashboards
- Connected experiences through cloud development, Windows Azure integration, contextual Microsoft SharePoint document repositories, teamwork and collaboration, and the Microsoft Dynamics Marketplace.
Bruce D. Kyle
ISV Architect Evangelist | Microsoft Corporation
Update 9/12/2010: Download the Microsoft Dynamics CRM 2011 Beta here. Installing Dynamics CRM 2011 will install the Windows Azure Platform AppFabric SDK v1.0.1006.0 if it’s not already installed.
Join Ryan and Steve each week as they cover the Microsoft cloud. You can follow and interact with the show at @cloudcovershow
In this episode:
- Listen as we talk about how you can deploy and manage the services you have running in Windows Azure
- Discover the various options, and the caveats, for upgrading your services
- Learn some tips and tricks on how to run your apps with zero downtime
Developing Applications for the Cloud – Code Samples
Brute Force Migration of Existing SQL Server Databases to SQL Azure
TechEd Australia Cloud Computing Videos
Storage Client Hotfix Release – September 2010
Pkcs12 Protected Configuration Provider
DAVID Systems GmbH announced on 9/10/2010 DAVID Systems shows Tri-Media Production in the Cloud based on Windows Azure:
DAVID Systems, a leading provider of global broadcast solutions, will present a preview of their upcoming tri-media production system at IBC 2010, in hall 7, at booth 7.G34, and in Microsoft Corp.’s booth in the Topaz Lounge. The new solution offers Radio and TV stations to use cloud-based content storage and content processing services leveraging Windows Azure, Microsoft’s cloud computing platform, and integrates seamlessly with Microsoft Office 2010 to support collaboration between journalists and editors.
Today, Radio and TV broadcasters need to reduce their costs and complexity of IT, while maintaining the capability to scale storage and server capacities in line with their business needs. At the same time, tri-media production and multichannel distribution have become key success factors for staying competitive.
DAVID Systems is addressing these needs
• by extending its new tri-media solution into the cloud, deploying broadcast-specific content management services on Microsoft’s cloud computing platform, Windows Azure, and
• by integrating its professional audio and video editing tools with Microsoft’s familiar business productivity environment, Microsoft Office.
The goal is to enable tri-media journalism across locations, devices and formats.
The new DAVID Systems tri-media solution can be integrated with the Microsoft Office System 2010 via the business connectivity services within Microsoft Office 2010. The Microsoft Solution Framework for Editorial Collaboration and Mobile Journalism for Office enables Radio and TV editors and journalists to share story proposals, manage resources, find relevant content, plan event coverage, and schedule outputs through a web-based user interface accessible from both Windows and Mac clients. DAVID Systems also offers the media management and audio/video editing functionality required to support tri-media production and distribution. Using DAVID Systems tools, journalists can ingest, view and edit video content, including voice-over, within a single application environment. As communication is key for creative work, the solution provides for integration with Microsoft Unified Communications, combining e-mail, instant messaging, Voice over IP (VoIP) and video conferencing in a consistent set of communication tools.
“Broadcasters are used to working with IT islands from production to playout. But in times where audiences expect cross-channel coverage and continuous updates, these islands hamper collaboration, delay processes, and increase cost.” says Vincent Benveniste, CEO, DAVID Systems GmbH. “Our new tri-media solution combines strong support for editorial collaboration with proven audio and video tools, and, for the first time, gives broadcasters the option of using cloud storage and cloud computing resources. Its web-based UIs are accessible from mobile platforms as well as Windows and Mac desktop clients, from anywhere in the world where journalists can connect to the Internet.”
“We are delighted to host DAVID Systems in our booth at IBC,” said Gabriele Di Piazza, managing director for the Media & Entertainment business in the Communications Sector at Microsoft. “DAVID Systems’ tri-media solution is an excellent example for how our partners are taking advantage of cloud computing by utilizing Windows Azure to address some of the key business challenges broadcasters are facing today: reducing cost and enabling collaboration across sites, teams, and channels.”
To arrange a live demo with DAVID Systems at IBC, please visit us in hall 7 booth 7.G34 and also at the Microsoft booth in the Topaz Lounge. You are also invited to contact us at email@example.com for more information and to schedule a meeting
For more information about DAVID Systems and its products and services, please visit the company online at www.davidsystems.com. Follow news from DAVID Systems at http://www.twitter.com/DAVID_Systems and join DAVID Systems on http://www.facebook.com/DAVIDSystems
All trademarks are the property of their respective owners.
• Paul Patterson (@PaulPatterson) explained Microsoft LightSwitch – Send an Email from LightSwitch with VB.NET code on 9/10/2010:
This afternoon I saw a post on the Microsoft LightSwitch forum in which someone asked how to send an email from LightSwitch. The conversation thread that resulted got my creative juices flowing and got me to thinking about where I might be able to use functionality like this.
For the last couple of weeks I’ve been playing, er, I mean working very, very hard at creating an application for my wifes’ photography business. Using LightSwitch, I’ve created a simple application named MyPhotoWorkshop that does some customer management, order processing, and appointment scheduling. With the expectation that she will soon be earning buckets of money, I should be able to quit my day job and work for her as her secretary. Think of the benefits package!
So, I got to thinking. If I am taking calls all day at the office from prospective customers , and she is out schmoozing it up with clients, what would be the most effective way for me to let her know of any new appointments? I could call her, but then I would have to rely on her to remember the appointment. Being a busy lady and all, an email to her mobile phone might be more effective – something she could easily lookup and recall.
Here is what I did…
I already had an Appointment table. This Appointment table has references to both a Customer table, and an Employee table.
The Appointment table with some relationships
In my Employee table, I have a field for the email address of the employee.
The Employee table (note the relationship to the Appointment table)
Also, in the Customer table, there are fields such as customer name, contact name, and phone number. Pretty straight forward stuff.
The Customer table
Now for the fun stuff. I know that LightSwitch runs as a Silverlight application. I also know that the application uses WCF RIA services. So, it should be just fine to add some logic that will use some .Net 4 stuff to fire out an email, right? I just bet that if I pop some code somewhere into my application, I can get that appointment out to my wifes mobile phone.
And where do I do this? Well, the stuff that will be responsible for sending out the email has to run in a place that will let it run. I know that the Silverlight client will be running on a client computer somewhere, so I know I can’t send the email directly from there because of trust issues. However, I can send it from the server where the WCF RIA services are running from – I thinks.
So, I go all control freak like and change the Solution Explorer for my project to File View.
Changing the solution explorer to File View
In the file view of the project, I head on down to the Server item and expand it. I then right click the UserCode node and select to add a new class.
Creating a new class
I then name my new class MailHelper.vb.
Sending an email is easy using .Net 4. The magical fairy dust is in the System.Net.Mail namespace. So I have to add that as a reference to the Server project.
I double click My Project within the Server project, select the References tab, and then click the Add… button.
Adding a Reference to the Server project.
I then locate and select the System.Net assemble and click the OK button.
Select the System.Net reference
Groovy!. Now that I have the necessary reference, I can add a little logic to do some emailing.
Back in my MailHelper.vb class, I import a couple of namespaces, add some properties, and then tie it all off with a method. The result is a beautiful piece of code…
Armed with a class within my Server project, I should be able to send an email from anywhere in my application. All I need to do is pass in some variables and ka’blamo, a message is sent out via the InterWeb.
So I am going to do exactly that. I am going to send an email message out when a new appointment is added. What’s more, I am going to send out an email to whoever is assigned the appointment.
I change back into the Logical View of my Solution Explorer.
Back to the Logical view I go
I open the Appointment table designer. Because I want to send an email when a new appointment is created, I drop down the Write Code button, and select the Appointments_Inserted method.
Selecting the Appointments_Inserted method stub
Selecting the Appointments_Inserted method opens up the ApplicationDataService.vb class of my application, with a stub for the selected method. Within this method I add the code I want to use to send an email when an appointment record is inserted into the appointments collection.
Now to see if it works.
I hit the F5 key to run the application in debug mode. I then add a new employee record to the application, making sure I use a valid email address for the employee.
An Employee with an email address
I then add a new Appointment record.
Adding the appointment
I click the save button. The appointment shows as added and save, so I assume the email went out (I should have really did some better coding and added some validation, or a message or something).
Getting excited, I run upstairs and grab my wife’s blackberry and check it out…
…okay, that is way too cool!
So there you have it. I am the new office hero.
Beth Massi (@BethMassi) explained Using Microsoft Word to Create Reports For LightSwitch (or Silverlight) on 9/10/2010:
LightSwitch has a really nice feature on data grids that allow you to export them to Excel if running out of browser (full trust).
This gives you a basic way of getting data out of the system to create reports or do further analysis on the data using Excel. However, in business applications you typically want to provide some client-side reporting that also formats the data in a certain way so it can be printed easily. Microsoft Word is a great tool for this.
So the last couple days I’ve been playing around with COM automation in LightSwitch and using Word to create client-side reports based on LightSwitch entities. Ideally I’d like to use Word 2007/2010 Open XML features & Content Controls to generate these reports. I’ve written a lot about using Open XML in the past. It’s a great way to manipulate documents without having to have Office installed on the machine, which is especially handy for server-side generated documents. If you haven’t checked out the Open XML SDK yet, I highly recommend it. In a nutshell, Word, Excel and PowerPoint documents are just .ZIP files with XML inside. The Open XML SDK wraps the .NET System.IO.Packaging classes to make it easy to access the contents.
Unfortunately the Open XML SDK and the System.IO.Packaging classes are not available in Silverlight and that’s what we’re working with when we create a LightSwitch client. Fortunately, however, Silverlight now supports calling COM components so we can use the Office object models to manipulate the documents on the client (or do anything else that this rich OM gives us). For this exercise I want to allow the user to create a “report” in Word by defining content controls on the document surface. As long as they name the controls according to the fields on the entity, we can easily add some custom XML into the document and bind the controls to that data from our LightSwitch (or Silverlight) client. This gives the end user the flexibility of creating reports, letters, memo’s or anything else you can do with Word. This is similar to a mail-merge except users can invoke this quickly right from within your LightSwitch application.
Creating the Template
To begin, I suggest downloading the Content Control Toolkit. Even though we’re going to bind the content controls in code, you can use this to play around with how you want your custom XML to look and bind. For instance, I created a template in Word that has some content controls laid out how I want them on the page in order to create a simple customer report. To add content controls to a document you first need to enable the Developer tab. You do this by opening up the File –> Options –> Customize Ribbon and making sure the developer tab is checked.
Now you can use the controls to lay out the fields on your document. Click Properties and enter a title that corresponds to the fields on your entity (in my case Customer). You can also lock the controls so that users cannot delete or edit them.
Here’s what my simple Customer report looks like in design mode:
Binding Data Manually with the Content Control Toolkit
Now all we need to do is add our custom XML with the actual data to the document and then bind that data to these content controls. We can do that dynamically in code (and we will), but you can also use the content control toolkit to do it manually. This is an option if you don’t want users to create the templates, instead you want to supply them with your application.
When you open the document with this tool you see a view of all the content controls on the left and any custom XML parts on the right. In fact, a document can store multiple custom XML parts separated by namespaces that you control allowing you to create some pretty complex binding if needed. First you specify the XML data that you want to bind by clicking “Create a new Custom XML Part” under Actions on the bottom right of the tool. Then you can paste in your XML into the Edit View. For this example we will use this simple set of data which correlates to fields on my Customer entity (note that I’m using all lower case in the elements, XML is case sensitive!):<ns0:root xmlns:ns0="urn:microsoft:ordermanager:customer"> <customer> <lastname>Massi</lastname> <firstname>Beth</firstname> <gender>F</gender> <phone>5551212</phone> <email>firstname.lastname@example.org</email> <address1>1234 Main Street</address1> <address2></address2> <city>San Francisco</city> <state>CA</state> <postalcode>94115</postalcode> </customer> </ns0:root>
Once you enter the custom XML into the Edit View, you can flip to the Bind View and then drag the XML elements onto the content controls to bind them (make sure you click twice on the element before dragging it). This sets up the XPath binding expressions.
Click Save and then you can open it back up in Word to see the report filled out with data. So what exactly did this do? I mentioned that Word, Excel and PowerPoint files are just .ZIP packages with XML inside. To see our custom XML part in the package just rename the .docx file to .zip, then look inside the customXML folder and open the item1.xml to see the custom XML we entered above.
Next we’ll set up a class to handle generating the data for this report. I’ll first show how we can distribute this already bound template with our LightSwitch application and populate it with customer data then I’ll show how we can populate a user-supplied Word document and bind the content controls dynamically in code.
Creating the Report Module
First we’ll need to write a class that encapsulates the report logic. Then we can call this from a button on our Customer screen to generate the report. To add your own class to the LightSwitch client you’ll first need to switch to File View in the Solution Explorer and then you can right-click on the Client project and add your class. I named mine MyReports.vb.
First thing I’m going to do is rename the Class to a Module since I’m using Visual Basic and put it in the same namespace as my application.This creates a Shared/static class for you so that it’s easy to call the report methods. I’m also going to need to import a couple namespaces, one for the LightSwitch COM automation and the XML namespace that we used in the custom XML part above. (We’ll need the XML namespace when we create the custom XML and bind to it dynamically later.)Imports System.Runtime.InteropServices.Automation Imports <xmlns:ns0="urn:microsoft:ordermanager:customer"> Namespace OrderManagement
Module MyReports End Module End Namespace
Generating a Report from the Fixed Template
Now that we have our report module set up we can write our code that creates the XML data and then stuffs it into the document. In this first example we’re going to use the template we created above that already has the content controls bound to the custom XML part.
First we need to check if the COM automation is available – meaning that the application is running out-of-browser with elevated permissions. Next we create an XElement with our customer data. In this case it’s really easy to use LINQ to XML and XML Literals to loop through all the properties of the customer entity dynamically. That way if we add new fields to our Customer entity, we don’t need to change this code at all. (If you’re new to LINQ to XML you can read a few articles I’ve written before here, particularly Getting Started with LINQ to XML.) The query uses the Details property on the entity to get at the collection of properties (fields) on the Customer. Then I create the XML element based on the property name, and then write out the value. If the value is blank then I want to put a dash (-) in the report.
Next thing we do is create the Word automation object using the LightSwitch System.Runtime.InteropServices.Automation.AutomationFactory (if you are using plain Silverlight then you would use System.Windows.Interop.ComAutomationFactory). We call CreateObject passing it the name of the registered COM object to instantiate, in this case Word.Application. Then we can open the document and find the custom XML part by its namespace. Please note that this will throw an exception if it’s not found so you better wrap all your automation code in a Try…Catch block. COM is all about late binding so you won’t get any intellisense on the members either so the MSDN documentation on the Word object model will be close by. When you’re doing COM programming from .NET it’s helpful to set debugging breakpoints and then explore the members dynamically using the immediate and watch windows.
Once we get the custom XML part we can replace the <customer> node with our data. We do this using XPath to select the node we want to replace. (BTW, I am totally lame at XPath that’s why I use the Content Control toolkit to help me by looking in the bindings.)
Once we replace the custom XML with our data we can show the document to the user so they can print it.Public Sub RunCustomerReportFixedTemplate(ByVal cust As Customer) If AutomationFactory.IsAvailable Then Try 'Create the XML data from our entity properties dynamically Dim myXML = <customer> <%= From prop In cust.Details.Properties.All Select <<%= prop.Name.ToLower %>><%= If(prop.Value, "-") %></> %> </customer> Using word = AutomationFactory.CreateObject("Word.Application") Dim doc = word.Documents.Open("C:\Reports\CustomerDetails.docx")
'Grab the existing bound custom XML in the doc Dim customXMLPart = doc.CustomXMLParts("urn:microsoft:ordermanager:customer") Dim all = customXMLPart.SelectSingleNode("//*") Dim replaceNode = customXMLPart.SelectSingleNode("/ns0:root/customer") 'replace the <customer> node in the existing custom XML with this new data all.ReplaceChildSubtree(myXML.ToString, replaceNode) word.Visible = True End Using Catch ex As Exception Throw New InvalidOperationException("Failed to create customer report.", ex) End Try End If End Sub
Distributing the Report Template with the LightSwitch Client
Notice that in the above code we are hard-coding the path to where the template is located “C:\Reports\CustomerDetails.docx”. Another option would be to include the report template in the client directly. LightSwitch creates a client XAP file located in your <project>\bin\Release\Web folder and this is what is deployed & running on a user’s machine. To add the CustomerDetails.docx report template, right-click on the ClientGenerated project, then select “Add Existing Item” to select the document and then set the Build Action property to “Content” as shown below:
Now we can change our code above to read the file using GetResourceStream:Dim resourceInfo = System.Windows.Application.GetResourceStream(
New Uri("CustomerDetails.docx", UriKind.Relative)) Dim fileName = CopyStreamToTempFile(resourceInfo.Stream, ".docx") Dim doc = word.Documents.Open(fileName)
We just need a couple helper methods in our MyReports module to write the resourceInfo.Stream to disk:Private Function CopyStreamToTempFile(ByVal stream As System.IO.Stream, ByVal ext As String) As String Dim path = GetTempFileName(ext) 'Create the temp file Dim file = System.IO.File.Create(path) file.Close() 'Write the stream to disk Using fileStream = System.IO.File.Open(path, System.IO.FileMode.OpenOrCreate, System.IO.FileAccess.Write, System.IO.FileShare.None) Dim buffer(0 To stream.Length - 1) As Byte stream.Read(buffer, 0, stream.Length) fileStream.Write(buffer, 0, buffer.Length) fileStream.Close() End Using Return path End Function Private Function GetTempFileName(ByVal ext As String) As String 'Return a uinuqe file name in My Documents\Reports based on a guid Dim path = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments) + "\Reports" If Not Directory.Exists(path) Then Directory.CreateDirectory(path) End If Dim filename = Guid.NewGuid().ToString() & ext path = System.IO.Path.Combine(path, filename) Return path End Function
Generating a Report from a User-Defined Template
The above works nicely if we have fixed templates but I want to allow the user to create these templates themselves. However, I don’t want to have make them do any of the binding. All I want them to do is to lay out the content controls where they want them on the document surface and then set their Title to the field name they want to appear. This means that our code will need to add the custom XML to the document and then dynamically bind it to controls it finds.
To do that we first create the custom XML, but instead of starting with the <customer> element like the above, we need to create the entire tree starting with the <root>. By importing the XML namespace at the top of our code file, Visual Basic will automatically handle putting this XML into that namespace when it is generated. We just specify the namespace on the root element like <ns0:root> and the rest is the same as before.
Now after we open the report template document we need to make a copy of it, I’m doing that in My Documents\Reports. Then we can add the custom XML part and loop through all the content controls that aren’t already bound in order to set the binding to the correct XPath expressions. These are the same XPath expressions that you see in the Content Control toolkit (that’s why it’s handy to install it). So if I find a content control with the title “LastName” then it will bind to the XPath /ns0:root/customer/lastname. We also need to specify the namespace and the part to bind to in the call to XMLMapping.SetMapping.Public Sub RunCustomerReportDynamicTemplate(ByVal cust As Customer) If AutomationFactory.IsAvailable Then Try Dim templateFile = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments) + "\Reports\CustomerDetails.docx" 'Create the XML data from our entity properties dynamically Dim myXML = <ns0:root> <customer> <%= From prop In cust.Details.Properties.All Select <<%= prop.Name.ToLower %>><%= If(prop.Value, "-") %></> %> </customer> </ns0:root> Using word = AutomationFactory.CreateObject("Word.Application") Dim tempFile = GetTempFileName(".docx") File.Copy(templateFile, tempFile) Dim doc = word.Documents.Open(tempFile) 'Add the new custom XML part to the document Dim customXMLPart = doc.CustomXMLParts.Add(myXML.ToString()) 'bind any content controls that we find based on the title of the control For i = 1 To doc.ContentControls.Count Dim ctrl = doc.ContentControls(i) If Not ctrl.XMLMapping.IsMapped Then ctrl.XMLMapping.SetMapping(
"/ns0:root/customer/" + ctrl.Title.ToString.ToLower(), "xmlns:ns0=""urn:microsoft:ordermanager:customer""",
customXMLPart) End If Next word.Visible = True End Using Catch ex As Exception Throw New InvalidOperationException("Failed to create customer report.", ex) End Try End If End Sub
Calling the Report Module from the Customer Screen
All that’s left is calling this baby from a button on our Customer screen. I want to put this on the screen’s command bar (the ribbon across the top) so in the Screen Designer expand the Screen Command Bar at the top, Click Add to add a new button and name it Print. Then right-click on it and select “Edit Execute Code”
Then we can write code that calls the report and passes it the Customer entity on the screen. We can also edit the CanExecute code so that the button is only enabled for out-of-browser deployments.Private Sub Print_Execute() ' Write your code here. MyReports.RunCustomerReportDynamicTemplate(Me.Customer) End Sub Private Sub Print_CanExecute(ByRef result As Boolean) ' Write your code here. result = System.Runtime.InteropServices.Automation.AutomationFactory.IsAvailable End Sub
Now when we run the application, users can click on the Print button on the Customer screen and generate reports that they created in Word.
With COM automation available in Silverlight a lot of possibilities open up for business applications that need to interact with Office. I hope I’ve showed you one practical way to get simple client-side reporting into a LightSwitch or Silverlight application.
To download LightSwitch & access instructional videos and articles please visit the LightSwitch Developer Center.
Visual Studio LightSwitch depends on Entity Framework v4 for data access so the ADO.NET Team’s The EF Team Wants to Hear from You! post of 9/10/2010 is a propos:
Now that Entity Framework 4.0 has shipped, the team’s working hard to add more great features for the next release. As part of this process, it’s critical that we hear your feedback, as it helps us ensure that what we build actually meets your requirements in real-world scenarios. Today we’re launching a website that will allow you to interact more directly with the development team and provide input: http://ef.mswish.net.
The site’s pretty simple and self-explanatory. No sign-in is required, and everyone gets 10 votes that they can allocate across features. We’ve already pre-populated this list with items that were submitted on Connect and we’ve also transferred over votes. We hope you’ll try it out and vote on the features you most want to see added. Finally, as features move from ideas into actual development, the best place to join the discussion is on the Entity Framework Design Blog.
Bruce Kyle claimed Entity Framework Supports Your Way to Develop Your Data Tier in a 9/9/2010 post to the US ISV Evangelism blog:
For ISVs wanting to update legacy applications or for new green field development, ADO.NET Entity Framework 4 has become a time saving, cost effective way to build out your data tier of your application. Entity Data Model (EDM) provides a uniform way for you to work with data by specifying the data structure of a client application through business logic, namely entities and relationships.
Entity Framework 4 now supports:
- POCO Support: You can now define entities without requiring base classes or data persistence attributes.
- Lazy Loading Support: You can now load sub-objects of a model on demand instead of loading them up front.
- N-Tier Support and Self-Tracking Entities: Handle scenarios where entities flow across tiers or stateless web calls.
- Better SQL Generation and SPROC support: EF4 executes better SQL, and includes better integration with SPROCs
- Automatic Pluralization Support: EF4 includes automatic pluralization support of tables (e.g. Categories->Category).
- Improved Testability: EF4’s object context can now be more easily faked using interfaces.
- Improved LINQ Operator Support: EF4 now offers full support for LINQ operators
Your Choice: Database-First, Model-first, and Code-first
Entity Framework 4 and Visual Studio 2010 supports your choice of development styles.
- Database-first is where you construct your model layer on a design surface from an existing database.
- Model-first is where you create a conceptual 'model first' and then deriving a storage model, database and mappings from that.
- Code-first is where you first define your model layer using the design surface, and can then use it to generate database schema from it.
If you want to do model-first development, EF4 offers a Generate Database Wizard to create the database and parts of the EDM (SSDL, MSL) from a conceptual model.
EF’s “code first development” support is currently enabled with a separate download that runs on top of the core EF built-into .NET 4. CTP4 of this “code-first” library shipped recently and can be downloaded here.
About ADO.NET Entity Framework
The ADO.NET Entity Framework enables developers to create data access applications by programming against a conceptual application model instead of programming directly against a relational storage schema. The goal is to decrease the amount of code and maintenance required for data-oriented applications. Entity Framework applications provide the following benefits:
- Applications can work in terms of a more application-centric conceptual model, including types with inheritance, complex members, and relationships.
- Applications are freed from hard-coded dependencies on a particular data engine or storage schema.
- Mappings between the conceptual model and the storage-specific schema can change without changing the application code.
- Developers can work with a consistent application object model that can be mapped to various storage schemas, possibly implemented in different database management systems.
- Multiple conceptual models can be mapped to a single storage schema.
- Language-integrated query (LINQ) support provides compile-time syntax validation for queries against a conceptual model.
Getting Started with Entity Framework 4
See Getting Started (Entity Framework) on MSDN.
Kathleen Richards of Redmond Magazine had penned an article providing an overview to Entity Framework 4 entitled, Cover Story: Get Ready for the Entity Framework.
Database-first. The Quickstart tutorial demonstrates how to build an Entity Framework application from an existing database.
Model-first. See Model First for step-by-step tutorial on how to generate a database from a model.
Code-first. To see how to use Entity Framework using code-first with an existing database, see Scott Guthrie’s posts:
David Linthicum asserted “The interest in private clouds and the growth of cloud providers are increasing demand for hardware” as a deck for his The numbers don't lie -- cloud computing boosts server sales post of 9/10/2010:
We've been reporting on server growth around cloud computing for a while. And now, according to IDC, worldwide server market revenues increased 11 percent in the second quarter. Clearly, this is not an aberration, but a counterintuitive trend.
These figures are driven partially by the worldwide economic recovery, but also by corporate interest in cloud computing, especially private clouds. It appears that cloud computing's initial promise of fewer data centers and servers is having the opposite effect on hardware purchases.
The reasons behind this are obvious. As made clear at VMWorld, many enterprises are looking into their options with private clouds, but instead of converting existing hardware to cloud approaches and technologies, they're making new purchases. Moreover, hardware vendors such as IBM and HP are offering "cloud bundles" or a prebuilt "cloud in a box," which are just server and software packages. These deals are especially tempting for enterprises that want to get their private clouds up and running quickly.
Cloud computing providers are also driving hardware growth. As their businesses expand, these providers are building out their infrastructure, and they need new servers to do so.
Ironically, cloud computing promised to reduce the number of servers we managed, through better sharing of existing hardware and by placing much of our compute and storage processing on public clouds. But then came the interest in private clouds, which seems to be running in parallel with the movement to public clouds, as well as the available budgets to buy new hardware. Until now, enterprises and government agencies were resisting the call to allocate money for new servers.
As a consequence, we're seeing decreased ROI from cloud computing. The cloud is supposed to help us become more effective and efficient with our hardware and software assets, but we're heading for strange times with the new emphasis on private clouds and, in effect, our individual data centers. Where we should be zigging, we're actually zagging.
K. Scott Morrison posted Public vs. Private Clouds on 9/10/2010:
Christian Perry has an article in Processor Magazine that I contributed some quotes to. The article is about the ongoing debate about the merits of public and private clouds in the enterprise.
One of the assertions that VMWare made at last week’s VMWorld conference is that secure hybrid clouds are the future for enterprise IT. This is a sentiment I agree with. But I also see the private part of the hybrid cloud as an excellent stepping stone to public clouds. Most future enterprise cloud apps will reside in the hybrid cloud; however, there will always be some applications, such as bursty web apps, that can benefit tremendously from the basic economics of public clouds.
See David Linthicum asserted “The interest in private clouds and the growth of cloud providers are increasing demand for hardware” as a deck for his The numbers don't lie -- cloud computing boosts server sales post of 9/10/2010 in the Windows Azure Infrastructure section above.
Bernard Golden addresses Data Compliance and Cloud Computing Collide: Key Questions in a 9/10/2010 article for CIO.com with this deck: On the crucial issue of data compliance, do you understand what you are responsible for versus your cloud service provider? One thing you don't want is a costly and labor-intensive manual audit mechanism, says CIO.com's Bernard Golden.
Forrester has been putting out really interesting reports on cloud computing lately. I discussed one of them in a recent post entitled "Cloud Computing: Whose Crystal Ball is Correct," which addressed the topic of private clouds. In that post, I examined Forrester's James Staten's point that implementing private cloud computing requires far more than buying vSphere and a few add-on modules—it requires standardization, process re-engineering, and organizational alignment.
This week brought another excellent report from Forrester, "Compliance with Cloud: Caveat Emptor," written by Dr. Chenxi Wang, exploring the challenges raised by the collision between data compliance requirements and cloud computing real-world offerings.
As Dr. Wang notes, most data compliance laws and regulation are written with an assumption that the liable party controls the infrastructure data is stored on as well as the placement decision about where that storage is located. Practically none of the laws and regulations recognize that a service provider may hold the data on behalf of the liable organization. Therefore, most compliance situations assign all of the responsibility to the user of a cloud computing environment despite the manifest fact that much of the control of the data is out of the hands of the user.
Several things about Dr. Wang's analysis stood out to me:
1. It may be easier to learn where an IaaS provider's data centers (and therefore, data storage location) are than for an SaaS provider. Google is identified in the report as not being able to state, definitively, where one's data is hosted or that its location will be restricted to any given region. Obviously, any opaqueness about location causes a real problem for users to ascertain if they are in compliance with applicable laws and regulations.
2. Only one law is identified as specifically recognizing the role of a service provider—HITECH for HIPAA. All other laws and regulations leave all of the responsibility with the user. At HyperStratus, we refer to this situation as asymmetric risk —despite the fact that compliance is a shared responsibility, most or all of the risk falls upon the user.
3. Those who trumpet that cloud providers accept responsibility for legal compliance measures overlook an obvious difficulty—cloud providers often don't know what data is being stored in their infrastructure and can't know what legal conditions apply to the data.
For a company like Amazon, the fact that someone can begin executing a cloud-based application with nothing more than a credit card and an account id means that it has no way to validate (or indeed, even understand) an application's compliance requirement. This is worth repeating—absent a discussion, there is no way for a cloud provider to have any idea what measures should be taken for compliance reasons—so insisting the cloud provider step up and meet compliance requirements may be unrealistic.
Lori MacVittie (@lmacvittie) asked “There’s a rarely mentioned move from 1024-bit to 2048-bit key lengths in the security demesne … are you ready? More importantly, are your infrastructure and applications ready?” as a preface to her F5 Friday: The 2048-bit Keys to the Kingdom post of 9/10/2010 to F5’s DevCentral blog:
Everyone has likely read about DNSSEC and the exciting day on which the root servers were signed. In response to security concerns – and very valid ones at that – around the veracity of responses returned by DNS, which underpins the entire Internet, the practice of signing responses was introduced. Everyone who had anything to do with encryption and certificates said something about the initiative.
But less mentioned was a move to leverage longer RSA key lengths as a means to increase the security of the encryption of data, a la SSL (Secure Socket Layer). While there have been a few stories on SSL vulnerabilities – Dan Kaminsky illustrated flaws in the system at Black Hat last year – there’s been very little public discussion about the transition in key sizes across the industry.
The last time we had such a massive move in the cryptography space was back when we moved from 128-bit to 256-bit keys. Some folks may remember that many early adopters of the Internet had issues with browser support back then, and the impact on the performance and capacity of infrastructure were very negatively impacted.
Well, that’s about to happen again as we move from 1024-bit keys to 2048-bit keys – and the recommended transition deadline is fast approaching. In fact, NIST is recommending the transition by January 1st, 2011 and several key providers of certificates are already restricting the issuance of certificates to 2048-bit keys.
Recommends transition to 2048-bit key lengths by Jan 1st 2011: Special Publication 800-57 Part 1 Table 4
Started focusing on 2048-bit keys in 2006; complete transition by October 2010. Indicates their transition is to comply with best practices as recommended by NIST
Clearly indicates why it transitioned to only 2048-bit Keys in June 2010
Also following NIST recommendations : TN 7710 - Entrust is moving to 2048-bit RSA keys.
"We enforced a new policy where all newly issued and renewed certificates must be 2048-bit“. Extended Validation (EV) required 2048-bit keys on 1/1/09
Note that it isn’t just providers who are making this move. Microsoft uses and recommends 2048-bit keys per the NIST guidelines for all servers and other products. Red Hat recommends 2048+ length for keys using RSA algorithm. And as of December 31, 2013 Mozilla will disable or remove all root certificates with RSA key sizes smaller than 2048 bits. That means sites that have not made the move as of that date will find it difficult for customers and visitors to hook up, as it were.
THE IMPACT on YOU
The impact on organizations that take advantage of encryption and decryption to secure web sites, sign code, and authenticate access is primarily in performance and capacity. The decrease in performance as key sizes increase is not linear, but more on the lines of exponential. For example, though the key size is shifting by a factor of two, F5 internal testing indicates that such a shift results in approximately a 5x reduction in performance (as measured by TPS – Transactions per Second). This reduction in performance has also been seen by others in the space, as indicated by a recent Citrix announcement of a 5x increase in performance of its cryptographic processing. This decrease in TPS is due primarily to heavy use of the key during the handshaking process.
The impact on you is heavily dependent on how much of your infrastructure leverages SSL. For some organizations – those that require SSL end-to-end – the impact will be much higher. Any infrastructure component that terminated SSL and re-encrypted the data as a means to provide inline functionality (think IDS, Load balancer, web application firewall, anti-virus scan) will need to also support 2048-bit keys, and if new certificates are necessary these, too, will need to be deployed throughout the infrastructure.
Any organization with additional security/encryption requirements over and above simply SSL encryption, such as FIPS 140-2 or higher, are looking at new/additional hardware to support the migration.
Note: There are architectural solutions to avoid the type of forklift upgrade necessary, we’ll get to that shortly.
If your infrastructure is currently supporting SSL encryption/decryption on your web/application servers, you’ll certainly want to start investigating the impact on capacity and performance now. SSL with 1024-bit keys typically requires about 30% of a server’s resources (RAM, CPU) and the increase to 2048-bit keys will require more, which necessarily comes from the resources used by the application. That means a decrease in capacity of applications running on servers on which SSL is terminated and typically a degradation in performance.
In general, the decrease we’ve (and others) have seen in TPS performance on hardware should give you a good idea of what to expect on software or virtual network appliances. As a general rule you should determine what level of SSL transaction you are currently licensed for and divide that number by five to determine whether you can maintain the capacity you have today after a migration to 2048-bit keys.
It may not be a pretty picture. …
Lori continues with ADVANTAGES of SSL OFFLOAD, ARE YOU READY? and THIS IS IN MANY REGARDS INFOSEC’S “Y2K” topics.
Tim Negris asserted “User Ignorance, Provider Apathy, and Hidden Cost Make Cloud A Big Risk for Small Business” in a preface to his SMB Cloud Is A Hacker's Paradise post of 9/9/2010:
Cheaper, Easier, Scarier -
Small and medium-sized businesses are increasingly turning to cloud computing as an easier, cheaper alternative to in-house IT or shared and dedicated server hosting solutions. And, they are finding social media to be an accessible, inexpensive way to build brands, distribute content, and assist customers.
Correspondingly, cloud services and social networking providers are increasingly targeting the SMB segment for revenue they can't get from consumers and margins they can't get from large businesses.
Meanwhile, abetted by user ignorance, provider apathy, and the high cost of security solutions, hackers are turning to cloud computing and social media as an easier, cheaper alternative to botnets; and they are finding small business tenants and users to be accessible, inexpensive targets for crime and violence.
The biggest cloud security problems are largely unique to SMB and often ignored and downplayed by naïve or cynical cloud boosters, but they are real and growing faster than public awareness or available solutions. Consumers using public clouds primarily for storage and backup or using social networks for communication and file sharing are already pretty safe and getting safer. And, enterprises using private clouds for IT flexibility and efficiency or using social networks for crowdsourcing and brand building are also not facing particularly higher risks than with other, more established technologies.
However, small and medium businesses are increasingly using public clouds for business applications and commercial web sites, and they are using social networks for collaboration, communication and customer care. In so doing, they are leaving themselves open to a growing array of risks from an increasing number of sources. These include Distributed Denial or Service Attacks (DDoS), receiving and spreading trojans and other malware, criminal extortion, competitive dirty tricks, phishing and spoofing attacks, and more.
People in small businesses often think that hackers are only interested in attacking large companies and government agencies, but that is not true. Most large enterprises employ concentrated IT resources that are very difficult to hack and not of much value to most exploits.
Most hacking schemes benefit from the availability of large numbers of unprotected systems. In the past, this has mainly meant personal computers in homes and small businesses and, to a lesser extent, distributed embedded and industrial systems. But, now it also includes cloud-based virtual servers. One such server, closely connected to fast fiber, can do the dirty work of many compromised home computers. This makes the less secure cloud hosting infrastructure increasingly used by small businesses a very attractive target for hackers.
Attacks From the Cloud
At DEFCON 18, a computer hacking convention held last month in Las Vegas, one of the most talked-about presentations was one given by two young network security consultants and entitled Cloud Computing, A Weapon of Mass Destruction? The question mark in the title proved to be a bit gratuitous and the title overall only slightly hyperbolic.
As reported on the highly respected DarkReading security site, (http://tinyurl.com/2d38kee) the presenters showed how, by spending $6 with a credit card "that could have been stolen" to deploy a simple computer program on a few virtual servers in the Amazon EC2 cloud, they were able launch a DDoS attack that took a small financial services company, their client, off the internet for a long time. "With the help of the cloud, taking down small and midsize companies' networks is easy." one presenter said, "It's essentially a town without a sheriff."
DarkFire goes on to report that the presenters claimed they found no bandwidth restrictions in their Amazon agreement, there was no apparent automated malicious server detection operating in the cloud, and complaints to Amazon by the test victim went unanswered. Amazon responded to DarkFire only in general terms, asserting that they have both detection and complaint response mechanisms in place.
Infrastructure-as-a-Service clouds are a place where it is easy and inexpensive for attackers to set up and run DDoS and other types of attacks that, in many cases, go undetected for long periods of time.
Attacks In the Cloud
In the previous case, the attack originated from a few virtual servers within the cloud and was directed against a conventional web site on the internet. A more common case is where an attack originates on a conventional botnet and is directed against cloud-based web sites and services.
Many sites built with social media and content management services or software are run on public cloud infrastructure and can lead to a variety of cloud security problems, owing to the inherent complexity in the interplay of multiple companies, programs and services. Such might be the case with, say, a commercial web built using the Wordpress open source content management system, run in the Rackspace public cloud with an address resolved through a third-party DNS service. Some problems are technical, while others pertain to weaknesses in security management processes and accountability between the software and services companies.
Here is a technical problem example. Posterous is a web-based service that allows people and businesses to upload and share content with others through Posterous web pages, emails, and social networks. The Posterous service and the sites created with it run on the Rackspace cloud. Posterous was recently the target of two virulent DDoS attacks that forced them to take technical measures that included circumventing Rackspace's security provisions after those failed. Here are the real-time tweets from the incident.
"Our datacenter is experiencing heavy packet loss. We're on the line with Rackspace now. 12:32 PM
"Network issues have been resolved for now. We're working with Rackspace to determine the cause. 12:59 PM
"We're catching up on email queues and circling back to do everything we can to stay ahead of the attack. 2:37 PM
"We're back online and systems are operational. Fools can't hold us back! Still see problems? Please email us at email@example.com. 3:09 PM
The short duration of the event and sanguine tone of the last two tweets in the series belie the seriousness of the event and the impact it had on the company. Those are better reflected in the following text from an email sent by the Posterous CEO to his customers.
"On Wednesday and Friday, our servers were hit by massive Denial of Service (DoS) attacks. We responded quickly and got back online within an hour, but it didn’t matter; the site went down and our users couldn’t post.
"On Friday night, our team worked around the clock to move to new data centers, better capable of handling the onslaught. It wasn’t easy. Throughout the weekend we were fixing issues, optimizing the site, some things going smoothly, others less so.
"Just at the moments we thought the worst was behind us, we’d run up against another challenge. It tested not only our technical abilities, but our stamina, patience, and we lost more than a few hairs in the process.
"I’m happy to report Posterous is at 100% and better than ever. Switching to a new data center will help us avoid the type of attacks we saw last week, and the new, bigger, beefier servers will speed up the site and increase capacity. We were hit pretty hard, but we’ve come out stronger in the end.
"While we were certainly frustrated, we know that no one was more frustrated than you. Your website was down, and I humbly apologize for that. Know that throughout these six days, restoring your site and your trust has been our number one priority."
Now, here is an example of the personnel and policy clashes that often add to the problems of cloud security. Alison Gionotto is a technical author and experienced web developer who builds and manages web sites that include many built with WordPress and hosted in the Rackspace cloud. After suffering through many security problems with those sites, she happened to receive a form letter from Rackspace listing security tips that included this one:
"Many applications, like WordPress, have optional plugins developed by the community. Since these add-ons are often not as well vetted, it’s extremely important to carefully evaluate and manage third party application plugins, themes, or other functionality that is introduced to a running web application. Most hackers are exploiting these plug-ins."
That was apparently the wrong thing to say to Alison. She said it made her "brain melt", and it also made her write and post a furious public open letter to Rackspace that, in addition to containing numerous specific things about Rackspace that make it difficult for her to do her job, included the following passages:
"This week, I have personally had to repair 11 WordPress websites hosted on the [Rackspace] Cloud that were hacked, all were running [the latest WordPress version] and had very few plugins in common. The plugins they do have in common, like WP-Supercache, are plugins Rackspace suggests to keep the CPU-cycle raping down to a minimum.
"I would like to know what Rackspace is doing to help developers isolate these issues?
"If I am going to continue hosting with Rackspace, I want to be assured that Rackspace is actually doing something to help us protect ourselves other than send emails that overstate the obvious.
"Your customers are under attack, and I want to know what you plan to do to help us protect ourselves and our clients, or I am taking my business to a company that values my time and reputation."
The Rising Cost of Safety and The Dropping Price of Mayhem
Cloud computing can significantly lower the regular and predictable costs of IT for small business, but, as the examples above show, it comes with a potential for unpredictable problems that can be very costly to fix and, in some extreme cases, can even kill a company. And, as bad as these problems were, they occurred in relation to some of the largest, most secure cloud service providers.
For every such major provider, there are many dozens of companies jumping into the cloud computing land rush who lack the mass of companies like Amazon and Rackspace. In order for these much smaller companies to be price-competitive with the bigger players they must cut corners, often in the area of security software and personnel. Phrases like "bare bones" are code for "limited security".
Such offerings are creating opportunities for Security-as-a-Service providers, like Zscaler. They sell add-on security solutions directly to end users and also to cloud service providers, adding cost for customers, one way or another. But, any small business considering using cloud computing or relying on social networking cannot afford not to do everything they can to ensure security, even if it ends up costing more than they thought it would.
While the cost of prevention and protection will continue to rise as threats multiply, until better solutions are deployed en mass, the cost of making mischief continues to go down. For example, botnets of the type that attacked Posterous can be rented - by a competitor, disgruntled user, extortionist, or anybody else - for as little as $200/day for a 10,000 agent network. Or, the Eleonore Exploit Pack, a toolkit for exploiting browser flaws and spreading viruses, which was used recently to bring down a US Treasury Dept. site running in the Network Solutions cloud (another top-tier provider) only costs $700 and requires very modest programming skills to use.
The Real Solution is Warmware
Hackers, security consultants, cloud service providers, and experienced users all agree on one thing, security software and hardware are not enough. Safety in the cloud also requires warmware - geek speak for "people".
Even the smallest companies usually have somebody on staff who handles things like setting up email accounts and passwords or helping users with applications. Those people should be trained to be on the lookout for trouble and in what to do if/when it happens. There may be someone who manages the relationship with the company's cloud service provider, the social network accounts, the web site content, and so forth. Those people may think of themselves as accounting or creative types, but, as the people with their hands on "the stuff", they are in the best position to ensure that things are setup right for maximum security and that vendors are held accountable for doing their part.
Better still, though, even the smallest company should consider the wisdom of making cloud security management something that is purposefully budgeted and staffed. The Cloud Security Alliance is a non-profit organization founded and supported by a large number of technology vendors and service providers that is dedicated to making cloud computing more secure and helping users protect themselves.
The CSA coordinates the implementation of cloud security standards and provides educational resources for end-users, and last week announced a new certification program called the Certificate of Cloud Security Knowledge. It is a web-based training program and certification test designed to cultivate and certify cloud security management competence. See http://www.cloudsecurityalliance.org/ for more details. It is something worth looking into for any size of company using cloud computing technology. The test costs $195 until the end of the year and $295 thereafter, either way a bargain compared to the cost of ignorance.
Eric Nelson posted TRAINING: “Hardcore Entity Framework” in London with Julie Lerman, 22nd October on 9/10/2010:
Entity Framework 4.0 (EF4) it the strategic .NET ORM from Microsoft. Hmm…. the “strategic” word. I will come back to that. However the important bit of news is we have Julie Lerman at Microsoft in London on the 22nd October devoting a full day to training on “Hard Core Entity Framework 4”.
It’s a rare visit so if you’re interested in learning about the data stack from a recognised world-wide authority and author on the topic then register now (£300).
Now back to that word “strategic”. As developers we are in a unique position. We could code everything up from scratch. We could code our own database, our own web server, our own ORM. However we choose to use other peoples implementations of these to save time and effort – but taking dependencies on technologies can be a dangerous business. For an ISV with a product that will likely live in many organisations outside of their direct control it is even more important to only take smart dependencies. Dependencies on strategic technologies from vendors are often smart dependencies. However the word “strategic” does tend to be a little overused in our industry which is why I advise ISVs to also look at how the technology is being used by the vendor and whether there is a healthy eco-system forming around it. I am confident that Entity Framework is ticking both those boxes:
- Other products/technologies from Microsoft are taking dependencies of EF4 – it pops up in places such as WCF Data Services, WCF RIA Services, ASP.NET Dynamic Data, etc.
- We have a strong eco-system forming in the blog community, great skills in our training companies and consultancies and some first class books including “Programming Entity Framework” from Julie Lerman (You need the 2nd Edition which came out Aug 2010 to cover EF4)
I hope to pop in the 22nd – maybe I will see you there.
You can’t tell EF4’s methods from its properties without a program and Julie Lerman’s (@julielerman) just-published Programming Entity Framework 2nd edition (O’Reilly, 2010) is a must-have for anyone serious about object/relational mapping (O/RM) in LightSwitch and other .NET technologies.
• Mary Jo Foley (@maryjofoley) reported “Microsoft still has no low-cost retort to Amazon in the cloud” in her Microsoft week in review: Installment one of iPhone-to-WP7 app-porting tutorial and more post of 9/10/2010:
… Amazon launched this week “Micro Instances” of Unix/Linux and Windows on its EC2 infrastructure. As Microsoft Azure expert Roger Jennings noted — in spite of Microsoft developers’ continued requests for a lower-priced, entry-level licensing plan from the Azure team — Microsoft doesn’t have anything like this.
I asked the Softies just to be sure and was told they did not have a counter (at least yet) to Amazon’s new pricing/licensing plan. Microsoft officials have said they are working to make virtual-machine roles available on Azure, via which users could host legacy Windows apps, but we’ll likely have to wait until PDC 2010 in late October to hear any more details on that option (which isn’t really the same as an instance, in any case). [Emphasis added.]
John Treadway claimed that Micro Instances Do Not a Web Host Make in this 9/10/2010 post to his CloudBzz blog:
Amazon’s announcement of Micro Instances this week ist great news for web sites who need a lower-capacity intense type for simple operations or low-volume processes. Some people have equated Micro Instances with a VPS model, or specifically as competition to traditional mass market web hosts.
A small instances is not an offering that replaces a web host.
- Is there pushbutton deployment of WordPress or Drupal? No.
- Can you provision a FREE MySQL database as part of the service? No.
- Is there an easy to use cpanel-like front end? No. Do they have reseller accounts? No.
- Do they offer built-in POP, SMTP, mailboxes, FTP and other standard web host services? No.
You have to install all of that software manually, configure it, and make sure it stays running. Godaddy does this for you, for $4.95/month.
It’s nice they have a cheaper option, but this doesn’t change Amazon’s fundamental service one iota.
Jeff Barr posted Amazon S3: Console Enhancements and Import/Export of 8TB Devices to the Amazon Web Services blog on 9/9/2010:
I've got two items that will be of interest to Amazon S3 users: console support for RRS notifications, and AWS Import/Export support for 8TB devices.
Console Support for RRS
A couple of months ago I blogged about the integration between Amazon S3 and Amazon SNS. This integration allows S3 to send a notification to an SNS topic if an object stored with Reduced Redundancy Storage is lost. You can now enable this feature for an S3 bucket by selecting Properties on the bucket's right-click menu:
Then you select the Notifications tab, enable notifications, and enter the name of an SNS topic:
AWS Import/Export Support for 8 TB Devices
The AWS Import/Export Service now supports devices with capacities of up to 8 Terabytes, reducing the number of devices needed for large data transfers. We've been working a lot with the LaCie 4big Quadra:
We are also interested in speaking with users of even larger devices such as the Micronet 10TB Platinum RAID SATA and the Data Robotics Drobo S. If you are interested in using a device that can hold more than 8 TB, please contact us at firstname.lastname@example.org.