Note: This post is updated daily or more frequently, depending on the availability of new articles.
•• Update 11/15/2008 5:00 PM PST: Additions to SDS/Cloud Computing section
• Update 11/14/2008 8:00 AM PST: Additions
• Matthieu Mezil explains How to simulate a 1 to 0..1 relationship when you must have a 1 to 1 using EF in his 11/14/2008 post. His approach automatically creates a missing entity to prevent exceptions when saving changes.
• Kim Major continues his series about Renaissance Computer Systems, Ltd. (Hashmonaim, Israel) use of a a repository interface to the Entity Framework in his Testable Data Access With The Repository Pattern of 11/13/2008.
Kim includes the code to implement an in-memory version of the repository, which is one of the topics covered in Chapter 4, “Working with Advanced Query Operators and Expressions” of my forthcoming Professional ADO.NET 3.5 with LINQ and the Entity Framework book from Wiley/WROX.
The first member of Kim’s series is Data Access With The Entity Framework post of 11/12/2008.
Matthieu Mezil’s How to split a data table? post of 11/13/2008 describes how to split a single table into two EDM entities to enable delayed loading of part of the table.
No significant new posts as of 11/13/2008 5:00 PM PST.
• Eric White’s 11/14/2008 post offers The SkipLast Extension Method that lets you skip the last n items of a LINQ to XML query resultset without performing an initial Count() operation.
Rob Conery demonstrates the RAD capabilities of T4 templates by responding to a user’s request to add IEnumerable<T> overloads to the Add, Delete, and Update methods on the repository class in his SubSonic 3.0 Repository Template Update post of 11/13/2008.
Rob Conery’s SubSonic 3.0 Preview 2 post of 11/12/2008 describes his flagship O/RM tool’s LINQ to SubSonic features, such as LINQ queries that perform updates, inserts and deletions, as well as batched queries that return DataReaders to which you can apply the ToList<T> method. Rob also provides an IRepository<T> interface.
Another interesting feature is that v3p2 uses T4 templates to generate all except “core data access” code. If you don’t like Rob’s codegen, you can roll your own templates.
• Gil Fink expands on his earlier Service Operations post (see below) with Consuming Data Services Service Operations of 11/14/2008, which explains how to consume ADO.NET data services service operations from clients.
• Pablo M. Cibrano offers a walkthrough that describes Using the WCF OAuth channel with an ADO.NET service which is based on his detailed OAuth channel for WCF RESTful services essay about of 11/14/2008. Pablo says:
My WCF channel implementation for OAuth mounts on top of Alex Henderson’s OAuth library and it basically transforms a OAuth token into a .NET security principal that can be used later within the service implementation. The channel is implemented as a RequestInterceptor, one of new features introduced in the REST WCF Starter Kit.
• Gil Fink’s Service Operations – Adding Business Logic to a Data Service post of 11/14/2008 explains service operations and how to use them to add business logic to ADO.NET Data Services.
Wally McClure’s ASP.NET Podcast Show #127 - Dynamic Data post of 11/10/2008 includes source code for the Global.asax file discussed by the participants.
This demo stores @200 original images (plus another 400 thumbnail and preview versions) along with JS and CSS files directly in SDS storage (about 80MB in all). A small C# ASP.NET handler accepts anonymous requests, processes them and, when needed initiates authenticated requests to an SDS-Proxy that then talks directly to SDS servers to get the images, scripts, and stylesheets.
You can test-drive the demo at http://amundsen.com/examples/sds-photos/.
Jim Nakashima warns that Visual Studio "Publish" of a Large Windows Azure Cloud Service may Fail if the size of the package is too large (11/15/2008). The workaround is to use the cspack.exe command-line packing utility.
•• Sriram Krishnan’s Compressed GZip content from Windows Azure blob storage post of 11/15/2008 observes:
Windows Azure currently doesn't compressing uncompressed data on the fly for you. However, there's nothing stopping you from storing the data compressed in the first place. The key is to set the Content-Encoding header to 'gzip' when uploading the blob so that when the blob storage serves it back out, clients know that the content they're getting is compressed and know how to deal with it.
and post the sample code required to compress blob files for storage.
•• Ryan Dunn has a fix for the Azure Services Training Kit’s missing AjaxControlToolkit.dll in his Fixing the SDS HOL from Azure Training Kit post of 11/14/2008. See David Aiken’s announcement later in this section.
• Shan McArthur wonders How can we use Live ID for web authentication in the Azure deployment model? in his detailed thread in the Windows Azure forum. It appears from Shan’s post that the cards are stacked against Live ID authentication in the current Azure Services version.
• Aaron Ricadela analyzes VMware's Lofty Cloud Computing Goals for Paul Maritz’s forthcoming Virtual Data Center Operating System in this Business Week article of 11/13/2008. The VDC OS appears to me to be an Amazon EC2 clone with and on-premises option.
Roger Jennings’ unfinished Azure Storage Services Test Harness: Table Services 1 of 11/13/2008 describes a test harness that’s currently undergoing feature bloat. The post provides a table that compares execution times for various Table Services operations on data uploaded from the Northwind Customers table (instead of trivial Contacts or blog Items/Comments tables.)
You’ll be surprised a how close the performance of Azure Storage and Fabric are to the same operations run with Development Storage and Fabric over a relatively slow DSL connection to the Windows Azure service.
David Aiken announces fixes to the Azure Services Training Kit in his Hands on Labs Updated post of 11/13/2008. The update contains updates and bug fixes to the original kit, which includes labs for Windows Azure as well as .Net Services, SQL Services, and Live Services.
Gus Perez’s Windows Azure Links page is:
[A] single page I update with links to good info on Windows Azure from other peers at Microsoft as well as those coming from the community instead of posting individual entries every once in a while.
As of 11/13/2008, the OakLeaf Blog was one of Gus’s six “Community” sites.
John Foley brings folks up to date on IBM’s recent cloud-computing machinations with his IBM Turns To Cloud Management post of 11/13/2008 to InformationWeek’s “Plug into the Cloud” blog.
I've gotten [the project] to the point where I can authenticate against a storage endpoint, be it development storage on your local machine or the *.core.windows.net endpoints in the sky. I spent some time today implementing the basic blob primitives (list containers, get/put blob but there is a long way to go before it is usable).
Jim Nakashima’s Using the CloudDrive Sample to Access Windows Azure Logs post of 11/12/2008 provides step-by-step instructions for copying log files to Blob Storage services and using PowerShell and the CloudDrive sample to copy them to a local folder.
John Spurlock has updated his SpaceBlock front-end to handle S[S]DS and Azure Blob storage, in addition to Amazon S3 and Nirvanix service accounts. You can install the project with ClickOnce or download source code from the CodePlex.SpaceBlock site.
No significant new posts as of 11/13/2008 3:00 PM PST.
Simon Segal’s WCF Transactions - Treat with care post of 11/13/2008 is a detailed treatise on issues that arise when transacting long-running services with WCF.