Showing posts with label Data Services. Show all posts
Showing posts with label Data Services. Show all posts

Tuesday, December 08, 2009

Windows Azure and Cloud Computing Posts for 12/7/2009+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.


• Update 12/8/2009: Windows Azure Blog: Windows Azure Joins Windows Server in a New Organization: Server & Cloud Division; Jim O’Neill: Discovering “Dallas”; Ben Riga: Azure Lessons Learned: Kelley Blue Book; Allan da Costa Pinto: I’d Like To Teach The World to Cloud – Windows Azure Present and Future; David Linthicum: Top 5 cloud computing predictions for 2010; Jayaram Krishnaswamy: New 'Upgrade' feature in Windows Azure [is] quite nice; Lori MacVittie: Silos Belong on Farms Not in Clouds; Tom Bailey: Take the Windows Azure Challenge and Get Paid; Steve Marx and Scott Hanselman: Live at PDC09: Steve Marx; Dan Teodor: ESRI Releases MapIt 1.1; Neil MacKenzie: Diagnostics in Windows Azure and Custom Diagnostics in Windows Azure; and Ryan Dunn: Windows Azure Service Management CmdLets.

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page.
* Content for managing DataHubs will be added when Microsoft releases a CTP of the technology

Off-Topic: OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009, SharePoint Nightmare: Installing the SPS 2010 Public Beta on Windows 7 of 11/27/2009, Unable to Activate Office 2010 Beta or Windows 7 Guest OS, Run Windows Update or Join a Domain from Hyper-V VM (with fix) of 11/29/2009, and Installing SharePoint 2010 Public Beta on a Hyper-V Windows 7 VM Causes Numerous Critical and Error Events of 12/2/2009.

Azure Blob, Table and Queue Services

Bruno Terkaly describes Deploying Azure apps in 1000 words or less in this illustrated tutorial of 12/6/2009 that starts with your receipt of an email with an invitation code and ends with promoting an Azure application that uses Azure Data Services from staging to production.

See Dan Teodor’s ESRI Releases MapIt 1.1 in the next section.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

Dan Teodor announced ESRI Releases MapIt 1.1 on 12/8/2009:

ESRI MapIt version 1.1 is now available. The latest release includes new capabilities, enhancements, and functionality that allow users to better visualize the connection between tabular data and its geographic properties. It also includes Microsoft Windows Azure and SQL Azure support for developing applications in the cloud.
Download a free 60-day evaluation of MapIt 1.1

<Return to section navigation list> 

AppFabric: Access Control, Service Bus and Workflow

See Ron Lai’s Novell to extend identity management to cloud, virtualized apps article in the Cloud Computing Events section.

<Return to section navigation list>

Live Windows Azure Apps, Tools and Test Harnesses

Neil MacKenzie digs deeper in his Custom Diagnostics in Windows Azure second post of 12/8/2009:

This post follows on from the previous post on Diagnostics in Windows Azure [see below,] which should probably be read before this one.

Windows Azure Diagnostics supports the ability to integrate custom logging with the provided diagnostics handling. This is achieved by the creation of a custom file-based data buffer configured through the DirectoriesBufferConfiguration and DirectoryConfiguration classes. DirectoriesBufferConfiguration is declared:

public class DirectoriesBufferConfiguration : DiagnosticDataBufferConfiguration {
    // Constructors
    public DirectoriesBufferConfiguration();

    // Properties
    public IList<DirectoryConfiguration> DataSources { get; }
}

Neil MacKenzie introduces Diagnostics in Windows Azure in this detailed post of 12/8/2009:

Microsoft introduced a set of classes supporting diagnostics monitoring in the November 2009 Azure Services SDK. The classes are composed into two namespaces:

The Diagnostics namespace comprises classes for the configuration and collection of diagnostic information. The Diagnostics.Management namespace comprises classes supporting local and remote management of the collection of diagnostics information and its transfer to persistent storage in Azure Tables and Blobs. This post is focused on the Diagnostics namespace and a subsequent post will consider the Diagnostics.Management namespace.

MSDN has a good section on Implementing Windows Azure Diagnostics for Hosted Services that includes the following sections:

Matthew Kerner gave an excellent presentation on Windows Azure Monitoring, Logging, and Management APIs at PDC 2009. The November 2009 Update of the Windows Azure Platform Training Kit has a good hands-on-lab Windows Azure: Deploying and Monitoring Applications in Windows Azure with Example 3 being Monitoring Applications in Windows Azure. Sumit Mehrotra, from the Azure team, has a couple of posts – here and here – looking at Azure Diagnostics. …

• Maarten Balliauw describes his PHP-oriented visit to the Microsoft Web Development Summit 2009 on 12/7/2009:

  • Did a session on Windows Azure SDK for PHP, PHPExcel and PHPLinq.
  • Did an interview for the Connected Show
  • Met a lot of people I knew from Twitter and e-mail, and met a lot of new people, both Microsoft and PHP community. Nice to meet you all!
  • Event focus was on feedback between Microsoft and PHP community, overall I think the dialogue was respectful and open and helpful to both parties. …

He also posted the slide deck from his recent Cloud computing and the Windows Azure Services Platform (KU Leuven) presentation:

It was a fun session yesterday at KU Leuven university! I did a session on cloud computing and Windows Azure there for the IEEE Student Branch Leuven.

Abstract: "This session covers the basics of the Windows Azure Services Platform and drills into some architectural challenges. Learn what components the Windows Azure Services Platform is built of and how they can be leveraged in building a scalable and reliable application."

Jayaram Krishnaswamy says the New 'Upgrade' feature in Windows Azure [is] quite nice in this 12/7/2009 post:

You created a Hello World Sample like I did and wondering what if you wanted to run another sample with your account. 1 token in CTP gives you just one hosting service. You have the option to delete your service and create a new service.

I was not prepared to do this because I was not sure whether I still retain the ability to create a new service.

I went back to my Hello World project and made changes to the Default.aspx and then rebuilt the project and published it again. This time, in the Windows Azure portal, I chose to make a upgrade in the staging area. It took quite sometime (when I was holding my breath). But finally it did work as intended. Quite good.

OK. Now, can you make it a little faster?

Ben Riga announced a new Channel9 video in his Azure Lessons Learned: Kelley Blue Book post of 12/7/2009:

… As I mentioned, over the past few months I’ve been working on a number of activities related to the Windows Azure Platform.  In particular, I’ve been working with several partners as we prepared for the PDC’09 conference.  While we were preparing for the conference we welcomed a few partners to a deep-dive event in Redmond where they did some architectural reviews and met with various members of the product team in the final sprint to releasing solutions.  While they were in Redmond I took advantage to record a few videos for Channel 9.

Kelley Blue Book stands out as they were featured on the main stage during Bob Muglia’s keynote on day 1 of PDC (Andy comes on at about 1:27).  In this video with Andy Lapin, Director of Enterprise Architecture spent a few minutes showing off the site and then discussing some of the lessons the KBB team learned as they ported their site from a hosted facility to the Windows Azure Platform: Channel 9: Azure Lessons Learned: Kelley Blue Book.

• Jim O’Neill’s Discovering “Dallas” post of 11/30/2009 provides a guided tour through Microsoft’s new Azure-based data repository:

“Dallas” is essentially a repository of data repositories, a service - built completely on Windows Azure - that allows consumers (developers and information workers) to discover, access, and purchase data, images, and real-time services. 

“Dallas” essentially democratizes data, enabling a one-stop shopping place (via PinPoint) for all types of premium content.  With “Dallas” one can opt in to a pay-as-you-grow type model, facilitating access to data that may have previously only been accessible via expensive subscriptions directly with the data provider.

Developers can access “Dallas” via REST-based APIs and Atom feeds or in raw format (as many of the content providers had made available pre-“Dallas”).  The web-accessible “Dallas” Service Explorer (shown below) allows the consumer to explore the data as well as the HTTP URLs that are constructed and executed to retrieve the data set based on the user-provided parameters. …

"Dallas" Service Explorer

• Ryan Dunn posted Windows Azure Service Management CmdLets to CodePlex on 11/17/2009. Ryan describes them as:

A set of PowerShell cmdlets wrapping the Windows Azure Service Management API. These cmdlets make is simple to script out your deployments, upgrades, and scaling of your Windows Azure applications.

I missed Ryan’s contribution at the time he posted it.

Kathy Mahdoubi reports Imaging Forecast: Cloud Computing Takes RSNA by Storm in this 12/7/2009 post to DotMed News [RSNA = Radiological Society of North America]:

It's been rolling in for a few years, and now an accumulation of clouds -- data clouds, that is -- has taken over the health care industry. In particular, at this year's RSNA technical exhibit, a number of health care IT companies brought out their newest fix for this off-site approach to data and image management.

What is cloud computing? The term means that end users no longer have to know where their data resides. It is all "up in the air" in a sense. No longer are hospitals, imaging centers and private physicians' offices tied to on-site hardware to access the data and imagery required to get through their daily caseload.

"The forecast at RSNA is that everything is about the cloud, because the cloud represents a new tier of data management," said InSite One's Mitchell Goldburgh, senior vice president of marketing and business development. "With the advent of the cloud, it's all virtual. It's like when you do a Google search for a restaurant. You don't care where that data is coming from as long as it is the data you need." …

David C. Kibbe, MD and Brian Keppler, Ph.D. observe “Organizations like Microsoft, Google, Salesforce, Covisint, IBM, Intel, and Amazon not only are marshaling their forces to create new health care products, but have the resource bases and very deep IT infrastructures required to rapidly scale the kind of effort that will be required in a sector as vast and sophisticated as health care” in their 2009: A Year of Surprises and Change for the EHR Technology Market retrospective analysis of 12/6/2009 that covers the following events:

    • Payment for Meaningful Use of EHR technology, not for the software and hardware itself.
    • It's become PC to ask tough questions about EHRs, quality, and health care costs
    • CCHIT's loss of invulnerability and the displacement of its monopoly on EHR certification
    • The Power Shift Away from Legacy HIT Firms
    • Interest in HIT by Big Technology Companies

A very interesting forecast on the direction in which health information technology (HIT) is headed.

Karsten Januszewski’s Adventures With Windows Azure Diagnostics post of 12/4/2009 to the Mix Online blog begins:

I recently took the Azure plunge, in preparation for an upcoming Mix Online Service we will be announcing soon. Things went relatively smoothly, but I hit few gotchas, especially when trying to get diagnostics and logging working.  Below is a chronicle of said gotchas, with tips on how to resolve them.

I first came across Karsten when he was spearheading Microsoft’s ill-fated UDDI program for XML Web services.

Carl Brooks asserts Cloud computing is ready for personal health records, healthcare is not in his 12/2/2009 part 1 of a two-part series for SearchCloudComputing.com:

Vacationing in exotic Morocco, you fall and break your arm on one of the shady streets of Casablanca's Old Quarter. Speaking neither French nor Arabic, you are desperate to inform the doctors at the nearest hospital that you are diabetic and due for insulin any minute, along with concerns about your allergies to several common medications and materials such as penicillin and latex.

The attending physician has nothing to go on but your passport and an Internet connection -- and that's more than enough. Before prescribing or treating you, she pulls up a medical dossier that contains all pertinent information, including a calendar of necessary treatments and all the medication you've ever taken.

You get the insulin you need, the doctor wears nitrile gloves and your X-ray and a clinical note in French are uploaded to your file when you are finished. Your doctor back home gets an email about the change, and you meet for a follow-up when you return. That's the promise of personal health records (PHRs), and proponents say cloud computing could help make them a reality. …

Of course, this outcome assumes that the attending physician can read English.

Joseph Goedart asks Are HIEs the Answer? in this 12/1/2009 post to the Health Data Management Magazine site [HIE = Health Information Exchange]:

The numbers are daunting. Nearly two decades after the advent of community health information networks and more than five years after the Bush Administration starting pushing for electronic health records and health information exchanges, only 28 states have one or more operational HIEs. And operational doesn't mean everyone in a region, much less a state, is active in the HIE.

In a nation of 300 million residents over 3.5 million square miles, there are 193 HIEs in various stages of development, according to eHealth Initiative, a Washington-based industry advocacy organization. By self-attestation, 57 of the HIEs are operational. Most HIEs don't have a sustainable business model, and getting a critical mass of regional stakeholders to cooperate in exchanging their data remains an extremely difficult proposition. …

HealthBridge's Robert Steffel has been impressed with the Obama Administration's health information technology strategy and the massive amount of stimulus funding that goes with it. "There has been a focused, diligent and honest effort to figure out how to do this and do it well." But Steffel doesn't believe all the money will be spent well. "HIEs are very hard to do. Money's not the hard part; it certainly is a barrier, but collaboration is the hard part."

Others say it's too early to know whether the money being thrown at HIEs and other I.T. initiatives will be well spent. "My concern is there's a lot of money going into the system very quickly and the system isn't used to that, so there's a lot of opportunity for mistakes to be made," says [John] Moore of Chilmark Research. …

Mitch Wagner asks “Microsoft HealthVault and Google Health want to be the repository of choice for millions of personal health records. Are they up to the task?” in his feature-length Microsoft, Google Face Off On Healthcare article for the InformationWeek HealthCare newsletter:

Microsoft and Google are taking their rivalry to the doctor's office, running competing services that allow people to store their medical records online for access by family members and healthcare providers.

Google Health and Microsoft HealthVault are similar approaches: They let patients input their own medical data either by typing it in or by giving permission for the vendor to get the information from a healthcare provider or insurer with which it's partnering. Google Health and Microsoft HealthVault then provide tools for those partners to give the patient personalized health advice and other services built around the person's records.

These "personal health records"--PHRs for short--complement electronic medical records. Both types of records contain a lot of the same information on the patient's conditions, test results, prescriptions, and other medical data. But PHRs are compiled and controlled by the patient, while EMRs are compiled and, for the most part, controlled by the doctors, hospitals, and other healthcare organizations.

Fred Schulte reports Switch to Electronic Records Getting Mixed Reviews at Hospitals and Clinics on 11/24/2009 for For the Huffington Post Investigative Fund and American University's Investigative Reporting Workshop:

More than five years ago, one of California’s leading hospitals decided to leap into the future of medical care by digitizing its patients’ health records. Despite a $50 million investment and countless hours trying to overcome persistent technical headaches, the system is still not fully up and running.

This summer, the University of California San Francisco Medical Center quietly wrote off more than a third of the money it has spent, terminated its contractor and prepared to start part of the project from scratch. …

“Our basic position is that the current products cannot meet our quality, safety or efficiency needs,” said Kendall Rogers, an internist at the University of New Mexico. He chairs an information technology task force for the Society of Hospital Medicine, a doctors’ group whose members work primarily in hospitals. …

In the case of the University of California San Francisco, the hospital ended its contract with General Electric “based on overall delay in getting an integrated system in place and fully functional,” a spokeswoman said.

Despite the problems, many at the hospital remain committed to the idea of electronic records. Glitches are worrisome but the products are “getting better as more people use them,” said Robert M. Wachter, a professor of medicine and an expert on patient safety issues. …

General Electric is one of the leading vendors of “legacy” electronic health record (EHR) software.

Return to section navigation list> 

Windows Azure Infrastructure

The Windows Azure Group and the Windows Server & Solutions Group announced on 12/8/2009 at 8:14 PM PST that Windows Azure Joins Windows Server in a New Organization: Server & Cloud Division:

Microsoft is announcing today a new organization within the Server & Tools Business (STB) that combines the Windows Azure group with the Windows Server & Solutions group.  This new organization is called the Server & Cloud Division. …

This move better aligns our resources with our strategy – creating a single organization focused on delivering solutions for customers that span on-premises data centers and the cloud.

As our new teammates stated on their blog today, SCD comprises the following:

  1. The Windows Azure Development team will move from under Chief Software Architect Ray Ozzie to the Server & Tools Business, led by Bob Muglia, President, Server and Tools Division.  Senior Vice President Amitabh Srivastava will lead the newly formed SCD, reporting to Bob.
  2. The Windows Server and Solutions group, led by Corporate Vice President Bill Laing, will join the Windows Azure team to form the Server & Cloud Division.  Bill will report to Amitabh and will continue his role as a key member of the STB leadership team.  Bill will partner with Amitabh to continue the bilateral sharing of technology between Windows Server and Windows Azure, which has been a key design goal of Microsoft’s software + services strategy.
  3. The Windows Azure Business and Marketing team will continue to be led by Doug Hauger.  Doug will join the Server and Tools Marketing Group, led by Corporate Vice President Robert Wahbe, reporting to Corporate Vice President Bob Kelly, who is also responsible for Windows Server, System Center, and Forefront.

As SCD, together with our colleagues in Windows Server, we’ll ensure that customers get the full benefit of Microsoft offerings that span Microsoft’s public cloud, on-premises solutions, private clouds, and clouds that our partners host.

The Windows Azure team is excited to be joining the Server & Tools Business and working together with the Windows Server team to deliver our customers the best of software + services.

My take: Bob Muglia 7, Amitabh Srivastava 3, Ray Ozzie 0.

• Mary Jo Foley adds more background details about the reorganization in her Microsoft combines Windows Server and Azure to form new Server & Cloud unit post to her All About Microsoft blog:

It’s not a surprising move, but as of December 8, it is an official one: Microsoft has moved its Windows Azure business by moving it into the company’s Server and Tools Business (STB) Unit.

The move sets the stage for Microsoft to strengthen its story that it will offer customers a range of solutions, ranging from private cloud to public cloud ones. The actual private-cloud offerings from Microsoft are still not publicly available, but sound like they’re coming together slowly, based on some early info company officials shared in November at the Microsoft Professional Developers Conference (PDC).

Prior to today, the Azure team reported directly to Microsoft Chief Software Architect Ray Ozzie.

Now, while Microsoft President Bob Muglia will continue to run STB, Microsoft’s Windows Azure chief Amitabh Srivastava will be reporting directly to Muglia. Srivastava will be running a new unit that combines Windows Server and Windows Azure (codenamed RedDog) into the newly minted Server & Cloud Division. Windows Server Corporate Vice President Bill Laing will be reporting to Srivastava and will work on the newly combined unit. SQL Azure — the other main component of Microsoft’s cloud offering — already is under STB, according to Microsoft officials.

• Lori MacVittie asserts Silos Belong on Farms Not in Clouds in her 12/8/2009 post:

Beware the danger of building out isolated network and application network infrastructures in the cloud lest we end up with silos from which it is difficult to escape.

cloudsilo

While writing a separate post on the business value of public versus private cloud computing investments I specifically called out the fact that infrastructure – virtual or physical – provisioned in a cloud environment is applicable only to that cloud environment; it really can’t be shared within the enterprise architecture or other public cloud computing environments, for that matter.

That led to considering the impact of the cloud computing deployment model on general application architecture and how much “sharing” of provisioned resources would occur “out there”. There appears to be a very real possibility that the lack of visibility in cloud computing environments may very well lead to the creation of silos in the cloud.

ISOLATION is not always A GOOD THING

One of the alleged benefits of public cloud computing is that anyone within your organization can take advantage of it. We’ve seen the results of isolated, disconnected departmental-level architecture and development before; internal technological silos. When there is no centralized infrastructure management, each department/project is left to its own devices. This isolation and separation from a shared IT infrastructure management could easily lead to two or more different applications being provisioned with their own “copies” of infrastructure in a public cloud computing environment. …

• David Linthicum offers his Top 5 cloud computing predictions for 2010 and forecasts “Microsoft will find itself well-placed in the cloud” in this 12/8/2009 post for InfoWorld’s Cloud Computing blog:

Cloud computing standards and major cloud computing outages top Linthicum's list.

Evidently it's a requirement that all of those in the cloud computing world must chime in with their cloud computing predictions for 2010, so here are mine:

  1. Rise of standards …
  2. First major cloud computing provider outages …
  3. Microsoft will be relevant in the cloud …
  4. Rapid consolidation of existing providers …
  5. Rapid rise of cloud computing startups …

Of course, David fleshes out his predictions with background detail.

• Allan da Costa Pinto reviews Manuvir Das’ Windows Azure Present and Future (SVC13) PDC09 session, I’d Like To Teach The World to Cloud – Windows Azure Present and Future, in this 11/22/2009 post:

On a time budget and looking for a great video that gives you a vantage point of what is Windows Azure and what’s next for it? I’d like to suggest this 1 hour session from the Microsoft PDC: Windows Azure Present and Future (SVC13) by Manuvir Das.

As part of this session, Manuvir described Windows Azure “bit by bit” and the available, shipping feature set (e.g. secure certificate store, logging and diagnostics system, service management API, in-place rolling upgrades)…

Windows Azure Logical View

Windows Azure - flexible application hosting, lights out service management and storage at massive scale.

…while highlighting a great brand like Coca-Cola Enterprises doing real world Azure today…

Real World Azure

"The resilience and reliability of this platform is some of the benefits that we are seeing." – SVP & CIO, Coca-Cola Enterprises

…and showed what’s next for Windows Azure – a great set of features (e.g. Administrator privileges, user-driven VMs and terminal server access) – that many enterprise development and IT shops have asked for.

Eric Knorr asks “Will IT recover from a disastrous 2009? What will be the leading trend? Analyst group IDC takes its best shot” in his 2010 as predicted by IDC article of 12/7/2009 for InfoWorld:

It's "predictions season" at IDC, which always kicks off with broad forecasts for the coming year.  This year's initial report has the dramatic title "Recovery and Transformation," which sounds more optimistic than it really is.

Basically, IDC expects IT to grow 3.2 percent worldwide (3 percent in the United States), which is slightly faster than most people believe the economy will grow. IDC senior vice president and chief analyst Frank Gens told me that the most dramatic swing will be in small and medium-size businesses, which will see 3.6 percent growth, up from negative 3.5 percent in 2009. He expects large companies to be more conservative, upping their spending by just 1.8 percent. …

This includes -- you knew this was coming -- cloud service providers. "Cloud greatly expands and matures" is IDC's biggest prediction for IT. IDC expects more "enterprise grade" offerings (as IBM introduced recently) that provide not only infrastructure but meaningful SLAs. At the same time, in large enterprises, private clouds will start to take off. …

Mary Jo Foley continues her Microsoft Big Brains series with this All About Microsoft post of 12/7/2009 about Anders Vinberg, which reports, in part:

Vinberg also is spending a lot of cycles at present on Microsoft’s evolving private-cloud strategy and vision. As one would expect, he is talking extensively to the Azure team about how to provide the kinds of infrastructural advances available to customers of Microsoft’s hosted solutions to users who want to host their own data on-premises. Currently, there isn’t really an equivalent to System Center built into Windows Azure. Meanwhile, applications aren’t standing still. Next-generation applications that customers may want to run in a public or private cloud “will have different qualities than existing apps,” Vinberg said.

“Programmers used to be an elite group of developers,” Vinberg reminisced. “Then Visual Basic arrived. In the cloud, we need the moral equivalent of Visual Basic to take this technology into the mainstream.” [Emphasis added.]

Brenda Michelson’s Gartner Fellows Interview Microsoft’s Ray Ozzie on Cloud Computing detailed post of 11/19/2009 confirms Microsoft’s future hybrid cloud intentions with a quotation:

"Ozzie: I believe in a hybrid model. I fundamentally, deeply believe in a hybrid model at the experience side and at the back-end side.

At the back-end side, it depends on the size of enterprise and the workload, as well as the segment of the enterprise and whether it is highly regulated or whatever. The decisions regarding what to keep on-premises versus what to distribute into the cloud will vary dramatically. Very small businesses will put almost everything into the cloud. Very large businesses will put all their infrastructural systems, such as mail, phone systems and document management, into the cloud. Enterprise applications that have high integration requirements and a lot of legacy issues will stay on-premises. What happens in the middle is a mix."

“…But again, it’s a hybrid architecture.  If you don’t have the center, then you can’t rendezvous. You can’t find each other. You can’t connect in any way, shape or form.  However, if you don’t have the edge, then you don’t have the agility. You pay for ingress and egress when you don’t have to.”

Phil Wainwright worries that “Windows Azure, and similar platforms, will lure developers into half-baked (half-aaSed?) cloud deployments without realizing they haven’t completed the journey to the cloud in his Windows Azure and the many faces of cloud post of 12/7/2009 to ZDNet’s Software as Services blog:

One of the reasons it’s so difficult to satisfactorily define cloud computing is that people have many different needs and expectations from a cloud platform. To start a conversation about cloud — especially one that seeks to evaluate the relative merits of competing cloud platforms — without first identifying what needs are being met is to invite misunderstanding and confusion. So before I come to my analysis of Microsoft’s Windows Azure platform and the hidden danger lurking there for many ISVs looking to embark on a SaaS strategy, I’m going to segment cloud computing into several important but separate categories.

First and foremost, we need to be clear whether we’re talking about cloud as a service or cloud as an infrastructure asset. I often find myself halfway through a conversation about cloud computing, in which I’ve made the natural assumption that we’re talking about delivery of cloud computing as-a-service by a third-party operator, when it suddenly dawns on me that the other person has been talking about buying software and hardware in order to implement their own cloud infrastructure. …

Maria Spinola offers a Cloud Computing Implementation Road-Map in this 12/6/2009 page that details the following four steps:

    1. Determine the Bad and Good “Candidates” for the Cloud
    2. Prepare Your IT portfolio for the Cloud
    3. Key Questions to Ask Cloud Computing Providers
    4. Test, Deploy, Monitor and Measure ROI

She includes a list of her earlier cloud-computing white papers.

Lori MacVittie asserts Next-Generation Management of Data Centers Should be Modeled on Social Networking in this 12/4/2009 post:

Should the next generation management of network and application network devices look and act more like Facebook and Twitter? Infrastructure 2.0 could take us there.

You may think I’m kidding and certainly I make this proposal with some amount of humorous intent, but there is some value, I think, in applying the concepts of Web 2.0 and social networking to network management systems (NMS).

There’s a reason it’s called social networking, after all. It’s modeled closely on networking and NMS is primarily about managing not just individual network and application network devices, but on managing the relationships between them. “Dependencies” are often included in NMS applications to better visualize and traverse the myriad relationships between network, application network, storage, and applications that make up the data center infrastructure. Understanding which devices are “friends” and which are “followers” is nothing new to NMS and IT professionals who spend their days mired inside these applications. …

<Return to section navigation list> 

Cloud Security and Governance

See Ron Lai’s Novell to extend identity management to cloud, virtualized apps article in the Cloud Computing Events section.

<Return to section navigation list> 

Cloud Computing Events

• Sanjay Jain touts next January’s Azure BizSpark Camp @ New York: Chance to Win $5000 in this 12/8/2009 post to the Microsoft BizCamp blog:

With several successful Microsoft BizSpark Incubation Weeks under our belt (Azure Atlanta, Win7 Boston, Win7 Reston, CRM Reston, CRM Boston, Win 7 Irvine, Mobility Mountain View,), we are pleased to announce Microsoft BizSpark Camp for Windows Azure in New York, NY during January 28–29, 2010. Based upon your feedback we have made several changes including offering cash prize, compressed time commitment and much more. We hope you’re tapping into the growing BizSpark community.

The current economic downturn is putting many entrepreneurs under increasing pressure, making it critical to find new resources and ways to reduce costs and inefficiencies. Microsoft BizSparkCamp for Windows Azure is designed to offer following assistance to entrepreneurs.

  • Chance to win cash prize of $5000
  • Nomination for BizSpark One (an invitation only program) for high potential startups
  • Learn and build new applications in the cloud or use interoperable services that run on Microsoft infrastructure to extend and enhance your existing applications with help of on-site advisors
  • Get entrepreneurs coaching from a panel of industry experts
  • Generate marketing buzz for your brand
  • Create opportunity to be highlighted at upcoming launch

Steve Marx describes his role as an Azure technical strategist and answers developers’ questions about Windows Azure and SQL Azure in an interviewed by Scott Hanselman in this 00:30:00 live Channel9 video recorded at PDC09 and released on 12/8/2009: Channel 9 Live at PDC09: Steve Marx.

Tom Bailey suggests that you Take the Windows Azure Challenge and Get Paid in this 12/8/2009 post to TechNet’s Windows Azure Platform, Web Hosting and Web Services blog:

Elance have just launched another one of their amazing challenges. This time, you get paid to play around with Windows Azure.

It takes just a few minutes to earn $50 and also the chance to earn $1,000 to $10,000 for the coolest Azure applications. Here’s the scoop:

Elance is launching the Windows Azure Challenge for US based developers.

You will need to have an Elance account to participate.  Sign up for a free account at: https://secure.elance.com/php/reg/main/providersignup.php

There are two ways to get paid for creating a Windows Azure based application:

  • EARN $50 for each accepted application you build on Windows Azure and submit through Elance by December 31, 2009.  Any type of Windows Azure based application from simple samples to extremely complex solutions will qualify.
  • WIN UP TO $10,000 for the top application and $1k each for the top 5 runner up applications.

Click here to take the challenge.

Ron Lai reports “Novell plans eight new virtualized and cloud apps with built-in security that will aid in 'intelligent workload management'” in his 12/7/2009 Novell to extend identity management to cloud, virtualized apps article for InfoWorld’s Cloud Computing blog:

Virtualization and cloud computing has taken off, despite strong concerns lingering over how companies can secure and manage those apps and data.

Novell says it can help companies with both sides of the equation, accelerating the creation of virtualized and cloud apps with built-in security.  Over the next year, Novell plans to release eight new products or upgrades to aid in what it calls "intelligent workload management."

The upcoming Novell Identity Manager 4 will add the new ability for IT managers embed identity management and other security features into both Web-hosted and virtualized apps, Novell CEO Ron Hovsepian said in an interview last week.

Novell Identity Manager 4 will arrive by the middle of next year. That will work closely with Novell Cloud Security Service, also due in 2010, in order to extend identity and security policies onto apps and data hosted in the cloud.

Ben Riga’s Catching Up and PDC’09: So many sessions; so little time post of 12/7/2009 lists PDC 2009 video resources by these categories:

    • Getting Started
    • Windows Azure
    • Codename “Dallas”
    • SQL Azure
    • Identity
    • Customer and Partner Showcases

The Organizing Committee of the Nineteenth International World Wide Web Conference (WWW2010) welcomes the participation of researchers from around the world to submit original and pioneering research related to the Web for presentation and discussion next April in Raleigh, North Carolina. According to Tweets from @WSREST2010:

[T]he proposal for WS-REST 2010 at #WWW2010 accepted as a full-day workshop! [O]rganized by @dret @alexandrosM @pautasso. More details soon.

Keynote speakers with be Vint Cerf and danah boyd.

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

• A Fujitsu press release of 12/8/2009 titled Fujitsu Launches End-to-End Cloud Services in North America announced “Fujitsu Will Help Enterprises Build Secure Clouds in Their Own Data Centers and Will Host Applications From ISVs -- Such as CoolRock Software and Intershop Communications -- in Upgraded Fujitsu Data Center in Sunnyvale:”

SUNNYVALE, CA -- (Marketwire) -- 12/08/09 -- Fujitsu (http://solutions.us.fujitsu.com) today announced it will be offering end-to-end enterprise cloud services designed to help enterprises and ISVs in North America quickly reap the cost and agility benefits of reliable, secure cloud computing. Fujitsu enterprise cloud services will let companies migrate existing multi-platform and multi-vendor mission-critical systems to enterprise clouds. Clients moving to the cloud will minimize their capital expense-intensive investment in technology and instead purchase a service that can be tailored to meet their specific business requirements, allowing them to align their IT costs with revenue.

In preparation for a Q1 2010 launch, Fujitsu has upgraded its environmentally-friendly data center at its Sunnyvale, Calif., headquarters to the Tier III level and will also support the cloud application interface (API) specification, which it recently submitted to the Open Cloud Standards Incubator of the Distributed Management Task Force (DMTF) in an effort to promote interoperability of cloud computing environments. CoolRock Software, an ISV specializing in email management software for archiving, ediscovery and collaboration, and Intershop Communications, a leading ecommerce solutions ISV, intend to leverage the Fujitsu cloud computing solution to offer their software to clients using a SaaS business model.

Designed for enterprises in manufacturing, finance, healthcare, retail and other compute- and data-intensive industries, Fujitsu is offering a complete array of cloud solutions, including system construction, operations, maintenance services and full-featured vertical applications. In order to comply with vertical industry standards and regulations, retail transactional applications will be hosted in a PCI-compliant data center and health care applications will be hosted in a HIPAA-compliant environment. …

Eric Engleman reports Amazon cuts some cloud pricing, adds bigger storage options in this 12/8/2009 post to the Puget Sound Business Journal’s TechFlash blog:

With Microsoft rolling out its Azure cloud computing platform, Amazon.com is cutting prices again on its cloud services and adding some new incentives. Amazon Web Services evangelist Jeff Barr, in a blog post, said the company is reducing the cost of its S3 storage service in the EU region by more than 15 percent, adding new storage options for big customers, and waiving inbound data transfer fees for all cloud computing services through June 2010.

Here's what Barr writes about the new storage options:

“In addition, we have reduced the price of our 500 TB tier and have added two new storage tiers across all Amazon S3 Regions for our largest customers. S3 storage in excess of 1 PB per month is billed at $0.08 per GB per month in the US Standard and EU (Ireland) Regions and $.095 in the Northern California Region. S3 storage in excess of 5 PB per month is billed at $0.055 per GB per month in the US Standard and EU (Ireland) Regions and $0.070 per GB per month in the Northern California Region.”

Amazon in October also slashed prices on its core EC2 cloud computing service. It will be interesting to see what else Amazon does to counter Azure and other cloud rivals.

Reuven Cohen discusses Open Cloud Services & Co-operative Community Clouds in this 12/7/2009 post:

Now that I'm back and have had chance to recuperate from from my trip to Israel I thought I'd share a few of the more interesting ideas to come out of the conversations I've had. In particular were several comments that Alistair Croll made at CloudCamp Tel Aviv about the potential opportunities for what he described as "Free / Open Cloud Services" as well as an idea I had around the potential of so called "Cooperative Community Clouds".

First in regards to Open Cloud Services, basically the concept goes like this; as we move away from the traditional client/server based models of the past to more web centric / service oriented opportunities of the future, we will see open source shift from application centric (source code) toward free open services and information. Cloud providers will essentially give away access in return for greater adopt of their platforms / services, increased customer acquisition and to accelerated creation of data and information. Basically the same reasons companies open source their applications today, just applied in a cloud context.

His comments really did get me thinking and reminded me of a potentially huge but generally overlooked opportunity for Cloud Computing. What I'm talking about is that of the "Community Cloud" or "Cloud Cooperative" which may be a potential avenue to enable these types of free or shared cloud services.

The question is “How will ‘free cloud services’ pay their bills?”

<Return to section navigation list> 

Friday, May 22, 2009

Ten Slides from the “What’s New in Microsoft SQL Data Services” (DAT202) Tech*Ed 2009 Session

Rick Negrin’s What’s New in Microsoft SQL Data Services (DAT202) session is available to Tech*Ed 2009 attendees only:

Come and learn how SQL Data Services has evolved over the past year based on your feedback. In this session, learn how SQL Data Services delivers on promise of Database as a Service (DaaS). See how easy it is to take an existing class of SQL Server applications and extend them to the DaaS service using existing SQL Server knowledge, protocols, client libraries and tools. With minimal changes, your application will be running in a highly available and scalable service. Finally we touch on the business model, terms of use, and present a roadmap for the service.

Update 5/22/2009: Current SDS survey forms for the CTP are read-only for Windows Live IDs used to complete surveys for original SQL Server Data Services (SSDS) CTP. See last Negrin slide. Added a slide from and link to David Robinson’s SDS presentation at Microsoft’s Enterprise Developer and Solutions Conference.

Following are captures of 10 slides from Rick’s presentation for those of you who didn’t make it to Tech*Ed 2009.

Clicking the Join the Mailing List link on the SDS SQL Data Services (SDS) page (above URL) takes you to a Microsoft Connect page with two mailing lists dated 3/5/2008 and 4/7/2008 from the original SQL Server Data Services (SSDS) CTP. If you completed a survey for an SSDS account with your currently active Windows Live ID, the original survey (read-only) opens.

The following is a slide from David Robinson’s SDS presentation at Microsoft’s Enterprise Developer and Solutions Conference, courtesy of Dennis Gobo.

You can watch Dave’s presentation, which was similar to Rick’s, here. Disregard the “Presentation Not Available” message that might appear and click the arrow to start the segment.

Sunday, November 02, 2008

Cloud Computing at PDC and Elsewhere: Day 4 (10/30)

A daily compendium of PDC keynotes and sessions about Cloud Computing, SQL Data Services, and related topics. This post will be updated frequently from 8:00 AM to 5:00 PM PDT or later. Unless otherwise noted all sessions were on 10/29/2008.

SQL Data Services and Related Session Videos From Channel9

Channel9 is posting videos of PDC 2008 interviews and sessions about 24 hours after the session’s end. Following are segments from Wednesday’s sessions that are related to SQL Data Services (SDS):

Pablo Castro and Niranjan Nilakantan: ES07 Windows Azure: Modeling Data for Efficient Access at Scale (Wed 10/29 | 1:15 PM-2:30 PM | 403AB):

Learn how to use the highly scalable, available and durable Table Storage service. This session presents a deep dive with demos into the programming APIs and data models for structured storage.

Unfortunately, the screen captures are drastically overexposed. For more background about ADO.NET Data Services (Astoria) as a front-end for Table Services and SQL Data Services, read Pablo’s ADO.NET Data Services in Windows Azure: pushing scalability to the next level post of 11/1/2008.

Patrick McElroy: SQL Services: Futures (Wed 10/29 | 10:30 AM-11:45 AM | 408B):

Learn about new capabilities in upcoming versions of SQL Services. See demonstrations of services like advanced query processing, sync services, and reporting services running in the cloud. Also learn about plans to integrate with other services.

Andrew E. Fitzhugh: How HP Built their Magcloud Service on Windows Azure (Wed 10/29 | 10:30 AM-11:45 AM | 411). Here’s the abstract:

Hear from an online magazine publishing service that was prototyped on Microsoft cloud services. Learn from the development team about what they built, why they built it, and what they learned about the platform along the way.

You can learn more about the project by reading Kurt Mackie’s PDC: HP Startup Tests Windows Azure Cloud Platform article of 10/31/2008.

Other .NET Services Session Videos From Channel9

Following are segments from Wednesday’s sessions that are related to Azure cloud computing services other than SQL Data Services (SDS):

Clemens Vasters: .NET Services: Connectivity, Messaging, Events, and Discovery with the ServiceBus (Wed 10/29 | 10:30 AM-11:45 AM | 406A):

Learn how to use the ServiceBus in the Microsoft .NET Services to address difficult connectivity, security, and discoverability issues.

Michael Conrad, John Shriver-Blake: Showcase: Windows Azure Enables Live Meeting (Wed 10/29 | 12:00 PM-12:45 PM | 408A):

The next generation conferencing team discusses why the cloud computing platform was selected for a next-generation, high-scale conferencing service that requires high performance and reliability and the ability to automatically adjust scale. The team shares learnings from the prototyping phase and discusses features on the cloud services roadmap that are central to their success.

Steve Garrity, Mark Gilbert: .NET Services: Logging, Diagnosing, and Troubleshooting Applications Running Live in the Cloud (Wed 10/29 | 1:15 PM-2:30 PM | 515B):

Learn the real world ways of tracking down and fixing problems on services and systems running live in the cloud. These black belt techniques and application patterns are used to simplify the operations and debugging of large scale services while they run. Also get a peek at how we run and monitor the .NET Services at Microsoft.

Justin Smith: .NET Services: Access Control In Microsoft .NET Services (Wed 10/29 | 3:00 PM-4:15 PM | 408B):

Learn how to manage access control to the ServiceBus, Workflow, and Data Services via the Access Control Service. This session illustrates the access control capabilities of these services, and several common patterns for building your application securely using the .NET Services, access control service.

Justin Smith: .NET Services: Access Control Service Drilldown (Wed 10/29 | 4:45 PM-6:00 PM | 515A):

This session shows how to use the access control service in your application. Learn how to secure your application or service using the Access Control service's APIs. We'll then drill into the protocols and security patterns the service uses along with explaining some of the service internals.

Sriram Krishnan: Windows Azure: Cloud Service Development Best Practices (Wed 10/29 | 3:00 PM-4:15 PM | Petree Hall CD):

This session goes beyond the "Hello World" development experience, giving best practices across common tasks for serious service developers. These tasks include logging, debugging, deployment, management, and maintenance of individual services.

 

Sunday, September 07, 2008

LINQ and Entity Framework Posts for 9/2/2008+

Note: This post is updated daily or more frequently, depending on the availability of new articles.

•••• Updated 9/6/2008 5:00 PM PDT
••• Updated 9/5/2008 2:00 PM PDT
•• Updated 9/4/2008 12:40 PM PDT
• Updated 9/3/2008 11:40 AM PDT

Entity Framework and Entity Data Model (EF/EDM)

••• David Sceppa says in his SQLite's ADO.NET Provider Supports the ADO.NET Entity Framework! post of 9/6/2008:

[T]he initial build of the SQLite provider that supported the RTM version of the Entity Framework was available the same day that the Entity Framework released.

According to the PhxSoftware site, SQLite for ADO.NET 2.0 v3.6.2 of 8/30/2008 (build 1.0.58.0 on SourceForge.Net):

System.Data.SQLite is the original SQLite database engine and a complete ADO.NET 2.0 provider all rolled into a single mixed mode assembly.  It is a complete drop-in replacement for the original sqlite3.dll (you can even rename it to sqlite3.dll).  Unlike normal mixed assemblies, it has no linker dependency on the .NET runtime so it can be distributed independently of .NET.

However, the site offers the following caveat about EF:

Support for the ADO.NET 3.5 Entity Framework
SQLite's EF provider is still in beta for now, but go ahead and kick the tires!

PhxSoftware should confirm support for EF RTM, not a beta version.

Note: SQLite is the local database engine for Google Gears and several other technologies. According to Robert Accettura’s The Winner For Most Embedded Is: SQLite post of 2/27/2008:

What is interesting is that SQLite really dominates [the embedded database market] right now. Adobe Air, Mozilla Prism, Google Gears, Android, iPhone SDK (likely through Core Data API), Symbian, Ruby On Rails (default DB in 2.0), PHP 5 (bundled but disabled in PHP.ini by default).

I question whether a lightweight database engine (redistributable = 600 KB) justifies a heavyweight data layer, such as EF.

For more information on the SQLite database engine, go to http://www.sqlite.org/.

•••• Kathleen Dollard explains how to Customize Code Generation in EF in her “Ask Kathleen” column for Visual Studio Magazine’s September 2008 issue. She shows how to customize the EF team’s SampleEdmxCodeGenerator to remove the default DataMember attribute to specific properties of EF EntityObjects so the DataContractSerializer ignores them. The process is by no means a walk in the park and is a tribute to Kathleen’s analytical and coding prowess.

••• David Sceppa announces in his Devart's New Providers Support the ADO.NET Entity Framework! post of 9/5/2008 that Devart (formerly Core Laboratories) is the first third-party Entity Framework data provider to support the VS 2008 SP1 RTM implementation. For more information see Devart’s New Versions of ADO.NET Data Providers Available! post for:

•• Diego Vega updated the ADO.NET Entity Framework & LINQ to Relational Data wiki on 9/4/2008 to note that all Entity Framework Toolkits & Extensions have been updated to VS 2008 SP1 RTM except Danny Simmons’ Perseus (EntityBag) project.

So far, there’s no word from Danny about his intention to update the project.

Andrew Peters will be a developer on the Entity Framework team, according to his Joining Microsoft post of 9/2/2008. Andrew is a cofounder of Mindscape, a software solutions provider in Wellington, NZ, and the initial force behind the LightSpeed object/relational management (O/RM) tool.

Andrew will be working with fellow Kiwi, Alex James, EF’s Metadata program manager, but from the Microsoft Canada Developer Center in Vancouver, BC, because he missed this year’s H1B visa cutoff. (Damien Guard, a member of the LINQ to SQL team, also has temporary duty in Vancouver.)

Kristofer Andersson’s Tools - Part 5 - Add-ins - Documentation features in Entity Framework vs Linq-to-SQL of 9/2/2008 praises the EF team for including documentation for classes and their property. However, he faults the group for not using the entity’s Summary description for EntitySets and/or enabling editing it in the designer. I agree with Kris that the default “There are no comments for [name] in the schema” is “redundant and waste of screen real estate.”

See the related entry in the “LINQ to SQL” category.

Muhammad Mosa’s LINQ to Entities, Workarounds on what is not supported post of 9/2/2008 handles the EF features he found missing in his earlier LINQ to Entities, what is not supported? post of 8/24/2008 by:

  • Using Business objects as Wrappers for Entity objects
  • Converting to client evaluation, LINQ to Objects way

LINQ to SQL

Scott Guthrie resumes posting about ASP.NET MVC with LINQ to SQL as the data source in his very lengthy ASP.NET MVC Preview 5 and Form Posting Scenarios of 9/2/2008. I estimate the blog to be at least 42 screens long (i.e., classic Guthrie detail).

Kristofer Andersson describes the Huagati DBML Tools add-in’s new documentation feature for LINQ to SQL in his Tools - Part 5 - Add-ins - Documentation features in Entity Framework vs Linq-to-SQL post of 9/2/2008.

The add-in provides a new “Update Linq-to-SQL documentation from database” menu choice that imports the Description property of tables and columns as “summary xml comments and description attributes for all entities/classes, members/properties and data context entity accessor properties.”

Daniel Crenna has reached part four of his series that demonstrates using LINQ to SQL as the data transfer layer for a WCF service consumed by Silverlight following are links to the current and three previous members:

Dan’s From ASP.NET to Silverlight in Five Leaps of 7/25/2008 describes the five major differences between developing in ASP.NET and Silverlight.

LINQ to Objects, LINQ to XML, et al.

•••• Eric White’s Announcing the First CTP of Open XML SDK V2 reports the availability of the new OOXML SDK that includes the following new features:

  • Strongly typed document object model (DOM).

  • Tool to compare two Open XML files.

  • Class explorer that helps you understand the markup and determine which classes to use in the strongly typed DOM.

  • Document reflector that can write a lot of your code for generating documents or content.

Eric explains the benefits of the strongly typed DOM with LINQ to XML:

The gist of the strongly typed DOM is that the SDK defines classes for elements in the markup. When you access the contents of a part, you access via these class. For example, instead of retrieving a collection of System.Xml.Linq.XElement objects for the paragraph (w:p) elements, you retrieve a collection of DocumentFormat.OpenXml.Wordprocessing.Paragraph objects.

and provides small examples that compare the weakly and strongly typed with LINQ to XML.

•• Diego Lopez Ruiz suggests using additional XML technologies with LINQ to XML in his LINQ to XML, XmlReader, XmlWriter - more than meets the eye tidbit of 9/5/2008.

•• LinqMaster’s Using LINQ ElementAt and LINQ ElementAtOrDefault post of 9/4/2008 explains the syntax of these two LINQ extension methods.

•• Lisa Feigenbaum has just returned from an around-the-world tour giving VB presentations. She’s posted descriptions and materials for these two LINQ-related sessions she gave at Tech*Ed SouthEast Asia 2008:

in her Visual Basic at TechEd SouthEast Asia (Lisa Feigenbaum) post of 9/3/2008.

Amanda Silver, Lino Tadros, Steve Lasker, Erick Thompson, and Charlie Calvert discuss LINQ in the UI Layer with Carl Franklin in this panel interview. Here’s the description:

LINQ is commonly known as a great way to query data from external sources (databases, Web services, XML, SharePoint, etc). However, there are numerous other places where LINQ can make the developer's life much easier when working on the end user interface, especially when you incorporate DataSet into the mix. This panel will talk about these approaches and the possible pitfalls of using them.

Bart De Smet offers translation of LINQ expression syntax to LINQ method calls in his C# 3.0 Query Expression Translation Cheat Sheet post of 8/30/2008.

ADO.NET Data Services (Astoria)

•••• Mike Flasko reports in his 0 to 60 With ADO.NET Data Services post of 9/5/2008 that he’s updated his “Using Microsoft ADO.NET Data Services” white paper for VS 2008 SP1. Following are a few of the more important topics of the whitepaper:

  • Creating a Data Service using the ADO.NET Entity Framework
  • Creating a data service from any data source (from a CLR object graph)
  • Trying an ADO.NET Data Service (in the browser)
  • Finding and Pointing to Data: URLs in Data Services
  • Expression Syntax
  • Options for Data Representation (AtomPub, JSON)
  • Changing Data in ADO.NET Data Services
  • Optimistic concurrency
  • Custom Behaviors on Data Services
  • AJAX Applications
  • Language Integrated Query (LINQ to ADO.NET Data Services)
  • Using the client library from Microsoft Silverlight 2
  • Controlling Data Service Policies

Unsurprisingly, there’s no mention of LINQ to SQL as an Astoria data source in this white paper.

••• Shawn Wildermuth’s My Silverlight 2 Data Services Article Code Updated post of 9/4/2008 announces the availability of updated code for his Silverlight 2 MSDN Magazine article in the September 2008 issue. He also includes links to these three updated versions on his new Silverlight Data site:

  • Simple Example: Single page, three-table schema example that reads and writes data via ADO.NET Data Services and Entity Framework.
  • NHibernate Example: Single page, three-table schema example that reads and writes data via ADO.NET Data Services and NHibernate's new NHibernate.LINQ provider.
  • AG Games (Down for Maintenance): A more complex example using a richer Entity Framework model with complex mapping, data templates, multiple entity read/writing.

• Phani Raj announces an Interim Release: Making SL2 Beta 2 Clients Work With .NET Fx 3.5 SP1 RTM Servers on 9/2/2008. The new release solves the interoperability problem between Astoria RTM in VS 2008 SP1 and Silverlight 2 Beta 2 that Shawn reported in his post below.

• Rick Strahl warns about JSON UTC date encoding issues that might affect Astoria users in his Watch out for Date Kind in JSON Deserialization post of 9/3/2008.

• Roger Jennings: Out-of-band update 9/3/2008 4:45 PM PDT.

Shawn’s sample code works fine with VS 2008 SP1 RTM, and Silverlight 2 Beta 2 runtime, SDK, and VS 2008 Silverlight Tools when you replace the SL2 Beta 2 version of System.Data.Services.Client.dll with the interim release from today’s Astoria Project post.

Two or three fixes required:

  1. Replace opResponse.HasErrors with opResponse.Error !=null in ProductList.cs line 121 (and similar for VB)
  2. Replace EntitySetRights.WriteUpdate with EntitySetRights.WriteReplace in Products.svc line 20 (and similar for VB)
  3. If you’re using SQL Server Express, replace DataSource=.; with Data Source=.\SQLEXPRESS; in Web.config

Shawn Wildermuth warns readers of his “Create Data-Centric Web Applications With Silverlight 2” article for the MSDN Magazine’s September 2008 issue that his code examples require VS 2008 SP1 Beta and Silverlight 2 Beta 2 in “Caveats About My Silverlight 2 Data Services Article.” VS 2008 SP1 RTM won’t work.

He promises to update the sample code, which uses ADO.NET Data Services as its WCF-enabled data source, when Silverlight RTM becomes available.

ASP.NET Dynamic Data (DD)

•••• Matt Berseth shows you how to customize the default confirmation message box for deleting an entity instance with details about the item you’re deleting in his Dynamic Data - Customizing the Delete Confirmation Dialog post of 9/7/2008. Here’s a sample (courtesy of Matt):

•••• Steve Naughton posts the code for all seven parts of his FieldTemplate Septet (with apologies to Lawrence Durrell):

  1. The Anatomy of a FieldTemplate.
  2. Your First FieldTemplate.
  3. An Advanced FieldTemplate.
  4. A Second Advanced FieldTemplate.
  5. An Advanced FieldTemplate with a GridView.
  6. An Advanced FieldTemplate with a DetailsView.
  7. An Advanced FieldTemplate with a GridView/DetailsView Project.

in preceding link #7’s page of 9/6/2008.

••• Steve Naughton discovers in his Dynamic Data Futures – Part 2 - AnyColumn Filter post of 9/5/2008 that creating the AnyColumn filter from the Dynamic Data Futures Integer filter just requires you to “add the filter attribute Integer to the property you want filtered,” which make for a much shorter than usual article.

•• Steve Naughton explains how to display and edit multiple fields in a custom FieldTemplate in his Dynamic Data Compound Column post of 9/4/2008. He uses a compound Point(x, y) type as a Coordinate property for his tutorial. Steve says:

At a later date I would like to try this with Point being a User Def[in]ed Type in SQL Server 2005/2008, this would get rid of the need for the compound property and would mean only a new FieldTemplate was required.

• Steve Naughton, true to his word, posted part 1 of Dynamic Data Futures - Advanced Filters, Getting Dynamic Data Futures filters working in File Based Website, on 9/3/2008. The post is a lavishly illustrated, step-by-step tutorial that covers these topics:

  1. Adding Dynamic Data Futures to your Website
  2. Adding a Reference to Dynamic Data Futures and AjaxToolkit
  3. Adding the new Filter User Controls to the Website
  4. Converting Web Application files to work in a file based Website
  5. Making Necessary Changes to allow Advanced Filter to Work
  6. Testing the AdvancedFilterRepeater and new Filters

I wouldn’t try installing Dynamic Data Futures to a file-based Web site without this article.

Steve Naughton reports that he’s back from vacation  in Normal Service Resumed of 9/2/2008. However, he worked on the following new episodes in his series of DD improvements and will publish them shortly:

  • How to get the new Advanced Filters component of DD Futures working in a file based website.
  • Making an Any Column Filter (Works on any column not just FK columns)
  • Adding Insert facility to my GridView FieldTemplate.
  • A Time entry control and FieldTemplate.

SQL Server Data Services (SSDS)

••• Roger Jennings’ Microsoft Introduces Data Minining “Cloud Services” post of 9/4/2005 says the new online Table Analysis Tools for data mining appear to be a conventional Web service or Software as a Service (SaaS) application, not a “cloud service.”

However, hooking SSDS as the data service for the tools might qualify them for “cloud computing” status.

Ryan Dunn posted PhluffyFotos v2 Released on 9/4/2008. According to Ryan:

This sample application is a ASP.NET MVC and Windows Mobile application showing how to build a photo tagging and sharing site using our cloud data service, SSDS.

Following are abbreviated descriptions of v2’s new features:

  • Updated to MVC Preview 4
  • Updated to add thumbnail support
  • Updated to use the SSDS blob support 
  • Updated to use the latest SSDS REST library

Ryan adds:

The sample is available for download at CodePlex, and a live version is available to play with at PhluffyFotos.com.  I am opening this one up to the public to upload photos.  Maybe I am playing with fire here, so we will see how well it goes.

Ryan Dunn announces in his SSDS REST Library v2 Released post of 9/4/2008 that the new REST-based library for SSDS available from the MSDN Code Gallery’s SSDS REST Library offers the following new features:

  • Concurrency support via Etags and If-Match, If-None-Match headers
  • Blob support
  • Parallelization support via extension methods
  • Bug fixes
  • Better test coverage

See Ryan’s post for more details on the added features.

Roger JenningsHow To Get a List of Your SSDS Account’s Authorities post of 9/4/2008 answers a question that Mike Amundsen and I asked in the How do I get a list of my defined Authorities? thread of the SQL Server Data Services (SSDS) - Getting Started forum.

Pat Helland analyzes Amazon S3’s recent outage and suggests what’s required for cloud-based data reliability in his lengthy Confidence in the Cloud post of 9/1/2008. Topics include:

  • Some Observations about Reliable Process Pairs
  • Less Is More
  • N-Version Programming
  • Availability Over Consistency
  • Eventual Consistency
  • Front-Ending the Cloud
  • It's Going To Be a Fun Ride!

He concludes:

It is my opinion that we will be designing systems to support eventual consistency.  Part of this trend will be to back away from our traditional separation of storage (e.g. database) from applications.

The SSDS team touts immediate (ACID, transactional) consistency as one of the primary selling points of SQL Server Data Services.

(Pat left his job as a data architect at Microsoft to join Amazon and returned to Microsoft in March 2007 and ultimately joined the SQL Server data programmability group.)

Note: If you haven’t been to downtown Chicago for a few years (like me), check out Pat’s photographs in A Wonderful Few Days at a Wedding in Chicago of the same date.

SQL Server Compact (SSCE) 3.5 and Sync Services

• Aaron Greene, Andrei Maksimenka and Christian Liensberger “dig into the details of the Sync Framework and explore the complexities of sync, generally” in Channel9’s one-hour Synchronizing Data: Inside the Microsoft Sync Framework video segment. The description starts with:

The Microsoft Sync Framework is a comprehensive synchronization platform that enables collaboration and offline scenarios for applications, services and devices. It supports any kind of data type and any kind of data store. The framework was designed to be transfer protocol independent and to allow the developers to map any kind of network topology.

Visual Studio 2008 Service Pack 1 (General)

No new posts as of 9/2/2008 2:00 PM PDT 

Miscellaneous (WPF, WCF, Silverlight, etc.)

•• Beth Massi demonstrates how to incorporate a Windows-form DataGridView control in a WPF form by using the WindowsFormHost control in her Editing Tabular Data in WPF Using the Winforms DataGridView post of 9/4/2008.

• Rob Conery is updating his MVC Storefront project to Preview 5, as noted in MVC Storefront: Intermission 2 of 9/3/2008.

Roger Jennings decries Google Chrome’s draconian end-user license agreement in Chrome’s Evil Terms of [Software as a] Service of 9/3/2008. Other sources (ReadWriteWeb, The Register) cite mandatory copyright licensing, but there’s much more to fear when you read all the fine print. (Chrome is uninstalled here.)

• Derik Whittaker’s Upgrading Dimecasts.Net from MVC Preview 4 to Preview 5 post of 9/3/2008 expands on Casey Charlton’s post below.

Casey Charlton describes seven Problems Upgrading ASP.NET MVC to Preview 5 in this post of 9/2/2008.

Roger Jennings gripes about Technorati’s missing entries for the OakLeaf Blog’s new posts in his Technorati Fails to Index OakLeaf Systems Blog Posts complaint of 9/2/2008

WebSlice Content for Internet Explorer 8 Beta 2

OakLeaf LINQ and Entity Framework Links

Last Updated: 9/4/2008 12:40 PM PDT

  • Entity Framework Additions
  • LINQ to Objects, XML, etc. Additions
  • ADO.NET Data Services Additions
  • ASP.NET Dynamic Data Additions
  • SSDS REST Library V2