Tuesday, January 05, 2010

Windows Azure and Cloud Computing Posts for 1/4/2010+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

 
Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the November CTP in January 2010. 
* Content for managing DataHubs will be added as Microsoft releases more details on data synchronization services for SQL Azure and Windows Azure.

Off-Topic: OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009.

Azure Blob, Table and Queue Services

My OakLeaf Systems Azure Table Services Sample Project example from my Cloud Computing with the Windows Azure Platform book is live as an upgraded, commercial service.

However, the book’s three other live test harness accounts (Azure Blobs, Azure Tables SSL, and Azure Queues) won’t be available until I have time to add them to the new Table Services Sample Project.

<Return to section navigation list> 

SQL Azure Database (SADB, formerly SDS and SSDS)

Chris Rolon will present MSDN Webcast: geekSpeak: SQL Azure Under the Hood with Chris Rolon (Level 200) on Wednesday 1/6/2010 at 12:00 PM PST:

In this episode of geekSpeak, Chris Rolon gives us a look under the hood of Microsoft SQL Azure to see how was constructed. Chris discusses the issues involving high availability, failure detection, automatic failover, and the distributed data fabric. Be sure to bring your questions about SQL Azure for Chris. This geekSpeak is hosted by Glen Gordon and Lynn Langit.

Cihan Biyikoglu and Henry Zhang co-wrote the Evaluating Application Performance and Throughput in SQL Azure post of 1/5/2010:

The discussion of performance and throughput came up a few times in discussion with customers at PDC this year. Here are a few important things to understand when evaluating SQL Azure performance.

Key takeaways; SQL Azure environment utilizes large number of nodes and the total capacity across these nodes means massive scale. SQL Azure provisions a logical server for each account. These servers are logical groupings of databases. Physically, these databases are spread across the environment to various nodes. Each node services many databases. Since placement of new databases is spread across the environment, it is not likely that 2 of your databases are on the same server. …

Steve Clayton’s Running those 1.3 billion Hotmail inboxes post of 1/5/2010 quotes Arthur de Haan’s post of 12/22/2009 regarding the scale of Hotmail:

    • We currently have over 155 petabytes of storage deployed (70% of storage is taken up with attachments, typically photos).
    • We’re the largest SQL Server 2008 deployment in the world (we monitor and manage many thousands of SQL servers). [Emphasis added.]

<Return to section navigation list> 

Song analyzes three DBaaS entries in his Database as a Service – from Force.com to Amazon RDS and SQL Azure of 1/3/2010:

These days, those who are building internet applications have many choices for database.  Traditionally (i.e. up to the last few years), you have the usual commercial and open source solutions such as Oracle, Microsoft SQL Server, MySQL, PostgresSQL to store user and application data.  Usually, these databases run in the company’s own data centers or a co-lo facility managed by AT&T, Verizon or the like.  However, offerings from Force.com (from Salesforce.com), Amazon Web Services and Windows Azure are changing this landscape.  In fact, in some cases, they start to converge despite their differences.

I started thinking about database as a service not because I was building one or looking for a DBaaS solution.  In fact, I was building a full stack internet application based on open source technology.  However, the offering from Force.com was rather intriguing as they claim to offer faster development time, easy to use language and secured infrastructure as this InformationWeek article indicates.

However, when you dig a little deeper, you will find that if you do not want your UI to look like the Salesforce.com form based interface and instead you want a rich interactive user experience, Force.com offers limited options.  The reason is because Force.com only offers a database driven form application UI or a heavy client written in Adobe AIR and Flex.  The former is reminiscent of web application from the late 1990s and the latter requires significant processing power on the client desktop.

AppFabric: Access Control, Service Bus and Workflow

The Windows Azure Team’s Now Available: New Identity Developer Training Kit for WIF with Windows Azure post of 1/5/2009 announces:

At the Professional Developers Conference last November, we announced the availability of Windows Identity Foundation (WIF), a new extension to the .NET Framework that makes it easier for developers to create more secure applications with interoperable, identity-based access.  The software and documentation are available here.  You can watch various video discussions about WIF on Channel 9 here.

The new identity developer training kit and channel9 training course contain a new hands-on lab, Federated Authentication in a Windows Azure Web Role Application, which provides step-by-step guides for hosting in Windows Azure (WAZ) a Web application accepting identities from an external identity provider, leaving you free to focus on the business function of your application. You can also go here for a standalone version of the lab. …

If you'd like an overview of the Windows Identity Foundation, please watch Vittorio Bertocci's WIF introductory talk at PDC.  If you are interested in going deeper in the topic, take a look at the recording of the excellent talk Microsoft Architect Hervey Wilson gave on this subject at PDC09.

The Windows Azure Platform AppFabric Team’s Announcing Windows Azure platform commercial offer availability and updated AppFabric pricing of 1/4/2010 also announces a change in Service Bus and Access Control Services pricing, as well as a name change from “Message Operations” to “Transactions:”

As part of today's announcement about the commercial availability of Windows Azure platform offers, we are also introducing updated pricing for the Windows Azure platform AppFabric, which helps developers connect cloud and on-premises applications. Based on discussion and feedback from hundreds of customers during the CTP process, we have made the pricing simpler and more predictable. Service Bus will now be priced at $3.99 per Connection-month, and Access Control will be $1.99 per 100,000 Transactions. …

For Service Bus, the pricing meter has changed from "Message Operations" to "Connections". In many cases, each application instance that connects to the Service Bus will require just one Connection, which means that predicting your usage is often as simple as counting the number of application instances or devices that you need to connect. Whether your application requires two-way messaging, event distribution, protocol tunneling, or another architecture, the Connection-based model is designed to suit your business needs. Connections are charged at a rate of $3.99 per Connection per month (plus applicable data transfer charges), and will be billed on a pay-as-you-go, consumptive basis. Alternatively, for customers who are able to forecast their needs in advance, we offer the option to purchase "Packs" of Connections: a pack of 5 Connections for $9.95, a pack of 25 for $49.75, a pack of 100 for $199.00, or a pack of 500 for $995.00 per month (plus data transfer). Connection Packs represent an effective rate of $1.99 per Connection-month. Pack sizes larger than 500 may be available on request. In our FAQ, we provide more details on how Connections are defined, measured, and billed. …

For Access Control, the pricing meter has changed from "Message Operations" to "Transactions". In practice, these meters are the same; only the name has been changed to reflect the Access Control function more accurately. As previously announced, token requests and service management operations will both be counted as Transactions, and charged at a rate of $1.99 per 100,000 Transactions. …

The Service Bus and Access Control are available today, however to give customers more time to adjust to the new pricing structure, charges will start to accrue in April, 2010. Usage until that time will be free of charge, so we encourage you to upgrade your account and sign up for an offer today. Starting today, customers can already take advantage of the same support and benefits provided across the Windows Azure platform.  SLAs will take effect when charges begin to accrue in April, 2010.  In order to help customers monitor and predict their usage before charges begin to accrue, Connection and Transaction usage reports will be made available soon on their developer portal at http://appfabric.azure.com/.

For more information, please visit our FAQ and pricing pages.

Mamoon Yunus asserts “Without Federated SOA, enterprise-class cloud computing will lack mass adoption” in his Federated SOA: A Pre-requisite for Enterprise Cloud Computing post of 1/4/2010:

According to Massimo Pezzini, VP and Gartner Fellow, "Federated SOA is a systematic approach to large-scale, enterprise wide SOA that enables organizations to integrate semi-independent SOA initiatives. Often used to fix an initial lack of coordination, federated SOA should be proactively pursued from the inception of major, strategic SOA initiatives." -- Divide and Conquer: Taming Complexity Through Federated SOA.

Successful enterprise SOA implementations build on a set of localized, project-level efforts with services that have clearly identified and accountable business and technology owners. Ownerships defines a SOA Domain. SOA domains may exist within corporate boundaries or may be a set of external third-party services. Deciding what services are core to a business owner and should be implemented within her/his domain versus consumed from another SOA domain becomes a critical part of building Federated SOA. Understanding core capabilities provided by SOA domains is a crucial task at the enterprise-level for encouraging efficiency through re-use and for keeping focus on core business services.

Mamoon’s claim that federated identity is the first prerequisite of federated SOA isn’t surprising, because he’s the founder of Forum Systems, which manufactures Web Services security gateways and firewalls.

<Return to section navigation list>

Live Windows Azure Apps, Tools and Test Harnesses

My OakLeaf Systems Azure Table Services Sample Project example from my Cloud Computing with the Windows Azure Platform book is live as an upgraded, commercial service. (Repeated from the Azure Blob, Table and Queue Services section.)

This post is as lengthy as many of Scott Guthrie (@ScottGu)’s tutorial posts for ASP.NET MVC and related topics.

Panagiotis Kefalidis discusses Windows Azure - How to detect if you’re running on the cloud or locally in this 1/5/2010 post:

Recently at MSDN Forums there were people asking how they can detect if their web application is running on the cloud or locally (Dev Storage). Well besides the obvious part, if you have code inside a Web Role or a Worker Role Start() method, this only exists on a cloud template but what if you want to make that check somewhere else, for example inside a Page_Load method or inside a library (dll)?

PK explains both.

Lori MacVittie’s The Application Delivery Deus Ex Machina post of 1/4/2010 begins:

In storytelling a deus ex machina is not necessarily a good thing. In fact, its use is often attributed to the author’s inability to resolve a plot point and thus divine intervention, or some other too-good-to-be-true coincidental discovery of a vital piece of information, is used to solve the problem. The term comes from Greek plays in which the gods descended from the heavens to solve an unsolvable problem for mere mortals. In those times a primitive machine was used to lower the “gods” from the heavens onto the stage, hence the term which translates as “god from machine”.

While many IT personnel have almost certainly prayed on any number of long, frustrating evenings for a deus ex machina solution to some problem with which they were struggling, the reality is that very few “machines” can suddenly drop out of the sky into the network and solve all application-related problems. Unlike storytelling, however, the existence of a deus ex machina solution would certainly be a good thing and having one drop out of the sky would be, if you’ll pardon the pun, a god-send.

While no such beast exists completely today, a unified application delivery platform comes pretty close. …

John Moore details Part One: Stage One Meaningful Use Winners in this 1/3/2010 post that lists:

  • Consultants
  • Payers
  • Large, Established EHR/EMR Vendors
  • Revenue Cycle Management (RCM) Vendors
  • Medication Checking Reconciliation & eRx Apps

as the principal beneficiaries and offers explanations for their membership in the ARRA revenue winners circle.

John’s Achieving Meaningful Use: View from the Trenches post of 1/5/2010 ends:

Couple of final points:

1) Halamka is speaking from the perspective of a CIO in a major healthcare organization.  His plan will not be terribly useful for a small practice or rural hospital that does not have the same resources at its disposal.

2) As part of their plan, BIDMC/Halamka frequently refers to Microsoft’s HealthVault and Google Health as a key part of enabling consumer access to and control of their personal health information (PHI) via use of CCD or CCR standards. It is Chilmark Research’s opinion that to meet meaningful use criteria for consumer access to and control of PHI, practices and hospitals will increasingly enable such functionality through their own gateways to these two Personal Health Platform (PHP) services. [Emphasis added.]

Brandon Werner has updated his detailed How To Host Your Site and Content On Azure Quickly and Easily tutorial post of late November 2009:

This entry seeks to provide you with a quick and easy way to get up to speed on Azure quickly by deploying your own personal website as an MVC application in to the cloud. Consider it a “Hello World”. I will do the following:

  • Demonstrate how to write and deploy a simple Azure hosted website
  • Demonstrate how to to create your own image and content server using Azure Storage and expose your content publically through URLs
  • Demonstrate how to use new tools like Azure Storage Explorer to access your cloud storage

Return to section navigation list> 

Windows Azure Infrastructure

My How to Create and Activate an Introductory MSDN Premium Windows Azure and SQL Azure Account Upgrade post of 1/5/2010 is an illustrated tutorial for upgrading Windows Azure and SQL Azrue accounts from CTPs to commercial (paid) versions, with or without Introductory MSDN Premium benefits.

Gunther Lenz explains Azure benefits through MSDN and BizSpark! in this 1/5/2010 post to the ISV Developer Community blog:

… MSDN Premium subscribers and BizSpark members in the countries listed above are now able to sign up for a special introductory Windows Azure platform offer. This offer lasts 8 months from when the subscriber signs up and provides 750 hours of Windows Azure compute per month plus three SQL Azure databases, AppFabric, and bandwidth. An email communication will be sent out the week of January 11th to all MSDN Premium subscribers in the 21 countries alerting them that they can sign up.

We are not yet communicating when signup for this special introductory offer ends, but it will be only for a limited time, and at least through June 30, 2010. …

The Windows Azure Team’s Sign up for a Windows Azure platform offer today and get visibility into your usage post of 1/4/2009 begins:

Today marks an important step towards our goal of enabling you, our customers and partners, to build and grow your businesses on the Windows Azure platform. We are pleased to announce that starting today you can upgrade your Community Technology Preview (CTP) accounts of the Windows® Azure™ platform (i.e., Windows Azure, SQL Azure and/or Windows Azure platform AppFabric) to paid commercial subscriptions. If you upgrade your CTP accounts during the month of January, 2010, all Windows Azure platform usage incurred during this month will be at no charge. You will also have full visibility during this month to your Windows Azure platform usage. Billing and SLAs for all commercial accounts will begin on February 1st, 2010.

To upgrade your CTP accounts, please visit our offer page and select the offer of your choice. When you purchase the selected offer, you will need to sign in with the same Windows Live ID as that associated with your CTP accounts. If you wish to purchase a new commercial subscription but NOT upgrade your existing CTP accounts, please utilize a different Windows Live ID than your CTP accounts when ordering or remove all applications and data associated with your CTP accounts prior to sign up.

If you elect not to upgrade, on February 1, 2010 your CTP accounts will be disabled and any Windows Azure Storage will be made read-only. SQL Azure CTP accounts will be able to keep using their existing databases but will no longer be able to create new databases.  Also, Windows Azure platform AppFabric namespaces will be disabled. On March 1, 2010, the SQL Azure CTP accounts that have not been upgraded will be deleted. On April 1, 2010, the Windows Azure Storage CTP accounts and Windows Azure platform AppFabric namespaces that have not been upgraded will be deleted. It is important to export your data if you do not plan to upgrade to a commercial subscription prior to these dates.

See my tutorial above for details on creating new or upgrading CTP accounts to commercial status.

My Windows Azure and SQL Azure Billing System Snafu on 1/4/2010: Epic FAIL! post of 1/4/2010 describes issues relating to upgrading accounts prior to correction of the initial upgrade problem reported above.

Crystal Nichols asks Does Cloud Computing Make Good Business Sense? in this 1/3/2010 post to the IT Solutions blog:

Whether or not cloud computing makes good business sense isn’t really the question. The question should be more like, “In what ways does cloud computing make good business sense” in any given organization. While cloud computing isn’t always the best solution for every problem, it does address a number of significant issues facing IT today.

There are several areas you can make a business case for cloud computing, and there are also some challenges that face an IT department who attempts to implement cloud computing solutions. …

<Return to section navigation list> 

Cloud Security and Governance

Kim Hart reports FTC set to examine cloud computing in this 1/4/2010 post to The Hill:

The Federal Trade Commission (FTC) is investigating the privacy and security implications of cloud computing, according to a recent filing with the Federal Communications Commission.

The FTC, which shares jurisdiction over broadband issues, says it recognizes the potential cost-savings cloud computing can provide. "However, the storage of data on remote computers may also raise privacy and security concerns for consumers," wrote David Vladeck, who helms the FTC's Consumer Protection Bureau.

"For example, the ability of cloud computing services to collect and centrally store increasing amounts of consumer data, combined with the ease with which such centrally stored data may be shared with others, create a risk that larger amounts of data may be used by entities not originally intended or understood by consumers," the filing says.

The FTC is also looking at identity management systems — i.e., how people authenticate their identities when logging into websites — and how they can better protect citizens' privacy.

Both examinations are part of a "broader initiative" to investigate various models for privacy. The agency is holding a roundtable Jan. 28 to focus on privacy protections. It will include specific discussions about cloud computing, identity management, mobile computing and social networking. …

<Return to section navigation list> 

Cloud Computing Events

Chris Rolon will present MSDN Webcast: geekSpeak: SQL Azure Under the Hood with Chris Rolon (Level 200) on Wednesday 1/6/2010 at 12:00 PM PST. More details are in the SQL Azure Database (SADB)  section above.

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

Krishnan Subramanian’s MySQL, Oracle And Cloud Computing post of 1/5/2010 analyzes the effect of Oracle’s pending acquisition of MySQL in conjunction with its Sun Microsystems takeover:

Ever since Oracle announced the acquisition of Sun Microsystems along with MySQL, all hell broke loose in the open source community. With EU questioning the deal, there is a war (of words) erupting inside the community with one side asking EU to block the deal or, at the very least, change the license to another open source license from GPL and the other side urging EU to allow the transaction to go through. Even though I have no love for Oracle, I think it is time to let the deal go through at least for the sake of Sun employees who are sitting there with their future unknown. At the same time, I am not unduly worried about the future of MySQL because I have complete confidence in the open source license of MySQL. Let me try to explain my position here in this post. …

Krishnan concludes:

I think it is time for some sanity to prevail in the community and let Oracle absorb Sun and MySQL. The very nature of open source will ensure that users are never left in the lurch. MySQL and any other open source software absorbed by proprietary vendors in the future will survive irrespective of what the new owner does to the OS software they buy. Along with other factors, cloud computing will also help them survive.

Pablo Castro asserts HTML5 does databases in this 1/4/2010 post:

The HTML5* specification has been cooking for a while and lately the amount of buzz around it has been growing at full speed. Just search for #HTML5 in tweeter and you'll see what I mean. After even a quick look at it, it becomes evident that the next version of HTML aims to go much further into the application space than earlier ones. Not only there is a lot of highly expected presentation features such as <video> and <canvas>, but also several APIs to do things that applications do, from background work (with Web Workers) to direct communication (with Web Sockets) to offline support (with the App Cache) and databases (with Web SQL Database and/or Indexed Database API).

I've always been attracted to things that bring data and the Web together. So a while back when I first saw browsers and databases in the same spec I had to get involved. There is a bunch of us at Microsoft interested in HTML5 from different angles, and we have now good momentum to explore this space. So now I'm spending a good chunk of time on the database aspects of HTML5, the API, the developer story, etc.

This explains why Pablo has been so quiet on the blogging front lately.

Sean Kelly announces New Release: Database Sync - Oracle and SQL Compact in this 1/4/2010 post:

Openness is an important aspect of the Sync Framework and one of our jobs is to knock down as many walls as possible, empowering developers to synchronize data between any two stores.  We would love for you to use the Sync Framework to connect with other Microsoft products such as SQL Server and SQL Azure and oftentimes enhance the experience by lighting up on those platforms.  That said, we recognize that most companies are not homogeneous in terms of the software vendors they depend upon and it is important to provide the ability to synchronize with existing investments such as Oracle or DB2.  This sample demonstrates how one would go about synchronizing SQL Compact with Oracle using our new P2P providers.  Enjoy!  Special thanks to Jesse Low who took the time to build this sample.  He is one of the developers on the Sync Framework team and his sample work resulted in a handful of improvements to the core runtime in order to support the ability to sync data with Oracle.

The sample source code is here on the MSDN Code Gallery.

Rich Miller reports Salesforce.com Hit by One Hour Outage in this 1/4/2010 post:

Enterprise cloud computing provider Salesforce.com says it has resolved an outage that knocked its services offline for about an hour and 15 minutes this afternoon. Salesforce.com has nearly 68,000 customers using its online applications, including Dell, Dow Jones Newswires and SunTrust Banks. The company says the incident “affected all instances.”

“The Salesforce.com Technology Team has resolved the service disruption issues on all instances from 12:10PM PST to 1:25PM PST,” the company reported on its status dashboard. “All services are restored at this time. We are performing a review of the incident and will take any corrective action needed. We apologize for any inconvenience this may have caused you and appreciate your patience.”

UPDATE: Users on Twitter report continuing problems trying to log onto their apps. As of 1:55 p.m.  Pacific time, Saleforce.com is reporting that its NA2 instance is back offline.

<Return to section navigation list> 

blog comments powered by Disqus