|Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.|
• Updated 4/7/2010 with a link to Darrell West’s Saving Money Through Cloud Computing research paper from the Brookings Institution’s The Economic Gains of Cloud Computing event. See the Windows Azure Infrastructure and Cloud Computing Events sections.
Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:
- Azure Blob, Table and Queue Services
- SQL Azure Database, Codename “Dallas” and OData
- AppFabric: Access Control and Service Bus
- Live Windows Azure Apps, APIs, Tools and Test Harnesses
- Windows Azure Infrastructure
- Cloud Security and Governance
- Cloud Computing Events
- Other Cloud Computing Platforms and Services
To use the above links, first click the post’s title to display the single article you want to navigate.
Discuss the book on its WROX P2P Forum.
See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.
Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.
You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:
- Chapter 12: “Managing SQL Azure Accounts and Databases”
- Chapter 13: “Exploiting SQL Azure Database's Relational Features”
HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the January 4, 2010 commercial release in April 2010.
No significant articles today.
Phani Raju YN published A Checklist for OData Feed publishers post on 4/6/2010:
Phani’s original posts cover the following topics:
- Allow Silverlight clients to access your data
- Server Driven Paging for your entity sets
- Web Friendly Feed annotations for prettier feeds
- Make your entity sets read-only.
Update 4/6/2010 4:00 PM PDT David Robinson reports Yellow dashboard back to green in this update to the following post of 4/5/2010:
Yesterday afternoon we reported an intermittent connection issue and I am happy to report that the issue has been identified and resolved. The issue was tracked to an out of data Kerberos encryption policy on one of our machines.
As an additional heads up, today the SQL Azure service will be undergoing a planned upgrade (SU2), as announced at the MIX conference a couple weeks ago. This is a rolling deployment across all data centers worldwide and is expected to complete without impact to availability by the end of the week (9 April). Please check back here for more details once the deployment is complete.
David Robinson (my book’s Technical Editor) answers on 4/5/2010 the Yellow Dashboard? question about “intermittent authentication issues” reported in my Windows Azure and Cloud Computing Posts for 4/5/2010+ post:
SQL Azure has been yellow on the Windows Azure Platform dashboard for a few days and I wanted to take a moment to brief you on why and what we are doing to correct it. A small number of customers are seeing intermittent connection issues and so we flipped the status on the service dashboard to reflect this. The exact status on the dashboard states:
We are experiencing intermittent authentication issues within our service which also causing intermittent authentication errors for some customers attempting to access their database. The length of each occurrence varies widely from seconds to several minutes. In some cases closing and retrying your connection may mitigate the issue. We are actively investigating this issue and will continue to do so until we have resolved the issue.
Even though the number of affected customers is low, we are focused on resolving this issue with the highest priority.
So what is going on? As some of you that have been to one of the many presentations on SQL Azure know, SQL Azure is comprised of a number of different tiers, front-end machines that handle the initial user request and back-end machines that ultimately execute that request against the database. The problem we are seeing is that intermittently, some front-end machines fail to connect to the appropriate back-end machine due to a transient authentication issue. The result is that the user receives an error message stating “Login failed for user NT AUTHORITY\ANONYMOUS LOGON”. If your application has retry logic built in, you probably didn’t even notice the problem was occurring since it is an intermittent issue.
The SQL Azure Development Team, Operations and our Data Center staff are working around the clock on this. Once we complete our root cause analysis, we will let you know our findings. I’ll post an update on our progress in the next 24 hours.
Rest assured your data is completely safe! We maintain multiple copies of your database and this issue is only related to intermittent connection failures. I would also like to reiterate that although this is an intermittent problem affecting a small subset of the users, we are treating it with the utmost urgency and people are working around the clock to get it fixed.
What if you are one of the people seeing this? Adding retry logic to your application or simply trying to reconnect will work around this issue.
Gil Fink describes how to write an OData Visualizer Extension in this 4/6/2010 post:
Yesterday I wrote about VS2010 extension manager. One of the great extensions that you
can download and use is the OData Visualizer. The OData Visualizer helps us to get a visual graph of the OData Protocol’s types, properties, associations, and other aspects of the EDM which you get from an OData metadata endpoint.
Installing OData Visualizer
Installing the OData Visualizer is very easy. Go to your Extension Manager and search for OData Visualizer in the online gallery. Once you find it just install it and then restart VS2010 in order to it to be available. It is also available in the Visual Studio Gallery
Gil continues with these topics:
- Setting the Environment
- Using the OData Visualizer
Sajid Qayyum explains Exporting and Synchronize local Database to SQL Azure in this 4/5/2010 illustrated tutorial, which is similar to my Synchronizing On-Premises and SQL Azure Northwind Sample Databases with SQL Azure Data Sync post of 1/28/2010:
I assume that you already have a SQL Azure account with a Server name and an Administrator. Both the server name and the administrator username are written under "Server Information" on the SQL Azure Server Administration screen.
We start by establishing a connection from our SQL Server Management Studio to the SQL Azure. But for that, we must first allow our computer to be connected to SQL Azure, since SQL Azure restricts all communication as a default setting. For that, Login to your SQL Azure account.
Sajid continues with the details.
John R. Durant’s Connect Microsoft Excel To SQL Azure Database post of 4/5/2010 is an illustrated tutorial for connecting Excel 2010 RC to an SQL Azure database:
A number of people have found my post about getting started with SQL Azure pretty useful. But, it's all worthless if it doesn't add up to user value. Database are like potential energy in physics-it's a promise that something could be put in motion. Users actually making decisions based on analysis is analogous to kinetic energy in physics. It's the fulfillment of the promise of potential energy.
So what does this have to with Office 2010? In Excel 2010 we made it truly easy to connect to a SQL Azure database and pull down data. Here I explain how to do it.
By following these steps you will be able to:
- Create an Excel data connection to a SQL Azure database
- Select the data to import into Excel
- Perform the data import
John continues with details and illustrations of the five steps required to connect. You might also want to check out John’s earlier Getting Started Integrating Windows Azure with Microsoft Office Solutions post.
See Jason Maurer’s proposal for a CRUD for the Web: OData, GData, and You session for the 2010 Open Source Bridge conference to be held 6/1 to 6/4/2010 in Portland, OR in the Cloud Computing Events section.
See Jon Flanders’ MSDN Webcast: geekSpeak: Windows Azure AppFabric (Level 200) presentation on 4/7/2010 in the Cloud Computing Events section.
Bruce Kyle and James Brundage show you How to Call PowerShell from Your Application in this You Tube video of 4/5/2010:
Join James Brundage, Tester from the Windows PowerShell team, and Bruce Kyle for a quick [00:04:19] introduction to how embed PowerShell within your C# application.
The U.S. federal government spends nearly $76 billion each year on information technology, and $20 billion of that is devoted to hardware, software, and file servers (Alford and Morton, 2009). Traditionally, computing services have been delivered through desktops or laptops operated by proprietary software. But new advances in cloud computing have made it possible for public and private sector agencies alike to access software, services, and data storage through remote file servers. With the number of federal data centers having skyrocketed from 493 to 1,200 over the past decade (Federal Communications Commission, 2010), it is time to more seriously consider whether money can be saved through greater reliance on cloud computing.
Cloud computing refers to services, applications, and data storage delivered online through powerful file servers. As pointed out by Jeffrey Rayport and Andrew Heyward (2009), cloud computing has the potential to produce “an explosion in creativity, diversity, and democratization predicated on creating ubiquitous access to high-powered computing resources.” By freeing users from being tied to desktop computers and specific geographic locations, clouds revolutionize the manner in which people, businesses, and governments may undertake basic computational and communication tasks (Benioff, 2009). In addition, clouds enable organizations to scale up or down to the level of needed service so that people can optimize their needed capacity. Fifty-eight percent of private sector information technology executives anticipate that “cloud computing will cause a radical shift in IT and 47 percent say they’re already using it or actively researching it” (Forrest, 2009, p. 5).
To evaluate the possible cost savings a federal agency might expect from migrating to the cloud, in this study I review past studies, undertake case studies of government agencies that have made the move, and discuss the future of cloud computing. I found that the agencies generally saw between 25 and 50 percent savings in moving to the cloud. For the federal government as a whole, this translates into billions in cost savings, depending on the scope of the transition. Many factors go into such assessments, such as the nature of the migration, a reliance on public versus private clouds, the need for privacy and security, the number of file servers before and after migration, the extent of labor savings, and file server storage utilization rates.
West continues with a description of “five steps be undertaken in order to improve efficiency and operations in the public sector.” See the Cloud Computing Events section for more details on the event.
The Brookings Institution describes itself as follows:
The Brookings Institution is a nonprofit public policy organization based in Washington, DC. Our mission is to conduct high-quality, independent research and, based on that research, to provide innovative, practical recommendations that advance three broad goals:
- Strengthen American democracy;
- Foster the economic and social welfare, security and opportunity of all Americans and
- Secure a more open, safe, prosperous and cooperative international system.
Brookings is proud to be consistently ranked as the most influential, most quoted and most trusted think tank.
The Straits Times reports S[inga]pore key to Microsoft's regional cloud push on 2/7/2010:
MICROSOFT sees Singapore as a focal point for its cloud computing push in the region, and will be investing more here to support this business, says Stephen Elop, president of its business division.
He said Singapore is now a hub for 'very massive investments around data centres to support countries in the region'.
And Singapore's appetite for Microsoft's cloud services has so far been among the biggest in the region, he revealed.
'Singapore is very important. It has a beautiful environment in which to learn about the future of technology.
'It is also very much the case that broadband services, connectivity rates and so forth are very good here, and therefore we can do things here that other countries are not equipped to do,' he said in an interview.
'The regulatory environment here is also conducive for us to do new forms of business.'
To grow its cloud computing business, Microsoft is now spending substantially on data centres, in order to host applications and customer data.
Mr Elop said Microsoft has been making such investments in Singapore to support the company's cloud customers from across the region, including those in Australia. …
• Chris Hoff’s Good Interview/Resource Regarding CloudAudit from SearchCloudComputing… post of 4/6/2010 describes his interview:
The guys from SearchCloudComputing gave me a ring and we chatted about CloudAudit. The interview that follows is a distillation of that discussion and goes a long way toward answering many of the common questions surrounding CloudAudit/A6. You can find the original here.
And continues with a verbatim copy of the Q&A session.
The SearchSecurity.com staff interviewed Chris Hoff (@Beaker) for this Q&A: CloudAudit targets automated risk assessment, management post of 4/6/2010:
CloudAudit, launched in January 2010, brings together cloud computing providers, integrators and consultants in an effort to create a common interface and namespace. The volunteer initiative aims to help with an automated risk assessment and audit of Software-as-a-Service (SaaS), Platform-as-a-Service (PaaS) or Infrastructure-as-a-Service (IaaS) environments. Christofer Hoff, who earned a respected reputation as a long-time independent researcher and is now the director of cloud and virtualization systems at Cisco Systems Inc., and a technical advisor and founding member of the Cloud Security Alliance (CSA), spoke to SearchSecurityChannel.com about CloudAudit's mission.
What are the biggest challenges when auditing cloud-based services, particularly for the solution providers?
Christofer Hoff:: One of the biggest issues is their lack of understanding of how the cloud differs from traditional enterprise IT. They're learning as quickly as their customers are. Once they figure out what to ask and potentially how to ask it, there is the issue surrounding, in many cases, the lack of transparency on the part of the provider to be able to actually provide consistent answers across different cloud providers, given the various delivery and deployment models in the cloud.
How does the cloud change the way a traditional audit would be carried out?
Hoff: For the most part, a good amount of the questions that one would ask specifically surrounding the infrastructure is abstracted and obfuscated. In many cases, a lot of the moving parts, especially as they relate to the potential to being competitive differentiators for that particular provider, are simply a black box into which operationally you're not really given a lot of visibility or transparency.
If you were to host in a colocation provider, where you would typically take a box, the operating system and the apps on top of it, you'd expect, given who controls what and who administers what, to potentially see a lot more, as well as there to be a lot more standardization of those deployed solutions, given the maturity of that space. …
The interview continues with additional Q&A.
Christine Jacobs, Communications Officer, Governance Studies for the Brookings Institution announced in an e-mail this morning:
Mr. Kundra also announced that the National Institute of Standards and Technology will host a “Cloud Summit” on May 20, with government agencies and the private sector. The Summit will introduce NIST efforts to lead the definition of the Federal Government’s requirements for cloud computing, key technical research, and United States standards development. Furthermore, Mr. Kundra stated that the government will engage with industry to collaboratively develop standards and solutions for cloud interoperability, data portability, and security. [Emphasis added.]
The Brookings Institution will hold The Economic Gains of Cloud Computing event with Vivek Kundra, Federal CIO on 4/7/2010 from 9:00 AM to 11:00 AM EDT at Falk Auditorium, The Brookings Institution, 1775 Massachusetts Ave., NW, Washington, DC:
The Brookings Institution will host a policy forum that examines the economic benefits of cloud computing for local, state, and federal government. Vivek Kundra, federal chief information officer, will deliver a keynote address on the role of the government in developing and promoting cloud computing. Brookings Vice President Darrell West will moderate a panel of experts and detail the findings in his paper, "Saving Money through Cloud Computing," which analyzes its governmental cost-savings potential.
Here are the experts:
BrightTALK will air a Data Privacy in the New Decade summit on 4/8/2010 starting at 8:00 AM (Time Zone not specified):
With the emergence of new technologies like social media, mobile and cloud computing, electronic data is gathered, stored and shared in ways unimaginable a decade ago. The imperatives of security have led to a modernization in Data Privacy Laws to safeguard emerging privacy and security vulnerabilities. Join industry experts as they navigate their way through international regulatory ambiguities and discuss industry best practices for data privacy risk minimization within the digital economy.
Reuven Cohen reports he’s Kicking off CloudChasers Radio Show on 4/6/2010:
I'm happy to announce that I will helping kick off the new CloudChasers Radio Show sponsored by Novell this Thursday April 8 @ 1:30pm (Eastern).
CloudChasers is a live Internet radio show tackles the drama surrounding cloud computing head on. Each week, a Novell guest and a third-party subject matter expert will examine the emerging ideas, challenges and opportunities relevant to cloud computing in a spirited, non-formatted discussion. It's a must-listen for every IT professional chasing the cloud.
Date: April 8, 2010
Topic: Cloud Adoption: What's Holding Us Back?
Time: 1:30pm EDT (10:30PT)
Host: Leslie Poston, writer for Mashable.com, author of Twitter for Dummies and founder of Podcamp NH
Jon Bultmeyer, Director, Identity R&D at Novell
Reuven Cohen, blogger at elasticvapor.com, founder of Cloud Camp, and CTO at Enomaly
There's no shortage of voices saying that cloud computing offers the promise of enormous agility, flexibility, and cost savings. But if the cloud is set to revolutionize IT, why aren't more enterprises embracing it fully for business-critical operations? Is it security risk? Regulatory concerns? Integration worries? Join us as we discuss what's really holding us back from taking advantage of the cloud – whether it's something technological, operational, or even cultural—and how those obstacles can be overcome.
tbtechnet announced Windows Azure One Week Virtual Boot Camp April 5th - April 12th 2010 on 4/5/2010:
Learn Windows Azure at your own pace, in your own time and without travel headaches.
Special Windows Azure one week pass provided so you can put Windows Azure and SQL Azure through their paces.
NO credit card required.
You can start the Boot Camp any time during April 5th -12th and work at your own pace.
The Windows Azure virtual boot camp pass is valid from 5am USA PST April 5th through 6pm USA PST April 12th
Follow these steps:
- Get a Windows Azure One Week Pass here
- Sign in to the Windows Azure Developer Portal and use the pass to access your Windows Azure account.
- Please note: your Windows Azure application will automatically de-provision at the end of the virtual boot camp on April 12th
- Since you will have a local copy of your application, you will be able to publish your application package on to Windows Azure after the virtual boot camp should you decide to sign up for a regular Windows Azure account after the Boot Camp.
- For USA developers, no-cost phone and email support during and after the Windows Azure virtual boot camp with the Front Runner for Windows Azure program.
- For non-USA developers - sign up for Green Light https://www.isvappcompat.com/global
- Get the Tools
- To get started on the boot camp, download and install these tools:
- Download Microsoft Web Platform Installer
- Download Windows Azure Tools for Microsoft Visual Studio
- Learn about Azure
- Learn how to put up a simple application on to Windows Azure
- Take the Windows Azure virtual lab
- Read about Developing a Windows Azure Application
- View the series of Web seminars designed to quickly immerse you in the world of the Windows Azure Platform
- Why Windows Azure - learn why Azure is a great cloud computing platform with these fun videos
- Dig Deeper into Windows Azure
- Download the Windows Azure Platform Training Kit
Lynn Langit will host and Jon Flanders will present a MSDN Webcast: geekSpeak: Windows Azure AppFabric (Level 200) on 4/7/2010 at 12:00 PM PDT:
In this geekSpeak webcast, industry guru and author Jon Flanders provides details on the caching feature in the recently released Windows Azure AppFabric. AppFabric is a set of integrated technologies that make it easier to build, scale, and manage Web and composite applications that run on Internet Information Services (IIS). Jon shows us how using AppFabric caching capabilities for Web applications provides high-speed access, scale, and high availability to application data. This geekSpeak is hosted by Lynn Langit.
The geekSpeak webcast series brings you industry experts in a "talk-radio" format hosted by developer evangelists from Microsoft. These experts share their knowledge and experience about a particular developer technology and are ready to answer your questions in real time during the webcast.
Presenter: Jon Flanders, Flanders Software Consulting
I’m guessing that whoever wrote the abstract confused Windows Azure AppFabric with Windows Server AppFabric.
Why do you have to relearn yet another API every time you want to really use someone's data source on the Web? It's time we moved beyond just consuming feeds -- we need full-function data access APIs! That's what the Open Data Protocol (OData) and the Google Data Protocol (GData) aim to do. Learn about these efforts, how they are used, and why you should adopt them for your next web API.
There are many APIs for working with data on the Web available today… way too many. Almost every site or service with an API rolls their own interface to work with their data. This inconsistent approach limits access, creates needless silos, and hinders the evolution of the Web.
We’ve seen great benefits from standardizing Read with RSS and Atom syndication; imagine what we could do if we standardized Create, Update and Delete as well. That’s what the Open Data Protocol and the Google Data Protocol aim to do. Learn about these efforts, how they are used, and why you should adopt them for your next web API.
No significant articles today.