|Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.|
- Azure Blob, Table and Queue Services
- SQL Azure Database (SADB)
- AppFabric: Access Control, Service Bus and Workflow
- Live Windows Azure Apps, Tools and Test Harnesses
- Windows Azure Infrastructure
- Cloud Security and Governance
- Cloud Computing Events
- Other Cloud Computing Platforms and Services
To use the above links, first click the post’s title to display the single article you want to navigate.
Discuss the book on its WROX P2P Forum.
See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.
Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.
You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:
- Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
- Chapter 13: “Exploiting SQL Azure Database's Relational Features”
HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the November CTP in January 2010.
* Content for managing DataHubs will be added as Microsoft releases more details on data synchronization services for SQL Azure and Windows Azure.
Off-Topic: OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009.
Cory Fowler’s Working with Windows Azure Development Storage post of 12/23/2009 explains how to set up the local SQL Server database for developing Azure applications in the Development Fabric:
During the Development Cycle it is necessary to connect to a database to ensure that your data is getting stored in Windows Azure Storage Services. Microsoft was nice enough to give us this functionality out of the box so we don’t actually need a Windows Azure account before beginning development on an application. Of course you will need to Initialize your Development Storage Service on your development machine before you get going.
Once you’re done setting up the Development Storage Service you will need to configure the Development Storage in the ServiceConfiguration.cscfg file. You will need to add the … ConfigurationSettings to each Role element in the ServiceConfiguration.cscfg. …
Once you have these configuration settings in place you will be ready to interact with the Windows Azure Development Storage Service on your Development Machine. Stay tuned for my next blog Series which will describe the different between Blob Storage, Queue Storage, and Table Storage and how you will go about interacting with the different storage spaces.
André van de Graaf provides a solution to the SQL Azure Migration Wizard. BCP upload process failed: SQLState = 37000, NativeError = 40531 error in this 12/27/2009 post:
This error can be solved if you specify also the server name after the login name [prefixed with] @ instead of only [the login name:]
Alex Handy’s Embarcadero taps into SQL Azure beta article for SDTimes, copied by Synergistics India on 12/24/2009, describes Embarcadero’s free (for a 90-day trial) DBArtisan v8.7.2 tool for SQL Azure:
To enable developers and administrators to better experiment with the SQL Azure beta, Embarcadero Technologies has partnered with Microsoft to release a free version of DBArtisan specifically tailored to help with the move to Azure.
SQL Azure is based upon some SQL Server technologies, but it is being specially crafted for the cloud. Scott Walz, senior director of product management at Embarcadero, said that both developers and Microsoft have questions to be worked out about how a cloud-based database should work.
As such, DBArtisan 8.7.2 does not include optimization features, nor does it offer deep monitoring capabilities. But that is because even Microsoft hasn't yet figured out how to give that type of information or power to users, said Walz.
DBArtisan focuses on migration and query tools. Because this is a free version limited to a 90-day trial, it is also only able to migrate Microsoft SQL Azure databases to Windows Azure. Walz said that the eventual commercial version of DBArtisan for SQL Azure will include migration tools for many different types of databases, but because the Azure platform is not complete, the decision was made to include only Microsoft-specific tools this time. …
Embarcadero will need to increase DBArtisan’s feature set considerably to justify a license fee. George Huey’s free SQL Azure Migration Wizard v3.1.1 handles SQL Server <-> SQL Azure migrations nicely.
Sheila Molnar quotes Slavik Markovich, CTO and founder of the database security company Sentrigo, in her Staying Abreast of SQL Server Database Trends in 2010 article of 12/22/2009 for SQL Server Magazine. Under the topic, “2010: The Enterprise Moves Data to the Cloud,” Sheila writes:
… According to Slavik, the “biggest push in 2010 is the move to cloud-based services. Microsoft will push the Azure cloud platform and SQL Azure database services.” A major hurdle will be “how do you protect the data in the cloud environment?” Organizations need protect data from attacks from both outside and “from your own data administrators, plus your cloud administrators or administrators from the hosting company.” The questions are: “How do you trust them? Or trust but verify that your data is not being accessed or breached?” And “how do you monitor access to the information while it’s kept in the cloud?”
Slavik notes that DBAs have been slow to move data to the cloud because the market hasn’t been ready. “There weren’t good services out there that offered real SQL Server hosting. What you got from Amazon [for example] for their cloud was just basically the platform. And Google of course provided its own database. Smaller companies provided the SQL Server environment, but didn’t provide the whole vision thing. Whereas Microsoft with Azure provides a really strong platform that offers both platform services and higher-level services—SQL Server web services and a path between them.” For more on SQL Azure database services, see Mike Otey’s “7 Facts about SQL Azure,” InstantDoc ID 102766. …
No significant articles yet.
Geoff Nairn claims Siemens signs up for Microsoft cloud in this brief report in the “What’s New” section of the Financial Times for 12/27/2009:
Siemens IT Solutions and Services aims to be among the first service providers to sign up for Azure. It has struck a deal with Microsoft to remotely distribute software updates to its business customers using the Azure cloud computing platform. The service is part of Siemen’s Common Remote Service Platform, which provides remote maintenance of customers’ IT systems.
The preceding item follows an earlier Microsoft unwraps Windows Azure item of the same date:
Microsoft has finally taken the wraps off its Windows Azure cloud computing platform, which was first announced a year ago. From January 1, Azure will go live as a commercial offering although Microsoft will not start charging customers until February. Being based on Windows, Microsoft argues that Azure is easier to manage than existing cloud platform offerings from the likes of Google or Amazon.
Is Windows Azure becoming a mainstream news item? Nick Eaton included an item entitled “Cloud computing takes center stage” in his 10 biggest Microsoft stories of 2009 article of in the SeattlePI Blog:
Microsoft unveiled Windows Azure at PDC08 and announced its commercial availability at PDC09. But what the heck is it? An operating system in the cloud, that exists nowhere and everywhere? And why the heck do we need it? The answer to all those questions is simple: the future.
Microsoft won't start charging for the Azure Platform until February, but it's already got hundreds of developers using the service. With Windows Azure, companies and individuals can create and manage cloud-based applications that people access via the Web. It lets clients scale up and scale back their server use as needed, and can slash a company's IT costs.
Abel Avram reports Information Can Be Sold and Bought in “Dallas” in this 12/24/2009 review of Codename “Dallas” and Microsoft Pinpoint for InfoQ:
Microsoft’s service codename “Dallas” is an information marketplace bringing together data, imagery and service providers and their consumers facilitating information exchange through a single point of access.
“Dallas” has been built with and on top of Windows Azure platform, the service consisting of three main components:
- Information Discovery – discovering and consuming structured and blob datasets available to any application on any platform.
- Brokerage – facilitates partnership between information providers and organizations interested in consuming it.
- Analytics and Reporting – provides data analysis and reporting.
Data can be accessed through a REST-based API, but C# proxy classes provide an object model facilitating access from within any .NET program. Services provide their data as ATOM feeds or in a tabular form. Data can be also loaded into Excel through PowerPivot. “Dallas” will soon provide data through SQL Server and SQL Azure queries.
Eric Nelson’s SyncToBlog #7 Windows Azure Platform links post of 12/24/2009 is a list of resources for Windows Azure developers in the following categories:
- Recent Windows Azure articles of interest to developers
- A few case studies
- Plus some of the “must remember to read” Azure Blogs
The OakLeaf blog isn’t (yet) on Eric’s “must remember to read” list.
John Savageau predicts A Cloudy Future for Networks and Data Centers in 2010 in this 12/24/2009 post:
The message from the VC community is clear – “don’t waste our seed money on network and server equipment.” The message from the US Government CIO was clear – the US Government will consolidate data centers and start moving towards cloud computing. The message from the software and hardware vendors is clear – there is an enormous investment in cloud computing technologies and services.
If nothing else, the economic woes of the past two years have taught us we need to be a lot smarter on how we allocate limited CAPEX and OPEX budgets. Whether we choose to implement our IT architecture in a public cloud, enterprise cloud, or not at all – we still must consider the alternatives. Those alternatives must include careful consideration of cloud computing. …
John is President of Pacific-Tier Communications.
David Aiken announces a New [Azure] Training kit for the Holidays in this detailed post of 12/23/2009, which includes a catalog of its:
- Hands On Labs
- Presentations and Videos
Lynn Langit (@llangit, a.k.a. SoCalDevGal)’s Developing Windows Azure applications – Setup post of 12/23/2009 explains how to get started with Windows Azure projects in Visual Studio 2008 SP1 or 2010 November 2009 CTP:
I’ve been working with Windows Azure (post PDC09 build) to take a look at the basic mechanics of coding and deploying applications. To that end, I wanted to share what I’ve learned so far in this post.
First, I’ll talk about what you need to download and install to get started. There are several categories of items to get in order for you to start developing. Of course what you’ll need is dependent on what you intend to build.
I initially wanted to build ASP.NET C# applications that either used Windows Azure storage (i.e. table, blob or queue) or that used SQL Azure storage (i.e. RD[B]MS-like tables). I’ll remind you that other application development languages are supported, such as PHP. Also if you were simply using SQL Azure as storage, there is no requirement that the front-end application actually be a Window Azure application. …
Lynn continues with detailed descriptions of the tools required to create Azure projects, the account (token) acquisition process, and creating a sample to do application.
Schematic deployed on 12/18/2009 TwittZURE, a Windows Azure/Silverlight 3 sample application that displays Twitter search results and timelines by taking advantage of the new Twitter API:
Clicking the Install Now button performs a Click Once installation of the out-of browser version.
Schematic is an “interactive agency that creates branded experiences” from offices in Los Angeles, New York, Atlanta, Austin, Minneapolis, San Francisco, San Jose (Costa Rica), and London (UK).
The Silverlight Team Blog posted on 12/23/2009 a Case Study - TwittZure Silverlight Twitter Client on Azure, which describes the project:
… Schematic used the engaging rich internet application of Silverlight 3 and the responsiveness of Windows Azure cloud platform to build a cutting edge Twitter application, the recently released Beta version of Twittzure.
TwittZure enables users to access key Twitter features by providing an engaging and sleek user interface to interact with friends and colleagues through the Twitter public APIs. TwittZure takes advantage of a highly available and scalable cloud server platform by using Windows Azure to serve the application and as a bridge between the application and the Twitter APIs. …
Why another Twitter client if there are so many around? TwittZure is also a unique showcase and proof of concept for several emerging technologies and platforms that are integrated into the application and are relevant for Schematic and our clients. TwittZure integrates not only the Twitter user and search REST public APIs, but also uses Windows Azure, Microsoft’s cloud platform, to host the application,--providing a highly available and scalable bridge between the application and Twitter’s REST services. …
Anthony Baker’s TwittZure Showcased at Schematic.com post of 12/18/2009 offers a brief history of TwittZure’s development. According to the case study, Anthony is a Software Architect in the Microsoft Platforms Group and was TwittZURE’s Project Lead, Application Architect, Developer.
My OakLeaf Blog Analytics for November 2009 will be of more interest to me than anyone else, but I intend to post statistics monthly to archive readership trends.
Dion Hinchcliffe explains How Cloud Commoditization Will Transform Enterprise Computing in this 12/23/2009 post to the Enterprise Irregulars blog:
The announcement last week of Amazon’s new EC2 Spot Instances was more than just another move considerably ahead of the rest of the industry by that forward-looking cloud computing leader. Spot Instances also heralds the beginnings of a real trading market for cloud computing resources of all kinds, whether that is storage, support services, real compute power, or even human capital (as in on-demand crowdsourcing.)
Up until now — indeed, still today except for Amazon — you basically had to pay fixed “retail” amounts according to the publicly posted prices from a bulk vendor or refer to the the rates listed in your negotiated contract with a private supplier. Now in the new spot market (live price ticker here) you can just look at the current price of unused cloud capacity and if your bid is the highest, it’s all yours. Of course, this is only available in Amazon’s cloud at the moment and it’s just for EC2, their compute cloud. But you can count on this expanding to their other services as well and for competitors to respond, which is where it gets a lot more interesting.
Admittedly, the days of the fixed price retail cloud are far from over. It will take a while to grow broader interest in treating the unused and available portions of the cloud as a commodity. It might even take longer for buyers to start thinking of compute resources in all its forms as an instantaneous product to consume, something that up until now still looked more like long-term fixed investments in 3rd party pooled resources than rapidly fluctuating units of exchange with highly dynamic value and price. …
Click the graphic above to read the original eBizQ post.
Frances Karamouzis predicts “By 2012, India-centric IT services companies will represent 20% of the leading cloud aggregators in the market (through cloud service offerings)” in his initial (and untitled) Gartner blog post of 12/23/2009:
As we approach the end of the year, I thought I take the opportunity to launch this blog with a question and hopefully engaging discussion about the extensively hyped topic of Cloud. As a Gartner analyst, clearly our organization has put forth lots and lots of research regarding Cloud Computing. However, I want to focus on “IT Services” and the Cloud.
Within Gartner, we are referring to this as Cloud enabled Services or Cloud enabled Outsourcing — or just Cloud Services for short. Regardless of the name — the question on the table is how much money, marketshare, channel mastery and MOST importantly in the first 18 months (how much MINDshare) will be commanded by IT services companies (whether its some of the traditional providers like Accenture, Capgemini, ACS, CSC, or some of the Indian vendors (TCS, Wipro, Infosys, Cognizant, HCL, and the other 40+ Indian vendors OR other offshore vendors (Softtek, Neoris, EPAM, CPM Braxis, iSoftstone, VanceInfo, etc. or some of the smaller emerging providers like AppsLabs, Appirio, Tory Harris etc.)) [Emphasis by the author.]
Please weigh in with your thoughts, comments, challenges …
BCS, the UK’s Chartered Institute for IT asserts Cloud computing 'will dominate focus in 2010' in this 12/24/2009 post:
Cloud computing will dominate the focus of the industry in the new year, a new study has revealed.
In a survey of the chief information officers and chief technology officers of several leading companies by Logicalis, the main technology trends of 2010 were found to be Web 2.0 and Cloud computing.
The latter, which was the second most popular term mentioned by the respondents after Web 2.0, was seen as important and popular due to its flexible and scalable architecture.
Commenting on the findings, Adam Bosnian, vice president of products, strategy and sales at security management firm Cyber-Ark, said: 'Almost any size of organisation can use public or private cloud resources and enjoy significantly enhanced economies of scale.'
'Even if an organisation uses private cloud resources … where the server storage environment is effectively outsourced to the service provider - there are still economies of scale to be had.'
Chris Hoff (@Beaker) issues a challenge to the “select few who ignore issues brought to light and seem to suggest that Cloud providers are at a state of maturity wherein they not only offer parity, but offer better security than the ‘average’ IT shop” in his The Great Cloud Security Challenge: I Triple-Dog-Dare You… post of 12/27/2009:
There’s an awful lot of hyperbole being flung back and forth about the general state of security and Cloud-based services.
I’ve spent enough time highlighting both the practical and hypothetical (many of which actually have been realized) security issues created and exacerbated by Cloud up and down the stack, from IaaS to SaaS.
It seems, however, that there are a select few who ignore issues brought to light and seem to suggest that Cloud providers are at a state of maturity wherein they not only offer parity, but offer better security than the “average” IT shop.
What’s missing is context. What’s missing is the very risk assessment methodologies they reference in their tales of fancy. What’s missing is that in the cases they suggest that security is not an obstacle to Cloud, there’s usually not much sensitive data or applications involved. (Author’s emphasis.) …
Chris details his challenge and concludes:
I’m all for evangelism, but generalizing about the state of security (in Cloud or otherwise) is a complete waste of electrons.
Chris Hoff (@Beaker)’s How Many Open Letters To Howard Schmidt Do We Need? Just One post of 12/23/2009 includes a brief message to the newly appointed “Cyber-Security Czar:”
I’ll keep it short.
Let me know how we can help you be successful; it’s a two-way street. No preaching here.
and ends with an offer to volunteer his services:
If Howard called me tomorrow and asked me to quit my job and make sacrifices in order to join up and help achieve the lofty tasks before him for the betterment of all, I would. [Emphasis Beaker’s.]
If I were Howard, I’d take Beaker up on his offer.
Eric Chabow reports “Lawmakers Seek to Give More Power to White House Infosec Adviser” in his Cybersecurity "Czar" Hubbub Continues post of 12/23/2009:
Don't expect the hullabaloo surrounding the cybersecurity "czar" to vanish despite the appointment Tuesday of Howard Schmidt as the White House cybersecurity coordinator.
Since President Obama announced in late May he would appoint a cybersecurity coordinator, much of the hubbub focused on who that person would be and - as the months rolled by - when the appointment would be made. That's been settled.
But the fact that Schmidt reports not to the president, but to National Security Adviser James Jones, and that the post doesn't require Senate confirmation, bothers some influential lawmakers who believe the job should be situated higher on the White House organizational chart.
While praising the naming of Schmidt, Sen. Joseph Lieberman said he will introduce legislation early next year to require the White House cybersecurity adviser be confirmed by the Senate. The Connecticut Independent Democrat who chairs the Senate Homeland Security and Governmental Affairs Committee, in his statement, did not indicate what powers his bill would give the cybersecurity adviser. …
With friends like Sen. Lieberman, you don’t need any enemies.
David Talbot asserts “Information technology's next grand challenge will be to secure the cloud--and prove we can trust it” in his Security in the Ether cover article for the January/February 2010 issue of MIT’s Technology Review magazine:
… Computer security researchers had previously shown that when two programs are running simultaneously on the same operating system, an attacker can steal data by using an eavesdropping program to analyze the way those programs share memory space. They posited that the same kinds of attacks might also work in clouds when different virtual machines run on the same server.
In the immensity of a cloud setting, the possibility that a hacker could even find the intended prey on a specific server seemed remote. This year, however, three computer scientists at the University of California, San Diego, and one at MIT went ahead and did it (see "Snooping Inside Amazon's Cloud" in above image slideshow). They hired some virtual machines to serve as targets and others to serve as attackers--and tried to get both groups hosted on the same servers at Amazon's data centers.
In the end, they succeeded in placing malicious virtual machines on the same servers as targets 40 percent of the time, all for a few dollars. While they didn't actually steal data, the researchers said that such theft was theoretically possible. And they demonstrated how the very advantages of cloud computing--ease of access, affordability, centralization, and flexibility--could give rise to new kinds of insecurity. …
At Google, we operate many data centers around the world, each of which contains a large number of computers linked to one another in clusters. In turn, the data centers are linked through a high-speed private network. These data centers support applications and services that users can access over the public Internet to tap into virtually unlimited computing power on demand, a process known as cloud computing (see "Security in the Ether"). Amazon, IBM, Microsoft, and others are implementing and experimenting with similar systems. Currently, these clouds operate in isolation, communicating only with users. But I think we need to start developing interfaces so that clouds can communicate directly among themselves.
An integrated cloud would have a number of advantages. Users may wish to move data from one cloud to another without having to download all their data and then upload it again. Or users may want to store the same data in multiple clouds for backup. In this case, reliable mechanisms for synchronizing data across different clouds would be useful. Some may wish to do coördinated computation in multiple clouds.
How can a program running in one cloud reference data in another? If one cloud puts restrictions on access to data, how can those controls be replicated in a second cloud? What protocols, data structures, and formats will allow clouds to interact at users' direction and in accordance with their requirements? …
Vinton Cerf is vice president and chief Internet evangelist at Google. In the 1970s and '80s he worked at DARPA, where he is widely credited with developing the Internet.
Graphic Credit: Paddy Mills
No significant articles yet.
See Dion Hinchcliffe’s How Cloud Commoditization Will Transform Enterprise Computing article about Amazon Web Service’s spot pricing in the Windows Azure Infrastructure section.