|Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.|
Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:
- Azure Blob, Table and Queue Services
- SQL Azure Database, Codename “Dallas” and OData
- AppFabric: Access Control and Service Bus
- Live Windows Azure Apps, APIs, Tools and Test Harnesses
- Windows Azure Infrastructure
- Cloud Security and Governance
- Cloud Computing Events
- Other Cloud Computing Platforms and Services
To use the above links, first click the post’s title to display the single article you want to navigate.
Discuss the book on its WROX P2P Forum.
See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.
Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.
You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:
- Chapter 12: “Managing SQL Azure Accounts and Databases”
- Chapter 13: “Exploiting SQL Azure Database's Relational Features”
HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated in May 2010 for the January 4, 2010 commercial release.
Brad Calder describes future posts that will aid Understanding the Scalability, Availability, Durability, and Billing of Windows Azure Storage in this 5/7/2010 post to the Windows Azure Storage Team blog:
Windows Azure Storage allows the application developers, their applications, and their users to leverage the vast amount of highly available, durable and scalable storage in the cloud. This allows developers to build services that can access their data efficiently from anywhere at any time, store any amount of data and for any length of time, and pay for it as a utility based service based only on what is used and stored. The 4 storage abstractions we provide are:
- Blobs – Provides a simple interface for storing named files along with metadata for the file.
- Drives – Provides durable NTFS volumes for Windows Azure applications to use. This allows applications to use existing NTFS APIs to access a network attached durable drive. The goal is to ease the migration of existing applications using NTFS to the cloud, and provide durability of the volumes on application failover and hardware failures.
- Tables – Provides massively scalable structured storage. A Table is a set of entities, which contain a set of properties. An application can manipulate the entities and query over any of the properties stored in a Table.
- Queues – Provide reliable storage and delivery of messages for an application to build loosely coupled and scalable workflow between the different parts (roles) in an application.
Now that Windows Azure is commercially available, we have gotten questions about how Windows Azure Storage works and how to get the best performance from it. Therefore, over the next couple of months we are going to post a series of blog entries focused on understanding how to best use Windows Azure Storage and its Scalability, Availability, Durability and understanding Billing.
The following outlines the series of posts that are planned (may be posted out of order). We will link each of the posts back here once they are available:
- Storage Abstractions and their Scalability
- What are the Windows Azure Storage abstractions and their scalability targets?
- How to use and scale out access to Blobs?
- How to use and scale out access to Tables?
- How to use and scale out access to Queues?
- How to understand and estimate the amount of Windows Azure Storage Bandwidth, Transactions and Storage Capacity for an application?
- Storage Architecture Overview and Availability
- What does the Windows Azure Storage Architecture look like and how does it provide high availability?
- How does Windows Azure Storage provide Durability?
Ryan Dunn and Steve Marx presented Cloud Cover Episode 10 - Table Storage API, a 00:42:08 Channel9 video on 5/7/2010:
Join Ryan and Steve each week as they cover the Microsoft cloud. You can follow and interact with the show at @cloudcovershow
In this episode:
- Learn about the Table storage model - PartitionKeys, RowKeys, and more.
- Walk around the API using the StorageClient library and learn how to query the service without first building a data model.
- Hear the latest news and announcements for the platform.
- Troubleshoot a common query error when using Table storage.
Windows Azure Guidance - Failure Recovery (via Eugenio)
SELECT INTO with SQL Azure
Application Infrastructure Virtual Learning
Windows Azure Firestarter videos
Protecting Blobs from Application Errors
Protecting Tables from Application Errors
Table Storage Backup and Restore on CodePlex
Jerry Huang shows you How to Access Windows Azure Storage using CIFS and his Cloudinet CloudAFS (Cloud-Attached File Server) in his 5/5/2010 post:
Windows Azure as a cloud service platform has gone commercial since January this year. Azure Storage is one of the key offering, providing pay-as-you-go cloud storage for your organization.
Thinking about how to best leverage the Azure offering for your company? Being faced with the challenge of providing access to Azure Storage for a group of Active Directory users? Or perhaps your existing file server needs to be upgraded or replaced. In either case, attaching Azure Blob Storage to your file server and publishing it to your user groups can address these issues.
Gladinet CloudAFS (Cloud Attached File Server) allows you to do just that. It also gives users the ability to access the attached, published cloud storage using CIFS. Here’s how:
Publishing an Azure Folder and Assigning Users
CloudAFS map cloud storage to virtual directories of a network drive just like Gladinet Cloud Desktop does. One important difference is that a CloudAFS virtual directory can be accessed by multiple remote clients running Gladinet Cloud Desktop or a CIFS/SMB (native OS support) client. To set it up, the virtual directory, or one of its subfolders needs to be published. Once the folder has been published, users must be assigned to the folder and granted permission to access it.
Publishing the Azure Folder
After you mount the Windows Azure Blob Storage using your Azure credentials, the publishing wizard can be launched from the Published Folder List window shown below:
The following screen pops up. Choose the Azure folder to be published and name it.
Once published, the folder is accessible as shown in the text below.
Jerry continues with the details for “Assigning Users and Permissions,” “View[ing] Published Shares Using Computer Management” and “Accessing Published Azure Shares.” …
Visit http://www.gladinet.com for more information.
Marcelo Lopez Ruiz points in his 5/7/2010 OData support teaser in Windows Live blog post to a new Windows Live Messenger Connect feature:
In case you missed it, something cool is coming on the "RESTful endpoint" side of things for Windows Live / Messenger Connect (to be honest, I don't know much about it, but I found the article terribly exciting, as I'm an avid user):
From Ori Amiga’s Messenger across the web post of 4/29/2010 to the Inside Windows Live blog:
Earlier today, John Richards and Angus Logan took the stage at The Next Web Conference in Amsterdam where they announced Messenger Connect – a new way for partners and developers to connect with Messenger. Messenger Connect allows web, Windows and mobile app developers to create compelling social experiences on their websites and apps by providing them with social promotion and distribution via Messenger. …
Messenger Connect brings the individual APIs we’ve had for a long time (Windows Live ID, Contacts API, Messenger Web Toolkit, etc.) together in a single API that's based on industry standards and specifications (OAuth WRAP, ActivityStrea.ms, PortableContacts, OData) and adds a number of new scenarios. …
Update: I got some questions about OData and we forgot to mention in the original post (doh!)… our RESTful endpoints will also support OData (www.odata.org). We’ll share more info as we roll out the bits.
David Robinson continues pumping posts to the SQL Azure Team blog with Tech Ed 2010 – June 7-10 & SQL Azure of 5/7/2010:
Interested in learning firsthand about SQL Azure? Come meet our team at Microsoft TechEd North America. We will be presenting at six sessions and hosting an open lab so that you can experiment with SQL Azure.
DAT209: What’s New in Microsoft SQL Azure
SQL Azure provides a highly available and scalable relational database engine in the cloud. In this demo-intensive and interactive session, learn how to quickly build Web applications with SQL Azure Databases and familiar Web technologies. Patric McElroy demonstrates several new enhancements we have added to SQL Azure based on the feedback received from the community since launching the service earlier this year.
COS305: Microsoft SQL Azure Development Best Practices
Rick Negrin will be covering best practices for using the SQL Azure cloud relational database. He walk through the creation of a departmental application from scratch. See firsthand how easy it is to provision a SQL Azure Database and start developing against it. We also look at importing and exporting data, and reporting. Time is also spent looking at strategies for migrating your existing applications to the cloud so that you are provided with high availability, fault tolerance and visibility to these often unseen data repositories. Finally we see how the reach of the cloud provides you with opportunities to create a new, differentiated class of applications.
COS311: Migrating Applications to Microsoft SQL Azure
Cihan Biyikoglu walks through the creation of a departmental application from scratch. See firsthand how easy it is to provision a SQL Azure Database and start developing against it. He will show how importing and exporting data, and reporting. Time is also spent looking at strategies for migrating your existing departmental applications to the cloud so that you are provided with high availability, fault tolerance, and visibility to these often unseen data repositories. Finally, he shows how the reach of the cloud provides you with opportunities to create a new, differentiated class of applications.
COS13-INT: Database Performance in a Multi-tenant Environment
Microsoft SQL Azure presents unique opportunities with regards to performance. The key distinction is multi-tenancy where every user database is co-located with multiple other databases on the same machine. Since your database no longer runs on dedicated hardware, activities of your neighbor databases could impact your performance. When the neighbors are quiet, your database has the luxury to utilize all the resources on the machine. However, when the neighbors get busy, your database performance may be affected. In this session, Henry Zhang shows you a case study in which we quantify the multi-tenancy impact on performance. He gives you some guidelines to monitor such effect on your database and predicate your database performance.
COS07-INT: Using Microsoft SQL Azure as a Datahub to Connect Microsoft SQL Server and Silverlight Clients
In this session, Liam Cavanagh demonstrates how the Sync Framework is used to enable a new series of exciting scenarios made possible by SQL Azure. First, we demonstrate how the Sync Framework is integrated with SQL Server to make it easy to extend data to SQL Azure in only a few clicks. He then show how customers can drop down to lower levels in the stack in an effort to customize the behavior of that data movement without having to rebuild the application from the ground up. Finally, he demonstrate how the Sync Framework is being enhanced to expand our offline client-side reach beyond traditional Windows Presentation Foundation/Winforms applications by adding support for Microsoft Silverlight, HTML 5, Windows Phone 7, and other devices such as an iPhone.
COS208: Building Engaging Apps with Data Using Microsoft Codename “Dallas”
Jamie Thomson asks Is SQL Azure a newbies springboard? and suggests Microsoft waive charges for new SQL Azure/Server users in this 5/6/2010:
Earlier today I was considering the various SQL Server platforms that are available today and I wondered aloud,
Let me explain. My first experience of development was way back in the early 90s when I would crank open VBA in Access or Excel and start hammering out some code, usually by recording macros and looking at the code that they produced (sound familiar?). The reason was simple, Office was becoming ubiquitous so the barrier to entry was incredibly low and, save for a short hiatus at university, I’ve been developing on the Microsoft platform ever since. These days spend most of my time using SQL Server.
I take a look at SQL Azure today I see a lot of similarities with those early experiences, the barrier to entry is low and getting lower. I don’t have to download some software or actually install anything other than a web browser in order to get myself a fully functioning SQL Server database against which I can ostensibly start hammering out some code and I believe that to be incredibly empowering. Having said that there are still a few pretty high barriers, namely:
- I need to get out my credit card
- Its pretty useless without some development tools such as SQL Server Management Studio, which I do have to install.
The second of those barriers will disappear pretty soon when Project Houston delivers a web-based admin and presentation tool for SQL Azure so that just leaves the matter of my having to use a credit card. If Microsoft have any sense at all then they will realise the huge potential of opening up a free, throttled version of SQL Azure for newbies to party on; they get to developers early (just like they did with me all those years ago) and it gives potential customers an opportunity to try-before-they-buy.
Perhaps in 20 years time people will be talking about SQL Azure as being their first foray into the world of coding!
David Robinson’s Ready to jump in and learn SQL Azure? post of 5/6/2010 is repeated today because of it’s importance, as indicated by its #1 position today on the Google Blogs list for SQL Azure (This OakLeaf post is #6):
Introduction to SQL Azure
In this lab, you will walk through a series of simple use cases for SQL Azure such as: preparing your account, managing logins, creating database objects and query your database. Here are the direct links to the various sections:
- Exercise 1: Preparing Your SQL Azure Account
- Exercise 2: Working with Data. Basic DDL and DML
- Exercise 3: Build a Windows Azure Application that Accesses SQL Azure
- Exercise 4: Connecting via Client Libraries
Migrating Databases to SQL Azure
In this lab, you will use the AdventureWorksLT2008 database and show how to move an existing on-premise database to SQL Azure including modifying the DDL and moving data via BCP and SQL Server Integration Services. Here are the direct links to the various sections:
- Exercise 1: Moving an Existing Database to the Cloud
- Exercise 2: Using BCP for Data Import and Export
- Exercise 3: Using SSIS for Data Import and Export
- Known Issues
SQL Azure: Tips and Tricks
In general, working with a database in SQL Azure is the same as working against an on SQL Server with some additional considerations covered in this lab.
- Exercise 1: Manipulating the SQL Azure firewall via API’s
- Exercise 2: Managing Connections – Logging SessionIds
- Exercise 3: Managing Connections – Throttling, latency and transactions
- Exercise 4: Supportability – Usage Metrics
What is SQL Azure?
Just what is SQL Azure? Join Zach in this brief video for an introduction to SQL Azure exploring what makes SQL Azure just like SQL Server and what makes it different and ready for the cloud.
Alex James posted OData Roundup #3 on 5/6/2010:
Many exciting things have happened in the OData world since our last post.
- We released the source code for our .NET client library on codeplex.
- We announced the OData Roadshow which will visit: New York, Chicago, Mountain View, Shanghai, Tokyo, Reading and Paris in the coming months. See if you can get along.
- The next version of Windows Live Messenger will include a RESTful interface that supports OData.
- We asked whether people thought making OData metadata queryable would be useful?
- Tomorrow we are putting on a free webcast, OData and You: An everyday guide for Architects Featuring Douglas Purdy.
- We released an updated version of the OData Explorer that now works with the RTM version of Silverlight 4.
- And as always lots of things are happening on twitter.
Did you notice that Alex dropped “weekly” from the latest title?
Marcelo Lopez Ruiz delivers Some Performance Notes on Enumerable LINQ Operators in this 5/2/2010 posts:
This post is the continuation of Layering enumerators; I wanted to have a much shorter post, but it seemed to me like this would be much more useful if I had all the notes in a single post.
First, some caveats. This post deals with the LINQ operators and friends as they work on IEnumerable<T>, with the extension methods of the Enumerable class. LINQ to Objects, if you will. Now, I'm not an expert on this library in particular, but I've used it in the past, and I'd like to share some notes on how some decisions may impact performance.
Many of the observations change significantly if you work with IQueryable<T> and databases, because they typically have specialized auxiliary data structures at their disposal. They may not support all methods, though.
Still, it's interesting that many of these consideration apply across a variety of software components that ultimately have some notion of iterator (like XmlReader for example). Also, some operators may work better on lists, as they have random access and keep a count of elements.
Sometimes you might also see changes in performance characteristics depending on whether you use some overload that takes in additional selectors or filter. For example, the implementation of Last() works great on list, as it just picks the last item, but Last(predicate) ends up consuming its complete input.
Without any further ado, here we go.
Wade Wegner’s Release the hounds – Multicasting with Azure AppFabric post of 5/6/2010 shows you how “to start a job simultaneously across multiple worker roles running in Windows Azure”:
On an email thread today, someone was looking for suggestions on how to start a job simultaneously across multiple worker roles running in Windows Azure. For example, image you have ten worker roles already running and, through the command of an admin or user, you want to “release the hounds!”
Definitely an interesting scenario, and many different ways to approach it. Initial ideas and thoughts centered around using Windows Azure storage tables or blobs – in fact, Steve Marx quickly threw out some
pseudocode highlighting a reasonable way to approach the problem:while (blob.DownloadText() != “RELEASE THE HOUNDS!”)Thread.Sleep(TimeSpan.FromSeconds(1));// do the actual work
Then to release:blob.UploadText(“RELEASE THE HOUNDS!”);
You could definitely take this approach and have success.
Of course, to me this scenario screamed multicasting with NetEventRelayBinding.
NetEventRelayBinding supports multiple listeners on the same URI, which means that you can have 1 or 1000 worker roles in Windows Azure all listening to the same URI – this gives you the ability to push out events to all listeners, as any message sent by a client gets distributed to all the listeners.
Clemens Vasters sums NetEventRelayBinding it up nicely on his blog:
The NetEventRelayBinding doesn’t have an exact counterpart in the standard bindings. This binding provides access to the multicast publish/subscribe capability in the Relay. Using this binding, clients act as event publishers and listeners act as subscribers. An event-topic is represented by an agreed-upon name in the naming system. There can be any number of publishers and any number of subscribers that use the respective named rendezvous point in the Relay. Listeners can subscribe independent of whether a publisher currently maintains an open connection and publishers can publish messages irrespective of how many listeners are currently active – including zero. The result is a very easy to use lightweight one-way publish/subscribe event distribution mechanism that doesn’t require any particular setup or management.
So, the architecture might look something like this:
In this scenario, an admin sitting on a laptop can send a message to the Service Bus, which in turn relays the message to all the listeners. When the worker roles receive the message they will “release the hounds” and process whatever it is they need to process.
Note: this approach is just as valid for listeners that don’t reside in Windows Azure. For example, if you have an application that is distributed across PCs and you want to send every client a message (without implementing some form of polling) this is the perfect approach.
So, without further ado, here’s the code to release the hounds!
Wade continues with “Comments on the code.”
See Vittorio Bertocci announced June 1st-2nd, WIF Workshop in Redmond: Sign up before we run out of seats! on 5/6/2010 in the Cloud Computing Events section below.
Mike McKeown added a Windows Server AppFabric topic to the TechNew Wiki on 4/23/2010:
Windows Server AppFabric extends Windows Server to provide enhanced hosting, management, and caching capabilities for Web applications and middle-tier services. The AppFabric hosting features add service management extensions to Internet Information Services (IIS), Windows Process Activation Service (WAS), and the .NET Framework 4. This includes Hosting Services and Hosting Administration tools that make it easier to deploy, configure, and manage Windows Communication Foundation (WCF) and Windows Workflow Foundation (WF) based services. The AppFabric caching features add a distributed, in-memory object cache to Windows Server that makes it easier to scale out high-performance .NET applications, especially ASP.NET applications.
The wiki has entries under the following topics:
- Windows Server AppFabric Concepts
- Windows Server AppFabric Development
- Windows Server AppFabric Frequently Asked Questions (FAQ)
- External Links
tbTechnet delivers a list of Channel9 resources for Windows Azure in his Stay Seated: Windows Azure Platform Channel 9 Videos with Workshop post of 5/7/2010:
My colleague, John M. sent these over – some really great Channel 9 videos that cover Windows Azure Platform overviews, introductions and real-world developer interviews:
Building an Azure Business Practice
- The Business Opportunity Behind Azure
- Windows Azure and Partner Opportunities
- Helping Customers Move Forward with Azure One Partner’s Perspective
- Interview with Hedgehog Development - Azure, the Microsoft Partner Network, and BizSpark
- A TELLUS Perspective on Cloud Computing
- Neudesic Migrates Quark Promote to Windows Azure
- BizSpark Startup Linxter Launches Azure Based MonitorGrid
- While you’re at it, get some hands-on experience yourself by joining a 2-hr hands-on workshop:
- @Home with Windows Azure
- A 2-hr virtual hands-on workshop to guide you through the process of building and deploying a large scale Azure application. No cost to attend; each attendee receives a temporary, self-expiring, full-access account to work with Azure for a period of 2-weeks.
- @Home with Windows Azure
Maarten Balliauw delivers his latest PHP-on-Azure slide deck in Linuxwochen Austria of 5/7/2010:
Abstract: “Ever wanted to get started with PHP development on Windows? This session covers the basics of running PHP on the Windows platform and will help getting your development environment ready.”
Thanks for being in this session! I know it is a controversial one on a Linux event :-)
David Robinson announced the availability of Clinic 10332: Introduction to Microsoft SQL Azure in this 5/6/2010 post:
This self paced free learning clinic from Microsoft Learning will introduction your to SQL Azure. This clinic covers the following topics:
- Understanding the SQL Azure Platform.
- Designing Applications for SQL Azure.
- Migrating Applications to SQL Azure.
- Achieving Scale with SQL Azure.
Total time: 2 hours.
Mel Beckman is interviewed by Karen Forster for 00:25:03 on 5/6/2010 in TechNet Radio: Windows Azure: A Non-Microsoft Perspective:
Mel Beckman, an open source guru famous for saying, “I don’t do Windows,” talks with Karen Forster about why he is finding Windows Azure to be an excellent platform for distance learning and developing on PHP. Windows Azure's scalability makes it well suited for online, interactive user access, where workloads can vary dramatically with the time of day.
A good example of this application is distance learning, where it can be challenging to host compute-intensive, online courseware locally, while still delivering performance needed for effective instruction. Educators are experts on course content using popular, open source tools like Moodle, but not necessarily proficient at building scalable, network infrastructures for multi-user web applications. Mel Beckman, senior technical editor for Penton Media's technology group, and a specialist in secondary school educational networks, discusses Windows Azure in the context of distance learning with Karen Forster, Vice President of Platform Vision (www.platformvision.com).
Click here to learn more about Microsoft Cloud Services
Geva Perry expresses surprised that Gartner: By 2012, 20% of businesses will own no IT assets. Huh? hasn’t been questioned analysts and thought leaders in his 5/6/2010 post:
Back in January, Gartner announced its annual predictions for 2010 and beyond. One of the key predictions was that "by 2012 20 percent of businesses will own no IT assets". Here's the excerpt from the press release, which has been widely quoted around the web:
“By 2012, 20 percent of businesses will own no IT assets. Several interrelated trends are driving the movement toward decreased IT hardware assets, such as virtualization, cloud-enabled services, and employees running personal desktops and notebook systems on corporate networks. The need for computing hardware, either in a data center or on an employee's desk, will not go away. However, if the ownership of hardware shifts to third parties, then there will be major shifts throughout every facet of the IT hardware industry. For example, enterprise IT budgets will either be shrunk or reallocated to more-strategic projects; enterprise IT staff will either be reduced or reskilled to meet new requirements, and/or hardware distribution will have to change radically to meet the requirements of the new IT hardware buying points.”
The more I thought about this one it strikes me as a very odd statement. Don't get me wrong -- I understand where they are coming from and agree with the explanation and the trend, but some things just don't make sense in their wording.
"20 percent" - compared to what percentage today?
"Businesses" - What businesses? The bodega on the corner of Broadway & 89th? The guy who works on my lawn?
"IT assets" - They probably mean IT assets in the data center because aren't personal desktops and notebooks also IT assets?
Although this prediction was widely quoted, no one seems to have questioned it in any way or asked for a clarification.
Very wierd. Peter Silva’s CloudFucius Ponders: High-Availability in the Cloud post of 5/7/2010 to f5.com quotes the Gartner report without making a judgment as to its veracity.
Tim O’Reilly asserted the following in his The State of the Internet Operating System post of 3/29/2010:
[I]t's easy to jump to the conclusion that "cloud computing" platforms like Amazon Web Services, Google App Engine, or Microsoft Azure, which provide developers with access to storage and computation, are the heart of the emerging Internet Operating System.
Cloud infrastructure services are indeed important, but to focus on them is to make the same mistake as Lotus did when it bet on DOS remaining the operating system standard rather than the new GUI-based interfaces. After all, Graphical User Interfaces weren't part of the "real" operating system, but just another application-level construct. But even though for years, Windows was just a thin shell over DOS, Microsoft understood that moving developers to higher levels of abstraction was the key to making applications easier to use.
But what are these higher levels of abstraction? Are they just features that hide the details of virtual machines in the cloud, insulating the developer from managing scaling or hiding details of 1990s-era operating system instances in cloud virtual machines? …
The full contents of this post formed the foundation of his eponymous Keynote speech of 5/6/2010 to the Web 2.0 2010 conference in San Francisco. Read State of the Internet Operating System Part Two: Handicapping the Internet Platform Wars of 4/30/2010.
No significant articles today.
tbtechnet encourages you to register as a Microsoft Partner to take advantage of Azure technical and sales training in his Windows Azure – Getting in on the Act post of 5/6/2010:
There is a large amount of new technical and sales training content emerging about Windows Azure.
Plus, Microsoft partners are getting in on the act with some upcoming webinars.
- There is a brand new virtual lab designed for developers who work with PHP and who would like to be able to run PHP applications on the Microsoft Windows Azure platform: Deploying/Running PHP on Azure Virtual Lab .
- This lab complements some newer PHP on Windows Azure Platform Web seminars and quickstart guide here.
- David Makogon at RDA Corp is presenting a Webinar - How to Setup and Monitor a Basic Azure Application on May 12th. David is the Technical Lead of RDA’s Architecture Guidance Practice Group, and a Microsoft Solutions Advocate.
To Register: http://www.clicktoattend.com/invitation.aspx?code=147728
- For those companies that are in the Microsoft Partner Network (MPN) and join here if not – it’s no-cost and there are a ton of benefits that will help you grow revenue and gain technical expertise.
- Once in MPN, you can then access this:
- Windows Azure Platform Technical Training
This technical learning path will help you learn what Windows Azure is, and what is provided by the Azure Services Platform.
- Windows Azure Platform Sales Training
This learning path will provide you with and overview of Windows Azure and how Windows Azure can help drive new business.
Vittorio Bertocci announced June 1st-2nd, WIF Workshop in Redmond: Sign up before we run out of seats! on 5/6/2010:
Only few seats left for the only US date of the WIF Workshops megatour I am bringing around the planet! Apply for your seat at the link below:
Tuesday, June 01, 2010 9:00 AM - Wednesday, June 02, 2010 5:00 PM Pacific Time (US & Canada)
Microsoft Main Campus - Platform Adoption Center (Bld 20)
3709 157th Ave NE
Building 20 Redmond Washington 98052
The workshop is 50% sessions, 50% labs: more detailed description here. We have very few seats, as you can imagine, and given the very high demand it is very likely that we won’t be able to accommodate everyone. If you are already a WIF expert please don’t be bummed if we don’t manage to get you in! We are going to record the presentations in video, and we’ll share them on Channel9/in the training kit shortly after, but of course the live class is the chance to have your questions answered on the spot; that makes all the difference for someone who is ramping up.
The peripatetic Vibro continues with details of his recent European and future Asian journeys.
SearchCloudComputing.com delivers a rogues’ gallery of the Clouderati in their Top 10 cloud computing leaders post of 5/7/2010:
Cloud computing could not have emerged as the fastest growing trend in IT today without a cavalcade of forward-thinking people powering its rise. Whether they are business executives, chief technology officers or influential bloggers, the direction of the cloud is being steered by several powerful and progressive IT minds.
This list of the top 10 cloud computing leaders, as of May 2010, includes both pioneers and innovators. Our goal in creating it is to emphasize those who have made, and continue to make, a true impact in the world of cloud computing.
No one from Microsoft made the cut, but any scoring algorithm that includes Larry Ellison as a cloud-computing leader is is questionable to be generous. Congrats to Werner Vogels (@werner) #1; Chris Hoff (@Beaker) #10; and Sam Johnston (@samj) co-honorable mention.
Mitch Wagner claims Apple is dead in the cloud in this 5/7/2010 post to ComputerWorld’s Tools blog:
Facing the end of the personal computing business as we know it, Apple is betting the company on a new generation of thin devices like the iPhone and iPad. But Apple is weak in the cloud computing applications that are essential for it to survive and thrive in the new future.
Charles Stross got it close to right last week. Stross is both a talented science fiction writer and insightful observer of the technology industry. In a discussion of Apple's refusal to support Flash on the iPhone and iPad, he writes that Jobs "believes he's gambling Apple's future — the future of a corporation with a market cap well over US $200Bn — on an all-or-nothing push into a new market."
PC prices are falling precipitously, the profit margin is thinning. "Apple has so far survived this collapse in profitability by aiming at the premium end of the market — if they were an auto manufacturer, they'd be Mercedes, BMW, Porsche and Jaguar rolled into one," Stross writes. But that won't save Apple forever. The future is in cloud computing.
“Apple are trying desperately to force the growth of a new ecosystem — one that rivals the 26-year-old Macintosh environment — to maturity in five years flat. That's the time scale in which they expect the cloud computing revolution to flatten the existing PC industry. Unless they can turn themselves into an entirely different kind of corporation by 2015 Apple is doomed to the same irrelevance as the rest of the PC industry — interchangeable suppliers of commodity equipment assembled on a shoestring budget with negligible profit.”
In other words: The battle for future dominance in the computer industry is in the cloud.
Stross is right, but he doesn't take the discussion far enough. …
Mitch continues with an analysis of Apple’s cloud-computing shortcomings and concludes:
To thrive in the cloud-computing future, Apple needs to create brilliant cloud applications, and I don't know if that's in Apple's DNA. Apple is a company with a fanatical need to control every element of its ecosystem, and cloud computing is fundamentally opposed to that approach. Google did not succeed by carefully screening every application built on top of its platform, like Apple did with the App Store. To succeed at cloud computing, companies need to publish APIs and throw the doors open to anyone who cares to use them.
I see three possible futures for Apple:
- Failure. It fails to adapt to cloud computing. Apple is wildly successful today, but fortunes change fast in the technology industry, and the company could be on life support by 2015.
- Apple changes its corporate culture to embrace cloud computing and the openness and loss of control entailed. I see that as unlikely; Apple sees control as vital for its survival. It remembers well the bad old days of the 1990s, when Apple's survival depended on the largesse of Microsoft continuing to support Office on the Mac, it won't be in the position of begging for scraps from someone else's table again.
- A miracle occurs. With Apple, that's always a possibility. Other companies fail when they can't to play by the rules of industries they enter, but Apple changes industries to suit its preferences.
Bob Warfield references Larry Dignan and an Amazon press release in his Netflix’s Movie Cloud is Moving into the Amazon Cloud report of 5/7/2010 to the Enterprise Irregulars blog:
Netflix has always been an extremely progressive company. I know the founders, Marc Randolph and Reed Hastings well, and many of the employees too. There is an amazing amount of brainpower behind the scenes there and it shows with their great products and great story.
I read with interest Larry Dignan’s piece about their usage of Amazon Web Services to move key parts of the Netflix infrastructure into the Cloud. It doesn’t seem that long since I remember being asked to visit Netflix and tell them about my company’s experience moving into the Amazon Cloud. I expected to meet in Reed Hasting’s office with perhaps a couple of people, but was surprised to find they had assembled a small auditorium of developers to hear the story. I spent a little over an hour telling them how we’d done it and answering questions and then went away.
As an aside, this is how smart companies go to school–by sharing information broadly rather than hoarding it at the top, and by bringing in outsiders who can add to the collective knowledge pool. When was the last time your company did something like this? It’s so easy here in Silicon Valley, which is dense with sharp insights and hard-won experience. Take advantage of it, it’s the least you can do after paying the high cost of living here!
I admit, I wondered whether they’d carry it off or even get started, or whether they were just curious. Moving to the Cloud is a big step for a big thriving company. There are a lot of moving parts that have to be orchestrated for it to be successful. But as I said, they are an extremely progressive company with a lot of very bright people. Color me very impressed with the speed at which they were able to move.
PS Amazon has a press release / case study with more detail on just what Netflix is doing.
Well, at least they deliver their catalog in OData format. Eric Engleman chimes in with his Netflix on Amazon's cloud TechFlash article of 5/7/2010:
It's not a secret that DVD rental company Netflix has parts of its technology infrastructure running on Amazon's cloud (the New York Times just wrote about it last month), but Amazon is now giving more details. According to information released by Amazon today, Netflix has been using Amazon Web Services for more than a year "for both customer-facing and backend applications" and is now "expanding the set of applications that it is migrating" to Amazon's cloud.
It's an interesting partnership given that Netflix and Amazon are competitors in the streaming video market and Amazon has long been rumored to be interested in acquiring Netflix.
Amazon says Netflix will be using its web services to power member movie lists, website search, movie transcoding, and its recommendation system.
Reuven Cohen wrote Unlocking the Value of IaaS on 5/7/2010:
Until recently I've been in an odd spot, generally speaking my biggest competition when a potential customer came to us looking for an Infrastructure as a Service (IaaS) platform was either to build it yourself (aka huge risk) or buy it from us. (Yes there were a few other competitors) But for the most part the space was a greenfield, we really didn't have to worry about our competitors because there weren't many and the ones that were out there were positioned in a significantly different way than us.
The problem with being first is one of education, most potential customers didn't even realize they needed a cloud platform. The good news is things are changing, the idea of Infrastructure as a service is no longer a radical one. IaaS companies are getting funded left and right and customers are buying. I've long held the notion that an industry isn't real until you have direct competitors. This both proves there is an opportunity as well as brings a broader awareness. A rising tide floats all boats if you will. In my post today, I thought I'd briefly explore the value proposition of IaaS.
So you ask what's the value of providing an Infrastructure as a Service? From a private point of view it's most about efficiency, the draw back is you need to spend money to save money. Which can be a tough sell in a rough economic climate. From a public cloud context it's about converting costs (Fixed to Variable, Capex to Opex etc). Ok, we're heard the story before. It's really about saving money or not losing it to a more nimble competitor. So the real question becomes how do you unlock the value of an IaaS cloud, either internally or externally or both? For me it's all about the application. …
The biggest value of an IaaS platform is as a stepping stone -- one that allows you to gracefully migrate from the traditional physical single tenant infrastructure of the past to a multi-tenant distributed cloud of the future, while not requiring you to completely re-architect or rebuild your applications for it. What IaaS does is move the need to make your applications multi-tenant by making your infrastructure mutli-tenant. If it doesn't accomplish this, than it's not IaaS and there's not a lot of value in it.