|Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.|
• Update 1/23/2010: Lionel Laske: OneNote on iPhone and Palm Pré using Windows Azure; Charles Leadbeater: Let's open up cloud computing; Kevin Jackson: "Shaping Government Clouds" Just Released; Andrea DiMaio: Open Government Directive: A First Wave of Data, or Rather A Trickle?; Jeffrey Schwartz: Microsoft, Intuit Strike Cloud Pact for Small Business; Karsten Januszewski: Downloading and Parsing IIS Logs from Windows Azure; David Sayed: Creating and Publishing a Silverlight Video to Windows Azure; Toddy Mladen: Windows Azure Deployment Stuck in Initializing, Busy, Stopping – Why?; Jamie Thomson: Tweetpoll and RESTful Northwind go bye-bye; Lucas Almeida Romão: Campus Party Brazil: Azure Services Platform;
Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:
- Azure Blob, Table and Queue Services
- SQL Azure Database (SADB)
- AppFabric: Access Control, Service Bus and Workflow
- Live Windows Azure Apps, Tools and Test Harnesses
- Windows Azure Infrastructure
- Cloud Security and Governance
- Cloud Computing Events
- Other Cloud Computing Platforms and Services
To use the above links, first click the post’s title to display the single article you want to navigate.
Discuss the book on its WROX P2P Forum.
See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.
Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.
You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:
- Chapter 12: “Managing SQL Azure Accounts, Databases, and DataHubs*”
- Chapter 13: “Exploiting SQL Azure Database's Relational Features”
HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the November CTP in January 2010.
* Content for managing DataHubs will be added as Microsoft releases more details on data synchronization services for SQL Azure and Windows Azure.
Off-Topic: OakLeaf Blog Joins Technorati’s “Top 100 InfoTech” List on 10/24/2009.
• Lionel Laske shows you how to port OneNote on iPhone and Palm Pré using Windows Azure blob storage in this 12/25/2009 post, which I missed when published:
This article will show you how you can synchronize OneNote on Windows Azure using OneNote API and Azure Blob Storage. Then, you'll learn how to develop an ASP.NET application to access it from an iPhone and a WebOS application to access it from a Palm Pré. …
My need is to have permanent access to my OneNote notebooks from a SmartPhone. At the time of beginning this project "my SmartPhone" meant an iPhone 3G, and at the end of the project, it means a Palm Pré©. So both these terminals are supported here.
OneNote stores all its contents locally on your hard drive. Of course, accessing OneNote from everywhere needs synchronizing the content in a shared place. What else than Internet to store shared content ? So, I naturally chose to synchronize OneNote "on the Cloud" and, more precisely on Windows Azure.
Here is a functional overview of the process.
The previous schema is an overview of the process. The next one is an architecture overview of components used to implement this process.
Lionel continues with a very detailed description of his downloadable application.
Mitch Milam posts links to two Useful Tools for Working with Windows Azure Storage in this 1/21/2010 post:
I wanted to make quick note of two tools that I’ve found invaluable during my recent development efforts.
Allows you to manage your Windows Azure blob storage using a Windows Explorer-like interface:
It’s very useful should you wish to create blob containers or quickly move large files from your file system to Windows Azure.
This is an Windows MMC snap-in that allows you to manage both Windows Azure Blobs and Queues.
This tool is indispensible for creating and monitoring Azure Queues.
The Azure Storage Service provides the ability to store blobs, tables and queues in the Azure cloud. Blobs, tables and queues are identified by URL and could, in theory, be accessed by anyone who knows the appropriate URL. This could be an enormous security hole so the Azure Storage Service requires all access to Azure tables and queues to be authenticated. By default, the Azure Storage Service also requires all access to Azure blobs to be authenticated as well. However, because of the utility of blobs in storing media files, such as images and music, the Azure Storage Service supports various methods of loosening the authentication requirements for blobs and the containers they are located in.
The Azure Storage Service supports the association of an access permission with a container through public access control. This access control allows public read access to the container and the blobs in it or public read access merely to the blobs in the container and not to the container itself. The latter would, for example, prohibit the unauthenticated listing all the blobs in the container. The Azure Storage Service also supports of shared access signatures which can be used to provide a time-limited token allowing unauthenticated users a time-limited ability to access a container or the blobs in it. Shared access can be further managed through container-level access policy. …
• George Huey released v3.1.4 of his SQL Azure Migration Wizard (SQLAzureMW.exe) to CodePlex on 1/23/2010 with a fix for problems analyzing (parsing) T-SQL scripts from a file. See the updated Using the SQL Azure Migration Wizard v3.1.3 or v3.1.4 with the AdventureWorksLT2008R2 Sample Database post of 1/23/2010.
• My Using the SQL Azure Migration Wizard v3.1.3 with the AdventureWorksLT2008R2 Sample Database post of 1/22/2010 shows you how to work around problems with source tables that include XML indexes or generate T-SQL with three-part column names. The best new feature of v3.1.3 is support for uploading data to or from SQL Azure with SQL Server’s Bulk Copy Protocol (BCP) native mode.
The next-to-last update to the post covers an issue with importing T-SQL script files with the Analyze and Migrate – TSQL File option. Use the Run TSQL without Analyzing – TSQL File option until George Huey issues a fix for the former in v3.1.4.
The last (1/23/2010) update describes an AdventureWorksLT.zip archive from George Huey with an updated T-SQL script and a set of BCP data files that run without errors with v3.1.3’s Analyzing – TSQL File option. To download and run this file, see the Temporary Workaround for v3.1.3 with a Modified AdventureWorksLT.sql File section at the end of that post.
Full disclosure: This is a very lengthy and detailed post, but most of the length is occupied by screen captures.
Reminder: Vote for new or improved SQL Azure features in the SQL Azure Feature Voting Forum.
Rob Collins’ Microsoft SQL Azure – Microsoft Takes SQL Databases to the Clouds post of 1/20/2010 begins:
A new technology from Microsoft SQL Simply click on the market on 1 January 2010. Or perhaps more accurately, a renowned technology Microsoft has released the market. Azure Services is the most important step in Microsoft’s cloud computing. It operates as a complete platform in the cloud computing, storage and hardware to attach the individual systems into an integrated network that can balance the processing load or share resources.
What SQL Azure? The part that is most relevant for this discussion is how Azure relates to SQL. Microsoft used to call this service from Services SQL Server and SQL services before finally moving to SQL Azure recently. The name change is just another step in the same direction as the SQL server has already dealt with cloud computing. Azure provides SQL data storage “in the cloud, like Amazon S3 and many of Google Apps. A big advantage of relational SQL queries Azure can be done against the data stored in the cloud, regardless of whether they are structured, semi-structured or unstructured.
Rob’s last sentence is over enthusiastic about SQL Azures capability to deal with “semi-structure or unstructured" data; that was SQL Data Services and its predecessor, SQL Server Data Services, that used the Entity-Attribute-Value (EAV) data structure. Rob also could use a bit more whitespace in his articles.
Hilton Giesenow shows you to set up SQL Azure DataSync in a 00:17:08 How Do I: Integrate An Existing Application With SQL Azure? Part – 1 video segment:
What if you could get all the benefits of distributed, on-premises application databases AND hosted cloud-based databases for a real Software + Services implementation? In this video, Hilton Giesenow, host of The MOSS Show SharePoint podcast (http://www.TheMossShow.com/) shows us how to set up a powerful and easy-to-use synchronisation model between a local Microsoft SQL Server database and a Microsoft SQL Azure database with the Microsoft Sync Framework tools for SQL Azure.
Andrea DiMaio reports The UK Joins the Open Government Data Train in this 1/21/2010 post:
As I do not judge success by numbers but by impact, I cannot really say whether the British version is really better. For instance, the format in which data is available is much more evident in the US site than in the British one, and in both cases data sets are listed alphabetically, which does not make finding those one needs very easy.
To be fair, the UK government has been working on this for a long time. The Power of Information report and the ensuing Taskforce, as well as the engagement of a high caliber like Sir Tim Berners-Lee witness the aspiration of making this right. …
A. Balaji’s Diagnostics in AD FS 2.0 post of 1/21/2010 notes that:
AD FS 2.0 RC has a number of enhancements and additions for troubleshooting/diagnostics of failures and errors that occur when processing token issuance requests. …
Starting with RC, AD FS 2.0 events and traces are logged using Event Tracing for Windows (ETW) framework. ETW is a general-purpose, high-speed tracing facility provided by the operating system. For more information on ETW, refer to http://msdn.microsoft.com/en-us/magazine/cc163437.aspx. …
• Andrea DiMaio questions Open Government Directive: A First Wave of Data, or Rather A Trickle? in this 1/23/2010 post to the Gartner blog:
January 22, 2010 was the first deadline for US federal agencies to comply with the Open Government Directive issued on December 9, 2009. In particular agencies were requested to identify and publish online in an open format at least three high-value datasets. These sets must be registered for Data.gov and should not be previously available online or in a downloadable format.
Looking at Data.gov this morning (Central European Time) reveals a page with a list of new data sets provided by departments and agencies in compliance with the directive. When I visited the site (at 9:00 am CET, which is midnight on the US West Coast), there were 340 data sets (137 of which indicated as high-value) from 45 departments or agencies. This is a bit more than half of the entities that were supposed to respond (the US government has 15 executive departments, plus the executive office of the President, and about 70 independent agencies).
Departments and agencies took very different approaches. Some – such as the Department of Justice or of Veteran Affairs – provided loads of data, only few of which were marked as “high-value”. Others – such as the Department of Transportation or NASA – reported exactly three high-value data sets. A few – such as the Nuclear regulatory Commission or the Office of Personnel Management – provided only one or two high-value data sets. …
Andrea goes on to question what makes certain data sets “high-value” and concludes:
It is clearly too early to judge whether the Open Government Directive is having an impact. Departments and agencies are clearly moving, but the extent to which they are just complying or really leveraging the benefits of open government (see research note U.S. Open Government Directive: What Should Agencies Do? – Gartner login required) is still to be determined. Their Open Government Plans, due by April 7, will be much more revealing in this respect.
• Jeffrey Schwartz’s Microsoft, Intuit Strike Cloud Pact for Small Business post of 1/21/2010 to the Visual Studio Magazine blog adds more detail about the agreement (see below):
After jettisoning its small-business accounting product line late last year, Microsoft is aligning with former rival Intuit to help bring Intuit's large contingent of QuickBooks users into the cloud.
In an agreement between the two companies announced Wednesday, Microsoft's Windows Azure cloud service will become the preferred platform for Intuit App Center, a marketplace of small-business applications and services launched in October 2009.
"We think there's a big opportunity for developers to build on Azure, a best-of-breed application, and then federate that application into the Intuit App Center so they can get the distribution through the channel that we have," said Alex Chriss, director of Intuit's partner platform.
The two companies are aiming to let their respective developers and channels build and sell those applications to small and mid-sized businesses. Intuit App Center only has 40 contributions to date, but Chriss claims it is gaining traction. The Microsoft deal should help facilitate development and cross-selling of Web-based apps based on the QuickBooks platform, he said.
The pact will also let Microsoft developers and channel partners reach out to QuickBooks customers to develop new apps or those that integrate with Microsoft's core offerings such as Exchange and SharePoint.
"It's another example of vendors joining forces to expand portfolios, capabilities, and provide more complete offerings," said Gartner analyst Tiffani Bova.
As part of the agreement, Microsoft made the beta of a Windows Azure SDK for the Intuit Partner Platform available for download. The SDK includes a Security Access Markup Language (SAML) gateway that provides the single sign-on, said Jamin Spitzer, director of platform strategy at Microsoft. The APIs that provide the data access are built into the SDK, which will be available to developers using Microsoft Visual Studio.
• Karsten Januszewski describes Downloading and Parsing IIS Logs from Windows Azure in this 1/22/2010 post to the VisitMIX blog:
In this post, I delve into how to get IIS logs out of Windows Azure and on to your local machine so you can do analysis. I wrote a program to help this process called AzureLogFetcher.
In this post, I’ll tell you how to get IIS logs out of Windows Azure and onto your local machine so you can do analysis. I wrote a program called AzureLogFetcher to enable this process.
I have already written about how to get logging and diagnostics working with Windows Azure so that you can access both your IIS logs and diagnostic information. In this post, I’ll show you how to get this information out of Windows Azure and onto your local machine so you can analyze the logs. I’ll also show some of the queries I run against the IIS logs using the most excellent Log Parser tool, a free program from Microsoft. …
• David Sayed explains Creating and Publishing a Silverlight Video to Windows Azure in this detailed and illustrated 1/7/2010 post:
In this post, I will describe how to use Microsoft Expression Encoder 3 to encode and publish a Silverlight video player to a Windows Azure storage account. You will need: (1) an active Windows Azure subscription, (2) an active Windows Azure storage service and (3) installed Microsoft Expression Encoder 3.
David is one of the Web’s best sources of information about working with video content stored in Azure and delivered by Silverlight.
• Toddy Mladen asks Windows Azure Deployment Stuck in Initializing, Busy, Stopping – Why? and provides many possible answers in this 1/23/2010 post:
Recently I see a lot of posts on Windows Azure Forum related to deployments cycling between Initializing, Busy and Stopping states. In this post I would like to summarize the most common reasons why is this happening. Here they are [as abbreviated titles]:
- Missing runtime dependencies (DLLs)
- Using incorrect platform version of a DLL
- Your code or assembly you use requires admin access (elevated privileges)
- Misconfigured DiagnosticsConnectionString in Service Configuration file
- Misconfigured DataConnectionString in Service Configuration file
- Wrong CSDEF and/or CSCFG schemas
- Read from queue or table that doesn’t exist during initialization
- Certificate without exportable private key
- Returning from Run() method in a Worker Role
- Uncaught exception thrown during initialization
• Jamie Thomson announced Tweetpoll and RESTful Northwind [will] go bye-bye in this 1/23/2010 post to SQLBlog.com:
On 31st January 2010 Windows Azure and SQL Azure will transition to becoming services that you have to pay for which means that my three small demos that are hosted up there are going to disappear hence I thought now would be a good time to review before they are digitally ground into dust.
- Tweetpoll was a demo that I wrote back in April 2009 and is hosted on Windows Azure at http://tweetpoll.cloudapp.net/. …
- When SQL Azure entered public beta I wanted to build an app that would demonstrate its capabilities so I built http://northwindazure.cloudapp.net/Northwind.svc/. …
- TwitterCache is a lot simpler than Tweetpoll or RESTful Northwind, it is simply a SQL database hosted up on SQL Azure.
Jamie’s post provides detailed information about each departing app. RIP. This are examples that proves Tim Anderson’s point below.
Tim Anderson asserts Windows Azure is too expensive for small apps in this 1/22/2010 post:
I’m researching Windows Azure development; and as soon as you check out early feedback one problem jumps out immediately. Azure is prohibitively expensive for small applications.
Here’s a thread that makes the point:
“Currently I’m hosting 3 relatively small ASP.net web applications on a VPS. This is costing about $100 per month. I’m considering transitioning to Azure.
Q: Will I need to have 1 azure instance per each ASP.net application? So if I have 3 web apps, then I will need to run 3 instances which costs about $300 per month minimum, correct?”
The user is correct. Each application consumes an “instance”, costing from $0.12 per hour, and this cost is incurred whenever the application is available.
Amazon also charges $0.12 per hour for a Windows instance; but the Amazon instance is a virtual machine. You can run as many applications on there as you like, until it chokes.
Google App Engine has a free quota for getting started, and then it is charged according to CPU time. If the app is idle, you don’t pay.
In addition, all these services charge extra for storage and data transfer; but in a low-usage application these are likely to be a small proportion of the total.
Summary: Azure’s problem is that it does not scale down in a way that makes business sense. There is no free quota, unless you count what is bundled with an MSDN subscription.
I’ve been fighting the same battle on relative cost since Microsoft announced Windows Azure pricing. See my A Comparison of Azure and Google App Engine Pricing post of 7/19/2009, Lobbying Microsoft for Azure Compute, Storage and Bandwidth Billing Thresholds for Developers of 9/8/2009 and Amazon Undercuts Windows Azure Table and SQL Azure Costs with Free SimpleDB Quotas of 10/5/2009.
Ben Riga’s SQL Azure Lessons Learned: ESRI post of 1/21/2010 begins:
In this episode of Lessons Learned I chat with Rex Hansen of ESRI. Rex works on MapIt; a product for visualizing enterprise data on maps. This was recently released as a on-premises product that enables developers to work with the tabular and spatial data in SQL Server 2008 and integrate that data with maps on ArcGIS online and Bing Maps.
ESRI has been working to extend that functionality to Windows Azure and SQL Azure. MapIt takes advantage of SQL Azure to consume location-based data. The MapIt spatial data service can be deployed as a role on Windows Azure and provides spatial data capabilities to applications using SQL Azure. This provides a valuable service to folks that miss the spatial data types they were used to using in SQL Server.
Rex walked me through building a Silverlight application with their Silverlight Control Toolkit in Expression Blend. As Rex mentions in the video ESRI has released the source code for their Silverlight toolkit controls. You can find those on CodePlex here: http://ESRISilverlight.codeplex.com/
It sounds like ESRI has some big plans for where they want to take the MapIt product to provide even better integration with SQL Azure.
Ben’s post includes an embedded viewer for his Channel 9: SQL Azure Lessons Learned: ESRI interview.
Phil Wainwright asks if Cloud Content Management to Challenge ECM? and answers “Yes” in his 10/22/2010 post to the Enterprise Irregulars blog:
A phrase that instantly intrigued me when I had a pre-brief of Box.net’s new announcement this week (all Techmeme coverage) was the vendor’s newly coined notion of ‘cloud content management.’ VP of marketing Jen Grant (who I’ve interviewed previously here in a podcast) explained that Box is using the term to denote the application of social computing and enterprise 2.0-inspired principles to content management. This gives rise to a dynamic, participative treatment of all forms of content that’s in sharp contrast to the fusty image of conventional enterprise content management (ECM), which many people think of as the place ‘where content goes to die.’
I liked this so much, I Googled the term as I sat down to write this blog entry and, much to my surprise, the top search result was another podcast I published here on ebizQ a few months back called Content Management Moves to the Cloud. That interview is about web content management — in the sense of content that sits on a public-facing website — and is a category of content that personally I see little sense in managing anywhere else than in the cloud. Enterprise content management is different, however. Much of this content is private and confidential to the enterprise and so it has to be managed in a way that allows control and governance over who gets to see it, when they see it and what they do with it. …
Jim Nakashima explains issues with Windows Azure WCF Add Service Reference Patch and Windows 7 in this 1/21/2010 post:
In general, WCF works correctly on Windows Azure. There is a problem using "add service reference" or svcutil but we have a patch to workaround the problem. The patch is installed in the cloud, and more information about this is here: http://code.msdn.microsoft.com/wcfazure/Wiki/View.aspx?title=KnownIssues (note that a web.config change is also required)
One of the things this prompted a lot of folks to ask is "Where is the Windows 7 version of this patch?"
Well, I'm happy to announce that we have released this QFE for Windows 7 and Windows Server 2008 R2: http://code.msdn.microsoft.com/KB977420
I also recommend that if you are using WCF on Windows Azure, you spend time browsing the content on http://code.msdn.microsoft.com/wcfazure.
You may also be interested in the REST service templates the WCF team has made available on the Visual Studio Gallery: http://visualstudiogallery.msdn.microsoft.com/en-us/842a05c3-4cc8-49d3-837f-5ec7e6b17e80 (this is the .NET 3.5 C# template, there are also .NET 3.5 VB and .NET 4 C# and VB templates)
Note that the REST templates aren't currently directly supported as a Windows Azure Role in the New Project Dialog, but you can easily use that template in a Windows Azure Cloud Service by following the "Using an Existing Project" section of this post.
Ben Riga’s Real World Windows Azure: Interview with Anthony Du Preez, Founder of Tradeslot and Adslot post of 1/21/2010 begins:
As part of the Real World Windows Azure series, we talked to Anthony Du Preez, Founder of Tradeslot and Adslot, about how the companies use the Windows Azure technology platform for their combinatorial auction platform and the benefits that Windows Azure provides. [Links added.]
and continues with a transcript of the interview. Read more about the case study in Combinatorial Auction Provider Scales Up Quickly, Saves Costs with Cloud Services of 11/17/2009:
Tradeslot specializes in designing and building large scale business-to-business and government-to-business (G2B) auctions platforms; its partner company, Adslot, uses the same auction platform to auction online advertising space. The companies use computation-heavy combinatorial auction algorithms to manage their complex conditional bid processes. However, with a lack of compute power with its existing infrastructure, the companies found it necessary to manually add constraints to auctions. In addition, the companies wanted to reduce capital costs and IT resources required to set up new server hardware for customers. Tradeslot and Adslot implemented Windows Azure™ and, as a result, reduced capital costs for customers—from U.S.$60,000 to only $1,000 for each customer, improved their ability to scale up and improve services for customers, saved critical IT resources, and delivered a solution their customers can trust.
Read more Windows Azure success stories here. A transcript of another of Ben’s interviews is in his Real World Windows Azure: Interview with Ezequiel Steiner, CEO at Acumatica post of 1/14/2010.
The VAR Guy asks Intuit Plus Windows Azure: Small Business Clouds? in this 1/21/2010 post:
Microsoft and Intuit nearly exchanged wedding vows in 1994. Fast forward to the present, and the two software giants are infatuated with each other again. This time their mutual admiration involves shifting small businesses to cloud computing, Windows Azure and Intuit’s App Center. Here’s where VARs fit in.
First, let’s reminisce a bit. Back in October 1994, Microsoft was crazy for Intuit Quicken and home banking. Ah, to be young and in love. Microsoft ultimately bid $1.5 billion to buy Intuit — but regulatory issues crushed the proposed deal. Easy come, easy go.
Fast forward to the present. The shrinkwrapped software market has lost its sex appeal. Microsoft and Intuit are leaping into the cloud — and climbing under the covers together again. This time around Microsoft and Intuit say they will help…
- developers write small business applications for the Intuit App Center
- channel partners and developers to leverage the Intuit Partner Platform and Windows Azure
For VARs that promote Small Business Server, Intuit Quickbooks and other on-premise applications, the message is clear: You better put your toes in the water and give emerging cloud services a try.
Intuit has been transitioning aggressively to SaaS, and Microsoft is set to officially launch Windows Azure in February. Will the Microsoft-Intuit partnership work out this time around? The answer really doesn’t matter. With our without each other, Intuit and Microsoft are marching into the cloud. VARs should at least give their efforts a look.
VAR Guy continues with links to related items.
The Windows Azure Team reports Intuit and Microsoft Join Forces to Spur Innovation and Expand Cloud Opportunities for Developers and Channel Partners on 1/20/2010:
Did you see the announcement made today by Intuit and Microsoft to integrate the capabilities of their cloud services platforms, the Intuit Partner Platform and Microsoft Windows® Azure platformTM, to enable developers and channel partners to deliver web-based solutions to the millions of employees within businesses that use QuickBooks® financial software? This is significant in that it instantly gives Microsoft developers the ability to develop applications for Intuit's loyal customers and will open up to them a new landscape of sales prospects who are looking for small business solutions.
More specifically, the Intuit Partner Platform will give Windows Azure developers the ability to build applications that their customers can use to take advantage of their QuickBooks data, and give them instant access to a pre-built channel for acquiring customers via the Intuit App Center. Small businesses will be able to click from the QuickBooks Pro and Premier 2010 toolbar to visit the Intuit App Center where they can find, buy and use Web-based business applications built on or integrated with the Intuit Partner Platform.
The companies, who have been strategic partners for more than 20 years, expect this to jump-start development and distribution of new, innovative web-based applications to help small businesses meet current business needs and provide new opportunities. The free Windows Azure software development kit (SDK) (Beta) for Intuit Partner Platform is available today at http://developer.intuit.com/azure.
Click here to read the full press release: http://www.microsoft.com/presspass/press/2010/jan10/1-20IntuitDevelopersPR.mspx
The press release doesn’t mention that Microsoft attempted to purchase Intuit many years ago but was foiled by anti-trust concerns. It will be interesting to learn how the integration will affect Intuit’s QuickBase Web-based database offering.
Joseph Hofstader presents a 1:04:34 Channel9 ARCast.TV Special - Designing Multi-tenant Applications on Windows Azure featuring Joseph Hofstader Webcast beginning 1/20/2010:
Cloud computing is one of the hottest topics in information technology today. With all the confusion surrounding acronyms ending in ‘aas’ like Platform as a Service (PaaS), Infrastructure as a Service (IaaS) and Software as a Service (SaaS) it can be intimidating for even seasoned IT professionals. This presentation will briefly discuss the different types of cloud platforms and then address one of the key business scenarios for the cloud: Software as a Service.
Liam Cavanagh describes a New Case Study – GoGrid and Blue Star Infotech in this 1/20/1020 post to the Microsoft Sync Framework blog:
We recently added a new case study of a solution built by GoGrid on the Windows Azure platform to allow customers to develop, test, deploy, and back up Windows Azure applications. GoGrid partnered with Blue Star Infotech, a global provider of IT and Outsourced Product Development services and solutions to build a solution that provides developers within the Windows Azure environment the ability to make back-ups of their SQL Azure infrastructure to a SQL Server database hosted in the GoGrid environment.
I had a chance to sit down with Mehul Shah the Microsoft Practice Manager with Blue Star Infotech and he explained to me how this was accomplished.
“GoGrid provides a cloud infrastructure and their focus is primarily on start-up ISV's. These customers use the GoGrid infrastructure for both development and deployment. If someone wants to develop on Windows Azure they will be given a pre-configured environment in the GoGrid environment. When looking at additional services GoGrid could provide their customers, they also chose to implement load testing in the Azure staging environment as well as the ability to make back-ups of their SQL Azure infrastructure to the SQL Server GoGrid environment. To help accomplish this it uses Sync Framework to do move the data between SQL Azure and SQL Server.”
Blue Star Infotech’s background in creating custom providers with the Microsoft Sync Framework proved extremely useful in this implementation.
For more details on this solution, you can see a video of the GoGrid Application Lifecycle solution here.
B. Guptill, M. West and B. McNee coauthored the Cloud “Rip-and-Replace” Deals Suggest Business Solutions Maturing Saugatuck Research Alert of 1/20/2010 (requires site registration):
Cloud business solutions are changing the rules when it comes to even the most traditional business systems and operations. Evidence is mounting, especially in recent weeks, that “rip-and-replace” is emerging as a viable SaaS strategy for some users – and a threat to traditional software.
Google-City of Los Angeles (34,000 seats) and IBM-Panasonic (380,000 seats) are two recent, large deals that have called attention dramatically to the viability of Cloud business solutions in addressing large enterprise computing requirements – with both deals at the expense of Microsoft’s on-premise software. But this is not just a large-enterprise phenomenon, and not just an email and office / collaboration phenomenon. Recent Saugatuck interviews with SME-focused SaaS providers such as Intacct and NetSuite indicate rapidly-increasing numbers of “rip-and-replace” strategies being deployed of traditional, on-premise SME accounting and Finance applications for solutions in the Cloud.
These are important signposts that point toward an important and powerful mid-to-longer term trend. The bottom line for IT markets is that “rip and replace” strategies becoming more of a reality – and more of a threat for on-premise software vendors. …
J. Nicholas Hoover asserts “Government agencies will continue to struggle with cybersecurity, while making strides in transparency and cloud computing. There will also be new pushes for accountability and IT procurement reform” in his 5 Predictions For Government IT In 2010 article of 1/15/2010 for InformationWeek’s Government Newsletter:
In some ways, government technology appeared to have received a new lease on life in 2009. As cloud computing and Web 2.0 took root in the wider tech industry, President Obama's tech-savvy administration named the United States' first CIO and CTO and launched a -ranging effort on transparency that was echoed in many national, state and even local governments.
Government 2.0 and "government as a platform" became new buzz phrases, federal CIO Vivek Kundra launched cloud computing and performance management initiatives, and it seemed as if every government agency started a Twitter feed, joined Facebook and began posting YouTube videos. …
Following are five things to watch for next year in government tech, subject to the caveat that I like Nostradamus and Punxsutawney Phil have been known to make a prognostication or two that don't pan out.
- Cybersecurity Will Continue To Be On The Front Burner …
- More Agencies Will Begin Forays Into Cloud Computing, Including Public Clouds …
- Transparency Won't Always Be Easy …
- Dashboards Will Push Agencies To Improve Performance Management …
- IT Procurement May See Some Tentative, Incomplete Reform …
Nick fleshes his predictions out a bit, of course.
• Charles Leadbeater asserts “Before our digital lives disappear too far into 'the cloud', we must wrest it from corporate and governmental control” in his Let's open up cloud computing op-ed piece of 1/22/2010 for the guardian.co.uk site:
The internet, our relationship with it, and our culture are about to undergo a change as profound and unsettling as the development of web 2.0 in the last decade, which made social media and search – Google and YouTube, Facebook and Twitter – mass, global phenomena. The rise of "cloud computing" will trigger a battle for control over a digital landscape that is only just coming into view.
The internet we have grown up with is a decentralised network of separate computers, with their own software and data. Cloud computing may look like an extension of this network-centric logic but, in fact, it is quite different.
As cloud computing comes of age, our links to one another will be increasingly routed through a vast shared "cloud" of data and software. These clouds, supported by huge server farms all over the world, will allow us to access data from many devices, not just computers; to use programs only when we need them and to share expensive resources such as servers more efficiently. Instead of linking to one another through a dumb, decentralised network, we will all be linking to and through shared clouds.
Which raises the question: whose clouds will these be?
Charles proposes an “Open Cloud Declaration,” a five-point “manifesto” condemning homogeneity, corporate control, “industrial-era media companies,” attempted government control and inequality. Sounds to me like a paean to “cloud anarchy.”
See the last item in this section for Microsoft’s contrary view of the government role in cloud computing.
• Kevin Jackson reports "Shaping Government Clouds" Just Released in this 1/14/2010 post:
As part of the On The Frontlines series, Trezza Media Group has just released it latest on-line electronic magazine. "Shaping Government Clouds" includes:
- Pete Tseronis, Chairman of the Federal Cloud Advisory Council, shares his thoughts on how to embrace clud computing possibilities;
- US Army Major Larry Dillard explains how the Army Experience Center used the cloud to enhance recruiting;
- Experts from Citrix, HP and SafeNet discuss how they are using the cloud in government solutions;
- And much much more!
- Henry Sienkiewicz, DISA
- Mike Krieger, US Army
- Robert Carey, US Navy
- Ron Bechtold, US Army
- Curt Aubley, Lockheed Martin
- David Smith, Citrix Systems
- Ronald Ritchey, Booz Allen Hamilton
- Tim Harder, EMC
Read it today! Or better yet, download it as an excellent reference.
Reminder: Vote for new or improved Windows Azure features in the Windows Azure Feature Voting Forum.
Jim Nakashima reviews Windows Azure Instance & Storage Limits in this 1/22/2010 post:
Recently, a colleague of mine wrote about the Windows Azure instance limits: http://blog.toddysm.com/2010/01/windows-azure-role-instance-limits-explained.html
His post is very complete, I recommend you have a look but here is my take:
These are default limits that are in place to ensure that Windows Azure will always have VMs available to all of our customers. If you have a need for more capacity, we want to help! Please contact us: http://go.microsoft.com/fwlink/?LinkID=123579
The limits are:
- 20 Hosted Service Projects
- 5 Storage Accounts
- 5 roles per Hosted Service (i.e. 3 different web roles + 2 different worker roles or any such combination)
- 20 CPU cores across all of your Hosted Service Projects
The first two are really easy to track, on the Development portal when you go to create a new service, it’ll tell you how many you have left of each:
5 roles per Hosted Service is also easy to understand, this corresponds to the number of projects you can add as roles to your Cloud Service – here I am hitting my role limit:
Jim continues with an analysis of the 20-CPU core limit and storage quotas.
Derrick Harris analyzes From Azure to VMware: A Look Back at Infrastructure Trends From Q4 in this 1/22/2010 post to GigaOm:
Looking back at the past three months of data center and cloud computing news, what’s striking is not so much what happened, but what will happen. As I outline in the latest Quarterly Wrap-up for GigaOM Pro (sub. required), there were plenty of major announcements and big happenings, to be sure, but many won’t materialize until later this year. When they do, the results could alter their respective landscapes significantly.
- Data Center Shape-shifting
- Microsoft Azure Wows With What Might Be
- Oracle Cleared Final Hurdle to Sun Buy
- Green Shoots in Q4 Financials
- The Downsides
Colin Cole’s Windows Azure, Cloud Computing, and the Insurance Value Chain post of 1/22/2010 asks:
Curious how ISV insurance partners can leverage Windows Azure to run parts of their business? Software as a Service and Cloud Computing represent new opportunities for the Financial Services industry, and in this webcast, we show customers how they can quickly adopt new business capabilities with minimal to no hardware and software investments.
In this webcast, I walk through the tenants of Windows Azure, and then in 15 minutes I demonstrate the technology by building an Insurance Policy service from scratch, deploy it and run it from the cloud. A soup-to-nuts policy service. By following the example, customers and partners can quickly see how they can leverage Windows Azure to build and host their insurance assets, leveraging all of the benefits of the cloud such as "infrastructure on demand". This webcast starts out discussing cloud + insurance strategy, but it quickly turns technical and dives into a detailed code example.
Watch the Webcast here if you have credentials for academymobile.microsoft.com
Mohamed El-Refaey quotes Mark Ortenzi: "If you don't do it, in two years you won't have a business" in his Cloud Computing Certifications post of 1/22/2010:
Though [I] am not that fond of certificates in general and believe in practical hands-on to the extremes, it looks that within two years I should be aware of the training kits and certificates exist for cloud computing and virtualization technology :) ...
3Tera (Cloud Computing Platform and services company) announced today their cloud computing certifications and education offerings ...
Mark Ortenzi, president of 3Tera partner CariNet, mentioned that "If you don't do it, in two years you won't have a business."
According to these announcements there will be two tracks for these certificates:
- The Certified Cloud Operator program which is targeted at service providers, enterprises, operations professionals and system integrators who deploy and operate cloud services. It covers installing, configuring and maintaining the computing fabric used for building cloud computing services. The certification has emphasis on hardware requirements, service configuration, hardware failure troubleshooting, provisioning of customers and configuration of virtual private data centers.
- The the Certified Cloud Architect program, is aimed at systems architects, IT operations professionals, application developers and systems engineers who design, integrate, provision, deploy and manage distributed applications. That certification teaches the architectural concepts of 3Tera's AppLogic cloud computing platform, step-by-step deployment procedures, operating and managing applications in the cloud, best practices for security, testing and scaling applications and how to architect for business continuity.
It worth mentioning that Rackspace, Amazon (EC2) and Microsoft Windows Azure have training kits and education available, but yet they don't have cloud com[p]uting certifications.
We should keep an eye on these vendors in their offerings in the near future ...
Cloudworthiness certification for operators, architects and developers might become a significant revenue source for cloud vendors and training organizations.
Lori MacVittie asks and answers WILS: How can a load balancer keep a single server site available? in this 1/20/2010 post:
Most people don’t start thinking they need a “load balancer” until they need a second server. But even if you’ve only got one server a “load balancer” can help with availability, with performance, and make the transition later on to a multiple server site a whole lot easier.
Before we reveal the secret sauce, let me first say that if you have only one server and the application crashes or the network stack flakes out, you’re out of luck. There are a lot of things load balancers/application delivery controllers can do with only one server, but automagically fixing application crashes or network connectivity issues ain’t in the list. If these are concerns, then you really do need a second server.
But if you’re just worried about standing up to the load then a Load balancer for even a single server can definitely give you a boost. …
Microsoft Public Relations reported in a Microsoft Urges Government and Industry to Work Together to Build Confidence in the Cloud post of 1/20/2010:
Today, Brad Smith, senior vice president and general counsel at Microsoft Corp., urged both Congress and the information technology industry to act now to ensure that the burgeoning era of cloud computing is guided by an international commitment to privacy, security and transparency for consumers, businesses and government.
During a keynote speech to the Brookings Institution policy forum, “Cloud Computing for Business and Society,” Smith also highlighted data from a survey commissioned by Microsoft measuring attitudes on cloud computing among business leaders and the general population.
The survey found that while 58 percent of the general population and 86 percent of senior business leaders are excited about the potential of cloud computing, more than 90 percent of these same people are concerned about the security, access and privacy of their own data in the cloud. In addition, the survey found that the majority of all audiences believe the U.S. government should establish laws, rules and policies for cloud computing.
At today’s event, Smith called for a national conversation about how to build confidence in the cloud and proposed the Cloud Computing Advancement Act to promote innovation, protect consumers and provide government with new tools to address the critical issues of data privacy and security. Smith also called for an international dialogue on data sovereignty to guarantee to users that their data is subject to the same rules and regulations, regardless of where the data resides. …
Get more details and background in a Cloud Computing presskit of the same date. Kelly Fiveash analyzes the speech in a lengthy Microsoft's top lawyer demands a cloud computing law piece of 1/21/2010 for El Reg.
Lori MacVittie asserts “One of the concerns with cloud bursting specifically for the use of addressing seasonal scaling needs is that cloud computing environments are not necessarily PCI-friendly. But there may be a solution that allows the application to maintain its PCI-compliance and still make use of cloud computing environments for seasonal scaling efficiency” in her Cloud Balancing, Reverse Cloud Bursting, and Staying PCI-Compliant post of 1/22/2010:
The ability to implement such an architecture would require that the PCI-compliant portions of a web application are separated (somehow, perhaps as SOA services or independently accessible RESTful services) from the rest of the application.
The non-PCI related portions of the application are cloned and deployed in a cloud environment. The PCI-related portions stay right where they are. As the PCI related portions are likely less heavily stressed even by seasonal spikes in demand, it is assumed that the available corporate compute resources will suffice to maintain availability during a spike, mainly because the PCI compliant resources have at their disposal all local resources. It is also possible –and likely – that the PCI-related portions of the application will not consume all available corporate compute resources, which means there is some capacity available to essentially reverse cloud burst into the corporate resources if necessary. …
• Lucas Almeida Romão announced in his Campus Party Brazil: Azure Services Platform post to the Azure Services BR Community site that Evilázaro Alves will present a session about the Azure Services Platform for Campus Party Brazil at Expo Center Imigrantes in São Paulo/SP on 1/28/2010 at 10:00 to 11:00 AM:
Campus Party is considered to be the largest event of technological innovation, Internet and networked electronic entertainment in the world. An annual meeting held since 1997, which brings together, for seven days, thousands of participants with their computers to share knowledge, exchange experiences and perform all kinds of activities related to computers, communications and new technologies.
And this year the
moderatecommunity moderator of Azure Services BR, br101MVP Evilázaro Alves, will be speaking atabout the Azure Services platform.
Translation [edited] from Brazilian Portuguese by the Bing Translator.
Brian Loesgen announced I’ll be presenting a new session next week: Bridging from On-premise ESB to Windows Azure in this 1/22/2010 post:
I will be doing a brand-new, never-seen-before presentation at the Code Camp in Fullerton next week. I’m late signing up as I wasn’t sure if my schedule would permit it, but it all looks good, so I’ll do it.
The session will encapsulate some of the cool stuff I’ve been doing spanning the two environments. This will be a powerful (and I would say essential) presentation for BizTalk developers as it highlights some of the new patterns we now have at our disposal. However, I this is also an important session for anyone deploying services to Azure and calling them from on premises, as many will not have an integration background and as such will run into the typical pitfalls that experienced integration devs know to avoid.
I’ve done one post about bridging on-premise to Azure here, and there are more videos working their way through MSDN that should be live any time now. Once they go live, I’ll post accompanying blog posts.
I’ll be presenting at 2:45 on Sat Jan 30th. Hope to see you there if you’re attending this Code Camp.
Having attained the plateau of productivity, companies worldwide are enjoying the benefits and efficiencies that can be realized through a well-defined and implement SOA strategy. In addition, many are also realizing the business value and agility improvements that come from have an Enterprise Service Bus in place as a messaging backbone to support their SOA infrastructure.
With the recent “go-live” of Microsoft’s Windows Azure platform, intriguing new architectural patterns for distributed applications are being made possible. In this session we will look at what it means to bridge from the on-premise ESB to the Windows Azure platform. In addition, we will cover the value-add that an ESB brings to Azure usage. We will take a pragmatic approach, showing you what can be done today, with the tools available to you right now.
John Willis reports CloudCamp to Hold First OpsCamp for Cloud Operations and Development Professionals in this 1/21/2010 post:
CloudCamp, an organizer of local events to exchange ideas, knowledge and information in a creative and supporting environment, advancing the current state of cloud computing and related technologies, today announced the first OpsCamp for systems management and cloud development professionals. OpsCamp is an event aimed at bringing together IT professionals who are interested in the evolution of systems management and application deployment as it bridges physical and virtual infrastructure and especially cloud computing technologies. The event will be a participant driven unconference style event made popular by events like BarCamp, Bloggercon and Mashup Camp.
Event Details: The event will be held in an unconference format starting with an Unpanel discussion about cloud computing followed by a self-organizing conference format where topics are proposed and then voted on by the attendees.
Saturday, January 30, 2010: 8:00 am – 5:00 pm
Spider House Cafe, 2908 Fruth St., Austin, TX 78705
While attendance is free, RSVP is required: http://www.opscamp.org/austin
TechEd Events offer links to 39 events containing the keyword Azure in this Live Events for IT Pros interactive page:
The most popular topic is: TechNet Events Presents: Windows Azure, Hyper-V and Windows 7 Deployment:
Join your local TechNet Events team for a lively tour of the latest tools and resources for IT Pros. We’ll start with an overview of Windows Azure, and explore how you can use this high-performance hosted platform to build customer-facing applications and add horsepower to your computing infrastructure. Next, we’ll look at all the tools and techniques available for building virtual environments in Hyper-V version 2.0, then finish the day by demonstrating how to simplify your Windows 7 deployments. TechNet Events are free, live learning sessions packed with hands-on technical content. Register today!
Matthew Weinberger writes about the TechNet Events in his Microsoft Taking Azure to the Streets post of 1/21/2010 to the VAR Guy blog:
That headline might go a little far, but Microsoft has announced plans to hold workshops across the United States to educate VARs and other IT pros about what Windows Azure is, what it can do, and how best to deploy it. It seems like the closer we get to the February 1 grand opening of Azure, the more eager Microsoft is to push their new cloud platform.
When I first heard about Azure, I was a little skeptical. It seemed like Microsoft was just looking to make a token nod to software as a service models, and they would go back to releasing version after version of Windows Server as soon as no one was looking. I thought a version of Windows that lived in the cloud was just too risky for the often-conservative company.
That’s why I’m pleasantly surprised at the announcement of this lecture series: it really sounds like Microsoft is serious about Windows Azure’s success and is making a really strong effort to educate the IT channel about the benefits of the cloud. In that same vein, the Redmond giant has released a TCO/ROI calculator to demonstrate how much a VAR might save by leveraging Azure.
Yes, we’re all skeptical of TCO and ROI calculators. But I think we’re all addicted to them, too.
The Microsoft workshops are also going to deal with Hyper-V and Windows 7 deployments. But the fact that they’re uttering “Azure” in the same breath as their new golden child Windows 7 shows just how serious Microsoft is getting. As always, expect more as that February 1st launch date draws nearer.
MSDN Events presents Take Your Applications Sky High with Cloud Computing and the Windows Azure Platform in Alpharetta, GA on 2/25/2010 from 1:00 to 5:00 PM EST:
Join your local MSDN Events team as we take a deep dive into cloud computing and the Windows Azure Platform. We’ll start with a developer-focused overview of this new platform and the cloud computing services that can be used either together or independently to build highly scalable applications. As the day unfolds, we’ll explore data storage, SQL Azure, and the basics of deployment with Windows Azure. Register today for these free, live sessions in your local area.
If you register and attend this event, you will be placed in a raffle to win a chance to bring home one (1) free copy of Windows 7 – you could be the lucky winner! Register today!
SESSION 1: Overview of Cloud Computing and Windows Azure
The Windows Azure platform is a set of high-performance cloud computing services that can be used together or independently and enable developers to leverage existing skills and familiar tools to develop cloud applications. In this session, we’ll provide a developer-focused overview of this new online service computing platform. We’ll explore the components, key features and real day-to-day benefits of Windows Azure.
- What is cloud computing?
- Running web and web service applications in the cloud
- Using the Windows Azure and local developer cloud fabric
- Getting started – tools, SDKs and accounts
- Writing applications for Windows Azure
SESSION 2: Survey of Windows Azure Platform Storage Options
Durable data storage is a key component of any cloud computing offering. The Windows Azure Platform offers many options, which can be used alone or in combination. Windows Azure itself offers ready-to-use and lightweight storage in the form of tables, blobs, and queues. Another choice for storage is SQL Azure, a true relational database in the cloud. In this session, we’ll explore the highlights of these implementations and how to both create and use storage in each form. We’ll give you guidance on choosing the right forms of storage for your application scenarios.
- Understanding table & blob storage
- Programming against table & blob storage
- Working with queue storage
- Managing credentials and connection strings
- Scaling and configuration
- Understanding SQL Azure databases versus local SQL Server databases
- SQL Azure firewall, logins and passwords
- Database creation, deployments and migrations
- Database management using SQL Management Studio
- Programming against SQL Azure databases
SESSION 3: Going Live with your Azure Solution
Windows Azure features a powerful, yet simple deployment model. By focusing on your application and abstracting away the infrastructure details, you can deploy almost any app with minimal fuss. In this session, we’ll walk you through the basics of Windows Azure deployment, including site monitoring, diagnostics and performance issues.
- Start-to-Finish Visual Studio demonstration of a realistic XML data driven business web site from the desktop to the cloud.
- Windows Azure Deployments
- Start-to-Finish Visual Studio demonstration of a realistic SQL Server data driven business web site from the desktop to the cloud.
- Configuration of your application in the cloud
- Guidance and Suggestions to ensure your success
Event ID: 1032439974
Register by Phone: 1-877-MSEVENT (673-8368)
See the Take Your Applications Sky High with Cloud Computing and the Windows Azure Platform blog post of 1/22/2010 for dates and locations in Orlando, Ft. Lauderdale, and Tampa.
Microsoft and Extreme Networks will present a Data Center Virtualization – from Physical to Virtual to Cloud Webinar on 2/3/2010 at 11:00am PST (registration required):
Virtualization promises tremendous benefits, and its deployment is transforming data centers. The benefits of lower cost, flexible and scalable infrastructure are well understood, but there are many challenges to fully realizing the benefits of virtualization. In this webinar, Extreme Networks, Microsoft, Enterprise Strategy Group (ESG) and Network World will discuss a blueprint for how organizations can migrate their data centers from physical to virtualized and then to cloud architectures while enhancing the infrastructure, management and stability essential in data center networks today.
Pinalkumar Dave’s SQLAuthority News – Community Tech Days – Jan 30, 2010 – Event Announcement post of 1/21/2010 reports:
In Ahmedabad [, India] this event will happen on January 30, 2010. Just like last event we are expecting this time as well the event will have astonishing success and huge response. We will have five tech sessions back to back with lots of interesting and innovative efficient Microsoft products. The details of the sessions for this event is as following.
10:15am – 11:15am Insight of Windows Azure Platform By Mahesh Dhola
11:30am – 12:30pm SQL Azure : Extending SQL Data Platform to Cloud By Pinal Dave
01:30pm – 02:30pm Microsoft Silverlight 4 – An Overview By Dushyantsinh Jadeja
02:30pm – 03:30pm Fall in love with SQL Server 2008 R2 By Jacob Sebastian
03:45pm – 04:45pm Visual Studio 2010 enhancement for developers By Prabhjot Bakshi
Ahmedabad is very much known for its love for SQL Server technology. We will have two sessions which will be on latest innovations on SQL Server.
Andre Leibovici reports the First Perth CloudCamp in Australia in this 1/22/2010 post:
The city of Perth in Australia has it’s first CloudCamp confirmed for April 8, 2010.
CloudCamp is an unconference where early adopters of Cloud Computing technologies exchange ideas. With the rapid change occurring in the industry, we need a place where we can meet to share our experiences, challenges and solutions. At CloudCamp, you are encouraged to share your thoughts in several open discussions, as we strive for the advancement of Cloud Computing. End users, IT professionals and vendors are all encouraged to participate.
Location – Curtin University, Perth, Australia
Maureen O’Gara asserts “Neelie Kroes didn’t want to upset her chances of becoming Europe’s Digital Agenda commissar” in an EC Antitrust Chief’s Job Ambitions Reportedly Delay Oracle-Sun OK post of 1/21/2010:
Everybody and his brother expect the European Commission to rubber stamp Oracle’s acquisition of Sun any minute now but when it didn’t happen Tuesday like it was reportedly supposed to we asked why not and were told – you’re gonna love this – that it was because antitrust czarina Neelie Kroes screwed up her confirmation hearing last week as Europe’s Digital Agenda commissar and didn’t want to upset her chances of getting the job by waving through an acquisition that is increasingly unpopular with the digital agenda constituency by underscoring the EC’s toothless response to it, or so they say.
So she sat on it.
But then the confirmation vote on the new EU commissioners was moved from January 26, the day before the EC’s Snoracle deadline, to February 9 so Neelie had to relent. Approval is supposedly due by the end of the week.
Sounds like the action of a typical EC bureaucrat to me. Timothy Prickett Morgan reports ED approval of the merger in his Schwartz puts comforting arm around stricken Sun article for El Reg of 1/22/2010 and adds: “Uses other arm for Oracle fist-pump.”