|Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.|
Update 2/15/2010: Alternate link to Forrester Research’s Interactive Data Protection Heat Map; you’ll find more 2/14/2010 posts and new 2/15/2010 articles are in Windows Azure and Cloud Computing Posts for 2/25/2010+.
Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:
- Azure Blob, Table and Queue Services
- SQL Azure Database (SADB)
- AppFabric: Access Control, Service Bus and Workflow
- Live Windows Azure Apps, APIs, Tools and Test Harnesses
- Windows Azure Infrastructure
- Cloud Security and Governance
- Cloud Computing Events
- Other Cloud Computing Platforms and Services
To use the above links, first click the post’s title to display the single article you want to navigate.
Discuss the book on its WROX P2P Forum.
See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.
Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.
You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:
- Chapter 12: “Managing SQL Azure Accounts and Databases”
- Chapter 13: “Exploiting SQL Azure Database's Relational Features”
HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the January 4, 2010 commercial release in February 2010.
Ben Day’s Run Azure Development Storage from Visual Studio Unit Tests post of 2/24/2010 reports:
Use Azure development storage from unit tests
by Benjamin Day
If you're writing an application that's going to use Windows Azure, you have two options for your persistent data storage (aka your database). Option #1 is SQL Azure which is Microsoft's relational database in the cloud. Option #2, Azure Storage, is similar to a relational database but is better thought of as "structured storage". Reasons to choose one Azure storage option over the other, is a discussion that's outside of the scope of this article but basically comes down to performance and cost.
If you're writing a Windows Azure application and have chosen to go the Azure Storage route, you get a convenient local version that run on your desktop called Development Storage. Development Storage (DevelopmentStorage.exe) supplies a set of REST-based Web service endpoints that behave exactly the same as the cloud-based, production Azure Storage endpoints. Since development storage runs on your local machine rather than on Microsoft's servers, you don't have to pay for your usage.
So, what does this have to do with unit tests?
Ben continues with the answer.
In the past weeks I've read several articles / blog-posts and other digitally expressed thoughts about relational databases, query systems and how they all suck compared to K/V stores, CQRS, OODBs or whatever Hype of the Day-term. While most of them were simply re-labeling 20+ year old common knowledge, others were pretty stupid and downright sending the (novice) reader the wrong message. With 'wrong' I mean: the conclusions are based on false 'facts', assumptions and hand-waving n==1 pseudo-science.
Instead of writing a long essay here, I'll quote from and link to several Wikipedia articles and other articles which can help you learn about what relational models, databases are all about, what the theory is they're based on, why they work and what tools (as in: methodologies) are at your disposal. It's not meant to sell you the picture of 'OODB==bad, RDBMS==good', as that would be silly and as short-sighted as the articles I mentioned above. Instead you should see this small subset of knowledge about relational models and databases as a starting point for yourself when you are researching what to use and how to face a problem domain. After all, you can only make an informed decision if you know what you're talking about.
He continues with detailed sections about:
- Relational model and theory
- Normalization and De-normalization
- Information Analysis
- Friction and the Impedance Mismatch
Frans designed and developed the LLBLGen Pro Object/Relational mapper and code generator, which competes with Microsoft’s Entity Framework and open-source NHibernate.
Bruno Terkaly’s SQL Azure – Relational Database as a Service – Soup to Nuts – The Whole Enchilada post of 12/23/2010 is a fully illustrated, highly detailed tutorial for creating an SQL Azure instance, adding the Northwind sample database, and verify it’s operation:
This post will be very direct. It will get a database running in the Azure cloud ads quickly as possible. There is plenty of material about why SQL Azure might be of interest and how it works.
- Build a simple data driven ASP.NET Page using the graphical controls in Visual Studio 2008
- Develop against a local SQL Server/SQL Express database before connecting to SQL Azure
Steps in this post
- Create the northwind database
- Add northwind to sql server
- Create a cloud project with 1 web role
- Add a grid and attach grid’s data source our data (Northwind database)
- Run our cloud application against local data
- Create a SQL Azure database
- Download and install the SQL Server Migration Wizard
- Upload the customers table to SQL Azure
- Change our connection string to point our app to the cloud
- Run our app and verify everything works using the data from SQL Azure (NorthwindInCloud)
The length of Bruno’s post beats most of mine.
My Using the SQL Azure Migration Wizard v3.1.3/3.1.4 with the AdventureWorksLT2008R2 Sample Database, and SSMS 2008 R2 11/2009 CTP Has Scripting Problems with SQL Azure Selected as the Target Data Base Engine Type posts in January 2010 cover the same or similar ground. Synchronizing On-Premises and SQL Azure Northwind Sample Databases with SQL Azure Data Sync shows a different approach.
John Fontana claims “Microsoft eyes new frontiers for directory technology” in his Active Directory: 10 years old and thinking cloud post of 2/24/2010 to NetworkWorld’s Software blog:
Ten years ago today, Microsoft released its Active Directory technology to skepticism that it could build an infrastructure technology to anchor user management and access control. Now the software is an integral part of nearly every corporate network and stands ready for its next frontier: public and private clouds.
Over the years, Active Directory (AD) has strengthened its shaky legs by improving scalability and flexibility, and adding features such as federation and rights management services. The directory today is part of nearly every task a user performs on a Windows-based network, plus there are tools to include Unix and Linux machines under the access controls in AD, and an army of third-party vendors.
Most recently, Microsoft unveiled plans for the Next Generation Active Directory (NGAD), a modular add-on that is built on a database and designed to add querying capabilities and performance never before possible in a directory. NGAD also is a reshaping of the programming model for Active Directory. …
“Microsoft did learn the hard way, which is almost an inevitable process for that level of infrastructure,” says Jamie Lewis, CEO of the Burton Group/Gartner. “It is not easy to build and it takes time for something as ambitious and complex as a directory.”
But Lewis says there is no question that AD today is considered by a lot of enterprises to be the foundation of their user repository and hub for their internal authentication mechanism.
Hopefully, Microsoft learned its lessons well, because Lewis says moving the directory to the cloud will be another hard lesson, especially given that AD is now a piece of legacy technology that many users don’t want disturbed. That is one reason Microsoft is developing NGAD as an add-on. [Emphasis added.] …
One user who requested anonymity notes that despite NGAD’s introduction, there are precious few details surrounding AD for the cloud.
In addition, he says, there have been few appreciable changes in 10 years. “People still are afraid to change the schema.” Also, the Active Directory Lightweight Directory Service (formerly called ADAM), which is mostly an Internet focused directory, isn’t on par with AD in terms of management tools, making it in essence a separate deployment.
John’s earlier Microsoft touts groundbreaking 'clip-on' for Active Directory post from PDC on 11/18/2009 provides the NGAD backstory:
Microsoft will pass out beta code Wednesday it hopes will define the next evolution of directories. It's a modular add-on that is built on a database and designed to add querying capabilities and performance never before possible in a directory.
The code is so early-stage it does not have an official name, although internally Microsoft calls it Next Generation Active Directory (NGAD). Microsoft introduced NGAD, which it calls a directory federation technology, on the second day of its annual Professional Developers Conference going on this week.
NGAD, however, is not a replacement for Active Directory but a "clip-on" that provides developers a single programming API for building access controls into applications that can run either internally, on devices or on Microsoft's Azure cloud operating system. Users will not have to alter their existing directories but will have to option to replicate data to NGAD instances.
NGAD stores directory data in an SQL-based database and utilizes its table structure and query capabilities to express claims about users such as "I am over 21" or "Henry is my manager." To ensure security, each claim is signed by an issuing source, such as a company, and the signatures stay with the claim no matter where it is stored. …
In addition, the directory design means multitudes of new cloud or other applications won't be hammering the central Active Directory architecture with lookup requests and administrators don't have to perform often tricky updates to directory schema to support those new applications.
Brendan Cournover’s What is Next Generation Active Directory? post of 12/2/2009 to The Windows Server Notebook blog digs deeper into NGAD:
… I spoke with Directory Services MVP Laura E. Hunter, and she described NGAD as a way for Microsoft to provide a “SQL-like frontend” where admins can make authorization decisions. The examples she gave were functions such as “age over 21” or “can approve expense reports = TRUE,” similar to what AD Federation Services 2.0 does now, only taking things one step further.
So where does the cloud fit in? Well it’s really all about the way administrators deal with directories and applications, and creating a common interface no matter if those directories or apps are on premise or in the cloud. As Computerworld’s John Fontana puts it, “users will not have to alter their existing directories but will have [the] option to replicate data to NGAD instances.”
NGAD is to be based on the claims-based identity model, which Microsoft describes as when an “application makes identity-related decisions based on claims supplied by the user. This could be anything from simple application personalization with the user’s first name, to authorizing the user to access higher valued features and resources in your application.” In other words, the claims-based model is a simplified way of governing access control.
Again, it’s very early, so no timeframe for NGAD has been given. It’s possible that whenever it is released, it will be a standalone product that also comes with Windows out-of-the-box, similar to Hyper-V.
Will Bell shows you How to install a chained SSL certificate in this 2/24/2010 post to the Azure Support Team blog:
When a user browses to a web site protected by a Secure Sockets Layer (SSL) endpoint, an error will be displayed if the proper certificates are not discovered, or if the certificate is expired, or revoked.
The second line of this dialog reports more information on the specific certificate error:
The security certificate presented by this website was not issued by a trusted certificate authority.
If the user clicks on the red x “Continue to this website (not recommended).” the IE 8 Security Status Bar will turn red to continue to warn the user:
This document discusses how to properly install a chained SSL certificate in a Windows Azure application. …
Will continues with a detailed tutorial.
Marc Schweigert’s Running Reporting Services Reports in Windows Azure Webcast posted to Channel9 on 2/24/2010 carries this abstract:
In this screencast, Marc shows you how to run a SQL Server Reporting Services 2008 report in Windows Azure using the ReportViewer control that ships with Visual Studio 2010. As an added bonus, he demonstrates using ReportViewer against an OData service through the use of WCF Data Services client libraries and the ObjectDataSource.
diTii.com announced Thor leverages power of Microsoft Tag, Windows Azure, and Exchange Server on 2/24/2010:
Project Thor is an open source project undertaken by the Minneapolis Cloud Computing User Group, led by Adam Grocholski from RBA consulting. Thor focuses on leveraging the power of Microsoft Tag, Windows Azure, and Exchange Server to allow access to the schedules and scheduling services of Exchange on most mobile devices! Thor is designed to be a real solution that showcases the power of cloud-based solutions coupled with on-premise software. Thor is implemented as an Azure-based solution, so you’ll need an Azure account to try it out. (If you really want to use it on-premise, the source code is provided). Thor ships with three different providers: Exchange 2010 Interop, and Exchange 2007 Interop(which use the Exchange Web Services Managed API), and Native Web Services (which uses the EWS autogenerated proxies).
You can download Thor from CodePlex.
The Microsoft Public Sector Developer and Platform Evangelism Team’s CitySourced to leverage Microsoft’s Windows Azure platform for their government customers post of 2/23/2010 announced:
CitySourced has selected Microsoft’s Windows Azure platform as the foundation for their application infrastructure..
We will do a detailed blog-post over the next month or two, but below is a short one about FreedomSpeaks/CitySourced, goals to address their future needs, and broad outline of their plans to leverage Microsoft’s suite of Windows Azure services.
Background about the organization and the service - CitySourced is a real time mobile civic engagement tool. The CitySourced suite of applications has three main components. 1) The Console, which provides an administrative extranet for government agencies to manage the incoming customer support requests; 2) The Website, a consumer facing website encouraging citizens to engage with their local government as well as providing complete transparency to their government’s day-to-day operations; 3) A Smartphone application that enables citizens and residents to submit non-emergency service requests (i.e., potholes, graffiti, trash, snow removal, etc.) directly to their local government. The CitySourced platform presents a unique opportunity for government to use technology to both save money and improve accountability to those they govern. It also creates a positive, collaborative platform for real action. The mission of CitySourced is to transform civic engagement and enable citizens to make their communities a better place.
Here are some of the goals/drivers that led CitySourced to adopt cloud-based offering:
- CitySourced’s application utilized multi-layered architecture based on Microsoft ASP.NET on Windows Server 2003/IIS6.0 and SQL Server 2005 on three servers. CitySourced continues to get a lot of visibility in various cities around the country, and needed a plan to implement a scalable cloud based solution to handle the anticipated future demand.
- Focus was more on deliver better service and high-value features to both the government and consumer entities, so the goal was to stay away from hardware/software acquisition, management and monitoring of the infrastructure and the application.
- CitySourced also wanted to improve back-end job-processing service and log storage.
- CitySourced explored several options with the key goal being to avoid making major changes to their application code-base, in-memory cache layer and database layer.
- All of the above helped CitySourced choose Microsoft’s Windows Azure platform to host and manage the CitySourced application.
Goals & plans for Phase 1: (To be completed by March, 2010)
- Migrate the entire application infrastructure to Azure (web, cache, and databases) with a minimum amount of disturbance to their core business processes.
- Migrate with a minimal amount of refactoring to existing code (focusing on system changes, not code changes).
Goals & plans for Phase 2: (To be completed by May, 2010)
- Leverage Windows Azure Storage & Tables for more scalable storage of logs and other binary data (such as serialized hash tables and image blobs).
- Migrate their existing homegrown (and database dependent) queue workflow to Windows Azure Queue services.
- Offload jobs processing from Window’s based services onto dedicated worker instances.
Benefits & End-Result- CitySourced is confident that a cloud computing solution based on Microsoft Windows Azure will enable them to deliver better service to a broader audience (more customers) without worrying about infrastructure, software acquisition and service management.
High-Level architecture of their next-generation solution:
The post continues with “… some pointers/resources in case your organization would like to learn and apply Windows Azure as part of your cloud-computing initiatives.”
The Innov8ShowCase Team reported McDonalds Creates a Virtual Winter Olympics Village using IE8, Silverlight and Azure on 1/22/2010:
The premise is simple: McDonald’s has created a McNuggets Village to act as a hub around the Winter Olympics. The central piece to this is McDonald’s introducing a new sauce – the new Sweet Chili Sauce. Through the site, you can score the Sweet Chili Sauce against classic sauces.
This campaign is running during the Winter Olympics so check it out soon!
Ellen Rubin claims “Concerns about security and loss of control in public clouds have led to an alternative model” in a preface to her The Hidden Costs of Internal Clouds post of 2/24/2010:
A public cloud can provide access to computing resources that many companies would otherwise never be able to afford. The arguments for the cloud are well-known by now, but remain compelling—no up-front costs, virtually unlimited computing power on demand, and highly efficient pricing where customers pay only for resources used. There’s also less pressure on corporate IT departments that are charged with managing the infrastructure and budgeting for new equipment to keep up with demand.
But concerns about security and loss of control in public clouds have led to an alternative model—the internal cloud—that replicates the cloud environment inside the corporate firewall. Within these boundaries, enterprise users can provision computing resources as needed, using the cloud’s self-service capabilities while leveraging data center services. Internal clouds are often referred to as private clouds, but since private clouds can also be found in external environments, the “internal” designation is a more precise term for what we’re talking about here. (RightScale's blog post provides helpful definitions of the different cloud variants.)
With servers, applications and data within the enterprise walls, internal clouds can provide many of the benefits of cloud computing without the potential risks when the computing environment is provided by a third party. Unfortunately, the economics of internal clouds makes them inherently less efficient than the public cloud, especially as new technology makes the public cloud safer and more reliable.
Ellen continues with the reasons why internal clouds are inherently less efficient than the public cloud. More fodder for the public vs. private (vs. internal?) cloud controversy.
Ellen Rubin is the founder and vice-president of products at CloudSwitch.
Lori MacVittie asserts “Managing a virtual machine is not the same thing as managing the stuff inside it” to introduce her As Deep as a Puddle post of 2/24/2010:
I’ve been noticing a disturbing, though not unexpected, trend in the world of virtualization and cloud computing around management of infrastructure, particularly around virtual network appliances (VNAs). Specifically this trend is claiming the ability to manage virtualized infrastructure.
You’d think I’d be happy about that. I probably would - if the solutions were actually capable of managing the infrastructure.
Digging into these management solutions shows that for the most part the definition of the term “manage” is about as deep as a puddle; the buck (and control) stops at the virtual machine. What management and automation solutions promise is the ability to provision, manage, and migrate virtualized infrastructure. What they actually provide is provisioning, management, and migration of virtual machines. Whether it is infrastructure or applications running internal to the virtual machine is pretty much irrelevant; the solution is about managing virtual machines.
Lori continues with
- TURNING NETWORK ADMINISTRATORS into INTEGRATION SPECIALISTS …
- LACK of DYNAMISM …
- MANAGING INFRASTRUCTURE requires INTEGRATION …
topics. For more about management of virtual machines in clouds, see the articles about CA, Inc.’s acquisition of 3Tera in the Other Cloud Computing Platforms and Services section.
Dion Hinchcliffe’s The Enterprise Data Cloud: Why Information Power Is The Future of Business post of 2/23/2010 to the Enterprise Irregulars blog begins:
As organizations take a close hard look at cloud computing and how it can help them with their businesses, some are coming away unimpressed by the maturity or with concerns about risk, control, and privacy. Yet others are beginning to notice that there a number of significant but previously unrelated threads in IT that are coming together to drive a compelling new cloud agenda. I’m calling this confluence of factors the “Enterprise Data Cloud”, and most organizations already have one, even if they aren’t aware of it.
The Enterprise Data Cloud is an ad hoc and evolving combination of:
- Existing network resources and new cloud delivery models (in particular the ability to seamlessly transition between public and private clouds and the shades of grey in-between.)
- Increasingly open high-value business data typically delivered up via SOA.
- The rapid expansion of business information in machine-readable form.
This last trend in data explosion is due to today’s digital enablement of practically all business artifacts. There has also been the associated rise of corporate social media (aka Enterprise 2.0), open APIs, and mashups. These three items in particular have done a great deal to actively encourage information to be opened, syndicated, and set loose within and across corporate walls, even if the impact is only now starting to be felt. …
Jeremy Geelan claims “Security is core to the work that Google performs and is not an afterthought” in a preface to his Cloud Computing Solves Traditional IT's Security Problems: Google post of 2/24/2010:
In what he called "traditional" IT, the security model - according to Eran Feigenbaum - is broken.
Feigenbaum, who is Director of Security at Google Enterprise, gave a talk in Washington this morning offering the hope that, because of Cloud Computing, some of the existing challenges may now be surmounted.
How do we build our systems to allow for fast recovery times, for zero or very limited loss of data...
"Not all cloud providers are created equal," Feigenbaum explained. Google has one of the biggest security teams anywhere in the world of the Internet, he added. Furthermore physical security is tip-top: Google's datacenters are very low profile, at undisclosed locations - most often unmarked for added protection. They have 24/7 armed guards. He explained how Google "chunks up" users' data into small, obfuscated pieces, Gmail's email data for example, and thereby makes the attack service much harder to understand. There are also six live replications of each email.
Because Google builds its own machines and writes its own OS, it has a very homogenous environment - making it easy to issue patches to the entire system, for example. It doesn't use virtualization; it uses cheap x86 drives in vast numbers. It has a barcode on every hard drive that tracks how it is being used, who has accessed it, and what they did.
Who outside Google gets to see and appraise all these internal checks and balances? The company does a yearly thirdparty "security penetration test."
Moreover it has a rigorous code development process.
That must be a heavyweight barcode to store the amount of information as Feigenbaum claims. A barcode with its own built-in solid-state drive?
Forrester Research’s Do You Know Where Your Data Is In The Cloud? post of 2/24/2010 includes a graphical “Interactive Data Protection Heat Map” and links to Forrester’s research about how Data Protection Regulations affect IT operations:
Country-specific regulations governing privacy and data protection vary greatly. To help you grasp this issue at a high level, Forrester created a privacy heat map that denotes the degree of legal strictness across a range of nations.
Click on a country in the "Map View" or choose from a directory of countries in the "List View" to learn more about country-specific data protection and privacy regulations.
Update 2/25/2010: I wasn’t able to test the heat map on 2/14/2010 because it appeared not to open from the above link, but you can view it from James Staten’s and Onica King’s Where Your Cloud Resides Matters post of 2/23/2010.
Ben Rothke and David Mundhenk answer “How do virtualization and cloud usage affect compliance with PCI?” in a six-page Virtualization, Cloud Computing and the PCI DSS post of 2/24/2010 to the CSO Data Security blog that’s prefaced “QSAs Ben Rothke and David Mundhenk provide practical advice:”
Two of the hottest IT technologies in 2010 are virtualization and cloud computing. Both are heavily evangelized in the industry as the "wave of the future" and the "next big thing." This is primarily due to perceived promises of reductions in hardware, software licensing and maintenance costs. To a large extent, all of these claims have merit. But the overarching issue is that it is easy to get caught up in the hype of these new technologies, while being oblivious to the myriad operational and security challenges in making them work.
Just how hot is cloud computing? 2010 had barely started when HP and Microsoft announced a $250 million partnership to develop integrated data center products that HP will offer as the HP Private Cloud.
Other major cloud news includes none other than Microsoft, who announced the addition of the OS versioning feature to its recently released Windows Azure platform as a service offering. This was needed as Azure users complained about how patches and upgrades unexpectedly affected the operating systems running under Azure.
Historically, many organizations get caught up in the excitement and associated hype of the latest technologies due to the fascination with all things "new and improved." In doing so, they can easily lose sight of the risk implications of quickly and indiscriminately embracing new technologies, without first performing the requisite due diligence exercises, including at the least, a formal risk assessment.
The concept of virtualized computing is deep-rooted in the halcyon days of mainframe computing. Mainframes were then and still are expensive to install and maintain. An enterprise fortunate enough to afford mainframes in the past also had to ensure the logical separation of computing system resources and data assets of the often various, and sometime competing business customers paying hefty sums to use them.
Ben and David continue with an analysis of “a logical partition or LPAR, which was conceived and secured to ensure a dedicated virtual environment from which those customers could address various critical business computing requirements” and virtualization as they relate to the Payment Card Industry’s Data Security Standard (DSS).
Bill Brenner describes Five Security Missteps Made in the Name of Compliance in this 2/24/2010 post to NetworkWorld’s Security blog:
Compliance pressures often push companies to make security improvements they wouldn't have tackled otherwise. More budget goes toward technology needed to protect customer data. New policies are created to rein in what employees do online with company machines.
But there's a dark side to this story. …
In the mad rush to comply -- whether the stick takes the shape of PCI DSS or the Red Flags Rule -- companies sometimes make decisions that weaken their security. Poorly chosen and deployed IT security technology is perhaps the best example; for more on that, see " How to Make Things Worse With IT Security Technology.
Bill continues with descriptions of five common mistakes as related by IT security practitioners, analysts and consultants.
The Microsoft Partner Network will present a two-hour CSD24CAL: Is Azure Right for my Business? briefing via Live Meeting on 2/25/2010 at 11:00 AM EST (8:00 AM PST):
Microsoft Partners are cordially invited to attend an online Live Meeting briefing to hear new insights and points-of-views from well-known industry experts and leading edge partner companies on what the Windows Azure opportunity means for your organization. We will demonstrate how you will be able to change the face of your business and introduce new revenue streams for building solutions for our customers.
Have you pondered the business opportunity behind Azure?
Do you want to know how to:
- Leverage the opportunity and get started / engage your customer?
- Assess an organization for Azure and help them determine their S+S strategy?
- Compute cloud charges and determine the ROI of migrating to the cloud?
- Determine what best practices to apply in migrating applications to Azure?
At this briefing you will:
- Hear from a leading Azure partner on how to get-started identifying and developing Azure opportunities
- Hear from peers within your network about real projects and business development initiatives
Who should attend?
This is a business development briefing for Microsoft Partners, specifically system integrators, who build custom developed solutions. Partner attendees most likely to benefit from this briefing include:
- Executive Technology/ Business Liaisons
- Technology Decision Makers
- Development and Architecture Leaders
- 11:00 am - 11:15 am: Welcome by John McClelland, Partner Evangelist, Microsoft Corporation
- 11:15 am - 12:15 pm: Building An Azure Business Practice by Chris Rolon, Architect, Neudesic
- 12:15 pm - 1:00 pm: Showcase of An Azure Business Model by Wayne Beekman, Co-Founder, Information Concepts
Microsoft Unveils New Government Cloud Offerings at Eighth Annual Public Sector CIO Summit according to this press release of 2/24/2010:
Today, at the eighth annual Microsoft U.S. Public Sector CIO Summit, Microsoft Corp. unveiled a number of new enhancements and certifications for the Microsoft Business Productivity Online Suite (BPOS) that continue to raise the bar in cloud security and privacy. In a keynote speech addressing more than 300 public sector CIOs, Ron Markezich, corporate vice president of Microsoft Online, launched an industry first — a new dedicated government cloud offering based on the BPOS. For education, Microsoft also announced it is extending identity federation services to Microsoft Live@edu to help schools improve collaboration and security, as well as simplify identity management and support interoperability among disparate software applications. …
At this week’s CIO Summit, several government entities’ cloud solutions based on Windows Azure, Microsoft’s cloud services platform that provides a development, service hosting and service management environment, will be featured. To aid government transparency, the City of Miami created a public-facing 3-1-1 application using Azure to monitor and analyze non-emergency requests, such as pothole repair or missed trash pickup in their area. Another unique application of Azure, an example provided by the City of Chicago, will also be discussed. Last summer, organizers of the “Taste of Chicago,” one of the largest outdoor food and music festivals in the world, used Azure to provide interactive, online maps that its residents and visitors used to plan visits to the event. Visitors to the event could search by day, stage, event, and vendor and then print out a tailored map to use as a guide at the festival.
Microsoft’s 00:04:16 Cloud Computing for Government: The Power of Choice YouTube video segment of 2/16/2010 emphasizes “Software plus Services” and might have been produced for the Public Sector CIO Summit:
Mary Jo Foley adds details about the new BPOS offerings in her Microsoft unveils a locked-down, hosted app bundle for U.S. government customers post of 2/24/2010 to ZDNet’s All About Microsoft blog:
Microsoft is rolling out a new version of its Business Productivity Online Suite (BPOS) tailored specifically for federal government users, the company announced on February 24 during the kick-off of its annual CIO Summit in Redmond.
The new bundle is known as Business Productivity Online Suite Federal. No, this is not related to the still-unannounced BPOS Lite offering I wrote about recently. This is something aimed at government contractors “and others that require the highest levels of security features and protocols,” according to the Softies.
BPOS Federal is a security-enhanced version of the current BPOS product. BPOS is a Microsoft-hosted collection of Exchange Online, SharePoint Online, Communications Online and Live Meeting. The Federal version is hosted on “separate, dedicated infrastructure in secured facilities,” not in the existing datacenters where Microsoft currently hosts BPOS. (BPOS is a cloud offering, but isn’t currently hosted on Windows Azure. Microsoft officials have said the goal is to move BPOS to Azure, but haven’t offered a timetable for that.) [Emphasis added.]
It will be interesting to see when (and if) BPOS moves to the data centers and Azure. Running BPOS Federal on a dedicated infrastructure isn’t a good advert for Azure’s security.
Rachel Chalmers, “Research Director at The 451 Group, will share her insight on the future of enterprise private cloud-based IT as well as her recommendations for adopting a cloud-based dynamic IT model” in a 45-minute Enterprise Use Cases for Cloud-Hosted Dynamic IT with The 451 Group webcast on 3/3/2010 at 11:00 AM PST:
Date: 3 March 2010
Time: 1pm Central | 2pm Eastern | 11am Pacific
Register: Space is limited! Please click here to register for this live, online event »
In this session, you will learn about:
- Key CIO concerns and how they are impacted by the private cloud
- The difference between managed hosting and the cloud
- Key use cases and real-world case studies of dynamic cloud-based IT
- Recommendations on enterprise private cloud adoption
- The proven Surgient approach to implementing successful enterprise private clouds
Surgient is the Webcast’s sponsor.
Julie Allinson’s Repository & the Cloud event post of 2/24/2010 to the York Digital Library Blog reports on the Repositories and the Cloud held the day before at Magic Circle Headquarters in London, UK. Here’s Julie’s description of Alex D. Wade’s description of Microsoft Research’s Zentiy and EntityCube projects:
‘Moving to a world where all data is linked and can be stored / analyzed in the Cloud.’
Windows Azure (http://www.microsoft.com/windowsazure/) is Microsoft Cloud platform.
Zentity Cloud Storage (http://research.microsoft.com/en-us/projects/zentity/)
OGDI SDK: ‘OGDI is [an] open source starter kit written using C# and the .NET Framework that uses the Windows Azure Platform to expose data in Windows Azure Tables as a readonly RESTful service using the Open Data Protocol (OData) via an ASP.NET based Windows Azure web role.’ (http://ogdi.codeplex.com/)
EntityCube is a research prototype for exploring object-level search technologies, which automatically summarizes the Web for entities (such as people, locations and organizations) with a modest web presence. (http://research.microsoft.com/en-us/projects/entitycube/)
According to the sponsors:
This free event will bring together software developers, repository managers, service providers, funding and advisory bodies to discuss the potential policy and technical issues associated with cloud computing and the delivery of repository services in UK HEIs.
We hope that you arrive with a general understanding about utility computing solutions in the public cloud (as those terms are defined on the Wikipedia cloud computing entry), particularly file storage.
We expect you to leave with a good understanding of:
- the technical options for integrating EPrints, DuraSpace and Microsoft Zentity with cloud services;
- the potential policy issues that arise from such usage;
and having had the chance to contribute your ideas and suggestions for future activities in this space.
The day will offer a mix of plenary talks (morning) and themed breakout sessions (afternoon) and is designed to appeal to both geeks and policy bods.
CA, Inc.’s CA To Acquire Cloud Computing Solution Provider 3Tera press release of 2/24/2010 claims “Global IT Management Leader Enables Customers to Rapidly Deploy New and Existing Applications to Private and Public Clouds:”
CA, Inc. (NASDAQ:CA) today announced a definitive agreement to acquire privately-held 3Tera®, Inc., a pioneer in cloud computing. 3Tera’s AppLogic® offers an innovative solution for building cloud services and deploying complex enterprise-class applications to public and private clouds using an intuitive graphical user interface (GUI). Terms of the agreement were not disclosed.
With 3Tera—which follows CA’s recent acquisitions of Cassatt, NetQoS and Oblicore—CA continues to aggressively expand its portfolio of solutions to manage cloud computing as part of an integrated information technology management program.
3Tera enables enterprises and service providers to provision, deploy and scale public and private cloud computing environments while maintaining full control, flexibility and reliability. 3Tera also makes it easy for service providers to offer application stacks on demand by adding applications to the AppLogic catalog, where they can be deployed to a low-cost, shared cloud infrastructure. 3Tera’s customers include more than 80 enterprises and service providers globally, which use the cloud computing technology to provide services to thousands of users.
“CIOs can use cloud computing to build and manage a new type of IT ‘supply chain’ across today’s virtualized internal and external technology infrastructure,” said Chris O’Malley, executive vice president of CA’s Cloud Products & Solutions Business Line. “3Tera technology is a powerful addition to the total solution CA provides for optimizing these high-value supply chains—from the mainframe to the cloud.” …
Barry X. Lynn’s Mainstream IT Buys into Cloud Computing: CA to Acquire 3Tera - A Message from Barry X Lynn, CEO 3Tera adds the 3Tera side of the story:
We started 3Tera to radically ease the way IT deploys, maintains and scales – MANAGES - applications. Our AppLogic® cloud computing platform provides the foundation of our partners’ orchestration of cloud services for public and private clouds around the world. Today, we’re taking the next step in moving toward making cloud computing mainstream by joining CA.
CA and 3Tera share a common vision for the future of cloud computing, and we are excited about the opportunities that this acquisition will create for our customers, partners and their cloud users.
This is a historic moment in Cloud Computing. The significance of this acquisition is a heck of a lot more than just a land grab in a hot space. We are confident that as a team, CA and 3Tera, will extend our leadership of the cloud computing platform market.
We are honored, given the plethora of Cloud Computing companies that have emerged in the last few years, that CA has chosen us. We really are!
It would probably be arrogant to suggest that we, in turn, chose CA. So I won’t suggest that. But the fact is, we had many options for the future and this is the one that excited us the most. …
You can watch a whirlwind video tour of AppLogic here.
James Staten and Glenn O'Donnell add Forrester Research’s take on the CA/3Tera deal in their Welcome to the cloud market, CA post of the same date: "In short, CA may have just taken the lead in IaaS platforms:”
If anyone doubted CA Inc.’s intention to get into the cloud computing market, you can’t get away with that skepticism anymore. This company is serious. Its acquisition of early cloud leader 3Tera takes their nascent cloud entreaties to an entirely new level. 3Tera was one of the poster children of the emerging Infrastructure as a Service (IaaS) market when its AppLogic platform was deployed across a collection of service providers as the basis for their cloud solutions. The company since has grown this into a network of more than 30 service providers across the globe, and a small collection of enterprises who use its software to power their clouds. As a cloud infrastructure platform, that’s a substantial lead-in market penetration compared to the other favorites such as Eucalyptus, Enomaly, and of course VMware vCloud Express.
I call 3Tera’s customers a network not because they are interconnected but because the AppLogic software provides a higher level of abstraction than simply the hypervisor, and this abstraction eases the movement of workloads, whether single virtual machines or complex distributed applications, from one cloud to another. It’s also a differentiated platform play from Amazon EC2, Rackspace, and the other IaaS leaders as it has workload management built in, letting you package multiple VMs into a single service and deploy it as one logical entity and manage it as such. This ability to encapsulate applications comes from the roots of AppLogic as a workload management system, resulting in a relatively simple management system that can institute auto-scaling and failover policies. …
The authors continue with additional favorable analysis. However, “Taken the lead” over Amazon Web Services is a stretch in my opinion.
Larry Dignan chimes in with a CA adds to cloud portfolio, buys 3Tera article of the same date for ZDNet’s Between the Lines blog:
CA said Wednesday that it will acquire cloud computing player 3Tera.
Terms of the deal weren’t disclosed, but CA gets access to 3Tera’s AppLogic platform. AppLogic allows customers to pull together public and private clouds and display the connections in a handy interface.
With the deal, CA bolsters its cloud computing portfolio. CA has acquired Cassatt, NetQoS and Oblicore. The aim for CA is to bridge its IT management software to cloud computing deployments. …
In a statement, CA said it will integrate 3Tera’s software with CA’s IT infrastructure management tools—Spectrum Automation Manager and Service Assurance—as well as its cloud assets. CA will also extend support of 3Tera to virtualization platforms such as VMware ESX and Microsoft Hyper-V. 3Tera currently works on the Xen virtualization platform.
3Tera has 80 enterprise and service provider customers. …
Leena Rao joins the chorus with a CA Continues Shopping Spree; Acquires 3Tera To Boost Presence In The Cloud piece for TechCrunch:
IT software giant CA is acquiring cloud computing startup 3Tera. Terms of the deal were not disclosed. 3Tera’s product, AppLogic, helps enterprises build and deploy cloud-bases applications both to public and private clouds.
CA is opening up the purse strings to boost its presence in the cloud. The company recently acquired Cassatt, NetQoS and Oblicore. 3Tera allows companies to provision, deploy and scale public and private cloud computing environments. 3Tera also makes it easy for service providers to offer application stacks on demand. 3Tera’s client base includes 80 enterprises and service providers globally, which use the cloud computing technology to provide services to users
CA plans to integrate 3Tera’s AppLogic into its own suite of offerings. CA also plans to extend support of 3Tera to include both VMware ESX and Microsoft Hyper-V(TM).
Reuven Cohen claims CA’s 3Tera acquisition is “bad for customers” in his What CA's Cloud Consolidation Means post of 2/24/2010:
Let me be as direct as possible, the current wave of cloud consolidation is great for entrepreneurs, bad for customers. By now you've probably heard about the CA's acquisition of 3Tera. So purely from the stand point of an entrepreneur the 3Tera acquisition is great for Enomaly from both a valuation point of view as well as a sales and marketing one. Effectively in the short term (while the logistics of the deal are worked out) we have one less competitor in a space where the vendor selection was and is -- fairly limited.
Already this morning our sales team has seen an uptick in interest from ex or soon to be ex-3tera customers looking for a true Elastic Computing platform. At the end of the day most hosting customers seem to rather work with a smaller customer driven company that can still offer a personal touch as well as direct influence over the roadmap and features of the product. Something that a much larger company just can't do. With Enomaly you can still talk to the founders and our customers like this.
On the bright side, CA's acquisition is further proof of the booming opportunity in the Cloud Service provider market. If you're a regional telcom, data center, or hosting provider without a cloud offering today, simply put, you're being left behind. Assuming 3tera's pipeline looks like ours, there is a tremendous amount of interest in this space. Actually amazing might be a better adjective. And unlike other areas of cloud computing, there is real money being to be made now, not tomorrow. CA's M&A team obviously sees this opportunity, the real question is CA the best company to capitalize on this opportunity?
This brings us to the logical question, is Enomaly next? When we started the company more then 6 years ago we did this with the goal of building a unique and different kind of company (An Anomaly if you will), something we had direct control of, something that was ours. Our heart and soul have gone in this company and we will continue to put our blood, sweat and tears into our products. Our goal has always been to build an independent, sustainable world class company. A company that has arguably been in the middle of one of biggest shifts ever seen in technology. I can't tell you if someday Enomay will be bought or may we'll start doing the buying, but what I can tell you is today we're focusing on what we do best, building an innovative and unique platform for service providers. Simply put, our mission is to make you successful.
@Ruv is Founder and CTO for Toronto based Enomaly Inc., a 3Tera competitor.
Robin Wauters reports Google App Engine Sputters (Updated) in this 2/24/2010 TechCrunch article:
We’ve been getting a number of tips about the Google App Engine API being down hard, causing a good number of third-party services who depend on it to fail or be downright inaccessible. A quick check on API-status, which tracks that sort of thing, confirmed the service disruption.
The outage was also confirmed by the App Engine team in a Google Groups discussion, making it clear this wasn’t a scheduled event:
“Since 7:53am PST, App Engine has been experiencing an unexpected outage affecting the majority of App Engine applications. The team is working quickly to correct the cause and will have an ETA on the fix shortly. Please watch this thread for updates. We sincerely apologies for the inconvenience.”
Update: it’s back! (approx. 10:15 AM PST)
The word on Twitter is that the outage was due to a problem with the Backup Data Center.