|Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.|
• Update 3/6/2010: Steve Ballmer’s UofW presentation; Tim Anderson’s analysis of Steve’s speech; Lin Langit’s SQL Azure slides; Lori MacVittie on Cloud Security with Infrastructure 2.0; Zoli Erdos and the Under the Radar event; Brent Stineman updates his Azure Service Configuration post; Mike Kelly explains A PowerShell Cmdlet for Managing Windows Azure Diagnostics; Matthias Jauernig describes his live Azure Rating Stress Simulator application; ebizQ announces Cloud QCamp 2010; Shuttervoice.com reports Largest Image Update For Bing Maps in live Azure World Tour app; and Vittorio Bertocci announces a series of Windows Identity Foundation Workshops.
Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:
- Azure Blob, Table and Queue Services
- SQL Azure Database (SADB)
- AppFabric: Access Control and Service Bus
- Live Windows Azure Apps, APIs, Tools and Test Harnesses
- Windows Azure Infrastructure
- Cloud Security and Governance
- Cloud Computing Events
- Other Cloud Computing Platforms and Services
To use the above links, first click the post’s title to display the single article you want to navigate.
Discuss the book on its WROX P2P Forum.
See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.
Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.
You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:
- Chapter 12: “Managing SQL Azure Accounts and Databases”
- Chapter 13: “Exploiting SQL Azure Database's Relational Features”
HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the January 4, 2010 commercial release in February 2010.
See K. Scott Morrison’s REST Security Does Exist—You Just Need To Apply It post of 3/4/2010 in the Cloud Security and Governance section.
• The SQL Azure Team posted on 3/5/2010 a job opening for a Development Lead, Senior Job to build “mission critical database APIs” for Linux and PHP connectivity to SQL Server and SQL Azure:
We are building a team in SQL Connectivity that will focus on providing PHP and Linux based secure, reliable and scalable connectivity libraries to exchange data with the SQL Server. This team is responsible for broadening our developer base in these strategic areas and ensuring that they will have the full power, performance and scalability necessary to build their applications against SQL Server.
If owning components that are required for the success of ISVs, Enterprises, MS Office, Sql Services, SQL Azure ; solving complex technical problems; working on a team that has high visibility and a team that cares to listen to your ideas; are the type of things that get you excited, then this is the position for you.
We are seeking a highly motivated Senior Development Lead who is passionate about working on mission critical database API's. The ideal candidate will have 2-3 years of dev lead experience, strong Linux and/or PHP experience, a strong track record as a manager and mentor, the ability to collaborate and drive consensus across teams, and the ability to go deep in understanding complex subjects across all our SQL Server data access stacks. The technical ownership opportunities include the ability to drive long term connectivity solutions for SQL Server and SQL Azure.
Ryan Dunn explains Calculating the Size of Your SQL Azure Database in this 3/5/2010 post:
In Episode 3 of Cloud Cover, I mentioned the tip of the week was how to measure your database size in SQL Azure. Here is the exact queries you can run to do it:
sum(reserved_page_count) * 8.0 / 1024
sys.objects.name, sum(reserved_page_count) * 8.0 / 1024
sys.dm_db_partition_stats.object_id = sys.objects.object_id
group by sys.objects.name
The first one will give you the size of your database in MB and the second one will do the same, but break it out for each object in your database.
Hat tip to David Robinson and Tony Petrossian on the SQL Azure team for the query.
Glenn Barry delivers More SQL Azure Samples with a very long T-SQL script in this 3/4/2010 post:
I have been playing around some more with SQL Azure in order to get ready for an upcoming presentation, so I thought I would share some of the T-SQL commands in the script below. I have a partial copy of the sample AdventureWorksLT2008R2 database installed on my SQL Azure logical “server” in the Southern US data center before this demo starts. You need at least the November CTP of SQL Server 2008 R2, in order to talk to SQL Azure with SSMS.
I have mainly just trying different commands to see what works and what does not work in SQL Azure, so I hope you find this interesting and useful.
You’ll need the AdventureWorksLT2008R2 database running in SQL Azure to take full advantage of the script. See my Using the SQL Azure Migration Wizard v3.1.3/3.1.4 with the AdventureWorksLT2008R2 Sample Database post of 1/23/2010 for the quick and easy way to upload the database schema and data.
Lori MacVittie’s The IP Address - Identity Disconnect post of 3/4/2010 asserts “The advent of virtualization brought about awareness of the need to decouple applications from IP addresses. The same holds true on the client side – perhaps even more so than in the data center” and contests the foundation of SQL Azure’s firewall:
I could quote The Prisoner, but that would be so cliché, wouldn’t it? Instead, let me ask a question: just which IP address am I? Am I the one associated with the gateway that proxies for my mobile phone web access? Or am I the one that’s currently assigned to my laptop – the one that will change tomorrow because today I am in California and tomorrow I’ll be home? Or am I the one assigned to me when I’m connected via an SSL VPN to corporate headquarters?
If you’re tying identity to IP addresses then you’d better be a psychiatrist in addition to your day job because most users have multiple IP address disorder.
IP addresses are often utilized as part of an identification process. After all, a web application needs some way to identify a user that’s not supplied by the user. There’s a level of trust inherent in the IP address that doesn’t exist with my name or any other user-supplied piece of data because, well, it’s user supplied. An IP address is assigned or handed-out dynamically by what is an unemotional, uninvolved technical process that does not generally attempt to deceive, dissemble, or trick anyone with the data. An IP address is simply a number. …
Lori continues with an analysis of the need for user context:
Cookies. Unique names. Device types. Browser. Tokens. Any combination thereof. In other words, user context. It’s not that we need to stop using IP addresses altogether, but we do need to stop using them as an authoritative source on their own. They are still, certainly, part of the equation but it’s not an idempotent one and especially when used for security purposes it should be just one more piece of information used to make the determination to deny, allow, or grade access to application and network resources. …
Dennis Forbes recommends Getting Real about NoSQL and the SQL-Isn't-Scalable Lie in this 3/2/2010 essay:
I work in the financial industry. RDBMS’ and the Structured Query Language (SQL) can be found at the nucleus of most of our solutions.
The same was true when I worked in the insurance, telecommunication, and power generation industries.
So it piqued my interest when a peer recently forwarded an article titled “The end of SQL and relational databases”, adding the subject line “We’re living in the past”.
[Though as Michael Stonebraker points out, SQL the query language actually has remarkably little to actually to do with the debate. It would be more clearly called NoACID]
That series focuses on NoSQL as the challenger to the throne. It isn’t alone as the past year has yielded a bountiful crop of articles and blog entries declaring the imminent death of the decrepit relational database at the hands of this new innovation.
“The ACIDy, Transactional, RDBMS doesn’t scale, and it needs to be relegated to the proper dustbin before it does any more damage to engineers trying to write scalable software.”
And they usually see later edits that blunt the original euphoria.
"Postnote: This isn’t about a complete death of the RDBMS. Just the death of the idea that it’s a tool meant for all your structured data storage needs.”
Dennis continues his argument under the following topics:
- So what is scalability, anyways? …
- When Money Is No Object …
- Artificially Limited Scalability …
- The Needs of a Bank Aren’t Universal …
- SQL is Scalable and NoSQL Isn’t For Everyone …
A convincing essay, IMO. Be sure to read the comments.
Dennis is a Toronto-based software architect focused primarily on the .NET and SQL Server worlds.
Jeff Darcy pushes back against Dennis in his Getting Silly About NoSQL post of 3/3/2010:
… I was going to try to be polite, until I saw your slam against All Things Distributed as a NoSQL advocacy site. What rubbish. Vogels and company know more about scalability than just about anyone, and more about using the right tool for the right job – which is why they provide RDS as well as SimpleDB. Even if that weren’t the case, what you’re engaging in is mere ad hominem. Vogels’s definition of scalability is right or wrong on the merits, not based on who he is or what other opinions you attribute to him. Might as well dismiss all of *your* definitions and claims based on your being an RDBMS advocate. As it happens, I was making Oracle scale before there was a NoSQL, before there was even RAC (it was OPS back then), and from then until now I’ve always used a very similar definition of scalability: maintaining a ratio of work done to resources used. For you to offer a different definition *is* the same sort of self-serving wordplay which you criticize in others.
Chirag Mehta chimes into the NoSQL space with a NoSQL Is Not SQL And That’s A Problem post of 3/5/2010:
I do recognize the thrust behind the NoSQL movement. While some are announcing an end of era for MySQL and memcached others are questioning the arguments behind Cassandra’s OLTP claims and scalability and universal applicability of NoSQL. It is great to see innovative data persistence and access solutions that challenges the long lasting legacy of RDBMS. Competition between HBase and Cassandra is heating up. Amazon now supports a variety of consistency models on EC2.
However none of the NoSQL solutions solve a fundamental underlying problem – a developer upfront has to pick persistence, consistency, and access options for an application.
I would argue that RDBMS has been popular for the last 30 years because of ubiquitous SQL. Whenever the developers wanted to design an application they put an RDBMS underneath and used SQL from all possible layers. Over a period of time the RDBMS grew in functions and features such as binary storage, faster access, clusters etc. and the applications reaped these benefits.
I still remember the days where you had to use a rule-based optimizer to teach the database how best to execute the query. These days the cost-based optimizers can find the best plan for a SQL statement to take guess work out of the equation. This evolution teaches us an important lesson. The application developers and to some extent even the database developers should not have to learn the underlying data access and optimization techniques. They should expect an abstraction that allows them to consume data where consistency and persistence are optimized based on the application needs and the content being persisted.
Continuing with the NoSQL Theme, Jnan Dash asserts “We also see Amazon's SimpleDB is introducing ‘forced consistency’ to the world of ‘eventually consistent’ model” in his Elastic Database Cloud? post of 3/4/2010:
The NoSQL movement is getting a lot of buzz lately. Recently I read an interesting article, an interview with Gigaspace CTO Nati Shalom. Several key points are made here. The one-size-fits-all approach has never worked in our industry - one operating system, one DBMS, one middleware, etc. So the exuberant folks behind the NoSQL movement must be careful when they declare that the traditional SQL RDMS business is going to die soon.
For example, we do understand the emerging needs of extreme scale from Twitter - the number of users grows continuously; each user maintains a list of followers that tends to grow continuously as well; and all communication is done on a many to many approach. "The combination of all that makes the traffic pattern in a twitter environment fairly viral and unpredictable and therefore stretches almost any boundaries of scaling we can think of." Hence innovative approaches are being tried such as a combination of in-memory and file-based access. It's all based on cost/GB and read/write performance.
It's worth looking at an analysis by Stanford researchers called "The Case for RamCloud" that gives a cost comparison between disk and memory-based approaches in terms of cost/performance.
We also see Amazon's SimpleDB is introducing “forced consistency” to the world of “eventually consistent” model. As features of a full-function RDBMS get added to the NoSQL solutions, they start looking more like the current RDBMS, even though Nat Shalom does not think SimpleDB is a database. …
Jnan is Senior Advisor at EZShield Inc., Advisor at ScaleDB and Board Member at Compassites Software Solutions.
Brent Stineman’s Introduction to SQL Azure post of 3/4/2010 is a detailed tutorial for SQL Azure naifs:
Yet another simple attempt to document my continuing adventures with the Windows Azure Platform. In today’s edition, our hero attempts to host a simple on-premise database to SQL Azure and access it from an on-premise application.
I’ve covered getting Windows Azure platform benefits in another post. So we’ll presume you have already either purchased a subscription for SQL Azure or claimed any benefits you’re entitled too. Now my MSDN Premium subscription benefits include 3, 1gb instances of SQL Server. Today I’m going to setup my SQL Azure server and create one of those instances.
See Vittorio Bertocci’s Coming to a City Near You: Windows Identity Foundation Developer Workshops! post in the Cloud Computing Events section.
/n software inc. describes its new RSSBus for Azure (Beta) in this data sheet of 3/3/2010:
Now you can create and deploy RSSBus Simple Services on Windows Azure! RSSBus for Azure is a pure .NET implementation of RSSBus designed exclusively for deployment to the Microsoft cloud.
Download: RSSBus for Windows Azure [BETA]
RSSBus for Azure facilitates the creation of simple services and service based web applications using the Microsoft Windows Azure. RSSBus for Azure includes the RSSBus Engine, a library of RSSBus Connectors, the Admin Console, and a set of Simple Service demo scripts.
Due to the security restrictions of the Windows Azure platform some RSSBus Connectors may be unavailable for deployment to the Azure Cloud Service. For a complete list of RSSBus Connectors available for RSSBus Server, please visit the Connector Gallery.
RSSBus for Windows Azure Will Help You:
- Connect to information sources with the library of pre-defined services.
- Define new services and access data sources through RSSBus Connectors.
- Create custom feeds, widgets, and services that are accessible through simple standards-based interfaces including SOAP, REST, HTML, RSS, ATOM, JSON, XLS, CSV, and more.
- Developers can enhance RSSBus for Azure with additional data sources with the Extensible Connector architecture. …
On PDC 2008 Ray Ozzie went out on a limb by saying that Windows Azure will be “setting the stage for the next 50 years of systems”. Everyone (me too) was excited about this new technology and people got inspired by Microsoft’s vision and its new cloud computing platform.
16 months later, here we are: Windows Azure is live, the platform has been consolidated (R.I.P., Live Framework…) and data centers have been built up around the world.
But is Windows Azure this game changer that Microsoft promised and which they bet on? Will Windows Azure (as product) succeed? I’m not sure, but let’s see…
My thoughts on this topic have a certain background. In 2009 my company SDX invested significant research time into the innovative areas of cloud computing in general and Windows Azure in particular. And we’re still moving forward in this area. In scope of the NewCloudApp() contest we made up a little showcase named Rating Stress Simulator which you can see on the right and which you can try out now on Windows Azure. …
Matthias continues with a plea to make Windows Azure more attractive to hobby developers, a feature that Tim Anderson and I also believe Azure lacks.
• Mike Kelly describes A PowerShell Cmdlet for Managing Windows Azure Diagnostics in this 3/5/2010 post:
I decided to write a PowerShell cmdlet to manage Windows Azure diagnostics. This cmdlet will let you retrieve and delete log information in the Azure cloud for your service; right now there isn’t any easy way to do this.
I started out by writing a simple spec for my tool; you can see it here. The spec just lets me flesh out some of the design issues before writing code.
I found that there is a PowerShell 2.0 SDK released a couple of months ago; I installed that.
Next, I fired up Visual Studio 2010 and looked for a template to create a PowerShell Cmdlet. Turns out there isn’t one. So you just create a class library; all there is to a PowerShell Cmdlet is an assembly that inherits from Cmdlet (or PSCmdlet). …
Mike continues with his narrative about how he finished the Cmdlet. He’s Director of Emerging Practices in Microsoft's Engineering Excellence group.
• Shuttervoice.com reports Largest Image Update For Bing Maps, Says Microsoft in this 3/6/2010 post:
Microsoft Bing Maps has undergone a so called Largest Image Update Ever in Maps in measurement of square kilometers. It’s been said by Microsoft that, they have underwent the largest update of 6.7 million square kilometers.
With this update there is new aerial and bird’s eye imagery structure in the maps on several continents as in the picture below:
The Bing Maps World Tour shows updates on the Map. In another image below we can see the statistical data on a large basis for many countries.
• Brent Stineman’s Azure Service Configuration Updated (or “Where did RoleManager go?”) post of 3/5/2010 begins:
In May of 2009, I wrote a blog posting that covered the basics of Azure Service Configuration Options. When I was looking through my blog statistics lately, I noted this posting is seeing a significant amount of traffic. This is tragic because the November release of the Windows Azure platform broke much of what I discuss in that posting. So I figured it was time to do right by that topic and make an update. :)
Brent continues with descriptions of changes to ServiceConfiguration.cscfg and ServiceDefinition.csdef files in the Azure release version.
Ryan Dunn and Steve Marx posted the Cloud Cover Episode 3 - Worker Role Endpoints Webcast to Channel9 on 3/5/2010:
Join Ryan and Steve this week as they cover the Microsoft cloud. You can follow and interact with the show at @cloudcovershow
In this episode:
- Learn how to host other web servers in Worker Roles.
- Hear about some new tools for working with SQL Azure.
- Find out how to calculate the size of your database and objects in SQL Azure.
Steve Marx explains Using Other Web Servers on Windows Azure in this 3/5/2010 post:
Ryan Dunn and I recently recorded the third episode of our new Channel 9 show, Cloud Cover, which should go live on Channel 9 tomorrow (Friday). In this week’s episode, I do a demo of a worker role with an input endpoint. That’s a fancy way to say I use a non-IIS web server in Windows Azure.
The technique I demonstrate is exactly the same as what’s used by our Tomcat Solution Accelerator, but it’s also been used to run Apache, Mongrel, and other web servers. In this post, I’ll show how to run Mongoose, a tiny web server, in the cloud.
Steve’s post covers the following topics:
- Background: Endpoints in Windows Azure
- Worker Roles with Input Endpoints
- Listening for Traffic on the Right Port
- Launching a Web Server
- Packaging the Web Server with Your Role
- Try It Out
I’ve zipped up everything (except the Mongoose binary) so you can try this sample yourself. To get the application working, just download Mongoose and place it in the Mongoose folder under the worker role.
Note that because you don’t have an HTTP input endpoint, Visual Studio doesn’t know to launch the web browser. To view the application once it’s running, open up the Development Fabric UI and check what port you’re on (should be port 80, or often 81 or 82 if you’re running other things). Then just browse to
localhoston that port in the browser.
It’s also running in the cloud (for a week or so until I pull it down) at http://workerinput.cloudapp.net.
It’s good to see Steve back in the blogging business.
Nuno Silva begins a A simple, yet interesting scenario with this initial post of 03/03/2010:
Today I had the chance to exercise myself in the Evangelist role when I was confronted with a simple, yet interesting application scenario. I was asked for advice on how to build a small application that had some interesting requirements:
- A simple Silverlight “control” to be placed in a portal, which will be reused for several other portals in different countries.
- The control must allow both read and write access to a set of data
- Due to budget constraints, the organization doesn’t have the resources to acquire a database server to hold that data …
In the end, after carefully analyzing our alternatives, we came up with the following solution:
- Silverlight control (it is a requirement and I would have advise it anyway)
- Database hosted somewhere in the cloud
- Service hosted on Azure to provide both read and write access to the data
- Depending on what they want to achieve they can opt for a WCF service or a REST service (using ADO.NET Data Services).
He continues the process with A simple, yet interesting scenario: creating a database on Microsoft SQL Azure of 03/04 and A simple, yet interesting scenario: hosting a WCF service on Windows Azure of 03/05.
Nuno (@nuno_ms) is a Developer Evangelist at Microsoft Portugal. I’ve subscribed to his feed.
Microsoft posted a new Cloud Services portal to coincide with Steve Ballmer’s The Cloud: Exciting New Possibilities presentation at the University of Washington’s Paul G. Allen Center for Computer Science & Engineering of 3/4/2010 with the theme of “We’re all in” [the cloud.]
So, what I'm going to try to do is share with you five key dimensions in the cloud, five key opportunities, five key things that I think need your best ideas, your best thoughts, your best invention, commercial inventions, academic inventions, product inventions to really drive forward. And I'll give you a little bit of context about what Microsoft is doing in that regard.
In brief, following are Steve’s “five key dimensions in the cloud:”
- The cloud creates opportunities and responsibilities. …
- The cloud learns and helps you learn, decide and take action. …
- The cloud enhances your social and professional interactions. …
- The cloud wants smarter devices. …
- The cloud drives server advances that, in turn, drive the cloud. …
- Todd Bishop’s Notes: Steve Ballmer's 'Five Dimensions of the Cloud' and Nick Eaton’s Ballmer betting on cloud computing: 'We're all in' posts of 3/4/2010 summarize Steve’s speech quite well.
- Nancy Gohring reported “All Microsoft products driven by idea of being connected to the cloud, CEO tells students” in her Microsoft's Ballmer says he has bet the company on the cloud article of 3/4/2012 for the IDG News Service.
- Sebastian Rupley’s take is Ballmer on the Cloud: Throw Out All the World’s Software? for GigaOm.
- Claudine Beaumont delivers the UK view in her Balmer's Hazy Definition Of The Cloud article for the Daily Telegraph.
A full transcript of the 01:15:48 speech is here. Following is Microsoft Cloud Services’ new motto:
• Tim Anderson posted his Microsoft maybe gets the cloud – maybe too late analysis of Ballmers take on Azure on 3/5/2010:
Someone asks if Microsoft is just reacting to others. Ballmer says:
“You know, if I take a look and say, hey, look, where am I proud of where we are relative to other guys, I’d point to Azure. I think Azure is very different than anything else on the market. I don’t think anybody else is trying to redefine the programming model. I think Amazon has done a nice job of helping you take the server-based programming model, the programming model of yesterday that is not scale agnostic, and then bringing it into the cloud. They’ve done a great job; I give them credit for that. On the other hand, what we’re trying to do with Azure is let you write a different kind of application, and I think we’re more forward-looking in our design point than on a lot of things that we’re doing, and at least right now I don’t see the other guy out there who’s doing the equivalent.”
Sorry, I don’t buy this either. Azure does have distinct advantages, mainly to do with porting your existing ASP.NET application and integrating with existing Windows infrastructure. I don’t believe it is “scale agnostic”; something like Google App Engine is better in that respect. With Azure you have to think about how many virtual machines you want to purchase. Nor do I think Azure lets you write “a different kind of application.” There is too little multi-tenancy, too much of the old Windows server model remains in Azure.
Finally, I am surprised how poor Microsoft has become at articulating its message. Azure was badly presented at last year’s PDC, which Ballmer did not attend. It is not an attractive platform for small-scale developers, which makes it hard to get started.
Mary Jo Foley’s The cloud slide Steve Ballmer should have shown of 3/4/2010 suggests Lynn Langit’s slide as more appropriate to Microsoft’s cloud story:
Based on his presentation, Ballmer seemingly was using the terms “cloud” and “Internet” interchangeably. But to me, the Web is not the same as the cloud. Then again, maybe I’m just splitting hairs…
I understand that there is no single cloud. Is Microsoft Hotmail a cloud app? Sure, it runs in Microsoft’s datacenters somewhere. Ditto with Xbox Live, the Danger Sidekick services, Office Web Apps, Windows Live services, Microsoft’s hosted Business Productivity Online Services (BPOS), etc. There are lots of Microsoft servers different Web-based apps and services out there, all of which could be called part of “the cloud” even though none of the ones I’ve mentioned is running on Microsoft’s Windows Azure.
Oh yeah. Azure …. When most pundits and industry observers talk about Microsoft and its cloud strategy, they mean Azure. I bet a lot of Microsoft’s customers and developers think this way, too. Ballmer made very few references in his UW talk today to Azure — maybe because on the Azure cloud front, Microsoft is playing catch up (at least timing-wise) to others already out there, including Amazon, Google, Salesforce and more.
Ballmer said Microsoft would support the public cloud, the customer (private) cloud, the partner cloud and the governement cloud. Until today, I felt Microsoft’s story about how it would do this was pretty clear and straightforward. It was software+services and/or three-screens-and-a-cloud. According to that “story,” Microsoft offers users a wide span of choices: Run your applications on-premises; partially on-premises and partially in the cloud; or completely in the cloud. On the cloud side, these applications can be hosted by Microsoft partners and/or Microsoft.
In other words, like this. (Click on the slide below to enlarge. Note IaaS is Infrastructure as a service and PaaS is platform as a service.):
(from “Introduction to Windows Azure” by Softie Lynn Langit) …
Mary Jo links to Jason Kincaid’s Steve Ballmer’s Memo To Microsoft Staff: “We Must Move At Cloud Speed” post of 3/4/2010 to TechCrunch, which carries the full text of Steve’s “all-staff” email about the five dimensions and requests Microsoft employees to:
- Watch the speech on demand here
- Learn more about our cloud offerings and how they relate to our overarching software plus services strategy here
- Review your commitments to ensure you are landing our vision with customers and partners.
Mike Wickstand of the Windows Azure team has posted a quick survey which requests users to choose between manual, automatic, or delayed automatic upgrading of the guest operating system in Microsoft data centers. It took me about two minutes to complete it.
Charlie Burns and Robert McNeill co-authored the User IT Trends Breed Alternatives to Traditional IT Management Research Alert of 3/4/2010 for Saugatuck Resarch:
Recent announcements by significant IT players continue to indicate new dynamics in the traditionally lethargic arena of IT systems management. New offerings are threatening the status quo of the “Big Four” (i.e., BMC, CA, HP, and IBM).
What is Happening?
As we recently detailed in Strategic Perspective MKT-708, “The Cloudy Future of IT Service Management,” published 26 Feb. 2010, IT service management (ITSM) is one of the traditional IT support functions that are being targeted by a new set of vendors. User IT organizations are selecting from a growing list of functionally-rich alternatives with an array of implementation options ranging from traditional on-premise solutions to on-premise appliances to SaaS and cloud-based tools. …
Charlie and Robert continue an analysis of the IT management tool market, which includes cloud computing data centers.
Enterprise Management Associates’ New EMA Research Examines “The Responsible Cloud” press release of 1/13/2010, which I missed when published, reports:
Enterprise Management Associates (EMA), a leading IT management research and consulting firm, today released its latest research report entitled, “The Responsible Cloud.” … Some of the key findings in the report include:
- 76% of all enterprises report that cloud computing has resulted in real, measurable cost savings. Discounting those that have not measured or cannot tell, 89% of enterprises with production deployments report a real, measurable cost reduction as a result of implementing cloud computing.
- Service improvement and cost reduction are the most important drivers and the strongest outcomes. Over 75% of enterprises report CapEx and/or OpEx savings – on average over 20%. However, HR issues, politics, lock-in, and management are all rated as critical barriers to success.
- 70% of experienced organizations rate the role of IT Management as highly important to the success or failure of cloud computing.
There is no doubt as a result of this research that cloud computing requires far more attention than the slap-dash “cowboy” efforts that many seem to advocate. It requires a carefully planned approach that establishes a firm foundation in virtualization and infrastructure management, adds careful attention to core principles of IT and business service management, and assures commitment to principles of security and compliance throughout. …
• Lori MacVittie claims “The current threat level is … the same as it was yesterday, and the day before, and will be tomorrow” in her When Everything is a Threat Nothing is a Threat post of 3/5/2010:
We’ve all been in the airport before and heard the announcement. “The current threat level is orange. Blah blah blah blah yada yada whatever.” At least that’s what I hear today because I’ve become immune to the fact that “orange” means there’s a threat. There’s always a threat, it seems, and the announcement simply conveys what appears to many of us to be the “status quo.” We have effectively been desensitized to a “higher” threat level as it is now treated as nothing out of the ordinary. It is the norm, rather than something that grabs our attention.
The same is true in the enterprise, where the threat level is always high. Although most organizations likely don’t have a “threat level announcement” the effect is the same: personnel and infrastructure alike treat this allegedly “heightened awareness” as the status quo. It’s no longer actually heightened, or more aware, it’s the way it always is. Many times this is because there’s always a credible threat to the infrastructure and applications of any organization. At any time there may be an incursion, an attempt at penetration, the exploitation of an old or newly discovered vulnerability. This forces information security teams to put into place the infrastructure and solutions, both active and monitoring, that will detect and (one hopes) prevent a successful attack. These solutions are always on alert, twenty-four by seven, three-hundred sixty-five days a year. …
A dynamic infrastructure, enabled by Infrastructure 2.0, however, might resolve this problem – or at least give us the means by which we can architect an infrastructure that can. …
K. Scott Morrison asserts REST Security Does Exist—You Just Need To Apply It in this 3/4/2010 post:
On the eve of the RSA conference this year, Chris Comerford and Pete Soderling published a provocative article in Computerworld titled Why REST security doesn’t exist. It’s a prelude to a talk the author’s are delivering at the conference. Their premise is that while good REST security best practices do indeed exist, developers just don’t seem to follow them.
Comerford and Sodering attribute this state of affairs to a combination of two things. First, REST lacks a well-articulated security model. Few would argue with this—REST, by virtue of it’s grassroots origins, suffers from a security just-do-it-like-the-web nonchalance that’s certainly done it no favors.
The second issue concerns developers who tend to rush implementation without giving due consideration to security. Truthfully, this is the story of security across all of IT, but I might suggest that with REST, the problem is especially acute. The REST style owes much of its popularity to being simple and fast to implement, particularly when faced with the interest-crushing complexity and tooling demands of the WS-* stack. It’s reasonable to think that in the enthusiastic dash to cross the working application finish line, that security is conveniently de-emphasized or forgotten altogether. …
Scott implements Comerford and Soderling’s recommendations in a simple policy for the Layer 7 Technologies SecureSpan Gateway:
Mike Kirkwood suggests “potential upgrades to 802.1 would enable a richer dialog between the server as it starts up its networking process” in his Reinventing the Handshake: Polite Servers and Smart Networks Lead to Active Security post of 3/4/2010:
If there was a real-time tag cloud for the RSA conference this year, three words would be in big bold letters: Security (of course), Cloud, and Virtualization. Paul Congdon, from HP's ProCurve Networking group gave us a view into the not-so-distant future where servers, like good house guests, knock before entering. In this case, it's the link they request, and to get it they will properly announce themselves and their intentions to allow the host to prepare to accommodate them.
This capability is a linchpin in removing the process bottleneck in provisioning new services in the data center. For most organizations, the network is manually configured. To keep up with the movement of the provisioning of virtual machines, the network needs to enable "plug and play".
802.1.x is technology that has been used in WiFi connections. One reason it was useful in that context is that it's expected that the link drops and reconnects frequently and so is seen as an opportunity for the physical link as well.
B&L Associates quotes Forrester Research in this Business Balking on Cloud Storage post of 3/4/2010:
While the cloud may have an irresistible attraction to some businesses for some purposes, storage doesn't seem to be one of them---at least according to some recent findings from Forrester Research. In a report titled "Business Users Are Not Ready For Cloud Storage," the firm disclosed that IT decision makers in North America and Europe just aren't ready to trust the cloud with their data.
"Given that data storage capacities are growing at 30 percent to 40 percent per year but storage budgets are flat or growing minimally, IT professionals are eager to take advantage of the low cost per gigabyte offered by cloud providers," the report's executive summary noted.
"Storage vendors are eager to be the ones to provide storage-as-a-service or supply storage systems to the cloud providers," the summary continued. "However, data from Forrester's 2009 hardware survey shows that this is just talk, so far."
Don't look for companies to trust their data with services like Amazon S3, EMC Atmos, Nirvanix, The Planet or AT&T Synaptic Hosting any time soon, warned the report, penned by Forrester researcher Andrew Reichman with Stephenie Balaouras and Alex Crumb. "There is long-term potential for storage-as-a-service, but Forrester sees issues with guaranteed service levels, security, chain of custody, shared tenancy, and long-term pricing as significant barriers that still need to be addressed before it takes off in any meaningful way," the summary said.
Not surprisingly, the top concerns of the techies polled for the report about cloud storage were privacy and security. No doubt the recent breach into Google's computers by Chinese crackers threw gasoline on those burning concerns. Of the 2200 IT executives interviewed for the report, 49 percent of those from enterprises and 51 percent from small- and medium-sized businesses cited those concerns for not using the cloud. …
John Pescatore asks Cloud Computing: Will It Be Government’s Venus Fly Trap? in his 3/4/2010 post to the Gartner blog:
The cryptographer’s panel at the RSA conference is always my favorite part. At this year’s conference, Adi Shamir (the S in RSA) made a comment along the lines of “One of my fears for the future is that cloud computing is a ‘dream come true’ for government intelligence agencies.” He actually used a more colorful term for ‘dream come true’ but his basic point was something I point out to Gartner clients all the time: in many countries (the US included) companies are legally (and often illegally) required to cooperate with government requests to surreptitiously monitor communications and content flowing through or stored on their systems.
There is a school of thought that true cloud computing means no care at all about the physical location of the storage. The fact that many governments can compel any company or service provider operating in their country to expose their customer’s data means for real businesses, location does matter.
He continues with an analysis of potential solutions including encryption, striping or scattering the data across multiple data centers in multiple countries, striping/scattering encrypted bits and “tokenization as a service,” which John believes offers more promise.
John is an analyst in Gartner’s security group.
• ebizQ announced Cloud QCamp 2010 on April 7, 2010 in this 3/6/2010:
ebizQ has been covering Cloud Computing for the last three years. Last June, ebizQ organized a Cloud Qcamp virtual conference, where leading industry experts and practitioners explored the role of service-oriented architecture (SOA) and business process management (BPM) in supporting cloud-computing initiatives. This April, ebizQ will help enterprises cut through the hype and focus on issues surrounding cloud computing, covering Infrastructure as a Services (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS). This year's QCamp will also focus on development of Private Clouds in Enterprises.
- The Economics of Cloud Computing: Joe McKendrick
- Best Practices in Moving Data to the Clouds: David Linthicum
- How Can the Cloud Fit Into Your Applications Strategy?: Phil Wainewright
- Can the Cloud be Governed?: Joe McKendrick
- Platform as a Service: How to Avoid Lock-In: Phil Wainewright
- What is Virtualization and Cloud Computing, and How Interconnected Are They?
• Zoli Erdos’ Under the Radar: Commercializing the Cloud – Apply to Present / Discount Tix Here post of 3/5/2010 to the Enterprise Irregulars blog announces:
Under the Radar is Silicon Valley’s most established startup debut platform: a conference series organized by Dealmaker Media, covering business applications, social media, entertainment, mobility..etc.
This year’s conference in Mountain View, CA on April 16th will focus on Commercializing the Cloud – that’s a fairly wide definition, and one that perfectly mashes with our focus over @ CloudAve, so we’re proud to be Media Partners at this event. That means we’ll be covering it before, during and after, and if you decide the attend, we’ll get you in at a discount rate.
In this American Idol of startups typically 32 finalists are selected, who are grouped in categories of 4 each and each has about 15 minutes to present in two parallel tracks. They get grilled by the judges and audience, and at the end of the conference the winners of each category are announced. A few years ago I participated in the pre-selection of startups, and I remember having checked out hundreds of companies to come down to the finalist set. At the moment 19 finalists are announced:
AppDynamics, AppFirst, Aprigo, Cloudant, CloudShare, CloudSwitch, Conformity, CubeTree, Fonolo, GoodData, Layerboom Systems, Makara, MaxiScale, Neo Technology, NorthScale, Reductive Labs, RiverMuse, SaaSure and SendGrid.
This means two things:
- A dozen or so slots are still open
- The Selection Committee will likely sift through another 100+ applications to fill those slots.
So if you consider your startup a (future) leader in Saas | Collaboration | Business Apps | Development Tools | Compliance | (and more!), don’t waste time, apply here to be a presenter. …
• Vittorio Bertocci (a.k.a., Vibro) reports on 3/4/2010 Coming to a City Near You: Windows Identity Foundation Developer Workshops!:
3/29-3/30 - Brussels, Belgium
4/7-4/8 – Chertsey, UK
5/_-5/_ - Munich, Germany
5/18-5/19 – Singapore
5/_-5/_ - Sydney, Australia
6/1-6/2 – Redmond, USA
Want to gain deep, hands-on knowledge on Windows Identity Foundation? Well, we may just have what you need here!
By the end of this month I’ll hop on a plane and start going around the world, delivering 2-day workshops about WIF.
The Workshops Format
2 days of full immersion in Claims-Based Identity and Windows Identity Foundation, where traditional lectures are alternated to instructor-led labs. Everything will be as interactive as it can be, so that you can squeeze as much value as possible from your participation: which is why I am personally flying to every location for delivering the training. …
… Apart from the first 30/40 mins on day 1, the workshop content is quite deep and designed for a developer audience. Architects are welcome as well, provided that they are hands-on and know their way in Visual Studio.
We will touch on all the standard stuff, then we’ll dig deeper on some key topics (extensibility, custom STSes on-premises and in Windows Azure, WIF and Silverlight, WIF+WCF+Windows Azure, etc). [Emphasis modified.]
The idea is not (only) to give you a list of recipes of how to handle a list of given scenarios, but to make you understand what makes WIF tick so that you know where to put you hands and what to change in every occasion. Perhaps even more important, during the 2 days spent together I will challenge you to learn to think in term of claims about your scenarios: if you get that, all the rest is quite literally syntactic sugar. …
How to Participate
Aaaand we finally get to the interesting part. How to participate? The event is free of charge, apart of course for your T&E if you need to travel to the venue. Unfortunately the highly interactive nature of the workshop format (and the requirement to provide PCs for the labs) imposes hard limits on the number of participants, less than 20 people per event. If you want to attend, don’t hesitate!!! The current schedule is [above]. …
If you live in Belgium, UK, Singapore or US and you want to participate to the workshop please get in touch with your local DPE contact.
If you live in Germany or Australia: the two workshops there are still not confirmed, but we are working on it. If you would be interested in participating, please let your local evangelist know.
Lynn Langit’s Intro to Windows Azure from MSDN Irvine, CA Event post of 3/4/2010 provides links to her presentation slides and more:
Expanded deck from yesterday’s talk – includes information on Windows Azure data storage options as well as basic introduction to cloud computing and Microsoft’s Windows Azure offering: Windows Azure Introduction. View more presentations from lynnlangit.
Here’s an image that I like (from Microsoft’s PSA Keith Pijanowski) which shows a logical representation of Microsoft’s Windows Azure cloud computing offering:
Keith has also written a blog post about IaaS, PaaS and SaaS that I found to be helpful – link here.
Also, here’s a link to a comparison of the cloud computing offerings from vendors other than Microsoft (requested by the audience in Irvine).
By the way, after you build and deploy your first cloud application (using ANY vendor), I’d love to hear more about your experiences (good or bad) – drop me a line via this blog.
Brien Loesgen reminds Southern Californians on 3/5/2010 that San Diego:Windows Azure Conference is Tomorrow! (Saturday):
San Diegans, time is running out, the Windows Azure conference (I blogged about it here) is *tomorrow*. This is a great opportunity to ramp-up quickly on what Windows Azure is, and how it can be used in the real world. Come see why everyone is so excited, and why everyone agrees that this is a major shift in our industry. This is not future-tech, the cloud isn’t vapor anymore :) – this is live and production-ready today.
I will be presenting on Windows Azure platform AppFabric, and specifically how to leverage it to bridge between on-premise and off-premise (or, from–one-premise-to-another-premise).
Hope to see you there!
IDC reports its Directions 2010 Capitalizing on the Recovery: Building the Foundation for the Intelligent Economy 45th annual industry/business briefing will take place on 3/10/2010 at the Santa Clara Convention Center, Santa Clara, CA:
Recent economic conditions have brought near-death experiences for many of the industries' largest players, exposing massive inefficiency, dangerously poor visibility and transparency, and a painfully slow ability to innovate and adapt. As the world slowly recovers from recession, it's no surprise that almost every industry is poised for significant - and, in many cases, long overdue - restructuring and transformation. A common element of industry transformation and revitalization is a deeper embedding and leverage of ICT to foster an intelligent economy, greatly improve growth and efficiency, and stave off future crises.
The Windows Azure Team suggests that you Submit Your Solution for the Windows Azure Platform Partner of the Year Award in this 3/3/2010 post:
Are you a Microsoft partner who has built and delivered an innovative solution on the Windows Azure Platform? This year we're showcasing more partner solutions than ever before at the Microsoft Worldwide Partner Conference (WPC) 2010 to be held in Washington, D.C. July 11 - 15, 2010. Don't miss this opportunity to get your innovative solution recognized with a Microsoft WPC 2010 Award!
Winning a Microsoft WPC Partner Award gives you valuable business advantage through public recognition from your peers, press, analysts, and your customers. This year, the WPC 2010 Awards Program is being expanded to showcase more partner solutions than ever, including the newly added Microsoft Country Partner of the Year award. Past winners have demonstrated significant customer impact, solution innovation, speed to market, deployment, and utilization of advanced features in Microsoft technologies and have optimized Microsoft technologies to meet customer business needs and exceed expectations.
Receive onsite acknowledgement at WPC 2010 and take advantage of a comprehensive marketing and press kit designed to help you attract prospects and gain new customers. Click here for more information and find out how to submit your entry to become the Windows Azure Platform Partner of the Year Award winner.
Dan Worth reports from CeBIT “Security must remain the top priority as businesses rush to embrace cloud computing to help reduce expenditure and improve sales, according to Werner Vogels, chief technology officer at Amazon Web Services (AWS)” in his CeBIT 2010: Firms told to focus on cloud security post of 3/5/2010:
Speaking at the CeBIT trade show today, Vogels said AWS is seeing increasing demand for its services, as it is able to offer the scalability and flexibility that companies need to work in today's IT and business environments.
"We are seeing firms taking advantage of services for traditional reasons such as web hosting or software distribution, but also in new areas like disaster recovery and large-scale analytics," he said.
Vogels added that many IT chiefs see cloud computing as an integral requirement in any purchase of new services, underlining its growing importance to businesses now and for the future.
"CIOs are making bold moves in the cloud space and demanding that new technologies are cloud-ready, not only in areas where it is obvious that technology needs to work in the cloud, such as web hosting or marketing campaigns, but also in areas like media distribution or collaboration tools," he said.
However, he stressed that firms must make sure security remains key when considering cloud products, both to protect their investments and to adhere to data protection laws. …
See Jnan Dash Elastic Database Cloud? post in the SQL Azure Database (SADB) section, which asserts “We also see Amazon's SimpleDB is introducing ‘forced consistency’ to the world of ‘eventually consistent’ model.”