•• Update 8/18/2009: Zach Skyles Owens on SQL Azure Database invitations
• Update 8/18/2009: SQL Azure Database CTP released.
Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:
- Azure Blob, Table and Queue Services
- SQL Azure Database (SADB)
- .NET Services: Access Control, Service Bus and Workflow
- Live Windows Azure Apps, Tools and Test Harnesses
- Windows Azure Infrastructure
- Cloud Security and Governance
- Cloud Computing Events
- Other Cloud Computing Platforms and Services
To use these links, first click the post title to display the single article you want to navigate.
Martin Balliauw’s Signed Access Signatures and PHP SDK for Windows Azure post of 8/17/2009 provides a quickstart demonstration based on Steve Marx’s Wazdrop sample:
The latest Windows Azure storage release featured a new concept: “Shared Access Signatures”. The idea of those is that you can create signatures for specific resources in blob storage and that you can provide more granular access than the default “all-or-nothing” approach that is taken by Azure blob storage. Steve Marx posted a sample on this, demonstrating how you can provide read access to a blob for a specified amount of minutes, after which the access is revoked.
The PHP SDK for Windows Azure is now equipped with a credentials mechanism, based on Signed Access Signatures.
Magnus Mårtensson enlists Peter von Lochow to write Using the CloudStorage.API: The Message Queue on 8/17/2009 as the first of three posts that describe how to use their CloudStorage API implementation with Azure Queue Services. The next two posts will cover blob and table storage.
These earlier posts describe the antecedents of the individual storage type implementations:
- Windows Azure + Managed Extensibility Framework (MEF) = true (7/3/2009)
- Introducing the Cloud Storage API (7/15/2009)
• The Data Platform Insider blog announces SQL Server StreamInsight and SQL Azure Database CTP Availability on 8/18/2009:
Also available today is the first community technology preview of SQL Azure Database, a cloud-based relational database service built on Microsoft SQL Server technologies. With SQL Azure Database, you can easily provision and deploy relational database solutions to the cloud, and take advantage of a globally distributed data center that provides enterprise-class availability, scalability, and security with the benefits of built-in data protection, self-healing and disaster recovery. To register for the free trial, visit http://msdn.microsoft.com/en-us/sqlserver/dataservices/default.aspx. To learn more about SQL Azure, visit http://www.microsoft.com/azure/sql.mspx.
If you registered for SSDS or SDS, completed the survey(s), and received an invitation from the Microsoft Connect site, you’re already registered and can’t register again. According to a reply by Dave in the SQL Azure – Getting Started Forum forum to Jamie Thomson’s Try SQL Azure Database CTP Today thread:
Connect is where you go to register. Since it is showing you as registered, just hang tight. The invites are on their way. Just be patient, its going to take some time for the backlog to get processed.
You can get additional information about SQL Azure invitations here.
•• Zach Skyles Owens says in his SQL Azure Invitation Codes post of 8/19/2009:
Over the next week or two everyone who has already signed up for a SQL Azure Invitation Code should be receiving an email sent to the address associated with your Live ID containing the token and a link to redeem it. We understand that everyone would like their tokens yesterday but we need to work through the list and ramp up the service.
Once the list of current requests has been processed, new requested will be fulfilled within a day or two.
We are working on integrating the SQL Azure and Windows Azure provisioning experience. We realize that it is very inconvenient to have to have to make requests for two different tokens from different places.
What about customers who already have an account on the previous version of SQL Data Services/SQL Server Data Services which had an ACE model with a REST API? When will they get tokens? We will be providing all of those users with a token, but in the meantime I’d recommend that all of those users sign up for the CTP.
If you haven’t already done so, please sign up for the CTP today!
Cory Isaacson’s Database Sharding: The Key to Database Scalability post of 8/14/2009 for Database Trends and Applications explains the principals of sharding to increase database scalability:
The concept of database sharding has gained popularity over the past several years due to the enormous growth in transaction volume and size of business-application databases. This is particularly true for many successful online service vendors, software-as-a-service companies and social networking websites.
Database sharding can be simply defined as a "shared-nothing" partitioning scheme for large databases across a number of servers, enabling new levels of database performance and scalability. If you think of broken glass, you can get the concept of sharding-breaking your database down into smaller chunks called "shards" and spreading them across a number of distributed servers. …
SQL Azure Database gains scalability by sharding. See the Windows Azure Platform Training Kit - August Update post in the Live Windows Azure Apps, Tools and Test Harnesses section.
The Torrance talk was A First Look at WF 4, which covered all the new features coming in WF 4. I covered the new designer, declarative XAML workflows, new activities in the base activity library, the flowchart workflow model, hosting of workflows, arguments, variables, and expressions, workflow services, and runtime improvements to name a few things.
Dave Kearns reports Trusted frameworks sought for e-government in this overlooked 8/13/2009 post: “The GSA meets with OpenID Foundation, the Information Card Foundation, the Kantara Initiative and InCommon:”
The General Services Administration (GSA) sponsored a one-day workshop ("Open Government Identity Management Solutions Privacy Workshop") last week, which I gather from those who attended was an exciting event. Among the speakers were representatives from the OpenID Foundation, the Information Card Foundation, the Kantara Initiative, InCommon and the federal government. The general theme of the meeting was to explore the "Trust Framework Provider Adoption Process (TFPAP) For Levels of Assurance 1, 2, and Non-PKI 3" document, which was released as a draft last month.
The intent is to leverage existing industry-created credentials and credentialing processes to support e-government activities. To that end, representatives of various non-governmental bodies made presentations about existing frameworks for trusted identity systems in order to show that they satisfied federal requirements as codified by Office of Management and Budget, National Institute of Standards and Technology, and, of course, the GSA. These Trust Frameworks include requirements for trust framework provider (TFP) auditing qualifications and processes, TFP organizational maturity, TFP member identity provider organizational maturity, TFP member identity provider credentials and their issuance, and TFP member identity provider privacy policies.
One of the more interesting presentations was a joint effort of the OpenID Foundation and the Information Card Foundation called "Open Trust Frameworks for Open Government: Enabling Citizen Involvement through Open Identity Technologies" (see here). …
Ryan Dunn (@dunnry) reports in an 8/17/2009 Tweet that the Windows Azure Platform Training Kit - August Update of the same date is available for download. Ryan says, “Lots of SQL Azure content, new WAz labs.” Here are the contents:
Demos [Emphasis added.]
- Azure Platform Overview
- What is Windows Azure?
- Windows Azure Storage Overview
- Introduction to Windows Azure
- Building Services using Windows Azure
- Introduction to SQL Azure
- Building Applications using SQL Azure
- Scaling Out with SQL Azure
- Introduction to .NET Services
- Building Applications Using the .NET Service Bus
Hands On Labs [Emphasis added.]
- Deploying Windows Azure Services
- Hello Windows Azure
- Windows Azure Guestbook Demo
- Windows Azure Logging and Configuration Demo
- Windows Azure using Blobs Demo
- Windows Azure Worker Role Demo
- Windows Azure Using Queues Demoo
- Windows Azure Using Tables Demo
- Preparing your SQL Azure Account
- Connecting to SQL Azure
- Managing Logins and Security in SQL Azure
- Creating Objects in SQL Azure
- Migrating a Database Schema to SQL Azure
- Moving Data Into and Out Of SQL Azure using SSIS
- Building a Simple SQL Azure App
- Scaling Out SQL Azure with Database Sharding
- .NET Services Service Bus Direct Connection Demo
- .NET Services Service Bus webHttpRelayBinding
- .NET Services Service Bus Publish and Subscribe
- .NET Services Service Registry
- .NET Services Service Bus NetOneWayRelayBinding
Samples and Tools
- Building Windows Azure Services
- Windows Azure Native Code
- Windows Azure and PHP
- Getting Started with Windows Azure Storage
- Using Windows Azure Tables
- Building ASP.NET MVC Applications with Windows Azure
- Building ASP.NET Web Form Applications with Windows Azure
- Migrating Applications to Windows Azure
- Introduction to SQL Azure
- Migrating Databases to SQL Azure
- Building Your First SQL Azure App
- Introduction to the .NET Service Bus
- Building Hybrid Applications
- Windows Azure MMC
- Bid Now
- Contoso Cycles
Of course, what we need now is SQL Azure Database August CTP to go with the emphasized Demos and Hands On Labs which require an SADB invitation.
Note that the Training Kit August Update mentions (but doesn’t describe any details of) the SQL Azure Secure Data Hub.
Jim Nakashima’s Using app.config in the July 2009 CTP post of 8/17/2009 explains how to work around a bug where:
- [I]f you use the App.Config file in a Worker Role, it won’t get included in your Service Package both when running on the local Development Fabric or in the Cloud.
- To resolve this issue, simply add the following environment variable before starting Visual Studio:
- This will be fixed in our next release, in fact it’s fixed in our daily builds already.
• Maria Spinola, the author of Cloud Computing in Plain English and An Essential Guide to the Possibilities and Risks of Cloud Computing warns that Rushing To Cloud Computing Can Be Bad in this 8/18/2009 post that asks “Why Should IT Directors, VPs, CIOs and CEOs Care About Cloud Computing?”
Business managers know that in spite of the benefits of every new technology/business model, there are also risks and issues like trust, loss of privacy, regulatory violation, data replication, coherency and erosion of integrity, application sprawl, and dependencies, among others.
Therefore they realize that rushing things when it comes to Cloud Computing can be a very bad decision. However, ignoring Cloud Computing all together, because of a belief in your ability to secure your own environment better than a service provider ever could, or jumping rapidly into it because the many claims made about Cloud Computing have led you to the point of "irrational exuberance" and unrealistic expectations, isn't smart either. …
• Ben Kepes’ Making the Move From Traditional to SaaS – Credit Where Credit Is Due post of 8/18/2009 analyzes the difficulties and rewards of moving from traditional packaged software to SaaS. As an example, Ben cites Jeff Kaplan’s Callidus Bets on the Cloud post of 7/30/2009:
Callidus Software is taking more extreme action in response to today’s realities. The company announced during its second quarter earnings call on July 28, that it is moving its entire operation and set of offerings to a [multi-tenant] “predictable recurring revenue model”, i.e. SaaS.
Callidus provides Sales Performance Management (SPM) solutions.
• Geva Perry’s The Purpose-Driven Cloud: Introduction post of 8/17/2009 announces the start of a new series about selecting a cloud provider:
Nearly a year ago I posted Thoughts on Platform-as-a-Service, in which I reviewed the state of cloud platforms at the time. A lot has happened since then, and it's time to revisit the latest and greatest in PaaS and specialized cloud platforms.
I've been asked several times by companies I advise or just talk to, as well as by journalists, how should companies select a cloud provider? The answer is, as it almost always is, "it depends." Similarly to what we had already seen in programming languages and traditional application platforms (i.e., installed on-premise), different platforms have been designed to serve different purposes.
I am planning on publishing a series of posts with the general title of "The Purpose-Driven Cloud", which cover the different dimensions that differentiate cloud platforms and make them more suitable for different purposes.
The dimensions I am going to cover are:
- Usability-Driven Clouds [UPDATE: Post is live here]
- Framework-Based Clouds
- Domain-Specific Clouds
- Data-Driven Clouds
- Industry-Focused Clouds
- Geographically-Centric Clouds
Simultaneous to this intro, I am publishing the first in the series: The Usability-Driven Cloud.
• The International Association of Software Architects’ Perspectives of the IASA issue of August 2009 carries the following cloud-computing articles:
- Cloud Software Architecture: The Quest for Zero-Delta, No Lock-In, Hybrid-Deployment-Ready Applications by Julian Keith Loren
- Are You Secure in the Clouds? by Keith McMullin
- Amazon Web Services: Infrastructure in the Cloud by James Murty
- Three Scenarios: When the Cloud Makes Sense by Brian H. Prince
Brian is a Microsoft architect-evangelist.
Philippe Courtot claims in this 8/17/2009 It's time to embrace (and prepare for) the shift to the cloud article for SC Magazine:
The software industry is entering another age of astonishing innovation. It's a time when not only is software advancing at an astounding rate, but so are hardware devices – where power is increasing as quickly as size is decreasing. This is making software and computing power near ubiquitous.
Consider this: a handful of years ago, few would have believed that customer relationship management software would have moved almost completely to the cloud. Or that Lotus Notes, that gray old lady of IT, would have made the jump as well. Even among the proponents of cloud computing, few believed corporate software and data wanted to be liberated so quickly – and make itself readily available anywhere, anytime, on any device, and from within any web browser. Today, it seems more unusual not to have a software as a service (SaaS) or cloud offering that complements, or completely replaces, a software maker's traditional software applications.
Yet, I believe that the SaaS and cloud computing revolution holds the potential to benefit everyone in the software industry, and all who rely on it for their business. For instance, we in the industry are well aware that software is evolving too quickly. It's a never-ending process of software enhancements, upgrades, security fixes and new installations. And, few would disagree that there are too many vulnerabilities affecting too many applications. In this disorder, most of the burden has fallen on the shoulders of corporations that have had to dedicate extraordinary resources to patch and mitigate the security holes. …
Philippe is chairman and CEO of Qualys, Inc., an on demand vulnerability management and policy compliance solutions firm.
Judith Hurwitz offers Ten things I learned while writing Cloud Computing for Dummies on 8/14/2009:
I have been hard at work (along with my colleagues Marcia Kaufman, Robin Bloor, and Fern Halper) on Cloud Computing for Dummies. I will admit that we underestimated the effort. We thought that since we had already written Service Oriented Architectures for Dummies — twice; and Service Management for Dummies that Cloud Computing would be relatively easy. It wasn’t. Over the past six months we have learned a lot about the cloud and where it is headed. I thought that rather than try to rewrite the entire book right here I would give you a sense of some of the important things that I have learned. I will hold myself to 10 so that I don’t go overboard! …
Jon Bodkin reports Enterprise cloud use on agenda for new Open Group committee: “Security, flexibility up for debate among vendors and users” on 8/14/2009 for NetworkWorld:
The Open Group is forming a new cloud computing committee that brings vendors and end-user organizations together to develop a common understanding about how cloud services should be deployed safely and effectively.
The consortium’s Cloud Work Group includes vendors such as IBM and Sun, end-user organizations like Eli Lilly, financial services companies, and U.S. and U.K. government officials.
Several new committees and organizations promoting cloud standards and frameworks have popped up this year. But the Open Group says it aims to contribute something unique by focusing on enterprise requirements for cloud computing, rather than the nitty-gritty technology details.
“The last thing this industry needs is more competing and contradictory information on emerging technologies like cloud computing,” says Dave Lounsbury, the Open Group’s vice president of collaboration services. …
Carl Brooks’ Party’s over, kids: Microsoft has private cloud all sewn up. In 2010. Maybe post to SearchCloudComputing of 8/14/2009 reports:
Microsoft says it will have the definitive virtualized public/private/platform cloud solution ready to go in a “shrink wrap” package by 2010, and that, by the way, hosters that aren’t fully virtualized will go the way of the dodo. Of course, this may come as a surprise to all the hosters already going great guns with any variety of managed, virtualized and dedicated offerings, including cloud computing models.
Zane Adam, Senior Director of Virtualization at Microsoft announced the Microsoft model for hosting companies and data centers at Tuesday’s Hosting Con 2009 keynote. He said that lowering “human touch” and “fabric management” were the new face of hosting and “those that pull the plug [on virtualization and automation] too late will become dinosaurs.”
Adam pitched Microsoft’s “System Center Solutions” and Dynamic Data Center Tookit as the provisioning and management glue for Microsofts new server products. Get on Server 2008 R2 with Hyper-V, he said, download the software kit and away you go: virtualized, managed, cloud-ready. A wonder no one’s thought of that before. …
• Lori MacVittie reminds us that “Amazon EC2 and S3 are no more or less safe than they were last week despite hype around PCI compliance admission” in her Amazon Compliance Confession About Customers, Not Itself post of 8/18/2009:
The recent admission/announcement that “Amazon EC2 is not PCI compliant” (this is not exactly true, but we’ll get to that later) has set off a rush of blogs, articles, and tweets that say, in effect, EC2 is no longer “safe”. But a lack of compliance does not make Amazon any more less safe than achieving PCI compliance makes a site more safe. …
PCI compliance doesn’t automatically make a site safe. Lack of PCI compliance doesn’t make EC2 unsafe, either. It means it isn’t compliant with the policies designated by the PCI council for handling credit card transactions and sensitive data. And, if we look past the hand-waving, we’ll find [from an Amazon Web Services message of 8/12/2009 to Jason Rushton] that Amazon admits you can’t build a PCI Level 1 compliant application using EC2 and S3, but you can build a PCI Level 2 compliant application. …
Amazon clearly states you cannot be level 1 compliant because it requires on-site auditing that they simply can’t (or won’t) allow. The inability to meet a requirement because of logistics (level 1 requires an on-site audit which Amazon states is not possible) is hardly the same as failing to meet the requirement for a firewall, or default password use. The inability to meet that one requirement is hardly reason for condemnation of Amazon’s overall security posture. Its inability for you to meet PCI compliance does not automatically mean its systems and environment are “unsafe”. Amazon points to the “on-site audit” requirement as a reason why you cannot achieve PCI Level 1 compliance. For all we know Amazon meets or exceeds every other requirement for PCI level 1 compliance that is required of a service-provider. Inferring anything about the security posture of Amazon’s internal systems from one message in a forum is simply not possible.
Another tempest in a teapot. It’s reasonable to expect Microsoft to take the same stance as AWS with respect to PCI Level 1 compliance for user applications running on Windows Azure. An alternative to storing sensitive customer credit-card information yourself is to use a payment gateway provider that offers tokenized payments.
• Practice Fusion’s Medical Data in the Internet “cloud” (part 2) – Data security article of 8/18/2009 by Robert Rowley, MD – Chief Medical Officer, Practice Fusion Inc., is the second of a three-part series about medical data in the Internet cloud:
A review of issues around medical records ownership and protection shows that medical records are the property of those who prepare them (medical professionals), and not the property of those about whom they are concerned (patients), although patients generally have a right to review them, demand copies of them and demand their confidentiality. With limited and specific exceptions, consent is required in order to disclose such information to others. So, how does one create a framework of security that protects the confidentiality of such records against unauthorized breach?
The first part was Mediical Data in the Internet “cloud” (part 1) – Data safety of 8/13/2009 is also by Dr. Rowley:
The question of data security in a “brave new world” of cloud-based Electronic Health Records (EHRs), Personal Health Records, and iPhone and other smartphone apps that could transmit personal health information, has attracted the attention of many. Web-based services – so-called “cloud computing” – are not inherently secure. Such technology is focused more on widespread reach and interconnectedness rather than on making sure that the connections and the data are foolproof. Yet much of our personal information, such as banking information, is housed electronically and accessed through the web – we have become so accustomed to it that we seldom think very much about it. Personal health information, moreover, is protected by law: HIPAA, which is focused around physician and hospital-centered recordkeeping, and now ARRA, which extends HIPAA-like protection to patient-centered Personal Health Records as well.
• Paul Miller discusses Security and the Cloud and asks in this 8/18/2009 post “Will focus shift to the customer?”
I was talking with Avanade’s Senior Director for Enterprise Security, Ace Swerling, earlier today. The conversation touched on a wide range of security and identity management issues that I’ll probably return to, but one of Ace’s comments brought my attention back to an issue that has been nagging at me for a while.
As I’m sure we all know, security concerns often figure highly in discussions about moving Enterprise applications and data to the Cloud. Indeed, I spoke with other Avanade executives earlier this year to report on a survey they had commissioned that suggested just how significant these concerns can be for potential customers.
In today’s conversation, Ace appeared to agree (as do I) with the frequent assertion that Cloud providers’ own systems will tend to be more secure than those that the majority of potential customers have in-house today. These service providers have their entire reputation riding on their security, it’s absolutely core to their business model, and they can invest in the facilities, procedures and people to get it right. They’re not claiming to be invincible; nothing is. But the good ones should certainly be capable of being as secure as anything else connected to a network. [Emphasis Paul’s.] …
David Linthicum’s How government can do cloud computing right post of 8/17/2009 to InforWorld’s Cloud Computing blog asserts: “The feds are getting serious about the cloud. But before they spend billions on it, they need to avoid both overeager adoption and the tendency to want to control everything:
According to Bloomberg, IBM is looking to "grab a piece of the more than $1 trillion in global stimulus spending by pitching cloud-computer projects for health care and energy." However, I can tell you that other cloud computing providers and consulting service providers are after that money in a big way. As Bloomberg reported, the U.S. government's stimulus plan will put more than $100 billion toward health-care networks, energy grids, and other technology projects, according to researcher IDC. "Uncle Sam is coming down with funding," Gens said. "Cloud computing's coming at a very good time." Total cloud spending will top $40 billion by 2012, almost triple last year, according to the researcher.
The stimulus money is following the cloud because those in government IT are looking to get a much bigger bang for the IT buck, and they consider cloud computing as the way to do that. Within the federal government, cloud computing has some pretty big supporters these days, including the U.S. CIO, who has been a public advocate for cloud computing. Also, just last week the GSA put out an RFQ looking to provide an easy on-ramp to cloud computing for most government agencies.
I suspect that by the end of 2010 we'll have some pretty huge government-sponsored cloud computing projects under way, and hopefully more effective and efficient directions for government IT. Figure you're going to see a mixed bag of cloud computing successes, with a few bad projects mixed in, as they typically are. …
Chris Hoff (@Beaker) asks Do We Need CloudNAPs? It’s A Virtually Certain Maybe. in this 8/16/2009 post:
Allan Leinwand from GigaOm wrote a really interesting blog the other day titled: “Do Enterprises Need a Toll Road to the Cloud?” in which he suggested that perhaps what is needed to guarantee high performance and high security Cloud connectivity is essentially a middleman that maintains dedicated aggregate connectivity between “…each of the public cloud providers.” …
In the long term the notion of an open market for hybrid Cloud connectivity — the Inter-Cloud — will take form, and much of the evolving work being done with open protocols and those in the works by loose federations of suppliers with common goals and technology underpinnings will emerge.
In the long term do we need CloudNAP’s? No. Will we get something similar by virtue of what we already do today? Probably.
Chris Hoff’s Follow-On: The Audit, Assertion, Assessment, and Assurance API (A6) post of 8/16/2009 expands on his original A6 proposal:
… The idea has since grown legs and I’ve started to have some serious discussions with “people” (*wink wink*) who are very interested in making this a reality, especially in light of business and technical use cases bubbling to the surface of late.
To that end, Ben (@ironfog) has taken the conceptual mumblings and begun work on a RESTful interface for A6. You can find the draft documentation here. You can find his blog and awesome work on making A6 a reality here. Thank you so much, Ben.
I’m thinking of pulling together a more formalized working group for A6 and push hard with some of those “people” above to get better definition around its operational realities as well as understand the best way to create an open and extensible standard going forward. …
Paul Enfield, J.D. Meier, and Prashant Bansode of Microsoft’s patterns & practices Azure Security team released on 8/12/2009 the results of an Azure Security Guidance Survey they started on 7/10/2009. As the authors noted, “At the moment the number of responses is low.”
• Aaron Skonnard will present Azure-related sessions at the Heartland Developers Conference (HDC) 09 on 10/15 and 10/16/2009 at the Qwest Center in Omaha, NE:
Where: Qwest Center, Omaha, NE, USA
• Forrester Research analysts John Rymer and James Staten will present Creating Your Enterprise Cloud Computing Strategy workshops in Dallas, TX on 9/30/2009 and Cambridge, MA on 11/5/2009:
In this Workshop, you benefit from the latest research on the categories of products that use cloud computing concepts, the companies behind those products, the economics, and the emerging practices for employing cloud computing from the more thean 100 enterprises Forrester has worked with on the topic. Forrester analysts help you position this new class of platform products to existing architecture and strategy, highlight examples of successful adoption, and lead exercises to help your formulate your policies and business plans.
- Understand Forrester's definitions and categorizations of cloud computing in all of its variations, along with how to apply these categories to enterprise IT problems.
- Master the cloud computing vendor landscape, including products, alliances, pricing, and packaging.
- Learn about what other organizations have experienced when using the cloud approach, whether internally or externally.
- Get an up-to-date view on the maturity and readiness of the various cloud approaches for usage in the enterprise.
- Determine what you should be doing on cloud computing today.
- Work closely with Forrester analysts. Attendance is limited to maximize client-analyst interaction.
- Understand the trends and technologies that matter most to your role, so you'll be able to make informed decisions and gain a competitive advantage.
- Complete hands-on exercises applying the same methodologies that Forrester analysts use for their own research.
- Network with peers facing similar goals and challenges.
- Leave with an action plan and strategy that will generate new growth for your company.
The price for either workshop is $1,975.
Where: Dallas, TX, USA
Where: Cambridge, MA, USA
• The IT Valley Innovation Center announces CloudStorm Belgium to be held on 9/2/2009 at the Atomium, Brussels, Belgium:
At CloudStorm, you will receive in 100 minutes an update of the latest trends and products in Cloud Computing, followed by a mini expo and networking opportunity.
- 10 minutes Cloud Vendor presentations on the newest trends in Cloud Computing
- Showing the real solutions and products.
- An optimal mix on vendors: Private Cloud, Public Cloud, Platform, Load Balancers, etc.
- Drinks and light snacks during and after the presentations.
- Networking, the key in this event!
- Patrick Crasson, Strategic Business Development SUN Microsystems
- Wim De Wispelaere, Product Innovator B-Virtual
- Arvid Fossen, Product Management Director A-Server
- Wouter Maagdenberg, CEO Calamares
- Hans Verbeeck, Evangelism Manager Microsoft
- Thomas Rulmont, CTO CloudSphere
- Marc Vrijhof, CEO REP 42 (Mobilising the Enterprise)
- Yves Pauwels, Product & Account Manager SaaSforce
Where: Atomium, Brussels, Belgium
• CloudStorm London will be held 10/13/2009 at Inner Temple Hall, London, England.
The event will take place at the famous Inner Temple Hall on October 13th. Inner Temple is based in the heart of London’s legal quarter and is steeped in history as one of the four ancient Inns of court.
Combine with Storage Expo in London Olympia on October 14th and 15th!
Where: Inner Temple Hall, London, England, UK
Carl Brooks reports Microsoft, cloud companies pushing platforms on hosters at the HostingCon 2009 conference held 7/10 through 7/12/2009 in Washington, DC:
Cloud computing bigwigs came to deliver a message that many HostingCon 2009 attendees weren't ready to hear.
Google, Microsoft, Salesforce.com and Rackspace appeared in force at a keynote session on Tuesday, but audience reaction was more quizzical than receptive. By a show of hands, hosting companies made up the great majority of attendees, but pitches and predictions about the state of cloud computing focused on Platform as a Service as the way for hosters to stay ahead the curve.
But hosters at the conference drew a clear line between Infrastructure as a Service, which lets users run any software they want, and Platform as a Service, which restricts users to the provider's proprietary code and operating system. …
When: 7/10 – 7/12/2009
Where: Gaylord National Resort and Convention Center, 201 Waterfront Street, National Harbor, MD 20745, USA
For additional background on SearchCloudComputing’s view of Windows Azure marketing, see:
- Party’s over, kids: Microsoft has private cloud all sewn up. In 2010. Maybe (8/14/2009)
- Azure's battle plan: Be all things to all people (8/3/2009) I’m quoted as “Robert Jennings.”
- Azure will be popular with .Net developers, but questions remain on application deployment (7/17/2009)
• Lori MacVittie reminds us that “Amazon EC2 and S3 are no more or less safe than they were last week despite hype around PCI compliance admission” in her Amazon Compliance Confession About Customers, Not Itself post of 8/18/2009. (Repeated from the Cloud Security and Governance section.)
Krishnan Subramanian reports Joyent Ups The Ante In Enterprise Game in this 8/17/2009 post that puts Joyent in direct competition with the Google App Engine and even more so with Windows Azure:
Joyent, the San Franscisco based company offering enterprise class cloud computing solutions, is shifting focus to completely concentrate on enterprise customers. I wrote about Joyent in the early months of Cloud Ave and Joyent has, ever since, prioritized their offerings towards enterprise customers. …
Since that time, along with their existing Joyent Accelerator product, they have added
Their other offering, Joyent accelerator for MySQL, a virtual MySQL applicance optimized for high end performance was announced last Tuesday. Well, MySQL virtual appliances are nothing new. We have seen many such offerings even before Cloud Computing started to occupy our imagination. What makes this offering interesting is the fact that Joyent has worked with MySQL team in Sun Microsystems to offer an highly optimized and high performance appliance. …
- Cloud Control software that can be used to deploy private cloud inside the business' own datacenters.
Stacey Higgenbotham’s IBM Plans Cloud Service to Take on Microsoft, Google & Salesforce post of 8/16/2009 to the GigaOm blog begins:
Earlier this week I spoke with Erich Clementi, General Manager, Enterprise Initiatives (otherwise known as the head of IBM’s cloud computing efforts) about Big Blue’s cloud strategy. After we raked the computer and service provider over the coals earlier this year for talking about the cloud without offering substance, in June IBM finally unveiled part of its cloud plans.
They revolve around providing workload-specific services via an IBM cloud, as a hosted cloud, or inside a company’s own data center. It kicked off its cloud rollout with a test and development service, and last month it announced an analytics offering. Clementi revealed that IBM won’t stop at workload-specific services, and will build a WebSphere platform-as-a-service offering for clients. …