Showing posts with label AtomPub. Show all posts
Showing posts with label AtomPub. Show all posts

Friday, April 27, 2012

Creating a Private Data Marketplace with Microsoft Codename “Data Hub”

•• Updated 5/11/2012 with change of dataset name from US Air Carrier Flight Delays, Monthly to US Air Carrier Flight Delays to reflect change of SQL Azure dataset to new multi-month/year format. (See my Creating An Incremental SQL Azure Data Source for OakLeaf’s U.S. Air Carrier Flight Delays Dataset post of 5/8/2012 for more information about the new dataset.)
• Updated 4/30/2012 with link to OakLeaf’s new US Air Carrier Flight Delays, Monthly (free) data set on the public Windows Azure Marketplace DataMarket.
Contents:

Introduction

SQL Server Labs describes their recent Codename “Data Hub” Consumer Technical Preview (CTP) as “An Online Service for Data Discovery, Distribution and Curation.” At its heart, “Data Hub” is a private version of the public Windows Azure Marketplace DataMarket that runs as a Windows Azure service. The publishing process is almost identical to the public version, except for usage charges and payment transfers. “Data Hub” enables data users and developers, as well as DBAs, to:

  • Make data in SQL Azure discoverable and accessible in OData (AtomPub) format by an organization’s employees
  • Enable data analysts and business managers to view and manipulate data from the Marketplace with Service Explorer, Excel, and Excel PowerPivot
  • Publish datasets for further curation and collaboration with other users in the organization
  • Federate data from the Windows Azure Marketplace DataMarket for the organization’s employees (in addition to the organization’s uploaded data)

The initial CTP supports the preceding features but is limited to SQL Azure as a data source and OData (AtomPub) as the distribution format. Microsoft is considering other data source and distribution formats.


•• Note: OakLeaf’s US Air Carrier Flight Delays, Monthly data sets are publicly accessible at https://oakleaf.clouddatahub.net/ by clicking the Government or Transportation category link. To issue a query with Data Explorer, do this:

  • Click the Sign In button at the top right of the page
  • Log in with your Windows Live ID
  • Click the US Air Carrier Flight Delays link on the landing page
  • Click the Add to Collection button
  • Click the US Air Carrier Flight Delays link in the My Collection page
  • Click the Explore Data button to open the Data Explorer
  • Click the Run Query button to display the first 23 rows of data:

image

For a more detailed description, see the Exploring the User’s Experience at the end of this post.

Alternatively, you can register with the public Windows Azure Marketplace Datamarket and then subscribe to the same datasets from OakLeaf’s new US Air Carrier Flight Delays data set. After you subscribe to the free dataset, you can also use it as a data source for Apache Hadoop on Windows Azure Hive tables.


Obtaining CTP Invitations

“Data Hub” and the “Data Transfer” CTP are invitation-only CTPs. You can request an invitation by clicking the Start Here link for “Data Hub” to open the Welcome page:

imageClick images to display full size screen capture.

Complete the questionnaire and wait for the e-mail that advises you’ve been approved as a user. At present, users are limited to use of the CTP for three weeks.

“Data Hub” integrates “Data Transfer” for uploading comma-separated-value (*.csv) data files to existing or new SQL Azure database tables. As noted below, I found using “Data Transfer” independently of “Data Hub” worked for some large files that “Data Hub” wouldn’t process. Therefore, I recommend you apply for the “Data Transfer CTP” by clicking the Start Here button on the landing page:

image

My Test-Drive SQL Azure Labs’ New Codename “Data Transfer” Web UI for Copying *.csv Files to SQL Azure Tables or Azure Blobs of 11/30/2011 was an early tutorial. I’ll post an updated tutorial for using Codename “Data Transfer” with On_Time_Performance_YYYY_MM.csv files shortly.


Creating Your “Data Hub” Marketplace

After you receive your “Data Hub” invitation, follow the instructions in the e-mail and complete the Create Your Marketplace page. The domain prefix, oakleaf for this example, must be unique within the clouddatahub.net domain. Specify the Start and End IP Addresses for your expected users. (0.0.0.0 and 255.255.255.255 admit all users making your Marketplace public):

image

Click Create Marketplace to take ownership of the subdomain:

image

After the creating process completes, your homepage (at https://oakleaf.clouddatahub.net for this example) appears as shown here:

image

Your Account Key is equivalent to a password for “Data Hub” administration by users with Live IDs other than the administrator’s.


Provisioning a SQL Azure Database to Store *.csv Data

Click the Publish Data menu link to open the Welcome to the Publishing Portal page, which has Connect, Publish, Approve, Federate and View Marketplace menu links.

Click the Connect link to open the Connect to your Data Sources page, which offers five prebuilt sample data sources, AdventureWorks … US Data Gov:

image

I previously provisioned the On_Time_Performance data source from one of the four Free Trial SQL Azure Databases you receive with your invitation and On_Time_Performance2 data source from an existing database created with the “Data Transfer” CTP.

To use one of the four free SQL Azure databases, click the Free Trial SQL Azure Databases link in the left pane of the preceding page to open the Create New Database page. Type a unique name for your database (On_Time_Performance_Test for this example):

image

And click Create to add the database to the Data Sources list and display the Upload a File page.


Understanding the FAA’s On_Time_Performance.csv Files

The Creating the Azure Blob Source Data section of my Using Data from Windows Azure Blobs with Apache Hadoop on Windows Azure CTP post of 4/6/2012 described the data set I wanted to distribute via a publicly accessible, free Windows Azure DataMarket dataset. The only differences between it and the tab-delimited *.txt files uploaded to blobs that served as the data source for an Apache Hive table were:

  • Inclusion of column names in the first row
  • Addition of a formatted date field (Hive tables don’t have a native date or datetime datatype, so Year, Month and DayOfMonth fields were required.)
  • Field delimiter character (comma instead of tab)

Following is a screen capture of the first 20 data rows of the ~500,000-row On_Time_Performance_2012_1.csv table:

image

You can download sample On_Time_Performance_YYYY_MM.csv files from the OnTimePerformanceCSV folder of my Windows Live SkyDrive account. The files are narrowed versions of the On_Time_On_Time_Performance_YYYY_MM.csv files from the Bureau of Transportation’s Research and Innovative Technology Administration site. For more information about these files, see The FAA On_Time_Performance Database’s Schema and Size section of my Analyzing Air Carrier Arrival Delays with Microsoft Codename “Cloud Numerics” article of 3/26/2012. Each original *.csv file has 83 columns, about 500,000 rows and an average size of about 225 MB.

On_Time_Performance_2012_1.csv has 486,133 data rows, nine columns and weighs in at 16.5 MB. Other On_Time_Performance_YYYY_MM.csv files with similar row count and size are being added daily. There also are truncated versions of the files with 100, 1,000, 10,000, 100,000, 150,000 and 200,000 rows that I used for determining the large-file *.csv upload problem with “Data Hub” in the On_Time_Performance_Test subfolder.

Tab-delimited sample On_Time_Performance_YYYY_MM.txt files (without the first row of column names and formatted date) for use in creating blobs to serve as the data source for Hive databases are available from my Flight Data Files for Hadoop on Azure SkyDrive folder.

Provision of the files through a private Azure DataMarket service was intended to supplement the SkyDrive downloads. I also plan to provide the full files in a free submission to the Windows Azure Marketplace Datamarket site in early May.


Uploading *.csv Files to Your Data Source

Download one of the smaller test files from my SkyDrive On_Time_Performance_Test subfolder to your local machine. Choose the 10,000-row file if you have a DSL connection with slow upload speed.

Click the Upload a File page’s Browse button and navigate to the *.csv file you downloaded:

image

Click the Upload button to start the upload to an Azure blog process. You receive no feedback for the progress of the upload until the Update the Table Settings page opens:

image

All data rows are useful, so accept the default Include settings and click Submit to transfer the data from an Azure blob to an SQL Azure table named for the *.csv file, which replaces the above page with a Loading File to Database message. It often takes longer to load the file to the database than to upload it to a blob.

When the Success message appears, click the Connect menu link to verify that the new database appears in the Data Sources list:

image


Publishing the New Data Source

Click the Publish menu link to open the My Offerings page and click the Add Offering button to open the 1. Data Source page.

Select the table(s) you want to include in the data source, type a Friendly Name for the column, click the Columns button for the table to display the column list and clear the columns that you don’t want to be queryable:

image

Indexes are required on all queryable columns. Month and Year don’t need to be queryable because they are the same for all rows in the table.

Click the 2. Contacts button to save the Data Source information and open the Contacts page. Type your Name, Email alias and Phone Number:

image

Click the 3. Details button to open the Details page. Complete the required data, select from one to four categories, open the Documentation list and add links to appropriate documentation URLs, and add a logo image:

image

Click the 4. Status/Review menu link to open the Status/Review page. Click the Request Approval button to send a request the Data Hub team to approve the entry.

image

Note: I will post a tutorial on federating content from the public Windows Azure Marketplace DataMarket after Microsoft approves my pending submission, which is identical to this private Marketplace entry.


Previewing and Approving the Marketplace Submission

Click the Approve menu link to open the My Approvals page. Click the Approve/Decline button to open the message pane, mark the Approve option, type an optional message to the requester, and click the Display Actions button to open Preview Offering choices:

image

Click Preview Offering in Marketplace to verify that the offering details appear as expected:

image

Click the Explore This Dataset link to open the dataset in the Service Explorer and click the Run Query button to display the first 22 of 100 rows with this default URL query https://api-oakleaf.clouddatahub.net/Data.ashx/default/US_Air_Carrier_Flight_Delays_Monthly/preview/On_Time_Performance_2012_1?$top=100:

image

Type a Carrier code, such as WN in the Carrier text box and OAK in the Dest[ination] text box to provide data for Southwest Airlines flights to Oakland with this URL query: https://api-oakleaf.clouddatahub.net/Data.ashx/default/US_Air_Carrier_Flight_Delays_Monthly/preview/On_Time_Performance_2012_1?$filter=Carrier%20eq%20%27WN%27%20and%20Dest%20eq%20%27OAK%27&$top=100:

image

You also can visualize and export data, as well as click the Develop button to open a text box containing the current URL query. Click the XML button to display formatted OData (AtomPub) content:

image

Return to the My Approvals page and click the Send button to notify the originator of the approval and clear the My Approvals counts.


Publishing the Offering

Click the Publish menu link to display the offering with the Draft Approved as the status and the Publish option enabled:

image

Click the Publish button to publish your offering for others to use:

image


Exploring the User’s Experience

Click the Marketplace menu link to return to the landing page, type a search term (OakLeaf for this example) to find your offering:

image

Click the offering name link to open it as an ordinary user and display the Add to Collection button:

image

Click Add to Collection to add the offering to the My Collection list:

image

When you sign out and navigate to the default URL, https://oakleaf.clouddatahub.net/, the landing page appears as shown here:

image

The Science & Statistics item is the default Data.gov offering.  Government or Transportation and Navigation link opens the user view of the offering landing page with a Sign In to Add to Your Collection button:

image

The service URL is: https://api-oakleaf.clouddatahub.net/default/US_Air_Carrier_Flight_Delays_Monthly/. Navigating to this URL and clicking the Show All Data button displays the default collections (for an earlier data source):

image


Thursday, July 28, 2011

Early Google Base Posts on the OakLeaf Systems Blog

Updated below on 7/30/2011 with reasons why I abandoned attempts to use Google for sharing data from relational tables.

Google opened a public beta version of Google Base to testers on 11/16/2005. According to program manager Bindu Reddy’s First Base post to the Official Google Blog:

imageToday we're excited to announce Google Base, an extension of our existing content collection efforts like web crawl, Google Sitemaps, Google Print and Google Video. Google Base enables content owners to easily make their information searchable online. Anyone, from large companies to website owners and individuals, can use it to submit their content in the form of data items. We'll host the items and make them searchable for free. There's more info here.

Google Base pages no longer are available from the Google site and the last post to the Google Base site in the Wayback Machine appears to be 12/18/2008. The following table provides links to my early posts to the OakLeaf Systems blog about a number of Google Base tests I ran in 2005, as well as a review of the DabbleDB beta in March 2006:

Date Post
11/16/2005 Google Base and Bulk Uploads with Microsoft Access
11/17/2005 Google Base and Atom 0.3 Bulk Uploads
11/27/2005 Bulk-Uploaded Items Disappear from Google Base
11/29/2005 Windows Live "Fremont" vs. Google Base Classifieds
12/5/2005 Problems Uploading a Google Base Custom Item Type from a TSV File
12/12/2005 Google Base and Blogger Items Missing from Google Search
3/19/2006 Dabble DB: The New Look in Web Databases

It’s interesting to note that Google supported data insert operations using the Atom format before AtomPub became a standard. Microsoft later adopted AtomPub as the foundation of it’s Open Data Protocol (OData) for RESTful create, retrieve, update, and delete (CRUD) operations on Windows Azure storage, as well as SharePoint lists.

Added 7/30/2011: I abandoned my attempts to use Google Base for sharing database and spreadsheet tables at the end of 2005 because I encountered issues with severe latency (as long as a few hours to publish inserted data), limited and restrictive data type repertoire, chancy (and slow) data upload techniques, and a cumbersome, unintuitive client UI.

Google removed Google Base’s search page on 10/8/2009 and abandoned Google Base in September 2010 by moving it into the Google Merchant Center as the data store for Google Product Search. Google Merchant Center’s data has a restrictive schema with required attributes predefined for selling products. Google announced on 12/17/2010 that Google Base's API had been deprecated in favor of a set of new APIs known as Google Shopping APIs. Google also abandoned its Google Health services that had database-backed services in June 2011. Microsoft currently is attempting to woo former Google Health users to its HealthVault service with a data conversion application.

Monday, April 04, 2011

Amazon Claimed OakLeaf Blog Feed Too Large and Overly Illustrated for Kindle Edition

Amazon has made the OakLeaf Systems blog available for the Kindle for about a year. I occasionally receive $10 or so as my share of subscription proceeds.

imageA bargain.

On 3/13/2011, I received the following message from the Amazon’s Kindle Publishing for Blogs group:

Dear Publisher,

image We noticed your blog (listed below) has not updated for more than 60 days.

Blog Title: OakLeaf Systems

Blog ASIN: B0029U16XO

On investigating the feed URL, we received the following error message:

Feed Error: XML error in feed. Details : feed.xml:603:0: junk after document element Feed URL registered with Kindle Publishing: http://oakleafblog.blogspot.com/atom.xml [Emphasis added.]

Kindle customers expect to receive frequent updates for blogs and news feeds to which they subscribe. Because blogs should update at least once per month, we are canceling blogs that have not updated in more than 60 days. Accordingly, if you do not fix this issue and publish new updates within 7 days, we will remove your publication from the Kindle Store.

If you have any questions or concerns, please write to us at kindle-publishing-blogs@amazon.com.

This message surprised me because I update the blog at least three times per week. I replied on the same day:

Hi,

... [Copy of preceding message]

image This blog is updated almost on a daily basis. See http://oakleafblog.blogspot.com.

I use Windows Live Writer, which supports XHTML, for authoring. The Atom feed displays as expected in IE8’s Feed reader. Here’s an example of the first page of today’s feed:

clip_image002

Here’s the feed in Mozilla FireFox 3.6:

clip_image004

Note that this blog has exceptionally long posts.

Something appears to [be] wrong with your Atom reader.

Please advise status.

Thanks,

Roger Jennings
OakLeaf Blog
Microsoft Access Blog

I received an acknowledgment of my message the next day. Notice that I didn’t take umbrage at Amazon’s characterizing my blog’s content as “junk.”

After prompting, I received the following response on 4/4/2011:

Hello Roger,

image Your blog started publishing regularly to Kindle.

After some investigation, our technical team identified that your blog has too many images and articles resulting in a huge increase in your Kindle Blog's size. This made the blog error out in the Kindle Publishing pipeline.

To prevent this issue in future, we suggest you to reduce the number of images provided in the Kindle Blog feed.

Best regards,

Srinivasa Krishnan
http://www.amazon.com

I don’t think I’m ready to write an AtomPub filter to accommodate Kindle’s limitations. The Kindle delivers illustrated books and thick magazines, such as The New Yorker and PC Magazine; why not lavishly illustrated blogs like mine? Hopefully, they’ve fixed the “Kindle Publishing pipeline.”

Guess I’ll need to buy a Kindle to see how Amazon renders my blog. In the meantime, I’ve added a permalink to the Kindle edition in the left frame.


Thursday, October 14, 2010

Windows Azure and SQL Azure Synergism Emphasized at Windows Phone 7 Developer Launch in Mt. View

image I attended Day 1 of the Windows Phone 7 Developer Launch on 10/12/2010 at the Microsoft Silicon Valley Convention Center in Mt. View, CA and was pleased to see presenters describing Windows Phone 7-Windows Azure Platform synergy.

Updated 10/17/2010: Added capture of Falafel Sotware’s EventBoard WP7 app.

Updated 10/14/2010: Moved Windows Phone 7 Developer Guide and Problems with Dependency Checker in Drop 5 topics to a new post: Solving Dependency Problems in Drop 5 of p&p’s Windows Phone 7 Developer Guide.

Updated 10/13/2010: See end of post.


image[5] MobilePay USA (@MobilePayUSA) demonstrated a free application for paying bills with your Windows Phone 7 (or iPhone). Here’s how it works:

  1. When inside a store accepting MobilePay, GPS technology enables the store to be displayed on your phone.
  2. At the checkout stand, tell the cashier you are paying by phone.
  3. On your phone, tap the "PAY STORE" button, enter your pin, the payment amount due and tap the "PAY NOW" button.
  4. Now, in just seconds, a payment confirmation will appear on your phone and the merchant’s terminal.

Randy Smith announced that his “team [was] waiting to announce Windows 7 Phone platform at Microsoft headquarters in Silicon Valley! http://ow.ly/2Sk4L” in a 10/12/2010 tweet. You can watch demos by the same pair that presented in Mt. View:

According to MobilePay, a small Windows Azure compute instance can support up to 10,000 transactions/second. Both the iPhone and WP7 versions use Windows Azure for data processing. The demo team said creating the iPhone app took two weeks but they finished the WP7 version in two days.


Lino Tadros and John Waters of Falafel Software described their free EventBoard conferenceware product that’s available for iPhone and Adroid devices and has been submitted to the new Windows Phone Market Place.

image[10]   

image[13] The pair’s session covered a “range of real world development stories and deep dive into using Visual Studio 2010 and Expression Blend to build Windows Phone Applications with Windows Azure Cloud Services and Push Notifications.”

imageFalafel developed the initial version of EventBoard for the Silicon Valley Code Camp 2010 to enable attendees to view and manage information about sessions, tracks, rooms, and speakers with the goal of enriching attendees’ conference experience. Here’s the live OData metadata for the initial OData source:

image[18]

And most of the first of the SessionsOverviews:

image[23]

The following example is from the Falafel Software site:

image

The sessions were simulcast at MSDN Simulcast:  Windows Phone 7 Developer Launch Event, Oct. 12, but there’s no indication (so far) that a video archive of Day 1 will be provided. The video currently is a static placeholder.


Update 10/13/2010: Microsoft reps refused to discuss the future Windows 7 roadmap at the Developer Launch, but Devindra Hardawar reported Windows Phone 7: Mac sync coming this year, will support removable storage (sort of) in a 10/13/2010 post to MobileBeat:

image Microsoft announced that it will release a tool later this year to let Mac users sync “select content”, Engadget reports. In other news, we finally some idea of how the platform will handle external storage, thanks to Paul Thurrott.

While I don’t think Microsoft’s announcement means that we’ll see a full Zune client on Macs this year, it’s a clear sign that the company isn’t willing to ignore Mac users this time around. Microsoft has long been criticized for only making its Zune media software available on PCs, which in turn prevented its Zune portable players from finding much of an audience with Mac users.

It’s a wise move because Windows Phone 7’s flashy user interface could appeal to some Mac aficionados, and many users also own both Macs and Windows computers. Hopefully, it will also lead the way to a full-fledged Zune client on the Macs — a program that I vastly prefer to iTunes on Windows.

image As for external storage, something that Microsoft has long said won’t be supported on Windows Phone 7, Thurrott explains how some phones on the platform will offer it:

Supported devices (not all Windows Phones will be expandable) will include a micro-SD card slot, which by Microsoft’s requirements must be placed under the battery cover (i.e. next to the actual battery) and not be externally accessible. That’s because this functionality isn’t designed to be something that is swapped out, used with a PC, or whatever. Instead, the micro-SD-based storage will work in tandem with whatever storage is available inside the device.

Microsoft is apparently offering a compromise between easily removable external storage and none at all. On supported phones, you’ll be able to stick in a micro-SD card of your own between 8GB and 32GB, and the OS will combine that storage with the device’s built-in storage. So if you add a 32GB card to a phone with 8GB of storage, the phone will register it as 40GB of total storage.

Thurrott explains that you won’t be able to eject the micro-SD card and read it on your computer due to “technical limitations.” Similarly, you won’t be able to remove the micro-SD card without performing a hard reset on the device, and there’s no way of telling what data is stored on the external card.

Thurrott mentions that Samsung’s Focus supports external memory, which is yet another reason to consider it the most compelling Windows Phone 7 launch device.


Wednesday, May 26, 2010

List of 34 Cloud-Related Sessions at the Microsoft Worldwide Partner Conference (WPC) 2010

imageMicrosoft’s Worldwide Partners Conference (WPC, @WPCDC) will take place in Washington D.C. on July 11-15, 2010 at the Walter E. Washington Convention Center. This five-day partner event offers you the opportunity to learn about Microsoft’s roadmap and best practices for the year, gain exclusive access to Microsoft executives and other partners, and to explore the infinite opportunities with Microsoft’s cloud computing strategy. More reasons to attend WPC 2010.

New for WPC 2010! Choose from six types of passes:

  • All Access Event Pass: US$1795.00
  • Day Pass: US$650.00
  • Expo Only Week Pass: US$150.00
  • Guest Pass: US$300.00
  • Business Leadership One-Day Event Pass: US$595.00
  • Hands-on Labs Passes: 1 day pass US$295.00 or 3 day pass US$795.00

Searching WPC 2010’s Session Catalog for Track = Cloud Services returned the following 19 session descriptions categorized by Breakout Sessions, Interactive Discussions and Panel Discussion:

Cloud Services - Breakout Sessions

Successful Selling in the Cloud

  • Session Type: Breakout Session
  • Track: Cloud Services
  • Speaker(s): Jeff Medford, Jon Strausburg, Joshua Shea
  • Level: 200 Level: Intermediate

Discover where opportunities can be found and how to steer your business to success in selling cloud services. Learn the steps you should take to sell Microsoft’s cloud services. Using best practices from our most successful partners we teach you how to have the cloud conversation with your customers, and how to overcome objections and be proficient at answering the most common cloud computing concerns.

Influence of the Cloud on the Channel: How Can Partners Prepare Themselves?

  • Session Type: Breakout Session
  • Track: Cloud Services
  • Speaker(s): Tiffani Bova
  • Level: 300 Level: Interactive; Advanced

The demand for cloud-based solutions is exploding in FY11. How can you as a partner be ready? How do you determine what are the hotspots for your company investments and the business model changes with cloud computing -- today and in the future? During this session hear first-hand from Gartner Research Vice President Tiffani Bova the expected customer adoption rates, customer cloud investment hotspots and business model changes cloud computing will deliver to the channel in 2010.

Microsoft Cloud Strategy 101: Everything You Need to Know about the Cloud

  • Session Type: Breakout Session
  • Track: Cloud Services
  • Speaker(s): Gretchen O'Hara, Tim O'Brien
  • Level: 100 Level: Introductory

Microsoft is “all in” when it comes to the cloud. In this session, come get an end-to-end view of our cloud services strategy and learn how the cloud pervades everything we’re doing. Hear how Microsoft positions itself relative to other offerings in the industry, and how Microsoft’s products and services provide opportunity for your business to win along with us. A must-attend session for all partners.

The Evolution of Microsoft Online Services

  • Session Type: Breakout Session
  • Track: Cloud Services
  • Speaker(s): David Scult
  • Level: 200 Level: Intermediate

Hear about the latest additions to the Microsoft Online Services family: The next versions of Exchange Online, SharePoint Online, Office Communications Online, Office Live Meeting, Dynamics CRM Online, Live@edu, and Windows Intune. In this session, learn about the new services offerings, availability and roadmap, sales and marketing strategies, partner revenue opportunities, and more. After attending this session, you will feel more confident about building a successful and profitable business built on Cloud Services.

Better Together: The Next Generations of Microsoft Online Services + Microsoft Office 2010

  • Session Type: Breakout Session
  • Track: Cloud Services
  • Speaker(s): Eron Kelly
  • Level: 200 Level: Intermediate

The next versions of Microsoft Online Services, Office 2010 and the new Windows Intune are here. In this session, hear about the exciting new updates and all the capabilities that come with this new Online release and how they “light up” Office 2010. Hear about how Windows Intune helps simplify how businesses manage and secure PCs virtually anywhere. Learn strategies to increase your success rate and how to expand your business.

Cloud as a Reality: The Upcoming Decade of Cloud and the Windows Azure Platform [hide description]

  • Session Type: Breakout Session
  • Track: Cloud Services
  • Speaker(s): Michael Maggs
  • Level: 200 Level: Intermediate

Please join us to learn more on the economics and impact of Cloud computing and why it matters to you. We showcase Cloud trends, our solution (The Windows Azure platform including: SQL Azure and AppFabric), issues and how we’re solving them, and where we’re taking our platform in the next decade and beyond. In this session you can expect to make sense of the importance of the Cloud, the Windows Azure platform and how best you can take advantage of this exciting change in business.

Windows Intune: PC Management with Cloud Services and Windows 7

  • Session Type: Breakout Session
  • Track: Cloud Services
  • Level: 100 Level: Introductory

Microsoft recently announced Windows Intune, a new solution to help you more efficiently manage and secure your customer’s PCs. Windows Intune brings together cloud services for PC management and malware protection, a Windows 7 upgrade subscription, and the Microsoft Desktop Optimization Pack. In this session, learn more about Windows Intune and how this offering can help you grow your business while providing greater value to your customers at a lower cost to you.

Cloud Services - Interactive Discussions

The Cloud Opportunity in Education: Live@edu ISV Discussion

  • Session Type: Interactive Discussion (Small Group)
  • Track: Cloud Services
  • Speaker(s): Raj Mukherjee
  • Level: 100 Level: Introductory

Live@edu is a worldwide program that provides education institutions (students, faculty, staff and alumni) with free online services of email and collaboration through Microsoft Outlook Live (based on Microsoft Exchange Server 2010) and Windows Live offerings. This interactive session covers the programmable layer related to Live@edu as of today (mainly Exchange Web Services and Windows PowerShell) and the services that are in the roadmap (e.g. Microsoft SharePoint) for the next 12-18 months. If you are in education vertical or planning to create a education solution please join us in this interactive discussion. NDA required and signed on-site.

Microsoft Technology Vision for Next Generation Hosting

  • Session Type: Interactive Discussion (Small Group)
  • Track: Cloud Services
  • Speaker(s): Anil Reddy, Raúl González
  • Level: 200 Level: Intermediate

Learn about how the Microsoft platform can enable service providers to grow their hosting businesses and expand their offerings to reach new customers. In this session, hear about the latest product roadmaps and resources available to hosters as well as the vision of how Microsoft and hosters may partner in an effort to build next generation offerings.

Winning the Cloud with Microsoft Online Services

  • Session Type: Interactive Discussion (Small Group)
  • Track: Cloud Services
  • Speaker(s): Martin Bald, Thomas Rizzo
  • Level: 200 Level: Intermediate

This session helps power your sales engine for success with Microsoft Online Services. Learn how to tell the Microsoft cloud story, build sales confidence and drive sales success. Understand your competition, how they are targeting your customer revenue, how to effectively position yourself, and win with existing and new customer accounts. Finally, hear more about best practices and success stories from other partners who are experiencing success in the cloud today.

Microsoft Transactional Partners: Distributors, LARs, VARs Interested in Windows Azure

  • Session Type: Interactive Discussion (Small Group)
  • Track: Cloud Services
  • Speaker(s): Anders Trolle-Schultz, Tony Bailey
  • Level: 100 Level: Introductory

This interactive session discusses how Microsoft ISVs, Hosting Providers, VARs/Resellers/SIs and Distributors partners could engage with each other within the SaaS/S+S/Cloud Computing and Windows Azure Platform and reach out to end-customers. The discussion includes listening to constructive ideas from SaaS-IT business enablement consultancy on how Resellers and Distributors can carve out a business for themselves re-selling and distributing cloud-based apps vs. on-premises or boxed apps.

Integrating Microsoft Cloud Services into the License Management Process

  • Session Type: Interactive Discussion (Small Group)
  • Track: Cloud Services
  • Speaker(s): Heather Young, Jim Hill
  • Level: 200 Level: Intermediate

With so many customers considering Microsoft Online Services, what role should the customer’s existing licensing agreement and on-premises software play in your discussions? How should they think about BPOS as compared to the eCAL Suite and server upgrades in terms not only of the license costs, but other elements that can impact these decisions? Customers today need help understanding the total solution, including cost implications of areas such as asset management, hardware, energy, and more when moving to the cloud. This interactive discussion helps you understand these key areas, identify resources, and translate to successful customer conversations.

Click, Try, Buy! A Partner’s Guide to Driving Customer Demand Generation with Microsoft Dynamics CRM!

  • Session Type: Interactive Discussion (Small Group)
  • Track: Cloud Services
  • Speaker(s): David Brown, Patrick Pando
  • Level: 200 Level: Intermediate

Learn the simple and effective way to market your solutions and increase demand for NEW business! Offer your customers IMMEDIATE time to value with no technical issues or platform restrictions to block the sales process. See how easy it is to help customers lower up-front investment costs, accelerate ROI, and drive business process efficacy. Get up to speed on how to offer a complete business solution INSTANTLY with an integrated Microsoft Exchange, SharePoint, Dynamics CRM, and mobile experience supporting one user or x,000 users. See the true power of choice at work: in the cloud, partner hosted, and on-premise.

Access Premium Data as a Service Subscriptions with Microsoft Codename “Dallas”

  • Session Type: Interactive Discussion (Small Group)
  • Track: Cloud Services
  • Speaker(s): Roger Mall
  • Level: 200 Level: Intermediate

Microsoft Codename "Dallas" is a Windows Azure information marketplace that brings data, imagery, and real-time web services from leading commercial data providers and authoritative public data sources together into a single location, under a unified provisioning and billing framework. This new service allows developers and information workers to easily discover, purchase, and manage premium data subscriptions in the Windows Azure platform. Learn how you can incorporate "Dallas" data services into your applications, projects, and solutions and generate revenues from it.

The Rise of Cloud Collaboration

  • Session Type: Interactive Discussion (Small Group)
  • Track: Cloud Services
  • Speaker(s): Jared Spataro
  • Level: 100 Level: Introductory

Cloud based delivery models for collaboration are growing raipdly. Gartner is predicting that by 2012 one out of five businesses will own no IT assets at all opting instead to use Cloud-based services for collaboration. With the leading collaboration technologies in market today Microsoft is well placed to take advantage of this with SharePoint and Exchange online. This session helps you to position and sell the value of Microsoft's cloud collaboration story and the opportunities that exist for Microsoft partners to win in this rapidly changing market.

Best Practices: Selling Cloud Based Solutions to a Customer

  • Session Type: Interactive Discussion (Small Group)
  • Track: Cloud Services
  • Speaker(s): David Cryer
  • Level: 200 Level: Intermediate

This session is for partners who would like to gain a better understanding of what the Microsoft Cloud strategy is and how to successfully sell cloud solutions to their customers. We discuss the “anatomy of a cloud deal” in an effort to prepare you to have a Microsoft Online Services and Azure sales conversation and effectively deliver the cloud value proposition to customers.

Connect Your On-Prem Investments to In-Cloud Applications

  • Session Type: Interactive Discussion (Small Group)
  • Track: Cloud Services
  • Speaker(s): Derek Pai, Madhu Kachru
  • Level: 200 Level: Intermediate

Customers can only move their applications to Cloud if they can continue to seamlessly connect current on-prem applications to in-Cloud applications. This session discusses how various Microsoft technologies allow our SI partners to bridge the gap between on-prem applications to in-Cloud applications.

Cloud Services Deployment Opportunities for Partners

  • Session Type: Interactive Discussion (Small Group)
  • Track: Cloud Services
  • Speaker(s): Mark Rice

Level: 200 Level: Intermediate

What Cloud services provisioning and deployment opportunities do exist for partners? In this interactive session we discuss customer provisioning and deployment scenarios and concepts, and how partners can fulfill them.

Cloud Services - Panel Session

Cloud Services Partner Panel: Partner Perspectives and Best Practices

  • Session Type: Panel Session
  • Track: Cloud Services
  • Level: 300 Level: Interactive; Advanced

This session features a panel of partners who are successfully buildings solutions, practices, and business portfolios based on Microsoft’s cloud services today. This is a lively interactive discussion moderated by a Microsoft executive and features a variety of partner types and business models from around the world.

Other Tracks – Key Word Azure

Searching WPC 2010’s Session Catalog for Key Word = Azure returned the following 15 Azure-related session descriptions in other tracks:

Best Practices in Building Your Solution in Windows Azure [hide description]

  • Session Type: Interactive Discussion (Small Group)
  • Track: US Partner
  • Speaker(s): Aaron Suzuki, John Lair, Mike Jalonen, Rusty Johnson, Sarim Khan
  • Level: 100 Level: Introductory

Are you an ISV actively building or considering building your solution on Windows Azure? Come and interact with ISV partners that are developing their applications on Windows Azure. Share your ideas and listen to these real-world examples, the challenges, the tips and tricks and the ROI.

Best Practices in Building Your Solution in Windows Azure [hide description]

  • Session Type: Interactive Discussion (Small Group)
  • Track: US Partner
  • Level: 100 Level: Introductory

Are you an ISV actively building or considering building your solution on Windows Azure? Come and interact with ISV partners that are developing their applications on Windows Azure. Share your ideas and listen to these real-world examples, the challenges, the tips and tricks and the ROI.

Best Practices in Creating New Business with Azure and Software as a Service (SaaS) Offerings for SI Partners [hide description]

  • Session Type: Interactive Discussion (Small Group)
  • Track: US Partner
  • Speaker(s): Aaron Suzuki, John Lair, Mike Jalonen, Rusty Johnson, Sarim Khan
  • Level: 100 Level: Introductory

If you are an SI looking for best practices for creating new business with Azure/SaaS offerings and how to build the profitable business models and channel partnerships of Azure/SaaS, this is the session for you!

Build Successful Service Line Offerings around the Microsoft Middleware Stack [full description]

  • Session Type: Breakout Session
  • Track: Application Platform
  • Speaker(s): Sudhir Hasbe
  • Level: 200 Level: Intermediate

This session helps SI partners understand the Microsoft middleware strategy. Microsoft has various middleware assets deliverd in various products like SharePoint, SQL Server, AppFabric, Azure and BizTalk Server. This session tells the all-up middleware story of Microsoft. We also share how our SI partners have successfully leveraged the Microsoft middleware stack to deliver real value to their customers.

Building a Multi-Tenant SaaS Application with Microsoft SQL Azure and Windows Azure AppFabric [hide description]

  • Session Type: Breakout Session
  • Track: Application Platform
  • Speaker(s): Rick Negrin
  • Level: 200 Level: Intermediate

Come learn how SQL Azure, AppFabric, and the Windows Azure platform will enable you to grow your revenue and increase your market reach. Learn how to build elastic applications that will reduce costs and enable faster time to market using our highly available, self-service platform. These apps can easily span from the cloud to the enterprise. If you are either a traditional ISV looking to move to the cloud or a SaaS ISV who wants to get more capabilities and a larger geo-presence, this is the session that will show you how.

Business Software Mini-Track: Leveraging Business Application Platform Technologies [full description]

  • Session Type: Breakout Session
  • Track: Independent Software Vendor
  • Speaker(s): Steve Fox
  • Level: 100 Level: Introductory

The key to success with business applications is to deliver the greatest value to customers as quickly as possible. From Office to SharePoint to Azure, Microsoft can help you deliver more value in your solutions and to your customers with agility, performance and scalability. In this session, see the many ways that Microsoft can help you accomplish that goal through deeper insight into the core productivity platform technologies. To get there, we take a peek into how you can build and deploy solutions using Microsoft’s business application technologies.

Cloud Computing for ISVs [hide description]

  • Session Type: Breakout Session
  • Track: Independent Software Vendor
  • Speaker(s): Michael Maggs
  • Level: 200 Level: Intermediate

ISVs have very particular and unique needs when it comes to cloud computing. Here we showcase cloud trends, our platform (the Windows Azure platform including: SQL Azure and AppFabric), issues and how we’re solving them and where we’re taking our platform in the next decade and beyond. As an ISV, you can expect this session to make sense of the importance of the cloud, the Windows Azure platform, and how best you can take advantage of this exciting change in business.

Migrating Database Applications to Microsoft SQL Azure [hide description]

  • Session Type: Interactive Discussion (Small Group)
  • Track: Application Platform
  • Speaker(s): Niraj Nagrani
  • Level: 200 Level: Intermediate

Do you build and manage a ton of small applications in the enterprise? Come and learn how moving those applications up to the cloud can benefit both you and your customer. We cover how easy it is to migrate the application, as well as the benefits you get from running in the cloud.

The Meaning of Cloud Computing for My Small and Medium Business/Practice [hide description]

  • Session Type: Breakout Session
  • Track: Core Infrastructure, Small and Midsize Business Partner
  • Speaker(s): Chris Phillips, Marco Di Giacomo
  • Level: 200 Level: Intermediate

Is the Cloud going to hurt my business? Can I get rich on it? Join this session to hear from the team who created Microsoft Small Business Server (SBS) what is likely to happen, what you can do to benefit from it, and what we have developed on the roadmap to help you succeed. Upcoming Windows Server products, Intune and the new Azure technology will create opportunities for your business and your customers. Come hear about them!

US SLG: Winning with the Microsoft Cloud [hide description]

  • Session Type: Breakout Session
  • Track: Public Sector
  • Level: 100 Level: Introductory

Cloud Solutions reloaded: BPOS is into large scale deployment and SLG partners needs to be a part of that rollout. Hear a recap of the roadmap, why we see customers moving to BPOS and where partners should be today in their skills and customer base. Azure: move beyond the talk. Get roadmap information, what partners should be doing to engage customers and how they can use Azure to strengthen their positioning with business and technical decision makers.

Microsoft Transactional Partners: Distributors, LARs, VARs Interested in Windows Azure [hide description]

  • Session Type: Interactive Discussion (Small Group)
  • Track: Cloud Services
  • Speaker(s): Anders Trolle-Schultz, Tony Bailey
  • Level: 100 Level: Introductory

This interactive session discusses how Microsoft ISVs, Hosting Providers, VARs/Resellers/SIs and Distributors partners could engage with each other within the SaaS/S+S/Cloud Computing and Windows Azure Platform and reach out to end-customers. The discussion includes listening to constructive ideas from SaaS-IT business enablement consultancy on how Resellers and Distributors can carve out a business for themselves re-selling and distributing cloud-based apps vs. on-premises or boxed apps.

Windows Azure: What a Service Provider Needs to Know to Get Started [hide description]

  • Session Type: Interactive Discussion (Small Group)
  • Track: Hosting Infrastructure
  • Speaker(s): Pascal Walschots
  • Level: 300 Level: Interactive; Advanced

As a service provider, cloud services are creating a new opportunity to add offerings to your portfolio and optimize your existing offerings. How do you start with integrating Windows Azure services into your current service portfolio? What are the scenarios you can consider? Find out during this session.

Programs, Resources and Tools to Help ISVs Move to Cloud [full description]

  • Session Type: Interactive Discussion (Small Group)
  • Track: Independent Software Vendor
  • Speaker(s): Trina Horner
  • Level: 100 Level: Introductory

ISVs are in different stages of development for the Cloud, from thinking about moving to SaaS/Cloud to already in market to trying to increase average revenue per customer. Come learn about how Microsoft can help you at any stage of your development: Comprehensive Advisory Services offer based on 40 business success factors to optimize for the SaaS/Cloud model; Hosters and enabling partners who have expertise to onboard ISVs to SaaS/Cloud; Latest research “Sense of Urgency for Software Companies: Partnering for Success in the Cloud”; How best to engage with partner public and private clouds and Windows Azure, and more.

US Federal: Cloud Opportunities [hide description]

  • Session Type: Breakout Session
  • Track: Public Sector
  • Level: 100 Level: Introductory

Making the cloud Federal ready. This session covers the nuances of Microsoft’s Cloud Services specific to Federal organizations. During this discussion, learn how Microsoft is making the Cloud consumable for government by allowing them to meet their regulatory obligations. Additionally, understand how partners and System Integrators have found success with their Cloud offerings in both the Online Services and Azure spaces. Bring any questions you have and we will share with you the top questions that we are hearing.

Health Track: Health Plans - Out of the Box and into “The Cloud” [hide description]

  • Session Type: Breakout Session
  • Track: Public Sector
  • Speaker(s): Hector Rodriguez
  • Level: 200 Level: Intermediate

Microsoft Health Plan Industry Technology Specialist, Hector Rodriguez, discusses how health plans are looking for ways to innovate through “Health Improvement” Technologies (HiT) and how the Windows Azure platform offers an opportunity for our Partners to help them succeed. This session discusses some of the specific innovations we are seeing for health plans and offers some suggestions for how your organization should position an evaluation of the Windows Azure platform, not just as a technology platform but as a potentially new business model.

 

Thursday, April 08, 2010

Windows Azure and Cloud Computing Posts for 4/8/2010+

Windows Azure, SQL Azure Database and related cloud computing topics now appear in this weekly series.

 
Update 4/15/2010: Corrected the spelling of Shannon Lowder’s last name (see the SQL Azure Database, Codename “Dallas” and OData section.)

Note: This post is updated daily or more frequently, depending on the availability of new articles in the following sections:

To use the above links, first click the post’s title to display the single article you want to navigate.

Cloud Computing with the Windows Azure Platform published 9/21/2009. Order today from Amazon or Barnes & Noble (in stock.)

Read the detailed TOC here (PDF) and download the sample code here.

Discuss the book on its WROX P2P Forum.

See a short-form TOC, get links to live Azure sample projects, and read a detailed TOC of electronic-only chapters 12 and 13 here.

Wrox’s Web site manager posted on 9/29/2009 a lengthy excerpt from Chapter 4, “Scaling Azure Table and Blob Storage” here.

You can now download and save the following two online-only chapters in Microsoft Office Word 2003 *.doc format by FTP:

  • Chapter 12: “Managing SQL Azure Accounts and Databases”
  • Chapter 13: “Exploiting SQL Azure Database's Relational Features”

HTTP downloads of the two chapters are available from the book's Code Download page; these chapters will be updated for the January 4, 2010 commercial release in April 2010. 

Azure Blob, Table and Queue Services

Steve Marx’s Leasing Windows Azure Blobs Using the Storage Client Library of 4/8/2010 begins:

One of the unsung heroes of Windows Azure storage is the ability to acquire leases on blobs. This feature can help you solve thorny concurrency challenges, perform leader election, serialize edits, and much more. Look for more discussion of leases over on the Windows Azure Storage Team blog in the coming weeks.

A question came up on the Windows Azure forum yesterday about how to use this blob lease functionality from the storage client library that ships with the Windows Azure SDK. Unfortunately, methods to acquire, renew, break, and release leases is not yet included in the top-level Microsoft.WindowsAzure.StorageClient namespace. This makes it a bit hard to figure out how to use leases. In this post, I’ll show you what’s available in the storage client library to manage leases, and I’ll share some code to help you get going.

Using the Protocol namespace

The lease operations can be found in the Microsoft.WindowsAzure.StorageClient.Protocol namespace, which provides lower-level helpers to interact with the storage REST API. In that namespace, there’s a method called BlobRequest.Lease(), which can help you construct a web request to perform lease operations.

Here’s a simple method which attempts to acquire a new lease on a blob and returns the acquired lease ID. (For convenience, I’ve made this an extension method of CloudBlob, which allows for syntax like myBlob.AcquireLease().)

public static string AcquireLease(this CloudBlob blob)
{
    var creds = blob.ServiceClient.Credentials;
    var transformedUri = new Uri(creds.TransformUri(blob.Uri.ToString()));
    var req = BlobRequest.Lease(transformedUri,
        90, // timeout (in seconds)
        LeaseAction.Acquire, // as opposed to "break" "release" or "renew"
        null); // name of the existing lease, if any
    blob.ServiceClient.Credentials.SignRequest(req);
    using (var response = req.GetResponse())
    {
        return response.Headers["x-ms-lease-id"];
    }
}

The call to BlobRequest.Lease() gives me an HttpWebRequest which I can then execute. To make sure I’m using the correct URL and authorization, I’m using TransformUri() and SignRequest(). The former updates the URL with a Shared Access Signature (if needed), and the latter constructs the correct Authorization header (if needed). Doing both ensures that no matter which kind of access I’m using, I have a properly authorized HttpWebRequest.

Finally I execute the web request and read the x-ms-lease-id header to get back the newly-acquired lease (which will be in GUID format).

Steve continues with code examples for:

    • Using the lease once acquired
    • The rest of the lease methods
    • Simple usage

and provides downloadable sample code.

<Return to section navigation list> 

SQL Azure Database, Codename “Dallas” and OData

Kevin Kline’s The NoSQL Movement: Hype or Hope? article of 4/7/2010 for Database Trends and Applications magazine’s April 2010 issue casts a jaundiced eye on the NoSQL movement’s tenets:

If you spend any time at all reading IT trade journals and websites, you've no doubt heard about the NoSQL movement.  In a nutshell, NoSQL databases (also called post-relational databases) are a variety of loosely grouped means of storing data without requiring the SQL language.  Of course, we've had non-relational databases far longer than we've had actual relational databases.  Anyone who's used products like IBM's Lotus Notes can point to a popular non-relational database.  However, part and parcel of the NoSQL movement is the idea that the data repositories can horizontally scale with ease, since they're used as the underpinnings of a website.  For that reason, NoSQL is strongly associated with web applications, since websites have a history of starting small and going "viral," exhibiting explosive growth after word gets out.

In contrast, most relational database platforms require a lot of modifications to successfully grow in scalability from small to medium to global.  For a good review of such a growth pattern, and the frequent re-designs that explosive growth requires, read the story of MySpace's evolution as a Microsoft SQL Server shop at http://www.microsoft.com/casestudies/Case_Study_Detail.aspx?CaseStudyID=4000004532.

On the negative side, NoSQL databases circumvent the data quality assurance of relational databases best known as ACID (atomicity, consistency, isolation, durability) property of transactions.  So, while NoSQL databases might be very fast and scale easily, they do not typically guarantee that a transaction will be atomic, consistent, isolated, and durable.  In other words, you could lose data, and there is no guarantee that a transaction will always complete successfully, or completely roll back.

The market for NoSQL is still very immature and crowded with many open source and proprietary products.  Well-known vendors for NoSQL databases include Google's Big Table offering and Amazon's Dynamo, both of which are available as inexpensive cloud services.  Some of the most talked about NoSQL platforms on the open source side include Apache's HBase and CouchDB; Facebook's Cassandra; and LinkedIn's Project Voldemort. …

Kevin Kline is the technical strategy manager for SQL Server Solutions at Quest Software. You might be interested in his earlier Server in the Clouds?article with this abstract:

The idea of "SQL Server in the cloud" is all the rage as I write this article. Many SQL Server experts already predict the demise of the IT data center and a complete upending of the current state of our industry, in which large enterprises can spend millions of dollars on SQL Server licenses, hardware and staff. I have to admit, when I first heard about this idea, I was ecstatic. What could be better for an enterprise than to have all the goodness of a SQL Server database with none of the hardware or staffing issues? However, on deeper examination, there is much about which to be cautious.

Shannon Lowder describes Migrating Databases to SQL Azure without current migration tools such as the SQL Server Migration Wizard or Azure Data Sync in this 4/7/2010 post:

When converting a database from an older version of Microsoft SQL to Azure, there will be many gotchas along the way.  I'd like to help you learn from the troubles I had along the way, hopefully sparing you a bit of time that was lost during my first conversion.

Getting Started

I'm going to assume you already have your account, and have already set up the database and firewall settings for your Azure server.  If you haven't please visit http://sql.azure.com and follow their getting started guide.  This will walk you through each of the steps you'll need to have completed before any of the following article will help.

To get started developing in Azure you could either build a database from scratch, or "export" your current database to your azure server.  Since I have several databases that I've built throughout the years I figured I'd start my development by upgrading an existing database to Azure.

Shannon probably could have saved considerable time by checking out these posts:

Ron Jacobs’ Using System.Web.Routing with Data Services (OData) post of 4/5/2010 answers “How do you get rid of the .svc extension with WCF Data Services?:”

So you like the new OData protocol and the implementation in WCF Data Services… but you hate having a “.SVC” extension in the URI?

How do you get rid of the .svc extension with WCF Data Services?

Simple… Just use the new ServiceRoute class. For this example, I’m using a project from my MIX10 session Driving Experiences Via Services using the .NET Framework. The sample includes a simple WCF Data Service that returns information about a conference.

As you can see I’ve put the service under the Services folder.  To access it you have to browse to http://localhost:62025/Services/Conference.svc/

The default behavior of the URI in this case is that the folder and file name in the web site are used to create the URI.  But if you don’t like that with .NET 4 you can just create a route. 

Just add a Global.asax file to your site and add the following code [that Ron shows you].

This code creates a route for the URI http://localhost:62025/Conference that will use the DataServiceHostFactory to create our WCF Data Service class which is the type Conference.

Simple, easy and very cool…

Carl and Richard interview Brad Abrams, Bob Dimpsey and Lance Olson about OData for NET Rocks Show #519.

Brad is is currently the Group Program Manager for Microsoft’s UI Framework and Services team, Bob is the Product Unit Manager for the Application Server Developer Platform, and Lance is a Group Program Manager building developer tools and runtimes for data on the SQL Server team.

<Return to section navigation list> 

AppFabric: Access Control and Service Bus

The Azure AppFabric team announces The Windows Azure platform AppFabric April 2010 Release is Live in this 4/7/2010 post:

The Windows Azure platform AppFabric April Release is now live. In addition to improvements in stability, scale, and performance, this release addresses two previously communicated issues in the March billing preview release. We recommend that you re-check your usage previews for April 7th or later per the instruction in the March Billing Preview announcement to ensure that you sign up for the appropriate pricing plan in preparation for AppFabric billing that will start after April 9. For more information about pricing and the billing related issues addressed in this release, please visit our pricing FAQ page.

Please refer to the release notes for a complete list of known issues and breaking changes in this release. To obtain the latest copy of the AppFabric SDK, visit the AppFabric portal or the Download Center.

Vittorio Bertocci reports in his Patria Natia Tour: keynotes di VS2010 & Basta! Italia, Community Tour a Catania & Venezia post of 4/7/2010 (in Italian) that he will deliver a pair of keynotes and two claims-based identity sessions in mid-April: 

image In September 2005 I embarked on this venture Redmondiana, buying the domain www.maseghepensu.it from good emigrant Genovese. At the time I never imagined that a few years later I would come in Italy away to deliverer the keynote of one of the most important product launches of our recent history! Needless to say that I am deeply honoured & extremely thing, and I'm looking forward to meet the developers Italians from toured with Lorenzo, Francesca and all the members of Microsoft France involved in launching Visual Studio 2010 (and they are doing an amazing job). Below the agenda:

  • 12 April, 14: 30-18: 00. Keynote's launch of Visual Studio 2010, transmitted LIVE directly on the pages of the event http://www.microsoft.com/italy/visualstudio/2010/evento.aspx
  • 13 April, 9: 30-10: 15. Keynote Basta! Italy in Rome
  • 14 April, 10: 00-11: 40. Keynote & claims based identity session at community event of OrangeDotNet in Catania
  • 15 April, 10: 00-11: 40. Keynote & claims based identity session at community event of XeDotNet in Venice

Great tour! Especially when you consider that in 33 years that I have lived in Italy have never been ne ne in Veneto in Sicily ...

I'm really curious to see if give sessions in English will be like moving from the ball medical supertele, or if the almost 5 years to use the Italian primarily as VPN with my wife when we shop have ruined my otherwise proverbial scilinguagnola ... I hope you'll be including:-))

There you see on tour!!!

Italian –> English translation by Microsoft (Bing) Translator.

<Return to section navigation list>

Live Windows Azure Apps, APIs, Tools and Test Harnesses

Joshua Kurlinski from Symon Communications and Joseph Fultz from the Dallas Microsoft Technology Center deliver this 00:21:15 Scaling Web Sites with Azure and Local Cache Channel9 video segment of 4/8/2010:

In this screencast we learn how Microsoft teamed up with Symon Communications to build a scalable content delivery system for mobile devices. This Proof of Concept was created to help with Symons digital signage network, but the caching mechanism could potentially benefit anyone looking to leverage Windows Azure for massive scale. Joshua Kurlinski from Symon talks about the problems we needed to solve, and Joseph Fultz from the Dallas MTC (Microsoft Technology Center) walks us through the solution we created in depth.

To learn more about Symon Communications, visit www.symon.com. If you’d like to learn more about this solution, you can read this article on Josephs blog.

Mike Wickstrand’s From your perspective, what is the reasonable length of time it should take to deploy an application to Windows Azure? Twtpoll survey lets you select from various deployment times:

Here were the results as of 11:00 AM on 7/8/2010:

image

I voted for #1.

Mike is Senior Director of Product Planning for Windows Azure.

Maarten Balliauw explains Running PHP on Windows Azure and other related topics on 4/8/2010:

Yesterday I did some talks on PHP and Windows Azure at JumpIn Camp in Zürich together with Josh Holmes. Here are the slide decks and samples we used.

Scaling Big while Sleeping Well

Josh talked on what Windows Azure is, what components are available and how you can get started with PHP and Windows Azure: Scaling Big While Sleeping Well. View more presentations from Josh Holmes.

Running PHP in the Cloud

I did not do the entire deck, but showed some slides and concepts. This is mainly the same content as Josh’s session with some additions: Running PHP In The Cloud. View more presentations from Maarten Balliauw.

Windows Azure Storage & SQL Azure

This deck talks about the different storage concepts and how to use them in PHP: Windows Azure Storage & Sql Azure. View more presentations from Maarten Balliauw.

Sample code

As a demo, I had ImageCloud, a web application similar to Flickr. Here’s the sample code: ImageCloud.rar (5.00 mb)

tbtechnet answers PHP on Windows Azure… What’s the Scoop? in this 4/7/2010 post to the Windows Azure Platform, Web Hosting and Web Services blog:

Ever wondered what’s involved to get a PHP app to work with Windows Azure? We took a crack at explaining to PHP developers how to work with Windows Azure.

PHP on Windows Azure Quickstart Guide - Creating and Deploying PHP Projects on Windows Azure

The guide shows how to develop and test PHP code on a local development machine, and then how to deploy that code to Windows Azure. The material is intended for developers who are already using PHP with an integrated development environment (IDE) such as Eclipse, so the guide doesn't cover PHP syntax or details of working with Eclipse, but experienced PHP programmers will see that it is easy to set up Eclipse to work with Windows Azure to support web-based PHP applications.

Along with the new Windows Azure Command-line Tools for PHP Developers which provides a command-line utility to PHP developers the guide is worth a look.

Return to section navigation list> 

Windows Azure Infrastructure

CloudTweaks reports on Microsoft Research’s Cloud Computing Project – “Cloud Faster” in this 4/8/2010 post:

To make cloud computing work, we must make applications run substantially faster, both over the Internet and within data centers. Our measurements of real applications show that today’s protocols fall short, leading to slow page-load times across the Internet and congestion collapses inside the data center. We have developed a new suite of architectures and protocols that boost performance and the robustness of communications to overcome these problems.

About Cloud Faster

We have developed a new suite of architectures and protocols that boost performance and the robustness of communications to overcome these problems. The results are backed by real measurements and a new theory describing protocol dynamics that enables us to remedy fundamental problems in the Transmission Control Protocol.

To speed up the cloud, we have developed two suites of technology:

  • DCTCP – changes to the congestion control algorithm of TCP that decreases application latency inside the data center by decreasing queue lengths and packet loss while maintaining high throughput.
  • WideArea TCP – changes to the network stack of the “last-hop server” – the last server to touch packets before they travel to the client – that reduce the latency for transferring small objects (5 to 40 KB) by working around last-mile impairments such as loss and high RTT.

We will demo the experience users will have with Bing Web sites, both with and without our improvements. The difference is stunning. We also will show visualizations of intra-data-center communication problems and our changes that fix them. This work stems from collaborations with Bing and Windows Core Operating System Networking.

DCTCP – Reducing Latency Inside the Data Center

The following videos shows what happens when a server (marked 21) in a rack sends a request for information to 20 other servers in the same rack, and then waits for their responses so that it can formulate a summary. This Partition/Aggregate pattern is very common in data center applications, forming the heart of applications as diverse as Web search (querying very large indexes), ad placement (find the best ads to show with a web page), and social networking (find all a user’s friends, or the most interesting info to show to that user).

In both videos, we see a burst of activity as the request is sent out, with all servers responding at roughly the same time with a burst of packets that carries the first part of their response. This burst is known as incast, and it causes the queue at the switch to rapidly grow in size (shown as blocks extending out on a 45 degree angle).

DCTCP

In the case of DCTCP, senders start receiving congestion notifications much earlier than with TCP. They adjust their sending rates, and the queue never overflows. Even after the initial burst of activity, the operation of DCTCP is much smoother than TCP, with senders offering roughly equal amounts of traffic — so much that it even appears they are “taking turns.”

Watch the Videos

David Gristwood has updated his SlideShare deck to 70 slides in Understanding The Azure Platform March 2010. Dave is a Microsoft application architect.

<Return to section navigation list> 

Cloud Security and Governance

Ina Fried claims The cloud--it's not for control freaks in this 4/8/2010 post to CNet News’ Beyond Binary column:

Moving server software to the cloud has a lot of advantages. A company no longer has to worry about patches, deploying upgrades, and an number of other concerns.

But it also has one big downside--one that many CIOs are still struggling with--a the loss of control.

"They do lose control, when they move to a cloud-based service, of some things," Microsoft Senior Vice President Chris Capossela said during a lunch meeting on Wednesday. "They lose control of when things get updates. They lose control of saying 'no' to some new thing."

Capossela acknowledged that many technology executives, even those who are shifting work to the cloud, see it as a mixed bag.

"On Mondays, Wednesdays, and Fridays they hate it, and on Tuesdays and Thursdays they are really excited by it," Capossela said. "What I mean by that is they see the excitement and the benefits of it and they are also scared of it."

To the end user, it doesn't make a huge difference; Microsoft's software looks basically the same whether it is running in a customer's data center or as a service from Microsoft. If anything, the service customers are happier because they get new versions more quickly.

However, to the IT department, those two scenarios look very different. When they run the software on their own, customers have to budget for upgrades, manage installations, and monitor servers. In the latter scenario, the company doesn't do any of that but at a different cost: they have little say which versions of the software are running.

Photo of Chris Capossela by Microsoft.

e.Tune.me’s The Cloud is unsafe? Quite the opposite post of 4/8/2010 posits:

As it turns out, 45% of tech execs believe that the risks of cloud computing outweigh the benefits.

I don’t get it. Yes the risks are obvious, and I realise that the primary concern for large enterprise in entering the cloud is the increased vulnerability of information contained within it. As with almost all cloud solutions a company does not host it’s own cloud servers and is therefore not responsible for managing the security of the servers.

This is very much akin to trusting a stranger with your children. If something ever happened to them, wouldn’t you rather it happen when they were with you? After all who can love and protect them better than their own parents? My response to this is simple: If the person you were trusting your children with were a kick-ass Steven Seagal-type character, would you still not trust them to be safe? Certainly you will still love them more than Steven Seagal (maybe??), but barring some divine lifting-a-car-off-your-child type moment, you surely can’t protect them better.

This is how I view the cloud. Disregarding amateurish companies that consider the cloud as dollars for storage with little concern for security, cloud services such as Amazon’s AWS are the Steven Seagals of the cloud. It is their business to ensure that they are the best at what they do and security is absolutely paramount. As Google have stated in the past, one small error and everyone stops trusting you. They simply can’t afford any mistakes, even more so than the enterprises that use them.

With that said, how then can enterprise possibly trust their own ’secure’ non-cloud based solutions over the cloud? Their networks are still connected to the net (although very well firewalled), their information is still accessible to anyone desperate enough to retrieve it (just ask the Chinese government), and surely – SURELY – most enterprise IT experts are no more and probably less skilled than those working for the likes of Amazon. In saying that, if you were to get hacked, wouldn’t you prefer liability lie with someone else for the resulting damages? I sure would.

The cloud can in some twisted way be viewed as a form of insurance. They mess up and you sue. You mess up… ????

Slavik Markovich’s The Next Challenge for Database Security: Virtualization and Cloud Computing article of 4/7/2010 for Database Trends and Applications April 2010 issue begins:

It's hard enough to lock down sensitive data when you know exactly which server the database is running on, but what will you do when you deploy virtualization and these systems are constantly moving?  And making sure your own database administrators (DBAs) and system administrators aren't copying or viewing confidential records is already a challenge - how are you going to know when your cloud computing vendor's staff members are not using their privileges inappropriately?  These are just two of the obstacles that any enterprise must overcome in order to deploy a secure database platform in a virtual environment, or in the cloud. In some cases, these concerns have been preventing organizations from moving to virtualization or cloud computing.

Security in a Dynamic Systems Environment

Whether we're talking about your own VMware data center, or an Amazon EC2-based cloud, one of the major benefits is flexibility.  Moving servers, and adding or removing resources as needed, allows you to maximize the use of your systems and reduce expense.  But, it also means that your sensitive data, which resides in new instances of your databases, are constantly being provisioned (and de-provisioned). While gaining more flexibility, monitoring data access becomes much more difficult.  If the information in those applications is subject to regulations like Payment Card Industry Data Security Standard (PCI DSS) or Health Insurance Portability and Accountability Act (HIPAA), you need to be able to demonstrate to auditors it is secure.

As you look at solutions to monitor these "transient" database servers, the key to success will be finding a methodology that is easily deployed on new virtual machines (VMs) without management involvement.  Each of these VMs will need to have a sensor or agent running locally - and this software must be able to be provisioned automatically along with the database software, without requiring intrusive system management, such as rebooting, for example whenever you need to install, upgrade or update the agents.  Even better, if it can automatically connect to the monitoring server, you'll avoid the need to reconfigure constantly to add/delete new servers from the management console.  The right architecture will allow you to see exactly where your databases are hosted at any point in time, and yet centrally log all activity and flag suspicious events across all servers, wherever they are running. …

Bill Brenner reports from the SaaS Connect Conference “SaaS, Security and the Cloud: It's All About the Contract” post to Network World’s Security blog of 4/7/2010:

The term Software as a Service (SaaS) has been around a long time. The term cloud is still relatively new for many. Putting them together has meant a world of hurt for many enterprises, especially when trying to integrate security into the mix.

During a joint panel discussion hosted by CSO Perspectives 2010 and SaaScon 2010 Wednesday, five guys who've been there sought to help attendees avoid the same ordeal. Perhaps the most important lesson is that contract negotiations between providers is everything. The problem is that you don't always know which questions to ask when the paperwork is being written.

Panelists cited key problems in making the SaaS-Cloud-Security formula work: SaaS contracts often lack contingency plans for what would happen if one or more of the companies involved suffer a disruption or data breach. The partners -- the enterprise customer and the vendors -- rarely find it easy getting on the same page in terms of who is responsible for what in the event of trouble. Meanwhile, they say, there's a lack of clear standards on how to proceed, especially when it comes to doing things in the cloud.

Add to that the basic misunderstandings companies have on just what the cloud is all about, said Jim Reavis, co-founder of the Cloud Security Alliance.

"It's important we understand there isn't just one cloud out there. It's about layers of services," Reavis said. "We've seen an evolution where SaaS providers ride atop the other layers, delivered in public and private clouds."

Somewhere in the mix, plenty can go wrong. …

Jay Heiser asks Its 11PM, do you know where your data is? in this 4/7/2010 post to the Gartner blogs:

Where is it?Every evening for several decades, a number of American television stations announced that it was 10pm, and asked the public service question “Do you know where your children are?”  Anyone using a cloud computing service should be asking the same question about their data.

Over the next few months, I’m going to be researching an area of cloud computing risk that hasn’t received adequate attention: data continuity and recovery.

Theoretically, the cloud computing model should be a resilient one, and a number of vendors claim that their model is built to automatically replicate data to an alternate site, protecting their customers from the risk of hardware failure, or even site failure. I have no trouble believing this.

What I do have trouble with is accepting unsubstantiated vendor claims that this is a more reliable mechanism than anything I can do for myself. There is no perfect mechanism for backing up data, but if I choose to be responsible for backing up my own data, I’ve got quite a bit of useful knowledge about the reliability of the mechanisms I choose, and the degree to which the processes are performed.  I can verify the integrity and completeness of the copies, I can store them offsite and post armed guards, and I can periodically test to ensure that restoration is possible.  None of this is foolproof, but it can be reliable to what ever degree I desire.

If I choose instead to rely on a cloud service provider, I have no ability to know where the primary data is, let alone have an ability to verify that redundant copies of all my data exist in a different site. I have no ability to know the likelihood that my provider would be able to restore my data in case of an accident, let alone restore something important that I accidentally deleted.

And if my data in the cloud  is being backed up in real time, it raises another significant question: if the original data is corrupted, won’t the same corruption affect the copy?  Mistakes and errors replicate at the speed of the cloud. What if data loss occurs as the result of some sort of cascading failure, or external attack?  Isn’t it reasonable to assume that this would affect all copies of the data?  Traditional backups are inherently more reliable in that offline data is insulated from failure modes that are inherent to realtime online redundancy models.

If you don’t know where your data is, can you confirm that it will be there when you need it most?

What evidence do you have from your provider that their proprietary technology is reliable?

<Return to section navigation list> 

Cloud Computing Events

Resource Plus announces its State of the Cloud 2010 Executive Conference to be held 4/26 through 4/27/2010 at the Seaport World Trade Center in Boston, MA:

SOTC 2010 - State of the CloudCloud Computing is in the news. But is it in your business plan? Some businesses are already realizing the significant benefits of Cloud Computing—slashing IT costs, boosting efficiency, and enabling new sales capabilities. Others may be reluctant to jump because it’s perceived as new, unfamiliar, and potentially risky.

The State of the Cloud Executive Conference helps you take Cloud Computing from an intriguing idea to a day-to-day business reality:

  • Separate the hype from the true business potential
  • Hear from companies that have made their move to the Cloud
  • Chart your company’s technological and financial future
  • Learn about all aspects of this critical emerging technology
  • Chart your company’s technological and financial future
  • Get objective insights from independent industry experts
  • Find out what third-party solution providers have to offer
Spend a day in the Cloud

It all happens at this convenient, one-stop Cloud Computing conference designed for CEOs, CIOs, senior IT executives, and other key decision-makers. Topics covered in our keynote presentations and panel sessions will help you better understand the current Cloud offerings, price points, and total cost of ownership. Plus, leading Cloud Computing service experts will showcase their Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS) offerings.

We’ll explore:
  • How Cloud Computing can give your company a competitive edge
  • Scalability and portability for IaaS, PaaS, and SaaS
  • Application and infrastructure interoperability
  • Total cost of ownership and cost optimization scenarios
  • Cloud Computing security/privacy implications and solutions
  • Public vs. private Cloud solutions
  • Preparation and impact to your organizational workflow

Microsoft is a gold sponsor.

Claude (The Cloud Architect) reports Upcoming CloudStorm Editions in this 4/8/2010 post:

Next month, CloudStorm will take place at Cloud Expo in New York. It is being held on April 19, 10AM, the morning before Cloud Expo proper opens its doors at the Javits Convention Center.

CloudStorm will run from 10:00AM - 12 noon on Monday, 19th April and includes lightening talks by Cordys, Soasta (Cloudtest), Rightscale, Virtual Ark, Amplidata, A-Server (DAAS.com), Zenith Infotech (Smart Style Office) and more to come!

Register to attend here if you want to attend,you receive a free pass. If you want to talk at the this edition, I have 2 speaking slots left: a booth at cloud expo is required. If you have that, the speaking slot is offered free of charge.

Also, we have still some speaking slots left in CloudStorm Düsseldorf on May 4th, CloudStorm Amsterdam on May 6th, Cloud Expo Europe in London on June 16th. Costs are +- €1.000 each, providing speaking,demo table, delegate bag insert, full list of attendees including contact details

More info on www.cloudstorm.org

Rutrell Yasin reports “The Cloud Summit will turn attention to standards for data interoperability, portability and security” in his Feds, industry to hash out cloud standards at May summit article of 4/7/2010 for Federal Computer Week:

The National Institute of Standards and Technology will host a Cloud Summit on May 20 with federal agencies and the private sector with the intent to develop data interoperability, portability and security standards for cloud computing that can be applied across agencies.

Vivek Kundra, the federal chief information officer, told an audience at the Brookings Institution today that establishing such standards is essential to making full use of cloud computing's potential.

By Aug. 1, NIST officials plan to move forward with initial specifications, which will lead to the launch of a portal for cloud standards where various stakeholders can collaborate online in a cloud environment, Kundra said.

“NIST will convene people around the table, and part of what we want to do is test case studies,” Kundra said during an address on “The Economic Gains of Cloud Computing,” sponsored by Brookings in Washington, D.C. The event was moderated by Darrell West, vice president and director of Governance Studies with Brookings, which released a report entitled “Saving Money Through Cloud Computing.”

See Windows Azure and Cloud Computing Posts for 4/7/2010+ for more details on the Brookings Institution’s conference.

Simon Munro laments the canned production values of typical Windows Azure demos and announces UK AzureNET User Group Redux in his Emerging Azure Rockstars post of 4/7/2010:

I’ve just about had my fill of Azure demos and presentations. After more than a year in beta it seemed that the only people that could stand up in front of a crowd and talk about Azure were those from Microsoft or their hymn-singing partners. It is not that the presentations and videos are bad, it is just that they are a lot of the same – either a ‘Hello cloud’ introduction to the platform, release of new features that have been asked for or pimping how similar Azure development is to existing .net development.

Most of the presentations that I have seen, although many of them very good and done by really smart people, have the sheen of marketing snake oil. It all seems to perfect and simple - all clinical, clean and fashionable like CSI Miami where everything works, rather than messy and grungy like the development world that we have to live in every day. …

Simon continues:

So it is fortunate (and about time) that the first Azure meeting of the year is going to have presentations – not by some big name to attract the usual sheeple that marketing attracts, but by some people that have worked with Azure and delivered something that is real. I have had some chats with Simon Evans, James Broome and Grace Mollison over the last couple of months as they have developed a solution on top of Azure – a solution that would have no business case if it weren’t for the cloud model. I have heard from James as he wrestled the development fabric to fit in with his BDD style and the support of Grace in providing a build server for the team using a product that doesn’t have a server version. I watched from a distance as Simon got his head down and dealt with the persistence issues and had to ignore him after the n th time that he exclaimed how cool and easy the CDN is.

Next week is a busy week for the Microsoft community. There is all the stuff that Microsoft is putting on around the launch of VS2010, SQL R2 and others. There is a Silverlight user group meeting and I will be presenting at SQLBits on Friday. Thursday night is the turn of UKAzureNet and even though it might be less central, it is being done in a cinema and our new emerging rockstars will be on a stage of sorts. We need your support and attendance to help get it as full as possible – we don’t expect it to be like the opening weekend of Avatar, but hope that we’ll have more than a handful of usual suspects throwing popcorn.

You are guaranteed to hear some interesting stories from the trenches.

You can register here UK AzureNET User Group: Phoenix from the Flames

<Return to section navigation list> 

Other Cloud Computing Platforms and Services

Lori MacVittie claims “Stateless applications may be the long term answer to scalability of applications in the cloud, but until then, we need a solution like sticky sessions (persistence)” in her Amazon Makes the Cloud Sticky post of 4/8/2010:

Amazon recently introduced “stickiness” to its ELB (Elastic Load Balancing) offering. I’ve written a bit about “stickiness”, a.k.a. what we’ve called persistence for oh, nearly ten years now, before so I won’t reiterate again but to say, “it’s about time.” A description of why sticky sessions is necessary was offered in the AWS blog announcing the new feature:

blockquote Up until now each Load balancer had the freedom to forward each incoming HTTP or TCP request to any of the EC2 instances under its purview. This resulted in a reasonably even load on each instance, but it also meant that each instance would have to retrieve, manipulate, and store session data for each request without any possible benefit from locality of reference.

-- New Elastic Load Balancing Feature: Sticky Sessions

What the author is really trying to say is that without “sticky sessions” ELB breaks applications because it does not honor state. Remember that most web applications today rely upon state (session) to store quite a bit of application and user specific data that’s necessary for the application to behave properly. When a load balancer distributes requests across instances without consideration for where that state (session) is stored, the application behavior can become erratic and unpredictable. Hence the need for “stickiness”. …

Lori continues with “WHY is THIS IMPORTANT?” and “THE NECESSARY EVIL of STATE” sections.

Lydia Leong describes Cogent’s Utility Computing service in this 4/8/2010 post:

A client evaluating cloud computing solutions asked me about Cogent’s Utility Computing offering (and showed me a nice little product sheet for it). Never having heard of it before, and not having a clue from the marketing collateral what this was actually supposed to be (and finding zero public information about it), I got in touch with Cogent and asked them to brief me. I plan to include a blurb about it in my upcoming Who’s Who note, but it’s sufficiently unusual and interesting that I think it’s worth a call-out on my blog.

Simply put, Cogent is allowing customers to rent dedicated Linux servers at Cogent’s POPs. The servers are managed through the OS level; customers have sudo access. This by itself wouldn’t be hugely interesting (and many CDNs now allow their customers to colocate at their POPs, and might offer self-managed or simple managed dedicated hosting as well in those POPs). What’s interesting is the pricing model.

Cogent charges for this service based on bandwidth (on a Mbps basis). You pay normal Cogent prices for the bandwidth, plus an additional per-Mbps surcharge of about $1. In other words, you don’t pay any kind of compute price at all. (You do have to push a certain minimum amount of bandwidth in order for Cogent to sell you the service at all, though.) …

Mike Kirkwood reports This Tweet is Priority 1: SalesForce.com's Chatter is Transactional Social Media in this 4/8/2010 post to the ReadWriteCloud:

chatter LedeSoon, Twitter users will be in a better position to get satisfaction with the companies that they do business with. This morning, SalesForce.com is announcing that the Chatter beta developer preview has grown to 500 companies and is integrated with its popular Service Cloud offering. The company has shown its ability to leverage the disruption of social media - rather than be disrupted by it.

We had a chance to review the new tools and experience what an end-to-end social media driven customer experience looks like. It was eye-opening for us - and is coming soon to the 70,000-plus customers of SalesForce platform.

The first thing we learned in our briefing with SalesForce is that the company has fully digested the reality of the new web. The company talks about how it started on a mission to bring the power of great web applications like Amazon.com to enterprise customers. Now, ten years later, the web and the company have moved on towards the new dominant engagement model on the web, Facebook, YouTube, and Twitter. …

<Return to section navigation list>