Continuing Cerebrata journey, in this blog post I will describe how we built our 2nd product – Azure Diagnostics Manager.
History
During the public beta phase of Cloud Storage Studio (CSS) before PDC 2009, a number of our users asked us the build the capability of visualizing their cloud application’s diagnostics in that application. One of our task for one of the public beta builds of CSS was to include this capability. If you were involved with Windows Azure around that time, you may recall the diagnostics capability in Windows Azure was pretty rudimentary and rather clumsy. Determined to implement this functionality, I shot off email to all our users that we will be working on providing this functionality in the next public beta release of CSS. A big thanks to Steve Marx who sent me a timely email and telling me not to do this. He told me that Microsoft is changing the way diagnostics was handled in Windows Azure and more information will be revealed during PDC. During that time, Steve never responded to my emails [He may have set a filter in his Outlook to automatically move my messages to “Junk” folder] so when he responded, I took that very seriously and waited for PDC.
During PDC, Matthew Kerner demonstrated the new diagnostics functionality and to be honest I (and everybody else I would think) was completely blown away. They basically brought managing diagnostics in Windows Azure at par with managing diagnostics in On-Premise applications. They gave us Event Logs, Performance Counters, Trace Logs, IIS Logs, IIS Failed Request Logs (stuff we’re familiar with) and the neat thing was that you don’t have to learn new tricks to capture all this information. For example, if you were using Trace.WriteLine to trace information in your on-premise code you would use exactly the same to trace information in Windows Azure. This again is an important lesson for all of us developers – Just because you’re building something new, don’t make your users learn new ways to do things they’re familiar with.
Anyways, so PDC came and gone and we got busy with releasing Cloud Storage Studio and kind of put this whole thing on back-burner.
We revisited this again in January but were absolutely clueless as to how we’re going to build it. We knew that the diagnostics data is being stored in Windows Azure Table Storage and Blob Storage but that’s about it. We didn’t know how to fetch this data let alone present it. Then we ran into a blog post.
The Blog Post
As mentioned above, we were clueless as to how we’re going to build the application. Emmanuel Huna wrote a blog post about visualizing Windows Azure Diagnostics data which you can read here: http://blog.ehuna.org/2009/12/visualizing_windows_azure_diag_1.html. Basically he (and his team) built an application which fetched the data (performance counters only) from Windows Azure Table Storage and saved it in a SQL Server database. He then wrote an application which essentially fetches the data from that SQL Server database and presents that in the form of really nice charts.
You could call it the inspiration or the missing clue, but it gave us the idea about how we’re going to build this application.
Development
Having built Cloud Storage Studio and making it commercially available made our job a lot easier with the development of this product. Since Windows Azure Diagnostics data was saved in Table and Blob Storage and we already have our library to manage that data, the task we were left with is how do we present this data to the user.
We started working on this product in late January 2010 and were out with our 1st closed private beta version towards the end of February 2010. The application was not complete by any stretch but had enough functionality so that users can start using it. In the 1st cut, we had support for viewing Event Logs, Performance Counters, Trace Logs, Infrastructure Logs, IIS Logs and IIS Failed Request Logs.
This is how the application looked in the first private beta version:
Designing User Interface
One important thing we realized during the development of Cloud Storage Studio is to understand what users are most familiar with. Since Microsoft did a great job at keeping the way diagnostics data is captured same (as with on-premises applications), we ensured that we would do the same. Our event logs viewer looked and worked the same way as the event viewer on the desktop. We tried to mimic PerfMon utility for displaying performance counters data and so on. That familiarity helped a wider adoption of this product. Take a look at Event Logs viewer in the application below:
Versions, Versions and More Versions
When we made the 1st private beta release of the application, we knew that the application is not ready and more features needed to be added. So over the course of next month or so, we made a number of releases. This obviously included some of the features we didn’t include (like ability to view Crash Dumps, performing On-Demand Transfer and Remote Diagnostics Management) but we also implemented a lot of user feedback. Here’s an example of the changes we did in the 2nd private beta release:
Throughout the beta phase (and afterwards also), we remained in touch with the users and told them what we’ve done and what we’re doing next. That gave users a clear idea as to where this product is going and if they wanted to add some of the items in our product pipeline that could do so.
All in all we made 3 or 4 releases in closed private beta in the span of 2 months or so and then we went for public beta. We kept the public beta much-much shorter than what we did for Cloud Storage Studio and finally towards the end of May we made the application commercially available. With Cloud Storage Studio this cycle was close to 7 months but here it was just a little over 3 months. There were many reasons for that:
- Compared to Cloud Storage Studio, this application was less complicated. Also the breadth of features covered by Cloud Storage Studio were much more than what we needed to cover in this application.
- Most of the ground work (especially wrappers around REST API, WPF codebase) was already done during the development of Cloud Storage Studio thus made our job much easier with this application. You can say that all the R&D we did during 1st product paid off in this product.
- Most importantly, we thought that we’re ready to go live with this application. We knew that we have to add more and more features but all-in-all the application was stable and was doing what it was supposed to do. And that was good enough for us to proceed.
Going Live
When we went live with Cloud Storage Studio, we had to do a lot of ground work. We needed to figure out licensing, payment processor, pricing etc. but since we have already figured those out, we didn’t have much to worry about this time. It was rather a painless release.
For the initial release we decided to price it at $70.00 ($69.99 to be precise ). The reason being we felt it was kind of a specialized application and folks wouldn’t mind paying some extra for it.
We kept everything else the same. One would get to try it out for 30 days and after that the application would switch into “Developer” mode where users would only be able to connect to Windows Azure Storage Emulator using this application.
1st License Sale
As with Cloud Storage Studio, we got pleasantly surprised when we got our 1st license sale on the very first day (31st of May2010). Secretly (Greedily) I was hoping that this would happen and luckily it happened. I was in the process of sending out email to our existing users about the availability of this application when we got our first order of this application. Again, I was extremely elated! We ended up selling 4 licenses on the very first day and a little over 60 licenses next month!!!
Party
With Cloud Storage Studio, we waited for a week or so to celebrate, but with this we didn’t waste any time . In a day or so the team went out for a fancy lunch (in the middle of typical Rajasthani Heat ).
Post Launch
Things were somewhat in control with this release. We ended up making just 2 releases in the next month . The main reason was the same I mentioned above – We had a better handle on things + the application was not as complex as the previous one.
Foundation for Version 2
Our products were getting popular day-by-day and more and more folks were trying out them and more importantly buying them. But we realized early on that the architecture/framework we have in place (or lack of) won’t get us far. We had to start rebuilding our applications.
When you’re in technology business, it is extremely important to keep upgrading/updating. Be it your products or your knowledge. Otherwise you will not simply survive.
Summary
In one of the next few posts, I will talk about how we built version 2 of our products and also one of the most important story – About Red Gate acquisition . I will also share the story about our 3rd product – Azure Management Cmdlets.
So Long and Stay Tuned!!!