Learn how to raise your odds of success

 
 
 

A First Look at Some Metrics Numbers

Written by Ash Maurya

Last time I shared my conversion dashboard and promised some numbers. I don’t have all the numbers yet. But enough to start identifying some actionable next steps.

First some tools discussion is in order

I had been using Mixpanel and evaluating KISSmetrics for my metrics data. While I still see lots of possibilities using Mixpanel for application level metrics (e.g. what features are really getting used), I find KISSmetrics a lot more aligned with my goals for conversion metrics.

Here’s why… First some common goodness between the two: Both are near-real time. Both are event driven. Both have the ability to define custom properties on events like plan type, operating system version, browser version, etc.

But here’s where KISSmetrics really shines:

Ad-hoc funnel reports
While both Mixpanel and KISSmetrics use events to construct funnel reports, Mixpanel assumes all funnels are linear and fixed. You have to pre-define the exact sequence of steps upfront, then hardcode them into your pages.

For example, to record a “Created Gallery” event during a “Signup Flow” for funnel analysis, you would generate an event like this:

mpmetrics.track_funnel('Signup Flow', 3, 'Created Gallery');

where 3 identifies the third step in your funnel. Pre-defining funnels like this is fragile. If the event occurs out of order, it isn’t counted. If you need to add another event to uncover more detail, you have to touch all the events that follow it in the flow.

KISSmetrics on the other hand simply collects events that can be later assembled into one or more ad-hoc funnel reports.

You would code the same “Created Gallery” event as:

KM.record("Created Gallery");

You don’t need to identify a step number or funnel for this event. This sort of late composition of funnel reports really decouples raw event data from how they are eventually used in ad-hoc reports which is incredibly flexible.

What I also found, a little to my surprise, is that CloudFire’s conversion funnel is linear but not fixed. People generally move from the top of the funnel towards the bottom but they can jump in at multiple points. I’ve had users click-through directly to the signup page from another site without going through the landing or pricing pages at all. This wouldn’t get recorded at all in a Mixpanel type pre-defined funnel since it wouldn’t start at Step 1.

Ability to identify users
Another challenge when collecting metrics is tracking users consistently across sessions. Most analytics tools use unique cookies, but those break down across browsers or multiple computers. CloudFire, being a downloaded p2web app, has the additional requirement for multiple domain support. KISSmetrics offers a simple (too simple) solution for tracking users across any of these scenarios.

Before a user’s identity is known, such as pre-signup, KISSmetrics tracks users using a unique cookie (like other analytics tools). But once the identity is known, such as post-login, you can call a

KM.identify("$user_identifier");

method and tell it exactly who the user is using a persistent user identifier such as username or email. All data pre-login is merged with that post-login into a single record.

This way events are really tied to people and much more meaningful.

1-Page Reports
I really like the 1-page report visualization which you’ll see a little later. I do, however, wish they made their dashboard a little more useful.

Mine currently looks like this:

km_dashboard

You might recognize it as my AARRR conversion dashboard. All it’s missing are conversion percentages next to each funnel.

On to the numbers

As I mentioned last time, I am laying the foundation for measuring all the metrics but only focussing on optimizing Activation and Retention initially. As each of the AARRR metrics is itself a sub-funnel, I decided to break them out into separate funnel reports rather than build one giant report that would be a nightmare to maintain in the long run.

The ability to create ad-hoc funnel reports, discussed earlier, allows me start measuring everything at the macro level and add more detail to drill into the sub-funnels as needed for optimization.

Acquisition Report

I define Acquisition (user engagement) as a visitor who doesn’t abandon, and visits the pricing page (usually from the landing page). I am collecting KM events on my landing and pricing pages and have created a funnel report that looks like this:

acquistion_report

A 35% conversion (or 65% bounce rate) isn’t particularly great but it’s good enough to drive meaningful traffic to validate my MVP for now (scaling comes after product/market fit). Apart from my earlier adventures in SEM, I have not spent any more money on paid channels, and am instead investing time building up some viable free channels (SEO, blogs) in parallel.

Activation Report

I define Activation (happy first-user experience) as a sub-funnel made up of the following steps: Sign-up, Download App, Create First Gallery.

The first 2 steps occur on the CloudFire website, while the last is done in the downloaded application. Since I just started using KISSmetrics, I haven’t finished integrating it with the app and am relying on an earlier custom database report I created to measure “Created First Gallery”.

Here’s what the KISSmetrics Activation funnel looks like without the last step:

activation_report

You’ll see what I meant by CloudFire’s conversion funnel being linear but non-fixed: 11 new people viewed the sign-up form, and 8 new people downloaded the app without starting from the top of the funnel.

Supplementing, as closely as possible, with my own custom report for the number of users that successfully “Created First Gallery” lowers the Activation conversion rate from 11.5% to 5.3%.

This is where I currently am on the numbers. I will be finishing up data collection for Retention, Referral, and Revenue this week but just the Activation numbers already reveal a number of potential hot spots.

Normalized Conversion Dashboard

dashboard_1

I like to visualize my conversion funnel as a percentage of total visitors and have normalized the numbers to reflect that. I’ve also blown up the Activation row to show the full Activation sub-funnel since I’m highly motivated to optimize that right now. As I don’t have the other numbers yet, I’m not bothering with showing them for now.

These numbers immediately indicate that the Activation process is NOT healthy.

While all 3 steps exhibit leaking buckets, of particular concern to me was loosing more than half the people that chose to download the app but couldn’t successfully finish the first task of creating a gallery. That’s where I decided to start.

It’s not that the others aren’t as important but they seem more of an optimization problem (pricing, sign-up form, copy, etc.) than a fundamental failing of the MVP (software) itself. Plus people that made it to the last step, successfully navigated the previous steps so there is some added rational in starting from the bottom of the funnel.

Diagnosing downloads

Despite having an easy way to contact us (800 number, email, GetSatisfaction) on every page of the sign-up process, most people did not choose to contact us when things went wrong, which placed the burden of figuring what went wrong on us.

signup

The first step was being able to identify users that downloaded the app. I used to allow users to download the application from the website and complete the signup process from the app. The idea behind that was to reduce friction and make the app self-contained so it could be distributed from other websites (like say download.com). However, if a user had a problem with the installation, there was no way of knowing. So I reordered the flow to where users create an account first on the website, then download the app. That way we have their email address and can contact them if needed.

The second step was being able to identify as quickly as possible if users ran into an issue. It was fairly easy to construct a report that found users that signed-up but didn’t finish creating their first gallery within a reasonable timeframe. I sent them all personalized emails (this has been automated now) and happily many replied back. Some downloaded the installer, but didn’t know to run it (how do I fix that?). Others had issues with the installation itself which they shared. No one, so far, had issues creating a gallery once the app was installed and launched.

Downloaded apps, in general, are a challenge. The desktop, with multiple operating system versions, java versions, anti-virus programs, NATs and firewalls, is a pretty hostile environment for a new networked application.

One of the issues I uncovered was over a nasty shortcut Apple took with force migrating everyone from 32-bit Java 5 to 64-bit Java 6 in Snow Leopard using a symbolic link pointing Java 1.5 -> Java 1.6 (WTF!). This broke CloudFire. Fixing it actually required upgrading a 3rd party component (Eclipse) which required rewriting the software update process (now using P2/OSGI), and my continuous deployment process (future post). Other issues had to do with 64-bit versus 32-bit on Windows, and bad pre-existing Java installations.

What I’ve found is that, in the end, users WILL encounter unanticipated problems, because you can only test so many desktop/browser configurations (until you can afford running all of them). The key is to be able to identify users that run into problems as quickly as possible and then try to engage them directly with an offer of help, gift certificates, extended trials… whatever it takes to get them to talk to you as they hold the answers (actually they hold the problems, it’s up to you to uncover the answers).

I’ve also started running some more usability tests on the download process which hasn’t revealed anything as significant as the issues already uncovered so maybe I’ll start seeing some improvement in those numbers soon.

What’s Next

Completing the rest of the conversion dashboard, prioritizing other areas in Activation/Retention that need addressing, a/b testing, usability testing, customer follow-up interviews.

  • http://www.twitter.com/aainslie Alexander Ainslie (@AAinslie)

    Ash,
    Great post. Thanks for sharing.
    I just discovered http://www.LotusJump.com via @Szetela – seems like a cost effective tool to manage and extend marketing reach.
    Keep the good stuff coming. It’s better and more useful than watching a reality show! ;)

  • http://www.twitter.com/aainslie Alexander Ainslie (@AAinslie)

    Ash,
    Great post. Thanks for sharing.
    I just discovered http://www.LotusJump.com via @Szetela – seems like a cost effective tool to manage and extend marketing reach.
    Keep the good stuff coming. It’s better and more useful than watching a reality show! ;)

  • Pingback: uberVU - social comments()

  • Garret Tadlock

    I have been reading your blogs for the past few months and really appreciate you taking the time to track your progress. I’m working on the process as well, just a few steps back and find your posts helpful in setting a guide and then re-aligning specific details to fit my target market.

  • Garret Tadlock

    I have been reading your blogs for the past few months and really appreciate you taking the time to track your progress. I’m working on the process as well, just a few steps back and find your posts helpful in setting a guide and then re-aligning specific details to fit my target market.

  • http://www.ashmaurya.com/ Ash Maurya

    Garret –

    I’m happy the posts have been helpful. Writing really helps me crystallize my thinking and forces accountability.
    So thank you for reading..

  • http://www.ashmaurya.com Ash Maurya

    Garret –

    I’m happy the posts have been helpful. Writing really helps me crystallize my thinking and forces accountability.
    So thank you for reading..

  • http://www.twitter.com/MichaelZipursky Michael Zipursky

    Ash – your posts kick things up a notch and are stronger than a mug of espresso. Great stuff!

  • http://www.twitter.com/MichaelZipursky Michael Zipursky

    Ash – your posts kick things up a notch and are stronger than a mug of espresso. Great stuff!

  • http://www.ashmaurya.com/ Ash Maurya

    A mug of espresso has a lot of kick…Thanks for the compliment Michael…

  • http://www.ashmaurya.com Ash Maurya

    A mug of espresso has a lot of kick…Thanks for the compliment Michael…

  • Chris Hawkins

    Ash, good discoveries! I’m going through this with an IT tool targeted at system administrators. So far I have had a similar experience where people will sign up on the website, I capture their email address and name, and then they download the installer. But then there is a VERY strong tendency for them to not ask for help when they encounter problems. I am struggling with this issue and I think I will try building some anonymous communications into my installer…

    Something like a ping when they start an install, a ping when they finish certain steps, and then another when the installer completes successfully. I’m feeling the need to know if users are even attempting installs, if they are trying and failing, and if so, where, etc. Your thoughts?

    Also – you mentioned changing your process so that get an email before allowing a download. I took that a step further and put a quick one page survey in-line with the download link, so before they can get the installer they have to tell me some demographic information that I really want. It only takes them a few seconds and this helped me tremendously in figuring out who was responding to my attempts to drive traffic, and did not appear to deter anyone from completing the download.

  • Chris Hawkins

    Ash, good discoveries! I’m going through this with an IT tool targeted at system administrators. So far I have had a similar experience where people will sign up on the website, I capture their email address and name, and then they download the installer. But then there is a VERY strong tendency for them to not ask for help when they encounter problems. I am struggling with this issue and I think I will try building some anonymous communications into my installer…

    Something like a ping when they start an install, a ping when they finish certain steps, and then another when the installer completes successfully. I’m feeling the need to know if users are even attempting installs, if they are trying and failing, and if so, where, etc. Your thoughts?

    Also – you mentioned changing your process so that get an email before allowing a download. I took that a step further and put a quick one page survey in-line with the download link, so before they can get the installer they have to tell me some demographic information that I really want. It only takes them a few seconds and this helped me tremendously in figuring out who was responding to my attempts to drive traffic, and did not appear to deter anyone from completing the download.

  • http://www.ashmaurya.com/ Ash Maurya

    Chris –

    I got a similar suggestion from Niall Smart (from http://www.echodio.com) to build some more visibility into differentiating “installer downloads” from “installer launches”. I agree and have contemplated doing that before. The challenge, for us, comes in building the right hooks from the installer to the back-end metrics system. We are using NSIS for Windows installs, and PackageManager for Mac OS X installs. I think I’ve figured out a way to do this. Are you using a 3rd party installer or a custom one?

    In the right environment, Mac OS X + Safari web browser, launching the installer isn’t a problem as Apple auto-launches install packages. This is certainly a minority case but worth mentioning.

    I haven’t had much experience with surveys and have always erred on the side of collecting less versus more at sign-up. I want to start with some installer events and maybe a/b test a sign-up form with a survey.

    Cheers

  • http://www.ashmaurya.com Ash Maurya

    Chris –

    I got a similar suggestion from Niall Smart (from http://www.echodio.com) to build some more visibility into differentiating “installer downloads” from “installer launches”. I agree and have contemplated doing that before. The challenge, for us, comes in building the right hooks from the installer to the back-end metrics system. We are using NSIS for Windows installs, and PackageManager for Mac OS X installs. I think I’ve figured out a way to do this. Are you using a 3rd party installer or a custom one?

    In the right environment, Mac OS X + Safari web browser, launching the installer isn’t a problem as Apple auto-launches install packages. This is certainly a minority case but worth mentioning.

    I haven’t had much experience with surveys and have always erred on the side of collecting less versus more at sign-up. I want to start with some installer events and maybe a/b test a sign-up form with a survey.

    Cheers

  • http://www.arieldistefano.com/ Ariel Di Stefano

    Great post Ash.

    Regarding the download process, I still believe, based on our own experience and numbers, that the download button has to be the call to action button on the homepage / landing page.

  • http://www.arieldistefano.com Ariel Di Stefano

    Great post Ash.

    Regarding the download process, I still believe, based on our own experience and numbers, that the download button has to be the call to action button on the homepage / landing page.

  • http://www.ashmaurya.com/ Ash Maurya

    Thanks Ariel –

    We had started with that but wanted to isolate the effect of a downloadable app from the UVP on the landing page. It will make sense to revisit at some point (once a baseline is established) and A/B test it’s impact again.

  • http://www.ashmaurya.com Ash Maurya

    Thanks Ariel –

    We had started with that but wanted to isolate the effect of a downloadable app from the UVP on the landing page. It will make sense to revisit at some point (once a baseline is established) and A/B test it’s impact again.

  • Pingback: Twitted by aiwilliams()

  • Pingback: Twitted by kylemathews()

  • Free Faler

    Hi Ash,
    thanks for the great stuff you’re writing here. Really helpful!
    I was just wandering how did you calculate Est.value (not cost!) in the Dashboard ?

  • Free Faler

    Hi Ash,
    thanks for the great stuff you’re writing here. Really helpful!
    I was just wandering how did you calculate Est.value (not cost!) in the Dashboard ?

  • http://www.ashmaurya.com/ Ash Maurya

    Hi Free Faler –

    Right now, these are really just based on gut-feel estimates on the value I would place on each of these events, working backwards from the one real dollar event which is Revenue. CloudFire is priced at $50/yr so that’s the starting (or end) point. Then I just discounted from there. Some people choose to use lifetime value of a customer (which assuming 3 yrs would be $150) but I decided to keep things simple for now.

    Once I get all my conversion numbers, then I can actually do the math based on that. So for instance, assuming that 1% of visitors that viewed my pricing page (Acquisition) convert to paying customers (Revenue), the value of Acquiring a user would then be $50*1% = $0.50

  • http://www.ashmaurya.com Ash Maurya

    Hi Free Faler –

    Right now, these are really just based on gut-feel estimates on the value I would place on each of these events, working backwards from the one real dollar event which is Revenue. CloudFire is priced at $50/yr so that’s the starting (or end) point. Then I just discounted from there. Some people choose to use lifetime value of a customer (which assuming 3 yrs would be $150) but I decided to keep things simple for now.

    Once I get all my conversion numbers, then I can actually do the math based on that. So for instance, assuming that 1% of visitors that viewed my pricing page (Acquisition) convert to paying customers (Revenue), the value of Acquiring a user would then be $50*1% = $0.50

  • http://www.reemer.com/ kareem

    ash-

    loving the detailed posts. laying out your thinking as you go through customer discovery and validation gives me new ideas about how to go about the same processes on my biz. please keep up the great posts!

    kareem

  • http://www.reemer.com kareem

    ash-

    loving the detailed posts. laying out your thinking as you go through customer discovery and validation gives me new ideas about how to go about the same processes on my biz. please keep up the great posts!

    kareem

  • Pingback: Twitted by LucianL()

  • Pingback: Lessons Learned in 2009()

  • http://www.ispionage.com/ Leon

    Ash,
    I stumbled on your blog a couple of days ago from Eric Ries’s blog, and can’t stop reading it now!
    Great content with many actionable insight. Keep on posting Mr.

    -Leon

  • http://www.ispionage.com Leon

    Ash,
    I stumbled on your blog a couple of days ago from Eric Ries’s blog, and can’t stop reading it now!
    Great content with many actionable insight. Keep on posting Mr.

    -Leon

  • http://www.flowtown.com/ Dan Martell

    Ash, great post and glad to see such a detailed post on flow analysis and conversion metrics. That being said, this is my favorite paragraph:

    What I’ve found is that, in the end, users WILL encounter unanticipated problems, because you can only test so many desktop/browser configurations (until you can afford running all of them). The key is to be able to identify users that run into problems as quickly as possible and then try to engage them directly with an offer of help, gift certificates, extended trials… whatever it takes to get them to talk to you as they hold the answers (actually they hold the problems, it’s up to you to uncover the answers).

    Thanks for sharing.

  • http://www.flowtown.com Dan Martell

    Ash, great post and glad to see such a detailed post on flow analysis and conversion metrics. That being said, this is my favorite paragraph:

    What I’ve found is that, in the end, users WILL encounter unanticipated problems, because you can only test so many desktop/browser configurations (until you can afford running all of them). The key is to be able to identify users that run into problems as quickly as possible and then try to engage them directly with an offer of help, gift certificates, extended trials… whatever it takes to get them to talk to you as they hold the answers (actually they hold the problems, it’s up to you to uncover the answers).

    Thanks for sharing.

  • Pingback: KISSmetrics vs. Google Analytics()

  • Pingback: Customer Development Checklist for My Web Startup – Part 2()

  • Pingback: Customer Development Checklist for My Web Startup – Part 2 « YABOYA Media()

  • Pingback: Customer Development Checklist for a Web Startup()

  • Pingback: The Ultimate Entrepreneur & Small Business Toolbox | Sean M Everett()

  • Pingback: 5 Necessary Tools For Measuring Your Website | Ryan Glasgow()

  • Pingback: Troubleshooting the Trial Period()

  • Pingback: Troubleshooting Free Trials()

  • Pingback: Software Trials: Fehlervermeidung bei kostenlosen Testphasen - //SEIBERT/MEDIA Weblog()