mediascience's posterous http://mediascience.posterous.com Most recent posts at mediascience's posterous posterous.com Mon, 15 Aug 2011 00:27:00 -0700 Behind the scenes: A/B testing part 2: How we test - (37signals) http://mediascience.posterous.com/behind-the-scenes-ab-testing-part-2-how-we-te http://mediascience.posterous.com/behind-the-scenes-ab-testing-part-2-how-we-te

A few weeks ago, we shared some of what we’ve been testing with the Highrise marketing page. We’ve continued to test different concepts for that page and we’ll be sharing some of the results from those tests in the next few weeks, but before we do that, I wanted to share some of how we approach and implement A/B tests like this.

Deciding what to test

Our ideas for what to test come from everywhere: from reading industry blogs (some examples: Visual Website Optimizer, ABtests.com), a landing page someone saw, an ad in the newspaper (our long form experiments were inspired in part by the classic “Amish heater” ads you frequency see in the newspaper), etc. Everyone brings ideas to the table, and we have a rough running list of ideas – big and small – to test.

My general goal is to have at least one, and preferably several A/B tests running at any given time across one or more of our marketing sites. There’s no “perfect” when it comes to marketing sites, and the only way you learn about what works and doesn’t work is to continuously test.

We might be simultaneously testing a different landing page, the order of plans on the plan selection page, and wording on a signup form simultaneously. These tests aren’t always big changes, and may only be exposed to a small portion of traffic, but any time you aren’t testing is an opportunity you’re wasting. People have been testing multiple ‘layers’ in their sites and applications like this for a long time, but Google has really popularized this lately (some great reading on their infrastructure is available here).

Implementing the tests

We primarily use two services and some homegrown glue to run our A/B tests. Essentially, our “tech stack” for running A/B tests goes like this:

  1. We set up the test using Optimizely, which makes it incredibly easy for anyone to set up tests – it doesn’t take any knowledge of HTML or CSS to change the headline on a page, for example. At the same time, it’s powerful enough for almost anything you could want to do (it’s using jQuery underneath, so you’re only limited by the power of the selector), and for wholesale rewrites of a page we can deploy an alternate version and redirect to that page. There are plenty of alternatives to Optimizely as well – Visual Website Optimizer, Google Website Optimizer, etc. – but we’ve been quite happy with Optimizely.
  2. We add to the stock Optimizely setup a Javascript snippet that is inserted on all pages (experimental and original) that identifies the test and variation to Clicky, which we use for tracking behavior on the marketing sites. Optimizely’s tracking is quite good (and has improved drastically over the last few months), but we still primarily use Clicky for this tracking since it’s already nicely setup for our conversion “funnel” and offers API access.
  3. We also add to Optimizely another piece of Javascript to rewrite all the URLs on the marketing pages to “tag” each visitor that’s part of an experiment with the experimental group. When a visitor completes signup, Queenbee – our admin and billing system – stores that tag in a database. This lets us easily track plan mix, retention, etc. across experimental groups (and we’re able to continue to do this far into the future).
  4. Finally, we do set up some click and conversion goals in Optimizely itself. This primarily serves as a validation—visitor tracking is not an exact science, and so I like to verify that the results we tabulate from our Clicky tracking are at least similar to what Optimizely measures directly.

Evaluating the results

Once we start a test, our Campfire bot ‘tally’ takes center stage to help us evaluate the test.

We’ve set up tally to respond to a phrase like “tally abtest highrise landing page round 5” with two sets of information:

  1. The “conversion funnel” for each variation—what portion of visitors reached the plan selection page, reached the signup form, and completed signup. For each variation, we compare these metrics to the original for statistical significance. In addition, tally estimates the required sample size to detect a 10% difference in performance, and we let the experiment run to that point (for a nice explanation of why you should let tests run based on a sample size as opposed to stopping when you think you’ve hit a significant result, see here).
  2. The profile of each variation’s “cohort” that has completed signup. This includes the portion of signups that were for paying plans, the average price of those plans, and the net monthly value of a visitor to any given variation’s landing page (we also have a web-based interface to let us dig deeper into these cohorts’ retention and usage profiles). These numbers are important—we’d rather have lower overall signups if it means we’re getting a higher value signup.

Tally sits in a few of our Campfire rooms, and anyone at 37signals can check on the results of any test that’s going on or recently finished anytime in just a few seconds.

Once a test has finished, we don’t just sit back and bask in our higher conversion rates or increased average signup value—we try to infer what worked and what didn’t work, design a new test, and get back to experimenting and learning.

Interesting insight into A/B testing at 37S. M ostly relates to Mktg site but good reference for all if we decide to engage in A/B for MS platform.

Permalink | Leave a comment  »

]]>
http://files.posterous.com/user_profile_pics/1037311/RE_Pic.jpg http://posterous.com/users/he6XH5WLGjRJE mediascience mediascience mediascience
Wed, 16 Mar 2011 01:22:00 -0700 eXo Platform kicks off “the year of PaaS” and extends enterprise portals to the cloud http://mediascience.posterous.com/exo-platform-kicks-off-the-year-of-paas-and-e http://mediascience.posterous.com/exo-platform-kicks-off-the-year-of-paas-and-e

Yesterday, Gartner declared that 2011 will be the year of Platform as a Service (PaaS). Not only does this mean we’ll be seeing a wave of innovation but also a consolidation of various offers over the next few years.

Well, we’ve perhaps already witnessed a bit of consolidation. Like with VMware’s acquisition of WaveMaker just last week. This acquisition – which took place only a few months after Cloudbees bought Stax – should naturally strengthen VMware’s PaaS range.

As for innovation, new offers are also starting to pop up – especially with Amazon Web Service’s launch of Elastic Beanstalk in January. More traditionally an Infrastrure as a service (IaaS) provider, AWS seems to be clearly making its way towards PaaS as well. The product (still in beta) is designed to let developers upload their application and then keep their hands off while the system automatically handles deployment.

And coincidentally, eXo Platform is also launching a new cloud-based service for java developers, called eXo Cloud IDE. The Franco-American company is the publisher of a user experience platform for java, aiming to make java applications and websites faster to build and easier to deploy. Cloud IDE, which will be in private beta through the 2nd trimester of 2011, is the first of a series of free services to be provided in the eXo cloud services package. This hosted environment facilitates social coding and the collaborative development of gadgets and mashups that can be deployed directly to popular java PaaS. In the future, the team says it would like to extend support beyond java to cover additional programming languages like Rails, .NET, Node.js, and Play.

The company – which recently nominated the former Director of Red Hat France, Yann Aubry, to lead the company’s sales efforts in the EMEA region – has also announced the launch of eXo Platform 3.5, a multi-tenant user experience platform for java systems. In addition to multi-latency and cloud management capabilities, the new platform apparently features a number of improvements making it easier to write, test and deploy gadgets, mashups, HTML5 and content applications.

The company founded in 2003 closed a €4 million Series A round with Auriga Ventures and XAnge last year. eXo – which been called one of the “three musketeers” of French open source (alongside Bonitasoft and Talend) by French Newspaper La Tribune – says that it seeks to provide the shortest and most efficient path to the cloud for java enterprises with its new offers. Developers, feel free to chime in.

eXo Platform image

Website: exoplatform.com
Location:San Francisco, California, United States
Founded: 2003
Funding: $6M

eXo offers the next generation of Java middleware designed for the new era of cloud-based services. The eXo Platform makes Java websites and applications faster to build and easier to deploy, and offers modern features such as content, collaboration,… Learn More

Information provided by CrunchBase

See what you think Phil, any implications for MS? Either from potential tech/dev perspective or maybe just how we see the world...

Permalink | Leave a comment  »

]]>
http://files.posterous.com/user_profile_pics/1037311/RE_Pic.jpg http://posterous.com/users/he6XH5WLGjRJE mediascience mediascience mediascience