-
Notifications
You must be signed in to change notification settings - Fork 38
Setup for basic A/B tests for the front page #101
Comments
Are there bounce rate stats on Gittip name and previous designs of the front page? For me the page is definitely not "scannable" - it is hard to say what it is about without reading thoroughly. |
Proposed text changes for a first experiment (aimed at targeting our Customer A):
The faster we want results of an experiment, the fewer variations we should test. I propose we test one variation with all these (or similar) changes against the current default. We could definitely try a dramatic design change instead (or in addition) but the key is that we test one variation on the page (not just one change), as we otherwise won't have enough first-time traffic to get results in a reasonable amount of time. BONUS: We'll also see any effect on returning users but that isn't what we're trying to test for here. |
@techtonik Current bounce rates are available in GA, yes, so if you have dates for design changes, I can tell you if they seem to have made a significant difference, but without split testing, confidence will be limited. |
👍 |
Funnel:
|
@webmaven May I give you access to Optimizely? |
IRC re: Segment + Optimizely |
Looks like we're going to use Optimizely directly instead of through Segment, because of price. |
@whit537 Yes, please give me access. |
@webmaven I've attempted to do so via LastPass. Please let me know if you don't get an invite or are otherwise stymied. |
We're looking for dramatic changes in behavior if we can find them, but small ones add up too. Hypothesis: Changing the content (see #101 (comment)) of the front page to appeal to givers rather than receivers will not measurably change the new visitor bounce rate.This is the smallest (and simplest) possible experiment I can think of that might have the desired effect. We can, instead, make a more dramatic change to the page. Depending on whether the behavior change is dramatic, it may take a while for statistical significance to accumulate (the smaller the effect on user behavior, the more samples will be needed to be certain the effect is there). We may see other behavior changes in the data that will be worth following up such as a change in which action is taken next. Note: Unless the results show a rise in bounce rate or other clearly undesireable effect, I suggest the text change be made anyway (first, avoid harm) as it better aligns with current goals. Possible followup experiments on the front page: Changing the layout, Text sizes, arrangement of elements, color-scheme, Call to Action, etc. Other possible pages to experiment on can be the 'about' page, adding a 'thank you for signing up' page, changing the un-logged-in view of the receiver profile page to appeal more directly to givers, etc. |
I'm excited to see what happens with this! |
👍 |
|
Our first A/B test is live! 💃 Went with:
|
Using 25% as current CTR, here are some sample sizes per Optimizely's calculator:
We're getting about 1,500 new visitors per week. |
sigh Sorry for the "close" noise. :-/ |
What's the flub? |
"Welcome back to Give Weekly Donations!"? |
What did it say before? |
@webmaven - It used to show "Your balance is XXX" |
Looks like the selector used is |
@rohitpaulk @webmaven How about if we only use Optimizely with anonymous visitors: gratipay/gratipay.com#2893? |
I don't like that option. Can we instead change the CSS class on the p elements (p.greeting vs p.subhead, perhaps)? If that change to the templates is deployed, I can target the Optimizely rule more narrowly. Alternately, just change the class of the greeting from div.action to div.greeting to accomplish the same thing, and I may not even have to change the rule. |
Why? It's ready to go and solves the problem for the time being. The purpose here is to run one single A/B test to get a feel for how this all works. Let's not overthink this. |
Because I want to see the effect of the new language on logged in users as well. |
I'm on my way back home at the moment. In about an hour I should be able to give you a glimpse at what the analytics look like inside of Google Analytics for this test. |
Optimizely is not set up to distinguish between logged-in and logged out users. It isn't even distinguishing between new and returning users. But in Google Analytics, we will be able to slice and dice the information much more finely, as long as we don't throw it away. About 1/4 of the user sessions recorded yesterday (GA has a day's delay) were returning users, and presumably some of those are already logged in. If we stop showing the experiment to logged in users, we'll have invalidated a huge chunk of our data so far, and be very limited in the kinds of questions we can expect to get out of the data after the experiment is concluded. I would really rather not cut ourselves off from potential insights. |
Alternate fix for the flub (untested): https://github.com/gratipay/gratipay.com/pull/2894/files |
Flub fixed in #2894. IRC |
First of three goals is done! Starting to discuss results and next steps in IRC. |
Per above we need 15,000 users to confidently measure a 5% increase. We get 1,500 new users a week so it would take us another nine weeks. |
I think it is going slower than it could, because we're only able to test users that aren't blocking 3rd-party JS (most ad blocker software, for example, will prevent Optimizely from working). We're pretty sure at this point that the new version isn't worse at least. I suggest we let this run about another week at most. Meanwhile, no need to let this stop us from furthering the A/B infrastructure and doing other tests in parallel. |
@webmaven Are we tracking only new visitors on Optimizely? |
No. That was why the GA integration was important, so we could differentiate between new and returning visitors there. |
Ah. So in that case the percentages we're getting from Optimizely are meaningless, right? We are interested in the effect of this change on the bounce rate for new visitors, correct? |
Not completely meaningless, it was important to note whether there was an effect in general.
That is our primary interest, yes. |
Some data so far: Among new users who land on the front page (1,242 users), GA says the new version has a 53.8% drop off compared to the original of 55.5%. Among new users that land on another page but then proceed to the homepage (228 users), the new version has 51.3% compared to 51.2% for the original. If the homepage is the 3rd page visited (129 users), we see the new version with 48% drop off, and the original with 45%. |
"Win The Internet With A/B Testing" |
One down! :-) |
Currently, the front page has around an 80% bounce rate, possibly because it is targeting receivers. I propose testing our current Hypothesis for 'Customer A' by running an A/B test on the front page content to see if we can move that needle significantly.
To set up a basic A/B test We'll need the following steps completed as prerequisites
Turn on Optimizely integration for Segment: https://segment.com/docs/integrations/optimizely/BONUS Add a 'thank you for signing up' page for people signing in for the first timeDidn't make it.Note: I am not certain that the variations will end up in GA (Google Analytics) so that we can limit the analysis to first time visitors and segment by variation, but even if not, we should see a statistically significant change as first time visitors currently bounce at a ~75% rate., and we can also see Optimizely's own reports.
The text was updated successfully, but these errors were encountered: