Google Analytics, 3 of 4: Content Experiments

My third post is about how I completely misunderstood Google Analytics Content Experiments and learned to still be okay with myself. After very briefly / braindeadly looking over explanations about what Content Experiments are and reading through the vue-analytics documentation, I with way too much certainty assumed my incomplete set of marching orders and set to marching. I then wily-nily started creating new elements and alternate pages, adjusting my GA event trackers, all while imagining all the sweet insights I was about to rake in based on the experiment I’d set up for visitors to unwittingly walk into. But I did it all wrong. (It’s okay, I forgive myself.)

What even are “Content Experiments”?

Content Experiments are similar to A/B testing. A/B testing, also known as split testing or bucket testing, is a form of statistical hypothesis testing comparing two versions of a single variable. Content Experiments use a similar model, A/B/N. In this model, you aren’t dealing with just two versions of the same page, you are testing up to 10 full versions of a page, each with a separate URL.

The Google Analytics console offers a wizard of sorts to set up your page variants and generate code which must be added to your website in order to enforce the direction of traffic to your variant pages based on your specific configuration. Heh, this may sound simple enough but I learned it all too late.

Screenshot of the Content Experiments wizard

Screenshot of the Content Experiments wizard

Experiment Objective

This tale starts off well. I think my planning steps were pretty solid even if my implementation was flawed. Good intentions and all, fwiw. Since my overall objective for the site is to get users to actually contribute a recipe to the repository, the Content Experiment objective I planned was to get users to click on a “Contribute” button, fill a form, and submit it.

The variable I decided to isolate was the location and color of the “contribute” button. Once able to divide traffic into visitors who accessed the submission form via one button or the other, I figured I’d have (what I now realize are extremely uninteresting and not useful ) metrics on whether blue or coral button pushers are more likely to submit a recipe.

Screenshot of Karl Sayagin home page showing different buttons

What color button pusher are you?

I set up my event trackers and goals, thinking that what I was orchestrating was a Content Experiment™.

I did whaaat?!

You may by now note the mistake I made. My set up involved two different components on a single page, both of which take you to the same submission form. Er, actually, each button takes you to a different submission page which looks exactly like the other (how incredibly boring, right?).

It would have been more appropriate for me to create two different submission forms, add them as variants within the google analytics console, and add the necessary code so that visitors would be delivered to one or the other at random.  Something I could try would be to give submitters the option of crediting themselves when they add a recipe, since some people may prefer not being anonymous.

The cold hard truth

By the time I had realized my mistake, it was too late to set Content Experiments up properly for the purposes of this assignment. Despite not being properly set up as a true Content Experiment, since I’d added event trackers to the two different “contribute” buttons linking to different pages, I was still able to gain some insight into the performance of these two different buttons.  Based on this test, for each button there were two users who clicked on them and ultimately submitted a recipe (two-to-two).

Screenshot of GA Dashboard showing success of different contribute buttons

The button in the top-right corresponds to ‘contribute-1’ and the main button below the quote corresponds to ‘contribute-2’

 

Leave a Reply

Your email address will not be published. Required fields are marked *