Project: Content Experiments

The Hunt Continues

This week, I continued tracking the performance of the Analytics Scavenger Hunt and performed informal user testing to assess areas where content could be revised to improve user experience and goal conversion rates.

Informal User Testing

After analyzing week 8 data and project results, I contacted several participants who had completed or nearly completed the hunt. I was also fortunate to receive voluntary reporting from users to let me know about their experiences with the Scavenger Hunt.

Although most users reported that they felt the Scavenger Hunt was engaging and that they were able to progress through the steps with relative ease, there were a two main areas of concern identified.

  1. Submitting a comment upon completion of the hunt.
  2. Progressing from the “Squeak the Squirrel” post to the next page in the sequence.

I also noted during my initial data collecting that these two steps were not performing as well as the rest of the scavenger hunt.

Submitting Comments

The instructions for the Scavenger Hunt include posting a comment on the landing page–a blog post that marks the start of the exercise.

GA-SH-CE

After analyzing the first wave of analytics gathered,  I noted that although there was a good showing of users completing the entire course of web pages (and in the correct order), they were not returning to complete the final task in the instructions. In fact, in the first round of testing, only 1 user out of 5 completing the entire sequence returned to post a comment.

Experiment 1

Goal: Improve Commenting at Completion

The first change I made to improve goal completion, was adding a link to the word ‘comment’ on the last page of the scavenger hunt. The first version I ran didn’t contain a link here.

GA-SH-CE1

Then, I created a variant with a much more noticeable comment button, to see which version performed better in my content experiment.

GA-SH-CE2

After setting up my initial experiment to improve commenting, I sent out invitations to a new wave of my contacts. Commenting has improved tremendously since implementing this test.

After implementing these changes, I continued to do follow up with some users who have made it through the full scavenger hunt to collect feedback. In doing this, I discovered that the button in the variant page is not working for Firefox users.

Squeak!

In addition to discovering the Firefox compatibility issue, 2 users expressed having difficulty progressing from the “Squeak the Squirrel” post to the next page in the sequence.

I had anticipated that this part of the hunt might become a drop off point for some users. This is the most complicated task of the hunt, in part due to the amount of content on this blog post, but also because there are several steps needed to complete and progress.

GA-SH-CE3

After scrolling to the bottom, users find Clue 4.

GA-SH-CE4

Some users were able to take the information in the clue and progress to the next task, but reported that it took longer than they expected and some also reported dropping off at this point.

After receiving this user feedback, I decided to run another experiment in order to improve the experience and progression to the next task.

Experiment 2

Goal: Improve Clarity of Path from “Squeak” to Finish Line

For this experiment, I left the original page as shown above and created a variant to test performance. I made simple changes to help give better clarity about where users should travel to next.

GA-SH-CE5

Comparison Tracking

In order to compare the success and/or discrepancies between different data collection methods, I set up several new destination goals in GA to compare to the content experiment results.

GA-SH-CE6

These goals are being tracked independently from the experiments and are set up to monitor traffic/conversions between all possible page variations combinations used in the experiments.

Because time was shorter for Experiment 2, I was particularly interested in being able to monitor real time activity, in case further adjustments to the content might be needed.

Minor Implementation Issues

The process of setting up the experiments went relatively smoothly except for my initial attempt while using the Google Analytics Content Experiments Plugin. Unfortunately, when trying to verify and launch my experiments, I received error messages regarding the placement of the experiment code on the original page when using the plugin. After several attempts, I decided to add the code into the header manually rather than use the plugin. After I did this the error message cleared for the original page and I launched the experiment.

Next Steps

I invited another wave of contacts to participate over the last few days of testing. I will continue informal conversation about site performance as needed, but expect to spend the remainder of the experiment monitoring the data as it comes in.

Leave a Reply

Your email address will not be published. Required fields are marked *