Usability Test: Eat Your Greens

Usability Test: Eat Your Greens

Background

Eat Your Greens is an AngularJS web application that performs a plant-based recipe search using terms entered by the user. It returns the top 10 results with basic data and links to full recipes.

11-cap-screenshot-branchaweb-com-2017-03-12-23-28-08

Testing Method

Volunteers for a usability study were recruited from among known friends and acquaintances. They were sent a request to visit and use the site followed by the completion of a questionnaire. Volunteers in this first round of testing are primarily non-web professionals.

Instrument

The questionnaire was created using Google Forms and is stored on Google Drive. A link to the questionnaire was provided in the initial invitation to participate and is also available in the footer and “Contact” sections of the web application.

Initial Results

As of this writing there were 12 respondents to the questionnaire.

Section 1: Overview

Q1. What type of device did you use to visit Eat Your Greens? (check all that apply)

01-cap-screenshot-docs-google-com-2017-03-12-21-51-26

The majority of respondents (50%) visited the site via I-Phone.

Q2. How long did it take to understand how to use the Eat Your Greens web application? (pick one)

  1. Instantaneous or under 5 seconds
  2. Quickly or under 15 seconds
  3. Not too long or under 30 seconds
  4. Took awhile or under 45 seconds
  5. Too a long time or under 1 minute
  6. Took forever or over 1 minute
  7. Still working on this.

The majority of respondents indicated that they were able to understand how to use the app within 5 seconds.

02-cap-screenshot-docs-google-com-2017-03-12-21-59-05

Q3. Where you able to do any of the following while using Eat Your Greens (check all that apply):

  • Perform a search using a name of cuisine or dish.
  • Perform a search using an ingredient.
  •  Find the name of recipe in results.
  •  View an image of the recipe.
  •  Read ingredient list in results.
  •  Read fat and calorie content.
  •  Read dietary information (e.g. “Low-Fat”, “Low-Sodium”, etc.)
  •  Click button to visit the recipe’s website.

All tasks indicated received 40% or higher success rate among respondents.

03-cap-screenshot-docs-google-com-2017-03-12-22-02-08

Q4. Rate overall level of difficulty in performing a search for recipes using Eat Your Greens.

Respondents rated difficulty of performing a recipe search on a scale of 1 to 5 (with 5 being most difficult). 75% considered this task to be easy to perform.

04-cap-screenshot-docs-google-com-2017-03-12-22-06-15

Q5. Did you visit any pages other than the home page?

05-cap-screenshot-docs-google-com-2017-03-12-22-12-56

The home page holds the majority of interest for the user, but 8 respondents (66.7%) indicated that they visited other pages of the site. (These 8 respondents where answered the following Q6 and Q7, all others skipped ahead to Q8,)

Section 2: Site Navigation

Q6. Which pages did you visit?

The about page was the most visited page, although all pages received good traffic.

06-cap-screenshot-docs-google-com-2017-03-12-22-28-30

Q7. How easy was it to navigate from the home page to other pages on the site?

Of the 8 respondents who visited pages other than home, 6 rated the ease of navigation as a 1 on a 1 to 5 scale (where 1= easiest to use).

Section 3: Usefulness

Q8. How likely is it that you will use Eat Your Greens again in the future?

07-cap-screenshot-docs-google-com-2017-03-12-22-34-01

Respondents used a linear rating scale from 1 to 5 to indicate their level of interest in future use of this application (where 5=frequent use, 2x/week and 1=never use again). The majority indicated a rating of 4. Some respondents commented off the record that they had bookmarked the app, because they are “always looking for recipes like this”.

Q9. How likely is it that you would recommend Eat Your Greens to a friend?

Using a scale of 1 to 5 (5 being very likely), 50% of respondents indicated it was very likely that they would recommend this app to a friend.

08-cap-screenshot-docs-google-com-2017-03-12-22-39-17

Q10. How useful were the recipes that you discovered while using Eat Your Greens?

Respondents then ranked the overall usefulness of the recipes they found in their searches on a scale of 1 to 5 (where 1=none were useful and 5=I’ve already made one of the recipes). 7 respondents (58.3%) indicated, although they hadn’t cooked a recipe yet, they anticipated they might.

09-cap-screenshot-docs-google-com-2017-03-12-22-42-04

Section 4: Suggestion Box

Q11. Would the addition of any of the following features improve your experience with Eat Your Greens? (check all that apply)

  • Ability to save recipes
  • Display more search results
  • More information and/or resources about plant-based diets
  • Site fine the way it is

75% of respondents indicated they would like ability to save recipes. Other feature options were not as strongly supported.
10-cap-screenshot-docs-google-com-2017-03-12-22-46-16

Q12. Do you have any other suggestions for ways to improve the overall usefulness of Eat Your Greens?

This was an open ended post where some respondents submitted additional feedback. Among them, where the following to be considered to improve functionality:

“Not super convenient to scroll down to see recipes on home page. What about tiles with the name/image of each that you could scroll past horizontally?”

“If an ingredient has no results, a default message would be nice.”

“Great site! It was really easy. The only problem I had was on the search. I expected to be able to hit enter to submit my search. That might be useful.”

“Spellcheck. LOL For the user, that is, typing in an ingredient or recipe.”

Summary

Next Steps

Add ability to save recipes. This was a feature that I considered including in the initial build of this application, but from my own experience with similar sites had not found it to be that useful, except in theory. I decided to present the application without this feature, initially, and wait until I had received feedback from the user questionnaire. This feature garnered the most significant interest among respondents as a way to improve their experience.

Add a “No Results” message. This is a suggestion that was made by a user who turned up no results on a search they attempted. The suggestion definitely has merit and will be considered for future iterations of the site.

Add ability to hit enter on keyboard to initialize search. This suggestion would add to the accessibility and overall ease of use for users on the site.

Consider other feedback and evaluate methods to implement them. Other suggestions made, such as:

  • Adding spellcheck to the entry box
  • Considering a “swipe” presentation of results rather than scroll
  • Including more results (or a more results button)

Continue testing. The questionnaire is still accepting responses at this time as there are still several volunteers who have not completed the test yet. Once they have completed I will close the questionnaire until improvements are implemented.

Implement A/B Testing. For future iterations of the site, I would like to create analytics experiments to study usability of specific features.

Final Project: the Analytics Scavenger Hunt

Background

This is part 3 of a project using the Analytics Scavenger Hunt on this user journal. There are clues for users to follow in order to visit and scroll down 5 pages. The pages are in a variety of formats, some blog posts, others pages reachable from top navigation. Content is varied as well, including images, video, and a link to an online game. All clues are within one click away from the answer.

Goals

I tracked a combination of events and destination goals prior to launching content experiments and added several additional to run in tandem with the experiments.

Implementation: I set up the goals using Google Analytics and Events using the WP GA Events plugin.

Goal Table

Final1

Event Table

Final2

Goal Setting

Goals Part 1

  • Scavenger Hunt: This was an early destination goal to track the number of users click a link that I promoted to the landing/instruction page of the hunt.
  • Hunt Comment Funnel: This is the first funnel set up to track progression from landing page through all five subsequent pages.

Associated Events: I set up a series of scrolling events to track user activity on pages where they needed to scroll to read a clue. These are the same page tracked in the hunt comment funnel goal.

Reasoning: When I set up these goals and events I was primarily interested in analyzing traffic patterns and identifying areas where drop-offs occurred during the course of the hunt.

Goals Part 2

  • Comment: This is a destination goal that tracks user progression from the final clue page to the comment entry box on the landing page. It is the original page in content experiment 1.
  • Comment Var2: This destination goal is a variation page created for the content experiment 1.

Associated Events: There are two click events, one for each variation, to track when the user clicks the link to go to the comment page.

Reasoning: Posting a comment is the final task of the Scavenger Hunt. I wanted to track how many users where arriving to the comment page and also, whether they were using the links provided, or other navigation methods.

Goals Part 3

  • Hunt Squeak to Game: This destination goal set was created to measure whether users were progressing from the “Squeak the Squirrel” page to the the next page in the sequence. The series below replaced this goal once I began running content experiment 2.
  • Hunt Squeak Var1 to Game Var1: The following destination goals were created along with content experiment 2 in order to track completions for each possible combination of variation pages.
  • Hunt Squeak Var1 to Game Var2
  • Hunt Squeak Var2 to Game Var1
  • Hunt Squeak Var2 to Game Var2

Associated Events: There is one scroll event to track the total number of users to scroll to the bottom of the page to read the clue.

Reasoning: Initially, I set this goal, because I suspected this page be more likely to have drop-offs then others in the course of the hunt due to the amount of content and the complexity of the tasks needed to progress to the next page.

Content Experiments

Experiment 1

Goal: Improve Commenting at Completion

Implementation

I set up a content experiment in Google Analytics to determine which of two pages produced more conversions.

Original

The original page in this experiment had a link to the comments in the text of the “Success” message.

Final5

Variant

In the variant I added a large button to the comment page below the “Success” message.

FInal3

Firefox Fix for Variant

Shortly after beginning this experiment, my informal user testing revealed that the button was not working for Firefox users. In order to continue the experiment with this variant running, I chose to add new content at the bottom to make it more accessible.

Final4

I also added a click event to the the new Firefox link to see how many users used this link to comment.

Results and Observations

After 6 days of data collection, experiment 1 is showing that the original page is outperforming the variant. However, the variant was only delivered to 14 sessions while the original had 30 opportunities to run.

FInal9

For this experiment, it has to be taken into account that at least 2 Firefox users contacted me to report difficulty with a non-functioning button. Their feedback precipitated my adjustment to the content to aid Firefox users. Both of the users who contacted me did convert to to the comment page, but they had to click off this page to get there.

I had expected the variant to be more successful than the original. The time running without the final revision may have contributed to lower performance.

Comparison Goal Tracking

Findings from the event tracking of the variation page are consistent with these results, with 1 click on the button and 1 scroll event to the bottom of the page recorded, plus 1 destination goal conversion.

At this time, it doesn’t appear that the Firefox link added to the Variant has been clicked.

Overall improved rates of commenting.

Even though the content experiment only shows that there were 6 conversions, 17 new comments were posted during the experiment. That means that 11 users took a different route to the finish.

At least 2 of these new comments came from Firefox users (from prior to added Firefox link) who used a different path back to the comment page. That leaves 9 users who didn’t take the predicted path to the comments.

Next steps for improvement.

It would be best to continue running the experiment for another week in order to collect more data, however the timetable for the project doesn’t allow for that much extra time.

Therefore, what can be learned from data and observations at this time? Considering that 11 out of 17 users took a path different than predicted, it would be wise to reconsider the task itself. Traveling back to the landing page after moving 5 or more pages away, may be more problematic than whether a link or button is noticeable.

If this project were to continue, the final page of the hunt should be a page where comments can be submitted without moving away to yet another page. Perhaps, adding a comment form to the online game rather than the blog would be even better.

Experiment 2

Goal: Improve Clarity of Path from “Squeak” to Finish Line

Implementation

After informal user testing, I received consistent feedback that this was the most difficult step in the hunt. I set up a content experiment to track two versions of “Squeak”.

Original

Many users felt the original version didn’t give enough information about how to locate the next clue.

Final6

Variant

In the variant, I added bold to the most relevant information in the clue and also a hint to help users find the link to the game in the top navigation.

Final7

Results and Observations

After 4 days of data collection, results were not very enlightening. Out of 8 sessions tracked in the experiment only 1 user was delivered the variant. All others received the original page.

Final8

Ideally, more time is needed to best evaluate whether one page is performing better. As in the previous experiment, I expected the variant to perform better.

Difficult yet charming.

Although, this experiment appears to show the original content as the winner, many users have given feedback stating otherwise.

Final10

Amusingly, even though users find it to be challenging, “squeak” has charmed many who expressed that the video was their favorite part of the experience.

Next steps for improvement.

As in the previous experiment, if the timetable for the project allowed for an additional week of data gathering, it would be helpful. Since this is not possible, based on the data available and clear user feedback, this content should be revised to make the progression between steps easier for the user.

Possible solutions

Reduce the amount of scrolling. Positioning the clue closer to the video might improve conversion rates and reduce user time trying to find the next page to visit.

Create new content that is more directly tied to what “Squeak” has to say. The clue that leads to this content asks the user to look at the video to see what “Squeak” says at a time marker in the video, but this doesn’t apply to the next step in the hunt, it is simply a way to encourage interaction with the video. This may confuse users tying to follow the clues and instructions. Adding another step to the hunt with a better connection to “Squeak” may improve user experience and sense of continuity.

Write a better clue. Combined with the fact that users may expect “Squeak’s” comments to be relevant to the next step in the hunt, the clue itself may simply be too vague.

Project: Content Experiments

The Hunt Continues

This week, I continued tracking the performance of the Analytics Scavenger Hunt and performed informal user testing to assess areas where content could be revised to improve user experience and goal conversion rates.

Informal User Testing

After analyzing week 8 data and project results, I contacted several participants who had completed or nearly completed the hunt. I was also fortunate to receive voluntary reporting from users to let me know about their experiences with the Scavenger Hunt.

Although most users reported that they felt the Scavenger Hunt was engaging and that they were able to progress through the steps with relative ease, there were a two main areas of concern identified.

  1. Submitting a comment upon completion of the hunt.
  2. Progressing from the “Squeak the Squirrel” post to the next page in the sequence.

I also noted during my initial data collecting that these two steps were not performing as well as the rest of the scavenger hunt.

Submitting Comments

The instructions for the Scavenger Hunt include posting a comment on the landing page–a blog post that marks the start of the exercise.

GA-SH-CE

After analyzing the first wave of analytics gathered,  I noted that although there was a good showing of users completing the entire course of web pages (and in the correct order), they were not returning to complete the final task in the instructions. In fact, in the first round of testing, only 1 user out of 5 completing the entire sequence returned to post a comment.

Experiment 1

Goal: Improve Commenting at Completion

The first change I made to improve goal completion, was adding a link to the word ‘comment’ on the last page of the scavenger hunt. The first version I ran didn’t contain a link here.

GA-SH-CE1

Then, I created a variant with a much more noticeable comment button, to see which version performed better in my content experiment.

GA-SH-CE2

After setting up my initial experiment to improve commenting, I sent out invitations to a new wave of my contacts. Commenting has improved tremendously since implementing this test.

After implementing these changes, I continued to do follow up with some users who have made it through the full scavenger hunt to collect feedback. In doing this, I discovered that the button in the variant page is not working for Firefox users.

Squeak!

In addition to discovering the Firefox compatibility issue, 2 users expressed having difficulty progressing from the “Squeak the Squirrel” post to the next page in the sequence.

I had anticipated that this part of the hunt might become a drop off point for some users. This is the most complicated task of the hunt, in part due to the amount of content on this blog post, but also because there are several steps needed to complete and progress.

GA-SH-CE3

After scrolling to the bottom, users find Clue 4.

GA-SH-CE4

Some users were able to take the information in the clue and progress to the next task, but reported that it took longer than they expected and some also reported dropping off at this point.

After receiving this user feedback, I decided to run another experiment in order to improve the experience and progression to the next task.

Experiment 2

Goal: Improve Clarity of Path from “Squeak” to Finish Line

For this experiment, I left the original page as shown above and created a variant to test performance. I made simple changes to help give better clarity about where users should travel to next.

GA-SH-CE5

Comparison Tracking

In order to compare the success and/or discrepancies between different data collection methods, I set up several new destination goals in GA to compare to the content experiment results.

GA-SH-CE6

These goals are being tracked independently from the experiments and are set up to monitor traffic/conversions between all possible page variations combinations used in the experiments.

Because time was shorter for Experiment 2, I was particularly interested in being able to monitor real time activity, in case further adjustments to the content might be needed.

Minor Implementation Issues

The process of setting up the experiments went relatively smoothly except for my initial attempt while using the Google Analytics Content Experiments Plugin. Unfortunately, when trying to verify and launch my experiments, I received error messages regarding the placement of the experiment code on the original page when using the plugin. After several attempts, I decided to add the code into the header manually rather than use the plugin. After I did this the error message cleared for the original page and I launched the experiment.

Next Steps

I invited another wave of contacts to participate over the last few days of testing. I will continue informal conversation about site performance as needed, but expect to spend the remainder of the experiment monitoring the data as it comes in.

Project: Goals and Funnels

For this project, I created a scavenger hunt on this user journal with clues for users to follow in order to visit and scroll down 5 pages. The pages are in a variety of formats, some blog posts, others pages reachable from top navigation. Content is varied as well, including images, video, and a link to an online game. All clues are within one click away from the answer.

GA-SH

Set Up

Google Analytics

I set up a destination goal using Google Analytics to track user progress onto the landing page (the blog post with instructions shown above) of the scavenger hunt.

I also created a funnel with the order of pages users would take if following the clues sequentially to the finish. I chose not to make the funnel required, because users may take different routes, or become interested in other content along the way. Reaching all of the pages, not necessarily in order, is more relevant to completion.

WP GA Events

To compensate, in part for not making the funnel required, and in part due to my own fascination, I used WP GA Events to add a scrolling event to each page. The id for the scroll is located in the clue  in order to observe how many users scrolled to the clues on the page. In addition, I added a click event to a comment link on the final page to see how many users would return and comment from the link.

Implementation

Once I created all of the content, coded scroll/click events, and set up the goal/funnel, I sent invitations to some of my personal contacts to drive traffic to the landing page.

Initial Findings

Scroll Events

Below are the top scroll events ranked in order.The events are ordered by clue number (clue 1 is on the landing page).

GA-SH

 

From the order, I can already see that pages with clues 4 and 3 show more engagement levels than others, which may be due to users returning to the page more than once. What is interesting is that there are 19 events tracked on the landing page, but only 6 on the page with the final clue.

Goal Completions

At this point, there have been 9 destination goal completions tracked to the landing page.  Five of the total originate from Facebook where I invited personal contact with a direct link to the landing page.

GA-SH1

Of the 9 total conversions, all but one entered the site via the landing page, which is what I anticipated. The 8 direct entrances are consistent with my main traffic source via the link I sent to contacts.

GA=SH2

Of these 8 conversions, only 2 continued on to the next page in the scavenger hunt to view clue 2.

GA-SH-flow

At this point, they are joined by 3 users arriving from the home page of the blog who are traveling along to the contact page and clue 2 as well.

GA-SH-flow2

There are 0 drop off from this group of 5 until they begin to reach the end of the course of scavenger hunt clues. At that point 2 drop off and 3 continue to travel the site. As you can see from the images 1 is returning to the ‘play the game’ page again.

GA-SH-flow3

Others diverged from the path slightly and are still completing the hunt at this point. In fact, it appears that at least 2 users are going through the pages a second time.

GA-SH-flow4

All users of these users drop off by the 11th interaction on the site (final pages were Contact Us and Play the Game).

GA-SH-flow5

This users flow through the site supports my earlier assumption that some pages are receiving higher scroll event results due to return visits to popular pages.

Conclusion and Next Steps

I’m pleased with the initial data collection results. They are rich and varied, which give me the opportunity to examine how different measurement methods reflect user behaviors.

One part of the hunt that I didn’t track at this stage was leaving a comment upon completion of the hunt. Only one user completed this task, which may be due to the fact that it requires returning to the landing page to enter the comment (or that users don’t want to leave a comment).

Moving on to the next phase of our project, I plan to take a look at the content and organization to determine what improvements could be made to improve overall results, but also improving ease in commenting at completion.

Analytics Scavenger Hunt

Welcome to the Analytics Scavenger Hunt!

Instructions

  1. There are several items to find on this hunt. They are all located somewhere on this site.
  2. Follow the clues. All items are only one click away from their clue.
  3. When you are finished submit a comment on this post and tell us how many of the items you found.

Clue 1

Sometimes the way to find the best starting point is to contact the source.

Adding the Analytics

Project Summary

I selected this User Science Journal blog (http://responserequest.com/branchb/) for my first analytics experiment, because I have the ability to add analytics and make changes at will.

Adding the Analytics

Adding analytics to this WordPress blog was relatively simple process. All that was needed was adding the scripts generated by Google Analytics to the site’s header, which is located in the settings. Scripts can also be added to the header of individual pages that are being tracked.

Pre-evaluation Statements

I made the following assumptions about this site prior to data collection with Google Analytics and will explore whether they are supported by the data in this article.

  1. The majority of users hail from the U.S. and more locally the Seattle area.
  2. The majority of users will enter the site from the home page.
  3. The home page is the most popular, because it is a blog and all of the content is available in a continuous scroll.

Preliminary Findings from Data Collection

Overview of all sessions on this site during the collection period Aug 1 to Aug 7.

GA-overview

The majority of users hail from the U.S. and more locally the Seattle area.

As predicted the majority of the users were from the U.S. It is interesting to note, however, that there were international users from both Iraq and India.

GA-by country

Also as predicted the majority of users were located in the Seattle area or in nearby northwest.locales. It was surprising that the 2nd most common “city” was from users not sharing city location at all.

GA-by city

The majority of users will enter the site from the home page.

The home page is the most likely entry point or landing page for users and this was supported by the data collection. Out 18 total sessions, 13 began on the home page, responserequest.com/branchb/.

GA-landing page

The home page is the most popular, because it is a blog and all of the content is available in a continuous scroll.

As well as being the most common landing page, the “User Science Journal” home page was also the most popular among users with 34 page views out of 46 total views. Users spent an average time of 01:54 on this page.

GA-by page title

Out of Curiosity

I also examined data grouping channel types. The majority of traffic came from a direct source, which would be consistent with my contacts who clicked through a link I sent to them via email. Others came via a link I shared on social media (Facebook).

ga-traffic sources

The remainder came from referral sources I didn’t recognize. Here is a more detailed view.

ga-by traffic source

Preliminary Conclusions

My preliminary findings, using the data collected with Google Analytics, support all pre-evaluation statements.

I also discovered that, although the majority of traffic came from sources I generated through my personal contacts, 16.67% of traffic was referred by other sources.

Project: “Squeak the Squirrel” with Subtitles

Project: “Squeak the Squirrel” with Subtitles

Published 1957
Usage Public Domain
Subtitles Transcribed July 31, 2016

This video shows how a gold-mantled ground squirrel at Crater Lake National Park has learned to solve problems connected with getting food. It illustrates how an animal can learn to find food that is hidden from view, or is out of reach through a series of exercises and with many peanuts. To license this film and get a higher quality version for broadcast/film purposes, contact A/V Geeks LLC.

Run time 10:00
Production Company Churchill – Wexler Films
Audio/Visual sound, color

Film downloaded from: https://archive.org/details/squeak_the_squirrel

(Did you read what squeak had to say? Scroll to bottom for next scavenger hunt clue)

Challenges and Surprises

Overall this was a very enjoyable project. The most challenging part was selecting an appropriate video. Many that I was drawn to initially didn’t have much content to transcribe, or had long periods with only instrumental music. It took some time to watch through several selections before I hit upon “Squeak the Squirrel”.

I was surprised by how interesting I found it to sync the subtitles with the action in the scene. Also, being sure to display the subtitles long enough to comfortably read, but not so long that it was a distraction from the scene. I watched it through several times just to add or remove a few seconds from the subtitles in order to improve the flow.

Project Workload

It took longer than I anticipated to complete the subtitles based on estimations expressed in our assignment video, which suggested about 20 minutes for a 5 minute video. The video I selected was 10 minutes long, but took several hours to properly transcribe. This may be due in part to my own inexperience and also the fact that it is a documentary with more spoken content to transcribe than other types of video.

YouTube Tools

YouTube didn’t generate subtitles for the first film that I selected. This was likely due to poor sound quality on the recording, or because there wasn’t any spoken dialogue for the first minute.

For “Squeak”, the auto generated subtitles worked, but it did require a lot of editing to make them into grammatical sentences. That said, having them as a framework made it much easier than transcribing from whole cloth.

The interface provided by YouTube was relatively simple to use. Being able to see the captions in a strip along the bottom of the video, was extremely helpful. It made syncing the text with the scenes much easier.

Results

I’m satisfied with the outcome of this project and look forward to doing more like it. There is definitely an art transcribing, especially in cases where there are sound effects, or music that must be interpreted. I plan to continue acquiring more skills like this that improve user experience and access to media.

Stretch Goals

For a stretch goal, I downloaded a copy of the subtitles and shared them with the members of the archive.org. I posted a link to the file in the comments for the film.

It can be found here: https://archive.org/details/SqueakTheSqurrelSubtitles

Clue 4

You have almost finished the hunt. The next item is a 4 digit number. To find it you’ll have to play a game.

Project: Evaluating Accessibility of SPL.org

Website Evaluated

Seattle Public Library (SPL) home page: spl.org

Purpose and Intended Audience

The home page of the public library is intended to be accessible to all community members. The home page provides information about library branches and hours, as well as news about activities sponsored by the library.  Links to policies, online catalog, and other information are provided for visitors to the site.

Evaluation Tools and Standards

I used Chrome’s Accessibility Developer Tools extension to perform audits for color contrast and to examine markup.

For this evaluation, I selected criteria from a tool published by the W3C called “Easy Checks – A First Review of Web Accessibility” (https://www.w3.org/WAI/eval/preliminary.html). The checks are based on the Web Content Accessibility Guidelines (WCAG) 2.0.

  1. Page title
  2. Image text alternatives (“alt text”) (pictures, illustrations, charts, etc.)

Text:

  1. Headings
  2. Contrast ratio (“color contrast”)
  3. Resize Text

Interaction:

  1. Keyboard access and visual focus
  2. Labels

Evaluation Results

Page title

The page titles on the site display well in browser tabs and appear in the <head> with the correct html tags “<title></title>”.

spl-titles

The titles of pages are distinct from each other and well written.

Image text alternatives (“alt text”)

All images include alt text in the markup and are generally well written based on relevance to content and context presented.

spl-altgoodImages like header banners are not “over-described” and images that relate to content are usually given the appropriate level of description.

Most graphic elements like a printer icon are given appropriate alt text to describe relevance (alt=”print”). But in a few cases, there are graphic elements that are not essential other than as decoration in these cases alt text can be simplified or null.

In the examples below these elements are over-described and repeat the text that follows. For those using screen reader they will here the same statement two times in a row.

spl-nullalt

spl-altmobile

Text:

Headings

Is header hierarchy meaningful? The hierarchy for the home page appears to be chosen based on style rather than organization.

 

Below is an outline of the home page that was generated by the W3C HTML Validator (The W3C Markup Validation Service).

 [h3] SEARCH THE LIBRARY
            [h3] BROWSE
            [h3] Library Locator
    [h1] Library News and Events
            [h3] Michael Swanwick
            [h3] 125 Years of The Seattle Public Library
            [h3] Find It On Friday
            [h3] Music of Teresa Teng
    [h1] SEE UPCOMING EVENTS
    [h1] QUICK LINKS
                [h4] Summer of Learning
                [h4] Playback - Local Music Collection
                [h4] Museum Pass
                [h4] Upcoming Author Readings
                [h4] Tax Help
                [h4] Homework Help
                [h4] Resources for Job Seekers
                [h4] Pay a Fine / Fee
                [h4] News Releases
                [h4] Special Collections Online
                [h4] Meeting & Study Rooms
                [h4] Reserve a Computer
                [h4] Sign up for Library E-News
                [h4] Podcasts
                [h4] ADA Accommodation
    [h1] QUICK LINKS
                [h4] Summer of Learning
                [h4] Playback - Local Music Collection
                [h4] Museum Pass
                [h4] Upcoming Author Readings
                [h4] Tax Help
                [h4] Homework Help
                [h4] Resources for Job Seekers
                [h4] Pay a Fine / Fee
                [h4] News Releases
                [h4] Special Collections Online
                [h4] Meeting & Study Rooms
                [h4] Reserve a Computer
                [h4] Sign up for Library E-News
                [h4] Podcasts
                [h4] ADA Accommodation
            [h3] AUDIENCES
    [h1] Support Your Library!
            [h3] LIBRARY BLOG

You may notice that the page doesn’t begin with an H1, rather H3. This is not ideal, but since a search box is the first content with a header, it isn’t essential that there be an H1 attached to this element.

However, this means that the first instance of H1 comes below the main banner and search box. It would be better if it were located above the search box. In fact, H1 is used several times on the page where H2 might be more appropriate.

The use of H4 for numerous quick link items may not be necessary as they are not section headers. It may be better to style the text for emphasis.

Contrast ratio (“color contrast”)

Many of the headers and navigation elements on the home page had a contrast of 3.00, which falls just below AA level contrast (4.52).

  • The top and bottom navigation text = 3.00

spl-topnavcontrast

  • Header 1 and Header 3 text throughout = 3.00

spl-H1H3contrast

  • Email link in footer = 3.00

spl-contrast-email

All of these instances could be improved to the AA level or higher without making major stylistic changes.

Resize Text

To evaluate whether the text was accessible at larger magnifications. I used my browser controls to zoom to 200% then examined the text using the checklist below.

  1. All text gets larger.
  2. Text doesn’t disappear or get cut off.
  3. Text, images, and other content do not overlap.
  4. All buttons, form fields, and other controls are visible and usable.
  5. Horizontal scrolling is not required to read sentences or “blocks of text”.

All text was fully functional at 200% magnification.

Interaction:

Keyboard access and visual focus

To test for keyboard access, I used the following check list:

  • Tab to all and Tab away:
    • I had no difficulty tabbing back and forth through the entire page.
  • Tab order:
    • The tab order was logical and followed the content in an expected order.
  • Visual focus:
    • This was problematic on the top and bottom navigation menus, because the box to indicated the location of the cursor if not visible against the blue header and footer color.
  • All functionality by keyboard:
    • All links, buttons. drop-down lists and other elements were accessible by keyboard.
  • Image links:
    • Images with links tabbed in the correct order and were visibly selected.

Labels

The search box on the home page is missing labels in its markup.

spl-searchlabels

Revising markup to include labels would improve accessibility.

Recommendations

Improve Accessibility

Overall the Seattle Public Library home page and the majority of its associated pages are accessible. However, there are several issues that could be addressed without major redesign or effort to improve it overall accessibility of the site.

Recommendations that follow are made based on the criteria used to evaluate general accessibility of the Seattle Public Library site:

  • Alt Text:
    • The image used along with “Support the Library” is only decorative, using null or alt=” ” would be appropriate in this instance.
    • The image above “get the app” has duplicate text as the link beneath it. I recommend revising to a descriptive alt=”mobile” or null.
  • Headers:
    • Improve the hierarchy of headers. H1 should be used at the top level and in limited use only where it is essential for structural clarity. Many of the instances of H1 could be changed to H2 in order to make the hierarchy more meaningful.
    • H4 is used for long lists of quick links. Improved styling for emphasis may be more effective in this case.
  • Contrast Ratio:
    • Improving the contrast ratio to at least AA levels (4.52) will not require major design changes and it would make the site more accessible to the visually impaired. These areas are recommended for revision:
      • Top and bottom navigation = 3.00
      • H1 and H3 text = 3.00
      • Footer link to email = 3.00
  • Keyboard Interaction:
    • Improve visibility of tab selection of top and bottom navigation items. The highlight box is not visible when tabbing against the light blue background.
  • Labels:
    • Add labels to the markup for the “Search” radio buttons and input box.

Project: Mobile Usability Test of Central Coop Website

Mobile Usability Test Summary

For this mobile usability test, I continued my analysis of the Central Co-op website: http://www.centralcoop.coop

Background

The Central Co-op is an independent co-operative natural food grocer with locations in Seattle and Tacoma. A co-op is a business owned and operated by and for its users. Member-owners receive benefits such as voting privileges, access to special sales, an annual dividend based on individual expenditures, as well as regular communications about the workings of the business. To become a member-owner, an individual must purchase a share in the business.

A previous usability test of the desktop website was completed prior to this investigation of the mobile user experience. Result and recommendations of that test are published in this User Science Journal here: Usability Test of the Central Co-op website

Purpose

As in the desktop site usability test, the mobile usability test examines the ease with which users are able to navigate the mobile site, in order to drive them to content supporting the following primary goals:

  • Member-owner recruitment and education
  • Locating and shopping in the physical store
  • Promoting enrollment in new grocery delivery service

Methods

For this test, there was 1 test administrator (myself) and 4 test subjects.

The test script from the desktop site test was updated to reflect the use of the mobile site. Both were adapted from a free, downloadable script found in Rocket Surgery Made Easy© by Steve Krug.

The opening script was read aloud by the test administrator in order to explain the test procedure and provide opportunity for participants to ask questions prior to the start of the test.

At the end of the introductory script, subjects were asked to give permission for their test session to be recorded, all subjects agreed and signed a consent form prior to the start of the test.

A Logitech webcam with microphone attached to a small tripod was used to record test subject interaction with the mobile device. A white cloth was used under the test area to increase brightness and image quality.

Test-Station

The webcam was tethered to a laptop, which the test administrator used to view the test. The Chrome extension Screencastify was used to record both audio and web cam activity throughout the test from the laptop.

Subjects were asked by the test administrator to navigate to the website on the mobile device and give a general impression of the main page. This also allowed them to be comfortable with the test set up and web camera placement. Time was limited to 3 minutes.

After the initial view of the home page, the subjects were asked to complete 6 scripted tasks. All of the tasks were presented in written form and were also read aloud by the test administrator.

Scripted Tasks:

In the desktop site test summary, I made recommendations to simplify several of the more complex tasks, by breaking them into smaller parts to improve clarity for the test subject. For this mobile test, I revised accordingly, which increased the total number of tasks from 4 to 6.

Questions number 5 and 6 remain the same as in the previous test.

  1. Are you able to locate information about how to join the Central Co-op?
  2. Are you able to find the cost for membership?
  3. Are you able to find out what benefits are included with membership?
  4. Are you able to location information about what a co-operative business is?
  5. Are you able to locate the address and hours for the Co-op location in Seattle?
  6. The Co-op has started a new grocery delivery service. Are you able to find out whether your zip code is within their delivery area?

The total time for each session from start of introduction to the end of the recording was designed to last no more than 15 minutes.

Participants

The test administrator recruited volunteers from among personal friends and colleagues. Four test subjects volunteered to be participants in this test.

  • Subject 1: Female, an employee of Seattle University, I-Phone 5
  • Subject 2: Male, an employee of Seattle University, I-Phone 5
  • Subject 3: Female, an employee of Seattle University, I-Phone 5
  • Subject 4: Mail, an employee of the Federal Government, Samsung Galaxy 3

Only one of these participants, Subject 4, is a member-owners of the Central Co-op. None of the subjects had visited its website on desktop or mobile prior to the test.

Of the three participants who are not member-owners, all expressed having prior knowledge of this business as a brick and mortar store near Seattle University where they are employed.

Results Summary

Initial Impressions

All subjects used an I-Phone 5 (IP5), except for Subject 4 who used a Samsung Galaxy 3 (SG3) device.

The SG3 was a small version of the desktop site, with small hard to read items.

COOP-Mobile-SG3-Vertical

In order to complete the tasks for this test, it was necessary to turn the phone horizontally allowing a more readable presentation of the content.

COOP-Mobile-SG3-HZ

The initial screen presentation of the Central Co-op’s homepage on the IP5 was substantially different than the SG3.

screenshot-drive.google.com 2016-07-17 15-49-30

The IP5 delivered a more responsive layout in appearance, but hid many of the buttons and cards that normally appear on the desktop site.

screenshot-drive.google.com 2016-07-17 15-52-50

This limited the test subject choices when navigating from the home page to complete tasks, but they were provided with a mobile friendly menu, which was not available to the SG3 user.

COOP-mobile-menu-button-IP

Scripted Tasks

1. Are you able to locate information about how to join the Central Co-op?

4 of 4 subjects completed this task successfully.

Findings

All subjects were able to locate the correct page to join the Co-op, but there was not a clearly marked navigational path for them to follow from the home page. Most users were able to find this content in under 1 min, but Subject 1 searched for 1:35 minutes before finding the page.

Subject 1 Pathway:

After scanning entire home page for a prominent button, she expanded the top menu.

COOP-mobile-menu-IP

From here, she clicked on “Ownership” bringing her to a long page of content explaining the ownership structure of the Co-op. In the desktop version this page has clearly marked buttons on the right, but in IP5 these buttons are off the screen.

COOP-DT-buttons

The subject does not notice a link at the bottom of this page to “Join.” The link is not underlined or styled much differently than the long page of text.

COOP-mobile-JoinLink-small-IP-

At this point, Subject 1 expands the menu again and clicks “About Us” opening another long page of content describing what type of co-operative business they run. Again, the subject does not see a link at the bottom of this page to “Join.” This link is nearly identical to the previous page, also not underlined or styled much differently than the long page of text.

Subject 1 returns to the home page to look again for a button or other recognizable path. From there the subject tries “Hours, Locations, Contact” and after scanning the page comments, “well, joining the Co-op is not super intuitive, I mean, one would think there would be a big button on the main page that says join.”

At this point in the task, the subject has spent approximately 01:08 searching for a way to join and asks the test administrator if she should continue looking. The administrator replies affirmatively and the subject navigates back to the home page to begin again.

From the home page, she clicks the “Weekly Owner Coupon” and is taken back to the “Ownership” page that she had visited near the start of this task. This time, the full page displays along with the large buttons that were off screen on her first visit. From here, she pinches out the view to be able to read the buttons and selects the “Join” button.

COOP-mobile-side-buttons-IP

Total task time was 01:38.

Subject 2 Pathway:

Subject 2 expands the top menu to search for how to join. He selects “Ownership” and scans the page. After some reading, the subject notices the link at the bottom of the page (mentioned above in Subject 1 pathway) and selects it arriving at the correct page to join.

Total task time was 00:49.

Subject 3 Pathway:

After scanning the home page for a link to join, Subject 3 expands the top menu. She reads through the list of menu items and says, “I don’t know where I would click to join.”

She then scans the home page a second time and returns to the menu again and states, “I guess, I’ll try ownership?” When she arrives at the page, she scans it and finds the link at the bottom of the page to join. At this point she says, “I guess they assume that I would know that Co-ops are run on people power.”

Total task time was 00:41.

Subject 4 Pathway:

Subject 4 is the only subject not using an I-Phone, which means he had no mobile friendly menu at the top as did the other subjects. He chooses to use the search box to look for a way to join and quickly navigates to the “Ownership” page based on the results. At this point, he comments, “it brought me to ownership, which is a tab I can see, but I had no idea that ownership meant join.”

Since he was using an SG3 with the device held horizontally, the full width of the site is visible, including the large buttons on the right, which IP5 users weren’t always able to see. He selects the top “Join” button, but comments “the text is really small and buttons really skinny.”

COOP-mobile-SG3-buttons

Total task time was 00:25.

  • Average task time among IP5 users was 00:63.
  • Average task time among all users was 00:53.

Recommendations

Improve Visibility of Navigation Elements for Mobile. Even with a seemingly more mobile friendly look and feel, the I-Phone users took significantly more time than the Samsung user to complete this task. This is due, in part, to missing navigation elements, which are available on the desktop and SG3 displays, such as the large buttons on the home page and right side of most pages. How to join should not be a hidden element to any visitor to the site.

  • Add buttons in text rather than off to the side on pages for “ownership”, “about us”, and all other pages where a non-member might visit for information.
  • Replace “subtle” links at bottom of pages with more noticeable, clickable formatting, or buttons.

Make Menu Names More User Friendly. Three of the test subjects commented that they did not know that the word “ownership” would lead them to information about becoming a member of the Co-op. I would recommend clearer wording for navigation elements, such as

  • Add “Join the Co-op” to the top navigation in the first position.
  • Improve positioning of the current “Join the Co-op” button on the home page, which is underneath a news story about light rail. This button is omitted on IP5 and is too small to see on SG3.

Improve Search Function. Even though the SG3 user completed the task at a much faster pace than the other subjects, his strategy of entering “Join” in the search box still brought him to the wrong page initially. Improving the search function on the site would give those users who prefer to use the search function a seamless and nearly instantaneous result.

2. Are you able to find the cost for membership?

Findings

4 of 4 subjects completed this task successfully.

This task was easily accomplished by all subjects.The information can be found on the “Join” page where most subjects were positioned at the end of Task 1 and, even if they had navigated away from this page, most of them remembered seeing the information during the completion of the prior task.

Recommendations

Although this was one of the tasks most easily achieved for the subjects, there is still room for improvement. The textual content of the page is long and important details, like the cost of membership, could be better styled to stand out.

COOP-mobile-cost

3. Are you able to find out what benefits are included with membership?

Findings

3 of 4 subjects completed this task successfully.

This task was by far the most difficult for the majority of test subjects.The three IP5 users scanned and then re-read the “Join” page anticipating that benefits of membership would be included on this page, or that there would be pathway to navigate to them in the content. Only the SG3 user hit on the link “return on your investment” right away.

COOP-mobile-SG3-Join-ROI

Subject 1 read through the “Join” page for a minute before moving on to “Co-ops 101” spending nearly another minute reading content before giving up and commenting that she expected to find this information along with how to join.

Subject 2 visited 7 pages before returning to the “Join” page and noticing the “return on investment” link. After spending well over 2 minutes searching, he commented that he “didn’t feel guided” by the content.

Subject 3 spent over a minute reading and re-reading the “Join” page then navigated to “Co-ops 101” in search of the benefits, finally she decided return to the top menu, but this time on the desktop site (or “Full Site”) view and found the link to “Benefits” there in the top tabbed navigation.

Subject 4 had the fastest completion time (a mere 00:12), but commented that “return on investment is lingo” those new to the Co-op might not know, finally saying, “Why doesn’t it just say benefits?”

All in all, the subjects spent the bulk of their time divided between the “Join” and “Co-ops 101” pages.

Benefits-Time-Table*Subject 2 visited 7 pages total, time on those pages are not reflected on table.

Recommendations

Use Clear, Jargon-Free Language. Although the phrase “What is the return on my investment?” may be an accurate way to describe what a person can expect in return for becoming a member, it isn’t clear enough for a person new to the concept of a co-operative business. Alternatives might be:

  • What are the benefits of becoming a member of the Co-op?
  • What do I get when I become a member?
  • What can I look forward to as a member of the Co-op?
  • I’m ready to join, what happens now?

Even “benefits” is not an entirely jargon-free word choice. Including it in the task prompt did little to improve completion time, because the term is only used under the “Ownership” menu tab and not immediately obvious to a user seeking information on how to become a member of the Co-op.

Create Better Navigation for Mobile Users. One of the challenges for most of the test subjects, was lack of clear navigation to the “Benefits” page. Tab navigation that is visible on the desktop site, were hard to read and click for the SG3 user. IP5 users were only able to see the tabs while visiting the “Full Site”.

COOP-mobile-SG3-Benefits tab

The SG3 user was never observed opening the tabs at all. Only one IP5 user was observed opening the ownership tab while searching for benefits.

Improve Clickability. Better styling of links in text, as in the case of the link for “return on investment”, should be more obvious, especially for mobile users. The content text is already rather small, having subtle styling of links makes it more difficult to find.

4. Are you able to locate information about what a co-operative business is?

Findings

4 of 4 subjects completed this task successfully.

The point of this task is to see if the subjects are able to locate general information about co-operative businesses that is located on the “Co-ops 101” page.

Three of the subjects completed this tasks easily.

  • Subject 1 and 3 remembered visiting this page in error while searching for how to join, or member benefits and then navigated back to it to complete the task.
  • Subject 4, using an SG3 with screen horizontal, was able to see the large button for “Co-ops 101” that is sometimes off screen for IP5 users.

Subject 2, however, had a more difficult time for these reasons:

  1. This subject didn’t remember landing on the “Co-ops 101” page in error, as Subjects 1 and 2 did, while completing earlier tasks.This is the only “advantage” Subject 2’s fellow IP5 users had in completing this task.
  2. For most of this task, Subject 2 wasn’t able to see the large button for “Co-ops 101” that Subject 4, the SG3 user, could see.

To complete this task, Subject 2 followed this path:

  1. “About Us” page
  2. “Ownership” page
  3. Expanded main menu
  4. Scrolled to the bottom of the home page
  5. Clicked the button to “Visit the Full Site”
  6. Scanned the “Full” version of the home page
  7. Scanned the footer links and finally found “Co-ops 101”

COOP-mobile-IP5-fullsite

Throughout the majority of the steps, Subject 2 was reading through long pages of content, which added significantly to his total time of 01:52.

  • Subject 1 total time 00:15
  • Subject 2 total time 01:52
  • Subject 3 total time 00:09
  • Subject 4 total time 00:08

Recommendations

Use Clearer Button and Menu Titles. Subjects 1 and 3 completed this task very quickly, but only because they remembered visiting the page in error during previous tasks. Although memorability/learn-ability can be a positive testing result, in this case it suggests that the actual work of the task was shouldered by time spent completing previous tasks, which were difficult to complete and forced user to visit multiple pages.

Subject 4 had the benefit of more visible navigation elements and the quickest completion time, but commented “Why not say, What’s a Co-op? Instead of Co-ops 101?”

“What’s a Co-op?” is a clearer title than “Co-ops 101,” which, as demonstrated in this test, became a catch-all for subjects whenever they became bewildered during the tasks.

Improve Navigation for Mobile. Although Subject 2 didn’t remember visiting “Co-ops 101” in previous tasks, as soon as he saw it in the footer links, he was confident that he had completed the task. If he had had the same access to navigation elements as Subject 4, using an SG3, it is likely his overall would have been improved.

5. Are you able to locate the address and hours for the Co-op location in Seattle?

Findings

4 of 4 subjects completed this task successfully.

As in the desktop test, this task was successfully achieved in minimal time by the test subjects. Most commenting that they remembered seeing a link to “Hours, Location, Contact” either in the mobile menu for IP5 or in the header of the page. In fact, the IP5 test subjects performed better in the mobile test than the desktop test subjects, because their mobile view isolated the content at the top of the page with no diversions.

Recommendations

Even though this task was easily performed by the test subjects, improvement in the presentation of this content is recommended. There is a lot of white space available on this page and the text is rather small and some users were observed pinching out the page in order to increase the size of the text.

COOP-mobile-small-hours

The SG3 user demonstrated the size he would prefer to see on a mobile page of this type below.

COOP-mobile-largerhours

6. The Co-op has started a new grocery delivery service. Are you able to find out whether your zip code is within their delivery area?

Findings

3 of 4 subjects completed this task successfully.

Subjects performed this task with only moderate observable difficulty. Pathways to complete the task were split. Two subjects navigated to the Instacart page, which is part of a different domain, and 2 of them found the info page on the Co-op website.

Subject 1 and 3 remembered seeing the banner graphic for the delivery service during previous tasks where they visited the “Full SIte”. They easily navigated to the Instacart page from there.

COOP-mobile-IP5-instacartlink

Unlike subjects in the desktop test, neither subject were disturbed by being taken to a new domain. In fact, quite the opposite, as the Instacart page is very mobile friendly appeared consistent with their expectations of web app appearance and functionality.

COOP-mobile-instacart

Subject 2 used the IP5 mobile menu to find the informational page on the Co-op website.

COOP-mobile-IP5-menu-delivery

He navigated to the page about the delivery service, which has a lot of textual content and began scanning for a way to check whether his zip code was in the delivery area.

COOP-mobile-delivery

The page was not very mobile friendly and required that he scroll back and forth to read the content. He also had to scroll down locate information about delivery areas. He didn’t see the link to check for zip codes at all. He scanned the list of selected neighborhoods and was satisfied that there was no point in looking at his exact zip code. He was the only subject who didn’t complete the task.

COOP-mobile-delivery-areas

Subject 4, the SG3 user, employed a similar tactic as he had previously by using the search function to look for “delivery”. He commented that it was a little difficult to access the box, because it was small. His search returned 169 results on 17 different pages of the website.

COOP-mobile-SG3-deliversearch

Luckily, the first result took him to the correct page on the Co-op website. The page displayed more fully on SG3 with a rotated screen than it did for the IP5 user. There was less scrolling needed to find the pertinent information.

COOP-mobil-SG3-delivery-zip

Subject 4 completed the task quickly and commented that the zip code popup was the most readable page he had seen throughout the entire test.

COOP-mobile-SG3-popup

Recommendations

Improve Navigation Elements for Mobile. The IP5 users who had the best experience completing this tasks navigated from the graphic banner link only available from the “Full Site” as opposed to the mobile menu. This navigation element is memorable and should have a place in the initial mobile view for all users. Adding the same or similar graphic link at the top of the informational page that displays correctly would also attract more users to the Instacart page, which is more responsive than the page on the Co-op site.

Improve Responsiveness and Content. The delivery information page on the Co-op site has similar content issues as other pages we have discussed throughout this summary. There is a lot of content in small print that requires scrolling and pinching out for better viewing. Larger, clearer textual elements, like prominent headers instead of a large unclickable banner, that appear near the top of the page would assist users in locating information quickly and painlessly.

Closing Summary

Mobile vs Desktop

The nature of a mobile device is to be “on the go.” Improving task times for mobile users is essential. In this test, users who went beyond a 00:40 second window in order to complete tasks became visibly irritated. Desktop testers didn’t seem to be as concerned with speed and also on average completed the entire test in less time.

Approximate Average Time for Entire Test*

  • Desktop Users: 06:73
  • Mobile Users: 07:66

*Time calculations do not account for time subjects spent commenting during the test.

I-Phone vs Samsung Galaxy

Although initially the I-Phone had a more mobile friendly appearing presentation, the Samsung user had the best completion times on nearly every task. In part, because he chose to use the search function, but also due to the fact that no elements of the website were hidden on his device.

Next Steps

In general, the look and feel of this site is pleasant for desktop users, but mobile users have a different experience. Both types of users would benefit from more accessible content.

Make Content More Scanable and Easier to Access. Although overall design of the site is pleasant visually, the textual content is presented in a small font and many pages could benefit from better use of headers and other graphic elements to break it up. One mobile subject commented that the pages where “cluttered” and they would like to see information broken up into smaller, logical chunks.

Make Site More Responsive. It is pretty clear that this site is meant for desktop and it performs best in that environment. When IP5 users receive a “friendly” version of the site, there is a lot of missing, or off screen content. The content not appearing doesn’t seem to be intentionally selected. Some of the missing content detracts from the users experience and ability to navigate the site, like buttons to direct them to join the Co-op.

Additional Testing

It would be worthwhile to perform additional testing for both desktop and mobile users after implementing recommendations made throughout this summary. That said, if considering a substantial redevelopment of content based on these findings, consider performing additional testing across a wider breadth of both mobile devices and desktops. This would provide the practical information needed to create a better user experience overall.

Project: Usability Test of the Central Co-op Website

Usability Test Summary

For this usability test, I selected the Central Co-op website: http://www.centralcoop.coop/home.php

Background

The Central Co-op is an independent co-operative natural food grocer with locations in Seattle and Tacoma. A co-op is a business owned and operated by and for its users. Member-owners receive benefits such as voting privileges, access to special sales, an annual dividend based on individual expenditures, as well as regular communications about the workings of the business. To become a member-owner, an individual must purchase a share in the business.

Purpose

This usability test examines the ease with which users are able to navigate the site, in order to drive them to content supporting the following primary goals:

  1. Member-owner recruitment and education
  2. Locating and shopping in the physical store
  3. Promoting enrollment in new grocery delivery service

Methods

For this test, there was 1 test administrator (myself) and 3 test subjects.

test script adapted from a free, downloadable script found in Rocket Surgery Made Easy© by Steve Krug was used for this test.

The opening script was read aloud by the test administrator in order to explain the test procedure and provide opportunity for participants to ask questions prior to the start of the test.

At the end of the introductory script, subjects were asked to give permission for their test session to be recorded, all subjects agreed and signed a consent form prior to the start of the test.

The Chrome extension Screencastify was used to record both audio and desktop activity throughout the test.

Subjects were asked by the test administrator to give a general impression of the main page of the website, followed by 4 scripted tasks.

All of the tasks were presented in written form and were also read aloud by the test administrator.

Scripted Tasks:

  1. Are you able to locate information about what a co-op is and how to join the Central Co-op?
  2. Are you able to find the cost for membership and figure out what benefits are included with membership?
  3. Are you able to locate the address and hours for the Co-op location in Seattle?
  4. The Co-op has started a new grocery delivery service. Are you able to find out whether your zip code is within their delivery area?

The total time for each session from start of introduction to the end of the recording was designed to last no more than 15 minutes.

Participants

The test administrator recruited volunteers from among attendees at an open lab session for a Web Development Certificate program at the Seattle University. Three test subjects volunteered to be participants in this test.

  • Subject 1: Male, a student in a Web Development Certificate program at Seattle University.
  • Subject 2: Male, an instructor in a Web Development Certificate program at Seattle University.
  • Subject 3: Female, a student in a Web Development Certificate program at Seattle University.

None of these participants were member-owners of the Central Co-op, or had visited its website prior to the test. Of the three participants, Subject 1 and 2 expressed having prior knowledge of this business as a brick and mortar store near Seattle University.

Results Summary

1. Are you able to locate information about what a co-op is and how to join the Central Co-op?

Findings

What is a Co-op?

There is an informational web page on the Co-op’s website called “Co-ops 101”. This part of the task was meant to test whether the subjects were able to locate this page and interact with the content. One subject navigated here successfully to complete the task. Two of the three subjects never found this page, but found other pages that they felt satisfied the request.

The two subjects diverted by other pages with similar content. navigated from the home page using the top navigation. They both scanned through the navigation tabs in search of the correct page.

  • Subject 1 selected “Join” rather than “Co-ops 101” from the “Ownership” menu tab. This page presents a lot of content detailing the membership process at this particular co-op, but doesn’t go into detail about what co-ops are as a type of business.

COOP-101a

  • Subject 3 selected “About Us“,  which led to a page describing the Central Co-op and its principles, but didn’t connect the “Co-ops 101” button with the task.

COOP-101b

Based these findings, the name “Co-ops 101” is not clearly enough presented to lead users to it in order to obtain a general description of a co-op run business. Since an important part of recruiting new members and customers is providing this educational content, it should be more accessible to newcomers.

Recommendations

  • The first portion of the task was only successful for 1 of the 3 subjects, although all of them thought they had completed the task. However, all users did find the “Join” page, which has large buttons to this information, as well as a button that links to the current page. Removing the “Join” button from this list and moving “Co-ops 101” to the top would make it appear more important.

COOP-101rec2

  • Adding an in text button on the “Join” and “About Us” pages with the description “Learn more about co-ops and how they work” or something similar would also make the information on “Co-ops 101” appear more important.

COOP-101rec1

  • The “Join” page only has a small link halfway down the page. More emphasis on this link to “Co-ops” 101 would improve user navigation to this information.

COOP 101 recs3

How do I join?

Most of the pages on this website have clearly marked buttons to join, which all subjects found relatively easy to find.

COOP-join-sub

However, navigating from the home page was not as clear. The only clear navigation element to “Join” the Co-op is in an unexpected location below a large button to a news item.

COOP-home-join

In the top navigation the “Join” page is buried under “Ownership.” Subject 1 and 3 didn’t see the button and went directly to the top navigation and scanned through the tabs to find a way to join. Subject 2 indicated that he was drawn to the large button to a news item more than the button to “Join”, but resisted, and did navigate to the correct page from the button.

Recommendations

  • For new users and customers, one of the primary functions of of this website is to increase membership in the Co-op. Yet, from the home page, the only clear navigation element to “Join” is located below a button to a news item that may distract the user. The addition of another button in a more prominent and “expected” location, such as in the header, would improve visibility.

COOP-HOME-join-suggest

2. Are you able to find the cost for membership and figure out what benefits are included with membership?

Findings

What does a membership cost?

All subjects were able to discover the cost of membership ($100) rather quickly, but benefits that come from the membership were more difficult to decipher.

All three subjects found the membership cost in one click, or less–two came across the information before the task had begun.

COOP-member cost

What are the benefits of becoming a member?

Finding a good reason to become a member, was not as clear for the test subjects. The only clear navigation element leading to information on “Benefits” that come with joining is under the “Ownership” tab in the top navigation. None of the test subjects followed this navigation initially to find the benefits. Only Subject 1 found this link in the navigation after scanning through the “Join” page and “About Us” pages first.

COOP_bennies-nav

There is only one other mention of benefits suggested by a small link on the “Join” page entitled, “What is the return on my investment?” This is how Subject 2 found the “Benefits” page.

COOP_bennies-ROI

Subject 3 found the “Co-ops 101” page while searching for benefits, and at first thought benefits were embedded in this content, then after scanning, backtracked to the “About Us” page, and then back to “Join”, but was unsuccessful.

For the two subjects that did eventually find the “Benefits” page, both commented while scanning the page content that the benefits described didn’t relate to them directly until more than halfway down the page. The content is presented as a “Principles” statement and prioritizes the benefit to community above consumer gain. Organizing the content in this way required more work on the part of the subjects to fully complete the task.

Recommendations

  • Finding the cost of membership (and the “Join” page) was among of the easier tasks for the test subjects to accomplish. Therefore, capitalizing on the ease of finding this content in order to better promote member benefits makes sense. The addition of a button called “Member Benefits,” to the “Join” page would be a good start to making this information more accessible both with graphical emphasis and clearer language.

COOP-bennies button

  • The content on the benefits page is not easily digestible to a new user, especially if the concept of a co-op is new to them. Culling the information that relates directly to the consumer, like discounts, sales, and dividends, and presenting them clearly on the “Join” page along with links to “more benefits” would simplify information gathering for a new user.

3. Are you able to locate the address and hours for the Co-op location in Seattle?

Findings

From the home page subjects were able to navigate rather quickly using a clear, well placed link in the upper right corner of the page.

COOP-HOME_location
Clicking this link took them a “Contact Us” page displaying basic information about grocery locations and hours. All three subject paused and scanned the entire page from top to bottom before finding the Seattle location (first item on page).

COOP-CONTACT_city

Even though the information regarding store hours was listed just below the store name, all subjects hesitated when trying to locate them. Subjects 2 and 3 squinted at the page during this process. Subject 2 suggested that the text might be too small.

COOP-CONTACT_hours

Recommendations

Although the link “Hours, Location, Contact” is prominently placed and users found it quickly, the “Contact Us” page it navigates to could use improvement. Based on the findings from this test, I would recommend the following changes (illustrated in mock-up beneath list):

  • Create new headers to clearly identify the “Grocery Locations” from “Co-op Events and Meeting Spaces.”
  • Reorganize layout in order to present the “Grocery Locations” side by side and higher on the page.
  • Emphasize both the city name and open hours by increasing font size and weight as well as adding additional space around them.

CONTACT PAGE REVAMP

4. The Co-op has started a new grocery delivery service. Are you able to find out whether your zip code is within their delivery area?

Findings

All three subjects were able to accomplish this task rather quickly.

Subject 1 didn’t recognize the graphic in the header referring to the delivery service as a clickable link and instead scanned through all of the top navigation menus trying to locate the information about the service.

COOP-Menu-nav-delivery

From the “order delivery” link pictured, Subject 1 arrived at an informational page detailing how the new service works. This page gave good information about how the service works and also a list of neighborhoods where delivery is covered. Just below this list of neighborhoods, which is below the fold, is a somewhat unobtrusive link to a list of zip codes, which the subject found after a scanning most of the page. The zip code link is in a muted gray text with no underline, but has a rollover effect darkening it to black with an underline.

COOP-Delivery-info-zips

After clicking this link a small popup appeared with a list of zip codes. This list gave the subject the information needed to complete this task.

COOP-zip-pop

Subjects 2 and 3 had different experience completing this task. They navigated to the delivery service link from the home page.

COOP-HOME-Delivery

This link took them to a page administered by Instacart, the delivery service, that is outside the Co-op’s domain. Both of these subjects seemed momentarily unsure of where they had landed, because of the change in domain and noticeably different look and feel of the page.

COOP-Instacart

From this page, they were able to enter their own zip code into the form and receive a message indicating that they were within the delivery area.

COOP-Zip-form

However, Subject 3 expressed difficulty finding the confirmation message, because it was not given enough emphasis. She also indicated the “Hooray, we offer Delivery in Seattle!” didn’t make her certain that her zip code was accepted.

COOP-Zip-form-response

Recommendations

Discovering whether this service is available in a customers area should take a relatively simple path. Having multiple navigation pathways to a list of zip codes is a plus. However, both pathways had elements that were frustrating for the subjects.

Improve the visibility of clickable items.

  • The path taken by Subject 1, suggested that the graphic in the header may not be recognizable as clickable to some users. Changing this graphic to a design that resembles a button, or a rollover effect with “Click to Order”, would help clarify that this is not just part of a large logo.

COOP-Rollover

  • The link to a list of zip codes on the informational page is located on the bottom half of the page and is styled with a muted gray color and no underline. Although the rollover effect compensates for this, restyling this link as a button located higher on the page would make it more convenient for users.

Improve experience for new users.

  • Subject 2 and 3 both expressed some surprise when taken to a page outside of the Co-op’s domain. Adding a popup or landing page where users can select “Returning” or “New” would allow content to be better tailored to the type of user navigating to this service. Those “Returning” could go directly to the account login and shopping page, while those who are “New” can explore a signup page with the most popular questions answered at the top of the page.
  • The path for users to verify that their location is within the delivery zone should be clear and direct. Subject 3 expressed that the form on the Instacart page seemed like a tactic to trick her into signing up before she was ready. Adding visual emphasis to the response message to the zip code query would help relieve the pressure.

Next Steps

Additions to the recommendations made throughout this summary are below.

Test and Re-test

I would recommend performing this test again, but with simpler tasks. In reviewing the results, I found that in several instances, test subjects found information needed for a future task in advance. This may have changed the way they interacted with the content in subsequent tasks. I have listed suggested revisions for tasks below (no revisions suggested for 5 and 6):

  1. Are you able to locate information about what a co-operative business is?
  2. Are you able to find information about how to join the Central Co-op?
  3. Are you able to find the cost for membership?
  4. Are you able to find out what benefits are included with membership?
  5. Are you able to locate the address and hours for the Co-op location in Seattle?
  6. The Co-op has started a new grocery delivery service. Are you able to find out whether your zip code is within their delivery area?

Test Clarity of Content

I would also recommend testing centered around general accessibility of content. A good amount of the content presented is written for those who are already acquainted with this type of business. Jargon usage in some of the content and navigation elements may prevent the uninitiated from finding out what they need to know in order to make an educated decision regarding membership.