Mobile usability test (Week 4)

This week, I conducted further usability tests on the same site (Seattle University’s College of Education website), with the same test script, the same test scenario, the same tasks, as last week, but this time on a mobile device — an iPhone 6s — with three new participants. I recorded the tests, again, using QuickTime Player on a MacBook Pro, by connecting the iPhone to the MacBook using a USB connection cord.

Mobile test participants

  • #1: 36 year old male; heavy web user (8-10 hours a day); Seattle University employee (so familiar with the website generally) but never visited the College of Education website before.
  • #2: 38 year old female; medium web user (3-4 hours a day, but generous estimate); not a smart phone user; full time graduate student at Seattle U (so familiar with the website generally) but never visited the College of Education website before.
  • #3: 44 year old male; medium web user (~ 4 hours a day); not an iPhone user; research assistant at Seattle U (so familiar with the website generally) but never visited the College of Education website before.

The test scenario and the tasks

For the purposes of this usability test, I asked users to pretend they were all prospective graduate students looking for more information on the Master in Teaching (MIT) program at Seattle University.

Task 1: Find out the date of the next MIT information session.

Task 2: Find out the next deadline for applying to the MIT program.

Task 3: Find out what the admissions requirements are for the MIT program.

Task 4: Find out who to contact if you have additional questions about applying to the MIT program.

Test results

PARTICIPANT 1

Task 1:  Participant 1 navigated from the College of Education home page to the “Graduate Degrees” page (clicking on a black flag button link in the primary content section [rather than from the menu]) then scrolled down to find the right program, clicked the black flag button for the “MIT program” and then scrolled down to find the “Upcoming information sessions” he found toward the bottom of the page (though he noticed two things in this section he found concerning: 1) that the “upcoming” sessions had past dates still listed, and 2) that there was a typo “Novermber.” He completed the first task in about 30 seconds.

mobile_test_1

Task 2: The first thing Participant 1 did was click the “Apply Now” call to action button, but when he realized he had left the MIT program website and had gone to the Graduate Admissions site, he immediately hit the back button. Then he scrolled to the top of the Master in Teaching home page, and using the navigation menu, clicked on “Apply” and then immediately clicked on “Admissions requirements” and scrolling down found the application deadline in the data table right below the heading (but was irritated that the table did not fit nicely in the mobile screen). Again, it only took Participant 1 about 30 seconds to complete the task.

mobile_test_1-2

Task 3: As he was already on the Admissions requirements page, it took Participant 1 about 5 seconds to scroll down the page to find the list of admissions requirements.

Task 4: From the Admissions Requirements page, he scrolled back to the top and accessed the menu, first clicking on FAQs, and scanning the FAQ page content and not finding contact information, he went back to the MIT home page, and clicked on the “More information” call to action button which took him to the Graduate Admissions “Request information” form, which frustrated him. So he scrolled to the bottom of the page, and in the footer found contact information for Graduate Admissions and said that’s who he would contact. This took him about a minute to do, and while not actually finding contact information in the MIT program, contacting Graduate Admissions would also work, but be a more circuitous route.

PARTICIPANT 2

Task 1: Participant 2 scrolled down to the bottom of the College of Education home page and found the “Information Sessions” information, completing task 1 in about 10 seconds (though she pointed out that the list of info sessions still included dates that had already passed).

Task 2: Participant 2 followed a similar route as Participant 1, scrolled up and clicked on the “Graduate Programs” button, and then the “MIT Program” button, all without accessing the menu. Scanned the MIT home page, taking a detour to the Graduate Admissions by clicking “Apply Now,” scanned the page and did not see any dates so she hit the back button, returned to the MIT page, and then tried the “Request Information” button which returned her to Graduate Admissions. This brought her to the web form but it did not load immediately on the phone, so initially it looked like a blank page which confused the participant and so she clicked back before the form was loaded.

mobile_test_2-1

When she returned to the MIT home page, she finally accessed the menu and selected the Apply page, and then the Admissions Requirements page. It took her about two and a half minutes to complete task 2.

Task 3: She completed task 3 in about 15 seconds by slowly scrolling down the “Admissions Requirements” page and finding the list there.

Task 4: Then participant 2 completed task 4 in about 20 seconds by returning to the “Apply” page and scrolling down to the bottom of the page stated she would contact the program at “mit@seattleu.edu.”

PARTICIPANT 3

Task 1: Participant 3 completed task 1 in about 10 seconds, quickly and easily scrolling down to the “Information session” section on the College of Education home page.

Task 2: Participant 3 seemed especially perplexed by Task 2 and navigated through several pages scanning for deadline information without finding what he was looking for. From the College of Education home page, he accessed the menu, clicked on “Graduate Degrees,” then “Master in Teaching.” Didn’t see any deadline information, so he clicked the call to action button “Apply now” and ended up in Graduate Admissions, didn’t see any deadlines on that page, then clicked the “Request information” call to action button and got a blank page for awhile, until the form loaded. Clicked back to the MIT program webpage, accessed the menu and selected “FAQ” which again did not present him with any information about application deadlines. He clicked the back button again and then went back to the menu and this time selected “Degree Options.” By this time he was very frustrated and said he was giving up on surfing for the information and was going to use the search bar. He was further frustrated trying to access the search bar because the main Seattle U navigation menu is so very long on a mobile device that he didn’t see the search bar at all when he first accessed the menu and had to scroll down to finally find it.

mobile_test_3-32

As soon as he found the search bar, and typed in “Master in Teaching deadline” the first search result was for the “Application requirements” page on the MIT website. It took him 7 and a half minutes to complete Task 2.

Task 3: After his frustration in the second task, Task three was much more straight forward and completed in 10 seconds by scrolling further down the “Application requirements” page.

Task 4: Task 4 was equally quick (about 10 seconds) for Participant 3, as he scrolled down to the footer and declared he would contact “mit@seattleu.edu.”

Analysis and next steps

It is very interesting comparing and contrasting the results of these usability tests on a mobile device versus a laptop computer. It seems as though these sites were not designed “mobile first” and while it didn’t necessary hinder their ability to complete the tasks (with the exception of not accessing the menu quite as much, because the menu was hidden on the mobile site and had to be clicked on to access it, where as the users who did this usability test on the laptop had much easier/quicker access to the menu) the fact that things didn’t necessarily fit in the screen did seem to bother the participants to varying degrees.

Participant 1 pointed out several times where it was clear that the pages he was navigating to and through were not intended to be viewed on a mobile device and found it to be especially annoying while completing the tasks: such as the Jumbotron picture on the College of Education home page) didn’t fit on the mobile screen and neither do the embedded YouTube videos, nor the tables. Participant 1 also remarked that the font size for the headings seemed to be too large for the screen size and the font size for the regular text seemed to be too small, and that the low contrast of the yellow color of the headings on the white background was especially difficult on the mobile site.

mobile_test_25

Participant 2 spent a bit of time remarking on how strange it was that the information about the “Info sessions” was so far down on the page, and that she thought it was rather important and should be closer to the top (whereas on the laptop/wider screen, those content boxes appeared on the right-hand column of the screen and closer to the top – but with the narrower screen size, those content boxes were moved to the bottom of the page forcing mobile users to scroll a lot further down to find the information they were looking for. And Participant 1 and 3 both had a difficult time with clicking links with the iPhone, holding their finger down too long and inadvertently activating iPhone the zoom feature, and sometimes Participant 3 wasn’t quite sure what was a link/what was clickable on the mobile site and stated several times that he preferred to use the Seattle U website on a device with a larger screen.

I would recommend that Seattle U in general do some more mobile usability testing to determine if there are fixes they can do with the design template to make the conversion to the mobile size a little easier on their users.

 

Just add analytics (Week 7)

So for this week’s project, I decided to dip into the analytics data for Seattle University’s Center for Faculty Development website. I am currently the administrative assistant for the Center for Faculty Development, and our website is a very important marketing and communication tool for us to reach and support our users – Seattle University faculty, and I was curious to see some actual data about how our site is being used.

Luckily, the university’s Marketing and Communications division already has Google Analytics tracking all of our webpages, so with quick request to MarCom, I was granted access to Seattle U’s Google Analytics account so I could dig into the data.

QUESTIONS AND HYPOTHESES

But first, I had to decide what I wanted to find out our website that the analytics data might be able to help shed some light on. I came up with some basic/general questions:

  • What parts of our website are the most popular?
  • When do we get the most traffic on our site? (in terms of the academic quarter)
  • Are our users primarily local/internal to Seattle U? Are we getting much traffic from outside Seattle U?

Without looking at any of the data yet, here are some of the things I’ve hypothesized:

  • In the past year, aside from the home page, the “Current Events” and “Faculty Resources” pages are the most popular parts of our site.
  • In the past year, we get the most traffic at the beginnings of the academic quarters (late September, early January, late March) when we send out our quarterly event announcement email to all faculty.
  • In the past year, the majority of our website users were local.

PRELIMINARY FINDINGS

Screen shot from Google Analytics report for Center for Faculty Development website

In the past year, The Center for Faculty Development’s website had over 11,000 pageviews.

The most popular page (with 21% of those pageviews were the home page, which was as I was expecting), and my first hypothesis was correct: the “Faculty Resources” page was the second most popular, with just under 12% of the pageviews, and the “Current Events” page being the third most popular page, with just over 7% of the pageviews.

The second hypothesis had some validity for sure. Some of the highest spikes in pageviews over the past year correspond with the dates that we sent out our quarterly event announcement email to all faculty (big spikes on January 5, March 30). The fall email announcement on September 22 didn’t generate quite as high a spike in traffic as the emails earlier in the year, but the email call for applications for the new Associate Director for Learning and Teaching position which went out to all full-time faculty on September 27, generated a very large spike in traffic. So I’d hypothesize that our email communication to faculty is what’s driving traffic to our site, rather than the time of the quarter per say.

Screen shot of Google Analytics for Faculty Development website

The third hypothesis is also true, the majority of our users are definitely coming from the Seattle area (and I would hazard a guess at internal to Seattle U) but we are also generating traffic externally as well, as a number of sessions originating in the city of Chicago are popping up on our Resources page.

There is so much to learn, so many interesting metrics to explore in Google Analytics and I am fascinated to explore further.

Video with subtitles (Week 6)

Description

This video is from Seattle University’s Center for Faculty Development (where I am currently the administrative assistant). It is the introductory part of a three part video series called “Teaching Tales” that the Center for Faculty Development created last spring (May 2016) in honor of an event called “The Year of the Teacher.” The three part video series features clips of interviews with ten different Seattle University faculty from around the campus talking about different aspects of teaching. The first of the three parts is a short video where each of the ten faculty interviewees introduce themselves by name, where they are located within the university and what they teach.

Process

As I have some experience doing transcription, and I’m quite familiar with the video footage from these interviews, the transcription process was an absolute breeze for me once I got used to the YouTube transcription interface (and the feature where the video stops playing when you’re typing in the transcription field).

The most difficult part was waiting for YouTube’s auto-sync to finish processing, and once it was finished, I was astonished at how well the auto-sync process turned out. Overall, the process was so very easy and I found the YouTube “Transcribe and auto-sync” tools to be very straightforward, I’m really astonished that more YouTube videos don’t have captions, because it takes so little effort to make a big difference in media accessibility for users. After seeing how incredibly easy it was to do, I am eager to add subtitles to all of the other videos in the Center for Faculty Development’s video resource library.

Evaluating accessibility (Week 5)

This week, I conducted an accessibility evaluation of Seattle University’s New Faculty Institute website: http://www.seattleu.edu/nfi. I am currently responsible for the maintenance of this website, as the administrative assistant supporting this program.

Site’s purpose and intended audience

This site is a resource for new full-time faculty who are invited to participate in New Faculty Institute, the annual two-day intensive orientation workshop which is held each year in early September. It introduces the event itself, including the program, and calendar of follow-up events held in the fall, and provides new faculty a list of resources useful for getting started at SU.

My supervisor and I were discussing the fact that if and when we have a new faculty member with disabilities attend NFI that all of our materials, including our website, aren’t especially accessible to individuals with various disabilities, and thus we aren’t currently equipped to fully support new faculty with disabilities through the NFI program. So that is why I decided to conduct the accessibility evaluation on the NFI website.

Evaluation tools

I opted to use WebAim’s Web Accessibility Evaluation tool (WAVE). WebAim’s WAVE checks for compliance with WCAG 2.0 and Section 508.

Results

  • One issue that WAVE flagged on the NFI website was a contrast error (very low contrast) on the yellow colored heading text on the white background. Contrast between the text color and the background color is important for all users, but especially users with low vision.

WAVE Report of NFI: Contrast error identified

  • Another issue that WAVE flagged is that there are images on the page that have “null or empty alternative text” which means that, if the image is conveying content, that content is not being conveyed to users who are using a screen reader.

WAVE report of NFI website: Images contain no alternative description

  • Another issue the WAVE discovered is that there are linked images on the page which are missing. Images that are the only thing in a link need an alternative descriptive text otherwise it appears empty and a screen reader thus has no content to present to the user.

WAVE Report of NFI site: Linked image missing alt text

  • In a similar vein, the WAVE report discovered several “empty” links. When you look at the site with styles, these links have social media icons but those don’t display as either image or text and so WAVE flags them as being empty links, and empty links introduce confusion for users with screen readers.

wave_report_of_new_faculty_institute_-_seattle_universitywave_report_5

Recommendations

  • One recommendation to fix the contrast error would be to change the yellow color of the headings to a darker color that would have a higher contrast with the white background.
  • We can determine if the images are indeed conveying important content, and if they are, then we can add an alternative (alt) description so that information can be conveyed to screen readers. But if we determine the image is not conveying important content  — or if the content is conveyed through a different method, like a caption, for example — and can be ignored by screen readers, then we can make the conscious decision to leave the alt text empty.
  • I would also recommend that with the image links, and with the social media icon links in the footer (and in other so-called “empty” links) that we provide text (or alternative description) to convey meaning to users with screen readers.

Usability testing (Week 3)

For our assignment this week, we were asked to conduct set of quick usability tests after reading chapter 9 (“Usability testing on 10 cents a day”) in Steve Krug’s Don’t Make Me Think. Using Steve Krug’s test script as a template, I developed a test script, with a test scenario and four short tasks, for three participants, starting on Seattle University’s College of Education website (http://www.seattleu.edu/education/).

SeattleU College of Education Home Page

The test scenario and the tasks:

For the purposes of this usability test, I asked users to pretend they were all prospective graduate students looking for more information on the Master in Teaching (MIT) program at Seattle University.

Task 1: Find out the date of the next MIT information session.

Task 2: Find out the next deadline for applying to the MIT program.

Task 3: Find out what the admissions requirements are for the MIT program.

Task 4: Find out who to contact if you have additional questions about applying to the MIT program.

Participants

  • #1: 31 year old female, self-described tech-savvy, heavy web user (about 8-10 hours per day online). Participant already has a Master’s degree (so well-educated). As Seattle University staff member, she had been to the site before, but never looking through the lens as a prospective student.
  • #2: 27 year old male, also fairly heavy web user (about 5 hours a day online). A current Seattle University graduate student (though in a different college). Had never been to the College of Education’s site before (though, as a current student, is familiar with Seattle University’s website in general).
  • #3: 29 year old female, graduate student, another heavy web user (~ 10 hours per day). Never been to the COE site before, but already familiar with SeattleU’s website generally.

Methods

Tests were all conducted in person, with all of the participants using the same device (a MacBook pro). Tests were all recorded using the screen recording function in QuickTime Player which also recorded audio using the MacBook’s internal microphone.

Test results

PARTICIPANT 1:

Task 1: While Participant 1 did scroll down to the bottom of the home page, she was looking primarily in the center column and did not see that the answer to the first task was located on the home page in the right-hand column, if she had looked to the right when she scrolled part of the way down on the page. So because she did not notice the section on “Information Sessions” on the home page, she navigated away from the home page, first clicking on “Graduate Degrees” – the fourth item down on the navigation menu along the left-hand side of the page – and then, still looking just in the navigation menu on the left-hand side of the page, scanned specifically for “Master in Teaching” – the sixth item listed in the Graduate Degrees submenu – and clicking on it. On the “Master in Teaching” page, she immediately spotted the “Upcoming information sessions” section, toward the top of the right-hand column. She completed Task 1 in less than a minute, but if she had only spotted the information on the home page, she could have completed Task 1 in just seconds.

Task 2: From the “Master in Teaching” program home page, Participant #1 scanned for the word “deadline” and anything resembling dates, and upon determining there were no application deadlines listed on the page, turned to the menu to see if she could determine where she might find the application deadline, and clicked on the “Apply” page, the second item down on the navigation menu. On the “Apply” sub-menu, she scanned for “deadlines” but didn’t see anything resembling what she was looking for, so she opted to click on the “Apply Now” button at the top of the right hand column. Clicking on that button took her away from the College of Education page and onto the Graduate Admissions website’s “Apply” page (and she stated that she recognized that she had left the site she was previously on and had entered a totally different site). She scrolled down, scanning for dates, and seeing that there was just general application requirements, she decided to click again on another “Apply Now” button on the top of the right-hand column. Here she said she was getting frustrated because instead of the information she was looking for, she reached a log-in screen for the Seattle University Applications System, and saying that she did not have a log in, and not wanting to create a log in, she decided to click the back button, returning to Graduate Admissions Apply page, and then instead of scanning further for deadlines in the Graduate Admissions site, she immediately clicked the back button again and returned to the Master in Teaching “Apply” page. Stating that she remembered that she had already scanned the center column of that page and hadn’t found the application deadline, she turned to the navigation menu on the left hand side of the page, and selected “Admissions Requirements” which was the first item in the apply sub-menu. It was here she completed task 2, finding the application deadline of December 1st at the top of the middle column in a data table (but incorrectly read the data in the table, across the row [rather than down the column] stating that the deadline was ‘December 1’ for a ‘February 1’ start date. It only took her about 2 and a half minutes to complete the task, but not without becoming frustrated and reaching (in her words) “a dead end.”

part1task2

Task 3: From the “Admissions Requirements” page, it took her less than 30 seconds to complete task 3, as all she needed to do was continue to scroll further down the page in order to find all of the admissions requirements listed.

part1task3

Task 4: Still on the “Admissions Requirements” page, she starts scrolling up and down the page again, scanning for “contact information” but does not find it, so she returns to the left-hand navigational menu looking for something related to people. She pauses briefly while scrolling on the “Frequently asked questions” link, but does not click it. She decides to click on the “Faculty and Staff” menu item, at the bottom of the menu. On the “Faculty and Staff” page, she starts scanning and scrolling down the list of MIT program faculty until she reaches the bottom of the page and finds the name and contact information for the administrative assistant, Jessica Fenner, and states that she would contact Jessica because she believes they would be able to answer questions about admissions better than the faculty probably could. It takes her just over a minute to accomplish task 4.

part1task4

PARTICIPANT 2:

Task 1: Participant #2 completed task 1 almost immediately (maybe 10 seconds) because he found the “Information Sessions” section on right-hand column after scrolling a little bit down the page. After asking me for the date, he is able to find the date of the next information session straight away.

usability_test_3

Task 2: Still on the College of Education home page, Participant #2 starts scanning the navigation menu down the left-hand column of the page and quickly clicks on the first menu item — “Admissions.” Without scrolling any further down the page, he scans down the menu and selects “Applications deadlines” – the third menu item on the College of Education’s Admissions sub-menu. On the “Application Deadlines” page, he slows down a bit as he scrolls down through a table full of data, including a long list of program titles, looking for “Master in Teaching” which is the second to last item in the first column of the table. Once he finds the right program, he quickly scrolls up to the top of the table again to figure out what column headings are, and then scrolls back down to the MIT row, and says the deadline is Dec 1 to start in spring quarter. It takes about two minutes for participant 2 to complete task 2.

Task 3: He scrolls to the top of the “Application deadlines” page to get back to the menu on the left-hand column and almost immediately clicks on “Application requirements” (the next menu item above the currently selected sub-menu item). He finds a short list of “application requirements” at the top of the page (which are very slightly different than the “admissions requirements” which I had asked for in the task). But he keeps scrolling down the page and finds another table of additional program requirements. Again, he scrolls down the page scanning specifically for the Master in Teaching program, and once he finds it, he again has to scroll back to the top of the page to figure out what the column headings are, as he had done during Task 2. It takes him about a minute to accomplish task 3.

Task 4: While on the “Application requirements” page, he scrolls to the top of the page, and near the top of the right-hand column is a link that says “More information” which he clicks on. This takes him to a web form to “Request Information” on the Graduate Admissions page and he says he would fill out and submit the web form and let the MIT program contact him, instead of seeking out a person himself. Participant #2 completes task 4 in about 30 seconds.

gradadmiss-reqinfo

PARTICIPANT 3:

Task 1: Participant #3 completed task 1 almost immediately (about 5-10 seconds) because she found the “Information Sessions” section on right-hand column after scrolling a little bit down the page; though she remarked that it was vexing that it was not current (out of date information was still listed).

Task 2: Participant #3 had virtually the same experience as Participant #2; she successfully completed task 2 in less than a minute. She encountered the same difficulty as Participant #2 while scrolling so far down in the table to find the MIT program in the list of graduate programs that she had to scroll back up to the top of the page solely to refresh her memory of what the column headings were. She remarked that she thought the programs would be listed with the most popular ones towards the top so she didn’t have to scroll so far down to find the program she was looking for.

admissions-reqs

Task 3: Participant #3, again, took the same route as #2, clicking the “application requirements” sub menu item and scrolling down the page until she found the MIT program listed in that table as well. It took her about 30 seconds to complete task 3, though she did find the admissions requirements listed in the table to be a bit simplistic/not detailed enough, and so she clicked on the link to go to the MIT program site, but when it did not immediately elaborate on the admissions requirements, she clicked the back button.

Task 4: This task seemed to take Participant #3 longer than everything else had so far. She first scanned the menu to see if there was anything like “contact us” in the menu items, but not finding it, she opts to click “Apply now” (the top item in the menu) and it takes her to the Graduate Admissions page. Not seeing any contact information, she selects “FAQs,” scans them and determines that they’re all general information about applying and nothing about the program she’s interested in. So she clicks the back button and then selects “Request info” but she’s frustrated when it takes her to a web form on the Graduate Admissions website, so she goes back to the menu and clicks “contact” on the Graduate Admissions website, which gives her contact information for how to contact Graduate Admissions (and not the MIT program specifically) – it’s now that she realizes she’s no longer on the program page she was interested in and exclaims in frustration, and then hits the back button twice until she returns to the Application Requirements page on the College of Education site. She goes the navigation menu on the left hand side of the page and selects “Frequently asked questions” but wonders aloud ‘Did I already look at this? No, this one is different.’ And it’s on this page that she finds contact information that she says she would use: coeinfo@seattleu.edu. Task 4 takes her over three minutes (longer than all the other tasks combined).

part3-task4

Analysis and next steps

So a caveat before I start with the analysis and recommendations, all three of my participants were generally in the same age bracket, well educated, and moderate to heavy web users, affiliated with Seattle University in some way already, and they had varying degrees of familiarity with the Seattle University website in general (if not the College of Education website specifically), so these tasks very probably were easier/faster for these participants than they might have been for another random user. So I would definitely recommend more usability tests with a greater range of users before making definitive changes.

My three participants all managed to complete the first three tasks with quickness and relative ease, though Participant #1 missed seeing the content on information sessions in the right hand column of the page, so it makes me think perhaps important information should not be placed in the right column of the page since users tend to scan in a T-shape from left-to-right and from top-to-bottom and might miss it. But on the other hand, 2 out of 3 participants did see it, so before making any final recommendations, I would recommend doing even more tests and get more data before making any big decisions.

The fourth task was the trickiest for all three participants and none of the three picked the same contact information (though I think technically all three options would eventually get users the answers they might seek, it could take varying amounts of time to get them, which could frustrate users if they don’t get in touch with the correct person immediately.) I would recommend that the COE figure out how to send a more obvious and consistent message to users about the best person or best way to contact someone with their questions about applying to the program.

Participants 2 and 3 both experienced some amount of difficulty with the long length of the data tables and the amount of scrolling needed to find the row they are looking for while remembering what the headings of the columns are. Though not a very specific recommendation, I would recommend that the College of Education revise the content in the tables to shorten the length on the page so it’s easier for users to successfully extract information from the table.

One thing that seemed to cause the most frustration among the users was that clicking on the very prominent call to action “Apply Now” was not giving them the results they were looking for, and taking them away from the College of Education site and bringing them to the Graduate Admissions site and they didn’t seem very clear that they had switched to the site of an entirely different entity, and were very confused once they tried surfing around the Graduate Admissions website because they didn’t realize they had left the location they thought they were in. It sort of reminds me of the time when I got lost in Las Vegas in two adjoining casinos, I started wandering, thinking I was still in my hotel, and totally inadvertently wandered into a completely different hotel unknowingly and it was very jarring and confusing when I realized I wasn’t where I thought I was. Minimally, I would recommend that there are better visual cues to let users know that they had wandered into a totally different location. But I think the general relationship between graduate program web sites and the Graduate Admissions websites ought to be examined and further investigated to make the application process a bit smoother for prospective students.

Allabouthawaii.com revisited: Usable design analysis (Week 2)

I decided to revisit www.allabouthawaii.com and analyze a few different criteria — this time, focusing on a usable design. In Don’t Make Me Think, Steve Krug says that both the design of the home page and the navigation are both major factors in helping users decide “Do these guys know what they’re doing?” And while the site conveys that they know a thing or two about Hawaii, I’m not sure that this site is instilling the kind of confidence they might wish to see in their users. But there are definite steps that can be taken to improve the usability of the site’s design.

Is there clear, simple and consistent navigation?

On the plus side, the primary navigation is consistent and persistent throughout the whole site (meaning it follows the user everywhere from the Home page to all of the secondary pages and stays the same/no changes throughout), and the site very effectively uses tabs (on of Steve Krug’s favorite design features) using the contrasting blue color to create a non-subtle you-are-here indicator with the active tabs. This does effectively give the user a sense of the scope of the site, and always gives users a quick, easy way to get back “home.”

allabouthawaii-nav

On the negative side, the primary navigation does not stick to the content sections but includes utilities, like “Search,” “About Us,” and “Agent Login” which, while they are important parts of the site, they are not content, and should be separated from the primary content hierarchy. In separating out the utilities from the content in the primary navigation, it would simplify the primary navigation immensely.

Additionally, while the primary navigation is consistent and persistent throughout the site, the secondary navigation is not. The secondary navigation runs in a blue-colored column along the left side of the page. So when there is a secondary layer of content, the secondary navigation shows up in the blue column on the left side. But when the user clicks on a page where there is no secondary content in the hierarchy, instead of just leaving the blue column running down the left side of the page, they remove the blue column completely. And it’s a very jarring design inconsistency for users to encounter.

allabouthawaii-sec-nav

Can the user easily search for what they’re looking for (if they don’t want to browse)?

Well, kind of. Allabouthawaii.com does have a “search” function. But it defies web conventions in a couple of ways. First, the search is included as a tab in the content hierarchy, rather than in a separate utilities section. And when you click on the search tab, the user encounters one of the strangest “search” utilities I’ve ever seen. You don’t just get a box where you can type in whatever keywords you want to search for, but you’re presented with a box with a list of “available keywords” and an empty box next to it for “selected keywords,” and arrow buttons between the two boxes so that the user can select and move any of the pre-selected keywords into the search box.

allabouthawaii-search

So users who are search-dominant (as Jakob Nielsen would say), can conceivably find the search (though not in the “usual” place, consistent with web conventions) but once they do get into the search page, the users may be flummoxed. Searching is not easy on allabouthawaii.com.

Does the home page convey “the big picture,” establish credibility and trust, without promotional overload?

allabouthawaii.com screen shot

In terms of usable design, the home page is kind of a mixed bag. The first thing I thought of when I first visited allabouthawaii.com was that the visual design was very dated, so much so that I immediately scrolled down to the footer to see if there was a recent copyright date because I was convinced there was no way that this site was created in the current decade (and I was utterly shocked to see a 2016 copyright date).The Site ID (the All About Hawaii logo) is so pixelated and old fashioned and is in desperate need of a design overhaul. I really think the site could use a serious visual makeover to help establish more credibility and trust in their site’s users.

Secondly, I don’t really think the user can quickly and easily grasp the big picture from the home page. This site is for a travel and tourism company and the purpose of the site is to aid travel agents and travelers in planning a trip to Hawaii, including things like accommodations, activities, car rental, etc. But I honestly don’t think the site ID “All About Hawaii” conveys that, but could be more effective if it also included a clear informative tag line that stressed planning travel to Hawaii. The welcome blurb is slightly more informative (despite not even using the keyword “travel”), but users really have to dig deep and squint. Additionally, “All About Hawaii” isn’t even the company’s name, as I discovered when I visited the “About Us” page and read that the site is operated by a travel and tourism company called “All About Travel.” (Which again – not a great way to establish trust and credibility among users.)

Lastly, the home page suffers from what Steve Krug refers to as “overgrazing of the commons” or more bluntly — promotional overload, because other than the primary navigation and the welcome blurb, the home page is entirely advertisements, deals, promotions, etc. And – especially for a user in the very beginning stages of planning a trip to Hawaii, who didn’t even know what island I wanted to visit (as I was when I first visited – see my previous post) the promotional overload was so incredible overwhelming and even confusing.

allabouthawaii-ads

One element that was especially confusing to me as a total newbie (both new to the site, and new to Hawaii more generally) had to do with a gallery of featured hotel properties towards the bottom of the home page. There were 19 different photo thumbnails with different colored banners stating the same of the property. But I just could not figure out — why are the banners different colors? After combing through the site more extensively, I believe I figured that out. On the sub-pages, the site assigned each island a specific color, and so the properties on Oahu have a yellow banner, and the properties on Maui have a pink banner, and so on. (And to go back to my previous post — this was a big “question mark” for me as a new user that ought to be eliminated.)

Next steps

There a number of possible next steps to improve the usable design of the page. Here are a few I think would help:

  • In the primary navigation, separate the utilities, like “Search,” “About Us,” and “Agent Login” from the primary content hierarchy in the navigation. It would simplify the primary navigation and give users a better sense of the actual scope of the content.
  • Make sure the secondary navigation (blue-colored column on the left-hand side of the page) elements are used consistently.
  • Replace the very strange current search function with a more conventional search box, to help eliminate the question marks (like “how the heck to I use this search tool?!”) and aid the search-dominant users in finding what they need.
  • Give the site, and the logo, a design overhaul so it doesn’t look so dated, and engenders more confidence, credibility and trust in the company in the users.
  • Try to scale back the sheer number of advertisements on the front page so as to avoid promotional overload — perhaps move the island-specific hotel advertisements onto the secondary pages about those specific islands.

Usability analysis of allabouthawaii.com (Week 1)

I recently got home from a vacation in Hawaii. When I was in the planning stages many months ago, having never been to Hawaii before, I decided to do a little research on the web about where I wanted to go, and what I wanted to do. My travel agent sent me to the website www.allabouthawaii.com, a website about planning vacations to Hawaii from a travel company specializing in custom vacations including lodging, activities, car rental, etc. But honestly, it wasn’t the best web experience I’ve ever had. I did, eventually, manage to extract the information I was looking for, but it took awhile and it wasn’t easy.

Also being a student in web development, when I was given an assignment to analyze the usability of a website (one where you could probably make suggestions to improve the usability), my brain went straight to allabouthawaii.com (and I’m pretty sure that’s not the type of memorable experience they were going for).

skitch

Where should I begin?

Usability expert Steve Krug says that one of the basic principles of usability is to eliminate the question marks and that one of the many things a user should not spend time thinking about is “where should I begin?” But that was precisely the first thing I thought when I arrived at allabouthawaii.com.

When I first arrived at the website, I wasn’t even sure what island I wanted to visit. There were two pretty clear calls-to-action (“Check availability and book online” in a red box in the top right corner, and a bright orange “Book Now” box on the left hand column, about halfway down the page) but I wasn’t ready to book yet, so I wasn’t sure where to begin. I started looking for a “getting started” or “first time visit” or “overview” or something to help me decide but I couldn’t see anything like that. I tried clicking on the “Travel Guide” tab but it had information about “what to pack” and “security screening tips” but nothing about how to start planning your vacation.

Is it obvious what’s clickable?

It wasn’t terribly obvious to me what was clickable on the home page. The navigation tabs, obviously, were clickable. And the call-to-actions both were clickable but the red box on the top right was obviously a button that changed when I hovered over it with my mouse (and buttons should be clickable) but the other boxes, it was unclear whether they were supposed to be clickable or not. Are they just ads or are they important information?

Is it scannable?

For the most part, on the home page, there wasn’t much text, so it was very scannable – with the exception of the four line paragraph of what Steve Krug refers to as “happy talk” — “It’s the introductory text that’s supposed to welcome us to the site and tell us how great it is”[1].

The sub-pages weren’t anywhere near as scannable as the home page — in fact, the pages for each individual island and the “Travel Guide” page were a wall of words.

Is it free of visual distractions?

Right in the middle of the page was a small white box that had text scrolling by inside it. That in an of itself is distracting enough, but the size of the box also prevents the user from seeing the whole message in one glance. I had to sit and wait for the words to slowly scroll by.

And when I would hover my mouse over the box, the scrolling would stop, which is another distraction itself. It was like “Hey! I was reading that! Why did you stop??”

Also the general style was just distracting because it did not conform to a clear visual hierarchy. The page had a very inconsistent use of color, font, and text size. There were at least three different shades of blue, along with yellow, orange, red, and two different shades of green. I could easily distinguish at least three different fonts and a several different font sizes.

And the important things weren’t the largest, most prominent things on the page.

That’s not to say that the site was completely without merit. Even though the text on the sub-pages weren’t terribly scannable, there was a wealth of interesting information about the different islands, and – as advertised – the site helped me evaluate different lodging options and different activities, at a good price. I had a wonderful Hawaiian vacation, thanks in large part of allabouthawaii.com. But with a bit of updating on the website and some usability analysis and testing, users’ Hawaiian vacations could start off on an even better footing.

Next steps

Aside from a major design overhaul for this site, if I were to offer some small next steps for increasing the usability of the page, these are some small changes that I think would help:

  • Add a “Getting Started” tab with introductory/overview information for someone who is just beginning their vacation planning so that there is a clear place for users to start.
  • Add visual cues to the boxes that are clickable to differentiate them from what’s not clickable, for example, add shadow around the box to make it look 3D (among other simple options).
  • On the secondary pages, make the pages more scannable by revising the text using Krug’s advice: plenty of headings, keep paragraphs short, use bulleted lists and highlight key terms.
  • In order to minimize distractions, I would say either ditch the small box with scrolling text, or make the box bigger so that users can see the entire message in one glance rather than having to wait for the message to scroll by.
  • Revise the site so that it has a more consistent use of color, font type and font size, and a more clear, visual hierarchy, especially making sure that the most important things (call to action buttons) on the page are the largest in size.

[1] Krug, Steve (2013-12-23). Don’t Make Me Think, Revisited: A Common Sense Approach to Web Usability (Voices That Matter) (Kindle Locations 764-765). Pearson Education. Kindle Edition.