• Moderated Test
  • Evaluation
  • Conclusion
  • Unmoderated Test
  • Heuristic Evaluation

Lu Chung

Product designer @ NYC

  • HOME
  • WORKS
  • ABOUT
  • RESUME
  • LINKEDIN

Case Study #1: Remote Moderated User Test Report

Usability testing to evaluate the A.R.T. website for aspiring archivists in NYC.

OVERVIEW

Four usability professionals conducted the remote moderated usability testing of on UserZoom Go website to evaluate and identify the usability issues on the Archivists Round Table (A.R.T) website. During testing sessions, I handled three usability testing sessions that took over a moderator two times and as a note-taker once. I composed the Methodology, Appendix, and References parts in our report.

I am also in charge of the art director. My works include designing the report, slides. Also, to deliver all the tables, charts, data visualization, illustrations, and graphics in our project.

To provide comprehensive solutions that improve the overall experience of the archivist website. I created mockups of their homepage, which presents the visual recommendation with high fidelity prototypes for mobile. And a video for the mobile device to give a further interpretation - how the Archivists Round Table website can execute both an aesthetic and functional way.

OBJECTIVE

Moderated remote test report

ITEM

Graduate team project

IN CHARGE

Design of report, slides, Mockups and prototypes of desktop, mobile, video

TOOLS

UserZoom Go, Google Docs, Google Slides, Sketch, InVision, Craft, Adobe Photoshop, Adobe Premiere Rush, Instagram Story/Reel, Unfold App

PERIOD

Sep 2020 - Dec 2020

BACKGROUND

The Archivists Round Table of Metropolitan New York, Inc. (A.R.T.) is an online platform that provides humorous information such as archiving related events, publications, jobs for aspiring archivists, to networking.


Challenge

It is essential to hierarchize the navigation menu and to group each section on the homepage. The improvement can make sure users feel seamless experiences and have a positive first impression when browsing the site.


Problem Statement

The aspiring archivist needs an online platform to find archiving related events, publications, and information that share the same interests because they wish to get involved in New York archiving communities and access helpful resources.

EVALUATION PROCESS

The summary of the steps involved in a remote moderated usability test, as described from an article, Remote Moderated Usability Tests: How to Do Them, established by Nielsen Norman Group, is shown below:

Note: I handled three usability testing sessions that took over a moderator two times and as a note-taker once.

What's the Research Question we address?

  1. What’s their users’ opinions and how do they feel when they come to this website?
  2. Could they find out the information they expect to get from this website?

What's Research Goal we achieve?

  • We decided to do the Evaluative Research Method.
  • To help us understand how our clients can iterate and redesign their website.

With this Research Goal in mind, we came up with 4 usability tasks...


What's Research Method we conduct?

We decided to conduct the usability testing method, which compared to in-person usability testing. The remote test has the advantage less expensive, less time-consuming. We also created the screening, pre, and post-test questionnaire to learn our participant's profile information. And during each session, every participant took 4 tasks as below:


The 1st task, we intended to get users' attitudinal data with semi-structured questions.

  • First of all, we ask for their first impression of the website.

The 2nd, 3rd and 4th task are behavioral research which address our 2nd question.

  • Then, participants need to find out an event in the 2nd task.
  • Task 3, they have to find and contact a member on the ART Website.
  • In the No.4 task, participants need to find available programs to progress their careers.

What did I learn from each step?

With the prepared usability testing plan, we executed the testing on the UserZoom Go website. But the method also has limitations, such as:

  • The data is often not as rich compares to in-person testing.
  • Also, the less engagement of participants.
  • And the difficulty to control the process.

We did the protocol to ask our participants to decide which device they want to take usability testing. We prepared the script to follow for the testing, which makes sure our testing process went smoothly. After participants agreed to conduct the testing, we sent the email to sign the consent forms. Before the real testing, we also did the pilot test to rehearsal our usability testing.


From Client Kick-Off Meeting

Our clients are not familiar with the evaluation of usability testing. Still, they do want to improve the user experience of the A.R.T website. They noted a lot of information related to archiving on the homepage that is easy to confuse their users. And they also mentioned if we can provide mockups and prototypes to inspire and help them from a visual perspective.


From Pilot Testing

After the moderated remote testing plan, moderators scheduled the time to test the online testing tool. In this process, we make sure the facilitator and team are prepared and have no problem to run the testing. We found out the issues to fix in the pilot testing and make each task clear enough and not mislead participants. This step ensures our smoother test and more substantial results.


What did we learn from the participants?

Based on the client's feedback, they hope to develop their website further to serve their members and encourage new ones to join. The research team segmented these insights into three user profiles, archiving professionals, historian enthusiasts, and aspiring archivists. To figure out our participants and narrow down the research scope, we target the audience of Aspiring Archivists.

After usability testing, collecting participants' profile information from screening, pre-test, post-test questionnaire. We came out with the following summary of our user profile:

  • Are they members of ART?
    • Only one participant is a member of the ART website.
  • What degree they currently pursuing?
    • 6 users are major in Library and Information Science.
    • 2 participants get a master's in Museum and Digital Culture degree.
    • And one participant did the dual degree, History of Art and Design.
  • What did they use for networking?
    • There are many ways to do networking. The top 3 are LinkedIn, Facebook, and News website.
    • 4 participants said they use LinkedIn for networking, 2 participants use Facebook.
  • Used this site before?
    • 4/9 participants are familiar with the A.R.T. website.
  • How often they visited the library?
    • 6/9 participants visit 1 to 3 times per week or month, including 3 per week and 3 per month.

Evaluation Results

After we got the feedback from our participants, we jumped into the Miro Board App. We noted down their profile information, pre & post-test questionnaire, and the input from four tasks. Followed the card sorting method to categorize each user. Then go to Google spreadsheets to group the usability findings, identify where they happen on the site, and rank their severity level. Finally, sort out grouping results with a pivot table, each issue's ground total scores, and percentage of severity.


Overall Results

From each evaluation task, we found out the Task 3 (to find and contact a member) has the most difficult ranked scores as 2.2. Task 4, "to find programs available to progress your career," is the easiest one, with average ranked scores of 3.8. (See the charts below for further detailed information)


And this bar chart visualizes the users’ behavioral data about the difficulty ranked scores in each task.

How well did the product perform?

Based on the feedback of ther participants, I created the severity table in our report from our usability finding grouping. Approximately 35% of issues are severity recommended for priority. The attitudinal data from Task No.1 which addresses our first Research Question, we collected 39 opinions that participants mentioned about the websites’ layout and appearance. And 34% of cases are positive or neutral feedback. (The figure: Number of Issues by Severity & Severity Table in Summary)

EVALUATION LEARNING

My Key Findings

According to the severity table in our report, the grand total of page layout includes 64 issues mentioned by our participants during the usability testing. Although 39 issues identify as severity 0 as no problem or positive feedback, 16 issues are ranked as severity 3. And 4 participants reported they struggled with doing Task No.1 on the mobile devices. The layout problems did negatively impact users' experiences on the A.R.T. website.

Note: Based on the report, The Aesthetic-Usability Effect, established by Nielsen Norman Group, points out aesthetic usability effects could impact how users evaluate the usability problem. “People tend to believe that things that look better will work better — even if they aren’t actually more effective or efficient.”

My Recommendations

Therefore, I decided to address page layout issues with design principles and web content accessibility guidelines (WCAG). With the consistent elements and primary color from the current A.R.T website, I extracted the website's contents, redesign the information architecture. Click on the buttons to view the high-fidelity mockups and their rapid prototypes.

VIEW DESKTOP VIEW MOBILE

What did I provide my point of view for recommended fixes?

Overall Layout

  • Group and hierarchize the content - apply various color backgrounds, titles, and pictures to categorize each section.
  • Scroll down the page, and the navigation menu fix on top.

Long Content

  • Lean the content with a carousel - keep the latest news on the homepage. Provide CTA buttons to further check the detail pages.

Footer

  • Chevron buttons - folded the detail pages to lean the content. Users click the button to unfold the menu of each section.
  • Back to top button - provide a shortcut for users to jump back to top quickly without through the long content again.



Navigation Menu

  • The first layer - prioritize the buttons: search icon and menu icon.
  • The second layer - show all the options in one page that users don’t need to scroll down and know all the site map.
  • Login button in text style - consider the avatar feature for login & logout status, which is better for the first development stage. Also, in our evaluation, only 1/9 participants are the A.R.T. member. Most users do not log in to accounts on their website when they visited.
  • VIEW PROTOTYPING VIDEO

What did I deliver the design in this project?


Design of Report & Slides

I took over the art director in this report. My works include designing the 41-page report and 44-page slides. To make the visual identity and the branding of the A.R.T. website is consistent design in our paper layout and the final presentation.

VIEW REPORT VIEW SLIDES


Tables, Charts & Data Visualization

Also, to focus on the details and aesthetics. I decided to deliver all the tables, charts, data visualization, illustrations, and graphics in our project with the consistent design guideline.



Illustrations, Graphics, Mockups & Prototypes

To provide comprehensive solutions that improve the overall experience of the archivist website. I created mockups of their homepage, which presents the visual recommendation with high fidelity prototypes for mobile.


CONCLUSIONS

Summary

Based on our findings, the following four essential recommendations in this remote moderated report would improve the usability of the site:


Clients' Feedback

Our clients are delighted and satisfied with the evaluation results and prototypes. They mentioned both research and design are useful information for their improvement of the A.R.T. website.


Next Steps

With the high-fidelity prototypes, we can create a site map and do the tree testing on users. Then, design a navigation menu, plan the user flows regarding the A.R.T. website's child pages. To conduct usability testing and iterate the website design.

TAKEAWAYS

Although remote testing has pros that typically more cost-effective than in-person research, there are still a couple of limitations and cons. The qualitative data often not as rich as the in-person research. A remote test might also reduce the engagement and adherence of participants. Researchers might have difficulty controlling the process, such as when participants do not follow the instructions or guidelines. And it is less efficient when the moderators try to deal with the urgent technical issues.

What we can do next time to improve this process is to check tool or device limitations. To schedule technology-practice sessions before the real testing. Select a website or app that is completely unrelated to the testing website, and ask participants to use them.

PRESENTATION

Slides of My Research Story


Slides of Teams' Final Presentation

Case Study #2: Unmoderated User Test Report

Usability theory & practice report of FarFetch website regarding the method of remote research.

OVERVIEW

Three usability experts conducted an Unmoderated Remote User Test to evaluate the usability of Farfetch.com in order to identify navigation patterns, potential problems, and design opportunities. The usability experts worked collaboratively to come up with a scenario, a number of tasks, a pre-task questionnaire, and a post-task questionnaire for the usability test.

OBJECTIVE

Unmoderated remote test report

ITEM

Graduate team project

IN CHARGE
Report design

TOOLS

Google Docs, UserTesting

PERIOD

Sep 2020 - Dec 2020

Brief of Testing Plan

Nine participants were recruited in total for this test and were asked to conduct the following steps.

Pre-test questionnaire

  1. What’s your age range?
  2. How frequently do you shop online?
  3. Where do you mainly shop online? (Open answer)
  4. What would be the reason that makes you prefer to purchase a product online? (Multiple answers)
  5. What factors do you consider when you want to do online shopping? (Multiple answers)
  6. What are the reasons you are willing to shop on a new website? (Multiple answers)

Scenario

The winter season is coming, you want to shop on a fashion website to find some nice outfits to participate in your friend’s birthday party next month.

Tasks

  • Task 1: Find a pair of black leather ankle boots for your winter collection.
  • Task 2: Find the product (Cherry print sweatshirt from Gucci, Size M) and add it to your wishlist.
  • Task 3: Navigate to your shopping cart, add a product from your wishlist to your cart, and proceed to checkout.
  • Task 4: Find out the return policy and see if you can return a product you received 15 days ago and how to do so.

Post-test questionnaire

  1. How easy was it to find what you were looking for on our website? (Rating scale 1-5, poor - excellent)
  2. Which feature(s) on the website was the most important or useful to you? (Multiple answers)
  3. How likely is it that you would recommend our website to a friend or colleague? (Rating scale 1-5, very likely)
  4. Do you have any other comments about how we can improve our website? (Open answer)

Findings and Recommendations

  • Finding 1: Users are unaware of how to use keywords in the search bar.
    • Recommendation: Create placeholder or example keywords for the Search Input.
  • Finding 2: Inconvenient layout of the filter section.
    • Recommendation: Redesign the filter layout.
  • Finding 3: Inaccurate information placement on return policy.
    • Recommendation: Reorganizing information placement & adding connections between sections.

Case Study #3: Heuristic Evaluation Report

Evaluate ten usability heuristics & severity ratings of usability problems for Resy Website

OVERVIEW

To enhance the reservation process on the Resy website and offer seamless experiences for their users. Two usability experts conducted a heuristic evaluation that focuses on the three most essential features of the Resy website: search input and result, create a favorite list of restaurants, and the process of booking. For each scenario, the researchers evaluated the website with ten usability heuristics and four severity ratings established by the Nielsen Norman Group.

OBJECTIVE

Heuristic evaluation report

ITEM

Graduate project

TOOLS

Google Docs, Miro

PERIOD

Sep 2020 - Dec 2020

Brief of Testing Plan

  • Two usability experts conducted a heuristic evaluation of the restaurant reservation service on the Resy website
  • A facilitator conducted a brief demonstration of the following scenarios: search input and result, create a favorite list of restaurants, and the process of booking.
  • During the twenty-minute evaluation, experts noted any usability issues they encountered and assigned a severity rating for each problem.

Scenarios

You are a new user who is using the Resy website for the first time. You have to book a new reservation at a Japanese restaurant because you want to invite your friend for dinner next weekend near West Village. You are trying to find a restaurant over four stars, but you have a budget which is under $$$. Open the website and perform the following tasks:

  1. Search the restaurant near West Village.
  2. Add some restaurants to your hit list.
  3. Pick up a restaurant from your list.

Findings and Recommendations

Recommendation 1: Create Filters and Criteria for the Search Input

Recommendation 1 is intended to address two problems encountered during testing as following usability problems:

  • Issue #2: Search for restaurants in the Search-bar. Users expect to input a keyword for criteria, such as location, price range, zip code.
  • Issue #6: Expect the search bar to have filters to narrow down searching, such as specifying my location, cuisine, etc.

At present design, the Resy website does not show the placeholder or hints for the input criteria. Also, users are unable to narrow down the search scope with filters.(Figure 1-1)

Figure 1-1: the search bar on the homepage

Both evaluators encountered obstacles to input searching keywords. They have no idea the search bar allows multiple keywords to refine the search result. Besides, both evaluators expected to use filters to create the criteria to converge the selected restaurants.

Recommendation 2: Refine Micro-Interaction for Interactive Feedback

Both evaluators pointed out the Resy website lacks interactive feedback. An appropriate micro-interaction can guide users on how to take the next step or make users notice the relevant information, especially when both users tried to add restaurants in their favorite list (Create fav list). Recommendation 2 is intended to solve two issues, including Issue #1, which evaluator #1 estimated as a “4 – Catastrophic” severity rating. The button of Create fav list associated with the following issues:

  • Issue #1: Confused with adding restaurants to the favorite list. The heart shape icon is hard to understand and discover the feature.
  • Issue #5: After checking a restaurant on the map, it is hard to find the corresponding one on the left without feedback.

In Issue #1 example, the image below (Figure 2-1) shows that after evaluator #1 searched the keyword “Japanese” and wanted to add a restaurant to the favorite list. However, the small pop-up window in the map that distracted the user to discover the Create-fav-list button on the left side. Also, the image presents the problem of Issue #5. Evaluator #1 noted that having trouble without highlighting the corresponding restaurant on the left.

Figure 2-1: the small pop-up window in the map

Recommendation 3: Provide Relevant Information to Reduce Users’ Learning Time

The recommendation 3 focuses on Issue #3. The search result presents related restaurants on a New York map. The map confused users who may not know the area in NYC. The search result should show the minimalist design instead of giving a map directly to confuse users. For example, making browser restaurants on a map as an optional feature that users can flexibly switch between the list and the map.

Figure 3-1: the New York map as the search result

Recommendation 4: Adapt Users’ Behavior with Considerate Interruptions

In Issue #4, two evaluators do not expect to register an account to access the hit list after clicking the heart icon. Both of them misunderstood the icon as a "like" feature: automatically add to the list without a login account (Figure 4-1).

Therefore, the improvement intends to avoid distraction and interruption. The design solutions have to consider how to deliver different mediums, which adapt to users and their behavior. For instance, the registration step should be placed after users finish creating a favorite list. The priority of the current task flows until users complete the process, then give another one if they are willing to move forward.

Figure 4-1: a registration window pops up after clicking the heart shape icon

Recommendation 5: Use Action-Packed Text for Call-to-Action Button

Finally, The last recommendation of this report addresses Issue #7. Evaluator #2 stated that the button is not clear enough to inform users how to create the booking action. The usability problem happens when the status of a button is not informed. Users have no idea about what is going on next (Figure 5-1, Figure 5-2).

Figure 5-1: unclear booking button
Figure 5-2: the page where to book a restaurant

To solve the issue, showing an action-packed text to describe a button. For example, change the present title “Dinner” into “Book A Table,” directly give a readable description that users know what’s buttons will do next after they click.

Copyright © 2021 Lu Chung. All Rights Reserved.