Skip to content

Latest commit

 

History

History
352 lines (252 loc) · 18.2 KB

uef-findings-round09.md

File metadata and controls

352 lines (252 loc) · 18.2 KB

Round 9 UEF Pattern Testing Usability Findings

Results overview from Round 9 of the user feedback sessions

Background

The Mobile UEF team conducted usability testing to evaluate specific UEF patterns in the context of a linear application on mobile and desktop devices. The existing iClaim application was modified to evaluate particular patterns, including:

  • Date (Calendar)
  • Multi-Select Modal
  • Single-Row Action Table (new pattern)
  • Pagination Table (new pattern)
  • Review Summary Error
  • Document Viewer
  • Portal Navigation (for devices with viewports > 768 pixels)
  • Username (Create)

Testing was conducted on the following types of devices:

  • Smartphones (iOS and Android)
  • Tablets (iOS and Android)

The Prototype

Design

The existing iClaim application was modified to incorporate several new patterns. The team chose a linear path application in order to test the new Single-Row Action Table and Pagination Table patterns. Each screen within the path incorporated at least one pattern being evaluated. Due to limitations of the Axure prototyping software, the tool was not used to create Single-Row Action Table and Pagination Table patterns.

Viewport Sizes

As with prior responsive prototypes used in Mobile UEF testing, this prototype was designed with a single breakpoint.

The devices with a viewport size of less than 768 pixels when used in portrait view included: the iPhone 5, Samsung Galaxy S3, Samsung Galaxy S5, Samsung Galaxy Note 4, iPhone 6, and the iPhone 6 Plus.

The devices with a viewport size of 768 and larger included the Samsung Galaxy Note 10.1 and iPad.

The viewport sizes for each mobile device used in this round of testing are as follows:

Mobile Device Viewport Size Operating System
iPhone 5 320 x 568 iOS
Samsung Galaxy S3 360 x 640 Android
Samsung Galaxy S5 360 x 640 Android
Samsung Galaxy Note 4 360 x 640 Android
iPhone 6 375 x 667 iOS
iPhone 6 Plus 414 x 736 iOS
iPad 768 x 1024 iOS
Samsung Galaxy Note 10.1 800 x 1280 Android

What We Did

With members of the general public, Mobile UEF Team members:

  • Conducted user testing with 15 participants on May 7, 2015 at the Baltimore County Public Library (Towson Branch).
    • Fifteen participants tested on one of the following types of devices:
      • Smartphone: 10 total participants
        • 3 using an iPhone 5
        • 2 using an iPhone 6
      • 1 using an iPhone 6 plus
        • 1 using a Samsung Galaxy S3
        • 2 using a Samsung Galaxy S5
      • 1 using a Samsung Galaxy Note 4
      • Tablet: 5 total participants
        • 3 using an iPad
        • 2 using a Samsung Galaxy Note 10.1
      • No desktop users were tested.
  • Collected participant information in a pre-test demographic survey, which indicated:
    • Participants ranged in age from 17 to 74, with a median age of 45;
    • All 15 participants owned and used at least one type of mobile device;
    • Seven participants had used Social Security’s online services;
    • Four participants would use a smartphone to access SSA.gov or a MySocialSecurity account.
  • Analyzed the results, including:
    • Navigation methods and preferences;
    • Participant issues or comments regarding specific UEF patterns or screen details;
    • User satisfaction scores on the overall experience as indicated in a post-test questionnaire.

Challenges & Constraints

As with prior Mobile UEF testing sessions, recruiting of volunteer participants was performed on-site during the testing session with outreach to a broad range of library patrons. The usability test scenario and tasks were designed to be completed within 15-20 minutes; prior mobile testing had shown this time range yielded the optimal balance of participants and data in any single day.

Metrics

Metrics for this usability test were established by the Mobile UEF Workgroup as follows:

  • Completion Rate – Percentage of participants who successfully completed the application without assistance
    • Target > 80% for each device type
  • Ease of Use – Percentage of participants who indicated the application was “very easy” to use, as measured by Questions #3, #5, and #8 of the post-test survey
    • Target > 80% for each device type
  • User Satisfaction – Percentage of participants who indicated they were “very satisfied,” as measured by questions #4 and #7 of the post-test survey
    • Target > 80% for each device type

What We Learned

Metrics for task completion, ease of use and user satisfaction, as measured by the post-test questionnaire, were as follows:

Metric Target (All Devices) Actual Smart Phone Actual Tablet
Completion Rate >=80% 95.1% 100%
Ease of Use >=80% 82% 78.7%
User Satisfaction >=80% 88% 76%

Post-Test Questionnaire

The following table lists the Post-Test Questionnaire responses by device type as well as overall.

Scale of 1-5 where 1 = lowest and 5=highest

Questions Smartphone (n=10) Tablet (n=5) Overall (n=15)
How well did the website match your expectations? 3.80 3.60 3.73
How well did the website support the task you were asked to perform? 4.60 4.40 4.53
How difficult or easy was the website to use? 4.10 4.40 4.20
Are you satisfied with the content? 4.60 4.00 4.40
How difficult or easy was it to move through sections of the website? 4.10 4.00 4.07
How easy were the words on the website to understand? 4.90 4.20 4.67
How satisfied are you with the speed at which you can complete tasks? 4.20 3.60 4.00
How difficult or easy was it to find information you needed? 4.10 3.40 3.87
How long would it take you to learn to use this website? 4.90 4.00 4.60
How confident did you feel using this application? 4.80 4.40 4.67
Average User Satisfaction Score by device type 4.41 4.00 4.27

Qualitative Assessment

Usability issues, as well as observations and participant comments, are listed below.

Small Breakpoint: Below 768 pixels (n=10)

Large Breakpoint: 768 pixels and above (n=5)

UEF PATTERNS

Date (Calendar)

This pattern included dates in two visible states: light gray (indicating unavailable dates) and with a black border (indicating the current date) and two buttons: Exit and Clear.

Small Breakpoint

  • There were no major functionality issues.
  • One participant skipped this task because he was using an iPhone 6.
  • One participant had physical difficulty selecting fields and tapping links because of large fingers.
  • Grayed-out Date:
    • Three participants expected the gray date to indicate holidays, or otherwise unavailable date.
    • Two participants did not know why the date was gray.
    • Three participants thought the gray date indicated a deadline, the future date, or an already selected date.
    • One participant expected the gray date to indicate today's date or end of enrollment.
  • Date with Black Border:
    • Five participants expected this date to indicate current or todays date.
    • Three participants expected this date to indicate selected date.
    • One participant did not know why the date had a black border.
  • All participants used the “x icon” to exit the calendar without prompting.
  • “Clear” Button:
    • Six out of nine participants noticed the “Clear” button.
    • Three participants expected the “Clear” button to remove a selected date.
    • Five participants expected the “Clear” button to reset the date, to change/reset the date or select another date, to start over or just did not know what it would do.
  • Six participants expected the calendar to disappear after selecting a date.
  • Three participants did expect the calendar to disappear.

Large Breakpoint

  • There were no major functionality issues.
  • One participant had physical difficulty selecting fields and tapping links because of large fingers.
  • Grayed-out Date:
    • Three participants expected the gray date to indicate holidays, or otherwise unavailable dates.
    • One participant did not know why the date was gray.
    • One participant thought the gray date indicated a selected or unavailable date.
    • Date with Black Border:
    • Four participants expected this date to indicate current or todays date.
    • One participant did not know why the date had a black border, but commented that it brings the date to his attention.
  • All participants used the “x icon” to exit the calendar without prompting.
  • Two out of five participants did not notice the Clear button until they scrolled up.
  • Four participants expected the “Clear” button to remove a selected date.
  • One participant was hesitant to click on the “Clear” button, thinking that it may wipe out all the information he just entered.
  • Four participants expected the calendar to disappear after selecting a date.

Dates shown to participants at the time of testing are not represented in the screenshot below.

Date Calendar

Multi-Select Modal

Small Breakpoint

  • Six participants did not expect the selected items to move to the top of the selectable list.
  • Eight participants noticed the number change in the “Selected” button after saving.

Large Breakpoint

  • Two participants noticed the selected items moving to the top.
  • One participant did not like the selected items behavior.
  • Two participants suggested increasing the size of the left and right navigation arrows.
  • Three participants did not notice the number change in the “Selected” button after saving.

Multi-Select Field

Multi-Select Modal

Single-Row Action Table

Two columns of data were “hidden” when viewed on smartphones and one column was hidden when viewed on tablets.

Small Breakpoint

  • Only one participant noticed the column sorting functionality.
  • One participant expected to sort the rows by clicking on the “Edit” button.
  • One participant wanted to have the ability to move items on the screen (move table rows).
  • Chevron:
    • Five participants attempted to expand or collapse the table rows by clicking on the chevron.
    • Five participants attempted to expand or collapse the table rows by clicking on the entire row.
    • One participant thought the chevron was a check mark and expected it to delete the record if they unchecked it.
    • One participant thought the chevron could be mistaken for the bullet point.

Single Row Action Table on SmartPhone

Large Breakpoint

  • None of the participants noticed the column sorting functionality.
  • One participant expected to sort the rows with finger.
  • Chevron:
    • Four participants attempted to expand or collapse the table rows by clicking on the chevron.
    • One participant attempted to expand or collapse the table rows by clicking on the entire row.

Single Row Action Table on Desktop

Pagination Table

Small Breakpoint

  • There were no major functionality issues.
  • Chevron:
    • Six participants attempted to expand or collapse the table rows by clicking on the chevron.
    • Three participants attempted to expand or collapse the table rows by clicking on the entire row.
    • One participant attempted to expand or collapse the table rows by clicking on both—chevron and the entire row.
    • One participant suggested adding a search feature to the table.

Pagination Table on SmartPhone

Large Breakpoint

  • There were no major functionality issues.
  • All participants attempted to expand or collapse the table rows by clicking on the chevron.
  • One participant expected to see a search feature to the table.

Pagination Table on Desktop

Review Summary Error

Small Breakpoint

  • There were no major functionality issues.
  • All participants understood the error and were successful in correcting it.

Review Summary with Error on SmartPhone

Large Breakpoint

  • There were no major functionality issues.
  • All participants understood the error and were successful in correcting it.

Review Summary with Error on Desktop

Document Viewer

Small Breakpoint

  • There were no major functionality issues.
  • Three participants could not locate the email link and did not look in the “More” menu.
  • Two participants expected to see the email option as a separate button, rather than inside the “More” drop down list.
  • One participant stated that they liked being able to see the original screen in the background.
  • Seven participants used the “X” icon to exit the modal.
  • Three participants used the browser “Back” button to exit the modal.

Document Viewer on SmartPhone

Large Breakpoint

  • There were no major functionality issues.
  • One participant expected to see the email option as a separate button, rather than inside the “More” drop down list.
  • One participant stated that they liked being able to see the original screen in the background.
  • Four participants used the “X” icon to exit the modal.
  • One participant was afraid to click on “Submit” on the previous page and failed this task.

Document Viewer on Desktop

Portal Navigation

(for devices with viewports > 767 pixels)

Small Breakpoint

There was no intention to test the Portal Navigation pattern on devices with a viewport less than 768 pixels.

Large Breakpoint

  • There were no major functionality issues
  • One participant expected that tapping on the SSA logo would return them to the home page.
  • All participants used the Portal Navigation pattern to navigate through the application without prompting.
  • One participant thought the menu items in the navigation looked gray and became confused as to why those items were clickable while the name, Terry Smith (next to the Sign Out link), was the same color but not clickable.
  • One participant clicked at the username (Terry Smith) to change login settings.

Document Viewer on Desktop

Username (Create)

Small Breakpoint

  • There were no major functionality issues.
  • Two participants failed this task because they were not able to get to this page.
  • Six participants liked that the username requirements were above the field.
  • Two participants would like more information about creating more secure names like numbers and symbols.
  • Six participants liked the use of the checkmarks for the requirements.
  • One participant using landscape orientation was unable to see the checkmarks as he was typing. He thought the text was straight to the point.
  • One participant suggested turning off the device’s auto-correct functionality while typing in the username and password fields.

Large Breakpoint

  • There were no major functionality issues.
  • Four participants liked that the username requirements were above the field.
  • One participant thought that the username requirements above the field are typical [standard].
  • All participants liked the use of the checkmarks for the requirements.
  • One participant would like more information about creating more secure names like numbers and symbols.

Username Default View

Username Success View

Recommendations and Next Steps

Based on this round of testing, the following patterns were found to be problematic for enough participants to necessitate retesting or design refinements:

  • Date (Calendar) for “Clear” button only
  • Multi-Select Modal
  • Single-Row Table
  • Pagination Table

Pattern recommendations based on the findings are below.

Pattern Recommendation Rationale
Date (Calendar) Remove the “Clear” button from the modal and provide “clear date field” functionality near/in the field. Users were not sure what action to expect with the “Clear” button.
Multi-Select Modal 1. Keep selected items in place (remove jumping functionality) and indicate they are selected. 2. For devices with a viewport more than 767 pixels, make the left and right arrows more prominent. 3. Keep the “Selected” button as designed.4. Users did not like the jumping of selected items and prefer for them to stay in place, but be highlighted. 5. Users had difficulty noticing the left and right navigation arrows. 6. Users acknowledged the number change in the “Selected” button after saving.
Single-Row Action Table Re-evaluate the table icons used, despite workgroup research and replace the chevron with a different icon. Users thought that the chevron resembled a check mark.Some users expected the entire row to be clickable.
Pagination Table Keep the current pagination design. There were no major issues.
Review Summary Error Keep the current design. There were no major issues.
Document Viewer Keep the current design. Users stated that they liked being able to see the original screen in the background.
Portal Navigation Keep the current design. There were no major issues.
Username (Create) Keep the current design, but turn off the device’s auto-correct and auto-complete functionality for both the username and password fields. Users had difficulty using the device’s keyboard with the auto-correct and auto-complete functionality turned on.

Recommendations for future testing rounds are below.

Recommendation Rationale
Bring a stylus for the next round of testing, but provide it only at the participant’s request. Users had physical difficulty using touch-screen devices with large fingers.