Results overview from Round 8 of the user feedback sessions
The Mobile UEF team conducted usability testing to evaluate specific UEF patterns in the context of a linear application on mobile and desktop devices. The existing iClaim application was modified to evaluate particular patterns, including:
- Password (Create)
- Date Calendar
- Application Steps (new pattern)
- Basic Table
- Summary (Enhanced and Receipt)
- Document Viewer
- Standard Template Footer
Testing was conducted on the following types of devices:
- Smartphones (iOS and Android)
- Tablets (iOS and Android)
The existing iClaim application was modified to incorporate several new patterns. The team chose a linear path application in order to test the new Application Steps pattern. Each screen within the path incorporated at least one pattern being evaluated.
As with prior responsive prototypes used in Mobile UEF testing, this prototype was designed with a single breakpoint. The devices with a viewport size of < 768 pixels when used in portrait view included: the iPhone 5, iPhone 6, Samsung Galaxy S3, and Samsung Galaxy S5.
The devices with a viewport size of 768 and larger included the Samsung Galaxy Note 10.1 and iPad.
The viewport sizes for each mobile device used in this round of testing are as follows:
Mobile Device | Viewport Size | Operating System |
---|---|---|
iPhone 5 | 320 x 568 | iOS |
Samsung Galaxy S3 | 360 x 640 | Android |
Samsung Galaxy S5 | 360 x 640 | Android |
iPhone 6 | 375 x 667 | iOS |
iPad | 768 x 1024 | iOS |
Samsung Galaxy Note 10.1 | 800 x 1280 | Android |
With members of the general public, Mobile UEF Team members:
- Conducted user testing with 14 participants at Stevenson University in Owings Mills, MD on February 5, 2015.
- Fourteen participants tested on one of the following types of devices:
- Smartphone: 9 total participants
- 2 using an iPhone 5
- 4 using an iPhone 6
- 1 using a Samsung Galaxy S3
- 2 using a Samsung Galaxy S5
- Tablet: 5 total participants
- 4 using an iPad
- 1 using a Samsung Galaxy Note 10.1
- No desktop users were tested.
- Smartphone: 9 total participants
- Fourteen participants tested on one of the following types of devices:
- Collected participant information in a pre-test demographic survey, which indicated:
- Participants ranged in age from 18 to 65, with a median age of 23;
- All 14 participants owned and used at least one type of mobile device;
- Thirteen participants had not used Social Security’s online services;
- Thirteen participants would use a tablet or a smartphone to access SSA.gov or a MySocialSecurity account.
- Analyzed the results, including:
- Navigation methods and preferences;
- Participant issues or comments regarding specific UEF patterns or screen details;
- User satisfaction scores on the overall experience as indicated in a post-test questionnaire.
As with prior Mobile UEF testing sessions, recruiting of volunteer participants was performed on-site during the testing session with outreach to a broad range of college patrons. The usability test scenario and tasks were designed to be completed within 15-20 minutes; prior mobile testing had shown this time range yielded the optimal balance of participants and data in any single day.
Metrics for this usability test were established by the Mobile UEF Workgroup as follows:
- Completion Rate – Percentage of test participants who successfully complete the application without assistance
- Target = 80% for each device type
- Ease of Use – Percentage of test participants who indicated the application was “very easy” to use on Questions #3, #5, and #8 of the post-test survey
- Target = 80% for each device type
- User Satisfaction – Percentage of test participants who indicated they were “very satisfied” on questions #4 and #7 of the post-test survey
- Target = 80% for each device type
Metrics for task completion, ease of use and user satisfaction, as measured by the post-test questionnaire, were as follows:
Metric | Target (All) | Actual (Phone) | Actual (Tablet) |
---|---|---|---|
Completion Rate | 80% | 100% | 100% |
Ease of Use | 80% | 85.8% | 92% |
User Satisfaction | 80% | 90% | 90% |
The following table lists the Post-Test Questionnaire responses by device type as well as overall. (Note: One smartphone participant did not complete the questionnaire.)
Scale of 1-5 where 1 = lowest and 5=highest
Questions | Smartphone (n=9) | Tablet (n=5) | Overall (n=13) |
---|---|---|---|
How well did the website match your expectations? | 4.22 | 4.20 | 4.15 |
How well did the website support the task you were asked to perform? | 4.67 | 4.60 | 4.62 |
How difficult or easy was the website to use? | 4.56 | 4.40 | 4.46 |
Are you satisfied with the content? | 4.44 | 4.00 | 4.23 |
How difficult or easy was it to move through sections of the website? | 4.56 | 4.60 | 4.62 |
How easy were the words on the website to understand? | 4.56 | 5.00 | 4.69 |
How satisfied are you with the speed at which you can complete tasks? | 4.67 | 5.00 | 4.77 |
How difficult or easy was it to find information you needed? | 3.89 | 4.80 | 4.15 |
How long would it take you to learn to use this website? | 4.67 | 4.60 | 4.62 |
How confident did you feel using this application? | 4.44 | 4.60 | 4.46 |
Average User Satisfaction Score by device type | 4.41 | 4.58 | 4.48 |
Qualitative Assessment
Usability issues, as well as observations and participant comments, are listed below.
Small Breakpoint: Below 768 pixels (n=9)
Large Breakpoint: 768 pixels and above (n=5)
Small Breakpoint
- One smartphone participant did not meet all the password criteria but decided to move on anyway.
- One smartphone participant was not sure if the symbol she entered was incorrect or ‘x’ symbol “wrong” and asked "Do I need to enter [this], or I am not supposed to use it?"
- One smartphone participant thought the password requirements were merely suggestions.
- One smartphone participant was unclear if the requirements were associated with the password field below.
Large Breakpoint
- One tablet participant did not meet all the password criteria but decided to move on anyway.
- One tablet participant expected a password strength indicator.
This pattern included dates in one visible state: light gray (indicating unavailable dates)
Small Breakpoint
- There were no major functionality issues
- Two participants used the new month drop-down list to change months.
- One smartphone participant preferred the Date Calendar pattern over the Date (Drop Down) pattern, commenting “The calendar is nice, easier than [the drop-down] numbers.”
- Five participants expected the gray dates to indicate holidays, or otherwise unavailable dates.
- One smartphone participant thought the gray dates indicated either deadlines, the current date, or already selected dates.
- Three participants did not know why the dates were gray.
Dates shown to participants at the time of testing are not visible in screenshot.
Large Breakpoint
- There were no major functionality issues.
- One participant used the new month drop-down list to change months.
- One tablet participant preferred the Date Calendar pattern over the Date (Drop Down) pattern.
- One tablet participant preferred using the Date (Drop Down) pattern rather than the Date Calendar pattern for selecting a date. (Both patterns appeared on the same page of the prototype.) According to participant, "It's easier, especially if you are picking a date that's far in the future."
- One tablet participant expected the gray dates to indicate holidays, or otherwise unavailable dates.
- Three participants thought the gray dates indicated either deadlines, the current date, or already selected dates.
- One tablet participant did not know why the dates were gray.
Dates shown to participants at the time of testing are not visible in screenshot.
Small Breakpoint
- Five participants used the Application Steps pattern to navigate in the application without prompting.
- One smartphone participant didn't notice the Application Steps pattern and used the Form Controls to navigate.
- Seven participants needed prompting to use the Application Steps pattern. Once they were prompted, most used it without issue.
- Navigation methods to the “Information About You” page:
- Five participants used the Previous button
- Three participants used the Application Steps
- One participant used the browser back button
- Navigation methods to return to the “Children” page:
- Six participants used the Previous button
- Three participants used the Application Steps
Large Breakpoint
- Five participants used the Application Steps pattern to navigate in the application without prompting.
- Two participants needed prompting to use the Application Steps pattern. Once they were prompted, however, most then used it the next time, without issue.
- Navigation methods to the “Information About You” page:
- Two participants used the Previous button
- Two participants used the Application Steps
- One participant used the browser back button
- Navigation methods to return to the “Children” page:
- Four participants used the Previous button
- One participant used the Application Steps
- One participant suggested making the Application Steps sticky while scrolling.
This was the first test of a responsive table; two columns of data were “hidden” when viewed on smartphones and one column was hidden when viewed on tablets.
Small Breakpoint
- Two participants did not realize that there was more information available in the table, and did not click on the chevron to display the additional information.
- Two participants wanted the table to appear expanded by default with an option to collapse.
- One smartphone participant expected the entire row to be clickable.
- One smartphone participant expected the entire cell containing the chevron to be clickable.
- One smartphone participant suggested that the chevron should match that used in the Application Steps pattern.
Large Breakpoint
- Three participants did not realize that there was more information available in the table and did not click on the chevron to display the additional information.
- One tablet participant expected the entire row to be clickable.
- One tablet participant expected the entire cell containing the chevron to be clickable.
Small Breakpoint
- Six participants did not notice the “Not Answered” text but understood its meaning after prompting.
- All nine participants used the Edit button to go back to the “Marriage Information” page to add the Spouse’s SSN information.
- One participant wanted instructions regarding formatting for the SSN.
- Two participants wanted a Save button on the page they had just updated.
- One participant wanted a confirmation message saying that the SSN had been updated after filling in that information per the scenario.
Large Breakpoint
- One participant did not notice the “Not Answered” text, but understood its meaning after prompting.
- Four of the five participants used the Edit button to go back to the Marriage Information page to add the Spouse’s SSN information.
- One participant expected to be notified about the missing data after selecting the Primary button labeled “Return to Summary”.
- There were no major functionality issues.
- One participant expected the Back button to be on the left side instead of the right.
- One participant could not locate the email link and did not look in the “More” menu.
- One participant thought the "x" close icon indicated a complete exit and wanted more instructions or a button to hide the Document Viewer.
Small Breakpoint
- There were no major design issues.
- Most participants were able to click on the link they wished to without issue.
- One participant did not understand the task, but was able to click on their chosen link after prompting.
- One participant wanted to have a “zoom-in auto-function” behavior when touching the link.
Large Breakpoint
- There were no major design issues.
- Most participants were able to click on the link they wished to without issue.
- One participant would have looked for SSA contact information by going instead to Done > My Account > Search rather than by clicking on a footer link.
Based on this round of testing, the following patterns were found to be problematic for enough participants to necessitate retesting or design refinements:
- Password (Create) (retesting)
- Basic Table
Recommendations based on the findings are below.
Pattern | Recommendation | Rationale |
---|---|---|
Password (Create) | Re-evaluate the Password (Create) pattern in conjunction with the proposed redesign of the Username pattern. | The two patterns are often used in tandem and should be consistent with regard to placement of requirements. |
Date (Calendar) | Continue with the current design. | There were no major issues with this pattern. |
Application Steps | Continue with the current design but consider being consistent with the chevron icons and functionality across all UEF patterns. | There were no major issues with this pattern. |
Basic Table | Consider being consistent with the chevron icons and functionality across all UEF patterns. Re-test and continue to note any further issues when testing future UEF table patterns. | Participants were either not aware or unsure if there was extra information for each row. |
Review Summary | Continue with the current design but consider re-testing with the Error Check Page. | There were no major issues with this pattern. |
Document Viewer | Continue with the current design. | There were no major issues with this pattern. |
Footer (links) | Continue with the current design. | There were no major issues with this pattern. |