Skip to content

Page Quality Audit WIP

pglevy edited this page Dec 14, 2020 · 9 revisions

Usability Test Plan

Goal

Validate first pass of method.

Questions

  • Can they complete the method?
  • Do they see value/use for this method?

Participants

At least 5 UX Team members.

Checklist

  • Introduce yourself and any observers.
  • Ask for permission to record.
  • Start the recording.
  • Repeat everyone’s names and consent for the recording.
  • Ask for consent to take photos.
  • Explain expectations:
    • This is not a test of you, only a test of some ideas.
    • There are no wrong answers.
    • Think out loud as you go through the activity.
    • We’re looking for all feedback, both positive and negative.
  • Ask if the participant has any questions before you begin.

Scenario

You're just getting started a new project for the Department of Education, and you've been asked to identify some "low hanging fruit."

Task

  1. Go to this page and review the method. (https://github.com/Bixal/methods/wiki/Page-Quality-Audit-(Draft))
  2. Run the method on this page: https://www.ed.gov/
  3. If you haven't already, run the audit on a couple additional pages on the site.

Questions

  1. What are your initial thoughts on the output of this audit?
  2. Are there any issues identified that you would feel comfortable providing remediation guidance on?
  3. How would you communicate the results of this activity to the project team?
  4. What other next steps would you take?
  5. How might you use this method on current or future projects?

Conclusion

  • Let participant know the activity is complete.
  • Ask if they have any questions or other comments they want to share about any part of the activity.
  • Thank them for their participation.
  • Stop recording.

Findings

First couple tests

  • Connection of results to user experience (or at least UX role) not immediately clear.
  • Value of using throughout project lifecycle not immediately clear.
  • High-level steps getting a bit lost in the details.
  • Order of steps around sharing, estimating confusing.
  • Language around saving report confusing.
  • Might need guidance on how to interpret scores.
  • Could emphaisize value of "whole-team" activity.

Second round

Changed the format of the How instructions to have clearer high-level steps with some sub-steps or tips for each one. Also tweaked the Why and What language slightly.

  • High-level steps seemed to be easier to follow.
  • Lots of talk about Excel and wanting to export the data.
  • Seemed to understand next steps better and need to bring in team.
  • Still looking for more context about "when" to use this in a project and how to convey the value/purpose to the team. ("If I were to go to sprint planning and say I need to do this, I need to have reasons why this is good use of time.")
  • Uncertainty about how long audit is taking to run.

Third round

Tested as live page in context of the current site (on the staging repo).

New findings:

  • The View Report feature is actually very important for seeing the details of the issues with the page (as opposed to just the general guidance on what those issues mean you see on the first page). This is not clear and the link is pretty understated.
  • The hierarchical approach to instructions seemed to help. (Participant scanned number instructions, though not sub-points, before diving in.)
  • Participant seemed to see value of using tool at different points in the project.

Raw Notes

P1

  • web service where I can put in a URL and it will give me a report
  • guessing CSS, contrast, USDS validation
  • how well web site performs, think “apage load
  • like audit
  • what does it mean to you at this point?
    • bar of scores not a whole lot of value
  • not a whole lot fall underneath my domain
  • see full report
    • more meaningful
  • not a whole in here for a UX designer
  • maybe for a bit of QA at the end?
  • nothing related to the domain
  • what would you do?
    • ping developers over teams
    • let them know this resource is available
  • in terms of LHF
    • set up time with dev
    • let me know if there are nay quesitons I can answer
  • if I saw alongside heatmap, might be more meaningful
    • thumbnails don’t tell me anything
  • would be triage, don’t know how to prioirtize these
  • are they critical
  • don’tk now what conteful paint element
    • sound like it would have meaningful
  • any other team members?
    • snap shot of this today, took another one prior to launch
    • have some value for the client as a member of the team
    • establishing a baseline, showing outcome of work
    • toss over to dev team and then forget about it
    • if we had an interactive designer, let them know they were going to be judged by it
    • suggest as a tool that would make their job easier,
  • see how low we can get that
    • if audience seems something, perceives as loading quicker
    • sit down with devs, score these based on effort
    • as UXD go back in, can’t do this, but maybe we can do that
  • do they affect UX?
    • confidence
  • doesn’t say what to do next, what to do with the insights
    • would like “best use” kind of bit
    • if it comm. engineering side
    • things that could be, help them prioritize what to tackle (dev)
    • good interface between design and engineering
  • step 3 out of place, that should come at the end
  • see sharing in #4
  • say
    1. this collects information and measures
    2. outcome: produces a thing for conversation with development team for collaborative effort
  • for dev to use and then reach out to UX designer , “hey, help us prioritize these”
  • any parting thoughts?
    • more context within larger scheme of things
    • about data collection and suggested next steps
    • better sense of value this would bring to me, would it be worth effort
    • not a lot of work

P2

  • reading method instructions
  • a little confused about order of things
  • see results (score numbers)
  • now it makes sense
  • what does #3 mean to you?
    • now that I have results
    • need to comm. to team members
    • somehow share with them
    • gather team members
  • need to figure which issues are for who
  • not sure how I would do it
  • “this version of the report”
  • how do I save this?
  • initial thoughts?
    • “not many issues with AX”
    • lower performance, not too bad
    • probably reading through what exactly those are
    • don’t know what it means, would need some help from technical team
    • or delegate to them
    • most important thing to fix because it’s in red
  • anything you can provide guidance on?
    • getting warmer, images with lower resolution
    • easy fix
    • SEO, probably content people
    • meta data desc looks pretty easy to fix
  • what would you do next?
    • definitely let people on the team know
    • this is what we’re tool
    • here’s the tool
    • share this findings with them
    • see if they can help
    • i will point them to the portion of report they need to look at
    • developers do first couple of issues
    • then bring them back to have conversation
    • what would be least time consuming, quickest fixes here, how long it would take
  • use on current or future project
    • good to use on existing site
    • assess what you’re dealing with
    • hope that we build something new, there won’t be a need
    • not as straightforward, need to read line by line, learning curve
    • not very much time
    • like the color coding
    • good tool conversation for cross-functionality
    • start adding to backlog and bringing value faster
    • depends on project, whole redesign effort, still keep current site working well
    • it’s automated, don’t have to do it manually
    • I see the value in it
    • just need to get situated with the tool
  • step 5
    • just did the homepage
    • how to determine which pages I need to do
  • other thoughts
    • PMs could do this on their own

P3

  • seems like a good tool
  • how long is it going to take
  • read scoring
  • not sure what contentful paint meana
  • reading everything
  • “what are the four categories?”
  • looks believable (lower performance, higher accessibility)
  • may not understand what I’m looking at
  • need more guidance on that (hey dev, can you do this?)
  • not an unmanagemenable list
  • “let’s click on it”
  • I think I know that means, wouldn’t have access to edit links
  • but could talk about why it’s important
  • familiar with tap targets (change design and tell dev)
  • don’t know what crawl is
  • familiar is meta desc, but not person to write it
  • would you know who to direct them to?
  • figuring out front end form BE is always
  • next step?
  • would be great if I could export to Excel, label it (transfer what they have already)
  • “hey, I did this thing (test), i have Excel sheet that tells what we need to fix
  • planning meeting, talk about who’s going to do what
  • start with PM and tech person, maybe
  • trying another page, switch URL
  • oh, it says view report (maybe that’s where I can get an Excel)
  • view report: this is pretty
  • looks like they can get it in some other format
  • start taking notes for myself in some other document
  • focusing on high impact issues
  • probably go into this process with a few assumptions
  • seems like it would be good for something like FSIS, when trying to do a whole redesign
  • get some baseline metrics, so when project is over, we increased your SEO score by X.
  • use on something already built? something Bixal created?
  • probably, although I would hope that development has something in place to audit their won
  • couldn’t hurt, definitely consider trying it
  • any other questions?
    • for Methods, I like to have a clear idea of when is this good for as much direction as I can get (beginning, the end)

post-convo

  • more questions about tool than instructions
  • mostly good
  • how can I get this data and touch it
  • maybe one or two at the top about the context
  • if i were to go to sprint planning and say this, need to have reasons why this is good use of time

P4

  • awareness makes sense for low hanging fruit
  • common early on
  • reminds me of lighthouse you told me about, gives specs and report
    • haven’t one anything with it since then
  • on specific project: misunderstanding about needing work for devs
  • tool you can use for free
  • read value props
  • this uses lighthouse
  • instead of chrome, more of an accessible dashboard
    • more user friendly, less dev speak
  • good colors scale, bad, ok, good
  • maybe I need to sign in
  • cause this is taking a long time,
  • looking for something to make it go faster
  • initial impressions
    • AX, because of RL, become more of skeptic on AX scans
    • RL scores really high, hueristic eval with James, lots of things not AX
    • other things to keep in mind, manually checks
    • this is a starting point
    • something I need to do going forward, solid foundational knowledge on other guidelines
  • performance? what does it mean?
    • 404 errors
    • load time slow
    • is the server malfunctioning
  • comfortable?
    • the buttons, content related (simple change)
    • guide instead of what the issues are
    • inspect? all components?
    • ask dev to go through all the buttons
    • unused CSS reminds of USWDS, moving to more tokenized site
    • that’s not really low-hanging fruit, not immediate return
  • for response time, ask BE dev
  • low res images: something harmless to improve
  • next steps:
    • go back to how to
    • reading high-level instructions
    • in Miro board, screens and annotations, visual way to comm. these problems
    • but these are more technical, producing report, summary, demoing it almost
    • I am proposing that FE, BE handle this, are we all in agreement? off base? PM go forth and assign people things
  • short-term applications you see for this?
    • only project is AL
    • RL already not doing quick fixes (want to do over whole thing)
    • specific project: doesn’t have any devs on that project, probably not, team extrememly siloed
    • don’t have any input into what devs are working on, not what I’m focused on
    • more open-minded client
    • change averse, start small
  • how else might you use this?
    • post-launch of re-designed website
    • different layout, content, different IA
    • serve as “launched new thing, check if things are being met”
    • would have been great to do when intranet just launched
  • only just noticed URL for view report: should be bigger call to action
    • assuming large
    • bring in dev to help me tranlsate a lot of this stuff
    • would look through if I was trying to make recommendations
  • methods around transitioning a USWDS site, maturity model