Remote usability testing

Last month we carried out our first remote usability testing. I was worried that it wouldn’t work as well as when we run them in person, but it turned out to be easier to organise, and just as much fun.

Usability testing involves asking a small number of participants to run through a series of tasks related to our web pages, online apps, or printed materials. As we watch the participants navigate their way through the pages, we can see the pain points they encounter that we may not have otherwise spotted.

I’ve written before about our usability testing process, when we were looking as the undergraduate prospectus. This time around, it was a set of web pages we wanted to test. The Transport, maps and parking section has recently been redeveloped, and we wanted to make sure it was user friendly.

Preparation

I started by working with Sam, who’d redesigned the content, to work out what we wanted people to be able to achieve when using the pages. What was the key information that they would need to find? I then used those scenarios to write 11 tasks for our participants to carry out.

So far, the process was pretty similar to how we normally test web pages, but because we’re working from home at the moment, everything else would be a bit different. 

Normally we get three participants to come into the office for up to an hour each, and take them through the tasks while live streaming their screen and audio to a nearby room with observers who are involved in the project.

It can be quite hard work to organise, making sure you can book in two rooms, three participants, all of the observers, and someone to run the observation room while I’m with the participants, all on the same morning, while also having someone on standby in case a participant doesn’t show up. 

Remote testing was much easier. I decided to try running it with five participants, and recorded each Zoom session to play back at a later date. I held each session on different days, at times to suit both the participant and myself. I then booked in the observation session with my colleagues, with only a few calendars to coordinate, and no room to book.

Liz from our Support Team volunteered to be a guinea pig participant so I could carry out a practice session. This is a useful way to check that the tasks make sense and are in a logical order. It also gives me an idea of how long it takes to run through them. And this time around it gave me the opportunity to check that it would run smoothly to record and screen share using Zoom.

Running the testing sessions

Running through the tasks with the participants was easy. I read each one out in turn, but I’d also sent them a link to the document so they could refer back to it if they needed to. Normally I’d be sitting next to them to watch what they were doing, but this time I just had them share their screen. 

The only slight downside was that if a participant had a slow internet connection it could be a bit frustrating, and the video broke up a bit, but it actually gave us the additional insight of seeing how our pages load on slower connections, and gave us some ideas for improvements we could make to try and speed up the page load times.

Once I’d run all of the sessions, I had five videos, ranging from 15 to 30 minutes in length, ready to play to my three colleagues.

Observation session

We got together on a Zoom call, and I played them each one in turn using screen share. As we watched each one, we used a shared Jamboard to write down any problems we spotted as the participants attempted to follow the tasks to find out transport information.

Many, many post-its on our Jamboard

We used a different colour for each participant. We didn’t worry if we were duplicating comments – if we had lots of similar post-its, that reinforced that something was a problem.

By the end of the session we had a very colourful board of thoughts. We discussed what we considered to be the main problems experienced by our participants, as well as working out some possible solutions.

As often happens in this type of session, the observers were fascinated by the unexpected ways people navigate web pages. When you’re working on a set of pages, it can be difficult to appreciate that people will use them in different ways until you see it for yourself. Something you may have thought was logical or straightforward is actually not that obvious to someone new to the pages. But more often than not, you can also see that there’s an easy way to fix it.

After the session, I wrote up the notes summarising the main themes and the recommendations we’d come up with as a group. We’ll now make a start on fixing the issues.

I’m definitely keen to use this method of testing again, even when we’re all back in the office.

Published by

Aimee Phillips

I'm a User Experience Designer in Communications at the University of York. My role includes carrying out usability testing, and ensuring accessibility is at the heart of everything we produce.

One thought on “Remote usability testing”

Leave a Reply

Your email address will not be published. Required fields are marked *