Remote usability testing has been a common practice for companies whose user base is international. It’s simply not enough to only speak with your local customers when your biggest user segment lives abroad. This applies also to Pipedrive where I work – most of our usability testing sessions are remote: I am moderating the sessions from Tallinn, and the users are testing the solution in the USA, Australia, or South Africa.

Companies with local customers have had the luxury to practice on-site usability testing. The COVID-19 has pushed them into a new reality where the on-site testing has to be replaced with a remote set up to follow the rules of social distancing. For desktop solutions it is rather simple, but how to test mobile designs remotely and hassle-free?

Remote usability testing for desktop

Conducting usability testing for desktop solutions doesn’t technically differ much from a regular remote meeting that the post-COVID world is now familiar with. Remote usability testing is in its essence a video meeting with screen sharing. The cursor indicates what the user is doing on the screen, and when the webcam works, it’s also possible to see the participant’s facial expressions and body language. We have been using Zoom for the user interviews as it allows easy screen sharing and recording of the sessions.

Pro tip: We recently discovered that Zoom has an auto transcription feature that works when the videos are saved to the Zoom cloud. It helps to save time in the analysis phase

Remote usability testing for mobile – screen sharing limitations

Testing a mobile app from a distance can be tricky. When applying the same screen sharing method on mobile, we face some limitations. First of all, asking the users to install the Zoom app to their phone and share their mobile screen can be a hassle (especially with iOS settings).

Most importantly, with screen sharing in a video call tool, the moderator is missing out on seeing what the user is doing with the prototype. There is no cursor to see on the screen when testing mobile. Zoom doesn’t have the feature to hint to the moderator where the user is tapping. If you are testing the live app, then you are in slightly better position because you might understand the user actions by observing the changes on the screen.

The limitation becomes a deal-breaker when testing a Marvel/Invision/Figma prototype with limited functionality – then it’s even more critical to examine the misclicks users are doing as they don’t become visible on the screen. Although the think-aloud protocol is used for the sessions, it doesn’t replace the power of witnessing what the user is doing.

Hug your computer – smoothest way to usability test mobile remotely

After testing the live mobile app with Zoom screen sharing and experiencing the limitations, I asked for advice as an upcoming prototype test was already on the horizon. A colleague introduced me to this clever testing hack: ask the user to hug their laptop.

“Hugging the laptop” method means that the user joins the call on their laptop and shows their mobile screen via the webcam. Their computer must have a working webcam. When the initial introduction to the session is done, the moderator asks the participant to turn their laptop 180 degrees so that the screen is looking away from the user. When the participant now places their hands around their computer and takes their mobile in their hands, it will be visible on the webcam. This sensationally smooth solution doesn’t require anything else from the user besides a laptop with a webcam.

Pro tip: Make sure that the participants have a webcam capability already in the recruitment phase

In my experience, the participants come along with the computer hugging idea readily. I always acknowledge that it is a bit funny to hug the computer because we usually don’t use our computers this way. I have also found that it is easier to explain the set up to the users by demonstrating it first on my device.

Pro tip: Once you turn around your computer and show how the mobile looks on the webcam, everyone gets the idea.

The level of detail and clarity of the design on the moderator’s screen is pretty good. In the case of the prototype testing, you are well aware of the solution anyway and can check details on the prototype if needed.

It’s essential to guide the tester to position the laptop screen and their hands in an angle that you can see the full mobile display. Depending on the light, there might be a reflection that makes it hard to recognize what is on the screen – this is easily solvable by guiding the participant to change their position a bit.

Pros and cons of “hugging your computer” approach

The most significant advantage of this solution is the possibility to see the full movement of fingers. Seeing hand gestures is especially important when testing a prototype where not everything is clickable. In this case, you can see where the users misclick and move their fingers while thinking. You can see where do they point when explaining something and whether they try to swipe or scroll if it’s not part of the prototype.

The trade-off with hugging your computer set up is that you can’t see the facial expressions of the user at the same time since the webcam pointed on the mobile device in their hands. I am willing to make this trade-off when testing mobile because fingers are expressing more than the face.

Steps of remote mobile usability testing:

  1. The facilitator and the participant join the Zoom call on desktop devices. (Can be any other video calling tool)
  2. The facilitator asks the participant to turn on the webcam.
  3. The facilitator asks the participant to turn their laptop 180 degrees so that the screen is looking away from the participant. The “Hug your computer” method.
  4. Pro tip: it is easier to explain the method when the facilitator shows it with their computer
  5. Once the participant has turned around their computer, the facilitator asks them to adjust the screen angle so that their mobile screen and hands are fully visible in the camera.
  6. The facilitator sends the prototype link to the participant in a way that they can open it on their mobile. With the live application, the participant is opening the app/webpage.
  7. The facilitator moderates the session as planned and observes the user behavior on the screen.
  8. I highly recommend recording the session for further analysis done later