Evaluating our Paper Prototype for Microsoft Teams

Sam Cummins
4 min readNov 8, 2020

Following on from Paper Prototyping a feature for Microsoft Teams.

Initial User Testing

With our script defined and our paper prototype digitalized we were able to conduct our first round of user testing. We each arranged to test one or two participants, giving them one overall task:

You are a teacher and presenting a PowerPoint to your class online.
You would like to increase student engagement.
We would like you to create a poll and share it with your students.
Ask them what do they think about the displayed slide and aim to get a yes or no answer.
We would then like you to choose a student at random through the app to discuss their answer with the class.

and various scripted questions and subtasks throughout the interview. As well as this we had our own list of questions to observe the answers to, e.g. how long it took them to find a certain button? did they know what each field was? etc.

Following this we had some closing questions, to try and gauge their general enthusiasm for the feature, and whether they could see themselves using it and in what context. (notes from the interview I conducted here)

We felt this granularity of questioning and observation of the prototype in use would allow us to evaluate the prototype on three main levels:

  • Ease of use and findability — can the user find and execute the flow with no/minimal prompts?
  • Clarity of options and outcomes — can the user predict what clicking each button will do? Do they understand what the options outside of those prototyped for this flow are?
  • General enthusiasm — Would they use this feature, and find it to be an asset?

Once again we met as a group an compared and discussed our findings.

Negatives

  • Lack of clarity in language used.
  • Users didn’t know that some labels represented text fields — mainly due to the low fidelity of our paper prototype.
  • most importantly: The part of the flow that allowed the teacher to ask for participation was not clear
This part of the flow needed a redesign in our next iteration.

Positives

  • The flow was easy to execute — Users could find the entry point and walk through the steps.
  • The functionality was well received as a concept — the users thought it would help in a classroom environment and allow them to get a high level overview of a classes knowledge, opinions or understanding quickly and efficiently.

Prototype Iteration

A view of us iterating our paper prototype on an envision freehand board.

This testing really highlighted one core issue — The part of the flow where the teacher opts in to asking a random student a follow up question was confusing.

  • The users didn't know exactly what it meant.
  • There were more steps than necessary
  • You really had to do the flow once to completion before you were sure of what was going to happen

We removed this opt in step — instead of the teacher asking the students if they would be willing to answer a follow up question and them opting in, now the teacher is prompted at the end of the poll if they would like to hand the mic to a random person that answered.

Our streamlined prototype following iterations after user testing

Testing our new Iteration

  • The issue with the flow was largely addressed — it was more clear and had more predictable outcomes
  • The fact it was simpler made people more inclined to use it
  • Our participants all said they would use it in a class room scenario — at least to try it out

Prototype Reflections

I think s a group we proposed quite a powerful addition to the Teams meeting functionality. Engagement in online meetings is a problem that effects far more than just school children and teachers, and this solution could have wide reaching applications — from the increasingly popular online quizzes to important votes at corporate board meetings.

Project Reflections

The benefits of research driven design were obvious almost every step of the way. When we were debating a decision as a group invariably what we referred back to to help us make our decisions was our user research, our personas and our problem statement. With these in mind it usually became clear to us which path was the most supported by our previous work and our gathered evidence. I think it is highly unlikely we would have gone down a similar path if all our designs and understandings were based on assumptions and our own experiences.

Similarly the iterative design of the prototype allowed us to step back and see the design in the hands of someone other than our group. We had all been thinking about and debating this for weeks and it wasn't always as clear to us that a flow was convoluted or confusing, where as it became clear very quickly in someone else's hands. This really allowed us to shift our perspective and view our designs more objectively.

Also as a group we all had extremely similar backgrounds — young, tech literate, professionals. It was important for us to get feedback from a far wider demographic, in order to cater for a wider array of users, and have confidence in our solution.

--

--