Search This Blog

Saturday, October 11, 2008

Real-Time, Quantitative Capture of User Response to Streaming Content

1. Introduction

Usability studies utilize both qualitative and quantitative methods for capturing user response to the user interface that is being tested. We can measure mouse-clicks, time on task, task completion rates and other valuable data. We can also collect verbal feedback related to ease of use, visual design, layout and other subjective responses. The processing of collected verbal data is expensive because recordings have to be transcribed, tagged and often edited for readability. This is a labor intensive process and if the testing is done with users who talk different languages, translation is also required. Moreover, even when interviews are carefully scripted and prompts are consistent, response are often difficult to reconcile:  Participant's answers can be inconsistent, vague, and generally difficult to analyze and interpret.

Verbal feedback is also used to capture participants' response to streaming content and to gage level of engagement with that content. Typically the tester pauses the media and prompts the participant for her or his opinion. The benefit of this method is that the feedback is contextually related to the content which had just been displayed and is fresh in the mind of the respondent. The disadvantage is the labor intensive post session processing and interpretation of the information gathered. Alternatively, a user can be given a questioner at the end of the streaming content. The benefit of a questioner is that it is easier to process and measure the responses, but the drawback is that the participant is not likely to recall in deep detail their response to content or their sense of engagement with content that was displayed minutes ago.

This paper describes a method I developed to capture in real-time participants' response to streaming content as well as their engagement levels throughout the presentation. The key benefits of this method are:
  1. Capture in real-time users' response to streaming content such as web seminars, tutorials and demos, where the user interface itself plays a smaller role in the interaction
  2. Significantly lower the time labor costs associated with processing the feedback, which may help budgeting for larger samples.
  3. Capture response to streaming content by setting your own test pages or from any website or application.
The method involves the use of TechSmith's Morae*, which is currently the only commercial, out-of-the-box software for usability test. The method leverages Morae's capability to captures, among other things, mouse movement and mouse clicks.

2. Methods
2.1 Create your own test page/s
The first approach is to create your own test pages. This scenario works well when you:
  1. Wish to hide the tested content from the associated company's identity by isolating it from the rest of the company's site and the site's URL. 
  2. When you are testing several draft variations of the content, don't want to bother site admins with helping you post the stuff and need to run it locally off your machine. 
An added benefit is that you can perform the test without worrying about the quality of bandwidth in the test location, or an internet connection all together.
Some technical skills involving the creation of a standard web page are required for setting up your own test pages, but a typical page is really simple, composed of the embedded streaming content - typically a Flash file (So you will need the SWF file), and a single graphics that is used to capture the feedback for content and engagement. See image 1 below:
You need to create an image that will be used to capture the user's responses to content and the user's engagement level. This graphic can be as fancy as you wish, but my suggestion is to keep it simple and remember that the main event on the page is the streaming content, not these graphics. Here is an image I typically use:





The image is divided into 2 sections:
  1. Left side - Response to content. A rating scale from 1 to 7, with 1 being "I don't care -- trivial content" to 7 being "Really important -- Tell me more!"
  2. Right side - Engagement level. A rating scale from 1 to 7, with 1 being "I'm bored" to 7 being "I'm fully engaged"
Morae Study Configuration
To maximize efficiency of logging sessions in Morae Manager, it is best to prepare the study configuration in advance. See image below:

















For a 7 based rating scale, prepare 7 markers for content and 7 markers for engagement and label them Content 1, Content 2, etc. Change the letter association for the markers to a sequence that will make it easy for you to use shortcut during the logging. Finally, assign a color to all content markers, and a different one to all engagement markers. This different colors will provide a clear differentiation once you finish placing all the markers.

How it Works:
Ask the user to click the relevant ratings on the content and engagement bars as the content streams. Ask the user to click as many times as makes sense. Morae captures mouse clicks on the bars, which are easy to see and log. (The red triangle in the image below is generated by Moare Recorder during the session.)
In logging the session it is possible to identify with a high degree of accuracy which section of the streaming content the participant rated, and of course, the assigned value. With a big enough sample rate you can get a good insight into participants opinion about the content -- both narration and visuals, as well as their engagement level throughout the streaming.

Some production tips:
  1. For the screen to be aesthetically pleasing and professionally looking, I adjust the width of the image so that it is the same as the width of the embedded content I'm testing.
  2. The buttons on the bar should be clear and easy to see, and easy to click on. 
  3. The labels should be clear and easy to read
  4. This is a static image - no need to create mouse-over states.
  5. Keep to the minimum the number of shades and colors used for the buttons: The participant needs to focus on the media, not the buttons, so minimize visual overload.
  6. Differentiate between the low and high scores. I have a gradual shift from White (1) to Yellow (7) 
  7. Make sure you have good speakers so that the participant can hear clearly the narration.
2.2 Capture any web page
The second approach makes it possible to capture user's response to any streaming content, on any site. This scenario works when:
  1. You want to test content that is on a production site but you don't have the media file locally
  2. You want to capture response to a section of a competitors site
  3. You want to capture response to streaming content but are also conducting a traditional usability test for the site (navigation, workflow, tasks and so on)  
  4. For some reason you can not use self-created test pages.
Just keep in mind that an internet connection will be required in the testing -- try to avoid at all cost a wireless connection and opt for an ethernet cable, if available.

How it Works:
Since a measurement bar graphic cannot be used, I suggest a low tech solution - drafting tape. The simplest method: Apply a strip of drafting tape directly to the monitor, above the clip you want to test. With a sharpie, write 'Content' in the top-center, the number 1 on the left, 2 in the middle and 3 on the right. Apply a second strip on the bottom of the clip, write 'Engagement' and the 3 numbers.

The strips help guide the user to well defined area of the screen where you want them to click. The strip is semi-transparent, so the user can see the mouse pointer then they click, and since they click in areas that are not part of the content object, the streaming is not interrupted by the clicks. When you view the recorded session later, the drafting tape strips will obviously not be there, but since you know their meaning - the clusters of clicks on top and bottom of the clip, and to the left, middle and right - will help you collect the relevant data as effectively as if there was a graphic there. Once this section of the study is done, you can peel the tape off the screen an move on to another topic.























While the example above works best for a 3 rating system, you can setup a more granular system used the left and right sides of the box. However, keep in mind that you want to keep it simple, and that adding too much tape around the clip may mask too much of the screen. Also think about the accuracy when logging the

What you need:
  1. A 1" 3M™ Scotch® 230 Drafting Tape - this tape sticks to the screen but is easy to peel off.  You can get it in any office supply store.
  2. Ultra or extra fine tip Sharpies - I use a Blue for the content strip , Black for engagement strip and red for the numbers. (Avoid using Red and Green for labels because they carry an inherent association for bad (Red) and Good (Green), which may confuse the user.
  3. Small scissors (to cut the tape nicely)
  4. Lens cleaner solution to wipe the screen after peeling off the tape
Keep in mind
  1. Don't be sloppy: Cut the strips with scissors. If you have to tear the tape, fold about 1/2" on each side to give the strip edges and straight edge.
  2. Apply the tape as horizontally as you can (Leveler is not needed...)
  3. Demonstrate to the user how you want them to act during the recording and make sure they are comfortable with the mouse going 'under' the tape while they click it.

3. What's next?
Once you capture and tag the sessions, it is possible to translate data to valuable information. There are many interesting ways to slice and dice the data, well beyond the scope of this document. However, as you can see in the graphs below, aggregation of session data make a compelling story about response to content and level of engagement to existing or proposed streaming media. makes helps present to stakeholders important analysis and help develop strategies
























------------------------------------------------------------------------
* Can be used with Morae 2 and 3.