Guo, Kim, & Rubin (2014)

Guo, P.J., Kim, J., and Rubin, R. (2014, March). How video production affects student engagement: An empirical study of MOOCs. Paper Presented at ACM Conference on Learning at Scale, p. 41-50. Atlanta, GA. Retrieved from

In this article, the authors seek to identify what kinds of videos are most engaging to students. They aptly point out that online courses heavily rely on video as a means of communicating content, but these videos package and present content in a variety of ways. Consequently, they choose to explore this question: “Which kinds of videos lead to the best student learning outcomes in a MOOC?” They focused their study on MOOCs because of the large amount of data available. In fact, they claim that their study is the largest-scale study of video engagement to date, as they are able to quantitatively analyze data from 6.9 million watching sessions. They pair this data with qualitative analysis of interviews with 6 staff who were involved in the video production. The “video watching sessions” consist of a single instance of a student watching a video, and they measured engagement by (1) the length of time a student spends on a video, and (2) whether or not the student attempted the post-video assessment.

While this article doesn’t delve deep into theory or offer an extensive review of the literature, it does provide accessible, practical, research-based applications. This can be invaluable to the designer or instructor who is looking for guidance on developing the best possible instructional videos for their course. Their recommendations are as follows:

  1. Shorter videos are more engaging. In fact, “video length was by far the most significant indicator of engagement” (4). Students rarely made it through videos longer than 9 minutes long, and so the authors recommend keeping videos to 6 minutes or less.
  2. Talking heads are more engaging than PowerPoint-style presentations. Students tend to respond well content that is “personalized.”
  3. High production value might not matter. They found students preferred a more natural, informal video to a large, informal lecture-style video. Again, the “personalization” element can prove powerful.
  4. Khan-style tutorials are more engaging compared to PowerPoint-style presentations or typed explanations. According to the authors, the “natural motion of human handwriting can be more engaging than static computer-rendered fonts” (6).
  5. Pre-production improves engagement. Even when recording live classroom lectures, pre-production helps create more engaging videos.
  6. Speaking rate affects engagement. Interestingly, students tended to engage better with faster talkers. The authors hypothesize that this is a reflection of the speaker’s enthusiasm rather than speed itself.
  7. Students engage differently with lectures versus tutorials. Students experience lectures as a continuous watching experience, while with tutorials, they tend to re-watch, skip around, etc.

As you can see, each of these findings lend directly to specific recommendations for video development. This chart summarizes each finding and the associated recommendations:

The authors themselves point out that this study has a few significant limitations. First, since they are only examining MOOCs, the participants are more likely to be self-motivated and comfortable with educational technology, which means their findings will not necessarily apply to all online learners.

The most significant limitation, in my opinion, is that their proxies for engagement (length of time spent watching a video and attempts at the post-video assessment) might not actually measure true engagement. (To be fair, the authors also point this out). Similarly, while engagement is an important component of student learning, I don’t think we can necessarily say that an engaged student is also a student is who is able to achieve the learning outcomes. (In other words, performance and engagement are linked but not the same thing).

A related element that is missing from this study is qualitative data that captures the student experience. While the interviews with the 6 staff members involved the video production can provide some insight, I think this study would be more robust if it also included qualitative data from the stakeholder at the center of this question — the students themselves. This could also perhaps address a few important hypotheses that the authors bring up but aren’t able to answer, such as (1) their hypothesis that shorter videos are more engaging because the content has to be more meticulously planned, and thus are higher quality, or (2) their hypothesis that it is the level of enthusiasm – not the actual talking speed – that leads to greater student engagement.

This recommendations for practice provided in this article are easy to understand, practical, and grounded in research. The authors close with this excellent point: “To maximize student engagement, instructors must plan their lessons specifically for an online video format. Presentation styles that have worked well for centuries in traditional in-person lectures do not necessarily make for effective online educational videos” (10). As we design videos for online learning, we will do well to take this comment to heart and consider their recommendations for practice.