The Team
Publications
Improving Student Creativity With Gamification And Virtual Worlds. In Proceedings Gamification 2013.
Improving Participation And Learning With Gamification. In Proceedings Gamification 2013.
Engaging Engeneering Students With Gamification. In Proceedings VS-Games 2013. Best Full Paper Award!
So Fun It Hurts: Gamifying an Engineering Course. In Proceedings HCII 2013.
Melhorando o Ensino Universitário Com a Gamificação. In Proceedings of the 5th Portuguese Conference on Human-Machine Interaction (in Portuguese).
Funding
Funded in part by AAL PAELife Project AAL/0014/2009 and FCT individual grant SFRH/BD/72735/2010
Let’s Game It: Improving College Learning with Gamification
Well-designed games are good motivators by nature, as they imbue players with clear goals and a sense of reward and fulfillment, thus encouraging them to persist and endure in their quests. Recently, this motivational power has started to be applied to non-game contexts, a practice known as Gamification. This adds gaming elements to non-game processes, motivating users to adopt new behaviors, such as improving their physical condition, working more, or learning something new. In this study we are exploring how gamification can be used in college-level education to improve the student’s learning experience. We are performing a series of iterative experiments with an engineering course, in which a gamified version of the course is deployed every year, and the resulting student feedback is used to fine tune the experience to the next year. Our goals are threefold: 1) assess the direct impact of the employed gamification elements over several student performance measures, such as online participation and final grade; 2) identify different types of students, what strategies do they use, and how do they adapt to the changes in the course; and 3) evaluate both student quantitative and qualitative feedback to derive meaningful design guidelines for future gamified learning experiences.
Problem
Gamification consists of adding game elements to non-game contexts, to engage users and encourage them to adopt specific behaviors. It draws on the motivational characteristics of good games which, unlike traditional learning materials, can deliver information on demand and within context , thus balancing challenge difficulties according to one’s abilities. Some gamified system focus on keeping users engaged while learning new techniques and tools. Prominent examples are Microsoft Ribbon Hero, which encourages users to explore Microsoft Office tools, and Adobe LevelUp, which does the same for Photoshop. Other services like Khan Academy and Codeacademy allow online students to learn about diverse topics, by watching movies and then performing exercises, and student progress is tracked with badges and points. However, major benefits of gamification to the students’ learning experience, when applied to education, remain widely unexplored. Early work shows promising results, with students showing signals of increase engagement and performance. However, little is known about how different students react and adapt to gamified learning, and how gamification can effectively be used to improve their experience. We our experiments aim to answered these questions and enrich the body of knowledge about the suitability of gamification to improve college learning.
The Gamified Course
MCP is an annual semester-long engineering MSc course in Information Systems and Computer Engineering at Instituto Superior Técnico, University of Lisbon. This engineering school has two campi, Alameda and Taguspark, where the course runs simultaneously. It also follows a blended learning program, in which students attend live theoretical lectures (3hrs per week) and work on projects in laboratory classes (1hr per week), but they also engage in discussions and complete assignments in the course’s virtual learning environment (Moodle). The theoretical lectures cover several multimedia concepts ranging from capture, to editing and production techniques, file formats and multimedia standards, as well as Copyright and Digital Rights Management. In laboratory classes, diverse concepts and tools were introduced to students on image, audio and video manipulation.
Prior to gamification, typical course evaluation consisted of several theoretical quizzes, a multimedia presentation, lab classes, a final exam, online participation and class attendance, with the final grade was represented with a value between 0 and 20. By analyzing student performance data from non-gamified years, we found room to improve student engagement in the course, as shown by their low online participation on Moodle, low attendance rates and lack of interest for the reference material (low number of downloads). In order to make the course more engaging, fun and interesting, we decided to gamify it, by adding experience points (XP), levels, leaderboards, challenges and badges, which seem to be some of the most consensual game elements used in gamification. So far we’ve had three subsequent gamified years, were several game elements have been tested.
The First Experiment (2010-2011)
By analyzing student performance data from previous years, we found room to improve student engagement in the course, as shown by their low online participation on Moodle, low attendance rates and lack of interest in the reference material (low number of downloads). In order to make the course more engaging, fun and interesting, we gamified it by adding experience points (XP), levels, leaderboards, challenges and badges, which seem to be some of the most consensual game elements used in gamification. As students perform course activities, they are awarded with XP, which provides direct feedback on how they are doing and motivates them by instant gratification. Every 900 XP corresponded to a progress level, which corresponded to a grade on the traditional 20-point scale. For example, a student with 1800 XP would be at level 2, which means her grade was 2 so far. To prevent rounding problems, students were given a head start bonus of half a level (450 XP) for enrolling in the course.
The leaderboard webpage provided an entry point to the gamified experience, and it was publically available from the forums. Here, enrolled students were displayed, sorted by descending order of level and XP. Each row portrays the player’s rank, photo and name, campus, XP, level and achievements, awarded as collectible badges for completing course activities, like attending lectures, finding resources related to class subjects, finding bugs in class materials or completing challenges. By clicking a row, the achievement history for that player was shown, which made progression transparent and allowed students to learn by watching others.
Challenges were tasks students had to complete to be granted XP and achievements. There were two main Challenge types. Theoretical Challenges were activities presented to students over the semester, at the end of some lectures. These consisted of small creative and time-limited tasks, designed to explore multimedia types and materials taught in those lectures. A special case of Theoretical Challenges were the Online Quests, which were more demanding, not particularly bound to any lecture, and with a longer deadline. Online Quests were the only challenges that already existed in the previous non-gamified course versions. The second type was the Lab Challenges, which were assigned during the first month of classes, before other lab evaluations started. These were meant to be fun and expressive, by allowing students to make creative content using multimedia tools introduced in lab classes.
Challenges were formally issued via posts to course forums by faculty. Achievements and graded activities were logged based on students’ posts, except for attendance, lab grading, quizzes and exams, which were manually recorded. The main purpose of challenges and achievements was to model course activities into meaningful deeds while providing the autonomy to choose what tasks and achievements to pursue. Achievements also provided feedback on how proficient students became on specific subjects. Achievements could be single-level, (a task that should be performed once) or multi-level (several times with increasing difficulty). The whole scoring process was done manually. Data from lectures and lab classes were logged by faculty, and activity logs from Moodle were also daily downloaded. Two to three times a day, a script was manually run to process all log files and generate the updated leaderboard webpage.
There seems to be a strong bond between how intrinsically motivated students are and how well they perform in learning. Self-Determination Theory (SDT) identifies three innate needs of intrinsic motivation: Autonomy, a sense of volition or willingness when doing a task; Competence, referring to a need for challenge and feelings of effectance; and Relatedness, experienced when a person feels connected with others. We tried to align the gamified course’s goals with those of students, in order to improve the experience’s intrinsic value. The rationale behind the selection and integration of game elements with the course was thus based on SDT: Autonomy was promoted by providing options on what challenges to pursue and which achievements to level up. We tried to boost competence by displaying positive feedback and progress via points, levels and badges. We aimed to improve relatedness by providing means of competition, cooperation and online interaction among players.
In this study we wanted to evaluate how gamification would affect students of our course, and for this we analyzed data from two consecutive years. In the first year, a non-gamified version of the course was used, similarly to any of the previous years. In the second year, we deployed our gamified system.
Data Collection and Results
We collected data regarding many aspects of student performance, such as the number of course lecture slides downloads, the number of posts in forums, class attendance and grades. These elements have previously been used as informal measures of student engagement. We compared values from this year with those of the previous non-gamified year and assed the main changes between them. We also explored student feedback from the satisfaction surveys, issued at the end of the course. Since data did not appear to follow a normal distribution, all statistical differences between groups were checked using a non-parametric Mann-Whitney’s U test. The results from the experiment were very encouraging. They showed notable gains in terms of attendance, participation and material downloads, which suggests improved engagement and diligence.
Engagement, Satisfaction and Performance
The average attendance for regular lectures significantly increased 11% from the non to the gamified year, which suggests that students had greater interest in attending classes. The number of posts made by students presented the largest growth with 511% more replies in the gamified year and 845% more initiated threads. This suggests not only more participation but also a massive increase in proactivity, which reflects a greater willingness to engage in discussions. It is important to note that the amount of posts in the second year is still 281% larger than the non-gamified year if not taking into account the new evaluation component. Furthermore, 47% of the total posts in the second year related to challenges students had to complete, which would award them with XP and badges. This suggests that challenges are indeed both good engagers and behavior drivers, as supported by recent studies (Chen, 2012), but also that the desire to collect may also be a powerful motivator.
The significant increases in the number of downloads, posts and attendance indicate that students were more engaged with this new gamified learning experience than they were with the previous non-gamified one. Students were generally satisfied and found the MCP course to be more motivating and interesting than other courses, even though it required more work. This is an interesting finding. Indeed, students often complain about heavy workloads. However, using gamification, they did not mind spending more time working for the course and were more satisfied. While this suggests that students were motivated to work, it is not clear whether this motivation was extrinsic or intrinsic. This is a topic for future research. Even though we have no evidence that their marks were significantly affected by the game-like experience, we know that they spent more time working and have enjoyed MCP more than other courses. Notwithstanding, it would be of interest to identify a correlation between student grades and gamified learning.
Design Implications
We learned that challenges have to be carefully crafted in order to add meaning and interest to a gamified learning experience. Students preferred challenges like “Proficient Tool User”, “Rise to the Challenge” and “Amphitheatre Lover”, whose main goal was to encourage them to attend lectures, explore topics taught on lectures and, above all, to create and be expressive, which is something that is not usually promoted in traditional engineering courses. These goals were not only in line with their personal objective of passing the course, but they were also creative and fun, which is why students made multiple replies to some challenges when only one would count for grading. This might have rendered challenges more meaningful, which as seen in the literature, is of major importance. It also explains why other challenges, such as the “attentive student” (find typos in class material) were less popular, as they were perceived as meaningless.
While points, levels, multi-level achievements seem to have worked well to transmit competence and mastery to students, it was much harder to convey autonomy due to the course evaluation constraints. However, students had the freedom to be creative in challenges and to choose tasks and achievements to pursue, and which were worth leveling up or not. As suggested by students, this could be further improved by adding achievement trees, in which some achievements would only be unlocked once the preceding ones were accomplished.
We also found that the balance between how hard and how gratifying a challenge is, is also important to keep students engaged, which can be related to the flow theory. This explains why the “Bug Squasher” achievement was the least popular. Fixing source code bugs and recompiling the course’s prototype for only 40 extra XP was not worth the effort. Furthermore, we found that challenges should be spread throughout the term to avoid periods where appealing goals may lack, or else students will become bored and demotivated.
Students suggested that the gaming experience could be improved by adding items with interesting effects, achievement trees and by representing each player with a customizable Avatar, which would allow students to develop online identity, reputation and become more committed. It was also suggested that there should be more direct competition among students, such as rewarding those that accomplish an achievement before anyone else. However, students also saw room for cooperation, and suggested adding group achievements, like rewarding everybody if every student reached at least 80% of the maximum grade. This could yield interesting effects in the whole class dynamics.
Furthermore, we have qualitatively, by interviewing faculty and observing the student's behaviours, identified the following concerns to take into consideration when designing gamified courses:
- Lighten the pace. The perceived workload must be carefully managed. The intervals between tasks should be carefully chosen to better balance this facet of the game.
- Careful comparisons. Consider other leaderboard types that don’t make the direct comparison between students of widely different ratings so easy (displaying only the immediate neighbors, having leaderboards for different “leagues”, etc).
- Reward quality. Estimate the quality of each student’s participation and award XPs accordingly. This will increase the amount of work done by the professors but is a requirement for the perceived fairness of the course.
- Make them participate as soon as possible. Many students only want to start play-ing when it is too late. Tailoring the game experience so that they are compelled to participate (and see meaningful rewards) early on will yield better results.
- Give them the chance to make up for lost time. While some tasks and challenges will always be time-bound, whenever possible it should be allowed for students to address the different challenges in a more unconstrained way.
- Provide means for cooperation. These should not be completely decoupled from competition. Find mechanisms where several students can work together towards a common goal but maintain the ability for students can show off their work.
- Make it all about the game. Several students thought they could neglect the game as some traditional evaluation components (ex: exam) were still in place. Reducing their importance (or getting rid of them altogether) will dispel this illusion.
We acknowledge that there are additional improvements that could be made to this gamified setup to further engage students, such as rewarding oral participation and giving more experience to higher achievement levels, as the current setup might make top levels less appealing. The interface could also issue notifications to transmit progress, and the leaderboard could allow students to perform direct comparisons with others. Student posts’ quality should also be accounted for grading, to avoid pointless posts and promote fairness.
The Second Experiment (2011-2012)
In the first gamified experiment, we saw significant improvements in lecture attendance and in the number of downloaded lecture slides, with the most gains in both initiated threads and reply posts on the forums. This suggested that students were more participative and proactive, enjoyed attending classes more and paid more attention to support materials, which seems to reflect a deeper engagement with the course. However, no significant gains were seen in final grades, which questions if this approach affects learning outcomes or not. With our second experiment we tried to verify our previous findings and assess the impact of our approach over student grades.
In the academic year of 2011-2012 we performed another experiment. Our gamified course was improved based on student feedback from the previous year, with two new achievements to reward cooperation in the labs and one to reward oral participation. Additionally, a new achievement rewards students for timely responses to challenges and another rewards students for compiling challenge results. We also had critiques about the achievements being too much trouble for only 10% of total grade, which made us re-grade the course so that quizzes would be worth 10% less and achievements 10% more. Since achievements were now worth more, we created six new challenges, four theoretical and two lab challenges, with the intent of making the work load more even over the semester, which was also criticized before. For cosmetic purposes, we changed the amount of XP per level from 900 to 1200 XP.
Data Collection and Results
We analyzed data from five academic years, beginning in 2007-2008 and ending with 2011-2012, with the last two being gamified years. We collected data regarding student lecture attendance, number of downloads of lecture slides, number of posts, as well as grades from quizzes, lab evaluations, the multimedia presentation, the exam, and the final grade. Students had similar backgrounds in all years, with the majority of them having finished their computer science undergraduate degree on the previous year, and a minority composed by one to three exchange students. For the five years, we had 52, 62, 41, 35 and 52 students respectively, excluding those that dropped out or only enrolled mid-semester and could not complete the course. The faculty staff for the theoretical lectures was composed of two professors and it remained the same across all years. As for lab classes, it varied between one and two instructors that changed from one year to the next. We had 18 lectures in every year except for 2009-2010, in which we had 19.
Student data in both experiments was collected in an uncontrolled environment. Variables like the composition of taught subjects, support materials and faculty staff could not be manipulated, as some of these changed and evolved on a five-year time span. Since our data did not appear to follow a normal distribution, all statistical differences between groups were checked using the non-parametric Kruskal-Wallis and post-hoc Mann-Whitney’s U tests with Bonferroni correction. Correlations between variables were calculated using the Spearman's rank correlation coefficient. We have also collected qualitative feedback from students with a questionnaire by the end of each gamified experiment, which we will also present.
Our results allowed us to answer to our three main research questions:
Does our new data support our previous findings?
Contrarily to what we have previously found, we could not observe a consistent increase on student attendance in both gamified years. We saw a significant increase in the first year but the same did not occur in the second, which presented attendance levels similar to non-gamified years. Our approach seems to have no effect on attendance, which might be explained by most gamification occurring over online content. This subject requires further study.
The number of normalized downloads per lecture grew consistently on both gamified years, presenting an increase of 1.5 to 3 times the number of downloads observed on those non-gamified. This suggests that students might have been more motivated to download lecture slides, as seen in our previous study. However, the number of downloads per student had a significant increase on the first gamified year, but in the second it went back to values similar to the non-gamified years. This might be due to the variation of course materials, which could have affected student interest and the amount of downloadable items. It is hard to draw conclusions here but, ultimately, it suggests that students can be engaged to pay attention to course material as long as it is rewarded, like many other aspects of the experience.
Compared to non-gamified years, the number of posts per student grew significantly 4 to 6 times on the first gamified year, and 6 to 10 times in the second. This derives from significant increases in both reply posts and initiated threads. Results support our previous findings, suggesting that the gamified course can engage students into participating and being more proactive in forums.
Student feedback was consistent across experiments, with them finding the gamified course to be more motivating and interesting than other regular courses, and that the gamified experience should be extended to other courses. MCP was perceived as being as easy to learn from as it is from other courses, and to have a study with the same quality but more continuous. Students mildly felt they were playing a game, which suggests that the game-like feeling has still room for improvement. They also felt achievements contributed for their learning but should be better rewarded, especially in the second year, which is in pair with their increased perception that the course required more work.
How did the gamified experiment affect the grades?
Although our second gamified year presented the highest final grade to date, we lack statistical evidence to support a significant increase. However, we observed both the highest minimum grade and the most students reaching the top grade ever, which suggests an improvement of learning outcomes. This effect implies a decreased disparity of grade distribution, which seems to have benefited all students, including those that would score poorly. We hypothesize that this happened because there were more and better rewarded challenges, and these gave students more opportunities to succeed. The significant increase in terms of both challenge posts per student and posts per challenge thread, between gamified years, and the growing strong correlation between challenge posts and final grade, seem to support this hypothesis. It may also explain, in part, the differences in the learning outcomes between experiments.
We found that in both gamified years there was a strong correlation between theoretical challenges and final grades, and also a moderate correlation with the quiz grades. Given that both quizzes and theoretical challenges are forms of continuous assessment, that both cover broadly the same topics, and that quiz grades slightly improved from the first to the second experiments, we hypothesize that theoretical challenges might help students to study and get better grades on continuous assessment components, like the quizzes. In the first gamified year, due to the small number of challenges and their uneven distribution, students felt that the second half of the term had fewer activities to do. This seems to support the benefits of using challenges to boost both student engagement and learning outcomes.
How was engagement affected by Gamified MCP 2.0?
Both experiences suggest that students were more proactive and participative in our gamified course, and that attention to reference materials might be positively influenced, which suggests a deeper engagement. This is corroborated by the students’ opinion that the course was more motivating and interesting than other non-gamified courses. The second experiment, however, brought additional evidence that gamification can indeed enhance student engagement.
In the second year, we increased the amount of challenges, improved their distribution over time, and increased their reward. As a result, we saw a significant increase of 66% on student posts over the previous year, with 73% of these being made in challenge threads, which reflects a huge improvement in student participation. One might argue that students posted more because they had more challenges to attend, but data shows they posted 55% more per challenge thread. The question of whether they have posted more because challenges were more rewarding might arise, but challenges were worth less than 5% of the maximum grade. Also, we found a moderate correlation between the number of challenge posts and non-rewarded posts, and this correlation grew from the first to the second experiment. This might suggest that the more students participate on challenges, the more they participate on other non-rewarding activities, and that we might improve overall participation and engagement by creating better challenges. In the first experiment we hypothesized that an uneven distribution of challenges over the semester might have rendered the course less engaging, due to the existence of long periods without interesting activities to perform. Albeit we do not have strong evidence, our data leads us to believe that by evening out challenge distribution we can not only improve student engagement but also improve other forms of continuous evaluation and, therefore, improve their final grade. This is an interesting topic of future research.
Success of the New Achievements
The new achievements had limited success. Those targeted at promoting cooperation were underused. Groups with good performance often blamed those with lower for the XP that had not been awarded. As compiling challenge results was too much trouble for only 100 XP, only one student undertook this task. The achievement for timely responding to challenges was highly criticized for promoting fast responses over meaningful posts. The oral participation badges were earned by 23 students and we had a few critiques concerning students feeling pressured to talk in class and resenting others being rewarded for doing it.
Ongoing Work
Currently we are analyzing data from a third experiment and we have a fourth one ongoing. In the third experiment we have made a few modifications to improve the student experience, such as allowing students to post up to three times in the same challenge, to make up for lack of participation in other less appealing challenges. This encouraged them to do more of what they like. We have also deployed three new game components: the MCP Quest, the Skill Tree, and AvatarWorld.
The MCP Quest was an online-riddle (in the likes of Notpron) where students started from a webpage with multimedia content, which they had to edit and manipulate to find the URL for the clue of the next level of the riddle. The amount of awarded XP was proportional to the level of quest. In order to encourage every student to participate, they were required to contribute once in order to be awarded the XP, but a student could not post twice in a row. The main goal of the MCP quest is to improve game feeling and to encourage students to collaborate.
The Skill Tree consists of a precedence tree where each node represented a thematic task, which earn students XP on completion. There are 6 base nodes that are already unlocked at the beginning of the course, and subsequent nodes can be unlocked when two preceding ones are completed. Students can gain the maximum amount of XP allocated to the Skill Tree through different paths. Each branch of the tree represented a theme, and students can either go all the way up to a top level node or just complete base ones, according to their liking. The main objective of this component is to improve the students’ sense of autonomy.
AvatarWorld consists of a Pixel Art 2.5D virtual world that evolves and grows with the students. The world starts as a small village but as students are awarded with XP, it expands and new characters, buildings and areas emerge. Students are represented by an avatar that can be used to explore the world. Avatar equiment equipment can be customized with clothing and handheld objects, which students can unlock by achieving certain course badges. Students can also create custom content for the game, like buildings and equipment, using tools and techniques taught in class. Submissions are made via posts on Moodle and are then graded by faculty, based on their creativity and technical correctness; those with scores of 50% or above are introduced into the world. The main goal of this element was to improve the game feeling and to allow students to be more creative, by making custom content.
We are also currently performing cluster analysis on student data, to identify behavior patterns and consequently, different types of students. So far we have identified three different types of students, common to both first gamified experiments, each with different strategies and approaches towards gamified learning. However, a fourth cluster seem to have emerged in the second gamified year…
Do you want to know more about this? Stay tune, we’ll have fresh results soon.