York College CIHP to Present at CUNY IT Conference

Members of the York College Center for Interdisciplinary Health Practice will present at the 11th annual CUNY IT Conference November 29th and 30th. The talk will describe recent work to create simulations for students in the School of Health and Behavioral Sciences. The project utilized Second Life to create scenarios for the students to role-play. Digital technology allows students to simulate activities that would otherwise be expensive or dangerous to perform in real life. Students could play the role of doctor, patient, caregiver, occupational therapist, and more. The simulations could also be viewed passively to occasion a discussion in the classroom. Vignettes were designed to be useful to students in a variety of disciplines including but not limited to nursing, physician’s assistant, occupational therapy, social work, and psychology.

CUNY Games Network to Present at CUNY IT Conference

Members of the CUNY Games Network will be presenting during the annual IT Conference at CUNY on November 29th and 30th. The advisory board will be presenting “Gaming Across the Curriculum,” which will provide an introduction to game-based learning along with examples of games designed and tested by CUNY faculty. The presentation will include “What’s Your Game Plan?,” a game designed by faculty member Joe Bisz. The game is designed to help educators create games for any classroom. The hands-on workshop will provide an opportunity to explore the fundamentals of game-based learning for novices and experts alike.

A second talk will feature a review of a new learning management system at BMCC called College Quest. The project, led by Joe Bisz and Francesco Crocco in conjunction with Neuronic Games, is designed to incorporate game mechanics into a college-wide LMS. Students design avatars, earn points and badges, work through levels, collaborate in a social network, receive push-notifications for deadlines, and much more.

Teen Angst, Epilogue

Figure 1

This is the last in series of posts about rapid prototyping for game development with high school students. I will use one of our games, Teen Angst, as a case study for what to do if things don’t go according to plan.

Teen Angst had the broadest scope of any of our games. The plan was to use game mechanics to shape decision making about three topics that most interested teenagers: relationships, substance abuse, and nutrition. Players answered a series of questions based on scenarios that were presented via PowerPoint (Figure 1). Points for four resources (Health, IQ, Friends, and Money) were gained or lost depending on the most likely consequences of the decision.

There were several design flaws in the game. First, there was no means of controlling flow. Task difficulty was not adjusted based on performance, which typically results in either boredom or frustration for the player. Second, we adopted a linear narrative that limited choices to only a few options. While the psychology literature indicates that too many choices can paralyze a person with indecision, the game world is full of examples where presenting players with more choices increases their engagement with and enjoyment of the game. In the game world, having a choice means the player is in control of the outcome, and thus they are more likely to engage with the system. As educators we should capitalize on this phenomenon and reconcile it with what we already know – ownership of the learning experience is critical to learning outcomes. Third, the game was unbalanced. In a perfectly balanced game, all the probable outcomes have an equal likelihood of occurring. A perfectly balanced game (e.g., “tic-tac-toe” or “rock-paper-scissors”) is also called a zero-sum game because opponents have an equal opportunity to win. The reward-punishment contingencies in a game, the player resources, or other factors that affect the final outcome can also be out of balance. In our game, the reward-punishment contingencies were not evenly distributed across resources. Even though the number of questions pertaining to the topics and resources were balanced, the point allocation for each topic-resource combination was not balanced. For example, players had more opportunities to gather resources for Health than the other resources.

We were aware of all of these design issues going into data collection, but we didn’t realize they would have such a strong impact on the data. Six high school students participated in the experiment, but the data for one was removed because the instructions weren’t followed properly. Subjects played the game and provided responses on answer sheets. As I mentioned in a previous post, there was an error in creating the answer sheets, which made it difficult to relate individual answers to their corresponding questions. However, we were able to compare the points earned from the first half of the game to points from the second half. The prediction was that players would earn more points in the second half because of practice effects.

Interestingly, performance exceeded chance levels during the first half of the game (Figure 3), which suggests that subjects were attentive and understood the rules. Nevertheless, contrary to our expectations, performance decreased during the second half of the game (Figure 2). Data were combined across all subjects and all categories (i.e., Relationships, Drugs, and Nutrition). For each resource (i.e., Health, Relationships, IQ and Money), performance during the second half of the game was worse than for the first half of the game, c2 (1, N = 157) = 80.067, p < 0.0001). Similarly, performance during the second half of the game was worse when data were combined across resources (Figure 3), c2 (1, N = 157) = 25.28, p < 0.0001). This decrease in performance might be attributed to fatigue. However, it is more likely that this effect is the result of an imperfect game design. The data were consistent with post-game interviews where the players reported being bored.

After six weeks of hard work, a result like this could be devastating to a student. At worst, the student might doubt the scientific method and loose interest in science. It is critical to spend as much time with the student as possible to confirm that they understand the value of impartiality, learn from failure, and persist in their quest for truth. I found it useful to recount my personal experiences with failed experiments as well as examples from famous scientists. Shifting the focus to improving the game was also helpful. However, it was particularly interesting to find that the student found some solace in knowing that her results were important because they provided evidence for the lab’s overarching hypothesis, namely, that properly employed game mechanics are useful for education. In her case, an imperfect design resulted in a baseline to which future iterations of the game will be compared. We both learned a lot from each other, and the student is sure to benefit from this experience in the future.

Figure 2

 

Figure 3

Learning by design