So on to part two. Apart attending a workshop, I also presented a poster at CHI: Do Games Attract or Sustain Engagement in Citizen Science? A Study of Volunteer Motivations (see below). The poster is based on some work being carried out as part of the Citizen Cyberlab project that Charlene Jennett and Anna Cox are involved with. The paper reports on the findings of a set of pilot interviews that Cassandra Cornish-Trestail carried out with people who play citizen science games - in this case, Foldit and Eyewire. The answer to the question in the title is no, game mechanics didn't seem to attract volunteers but, in addition to tools such as chat facilities and forums, they do help to sustain involvement over time. Essentially, the people who play these games are already interested in science, they aren't gamers. In addition, what game mechanics allow for is greater participation in a range of social interactions while also providing ways in which to recognise volunteer achievements as being meaningful. I really quite enjoyed chatting about the poster and luckily there were quite a few interested people to chat too :-)
I got to meet Elaine Massung a researcher from Bristol who was involved in the Close the Door project - where they were investigating motivations around crowdsourcing to support forms of environmental activism. Interestingly, their work suggests that game mechanics such as points can actually decrease motivation for some people. I also met Anne Bowser, a PhD student from Maryland University who presented the PLACE (Prototyping Location, Activities and Collective Experience - see below) framework for designing location based apps and games earlier on in the conference. I enjoyed hearing about Anne's work with on floracaching (a form of geocaching) and how they developed the Biotracker app - a serious geocaching game for citizen science that encourages players to gather plant phenology data. I'm hoping to be able to use at some point in the UK too!
Anne presented at the session on game design, where I also got to hear about Pejman Mirza-Babaei's work on Biometric storyboards. Unfortunately, Pejman couldn't make the conference but his supervisor Lennart Nacke was there to present the paper. I first became aware of Pejman's work during my PhD and it was really nice to see how far it had come. I'm not a big fan of biometrics, I didn't find the raw data I collected to be useful in relation to identifying game-play breakdowns and breakthroughs within my case studies but the tool that was presented during this talk was pretty cool. It allows for designers to consider the what they want players experience to be (see below) and provides a neat visualisation of the GSR (galvanic skin response) and EMG (electromyography) data that can them be compared with what was intended. The fact the Pejman also compared using this tool with a classic user testing approach (alongwith a control group) was great too and the results indicated that the BioST approach did lead to higher game-play quality. However, I do have some questions about the work carried out, even after reading the paper. The main thing I'm not sure about is whether the BioST approach took more time than the standard gamer user experience approach. This is important, as I know from visiting Playable games, there isn't always a lot of time to get some feedback and provide suggestions to designers. There weren't actually that many differences between the BioST and Classic UT approach, is the former really worth it if it takes a lot longer? I was also unsure about how the tool dealt with artefacts such as movement - does the researcher have to manually clear these up and how long does this take? Finally, I noticed that the BioST tool allowed for player annotations where it looks like players were asked to review a recording of the game-play session and add their comments but I'm pretty sure the classic UT condition didn't also do this... Considering this is what I asked my participants to do and I got a lot of rich data from it, I wondered whether the conditions really were a fair comparison - could the player reviews have been helpful without the biometric data? Nevertheless, I do like that the tool presented does not consider biometric data alone as I think it's important to give player's a voice too. Also, I think the way in which the biometric data was visualised provided designers with a powerful tool for interpreting play experiences so I'd be keen to see more research like this.
Later on I attended the Gamification@Work panel, which has a really interesting mix of people including Sebastien Deterding and a number of people from industry. I particularly liked Sebastien's emphasis on ensuring that autonomy isn't taken away from people when using gameful approaches at work. He also provided us with some quotes from games journalists which clearly indicated how when you have to do something for work, even playing games, the activity can lose it's appeal. I took a lot of notes in this session as it got me thinking about how I would design a game (or gamify a task) but I'm still mulling over these. The people from industry also had some insightful contributions to make but I couldn't help coming away from the session a little concerned about how game mechanics can be used to track performance and manipulate people into behaving in different ways. Why does this make me uncomforatble in relation to work but less so in relation to education or promoting health? Some interesting questions were also raised at the end and while measurements may be important for showing improvement (or lack of it) it's important to remember that not everything can be reduced to metrics.
Other highlights from the latter part of the conference include the student game competition - the quality of games was seriously impressive and I'd really quite like to check out a few of them including Machinneers (a lovely looking puzzle adventure for children stealthily teaches logical thinking, problem solving and procedural literacy), ATUM (an innovative multi-layer point and click game) and Squidge (a really cute game controller that monitors player heart rate - see below); the Women's Representations in Technology panel - again a seriously interesting mix of people and perspectives which got me thinking about feminism and how gender isn't necessarily binary; Razvan Rughinis' paper on badges in education - where he discussed badge architectures and how they can be used to chart learning routes; and finally Bruno Latour's keynote - I have to be honest and say I did not find this the easiest talk to follow but I'm sure it got my brain working! There are definitely other people who have got a better handle on it than I do (e.g. J.Nathan Mathias).
It was a huge conference and in addition to the other talks I haven't mentioend, there are also a few sessions I didn't get to go to so I've also got paper on persuasive games and behaviour change to my reading list. In general though, the conference gave me lots to think about especially in terms of how I
want my own research to continue, especially in terms of considering games in relation to my work on CHI+MED, which there may be more to say about later on...