Khoros Atlas Logo

Beat the Cheat: Stop Gaming the Gamification

Lithium Alumni (Retired) Lithium Alumni (Retired)
Lithium Alumni (Retired)

Dr Michael WuMichael Wu, Ph.D. is 927iC9C1FD6224627807Lithium's Principal Scientist of Analytics, digging into the complex dynamics of social interaction and group behavior in online communities and social networks.


Michael was voted a 2010 Influential Leader by CRM Magazine for his work on predictive social analytics and its application to Social CRM.He's a regular blogger on the Lithosphere's Building Community blog and previously wrote in the Analytic Science blog. You can follow him on Twitter or Google+.



The recent business success of social games has put the spotlight firmly on this subject. The hype often blinds people from the fact that gamification is in reality very hard to get right. Behind every successful gamification effort, there are probably hundreds that fail. In fact, if we view our lives in the context of a big game, then school and work are great examples of failed gamification that produced many bored students and dispassionate employees. Today, I’d like to talk about an unintended side effect of gamification that could undermine its success.


Justice for All is Essential

Commercial applications of gamification often use material rewards or cash prizes as motivators to drive certain consumer actions. Because these motivators are tangible, visible and often highly desirable, people often fixate on these rewards and prizes. One of the side effects of people getting too focused on the rewards is that they may start gaming the gamification system. Two examples come to mind:

  1. On Foursquare, I know quite a few people who will check-in to any shop or restaurant they happen to walked by without actually visiting it. They are checking-in to Foursquare for the points and badges rather than the serendipity of finding friends in the vicinity.
  2. Some online community rewards their members with points when they post content. And some members will game the system by posting random junks that are useless to the community. Again, these members are posting for the sake of points rather than contributing value to the community.


Justice Statue_web.jpgAccording to Prof. Byron Reeve, author of Total Engagement and renowned game researcher at Stanford University, “competition under explicitly enforced rules” is a critical ingredient of a good game. When “cheaters” game the system they break these rules and make the game less enjoyable for everyone else, because the game is no longer fair. It is important to realize then, that it only takes a small fraction of cheats to ruin the experience for the majority of players. When that happens, people will eventually stop participating, and the whole gamification strategy fails. Not only is this a waste of money and time, it can create a detrimental backlash that may be very difficult to repair or overcome.


Defeat the Cheats

So how can we prevent cheaters from gaming the system? The answer to this question is both good and bad news. The bad news is that you can’t stop cheating. No system, regardless of its sophistication, is completely immune to being gamed. For example, despite all of Google’s effort to design a fair PageRank algorithm that is resistant to gaming (e.g. link farming, link-baiting, etc.), it can still be gamed. It’s just disguised under a different name (i.e. search engine optimization, SEO).


We can certainly follow Google’s approach and continue to tackle this problem like engineers by making their algorithms harder to cheat. This works, but ultimately you will reach a point of diminishing return.


Reward vs Effort Beaver 300x205.jpgSince cheating is a human problem, we can also tackle it like psychologists or economists. That is the good news! Turns out that we don’t need to make the gamification system completely bullet proof to gaming. We just need to make it hard enough. So, how hard is enough? Strictly from an economics perspective, we only need to make the system robust enough that the effort required to game the system is greater than the perceived value that people can gain from gaming the system. Naturally, most people wouldn’t bother spending the time and effort to game the system, because the reward they get is far outweighed by their effort. Nevertheless, if they want to, they can still game the system in theory, most people just wouldn’t.


This is actually harder to achieve than it sounds, but there are two ways to do this:

  1. Decrease the perceived value of the reward
  2. Increase the effort required to game the system

Lowering the perceived value of the rewards is easier, but you can’t make the perceived value too low. Otherwise people would not be motivated enough to do the gamified task in the first place. The real challenge is finding the right balance between these two levers.



Since I promised to try to keep my posts shorter, I will stop now and we’ll talk about  how to control these two levers next time. For now, let me summarize what we’ve learned:

  1. An effective gamification must be fair and relatively immune to gaming (i.e. cheating), but gaming of commercial gamification seem inevitable, especially when the rewards are big
  2. It is nearly impossible to make a gamification system completely immune to gaming, but we don’t need a bullet proof gamification scheme to stop the cheaters
  3. We only need to make the gamification system hard enough to cheat that the reward is not worth the cheaters’ efforts
  4. There are two ways to do this. We can either:
        (a) decrease the perceived value of the reward
        (b) or increase the effort required to game the system

Next time, I will show you practical things that you can do to lower the perceived value of reward and metrics that you can use to make cheating harder.


Likes to Love Tour2_web.gifFinally, I will be on the road again for Lithium’s “Likes to Loves” World Tour in Amsterdam and London in early October. And prior to that, I will be taking a one-week vacation around the Netherlands during the last week of September. So my post frequency may drop a bit while I am traveling. But I will still respond if you comment.  🙂  If you are in the Netherlands and UK, drop me a line, maybe we can meet up for a bike ride. In the meantime, I welcome any discussion. Stay tuned and see you next time.



About the Author
Dr. Michael Wu was the Chief Scientist at Lithium Technologies from 2008 until 2018, where he applied data-driven methodologies to investigate and understand the social web. Michael developed many predictive social analytics with actionable insights. His R&D work won him the recognition as a 2010 Influential Leader by CRM Magazine. His insights are made accessible through “The Science of Social,” and “The Science of Social 2”—two easy-reading e-books for business audience. Prior to industry, Michael received his Ph.D. from UC Berkeley’s Biophysics program, where he also received his triple major undergraduate degree in Applied Math, Physics, and Molecular & Cell Biology.
Not applicable

This article was worth it just for the reference to SEO as a cheating tactic!!

David Oh
Not applicable

While certainly I agree with increasing the effort to game the system, nobody wants to decrease the perceived value of the reward. In fact, you'd only want to increase this. The perceived value of the reward is what makes the game fun and involving. For example, your example of "people posting useless content" could be solved not by making the rewards less, but by tweaking the game rules to encourage useful content, for example, by scaling up the rewards earned if the content proves to be useful. Quora is a great example of this, and Facebook's edgerank is another.

Not applicable

Hi Michael

Hope you are enjoying the sunny Netherlands.

I wonder whether the right way to look at cheating in games is as a mixed population of cheaters and fair players in a game theory nash equilibrium.

If you take a leaf out of evolutionary game theory, you can see that cheaters can be seen as hawks and fair players as doves. Other than pure strategies of either everyone cheating (unlikely) or everyone playing fair (equally unlikely), there is probably an evolutionary stable mixed strategy (nash equilibrium) with a small number of cheaters (who get more benefit, but have to put in more energy to get it) and a much larger number of fair players.

As long as the benefits to the much larger number of fair players is not reduced significantly by the small number of cheats, there isn't much of a real problem. Only a psychological problem of wanting to punish cheats. You must be careful in the two strategies you propose that you dont end up hurting the fair players in your rush to punish cheaters.

Maybe gamification could benefit from a bit more game theory.

Graham Hill
Customer-centric Innovator

Lithium Alumni (Retired) Lithium Alumni (Retired)
Lithium Alumni (Retired)

Hello Bruce,


Thank you for the comment. I'm glad you find this article worth while.

Stay tune for next piece on some more practical and detail descriptions of tactics that you can use.





Lithium Alumni (Retired) Lithium Alumni (Retired)
Lithium Alumni (Retired)

Hello David,


Thank you for the comment.


I totally agree with you. The point you brought up (i.e. "nobody wants to decrease the perceive value of the reward") is precisely why there are so many cheating activity going on. And the reason is that most people are only interested in and thinking about getting people to engage, act, and play. And they are not thinking about cheat prevention.


So it really depends on what you want. If you want to reduce cheating, than there are 2 things you can do:

    (a) decrease the perceived value of the reward
    (b) or increase the effort required to game the system

If you don't care about cheating and think you can deal with them, then obviously you won't care about decreasing the perceive value. And most people are not interested in beating the cheats. As a result, like you said, people don't think about reducing the perceived value of the reward system. But of course the side effect is that there are many people gaming the system and cheat.


The examples you gave are great examples of making the system harder to game. In fact, that is an example that I will talk about next time. But it is more sophisticated than what you described, because what you described is still gamable.


But there are situations where reducing the perceived value would work. And as I've said, it is a balancing act. But you CANNOT lower the perceive value too much, otherwise people will not be interested in performing the gamified action you want them to do in the first place. Yet, there are some practical tactics and examples of how this would work well. And I will show you some of them next time.


Thanks again for the comment. Hope to see you nex time.


Lithium Alumni (Retired) Lithium Alumni (Retired)
Lithium Alumni (Retired)

Hello Graham,


Very nice to see you here. And thank you for taking the time to comment on my post amonst your busy schedules.


Yes, totally... gamification can definitely benefit from game theory. That is why this branch of mathematics were created in the first place (i.e. to analyze the effectiveness of the strategies in games). The trouble is that most people really don't understand what "game theory" really is and confuse terms like game theory with game mechanics and game dynamics, etc. (see Gamification from a Company of Pro Gamers).


What you've described is precisely what I'm trying to convey in terms of game theory. You want to make the cheater put in more effort (or as you said, energy) in order to cheat. And the optimal balance is where there most people do not feel the reward is worth the effort and therefore don't cheat. And only a few is willing to spend the extra energy to cheat and gain the rewards.


In most gamification application, the actual benefit to the fair player is not reduced. The cheaters simply get further ahead, so it gives the fair players a perception of reduced benefit only relatively. In gamification, it is NOT advisable to use punitive measures. Rather we advise using the lack of rewards instead. We simply want to make the cheaters don't feel that they can gain much from cheating, rather than punishing them. So things should work out.


Gamification is a sophisticated subject. To do it well, it will require both careful analysis of the behavioral economics of the incentive system as well as the psychology of all players.


Thanks agaiin for commenting.

I'm flying to Amsterdam tomorrow. See you there.


Joost Kokke
Not applicable

Hi Michael,


I saw your presentation in Amsterdam and I loved it. Here's a tip there is a book, maybe you know of it, Histories of Social Medias by Jonathan Salem Baskin. Maybe it is a nice addition to your presentation. 

Here is also a link to a interview about the book with Mitch Joel of Six pixels of Seperation podcast


Greetings Joost Kokke


Lithium Alumni (Retired) Lithium Alumni (Retired)
Lithium Alumni (Retired)

Hello Joost,


Thank you for the nice comment. I'm glad you like my presentation at our Likes to Love Tour in Amsterdam.


FYI: If you like to get the deck to my part of the presentation, a pdf (obviously without all the animations) is available for download at SlideShare


Yeah, I know some of Jonathan Salem Baskin's work, an author who I have much respect for. The challenge at the Likes to Love tour is that I only get 30 min. But each one of the topics (i.e. Influence, gamification, cyber anthropology, social CRM, social analytics, science of social, etc.) that I skim over so quickly is really an 1-hour talk for each one. And I normally talk about each those topics separately in much greater depth than allow by the time slot.


And I learn of Mitch Joel's Six Pixels of Separation when I was teaching a Social CRM course at the Rotman Center of CRM Excellent at Univ of Toronto. But I have not read it yet. Maybe that will be my next bed time reading.


Thanks for the rec. And thanks for the comment. Hope to see you next time (virtually on Lithosphere, or in person again in Amsterdam).