top of page
Writer's pictureScott DeJong

Designing a Disinformation Game: Similar mechanics and approaches

What happens if we look at the wild world of disinformation as play. There are more similarities than you might think.


When we evoke the word game people generally have an idea in their mind. It might be a boardgame, a deck of cards, or videogames that we spend way too many hours immersed in. It's playful, fun, and leisurely. Yet, we can also see games and play in the world around us. Think about your favourite social media. Do you get “points” or likes for what you post? It has rules, gives us a bit of a dopamine rush, and asks us to keep engaging. So am I so far off to wonder if those structures work for something like disinformation? 


Image of a playbook

Arguably many people were leading up to the election results in the 2024 US election. There was a playbook, a long-game of building up alternative media, polarizing the public, and pushing particular people to emotionally react and build their identities around ideologies. From a crumbling legacy media, an ever growing dismissiveness of traditional knowledge structures, and an institutionalization of particular ideologies; all of a sudden the rules of our game become eerily serious and dangerous.


Play does not need to be fun. Think about a bully and their victim, a predator toying with their prey - here play can be understood as something dark, dangerous, and deeply serious. We just like to forget that. It can be one-sided, where some are playing while others are not, and the gamespace of social media makes it so much easier for people to play with the public. Or at least play with how information gets to the public.


Consider this a thought experiment, not one meant to downplay the sincerity of false information, but to recognize the ludic or gamey parts of disinformation as we understand it. If we are going to design games about disinformation, perhaps we should examine the existing similarities?


Gaming Literacy and Disinformation Literacy


In game studies we have this term “gaming literacy” which refers to the skills and knowledge needed to play games themselves. Everygame is teaching us something. We learn how to move, interact, and if you get really good, you are able to push the boundaries of a game to its limits. Games have rules that define what we can and cannot do. We might bend them, challenge them, or outright make up our own, games are generally understood as setting up a system and structure for play. It is through the play we understand how the game works and, in some cases, what it is telling us. 


In many ways, disinformation also has a literacy. That might sound odd, given we define it as intentionally false content meant to cause harm, but in terms of reading and comprehension we got two players in this game.


On the one side we have the audience. That is probably you and me, the people reading the false content. We have to:

A) Be able to read it, and

B) Be able to interpret it as it relates (or does not relate to) our own lives

This is often where media or information literacy comes in. It tries to give us the skills to decode the message, knowledge to understand the mediums behind it, and criticality to investigate the claims made. 


On the other side, we have the makers and spreaders. In case you didn’t know, people are hired to make false content. Entire factories exist, and if you pay them enough, they will make and spread the content for you (please don’t do this). Here we have a different type of literacy. These actors understand how to use social media platforms to target users. They use data to determine the particularities of online communities and try to exploit that in their messaging. It isn’t always effective, but they are definitely trying to.

Disinformation involves those who are playing and those who are played. It is active and targetted. There is a reason behind the content meant to drive you mad, encourage you to respond, make you feel sad, or see the world a certain way. Will you let them?

Of course, even some of these makers might be unaware of what they are doing. Technology, and the internet, while great at connecting us to friends and family, is also really good at separating us from our actions and their consequences. It's why some theorize that people are more likely to yell or post horrible things online, and the reason we see entire workforces built up around tiny tasks that remove us from the larger project. For disinformation makers, research shows that many did not even see their work as disinformation production because it wasn’t meant for them and their families. However, when that content did matter to their local contexts many found themselves in a horrible ethical dilemma - with some even quitting their job.


Of course we also have other disinformation actors. There are trolls - malicious actors looking to get a rise out of you. Take for example this clip from What We Do in the Shadows, where a character actively seeks to annoy people. We might also talk about imposters, bots, and conspiracy theorists. In short, everyone is vying to be heard, or at least have you hear them.


It doesn’t matter whether or not we want to participate, we are stuck as players in disinformation’s game. We might not be making content, but as users we are part of this game leaving us with the question -  how might we respond? I leave that for you to think about (or me to post about later) - this is a design blog after all. For now the best I can do is better paint the similarities between disinformation and games. By doing so, I want to improve the media literacy tools out there. Of course, doing so will involve simplifying ideas and abstracting concepts, but for the anyone interested in making a game that actually reflects disinformation I encourage you to engage in this thought experiment with me. 


Similar Goals:


Let’s start with the core parts of what makes up disinformation. Perhaps even in a definition we can see a link to games. Often it is described as intentionally false information meant to cause harm. We can debate that all day, but scholars Dean Freelon and Chris Wells give us three criteria: deception, potential for harm, and intent to harm.


How does this play out? The result of this intentional deceptive harm leads to some clear goals: 

  1. Disinformation works to build up a fiction.

    We are fed an idea that the world is collapsing, that so and so is a traitor, or that someone was justified in their actions. Its a narrative, a fiction that then becomes reaffirmed again and again.

  2. Disinformation inherently manipulates information.

    To make that fiction it twists the truth. It might not be 100% incorrect, in fact, the most believable disinformation is that with kernels of truth because we are likely to believe it.

  3. Disinformation is used to gain power and/or control.

    We don’t lie for no reason. Deception has a particular goal. In war it is used to mislead opponents to help you win, in social media it can be used to build up support for an idea. You ever heard of “controlling the narrative” in a disaster? Well what if you just make it yourself? Talk about power.

  4. Disinformation is intentional.

    Of course in all of that there is intent. The goal behind what spreaders and makers are doing. They think about who to target, what to make, and what ideas to say. Is the goal to get people upset, spread an idea, or twist a narrative? No matter what, disinformation is driven by something.


So the question I am asking is can we find these things in games? Short answer: yes. Long answer: let me break it down for you.


  1. Games require some form of fiction or suspension of reality.

    In game design, they might refer to this as the magic circle, or the space created by the game. We see this in animals play fighting where the lines of what is okay or not okay are established by the play itself. Any game asks us to suspend our realities just a little bit. From pushing social boundaries in Cards Against Humanity or pretending we are business moguls in the grind of monopoly, the game is asking us to adjust our perspective at least a little bit. This fiction is almost required to play. And while it can (and often does) bleed into our own lives, games help us come to the topics and themes of the game we are playing.

  2. Games constantly engage in manipulation.

    From moving pieces to lying, games ask us to manipulate something. Games are full of information, and what we do with it impacts everyone else. From deceiving friends in Among Us,  to taking an action that someone else needs to respond to, manipulation is at the core of our play. So, can we use these mechanics to reflect disinformation? 

  3. Games are often built around power and control.

    Across so many games we level up, accrue points, or gain new abilities. It is inherent to almost e every game, where fundamental to their is the ability to gain more power and gain more control. We even see this in gamification models that reward us for behaviour and get us points. 

  4. Games require intent.

    While they can be filled with randomness, most games provide some amount of agency and intent from their players. We choose an action to take, a card to play, or a button to click. We think about a strategy. Whether it is to win or just look the cutest there is intentionality to almost every game.


Here we have stayed pretty general. Resting in the obvious but all of that can shape how we might think about our design. This would be a starting point for design, but we can definitely get more specific.


How does Disinformation work?


If we are talking mechanics (or the actions of a game), maybe our time is best spent thinking through how disinformation works and how that relates to games themselves. Disinformation is not random, it involves an array of tactics. Here are a few:


  1. Targeting communities or systems.

    Fun fact, disinformation does not impact us all evenly. Some communities are seen as more vulnerable to particular types of messaging. For example, we saw this in COVID-19 where anti vaccine messaging led to more covid hesitancy among indigenous communities in Canada. If we consider a legacy of stereotypes and colonization this hesitancy makes complete sense. More broadly, politicians also do this all the time, targeting their messages to particular bases even if it isn’t true. These tactics are used to rally up support or mobilize communities even if the promises aren’t fulfilled.

  2. Using emotional messaging.

    We have seen an increase in emotional messaging especially within headlines. Feelings are extremely powerful at motivating audiences. Even if we can’t fully understand a situation we can have feelings about it. Emotions drive us. Unfortunately, disinformation campaigns can prey on these emotions, using tactics like fear and anger to sew dissent and polarize audiences. So many nations are apparently “crumbling” and while there are serious concerns to be had, such language can easily strike fear and hopelessness into audiences. As we get overwhelmed with information, we can become apathetic, thinking that no matter what we do nothing will happen. You can do something, but unfortunately disinformation is really good at making us think that there is no point.

  3. Exploit Influencers to spread messages.

    Remember how I said disinformation isn’t equal? Well a part of that is because we have some people who spread a lot of false information. Influencers or opinion leaders are really good at gathering an audience which they “sell”. In the case of information, it makes them powerful forces for spreading false information. Even smaller influencers matter, where many become “information bridges” across larger communities by spreading ideas. 


So where is the link to games?


  1. Games Target.

    Mechanically we target things all the time. In a game like Call of Duty you target your enemies, in a card game we might try and target a player who is winning, or be very intentional with what we say. We might plan to move somewhere on the board, complete a certain quest, or mess with an opponent. We have a goal and we make it happen. The specificity of who we target matters to our gameplay, and is fundamental to so many strategies in games.

  2. Games use Emotion.

    Games are emotional experiences. They can make you laugh, cry, and get frustrated. We might throw a controller, sit in silence as the story brings us to tears, or laugh hysterically when our character glitches and does something ridiculous. Emotions are a core part of what brings us to games and keeps us playing. From the feelings we get when we make a particular play, to the satisfaction of solving a puzzle, games are deeply tied to our feelings.

  3. Games have Influencers.

    If you have ever played a role playing or adventure game, we see the power of influencers at the core of driving a narrative. Non-Player Characters with exclamation marks tell us of a quest, and fantasy stories are filled with key leaders who hold influence and power on how the rest of the game sees us. Sometimes we are the influencers, using the knowledge our characters find to spread a message and “save the world”. Information is spread through key people, objects, or places which can be reflective of these influencers and the “bridges” around them. 


Review

The list keeps going and I plan to flesh it out into an article (so stay tuned). I am doing this not to say that disinformation is a game, or that disinformation is play, but that disinformation has similarities to games and play. Why? Because by knowing how games and disinformation are related, we can design games that actually show the actions of disinformation. Disinformation is more than a story but a set of tactics and content that relates to events and moments in our lives to build a worldview. It isn’t just a fake news article, but the deliberate spreading of ideas through an array of content from memes, to videos, satire, and podcasts. Thinking this way, maybe we can make games that actually reflect disinformation’s processes, so that our audiences come to understand how it works. 


Doing so can help us make learning experiences that encourage audiences to peel back these layers and reflect on the fact that disinformation is deeply involved in systems, tactics, and approaches that drive how we come to believe and understand the world. It isn’t someone randomly posting a lie, but created over longterm trajectories meant to influence us in ways that we do not always see. So stay critical, stay sharp, and don’t be played.

38 views0 comments

Recent Posts

See All

Comments


bottom of page