
Problem Scoping
Developers patch glitches. Players mourn them.
Picture this: you discover a funny exploit in your favorite game that lets you skip a tedious section. You use it for a week — then a patch arrives and it's gone. The developer removed a bug. You lost a feature.
This tension is more common than it sounds. Research from my prior study, The Impact of Glitches in Games, showed that many players find genuine enjoyment in glitches — using them to customize difficulty, speed-run content, or simply have fun. Yet developers almost universally treat glitches as problems to be eliminated, without evaluating whether a given glitch is actually causing harm.
Research
Understanding who seeks out glitches — and why
Before designing anything, I needed to understand player motivations from the ground up. I combined three methods: semi-structured interviews with players, secondary research (academic literature on game design and player psychology), and a social media dive into speedrunning and glitch-hunting communities on Reddit and YouTube.
My research questions stayed focused on behavior rather than opinion — I wasn't asking "do you like glitches?" I was asking what players actually do when they find one, and what drives that behavior.

Finding Commonalities Between Glitch Exploitation and Emergent Gameplay
While doing thematic analysis, I found that the motivations for emergent gameplay and exploiting glitches overlap.
Glitch exploitation and emergent gameplay share the same motivations
Players exploit glitches for the same reasons they invent their own rules: customization, skipping tedious sections, or just for fun. These aren't fringe behaviors; they're natural extensions of play.
Communities turn individual discoveries into shared knowledge
Speedrunners and glitch hunters document and share findings publicly. Each discovery becomes a foundation for the next one, creating a collaborative, scientific-like process built entirely by players.
Player "type" strongly predicts glitch behavior
Using Bartle's four player archetypes, I found that Explorers and Achievers (players focused on mastering game systems) are far more likely to seek out and exploit glitches.
Player types overlap and combine
No player fits neatly into one category. People have primary types but shift depending on context. Any tool for developers would need to reflect this nuance to be genuinely useful.
Co-design
Letting players define what goes in the tool
I knew from research which kinds of glitch behavior existed — but not which specific glitches players would find most interesting or useful to explore. Rather than guessing, I ran a co-design session using Miro where participants completed two activities.
1: Existing glitches
Participants described a glitch they'd encountered in a real game and explained how they felt about it: positive, negative, or somewhere in between.
2: "Ideal" glitches
Building on that foundation, participants then brainstormed what kinds of glitch-like features they'd actually want in a game; a generative leap from "what exists" to "what could exist."
Combining co-design output with secondary research, I identified categories of glitches to include in the final prototype — grounded in real player preferences, not assumptions.
Key Takeaway
Co-design is especially powerful when you need to understand not just current behavior but latent desires. Moving from "a glitch I've seen" to "a glitch I'd want" is a classic divergent prompt. It gives participants permission to imagine beyond what already exists.
Design Challenges
The format pivot that saved the project
The original plan was to build a simple digital game with intentional glitch-like features — a "show, don't tell" demonstration. Two problems killed that idea quickly: deliberately engineering a glitch that behaves in a controlled, non-crash-inducing way is genuinely hard, and building a working game from scratch wasn't feasible in four months.
The next instinct was to write developer guidelines. A professor with game development experience shut that down fast: "Why would you move away from a game to guidelines? That's boring." And they were right. Guidelines tell developers what to do. They don't change how developers think.
Scope Shift
That reframing opened the door to a physical card deck: a format that's low-friction to prototype, inherently playful, and well-suited to generative ideation sessions. It lets developers explore glitch potential on their own terms without being told what conclusions to reach.
Changing someone's mental model is a fundamentally different goal than informing them or teaching them a skill. Tools designed to shift perspective need to be exploratory by nature — not prescriptive. When you catch yourself writing rules, ask whether you're actually trying to facilitate discovery instead.
The Solution
GameBreaker: a card deck for exploring play from the inside out
GameBreaker is a physical card deck built around one core interaction: roleplaying as different player types while constructing gameplay scenarios that involve glitch-like mechanics. The tool works in two modes — exploring how existing discovered glitches might be reframed as features, or generating entirely new game ideas from scratch before a line of code is written.
The deck has two card types. First, Player Cards, which correspond to Bartle's four player archetypes — Explorer, Socializer, Achiever, and Killer. Users pick one (or combine several) and adopt that player's perspective for the session, essentially roleplaying that player like a character in DnD. Then, Game Component Cards, which include Game Mechanics, Glitch Types, Game Genre, and Game Rules. These can be chosen intentionally if a specific scenario is being explored, or drawn at random for open-ended ideation.

Explorer
Continuously seeks hidden knowledge and in-game secrets. Most likely to find and exploit glitches through systematic experimentation.

Achiever
Motivated by goals and completion. Will use glitches strategically to unlock achievements or optimize performance.

Killer
Seeks competitive dominance. Will exploit glitches to gain advantage over others — making their perspective essential for multiplayer balance testing.

Socializer
Cares more about other players than game mechanics. May share discovered glitches as social currency or use them for group laughs.
Every card includes a representational image and guiding prompts — questions that push users to think deeply about how a specific player type would experience a particular mechanic or glitch. This is where the roleplay element comes in: developers aren't just thinking about their game abstractly, they're inhabiting a player's mindset and generating scenarios from that perspective.
Testing
Testing with game developers, not just gamers
I evaluated GameBreaker with game development students: the actual target audience. Each session involved walking through the tool, explaining how it works, and having participants build an example scenario using a player type card. The goal was to understand whether the tool was understandable and whether it actually facilitated the kind of exploratory thinking it was designed to spark.
Built-in instructions made onboarding smooth
Participants found the included instruction cards sufficient to get started without additional explanation, a key signal that the tool can work independently of the designer.
The tool genuinely broke creative blocks
Several participants noted that drawing random component cards forced them to consider combinations they never would have explored intentionally, which is exactly the point. (And the inspiration for the name.)
The "Developer" persona was removed
The original deck included a Developer card alongside the player types. A participant pointed out the obvious: why would developers roleplay as themselves? Removing it was an easy call, but one that only surfaced through testing, not solo review.
Blank cards were requested
Participants wanted a way to add custom player types or mechanics not covered by the existing cards. This is a common signal in card game design; it means the core structure is sound but the content feels limiting.
Example use cases in the instructions helped significantly
Abstract instructions are harder to act on than concrete examples. Adding a sample scenario walkthrough to the instruction card gave participants a mental model for how to use the tool before they had to use it themselves.
Reflection
What this project taught me
Designing to change a mental model is different from designing to solve a problem
Most UX work is about removing friction or closing a gap. This project was about shifting how a group of people fundamentally categorize something. That required a completely different design approach: instead of building a better workflow, I built a space for reframing. Understanding the difference between these goals is what unlocked the right solution.
Knowing when to give up control
My first instinct was to tell developers how to handle glitches. I had research to back it up. But that would have been telling, not enabling. Glitches are too varied and context-dependent for a one-size-fits-all approach. The best I could do was create conditions where developers could reach their own conclusions. That's harder to design, and more valuable when it works.
Sample size shapes what claims you can make
One question from the final presentation stuck with me: "Why did you use Bartle's player types instead of creating your own?" The answer was sample size. I didn't have enough interview data to develop new typologies from scratch, so I used existing validated research as a foundation. That's a legitimate methodological choice, but it's one worth naming explicitly rather than glossing over.
Next Steps
The highest-priority next step would be testing with game developers at studios of different sizes, from indie solo developers to larger teams, to see whether the prompt content on the cards holds up across very different contexts. Beyond that: a digital version of the deck (building card interactions natively is a compelling UX challenge in itself), and enough data to develop original player type research rather than leaning on Bartle's 1996 framework.


