One of the things I am always wondering is why don’t software do playtesting like the way gaming does. Being in both fields, I found user experience research in game and software are quite different:
In software, testing usually happens in productions, either during official or beta releases. These tests are highly data driven, large scale, and often invisible to the participants. Takes A/B testing for example, a method that compare two (or more) design variations, rolls out to hundreds of thousands of live users; key outcome metrics like conversion, click rates, and etc. are measured to highlight any statistical significances. And afterward product manager would analyze these data and provide insights and assumptions into potential behavior changes, as well as actionable decisions and tasks. While these testing are great at revealing users preferences. They often fail to provide larger context in design; require you to have a sizable audience to start with; and they can stifle innovative idea as user preferences could be wrong. For example, another form a testing common used in software is user testing, And if we had conduct one of this on smartphone users back in 2007, asking whether they preferred BlackBerry’s physical keyboard or newly created iOS’s on-screen keyboard. Odds are, the physical keyboard would have won by a landslide, given its familiarity and track record. But had those early tests dictated the product direction, the breakthrough of the modern software keyboard might never have taken off.
Which is why I believe that playtesting—the favorite methodology of game designers—should be adopted in software development as well. The goal of playtesting is to provide designers with an empirical understanding of their product. Playtesting begins as soon as you start building and continues throughout the entire development and design process.
These tests are conducted on a very small scale, often involving colleagues, friends, and family members. The rules are straightforward:
1. Think Aloud: Testers need to verbalize their thoughts while using the product.
2. Minimal Guidance: You refrain from explaining or guiding them throughout the process, unless they become truly stuck.
3. Inquire About Decisions: Ask them how they think and why they clicked one button instead of another.
4. Regular Sessions: Conduct playtesting sessions regularly and frequently—weekly, every other day, or even daily.
5. Open to Feedback: Be open to disagreeing with what testers say.
6. Team Involvement: Ensure that you and everyone on your team are present to observe the testing.
That’s all there is to it!
For those immersed in data-driven design, it will be difficult to wrap your head around why playtesting is an effective tool without actually trying it. However, I can share why I believe it’s invaluable:
1. Identify Design Flaws: Watching a user struggle with your design makes the issues painfully clear and obvious.
2. Discover New Ideas: You often uncover new ideas by observing how others interact with your product.
3. Enhance Team Engagement: Your team becomes more engaged with the product through the playtesting process. I can’t tell you how motivated it is to watch user responds to your work. And it lights people asses on fire when they realize the product have major flaws.
Simply, it will just tell you which design needs more work and motivate you to do it. Moreover, the 3 points I just gave you are what I personally found valuable. If you want to learn more I recommend this video by Game Maker’s Toolkit: