Loading...

Telemetry in the QA Process

Phillip Derosa

Global Director of QA at OneSpan

Loading...

Problem

When people play games nowadays, we measure everything: the characters that they select, how many times that they’ve died, all of that stuff. This enables us to make better decisions as we conduct playtests -- we can hone in on a level where more people are dying more frequently, to name one example. The boss is too hard. There are a million possible problems.

I was at an industry show and saw some stuff that got me thinking a lot about telemetry in gaming. This was before telemetry was really a thing in our industry. Some people were unsure about it at first.

Actions taken

I ended up asking two of my QA folks to go through one of our games and to keep track of various stats, such as how many times that they died throughout the level. Instead of capturing events and dumping them out to a log, I asked them to count each tally manually on an Excel sheet by hand.

They played the whole game through and counted how many conversations and combat encounters that they had, among other things. We generated a nice report that outlined all of this information and sent it back to the Director of Design.

They were ecstatic; they described it as being like a window into the game that they had never had before. It was objective data that they would be able to rely on. They asked for more.

It was not practical to have human beings with stopwatches document all of these things, which was when introducing telemetry into the fold started to interest me greatly. Building the technology into our game gave us so much more to work with. Our developers were super excited about what we were showing them.

The guy who got me into telemetry at the conference was actually a programmer themselves; they ended up working with us as we implemented it into our work and process. Two or three of our own Engineers took these insights back to their desks; after a couple of days, they had come up with a prototype for a new system.

The technology was cool, and it was giving everybody the thirst. Within a few weeks, we had all of this new stuff, all of this extra data at our fingertips. This new source of information taught us so much about our own gameplay that we would be able to refer to later on.

Lessons learned

  • Once I was able to show our Director of Design the value of data like this, there was no turning back. Some people rely on what their gut tells them when deciding whether or not they’re going in the right direction. It’s good to have this type of intuition, but nothing beats real data.
  • When workshopping toys based on kids’ shows, R&D teams will do something similar. They will have some kids watch a show in a room full of toys and when the kids lose interest in the show to play with the toys, they take note of the time to review the show's content and adjust it. It’s low-tech, but it’s still first-hand feedback from a real user of the product.
  • Sometimes, you get a wall put in front of you. The goal for me is to make the best game possible, which is where I find the energy to continuously push forward. Nothing should be “just a QA thing”; everybody should feel some responsibility.

Be notified about next articles from Phillip Derosa

Phillip Derosa

Global Director of QA at OneSpan


Connect and Learn with the Best Eng Leaders

We will send you a weekly newsletter with new mentors, circles, peer groups, content, webinars,bounties and free events.


Product

HomeCircles1-on-1 MentorshipBounties

© 2024 Plato. All rights reserved

LoginSign up