Case Study: Educational Gaming


My duties at this early stage educational gaming startup included leading the design of new games, improving the user experience of both the proctoring platform and students’ gaming (on web, Android and iOS), and all promotional efforts of the company.

I quickly realized the user experience of the platform had a number of challenges for both students and teachers.

Step One: Identifying the Problem

Essentially, the founders of the company were also the developers and designers of the software, and as brilliant scientists and economists, they were somewhat blind to the fact that their platform was not as intuitive to use as one would hope.

It was clear to me that a presentation about basic heuristics was not going to change their minds. (Though this only became clear after I gave a presentation about heuristics.)

Step Two: Prove the Problem Exists

With the support of both marketing and design, we convinced our CEO to come to CalTech, where we corralled an economics professor who was unfamiliar with our platform and had him go through our standard onboarding process. After repeatedly preventing the CEO from giving helpful advice during the experience, we were finally able to open his eyes to the usability issues that our platform presented.

At the end of the day, it WAS simple to understand how to proctor games… as long as someone was there to show you how to do it. If we hoped to become successful however, we needed to make this process scalable and easy to understand for both students and instructors without the presence of a brilliant economist.

Step Three: Gather Data

Our company was in a period of rapid growth, so we were tasked with expanding our customer base, radically increasing the number of games we provided to instructors, and expanding the functionality of our existing platform at the same time I was attempting to improve it.
Consequently, all improvements on existing functionality had to occur piecemeal over many months, and since we had a limited staff, the founders needed to be assured that any improvements were absolutely necessary, as addressing those issues would take time away from further expansion.

We gathered data to identify pain points in a number of ways:

  • By sending out surveys to our existing users (over 2,500)
  • By organizing and filming student focus groups at local universities
  • By user testing separately with students and instructors
  • By extensive classroom observation of our platform in use
  • Input from our world class economics advisors, including Nobel Prize winner Al Roth
  • Analytics

Step Four: Use the Data to Inspire Change

We gained a massive amount of constructive data.
This information was distilled into long and short term action items to improve our platform for proctoring games, the console that the students used, and the games themselves.

Data Driven Improvements

It was surprising to me how much could be learned by simply sitting in a classroom and listening.
For instance, the relative percentage of students who were on mobile versus web turned out to be hugely important. Once we realized that many students didn’t want to download yet another app, and were consequently using a web browser on their smartphone to participate in our gaming system, we recognized that the downloading of web resources was a big drain on a classroom’s wifi. Encouraging students to get the app could therefore dramatically improve the user experience of the entire classroom.

Finalizing the Design

We had many productive discussions among the entire team about how best to improve our platform without losing the rigorous intellectual framework that would make it appealing to graduate level instructors. These discussions were some of the most exciting of my career, because the results would have real world impact on the lives of thousands of students.