Case Studies

Project: Testing a Concept in Beta

New class content beta tested with users prior to launch.

Deliverables: 

Class content, metrics, comparisons, final recommendations before full launch to paying customers.

Project Overview

Students were not completing an advanced introductory class and even those that were lacked comprehension. This impacted several of our KPIs, including churn, Net Promoter Score, and students’ ability to land a job. We decided to fully rewrite the class but had already tried this approach 3 times in the past. Determined to have a different outcome, we set out to beta test the content before releasing it to all customers.

Goals

Problems

Problems image

Progress data told us that students were starting the class but not many were finishing. Those that did finish gave the class low NPS ratings and numerous comments that they didn’t understand the material or feel confident. This was a big blocker to our ultimate goal – students landing jobs. We needed to solve multiple problems: Unhappiness with class content and low NPS ratings.

Our routine approach to curriculum development wasn’t working with a more complex topic. We needed to completely rethink how we could deliver the content on time and on budget, while still meeting student needs.

As the project progressed, we also started to see a problem with our data collection. Progress was no longer a clear metric that explained comprehension and success. Students would complete the lessons to mark them done, but weren’t able to solve simple logic problems on their own. We decided to track skill confidence at the start and end of the class. We also tracked the number of students to land a job after successfully completing the class.

Process

To begin the test, we planned to work closely with a small set of beta testers during class production so we could quickly iterate on real feedback.

Setup:

    • Testing time frame: November 9, 2020 – January 29, 2020
    • Testers: 10 students with varying previous exposure to the topic
    • Gather feedback: 1 feedback form per lesson, private Slack channel for testers, and personalized 1-1 Zoom sessions for each tester and an instructor. 
Image of a Zoom call

We released two lessons at a time to beta testers and followed their progress closely. Each lesson included a feedback form where we gathered impressions about the content and asked questions about delivery methods (video, written, or challenge). Testers also met with the class instructor in 1-1 sessions to ask questions and give user feedback. We conducted these as user tests (asking no leading questions) combined with general Q&As.

After testers completed batches of two lessons, the feedback was compiled and reviewed with the Instructional Designer and SME (subject matter expert). Iterations of the lesson content happened quickly before moving to the next concept. This allowed us to tweak the class outline as needed, spending more time on complex topics.

We also implemented HaTS surveys (Happiness Tracking Surveys) for a brand new lesson type. This allowed us to gather first impressions of the look/feel of a challenge before rolling it out to other classes.

Challenges

The test took place over the holidays, so we did lose some momentum just as the concepts were getting more difficult. This affected beta testers comprehension, and we needed to do some refreshers to get things back on track. Keeping testers active & engaged over a long time frame of several months was difficult, but we did incentivize completion of the test to combat that issue.

Also, because this was our first actual beta test, knowing what feedback to implement versus what was just the experience of a few people was somewhat difficult to decide. We needed to think long and hard about what was essential knowledge and what was “nice to know but not necessary.”

Finally, measuring confidence proved to be a difficult metric. Confidence is subjective, so how can you accurately measure it? We did tie this to successful outcomes, which helped give it more weight.

Conclusion

The beta test was conducted over three months, and roughly 5 months before we released the class to paying customers. There were a number of improvements introduced from beta test feedback:

    1. Lots and lots of practice! We always thought we gave students enough small practice projects to nail down concepts. What we learned is that in order to build confidence, students needed about 3x the amount of practice we normally built into the process.
    2. Complex topics typically needed to be broken down over 2-3 lessons, rather than making a single lesson longer. Subconsciously this helped students feel like things were bite-sized, despite seeing the same amount of lesson content.
    3. Finally, an overwhelming number of suggestions asked us to show walkthroughs of a developer solving a problem. We had originally planned to do this for 2-3 examples throughout the class but ended up doing 11 walkthroughs. It really helps a brand new developer see how a more senior person works, and this did wonders for confidence.

 

Results