case study
Research that grew a roadmap and gave the next team a foundation to build it from.
Browse had existed before. Users liked it. Then a previous decision-maker took it down, and the requests to bring it back never stopped.
The scale made this complex. More than 450,000 students across primary schools to junior colleges. Over 40 subjects, many offered across 2 to 3 curriculum bands and multiple year levels. The information architecture had to hold across all of that.
Meanwhile, curriculum specialists across more than 15 subject areas were building workarounds. Websites. Excel sheets. PDFs full of hyperlinks. Each one a different team solving the same problem the product hadn't.
My discovery quickly surfaced something the product team hadn't fully mapped: content maps, owned by the curriculum division, were information architecture central to how Browse would need to work.
I built out that picture proactively. It gave the team, including my product leads, enough shared understanding to start conversations with curriculum about updating them. That meant joining curriculum team meetings, understanding how they were already tackling discoverability, and framing our work in terms that made sense to them.
I ran usability testing and surveys with students, teachers, curriculum specialists, content map owners, and members of the original team who built the first Browse feature.
Only 2 out of 9 students made it to the right subject catalogue and category during testing. Some of them didn't understand what MOE Library was for in the first place. This was a concerning for the team, since MOE Library was built for students to find and study resources on their own.
This card component made sense elsewhere in the design system, but here it wasn't visually communicating that this was a place to browse for modules.
"Can you make it look more fun?"
— 8 year old usability test participant
"People kept asking for Browse... I didn't expect they would be so lukewarm about this."
— Product Teammate
A bigger question surfaced: even when Browse works, does it fit into how students actually revise? This was a workflow problem that we hadn't anticipated due to avid advocacy, but my research surfaced.
Before my research, shipping Browse was assumed to close the problem. After it, there was alignment that more effort was needed. The roadmap made room for further iteration.
The next team carved out dedicated time for in-depth usability testing with students around revision habits, follow-up research that is rare on this team.
My discovery work into the metadata backbone of these modules also kickstarted crucial conversations we needed to have with our curriculum colleagues. Our collaboration was crucial to students having a positive experience browsing.
Starting with 8 weeks for design left us with a tight turnaround time for testing, learning, and iterating. It also meant we had less time to plan for collborating with other key service partners, such as our curriculum colleagues. I'd push for a realistic timeline before work starts, so that we can solve these problems upstream.