Requirement gathering and discovery research
Ideation and solution design
Prototyping
Usability testing
Feasibility considerations
Development collaboration i.e specs & snagging
Figma
Notion
Dscout
Canva
Our existing content categorization structure did not adequately support users in intuitively browsing categories to discover what to watch. Users often found navigation challenging, which limited their ability to efficiently explore and uncover available content.
This created a gap in content discovery, resulting in reduced engagement and underutilization of the platform. In short, users were not watching as much as they could have, simply because discoverability was poor.
This project set out to investigate how categorization could be improved to better align with user expectations. The primary goal was to enhance content discovery by making categories intuitive, easy to navigate, and structured in a way that ensures every type of content is easily discoverable. This initiative was a cross-platform effort, spanning web, mobile apps, OTT, and connected TV (CTV). For the purpose of this case study, the focus will be on the web experience for coherence and depth of exploration.
The major goal of this project was to improve content discoverability through a more intuitive categorization system, enabling users to easily find and engage with content that matches their interests. By enhancing navigation and reducing friction when users look for what to watch, the aim was to drive higher content consumption and improve overall retention across platforms, ultimately supporting revenue growth by keeping users engaged and entertained.
This project followed a user-centred design approach guided by the Double Diamond framework. The framework provided a structured process that ensured I deeply understood user needs while shaping effective solutions. I began by discovering the problems users faced with content categorization through research and insights.
From there, I defined the core challenges by synthesizing findings into a clear problem statement. In the next phase, I developed potential solutions, testing and refining them to enhance content discoverability. Finally, I delivered the chosen solution and validated it to ensure it met both user needs and business goals.
Diving into the project, I began with secondary research to gain a deeper understanding of content categorization within the industry. I reviewed a range of credible sources, including publications from the Nielsen Norman Group, academic papers on HCI and UX, as well as studies on genre rating analysis and other relevant topics. From this research, I identified standard criteria and their underlying justifications, drawing on established patterns and best practices to inform the approach.
I evaluated our existing category pages to understand how categories were being mapped and presented to users. This involved closely examining the structure, hierarchy, and labeling of categories to assess how intuitive they were for browsing and discovery.
This helped to uncover potential issues that might create friction for users, such as unclear naming, inconsistent organization, or gaps in categorization. By identifying these pain points, I was able to establish a foundation for improving the overall navigation and making content easier to explore.
Benchmarking was conducted with 11 competitors using criteria defined through secondary research. Each criterion was scored on a two-point scale, with one point awarded for partial fulfillment. STV Player was then evaluated against these competitors to determine its relative position and uncover areas for improvement.
When the scores were tallied and competitors ranked, STV Player placed last. While this result clearly highlighted that there was work to be done, it also provided valuable insight. Many of the industry’s leading platforms ranked at the top, reinforcing that they were applying effective strategies we could learn from and adapt. However, what I always find most exciting about this kind of analysis is that it not only reveals where we need to improve but also uncovers opportunities for us to innovate and set new standards that others in the industry could follow.
I proceeded to the primary research phase, which offered a more direct and user-centered perspective. While secondary research provided valuable background insights, conducting primary research was especially beneficial as it kept the users’ needs and experiences at the forefront. During this stage, the categories feature and functionality were tested to assess how effectively the categorisation system supported content discovery. This evaluation focused not only on how clear and intuitive the categorisation appeared to users but also on how it influenced their ability to navigate and engage with content efficiently.
The behavioural testing provided valuable insights into several key areas:
Why users perceived certain categories as unintuitive.
How effectively content metadata helped users recognise programme types and understand their relevance to each category.
How easily users were able to locate content aligned with their specific preferences within categories.
After completing the discovery phase, I broke down the findings into the most significant insights that highlighted where users were facing challenges. This presented clear and actionable areas that would guide the next steps in addressing the problems.
I transformed the insights into user stories to create a human-centered framing, keeping the focus on the needs of actual users. This approach naturally sparked different ideas for how the solutions could be approached and explored. From there, I introduced How Might We (HMW) statements as a way to reframe the challenges into opportunities, asking ...How might we provide users with what they need or want?
Before moving into design, it was essential to determine how to meaningfully consolidate categories while still retaining the breadth of available content. The number of top-level categories was reduced from 31 to 8, with some categories reorganized as subcategories—for example, all crime-related content now sits under a single Crime category. Others, such as Audio Described, Recently Added, and Most Popular, were identified as better suited for filtering and sorting options rather than standalone categories.
This approach aligns with Hick’s Law in user psychology, which states that reducing the number of choices decreases cognitive load. By simplifying navigation, users can make faster, more confident decisions, ultimately finding something to watch more quickly and with less effort.
A detailed user journey was also developed to map out new pathways a user could take when exploring and discovering content within the category section. This process helped visualize the touchpoints, decisions, and interactions that the users would experience, from their initial entry point to the moment they find desired content. This understanding served as a foundation for the high-fidelity explorations, guiding layout decisions and interaction from page to page.
After establishing an ideal navigational structure, I explored multiple ways to present the user interface and its functionalities while maintaining consistency with our design language and reusable components. This stage involved ideating on key interactions such as how categories would be displayed for selection, how filtering options would appear, and how users could seamlessly browse through content. After numerous brainstorming sessions (and plenty of coffee), the refined concepts were consolidated into signed-off versions that were then taken forward for testing.
The goal of the usability test was to validate that the redesign effectively addressed the identified pain points; ensuring that categories were intuitive to navigate and that content within each category was easy to discover, allowing users to find what they wanted quickly and confidently. The testing focused on several key areas:
Navigation and filtering behaviour
Category simplicity and naming clarity
Subcategorisation and content grouping
Overall experience
The usability test demonstrated that the new categorization system successfully addressed the previously identified issues. Participants showed a high task success rate and significantly faster completion times across all test scenarios. In addition, users gave exceptionally positive feedback, rating the navigation, simplicity, clarity, and subcategorisation highly, which reinforced the effectiveness of the redesign in enhancing overall usability and content discoverability.
During the usability tests, two key areas for improvement were identified and subsequently addressed through iterative design refinements.
When asked to sum up their overall experience with the categories section in one word, most users shared the following responses:
Accessibility considerations were a key focus to ensure the design was inclusive and usable for all users. The following measures were implemented to enhance accessibility:
Accessibility filters – to help users easily find content with subtitles, audio descriptions, and visual signing.
High-contrast design – applied to banners, cards, and other graphical elements to improve legibility and comply with WCAG (Web Content Accessibility Guidelines) standards.
Alphabetical arrangement of categories – to establish a predictable, logical order that supports easier navigation, particularly for users relying on screen readers.
These are the core areas of the categories pages on web following the design iterations. In addition to enhancing usability, I identified an opportunity to boost business ad revenue by introducing sponsorship options across different categories and utilising banner spaces for targeted ad placements. This approach was very well received by stakeholders as a strategic and scalable solution.
As this project was delivered across multiple platforms, I have also highlighted key designs for the mobile and TV experiences.
Design QA with developers was conducted after the design handover. I particularly enjoyed this stage as it ensured that what reached users truly reflected the intended design. This process involved continuous collaboration with developers throughout the sprints and was essential for several reasons:
Ensured that all design elements, including colours, fonts, spacing, and layout, aligned with the original specifications for a cohesive look and feel.
Verified that navigation and interactions were implemented smoothly and intuitively, maintaining the intended user experience.
Confirmed that the design remained consistent and functional across different screen sizes, resolutions, and devices, enhancing responsiveness and adaptability.
Real-world user interactions will be closely monitored to determine whether further iterations are needed based on actual usage patterns. Integration of additional features across multiple platforms will also be considered according to our prioritisation frameworks. For example, making the genre tags interactive on a programme page was identified as a valuable enhancement, but out of scope within the current phase.