StreamCharts : UX/UI Redesign of Streaming Analytics

This case study illustrates the enhancement of the StreamCharts Overview page, initially deemed ineffective by stakeholders. User research and competitor analysis made adjustments in information architecture, graph visuals, and information layout, leading to improved user experience as evidenced by user testing results.

Type

UX/UI Design

Client

StreamCharts

Duration

Sep - Oct 2023

web

Data Visualization

Tags

Table Of Content

Overview

In the fall of 2023, I took on an exciting project with StreamCharts to improve their digital platform. At the initial meeting, the company’s CTO expressed concerns about the Platforms Overview page, noting its lack of user feedback, both positive and negative, which was in stark contrast to the other pages that were actively receiving user responses.

The key objective was to identify why the Overview page was unsuccessful in engaging users and propose comprehensive redesign strategies to address these issues.

My Team / ๐Ÿ‘ฉ๐Ÿผ ๐Ÿ‘ฉ๐Ÿผ ๐Ÿ‘ฉ๐Ÿผโ€๐Ÿฆฐ ๐Ÿ‘ฉ๐Ÿป ๐Ÿ‘ฉ๐Ÿป

As part of a skilled team of five UX designers, my role was multifaceted and integral to the project’s success.

My Responsibilities Included

Facilitated and observed user interviews for direct user insights.

Conducted competitor analysis for benchmarking and innovation opportunities.

Participated in tree testing to assess the information architecture and user navigation challenges.

Engaged in developing user personas and Customer Journey Maps (CJMs) as a team.

Contributed to brainstorming sessions for ideation and problem-solving.

Redesigned “Growth by week section”.

Design Process

Double Diamond methodology

We followed a structured yet dynamic process.

We started with competitor analysis to establish industry benchmarks. User interviews followed, providing insights into user needs, which were essential for creating accurate user personas and Customer Journey Maps (CJMs). Tree testing was conducted to validate our hypothesis about simplifying the information architecture.

Brainstorming sessions and ‘how might we’ statements helped focus our design direction. Finally, we developed low and high-fidelity prototypes, rigorously testing them and incorporating user feedback to ensure our solution was user-centric.

Research

11

Competitors

We analyzed 11 competitors, identifying similarities, differences, UI patterns, and SWOT analyses.

8

User Interviews

We also conducted 7 user interviews with streamers and businesses to learn how they use streaming platforms in their everyday life

83

Participants in Tree Testing

Our hypothesis was that the existing information architecture (IA) needed to be simplified. Testing with 83 participants showed that the new IA outperformed the initial one.

What we learned about the users

From our interviews we developed 2 main personas: John, a streamer, and Anna, a business professional. John seeks platforms for audience growth, while Anna looks for platforms to find streamers.ย 

A key gap we found was the lack of an efficient platform comparison feature, particularly critical for business users like Anna. This tool is essential for making informed choices about streamer partnerships.

We also noted that both John and Anna need to understand complex graphs and data easily. They often struggle with information overload.

Persona John (Streamer)

John's Customer JOURNEY MAP

Persona anna (business)

Anna's Customer JOURNEY MAP

Problem Definition

We closely examined user feedback and analytics data to pinpoint key areas for improvement. A brainstorming session helped narrow down critical problems, leading to collaborative solution development. Here’s a summary of the main issues we identified:

Complex Information Architecture (IA): Through the tree testing, we discovered that users found the platform’s navigation complex and unintuitive, needing help finding relevant information quickly.

Hard-to-Compare Platforms: A significant gap was identified in the platform’s ability to allow users to compare different streaming services effectively.

Graph Complexity: Users found the data representation on graphs to be overly complex and difficult to interpret.

Ideation

Next, we brainstormed and documented various ideas and then voted to select the best ones. We chose comprehensive yet easy ideas to implement, avoiding those requiring significant system changes.

The ideas with the highest votes moved into the prototyping stage for further development and refinement.

We used stickers and annotations on the prototype to highlight areas for improvement. This collaborative review helped us pinpoint and simplify aspects of the graphs.

We also iterated through multiple variations of each improvement until we achieved the desired outcome.

Idea Generation

Refinements

Considering Options

Outcomes

Platform Comparison

"It takes a long time to compare platforms"

Guided by the needs of our Anna persona, we introduced a page to address this challenge: a dedicated page for side-by-side platform comparison.

Platform overview feature

The initial platform overview page only allowed users to view platforms one at a time, not offering a true, collective overview. To address this, we designed a feature that lets users compare all platforms side by side, providing a clearer, more comprehensive overview.

We suggested 2 versions of the Platform Overview section, both with its advantages and disadvantages.

1

E-commerce Style

Pros

User-preferred method
Intuitive and direct comparison process

cons

Could be more complex to implement

2

Ribbon with Dropdowns

Pros

Cost-effective
Utilizing existing features

cons

Was missed in some user testing sessions

Redesign Overview

Takeaways

The project greatly enhanced my understanding of how a team works effectively. It highlighted key skills like delegating tasks and trusting team members to complete them.

I learned about the power of persuasive communication. The project stressed the importance of clearly expressing and defending ideas within a team.

It reinforced the value of a systematic and steady approach in design. We built each stage of the project on top of the previous one. This required us to continually move forward and revisit past stages to refine and improve them.

A significant part of the project involved adapting to limitations. We missed a chance for extensive usability testing. Instead, we engaged in rigorous user testing. This approach gave us essential insights and validated our design decisions in real-world situations.

Our adaptability highlighted the dynamic nature of the project.
It showed our ability to change direction and find solutions when facing challenges.

Overall, the project was an immensely educational experience. It deepened my appreciation for working together to solve problems. I learned the importance of agile design thinking. The project emphasized continuous improvement through user feedback.