NexGen Video Test

A video test focused on metrics that appear only in video streaming. Video addicts, get ready!

Client: Opensignal
Services: App Design | UX Research

My role

User interviews | User Analysis | Market Analysis | Feature analysis | User flows | Rapid Sketching | Wireframing | Prototyping | Animating | Usability testing | Concept Validation | Agile Framework
User interviews
User Analysis
Market Analysis
Feature analysis
User flows
Rapid Sketching
Wireframing
Prototyping
Animating
Usability testing
Concept Validation
Agile Framework

The challenge

The year 2017. My task is to design an app, which would allow testing popular video streaming services and let us, Opensignal, showcase our technology.

Our data scientists were still profoundly experimenting with the new video metric*. Consequently, I had free hands to lead design experiments as well. The idea of an app showcasing methodology came from our client and internally we liked it, but we didn’t know much about the usefulness of a product like this from the consumer side of view.

*Opensignal brings scientific analysis based on data to provide insights on mobile connectivity. That time we were striving to be the first company in the Telcom industry, delivering a metric focused particularly on the video streaming quality. We’ve managed.

First Interviews

Who needs video on a critical basis? Who live-streams upload? Who live-streams download?

I worked together with the product manager – Teresa, whom I introduced in the Opensignal App case study.

There is no wide consumer use case for a video-specific quality test. Apart from techies or film enthusiasts, we couldn’t identify audiences interested in measuring the quality of video streaming. That’s why at the beginning of the process, together with Teresa, a product manager, we decided to reach out to potentially interested groups:

– connected industries (video producers, agencies)
– video service providers
– video-heavy services hosting on own servers
– infrastructure providers + QA
– industry experts (video-quality focused)
– international clients of video agencies
– consumers (VoD + social media).

By conducting interviews one-to-one, we understood that people who stream videos for entertainment and professionals who make living from video production don’t bother with the quality of video streaming. 

Specialists use the default presets for processing the videos – standard encoders – and trust the services like YouTube or Vimeo. 

They are not interested in measuring if their video is presented well to every end-user, is it available to everyone around the globe or what is the quality of experience in streaming it as well.

The unique category of professionals* who are interested in this kind of information is video-quality focused, industry experts.

But the goal was to deliver a fun tool for consumer users, not a professional measurement tool heavy in difficult to understand indicators, so I focused on information which could be interesting for regular consumers and dropped complex ones, dedicated to experts.

*As an example, Streaming Video Alliance. 
Their mission is to create best practices for scaling capacity and operations to support high quality streaming for events and VOD content. On their website they provide information about key metrics for video streaming: video startup time (latency), re-buffering ratio, jittering, video start failure (y/n),  average Bit Rate (bits per second), and more.

The potential problem was that without having personas and use cases (which help to build an understanding of someone’s journey), we could develop a product that wouldn’t be engaging and therefore wouldn’t get attention from users.

Apps of Key Competitors

Approach

During interviews, we haven’t found any certain use cases which would let us understand clearly what should be the direction in presenting our new metric. With knowing what we don’t know, we decided to validate the assumptions during prototyping. 

I spent several weeks on heavy prototyping and multivariate user testing to reveal the most exciting for users functionalities from our list. This helped to define the MVP feature list. Furthermore, I examined the ways of displaying them.

Main features we considered during testing:

Video test for different platforms (like YouTube or Netflix) | Display recognised Key Network Delivery Metrics | Explain the result | Details of the test | Video test map | Service/server map | Bandwidth speed test for people who need these metrics | History of results | Result comparison with people in the neighbourhood

User Stories & User Flows

Three core user stories, which were the base for me to start with sketching:

As a user,
I want to run a video speed test for a particular service, like YouTube or Netflix.
As a user,
I want to learn what the test results mean.
As the user,
I want to learn about the overall video streaming capacity of my network.

Below some user flows, which represent the path a user follows through the app interface to complete a task.

Rapid sketching

From user flows, I moved into rapid sketches. This helps to quickly develop different solutions for particular flows, it can be a good base for first quick paper prototyping and testing.

Prototyping & Testing

With some help from testing and iterations, I defined the way of displaying features.

I repeated the iterative process and conducted user testing before handing over high fidelity deliverables. Some screens needed many iterations before they started working for testers, but whenever we try to create something new, there are no ready answers; it’s all about experimenting. 🤯 🤓

As my role was both user experience and visual designer, I was able to examine not only dry functionality and introduced options but – in the later stage – excitement about the upcoming product even before any development!

Medium-fidelity wireframes

…with some splash of colour

I also tried to define a consistent and appropriate tone of voice, branding. All those levels help to build a beautiful product.

The deliverables from my side were including the work in progress presentations, clickable prototype, a proposal of video streaming tester MVP features, a final presentation and a short video presenting the app vision.

Although the results were well received, sadly the app never went into the production.

Delivery of video metric took more time than expected and during this period the strategy of the company has changed. On one occasion, our CTO told me: “Malgo, you’ve become the victim of rapid changes the company comes through”

As much as I’d love to see this app in the store, the project has been never released. It was decided to add the video metric in one of our existing app, Opensignal.

Branding

From Opensignal stable, but the youngest. This is a completely new test: young, edgy and engaging! Dominating blue charcoal mixed with accent colours: energetic blue and rose pink, and some drops of white and snow grey shadows.

What is NexGen?

A video speed test, focused on metrics that appear only in video streaming, like video start time or buffering.

Choose platform

Test your favourite streaming platforms, like YouTube or Netflix

Run video test

Find out how the quality of your internet connection will affect your video streaming.

Understand events

Discover events that occur during the streaming, with a positive or negative impact on it.

Results

See the details of the test results, explained in an approachable way.

Best Streaming Times

Identify the best times of the day and the week to stream videos.