Intro
This is the project that I’ve brought to Opensignal as the answer to a design challenge, which was a part of my recruitment process. Several weeks later, we started working on it together.
At the moment I write this case study, over 2 years after development, Meteor on Android has reached more than 4.5 mln downloads with an excellent 4.8-star rating ⭐️⭐️⭐️⭐️⭐️
These results have blown away any expectations we had as a team. 🤯
What the Meteor Team was like? Simply, AWESOME!
The challenge
I’ve been asked to create a concept of a new speed test app, based on the current technology, but in an entirely new way. A piece of music to my ears!
Generative Research
I conducted several one-to-one interviews which helped me to select three main types of potential users:
Non-technical
A basic awareness of issues related to WiFi speed/connection and its quality. They want their Internet speed to be fast and measure this through their direct experience. If they run a test, it doesn’t tell them too much.
Tech-savvies
More demanding and likely to measure their connection speeds regularly from different devices. Require greater amounts of information, for example, they might ask about tower IDs and available frequencies.
Industry professionals
Industry professionals have extensive knowledge of Internet services, measurement and problem-solving methods. However, this is a narrow group of people interested in professional solutions, not a consumer app.
Why users measure mobile/Wi-Fi connection?
they want to diagnose problems with the internet.
they want to check if their provider is telling the truth about the speed delivered
to compare their internet capabilities with friends or neighbours.
Related insights
Average users don’t know how to interpret the result of the measurement. Adding a context would help to explain to them if the speed they got is matching their actual needs.
For tests, they use both mobile and desktop devices, but the majority focus on web-based tests.
“I’d like to understand if my Internet speed is good just for browsing articles or streaming. I don’t know too much about it.”
Qian
one of my interviewees
Competitive landscape
I checked many mobile and web-based apps, as I wanted to see the broad scope of different design and data representation. I included web apps as they are for many users the most common way to measure the speed of Internet. Here are some outcomes:
Persona Profile / User Scenarios / User Stories
Out of the interview results, I’ve built a persona’s profile. John represents a broad group of people, that are willing to learn more about their connection speed and want some tips regarding it.
So what’s the story, what are user’s goals and motivators when testing their Internet speed?
John
29 years old, Video Editor. He works remotely with many agencies around Europe. Has to send and receive large files few times a day. Lives in London. Loves gigs and watching movies.
Needs
Finding information about the quality of Internet connection at work and home. Understanding if his Internet speed is suitable for his particular requirements.
Pain points
Usually, speed tests do not give him clear picture of what the real source of the issues are. He doesn’t know if his slow download speed is caused by his network speed limits or external servers.
Story 1
As a user,
I want to be able to run a speed test so that I know what my actual Internet speed is.
Story 2
As a user,
I want to be able to understand the test results clearly, so I know if my Internet speed is adequate for my needs.
Ideation. Rapid Sketches
I moved information from user flows into quick drawings. The result looked notably uncomplicated, but that’s the beauty of low-fidelity sketches.
I also drew few visualisations of the stats – they could be represented by any shape – depending on the effect, we want to get: professional, fun, simple?
Eureka! After a few attempts, I felt ready to move it to the digital prototype. I found out the way of representing data originally and consistently.
Evaluative research. Digital prototype & iterations
I tested my hypotheses and the design solutions using a medium-fidelity prototype. Feedback about the clarity of the functionalities helped me in defining the key flows and UI elements.
Medium-fidelity wireframes
User testing
I repeated the iterative process many times before getting to high fidelity final deliverables. Once the feedback started being positive, I added some magic on high fidelity, worked more on the content and styled the app.
In the beginning, I started with why, now we moved to how. The project went through almost 3 weeks of user testing, over two months of development in 4.5 people team, more improvements on the way, and the final results can be seen below.
New features prioritisation (later improvements)
Audience validation – user survey
Results in 2 years after development: 4.8 star rating with almost 5 mln downloads, the highest retention from all Opensignal apps as well as the highest organic growth. See on Google Play or App Store
Branding
Here it is, Meteor! Neon blue, bright turquoise and goldfinch contrasting with clouded dark blue. I chose this palette to bring the freshness of novelty but also to keep the feel of the power of proven methods. This is science, but not a rocket science!