Inspiring to study abroad

Making core pages of Mastersportal more digestible and visually appealing

Before

Before Mastersportal redesign

After

After Mastersportal redesign

My role

I was leading the UXD team at Studyportals with whom we redesigned the core pages of Mastersportal. The core team consisted of four designers and one product owner. Together we did competitor benchmark, design direction, visual explorations and information architecture work. Additionally, I was focusing on user research and hiring the visual designer for our team.

Many people from the other teams participated in the planning, A/B testing, visual design, implementation and rollout of this project.

Context

Studyportals is a leading education choice platform visited by over 44M visitors in 2020 alone. The platform lists over 160K courses from more than 5K educational institutes on six portals like Mastersportal, Bachelorsportal, PhDportal.

Studyportal's mission is to make sure that no student would miss out on an education opportunity due to a lack of information. Studyportals helped at least 485 000 students to find their education.

Challenge

We wanted our platform's design to reflect the excitement of studying abroad. In the prior user research and expert reviews, we learned that our product is text-heavy, uninspiring and patchy. Hence we decided to solve these issues by redesigning our three main pages so the entire experience on our website becomes digestible and visually appealing.

We picked home, search and programme pages as a starting point. These pages are quite different and contain most of the UI components that can be reused on the other pages. Additionally, they are a big part of user journey of our visitors.

User research

We compiled relevant prior user research, expert evaluations and feedback from various colleagues in one place. As it usually goes, some decisions were easy to base on the knowledge we had, and some required additional research and validation.

Prior user research

When it comes to visual design and inspiration, there is a lot of subjectivity. We wanted to make our results more objective. Therefore we decided to evaluate our explorations using multiple research methods. There are a lot of methods out there, some are lengthy, others are unreliable. For this project, we wanted to measure trustworthiness and appeal of the new version.

We started with these questions:

Baseline measurement

We wanted to go further than NPS. At that time, we already experimented for a few years with the different ways to measure user experience, but none of them included all dimensions we wanted. On top of that, the scores fluctuated too much, so it was hard to see how our changes affected the product.

Our choice fell on Standardised User Experience Percentile Rank Questionnaire (SUPR-Q). This questionnaire can be completed very quickly as it contains only eight Likert items. And the best part is that it focuses on usability, trustworthiness, loyalty and appearance.

Also, we added one qualitative question at the end, where we asked about the main reason for the scores given. Before the release, we managed to collect an incredible number of quantitative and qualitative responses.

8589
Quantitative responses

2056
Qualitative responses

Visual appeal

It doesn't take long to make an opinion about the visual design of the website. People who land on a new website from Google spend seconds to assess its credibility and trustworthiness. Therefore we wanted to compare the first impressions that visitors have.

First impression

We used a 20-second test on Usabilityhub to show the existing website and the proposed design to the two separate groups of respondents. We asked them to describe the look and feel of the website in three words.

We collected 137 words that characterize look and feel of the website. Each word was evaluated by three UX designers independently using the scale below:

After that, we averaged the scores from the UX designers and summed them up per variation. The conclusive scores turned out to be 2.33 and 77.56 for the existing and proposed designs, respectively.

Preference test

Using the same platform, we ran a quick preference test with 25 respondents. We asked them to pick the design they prefer and explain in a few words their choice. 21 out of 25 respondents preferred the proposed design.

Visual Aesthetics

The Visual Aesthetics of Websites Inventory (VisAWI) survey shows how users subjectively perceive the UI. Respondents were presented with 18 statements that they need to rate from 1 to 7. These statements are focused on four core aspects of the website aesthetics from the respondents’ point of view: simplicity, diversity, colourfulness, and craftsmanship.

Both versions were reviewed by 12 different students. Variation consistently outperformed control on average and across all core areas. This pattern was already noticeable after each version received the first six responses.

Old VS new design scores in VisAWI

To avoid response order bias, half of the students received the existing website survey first and the variation survey second. The other half of the students received the variation first and the existing website version second.

Findability

During the content inventory, we mapped all the elements on the programme pages to see which elements are essential and which can be removed. We wanted to compare how fast people can find the essential elements on the existing and the proposed designs. For that, we ran two remote unmoderated usability tests with six participants in each group.

Majority of the respondents had no issues to find the most of the essential elements in both variations. The proposed design performed better and had less than half of the amount of failed tasks and confident mistakes in comparison with the existing one. However, the new design made wishlisting and university ratings less obvious, which we corrected in the next iteration.

We got useful qualitative feedback but to reliably measure the time completion we should have recruited more participants. 13–18 instead of six per variation.

To test our information architecture, we did a digital card sorting session with 39 first-year students based on the content inventory done earlier.

Card sorting session

Goal completion

To reliably measure goal completion and conversion, we run a set of A/B tests. With each test, we exposed a larger percentage of our platform to the new design. To see how our visitors scroll and interact with the page, we analysed hundreds of visitor recordings and 15 heatmaps in Hotjar.

Sample sizes

In the visualisation below, you can see how many participants we had in each research session.

Visual appeal

50
First impression

25
Preference test

24
VisAWI survey

Findability

12
Usability test

39
Card sorting

Goal completion

5
A/B test iterations

15
Heatmaps

100s
Screen recordings

Visual design

To be less restricted by our product looks, we started the visual explorations from scratch following the Double Diamond method. Diverging from the current design, and then converging our variations into one proposal.

Inspiration

We looked at direct and non-direct competition to define how we want to position our brand and product.

Benchmarking and inspiration

To play around with different styles and elements, we made style tiles individually. Later we summarised them in one style tile.

Style tiles

Page design

We mapped the content to the page structure and made wireframes of the page. After all the visual explorations described above, we started styling the wireframes and iterating.

The visual design track was validated by the methods above in stages and lead to the additional iterations.

During the high fidelity iterations, we also started working on interactions and animations. Check the early animation exploration by Gabriel Duarte below.

Additional experiments

At Studyportals, we are able to bring more value to the students if they have an account on our website. Before the rollout we ran a series of A/B tests to improve the registrations that lead to registration improvements of 12% (desktop) and 56% (phone) for the engaged users.

Registration improvements

Results

Design of the other pages is still being rolled out. It turned into a moonshot that elevated every part of our practice from visual design and user research to technical implementation and SEO.

To set our design vision we explored how the new design will look on the other parts of our portals. In some cases, we made adjustments that made the overall design system stronger.

For a smooth rollout of the style on the other pages, we closely collaborated with front-end developers. One artefact used in the base style rollout is presented below.

Base styling for roll out

Impact

The average monthly ratio of positive/negative comments improved by 21.3%. The same ratio for the comments about UI improved by 14.8%.

Additional experiments to improve registration rate lead to 12% on desktop and 56% on phone improvement for the engaged users.

Improvements to the UI-kit and better alignment between designers and front-enders lead to large improvements in consistency, design efficiency and cost of maintenance.

Reflection

Additional references

  1. NNG: How to Test Visual Design
  2. Common questions about five-second testing
  3. How to measure visual design — a practical guide
  4. Conducting Qualitative, Comparative Usability Testing
  5. The visual preference testing cheat sheet
  6. Rapid Desirability Testing: A Case Study
  7. Scannability: Principle and Practice

Up next
Measuring user experience ›