User research &
recruitment setup

Results

While being a small team of UX designers, we involved an impressive amount of respondents in our user research sessions and surveys to make sure we are building the right things.

13.9k
Quantitative responses

  • First impression and preference tests
  • Visual appeal surveys
  • Card sorting
  • SUPR-Q scores
  • A/B tests
  • Heatmaps and recordings

5.2k
Qualitative responses

  • 470+ user tests and interviews
  • 12 design sprints
  • Product-Market fit surveys
  • Custom intercept surveys
  • SUPR-Q qualitative comments

As you can see above, we used a wide range of user research methods ranging from usability testing and interviews to A/B tests and card sorting.

In most cases, we combined findings from the different research methods to get more reliable results. A good example of this was the research we conducted during the redesign of Mastersportal.

My role

I was leading the UXD team at Studyportals, with whom we built a robust user research practice. I was contributing to the user research setup, interviewing respondents, analyzing and concluding research findings. Additionally, I was focusing on user recruitment automation, ways to present and document research findings.

Context

Studyportals is a leading education choice platform visited by over 44M visitors in 2020 alone. The platform lists over 160K courses from more than 5K educational institutes on six portals like Mastersportal, Bachelorsportal, PhDportal.

Studyportal's mission is to make sure that no student would miss out on an education opportunity due to a lack of information. Studyportals helped at least 485 000 students to find their education.

Challenge

The rapid growth of the Engineering and UXD teams posed a range of challenges. The biggest one was to be ahead of the development teams when it comes to user research. We already ran bi-weekly user tests. Mainly we tested what was already released, which lead to potentially developing features our visitors don't need.

Below you can see a simplified user research flow we had at the time with an approximate sample sizes guidance for each research method.

Old user research flow and tools

Having the bi-weekly user test sessions was a great way to improve our products and the design maturity of the organisation. However, we didn't always have enough topics to test every two weeks and, it was putting a significant strain on the team.

We wanted to find a sustainable way to integrate a diverse range of user research methods in the different stages of product development.

Setup

Our qualitative research activities could be put in two and, at times overlapping, categories:

To reduce the subjectivity of our findings, we combined qualitative and quantitative methods to see what students do and what they say.

Intercept surveys

As our websites visited by millions of students each month, it's easy to get fresh input about experiences with our product. Our intercept surveys often combined quantitative and qualitative questions. A few notable examples were:

Product-Market fit survey

When in March of 2021, we asked our visitors how they would feel if they could no longer use our product, we got 82% of the respondents stating they would be very disappointed. Well above 40% benchmark.

Follow-up questions were useful to confirm our findings from the qualitative feedback we got from SUPR-Q.

Design sprints

Design sprints proved to be a fast way to understand and test complex ideas with the added bonus of involving stakeholders in the design process.

We run our first design sprint in 2016, a few months after Jake Knapp and John Zeratsky published the book about their process.

In the same year, I presented my experience with the facilitation of the design sprint at three companies and two conferences. In 2017, we ran five design sprints in parallel.

5 parallel design sprints. What possibly can go wrong?

Screen recording viewing sessions

Another way to engage the engineering team and stakeholders in user research were 2-hour Hotjar recording viewing sessions. The main goals of those sessions were to find out:

User recruitment

User recruitment was a shared task in the UXD team. It often required a lot of effort and delivered unpredictable results. That's why we automated the majority of the recruitment tasks. Check out my presentation about the user recruitment automation setup we use to get non-trained respondents for our research sessions.

User recruitment automation flow

We usually started the recruitment process 2–3 weeks before the session to have enough participants. To get access to non-professional respondents, we used Facebook ads, Hotjar polls and email newsletter.

Different research sessions required participants with different backgrounds, so we used personas to construct our screening surveys. To save respondents' time, we placed the critical questions first. Only after the consent form was signed, we would ask for personal details.

In recent years we mostly performed remote user research. The main reason was that it replicates the environment our users use our products better than in-person sessions. Additionally, it allows us to access way more students around the world. And with the help from Calendly, we saved a lot of time on session scheduling and reminders.

Impact

Just doing more user research was never the final goal of the UXD team. Instead, we wanted to build a design culture that ensures that we help students find their dream education.

Measuring design maturity is tricky and often based on self-reported surveys. In our case, we got a few more objective achievements:

Reflection

Additional references

  1. When to Use Which User-Experience Research Methods
  2. Triangulation: Get Better Research Results by Using Multiple UX Methods
  3. Leading and Lagging Measures in UX
  4. Net Promoter Score Considered Harmful (and What UX Professionals Can Do About It)
  5. Best practices for graphing & displaying data
  6. Quantifying The User Experience: Practical Statistics For User Research
  7. An overview of the various questionnaires that measure software and website quality (pages 70–72)

Up next
Streamlining airline booking flow ›