E-Learning Platform for ESL students

Online test services for TOFEL/EILTS preparation

UX / UI

Due to non-disclosure agreement, I will not reveal the company’s real name in this case study and will refer to it as “Company X” moving forward.

Problem Space

Goal

Company X is an E-Learning platform that provides online TOFEL/EILTS test preparation services for students in India. They have successfully launched a MVP version the month before I joined the team, and yet have witnessed a low conversion rate for their online mock test feature — a lot of users didn’t complete the tests they started or sign up for the platform for unknown reasons.

Identify usability issues that might cause the low conversion rate and design solutions to retain users.

Role

Time

I was in team with three other UX designers, mainly in charge of conducting UX research and redesigning the test screens.

Jan-March, 2023

Identify the problem

The first thing my colleagues and I did was facilitating an interview with the company stakeholders to identify their needs and wants. We were curious to learn the challenges that the company was facing as well as their long-term and short-term goals. Through asking a list of prepared questions and having a free discussion afterwards, we were able to identify a few things that the company needed our help with in in the hierarchy below.

(Short-term goals)

In order to understand the causes for users’ withdrawal from the tests, I first conducted a heuristic evaluation for the company’s current mock test feature. Through investigation, I found out a few usability issues during the test process that could hinder the user experience and cause them to leave the test page halfway through.

Issue 1: Lack of user freedom and control

Throughout the test, there’s no exit button or the option to direct the user to other places in the website. In addition, there’s no help button either. These issues amount to the high possibility that the user will close the tab altogether if they decide to either change to another test or explore the platform a little bit more before proceeding with the test, or if they have questions about the test that they can’t get the answers for.

With the current testing setup, users cannot choose to take a specific section of the test. If someone is just looking to try out a certain portion of the test — maybe to compare it to another website format before selecting one or the other — users could be frustrated for not being able to do so at a lower cost.

Issue 2: Excessive cognitive load

There is a lot of information on the test instruction page that may contribute to cognitive overload where users may feel overwhelmed by the information. This could lead to users not reading the information altogether.

Issue 3: Accessibility issues

The text throughout the test is relatively small and with little line space — especially in the reading section — which can greatly effect the user’s comprehension of the tasks. While it might not be an issue for all people, it can indeed undermine the usability for some and make the overall experience less pleasant for most people.

In the image above, the text in the reference image for the writing section is extremely hard to read and yet crucial to the understanding of the prompt. In this case, one can expect the user having difficulty completing the task due to the accessibility issue.

Issue 4: Inconsistency and lack of standards

Throughout the test, the user would encounter with buttons of different sizes and styles as well as similar colors in different shades. Although this might not directly cause user frustration or withdrawal, it can undermine the company's credibility and the user satisfaction of the overall experience.

See full company analysis >

Explore opportunities

After discovering some important usability issues, I explored some other UX practices that the company can implement to improve the product and stay ahead of the competition. In order to do so, my team and I conducted a competitive analysis across 5 industry leads to identify their strengths and weaknesses for us to learn from. After completing a throughout evaluation of each competitor, I compiled a feature comparison table to show where Company X currently stands in the competition.

Due to the fact that the other products have a much longer history of development, it’s no surprise that their product are more mature and comprehensive in terms of their functionalities and UX/UI designs.

However, the defining characteristic that sets us apart is the modern and hi-tech look and feel of our product, which is not often seen in educational products. In the next iteration, we can further enhance our strength by creating a universal style guide and robust design system for higher consistency and standards across the platform.

Based on the competitive research and the company’s heuristic analysis, we identified a list of high- and low-priority features for Company X to incorporate for their second launch of the product.

High-priority features to be added:

1. Help button. Provide a help button available throughout the test in case the user needs technical assistance or hopes to refer back to the test guidelines

(ETS test screen)

2. "Save and exit" function. The “SAVE & EXIT” button allows the user to save their test and continue later at any time. This helps if the user is short on time or wants to just check out the test without committing to the full time. It also increases user freedom and control, which is lacking in the current version.

(Testglider test screen)

3. Immediate test report. Provide a basic test report immediately after the user finished the test and before they decide to sign up for the platform. Using value-in-advance strategy can increase the conversion rate and provide the user with a more satisfying experience. Consider providing some basic rest results (e.g. numbers of correct/incorrect answers and unanswered questions) for all the guest users, and reserving the more in-depth insights (e.g. estimated score, detailed test analysis) only for registered members.

(Barron’s test screen)

Low-priority features to be considered:

1. Review previous questions. Allow users to review their previous answers before submitting and proceeding to the next section.

2. Hide/show timer and word count. Sometimes having a clicking clock in front of one’s face can cause more stress for the test takers, and the same goes to the word count function when writing an essay. Therefore, providing the option to turn on and off these features shows consideration for the users.

3. Text size customization: For basic accessibility, consider providing a feature that allows font size adjustment for better readability

See full competitive analysis >

Redesign

Based on the previous research, I iterated and created a new version for some key screens to improve usability and incorporate new features. The main areas of focus are:

1. Reduce cognitive load.

2. Increase user freedom and control.

3. Better visual and text balance.

4. More concise and informative UX writing.

Another major issue during the redesign process came from the fact that the company didn’t have a universal style guide, which cased severe design inconsistency across the platform. Therefore, in addition to redesigning the test screens, my team also came up with a design system to help give guidance to all the future designs of the company.

Next step

After updating the product website to the newest design, the next step is to conduct user tests to measure the effectiveness and evaluate the impact. However, the consulting time was up before we had the chance to implement this step. Therefore, we suggested that the company test out the solutions once the new version is launched.

See full project in Figma>