MARIAN: A web-based testing for cognitive assessment.

MARIAN: A web-based testing for cognitive assessment.

UX CAPSTONE (University of Michigan/Coursera)

Challenge: Conduct a multi-stage user experience project to design a product from scratch.

Activities: User Research, Prototyping, UX/UI Design, Wireframes

Result: Gain experience with a realistic UX design project and conducting user experience research.

 

Design Problem

By not having access to batteries of digital tests for cognitive assessment, psychologists and cognitive researchers must use the paper and pencil format, which are complicated to apply, evaluate, and store all the data in one place.

 

Target Users

The target user population are psychologists and cognitive researchers, people who need a battery of digital tests for the cognitive evaluation of their clients or participants of a research.

 

Application Design

MARIAN is a web-based battery of digital tests for cognitive assessment, which the user can obtain the following benefits: Easy access to the test battery, agile and efficient management of the data application, evaluation and storage processes, the tests are completely automated, that is means, the psychologist may ask the participant to complete the tests at home, not necessarily in the laboratory. Simple to use because the platform is created to be intuitive to use for the psychologists and their clients . It is also easy to use for tests, as they are designed in an interactive way, like "games", for easy understanding.

 

With the battery of digital tests for cognitive assessment called MARIAN, the user would do:

  • Have an account to manage the tests.

  • Create “Files” for the clients and remove it.

  • Choose the tests that are closest to their objectives from the catalog.

  • Each client will have an account (an associated email) where the results will be stored. This will allow the participant to perform the tests remotely.

  • Manage the clients and see their progress.

  • Users will have a view of the status of the clients, as well as statistics, favorite tests and other important documents on the dashboard screen.

Research Methods

To better research the product, an interview was first conducted to find needs in the target users. The interview was conducted with 4 participants. These participants were psychologist with the following recruiting criteria:

  • Currently working as a psychologist.

  • Have knowledge of batteries of cognitive assessment tests.

  • Have experience in handling a digital test battery platform.

Some of the overarching questions in that interview were: How the psychologist currently does cognitive assessment? What problems do psychologist face when they trying to accomplish cognitive assessment? Why psychologist do cognitive assessment the way they do?

After knowing the needs of the users and carrying out a low fidelity prototype of the platform, a usability test was carried out with 3 users (one user with experience in digital batteries and 2 users without experience). These usability test plan was conducted to find out improvements in the interface of the web-based testing MARIAN. The test method chosen was the “think aloud” method, where users carry out tasks while talking about the steps that they are taking to accomplish the task. During the test, they were being observed by the moderator.

High level goals of this user test were to answer the questions:

  • How satisfied this web-based testing for psychologists?

  • How easy it is to use this web-based testing?

  • Did the web-based testing helped psychologists to achieve their goals?

User Needs Analysis

Thanks to needs finding study, it was found that users need:

  • An account to facilitate the management of the research from one place.

  • A platform that keeps the list of participants and assigns them a code to keep the identity of the participant private.

  • To review the progress and results of their own assessment.

  • A platform that keeps track of the clients, allowing to check the progress of the test.

 I detected some user's current practices, for example: Users rarely get access to batteries from computerized tests conducted at other universities or companies.  When the user has access to these platforms they are for a limited time. They also use other platforms for the evaluation of cognitive processes; however, these are expensive or difficult to use. To use these platforms, the user must pay licenses, which is very expensive if the psychologist attends many clients. When computerized test batteries are not available, users must administer tests in paper and pencil format.

On other occasions, users perform a battery of similar tests using programs such as power point and word. However, this is often difficult to manage, since users must also keep notes on the duration of the test, reaction time, number of hits, etc.

For the results, they must store the information of each participant in an Excel. This process is tedious and long. Users also use tests found on the internet, but they are not very reliable.

Competitive Analysis

In the interview, the user was asked about digital batteries or other tools used for cognitive evaluation. Users mentioned that they had access to paid digital batteries such as: Cognifit, Cogstate and TEA Ediciones. Users also commented that they were satisfied with these platforms due to the variety of tests that platform provides, however, they would like the option of doing the tests remotely, that is, that the clients have access to the tests and that they can perform them from their homes.

This is not possible since these platforms require a license which is granted only to the user, in this case to the psychologist. A disadvantage that these platforms also have is that they do not provide a file for the clients.

They also commented that when they do not have access to these platforms, they use Power Point or Word to make some illustrations to later use them in the test, but it is complicated and time-consuming. So, they rarely use these tools. Other tool mentioned were Excel to control and manage the clients.

The users interviewed do not know if there is a platform that stores everything in one place, that is, if they can carry out an evaluation on a platform and keep track of participants, as well as have a catalog of tests to choose from and verify the progress of the participants in the evaluation.

 

Design Goals

The goals of the design seeks to address are:

  • Create a simple but elegant design from web-based testing which allows users to use it confidently.

  • Satisfy the needs of users through the functionalities of web-based testing.

  • Improve the experience of users when conducting cognitive assessments.

 

Some functional requirements that user can do:

  • The user will have an account which it facilitates the management of the assessment from one place.

  • With this account, the psychologist is allow to choose the tests from the catalog that are closest to the objectives of his or her cognitive assessment. The platform MARIAN will provide several cognitive evaluation tests.

  • The user has the option to make a file of the clients. The platform will send an email with the code to the client. This code will allow the participant to perform the tests remotely.

 High Fidelity Wireframe

Mockup

 

Final Usability Test

Goals

 The goal of this user test is to answer the questions:

  • How satisfied this web-based testing for users?

  • How easy it is to use this web-based testing?

  • Did the web-based testing helped the psychologist to achieve their goals?

 

Participants

The principal audience for this test will be psychologists who have little experience using a digital test battery platform. So, I define the recruiting criteria as:

  • Currently working as a psychologist.

  • Have knowledge of batteries of cognitive assessment tests.

  • Have some experience in handling a digital test battery platform.

Two psychologist who work in their private office and one psychologist who works in a hospital participated in this test. These users were contacted via email and the test was carried out remotely. They have experience in digital batteries and works doing cognitive assessment in their jobs.

Process

The test process was carried out remotely (due to the coronavirus). The Zoom program was used to observe the users as they performed the test.  Users were asked to share a screen and were asked permission to record the session (this through informed consent). The link to use the prototype was sent to the session chat by Zoom.

For this usability test, 3 participants were recruited. The session with the participant began with a brief presentation and description / objective of the test. Informed consent was then presented in .pdf format, which was sent via the Zoom session chat. The participant had to read it, print it, take a photo and send it to the researcher's email, which was provided in the informed consent.

When the participant signed the informed consent, the next step was to ask them about their work in cognitive assessment and their experiences with digital test batteries.

Once the pre-questionnaire was finished, the participant was given a series of instructions to follow during the test, such as indicating and explaining each step they did (think aloud method).

The participants performed the following tasks:

  • Task 1: Access to the platform.

  • Task 2: Change their profile.

  • Task 3: Create a file.

  • Task 4: Choose one battery of tests to evaluate the short-term memory.

  • Task 5: Remove the file.

  • Task 6: Sign out from the account.

At the end of the usability test, the participant was asked about their experience with the prototype (difficulties, preferences, changes and understanding). To confirm this experience, the SUS questionnaire was carried out, which was also sent to him through the Zoom session chat. The participant had to print it, answer it, take a photo and send it to the investigator's email.

During the usability test, notes were taken of what the participant did and what they said. A review of the recordings was made, notes were taken of some behaviors and people's sayings, this to identify critical usability problems, which were very few for this prototype.

For this evaluation, I used the heuristics developed by Jakob Nielsen (2005) given in his article “10 Usability Heuristics for User Interface Design”. Additional titles were added to the heuristics for clarification from “6 Tips for a Great Flex UX: Part 5” (Neil, n.d.):

  1. FEEDBACK: Visibility of system status

  2. METAPHOR: Match between system and the real world

  3. .NAVIGATION: User control and freedom

  4. CONSISTENCY: Consistency and standards

  5. PREVENTION: Error prevention

  6. MEMORY: Recognition rather than recall

  7. EFFICIENCY: Flexibility and efficiency of use

  8. DESIGN: Aesthetic and minimalist design

  9. RECOVERY: Help users recognize, diagnose, and recover from errors

  10. HELP: Help and documentation

Each issue found was categorized by the heuristic from the list above and the severity of the violation was judged based on a five-point rating scale taken from Nielsen’s “Heuristic Evaluation” chapter, Table 2.3 (1994):

  • 0. I don’t agree that this is a usability problem at all.

  • 1. Cosmetic problem only – need not be fixed unless extra time is available on project.

  • 2. Minor usability problem – fixing this should be given low priority.

  • 3. Major usability problem – important to fix, so should be given high priority.

  • 4. Usability catastrophe – imperative to fix this before product can be released.

Through the heuristic evaluation, I found some cases where MARIAN (a web-based testing) violated Nielsen’s heuristics. The heuristics that were violated in the system are: Flexibility and Efficiency of use and Aesthetic and minimalist design.

 

Results

The 3 participants had 100% success in performing the tasks, however during the interviews they commented that in task 2 they had some difficulty understanding the graphic for "Patient Visit by Skill" in the Dashboard screen.

To evaluate the usability of the platform, the SUS questionnaire was used. The results of this questionnaire were 90, 77.5 and 72.5. These are results that are very good, since they are above the average score. This reflects that the user was able to achieve the tasks successfully and efficiently, and that their experience with the platform was satisfactory.

Based on what the participants commented and the results of the SUS questionnaire, we can answer the following questions:

  • How satisfied this web-based testing for users? Participants indicated that they were very satisfied with the platform. They wanted more information about the product. They commented that the platform is a great idea and that it quite possibly helps many psychologists and cognitive researchers.

  • How easy it is to use this web-based testing? Participants say the platform was easy to use. They comment that instructions were very easy to understand, the platform itself helps you complete the steps that are required and that it is intuitive for the user.

  • Did the web-based testing help the psychologists achieve their goals? The participants mentioned that the platform covers a need that psychologists and others professionals have long wanted: to be able to carry out evaluations remotely. Due to the pandemic, they have had to resort other tools, but none of them resemble MARIAN.

Key findings

Finding 1: List of options for the "Job” in the profile screen.

Severity: 3/4

Heuristic Violated: Flexibility and efficiency of use

When the user enters to "My Profile" screen, they must fill in the "Job" and "Workpklace" fields, however, users prefer a list of options to be displayed each time this field is filled.

If it is the field of "Job" that displays a list with the following options: clinical psychology, cognitive psychology, cognitive science, neuropsychology, neuroscience, experimental psychology, and others. In the field of "Workplace", users would like the platform detects the place where the user is and display a list of the closest places or companies.

Recommendations: Add a drop-down menu to the "My Profile" screen in the "Job" and "Workplace" fields.

Finding 2: Delete the used product on status screen (Dashboard)

Severity: 2/4

Heuristic Violated: Flexibility and efficiency of use

Users will be able to see which device the clients use to perform the tests, this was done in order that the tests could be optimized in a better way. However, users found this information unnecessary.

Recommendations: Delete the used product on status screen (Dashboard).

References

Neil, T. (n.d.). 6 Tips for a Great Flex UX: Part 5. Designing Web Interfaces. Retrieved from http://designingwebinterfaces.com/6tipsforagreatflexuxpart5.

Nielsen, J. (1994). Heuristic Evaluation. In J. Nielsen. & R. L. Mack (Eds.) Usability Inspection Methods. New York, NY: John Wiley & Sons.

Nielsen, J. (2005). 10 Usability Heuristics for User Interface Design. Nielsen Norman Group. Retrieved from, https://www.nngroup.com/articles/ten-usability-heuristics/

Previous
Previous

APSE app

Next
Next

Real State Website