User Testing Report: Expedia, a trip booking website.
Expedia (Expedia.com) is an online trip booking website. Expedia currently offers a wide variety of travel options. The user can book airline tickets, hotel, car rental, cruises, vacation packages and various amusement parks through the web, app, and telephone. Expedia had been developed and designed to improve the planning of the trip with the concept of offering users a good, easy and fast booking experience. Besides, users can check general information about the flight, hotel, or place they plan to go.
I hope to provide insights on how to improve the usability of Expedia’s trip planning and improve the booking experience for users. Through the study, I expect to answer the following questions:
Can experienced users of online travel booking sites use the site, Expedia, to plan their trips?
What problems do users encounter trying to use the site to plan trips?
User testing is a method that can identify usability issues in a user interface. I expect to identify some critical issues, provide some useful recommendations, and, also share some good findings where Expedia is doing well.
Methods
The principal audience for this test is experienced users of online travel booking sites. The participants in this study are three (two women and one man), whose I named A1, A2 and M3. So, I defined the recruiting criteria as:
A person who has bought a ticket online in the past year.
A person who has used the site before.
I recruit participants who differ along two dimensions:
-Flight preferences:
Standard (no special accommodations)
Complex needs (dietary restrictions, travel w/ infant, special needs, etc.)
-Frequency of online bookings:
1-2 trips per year booked online
2 or more trips per year booked online
The scope of this evaluation includes the booking experience and the errors that could appear using Expedia, as well as the functionality of the homepage of the site. Severity is judged based on a five-point rating scale taken from Nielsen’s “Heuristic Evaluation” chapter, Table 2.3 (1994):
0. I do not agree that this is a usability problem at all.
1. Cosmetic problem only – need not be fixed unless extra time is available on project.
2. Minor usability problem – fixing this should be given low priority.
3. Major usability problem – important to fix, so should be given high priority.
4. Usability catastrophe – imperative to fix this before product can be released.
The instruments used in this study are questionnaires and task descriptions. I choose specific tasks within the flights and stay options that users would complete (these specific tasks can be found on appendix):
Plan a round trip with specific dates and destination.
Email the itinerary.
Looking for cheapest total price of the trip.
Evaluate the flight options.
Find the top-rated hotel (with certain amenities).
In the beginning of the study, the users had to answer a pre-test interview. The objective was characterizing the participants’ experience with the Expedia website and identifying the participants’ travel experience and booking preferences. Besides, I could know about the user’s past experiences booking and buying a flight on a website.
Another user testing that I used is Thinking Aloud, which consist asking users to think out loud as they are performing a task. They can say whatever they are looking at, thinking, doing, or even feeling at each moment. The objective of this method was identifying some aspects of the website which are confusing to them.
After completing the tasks, the users had to answer a post-test interview about the problems or difficulties that he/she found during the tasks. At then, they completed a post-questionnaire, which was the standard SUS questionnaire. The form was printed on paper.
For the analysis methods, I used logging sheet reviews to structure the notes taken. The logging sheet indicated the success criteria, so that when the task was completed by the user, I could check off whether it's successful or not. Also, audio record was used to get accurate information about the user’s experience with the Expedia website. I asked the users to be recorded; I present them the informed consent form, which all of them signed.
Summary Results
Through the user testing evaluation of the Expedia website, I found out some aspects that Expedia website could improve, according to user´s perspective. One of them is the lack of a counter of flight options. Another complaint is that users do not have the option to choose the seat when they buy the ticket. Sometimes, the website takes more time searching the flights, and the search bar does not have an advanced search option. Also, the users mentioned that the website needs icons for hotel and flight details to see the information better and find the right option. As the reader can see, some of the issues just need a bit of finetuning. Other issues need design and programming. However, they do not demand a lot of work.
The task completion rate is 100%. All the participants had success in their tasks. Besides, none of the participants had trouble on doing the tasks, therefore the error rate was null. Regarding task timings, participants had an average of 4 minutes per task. At last, the post- questionnaire results (SUS questionnaire) are between 67.5 to 82.5. That means that people's answer is reliable.
The user testing evaluation of the Expedia website revealed that there were some instances in the design that could improve the user’s experience.
To add an advanced search bar.
To display the total number of flights found.
Add a "request a seat" option.
Add icons that represent each of the flight's characteristics (Wi-Fi, TV, power & USB outlets, food, baggage….).
It is important to remember that most of these instances do not have a greater severity at the level of system operation, but more at an aesthetic level. However, these small design changes could improve the user's experience.
References
Nielsen, J. (1994) Heuristic Evaluation. In J. Nielsen. & R. L. Mack (Eds.) Usability Inspection Methods. New York, NY: John Wiley & Sons.