During the process of designing a product, it is important to understand how the user interacts with the interface. Considering that most developers are very different from their end-users, they have to rely on usability testing to give them insight as to where users struggle and how these issues may be fixed. However valuable this testing it, it can be expensive and time consuming. Many teams turn to heuristic evaluations to get the insight that they need as quickly and cheaply as possible. This form of testing involves giving testers a series of tasks to complete using the product. Testers can then use predetermined “rules of thumb,” or heuristics, to diagnose the problems they encounter.
As a part of the EH 542-442 Usability Studies course at UAH, I was challenged with the task of conducting a heuristic evaluation of a website or a program in order to test its user-friendliness. Almost immediately, I recalled my frustrations with various freeware programs. These programs are often created as inexpensively as possible, which meant that adequate user testing was probably never conducted. One in particular, the Free Picture Resize (FPR) software I had on my computer, was an older freeware program that was filled with bugs. This was the perfect place to practice conducting heuristic evaluations.
To start off, I needed testers who were familiar with resizing digital images. I turned to my classmates in my ARS 432: Senior Project Management web design class for help. Three of them completed the tasks that I gave them and recorded their comments for each problem they encountered (which were numerous) according to Nielsen’s 10 Heuristics. The following are the tasks I gave to each tester:
- Task 1: Reduce the size of an image. A test image is available if you click on “Open Image.” A folder containing “test_image.jpg” will be available for your use.
- Task 2: Return the image to its original size.
- Task 3: Crop the image.
- Task 4: Save the edited image as a separate file.
After collecting my data and analyzing the results, I found a noticeable trend. The testers struggled the most with Tasks 3 and 4. There is a bug in the FPR software that causes the program to fail to crop the image during the first attempt. Without any error message letting the user know that it was the program’s fault, not theirs, the program returns to the main window and shows no sign that anything had changed. This caused confusion and frustration for the testers. This violates Nielsen’s ninth heuristic, which says that error messages have to be provided whenever a problem occurs and that it must offer some valuable direction to show the user where to go from there.
Probably the most catastrophic problem involved Task 4. The program only feature a “Save” button, which can create confusion if the user wants to save the image as a separate file. One tester refused to save the image because they were afraid that they would overwrite the original. Even though the problem may seem self-evident in hindsight, it was not apparent to the developers of the program that this might cause problems.
Both problems have a simple solution: more effective communication. With a few tweaks to their messaging and a new “Save As” button, developers could dramatically improve the usability of their product (this is only hypothetical of course considering that developers are probably no longer updating this product). Still, this exercise taught me, if anything, to recognize usability problems that I encounter in everyday life (and to maybe be a little more forgiving of those who left the problems there in the first place).