Identifying usability issues through inspection methods and techniques to enhance the user experience of the University of Advancing Technology's website.
Website Overview
The University of Advancing Technology (UAT) in Tempe, Arizona, offers on-campus and online technology degree programs. Its website (https://www.uat.edu) serves as a central hub for prospective students, faculty, alumni, and potential employees, providing information on academics, admissions, events, and the university's culture. Visitors can apply for admission, explore job opportunities, schedule tours, and register for events.
Problem
A heuristic evaluation and cognitive walkthrough were conducted on the University of Advancing Technology’s website to identify usability issues and assess learnability for new users. The evaluation used Jakob Nielsen’s “10 Usability Heuristics for Interaction Design” from the Nielsen Norman Group. It revealed 41 heuristic violations, with severity ratings ranging from 1 to 3 on Nielsen’s “Severity Rating Scale for Usability Problems.” Of the 41 violations, 15 were related to “aesthetic and minimalist design”, and 12 to “consistency and standards”.
The “aesthetic and minimalist design” violations result from irrelevant content that distracts users and creates cognitive overload, making it difficult to prioritize essential information. This, in turn, may hinder task performance and completion. The “consistency and standards” violations stem from external inconsistencies, where the website fails to adhere to industry standards for functionality. This lack of consistency reduces learnability, as user expectations may not align with the website’s functionality. As a result, users may experience confusion when trying to navigate and interact with the site.
For the cognitive walkthrough, three core tasks were evaluated for performance. The walkthrough identified issues with two of the tasks, with severity ratings ranging from 2 to 3. These issues aligned with the “consistency and standards” violations found in the heuristic evaluation.
Building on the findings from the heuristic evaluation and cognitive walkthrough, a summative usability test was conducted to validate the issues identified and assess their impact on task performance and user satisfaction. This provided valuable insights, which informed actionable recommendations for improving the University of Advancing Technology’s website.
My Role
• Conducted usability inspections
• Developed usability test plan
• Conducted test sessions
• Analyzed test data
• Compiled results into final report/presentation
Project Type
Student Work
Timeline
10 Weeks
Tools
• Zoom
• Qualtrics
• Freeform
• Figma
• Google Docs
Heuristic Evaluation
Cognitive Walkthrough
Research Questions
Usability Evaluation
Recommendations
I conducted a heuristic evaluation on the University of Advancing Technology's website to gain deeper insight of the website's functionality. The goal of the evaluation was to assess usability and identify potential usability issues. This process was guided by Jakob Nielsen’s “10 Usability Heuristics for Interaction Design”. Each violation was assigned a severity rating using Nielsen’s “Severity Rating Scale for Usability Problems”:
0 = I don't agree that this is a usability problem at all
1 = Cosmetic problem only: need not be fixed unless extra time is available on project
2 = Minor usability problem: fixing this should be given low priority
3 = Major usability problem: important to fix, so should be given high priority
4 = Usability catastrophe: imperative to fix this before product can be released
41 heuristic violations were found
15 violations of "Aesthetic and Minimalist Design"
12 violations of "Consistency and Standards"
12 violations had a severity rating of 2
8 violations had a severity rating of 3
I conducted a cognitive walkthrough of the University of Advancing Technology's website to assess its learnability. This process provided valuable insights into how users interpret visual cues, follow instructions, and perform tasks. Through this analysis, I identified potential pain points and areas where users might experience confusion. Leveraging this methodology enabled me to offer recommendations for improvements that better align with user expectations and mental models. The cognitive walkthrough was structured around four key questions, designed to evaluate the user story and the progression towards successfully completing tasks:
• Will users be trying to produce whatever effect the action has?
• Will the user see the control (button, menu, switch, etc.) for the action?
• Once users find the control, will they recognize that it produces the effect they want?
• After the action is taken, will users understand the feedback they get, so they can go on to the next action with confidence?
The cognitive walkthrough focused on three core tasks. The tasks were developed based on information that may be accessed most frequently on the website.
Find information about the Cyber Security master's degree program.
View requirements for graduate student admission.
View graduate tuition.
Task 1 identified issues with a severity rating of 3
No issues were identified with Task 2
Task 3 identified issues with a severity rating of 2
Based on the heuristic evaluation and cognitive walkthrough, the following potential usability issues were identified:
Industry Standards and Consistency: The website’s design and functionality do not align with industry standards for UX design.
Use of Extraneous Content: There is unnecessary content throughout the website that does not provide value to users and could distract them from their intended goals.
Information Architecture: Inconsistencies were found in the organization of the website’s content.
The following research questions were formulated to guide the usability test:
1. Do users encounter difficulties finding the website's main menu?
2. Do users have trouble navigating the website?
3. Does the website's extraneous content distract users or affect task completion?
4. Does the website's information architecture allow users to easily locate important information?
5. How would users rate their overall satisfaction with the website?
Based on findings from previous evaluations, a usability evaluation was conducted on the University of Advancing Technology’s website.
The test aimed to:
The usability evaluation was conducted with one user group, prospective students. The “prospective student” profile allowed for easy recruitment of participants who represented the target audience of users for the website. To fit the inclusion criteria, participants must:
• be 18 years or older
• be able to read and understand the English language
• possess knowledge of computer and internet operation
After receiving several responses, seven participants were randomly selected who fit the criteria.
The usability evaluation required participants to complete four task-based scenarios, which were designed around the usability issues identified during previous evaluations. These scenarios were crafted to expose participants to the potential issues in different contexts, helping to validate their impact.
Find information about the Cyber Security master's degree program.
Scenario: Imagine you’re interested in pursuing a master’s in cyber security at the University of Advancing Technology. How would you use the website to find information about their cyber security master’s degree program?
View graduate tuition.
Scenario: Imagine you’re interested in pursuing a master’s in cyber security at the University of Advancing Technology and want to know the cost of tuition. How would you use the website to find this information?
Apply for an internship opportunity.
Scenario: Imagine you’re a student at the University of Advancing Technology and you’re interested in applying for an internship opportunity. How would you use the website to find and apply for an internship?
Find move-in dates for the Fall 2024 semester.
Scenario: Imagine you’ve been accepted to the University of Advancing Technology for the fall semester and want to know the specific move-in dates for Fall 2024. How would you use the website to find this information?
Summative
Moderated
• DePaul University
• Personal Network
Remote via Zoom
Task-Based Scenarios
Questionnaires, Screen Recordings, and Notes
While completing tasks, users’ actions, behaviors, and performance were observed. Both quantitative and qualitative metrics were collected to meet the objectives of the usability test.
Quantitative Metrics
• Task completion time
• Number of errors encountered
• Time to recover from errors
• Task completion rate
• Accurate interpretation of navigation
• Deviation from the "happy path"
Qualitative Metrics
• Single Ease Question (SEQ)
• After-Scenario Questionnaire (ASQ)
• System Usability Scale (SUS)
• Distraction Level Question
• User comments and feedback
Q1: Do users encounter difficulties finding the website's main menu?
(Do users immediately find the website's main menu? What are users' thoughts on the location of the main menu?)
4 out of 7 users
immediately accessed the main menu during Task 1.
"The menu would be better at the top of the page instead of a slide-out menu from an icon. The current location requires users to perform an extra step to access information."
Some users struggled to find the website’s main navigation. While over half of participants easily accessed the “Menu” icon during Task 1, others expressed confusion or uncertainty. These users felt the menu's location was hard to find and required an unnecessary extra step. Despite these difficulties, the menu’s location did not impact task completion rates. Once users found the main menu, they consistently accessed it throughout the test.
Q2: Do users have trouble navigating the website?
(Does the user’s path correspond to the “happy path”? How many errors does the user encounter while performing tasks and how quickly do they recover from errors?)
Q3: Does the website's extraneous content distract users or negatively impact task completion?
(Is the extraneous content distracting to users? Does the extraneous content affect task completion?)
Participants were asked, "How distracting did you find the extraneous content during task completion?".
7 out of 7 users completed Task 1.
7 out of 7 users completed Task 2.
5 out of 7 users completed Task 3.
0 users completed Task 4.
Distraction levels varied across tasks, with most participants rating Tasks 1 and 3 more distracting than Tasks 2 and 4. Despite these distractions, they did not seem to significantly impact task completion rates. Participants cited several distractions throughout the test. The moving imagery on the homepage was particularly distracting for some, while others found the homepage’s horizontal scrolling, the main menu, and the large number of menu items distracting. Although these distractions didn’t prevent task completion, they negatively affected the overall user experience and participants' impressions of the website.
Q4: Does the website's information architecture allow users to effectively locate important information?
(Do users accurately interpret the navigation labels? Does the navigation easily guide users
along the “happy path” when completing tasks?)
"I wasn't sure what the modules were all about. I think I would have liked a step-by-step guide for what I was doing and why I was doing it."
7 out of 7 users
accurately interpreted the navigation
3 out of 7 users
deviated from the
"happy path"
7 out of 7 users
successfully completed the task
"I feel like I have to navigate through too many pages just to find information."
5 out of 7 users
accurately interpreted the navigation
4 out of 7 users
deviated from the
"happy path"
7 out of 7 users
successfully completed the task
"I wouldn't have clicked this arrow on the homepage because I don't know what it's telling me. I'm just assuming it would show more graphics. It's not really labeled and I wouldn't expect there to be other pages."
4 out of 7 users
accurately interpreted the navigation
7 out of 7 users
deviated from the
"happy path"
5 out of 7 users
successfully completed the task
"I wouldn't ever think to look under "Costs" for move-in dates. They just don't seem related."
6 out of 7 users
accurately interpreted the navigation
7 out of 7 users
deviated from the
"happy path"
No users
completed the task
The usability test results showed that most participants accurately interpreted the navigation labels while completing tasks. Although many deviated from the ideal path, most users navigated effectively using alternate routes, demonstrating a correct understanding of the labels. For Tasks 1, 2, and 3, the navigation successfully guided most participants to the information. However, for Task 4, inconsistencies in the content organization on the “Student Housing” page hindered users’ ability to find the information, leading to no participants completing the task. For Tasks 2 and 3, some participants used the search function to find information. While these users successfully completed the tasks, their reliance on the search option indicated a misunderstanding of the navigation labels, as they did not use the main menu as intended.
Q5: How would users rate their overall satisfaction with the website?
ASQ (After Scenario Questionnaire)
SEQ (Single Ease Question)
After completing each task, participants answered the SEQ (Single Ease Question), "Overall, how difficult or easy was this task to complete?".
SUS (System Usability Scale)
After the usability test, participants completed a System Usability Scale (SUS), providing insights into users' perceptions of the system's learnability, efficiency, and satisfaction. The SUS consists of ten statements, with participants rating each on a scale from "Strongly Disagree" to "Strongly Agree".
1. I think that I would like to use this website frequently
2. I found the website unnecessarily complex
3. I thought the website was easy to use.
4. I think that I would need the support of a technical person to be able to use this website.
5. I found the various functions in this website were well integrated.
6. I thought there was too much inconsistency in this website.
7. I would imagine that most people would learn to use this website very quickly.
8. I found the website very cumbersome to use.
9. I felt very confident using the website.
10. I needed to learn a lot of things before I could get going with this website.
Participant responses were converted and calculated using the SUS scale on UIUX Trend’s website (https://uiuxtrend.com/sus-calculator/). The results are presented on a scale from 0 to 100, with higher scores indicating better perceived usability. The SUS scale assesses the website's usability, ranging from "Worst Imaginable" to "Best Imaginable," and includes an "Acceptability" rating and corresponding grade.
6 out of 7 users
felt the website was cumbersome to use
5 out of 7 users
felt the various functions in the website weren't well integrated
4 out of 7 users
felt the website was overly complex, requiring too many aspects to learn before getting started.
"I hope students can find information on this website better than I could."
Tasks requiring participants to access information from pages with inconsistent content organization and navigation labels led to decreased satisfaction and a poor user experience. Many participants found the website distracting, overly complex, inconsistent, and misaligned with their expectations. Some also noted that the website's functionality and certain design elements were particularly distracting.
The results of the usability test on the University of Advancing Technology’s website confirmed several issues identified in previous evaluations. These are my recommendations for improvement.
Move the main menu to the top of the homepage, ensuring it is prominently visible and easily accessible for users.
Assess the website to eliminate extraneous content. Key information that guides users toward their goals should be prominently displayed and easily accessible.
Evaluate content organization and use clear, unambiguous navigation labels that align with user expectations for navigating a university or college website.
Assess the website’s information structure and organization to ensure content and navigation labels are consistent and meet user expectations. Additionally, optimize the user flow for quick and easy access to information.
Evaluate the website for external consistency, ensuring it adheres to industry standards for functionality, usability, efficiency, satisfaction, and learnability to enhance the overall user experience.
Conduct further research to measure the impact of the recommendations on the user experience.
Use a larger sample size to re-assess usability and task performance.
Extend the evaluation to identify other potential issues with the website.
Project Name
Project Name