For my senior capstone project, I worked with 3 other students to redesign the University of Michigan Center for Entrepreneurship’s (CFE) website. The owners of the website were concerned with the usability of the website, specifically navigation and ease of use.

My Role
Project Manager
UX Designer
UX Researcher
Timing
8 Months
Tools
Figma
Team
4 Students
About the Project

Goal
To improve the CFE website’s architecture and design to provide ease of use and organic discovery of resources.

Process
Heuristic Evaluation
Surveys
Card Sorting
Usability Testing
User Research
Sketches
Wireframes
Low, Mid, and High Fidelity Mockups
Prototypes
Iterative Design
User Testing
Data Analysis
Validation
User Research
Research Questions
How well do users access resources from the CFE website in order to meet their needs?
Can users find the resources they are searching for and, if so, how easily are they found?
Heuristic Evaluation
Purpose
To find faults and determine what could be improved in terms of design and user experience.
Methodology
We analyzed five of Nielson's principles including consistency, relevancy, visibility, clarity, and error prevention. These principles were assessed on a priority scale to determine their importance in the overall experience of the site.
Key Findings
•Inconsistency among headers and structure made it difficult for users to navigate
•There was a significant lack of signaling that some links took the user to another site completely
•The variation in containers on the site were found to be confusing and difficult to click
Surveys
Purpose
To collect data from all user groups about how they are using the website and limited information about their struggles and successes with the website.
Methodology
The survey contained 10 questions and was shared through our client's email list, receiving 1,803 responses.
Key Findings
• Participants consisted of mostly students, faculty/researchers, and partners
• Most users are “Almost Always” able to find what they are looking for on the website
• Most users find the website “Fairly easy to navigate”
Card Sorting
Purpose
To understand what forms of information architecture are most intuitive to users.
Methodology
Participants (4) answered questions before and after they sorted cards. The site provided instructions and help for the user while completing the sorting. We used open card sorting so that we can gain a sense of how participants think the site should be organized and what categories they expect to see.
Key Findings
The categories (Events/Opportunities, Student Resources, About Us, and External Partners) are shown below.



Usability Testing
Purpose
To determine how successful users are in completing pre-determined tasks. This test was repeated with the redesign for validation purposes.
Our team designed a test with tasks and then asked users (4) to complete the tasks while they were being observed. These tasks were done on the current website to determine how easily the site can be navigated. Participants did these tasks while thinking aloud. Usability tests helped us understand the thought process users go through while they perform tasks and can identify pain points they may have.
Methodology
Key Findings
• Participants 1 and 3 had no trouble finding courses offered, graduate certificate program information, or information about the entrepreneurship minor
• Participants had the hardest time finding opportunities offered: UM Biomedical Venture Fund and May Mobility TechLab
• On average, participants were able to find information about the graduate certificate program the fastest.
Usability Testing Result Averages
Design
Process

Requirements
User Requirements
• Navigation must be clear for users; it must be logical and understandable.
• Users need assurance that where they are navigating on the website is where they are actually intending to go (that the resources/information they seek are there).
• Language used on the website must be clear and organized logically for users.
Business Requirements
• The majority of users must be able to find the site above average according to the System Usability Scale (SUS).
• The solution should be inexpensive.
• The solution should be easy for web development novices to implement and maintain.
Wireframes
After brainstorming solutions and ideas, we sketched our wireframes. Some of these features include drop-down menus and side navigation.
Low Fidelity Prototype
We then transformed our wireframes into a low fidelity prototype for testing. We found that while users found the navigation features useful, it was difficult to maneuver through the prototype without more information being included.


Mid-Fidelity Prototype
The mid-fidelity prototype is similar to the low-fidelity prototype but brings the design closer to how it would look on the finished site.
We included a "Get Involved" tab in the header instead of having it as a section for each user group (students, faculty/researchers, and partners). This feature is also to promote involvement with the Center.
We found that the users understood that the boxes were clickable in this iteration and they want the dropdowns to appear when hovering over the navigation tab. Users were able to return to the landing page by clicking the CFE logo in the top left easily and did not have any struggles with other navigation features.
High Fidelity Prototype
In this prototype, we altered the drop-down menu to appear when hovered. We also updated the appearance of the drop-down interaction. Information from the Center for Entrepreneurship’s current website was added to the prototype to minimize confusion from a lack of content.
Final Design
Rationale
The final design incorporates feedback from users during design iterations. It also fulfills the user requirements that were used to direct the design phase.
Requirement 1: Navigation must be clear for users; it must be logical and understandable.
Fulfilled by: the addition of the dropdown menus that appear when hovering the mouse over the navigation headers.
Requirement 2: Users need assurance that where they are navigating on the website is where they are actually intending to go (that the resources/information they seek are there).
Fulfilled by: the addition of the menu, as it creates greater clarity in navigating the website.
Requirement 3: Language used on the website must be clear and organized logically for users.
Fulfilled by: In researching the usage of certain terminology in navigation (the website architecture), the team found that the current page’s architecture worked well, and only needed minor adjustments, providing a solution to this third requirement.
Key Screens and Design Elements
Home Screen: The original design had an image as the home page with information below. Many users found this confusing and were unaware that the home page had content besides a picture. The home page was updated to welcome users to the website and signal to the user that there is information on the page.
Drop-Down Menu: During testing, users became frustrated because they did not know what information each tab contained. Dropdown menus were added to reduce this confusion.
Get Involved Page: Users could not tell who could get involved in each of the resources offered by the Center for Entrepreneurship. This made finding resources difficult. We have created a Get Involved page that includes a filter function to allow users to find the appropriate resources.
Validation
Study Procedure
Materials
A/B tests were conducted with the newly designed website and the original website (to minimize differences in look and functionality, the original CFE site was replicated in Figma). The tasks for each group were identical. We recorded the number of clicks used and time spent.
Each test began with an introduction to the test. The participants were asked to sign a consent form for recording the session. Then, they were asked to complete a series of tasks while rating the difficulty of each task after its completion. Rating the difficulty of each task is a Single Ease Question and was used to gauge the user’s perception of the usability. Lastly, participants were asked a series of follow-up questions.
Procedure
Analysis
Below are charts showing the 8 different task averages for time spent, clicks used, and the average level of perceived difficulty (based on a scale of 1 to 10, where 1 represents easiest).
Unpaired t-tests were run to compare the data, as the two groups were made up of unique users.
Group A:
Original Website

Group B:
Newly
Designed Website

Graph Based on Averages
63%
65%
37%
T-Test Results
For Time Spent:
P value and statistical significance:
The two-tailed P value = 0.0118
By conventional criteria, this difference is considered to be statistically significant.
Confidence interval:
The mean of
Group A - Group B = 29.51025
95% confidence interval from 7.64027 to 51.38023
Intermediate values used in calculations:
t = 2.8941, df = 14
Standard error of difference = 10.197
For Clicks Used:
P value and statistical significance:
The two-tailed P value = 0.0006
By conventional criteria, this difference is considered to be extremely statistically significant.
Confidence interval:
The mean of
Group A - Group B = 1.675
95% confidence interval from 0.865 to 2.485
Intermediate values used in calculations:
t = 4.4330, df = 14
Standard error of difference = 0.378
For Single Ease Question:
P value and statistical significance:
The two-tailed P value = 0.0175
By conventional criteria, this difference is considered to be statistically significant.
Confidence interval:
The mean of
Group A - Group B = 1.475
95% confidence interval from 0.300 to 2.650
Intermediate values used in calculations:
t = 2.6918, df = 14
Standard error of difference = 0.548