The Center for Communication Excellence (CCE) is Iowa State University’s go-to website for graduate academic support. Housing everything from event schedules to appointment bookings means that an intuitive, user-centered navigation is an absolute necessity!
UX Project
December 2019 – Present
Usability Testing, Think-aloud Protocols, Semi-Structured Interviews, Wireframing, Prototyping
User Researcher, Front-End Developer, UI Designer
I set out to find the root cause and to revamp the website based on user feedback. The current redesign has seen an increase in positive sentiment, user satisfaction, and link click-through rates. While this website is going through a second round of iteration, please check out thecurrent CCE website here.
The redesign of the website revolved around 3 main user insights in the first round of iteration: the lack of information organization, the need to access services quickly, and the frustration with redundancy. The current design thus features a navigational bar at the top of the page, features prominently used services on the homepage, and uses composition and typography to visually group information.
Drag the slider to check out the before and after. We love a good makeover.
I first interviewed the director and staff of the Center for Communication Excellence (CCE) to understand stakeholder needs on the administrative side. Then, I interviewed 4 graduate students who were frequent users of the CCE Website. Following that, I conducted a usability test by having them complete three tasks on the CCE website while using the Think Aloud Protocol. After each task, each participant scored the task on the SUS scale.
CCE stakeholders were asked 5 open-ended questions centering around administrative needs and success measurement. CCE Website users were asked 10 open-ended questions about their needs, behaviors, and sentiment.
Each student was give 3 tasks to complete and asked to verbalize what they are doing and thinking as they completed the tasks. All tasks were reported as key priorities by the stakeholders:
All 4 users failed the first task, which was to accurately identify all of the services and programs provided by the CCE. Students had a hard time understanding how the CCE services and programs were categorized. Observations included frequent backtracking and trying to parse out differences in categories. Think aloud also revealed comments like, "Wait, why is this here?" and "Huh? Oh it's the other one."
This prevented information search. All 4 users took a long time to complete the third task, which was to find an event they are interested in and to sign up for it. Observations revealed indicators of stress whenever they encountered a page with a wall of text. They would also say things like, "Why do I have to read through all of this to get to what I need?" and "I'm scrolling and scrolling but where is it?"
All users took a long time to complete the second task, which was to make a booking appointment. Observations revealed that the quickest path was 5 clicks and that the links were not labelled explicitly. This gave way to comments like, "I thought this was a header until I hovered over it."
Affinity diagramming workflow based on observations and interviews
"I've never been able to find the booking system on the first try. Why is there so much redundant information?"
User #4
Always Lost
"This looks like it was made in the 90s. Everything is cluttered and hard to read. Oh wow. Yeah, I'm not going to read all of that."
User #2
90s Website, not in a good way
How might we:
10 wireframe sketches with a focus on orienting users were tested with 2 students and 2 CCE staff members.
2 hi-fidelity mockups were created in Adobe XD and tested with 4 students. Qualitative responses were collected.
A final design was chosen based on A/B testing for navigability and organization. Iterative design changes were made based on user comments. I then proceeded with the front-end development of the final design, making sure to adhere to brand and accessibility guidelines.
Another usability study was conducted with 4 new users using the new interface with the same 3 tasks. Users were once again asked to use the think aloud protocol as they were completing their tasks. The findings are detailed below.
A/B testing was used on 4 users to test the usability of the new website. The new interface scored an SUS score of a 67, a 32 point increase from the old design.
One month after redesign launch, the click percentage of the website links were examined. Link clicks had increased 4.6% from 1.4% to 6% previous month, indicating a follow through from website visit.
All 4 users still reported confusion about the organization of the website. Users particularly struggled with how the navigation bar was set up and how programs were grouped:
“The homepage is really not well organized.”
“I think the links are not clear enough, I don’t know if it’s what I want or where it will take me.”
Users were confused by the what was featured on the front page. In particular, they did not like the “success story” section and disliked the number of drop downs:
“[Counting]. There are 8 drop downs…why so many? I think 3 is enough.”
“Half of the homepage is the success story, but I don’t really care about it.”
“What is this AcComp tab? Is it meant for most students?”