We asked university librarians and staff volunteers to evaluate the homepage of U of T's HR Service Centre in a series of interviews.

The HR Service Centre is an internal portal that consolidates Human Resource services, support, and information for active employees.

During development, I was instead focused on implementing the division's public website but I still helped my colleagues on the HRSC implementation team by marrying the ServiceNow logo to U of T iconography to create a brand mark. I also pitched in and helped our content strategy team to upload content, including media assets like icons and imagery for the first iteration of the homepage.
Since its launch, I've supported product testing and feature development for the HR Service Centre (HRSC). After working with colleagues in Enterprise Applications (a unit within U of T Information Technology Services) to improve case status notification emails, I began work on this research as a second product sustainment initiative.
This study identified opportunities to improve the HR Service Centre's user interface and to present recommendations to build users' understanding of site functionality. Participants attended interviews conducted using MS Teams, where I recorded commentary to be interpreted during qualitative analysis. 
Each interview was conducted in two parts:
1. An expectations prompt to evoke functional predictions of the HRSC;
2. Evaluative assessment according to four heuristic prompts.
Functional Predictions
Before showing the interface, I asked participants to identify and describe the user flows of the HRSC in their own words. 
Without any visual reference, participants were encouraged to list as many as possible while we recorded how many of the actual functions they recited:
Case Management:
•    Submit a Request
•    Updating an existing Request
Service Catalogue:
•    Use the Get Help Form
•    Access form from Catalogue
Self-serve Knowledge Base:
•    Browse Knowledge Base Categories
•    Search the Knowledge Base
Portal Behaviour:
•    Access linked U of T website
•    Access social media account
Remembering times when you’ve used the HR Service Centre in the past, can you list all the different things you can do on that website that come to mind?

This open-ended icebreaker question helped to situate participants, get them talking, and establish rapport through active listening.
Evaluative Assessment
Our participants didn't know it, but we were asking them to rate the site based on the first four of Nielsen's 10 Heuristics. After being shown a visual reference to the Portal, we asked them to score each heuristic between 1 (poor) and 7 (excellent) and then rationalize their rating. 
1.    Visibility of system status
“Where 1 is extremely difficult and 7 is effortless and intuitive: how would you describe your ability to tell the status of a service request?

2.    Match between system and the real world
“Where 1 is unexpected and confusing and 7 is intuitive and clear: how would you describe the language used to describe the functions of the HRSC?

3.    User control and freedom
“Recall the answer you gave to the first question regarding the functions of the HRSC. Now that you’ve had a closer look at this page, how would you describe the similarity between your understanding of the HRSC and the content on this page? In this question a 1 represents no similarity, and a 7 represents all of your expectations being fulfilled by the webpage.”

4.    Consistency and standards
“In terms of function and aesthetic, how would you describe the similarity between the HRSC homepage and other web applications you’re familiar with? 1 represents absolute inconsistency and 7 represents total familiarity.”
I visualized findings for this portion of research using the charts below. Each chart stacks or grades heuristic scores to compare performance and prioritize action items during sustained design iterations. ​​​​​​​
Because the first and second heuristic polled lowest, this informs our prioritization of related interface design changes: emphasizing process status and employing more accessible, familiar language.
Qualitative Analysis
A frequency count of participant comments found 64 unique codes, including:
•    Identified Requests function and associated with case management status (8 participants)
•    Shopping cart icon is ill-fitting (4)
•    Familiar with linked U of T web platforms but new hires may not be (4)
•    Vague Knowledge Base description - what does "my employment" really mean? (4)

I created an affinity map which clusters related comments.
These 14 groups include comments regarding:
1.    Platform context
Some users had never used the platform and had difficulty identifying its purpose or value from the context on screen such as the site tagline. 
2.    Clarity of menu items
To-dos and Surveys were curiosities.
3.    Site Acquisition
Participants expect to find content using Google search and with bookmarks.
4.    Outbound links
Participants thought there might be more outbound links including to divisional memoranda and U of T platforms like Acorn and Quercus.
5.    Personalization
Participants recognized and appreciated personalized content like their name and their initials. Some likened this as a humanizing recognition of their role in the “bigger picture”.
6.    Clarity of function distinction
Participants were unsure what distinguished the Knowledge Base from the Catalogue and from ESS. Others remarked on an unclear understanding of the distinction between Catalogue and Get Help.
7.    Absence of reference to Equity, Diversity, Inclusion (EDI)
Participants remarked on the absence of an explicit EDI reference.
8.    HR Request status
All participants identified My Requests when prompted to find their “HR Service Request status” but found it very small relative to its importance.
9.    Expectations of Get Help function
Most participants expressed an understanding of the Get Help function and some expect to find support contact details in that process.
10.    Service Catalogue label and description
The shopping cart icon was criticized by half of our participants. The Catalogue metaphor is also not clear to participants whose frame of reference tended to concern forms and life events.
11.    Course catalogue content
Some participants expected to find a course catalogue on the platform, with one even misinterpreting the Service Catalogue as such because of the similarity in name.
12.    Knowledge Base label and description
Participants found that the Knowledge Base label was clear but its description received criticism for being too vague. Particular attention was paid to the language of "your employment."
13.    Knowledge Base content expectations
Participants identified several topics that they expect to find in the Knowledge Base, such as Health & Safety materials and Collective Agreements.
14.    Manager Knowledge Base: Content permission
Participants expressed an expectation of materials to support management responsibilities. One participant, a manager, was also concerned that tiered permissions would cause “suspicion” of inaccuracy and omission and could complicate grievances and disputes where available content differs.

Next steps
With this summative evaluation complete, work is set to begin in Q1 2022 beginning with a series of prototypes for user testing. As you'll see below, I've already begun to explore and experiment in how we might adapt existing UI elements. In another portfolio page, I've written more about these changes and how they fit into our sustainment strategy for the HR Service Centre.
Present state
Present state
Exploration: Rows versus Columns
Exploration: Rows versus Columns
Exploration: Search priority
Exploration: Search priority
Exploration: Shortcuts
Exploration: Shortcuts

Other work: