Hey there, I'm Malia!
Senior UX Researcher with multi-disciplinary experiences in tech, non-profits, and learning experiences.
As Lead UX Researcher, I facilitated stakeholder discussions that enabled usability testing with customers to guide portal re-design efforts. This effort not only strengthened working relations between the UX Design and Development teams, but also built customer journey maps, informed Marketing and Partner teams on how to better communicate to customers, and empowered contract negotiations between the organization and a vendor.
Results of UX Research
- Renewed communications between UX Design and Development teams
- Customer feedback collected on a portal for the first time
- Findings and recommendations delivered to Product, Marketing and Channels teams
- Development of product-specific personas, customer and buyer journey maps
- Personas & journeys uploaded and shared with all employees in a "Solution Hub"
- Results of portal findings used in contract renewal negotiations
Project Details
Role
Lead UX Researcher
Stakeholders
- UX Designer
- Product developers
- Marketing team
- Sales team
- Channels team
- Product leadership
Project Length
One month
Deliverables
- Research plan
- Usability report & recommendations
- Customer needs & priorities report
- 4 personas
- 4 buyer journey maps
- 4 customer journey maps
Process
Worked with UX Designer to understand the context of the portal design and identify the stakeholders involved.
Developed a research plan that included goals, objectives, hypotheses, methodology, and usability session script.
Reviewed research plan with stakeholders to ensure the project captures the appropriate data. Gained buy-in from stakeholders who were previously disengaged from the process.
Recruited participants for the usability sessions through screening and filtering ~1000 existing customers. Also worked with account managers to ensure that customers’ contact information and statuses were accurate.
Nine participants were scheduled for remote usability testing.
Conducted hour-long remote usability sessions with nine participants.
Stakeholders were invited to observe the sessions as well.
Analyzed data collected for themes and common findings. Data was also used create 3 reports – usability findings and recommendations, customers’ perception and experiences with the existing product, and customer journey maps.
All reports were peer reviewed by other Research team members.
Developed customer journey maps based on four different personas from the data collected in the usability sessions.
Findings and recommendations were presented to all stakeholders. This also facilitated greater collaboration between the UX Designer and Developers on the portal re-design efforts.
Findings were also presented to the Marketing and Partner teams, who used the data to craft more targeted messaging in their messaging and communications with customers.
Data from the sessions was shared with decision-makers that supported their negotiations with a vendor.
A portal was undergoing re-design efforts, but communications had stalled between stakeholders
A fellow UX Designer had created mockups for the development team to help with the re-design of a particular portal. The purpose of re-designing the portal was to add new features for customers, and to adhere to new design guidelines that had been developed. While it was not requested by the UX Designer or Development team, customer feedback on this particular portal had never been collected before. In addition to the opportunity to collect and integrate customer feedback into the design process, adjacent projects and stakeholders also would benefit from hearing from this group of customers.
However, after the initial mockups were given to the Developers, the UX Designer stopped hearing from the development team and status of this effort went unknown for several weeks.
I restarted communications with the development team by inviting them to review a research plan to test the portal mockups. The development team was bought into the process through the review, and gave their support for usability testing.
Usability sessions were eye-opening in refuting assumptions previously made
I was able to schedule 9 sessions with existing customers after screening and recruiting ~1000 potential participants. During the sessions, we wanted to test several hypotheses. One such hypothesis is as follows:
- Hypothesis: Customers are currently not utilizing the automated repoorts as they are unaware they exists.
During the sessions, we uncovered that customers did in fact know of the automated reports. However, they avoided the reports due to experiencing either one of two issues:
- Finding 1: Customers are currently not utilizing the automated repoorts as they are unaware they exists.
- Finding 2: Customers encountering reports that had 0 alerts (meaning no content), but finding content within the reports anyway.
Customers were either turned off from viewing the reports, or felt that they couldn’t rely them at all. Customers instead to third party tools or created their own scripts to generate their own reports, as they felt more assurance they these tools could provide them data that they could trust. That customers found the data in their portal untrustworthy was an integral finding in informing design and content development efforts related to the automated reports.
Usability sessions uncovered blindspots that none of us had known
In the landing page for the portal, customers were presented visual of the amount of their devices that were up to date on their patches and protected. The intention behind the visual was to communicate to customers the value of the organization’s services through the high rates of device protection.
Sample of visual available on landing page
During the usability sessions, we instead uncovered that customers do not interpret the visual as impressive levels of services, but as hiding useful information. Rather than seeing 84% of devices protected, customers were far more interested in the 16% of devices that were not protected and what needed to be done.
Devices that require action:
Device 1 [click for more details]
Device 2 [click for more details]
Device 3 [click for more details]
Sample of what customers preferred to see on the landing page
During the usability sessions, we instead uncovered that customers do not interpret the visual as impressive levels of services, but as hiding useful information. In the original iteration of these re-designs, customers could only the visual of 84% protected devices. Additional effort was required to determine which of these devices needed action on.
Customers were baffled that they would have to infer what was not protected and then what needed to be done to correct the issue. Customers perceived the extra effort of uncovering the exposed devices as hiding critical information and a serious flaw with the design.
Interestingly enough, the motivation to present this particular visual was customer feedback, where customers reported needing to know “the value of the company”. Customers needed help to know organization had done that validated the cost of investing in the company’s services.
Devices that require action:
Device 1 [click for more details]
Device 2 [click for more details]
Device 3 [click for more details]
Sample solution pushed critical content first, followed by content that showed the value of the organization.
The solution came down to reconciling the different motivations customers may have when using the portal. Customers who were most interested in seeing a list of exposed devices first were individuals who were responsible for their company’s day to day technical operations. Their priorities would be to first resolve any issue that could impact operations. We found that by presenting the critical information first, customers were then more receptive to seeing other content.
These case studies represents only a small fraction of my work.