Brand: National Healthcare & Insurance
As the Head of Platform Design, I took this challenge as an opportunity to push for a culture shift. The team had long operated in a reactive mode, executing direct UI requirements from business partners who valued speed above all else. These partners, deeply invested in protecting users’ time, were convinced they knew exactly what users needed. Meanwhile, users, accustomed to outdated systems, passively accepted each update, even as frustrations persisted. Despite adding new features, the experience remained cumbersome, and the default response was always to introduce more features rather than addressing the underlying usability challenges. This project was our first pilot for our Empowered Teams transformation initiative.
While raw data holds immense potential, its true power emerges only when transformed into meaningful, actionable insights that guide users toward clear understanding and informed actions. Our reporting application redesign focused on translating complex data into intuitive, visual stories that inform and empower users to make confident, data-driven decisions. By reimagining how information is structured, visualized, and presented, we're turning data from an overwhelming flood of information into a strategic asset that speaks directly to the user's most critical business questions.
This application was originally built by a resourceful development team with minimal user research and heavy reliance on written direction from business liaisons and stakeholders. Recognizing the need for a complete overhaul, we went straight to the source—our users.
Elevated User Research & Continuous Learning
The team had been focused on executing UI requirements quickly without questioning the underlying problem. We dove deep into understanding our users' experiences by conducting interviews with several key users representing our primary personas. Through these conversations, we uncovered a critical insight: while our product technically checked all the boxes and met baseline requirements, it fundamentally missed the mark on delivering an experience aligned with users' mental models and expectations.
Built Trust with Business Partners
Business partners were confident they already knew what users needed. We implemented co-creation sessions, bringing business partners, designers, and researchers together to analyze user insights so we could shift their focus to problem-solving vs requesting solutions. We used data-driven research to challenge assumptions, sharing usability test findings and surveys. We reframed UX discussions around business impact, showing how improved usability reduced support costs, increased efficiency, and drove adoption.
Collaborative User Journey Mapping
Once we had a clear understanding of our personas and their needs, created affinity diagramming and journey maps to help the team organize their efforts. This allowed us to have more meaningful conversations with our immediate team, stakeholderes and business partners.
Through our discovery process, we identified several opportunties for improvement:
Data Visualization: Improve data vis methods to make it easier for users to understand and gain insights from the information.
Filtering Capabilities: Users can more precisely target the data they need, saving time and efforts.
Contextual Features: Implementing features like customized team views with utilization data will better equip users to complete tasks like report generation effictively.
Information Architecture: Clear and consistent Information architecture along with plain language in filters and reports to make it easier for users to find the data they need.
Customization and personalization: Providing great customization otpions allow users to tailor the platform to theri specific workflows, leading to increased efficiency.
By providing strategic UX northstar to the product and leadership team, we were able to gain approval for additional alignment for the project.
As part of this initiative we transformed our platform's success measurement approach by implementing a more comprehensive evaluation framework. Previously the platform measured success primarily by the number of features released, focusing on output rather than outcomes. We introduced the System Usability Scale (SUS) as a standardized benchmark to quantitatively track user satisfaction across releases. We established a cadence of moderated usability testing sessions with real users, incorporating task success rates, time-on-task measurements, and error analysis. By documenting and socializing these insights through easily digestible scorecards, we shifted stakeholder conversations from purely feature-driven discussions to evidence-based decision making centered on actual user experience.