Increasing task success for medication alerts by 66%

UX Research

UX Design

Web

Health-tech

Increasing task success for medication alerts by 66%

UX Research

UX Design

Web

Health-tech

Increasing task success for medication alerts by 66%

UX Research

UX Design

Web

Health-tech

Context


CarePlanner just launched a beta of a new feature, electronic Medication Administration Records (eMARS) when I joined the organisation. However, despite initial engagement, feedback from beta testers was starting to dwindle and customer-facing teams reported that customers were losing interest.


In particular, the medication alerts function, which alerted care managers to medication events recorded by carers in real time, had not gained as much interest as expected. This caused some concern as much of the sales and marketing messaging planned for the product launch centred the value proposition on this feature.

Context


CarePlanner just launched a beta of a new feature, electronic Medication Administration Records (eMARS) when I joined the organisation. However, despite initial engagement, feedback from beta testers was starting to dwindle and customer-facing teams reported that customers were losing interest.


In particular, the medication alerts function, which alerted care managers to medication events recorded by carers in real time, had not gained as much interest as expected. This caused some concern as much of the sales and marketing messaging planned for the product launch centred the value proposition on this feature.

Challenges


  • Sales teams who helped onboard customers onto the beta programme said they didn’t usually demo the alerts feature because it ‘overwhelmed’ customers or they ‘weren’t ready for it yet’.


  • Customer-facing teams reported customers rarely spoke to them about medication alerts when giving feedback about their beta experience.


  • There was a lack of product data to quantify user adoption or engagement with the feature.


Challenges


  • Sales teams who helped onboard customers onto the beta programme said they didn’t usually demo the alerts feature because it ‘overwhelmed’ customers or they ‘weren’t ready for it yet’.


  • Customer-facing teams reported customers rarely spoke to them about medication alerts when giving feedback about their beta experience.


  • There was a lack of product data to quantify user adoption or engagement with the feature.


Results

Results

01.

66% increase in task success

66% increase in task success

Before: 17%

Before: 17%

After: 83%

After: 83%

02.

6.5 decrease in user errors

6.5 decrease in user errors

Before: 7 p/session

Before: 7 p/session

After: 0.5 p/session

After: 0.5 p/session

03.

01:19 mins saved to complete task

01:19 mins saved to complete task

Before: 03:03 mins

Before: 03:03 mins

After: 01:44

After: 01:44

'Too complex for me'…

User Research

The product team and I started investigating the current user experience of the alerts feature, with 1:1 interviews with 6 care managers.


Interview takeaways:

  • Customers felt setting up medication records was a much bigger task than they had expected. Despite having the data on paper records, correctly entering medication records was intense and time-consuming.


  • All customers were aware of the alerts feature, only 4 of the 6 had used it on occasion, but 0 had consistent alert ‘coverage’ for all their clients.

  • The perceived complexity of setting up an alert and lack of time were consistent themes from all customers we spoke to.


The product team and I started investigating the current user experience of the alerts feature, with 1:1 interviews with 6 care managers.


Interview takeaways:

  • Customers felt setting up medication records was a much bigger task than they had expected. Despite having the data on paper records, correctly entering medication records was intense and time-consuming.


  • All customers were aware of the alerts feature, only 4 of the 6 had used it on occasion, but 0 had consistent alert ‘coverage’ for all their clients.

  • The perceived complexity of setting up an alert and lack of time were consistent themes from all customers we spoke to.


“My sole focus is getting all the medication paperwork set up…it’s a lot of admin. It’s not just a quick fill-in-a-form job, this is people's medication records! The details have to be right… Alerts look a bit too complex for me to think about right now”

“My sole focus is getting all the medication paperwork set up…it’s a lot of admin. It’s not just a quick fill-in-a-form job, this is people's medication records! The details have to be right… Alerts look a bit too complex for me to think about right now”

“My sole focus is getting all the medication paperwork set up…it’s a lot of admin. It’s not just a quick fill-in-a-form job, this is people's medication records! The details have to be right… Alerts look a bit too complex for me to think about right now”

Observations.

Unfortunately, observations of users using the feature weren't any more encouraging. Using the same candidates from the interviews, I asked users to demonstrate how they would 'set up an alert to receive if a chosen client refused a specific medication'. I observed their interactions.


Unfortunately, observations of users using the feature weren't any more encouraging. Using the same candidates from the interviews, I asked users to demonstrate how they would 'set up an alert to receive if a chosen client refused a specific medication'. I observed their interactions.


Unfortunately, observations of users using the feature weren't any more encouraging. Using the same candidates from the interviews, I asked users to demonstrate how they would 'set up an alert to receive if a chosen client refused a specific medication'. I observed their interactions.


17% task success

in setting up an alert.

1 user didn't attempt task at all.

17% task success

in setting up an alert.

1 user didn't attempt task at all.

17% task success

in setting up an alert.

1 user didn't attempt task at all.

07 user errors

made on average per session.

03:03 mins to success

for fastest user

Using an example record that already had alert set up.

Defining success.


Upon discussing these findings with internal stakeholders, it was felt improvements needed to prioritise the user’s perception of complexity and time investment.


With this in mind, and given the limited amount of user data available, we considered the following measures to guide and quantify the success of any design enhancements we implemented.

01.

Increase task success

Increase task success

At least 5/6 (83%) customers should be able to successfully create a medication alert.

At least 5/6 (83%) customers should be able to successfully create a medication alert.

02.

Decrease user errors

Decrease user errors

Users should be able to create an alert making less than 5 errors (per session)

03.

Reduce time to success

Reduce time to success

Users should be able to create an alert within less than 2 minutes.

Simplifying two distinct tasks.

Ideation

I led a design workshop with various internal teams to review the existing user journey for creating a medication alert. The individual steps in the journey were represented on post-it notes on a whiteboard, and as ideas were explored we moved, added, or removed items to help visualise and articulate ideas.


In line with the customer feedback steps 4 and 5 in the user journey, which required users to identify administration responses as ‘desired’ or ‘undesired’, were discussed at length. The use of language and placement of the alerts feature also gained a lot of attention, and it was suggested having the alerts feature at the bottom of a medication record, which was already a lengthy form, contributed to user fatigue.


With various ideas for solutions, I went away and wireframed the most promising options, which separated the alert creation capability onto another page. In this flow, the user could create an alert in medication settings, and then decide what actions would ‘trigger’ the alert on the original medication record page. Consequently, I was also able to remove the concepts of ‘desired’ and ‘undesired’ outcomes, which was a source of friction in the previous designs.


To review, I bought the stakeholders back and presented the wireframes. Stakeholders were enthusiastic – the updated designs would allow users to create one alert and reuse it several times, however, there were some concerns if the user would understand how to ‘apply’ an alert properly. To test the designs, I developed a prototype and conducted some usability testing.

Significantly improved user understanding.

Testing and Iteration

To test the suggested design I developed a simple click-through prototype using Marvel, and tested the usability using Maze.


To test the suggested design I developed a simple click-through prototype using Marvel, and tested the usability using Maze.


Task 1: In the new eMARS settings page create a medication alert to notify a chosen staff member of a medication issue, via your preferred method.

Result: 83% of testers completed task 1. Spending an average of 1.5 seconds on each step in the flow.


Task 1: In the new eMARS settings page create a medication alert to notify a chosen staff member of a medication issue, via your preferred method.

Result: 83% of testers completed task 1. Spending an average of 1.5 seconds on each step in the flow.


Task 2: Select a client you want to receive a medication alert about and set up a medication alert for when they refuse a specific medication.


Result: 83% of testers completed the task, spending an average of 2.7 seconds on each step. The fastest user correctly completed task 1 (create alert) and task 2 (apply alert) in 23 seconds.


Task 2: Select a client you want to receive a medication alert about and set up a medication alert for when they refuse a specific medication.

Result: 83% of testers completed the task, spending an average of 2.7 seconds on each step. The fastest user correctly completed task 1 (create alert) and task 2 (apply alert) in 23 seconds.


83% task success

in setting up an alert. (across both screens)

From 17% in original design

83% task success

in setting up an alert. (across both screens)

From 17% in original design

83% task success

in setting up an alert. (across both screens)

From 17% in original design

05 user errors

in total across all 12 testing sessions

From 7 p/session in original design

00:23 secs to success

for fastest user

for fastest user

From 03:03 mins in original design

Whilst initial results were promising, I also investigated the screen recordings of the prototype for further opportunities. In test 2, where users had to ‘apply’ the alert they had previously created (or create a new one), I noticed a 33% error rate. Recordings demonstrated that 4/12 users missed the 'enable to show in-app' step. Unfortunately, it was not possible to remove this step from the journey due to technical constraints, so I proposed the designs include some default options to ‘show in-app’ for each medication. I spoke to customers to identify the 3 most common outcomes they had created, to inform the default options.


I also proposed adding a default in the ‘create an alert’ page, so users could immediately use the feature ‘out of the box’. This would help address the need of the 83% of users who were setting up the same alert to alert the same people time and time again.


Whilst initial results were promising, I also investigated the screen recordings of the prototype for further opportunities. In test 2, where users had to ‘apply’ the alert they had previously created (or create a new one), I noticed a 33% error rate. Recordings demonstrated that 4/12 users missed the 'enable to show in-app' step. Unfortunately, it was not possible to remove this step from the journey due to technical constraints, so I proposed the designs include some default options to ‘show in-app’ for each medication. I spoke to customers to identify the 3 most common outcomes they had created, to inform the default options.

I also proposed adding a default in the ‘create an alert’ page, so users could immediately use the feature ‘out of the box’. This would help address the need of the 83% of users who were setting up the same alert to alert the same people time and time again.


'Much better for the way we work'.

Impact

Before implementing the proposed changes, we considered the results of our prototype test in the context of the measures of success we previously defined. We were pleased to see that for many of these measures, we not only achieved but excelled, For example;


  • In both task 1 and task 2 we saw an 83% success rate using the new designs, so now 10/12 users could successfully create and trigger an alert (even across the 2 different pages). This was in line with the measure of success we set ourselves at the beginning of this project of 83%.


  • We also saw a drastic decrease in errors, despite aiming for less than 5 errors per session, in these prototypes we only saw 5 user errors in total. This figure far surpassed our earlier aims of 5 or less errors per session, and we hoped the iteration to include some default options could impact this measure of success even more!


  • Our final measure of success was time-related to address the user’s perception of the time needed to set up an alert. Earlier in this project we decided that even if the new design saved a minute per user, this would be a significant improvement in usability. This success indicator was far exceeded by the fastest user who completed both tasks in 23 seconds.

Before implementing the proposed changes, we considered the results of our prototype test in the context of the measures of success we previously defined. We were pleased to see that for many of these measures, we not only achieved but excelled, For example;


  • In both task 1 and task 2 we saw an 83% success rate using the new designs, so now 10/12 users could successfully create and trigger an alert (even across the 2 different pages). This was in line with the measure of success we set ourselves at the beginning of this project of 83%.


  • We also saw a drastic decrease in errors, despite aiming for less than 5 errors per session, in these prototypes we only saw 5 user errors in total. This figure far surpassed our earlier aims of 5 or less errors per session, and we hoped the iteration to include some default options could impact this measure of success even more!


  • Our final measure of success was time-related to address the user’s perception of the time needed to set up an alert. Earlier in this project we decided that even if the new design saved a minute per user, this would be a significant improvement in usability. This success indicator was far exceeded by the fastest user who completed both tasks in 23 seconds.

Before implementing the proposed changes, we considered the results of our prototype test in the context of the measures of success we previously defined. We were pleased to see that for many of these measures, we not only achieved but excelled, For example;


  • In both task 1 and task 2 we saw an 83% success rate using the new designs, so now 10/12 users could successfully create and trigger an alert (even across the 2 different pages). This was in line with the measure of success we set ourselves at the beginning of this project of 83%.


  • We also saw a drastic decrease in errors, despite aiming for less than 5 errors per session, in these prototypes we only saw 5 user errors in total. This figure far surpassed our earlier aims of 5 or less errors per session, and we hoped the iteration to include some default options could impact this measure of success even more!


  • Our final measure of success was time-related to address the user’s perception of the time needed to set up an alert. Earlier in this project we decided that even if the new design saved a minute per user, this would be a significant improvement in usability. This success indicator was far exceeded by the fastest user who completed both tasks in 23 seconds.

+66% increase in task success

Old Design: 17%

New Flow: 83%

Measure of Success: 83%

-6.5 decrease in user errors

Old Design: 7 p/session (avg)

New Flow: 0.5 p.session (avg)

Measure of Success: 5 p/session

01:19 mins reduced to success

Old Design: 03:03 (avg)

New Flow: 01:44 (avg)

Measure of Success: 2 minutes to success

01:19 mins reduced to success

Old Design: 03:03 (avg)

New Flow: 01:44 (avg)

Measure of Success: 2 minutes to success

01:19 mins reduced to success

Old Design: 03:03 (avg)

New Flow: 01:44 (avg)

Measure of Success: 2 minutes to success

“Being able to reuse the same alert multiple times is much better for me…and I can still customise and create one off alerts to account for those more trickier scenarios…this will be much better for how we work”

“Being able to reuse the same alert multiple times is much better for me…and I can still customise and create one off alerts to account for those more trickier scenarios…this will be much better for how we work”

“Being able to reuse the same alert multiple times is much better for me…and I can still customise and create one off alerts to account for those more trickier scenarios…this will be much better for how we work”

Next steps.


  • The final designs were planned and road-mapped for development.


  • Work to track feature adoption through user tracking data was prioritised and brought forward.

  • The customer-facing teams revamped the onboarding experience tailored for new beta testers, and collaboration with marketing and sales teams was enhanced to align the messaging of the medication feature.

Jo Laycy 2024 ©

Jo Laycy 2024 ©

Jo Laycy 2024 ©

Back to top