Co-design: The secret sauce of digital health engagement

April 11, 2021
11 min read
Written by Gabe Strauss, Elise Ogle, and Xin Koepsell

As we’ve written about before, engagement is a primary challenge faced by the digital health industry today. Our industry has created interventions that are clinically effective and safe. Yet, these same interventions often struggle to sufficiently engage end-users, and consequently, fail to achieve their real-world potential.

Researchers and practitioners are increasingly turning to co-design as an essential strategy for improving engagement. Co-design (sometimes referred to as co-creation or participatory design) is a structured process for engaging end-users and stakeholders as creative partners in the design process in order to understand their needs and fine-tune product features to be as engaging and impactful as possible.

The reason for incorporating co-design is simple: We, as product developers, are not our users. Due to differences across age, culture, life experience, and cognition, what is engaging to us is often different than that which is engaging to our users. For example, at Limbix, we are building an intervention for adolescent depression. However, none of the core team are adolescents and few of us have lived experience with depression. It is therefore critical to involve prospective end-users as active participants in our design process.

Yet, while there is widespread agreement that co-design is essential for creating engaging digital health interventions, there is limited guidance on how it can optimally be undertaken. Unfortunately, we cannot simply draw upon generic guidance on co-design because of differences between digital health interventions and the products of other industries.

At Limbix, we have invested heavily in co-design as a core pillar of our engagement strategy. We have developed a five-step process for co-design that is grounded in the broader literature but fine-tuned to the unique demands of digital health intervention development. We apply this process iteratively throughout the product development lifecycle.

  1. Recruit participants that are as similar as possible to your target users
  2. Prioritize the problem based on impact and risk
  3. Select the right tool based on the problem you are trying to solve and the phase of product development.
  4. Run the session, taking into account the sensitive nature of the material.
  5. Validate and iterate, building on past feedback for continual improvement.

Step 1: Recruit participants

Recruiting participants is an essential but time-consuming step of co-design. Here are a few principles to keep in mind to make this critical step as time-efficient and effective as possible.

  1. Recruit participants that are representative of your target users. You cannot assume that what’s engaging for one group will have the same effect on another population. It’s critical to ensure that your participants adequately present the varied characteristics of your target users across dimensions such as diagnosis, symptom severity, age, sexual and gender identity, and cultural background, which can all impact engagement.
  2. Build a council. While time-consuming upfront, building a stable cohort of participants to provide regular product input can be a tremendous time-saver over the long run. It will ultimately allow you to spend less time on recruitment and more time on designing. Plus, we’ve found that participants love the opportunity to play a long-term role in developing a product that will help others! At Limbix, we call this group our Teen Advisory Council, and it includes 12 adolescents with experience of depression.¹
  3. Ensure safety and ethics up-front. You will often collect sensitive and personal information during co-design, so it’s essential to protect participant privacy by gathering informed consent, creating clear protocols for data storage and protection, and evaluating 3rd party vendors & suppliers for data security. When developing a regulated medical device, you may also need institutional review board (IRB) approval, which ensures that your research protocol meets high ethical standards for the protection of human subjects. At Limbix, we’re building a medical device so our design research is conducted under the purview of an IRB protocol and subject to Good Clinical Practice (GCP) guidelines.

Step 2: Prioritize the problem

You have limited time and resources, so it’s important to prioritize your co-design efforts. We recommend prioritizing co-design efforts based on two dimensions: impact and risk.

Impact refers to how likely a feature is to impact users’ clinical (health) outcomes. Digital health interventions are like medicine in that they have active ingredients (clinical content), and those active ingredients are only effective if users sufficiently engage with them. The more a new feature impacts users’ interactions with an active ingredient, the more important it is to incorporate co-design.²

For example, imagine that you’ve identified that in-app exercises are an ‘active ingredient’ because your data shows that completion of in-app exercises causally impacts users’ clinical outcomes. If you’re designing a feature that directly impacts exercise completion rate, then you should incorporate co-design into the feature’s development. In contrast, you wouldn’t spend much time — if any at all — involving users in developing a log-in screen, since the login experience is unlikely to impact engagement with your active ingredients.

Risk has two components: 1) How confident you are in your ability to design for your users’ needs without their direct input and 2) the level of effort required to build the feature.

Turning back to the example of the login experience, login flows are well-established, low-effort features that can be well-designed without user input. By researching established flows instead, we can save time and resources that can be redirected to seeking user input on other features that we have lower confidence in. In contrast, it would be very important to get user input on a complex, high development-effort feature like a reward system, which can be implemented in a variety of ways and has a well-known history of unintended negative effects when implemented poorly.

Prioritize co-designing features that are high impact and high risk

Let’s bring this all together with an example. Let’s say you’ve identified the completion of in-app exercises as your target engagement metric because it causally impacts clinical outcomes.

Based on your understanding of engagement techniques and prior user feedback, you believe that a reward system could be an effective way to drive exercise completion. You want to build a reward system into your next release and need to determine whether to incorporate co-design techniques into its development.

You score the reward system as high impact because it touches a clinically impactful area of your intervention. You also score it as high-risk, because it is a high effort feature and poorly implemented reward systems can have unintended negative effects. The reward system is, therefore, an ideal candidate for co-design.

Step 3: Select the right tool for the job

Co-design is an umbrella term for a range of tools used to engage end-users in the product development process. It’s critical to choose the right tool for the design problem, and the phase of product development.

Here are a few of the tools we use most:

  • Problem interviews — sometimes referred to as scoping interviews — are often used at the start of product development to identify user needs and how users are currently addressing those needs. The outputs from these sessions feed the problem ‘backlog’ that forms the basis of prioritized roadmaps.
  • Co-design is, somewhat confusingly, the name for a specific technique, as well as the name for the overarching category of techniques. The co-design technique, itself, involves working with participants to generate potential solutions to a design problem. This often involves a design studio with Crazy-8s, in which participants sketch out potential solutions, share them with each other, and then iterate. We usually start a feature’s development with a co-design session.
  • Group critiques and brainstorming — sometimes referred to as a solution interview — is a lightweight version of co-design. We’ll brief participants on the design goal and show them early designs. We’ll then ask participants for feedback and ask them to brainstorm improvements together. We tend to use group critiques for lower-risk features, where we have a good sense of the solution but want to quickly calibrate it before moving forward.
  • Usability tests involve users completing defined tasks within a prototype while they think aloud. The goal of these tests is to assess how easy the intervention is to use and understand. Comprehension is particularly important for many behavioral health interventions because clinical improvement often relies upon the learning of core psychological concepts.
  • User interviews involve collecting feedback from users after they complete the intervention. At Limbix, as part of every release, we interview a sample of users, asking them a series of structured questions about their experience. This feedback is essential, and many of our best feature ideas have come directly from these interviews.
  • Diary studies involve asking users to document their experiences and thoughts while completing the intervention. The problem with post-study user interviews is that memories are notoriously biased. Diary studies enable us to get in-the-moment feedback that more accurately represents users’ experiences. We tend to complete diary studies on minor releases that occur in between our larger clinical trial releases.
It’s important to use the right tool for the phase of product development

Let’s now turn back to the example and choose a technique for developing the reward system. You’ll want to involve users as early as possible in the process, so you might choose to start by running a co-design session, in which you brief users on the design objective and then generate some initial solutions together. By the end of the session, you’ll aim to have a range of ideas, some far-fetched, others more realistic, and have a good sense of what will be engaging to your users.

Step 4: Run the session

Everything leading up to this point has just been preparation. Now you get to the fun part! The specifics of each session will depend on the tool that you’re using. However, here are some overarching principles for running effective sessions when developing digital health interventions.

Keep participants engaged during sessions. You’ll need to keep participants engaged to get the most out of sessions. Here are a few tips that are especially important for co-designing digital health interventions:

  • Use icebreakers. Oftentimes, sessions can dive into personal topics related to a participant’s health. It is important to help people feel comfortable and warmed up before starting to discuss sensitive topics.
  • Use inclusive language. Never assume to know what someone’s experiences are, especially when you are discussing mental health and wellness. It’s also important to remind people that there are no wrong answers and that everyone’s experience is valid and valued.
  • Keep the diagnosis in mind. There may be special considerations to keep in mind as you interact with users that have a particular diagnosis. For example, at Limbix, we often work with adolescents with depressive symptoms, and they may occasionally miss a session due to their symptoms. In such cases, it’s important to be patient and understanding.
  • Keep it short. Participants with certain conditions may lack motivation or stamina for longer sessions. It’s, therefore, better to structure sessions into short, concrete exercises that take less than an hour.
  • Fairly compensate participants. Compensating participants goes a long way to keeping them engaged. However, it’s also important to not overly compensate participants, as too much compensation can unduly, and unethically, influence users to participate in research that they would otherwise be uncomfortable participating in.

Use different participants for ideation vs prototype feedback. It’s important to be able to ideate with one group and then get feedback from another. If the same group ideates on initial concepts and then gives feedback on prototypes, the feedback can be biased because it was based on their own ideas! For this reason, we generally run initial ideation sessions with around 5 participants, and then work with a different set of 5 participants for prototype feedback.

Make the most of remote sessions. Due to the COVID-19 pandemic, we had to move all of our co-design sessions online. While initially challenging, remote research ultimately became a huge advantage. We previously only recruited participants who could physically get to our office. Now we recruit participants from all over the US, which enables us to involve people with more diverse backgrounds and perspectives.

Output from an early co-design session at Limbix

Step 5: Validate and iterate

A single research session is just the beginning of a much larger process. After each session, you’ll need to synthesize and implement your findings, and then consider whether or not to run follow-up sessions using other tools.

Turning back to our reward system example, after running the initial co-design session, you might build some initial prototypes, and then run a group critique & brainstorming session to gain additional feedback. Then you might create a high-fidelity clickable prototype and run a usability test, followed by a diary study that simulates real-world usage of the feature when integrated into the full intervention. Throughout this process, you should also be getting regular feedback from clinical experts to ensure clinical efficacy and safety (which can be viewed as a parallel, but interrelated, cycle of co-design).

Finally, you’ll launch the feature as part of a wider product release and can measure how the introduction of the new feature impacts your quantitative engagement metrics and clinical efficacy. You might also run in-depth post-intervention user interviews to understand how the introduction of the feature impacted the users’ experiences, and generate suggestions for future improvement.

The development of a complex feature often involves the iterative application of multiple co-design techniques.

Conclusion

Co-design is one of the most important strategies available for driving engagement in digital health interventions. At Limbix, we incorporate co-design into all phases of our product development and firmly believe that it has played an outsized role in the strong engagement we see in our products today. While there is no ‘magic bullet’ engagement strategy, it really is hard to overstate its importance.

We’ve shared what we believe to be our most important insights into co-designing for digital health engagement. However, there’s still a lot more that we could have addressed, so feel free to reach out if you have any questions. Nothing excites us more than discussing this important topic with others who share our passion!

Thanks to: Judy Liu, Jessica Lake, Jim Liu, Freddy Tang, Tom Hallam, Lindsay Lee, Shikha Nalla, and Chris Sowers for insightful feedback on drafts. Thank you also to Stella Kim for shaping our thinking on this topic.

Footnotes

[1] One important caveat is that users may become biased as they become increasingly familiar with your product, and therefore be less representative of first-time users. It’s therefore worth periodically testing your product with naive participants in addition to your council.

[2] For a detailed explanation of this concept, read the article on meaningful engagement metrics.