Engagement techniques for digital health interventions

January 19, 2021
19 min read
Written by Gabe Strauss

User engagement with digital health interventions is a primary challenge faced by the industry today. A growing body of evidence has demonstrated that digital health interventions are effective, safe, and affordable treatments. However, many patients do not sufficiently engage with the interventions once they are prescribed. Just as medicine is only effective if patients take the minimum effective dose, so too, digital health interventions are only effective if patients sufficiently engage with them.

Engagement can be thought of as a factor that strengthens or weakens the relationship between the digital health intervention and the probability of clinical outcomes  (diagram based on Perski & Colleague’s 2017 review)

At Limbix, our vision is a world where all adolescents have access to effective, affordable behavioral healthcare, and we believe one of the best ways to achieve this vision is through accessible digital interventions. Engagement is one of the primary challenges to achieving this vision, and to this end, our product team invests heavily in understanding how to drive engagement.

Just as medicine is only effective if patients take the minimum effective dose, so too, digital health interventions are only effective if patients sufficiently engage with them

We follow a three-step process grounded in the well-known build-measure-learn framework, with an emphasis on theory, co-design, and quantitative & qualitative feedback:

  1. Identify engagement drivers, with an emphasis on theory. In order to build engaging products, we must first understand the theoretical underpinnings of digital health engagement. Based on psychological & human-computer interaction theory, and published studies, we develop hypotheses to test in the subsequent steps.
  2. Fine-tune with user testing and codesign. User-centered design is the ‘secret sauce’ of engagement. In order to create an engaging intervention, it’s critical to co-create and validate features with the people who will use them.
  3. Measure and iterate with both quantitative and qualitative feedback. Once we’ve shipped a feature, we measure its impact on engagement and iterate based on both quantitative usage data and in-depth qualitative interviews.

This post is the first in a series in which we’ll share our process for building engaging digital health interventions. In this post, we explore the first step of our process: the theoretical underpinnings of engagement in digital health interventions.

The elements of digital health engagement

The academic literature is an excellent starting place for understanding what drives engagement. Researchers have conducted several comprehensive reviews, and between them, they have identified upwards of 60 different factors that are hypothesized or known to influence digital health engagement.

In this post, we’ve outlined the factors that have most influenced our thinking. For each, we explain what the engagement factor is, summarize research about its effectiveness, and provide examples of how it’s been implemented. To assist with implementation, we’ve divided the techniques into three groups:

  1. Foundational. These are the essential elements that you should ensure every digital intervention has.
  2. Straightforward with strong empirical support. After implementing the foundational elements, spend your efforts on implementing these relatively straightforward and well-established engagement techniques.
  3. Experimental or complex. Finally, consider implementing techniques that either have limited, but promising empirical support, or are well-established but require careful implementation.

There is no single, magic-bullet, engagement factor.

It’s important to point out that there is no single, magic-bullet, engagement factor. The techniques have synergistic effects and work best when implemented together. All factors mentioned in this article have good theoretical and/or empirical support for their effectiveness and the difference between their impact is small. Even the most well-established factors still only have a relatively minor impact in isolation (e.g., one study estimated reminders increase engagement by 4%). However, when multiple factors are used together, the effect on engagement can be truly impactful.

Foundational techniques

Usability

In the context of digital health interventions, usability is defined by how natural and frictionless the interventions are. Usability is best understood as a potential barrier to engagement rather than something that increases engagement on its own. Recent studies comparing engagement across multiple web-based digital health apps found either no relationship or a negative relationship between usability and engagement. The authors explained that very simplistic, content-thin apps can have high usability scores, but can ultimately be very unengaging. The authors concluded that usability is a necessary but insufficient factor for engagement. Take the example of interactivity, which improves engagement by immersing users into the digital content. A buggy or counterintuitive interface (poor usability) takes users out of the experience and thereby diminishes the effects of interactivity on engagement.

Usability is best understood as a potential barrier to engagement rather than something that increases engagement on its own.

Usability is particularly important in digital health interventions for conditions in which users’ cognitive or physical abilities may differ from the general population. For example, users with depression often struggle with focus and motivation, and in such cases, intuitive design is critical to sustaining engagement.

Visual design

Visual design refers to the look and feel of the intervention’s interface, and is thought to improve engagement by attracting the attention of the user, stimulating curiosity, and giving the user a sense of delight. Researchers have suggested that aesthetic appeal can increase the persuasiveness of messages by leading to greater cognitive absorption and processing of the information. This is particularly important in digital health interventions, where comprehension of content is often critical to the effectiveness of the intervention. Visual design can also impact engagement through the well-established effect of color on emotion.

A 2018 study found that better visual design was positively associated with usage metrics for mobile but not web-based apps. The authors did not discuss the reasons why this occurred, but it may be that visual design is more impactful in some delivery formats (e.g. mobile apps) than others.

Similar to usability, good visual design is a foundational element for strong engagement. Poor usability and visual design was something that plagued early digital health interventions. However, as digital health interventions have gained the attention of the wider tech-world, both usability and visual design have improved dramatically.

Good usability and visual design are necessary but insufficient for engagement

Straightforward techniques with strong empirical support

Reminders

Reminders (also referred to as prompts or triggers) involve prompting users to engage in target behaviors. Common examples are push notifications, emails, and text messages. They form a core component of the well-known Fogg Behavior Model. According to the model, three elements must converge for behavior change to occur: motivation, ability, and a prompt. That is, even if a person has sufficient motivation and ability, they will also need to be prompted to engage in the target behavior. For this reason, incorporating reminders is often considered a low hanging fruit of behavior change.

Reminders are a low hanging fruit of behavior change.

Studies have generally found that notifications positively impact engagement and efficacy. A 2009 study found that notifications improved efficacy in the majority (57%) of interventions reviewed. A more recent, 2018, study found that notifications increased engagement in the 24 hours following the notification by a statistically significant but relatively small 3.9%. Another 2018 study found that calls-to-action (of which reminders are a type) were among the factors most highly correlated with increased engagement. Data from our own clinical trials has shown substantial increases in engagement after users receive notifications.

Care must be taken, however, to not overuse notifications. Smartphone users receive upwards of 50 push notifications a day, and sending too many notifications can drive users away. The question of when and how to send reminders is a complex topic, and a whole field of ‘interruptibility research’ has emerged to explore, among other things, the optimal time and method to deliver notifications. Limbix users have been surprisingly responsive to notifications, and have consistently asked for more rather than fewer. The key principle in our experience is to ensure that the notifications are relevant and personalized. As with all engagement techniques, there are many factors that influence success. User testing and codesign are the best ways to fine-tune implementation, something we will write more about in a future article.

Goal setting

Goal setting is one of the most commonly used behavior change techniques in both face-to-face and digital health interventions. A 2017 review found that goal setting had a statistically significant but small additive effect on behavior change. A 2018 review found that goal setting was among the features most highly correlated with engagement. As with many of the techniques mentioned in this article, goal setting is thought to be more effective when paired with other behavior change techniques (e.g self-monitoring or accountability).

As with many of the techniques mentioned in this article, goal setting is thought to be more effective when paired with other behavior change techniques

Goal setting increases engagement by improving motivation. Specifically, it motivates users to try harder at intervention tasks, for longer periods of time, and with less distraction. According to goal setting theory, goal setting is likely to be more effective in improving engagement when clear instruction is given on the goal-setting process, goals are chosen by the user, and goals are appropriately matched to the user’s abilities. While professional coaches can be used to guide the goal-setting process, standalone interventions can leverage algorithms to automate the process of tailoring goals to users' interests and abilities.

Self-monitoring

Self-monitoring is one of the most commonly used behavior change techniques in digital health interventions. It is based on the premise that when users monitor a behavior, they are more likely to make positive changes to that behavior. Or in other words, we ‘manage what we measure’. Self-monitoring is often used in conjunction with goal setting, whereby the user monitors their behavior relative to a goal. Many wearable devices (e.g., Fitbit or Apple Watch) and nutrition apps (e.g. MyFitnessPal) are centered around the combined behavior change techniques of self-monitoring and goal setting.

A recent study found that self-monitoring was significantly associated with increased engagement. Another study found that ‘ongoing feedback’, which is similar to self-monitoring, had a small, positive correlation with engagement.

Wearable activity trackers like the Apple Watch utilize self-monitoring combined with goal setting to drive engagement and health outcomes. (Image source: imore.com)

Interactivity

Interactivity refers to the two-way flow of information between the user and the digital intervention, and is considered a core component of engagement. There is a range of ways that interactivity can be incorporated into interventions, from moderate effort strategies like sliders, drags, and mouseovers, to high-effort features like algorithmic branching, conversational interfaces, and immersive game-design. A recent study found that interactive content was both more engaging and more persuasive than non-interactive content. Increased persuasiveness is particularly important in digital health interventions, where content is delivered in order to change attitudes and behaviors. A 2018 review found a small, positive association between interactivity and engagement.

Akili’s Endeavor Rx is a highly interactive video game for the treatment of ADHD (Image Source: Akili)

Narrative

Narratives improve engagement by sparking curiosity and making content more memorable and relatable. The engaging power of narrative is not surprising, given that humans have a well-established propensity to make sense of the world through storytelling. As Yuval Harari writes in his best-selling book Sapiens:

“Storytelling is our specialty. It’s the basis for everything we do as a species.”

Many of the most successful video games incorporate narratives as a core component of their experience. Gamification has become an increasingly common tool to improve engagement, and studies have found that narratives are an important driver of engagement within gamified digital health interventions. Many of our users have commented that our game-like features (including the use of narrative) were among their favorite features of the program.

Accountability

Accountability occurs when a person is called to ‘account’ for their behavior. To be effective, three elements must co-occur: A person must set clear goals, they must monitor their behavior relative to their goals, and they must be called upon to justify their actions if they do not adhere to their goals. Great care must be taken in the third step — justifying non-adherence — in order to avoid inadvertently discouraging users and thereby decreasing engagement. To address this risk, many digital interventions incorporate human coaches that help clarify goals, monitor progress, and reflect on performance. Standalone interventions (those without human support) can also facilitate accountability through self-guided goal setting and monitoring, and by automatically alerting users when deviations occur. The use of an empathetic and encouraging tone, and reiterating the ‘why’ behind the desired behavior, is particularly important for maintaining users' motivation through inevitable setbacks.

Prescription digital therapeutics provide another avenue for accountability in standalone interventions. In the prescription digital therapeutic model, clinicians prescribe a standalone intervention and then meet with the patient at regular intervals for follow up appointments (similar to when a medication is prescribed). Patients’ expectations that they will need to justify non-adherence to their health care provider can provide an accountability boost. In support of this, a recent study found that clinic-referred patients showed 3x greater intervention adherence than patients that were self-referred to the intervention.

Physicians can provide accountability in prescription standalone digital health interventions (Image source: Shutterstock)

Personalization

Personalization (often referred to as tailoring) involves customizing digital interventions to users’ personalities, attitudes, knowledge, goals, and other characteristics. Personalization enhances engagement by increasing the perceived personal relevance of intervention content. Personalization can take many forms, from low effort features like including a user’s name, to more complex features like customized avatars or real-time adaptive content.

A 2013 review found that tailored web-based health interventions led to significantly greater improvement in health outcomes as compared to non-tailored interventions. The authors suggested that the improved health outcomes were likely due to better engagement in the tailored interventions.

One reason why human-delivered interventions may be more engaging than standalone digital interventions is that humans can better personalize the interventions to the needs of patients. However, as technology improves, personalization will too. Just as powerful algorithms in many consumer products (e.g. Facebook or Tiktok) can provide highly impactful personalization at scale, we envision a future where standalone digital health interventions are also highly personalized. Many interventions also leverage algorithms to assist professional supporters to better and more efficiently personalize interventions. We expect ‘AI-augmented’ interventions to play an increasingly important role in scaling healthcare access.

Credibility and Treatment Expectancy

An intervention is credible when users believe that the developers had the necessary expertise to create an effective program. Closely related is the concept of treatment expectancy, which refers to whether a user believes that the intervention will be effective. Researchers generally view these as separate but closely related constructs, with credibility referring to a more logical thought process and treatment expectancy refers to a more emotional process, similar to that of hope or faith.

There are many ways to establish credibility and treatment expectancy, including empirical validation (e.g. randomized clinical trials), regulatory approval (e.g. FDA clearance), development by clinical experts, co-designing with the target population, and having the intervention be recommended or prescribed by a clinician. Good visual design and usability can also positively impact perceived credibility and treatment expectancy.

The relationships between credibility, treatment expectancy, and engagement are intuitive. When users perceive a treatment as credible or expect the treatment to work, they will be more likely to invest time and effort into completing the intervention. This relationship is supported in the literature. A recent review similarly found that credibility and treatment expectancy were among the most important predictors of increased adherence in digital mental health interventions.

In line with this, researchers are exploring the emerging concept of the digital placebo effect. The placebo effect has historically been viewed as something simply to control for in randomized controlled trials. However, treatment expectancy (an active ingredient in the placebo effect) has a very real, positive impact on engagement and clinical outcomes. Digital health interventions present an exciting opportunity to utilize adaptive tailoring to optimize messaging based on users' attitudes and personalities, and thereby produce greater treatment expectancy and better outcomes. While developers should, of course, be required to prove that their digital health interventions have clinical effects beyond that of a placebo, the power of positive treatment expectancy can be leveraged to achieve an even greater clinical effect than would be achievable with clinically active ingredients alone.

FDA cleared, prescription digital therapeutics offer a gold-standard level of credibility (Image source: Shutterstock)

Complex or experimental techniques

The following techniques are either well-established but complex (requiring careful implementation), or they are promising but still experimental.

User Control

User control refers to how much autonomy and choice users have when navigating interventions. The effect of user control on engagement is complex and has been theorized to both increase and decrease engagement.

On the one hand, user control can increase engagement by facilitating a greater sense of autonomy. According to self-determination theory, the human need for autonomy is a primary driver of motivation. For this reason, academics have predicted that greater user control and autonomy should lead to greater engagement.

On the other hand, several studies found that increased user control actually decreases engagement. A 2013 study found that users spent significantly more time engaging with an intervention that had a prescriptive, linear path, than with a version where all content was available at once. Another study found similar results, with linear navigation leading to greater consumption and retention of intervention content. This inverse relationship between user control and engagement brings to mind a famous study, titled “When choice is demotivating”, which found that consumers are less likely to purchase products when they are presented with a greater number of options. Consumers can become overwhelmed by too much choice and end up not choosing anything at all!

This may be an example of people’s preferences deviating from their behaviors. A 2012 study found that while users preferred having more control over their navigation through the content, they actually ended up engaging with the intervention less, and retaining less information than when they were given less control.

There are two guiding principles to draw from this research. The first is that while some control is good, too much control has adverse effects. Great care must be taken to identify the level of control that maximizes engagement. Second, when it comes to the complex realm of digital health interventions, it is often better to give users the feeling of control, while closely guiding them towards an optimal path.

It is often better to give users the feeling of control, while closely guiding them towards an optimal path.

Reward systems

Since the early days of behaviorism, researchers have established rewards as one of the most effective ways to influence behavior. Rewards can be broadly divided into two types: extrinsic and intrinsic. Intrinsic rewards relate to the positive consequences of doing something that is inherently enjoyable (e.g., feeling a state of flow), whereas extrinsic rewards refer to external incentives for engaging in a behavior (e.g., money or praise).

Reward systems within digital health interventions generally incentivize target behaviors by providing extrinsic rewards, such as badges, points, or level progression. Rewards can be provided for the target behavior itself (e.g. going for a run), effort towards the target behavior (e.g. scheduling a run), or approximations to the target behavior (e.g going for a walk)

A recent study found a small positive correlation between extrinsic rewards and engagement. Rewards are often grouped into the larger category of gamification elements, which are designed to provide extrinsic motivation to engage with the intervention. While there is general consensus that gamification elements improve engagement in digital health interventions, there are very few studies examining the precise impact of rewards by comparing the same intervention with and without rewards or other gamification elements.

While the effect of extrinsic rewards on behavior change is well established, rewards also run the risk of unintended negative effects on long term behavior change. Studies have shown that extrinsically rewarding someone for a behavior that they are intrinsically motivated to do decreases the likelihood that they will continue the behavior once the reward is no longer provided.

Although rewards are clearly a powerful force to influence behavior, it remains unclear how best to implement rewards in digital health interventions so as to avoid unintended consequences. It is for this reason that we consider reward systems to be a complex and high effort engagement technique

Social support

Social support features motivate users by leveraging social influence. The Persuasive System Design Model outlines 7 different types of social support, including social learning, competition, and cooperation, among others.

Adapted from Table 5 of Oinas-Kukkonen & Harjumaa’s Persuasive Systems Design

Social support features should not follow a one-size-fits-all approach, and developers must take care to identify the appropriate type of social support for each intervention. For example, encouraging users to share intervention accomplishments with their real-world peers might be appropriate for fitness interventions. Such an approach may be less effective for mental health interventions because participants may be less willing to share accomplishments due to significant stigma. Instead, facilitating the giving and receiving of support among intervention participants could be more effective. Moreover, certain social support techniques are likely to be more effective for certain people based on their personality type and other attributes. This is another example of where multiple engagement techniques (personalization and social support) can be combined to yield greater improvements to engagement.

A 2012 review of studies incorporating social support found that social support features had a positive, but statistically insignificant correlation with adherence. The study did not take into account the quality of implementation, and the authors suggested that social support likely does increase adherence when implemented well. This is an important insight that applies to all techniques mentioned in this article. There are substantial differences in the impact on engagement between techniques that are implemented well and those that are implemented poorly.

Professional support

Professional support (e.g. from coaches or clinicians) is arguably a component of social support but warrants its own section. Many digital health interventions incorporate professional support, but there is conflicting evidence about its impact on engagement. Several studies have indicated that professional support increases engagement. A 2019 review found that dropout from clinical trials was 12% for interventions that included professional support compared with almost 34% in stand-alone interventions. A 2014 review similarly found that professional support led to significantly higher intervention completion rates.

However, other reviews indicate that professional support does not significantly improve engagement. A 2020 study found that professional support did not increase adherence relative to automated notifications. A 2017 study similarly found that human support did not significantly impact intervention usage, and a 2016 review found that professional support did not significantly increase adherence. On the extreme end, a 2017 review found that digital health interventions for depression that included professional support had non-significant clinical effects, whereas interventions without professional support had significant effects.

So what do we make of all of this? As already mentioned, there are often substantial differences in the quality of implementation of a technique across studies, and this is further compounded by the wide range of coaching methodologies that are available. The need for human support likely also depends on the clinical severity, the target diagnosis, the type of treatment, and the individual characteristics of the patient. These factors could account for the conflicting findings. However, the impact of professional support on engagement is still likely smaller than previously thought.

It’s also important to take into account the cost and benefit of human support. Digital health interventions could enable a future where many treatments are infinitely scalable and can thereby address the formidable healthcare access issues throughout the world. However, this vision is more feasible for interventions that are comprised of standalone software. Human-in-the-loop features increase logistical complexity and cost, and constrain the ability of the intervention to scale.

We foresee standalone interventions playing an important role in increasing access to effective and affordable care.

It has also been suggested that as interventions and associated technology become more advanced, the need for human support could diminish. Perhaps we can draw a comparison to the self-driving car industry. Cars can already drive autonomously in certain circumstances, but human drivers are still needed in many situations. While we believe that professional support will always play an integral role in health interventions, we foresee standalone interventions playing an important role in increasing access to effective and affordable care.

Many digital health interventions, like Omada’s, include professional support (Image source: Omada Health)

Digital therapeutic alliance

Therapeutic alliance refers to the quality of the working relationship between a therapist and patient. Adherence is a major issue in face-to-face therapy (around 50% of patients drop out of treatment prematurely), and therapeutic alliance is consistently shown to be the single biggest predictor of adherence.

Given the importance of therapeutic alliance for engagement in face-to-face therapy, there is growing interest in how therapeutic alliance translates to digital interventions. This construct has been termed the digital therapeutic alliance. A 2018 survey among national mental health stakeholders in the UK identified digital therapeutic alliance as a top 10 research priority in digital mental health.

A recently established measure of digital therapeutic alliance includes 5 dimensions:

This table is based on Berry and Colleagues’ (2018) mARM Questionnaire

While digital therapeutic alliance is still a new and emerging concept, early data are promising. A 2018 study found that participants’ ratings of digital therapeutic alliance were positively associated with program log-ins, frequency of self-monitoring, and number of treatment modules completed. Given the strong theoretical rationale for the relationship between digital therapeutic alliance and engagement, we expect it to emerge as an impactful driver of engagement in digital health interventions.

Conclusion

In this article, we’ve outlined techniques that have shaped our thinking on how to build engaging digital health interventions. For those building interventions, we recommend starting with the foundations of usability and visual design. However, these are necessary but insufficient for engagement, and you should quickly move on to the straightforward and well-established techniques that form the backbone of engaging digital health interventions. Finally, consider integrating one or more of the more complex or experimental techniques. If you do, you will be well on your way to creating an engaging and impactful intervention.

We’d love to hear about what techniques have worked best for you. What insights have you gained that are not captured here?

We’d love to hear about what techniques have worked best for you. What insights have you gained that are not captured here? What would you like to hear more about from us? By sharing what’s worked, we hope to assist others that are tackling similar problems and thereby accelerate learning across the industry. By working together, we can build a world where all those in need have access to effective, affordable health care.

Further reading

The following are excellent reviews of engagement factors and behavior change techniques.

Acknowledgments

Thank you to our Head of Content Elise Vierra, Product Designer Xin Koepsell, UX Researcher Stella Kim, and Director of Research Jessica Lake for contributing to this article. Thank you also to Jim Liu and Shikha Nalla for insightful feedback.