Back to top

How Digital Behaviour Change Interventions Work

Contents

I Introduction
II Discovering the psychology of persuasive digital interventions
III Paradigm shift on behaviour change technology
IV The persuasive communication model
V Resources
VI References

-- Submitted by Brian Cugelman, PhD

I Introduction

In this article, I will show how digital behaviour change interventions work and discuss the art and science used to build them. However, I spent three years studying these technologies, and in order to explain how they work, I also need to explain my person struggle to use, reject and reformulate existing guidelines and concepts into a new model for building digital health behaviour change interventions and campaigns.

This article discusses a three year study to understand the persuasive psychology of behaviour change websites, solutions to overcome the limits of existing frameworks, and presents an overview of the persuasive communication model.

II Discovering the psychology of persuasive digital interventions

During my doctorate, I researched the psychology of websites that can influence how people think and behave, in order to learn how to design digital interventions that could be scaled to the population level, potentially improving the wellbeing of millions and the health of our planet.
    
One key challenge was to find a list of persuasive tactics and therapies that I could use to analyse existing health behaviour change technologies. Although there are many taxonomies that should have worked, almost all systems were incapable of describing the persuasive psychology and therapy employed by the websites reported in the scientific literature.

During the study, I assessed behaviour change systems and taxonomies from social marketing [1, 2] CAPTOLOGY, [3, 4] stages of change, [5, 6] Cialdini's principles [7] and finally, systems from evidence-based behavioural medicine. [8, 9, 10] These systems were either too limited, they packaged individual tactics into groups that rarely existed, or they were organized by concepts that were incompatible with digital interventions and campaigns. The best fitting taxonomies came from evidence-based behavioural medicine, especially the systems by Abraham and Michie (2008) and Michie et al. (2008). However, both systems omitted a large volume of persuasive techniques that are common to health behaviour change campaigns, rendering them good but incomplete.

In order to develop a comprehensive taxonomy, we identified all influence components (the tactics that are believed to influence how people think and behave), and reorganized them within a communication theory framework. [11] It was only at this point that we could use our revised coding system to describe digital interventions. However our model also revealed that researchers rarely report a large number of influence components, showing the need for the eHealth field to integrate knowledge from the persuasive technology field. It also revealed that the entire field of eHealth may suffer from a lack of appropriate theories and taxonomies to guide the analysis and design of health behaviour change technologies.

With this revised theory and taxonomy, we used statistical meta-analysis to quantify how much each influence component was capable of influencing users’ behaviours. The study was first published in January 2010, and then published in the Journal of Medical Internet Research a year later. Read the study here: http://www.jmir.org/2011/1/e17/.
    
During the course of the study, I deepened my understanding of the model as I learned what factors made digital interventions work and formulated a new understanding of the relationship between communications and influence. Since publishing the original paper, I have revised the taxonomy by extending and regrouping all influence components to better fit the theory and have extended our initial findings by building a comprehensive list of evidence-based factors of digital behaviour change design. However, without research funding, this work has remained unpublished, and the following section only provides a basic overview of the revised model.

III Paradigm shift on behaviour change technology

One of the key steps in developing a model to describe the persuasive qualities of digital interventions was to stop thinking about them as technologies, and instead to start thinking about them as people. One of my key eureka moments came while reading a textbook on social psychology, where I realized that the qualities of persuasive leaders were the same qualities of persuasive websites. Common qualities included strong credibility (expertise and trustworthiness), charisma, similarity to the audience, and attractiveness.
 
Although this may sound counterintuitive, there is overwhelming research that shows we interact with technology in ways similar to how we interact with other people. [12, 13] One of our earlier studies confirmed this, where we found that models of human source credibility could predict if a citizen engagement website was more likely to inspire users' trust and behavioural intentions. [14]

Then our meta-analysis on health behaviour change websites also confirmed this, showing that the most effective digital interventions were like motivational coaches. These digital coaches informed users about the consequences of their behaviour, prompted them to set goals, taught them new skills, then encouraged them to track their progress toward those goals while providing feedback on their performance and using incentives to reward their progress.

Some of the implications of this paradigm shift are that to understand how interactive technology can help people adopt healthier habits, you need to understand how a talented coach helps his clients achieve goals. Also, to design behaviour change interventions, you can model them on the same two-way process of communication/ interaction that guide therapist-patient relationships and by using our model, it is possible to optimize the communication process to maximize the probability of positive outcomes.

IV The persuasive communication model

Ultimately, digital interventions are communication products that engage people in relationships, and each part of the communication process can strengthen or undermine an intervention’s efficacy. When designing interpersonal digital interventions and also mass-media outreach campaigns, it is important to understand how each part of the communication process shapes outcomes.

Another key eureka moment came after realizing that each discrete influence component (persuasion and therapy tactic), could be organized within a communication framework, which allowed us to group each influence component within a model suitable to analysing/ designing both digital interventions and campaigns.

Figure 1: Persuasive Communication Model

Figure 1. Persuasive Communication Model

Figure 1 presents the persuasive communication model, and then Table 1 describes each sphere and their influence components. The model is based on a circular communication system where a source and audience exchange messages, by sending them back and forth. However, for one-way exchanges from source to audience, the model would omit audience feedback.

Spheres are the main groupings in the model while influence components make the spheres more or less persuasive. For instance, the source is the person or organization disseminating a message to an audience. Within the source sphere, the intervention can be made more persuasive by modifying these influence components: the source’s perceived expertise, authority, trustworthiness, likeability, attractiveness, similarity to the audience. Moreover, the source’s persuasiveness is also impacted by first impressions (the halo effect) and the sleeper effect.  

To demonstrate the model, consider a personal trainer (the source) who is helping a new client (the audience) develop a workout routine at a gym. Meeting in person, the trainer’s first message (source message) asks the client to complete a survey about their goals, reasons for joining the gym, health, activity levels and lifestyle. The client encodes their information into a survey (audience feedback), that is handed to the trainer. The trainer reads the survey (decoding it) and uses the information to design a motivating action plan that highlights the client’s desired benefits (audience influence components) and offers a tailored exercise routine (feedback message) that will help them achieve their goals. Next, the trainer writes down the client’s goals and action plan (encoding the message in a media) and hands it to the client, who negotiates it with the trainer till they are both happy with the tailored plan (feedback message). The trainer decides to leverage some social pressure to increase the odds that the client sticks to their program, and asks the client to make a public declaration to their family/friends and then commits to reviewing progress each month (social context).

Although this example describes a person, the same principles apply to digital interventions, as described below.

Source interpreter

Spheres: A source is anything that can be trusted. It's the person, organization, network or brand that stands behind an intervention. At the same time, it's the intervention itself as people interact with technology in ways similar to how they interact with people.

Influence components: To make interventions more persuasive it is critical to stress source credibility. This means stressing the source's expertise and trustworthiness.

Other key source factors include boosting likeability, authority, attractiveness and trying to make a good first impression.

Message encoding/decoding

Spheres: This sphere describes how a message is constructed so that it is easily understood and more engaging.

The way an idea is expressed can sometimes be more impactful than what is expressed. The influence components in this sphere are based on the premise that style sometimes trumps substance.

This is the sphere where only 20% of human communication is spoken, while 80% of what is conveyed comes through body language, tone of voice, and attitude. It's the sphere where the rules of human perception dictate how messages are interpreted which provides guidance on how messages should be expressed.

Influence components: All design features that make systems faster and less frustrating will improve usage and intervention efficacy.

There is a need to focus on interface design, which means developing intuitive interfaces, taxonomies, and overall information architectures.

To help users focus on critical information, use structural elements to convey meaning, such as graphic design/layout based on Gestalt principles of visual interpretation and taking advantage of primacy effects to physically place higher priority content where people routinely look, based on eye tracking studies.

Time is also a key factor, as an intervention can be encoded as a single-session or long-term program. Research shows that shorter intervention are more effective, however the ideal time will vary according the nature of the behaviour and required support.

To improve intervention efficacy, interventions can make use of language, employing clear and simple writing, framing issues according to audience needs, employing rhetorical tactics, and expressing factual information through more engaging formats such as personal stories or artistic expressions.

Media channel

Spheres: This sphere describes the different media used in an intervention, such as text, audio, photos, video, and multi-media such as CD ROMs or teleconferencing.

Influence components: There are pros and cons associated with each type of media, so it's important to pick the best media for each target audience, so that it is easy for them to engage with the media.

There is evidence that multi-media can be more effective than single media, employing audio and video as opposed to just text or images alone.

Intervention message

Spheres: The intervention message is what the source conveys to the audience. Digital interventions range from brief single-session interventions to long-term programs. However, regardless of the lengths, audiences are likely to interact with systems multiple times. Thus, intervention messages should be considered a sequence of messages that are expressed over the course of multiple interactions, which comprises a relationship.

Influence components: Effective messages will integrate influence components from all spheres in a way that is sensitive to a person's motivation (pros and cons) their capacity (ability and efficacy) and their trust in the source and proposed course of action. Also, messages need to include propositions, either overt calls to action or implied proposal to believe something.

When selecting influence components for a message, too few may be insufficient to influence audiences and too many may be overwhelming. There is a need to craft messages that use the fewest, but most persuasive influence components.

Audience interpreter

Spheres: This is the person, group or organization that the source is helping to change. This is the target audience. This sphere describes their demographic background, particular traits, habits, skills and psychological disposition.

Influence components: The influence components in this sphere describe how audience disposition impacts behaviour. Many elements from the classic behaviour change theories are grouped in this sphere, with popular constructs including beliefs about consequences, emotions, behavioural intentions, motivational goals, self-efficacy, memory and demographics.

In our study, key influence components included increasing users’ knowledge of the health issue and the consequences of their behaviour. Successful interventions focused on users’ motivation, primarily using positive goals but fear arousal was one negative emotional tactic that worked.

Based on action plans (discussed in feedback message), the systems provided instructions to help people develop new skills. Surprisingly, self-efficacy did not prove to be a key success factor. Perhaps due to the fact that digital programs appear to work for people who are already motivated to change and who may already believe they have the skill to do it.

Demographics proved to be relevant, with younger, female dominated, and higher educated groups achieving the largest outcomes.

Feedback encoding/decoding

Spheres: This sphere describes the different ways that digital systems capture user data for use in the intervention. For instance, web-based interventions often use questionnaires, mobile phones use GPS devices to encode data, and others use accelerometers to capture data.

If an intervention does not collect audience data, it is impossible to deploy audience feedback influence components. Without user data an intervention will be static and probably irrelevant to the majority of users.

Influence components: In digital interventions, data can be captured and encoded through website usage patterns, sensors (such as accelerometers or pedometers), and online questionnaires and polls.

To be more effective, data capture and encoding mechanisms need to be as minimally invasive as possible and require the least effort possible.

Where possible, it is a good practice to collect data over a long timeframe versus overwhelming users with time consuming requests. This can be implemented through several small data requests over time and incentivizing them when possible.

Feedback message

Spheres: The feedback message is the specific information that the audience shares with the source. Some of the most powerful behaviour change techniques can only happen with audience feedback, rendering this one of the most important spheres, where relationships and personal support can prosper.

While feedback encoding/decoding describes the data capture mechanisms, feedback messages are the tactics that can be deployed once feedback has been obtained.

Influence components: Feedback enables tailoring and personalization, both of which are highly effective. Personalization is the use of personal information, such a person's name and city. Tailoring goes further, to offer interventions specifically adapted to a person's particular needs and tastes.

Through the feedback message, interventions can develop personal action plans for users and identify barriers that may need to be removed or worked around.

Through feedback messages, users can obtain feedback on their performance towards their goals. Then depending on their performance, the system can use shaping, to reinforce efforts towards the goals with rewards and punishments (though we did not find evidence that punishment tactics worked).

In cases where a user has fallen off the bandwagon, through feedback messages, systems are able to initiate relapse support to get people back on the bandwagon.
In cases where people are dropping out of the program, feedback can be used to initiate adherence systems, to pull people back into the program.

Social context

Spheres: The majority of interaction happens within a social and physical context. This context is relevant to the success of an intervention and with the rise of social media, there are many opportunities to leverage social forces to make interventions more effective.

Influence components: In our study, social influence proved to be a very powerful sphere. Effective programs routinely provided time with a counsellor, tried to involve users’ friends and family, or provided options to connect with peer support networks.

Modelling behaviour is very effective, especially when combined with social comparisons, where a person compares their behaviour to others or where they are simply told about others behaviour.

V Conclusion

Science shows that digital interventions can help people achieve behavioural goals, ranging from simple habits to change that is more difficult to attain. Perhaps one of the biggest lessons is that effective offline programs can be adapted to digital environments and modeled on two-way interaction, forming long-term relationships with users.

The persuasive communication model has proven to be a valuable tool for understanding how digital interventions work. The upcoming revision has potential to help fix existing programs and aid the design of new digital behaviour change interventions.

About the author

Dr. Brian Cugelman has had an extensive career as a social campaigner, digital strategists researcher, and program evaluator. He has worked with grass-roots organizations, leading non-profit organizations, and has spent over seven years working with several of United Nations agencies.

He works as a consultant for AlterSpark Consulting (http://www.alterspark.com) where he provides social research, helps organizations measure their impact and designs online campaigns.

V Resources

Online Interventions for Social Marketing Health Behavior Change Campaigns: A Meta-Analysis of Psychological Architectures and Adherence Factors. Published in the Journal of Medical Internet Research, the findings demonstrate that online interventions have the capacity to influence voluntary behaviors, such as those routinely targeted by social marketing campaigns: http://www.jmir.org/2011/1/e17/.

The Dimensions of Web Site Credibility and Their Relation to Active Trust and Behavioural Impact: http://wlv.openrepository.com/wlv/bitstream/2436/85974/4/Cugelman_2009_w....

The psychology behind websites that can change people's health behaviours: http://www.cugelman.com/online-psychology/psychology-websites-change-hea....

Presentation on the Social Psychology of Social Media: http://www.cugelman.com/online-psychology/social-psychology-social-media....

VI References

1. Kotler, P. and Roberto, E.  1989. Social Marketing, New York: The Free Press.
2. McKenzie-Mohr, D. and Smith, W.  1999. Fostering Sustainable Behavior--An Introduction to Community-Based Social Marketing, Gabriola Island, Canada: New Society Publishers.
3. Fogg, B. J. (2003). Persuasive technology: using computers to change what we think and do. San Francisco: Morgan Kaufmann Publishers.
4. Oinas-Kukkonen, H. and Harjumaa, M. 2008. A Systematic Framework for Designing and Evaluating Persuasive Systems, in Lecture Notes in Computer Science, Springer Berlin / Heidelberg. p. 164.
5. Prochaska, J. and Norcross, J. 2001. Stages of Change. Psychotherapy. 38(4): p. 443-448.
6. Prochaska, J. Norcross, J. and DiClemente, C.  1995. Changing for Good: A Revolutionary Six-Stage Program for Overcoming Bad Habits and Moving Your Life Positively Forward: Collins
7. Cialdini, R.  2008. Influence: science and practice. 5th ed, Boston: Pearson/Allyn and Bacon.
8. Davidson, K. Goldstein, M. Kaplan, R. Kaufmann, P. Knatterud, G. Orleans, C. Spring, B. Trudeau, K. and Whitlock, E. 2003. Evidence-Based Behavioral Medicine: What Is It and How Do We Achieve It? Annals of Behavioral Medicine. 26(3): p. 161-171.
9. Abraham, C. and Michie, S. 2008. A taxonomy of behavior change techniques used in interventions. Health Psychology. 27(3): p. 379-387.
10. Michie, S. Johnston, M. Francis, J. Hardeman, W. and Eccles, M. 2008. From Theory to Intervention: Mapping Theoretically Derived Behavioural Determinants to Behaviour Change. Applied Psychology. 57(4): p. 660-680.
11. Cugelman, B., Thelwall, M., & Dawes, P. (2009). Communication-Based Influence Components Model. Paper presented at the Persuasive 2009, Claremont.
12. Reeves, B., & Nass, C. (2003). The media equation: how people treat computers, television and new media like real people and places: University of Chicago Press; New Ed edition.
13. Fogg, B. J. (2003). Persuasive technology: using computers to change what we think and do. San Francisco: Morgan Kaufmann Publishers.
14. Cugelman, B., Thelwall, M., & Dawes, P. (2009). The Dimensions of Web Site Credibility and Their Relation to Active Trust and Behavioural Impact. Communications of the Association for Information Systems, 24, 455-472.