The Person-Based Approach to Quantitative Process Analysis

Process analyses try to find out how the intervention worked, by collecting data on how the intervention is used (usage analysis) and assessing if usage is related to outcomes or user characteristics (process analysis). We have developed two key concepts to support usage analyses:

  1. Effective engagement is the level of user engagement needed to bring about intended outcomes. Identifying the minimum usage necessary for an intervention to be effective permits the design of ‘lean’ interventions that quickly achieve users’ goals and cost less to maintain and update. 
  2. Meaningful measures of usage. Most usage data analyses are based on quantity of engagement (number of logins, time spent, pages viewed) - but often what matters is what people engage with. 

We use the intervention’s programme theory to identify the intervention elements that we predict will be important to engage with (of course, this may be different for different groups of users). Then we test the theory by evaluating whether engagement with those elements does relate to better outcomes.

The Germ Defence usage analysis found that handwashing levels were sufficiently increased after having viewed just the first of four sessions, but only if they completed the goal-setting element of that session. Based on this analysis, when Germ Defence needed to be adapted for the Covid pandemic we were able to quickly adapt and disseminate it as a single session with goal-setting, making it more acceptable to the public and easier to disseminate.

Additional data and analyses

It is useful to collect additional quantitative data that may explain how the intervention worked. By collecting data about user characteristics (for example, age, gender, ethnicity, education, preferences) we can identify whether sub-groups of people used the intervention differently or had different outcomes.  

Often, we ask users about their beliefs about carrying out the target behaviour, for example: whether they feel they are able to do the behaviour; and if they believe doing it will help them. By comparing their answers at the beginning and end of the study, we can understand what changed and whether this relates to better outcomes. 

It can be difficult to decide in advance what data will need to be collected when implementing and evaluating an intervention. Our framework for Analysing and Measuring Usage and Engagement Data (AMUSED) can help you to plan and carry out quantitative process evaluations more efficiently and systematically. 

The AMUsED Framework

The AMUSED framework helps you plan your analysis in advance, to ensure all the data you need will be captured and in a format that will make it easier to analyse during the optimisation and evaluation phases. 

The framework is made up of 3 stages, each containing a checklist with generic questions to help guide the intervention development team in selecting questions that are specific to the intervention. 

Stage 1: Familiarisation with the intervention

Supports in-depth/rigorous knowledge of the intervention:

  • Intervention structure, content and theory-base (e.g. number of sessions, behaviour change techniques)
  • What data will be captured and when (e.g. self-report measures at baseline and outcome, log-data)
  • Context around the intervention (e.g. recruitment methods, findings from qualitative research)

Stage 2: Selecting usage variables and generating research questions

Supports planning the analysis:

  • What are the meaningful measures of usage (i.e. variables that examine the intervention’s structure, theory-based content, and active ingredients of the intervention)
  • Who used the intervention (e.g. do user characteristics predict meaningful usage)?
  • How was the intervention effective (e.g. what is the relationship between meaningful usage and outcomes)?
  • What type or level of usage is necessary for effective engagement?

Stage 3: Preparation for analysis

Supports preparation of data analysis:

  • Necessary resources
  • Analytical software selection and compatibility with collected data
  • Data preparation (e.g. combining multiple data sets from different sources)

A full explanation of the application and development of the AMUsED framework is available here:

Miller S, Ainsworth B, Yardley L, Milton A, Weal M, Smith P, Morrison L. A Framework for Analyzing and Measuring Usage and Engagement Data (AMUsED) in Digital Interventions: Viewpoint. J Med Internet Res 2019;21(2):e10966. doi: 10.2196/10966

Examples of process evaluations where the AMUsED framework has been applied:

Miller S, Ainsworth B, Weal M, Smith P, Little P, Yardley L, Morrison L. A Web-Based Intervention (Germ Defence) to Increase Handwashing during a Pandemic: Process Evaluations of a Randomized Controlled Trial and Public Dissemination. J Med Internet Res. 2021 Oct 5;23(10):e26104. doi: 10.2196/26104. PMID: 34519661. doi: 10.2196/26104

Miller, S., Yardley, L., Smith, P., Weal, M., Milton, A., Stuart, B., Little, P., & Morrison, L. (2020). How digital interventions support self-care of minor ailments: A process evaluation of the Internet Dr intervention for respiratory tract infections. Journal of Medical Internet Research Preprints, 24239.  
doi.org/10.2196/preprints.24239

More information on effective engagement:

Ainsworth, B., Steele, M., Stuart, B., Joseph, J., Miller, S., Morrison, L., Little, P., & Yardley, L. (2017). Using an analysis of behavior change to inform effective digital intervention design: How did the PRIMIT website change hand hygiene behavior bcross 8993 users? Annals of Behavioral Medicine, 51(3), 423-431. doi.org/10.1007/s12160-016-9866-9

Yardley, L., Spring, B. J., Riper, H., Morrison, L. G., Crane, D. H., Curtis, K., Merchant, G. C., Naughton, F., & Blandford, A. (2016). Understanding and Promoting Effective Engagement With Digital Behavior Change Interventions. American Journal of Preventive Medicine, 51(5), 833–842. doi.org/10.1016/j.amepre.2016.06.015