bi.team

How to scale interventions (and avoid the voltage effect)

A major barrier to scaling interventions is the voltage effect: when an approach that works in a small trial fails to deliver the same results at scale.

This is a fundamental challenge for the development of evidence-based policy. One of the reasons the voltage effect happens is that it is challenging to implement interventions well at a bigger scale. Often interventions are led by small teams that don’t have the capacity to scale their intervention effectively.

So, how do we implement interventions well at scale?

Let’s take a look at an example from the education sector.

In anEEF-funded efficacy trial in 89 primary schools, a neuroscience-based interventionaimed to improve science and maths attainment for Year 3 and Year 5 pupils. “Stop and Think” used short, computer-based learning activities designed to help children to pause and reflect before answering, rather than relying on their first instinct. Itwas found to significantly boost science attainment by the equivalent of two additional months’ progress.

We worked with EEF and the developers to implement this intervention at a larger scale. This involved visiting 170 schools to deliver in-person teacher training and providing a helpline during the intervention period.

This larger scale “effectiveness” trial was supposed to simulate more ‘real world’ conditions. The independent evaluator (NatCen) stipulated that, in contrast to what was done in the efficacy trial, we should not proactively contact teachers who fell behind with intervention delivery (other than through automated reminders).

It was found that even at a larger scale, and under these more real world conditions, the intervention achieved similar positive outcomes in science for all pupils – successfully overcoming the voltage effect.

How did this trial overcome the voltage effect?

Collaboratively planning a structured implementation process

We assembled a team ofProject Champions to train teachers and actively involved them in shaping the implementation process. After learning about the intervention from its developers, they helped finalise their own handbook, develop training materials for teachers, create school communications, organise school visits and set up the monitoring processes.

This not only improved the implementation design but also boosted engagement. AsEEF’s implementation guidance report states,“People, ultimately, value what they feel a part of,” and involving implementers increased their enthusiasm and commitment.

Building relationships, and involving all stakeholders at all levels and stages

WithProject Champions working remotely and travelling long hours, we fostered team identity through collaboration, meetings and shared experiences to keep them motivated. When recruiting schools we engaged staff at all levels, from leaders to class teachers.

A key focus of school visits was building strong connections with teachers, encouraging them to reach out to our helpline with any issues. This approach worked—teachers frequently contacted us and the independent evaluation highlighted their positive feedback:“found the trainers friendly, supportive and engaging in their explanations.”

Developing behaviourally-informed materials for teachers

Our main priority was to make iteasy for the teachers to deliver the intervention.

We produced a clear training script, and a simple teacher handbook with only essential information, which we shared during school visits. After each visit, we emailed teachers a digital copy of the handbook, game access details and a refresher video.

Just before the intervention began, we re-shared the video and handbook as a timely reminder. Teacher testimonials in the independent evaluation confirmed our success in keeping things straightforward.

“Teachers also expressed that the trainers had explained the programme well. Furthermore, teachers in interviews found the training materials, especially the Stop and Think handbook, useful in familiarising themselves with the game.”

“Training was described as easy to enrol on, unproblematic to access, and as having high-quality accompanying materials.”

During the intervention period, teachers received an automated email on a Friday morning if they had not yet completed the intended three sessions that week (encouraging them to complete another session that day, and four instead of three the following week, if necessary).

The emails were behaviourally-informed in that they:

Provided atimely reminder.

Were sent to the individual teacher specifically, and werepersonalised with the number of sessions they had completed that week and the name of their class – this was designed to attract attention and make the message feel personal rather than automated.

Appeared to come from the email address their Project Champion had used to follow up with them after visiting their school, thereby utilising the behavioural insights of themessenger effectandreciprocity.

Had aclear call to action in the subject line.

Provided a reminder about the helpline.

Were designed so that the teacher could reach our helpline simply by replying to the automated reminder.

Monitoring implementation

Project Champions kept a log of their school visits, which they shared and reflected on in a weekly meeting with their manager. This was a way of ensuring that whenever a school postponed a visit it was rescheduled, and that any issues were identified and resolved.

Implementation was tracked through usage data. However, the terms of this effectiveness trial were that we could not proactively reach out to teachers on the basis of low usage. Instead, we had to rely on schools contacting our helpline.

Being highly responsive to participants

We were committed to being highly responsive on the helpline, as this gave us an opportunity to troubleshoot any issues and get implementation back on track.

Teacher testimonials identified this as a key factor in the success of implementation:

“Teachers expressed that BIT were responsive when it came to their queries…In interviews, teachers and school leads who accessed technical support, particularly by email, reported that they received replies within the same day and, occasionally, within a few minutes.”

“Teachers discussed a number of factors that made the programme easy to implement. The first was the quality and timing of the training, and responsiveness of delivery support. Teachers who discussed this remarked favourably on communication being clear and support being “readily available”.”

The EEF is discussing next steps for Stop and Think and has included it on theirPromising Programmes page: on this page, schools can express interest in accessing the Stop and Think game.

It is important to note that successful implementation in this trial was not just down to our approach; as a whole-class game-based computer programme, this intervention is easier to scale than most.

However, we feel that these learnings could be applied by delivery teams at BIT and beyond to improve the implementation of trials at scale.

At BIT we use a deep understanding of human behaviour to design interventions that solve your real-world problems.Find out more about our work and get in touch to discuss how we can work with you.

Read full news in source page