## Notes from 09 February 2026
[[2026-02-08|← Previous note]] ┃ [[2026-02-10|Next note →]]
I came across this [short case study](https://www.nesta.org.uk/case-study/how-can-evidence-based-education-programmes-be-effectively-scaled-up-in-england/) from the _Behavioural Insights Team_ (BIT, part of [[Nesta]]) about scaling an evidence-based teacher professional development program in England, and the concept that stuck with me was “learning partner”. In the project, BIT worked with the [Education Endowment Foundation](https://educationendowmentfoundation.org.uk/) (EEF, a major UK funder of education evidence) and SSAT (the Schools, Students and Teachers Network, which developed and delivered the program) to support the scale-up of [Embedding Formative Assessment (EFA](https://www.ssatuk.co.uk/cpd/teaching-and-learning/embedding-formative-assessment/).
What “learning partner” means here is that BIT wasn’t only evaluating the program from the outside and reporting results at the end, but also feeding insights back during delivery (regular feedback, practical recommendations, and capacity building) so the implementation could be adjusted in real time while the program expanded across schools. The case study makes a broader point I found a bit obvious but still useful: even when an intervention has solid evidence behind it, scaling depends on the mechanics of delivery (monitoring, incentives, organizational capacity), not just on whether the program “works” in a trial.
It all reminded me also from the [[Arnold Ventures#State Partnerships for Proven Programs|State Partnerships for Proven Programs]], from [[Arnold Ventures]] in the US.