I have just completed reading the long awaited “Impact” report of the Home Office’s Blueprint Drug Education project. As then Drug Education Forum Chair and one of the Drug Education Forum representatives on the project’s Advisory Group, I am disappointed, frustrated…
The project was promoted from the outset as the most important drugs education research project ever to be undertaken in the UK. For those, like myself, who continue to believe in the importance of early intervention at school, family and community levels with young people, Blueprint was supposed to show us whether providing drugs education by Blueprint’s methods, based on effectiveness research from elsewhere, could have significant impact in a UK context.
It is difficult to exaggerate how important this project was promoted as being within the UK drugs education field.
I am not a statistician and I had always believed that the design had been properly validated by experts. In 23 schools we expected to be able to assess the differences in young people’s knowledge, skills, attitudes and behaviour when they benefited from the multi-component Blueprint programme, compared with students in 6 schools who didn’t get the programme.
Unfortunately I’m sure that the authors’ conclusions in this final report are absolutely valid. They state that, because of a flaw in the research design, comparisons cannot be drawn between the two groups. The report states:
“The original design of the Blueprint evaluation was not sufficiently robust to allow an evaluation of impact and outcomes and consequently the report cannot draw any conclusions on the efficacy of Blueprint in comparison to existing drug education programmes.”
Some important questions need to be answered. These include:
How could this happen with such an important, high profile and costly project?
At what stage were the flaws in the research design detected so that it became clear that comparisons between the 2 groups would not be valid? What decisions were then made, why and by whom?
What, if any, reasonable conclusions about impact on knowledge, skills, attitudes and behaviour can we draw from the study of the students who received the Blueprint intervention?
What implications, if any, do the findings have for future drug education practice and policy guidance?
What further research needs to take place?