Senior Research and Evaluation Officer, Diana Parkinson, reflects on the process of evaluating the CSA Practice Leads Programme in a social work context and how this has helped to shape our future work in this area.

I joined the research and evaluation team at the CSA Centre last September, back when we were still living in ‘normal’ times. One of my first projects was to undertake the evaluation of the CSA Practice Leads Programme in social work, and to produce a report describing the implementation and outcomes of the programme.

The CSA Practice Leads Programme in social work is an intensive programme of training and development for social workers, which seeks to build participants’ understanding and confidence in identifying and responding to child sexual abuse (CSA), and support them to cascade their learning across services. The programme was based on an earlier programme designed and delivered in East Sussex by Anna Glinski, the programme facilitator, who was at the time an advanced social work practitioner and is now Deputy Director for Knowledge and Practice Development at the CSA Centre.

The pilot programme comprised 10 days of small-group learning sessions, held over 10 months, which included reflective case discussions connecting evidence to ‘on the ground’ practice experiences in three local authorities between October 2018 and January 2020.

The evaluation

Although the programme was nearing its end as I started work, one of the first things I did was to work with Anna on developing a Theory of Change for the programme and, from that, to build an evaluation framework setting out what we wanted to assess and how we would do this.

Fortunately, the CSA Centre had put in place various monitoring systems when she set up the programme. These included questionnaires to be completed by Practice Leads at the beginning and end of the programme, and forms to record the support they provided to their colleagues during the programme. This meant that we were already gathering a considerable amount of data, which was great, but much of this was in quantitative form.

To supplement this data and in order to gain some deeper insight into the impacts and effectiveness of the programme, we ran focus groups with Practice Leads in each of the local authorities which provided an opportunity to learn directly from the participants about their experiences of the programme. It also proved valuable to interview some of their line managers, as well as the Principal Social Workers in each local authority, and Anna herself. As you can imagine, all this provided a rich amount of data to draw from in producing the final evaluation report.

Key findings

My analysis of the evaluation data revealed strong evidence of the programme’s impact on participants’ knowledge, skills and confidence in identifying and responding to CSA concerns, which, in turn, had enhanced their practice and enabled them to develop as specialists within their teams and wider organisations.

It also highlighted the quality of the programme’s delivery; participants had valued the way in which such a sensitive and complex subject had been approached in a manner that felt both positive and safe.

The analysis also showed how, by the time the programme ended, participants were disseminating their learning across their teams by sharing resources and delivering presentations. They were also starting to support their colleagues to overcome the fear and uncertainty that surrounds concerns of CSA, and, at times, were challenging them to ask direct questions and not let CSA concerns be put aside owing to lack of proof.

Once I had completed my analysis and writing up the findings, I sought Anna’s perspective on how we could take the learning from the evaluation and draw out recommendations that would be helpful in developing the programme further.

Given the highly positive nature of the feedback from participants and stakeholders and the evidence of the programme’s immediate impact, we concluded that the CSA Centre should continue to offer this programme for social workers within other local authority areas. We also felt it would be useful to test the programme in multi-agency settings (e.g. involving police, education and health as well as social work) to explore the benefits and challenges of bringing together practitioners from different sectors, and the wider impact this might have. We also suggested offering participating local authorities follow-on support to embed the programme – for example, through consultation on cases, facilitated group learning sessions or sharing of new research as it is published – which is now starting to happen.

Maximising the learning

After all the hard work of getting the report publication-ready, it might seem that seeing the report actually published would be the most satisfying moment of the work. In reality, the most satisfying and – I would even say, exciting – moment came somewhat later, when I had an opportunity to work with Anna to take the recommendations from the evaluation report and apply them to planning of the next iteration of the CSA Practice Leads programme. For example, we decided to re-design the registration form for agencies, and to create a new form to be completed by participants and their line managers, with the aim that both these documents would help to ensure a clearer shared understanding from the outset of the programme aims and its potential impact.

Evaluators rarely get to see their work being used so it was rather wonderful to have the chance to ‘round the evaluation circle’ and take the learning from one programme into the design of the next one.

As we look to the future development of the CSA Practice Leads Programme, we will consider how to best support CSA Practice Leads to continue disseminating their learning and how to assess the longer-term impact the programme has on local authorities’ response to concerns of CSA.

Blog posts give the views of the author, and are not necessarily those of the Centre of expertise on child sexual abuse.