Evidence 2016 Training workshop on government evaluations graphic recording
A shift over time
Fifteen years ago, in 2001, I had the privilege of working on the HIVSA project – a dynamic 6 weeks of participatory training workshops with policy-makers, health practitioners and researchers from eight SADC countries. These workshops aimed to support evidence-informed decision-making for those involved in designing, implementing and/ or evaluating education programmes for HIV prevention in Southern Africa.
The challenges of using evidence to inform decision-making were considered great by our workshop participants – a lack of relevant research; challenging IT infrastructures; a need to fulfill expectations of external funders, gaps in understandings between key stakeholders etc. However, there was interest among our participants to harness the methods we were sharing.
I was struck at Evidence 2016 by how far things had shifted in these last 15 years. The increase in enthusiasm and understanding of evidence-informed decision-making was exciting. The expertise in every session – from the speakers, as well as the participant asking questions – was both humbling and inspiring. I know in the UK, where I work, that there has been a growing trend towards evidence understanding and use: the shift in the education sector had been notable in this regard. What I hadn’t realised until I arrived at this conference, was how far many African countries had embraced this movement, and with such enthusiasm!
Workshop learning – a two way street
The training workshop I led at Evidence 2016 on government evaluations was a great forum for sharing. I learned a number of things from the participants in the workshop. We had interesting discussions about the timing of evaluations, the ways in which potential sustainability can be evaluated, and the challenges faced in trying to ensure rigour within evaluations.
I made the assertion that good relationships between funders and evaluators prior to the start of evaluations can improve the quality and outputs of evaluation. This point ended up chiming very much with points made throughout the conference about relationship building. I learned though that some of my assertions about discussing possible changes to evaluations in advance, to ensure realistic and appropriate designs, were out of sync with strict tendering rules for government evaluations in some countries, including South Africa. More thinking is required about how to best ensure the development of good funder and evaluator relationships within different funding contexts.
It was useful to have realistic discussions about the extent to which evidence – from local and international sources – can currently be applied in a policy context. Both in the plenary sessions and the training workshops, it was helpful to hear the realities that policy-makers are faced with in their decision-making, but also the willingness to consider ways to break down current barriers to evidence use.
Creativity and evidence – strange bedfellows?
My favourite innovation at Evidence 2016 was the inclusion of an artist summarising sessions. It turned out to be surprisingly easy to forget that an artist was drawing the session, even though she was only a few feet away from me. (Much easier than ignoring a camera!) It was remarkable to see how she captured aspects of the presentation and discussion and turn them so immediately into a true piece of art that also had contextual meaning. I was transfixed when I sat behind the artist during one of the plenary sessions, watching how she worked. I think her addition was very innovative and exciting. It was a real talking point of Evidence 2016, and the picture of my session is something I will always keep to remember the conference. It will be my own personal piece of evidence from an engaging and impactful conference.