We recently co-hosted a workshop to explore how evaluation can support design and design can support evaluation
In 2019, we co-hosted the sold-out Design and Evaluation for Impact, Converge and workshops with our Evaluation partners Clear Horizon in Melbourne, as well as our good friends at the Auckland Co-Design Lab and The Southern Initiative, all supported by the Australian Evaluation Society.
During the workshops we explored how evaluation can support design and design can support evaluation. All in the name of more rigorous social innovation.
Whilst design and evaluation may seem like unnatural bedfellows the truth is that they are both evaluative processes concerned with impact. And the rigour and safety of evaluation provides a great compliment for the creative and unconventional nature of design.
On the day one workshop, we named four ways design and evaluation can come together; by the end of the conference it was up to five; and with a little more reflection here I’ve stretched it to eight. And only one of these is evaluating pilots for outcomes.
How design and evaluation can work together for impact (a work in progress reflection):
Process evaluation can hold social innovation processes accountable to a set of principles for practice. This seems particularly helpful if co-design, self-determination, or other cultural considerations are an important part of your process.
Theory of change gives the social innovation process a way to model pathways to outcomes. This is particularly important as design-based processes, with their commercial origin, don’t have a way to model outcomes. Theory of change solves that.
Prototypes can be evaluated – pre-pilot and at a small scale – through simple experiments. Evaluation can bring more rigour to late stage prototyping.
Evaluation can be designed-in to services and processes and that can be done in such a way to make it useful (even delightful) for end users and commissioners.
Developmental evaluation approaches that run alongside social innovation processes, can help de-risk what is often a new process by providing another set of eyes on what’s working and what’s not, in any given approach.
Growth in social innovation or design capability can be evaluated. This is often of interest given so many organisations are seeking to build their capability in this area.
Evaluation can bring rigour to the design of staged and gated innovation pathways or innovation challenges, supporting taking tricky judgements on early-stage ideas or opportunities.
And evaluation can of course evaluate pilots for outcomes, the core use of evaluation. But often the only use we think of.