Systematising the Steps from Evidence to Impact: Amplifying the Royal Commission’s Recommendation for an Early Intervention Research Directorate

Part of a 5-post series on a social innovation perspective on the Nyland Royal Commission into Child Protection Systems.

The Nyland Royal Commission recommendations have proposed the development of a ‘cross-departmental Early Intervention Research Directorate (EIRD) to inform the development of preventative and early intervention services. In a previous post for this series, we explored what it might take to do prevention well, in this post we wanted to explore what model of ‘research directorate’ would have the best chance of creating impact for children and parents in South Australia.

The directorate could become something that actually builds new and effective solutions in South Australia and enables the system itself to become a genuine learning organisation. However, we think that if the ‘research directorate’ exclusively focusses on research we’ll see limited impact on intergenerational well-being outcomes for children and families.

Nyland’s recommendation is that the directorate focus on services for the prenatal period, young and first-time parents, and care leavers who become parents. The EIRD has real potential to bring a new type of rigour to the selection of early intervention and preventative service offerings. Nyland recommends it should:

– Create a five-yearly whole of government prevention and early intervention strategy

– Guide funding priorities for better service coordination

– Evaluate innovative service models in other jurisdictions to see if they’re suited for South Australia

– Require evidence-based evaluations for program funding.

These are exactly the kinds of efforts that have the potential to build the service landscape to stop crises — and even risks — before they start. Eventually a functioning set of preventative measures could start to reduce the number of children needing child protective services in the future, and over the long term start to reduce the number of young people needing out-of-home-care.

Research is essential, but research capability alone is insufficient to move from evidence that something works in one situation to impact into South Australia, or a fully  functioning system in South Australia. The opportunity is to extend the capability of the directorate so that:

1. Where there is evidence of effective interventions in other contexts, they can be effectively adapted and spread within South Australia

2. Where there are gaps in evidence (and there are many in this sector) new knowledge can be generated though experimentation in South Australia

3. Where there are interventions that are already working in South Australia we detect them early, nurtured and spread them (as well as stopping ineffective or harmful solutions).

1. What does it take to adopt and adapt a solution that is proven in another context?

Evaluations of existing solutions are based in the past and nearly always in other contexts: an evaluation of a great program in Canada that came out last year…something from the UK from four years ago…a new approach in New Zealand. This, of course, doesn’t tell you if the same solution will work in South Australia, where we have different demographics, cultures, service options and infrastructures, funding arrangements and a different service system entirely.

Evidence of effective interventions can be used to accelerate the development of new services and solutions, but we still need to undertake a rigorous process of adaptation that need to involve more than research.

While at a high-level, problems and situations within child protection are not unique, at the local level they are very unique. A program designed for one context may not be readily applicable to another. It is also a possibility that something that ‘works’ in one context may be entirely unsuitable for another. This is particularly relevant to social programs where what makes programs and services ‘work’ can be very nuanced and intentionally designed for unique community needs. Seemingly minor factors such as branding, key messages, referral sources, or staff training time and techniques can be determining factors of success.

When program elements are not adapted but need to be, and other components are altered but shouldn’t be, the potential for success is undermined: University of South Australia found that when evidence-based interventions are implemented in new contexts, they often are “quickly adapted or changed, resulting in interventions potentially losing the key ingredients that were critical for effectiveness.”* 

We see six key stages to a good adopt/adapt process:

Adopt-adapt

We saw an example of this in Ireland, where The National College of Ireland took an early childhood development program from the US, Parent Child Home Program and adapted it for a Dublin context . The program introduces new parents to reading and play activities to do with their children which has long term effects on children engagement in school, parenting capability and reduces high school drop out rates.

When adapting it, coordinator Beth Fagan knew that the fidelity couldn’t change but some other things could  as long as it “maintained the core.”  In the US, the program used retired social workers, but those individuals in Dublin didn’t want to go into the areas with high deprivation in the docklands. They made an adaption to instead up-skill local women to deliver the ‘play and learning training’ to parents. And this adaptation has been critical to the program’s success. Being part of the community gave workers respect and helped them recruit parents more easily. There was even an unforeseen bonus to the adaptation. The workers began practicing the techniques on their own children — it improved their own parenting and outcomes for their own kids.

2. What does it take to develop effective programs where there are gaps in evidence?

Where there are existing programs proven to deliver genuine outcomes it makes complete sense to build on them and adapt them for South Australia. But there are many places where there are gaps in evidence or the evidence itself is inconclusive — what if the answer is not out there?

For example in conducting a literature review on trends in restoration (reunification) there are few overarching conclusions that can be made about who restoration works for and who it doesn’t work for. This is because existing research is focussed on a mix of demographics, localities, and methods — you can’t conclude that what led this individual to a successful reunification in this context will be the same for other individuals without knowing just what contributed to the success. Was is it the person’s characteristics? Was it the system’s behaviour? Was it the intervention? Was it the combination of all three, or none at all?

So when existing research is patchy or nonexistent, what do you do?

Our work in restoration (reunification) follows a co-design innovation process, where desk-based research is one part of our ‘discovery’ stage work. In parallel to looking at the literature and existing programs, we conduct ‘design research,’ which meant spending time with people who had experienced restoration first-hand or were working toward it.

By speaking to families and looking at their whole experience engaging with child protective services, we gathered a layer of understanding around what helped in some cases and hindered in others. The sample was not statistically significant or representative, but by the end we had a set of hunches about what made restoration work and what got in the way. These insights, informed by both learning from people and the existing evidence, shed light on the answers existing rigorous academic research didn’t have.

Much more sophisticated learning will occur in the next stage of this work as we test assumptions through designing-with and experimenting. Design and Trial phases of innovation are rigorous components of our learning processes, but are not usually understood to be ‘research.’

bowtie

In 2010, TACSI identified a gap in family preservation and early intervention services that were offered for the whole family and by someone who families could trust. Mostly, a kind of ‘helpful’ help that parents were looking for when going through tough times but didn’t have opt-in access to — something that didn’t feel like a service. This discover, design, trial, spread approach is the process TACSI used to develop Family by Family.  We identified what families wanted and needed, co-designed models with families, tested it out, and finally spread it to six sites across Australia with interest in adapting it globally.

By doing this at each stage we experimented and tested-out uncertainties, modifying the program with families and practice experts until TACSI was confident it would work — and by work we mean  improve the lives of families going through tough times.

This process limits the guesswork when trying to start something new; it helps us reduce our ‘unknowns’ and increase our ‘knowns’ over time through controlled experimentation in the real world.

Nyland’s recommendations are explicit in stating that that the Directorate needs capability to do research in the discovery stage and evaluation in trial stages. But if the EIRD is to adopt and adapt interventions, and create new evidence where it doesn’t exist, it will need capability to conduct design research, to prototype solutions at a small scale and when solutions have been trialled to implement those solutions across the system. It needs a capability beyond the Nyland recommendations.

3. What does it take to identify promising practice and stop ineffective practice in South Australia?

An effective EIRD should be designed to also recognise effective practice in our systems; there’s research to be done in our own context. Small scale investigations can surface ways the people in the system are already innovating — at times working around the procedural constraints to do great things for kids and families.

It’s good that the directorate will require evidence-based evaluations of programs, however that takes time and has considerable costs attached. As well as evaluating programs the directorate should engage in ongoing monitoring of programs and on-the-ground scouting out of effective practice and interventions.

In TACSI’s restoration (reunification) work, we’ve seen great programs die because those controlling resources weren’t aware of program’s value, didn’t know they existed, they were looking at other metrics and indicators or success, or didn’t help initiatives spread to become the new status quo.

For example we discovered a team that flipped restoration (reunification) rates from 70% failure to 85% success rates. Unfortunately, it was overlooked by senior leadership and disbanded despite its innovative design and efficacy. While this kind of crossing wires of competing priorities is not uncommon, it doesn’t have to be the norm.

The EIRD has a great opportunity to foster a commitment to outcomes and the conditions for innovation so that we can leverage what good work already exists and ensure that quality innovations are supported to be better:

And if the EIRD seems like a great idea for early intervention and prevention, why limit it to one part of the abuse and neglect spectrum — should we consider a similar effort for out of home care and the rest of the statutory child protection system too? What’s more is that the EIRD, as described, would focus primarily on parts of the design and evaluation or ‘Trial’ stages of an innovation process, however there are more steps between evidence and impact.

What would it take to build a learning system that is actively addressing its own weaknesses and building on its strengths?

In working across human service systems throughout Australia it’s surprised us how lassie-faire systems are when it comes to understanding what works, stopping things that don’t, and spreading those that do. In some cases, we’ve seen biscuit companies that have more rigorous and evidence-based processes for development of new products than we use to develop new services for the families and children who need the most support. It’s time we bring the creativity and rigour we see in the best parts of the private sector to the development of services and systems in the public sector. (We won’t — more about that here)

With the EIRD, South Australia has the opportunity to demonstrate what it means to build an adaptive system, a learning system. That’s a big deal, because whilst most governments want to be adaptive, they’re not — and systems theory says that it’s only by being adaptive that you can tackle complex problems. All governments need to be adaptive, and practically we think that means building a systems that closely connects research, design, implementation, monitoring and evaluation capability.

An effective learning system of the kind described here is the system’s best defence against another Royal Commission. We need systems that can identify their own strengths and weaknesses and help themselves, rather than repeatedly requiring external interventions to promote change.

We may need to build the self-sufficiency of families so they can thrive, but first we’ve got to build the self-sufficiency of the systems that serve them.


Fiona Arney, Kerry Lewig, Robyn Mildon, Aron Shlonsky, Christine Gibson and Leah Bromfield, “Spreading and implementing promising approaches in child and family services”