Lou-Davina Stouffs and Teo Firpo from Nesta report back from our roundtable event
Last week we were pleased to host a roundtable event to discuss the What Works Centre for Local Economic Growth’s latest evidence review focusing on Innovation Policy. The strong turnout of local and national policymakers, researchers, and practitioners was a real testament to the growing interest and need to find out what works and what doesn’t in the area.
Evidence reviews like these are no small feat, and our colleagues at the What Works Centre combed through around 1700 studies until they were left with 41 that fit the review in terms of relevance  and robustness of results . While the full evidence review will be published on their website soon, this blog covers some aspects that particularly struck us.
Four things we learned from the evidence review
- Validating the previous theoretical literature, the evaluations indicate that R&D subsidies have a stronger impact on SMEs than on large firms.
- It looks like these programmes have more of an effect on employment than on productivity. This could mean that more spend goes towards hiring new employees rather than funding innovative activities.
- Surprisingly, the evidence suggests targeted programmes might not be as effective as general measures.
- Despite popular belief, public funding doesn’t seem to be crowding out private investment - but the findings are still a bit mixed.
The debate that followed covered a wide variety of topics, and three points really stood out for us.
Our three take-aways from the discussion
- The debate on what evidence we should be gathering is far from over, as there are many questions still open. Do rigorous evaluations make us lose sight of the local context? Can the metrics we use truly capture the complexity of innovation processes? How should we be tackling publication and commissioning biases?
- Despite these questions, there is a consensus on the fact that some programmes can actually have negative effects - and robust evidence is needed to identify them and roll them back.
- Many of those present stressed the need to develop new approaches to support innovation, while recognising that we are likely to continue to use many of the existing programmes. When it comes to the latter, we can ensure their impact is maximised by experimenting more.
Even though we work in this space, what still managed to hit us hard about the outcomes of the evidence review was how small - and inconclusive - the body of evidence is around innovation policy.
The Innovation Growth Lab (IGL), which we launched last October, was created precisely to grow this knowledge base, developing and testing with randomised controlled trials (RCTs) different approaches to support innovation, entrepreneurship and business growth. We aim to do this by bringing together the different communities working in this field, and providing them with tools to make testing the impact of different programme simple and accessible.
An important part of our work will be the dissemination and translation of the evidence to those who can make a difference. In this way, we want to play an active role in bringing out the best in programmes that work and finding new solutions for those that don’t.
As one of the first steps, we will be launching a database of all the RCTs in innovation, entrepreneurship and growth, in the coming weeks. Our hope is that this database will be a tool for policymakers, researchers and practitioners alike. Visit our website to read more about our work or to join our mailing list.
1. The What Works Centre for Local Economic Growth ((WWCLEG) concentrated on evaluations of R&D grants, subsidies and loans in this particular review. Of course these do not encompass the entire toolbox innovation policy makers have at their disposal. This review is part of a wider set of reviews that the WWCLEG will be publishing on innovation policy. They have previously reviewed evidence on Access to Finance.
2. The WWCLEG uses the Maryland Scientific Methods Scale to grade the evaluations they review. Any evaluations that score under 3 are omitted from the evidence review.
This blog was originally published on the Nesta website, you can read it here.