The review considered almost 700 policy evaluations and evidence reviews from the UK and other OECD countries.
It found 23 impact evaluations that met the Centre’s minimum standards. This is a smaller evidence base than for our first review (on employment training) although this may still be larger than the evidence base for many other local economic growth policies. It is a very small base relative to that available for some other policy areas (e.g. medicine, aspects of international development, education and social policy).
Overall, of the 23 evaluations reviewed, 17 found positive programme impacts on at least one business outcome. Four evaluations found that business advice didn’t work (had no statistically significant effects) and two evaluations found that business advice might be harmful.
Once we dig down in to the detail some key messages emerge. As LEPs roll out their Growth Fund allocations, it will be important to keep these in mind.
First, we found that business advice programmes show consistently better results for productivity and output than they do for employment. This may be because it’s genuinely easier to help firms raise their productivity. It’s possible, however, that productivity increases come first with employment gains seen in the longer run. Either way, this suggests that in the short run business advice programmes are more likely to increase productivity than employment.
Second, we also found some evidence that programmes which used a hands-on, ‘managed brokerage’ approach may perform better than those using a light touch approach (although this conclusion is based on only one direct comparison study). Taken at face value, this suggests that a strong relationship and a high level of trust between advisor and client may be important to the delivery of positive programme outcomes. It is not clear, however, which of these two approaches is more cost-effective.
Third, there are also lots of debates on which the evidence is inconclusive: In most cases, programmes had vague or multiple objectives, which makes measuring success difficult. We found no evidence that would suggest one level of delivery – national or local – is more effective than another. It is also difficult to reach any conclusions about the effectiveness of public-led vs. private-led delivery.
As should be clear from this review, there are many things that we still do not know about the effectiveness of business support. Much of the policy debate focuses on very broad questions about the institutional structures that are put in place to support businesses. Yet overall the evidence provides no clear steer on whether one particular type of delivery model (public / private; national / local) is more effective.
To help improve business advice programmes, we would like to see far more focus on robustly evaluating the impact of particular aspects of advice programmes and comparing their cost-effectiveness. For example, the costs of light touch versus more intensive support vary dramatically, yet we found only one evaluation that directly compared the effectiveness of these two types of support. Similarly, only 5 of the 23 shortlisted studies included cost-benefit analysis that assessed cost-effectiveness, and not all of these used measures that are comparable across studies. There is a clear need for more, consistent cost-benefit analysis in business advice impact evaluations.
We believe that further evaluations of this kind, involving, for example, the provision of different types of advice to similar firms, should be a priority for improving our understanding of what works in business advice. Local flexibility that allows for greater experimentation provides an ideal opportunity to undertake such evaluations. We are starting conversations with a number of LEPs and local authorities about this. If you are interested in helping us experiment in this area and improve our understanding of what works, please get in touch.