COVID-19: The presence and pitfalls of evidence-based policy

UCL Public Policy
6 min readSep 13, 2023

--

This piece was originally published on the European Institute’s blog.

Jonathan Kamkhaji, Postdoctoral Research Fellow at University of Exeter, and Claudio Radaelli, Professor of Public Policy at UCL, argue that the surprising nature of the coronavirus crisis means governments cannot base their response on scientific evidence alone.

Across Europe, governments have responded to the spread of the coronavirus in different ways, both in terms of timing of the regulatory responses and in their content. Most countries are trying to follow a public health model as a foundation of their regulatory choice. The UK and Sweden have arguably followed the model of an initial response inspired by risk-based approaches.

These different regulatory choices, however, have one thing in common: they are presented to public opinion as grounded in scientific evidence. For all that we have heard about populism, attacks on science, and the death of expertise, this is a unique moment in time in which elected politicians communicate their intention to follow the template of evidence-based (science-based and risk-analysis based) policy.

Has Covid-19 allowed European democracies to turn the tide of evidence-free policy? Do we now live in evidence-based times, then?

In the public debate on how to respond to the spread of Covid-19 there is a full range of sources of scientific evidence, including behavioural evidence and data on the capacity of public health systems. In turn, scientific evidence comes from different strands of expertise and diverse disciplinary fields, such as immunology, microbiology, epidemiology.

Thus, we have to be clear on what we are saying.

We do not argue that political leaders are insincere about their intention to be evidence-based. Neither do we say that some governments are better or worse in gathering, handling and learning from the evidence they have — ours is not a comparative judgement. We do not know if the various responses of European governments are dictated by the rhythm and timing of the spread across countries, the pressure of business, the public opinion climate, the capacity of national health structures, or the legacy of coping with previous crises.

Yet, there is a particular aspect of this crisis that strikes us. And this is it: we argue that under crisis conditions of surprise and radical uncertainty, policy responses cannot be based on lessons, principles, pathways inferred from evidence. This is the same as saying that there are moments where the model of inferential learning does not work. And this happens notwithstanding the continuous claims and genuine efforts of governments to resort to the best scientific evidence.

Our claim goes behind the common observation that during a crisis, heuristics matter– and public decisions are affecting by different types of heuristics-induced bias. Rather, we argue that the regulatory choices of these weeks are caused by fast-paced mechanisms of contingent learning that associate a range of stimuli to responses.

Contingent learning is not inferential learning — the latter being learning from the observation of evidence, a mental activity of evidence-based reflection and change of beliefs and priors.

In our previous work on the crisis of the Euro we demonstrated that the correct causal identification of fast-paced crises puts behavioural change before inferential learning, that is:

Crisis -> causes behavioural change without change of priors or beliefs -> feedback- inferential learning

Instead of the classic:

Crisis -> triggers inferential learning based on evidence -> learning causes change

We argue this on the basis of the knowledge we have accumulated in the disciplines of psychology and behavioural economics. In change-or-die situations, individuals (and by extension, governments) have ‘to do something’. This behavioural change, however, is not a result of learning from evidence — be it evidence about scientific facts, policy capacity, or compliance with risk management decisions. Even if there is a lot of evidence on the table, it’s the fast-paced process of cue-outcome associations that drives the changes — not inferential learning.

Responses have to be produced quickly. Some work — and therefore the regulatory choice is confirmed. Some do not work, and the response to the challenge is altered or modified in its intensity, strictness or foundation of regulatory choice. Once a sufficient number of episodes across time show that the regulatory response is correct, and feedback is sufficiently robust, knowledge is codified in a set of protocols, ways-of-doing-things, and ultimately experiential lesson-drawing. Only in that moment we see the change of beliefs and priors associated with inferential learning or learning from evidence.

Evidence-based learning then comes at the end of the causal chain. What we see in the earlier phases of the chain is much more contingent. There is no time to distil the lesson that really matters from competing, even abundant sources and types of evidence. The world today is a laboratory where governments probe regulatory responses. The key is to accumulate quickly solid evidence on the feedback process and discriminate between different feedback signals, and the conditions under which one response works (confirmatory feedback).

It is a bit like throwing a child for the first time in the sea water without any prior swimming lesson. The child will try something. By associating certain movements to certain responses (stay afloat or not) the child will select the right response and keep on going along with that response. There is no time to ask for a swimming lesson or a handbook of different swimming styles. If nothing works the child will call for help — something not available with Covid-19, where the deus ex machina does not exist. Also, the sea may become calmer or rougher, something that will destabilize the learning process about the right movement.

It is exactly the lack of experience about the relationship existing between a given stimulus (the virus) and an outcome, that generates what we would call surprise. Resilient beliefs systems may not be a major hindrance to change. Surprise may trump priors–narrow considerations about elections, political strategy, interests–via self-reinforcing mechanisms of association between new stimuli and outcomes.

Today, and soon tomorrow with their hammer and dance strategies, European governments are trying a range of responses in a laboratory called Europe–with high inter-dependency among actors, structural and strategic uncertainty, fragmented information and (so far) poor feedback, because it takes time to establish that feedback has confirmed that choice A is correct and choice B is not correct. Of course, the way governments read feedback and learn is coloured by political preferences, institutional legacy, and heuristics. But we expect that evidence-based inferential learning will only take place after change, provided that feedback is sufficiently strong to anchor the lesson learned.

What are the implications for the situation we are in? We should in any case rightly insist on public access to the evidence and science on which regulatory decisions are taken — this is our right-to-know in democratic political systems. But our attention should be turned to identifying correctly and as early as possible the responses that seem to be confirmed by non-random feedback. We can gain usable knowledge by identifying and validating the scope conditions under the crisis-response-feedback-learning works — as well as rejecting the probes that are not validated in our European laboratory. In the meantime, we hope that at least one child will stay afloat — because no friend or relative can and will come to the rescue.

______________________________________________________________

More about the Authors

Claudio M. Radaelli is a Professor of Public Policy at UCL and a Fellow of the European Institute

Jonathan C. Kamkhaji is a Postdoctoral Research Fellow at the University of Exeter

Note: The views expressed in this post are those of the author, and not of the UCL European Institute, nor of UCL.

--

--

UCL Public Policy
UCL Public Policy

Written by UCL Public Policy

Supporting engagement and collaboration between UCL researchers and policy professionals

No responses yet