I have pointed out in this series that there is an increasing level of complexity involved in making policy decisions. As Anthony Giddens pointed out, authorities operating with incomplete information are always in a tension between being seen as scaremongers if they push an issue too hard, or being accused of cover-up if they fail to push soon enough and hard enough. I have also attempted to show that there are powerful stakeholders on all sides of scientific issues. The idea of scientists as the objective and impartial arbitrators of truth is a myth. I have also sought to demonstrate that the idea of consensus as a guiding force is also illusionary. There are two more concerns I have about how we deal with policy on these complex issues.
First, economist Arnold Kling published an article last month called Two Strategies for Avoiding the Truth. Kling lays out what he calls a low investment strategy and a high investments strategy for avoiding truth. He writes concerning the low investment strategy:
The general public follows what I would call a "low-investment" strategy for avoiding the truth. They do not know the names of their representatives. They do not know the difference between a Sunni and a Shia. They do not know the approximate size of the Budget deficit or its outlook. And so on.
Ilya Somin, in his contribution to the Critical Review volume, points out that there is no particular reason for citizens to make a large investment in learning facts or forming coherent beliefs about political issues. The low probability that your vote will make a difference makes for an adverse cost-benefit calculation from obtaining information.
Let us be clear here. Kling is not necessarily faulting the public. It is rational to invest your time and energy in those matters you can influence since you can not be knowledgeable about everything and influence every issue. To some degree, this reality underpins our republican-democratic political system. We theoretically elect representatives to more closely scrutinize public policy issues and represent our interests accordingly.
The high investment strategy for avoiding the truth is a strategy that involves putting “considerable effort into emphasizing facts and arguments that support [an] overall position, while ignoring conflicting evidence.” Kling uses Rush Limbaugh and Paul Krugman as opposing examples of this strategy.
They know the facts about the structure of the American political system and the identities of major office-holders. They understand the connections between various beliefs. They maintain consistent positions, and their opinions are highly predictable, unlike the unstable, random positions that show up in polling of the mass public.
Limbaugh and Krugman may not necessarily be wrong (although it is hard for both of them to be right). However, both follow strategies that are designed to reinforce prior beliefs of conservatives and liberals, respectively. They highlight information and arguments that support their prior beliefs. When they encounter contrary evidence, they engage in "motivated skepticism," seeking to undermine the credibility or minimize the significance of the adverse information.
In fact, one could argue that Limbaugh and Krugman do not have wisdom that exceeds that of the ignorant public. However, while the typical individual's rationalizations of his or her beliefs are illogical and ill-informed, Limbaugh's and Krugman's rationalizations are clever and erudite.
Clearly this strategy can only be effectively executed by a small number of elite folks with the time and resources to devote to it.
We only learn according to what we already know and these elites have identified a core set of beliefs among their followers. They have then built a coherent package of policy positions that allows the common citizen to have a coherent view and "feel" informed. Very often this “package” has not been arrived at by a citizen wrestling with intricacies of the issues. Policy positions are adopted merely because they seem to “hang together” and the elite leader reinforces that perception. As a consequence, positions on some issues that might once have been fungible or more nuanced, are made paramount to keeping the package together. It increases the number of issues upon which politicians and the public become divided. The phenomena of low-investment and high-investment strategies are not unique to complex issues involving scientific data, but it does shape the lens through which people are willing to view and consider data.
My second concern is with “Bulverism.” Bulverism is a term invented by C. S. Lewis. Lewis claims that Ezekiel Bulver (a fictional character) was the founder of or modern form of dialog and debate. When Ezekiel Bulver was a small boy he overheard and an argument between his father and mother. His father was arguing that the length of two sides of triangle added together would always be longer than the length of the remaining side. Finally his mother exclaimed, “You just say that because you’re a man!” That ended the argument. The young Bulver learned that is was not necessary to refute or address an opponent’s facts or reasons. One merely had to assume your opponent was silly, concoct a plausible explanation for how your opponent became so silly, and then make that the subject of debate. These "plausible explanations of silliness" usually fall into either of two categories: A) The opponent is not quite mentally right or, more typically, B) the opponent has a concealed agenda (frequently malicious) that causes him or her to act so silly.
While it is unavoidable as human beings to make assumptions about the motives of opponents, debating motives rarely leads to productive insights about a policy issue. I suspect that bulverism most often comes into play when those of us employing low-investment strategies find ourselves without sufficient factual data to protect what we believe to be essential positions. It also tends to be a common counter measure when we have personally been "bulverized." In any case, bulverism, as ubiquitous as it is, does not lead us to sound policy considerations.
So what might help us to make more sound policy decisions in a context of complexity?