On Trusting Facts, Arguments, Experts

For one reason or another, the year 2014 saw the series of tubes known as “the internet” confront me with numerous articles questioning the efficacy of persuading people by facts and rational arguments. For example, a left-wing columnist for a major German online publication claimed (based on personal experience, without empirical evidence) that “perhaps two percent of humanity can recognize the limits of their own mental capacities” and thus be convinced by somebody else’s points to change their mind on something. And blogger David Friedman linked to a paper that asserted (with empirical evidence) that if you separate people on the basis of political (or religious) views, people whose political (or religious) views predispose them to believe in a scientific claim will be more likely to do so the more scientifically educated they are. At the same time, people whose views predispose them to disbelieve that claim will be more likely to reject it the more knowledgable they are. Regardless of where you stand on some specific debate (the examples given are creationism and global warming), the most obvious – and depressing – explanation would be that people do not use their increased knowledge of a subject to find out the truth, but rather to come up with better justifications for what they wanted to be true in the first place. My third example is a widely reported study that “anti-vaxxers”, wo spread the (false) claim that vaccines cause autism in children and seek “natural” alternatives, can become more convinced of their position after being confronted with evidence to the contrary.

This would seem to imply that it is a waste of time to try and argue a point to people who are already convinced of the opposite. Instead, like a savvy politician, someone who wants to convince others of his or her ideas should exclusively appeal to undecideds and their own “base”. Some people may also take it to confirm their longstanding suspicion that their opponents are fools and fanatics, while, of course, their side of an argument is so hard to convince because they are, in fact, right. The consequence that might lead some of them (or judging by the current state of the internet: a great many of them) to is to abandon rational discourse entirely and see public debate as a struggle for power, where discrediting opponents and generating outrage are the winning strategies. This would also mean that if a majority of society are wrong about something, and perhaps deeply convinced of that wrong something, it is close to impossible for them to recognize it.

While that kind of cynicism is certainly true to a degree, I still don’t quite buy it. My beliefs (which I currently hold without empirical evidence, so I am just mouthing of wild conjectures here) are that a) the people who are reluctant to engage with facts contradicting their views (i.e., most of us) may be irrational, but not quite as irrational as it seems, b) rational argument may not be able to convince someone of the wrongness of their entire worldview within a week, but it may reach more modest goals, provided it addresses the reasons people are so unwilling to change their minds. Let me work through both of these points in turn:

a) Imagine that one weird friend you have gave you a lengthy talk about how the moon landing was faked, some Vatican bank scandal 12 years ago was orchestrated by aliens, white sauces in the restaurant are manufactured from male sperm, or any other conspiracy theory. Said friend might cite an impressive number of technical facts, contradictions in official reasoning, world-renowned experts who support one or another of his claims, etc. And you might not be able to refute them, in fact, if you decide to factcheck him, you might find several of them to be correct. Yet, if you have what is referred to as “common sense”, you will not usually immediately jump to believing in some crazy illuminati-nazizombie-and-white-cat-controlled plot that secretly runs the globe. But this means that you are doing something that, on the face of it, would seem irrational: You disbelieve something in spite of a preponderance of all evidence and rational arguments you are familiar with suggesting it is true. And will probably continue to regard this as a reasonable position. How can that be? Well, perhaps because the argument has some hidden premises:

  • You need to trust your friend to give a balanced and correct presentation of the issue.
  • You need to trust any expert/journalist/whatever cited to give an accurate and well-reasoned assessment of the facts.
  • In potentially accepting some conspiracy theory, you need to trust yourself not to overlook something crucial, like a fatal flaw in the technical argument why the light reflexes on the moon capsule were wrongly staged.

And these “trust issues” are, presumably, a significant part of the problem you would have with accepting your friend’s view. Moreover, you have already been alive before his attempt to enlighten you on the dark side of the world, and thus are likely to have a concept of the probability and improbability of some events, such as a massive plot involving a minimum of thousands of people still somehow managing to remain secret. And you might agree with the notion that “extraordinary claims require extraordinary evidence”. Which immediately implies that the amount of evidence you would consider sufficient for a “ordinary” claims would not suffice to convince you of a more outlandish claim. Just compare how likely you would regard a friend’s story of recently having met 1) another acquaintance who lives in the same city, 2) a famous person of worldwide celebrity status or 3) a magical fairy, based on the same evidence of said friend telling you that story in each case. Your Bayesian priors for these three possibilities are simply different.

Now note that these two problems apply not just to paranoid conspiracy theories, but to any conceivable topic of discussion. People already go in with preconceived ideas of which answers are probable and which aren’t – which is, to an extent, a feature rather than a bug of human thinking, and, to that same extent, a perfectly rational thing. And people have to make decisions on who to trust and who not to. These two phenomena are not quite separate, but I single out the trust problem because I regard it as more crucial to many hot button issues than any other part of someone’s rational or irrational prejudices. This is particularly true when expert opinions are cited, a point I want to focus on here because it is the rule rather than the exception that our real or imagined factual knowledge is mediated by some kind of experts or academic scholars. (E.g., how do you know about the theory of evolution? The answer will be something like “from my 10th grade biology textbook, written by professional biologists, selected by professionals working for a more or less renowned publisher and approved by other professionals working in the educational system”.)

To drive this point home, just look at the three examples cited in the first paragraph: In debates about creationism, global warming scepticism and anti-vaxxing, opponents of these three viewpoints frequently cite the “overwhelming consensus” of the relevant scientists against them as a refutation. But that betrays an almost stunning inability to understand other ideas than one’s own: A person simply wouldn’t be a creationist, global warming sceptic or anti-vaxxer if she didn’t believe to have sufficient grounds for distrusting the expert opinion in the first place, and quite possibly the integrity of the scientific or scholarly process in an entire field. The point here isn’t if any of these three is true (for what it’s worth, I personally think only one of them has even the slightest chance to be correct, and it’s not the anti-vaccine one) it’s that this sort of objection rests on unexamined premises the addressee may not share, and it is thus absolutely no wonder that it frequently falls on deaf ears.

It is important to point out here that this kind of distrust is not necessarily stupid or without merit. Most facts that reach us pass through similar filters as the aforementioned conspiracy theorist arguments, and your belief in them requires your trust in each successive one. Now, even in the “hardest” of sciences, e.g. physics, professional consensus can be wrong, possibly distorted by groupthink and laziness. There can be pervasive political biases in a field, and it may be possible to cherry-pick from an overall inconclusive state of research. And these are just (some) reasons why statements like, “The majority of experts agree on…”, or “This renowned scholar widely respected in his field believes…” may in fact warrant some legitimate scepticism. It goes downhill from there: Media may present you someone as an “expert” who is actually a crank or charlatan, academic degree nonwithstanding, or can be legitimately suspected of having an axe to grind. (Ironically, the latter point brings us back to vaccine scepticism.) More subtly, someone who has studied a particular subject does not automatically know all of its subfields very well and might himself have significant misconceptions about them. And even when there legitimately is agreement among experts on something, some interested parties may overstate to what degree.

b) This defense of “rational stubbornness” is of course not to say that the above reasons should provide an excuse for defending an untenable position in the face of arbitrarily strong evidence to the contrary, or that people only refuse to change their opinion on something out of completely reasonable motives. Certainly, a reason why someone might be impervious to new arguments or facts is that their preconceived notions do something for them: They provide meaning and stability in their life (which includes all cases where someone looks back on years of activism, academic work or even holding a paid job dependent on holding a certain view), or they are useful to necessary for acceptance in their social group (ask yourself: which of your views couldn’t you change without losing half of your friends?). And there may be less high-minded causes like losing profits if an environmental regulation is implemented etc., but I submit that most are more sympathetic than that. Still, just like the above reasons for distrusting experts can be misconstrued to provide a justification for completely irrational insistence on one’s own wishful thinking, the recognition that there are “ulterior motives” can be, and is, frequently misused by asserting that, yes, people with other viewpoints than someone’s own have such reasons, while that person herself holds her views on exactly the grounds she says she does. It is the hallmark of the political ideologue to even have a specialized vocabulary designed to dismiss his or her opponents, and everything they might have to say. I, on the other hand, would maintain that being “good at reading people” and “knowledge of human nature” are imaginary friends, and not just in the kind of cases discussed here.

Thus, whether the reasons why people will stand by their opinions when presented with evidence to the contrary are rational or irrational, they are always at least “clandestinely” part of the argument, but either remain unstated and unexamined, or one side alleges that the other has hidden motives of the irrational kind in order to discredit it. If we want a more healthy way of doing things, we could demand that all of us a) think about conceivable reasons why others might distrust any facts we cite in support of our own opinion, b) think about conceivable reasons why someone would regard our viewpoint as intrinsically unlikely (in more fancy language: might have other Bayesian priors than we do). Then, we should consider the possibility (very real, as we saw above) that there is some merit to these reasons before we try to c) directly address them while d) also trying to show that we take them seriously. That would mean that e.g. no argument against anti-vaxxers is complete without some specific discussion of the grounds on which these people distrust the medical establishment. Furthermore, we might demand of ourselves and others to d) be very explicit that we are talking about possible reasons (and can’t actually presume to know what motivates others) whenever we try to address conceivable personal and social causes why they might be unwilling to change their minds on an issue. And I certainly do agree that talking about these kinds of motives is important. As a personal example, I had known about sceptical arguments against religion throughout my teenage years, and I knew that some of them were very hard to refute. But it was only after I had gotten over the notion that without belief in a deity, life wouldn’t make sense anymore that I was ready to seriously consider agnostic or atheist views.

But most importantly, e) one should remember that people do not typically change their views in the course of a single conversation, or after reading a single clever article attempting to refute them. A statement that may seem self-evident to many, but is also frequently ignored by many. Yet, weeks, months or even years after person A has presented a lot of information and statistics to person B that we need more hamsters in public life, and person B has just shaken her head and refused to see how strong A’s case was, B may finally reach a state where he is ready to approach the issue with an open mind, and then A’s words could easily come back to him. But even initially, A has the possibility to f) encourage B to learn something from her points and refine his position, without necessarily accepting her basic point of view.

However, all these would be very challenging demands indeed, and so far “we” (i.e., I and my imagined readership) have mostly been presumptuous enough to see ourselves in the position of the perfectly reasonable person desperately using facts, statistics and arguments to stem the tide of other people’s obstinacy. Needless to say that this picture is likely wrong, and few people will be really immune from unfairly dismissing evidence that contradicts their beliefs. That begs the question what people can do to keep a more open mind themselves, and many people have thought about it (full disclosure: I have never gotten around to reading much “Less Wrong”, but I have heard it’s interesting). Yet to actually stringently apply them, and to always and consequently follow the above suggestions whenever you argue against other views, you would have to be some kind of rationalist saint. I will try my best, but I myself should probably not promise to be one in all future blog posts. Even so, I think it is worth remembering that in principle, we can use such strategies and perhaps (once again, I am guessing without evidence here) progress by them if we just occasionally give them a try. And if we don’t use them, maybe we should be more careful about believing that people are stupid, pigheaded or evil for not buying what we are pointing out to them, even when it seems to us we are arguing something pretty obvious.

And here is a panda, obviously the greatest animal in the universe:

(besides cats)

Advertisements