(This is the third part – of three – of the talk ‘What I Know and Why I Know It’).
“When the facts change, I change my mind. What do you do, sir?” – John Maynard Keynes (not)
In ‘The Forest of Neurons’ analogy, young brains are able to acquire new knowledge compared with their elders because connections are unimpeded by existing growth (or, in the abstract sense, existing knowledge). This makes the young brain agile but impressionable compared with a mature brain that can be wise yet stubborn.
And this did not take into account the fact that the number of synapses varies greatly during development, accentuating this difference. A neonate’s brain is estimated to have an average of 2,500 synapses per cortical neuron. This number then varies throughout development, eventually settling at around 8,000 synapses per neuron but having been at twice this number at some stages in between.
This difference leads to a ‘cognitive bias’ – that information presented to us at an early age is weighted more than information presented in later life, and this then skews our judgement.
But the same skewing happens over a much shorter timescale as well and was called ‘anchoring and adjustment heuristic’ by Amos Tversky and Daniel Kahneman in the 1974 paper ‘Judgement under Uncertainty: Heuristics and Biases’. A classic example they gave of this was an experiment in which they gave two groups of students 5 seconds to estimate the result of a numerical expression.
- One group were given the expression 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8 and they produced a median result of 512.
- The other group were given the expression 8 x 7 x 6 x 5 x 4 x 3 x 2 x 1 and they produced a median result of 2,250.
The actual answer of course, is the same for both groups (it is 8! = 40,320) but the groups estimates were skewed towards the information presented to them first.
This skewing in cognitive psychology is also present within the bio-inspired Artificial Neural Networks, where ordering of the information in the training set can substantially alter the behaviour.
18. The Adaptive Toolbox
Kahne and Tversky’s notion of ‘cognitive bias’ implies that human thought is deficient compared with a rule-based formal logic. The psychologist Gerd Gigerenzer has been a fierce critic of this view. Instead, it should be seen that humans use an ‘adaptive toolbox’ – a repertoire of ‘rules of thumb’ – to make good-enough decisions about the world with limited time, effort and information. Computer simulations have shown that these heuristics can outperform traditional optimization methods.
An example of a ‘rule of thumb’ is the ‘recognition heuristic’. In ‘Models of ecological rationality: the recognition heuristic’ (2002), Goldstein and Gigerenzer tested students on their knowledge of the population of cities. When presented with the names of 2 cities, students needed to state which city had the greater population (it’s like facemash for Geography Bees). Surprisingly, American students scored higher on German cities and vice versa. Before saying which city had the greater population, the respondents needed to state which of the 2 they recognized. This allowed the experimenters to identify the questions in which only one of the cities were recognized. Goldstein and Gigerenzer surmised that the students were employing a ‘recognition heuristic’ – presume the city you recognize has the larger population, because you recognize it. This extremely simple strategy worked very well. Here is a case of ‘less is more’ – being less knowledgeable about the foreign cities actually helped. It may be using less knowledge but it is using the information available more intelligently. As Goldstein and Gigerenzer say: ‘missing knowledge can be used to make intelligent inferences’. The heuristic is valuable because it it ‘fast and frugal’ – it doesn’t take much effort to come up with a reasonably-good result.
This heuristic has been used to good effect on predicting the outcome of Wimbledon tennis matches – where it compares well against the official ATP ranking system and the seeding of players by so-called experts.
Gerd Gigerenzer, former banjo player.
Kahneman and Gigerenzer’s arguments may be seen as complementary rather than just contradictory. Kahneman position is consistent with the view that our thinking falls short of logic required in the 21st Century world (particularly in economics), whereas Gigerenzer position is consistent with the view that our thinking has evolved efficiently to cope with surviving in the Holocene.
19. Hedgehogs and Foxes
So, amateurs can outperform experts. And computers can too. Philip Tetlock’s 2005 book “Expert political judgment: How good is it? How can we know?” describes how, over a 20-year period, he analysed the predictions of a large number of political experts and then compared these with how things actually turned out. The experts fared poorly: worse than some fairly straightforward statistical computer algorithms. Among the many ways he looked at trying to demarcate his group of experts in order to see what could be done to improve their efforts, Tetlock split them into ‘hedgehogs’ and ‘foxes’, following Isaiah Berlin’s metaphor of ‘The Hedgehog and the Fox’ (1953). This is based on a text fragment attributed to the ancient Greek poet Archilochus (650BCE):
The fox knows many things, but the hedgehog knows one big thing.
Tetlock found that the ‘fox’ personality type was a better predictor than the ‘hedgehog’ type. (See also “Why Foxes Are Better Forecasters Than Hedgehogs”, 2007)
In the first part of the talk, I linked neuroscience to epistemology by equating a physicalist description of how the brain works in terms of the simultaneous combined effects of:
- The horizontal ‘pushing-together’ process in order to formulate an opinion, and
- The vertical ‘pulling-together’ process across the many levels of hierarchy, linking predictive models to the outside environment.
with the ‘foundherentist’ position within epistemology, that being an ongoing process of the combination of:
- a (horizontal) coherence with other knowledge
- a (vertical) correspondence with evidence
It is the simultaneous horizontal/vertical pushing/pulling that builds up a coherent view that corresponds to the external environment.
It’s a balancing act. If we steer too much towards coherence, we risk formulating ideologies that bear no resemblance to reality. If we steer too much towards correspondence, we risk our knowledge being a jumbled hypocritical mess. This looks rather like the distinction between ‘hedgehogs’ and ‘foxes’. The ‘hedgehogs’ rely too much on coherence. I can’t go as far as saying that ‘foxes’ rely too much on correspondence, only that they are less biased towards coherent ideologies and this allows them to make (slightly) better predictions.
20. Changing your Mind
So from the above, ‘Hedgehog’ personality types can be viewed as valuing increasing coherence at the expense of correspondence. The world of Academia can be seen as an ‘array’ of hedgehogs (this being the collective noun for them) as, frequently, academic originality is to be preferred over truth. Note that this individual disregard for truth may actually be a good thing. Whilst a solitary ‘fox’ might outperform a single ‘hedgehog’, this might well not scale to groups. An array of ‘hedgehogs’ is likely to throw up a more diverse range of ideas to explore than a ‘skulk’ of ‘foxes’ will, which will serve the group better.
There is something within us that wants to pull our ideas together to create some greater understanding, some greater ‘truth’. And in doing so, it will distance itself from the views of others that have had different experiences, particularly in their ‘formative years’.
In an epistemological as well as biological sense, we are living beings. Knowledge within us grows. But this necessarily entails long-term cognitive biases. And cognitive biases are apparent even in short time scales (‘anchoring’). When we are presented with new information that does not fit with our previous experience, we will tend to reject it. It is incommensurate. And this is exacerbated by the reduced plasticity of the brain as we age. As individuals, we have limits on how much we can change our mind.
We have to accept that we are far from perfectly rational beings (in an idealized/mathematical/logical sense). Why should we be? If we accept we are evolved, we must accept that we only need to have been effective (and for humans, we need only be effective in groups). Each of us is a walking, talking toolbox of pragmatic, cognitive tricks, with many of them acting below our consciousness. We were not designed to be a homo economicus; we evolved to be able to react appropriately to immediate circumstances.
There is the well-known quote, normally attributed to John Maynard Keynes but almost certainly not originating from him:
“When the facts change, I change my mind. What do you do, sir?”
This sounds so reasonable! Why would we do otherwise? Of course, the difficulty is in accepting that the facts have changed; that the new facts are valid.
21. Between White and White
A better quotation is called for, to reflect the real problem.
The tragedy of human communication is that what I say is not what you hear.
What I say has meaning bound up in both the explicit words of what I say and the tacit knowledge on which it is founded. The words you hear get interpreted in terms of the tacit knowledge on which they are founded in your head.
Your received meaning is not the same as my transmitted meaning.
in order to convey….
meaningme = explicit + tacitme
And you hear:
…and your interpretation of what I say:
meaningyou = explicit + tacityou
tacitme ≠ tacityou
meaningme ≠ meaningyou
This is different from the engineered applications of Shannon’s Communications Theory such as mobile phone or internet communications, where there is no ambiguity in the meaning of the message and hence no difference between the meaning transmitted and the meaning received. And the ambiguity in human communication isn’t just because we are frequently too brief in the number of words we use which acts against helping to ensure that the communicated message has been received correctly.
(This reminds me of the networking geek joke…“A TCP/IP packet walks in to a bar and says “I want a beer”. The barman says “you want a beer?” and TCP/IP packet says “yes, a beer”.)
This human tragedy is best summarized by Jacques Ranciere (in ‘Disagreement: Politics and Philosophy’, 1998):
“Disagreement is not the conflict between one who says white and another who says black. It is the conflict between one who says white and another who also says white but does not understand the same thing by it.”