… especially if you are dealing with abstract or complex problems. We do have something of a tendency to fall into misconceptions and cognitive errors through “conscious theorizing“. That’s quite a word salad so let me explain…
On a cold winter day many years ago, I made a long drive with a colleague, a Ph. D. mathematician. Our discussion turned to the human tendency to fall into various cognitive pitfalls. As we drove along, an example I used was how people can get confused between outside temperature and the subjective experience of hot or cold. He wasn’t sure what I was getting at so I asked him whether he thought the thermometer on a moving car would show lower temperature than if the car were stationary. To my surprise, he was quite convinced that it would be colder if the car were driving fast. In this case, my friend clearly confused the objective reality of the outside world and his own subjective experience of that reality. We feel colder in the wind because the flow of cold air removes heat from our bodies faster. But what surprised me the most was how strongly he clung to the idea that wind somehow lowered the environment temperature. After much discussion, he allowed that he might be mistaken, but still wasn’t convinced by my various jabs at explaining the misconception. That exchange highlighted the reality that we are all liable to misconstrue certain aspects of our objective reality. Generally, the more intangible or abstract the subject matter, the more ways we have to reach wrong conclusions.
In his bestseller, “How the Mind Works,” Steven Pinker cites research that shows just how easily we err when conceptualizing certain types of problems. For example, psychologists McCloskey, Caramazza, and Green asked college students to describe the trajectory of a ball shot out of a curved tube. A “depressingly large minority” of students, including many who studied physics, guessed that the ball would continue in a curving path, and were even quite prepared to provide the “scientific” explanation for this [1]. Dennis Proffitt and David Gilden asked people simple questions about the motion of spinning tops, wheels rolling down ramps, colliding balls, or solid objects displacing water. They found that even physics professors often got their answers wrong unless they were allowed to fiddle with equations on paper. Pinker points out that errors tend to arise from what he calls conscious theorizing. When respondents were shown animated illustrations of their answers, they instantly recognized their errors, usually with a burst of laughter [2].
But if conscious theorizing can get us lost in problems as simple as the motion of objects in physical world, how confident should we be about our comprehension of more complex problems? In “The Language Instinct” Pinker provides an illuminating example from the field of early artificial intelligence research [3]. In the 1970s and 1980s scientists at some of the leading American universities spent tens of millions of dollars attempting to solve the mystery of language and to enable computers to speak. They based their solutions on the notion that language is a discrete combinatorial system (a finite number of words and a finite number of rules about how to form sentences), and advanced the concept of word chain device. Word chain devices would construct sentences by selecting words from different lists (nouns, verbs, prepositions…) based on a set of rules for going from list to list. At the time, some psychologists believed that all human language arises from a huge word chain stored in the brain. In their efforts to generate language artificially, scientists painstakingly calculated the probabilities that certain words would follow certain other words in English language and they built huge databases of words and transition probabilities. The following “sentence” is an actual example of what they got out of all that hard work:
“House to ask for is to earn our living by working towards a goal for his team in old New-York was a wonderful place wasn’t it even pleasant to talk about and laugh hard when he tells lies he should not tell me the reason why you are is evident.” [4]
The whole magic ingredient of meaning never made it into these clever models. It is easy for us to recognize the gibberish flowing out of word chains because our brains were designed to process language and detect the meaning it conveys. What our mind was not designed to do is process reams of quantitative data about complex superstructures like economies and markets. We could say the same about other complex systems that affect our lives, including the climate. In such domains, we are not equipped to easily discern sense from nonsense and this can lead us to countless intellectual errors. For example, we are susceptible to confusing correlation with causation. If some event B follows event A 90% of the time, we tend to assume that there’s a 90% probability that B will follow the next occurrence of A. In complex domains, this is often not the case. We also have difficulties interpreting probabilities, and this includes the supposed experts. Consider the following example [5]: at Harvard Medical School, researchers posed a problem to 60 student and members of the faculty. The problem read as follows: a test to detect a disease that afflicts one person in a thousand has a 5% false positive result. What is the probability that a person found to be positive actually has the disease? The correct answer to this problem is 0.02. The most popular answer was 0.95 and the average answer was 0.56. Among the experts in this group, fewer than one in five got the right answer [6]. It’s this manner of confusion that led Stephen Jay Gould to state that, “misunderstanding of probability may be the greatest of all impediments to scientific literacy.”
Human mind at investing
This is all highly relevant to trading and investing. In these activities, as in other domains of human endeavor, our brain is the foundation of all that we can comprehend about the outside world. But our brain’s “cerebral organization is primarily devised so that it secures survival of the individual in its natural surroundings [7].” Given that modern humans have walked the Earth for some 200,000 years and that financial markets did not exist during more than 99% of that time, it is clear that our cerebral organization could not have evolved the faculties to make the individual successful at investment speculation. Indirect proof of this is the well-documented fact that most investors by far underperform market benchmarks. So are the frequent trends and bubbles that distort asset prices far and away from rationally justifiable levels.
This is not to suggest that we should all give up and accept that we’re doomed to perpetual underperformance. It is true that our brains haven’t evolved to comprehend modern markets and economics. But it is also true that they never evolved the faculty to paint masterpieces, write novels or compose symphonies, but some humans accomplished those feats with astounding mastery and inspiration. What we know about such individuals is that their accomplishments took shape on the back of years of concerted effort and near total devotion to their craft. This, as investment managers, is what we aspire to, building up our intellectual capital through experience, constant research, clear thinking, and disciplined adherence to our strategic objectives. At times, this can be far more difficult than it looks. I can think of no more pertinent illustration of this than Stanley Druckenmiller’s experience through the dot-com bubble.
Here’s a reminder of that story: while working for George Soros in 1999, Druckenmiller accumulated a significant short position in internet stocks he believed stupidly overvalued. He was right, of course, but Nasdaq’s meteoric rise eventually made him blink, cover his shorts and join the bulls on the long side. Shortly thereafter, the dot-com bubble burst and 75% of the internet stocks Druckenmiller shorted eventually went to zero. The rest of them fell between 90% and 99% [8]. Instead of making an absolute killing in 2000, Druckenmiller ended up losing 20%. Now, here’s how Stanley Druckenmiller himself described this experience. Answering the question about what he thought was the biggest mistake of his career and what he’d learned from it, he said [9]:
“… in 1999 after Yahoo and America Online had already gone up like tenfold, I got the bright idea at Soros to short internet stocks. And I put 200 million in them in about February and by mid-March the 200 million short I had lost $600 million on, gotten completely beat up and was down like 15 percent on the year. And I was very proud of the fact that I never had a down year, and I thought well, I’m finished.
So the next thing that happens is I can’t remember whether I went to Silicon Valley or I talked to some 22-year old with Asperger’s. But whoever it was, they convinced me about this new tech boom that was going to take place. So I went and hired a couple of gun slingers because we only knew about IBM and Hewlett-Packard. I needed Veritas and Verisign. … So, we hired this guy and we end up on the year – we had been down 15 and we ended up like 35 percent on the year. And the Nasdaq’s gone up 400 percent. So I’ll never forget it. January of 2000 I go into Soros’s office and I say I’m selling all the tech stocks, selling everything. This is crazy. [unintelligible] This is nuts. Just kind of as I explained earlier, we’re going to step aside, wait for the next fat pitch. I didn’t fire the two gun slingers. They didn’t have enough money to really hurt the fund, but they started making 3 percent a day and I’m out. It is driving me nuts. I mean their little account is like up 50 percent on the year. I think Quantum was up seven. It’s just sitting there.
So like around March I could feel it coming. I just – I had to play. I couldn’t help myself. And three times during the same week I pick up a – don’t do it. Don’t do it. Anyway, I pick up the phone finally. I think I missed the top by an hour. I bought $6 billion worth of tech stocks and in six weeks I had left Soros and I had lost $3 billion in that one play. You asked me what I learned. I didn’t learn anything. I already knew that I wasn’t supposed to do that. I was just an emotional basket case and couldn’t help myself. So, maybe I learned not to do it again, but I already knew that.”
Two months after he told George Soros that tech stock valuations were crazy and announced that he wanted to sell all the tech stocks, Druckenmiller went and bought $6 billion worth of those stocks – at much higher valuations. Shouldn’t that be a mystery? Day after day, he watched technology stocks skyrocket and his younger and much less experienced colleagues make huge returns while his fund was just treading water. What they were doing seemed to be working, and what he was doing wasn’t. Day after day the markets were telling him that his “gunslingers” were right and he was wrong; that they were smart and he stupid. Eventually he abandoned his discipline and joined the herd even though in his rational judgment he knew he was doing the wrong thing. “I was just an emotional basket case and I couldn’t help myself,” says Druckenmiller. Any and every investor should ponder those words, because what happened to him can happen to every speculator.
It is difficult to maintain your discipline when your investment strategy is unpopular and the popular “strategies” seem to reward investors handsomely. It is hard to keep a steady course when the markets are telling you that the herd is right and you are wrong; when in addition to uncertainty, emotion can also sway your. At such times especially, we must heed experiences like Druckenmiller’s. If a strategy seems correct to the best of our understanding, our job is not to get disoriented by chasing returns and swept up by market bubbles. On the contrary, our job is to cultivate clear thinking and strategic discipline. That should be at the core of an asset manager’s religion, and this is why yours truly is entirely partial to quantitative, systematic approach to investment management: even if we can’t fully comprehend the complexities of our world or predict what may happen tomorrow, we do know that time is ultimately never kind to herds. As bubbles reach bursting point, the herds chasing after returns always run themselves off a cliff. This why it is essential to remain guided by clear thinking and disciplined adherence to chosen strategic goals.
Alex Krainer is an author and hedge fund manager based in Monaco. Recently he has published the book “Mastering Uncertainty in Commodities Trading“.
Notes:
[1] “The object acquires a ‘force’ or ‘momentum’ which propels it along the curve until the momentum gets used up and the trajectory straightens out.”
[2] Pinker, Steven “How the Mind Works” W. W. Norton and Company, New York 1997 (p. 319, 320).
[3] Pinker, Steven “The Language Instinct,” Harper Perennial, New York, 1995
[4] This word-chain model worked by estimating the most likely word to follow after each four-word sequence, growing the “sentence” word by word.
[5] Pinker, Steven “How the Mind Works” W.W. Norton and Company, New York 1997 (344)
[6] To be fair, we tend to do much better when problems are presented in terms of relative frequencies rather than mathematical probabilities. As many as 92% of respondents gave the correct answer when the problem was formulated as follows: in a given population, one person in a thousand has a disease and 50 of 1000 test positive. How many who test positive actually have the disease? The difference between the two formulations is subtle, but it goes to show that we may often fail to grasp the substance of quantitative problems and that even experts aren’t immune to misconstruing mathematical probabilities and arriving at wrong solutions.
[7] Partial quote, neuroscientist and Nobel laureate Walter Hess
[8] Price, Tim. “The Emotional Investor” – PFP Wealth Management Newsletter, December 2013. (citing also research by fund manager David McCreadie).
[9] Armour, Timothy D. “Stanley Druckenmiller Lost Tree Club 1-18-2015” Transcript, 12 February 2015.