As career paths of professional investors go, Katherine Collins, CFA, certainly has a diverse one. Formerly a portfolio manager and head of US equity research at Fidelity Management & Research Company, Collins later attended Harvard Divinity School before launching her own biomimicry-based research firm, Honeybee Capital. In The Nature of Investing: Resilient Investment Strategies through Bio - mimicry , Collins examines how a better understanding of the natural world can lead to optimal decision making. In this interview, Collins discusses why honeybees are such good decision makers, the mechanization of the investment industry, and how preparing for uncertainty is different from preparing for risk. ... Biomimicry is the conscious emulation of natural wisdom in our products, processes, and designs. Many people think if you’re just using something from nature, that’s biomimicry. That’s not quite it. It’s the process of looking to nature as a model and a measure of our own endeavors, interwoven in every step of the process.
Humans are social and generally want to be part of the crowd. Studies of social conformity suggest that the group’s view may shape how we perceive a situation. Those individuals who remain independent show activity in a part of the brain associated with fear. ... We are natural pattern seekers and see them even where none exist. Our brains are keen to make causal inferences, which can lead to faulty conclusions. ... Standard economic theory assumes that one discount rate allows us to translate value in the future to value in the present, and vice versa. Yet humans often use a high discount rate in the short term and a low one in the long term. This may be because different parts of the brain mediate short- and long-term decisions. ... We suffer losses more than we enjoy gains of comparable size. But the magnitude of loss aversion varies across the population and even for each individual based on recent experience. As a result, we sometimes forgo attractive opportunities because the fear of loss looms too large.
Science is not a ‘body of knowledge’ – it’s a dynamic, ongoing reconfiguration of knowledge and must be free to change ... each scientific discipline is governed by an accepted set of theories and metaphysical assumptions, within which normal science operates. Periodically, when this rather humdrum ‘puzzle solving’ leads to results that are inconsistent with the regnant perspective, there follows a disruptive, exciting period of ‘scientific revolution’, after which a new paradigm is instituted and normal science can operate once more. ... When Newton said: ‘If I have seen farther, it is by standing on the shoulders of giants’, he wasn’t merely being modest; rather he was emphasising the extent to which science is cumulative, mostly building on past achievements rather than making quantum leaps. ... the accumulation process generates not just something more, but often something altogether new. Sometimes the new involves the literal discovery of something which hadn’t previously been known (electrons, general relativity, Homo naledi). At least as important, however, are conceptual novelties, changes in the ways that people understand – and often misunderstand – the material world: their operating paradigms. ... The world’s factual details are in continual Heraclitean flux, but the basic rules and patterns underlying these changes in the physical and biological world are themselves constant. ... Our insights, however, are always ‘evolving’. ... Science is a process, which, unlike ideology, is distinguished by intellectual flexibility, by a graceful, grateful (albeit sometimes grudging) acceptance of the need to change our minds, as our understanding of the world evolves. Most people aren’t revolutionaries, scientific or otherwise. But anyone aspiring to be well-informed needs to understand not only the most important scientific findings, but also their provisional nature, and the need to avoid hardening of the categories: to know when it is time to lose an existing paradigm and replace it with a new one. ... Holding still is exactly what science won’t do.
In the 1980s, two ecologists, Jim Brown at the University of New Mexico and Brian Maurer at Brigham Young University, coined the term macroecology, which gave a name and intellectual home to researchers searching for emergent patterns in nature. Frustrated by the small scale of many ecological studies, macroecologists were looking for patterns and theories that could allow them to describe nature broadly in time and space. ... Brown and Maurer had been influenced heavily by regularities in many ecological phenomena. One of these, called the species-area curve, was discovered back in the 19th century, and formalized in 1921. That curve emerged when naturalists counted the number of species (of plants, insects, mammals, and so on) found in plots laid out in backyards, savannahs, and forests. They discovered that the number of species increased with the area of the plot, as expected. But as the plot size kept increasing, the rate of increase in the number of species began to plateau. Even more remarkable, the same basic species-area curve was found regardless of the species or habitat. To put it mathematically, the curve followed a power law, in which the change in species number increased proportionally to the square root of the square root of the area. ... Power laws are common in science, and are the defining feature of universality in physics. They describe the strength of magnets as temperature increases, earthquake frequency versus size, and city productivity as a function of population.
To be an active investor, you must believe in market inefficiency to get opportunities and in market efficiency for those opportunities to turn into profits. ... The Mr. Market metaphor is very powerful because it makes an abstract idea concrete, encouraging an appropriate way to think about markets. ... One way to animate Mr. Market is to consider the wisdom of crowds. What’s key is that crowds are wise under some conditions and mad when any of those conditions are violated. ... Diversity breakdowns, which can happen for sociological as well as technical reasons, lead to extremes. ... Look for cases where uniform belief has led to a mispricing of expectations and hence a way to make money.
Should we make decisions based on intuition and emotion, or should we make decisions more rationally, with data, analytics, and numbers? The best process for making decisions under pressure is to use the data and numbers to inform our intuition. In addition, leaders must recognize and avoid falling prey to a number of mind tricks and biases. Power dynamics can also lead to poor decisions, and leaders do best to pursue an inquiry-based—rather than advocacy-based—approach. ... When making decisions under pressure, there are four tensions. Any decision in an organization generally has an ethical issue, a strategic issue, a financial issue, and a legal issue. Sometimes, there is tension among those issues. What makes perfect sense strategically might not make sense legally, or what makes the best sense financially might not make sense ethically. Part of the decision-making process is having the ability to recognize and manage the fundamental tensions that exist in most of the decisions we face. ... The way to do that is by answering three questions. First, how do I motivate and encourage the people and the organization to be aligned with what we are trying to achieve? Second, operationally, when we are under threat, how do I make sure that the business will be able to continue during these threatening circumstances? Third, how do I communicate the decision that I am about to make?
I believe my lack of business education was an asset because it encouraged me to ask a lot of questions and to think from first principles. I recall going to an equity research morning call and hearing the utility industry analyst suggest the slow-growing companies under his coverage deserved price-earnings (P/E) multiples in the high teens and the tobacco industry analyst imply that his fast-growing companies should trade at P/E’s in the mid-teens. How does that make sense? I was dropped into a world of rules-of-thumb, old wives’ tales, and intuitions. ... My first breakthrough occurred when a classmate in my training program handed me a copy of Creating Shareholder Value by Alfred Rappaport.3 Reading that book was a professional epiphany. Rappaport made three points that immediately comprised the centerpiece of my thinking. The first is that the ability of accounting numbers to represent economic value is severely limited. Next, he emphasized that competitive strategy analysis and valuation should be joined at the hip. The litmus test of a successful strategy is that it creates value, and you can’t properly value a company without a thoughtful assessment of its competitive position. ... The final point is that stock prices reflect a set of expectations for future financial performance. A company’s stock doesn’t generate excess returns solely by the company creating value. The company’s results have to exceed the expectations embedded in the stock market.
1. Be numerate (and understand accounting).
2. Understand value (the present value of free cash flow).
3. Properly assess strategy (or how a business makes money).
4. Compare effectively (expectations versus fundamentals).
5. Think probabilistically (there are few sure things).
6. Update your views effectively (beliefs are hypotheses to be tested, not treasures to be protected).
7. Beware of behavioral biases (minimizing constraints to good thinking).
8. Know the difference between information and influence.
9. Position sizing (maximizing the payoff from edge).
10. Read (and keep an open mind).
Professionals in many organizations are assigned arbitrarily to cases: appraisers in credit-rating agencies, physicians in emergency rooms, underwriters of loans and insurance, and others. Organizations expect consistency from these professionals: Identical cases should be treated similarly, if not identically. The problem is that humans are unreliable decision makers; their judgments are strongly influenced by irrelevant factors, such as their current mood, the time since their last meal, and the weather. We call the chance variability of judgments noise. It is an invisible tax on the bottom line of many companies. ... The prevalence of noise has been demonstrated in several studies. Academic researchers have repeatedly confirmed that professionals often contradict their own prior judgments when given the same data on different occasions. ... The unavoidable conclusion is that professionals often make decisions that deviate significantly from those of their peers, from their own prior decisions, and from rules that they themselves claim to follow. ... It has long been known that predictions and decisions generated by simple statistical algorithms are often more accurate than those made by experts, even when the experts have access to more information than the formulas use. It is less well known that the key advantage of algorithms is that they are noise-free: Unlike humans, a formula will always return the same output for any given input. Superior consistency allows even simple and imperfect algorithms to achieve greater accuracy than human professionals. ... One reason the problem of noise is invisible is that people do not go through life imagining plausible alternatives to every judgment they make. ... The bottom line here is that if you plan to use an algorithm to reduce noise, you need not wait for outcome data. You can reap most of the benefits by using common sense to select variables and the simplest possible rule to combine them.
Sustainable value creation has two dimensions: the magnitude of the spread between a company’s return on invested capital and the cost of capital and how long it can maintain a positive spread. Both dimensions are of prime interest to investors and corporate executives. ... Sustainable value creation as the result solely of managerial skill is rare. Competitive forces and endogenous variance drive returns toward the cost of capital. Investors should be careful about how much they pay for future value creation. ... Economic moats are almost never stable. Because of competition, they are getting a little bit wider or narrower every day. This report develops a systematic framework to determine the size of a company’s moat.
When I returned to addiction, it was as a scientist studying the addicted brain. The data were indisputable: brains change with addiction. I wanted to understand how – and why. I wanted to understand addiction with fastidious objectivity, but I didn’t want to lose touch with its subjectivity – how it feels, how hard it is – in the process. ... One explanation is that addiction is a brain disease. The United States National Institute on Drug Abuse, the American Society of Addiction Medicine, and the American Medical Association ubiquitously define addiction as a ‘chronic disease of brain reward, motivation, memory and related circuitry’ ... If only the disease model worked. Yet, more and more, we find that it doesn’t. First of all, brain change alone isn’t evidence for brain disease. Brains are designed to change. ... we now know that drugs don’t cause addiction. ... One idea is that addicts voluntarily choose to remain addicted: if they don’t quit, it’s because they don’t want to. ... The view that addiction arises through learning, in the context of environmental forces, appears to be gathering momentum.
Biological systems don’t defy physical laws, of course — but neither do they seem to be predicted by them. In contrast, they are goal-directed: survive and reproduce. We can say that they have a purpose — or what philosophers have traditionally called a teleology — that guides their behavior. ... By the same token, physics now lets us predict, starting from the state of the universe a billionth of a second after the Big Bang, what it looks like today. But no one imagines that the appearance of the first primitive cells on Earth led predictably to the human race. Laws do not, it seems, dictate the course of evolution. ... Animals are drawn to water not by some magnetic attraction, but because of their instinct, their intention, to survive. Legs serve the purpose of, among other things, taking us to the water. ... there appears to be a kind of physics of things doing stuff, and evolving to do stuff. Meaning and intention — thought to be the defining characteristics of living systems — may then emerge naturally through the laws of thermodynamics and statistical mechanics.
Investment rules shouldn’t be static. Investors should adapt their rules per the environment they are in. From experience, I can confirm that those who don’t adapt usually get into trouble sooner or later. My first and most important rule when investing is therefore a rule that defines the rules I should adhere to. ... What exactly do I mean by that? How can I possibly have a rule about rules? Allow me to explain. As I see things, there are rules and then there are rules. The most important ones always apply; those are my first frontier rules. There are not many of them, but they are all critically important. ... The second layer of rules – the second frontier – are strictly speaking not rules but principles. I treat them as rules, though, because I follow them almost whatever happens.
In his new book, Adaptive Markets: Financial Evolution at the Speed of Thought, M.I.T. finance professor Andrew Lo attempts to account for the messier, more feeling realities of human behavior. A key premise is that markets evolve, like species, but much faster: “evolution at the speed of thought.” And that this evolution happens in fits and starts, in response to changes in the environment—hence, what he calls the “adaptive” markets hypothesis. It’s during these times of change that human emotions play their biggest role. Lo believes we are in one of those times now and, in his book, he applies biology, psychology, neuroscience, and history toward the goal of improving on the efficient markets hypothesis—which, Lo says, is not only flawed but is becoming increasingly so as the financial environment continues to change. ... The efficient markets hypothesis is a special case of adaptive markets. Markets are efficient if the environment is stable and investors interact with each other and natural selection operates over a long period of time.
Above a certain temperature, a cell will collapse and die. One of the most straightforward explanations for this lack of heat hardiness is that the proteins essential to life — the ones that extract energy from food or sunlight, fend off invaders, destroy waste products and so on — often have beautifully precise shapes. They start as long strands, then fold into helixes, hairpins and other configurations, as dictated by the sequence of their components. These shapes play a huge role in what they do. Yet when things start to heat up, the bonds that keep protein structures together break: first the weaker ones, and then, as the temperature mounts, the stronger ones. It makes sense that a pervasive loss of protein structure would be lethal, but until recently, the details of how, or if, this kills overheated cells were unknown. ... One of the clearest observations was that in each species, the proteins did not unfold en masse with a temperature boost. Instead, “we saw that only a small subset of proteins collapses very early,” Picotti said, “and these are key proteins.” ... This paradox — that some of the most important proteins seem to be the most delicate — may reflect how evolution has shaped them to do their jobs. ... The more copies the cell made, they reported, the more heat it took to break a protein down.
How do experts go wrong? There are several kinds of expert failure. The most innocent and most common are what we might think of as the ordinary failures of science. Individuals, or even entire professions, get important questions wrong because of error or because of the limitations of a field itself. They observe a phenomenon or examine a problem, come up with theories and solutions, and then test them. Sometimes they’re right, and sometimes they’re wrong. ... Science is learning by doing. Laypeople are uncomfortable with ambiguity, and they prefer answers rather than caveats. But science is a process, not a conclusion. Science subjects itself to constant testing by a set of careful rules under which theories can be displaced only by other theories. Laypeople cannot expect experts to never be wrong; if they were capable of such accuracy, they wouldn’t need to do research and run experiments in the first place. If policy experts were clairvoyant or omniscient, governments would never run deficits, and wars would break out only at the instigation of madmen. ... The most important point is that failed predictions do not mean very much in terms of judging expertise. Experts usually cover their predictions (and an important part of their anatomy) with caveats, because the world is full of unforeseeable accidents that can have major ripple effects down the line. ... The goal of expert advice and prediction is not to win a coin toss, it is to help guide decisions about possible futures.