Rise of the Machines

“Music, this complex and mysterious act, precise as algebra and vague as a dream, this art made out of mathematics and air, is simply the result of the strange properties of a little membrane. If that membrane did not exist, sound would not exist either, since in itself it is merely vibration. Would we be able to detect music without the ear? Of course not. Well, we are surrounded by things whose existence we never suspect, because we lack the organs that would reveal them to us. ”
– Guy de Maupassant

“I call our world Flatland, not because we call it so, but to make its nature clearer to you, my happy readers, who are privileged to live in Space. … Distress not yourself if you cannot at first understand the deeper mysteries of Spaceland. By degrees they will dawn upon you. ”
– Edwin A. Abbott, “Flatland: A Romance of Many Dimensions”

“I wanted to be a psychological engineer, but we lacked the facilities, so I did the next best thing – I went into politics. It’s practically the same thing. ”
– Salvor Hardin (“Foundation”, by Isaac Asimov)

“It is vital to remember that information – in the sense of raw data – is not knowledge, that knowledge is not wisdom, and that wisdom is not foresight. But information is the first essential step to all of these. ”
– Arthur C. Clarke

“Any sufficiently advanced technology is indistinguishable from magic. ”
– Arthur C. Clarke

“What are you doing, Dave?”
– HAL (“2001: A Space Odyssey” by Arthur C. Clarke)

I thought it was appropriate in a note focused on the evolution of machine intelligence to start with some quotes by three of the all-time great science fiction writers – Abbott, Asimov, and Clarke – and something by the father of the short story, de Maupassant, as well. All four were fascinated by the intersection of human psychology and technology, and all four were able to communicate a non-human perspective (or at least a non-traditional human perspective) in their writing – which is both incredibly difficult and completely necessary in order to understand how machines “see” the world. Asimov in particular is a special favorite of mine, as his concept of psycho-history is at the heart of Epsilon Theory. If you’ve never read the Foundation Trilogy and you don’t know who Hari Seldon or the Mule is … well, you’re missing something very special.

All of these authors succeed in portraying non-human intelligence in terms of the inevitable gulf in meaning and perception that must exist between it and human intelligence. Hollywood, on the other hand, almost always represents non-human intelligence as decidedly human in its preference and utility functions, just with a mechanical exoskeleton and scary eyes. Thus the Daleks, the original Cylons, the Terminators, the Borg, etc., etc.

epsilon-theory-rise-of-the-machines-july-28-2013-robot epsilon-theory-rise-of-the-machines-july-28-2013-robot-2

At least the most recent version of Battlestar Galactica recognized that a non-human intelligence forced to interact with humans would perhaps choose a less menacing representational form.

epsilon-theory-rise-of-the-machines-july-28-2013-robot-evolution

The way to think about machine intelligence is not in terms of a mechanical version of human intelligence, but in terms of a thermostat and an insect’s compound eye.

epsilon-theory-rise-of-the-machines-july-28-2013-thermostat epsilon-theory-rise-of-the-machines-july-28-2013-insect

What I mean by this is that a thermostat is a prime example of a cybernetic system – a collection of sensors and processors and controllers that represents a closed signaling loop. It might seem strange to think of the thermostat as “making a decision” every time it turns on the heat in your house in response to the environmental temperature falling below a certain level, but this is exactly what it is doing. The thermostat’s decision to turn on the heat follows, from an Information Theory perspective, precisely the same process as your decision to buy 100 shares of Apple, just a simpler and more well-defined process. The human brain is the functional equivalent of a really complex thermostat, with millions of sensors and processors and controllers. But that also means that a really complex thermostat is the functional equivalent of a human brain.

The human brain has one big advantage over a thermostat, and that is the evolutionary development of a high degree of self-awareness or consciousness. There’s nothing mystical or supernatural about consciousness, nor is it somehow external or separate from the human brain. Consciousness is simply an emergent property of the human cybernetic system, just like Adam Smith’s Invisible Hand is an emergent property of the market cybernetic system. It is an incredibly useful property, however, allowing both the construction of thought experiments that radically accelerate learning by freeing us from the ponderously slow if-then laboratory that Nature and evolution provide non-self-aware animals, as well as the construction of belief systems that radically promote and stabilize joint utility functions of human communities. Our proficiency as both a tool-using animal and a social animal stems entirely from the development of consciousness, and we are an incredibly robust and successful species as a result.

On the other hand, a thermostat has one big advantage over the human brain in its decision-making process, and that’s the lack of evolutionary and social constraints. As phenomenally efficient as carbon-based nerve cells and chemical neurotransmitters might be, they can’t compete on a fundamental level with silicon-based transistors and electrons. As effective as social constructs such as language and belief systems might be in creating intra-group human utility, there is no inherent tension or meaning gap or ecological divide in communications between thermostats. The concept of music is a wonderful thing, but as de Maupassant points out it is entirely dependent on “the strange properties of a little membrane.” How many other wonderful concepts are we entirely ignorant of because we haven’t evolved a sensory organ to perceive them with? Just as the two-dimensional inhabitants of Flatland find it essentially impossible to imagine a third dimension, so are we conceptual prisoners of Spaceland. At best we can imagine a fourth dimension of Time in the construction of a helix or a hypercube, but anything beyond this is as difficult as storing more than 10 digits in our short-term memory. Machines have no such evolutionary limitations, and decision-making in terms of twelve dimensions is as “natural” to them as decision-making in terms of three.

This is why it’s useful to think of machine intelligence in terms of the compound eye of an insect. Not only are most compound eyes able to sense electromagnetic radiation that is invisible to the camera eyes of most vertebrate animals, particularly in the ultraviolet end of the spectrum, but there is a multi-dimensionality to insect vision that is utterly alien to humans. It’s not that insect vision is super-human, any more than machine intelligence is super-human. In fact, in terms of image resolution or location of an object within a tight 3-dimensional field, the camera eye is enormously superior to the compound eye, which is why it evolved in the first place. But for a wide field of vision and the simultaneous detection of movements within that field, the compound eye has no equal. It’s that simultaneity of movement detection that is so similar to the parallel information processing approach of most machine intelligences and is so hard to describe in human information processing terms.

epsilon-theory-rise-of-the-machines-july-28-2013-compound-eye

Because the compound eye associates a separate lens with each photo-receptor, creating a perceptive unit called an ommatidium, there is no composite 3-dimensional visual image formed as with twin camera eyes. Instead there are hundreds or thousands of separate 2-dimensional visual images processed simultaneously by insects, each of which is driven by separate signals. It’s customary to describe insect vision as a mosaic, but that’s actually misleading because the human brain sees a mosaic as a single image made up of individually discrete pieces. To an insect, there is no such thing as a single visual image. Reality to an insect is hundreds of visual images processed simultaneously and separately, and there is no corollary to this in the human cybernetic system. To a thermostat, though, with no evolutionary baggage to contend with … no problem. As a result, if a functional task is best achieved by seeing the world as an insect does – through simultaneous views of multiple individual fields – a machine intelligence can outperform a human intelligence by a god-like margin.

Over the past five to ten years, there have been three critical advances in computer science that have created extremely powerful machine intelligences utilizing a compound eye architecture.

First, information storage technology developed the capacity to store enormous amounts of data and complex data structures “in-memory”, where the data can be accessed for processing without the need to search for it on magnetic media storage devices. Again, this is a really hard concept to find a human analogy for. The best I can come up with is to envision the ability to just know – immediately and without any effort at “remembering” – the names, addresses, and phone numbers of everyone you’ve ever known in your life. Even that doesn’t really do the technology justice … it’s more like knowing the names and phone numbers of everyone in New York City, simultaneously and without any attempt to recall the information. Your knowledge vanishes the moment electrons stop powering your memory chip, so there’s still a place in the world for permanent magnetic media storage, but that place is shrinking every day.

The company that commercialized this technology first, best, and most widely is SAP, in a product they call HANA. I’ve been following its development for about three years now, and it’s changing the world. Does Oracle have a version of this technology? Yes. But if you’ve built a $150 billion market cap company on the back of selling periodic upgrades for a vast installed base of traditional relational database management software applications that query (search) a vast installed base of traditional data storage resources … hmm, how to put this in a nice way … you’re probably not going to be very excited about ripping apart that installed base and re-inventing your lucrative business model. SAP had a lot less to lose and a lot more to gain, so they’ve re-invented themselves around HANA. I have no idea whether SAP the stock is a good investment or not. But SAP the company has a phenomenal asset in HANA.

Second, advances in microprocessor technology, network connectivity, and system control software created the ability to separate physical computing resources from functional computing resources. This phenomenon goes by many names and takes multiple forms, from virtualization to distributed computing to cloud computing, but the core concept is to find enormous efficiencies in information processing outcomes by rationalizing information processing resources. Sometimes this means using hardware to do something that was previously done by software; sometimes this means using software to do something that was previously done by hardware. The point is to stop thinking in terms of “hardware” and “software”. The point is to re-conceptualize a cybernetic system into fundamental terms reflecting efficient informational throughput and functionality, as opposed to traditional terms reflecting the way that humans happened to instantiate that functionality in the past. When I write about re-conceptualizing common investment practices in terms of the more fundamental language of Information, whether it’s technical analysis (“The Music of the Spheres”) or bottom-up portfolio construction (“The Tao of Portfolio Management”), I’m not pulling the idea out of thin air.  There has been just this sort of revolutionary shift in the way people think and talk about IT systems over the past decade, with incredible efficiency gains as a result, and I believe that the same sea change is possible in the investment world.

One of the most powerful aspects of this re-conceptualization of machine cybernetic systems is the ability to create the functional equivalent of an insect’s ommatidia – thousands of individual signal processors working in parallel under a common direction to complete a task that lends itself well to the architecture of a compound eye. This architecture of simultaneity is more commonly referred to as a cluster, and the most prominent technology associated with clusters is an open-source software platform called Hadoop. There are three pieces to Hadoop – a software kernel, a file system (like a library catalog), and a set of procedures called MapReduce (like a traffic cop) – all of which were originally developed by Google. While Hadoop is in the public domain under an open-source license, I would estimate that Google is at least two generations ahead of any other entity (and that includes the NSA) in understanding and implementing the architecture of simultaneity. Obviously enough, search is a prime example of the sort of task that lends itself well to a machine intelligence organized along these lines, but there are many, many others. No one understands or directs machine intelligence better than Google, and this is why it is the most important company in the world.

Third, methodological advances in statistical inference and their expression in software applications have created the ability to utilize more fully these advances in memory, microprocessors, connectivity, and IT architecture. The range of these methodological tools is pretty staggering, so I will only highlight one that is of particular interest to the Epsilon Theory perspective. Last week I wrote about the problem of the ecological divide in every aspect of modern mass society (“The Tao of Portfolio Management”) and how humans were poor calculators of both aggregate characteristics derived from individual signals and individual characteristics derived from aggregate signals. Over the past 15 years, Gary King at Harvard University has pioneered the development of unifying methods of statistical inference based on fundamental concepts such as likelihood and information. I may be biased because Gary was a mentor and dissertation advisor, but I think his solutions to the problem of ecological inference can fundamentally change portfolio construction and risk management practices, especially now that there are such powerful cybernetic “engines” for these solutions to direct.

As described in “The Market of Babel”, these advanced machine intelligences based on the compound eye’s architecture of simultaneity have effectively taken over one particular aspect of modern markets and the financial services industry – the provision of liquidity. Understanding and predicting the patterns of liquidity demand are tailor-made for the massively parallel capabilities of these cybernetic systems, and there is no liquidity operation in modern markets – from high-frequency traders trying to skin a limit order book to asset managers trying to shift a multi-billion dollar exposure in the dark to bulge-bracket market-makers trying to post yet another quarter of zero days with a trading loss – that is not completely controlled by these extremely complex and powerful thermostats.

This is a problem for human investors in two respects.

The first is a small but constant problem. Whenever you take liquidity (i.e., whenever you create an exposure) in anything other than a “natural” transaction with a human seller of that exact same exposure, you are going to pay a tax of anywhere from 1/2 to 5 cents per share to the machine intelligences that have divined your liquidity intentions within 50 milliseconds of hitting the Enter button. I’m sorry, but you are, and it’s a tax you can only mitigate, not avoid. The problem is worse the more you use a limit order book and the more you use VWAP, but then again, no active manager ever got fired for “showing price discipline” with a limit and no trader ever got fired for filling an order at VWAP.

The second is a giant but rare problem. All of these machine intelligences designed to optimize liquidity operations are based on the same historical data patterns of human market participation. As those patterns change – particularly if the patterns change in such a way that machine-to-machine transactions dominate or are confused for human-to-machine transactions – it creates a non-trivial chance that an event causing what would otherwise be a small liquidity shock can snowball into a market-wide liquidity seizure as the machine-to-machine transactions disappear in the blink of an eye. This is what happened in the 2010 Flash Crash, and the proportion of machine-to-machine transactions in liquidity provision is, if anything, even greater today. Moreover, the owners of these machine intelligences, especially in the HFT world, are suffering much thinner margins than in 2010, and, I suspect, are taking much larger risks and operating with much itchier trigger fingers on the off switch. I have no idea when the liquidity train wreck is going to happen, but you can clearly see how the tracks are broken, and the train whistle sure sounds like it’s getting closer.

The solution to this second and more troubling problem is not to somehow dislodge machine intelligences from market liquidity operations. It can’t be done. Nor do I have much confidence in regulatory “solutions” such as Liquidity Replenishment Points and the like (read anything by Sal Arnuk and Joe Saluzzi at Themis Trading for a much more comprehensive assessment of these issues). What we need is a resurgence in “real” trading with human liquidity-takers on at least one side of the trade.

Unfortunately, I suspect that we won’t see a return to normal levels of human market activity until the Fed begins to back down from monetary policies designed explicitly to prop up market prices. You might not sell what you own with a Fed put firmly in place, but a healthy market needs buying AND selling, it needs active disagreement on whether the price of a security is cheap or dear. Markets work best and markets work more when investors venture farther out onto the risk curve on their own volition, not when they are dragged out there kicking and screaming by ZIRP and QE.

I don’t know when the Fed will stand down enough to allow normal risk-taking to return to markets, but at some point this, too, shall pass. The trick is how to protect yourself in the current investing environment AND set yourself up to do well in the investing environment to come. Now there are a thousand facets to both aspects of pulling that trick off, and anyone who tells you that he has THE answer for this puzzle is selling snake oil. But I think that part of the answer is to bring machine intelligences out of the liquidity provision shadows and into the light of portfolio construction, risk management, and trading.

Your ability to manage the risk of a liquidity-driven market crash is improved simply by recognizing the current dynamics of liquidity provision and speaking, however haltingly or humanly accented, the machine language of Liquidity. Imagine how much further that ability could be improved if you had access to a machine intelligence designed specifically for the purpose of measuring these liquidity risks as opposed to being another machine intelligence participating in liquidity operations. I am certain that it is possible to create such a liquidity-monitoring machine intelligence, just as I am certain that it is possible to create a correlation-monitoring machine intelligence, and just as I am certain that it is possible to create a portfolio-optimizing machine intelligence. These technologies are not to be feared simply because they are as alien to us as an insect’s eye. They should be embraced because they can help us see the market as it is, rather than as we wish it were or as we thought it was.

PDF Download (Paid Membership Required):

http://www.epsilontheory.com/download/15766/

The Tao of Portfolio Management

“Shape clay into a vessel;
It is the space within that makes it useful.
Cut doors and windows for a room;
It is the holes which make it useful.
Therefore benefit comes from what is there;
Usefulness from what is not there. ”
– Lao Tzu

“The limits of my language means the limits of my world. ”
– Ludwig Wittgenstein

“The question is not what you look at, but what you see.”
– Henry David Thoreau

“A European says: I can’t understand this. What’s wrong with me?
An American says: I can’t understand this. What’s wrong with him?”
– Terry Pratchett

I want to start this note in a manner that’s sure to annoy some readers, and that’s to reference the George Zimmerman trial. If gold is the third rail of financial commentary (“How Gold Lost its Luster”), then the Zimmerman trial must be the Death Star planet-destroying laser beam of such notes. But the shaping of the post-trial Zimmerman Narrative is a precise example of the behavioral phenomenon that I want to examine this week. So with considerable trepidation, here goes …

As discussed in last week’s note (“The Market of Babel”), groups speaking different languages – whether it’s an everyday language like English or Japanese, or an investing language like Value or Growth – have both a translation friction to overcome in inter-group communications as well as a potential dislocation of meaning in vocabulary and grammar. This latter problem is far more insidious and injurious to joint utility functions than the former, and the post-trial “conversation” between groups that support the Zimmerman verdict and groups that are appalled by the Zimmerman verdict is a perfect example of the problem of meaning. In fact, there is no conversation possible here at all, because each group is seeing the same observable data points through very different perceptual lenses. The chasm of meaning between these two groups is formed by an ecological divide, which is also a common source of meaning disparity in market communications and languages. As such, it is well worth our attention in Epsilon Theory.

An ecological divide is a difference in perception of useful signal aggregation. In the Zimmerman case, those appalled by the verdict are seeing the broad social context of the available information. How is it possible, they ask, for a society to allow an unarmed black minor walking home from a store to be shot dead with no legal sanction against his killer? Those supportive of the verdict, on the other hand, are seeing the individual instantiation of the available information in this particular case. How is it possible, they ask, to evaluate the specific evidence presented in this specific trial against the specific legal charges and fail to deliver a Not Guilty verdict?

The Western system of jurisprudence is based on liberal notions (that’s John Stuart Mill liberalism, not Walter Mondale liberalism) of the primacy of individual rights, as opposed to communitarian notions of aggregate social utility. What this means is that the rules of the trial-by-jury game, from jury instructions to allowed evidence, are set up to focus attention on specific fact patterns relevant to a specific defendant. And as a result, it makes a lot of sense (to me, anyway) that Zimmerman was found Not Guilty by virtue of reasonable doubt regarding the specific charges levied against him. On the other hand, the rules of the game for the Western system of political representation do not give a whit about individual rights, but favor the ability to mobilize like-minded groups of citizens on the basis of widely-held social grievances. So it also makes a lot of sense to me that the political dynamics outside the courtroom treat Zimmerman-like actions (and Zimmerman individually as a member of the Zimmerman-like set) as unjust and the object of sanction.

Each perspective is entirely valid within its relevant sphere of aggregation, and each perspective is extremely problematic in the other sphere. To deny the existence of racial bias in the aggregate data regarding crime and punishment in the United States – the application of the death penalty, for example – is, in my opinion, like denying that the Earth goes around the sun. However, this does NOT mean that ANY individual death penalty case, much less every death penalty case, is necessarily racially biased or that racial bias was a meaningful cause of any death penalty decision. I know this seems counter-intuitive … how can a population have a meaningful attribute in the aggregate, but no individual member of that population demonstrate that attribute in a meaningful way? … but it’s the truth. Or rather, it’s the inescapable conclusion of a consistent application of statistical logic.

Systems that demonstrate this sort of ecological divide are much more common than you might think, and are at the heart of any tool or technology that utilizes large numbers of small signals – each of which is inconsequential in its own right – to create or observe a meaningful signal in the aggregate. For example, the gamma knife technology used to shrink inoperable cancerous tumors works in this manner, by focusing hundreds or thousands of weak radiation beams from multiple directions on a cluster of cells. No single beam is meaningfully dangerous in and of itself, because otherwise the healthy cells hit by one of these rays might be injured, but the combination of many of these beams is deadly to a cell. Each individual beam of radiation is “Not Guilty” of causing irreparable harm to any individual cell, and no individual beam is biased/targeted specifically to any type of cell. But the overall system is a superb killer of cells subject to the bias/targeting of the system. The effectiveness of the gamma knife technology is entirely based on statistical assessments of probabilistic outcomes of cellular damage when exposed to a burst of radiation, both at the individual and aggregate levels. Because the radiation bursts can be reduced to multiple rays with very small individual impacts (probabilistically speaking), an ecological threshold can be calculated and implemented to create a potent cancer treatment therapy.

epsilon-theory-the-tao-of-portfolio-management-july-21-2013-gamma-knife

It’s no accident that technologies like the gamma knife are largely computer-driven, because humans are remarkably poor calculators of ecological thresholds. The human brain has evolved over millions of years and we have trained ourselves for hundreds of thousands of years to be very effective social animals making ecological inferences on a scale that makes sense for small group survival on an African savannah, not the smooth functioning of a mass society that spans a continent and has hundreds of millions of members. As a result, we are hard-wired to underestimate the cumulative impact of massive numbers of small signals that form part of a consistent system, and we consistently overestimate the contribution of any one contributory signal when we focus on the aggregate outcome. That latter decision-making mistake, where individual characteristics are improperly inferred from aggregate characteristics, has a name. It’s an ecological fallacy, and it’s an inherent problem for every aspect of human society in the modern age of massive aggregation, from the effective operation of a system of justice to the effective operation of a system of market exchange.

In the case of a justice system, the meaning of Trayvon Martin’s death is different when seen through the lens of individual rights at trial than when seen through the lens of social utility at large. What happened in Sanford was an instantiation of what I believe is a demonstrably unjust and racially biased system, and it deserves political action to recalibrate the societal gamma knife machine that ends up killing black cells preferentially over white cells. But that doesn’t mean that Zimmerman the individual was necessarily guilty of any crime, and to conclude that he is racially biased to a criminal degree because his actions form part of an unjust and racially biased system is an ecological fallacy. Such a conclusion is natural and all too human, but it is also illogical and unjust. It’s also a difficult point to fit into a soundbite for Fox or MSNBC, so I imagine that the demonization of both sides and the further polarization of American society will proceed with all deliberate speed.

In the case of a system of market exchange, I want to make two points about the impact of ecological divides and the hard-wired human tendency to make poor decisions under the influence of an ecological fallacy. The first, which I’ll only note briefly today but will describe in much more detail in subsequent weeks, is that it’s crucial for any investor to understand the basics of computer-driven methodologies of ecological inference. These methodologies, which fall under the rubric of Big Data, are driving revolutionary applications in fields as diverse as medicine, oil and gas exploration, and national security (this is the technology that underpins the recently revealed NSA monitoring program of mobile telephone meta-data). The technology has made some inroads within the financial services industry, particularly in the liquidity operations of market-makers (see “The Market of Babel”), but is surprisingly absent in risk management and security selection applications. It’s coming. And when these technologies do arrive, their impact on investing and money management will be as significant as that of the telegraph or the semiconductor. My hope is that Epsilon Theory will play some role in that arrival, both as a herald and as a provider.

The second point is that there is a huge ecological divide between investors, based on – as with all ecological divides – the perceived level of useful signal aggregation. When market participants describe themselves as bottom-up or fundamental investors, they typically mean that they base their decisions on signals pertaining to individual securities. When market participants describe themselves as top-down or macro investors, they typically mean that they base their decision on signals pertaining to an aggregated set of securities, perhaps an entire asset class of securities. For both bottom-up investors and top-down investors the English language uses the same word – “portfolio” – to describe the collection of securities that they own. But there is an enormous difference in meaning between a collection of securities that is seen and understood as an aggregate collection of securities versus a collection of securities that is seen and understood in terms of the individual members of that collection. The meaning of portfolio construction and risk management is very different when seen through the lens of a bottom-up stock-picking strategy than when seen through the lens of a top-down macro strategy, and the impact of this difference is underappreciated by investors, managers, allocators, and service providers.

To a top-down investor the portfolio IS the unit of analysis. A portfolio of securities is created for the express purpose of creating some set of characteristics in the aggregate. A top-down investor is trying to make a tasty stew, and the individual components of a portfolio are nothing more than ingredients that are intended to be blended together according to whatever recipe the portfolio manager is following. Securities are chosen solely for their contribution to the overall portfolio, and their usefulness is defined solely by that contribution. Individual securities have no meaning in and of themselves to a top-down investor, as it is the portfolio itself which is vested with meaning and is the object of the investor’s behavior.

To a bottom-up investor it is tempting to think of the portfolio as the unit of analysis, because it’s the performance of the portfolio that generates a manager’s compensation. But it’s not. To a bottom-up investor a portfolio is a collection of individually-analyzed exposures, where all the meaning resides in the individual exposures. It’s a “portfolio” simply because the bottom-up investor owns several individual exposures and that’s the word the English language gives to the owning of several individual exposures, not because there was any attempt to create or achieve some set of aggregate characteristics by owning several individual exposures. To use the imagery of Lao Tzu, a portfolio is a clay vessel to a fundamental investor, a provider of empty space that holds what is meaningful, rather than something that is meaningful in and of itself. The existence of a portfolio is an epiphenomenon to the behavior of a fundamental investor, not the object of that behavior, and to treat it as more than that or differently from that is a mistake.

Okay, Ben … that’s a very poetic metaphor. But what’s the problem here in concrete terms?

Both the bottom-up and top-down perspectives are demonstrably valid and effective within their own spheres. But when those spheres blur within investment decision-making you’ve got a problem. For a top-down portfolio manager this usually takes the form of imbuing meaning to an individual security (“Hmm … I think I will choose this stock to express the characteristic I want to have in my portfolio because I heard that it might be the target of a proxy fight. It’s like a free call option, right?”), and for a bottom-up portfolio manager this usually takes the form of tinkering with individual exposures in order to adjust or mitigate some portfolio-level attribute (“Hmm … I’m 40% net beta long and I’m really worried about this market. I better cut some of my high beta longs, maybe add some S&P puts. Gotta manage risk, right?”). Both of these behaviors fall into the chasm of the ecological divide, and the latter in particular is an expression of an ecological fallacy, no different in its logical inconsistency than believing that Zimmerman the individual should have been found guilty because he is part of a large set of individuals and actions that bear responsibility in the aggregate for a significant social iniquity.

The ecological fallacy expressed by tinkering with the individual exposures of a bottom-up, stock-picking portfolio happens all the time, in large part because these portfolios are typically judged and evaluated with the same tools and the same criteria used for top-down portfolios. A bottom-up portfolio manager is absolutely inundated with signals of all sorts about the aggregate characteristics of his portfolio … scenario analyses, volatilities, betas, correlations, active weights, gross and net exposures, etc. … and everyone knows that it’s critical to manage your exposure to this factor and that factor, that you should seriously consider a “trading overlay” or a “volatility hedge” to your portfolio. Or so we are told. And so we believe, because every institutional investor asks the same questions and collects the same performance and exposure data based on aggregate portfolio characteristics. We believe that everyone knows that everyone knows that it’s critical to manage your exposure to this factor or that factor, and thus it becomes Common Knowledge. And once it becomes Common Knowledge, then even if a fundamental investor privately believes that this is all hokum for the way he manages money, it doesn’t matter. The dynamics of the game are such that the rational choice is to go along with the Common Knowledge, else you are the odd man out. The Common Knowledge game is rampant in the business of money management, in exactly the same way that it is rampant in the intrinsic market activities of managing money.

The best stock-picking portfolio managers I know ignore 99% of the portfolio level data they are bombarded with, and good for them! A logically consistent bottom-up portfolio manager does not “manage to” some target Volatility or Sharpe Ratio or any other aggregate portfolio characteristic, because it makes no sense given what a portfolio means to a logically consistent fundamental investor. Again to refer to Lao Tzu, portfolio and risk management tools for the fundamental investor are moreuseful if they cut out measures and algorithms that do not make sense for the purpose or meaning of “portfolio” in the context of investing in individual securities.

But does that mean that fundamental investors are destined to fly by the seat of their pants through what is a decidedly foggy and stormy environment? Are there no effective instruments or tools that can help allocators and managers understand what makes one stock-picking portfolio different or better from another? I think that there are – or rather, could be – but these instruments need to be designed on the basis of what a portfolio means to a bottom-up investor, not what a portfolio means to a top-down investor. Unfortunately, every portfolio risk management tool or concept on the market today (to my knowledge) is based on the top-down investor’s perspective of portfolio-as-tasty-stew, as the direct object of analysis for the risk management tool, rather than the bottom-up investor’s perspective of portfolio-as-clay-vessel, as the indirect object of analysis for the risk management tool.

So what is a useful way of evaluating a portfolio-as-clay-vessel? To answer that question we need to ask why a fundamental investor has a portfolio at all. Why not just have three or four very large positions in your highest conviction stock-picking investment ideas and call it a day? One answer, of course, is that this approach doesn’t scale very well. If you’re managing more than a hundred million dollars, much less several billion dollars, finding sufficient liquidity depth in your best ideas is at least as difficult a task as identifying the best ideas in the first place. But let’s leave this aside for now as a practical challenge to a highly concentrated portfolio, not a fundamental flaw.

The fundamental flaw with concentrating investment decisions in a handful of exposures is that any investment is an exercise in decision-making under uncertainty. All fundamental investors “know their companies” as well as they possibly can, but in this business you’re wrong about something every single day. And that’s fine. In fact, it’s perfectly fine to be wrong more often than you’re right, provided that you have a highly asymmetric risk/reward payoff associated with being right or wrong with your fundamental analysis. In the same way that you would think about your bets at a horse track in terms of the expected pay-off odds AND your assessment of the expected race outcome, so are the exposures within a bottom-up portfolio based on a joint view of the likelihood of being right about future events AND the pay-off for being right. Different managers have different business models and views about the types of bets and the time frames of bets that are right for them, but this is the common language for all bottom-up investment strategies.

Thinking in terms of this joint probability function reveals why a bottom-up investor owns more than three or four exposures. Your best investment idea may not be (and in fact rarely is) the one where you are simply the most confident of the horse winning the race. It’s the one where you are most confident of the horse winning the race relative to the expected pay-off for winning the race and the expected loss for losing the race. Your best investment idea may well be (and in fact often is) based on a low probability event with a very high pay-off if you’re right and a reasonably low cost if you’re wrong, but you would be a fool to have a highly concentrated portfolio based solely on low probability events because the odds are too high that you will run into a streak of bad luck where none of the low probability events occur. Instead, you want your investment ideas to be sized in a way that maximizes the total of the expected returns from all of these individual joint probability calculations, but within a framework that won’t let a run of bad luck at the individual level put you out of business. That’s what a portfolio means to a bottom-up investor.

The language I just described – assessing risk and reward as a function of the probability of various informational outcomes and the pay-off associated with those outcomes – is called Expected Utility. It is the language of both Game Theory and Information Theory, and it is the language of the Epsilon Theory algorithms. In the same way that we can describe the informational surface of a security (see “Through the Looking Glass” and “The Music of the Spheres”), where price forms an equilibrium “trough” and the height of the “walls” around that trough represent the informational strength of the signal required to move the price outcome to a new equilibrium level, so can we describe the informational value of a specific portfolio exposure, where the vector (weight and direction) of that exposure versus the informational surface of the security represents the risk/reward asymmetry of that particular exposure from an Information Theory perspective. These individual informational values can be arrayed against probability distributions of new information coming into play for each individual security, and Monte Carlo simulations can then generate the optimal exposure weights for each individual security within the context of an overall business tolerance for bad luck. The resulting portfolio should be, by definition, the perfectly sized clay vessel to hold the securities chosen by the manager for their individual characteristics within a specified framework of business risk. The portfolio is the byproduct of the risk/reward attributes of the individual securities, not a directly constructed entity, and its own attributes (while measurable by traditional top-down tools if you care to do so) are relegated to the background where they belong.

I recognize that the preceding paragraph is quite a mouthful, and the language is foreign to most readers, in particular most bottom-up investors. I mean … very few bottom-up investors read up on Simpson’s Paradox or the latest applications of negative binomial stochastic distributions in their spare time. A stock-picker reads 10-Q’s and bond covenants in his spare time. A stock-picker is fluent in the written language of financial statements and the body language of management one-on-one’s, not the mathematical language of causal inference. But unfortunately there’s no getting around the mathematical language of statistical logic and causal inference whenever you start to aggregate complex things into complex collections of things, particularly when trillions of dollars are sloshing around in these complex aggregations. Without the structure and guard rails of mathematical tools and constructs, human decision-makers tend to fall into ecological chasms whenever they turn their focus from the individual level to the aggregate level to the individual level again.

The problem is that bottom-up investors have been ill-served by those who ARE fluent in these statistical languages. The available tools for portfolio construction and risk management aren’t guard rails at all to a bottom-up investor, but actually serve to encourage ecological fallacies and poor portfolio management. That’s because these tools were all designed from a top-down investment perspective, not out of malice or spite, but out of the intellectual hegemony that Modern Portfolio Theory exercises far and wide. It’s time for a re-conceptualization of these tools, one based on the truly fundamental language of Information and a recognition of the validity of different investment perspectives. That’s what I’m trying to achieve with Epsilon Theory.

PDF Download (Paid Membership Required):

http://www.epsilontheory.com/download/15761/

The Market of Babel

“But Achilles, weeping, quickly slipping away from his companions, sat on the shore of the gray salt sea, and looked out to the wine-dark sea.”
– Homer, “The Iliad”

The story of the Tower of Babel in the Book of Genesis, from whence we get the word “babble”, has always struck me as one of the most interesting Biblical origin myths. After the Flood, mankind is united and strong, speaking a single language. They build a great city and an even greater tower in the land of Shinar, which attracts God’s attention. God comes down from Heaven to see what Man is up to, notes that as a people with one language nothing Man sought would be out of reach, decides that this simply won’t do, and “confounds” their speech so that they no longer understand each other.

epsilon-theory-the-market-of-babel-july-14-2013-babel

The Tower of Babel before (Pieter Bruegel the Elder) …

epsilon-theory-the-market-of-babel-july-14-2013-dore

and the Tower of Babel after (Gustave Dore)

Construction on the tower stops, life in the city becomes untenable, the various linguistic groups scatter to the far corners of the globe, and a jealous God is safe once more from those uppity humans.

As described in a prior note (“Through the Looking Glass”), language is the quintessential example of Common Knowledge (usually called Convention in linguistic studies) in human behavior. This is what language IS … the belief that everyone knows that everyone knows a long-eared rodent that jumps around a lot is called a “rabbit” and not a “gavagai”, and the behavior that stems from that belief. If your group does not share the Common Knowledge or Conventions of another group when it comes to communicating about how to hunt long-eared rodents that jump around a lot, that’s a problem.

But as Jehovah knew all too well (and Quine rediscovered in 1948), the problem with people having different languages is not just the inconvenience of having to translate from one word that describes a long-eared rodent that jumps around a lot to another word that describes the same thing. If that were the only issue, then construction on the Tower could have proceeded, just at a slower pace and under the friction of translation. No, the lack of a shared language places a much more formidable obstacle in the path of human communication – the problem of meaning. Humans possessed of one set of Conventions, such as language, interpret and act on the world differently from humans possessed of another set of Conventions. The observed “facts” of the world will mean something different – sometimes slightly different and sometimes very different – to people possessed of different Conventions, and that difference in meaning is often entirely unbridgeable.

For example, consider another great classical text, the Iliad of Homer. One of the most famous phrases in that epic is “the wine-dark sea” that brooding Achilles contemplates after Agamemnon takes Briseis away from him, a strangely evocative image of the ocean that Homer uses several more times in his tale. But here’s the thing … throughout the Iliad and the Odyssey, Homer never describes the sea as blue.  He never describes the sky as blue. He never describes anything as blue. His only use of the Greek word that would later come to mean what we think of as “blue” – kuάneos – is used for a description of the dark sheen of Hector’s hair and Zeus’s eyebrows. How can the greatest epic poet in human history fail to see the ocean or the sky as blue?

Caroline Alexander has a wonderful essay (“A Winelike Sea”) in the most recent issue of Lapham’s Quarterly (Vol. VI, Num. 3, Summer 2013) that examines this mystery. As she notes, the answer to this conundrum for both Goethe in “Theory of Colors” and William Gladstone (yes, the four-time British Prime Minister was also an acclaimed classicist) was simple: Homer and all the ancient Greeks were color-blind. No, really. The greatest minds of the 19th-century (well, Goethe qualifies at least) concluded that most Greeks must have been color-blind until the fifth or sixth century BC when a poet named Simonides used the word kuάneos in a way that might mean “dark blue”. Other analysts came to the conclusion that, well … if Homer wasn’t color-blind, then that must mean that ancient Greek wine wasn’t red or purple, but was often blue! Right.

As Alexander points out, Homer may not have had the same words as we do today for color, but he had many more than we do today for light and the way it interacts with the world – so that the color white is never simply white, but is “glancing white” or “flashing white” or “gleaming white” or “shimmering white” depending on how the light strikes it. And when you start to read Homer’s phrasing through the lens of light and not the lens of color, it makes a big difference in how you understand the text. Unfortunately, no matter how skilled the translator (and this is not Alexander’s conclusion, as she is, after all, a very skilled translator), this means that it is ultimately impossible for us to read the Iliad as Homer intended us to read the IliadHomer saw the world very differently than you or I do – not because he was visually impaired or because the water was so alkaline that he had to drink blue wine – but because he and his contemporaries shared a different set of Conventions regarding how to interpret the world. And no matter how much we would like to see the ocean and sky as Homer did, as a quality of the light, we can’t stop seeing the ocean and the sky as blue. I defy anyone in the modern world to look at the picture of Santorini below and NOT use the concept of “blue” in any description of the scene.

epsilon-theory-the-market-of-babel-july-14-2013-greece

Homer could. We can’t. The difference in our perception of the world and Homer’s perception is incommensurable and ultimately unbridgeable. Such is the power of language and Convention. Such is the power of Common Knowledge.

Okay, Ben, that’s very interesting and all … but how does this help us become better investors?

First, we have to realize that the two great languages of investing – Value (along with its grammar, Reversion-to-the-Mean) and Growth (along with its grammar, Extrapolation) – are just that … languages. Neither of these sets of Conventions is timeless nor universal, and each conditions its speakers to interpret the observed facts of the world differently from the other. Not more truthfully. Just differently. Like any language, the primary usefulness of a shared set of Conventions is not found in inter-tribe communications, where both the friction of translation and the problem of meaning raise their ugly heads, but in intra-tribe communications. And like any language, the larger the tribe that shares the particular set of Conventions, the greater the utility for each individual member of the tribe. Calling a long-eared rodent that jumps around a lot a “rabbit” is much more useful to me if everyone I come into contact with shares the same vocabulary, grammar, and meaning for the word than if a sizable group speaks another language. In the latter case we will inevitably, to some degree, talk past each other whenever we try to communicate about long-eared rodents that jump around a lot, and that creates, by definition, a less efficient behavioral outcome for all of us.

The languages of Value and Growth are always useful to some degree in markets because the tribes that speak these languages are a significant enough proportion of pretty much any investment game to allow for meaningful intra-tribe communication. But the relative proportion of these tribes within any given market for any given security is extremely influential in shaping market game-playing, and the transition and inflection points of this relative proportion are predictive of transition and inflection points in security prices. There are consistent behavioral patterns, as expressed in security prices, associated with the waxing and waning of investment language population proportions. I have found the tools of linguistic evolution, as found in (among other places) the work of Brian Skyrms, particularly Signals: Evolution, Learning, and Information (Oxford University Press: 2010), to be very useful in understanding how the languages of Value/Reversion-to-the-Mean and Growth/Extrapolation wax and wane in their proportion of the overall population of investors for a particular security, and hence their importance in driving market outcomes. These are game theoretic tools, and they are at the core of the Epsilon Theory methodology.

For example, technology stocks tend to be much more driven by a Growth Narrative than by a Value Narrative. This is particularly true in large-cap tech stocks because the impact of Narrative in general is greater in large-cap stocks. Why? Because an informational “edge” is much harder to come by with large-cap stocks than small-cap or even mid-cap stocks, and as a result game-playing as driven by this Narrative or that is much more prevalent. Unless you are breaking the law, there is no possible way that you will know something about the fundamentals of, say, Apple that no one else knows and that is sufficient to move the stock. You either have a Growth language to speak with other Growth tribe members about Apple, or you have a Value language to speak with other Value tribe members about Apple. There are enough fellow tribe members that you will never be alone or seriously doubt your belief, but the Growth tribe is, historically speaking, a much more “enthusiastic” owner of tech stocks like Apple than the Value tribe.

Put differently, the day the dominant Apple Narrative shifted from “it’s expensive, but …” (a Growth tenet) to “it’s actually really cheap” (a Value tenet) is the day the stock stopped working, and the stock is unlikely to work again – regardless of how big a dividend Apple pays or whether it issues preferred stock (all Value tenets) – until Growth tenets reclaim control of the Apple Narrative. Evaluating how market opinion leaders talk about Apple is more important than what market opinion leaders say about Apple because it reflects the relative proportion and strength of one tribe of Apple owners, with a particular vision of what that ownership signifies and what behavior it entails, versus another tribe of Apple owners with a different vision.

Second, it is critical to recognize that there is a third language of investing in the world today, the language of Liquidity, and it’s not a human language at all. It is the language of Big Data, of computer-driven statistical inference, and if you try to “speak” the language of Value/Reversion-to-the-Mean or the language of Growth/Extrapolation to a computer on the other side of the trade, you are going to lose. Not a lot, but you are going to pay a tax whenever you take liquidity from a computer program. Why? Because algorithms, like Homer, see the world differently than you and I do.

The Conventions and the “biology” of modern computing systems make them very effective pattern recognizers of highly distributed and disparate data signals on a micro-second time horizon. They can “see” Liquidity signals in a way that is as alien to the human brain as the visual signals perceived by insects with compound eyes. Not only are human patterns of liquidity demand completely transparent to a modern liquidity provision algorithm, but also the typical effort made to hide liquidity demand – which is always some variation of chopping up a large order into smaller pieces and then injecting those pieces into the market according to a schedule determined by a sub-algorithm – only creates another sub-pattern or code that is in turn inevitably cracked by the liquidity provision algorithm. If the processing power available to crack these codes were limited to the human brain, then any of these chopping-up sub-algorithms would be sufficient to hide the pattern created by a modified VWAP order or one of its time-delineated kin. But with the processing power available to even the more modest liquidity provision algorithms, there is no hope – absolutely no hope – of creating any trading pattern that is somehow invisible or untraceable. As a result, algorithms dominate the liquidity operations of the modern market and have a significant trading advantage anytime a human decision-maker decides to create an exposure and take liquidity without another human simultaneously making a decision to provide liquidity.

In the same way that the relative proportion of Value-speakers to Growth-speakers makes a big difference in the medium to long-term price trends of certain securities, so does the relative proportion of human liquidity takers and providers to non-human liquidity operators make a big difference in the short-term price movements of certain securities. There are tools available to gauge this proportion (Hurst coefficients, for example), and even a cursory awareness of the language of Liquidity can help a portfolio manager anticipate the risk of pernicious reflexive price action (see “The Music of the Spheres”), particularly within an unstable informational framework.

Third, the implementation of any investment strategy can be improved by considering the common language that underpins the languages of Value, Growth, and Liquidity – the language of Information. I use the word “implementation” intentionally, because these insights of Epsilon Theory are less useful if you are buying a security, closing your eyes for three years, and then hoping to wake up and sell for a 30% gain. Epsilon Theory is most useful for investors for whom the path matters. If it matters to you whether or not this security goes down 30% before it ultimately ends up 30%, if you allow for the possibility that you might change your mind about the wisdom of holding this security at this point in time versus that point in time, then you should think about your investing in terms of Information. A concern with strategy implementation is a concern with the risk/return efficiency of exposures over time, and this is where an understanding of the common language of Information is so useful. As described in prior notes (“The Music of the Spheres” and “Through the Looking Glass”), understanding a security in terms of its informational surface (akin to a volatility surface) allows Value and Growth and Liquidity signals to be treated in a unified analytical framework. I’m not saying that those who speak the fundamental language of Information will see “nothing withheld from them which they purpose to do”. But if the power of a common language was enough to frighten God Almighty, well … that sounds like it should at least be good for a 2.0 Sharpe Ratio. Anyone else care for a bite of this apple?

PDF Download (Paid Membership Required):

http://www.epsilontheory.com/download/15752/

The Music of the Spheres and the Alchemy of Finance

“You say that we go round the sun. If we went round the moon it would not make a pennyworth of difference to me or to my work.”
– Sherlock Holmes (from “A Study in Scarlet” by Arthur Conan Doyle)

“It doesn’t matter if the cat is black or white, as long as it catches mice.”
Deng Xiaoping

“I could float off this floor like a soap bubble if I wish to. I do not wish to, because the Party does not wish it. You must get rid of those nineteenth-century ideas about the laws of Nature. We make the laws of Nature.”
O’Brien (from “1984” by George Orwell)

A few million years ago – the blink of an eye in evolutionary terms – our ancestors were roaming around some African savannah in a small band. We are still that social hunter-gatherer, for better or worse, with all the advantages and disadvantages our evolutionary heritage provides. Advantages include opposable thumbs, big eyes, and lots of neurons devoted to pattern recognition … attributes that, among other things, make our species very competent at driving cars and playing video games. Disadvantages include relatively few neurons and no sensory organs for interpreting really large numbers or physical laws that are foreign to an African savannah … attributes that, among other things, make our species poor theoretical astronomers.

We are excellent observers and pattern recognizers. For thousands of years, no astronomical event visible to the naked eye, no matter how minor, has escaped our attention and overwhelming need to find its pattern. But if understanding why these celestial patterns occur as they do requires a belief that the sun is an incomprehensibly large ball of hydrogen plasma 96 million miles away that warps the time/space continuum with its gravitational force … well, it’s pretty easy to understand why a heliocentric theory wasn’t humanity’s first choice.

For thousands of years, then, Common Knowledge – what everyone knows that everyone knows – of humanity’s place in the universe was dominated by this geocentric view, supported in the distant past by various origin myths and since 384 BC and the birth of Aristotle by the Narrative of Classical Science. Like all successful Narratives, Classical Science and Aristotelian geocentrism had a ring of truth to it (“truthiness”, as Stephen Colbert would say) and worked in concert with the interests of the most powerful political and economic entities of the day, from the Alexandrian Empire to the pre-Reformation Catholic Church. For almost 2,000 years the status quo entities of the West – whether explicitly religious such as the Catholic Church or dynastic and quasi-religious such as the Rashidun Caliphate, the Byzantine Empire, and the Holy Roman Empire – were based on a geocentric origin myth. The Narrative of Classical Science was extremely useful in efforts to maintain this myth because it allowed these political institutions to present geocentrism within the “modern” and compelling framework of Greek culture and learning, as opposed to the rather grim and ancient oral traditions of a nomadic desert tribe. Charlemagne may have famously used the sword to convert entire tribes to Christianity, but over a longer period of time Aristotle proved even more effective.

Unfortunately, however, there was a problem with the Aristotelian geocentric view of all the heavenly bodies circling the Earth in unison, creating a perfect and timeless “music of the spheres” … the data didn’t fit the theory. Mars, for example, goes back and forth in the sky with a retrograde motion during certain periods of the year, as do all of the planets to one degree or another, as opposed to a steady sweep across the sky as would be the case with a regular orbit around the Earth. Why? Because in truth both Earth and Mars go around the sun, and since Earth’s orbit is inside that of Mars, the position of Mars relative to Earth takes on a retrograde pattern when seen from Earth.

epsilon-theory-the-music-of-the-spheres-and-the-alchemy-of-finance-july-7-2013-mars-retrograde

Fortunately for the Narrative of Classical Science, however, around 140 AD an Alexandrian Greek named Claudius Ptolemy figured out how to reconcile the observed astronomical patterns with Aristotelian theory by devising the notion of “epicycles” and “deferent points”.

epsilon-theory-the-music-of-the-spheres-and-the-alchemy-of-finance-july-7-2013-aristotle epsilon-theory-the-music-of-the-spheres-and-the-alchemy-of-finance-july-7-2013-earth

Claudius Ptolemy (c. 90 – 160 AD)

In the Ptolemaic system, planets don’t orbit around the Earth directly. Instead, they orbit around a point in space (the epicycle center) that orbits around another point in space right next to Earth (the deferent center). The result is a flower-like orbit around the Earth for every planet, generating periods of retrograde movement as seen from the Earth.

If you ever had a Spirograph as a child (my favorite toy ever!), you’ll immediately understand Ptolemy’s theory. Basically he re-conceptualized the solar system as a giant, complex Spirograph, and through that brilliant insight the Narrative of Classical Science was saved.

epsilon-theory-the-music-of-the-spheres-and-the-alchemy-of-finance-july-7-2013-spirograph

Almost all of the observed data fit the Ptolemaic model well, and the theory was effective at predicting future astronomical events like eclipses and transits. Not perfect, but effective. For more than 1,000 years after his death in 168 AD, Ptolemy was the first and last word in everything to do with astronomy and astrology in both the Christian and Islamic worlds. Now that’s a useful Narrative!

So what went wrong with the Spirograph model of the universe? In school we learn that Copernicus “discovered” the heliocentric theory of the solar system and published a book to that effect in 1543, thus launching the Copernican Revolution. The popular implication is that it was the strength of the new ideas themselves that won the day for Truth and Reason against the narrow-minded intellectual tyranny of the Church. Yeah, right. In fact, it wasn’t until 60 years after Copernicus died that the Church got around to condemning his book and his theory. It took that long for his ideas to become dangerous to Rome because it took that long for his ideas to become useful to political and economic entities in Northern Europe. It also took that long because the world had to wait for Kepler and Galileo to improve on Copernicus so that his theory fit the observed data more comprehensively AND more effectively than Ptolemy.

epsilon-theory-the-music-of-the-spheres-and-the-alchemy-of-finance-july-7-2013-copernicus epsilon-theory-the-music-of-the-spheres-and-the-alchemy-of-finance-july-7-2013-kepler epsilon-theory-the-music-of-the-spheres-and-the-alchemy-of-finance-july-7-2013-galilei
Nicolaus Copernicus (1473-1543)
Johannes Kepler (1571-1630)
Galileo Galilei (1564-1642)

– I want to focus on that last point for a minute. The original heliocentric model that Copernicus developed was a lot simpler than the Ptolemaic model, but it didn’t work very well … if you wanted to predict an eclipse or the date that Easter would occur in some future year, you were still better off using the good old Ptolemaic tables. To make the observed data fit his model, Copernicus ultimately had to take the same Spirograph approach that Ptolemy had used 1,400 years earlier, complicating his original ideas enormously by introducing epicycles and the like. The problem for Copernicus was that he was hooked on the idea of circular orbits. It wasn’t until Kepler modified the heliocentric model with the idea of elliptical planetary orbits in 1615 that everything fell into place with a single simple theoretical structure. And it wasn’t until Galileo made his telescopic observations of the phases of Venus in 1610 that the Copernican model accounted for observed facts that the geocentric model could not possibly support.

In the history of Ptolemy and Copernicus we see the three necessary and sufficient conditions for a “paradigm shift”, which is just another term for an abrupt change in the Common Knowledge surrounding some socially-constructed phenomenon:

1)      new data observations that fit the new paradigm better than the old paradigm;

2)      new ideas that create a simpler and more fundamental structure for the new paradigm relative to the old paradigm;

3)      political and economic entities that come to see their self-interests better supported by the new paradigm than by the old paradigm.

I think it’s possible that we are on the cusp of just such a paradigm shift within the investment world, away from a narrow-minded faith in the power of Modern Portfolio Theory and its econometric foundations, and towards a more inclusive view of markets that incorporates an appreciation of historically-conditioned behavior as well as patterns of strategic decision-making under uncertainty.

Maybe that’s just wishful thinking on my part, but the necessary and sufficient conditions for change are present, including the realization by powerful political and economic entities that the current system … well, it just ain’t working.  Structural changes in markets (see “How Gold Lost Its Luster”) are eroding business models left and right. The collapse in trading volumes is poison to anyone who worships at the altar of Flow, like bulge bracket sell-side firms, and rampant disintermediation is death to gatekeepers like fund-of-funds and consultants. I mean, is your view on whether to buy or sell Apple really going to be influenced by the umpteenth sell-side model of Apple’s gross margins? Do you really need a consultant to tell you how to buy market exposure through ETF’s?

Of course, the same thing happened the last time we suffered through a multi-year investing environment of alpha scarcity and beta dominance, back in the 1930’s. Market makers and investment intermediaries dropped like flies throughout the decade, a process that – like today – was accelerated by sharp shifts in the regulatory environment (Glass-Steagal in 1933, Dodd-Frank in 2010). In fact, it really wasn’t until the mid-1950’s that the financial services industry began to grow dramatically again, not coincidentally with the introduction of Modern Portfolio Theory in 1952 and the expansion of retail brokerages (especially “the thundering herd” of Merrill Lynch) onto every Main Street in the U.S. These twin Narratives of the 1950’s – Everyman Stock Ownership and Modern Portfolio Theory – drove an investment paradigm shift that created modern Wall Street.

I don’t know for sure what the 21st-century equivalents of Everyman Stock Ownership and Modern Portfolio Theory will be for Wall Street, or how and when the associated Narratives will develop. But there is no more important question that Wall Street needs to answer in order to reinvent itself (again), and if I had an hour of Gary Cohn’s time this is what I’d want to talk about. I think that Epsilon Theory is a good place to start in evolving Modern Portfolio Theory into something more useful, and I think that there is already an established set of people and practices that can push paradigmatic change forward: traders and technical analysis.

Like many investors trained with a fundamental bias, for years I pooh-poohed the very idea that technical analysis had anything to offer a “true” investor. Technical analysis was the province of traders, and to call someone a “trader” was about the worst insult you could deliver in these circles. It was an entirely pejorative term, and it meant that you weren’t as serious or as thoughtful as a praiseworthy “long-term investor”. I was foolish to hold this bias, and I was a less effective portfolio manager for it. My job as a portfolio manager was to make money for my clients – to catch mice, as Deng Xiaoping would put it – not to make money in a theoretically pure way. If technical analysis could have helped me catch more mice – and it could – then I should have embraced it, not dismissed it out of hand.

Technical analysis is, at its heart, behavioral analysis, and as such is prime real estate to build a new investment paradigm that incorporates game theoretic behaviors. 

Now don’t get me wrong … there are HUGE problems with the application of technical analysis today. Technical analysis requires a Copernican Revolution. By that I mean it needs to be re-conceptualized away from the Spirograph models and the naïve empiricism that currently dominate the effort. Not because the current conceptualization is a failure, any more than the Ptolemaic conception of the solar system was a failure. The world functioned quite well for 1,400 years using Ptolemaic tables to predict the position of planets and stars, thank you very much. But technical analysis could accomplish so much more, both in terms of accuracy and of scope, if it were put on a more solid theoretical foundation. If you try to launch a rocket while believing in a geocentric model of the universe, you’re going to fail. I think that technical analysis could, in fact, “launch a rocket” in that it could drive a paradigm shift in how we think about and operate effectively within modern markets, but only if we change the conceptual center of the trading universe from Price to Information. 

When you talk with experienced options traders, it seems as if they can “see” securities in terms of volatility space, not the two-dimensional price-over-time matrix that most investors and traders use as their lens. What I want to suggest is to see ALL securities in terms of what I will call “information space”. In prior work (“Through the Looking Glass”) I laid out this methodology in detail, so I won’t repeat that here. The basic idea, though, is to describe a point in time for a security in terms of the information required to move that security’s price from its current equilibrium to a higher or lower equilibrium.

epsilon-theory-the-music-of-the-spheres-and-the-alchemy-of-finance-july-7-2013-equilibrium

Above, for example, is a simplified two-dimensional informational scenario for a broad market, say the S&P 500, with the black ball representing the current equilibrium price level for that market and the height of the trough walls representing the informational strength of the current equilibrium level. This is the informational surface. To make the ball “roll” to a new higher equilibrium level requires a strong enough signal (represented by the green arrow) to get over the right-hand trough wall, and vice versa for the market to go down. The informational surface plus the new information signal combine to create an informational scenario.

epsilon-theory-the-music-of-the-spheres-and-the-alchemy-of-finance-july-7-2013-time

This change in price equilibrium levels within an informational scenario can be mapped against a traditional price-over-time chart as shown above, but the depiction in informational space is much more useful than the depiction in price space. Why? Because any given price outcome can be generated by multiple informational scenarios, but any given informational scenario will generate one and only one price outcome. Just knowing the price outcome gives you little idea of how or why that price outcome arose. But if you know the informational scenario you know both the unique price outcome as well as how it came to be. An informational scenario can predict price, but a price can neither predict nor explain afterwards an informational scenario.

Seeing the market in terms of information space will NOT tell you whether the market is going up or down. It shows you how the market is likely to react to new information, and it gives you tools for evaluating the potential market impact of new information. Epsilon Theory is both a methodology (a toolbox for evaluating observed data) and a theory (a conceptualization of observed data). Methodologies and theories are neither true nor false, only more or less useful than alternative toolboxes and conceptualizations, and I have no doubt that Epsilon Theory is not terribly useful for some investment strategies. Sherlock Holmes didn’t care whether the Earth went around the sun or the other way around because it made “not a pennyworth of difference” to his life or his work, and the same is probably true for Epsilon Theory and, say, private equity investing.

But here’s an example of how a common dialog between traders and portfolio managers can be much more useful when reformulated in informational terms.

When a trader tells a portfolio manager that there is “resistance” at a particular price level, and “support” at another, he is making a statement about informational structures created by historical patterns of price movements. This is true regardless of the specific methodology used to determine these resistance and support levels – Bollinger Bands, Fibonacci Series, MACD, whatever. For example, here’s a MACD price chart for Apple that I’ve marked with hypothetical resistance and support levels (NB: I have no idea whether these levels are methodologically accurate, and it really doesn’t matter for the point I’m trying to make):

epsilon-theory-the-music-of-the-spheres-and-the-alchemy-of-finance-july-7-2013-bloomberg

And here’s that same chart expressed as an informational surface:

epsilon-theory-the-music-of-the-spheres-and-the-alchemy-of-finance-july-7-2013-informational-surface

The advantage of the informational surface expression over a price chart is that it is (potentially) more accurate, more comprehensive, and more understandable without being more complex.

An informational model is potentially more accurate because the resistance and support levels are not binary or categorical thresholds (resistance vs. support, strong vs. weak) but are variable representations of their inductively derived informational strength or weakness. Rather than simply saying, “there’s resistance at $446” it’s possible to say “it will require 0.6 generic bits of information to get over the resistance at $446 and 1.8 generic bits of information to get over the resistance at $492.” And because there are tools to measure the informational strength of new information, it’s possible to estimate the likelihood that this piece of new information will be sufficient to pierce the $446 resistance but that piece of information will not.

An informational model is potentially more comprehensive because non-price informational barriers can be incorporated directly into the analysis.  One of the biggest weaknesses of technical analysis as it is currently constituted is that it only “sees” informational signals based on historical price outcomes. By re-conceptualizing price as information, other important types of information, such as public statements and macro data announcements, can be plugged into the same inductive analytic framework. There’s an enormous amount of intellectual firepower embedded in technical analysis that is underutilized because it is only applied to price data. Information theory provides a common language for every type of signal, and by “translating” all sorts of signals into the language of information we can significantly expand the scope of powerful inferential tools that fall under the rubric of Big Data.

An informational model is potentially more understandable because the dimension of time can be incorporated more easily, or at least more intuitively, into the analysis. All forms of technical analysis are based on some flavor of time series regressions, so it’s not that time is ignored. But by including it graphically as an additional dimension, you can create the equivalent of a volatility surface, which makes it much easier to “see” the informational dynamics at work.

epsilon-theory-the-music-of-the-spheres-and-the-alchemy-of-finance-july-7-2013-informational-dynamic

epsilon-theory-the-music-of-the-spheres-and-the-alchemy-of-finance-july-7-2013-informational-dynamic-3d

What I’m suggesting – to treat patterns of price data as an important informational signal within a broader theory of behavioral finance – is not original to me. The Copernicus of this story is George Soros, and the application of game theory to markets has its first (and in many ways still best) expression in his magisterial 1987 book, The Alchemy of Finance. I am not going to attempt any sort of summary of the book here, because it defies easy summary. It is, as Paul Volcker (!) writes in his Foreword to the 2003 edition, “an honest struggle by an independent and searching mind to break through a stale orthodoxy with new and meaningful insights into financial and human behavior”, which is just about the highest praise an author can receive. I’ll just add that even though Soros does not frame his core ideas such as “reflexivity” in terms of formal game theory, there is no doubt that this is his intellectual home. Everything that Soros writes about the behavior of markets can be expressed, sometimes more effectively but usually less, as a game theoretic construct.

Reflexivity is the best known of Soros’s core concepts, but also tends to be misunderstood. Rather than repeat Soros’s own words or define it with the language of game theory, let me give you an example of reflexivity as a conversation that happens in one form or another hundreds of times a day, every day, all around the world.

PM:                Hey, why is XYZ down 3% all of a sudden?

Trader:         I don’t know. There’s nothing on Bloomberg. Let me ask around.

[2 minutes later]

Trader:         Nothing on chat. All the desks are calling trying to find out what’s going on.

PM:                Is there a conference or something where management is talking?

Analyst:        I don’t think so. I tried calling IR, left a message.

PM:                Well, somebody must know something. I hate this stock … it could go down 10%. Sell half the position and put a tight stop on the rest. Gotta manage the risk. Let me know if either of you hear anything.

[30 minutes later]

Trader:         We got stopped out. You’re flat.

[2 days later when XYZ has fully recovered to its original price]

PM:                Hey, false alarm, let’s start putting XYZ back on. I really like that stock.

This is reflexivity. It’s a bitch.

Like Copernicus, though, Soros has some problems with the concept of reflexivity as expressed in Alchemy of Finance. As written (and for all I know, Soros has kept his best work secret in order to build a fortune), reflexivity is more of a heuristic – a rule of thumb or a way of looking at data – than a practical methodology that can be incorporated into a rigorous evaluation. There’s no language to reflexivity other than the language of price, and that’s a problem for Soros in the same way that it’s a problem for technical analysis … it limits the enterprise in both scope and accuracy. But in the same way that elliptical planetary orbits provided the key for translating the central insights of the original Copernican theory into an extremely powerful (i.e., useful) heliocentric model of the solar system, I believe that information theory can translate the central insights of reflexivity into an extremely powerful (i.e., useful) behavioral model of markets.

Here’s the basic idea of how to describe reflexivity in informational terms. Let’s say you have an unstable equilibrium, meaning that the informational barriers for the current price equilibrium to start moving in one direction or the other are quite low. For whatever reason, maybe just the chance result of a large number of Sell orders clustering in time, the equilibrium starts to “roll” to the left.

epsilon-theory-the-music-of-the-spheres-and-the-alchemy-of-finance-july-7-2013-roll

That price action in and of itself creates a new informational signal.

epsilon-theory-the-music-of-the-spheres-and-the-alchemy-of-finance-july-7-2013-signal

And so on and so on. In retrospect it always seems obvious that the market just “had a mind of its own” and that there was nothing “real” to make the price go down. But when you’re in the middle of one of these episodes it’s not obvious at all. We are biologically hard-wired to pay attention to these signals and to interpret them as part of a larger, more meaningful pattern. These price action signals are entirely real as we experience them, and I think it’s critical to have an investing perspective that treats them as entirely real. That’s what Information Theory provides. By looking at the phenomenon of reflexivity through the lens of Information Theory, we can “see” its dynamics more clearly than if we’re just looking at price, and as a result we have the potential to anticipate and/or react more appropriately when these events occur.

This, then, is the goal of Epsilon Theory – to develop a practical methodology to identify the securities prone to game-playing behaviors like reflexivity, and the conditions under which game-playing behavior is more or less likely to occur. By building on the insights of thinkers like George Soros, E.O. Wilson, and Brian Skyrms I think it’s a very achievable goal. Whether that ultimately sparks a new investment paradigm … who knows? But I’m pretty sure it can help us catch more mice.

PDF Download (Paid Membership Required):

http://www.epsilontheory.com/download/15744/

How Gold Lost Its Luster, How the All-Weather Fund Got Wet, and Other Just-So Stories

Gold is money. Everything else is credit.
– John Pierpont Morgan

The relationships of asset performance to growth and inflation are reliable – indeed, timeless and universal – and knowable, rooted in the durations and sources of variability of the assets’ cash flows.
– Bob Prince, Co-Chief Investment Officer, Bridgewater Associates

Like every middle-aged white guy I know, I am a big fan of Rudyard Kipling. I grew up on his Just-So Stories and as an adult found that his novels and poems spoke to me, as they did to my father and his father before that. Kipling writes simply, directly, and evocatively. Whether it’s a poem, a short story, or a novel, the man knows how to tell a story. He was the youngest winner of the Nobel Prize in Literature (as well as the first English-language recipient), and after a too-long period of disfavor in the academy he now enjoys a well-deserved renaissance of interest and acclaim.

epsilon-theory-how-gold-lost-its-luster-june-30-2013-kipling

But there is also no doubt that what Kipling wrote was used by political and economic entities of his day to support their own self-interests. As George Orwell said, with wildly popular poems like “The White Man’s Burden” he was the “prophet of British imperialism.” Was he a simplistic rah-rah tout for the rewards of Empire? Not in the least. There is tension, nuance, and respect for the human condition in everything Kipling wrote, at least that I’m aware of. And this is exactly why he was such an effective prophet, such an effective Narrator for mainstream British policy in the first three decades of the 20thcentury. Kipling’s skill as an author allowed British citizens to feel good about themselves and to support their government’s policies without requiring them to check their brains or their scruples at the door.

These are the hallmarks of effective Narratives – they have an intrinsic ring of truth (“truthiness”, to use Stephen Colbert’s wonderful phrase) that speaks to us on an intellectual and emotional level AND they coincide with the goals and preferences of powerful political and economic entities. Neither of these qualities is inherently a bad thing, whatever “bad” means.  Nor is the content of a Narrative necessarily less truthful because it helps serve broader interests, whatever “truthful” means. Questions of truth and falsehood, good and bad, are impossible to assess from the informational content of the Narrative itself and are only meaningful in the broader context of human society at some given point in time. Kipling’s work gave a voice to the orthodoxy of foreign policy Common Knowledge in 1905 and the anti-orthodoxy of foreign policy Common Knowledge in 1965, even though Kipling’s words themselves never changed. In both eras, the Narrative of Imperialism – pro in 1905, anti in 1965 – was highly relevant to political and economic entities, which gave the world a public lens to interpret Kipling’s work. Today no one cares about the Narrative of Imperialism. It is a dead Narrative, like Manifest Destiny or Cultural Revolution. What Kipling wrote 100 years ago is largely irrelevant for the Narratives that shape our world today, which is probably what allows his work to be better appreciated for how it moves us on a personal level.

The Narrative of Gold was relevant 100 years ago and, unlike the Narrative of Imperialism, it remains relevant today. But like the Imperialism Narrative in 1965, it has morphed from a centerpiece of Common Knowledge orthodoxy into the foil or antithesis for a more modern, ascendant Narrative. Just as the Narrative of Imperialism was supplanted by the Narrative of Self-Determination, so has the Narrative of Gold been supplanted by the Narrative of Central Banker Omnipotence.

So long as the Narrative of Self-Determination was useful to powerful political and economic entities, the Narrative of Imperialism was relevant as well. It’s much easier to make an argument for something when you have something to argue against. But when the Narrative of Self-Determination lost its usefulness (not coincidentally with the end of the Cold War), so did the Narrative of Imperialism fade away. Today the Narrative of Central Banker Omnipotence is extremely useful to powerful political and economic entities, which means that the Narrative of Gold is important, too. Gold is just not important in the same way that it was important 100 years ago, and that shift in meaning makes all the difference in understanding the price of gold.

What “Jupiter” Morgan said about the primacy of gold above all other stores of value rang true to almost everyone when he said it. Did his public statements in support of the gold standard also serve his own self-interest? Absolutely. The US Treasury bought 3.5 million ounces of gold in 1895 from the House of Rothschild and … J.P. Morgan, using funds from a massive (for the time) 30-year bond issue syndicated by … J.P. Morgan. In a highly unusual (i.e. unconstitutional) move, this bond issue was carried out by the White House without any Congressional approval, under the authority of a forgotten Civil War era statute that was identified by … J.P. Morgan.

epsilon-theory-how-gold-lost-its-luster-june-30-2013-jp-morgan

But while there’s no question that the bond sale and subsequent gold purchase lined Morgan’s pockets and served the interests of the rich and powerful considerably, there is also no question that these moves effectively ended the Panic of 1893. Why? Because everyone knew that everyone knew that gold was money. And now the US Treasury had lots of it. Huzzah! Confidence is restored as the Republic is saved by Grover Cleveland and Jupiter Morgan.

To be sure, the Narrative of Gold as told by J.P. Morgan was not accepted at the time by everyone as good or wise policy, any more than the Narrative of Imperialism as told by Kipling was accepted by everyone as good or wise policy at the time. Political oratory being what it was back then, in fact, William Jennings Bryan famously compared the imposition of the gold standard as the equivalent of “crucifying mankind on a cross of gold” at the Democratic national convention of 1896 and was nominated by acclamation as his party’s Presidential candidate. This despite Bryan being only 36 years old (the youngest Presidential candidate in American history) and despite Grover Cleveland, the outgoing President who bought the gold from Morgan and Rothschild, being a Democrat and the standard-bearer of the party. Clearly that must have been one hell of a speech!

epsilon-theory-how-gold-lost-its-luster-june-30-2013-speech epsilon-theory-how-gold-lost-its-luster-june-30-2013-bryan

But Bryan and his Free Silver Democrats weren’t opposed to the gold standard because they disagreed with the notion that gold was money; they just thought that silver should be money, too. Whatever you thought about the policy implications, the Common Knowledge about the meaning of gold in 1895 was clear: it was money, and the behavior of market participants in buying and selling gold reflected this meaning.

Now imagine if the current head of the House of Morgan, Jamie Dimon, made the same statement as Jupiter Morgan did, equating gold with money. People would think it was a joke. Everyone knows that everyone knows that gold does not mean the same thing to Jamie Dimon that it did to J.P. Morgan.

To market participants in 2013 gold means lack of confidence in money, and their behavior in buying and selling gold similarly reflects this meaning. Buying gold today is a statement that you believe that global economic events may spiral out of the control of Central Bankers. It is insurance against some sort of massive monetary policy mistake that cannot be fixed without re-conceptualizing the global economic regime – hyperinflation in a developed nation, the collapse of the Euro, something like that – not an expression of a commonly shared belief in some inherent value of gold.

The source of gold’s meaning, whether you are a market participant in 1895 or 2013, comes from the Common Knowledge regarding gold. J.P. Morgan said that gold is money, and he was right, but only because at the time he said it everyone believed that everyone believed that gold is money. Today that same statement is wrong, but only because no one believes that everyone believes that gold is money.

You may privately believe that J.P. Morgan is still right, that gold has meaning as a store of value. But if you participate in the market on the basis of that belief, then you will buy and sell gold in an incredibly inefficient manner. You would be a smart gold investor in 1895, but a poor gold investor today. Or let’s say that you privately believe gold to be a “barbarous relic” and that it’s ridiculous for gold to have any ascribed value at all other than what jewelry demand would bring. You, too, will buy and sell gold in an incredibly inefficient manner. In fact, you would be a poor gold investor in both 1895 and today.

In some periods of history gold is money. In other periods of history gold is not. But gold is always something, and that something is defined by the Common Knowledge of the day. To be an efficient gold investor in any period, I believe it’s crucial to identify and measure the relevant Narrative that is driving the Common Knowledge regarding gold. Only then can one construct an informational surface that predicts how the equilibrium price of gold will respond to new information (see “Through the Looking Glass” for a description of this game theoretic methodology).

There is no stand-alone Narrative regarding gold today, as there was in 1895. Today gold is understood from a Common Knowledge perspective only as a shadow or reflection of a powerful stand-alone Narrative regarding central banks, particularly the Fed … what I will call the Narrative of Central Banker Omnipotence. Like all effective Narratives it’s simple: central bank policy WILL determine market outcomes. There is no political or fundamental economic issue impacting markets that cannot be addressed by central banks. Not only are central banks the ultimate back-stop for market stability (although that is an entirely separate Narrative), but also they are the immediate arbiters of market outcomes. Whether the market goes up or down depends on whether central bank policy is positive or negative for markets. The Narrative of Central Banker Omnipotence does NOT imply that the market will always go up or that central bank policy will always support the market. It connotes that whatever the central bank policy might be, it will drive a market outcome; whatever the market outcome, it was driven by a central bank policy.

Like all effective Narratives it has a great deal of “truthiness” … it rings true to our intellect even as it appeals to our emotions. How comforting to believe that there is a reason why markets go up and go down, and that this reason is clearly identifiable and attributable to the decisions of a few Wise Men and Women, as opposed to the much scarier notion that the world (and markets) are adrift on a sea of chaotic events and hidden currents. And like all effective Narratives it serves the interests of the world’s most powerful political and economic entities … not that there’s anything wrong with that.

The strength of the Narrative of Central Banker Omnipotence has nothing to do with whether people believe that central bank policy is wise or foolish, good or bad. To predict market behavior it really doesn’t matter if QE is the balm of Gilead or the work of the Devil, any more than it mattered in 1895 whether the gold standard saved the Republic or crucified mankind. The only thing that matters from an Epsilon Theory perspective is whether everyone believes that everyone believes that central bank policy determines market outcomes.

The stronger the Narrative of Central Banker Omnipotence, the more likely it is that the price of gold goes down. The weaker the Narrative – the less established the Common Knowledge that central bank policy determines market outcomes – the more likely it is that the price of gold will go up.  In other words, it’s not central bank policy per se that makes the price of gold go up or down, it’s Common Knowledge regarding the ability of central banks to control economic outcomes that makes the price of gold go up or down.

Look below at the price chart for gold over the past year. Gold peaked in late September and early October 2012, immediately after the Fed announced its open-ended QE program (red line). From the perspective of traditional macroeconomics, this makes no sense at all. The Fed had just announced its most aggressive monetary easing policy in history. Not only were they announcing yet another balance sheet expansion, but this time they were telling you that they weren’t going to stop at any pre-determined level, but were going to keep going for as long as it took to satisfy their full employment mandate. This is an inflation engine, pure and simple, and gold should go up, up, and away in price if the standard macroeconomic correlation between the price of gold and monetary easing held true.

epsilon-theory-how-gold-lost-its-luster-june-30-2013-bloomberg

London Gold Market Fixing Price, June 30, 2012 – June 30, 2013 (Source: Bloomberg L.P.)

But what was more relevant for the price of gold was the strengthening of the Narrative of Central Banker Omnipotence after the open-ended QE announcement. When you examine the public statements in major media outlets in the weeks following this announcement through a Common Knowledge methodology – both direct statements from Fed governors and “analysis” statements from prominent journalists, investors, and politicians – there was remarkably little opinion-leading or Narrative effort devoted to the direct economic or market implications of the new QE program. There was a one-day spike in inflation expectations and a few public comments to quell the “Oh my God, this means rampant inflation” crowd in the first day or so, but very little else. Instead, the focus of the mainstream Narrative effort moved almost entirely towards what open-ended QE signaled for the Fed’s ability and resolve to create a self-sustaining economic recovery in the US. And it won’t surprise you to learn that this Narrative effort was overwhelmingly supportive of the notion that the Fed could and would succeed in this effort, that the Fed’s policies had proven their effectiveness at lifting the stock market and would now prove their effectiveness at repairing the labor market. Huzzah for the Fed!

Within a week or so, however, opinion-leading voices from other prominent journalists, investors, and politicians joined the fray to say that this congratulatory viewpoint of the Fed’s new policy was entirely misplaced. There was absolutely no evidence showing that further expansion of the Fed’s balance sheet would have any impact whatsoever on US labor conditions, and that to claim otherwise was simply magical thinking. Moreover, according to this counter-argument, there was clearly a declining economic utility to more and more QE, so this latest program was a bridge too far.

But here’s the crucial point … whether these opinion-leaders and Narrative creators thought open-ended QE was a wonderful thing or a terrible thing, they ALL agreed that Fed policy had been responsible for the current stock market level. It was J.P. Morgan and William Jennings Bryan all over again, just arguing the merits of more QE versus less QE instead of the merits of the gold standard versus the gold + silver standard. But just as the debate over Free Silver only intensified the Common Knowledge that gold was money in 1896, so did this debate over the merits of open-ended QE only intensify the Common Knowledge that Fed policy was responsible for market outcomes in 2012. This was a positive informational inflection point in the Narrative of Central Banker Omnipotence, and as a result the price of gold has not had a good day since.

Gold is the most pronounced example of an asset with a mutable behavioral foundation because for all practical purposes there is no practical use for gold. It’s pretty and shiny and relatively rare, but so are a lot of things. For gold, at least, there is no “timeless and universal” relationship between it and economic constructs like inflation and growth, or monetary policy constructs like easing and tightening. There is a relationship, to be sure, but the nature of that relationship changes over time as the Common Knowledge regarding the meaning of gold changes.

The same is true of every other symbolized asset, which is to say every cash flow or fractional ownership interest or thing that is securitized and traded. 

Not to the same extent as gold … there’s a continuum to this perspective, with securities representing gold and other precious metals at one end, then securities representing foreign exchange, then securities representing industrial metals and other commodities, then securities representing publicly traded stocks and bonds, and finally securities representing privately traded equity and debt at the other end of the spectrum. Within each of these categories, the more symbolic the security the more fragile the correlation between it and real-world economic factors (so, for example, an aggregation of stock symbols via an ETF is more prone to game-playing than an individual stock.) Put slightly differently, the more clearly identifiable and directly attributable the cash flow foundation of an asset, the less the impact of the Common Knowledge game. Still, the assignment of value to any symbolized asset is inherently a social construction and will inevitably change over time, occasionally in sharp and traumatic fashion.

The notion that the preference function of market participants may change over time has been around for a long time, particularly in the study of commodity markets. Ben Inker at GMO recently wrote an excellent paper on this topic (“We Have Met the Enemy, and He Is Us”) that I highly recommend if you want to dig into the gory details, but here’s the basic idea:

Back in the 1930’s Keynes proposed an idea called “normal backwardation” to explain how commodity futures markets could support a profit for traders who specialized in those markets. In this theory, commodity producers like farmers were typically risk averse when it came to market risk, and so would be willing to accept a forward contract guaranteeing a lower future price for their crop than a straightforward projection of the current spot price would suggest. The difference between this agreed upon futures price and the projected futures price was the risk premium required by the specialized commodity trader to take the other side of the trade. As Keynes pointed out, the risk premium would have to be pretty high for the commodity trader to engage in this trade (and thus push the futures price down) because, obviously, the commodity trader’s entire livelihood was based on making a profit on these trades.

But now let’s fast-forward 80 years to a world where anyone can be a commodity trader, or rather, anyone can make commodity trades without being a specialized commodity trader. In fact, the notion of an entire market being made by specialists seems terribly quaint today. More importantly, the meaningof a commodity futures contract has changed since Keynes proposed his theory, in the same way that the meaning of gold has changed since J.P. Morgan smoked his last cigar. Pretend you’re a giant pension fund with several hundred billion dollars worth of current assets and future liabilities. Do you think about owning a commodity futures contract because you’re interested in making a small profit in the difference between a farmer’s hedge and a projected forward spot price? Are you agonizing over a few basis points like a specialized commodity trader? Of course not. The only reason you are interested in owning a commodity futures contract is because you’re worried about inflation within the context of your portfolio of assets and liabilities. It’s your preference function regarding inflation that will drive your behavior, not a preference function regarding the intricacies and competitive risk premium associated with this particular commodity.

Whatever historical correlations and patterns existed in this commodity market when it was limited to specialty traders have to be tossed out the window when the pension funds and other enormous asset managers get involved. It’s like playing poker at a table with five penny-pinching off-duty Vegas dealers, and then moving to a table with five rich doctors in town for the weekend. If you don’t change the way you play your cards, even if you’re dealt exactly the same cards from one table to another, then you’re a fool.

This transformation in the composition and goals of market participants is by no means limited to commodity markets. Over the past decade there has been a sea change in the structure of global debt and equity markets, as well. Multiple papers by Simon Emrich and Charles Crow at Morgan Stanley lay out the structural transformation in equity markets in fantastic detail, most recently “Trading Strategies for 2013 – Optimal Responses to Current Market Structure” (March 18, 2013), but here are the two most striking findings from a game theoretic perspective:

1)      Over the past 10 years, institutional management of equity portfolios has increased from 54% to 81%.

2)      Over the same period, the share of what Emrich and Crow call “real institutional trading” has declined from 47% of trading volume to 29%.

There are far fewer market participants today than just ten years ago, managing much larger portfolios across more asset classes, and using much less trading. In future letters I’ll lay out in detail how this structural shift has large and specific consequences for the nature of game-playing in markets, but for the balance of this letter I just want to make a simple, and I hope obvious, point: structural change in any social environment wreaks havoc on historically observed correlations and patterns within that environment.

Unfortunately this sort of structural change is effectively invisible to econometric modeling of portfolios, and as a result understates the risks inherent in portfolios that rely heavily on historical correlation patterns. In a market undergoing structural change, all of the “timeless and universal” relationships that form the backbone of Risk Parity funds like Bridgewater’s All-Weather Fund and similar offerings by Invesco and AQR are much less certain than their econometric justifications would suggest. The underperformance of these strategies in recent weeks and months (“Fashionable ‘Risk Parity’ Funds Hit Hard”, Wall Street Journal, June 27, 2013) takes on new meaning when seen in this light.

There’s nothing wrong with the math of the correlation exercises that underpin Risk Parity funds, any more than there was anything wrong with the math of the correlation exercises that ratings agencies like Moody’s and S&P used to grade Residential Mortgage-Backed Securities (RMBS). But in both cases there is an assumption about market behavior – the relationship of asset performance under varying conditions of growth and inflation for Risk Parity funds; the role of geographical diversity in mitigating the risk profile of mortgage portfolios for RMBS ratings – that is exogenous to the calculation of the projected returns. In both cases, the standard portfolio model of y = α + β + Ɛ, where Epsilon is treated as an error term and the preference functions of market participants are assumed, gives a very compelling result: Risk Parity funds demonstrate an excellent risk-adjusted return profile, and trillions of dollars worth of RMBS deserve a AAA rating. But if you are wrong in your exogenous assumptions – if, for example, there is a nation-wide decline in US home prices for the first time since the 1930’s and geographical diversity provides no protection for a mortgage portfolio – then all the Gaussian cupolas and other econometric legerdemain in the world won’t save your AAA-rated security.

Risk Parity funds are a more broadly conceived, less levered version of Long-Term Capital Management. I mean that as a compliment, because it was the narrow conception and over-use of leverage at LTCM that ruined a solid investment premise and made it impossible for that firm to survive even a small disruption in patterns of market participant preferences – in LTCM’s case, the strong historical preference of major sovereign nations not to default on their debt obligations and the strong historical preference of major bond investors not to pay non-economic prices for the safety of US sovereign debt. The investment premise of LTCM was to identify small arbitrage opportunities between securities on the basis of historical correlations and to lever up those opportunities to generate nice returns. If you can take that premise and improve it significantly by expanding the scope and depth of the arbitrage opportunities and by shrinking the leverage turns required for acceptable returns … well, that seems like a really great idea to me. And I have zero doubt that investment giants like Ray Dalio, Bob Prince, and Cliff Asness can design a complex levered bond portfolio that is both safer and more rewarding than a simple unlevered stock and bond portfolio under most conditions. But I get VERY nervous when I am told that the reason these complex levered bond portfolios work so well is that a socially constructed behavior such as the assignment of value to highly symbolic securities is “timeless and universal”, particularly when the composition and preference functions of major market participants are clearly shifting, particularly when monetary policy is both massively sized and highly experimental, particularly when political fragmentation is rampant within and between every nation on earth.

Timing is everything with levered bond arbitrage, just ask Jon Corzine. Five years ago, this is a guy for whom there was a plausible path to become President of the United States. This is why he left the US Senate to become Governor of New Jersey, so that he could more easily and more effectively run for President. Losing his re-election bid in 2009 to Chris Christie closed that door, but only temporarily, as F. Scott Fitzgerald’s line that “there are no second acts in American lives” was probably wrong when he wrote it and is clearly not applicable to American society today.

So MF Global came along after the gubernatorial defeat and gave him a place to hang his hat for a few years, maybe refill the personal coffers that had been depleted by his phenomenally expensive campaigns. Within a year of taking the MF Global reins in March 2010, Corzine transformed the firm from a poorly managed commodities brokerage plagued by rogue traders and seemingly constant regulatory fines into a significant capital markets and prop trading player under his direct control. The culmination of this transformation was MF Global’s approval by the New York Fed as a primary dealer in February 2011, allowing the firm to fund itself as cheaply as any major investment bank. From that moment on Corzine – who started his career at Goldman Sachs as a sovereign bond trader – began to build a levered position in distressed European peripheral sovereign debt. By levering MF Global’s capital to the hilt in order to borrow dollars at historically low rates and buy, say, Portuguese 5-year paper with a 10% current yield in April 2011, Corzine stood to make an absolute killing as soon as the Europeans got their act together. After all, this is the sovereign nation of Portugal we’re talking about here, a full-fledged member of the European Union with a currency backstopped by the ECB and Germany, and it’s trading like a distressed corporate bond? Time to back up the truck. In fact, why don’t we put a little bit of duration risk into the mix to juice the returns even more. What could possibly go wrong?

epsilon-theory-how-gold-lost-its-luster-june-30-2013-corzine

We all know the rest of the story. One day you’re at the pinnacle of business and politics, poised to make a billion or two on a killer trade; the next day you’re testifying before Congress about misuse of client funds and considering taking the Fifth; tomorrow there may be a perp walk. The irony, of course, is that if Corzine had put this trade on 6 to 9 months later, we would today be talking about the brilliance of Jon Corzine, Lion of Wall Street, and how he had created a new Goldman Sachs.

The lesson? Pride may goeth before a fall, but so does leverage, bad timing, and poorly examined assumptions. A dislocation in the price of gold may be the least of our worries in a market and world undergoing structural change.

PDF Download (Paid Membership Required):

http://www.epsilontheory.com/download/15736/

2 Fast 2 Furious

We are all impaled on the crook of conditioning.
– James Dean (1931 – 1955)

This note is a sequel to my letter from two weeks ago, What We’ve Got Here Is … Failure to Communicate, a sequel made necessary by the market fall-out from the FOMC announcement on Wednesday. The Fed’s communications to the market are clearly not having the effect intended by Bernanke et al., and the problem remains that the Fed is clueless about the game-playing that dominates this market. The car-driving analogy used by Bernanke in Wednesday’s press conference (to paraphrase, “we are not putting our foot on the brake, we are taking our foot off the accelerator”), intended to soothe and placate, is a perfect example of the Fed’s tone-deafness. From a game-playing perspective, taking your foot off the accelerator is more important than putting your foot on the brake. It is an informational inflection point that absolutely changes game-playing behavior in potentially extreme ways.

epsilon-theory-2-fast-2-furious-june-23-2013-james-dean

Rebel Without a Cause (1955)

We’re all familiar with the classic game of Chicken as depicted in popular narrative: two hot-headed teenagers race their cars toward a cliff’s edge; the first to brake or swerve is the Chicken who loses the game and the girl. Cue Natalie Wood to drop the white handkerchief and start the race …

Now put yourself in the shoes of one of the drivers. Let’s assume that you want to win the game but you also don’t want to die. How do you play this game?

There are two well-known strategies to win a game of Chicken. The first is to signal your opponent convincingly that you really don’t care about living past this race, that you prefer to die young and leave a pretty corpse. The second is to signal convincingly that you have no control over your ability to stop the race or swerve out of the way once you begin … rip the steering wheel out of your car or something like that. The problem with these strategies is that they require effective signaling prior to the race’s start. It really doesn’t do you much good if you remove your steering wheel and pre-commit yourself to driving off the cliff if your opponent doesn’t see you do it! Also, if you make these signals and your opponent still goes forward with the race, then you’ve already lost. Why? Well, if you’re signaling strongly that you’d rather die than lose the race, but your opponent decides to race anyway, what does that signal about him? In poker terms, your all-in bluff was just called.

Here’s why Chicken is so hard to play if the race begins and you’re heading towards the cliff … given the extreme consequences of going off the cliff, your rational decision is to stop your car and let the other guy win. But that logic applies to your opponent, too, and you know it. The rational decision for your opponent is to stop his car and let you win. Both of you want to stop your car, but both of you know that both of you want to stop your car. Why shouldn’t he stop his car first instead of you? Of course, he’s thinking the same thing, and the clock is ticking on both of you going off the cliff.

In formal terms, the game of Chicken has two pure strategy equilibria, and that’s what makes for its extreme instability. Below on the left is a classic two-player Prisoner’s Dilemma game with cardinal expected utility pay-offs as per a customary 2×2 matrix representation. Both you and James Dean have only two decision choices – Stop and Drive – with the joint pay-off structures shown as (you , James Dean) and the twin equilibrium outcomes (Drive , Stop) and  (Stop , Drive) shaded in light blue. With this informational structure, there is absolutely no way to predict which equilibrium will end up occurring, or whether any equilibrium will result.

epsilon-theory-2-fast-2-furious-june-23-2013-chicken

So you’re still in the race, you’ve still got the pedal to the metal, and you’re starting to freak out. Should you stop the race and let James Dean win? That cliff edge is looming closer and closer, and you are suddenly struck by the realization that your corpse will not be so pretty when pulled out of the wreckage. But then you notice something … your car is starting to pull ahead of James Dean’s car. You know that your car isn’t faster than his, so the only explanation is that James Dean has let up on the accelerator. He’s not putting on the brake, but the informational value of this reduction in acceleration is HUGE. Now you know that James Dean is wavering more than you are. And you know that James Dean knows he is wavering more than you are. Once an unstable game like Chicken tips towards one equilibrium or the other, it moves inexorably towards that equilibrium, faster and faster. Once James Dean starts to waver, both of you know that the next move is for him to waver more and you to waver less. You have won this game, and both of you know it, well before James Dean actually puts his foot on the brake.

Now to be clear, I’m not saying that the game-playing that occurs in markets is a straightforward corollary of Chicken. It’s much more aptly described as a Common Knowledge game. But what the current Common Knowledge market game shares with Chicken is that it is extremely unstable (see Through the Looking Glass, or … This is the Red Pill). And in unstable games, a change in the change of a critical data function – what’s called the second derivative – is incredibly influential on game-playing behavior.

The critical data function in your Chicken Run with James Dean is the position of the two cars. The speed of the cars (change in position over time) is the first derivative of this function, and the acceleration of the cars (change over time in the change in position over time) is the second derivative. If you were to draw a line on a graph to mark the position of the cars over time, the speed of the cars is the slope of that line at any given point in time (more speed = steeper slope = more distance per unit of time) and the acceleration of the cars is the curvature of that line (the slope of the slope). When acceleration stops, the slope of the line is still steep (the cars are still going really fast) but it’s no longer curving upwards. This is what’s called a negative inflection point. It’s the point where the marginal change in your position over time stops getting better. And that’s a really big deal for any rational decision-maker in any strategic environment, whether it’s high school in the 1950’s or the market in 2013.

For all the reasons I’ve laid out in prior work (What We’ve Got Here Is … Failure to Communicate), the critical data function in the market’s expectations of future Fed policy is the unemployment rate, and what the Fed said on Wednesday is that the unemployment rate is improving faster than they previously thought it would. Everyone knows that everyone knows that the unemployment rate is going down (the equivalent of our cars moving forward at a nice speed). The new information from the Fed is that this improvement is accelerating. The second derivative of the unemployment rate, the curve of the unemployment rate over time, is changing in a positive direction for the economy.

So why isn’t that a good thing for the market? Isn’t this good news for the fundamental health of the US economy, and thus good news for corporate earnings and revenue growth? Hasn’t the Fed been crystal-clear that it has no intention of actually tightening (putting its foot on the brake), but is going to remain historically accommodative (the car will continue to go fast) even as conditions improve?

Unfortunately, the Fed has also been crystal-clear that it is taking its foot off the accelerator. They announced an inflection point, which has enormous repercussions for game-playing behavior in an unstable game. This is the point where the marginal improvement in the Fed’s support for the market stops getting better and starts getting worse. And that shift in Fed policy is more than enough to trump whatever organic improvement we are seeing in the US economy.

What we are witnessing today is the opposite of the “green shoots” Narrative of 2009. On Sunday March 15, 2009 Bernanke created the “green shoots” Narrative with a “60 Minutes” television interview (his first) and announced a positive inflection point: from this moment onwards, the marginal improvement in the Fed’s support for the market would increase. This was, of course, accompanied by the Fed’s first Large Scale Asset Purchase (LSAP) program, also known as QE1, and the rest is history. Here’s a chart of the S&P 500 from March 16, 2009 through the rest of the year – an inexorable march upwards for a 48% increase in this broad market index, one of the most ferocious rallies in history – even as the fundamental health of the US economy remained (to be charitable) challenged.

epsilon-theory-2-fast-2-furious-june-23-2013-bloomberg

S&P 500, March 16, 2009 – December 31, 2009 (source: Bloomberg)

In exactly the same way that the market went up sharply in 2009 even as the real economy got worse, so today can the market go down sharply even as the real economy gets better.

How far down? I have no idea. It all depends on how the Narrative is shaped from here. Narrative formation can be tricky thing, and we will see over the coming days and weeks how the Powers That Be and talking heads respond with their public statements to support the market.

This past Friday, for example, at 6:08 AM St. Louis Fed President James Bullard released a statement detailing why he dissented from Wednesday’s FOMC decision on dovish grounds. You can find the full text on the St. Louis Fed’s website (http://www.stlouisfed.org/newsroom/displayNews.cfm?article=1829) and judge for yourself, but what’s notable to me is the act of publishing a formal dissent as well as the stridency of Bullard’s language within the usually staid context of Fed-speak. He “found much to disagree with in this decision” and even invoked the C-word – credibility – in his criticism. To suggest that the FOMC might have a problem in its efforts “to maintain credibility” with Wednesday’s announcement is the Fed-speak equivalent of going nuclear.

Bloomberg picked up the release of the Bullard’s statement (the only FOMC member to make a statement since Wednesday) but didn’t give it much attention. The Wall Street Journal even less. But then the market went down from the opening bell, continuing the losses from the prior two days. So much for the “Rebound in Stocks” promised by the Wall Street Journal before the open, a story which became “Stocks Give Back Gains” soon afterwards.

epsilon-theory-2-fast-2-furious-june-23-2013-bloomberg-2

S&P 500 intraday chart, June 21, 2013 (source: Bloomberg)

The response from major financial print media:

  • At 10:15 AM the Financial Times published an article about Bullard’s statement titled: “Bernanke decision ‘inappropriately timed’, says St. Louis Fed”.
  • At 11:20 AM Bullard gave a telephone interview to Bloomberg. Naturally, Bloomberg gave this interview a lot more space than the earlier statement, and kept a story titled “Bullard Says Fed May Need to Boost Asset Buying If Inflation Slows Further” on its Top Stories list throughout the rest of the day.
  • At 11:57 AM the Wall Street Journal published an article titled “Bullard’s Unusual Dissent” in its MoneyBeat section.

The market at least stopped going down after Bloomberg and the Wall Street Journal trumpeted the Bullard interview, hitting its lows for the day at 11:31 AM, but the march up didn’t begin until about 12:45 PM in anticipation of an influential Wall Street Journal article (the Wall Street Journal typically publishes its major Opinion-Leading-Masquerading-As-Analysis pieces at 1 PM).

Sure enough, at 1:01 PM the Wall Street Journal published an article by Jon Hilsenrath titled “Analysis: Markets Might Be Misreading Fed’s Messages” and the market completed its resuscitation immediately after this article came out. “Stocks Give Back Gains” becomes “Stocks Try to Regain Footing.”

How can a Wall Street Journal writer move the market so much more than the St. Louis Fed President? Because everyone knows that everyone knows that Hilsenrath is the Fed’s favorite print media mouthpiece. This is the market’s Common Knowledge about how Fed intentions are revealed. In the Bizarro-market that we must all endure, divining Fed intentions third-hand through Hilsenrath’s “analysis” is more informationally influential than hearing the St. Louis Fed President’s beliefs directly!

But it’s not easy to reshape a Narrative as firmly entrenched as “the Fed will reduce monetary accommodation proportionally to the decline in the unemployment rate.” For more than four years now, the market has been trained (and by “the market” I mean both human investors and trading algorithms) to take Bernanke communications as the single most influential signal in determining investment decisions. As James Dean said, “We are all impaled on the crook of conditioning,” and no group of individuals or computer programs is more intensively conditioned than market participants.

The only person with enough informational “juice” to undo the inflection point that occurred on Wednesday is Bernanke himself. The only other signal emitters that even come close in their informational influence are Draghi and Merkel. Everyone else – and that includes Obama, much less other FOMC members or any journalist – are an order of magnitude less important from an Information Theory perspective. The Hilsenrath’s and the Bullard’s of the world can stop the bleeding for a day or two, but they can’t change the underlying Common Knowledge structure.

Even someone as informationally powerful as Bernanke is not omnipotent. He must operate under both the institutional constraints of being a lame-duck Fed Chairman and the personal constraints of wanting to cement a legacy … both of which are very powerful and inherently risk-averse forces.  To convince the market that the Wednesday announcement meant something different from its plain-faced interpretation would require a wholesale dismantling of prior communications linking unemployment thresholds to QE tapering, and that’s something Bernanke will be extremely loathe to do unless he has the formal backing of the FOMC and/or things get a whole lot worse.

Keep in mind, too, that Bernanke’s signals are not communicated to us directly, but are mediated by a host of self-serving entities, from political institutions to individuals (including Bernanke himself) to corporations large and small. In the absence of Bernanke making a public mea culpa on tying Fed monetary accommodation to the unemployment rate, the best thing for diminishing this market-negative Common Knowledge informational structure would be for these signal mediators to reduce the attention and meaning attached to the unemployment rate. But that ain’t happening.

The “good news” of a declining unemployment rate serves too many institutional and personal self-interests for this Narrative to weaken, no matter how weak the broader measures of US labor conditions might be.

For example, listen to what David Axelrod says about the unemployment rate in a panel discussion organized by Axelrod’s Institute of Politics: Campaign Strategists: 2012 Explained. It’s a long video, but for anyone interested in US politics it’s a must-see. Why did Obama win in November? Because the unemployment rate went down in the months leading up to the election.  It wasn’t the Obama campaign’s use of Big Data. It wasn’t any failing in the Romney message or strategy. The economy got better, as evidenced and interpreted by the unemployment rate, and that swung a lot of undecided voters. That’s what won the election.

Or look at the ratings for CNBC on Jobs Friday versus any other day of the month … it’s not even close. Nuanced discussions of US labor conditions are for Charlie Rose, not Jim Cramer, which is why the former is seen by a handful of people on PBS and the latter is laughing all the way to the bank.

Everyone knows that viewing US labor conditions solely through this single constructed number is simplistic and kinda stupid. But so what? Everyone also knows that everyone knows that this number moves the market. Unless Bernanke reverses course and tells us otherwise, everyone knows that everyone knows that this is a crucial number for the Fed. And all signal mediators – from the White House to CNBC to everyone in-between – have a vested interest in keeping the Narrative of a “healing US labor market” intact. As a result, from an informational perspective it is now easier for this market to go down than to go up. Be careful out there.

PDF Download (Paid Membership Required):

http://www.epsilontheory.com/download/15725/

Through the Looking Glass, or … This is the Red Pill

There are unwritten rules for almost all social phenomena, from investing to writing epic poetry. Not that these weekly notes aspire to Homeric levels (although they do get pretty lengthy), but Epsilon Theory does follow one unwritten rule taken from the Iliad, the Aeneid, etc. in that I began this story in media res– in the middle of things. Last week’s note looked at two very current issues – the jobs report on Friday June 7th and the Fed’s market communication policy – through the lens of game theory to develop what I think are some non-intuitive results … namely that the market’s Narrative around US labor conditions is fundamentally at odds with the Fed’s communications, creating a major source of instability for global markets.  And if you look at the archived notes I’ve posted, they are almost all focused on some specific event or issue.

But at some point it’s important to step back and show in a more general way how game theory works to shape markets. This may strike some readers as too academic, but c’est la vie. We will return to our regularly scheduled entertainment next week. This week I want to lift the hood a bit on Epsilon Theory and show you how the engine works, or at least that there is an engine, so that you will trust me enough to get in the car and let me drive you to this or that destination. Epsilon Theory is not a collection of musings, and it’s not a blog. There’s significant intellectual property here, and you deserve to be convinced of that before you invest more of your time reading what I’ve got to say.

Along the way, though, there will be payoffs for the patient reader.  I’ll show you the difference between volatility and instability, why it’s the latter we are suffering through today, and why the difference is critically important for your portfolio. Also, I can give you an answer to the lead story in the JP Morgan Market Intelligence note from this Wednesday, June 12th:

Market Update – “why are the futures up?” – as has been the case on most mornings for the last few weeks, the futures are making a large move for no apparent reason. On the whole it was a quiet night as far as incremental news is concerned and as a result people had a lot of time to contemplate some of the big recent themes driving trading (bond weakness, EM rout, Fed tapering, whether stocks are whistling past the graveyard, etc).

A game theoretic perspective reveals the all too real dynamics behind the “no apparent reason” for these market swings, and there’s nothing academic about that.

An unwritten rule is also called a Convention, and both are just alternative names for Common Knowledge. The best example of a Convention is … language. There’s no inherent reason why we should call a rabbit by the name “rabbit” instead of some other word; it’s just a behavioral Convention that English-speaking people have developed over centuries in order to improve their mutual lot in life. There was no Saxon chieftain that commanded people to start calling the long-eared rodent that jumps around a lot a “rabbit” instead of a “gavagai”, to use Quine’s famous example. Instead, over time it somehow became clear to this group of people that everyone knows that everyone knows that a long-eared rodent that jumps around a lot is called a “rabbit”. A behavioral equilibrium to call a rabbit a “rabbit” developed without coercion, and as a result hunting long-eared rodents that jump around a lot got a whole lot easier for the group of people who shared this Convention. If you lived in this group but didn’t share the Convention – if you insisted on calling a rabbit a “gavagai” and had your own words for lots of other things – well, you probably didn’t last very long in this group. Similarly, if you’re an investor and you don’t share the Conventions of the market (“don’t fight the Fed” and its like) – well, you’re probably not going to last very long, either.

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-quine

WVO Quine, photograph by Steve Pyke (1990)

The best game theoretic work on Common Knowledge comes from linguists like Brian Skyrms and evolutionary biologists like Edward O. Wilson, not economists. There’s an enormous intellectual depth to these fields that I can’t do justice to here, but for our purposes of applying Common Knowledge game theory to markets I want to highlight a few ideas that underpin modern linguistic and biological studies of Convention.

Conventions evolve over time – whether we are talking about Conventions governing language or market behavior or any other social behavior – which is why it’s so important to know something about history in order to understand behavior. There is nothing eternal or written in stone about any behavioral Convention, and even the most socially entrenched Convention can change with amazing speed. For example, dueling and slavery as state-sponsored Conventions were considered part of the “natural order of things” for thousands of years; within a span of about 80 years in the 19th century they were wiped out globally. That said, it’s very hard to see Conventions changing when you’re living in one. John Lennon wrote that “it’s easy if you try” to imagine a future with alternative sets of Conventions, but for most of us it is nearly impossible. Of course, that doesn’t mean that change isn’t coming. It always does.

Conventions are formed by communication. Language is an obvious form of communication, but so is buying 100 shares of Apple. Any behavior, if made publicly, is a communication of sorts, whether communication was intended or not. It is a signal.

As social animals our brains are hard-wired to look constantly for communication signals and respond to them. As social animals we train each other from birth to look constantly for communication signals and respond to them. We can no more ignore a speech delivered by Bernanke than ants can ignore a pheromone emitted by their queen. At first blush this might seem like a weakness, as something to be avoided or at least mitigated. But it is precisely this heightened sensitivity to signals that makes us, like ants, such a successful species! Human behavior in response to signals – what is more commonly called decision-making – is not chaotic or illogical or counterproductive. On the contrary, it’s the finely honed product of millions of years of biological evolution and hundreds of thousands of years of social reinforcement. It’s why there are 8 billion of us on the planet today.

The insight of evolutionary studies of linguistic Convention is that because we have been socially organized as a certain type of social animal for millennia and because the wiring of our brains for social success hasn’t changed in a lot longer than that, there is an identifiable pattern to our behavior around signals. There is an underlying behavioral logic at work in humans. The parameters of that behavior – the Convention itself – may be socially constructed and constantly changing (i.e., there’s no natural reason to call a rabbit a “rabbit”, or to value gold more highly than peacock feathers), but the logic and pattern of strategic human decision-making are constant over time.

If you can measure the signals that investors are biologically and socially wired to respond to, and if you can map out the likely behavioral pattern of those responses … then you can predict how markets will respond to new signals.

That’s what I’m trying to do with Epsilon Theory.

There is a methodology for measuring and analyzing signals. It’s called Information Theory.  To conceptualize how signals and patterns of strategic decision-making work together to create predictable market outcomes, I have developed what I believe is a novel way of depicting the informational structure of markets. This is the intellectual heart of Epsilon Theory. And not to get all Matrix-y, but once you start to see the market in terms of its informational structure, that in fact, the market IS an informational structure, nothing more and nothing less, then you will have a very difficult time going back to seeing it as you once did.

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-matrix

Defining the strength of a signal as the degree to which it changes assessments of future states of the world dates back to Claude Shannon’s seminal work in 1948, and in a fundamental way back to the work of Thomas Bayes in the 1700’s.  Here’s the central insight of this work: information is measured by how much it changes your mind. In fact, if a signal doesn’t make you see the world differently, then it has zero information. As a corollary, the more confident you are in a certain view of the world, the more new information is required to make you have the opposite view of the world and the less information is required to confirm your initial view. There’s no inherent “truth” to any signal, no need to make a distinction between (or even think of) this signal as having true information and that signal as having false information. Information is neither true nor false. It is only more or less useful in our decision-making, and that’s a function of how much it makes us see the world differently. As a result, the informational strength of any signal is relative. The same signal may make a big difference in my assessment of the future but a tiny difference in yours. In that case, we are hearing the same message, but it has a lot of information to me and very little to you.

Let’s say that you are thinking about Apple stock but you are totally up in the air about whether the stock is going up or down over whatever your investment horizon might be, say 1 year. Your initial estimation of the future price of Apple stock is a coin toss … 50% likelihood to be higher a year from now, 50% likelihood to be lower a year from now. So you do nothing. But you start reading analyst reports about Apple or you build a cash-flow model … whatever it is that you typically do to gather information about a potential investment decision.

The graph below shows how Information Theory would represent the amount of signal information (generically represented as bits) required to change your initial assessment of a 50% likelihood of Apple stock going up over the next year to a post-signaling assessment of some new percentage likelihood. These are logarithmic curves, so even relatively small amounts of information (a small fraction of a generic bit) will change your mind about Apple pretty significantly, but more and more information is required to move your assessment closer and closer to certainty (either a 0% or a 100% perceived likelihood of the stock going up).

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-spx

Of course, your assessment of Apple is not a single event and does not take place at a single point in time. As an investor you are constantly updating your opinion about every potential investment decision, and you are constantly taking in new signals. Each new update becomes the starting point for the next, ad infinitum, and as a result all of your prior assessments become part of the current assessment and influence the informational impact of any new signal.

Let’s say that your initial signals regarding Apple were mildly positive, enough to give you a new view that the likelihood of Apple stock going up in the next year is 60%. The graph below shows how Information Theory represents the amount of information required to change your mind from here. The curves are still logarithmic, but because your starting point is different it now only requires 80% of the information as before to get you to 100% certainty that Apple stock will go up in the next year (0.8 generic bits versus 1.0 generic bits with a 50% starting estimation). Conversely, it requires almost 140% of the same negative information as before to move you to certainty that Apple stock is going down.

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-aapl

What these graphs are showing is the information surface of your non-strategic (i.e., without consideration of others) decision-making regarding Apple stock at any given point in time.  Your current assessment is the lowest point on the curve, the bottom of the informational “trough”, and the height of each trough “wall” is proportional to the information required to move you to a new assessment of the future probabilities. The higher the wall, the more information required in any given signal to get you to change your mind in a big way about Apple.

Now let’s marry Information Theory with Game Theory. What does an information surface look like for strategic decision-making, where your estimations of the future state of the world are contingent on the decisions you think others will make, and where everyone knows that everyone is being strategic?

I’m assuming we’re all familiar with the basic play of the Prisoner’s Dilemma, and if you’re not just watch any episode of Law and Order. Two criminals are placed in separate rooms for questioning by the police, and while they are both better off if they both keep silent, each is individually much better off if he rats his partner out while the partner remains silent. Unfortunately, in this scenario the silent partner takes the fall all by himself, resulting in what is called the “sucker pay-off”. Because both players know that this pay-off structure exists (and are always told that it exists by the police), the logical behavior for each player is to rat out his buddy for fear of being the sucker.

Below on the left is a classic two-player Prisoner’s Dilemma game with cardinal expected utility pay-offs as per a customary 2×2 matrix representation. Both the Row player and the Column player have only two decision choices – Rat and Silence – with the joint pay-off structures shown as (Row , Column) and the equilibrium outcome (Rat , Rat) shaded in light blue.

The same equilibrium outcome is shown below on the right as an informational surface, where both the Row and the Column player face an expected utility hurdle of 5 units to move from a decision of Rat to a decision of Silence. For a move to occur, new information must change the current Rat pay-off and/or the potential Silence pay-off for either the Row or the Column player in order to eliminate or overcome the hurdle. The shape of the informational surface indicates the relative stability of the equilibrium as the depth of the equilibrium trough, or conversely the height of the informational walls that comprise the trough, is a direct representation of the informational content required to change the conditional pay-offs of the game and allow the ball (the initial decision point) to “roll” to a new equilibrium position. In this case we have a deep informational trough, reflecting the stability of the (Rat , Rat) equilibrium in a Prisoner’s Dilemma game.

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-silence-rat

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-row-column

Now let’s imagine that new information is presented to the Row player such that it improves the expected utility pay-off of a future (Silence, Rat) position from -10 to -6. Maybe he hears that prison isn’t all that bad so long as he’s not a Rat. As a result the informational hurdle required by the Row player to change decisions from Rat to Silence is reduced from +5 to +1.

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-silence-rat-2

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-rat-rat

The (Rat , Rat) outcome is still an equilibrium outcome because neither player believes that there is a higher pay-off associated with changing his mind, but this is a much less stable equilibrium from the Row player’s perspective (and thus for the overall game) than the original equilibrium.

With this less stable equilibrium framework, even relatively weak new information that changes the Row player’s assessment of the current position utility may be enough to move the decision outcome to a new equilibrium. Below, new information of 2 units changes the perceived utility of the current Rat decision for the Row player from -5 to -7. Maybe he hears from his lawyer that the Mob intends to break his legs if he stays a Rat. This is the equivalent of “pushing” the decision outcome over the +1 informational hurdle on the Row player’s side of the (Rat , Rat) trough, and it is reflected in both representations as a new equilibrium outcome of (Silence , Rat).

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-pushing

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-trend

This new (Silence , Rat) outcome is an equilibrium because neither the Row player nor the Column player perceives a higher expected utility outcome by changing decisions. It is still a weak equilibrium because the informational hurdle to return to (Rat , Rat) is only 1 informational unit, but all the same it generates a new behavior by the Row player: instead of ratting out his partner, he now keeps his mouth shut.

The Column player never changed decisions, but moving from a (Rat , Rat) equilibrium to a (Silence , Rat) equilibrium in this two time-period example resulted in an increase of utility from -5 to +10 (and for the Row Player a decrease from -5 to -6). This change in utility pay-offs over time can be mapped as:

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-column-utility

Replace the words “Column Utility” with “AAPL stock price” and you’ll see what I’m going for. The Column player bought the police interrogation at -5 and sold it at +10. By mapping horizontal movement on a game’s informational surface to utility outcomes over time we can link game theoretic market behavior to market price level changes.

Below are two generic examples of a symmetric informational structure for the S&P 500 and a new positive signal hitting the market. New signals will “push” any decision outcome in the direction of the new information. But only if the new signal is sufficiently large (whatever that means in the context of a specific game) will the decision outcome move to a new equilibrium and result in stable behavioral change.

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-equilibrium

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-spx

In the first structure, there is enough informational strength to the signal to overcome the upside informational wall and push the market to a higher and stable price equilibrium.

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-signal

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-signal-2

In the second structure, while the signal moves the market price higher briefly, there is not enough strength to the signal to change the minds of market participants to a degree that a new stable equilibrium behavior emerges.

All market behaviors – from “Risk-On/Risk-Off” to “climbing a wall of worry” to “buying the effin’ dip” to “going up on bad news” – can be described with this informational structure methodology. 

For example, here’s how “going up on bad news” works. First, the market receives a negative Event signal – a poor Manufacturing ISM report, for example – that is bad enough to move the market down but not so terrible as to change everyone’s mind about what everyone knows that everyone knows about the health of the US economy and thus move the market index to a new, lower equilibrium level.

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-equilibrium-level

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-negative-event

Following this negative event, however, the market then receives a set of public media signals – a Narrative – asserting that in response to this bad ISM number the Fed is more likely to launch additional easing measures. This Narrative signal is repeated widely enough and credibly enough that it changes Common Knowledge about future Fed policy and moves the market to a new, higher, and stable level.

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-positive

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-stable

So what is the current informational structure for the S&P500? Well, it looks something like this:

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-s-and-p

The market equilibrium today is like a marble sitting on a glass table. It is an extremely unstable equilibrium because the informational barriers that keep the marble from rolling a long way in either direction are as low as they have been in the past five years. Even a very weak signal is enough to push the marble a long way in one direction, only to have another weak signal push it right back. This is how you get big price movements “for no apparent reason”.

Why are the informational barriers to equilibrium shifts so low today? Because levels of Common Knowledge regarding future central bank policy decisions are so low today. The Narratives on both sides of the collective decision to buy or sell this market are extremely weak. What does everyone know that everyone knows about Abenomics? Very little. What does everyone know that everyone knows about Fed tapering? Very little. What does everyone know that everyone knows about the current state of global growth? Very little. I’m not saying that there’s a lack of communication on these subjects or that there’s a lack of opinion about these subjects or that there’s a lack of knowledge about these subjects. I’m saying that there’s a lack of Common Knowledge on these subjects, and that’s what determines the informational structure of a market.

The unstable market informational structure today is NOT a volatile informational structure, at least not as “volatility” is defined and measured by today’s market Conventions. Here’s what a volatile market structure looks like:

epsilon-theory-through-the-looking-glass-or-this-is-the-red-pill-june-16-2013-volatility

This is an asymmetric informational structure where the signal barriers for the market to go down are much lower than the signal barriers for the market to go up. This structure does not mean that the market will definitely go down; it simply means that the market can go down, and will go down, with “ordinary” bad news on either increased macroeconomic stress or reduced policy support. The market could still go up, but it would take extremely positive signals on either the macro or policy front to overcome the high informational barrier. Given anything close to a normally distributed set of new market signals, a market with this informational structure is much more likely to go down than to go up, which will be reflected by higher market volatility measures.

I’m a big believer in calling things by their proper names. Why? Because if you make the mistake of conflating instability with volatility, and then you try to hedge your portfolio today with volatility “protection” – VIX futures, one of the VIX ETF’s, S&P 500 puts, etc. – you are throwing your money away. You are buying insurance for a danger that doesn’t exist right now, and you are leaving yourself unprotected against the danger that is staring you in the face. 

So if you’re reading Epsilon Theory for specific trade ideas, here’s one: short VXX every time it pops its head up on a big down day as investors rush to buy “volatility protection”. Hedge it out with some sort of long straddle on the S&P 500 or short positions on underlying stocks if you want to be cautious, but you really don’t need to. So long as most investors mistake instability for volatility (and unless Epsilon Theory gets a LOT more distribution I think it’s safe to say that will persist for a looooong time) this is an archetypal behavioral trade. I’ll let you know if the informational structure shifts so that volatility really does raise its ugly head.

On that note … I need to ‘fess up about something. The informational structures I’m showing in this note are a rudimentary version of what I’m using in my current research, in at least four ways.

  • There’s a fractal structure to game-playing, where the same patterns occur on different time frames and different market aggregation levels. Read anything by Benoit Mandelbrot if you don’t know what I’m talking about.
  • There’s a meaningful distinction between backward-looking Event signals, like the release of macroeconomic data, and forward-looking Narrative signals, like the communications of central bankers and politicians.
  • There’s also a Market signal – what George Soros calls “reflexivity” – that plays an important role in market game-playing. If you’ve ever watched a stock drop violently without any news showing up on Bloomberg or your traders hearing anything on chat, and then you’ve sold the stock because “somebody must know something” … that’s reflexivity.
  • There’s a dimension of time to all this, so that an information surface is three-dimensional (much like a volatility surface in options trading), not two-dimensional as shown in this note. Actually, the information surface is four-dimensional when you add uncertainty and what game theorists call “the shadow of the future” into the algorithms.

And you’ll notice I’m not saying anything about the methodology for actually measuring any of this.

I mention all this, not to be coy, but because I want to make clear that there is a depth to Epsilon Theory beyond some interesting but abstract perspective on markets. I mean for Epsilon Theory to have a strong practical application to active investment management, and to that end, I think that I’m pretty far along in developing the necessary tools and instruments.

Of course, I also mean to signal that there’s a lot more to Epsilon Theory than what I am distributing publicly! I don’t want Epsilon Theory to be a black box, not because I have anything against black boxes, but because I think the Convention of trusting a black box had been dying for a long time before Madoff put the final nail into that coffin. I’m not sure what the black box Convention has evolved into, but I’m trying to find out.

PDF Download (Paid Membership Required):

http://www.epsilontheory.com/download/15708/

What We’ve Got Here Is … Failure to Communicate

epsilon-theory-what-weve-got-here-is-a-failure-to-communicate-june-9-2013-cool-hand-lukeFrom the classic Paul Newman movie, Cool Hand Luke, as the Captain administers Luke’s punishment in the prison yard for yet another escape attempt:

 Captain: You gonna get used to wearing those chains after a while, Luke. Don’t you never stop listening to them clinking, ‘cause they gonna remind you what I been saying for your own good.

Luke: I wish you’d stop being so good to me, Cap’n.

Captain: Don’t you ever talk that way to me. NEVER! NEVER! [Captain hits Luke, who rolls down the hill to the other prisoners] What we’ve got here is … failure to communicate. Some men you just can’t reach. So you get what we had here last week, which is the way he wants it. Well, he gets it. I don’t like it any more than you men.

There are plenty of great cinematic scenes of the Common Knowledge game in action, but this is one of my favorites. The “failed” communication of the Captain to Luke is the basis for the successful communication of the Captain to the prisoners: subvert my rules and you will be crushed. The brutal message is made in public, not so that all the prisoners can see what happens to Luke, but so that all the prisoners can see all the prisoners seeing what happens to Luke.

In environments like prisons (and capital markets!), behavioral decisions based on private information (“I saw Luke beaten down for breaking the rules. If I break the rules I might get beaten, too.”) are almost always weaker than behavioral decisions based on Common Knowledge (“Everyone knows that if you break the rules like Luke you will be beaten down. Why would I even think about breaking the rules?”). The latter is a more stable equilibrium because, in effect, the prisoners themselves end up enforcing the warden’s rules. Even if you privately believe that you and your fellow prisoners could make a break for it, so long as you believe that “everyone knows” that you will be punished for breaking the rules, then you do not believe that you will receive any support from your fellow prisoners. It is irrational to even raise the subject with your fellow prisoners, as you will mark yourself as someone who is either too stupid or too dangerous not to recognize what everyone else knows that everyone else knows. And because everyone is making a similar calculation, no one ever makes an escape attempt and the Common Knowledge grows stronger over time, as does the no-escaping behavioral equilibrium. This is why the Captain goes to such lengths not just to punish Luke for his escape attempts, but to break Luke, and not just to break Luke, but to break Luke as publicly as possible.

Because of the Common Knowledge game, there is enormous power in making a Public Spectacle out of information, which is why coronations and executions alike have traditionally been carried out in front of large crowds. This lesson in behavioral influence – the crowd doesn’t just need to see the event, the crowd needs to see the crowd seeing the event – is why so many of our modern social institutions – from political campaigns to American Idol – are staged in front of live audiences. When you sit in front of your TV set and watch, say, a national political convention, you are infinitely more engaged with the event when you see a crowd than when you don’t. You can’t help yourself. It doesn’t even matter if the live audience is faked and we know that the audience is faked … have you ever listened to a sitcom without a laugh track? It’s just not as funny. The fact is that humans are social animals. We are hard-wired to look for and respond to Common Knowledge, and smart people – from political leaders to religious leaders to business leaders – have taken advantage of this for thousands of years.

So with that as introduction … ladies and gentlemen, I give you Jobs Friday™, brought to you by your friends at CNBC and Bloomberg and CNN and MSNBC and Fox and the WSJ and the FT and the NYT and every other financial media outlet.

There is an enormous difference between an unemployment number released in the context of Jobs Friday™ versus that same number released in the absence of Public Spectacle. The employment data today is imbued with so much more Meaning than it was even a few years ago … far too much Meaning … more than the numbers themselves can bear. And all of us, including the media creators of Jobs Friday™, know this to be true.

We all know that whether the unemployment rate is 7.5% today or 7.4% or 7.7% really makes no difference whatsoever for Fed policy decisions. We all know that whether there were 175,000 jobs added last month or 165,000 jobs added or whether it beat or missed expectations by 3,000 jobs makes even less difference. We all know that whatever the numbers are this month they will be revised next month. We all know that we are collectively paying too much attention to these numbers, but because we believe that everyone else is paying a lot of attention to these numbers, because the Fed has told us how important these numbers are, because we see the crowd of people participating in breathless commentary over the importance of Jobs Friday™, we can’t help ourselves.

We play the Common Knowledge game around the monthly jobs number, even though none of us believe privately that the numbers from month to month are very important. In fact, it would be irrational not to play the Common Knowledge game under the circumstances of Jobs Friday™. The jobs report, as mediated and narrated by talking heads through public statements, moves the market a lot. That’s an incontrovertible fact. So burying your head in the sand and refusing to play the game is not really an option for most active investment managers.

But apart from the frustration of investors ill-prepared to play a game that they are forced to play, here’s the bigger problem. While one monthly jobs number versus another may not make any difference in shifting Fed policy, the game-playing around the jobs numbers certainly does. The Fed’s communication policy is fundamentally flawed because it does not take into account game theory. This is a huge problem for the Fed, and as a result, it’s a huge problem for the stability of global capital markets.

The problem for the Fed is that their interpretation of US labor conditions does not match the market Narrative. The Fed, as you would expect, looks at lots of data in determining the “true” state of US labor conditions. The Narrative does not.

The Narrative around US labor conditions is formed almost exclusively around two numbers: the unemployment rate and the month-over-month change in non-farm payrolls. Both of these numbers are social constructions, which at first blush may sound weird. These numbers are based on actual counting of actual people. There may be mistakes made in the counting, but in what sense are they “constructed”? They are constructions because the government’s choice of what people to count and how to define employment is only one choice among many as to how to represent this social phenomenon. They are constructions because the choice of which systematic mistakes in the collection of labor data are adjusted for and which are not is enough to make a difference in the construction and perception of the US labor Narrative. I’ve written in the past about the ways in which the construction of the weekly unemployment claims number has been molded to support a particular Narrative (“10-29-12 Jack Welch Was Right”), so you can take a look at that piece if you’re interested, but that’s not what I want to explore today.

The point today is that the limitation of the US labor Narrative to the unemployment rate and the non-farm payroll number – a limitation that is encouraged, promulgated, and reinforced by the mediators of Jobs Friday™ and creates the informational structure of market game-playing around labor announcements – undermines the Fed’s market communications in a predictable and significant manner. Here’s the crux of the problem: the numbers that go into the labor Narrative are good, but the broader labor numbers that the Fed looks at aren’t so good. Unfortunately, the Fed has explicitly communicated their QE policy in terms of the Narrative, not the broader conditions, which means that Fed communications are increasingly at odds with Fed intentions.

The reason that the unemployment rate and the non-farm payrolls number don’t mean what they used to mean in terms of a broad view of US labor conditions, even though they are the sole sources of Meaning for the Narrative, is that the US labor market has changed structurally in terms of trend employment growth. Trend employment growth is the number of jobs that need to be created each month to keep the unemployment rate flat. Anything above-trend will push the unemployment rate down over time, and conversely, below-trend job creation will push the unemployment rate up. Back in the 1980’s and 1990’s it took close to 200,000 new jobs every month just to keep the unemployment rate flat. Today trend employment growth is only 80,000 jobs, and that number will continue to go down for the foreseeable future. As a result, even if we only continue to average, say, 160,000 new jobs per month, which historically speaking is no great feat, the unemployment rate will inexorably fall. It will continue to go down even if US growth remains anemic. It will continue to go down well past the 6.5% threshold communicated by the Fed – not because of open-ended QE – but because above-trend employment growth is so easy to achieve today.

This is not some fringe view; nor is it original to me (which is often, but not always, a fringe view). This is the view of the Chicago Fed. Here’s the link to the paper: (“Chicago Fed Letter, July 2013″). It’s an important piece of work and you should read it. I would only add that I think the authors are too conservative in their views of how quickly the unemployment rate will fall, because in addition to structural change in trend employment growth we are impacted by three policy-driven changes in measurements of unemployment: elimination of unemployment benefits after 99 weeks (moving more people from the unemployed category to the no-longer-looking-for-work category), expansion of student loans (full-time students do not count as unemployed), and hiring at shorter work-weeks to avoid Federal health insurance mandates (so hiring more people to do the same amount of aggregate work).

Even a cursory look at the unemployment rate over the past five years shows an unmistakable trend downwards. I’ve marked the open-ended QE decision with a green line just to show how immaterial this Fed policy is to the unemployment rate trend, even though its scope and its very existence are tied directly to the unemployment rate. In some future note I’ll discuss the concept of linkage from a game theoretic perspective, particularly the strategy of linking things that aren’t really meant to be linked together. For now let me just say that this is a very, very risky thing to do.

Epsilon Theory Manifesto

Our times require an investment and risk management perspective that is fluent in econometrics but is equally grounded in game theory, history, and behavioral analysis. Epsilon Theory is my attempt to lay the foundation for such a perspective.

The name comes from the fundamental regression equation of modern portfolio management: y = α + β+ ε where the return of a security (y) is equal to its idiosyncratic factors (alpha) plus its co-movement with relevant market indices (beta) plus everything else (epsilon).

The language of professional investment is dominated by this simple econometric formulation, and the most fundamental questions regarding active portfolio management – does an investment strategy work? how does an investment strategy work? – are now entirely framed in terms of alpha and beta, even if these words are not used explicitly. When investors ask a portfolio manager “what’s your edge?” they are asking about the set of alpha factors that can differentiate the performance of an actively managed portfolio from a passively managed portfolio. Even a response as non-systematic as “I know everything about the semiconductor industry and I have a keen sense of when these stocks are over-valued or under-valued” is really a statement about alpha factors. It is a claim that there is a historical pattern to security price movements in the semiconductor industry, that these movements are linked to certain characteristics of semiconductor companies, and that the manager can predict the future state of security prices in this industry better than by chance alone by recognizing and extrapolating this historical pattern.

This notion – that observed characteristics of securities, companies, or the world determine to a large extent the future prices of securities – is so ingrained in the active investment management consciousness that it is hard to imagine an alternative. If the observed characteristics of a security or its underlying economic entity or its relevant events are not responsible for making the price of the security go up or down, then what is? If market co-movement (beta) is the answer, then the passive investment crowd is right and we should just put our equity allocations into broad market ETF’s and call it a day.

Unfortunately for the active management community, alpha factors have not done terribly well in recent years, regardless of asset class, investment strategy, geography, etc. I’m not saying that active managers as a group have not had positive performance. I’m saying that on both a risk-adjusted and non-risk-adjusted basis the population of active managers today has underperformed prior populations of active managers to a significant degree. Are there individual exceptions to this general observation? Of course. I am making an observation about a population, not any individual member of the population. But I don’t think it’s a particularly contentious statement to say that the enterprise of active investment management has been challenged over the past five years.

One way to improve the efficacy of active management is to do better with alpha identification … to identify new historical patterns (including the pattern of pattern change), to measure the characteristics of securities and companies and events more accurately, etc. It seems to me, though, that this sort of effort, where we seek to add one more term to the list of alpha variables or improve the list we already have, is inevitably an exercise in diminishing returns, and a crowded exercise at that. I think it is ultimately a dead end, particularly in an era of Big Data technology and strong regulatory proscriptions against private information regarding public companies.

Instead, I think we should be looking outside the confines of factor-based investment analysis. We can’t squeeze much more juice out of the alpha fruit, and we know that beta gives no sustenance to the active investor. But what about epsilon? What about ε? We pejoratively call this an “error” term, and the goal in any applied econometric exercise is to make this term as small and inconsequential as possible. That makes perfect sense if we are trying to predict the future states of, say, decaying nuclear particles, where it seems unlikely that there is any agency or causal process outside of the particles themselves (i.e., outside the physical universe). But it makes no sense at all if we are investigating a social phenomenon such as a financial market, where strategic human behavior and decision making play a crucial role, but a role largely exogenous to the observed characteristics of the financial universe.

It’s the epsilon term that I want to explore, because it includes anything that cannot be expressed easily in econometric terms – things like strategic decision-making and shifting behavioral preferences. Modern portfolio theory ignores these dynamic behavioral characteristics by assumption, as the epsilon term is defined as residual and random information from the perspective of the static factors defined within the alpha and beta terms. Because decision-making and behavioral characteristics cannot easily be expressed in the language of factors and regression, they are essentially invisible to the econometric eye.

epsilon-theory-manifesto-june-1-2013-cartoon

Roy Lichtenstein, “I Can See The Whole Room …” (1961)

Fortunately, there is both a language and a lens available to analyze these human behavioral patterns in a rigorous fashion: game theory. Game theory and its close cousin, information theory, allow us to extract non-random, non-residual information from the epsilon term, which in turn allows us to predict or understand the likely return of a security more accurately.

To take a recent example, consider the recent plunge in the price of gold. Here’s how Alen Mattich “explained” the sharp drop in the MoneyBeat column of the Wall Street Journal on May 20th:

But just as there wasn’t any real logic needed to keep prices advancing when everyone was caught up in the euphoria, there’s no need for logic to intrude in the fall either. These things take on a life of their own. Especially when there’s so little rational basis on which to price these assets in the first place.

If you’re looking solely through the lens of factor analysis, Mattich is right: what’s happened recently in gold prices makes no sense. But through the lens of game theory, there is absolutely a logic and a rational basis for this market behavior. Game theory is explicitly designed to help explain events that otherwise appear to have “taken on a life of their own”, and my goal in Epsilon Theory is to elucidate and communicate that explanatory perspective to as broad an audience as possible.

I believe that we are witnessing a structural change in markets, brought on by a witches’ brew of global debt crisis, new technology, and new regulatory regimes. By structural change I mean a fundamental shift in the market’s relationship to society and politics, as well as a sea change in the behavioral preferences of market participants. Modern portfolio theory takes both of these terms – market rules and market participant preferences – as constants, and as a result it is impossible to see the impact of structural change by looking solely through the lens of alpha and beta factor analysis. We need another lens.

Here’s an easy example of what I mean … in modern portfolio theory Risk-On/Risk-Off does not exist. We all know that it’s out there, and we can even see some its impact on measurable alpha and beta factors, sort of a Risk-On/Risk-Off effect by proxy. But there is nothing in any alpha or beta factor that explains or predicts Risk-On/Risk-Off. It’s like trying to see Dark Matter with a telescope. We know that Dark Matter is out there in the universe, but a telescope detects photons, which is pretty good for most astronomical tasks, but not if you’re trying to see something that doesn’t interact at all with light. Why can’t factor analysis “see” Risk-On/Risk-Off? Because Risk-On/Risk-Off is neither an attribute of a security nor a discrete event; it is a behavior that emerges from a strategic decision-making structure, and factor analysis simply cannot detect behaviors.

My intent is not to rain on the econometric parade. My intent is to show its limitations and suggest an additional methodology for improving the efficacy of active investment management. From an econometric perspective, strategic human behavior and decision-making may reside in epsilon, the “error term”, where it is, by definition, largely impervious to econometric tools. But that does not mean that these strategic human behaviors are unpredictable or unknowable. It simply means that we need an entirely different tool kit, and that’s what game theory is.

Game theory is only useful for social phenomena. It is a methodology for understanding strategic decision making within informational constraints. I say “strategic” because, like the tango, it takes (at least) two decision makers to play a game, and each player’s decisions are made in the context of expectations regarding the other player’s decision-making process. Game theory does not see the world in terms of factors and historical correlations. It sees the world in terms of equilibria, as decision-making balance points where strategically-aware players have no incentive to make alternative decisions. Movement from one equilibrium to another is determined entirely by changes in the perceived pay-offs of the possible decisions, which is another way of saying that behavioral change is determined entirely by a change in the information available to the players regarding future probabilities of future states of the world.

The game of poker provides an instructive corollary for evaluating the relative strengths of game theoretic and econometric analysis. Econometric or factor analysis is the equivalent of “playing the cards”, where decisions are based on the odds of this card or that card appearing relative to the revealed strengths of other players’ hands and the potential stakes to be won or lost. Game theoretic analysis, on the other hand, is the equivalent of “playing the player”, where decisions are based on a strategic assessment of the likely behavior of other players relative to the informational signals provided by bets. If you want a tool kit to evaluate the static factors that describe the structure of a poker game or a capital market, then econometrics is the right choice. Game theory, on the other hand, is the right choice if you want to evaluate the dynamic interactions that emerge from the structure of a poker game or a capital market.

epsilon-theory-manifesto-june-1-2013-dog-poker

Cassius Coolidge, “A Friend in Need” (ca. 1908)

The need for this combined perspective has never been greater. As Mohamed El-Erian writes with his customary clarity:

The recent volatility [in gold] speaks to a dynamic that has played out elsewhere and, more importantly, underpins the gradually widening phenomenon of western market-based systems that have been operating with artificial pricing for an unusually prolonged period.

… think of the underlying dynamic as one of a powerful brand where valuation has become completely divorced from the intrinsic attributes of the product – thus rendering it vulnerable to any change in conventional wisdom (or what economists would characterize as a stable disequilibrium).

… Essentially, today’s global economy is in the midst of its own stable disequilibrium; and markets have outpaced fundamentals on the expectation that western central banks, together with a more functional political system, will deliver higher growth.

– Financial Times, “Markets Insight: We should listen to what gold is really telling us,” May 20, 2013

What El-Erian calls “change in conventional wisdom” is exactly the phenomenon that game theory perceives well. Conversely, it is exactly the phenomenon that factor analysis perceives poorly.

My only quibble with El-Erian is that the disjuncture between security prices of all sorts and fundamentals is not only a function of central bank policy designed specifically to create that disjuncture, but is also a function of new technology and regulatory regime shifts. Simply reversing the extraordinary measures taken by the Fed and its acolytes won’t put the Big Data genie back in the bottle, or reverse the impact of Reg FD and Reg NFM. I’ll have a lot more to say about the structural impact of technology and regulatory policy change in future letters, but here’s the bottom line: all of these changes create significant challenges for active investment management. The new policy regimes make it much more difficult for any investment firm to acquire private information about a public company legally, and the new technology ensures that any investment advantage gleaned from public information will be arbitraged away almost immediately. The result is alpha scarcity and beta dominance, a poisonous environment for active investment managers of all stripes, as well as the Wall Street firms with business models designed to support active investment management. Welcome to the New Normal.

But here’s the thing … this has all happened before. The New Normal turns out to be an Old Normal, or at least an Intermittent Normal, and history provides a crucial lesson for active investors seeking to ride out the current storm. Risk-On/Risk-Off behavior is nothing new, you just have to look back before World War II. Risk-On/Risk-Off was an accepted fact of market life in the U.S. for 100 years, from at least the 1850’s all the way through the 1940’s, because the conditions that create a structure where Risk-On/Risk-Off behavior emerges – global financial crisis + new technology + regulatory regime change – were so commonplace. We think that the Internet has changed the way we make investment decisions … imagine what the telegraph and the telephone did. We worry about central bank decisions to expand their balance sheets … imagine the concern over the creation of fiat currency and the outlawing of gold ownership. The New York Stock Exchange survived a Civil War and two World Wars quite nicely, thank you, and there were actual human investors who thrived during these decades, all without the benefit of Modern Portfolio Theory. It might behoove us to learn a thing or two from these men.

Here’s the big lesson I’ve gleaned from reading first-hand accounts of pre-World War II investors – they were all game players. Understanding, evaluating, and anticipating the investment decisions of other investors was at least as important to investment success as understanding, evaluating, and anticipating the future cash flows of corporations. To men like Andrew Carnegie, Jay Gould, and Cornelius Vanderbilt – just to name three of the more famous investors of this time – the notion that they would make any investment without strategically considering the decision-making process of other investors would be laughable. In fact, most of their public investments were driven by the strategic calculus of “corners”, “bulges”, and “points”. These men played the player, not the cards, in almost everything they did.

epsilon-theory-manifesto-june-1-2013-carnegieAndrew Carnegie

epsilon-theory-manifesto-june-1-2013-gouldJay Gould

epsilon-theory-manifesto-june-1-2013-vanderbiltCornelius Vanderbilt

To take one of literally hundreds of examples, it wasn’t some great secret that Jay Gould and James Fisk were trying to corner the gold market in the summer of 1869. They were buying in the open market and clearly communicating their intentions to every market participant, big and small. What they didn’t communicate was that they had a mole in the Grant Administration, someone who would tip them off to any government gold sale. Some investors figured out Gould’s game and avoided the original Black Friday, September 24th 1869, when the Grant Administration sold $4 million worth of gold in the open market and crushed the corner. Other investors (including some in Gould’s inner circle) were themselves crushed. The point is that everyone involved in the capital markets in 1869 was trying to figure out the behavioral intentions of a few very public figures, and investment success or failure in any security depended mightily on this strategic assessment. There was no hand-wringing and moaning about the “divergence of prices from their fundamentals”. It was just an accepted fact of life that yes, fundamentals mattered, but game-playing mattered a lot, too, and often it was the only thing that really mattered.

Are the subjects of game-playing in markets different in 2013 than they were in 1913? Sure. The days of “corners” are largely over, or at least illegal, just as the days of cozying up to management for non-public fundamental information are now largely over, or at least illegal. But the nature of game-playing hasn’t changed, and the centrality of game-playing to successful investment, particularly during periods of global economic stress, hasn’t changed at all.

The secret of effective market game-playing, whether you were an investor 100 years ago or you are an investor today, is to recognize that the market game hinges on the Narrative, on the strength of the public statements that create Common Knowledge. These are the core concepts of Epsilon Theory.

epsilon-theory-manifesto-june-1-2013-pipe

Rene Magritte, “The Treachery of Images” (1929)

The concept of Narrative is a thoroughly post-modern idea. What I mean by this is that Narrative is a social construction, a malleable public representation of malleable public statements that lacks any inherent Truth with a capital T. In fact, the public statements that go into the construction of a Narrative are often intentionally untrue.

As Jean Claude Juncker (far left in photo), Luxembourg PM and former Eurogroup Council President, famously said in reference to market communications, “When it becomes serious, you have to lie.”

epsilon-theory-manifesto-june-1-2013-juncker

And even if the information behind a Narrative is not intentionally a lie, it may have zero causal or correlative relationship to the Narrative. Randall Munroe captured this idea well in a cartoon on the statistical analysis (or lack thereof) underlying the business of sports commentary, and precisely the same critique can be made of the business of financial market commentary.

The financial news media has to say something, and they have to be saying something all the time. So they will.

epsilon-theory-manifesto-june-1-2013-sports-commentary

There’s nothing evil or immoral about this. It is what it is. But it’s critical to recognize a Narrative for what it is and not imbue it with superfluous attributes, such as Truth. To be effective, it is only important for a Narrative to sound truthful (this is what Stephen Colbert calls “truthiness”, which is actually a very interesting concept, not to mention a great word), not that it be truthful. A Narrative may in fact be quite truthful, but this is an accident, neither a necessary nor a sufficient condition of its existence.

My goal with Epsilon Theory is not to somehow expose a Narrative for being demonstrably untrue or disconnected from facts (although sometimes I just can’t help myself). And while it can be personally satisfying to indulge one’s righteous indignation by asking cui bono – who benefits? – from some particularly egregious repre-sentation of the world, that, too, is really here nor there. Demanding some arbitrary degree of truthfulness from a Narrative is a categorization error, pure and simple, and something of a conceit, to boot. No, I want to use a proper conception of Narrative, which has no inherent notion of truthfulness and is simply a public representation of a set of public statements made by influential people about the world, because I think that this can help me predict market behaviors that are not easily predictable by factor-based or econometric analysis. To that end, my goal with Epsilon Theory is to identify Narratives, measure their strength, and assess their likely impact on security prices through an application of game theory and information theory.

epsilon-theory-manifesto-june-1-2013-canyon

Mark Tansey, “Constructing the Grand Canyon” (1990)

Human history is littered with the corpses of dead Narratives, from the ancient myths of Greece or Rome to more modern concepts such as Manifest Destiny or Cultural Revolution. By definition, the verdict of history (which itself is a socially constructed representation of actual historical facts) has not been kind to dead Narratives, in that we see them now as myths, which is to say, as Narratives whose constituent public statements have lost whatever power they once had to move us.

epsilon-theory-manifesto-june-1-2013-american-progress epsilon-theory-manifesto-june-1-2013-zedong
John Gast, “American Progress” (1872) Mao Zedong Thought poster (ca. 1970)

It’s all too easy either to chuckle or raise a disapproving eyebrow at these more modern myths, wondering how anyone could be swayed or motivated by what seems to be obvious propaganda, which is to say, public media messages that no longer create Common Knowledge. But back in their respective days, these Narratives were powerful, indeed.

Many older Narratives have kept their potency, some for centuries. For example, the Narrative of the American Founding Fathers is just as powerful today as it was 100 years ago, maybe more so. There is no inherent expiration date on a Narrative, and it can survive as a meaningful driver of behavior so long as it regenerates itself by sparking influential public statements that create widespread Common Knowledge. This is certainly the case with the ongoing representations of public statements made by Washington, Madison, Jefferson, etc. over 200 years ago. Not only are their public statements still retransmitted and remediated in a positive light, but they are widely referenced by current influential speakers with new public statements on a daily basis.

It’s the new Narratives, though, that I am most interested in. How do they emerge? How do they sustain themselves? How do they manifest themselves in predictable patterns of behavior?

epsilon-theory-manifesto-june-1-2013-bernanke

Time, December 16, 2009

 For example, the current Narrative associated with Federal Reserve policy is just as powerful and just as real as any historical Narrative I am aware of, including the Narratives of global religions and major nationalities. Fifty years from now, will we look back on Central Bank Omnipotence as a dead myth, as something akin to Manifest Destiny, or will it continue to shape our expectations and behaviors as the Founding Fathers

Narrative does? The answer to this question will almost certainly not depend on the actual efficacy of Federal Reserve policy! Narratives tend to die with a whimper, not a bang, and even successful Narratives from a policy perspective (as Manifest Destiny surely was) can wither as they are supplanted by new interpretations and representations of the world that better serve the interests of the economic and political entities that promulgate Narratives.

For a current Battle Royale between two competing Narratives, look no further than Europe. On the one hand, we have the Narrative of European Union, which is a potent and vibrant public representation of an active set of public statements by extremely influential people advocating shared notions of identity and sovereignty across national European borders. This Narrative serves the interests of a large mandarin class of bureaucrats, as well as the economic interests of most European companies.

epsilon-theory-manifesto-june-1-2013-quarto-reich

il Giornale, August 3, 2012

And on the other hand, we have the Narrative of German Hegemonic Desires, advocating political resistance to Germany’s imposition of its preferred economic policies through EU mechanisms. This Narrative serves the interests of Opposition political parties and is particularly strong in Italy under the aegis of Berlusconi’s media empire. Neither of these competing European Narratives is going away anytime soon, if ever. But the waxing and waning of one versus the other has investable consequences for market behaviors, and it is this assessment of the Narrative battlefield, if you will, where the Epsilon Theory perspective can provide direct benefits.

The link between Narrative and behavior is Common Knowledge, which is defined as what everyone knows that everyone knows. This is actually a trickier concept than it might appear at first blush, because as investors we are very accustomed to evaluating the consensus (what everyone knows), and it’s easy to fall into the trap of conflating the two concepts, or believing that Common Knowledge is somehow related to your private evaluation of the consensus. It’s not. Your opinion of whether the consensus view is right or wrong has absolutely nothing to do with Common Knowledge, and the consensus view, no matter how accurately measured or widely surveyed, is never the same thing as Common Knowledge.

Common Knowledge is, in effect, a second-order consensus (the consensus view of the consensus view), and it is extremely difficult to measure by traditional means. You might think that if a survey measures a consensus, then all we need to do is have a survey about the survey to measure a consensus view of the consensus view and hence Common Knowledge, but you would be wrong. What would the second survey ask? Whether or not the second-survey individuals agree with the first-survey individuals? Common Knowledge has nothing to do with whether the second-survey individuals think the original consensus view is right or wrong … that would just be an adjustment of the original survey. What you’re trying to figure out is the degree to which everyone believes that everyone else is relying on the original survey as an accurate view of the world, which has nothing to do with whether the original survey does in fact have an accurate view of the world. It has everything to do, however, with how widely promulgated that original survey was. It has everything to do with how many influential people – famous investors, famous journalists, politicians, etc. – made a public statement in support of the original survey. It has everything to do with the strength and scope of the Narrative around that original survey, and this is what you need to evaluate in order to infer the level of Common Knowledge in play regarding the original survey.

Now obviously it’s unlikely for a powerful Narrative like Central Bank Omnipotence to emerge around a survey, but replace the words “original survey” with “consensus view that the Fed has got your back” and you’ll see how this works.

The more Common Knowledge in play at any given time, the more that market behaviors will be driven by the rules and logic of the Common Knowledge Game than by fundamentals or traditional factors. I’ve written extensively about the CK Game in my prior letters, so I won’t repeat all of that here (for a collection of this work see the Epsilon Theory Archives). Suffice it to say that you’ll be reading a lot more about specific applications of the CK Game in future letters. You’ll also be reading a lot more about pre-World War II investing in future letters, as I find that matching examples of successful game-playing in the past with opportunities for game-playing today is a very effective way of communicating the power and relevance of Epsilon Theory.

On that note, I want to conclude with an extended passage from John Maynard Keynes, writing in the 1930’s about the game-playing he saw and experienced on a daily basis with his personal investing.

epsilon-theory-manifesto-june-1-2013-maynard

Thus the professional investor is forced to concern himself with the anticipation of impending changes, in the news or in the atmosphere, of the kind by which experience shows that the mass psychology of the market is most influenced.

This battle of wits to anticipate the basis of conventional valuation a few months hence, rather than the prospective yield of an investment over a long term of years, does not even require gulls amongst the public to feed the maws of the professional; it can be played by professionals amongst themselves. Nor is it necessary that anyone should keep his simple faith in the conventional basis of valuation having any genuine long-term validity. For it is, so to speak of, a game of Snap, of Old Maid, of Musical Chairs – a pastime in which he is victor who says Snap neither too soon nor too late, who passes the Old Maid to his neighbour before the game is over, who secures a chair for himself when the music stops. …

Or, to change the metaphor slightly, professional investment may be likened to those newspaper competitions in which the competitors have to pick out the six prettiest faces from a hundred photographs, the prize being awarded to the competitor whose choice most nearly corresponds to the average preference of the competitors as a whole; so that each competitor has to pick, not those faces which he himself finds prettiest, but those which he thinks likeliest to catch the fancy of the other competitors, all of whom are looking at the problem from the same point view. It is not a case of choosing those which, to the best of one’s judgment, are really the prettiest, nor even those which average opinion genuinely thinks the prettiest.

We have reached the third degree where we devote our intelligences to anticipating what average opinion expects the average opinion to be. And there are some, I believe, who practise the fourth, fifth and higher degrees. – The General Theory of Employment, Interest, and Money (1935)

Any investment manager who has watched market indices tick-by-tick after an FOMC announcement knows the truth of what Keynes wrote 80 years ago. It clearly doesn’t matter what you think about the Fed statement itself. And you quickly learn that it doesn’t matter what you think about whether expectations of the Fed statement were met or not, because as often as not the market will go in the opposite direction that you surmised.

What you want to know is what everyone thinks that everyone thinks about the Fed statement, and you can’t find that in the Fed statement, nor in any private information or belief. You can only find it in the Narrative that emerges after the Fed statement is released. So you wait for the talking heads and famous economists and famous investors to tell you how to interpret the Fed statement, but not because you can’t do the interpreting yourself and not because you think the talking heads are smarter than you are. You wait because you know that everyone else is also waiting. You are playing a game, in the formal sense of the word. You wait because it is the act of making public statements that creates Common Knowledge, and until those public statements are made you don’t know what move to make in the game.

As Keynes wrote, you are devoting your intelligence to anticipating what average opinion expects the average opinion to be. And there is nothing – absolutely nothing – in the standard model of modern portfolio theory or the fundamentals of the market or any alpha or beta factor that can help you with this effort. It’s not that the standard model is wrong. It’s just incomplete, both on its own terms (we have precious little alpha or beta factor data from prior periods of global financial crisis) and, more importantly, in that it was never intended to answer questions of strategic behavior. You need an additional tool kit, one designed from the outset to answer these questions. That’s what Epsilon Theory is intended to be, and I hope you will join me in its development.

PDF Download (Paid Membership Required): http://www.epsilontheory.com/download/15632/