Axioms of Asset Valuation


My intent is that this post become a living document which houses my own personal magnum opus on asset valuation. Herein and throughout I will posit certain axioms of asset valuation that I believe to be relevant for distinguishing between a thing’s market versus true value. Upon review, one might (correctly) deduce that none of these axioms are my original ideas.

The axioms are summarized through the following postulates:

Postulate (1): The market price is usually the right price.

  1. More verbosely stated, the right price for an asset can be any ranges of prices which do not result in arbitrage (i.e., a free lunch). The no-arbitrage principle is the pillar of the efficient market hypothesis (EMH) and the fundamental theorem of asset pricing (FTAP). Complete markets are required for strong forms of no-arbitrage to be true; weaker (i.e., statistical) forms of this principle have no such hard stipulation.
  2. That the market is fairly valued is an extreme tautology if one uses market-based determinants for asset fair valuation. The fair value of a generic asset, V_t, is reflexively its discounted net present value (NPV) which, in turn, is defined as the integrated value of internally generated free cash flows, C_t:(eq 1) V_{t-\Delta t} = \sum_t^T C_t\frac{1}{(1+\frac{r}{n})^{nt}} \approx \int_t^T C_t e^{-rt} \, dt

    in which continuous geometric compounding of the short (i.e., risk-free) rate, r, results in exponential compounding:

    \lim_{n \to \infty} (1 + \frac{r}{n})^{nt} = e^{rt}

    For fixed exponentially growing C, this becomes the well-known growing annuity formula:

    (eq 2) V_{t} = \int_t^T C e^{-rt} dt = C \frac{e^{(g-r)t}-e^{(g-r)T}}{r-g}

    As t \to \infty, this become the perpetuity equation of the Gordon Growth Model ubiquitous in equity valuation:

    (eq 3) V_{t=0} = \frac{C}{r-g}

    If the discount rate, r, is defined as the market-weighted value (M) of the weighted average cost of equity, r_e, then the fair value of the market is a thing which reflexively defines itself:

    if: (r_e = \frac{\sum_i C_i }{\sum_i M_i})

    and: \sum_i V_t = \frac{\sum_i C_i}{r_e},

    then: \frac{\sum_i M_i}{\sum_i V_i} = 1.

    As long as assumptions regarding fair value are held internally consistent, the market is the capital-weighted reflection of market participants’ consensus regarding the expected values of determinant pricing variables under a generally broad range of possible scenarios.

  3. In even a weakly efficient market, common “valuation” ratios tell use little of value, but do tells us a lot of about the expectations of market participants. If we are take seriously the other participants in a market (which we should), there is a justification for their every purchase and sale. As result, security prices which appear “cheap” — relative to ubiquitous price multiples to earnings, book, and other fundamental factors — reflect expectations about the future performance of the underlying company (and, to some degree perhaps, expectations regarding the stock’s future performance independent of the underlying company’s future performance). A stock which appears “cheap” does not mean it is “a value” — these are not the same thing, but have been conflated according to the canonical value investor’s worldview. More often than not, an apparently cheap stock actually deserves to be cheap. These value traps are increasingly common as markets become increasingly efficient due increasing dispersion of financial data and knowledge surrounding the natures of mis-pricings and risk premia among market participants. Yet, the misconception that “cheap = value” persists because apparently “cheap” stocks have in aggregate out-performed the averages over the past century. This out-performance, however, was not necessarily due to a systemic “value” phenomenon. Rather, asymmetric upside and margin of safety was made possible because their prices reflected overly wrought pessimism and/or capriciousness. Indeed, the cross-section of “cheap” equities confirms that nearly all excess returns are due to a few outliers which exceeded the market’s low expectations. And of these outliers, the majority resided within the dusty corners of the market.
  4. Exceeding low expectations is much easier and more likely than exceeding high expectations. From this premise, explanations for market behavior rooted in behavioral economics allow for the relative predictability of some returns without relaxing (breaking???) the EMH. Note that while it is possible for individuals securities to be mis-priced within a broadly efficient marketplace, abounding inefficiency should not become our base assumption, lest we become overly enthralled with any number of value traps.
  5. True value investing is a lot more involved than buying cheap.
  6. Fundamentals which are expected to improve offer information which unique from trailing valuation ratios. This effect may be objectively captured, albeit also using trailing data, within the Piotroski Effect. Another possible, but not mutually exclusive, explanation is that Piotroski factors are mistakenly interpreted as value-centric when in reality are just conflated momentum factors.
  7. The veracity of EMH precludes almost any possibility that easily discerned patterns in widely dispersed data, such as price and volume, have any real and/or sustained predictive power over future price paths. An efficient market is in direct contravention with the causal underpinnings of nearly all forms of technical analysis. Proponents of technical analysis suppose that observed market prices discount all information related to supply and demand. This is not controversial. Proponents further claim that price action results in repeated and predictable patterns which provide signals regarding future areas of supply (i.e., resistance) and demand (i.e., support). That support and resistance exists is also not controversial. However, the ability of past price fluctuations to predict future paths is extremely controversial since it implies a free lunch. Even if a certain pattern did repeat itself in a predictable manner, the barriers to entry for arbitrage based on widely dispersed data are virtually non-existent. Wary of the potential for free money, arbitrageurs would front-run the expectation thereby changing the pattern altogether. Although certain technical patterns may contain information about the future, the nature of efficient markets requires that profiting by these patterns rely on proprietary knowledge, data, technology, and/or execution techniques.
  8. An efficient market does not rule out the possibility that constituent securities within that market may experience mis-pricings. True mis-pricings result as a consequence of investors’ errors regarding a security’s true value. Moreover, the very factors that contribute to errors also also tend to be those that relate to trading cost and risk premia which may be present. The Fama-French (FF) factors work most strongly within the the dusty corners of the market (i.e., within securities which are hard to short, illiquid, under-covered and/or under-followed, of under-capitalized and/or tightly-held companies, et cetera…). As a rule, the relative predictability of these securities’ returns due to errors are not strong violations of the no-arbitrage principle if we permit that these factors are also explicable barriers to arbitrage. Stated plainly, investor errors are most likely to persist in areas of the market in which: a) smart money (which generally equals big money) cannot or does not care to scale; b) unique information is more likely to exist; and/or, c) regulators are less likely to notice or care about asymmetry. An edge-seeking small investor should (law-abidingly) look in these places first.
  9. Moreover, the relative predictabilities of some asset returns due to the presence of risk premia are anticipated by EMH.

Postulate (2): The presence of momentum observed within many marketplaces is undoubtedly the most serious affront to Postulate (1).

  1. The presence of a predictable return spread which is not related to any kind of risk premia or trading costs looks like a classical free lunch. Momentum, if it truly does exist, utterly violates classical thinking on market efficiency and the no-arbitrage principle of asset pricing. Its presence was popularized in 1997 paper by Mark Carhart where upon it was begrudgingly added to the classical FF Factor Model. So inconvenient is momentum to EMH that Eugene Fama has called its mere existence “the biggest embarrassment to the theory“. Anyhow, momentum has been removed from the FF framework in favor of more explicable return spreads. Explicable within the FF framework, by the way, simply means that observed excess return spreads do not really exist after costs and risk-premia are factored. Otherwise, it would be an arbitrage upon which knowledge of its mere existence makes it *poof* disappear. It’s like the old economics joke:

    Two economists are eating lunch together and one of them points out to the other what appears to be a $100 bill lying on the ground. The other responds, “don’t worry — if that were a real $100 bill, someone would have already picked it up by now.”

  2. Kenneth French, in spite of the whether he eschews the core logic of their existence, continues to track anomalies related to portfolios constructed through prior returns within his data library, which includes short-term mean reversion in addition to longer-term momentum. The presence of short-term mean-reversion is less problematic because return spreads are at least partially explicable through first-mover advantages and trading costs. The presence of mean-reversion also implies that price deviations due to short-term perturbations (e.g., supply-demand imbalances) are quickly identified and corrected as a thing returns to its equilibrium value. For example, if I am compelled to sell a thing (e.g., due to a margin call; in order to pay taxes or put money down on a house; et cetera) and thereby drive down the price below its equilibrium, taking the other side of that trade should result in a positive expectancy. Short-term mean reversion is therefore not very problematic from the standpoint of EMH. Momentum, however… economists just can’t be having any free lunches.
  3. The relatively new field of behavioral economics, conceived by Daniel Kahnemann and Amos Tversky, has barely begun to unravel the crux of the human condition which compels people to “buy high and buy higher”, which is presumably at the root of the momentum anomaly. As a general aside, the field of behavioral economics is still a blue sky. A graduate student could probably learn most everything that has been written on the topic within a two-year program. On the other hand, students in other general fields of finance and economics could at best only scratch the surface of the massive corpus of literature — thus specialization.

Postulate (3): The veracity of Postulate (1) does not imply that the market is usually right.

  1. That the market price is usually the right (i.e., no-arbitrage) price simply means that it is very difficult to beat the market. This does not state that the current market price is an accurate estimate of the future market price — this is a common mis-reading. There exist inherent elements of uncertainty in asset prices which are virtually indistinguishable from random walks which will more than likely result in the following inequality:(eq 4) P_t \ne P_0 e^{rt}

    Rather, changes in asset prices can be thought of as taking a random walk, as in a Weiner Process — in which W_t is Geometric Brownian Motion — and changes in prices over \Delta t are the result of time-value (r), drift (\mu), variance (\sigma^2), and a normal random variable (Z_t).

    (eq 5) P_t = P_0e^{(r+\mu \pm \frac{\sigma^2}{2})t+\sigma W_t}

    (eq 6) = P_{t - \Delta t}e^{(r+\mu \pm \frac{\sigma^2}{2})\Delta t +Z_t \sigma \sqrt{\Delta t}}

    In this proposed random walk environment in which asset prices are virtually indistinguishable from semi-martingales, the current price is the probabilistic best estimate for future prices. Therefore, under the risk-neutral expectation, \mathbb{Q}, given by the FTAP — in which there must be at least one risk-neutral expectation equal to the probabilistic expectation in order for no free lunches to exist — the current price provides the best probabilistic estimate for future prices, i.e.:

    (eq 7) \mathbb{E^Q}[P_t] = P_{t - \Delta t} e^{r \Delta t}


Postulate (4): In cases in which Postulate (1) is false, there may exist some form of arbitrage.

  1. Strong forms of arbitrage demand a risk-free return which, in turn, depends on mis-pricings which are independent of assumptions regarding future price paths and the nature of their randomness. Strong forms of arbitrage are usually fleeting. Most investors will never definitively discover a true risk-free arbitrage opportunity in their lifetimes.
  2. However, if/when an investor is able to deduce that the nature of uncertainty is not completely random, he/she might deduce the presence of a statistical arbitrage — i.e., an expectation for profit above the risk-free rate of return. Statistical arbitrages may abound especially when and where there exists informational asymmetry. An investor has a statistical arbitrage when/where he/she determines that expectations implied by current market prices are unlikely to reflect revised expectations once other market participants become aware of either their previous mis-judgements and/or the presence of contradictory information. Statistical arbitrages do not require complete markets and do not present a clear violation of EMH.
  3. In reality, success in the market is the result of passively riding the wave, luck, skill, or compensation for assuming (perceived) risk. A 2017 paper entitled Do Stocks Outperform Treasury Bills? on the cross-section of equity returns since 1926 elegantly explains why most active investors under-perform their self-selected benchmark — because nearly all excess equity returns are attributable to the top performing quintile, under-diversified portfolios (i.e., most actively-managed portfolios) tend to in aggregate under-perform the passive index. And for any actively-managed portfolios which do indeed out-perform, it is also exceedingly difficult to discern whether it was due to luck or skill. See Warren Buffett’s essay on The Superinvestors for a demonstration of the luck versus skill paradox involving “a national coin-flipping competition”. If the market is efficient, there is no such thing as a good coin-flipper (i.e., “active investor”) who is also not an arbitrageur. However, it is reasonable for an average investor to expect an excess rate of return over the long-run by receiving premia for the assumption of perceived risk in a sufficiently diversified portfolio. The veracity of this so-called smart beta approach indicates that the bankable difference between a skilled investor/arbitrageur and a skilled risk-taker is semantic. In the real world, it is irrelevant whether returns above the risk-free or market rates of return are due to alpha (i.e., returns in excess of beta) or beta (correlated volatility among other factors).

Postulate (5): Postulate (4) depends on the singular premise that price seeks value.

  1. The assertion that “price seeks value” is a philosophical one but is also supported by the EMH and lots of data. The EMH supposes that a market’s sole purpose is as a price discovery mechanism. The implication is that current prices reflect the current capital-weighted consensus. As new information regarding the present and future become known, prices will seek a new efficient equilibrium.
  2. Instances of excess return spreads which do not depend on the momentum anomaly (which may be interpreted as a form of massive cognitive dissonance) depend almost exclusively on violations of the time-value of money (TVM) principle. If the expected net present value of internally generated cash flows of any generic asset can be shown to not equal its market price, then a statistical arbitrage may exists.
    If: \mathbb{E^Q}[P_t] \ne P_{t - \Delta t} e^{r \Delta t},then a purchase or sale of: P_{t - \Delta t} \to_{\Pi \gg 0} \mathbb{E^Q}[P_t] over the period, \Delta t.

    The expected profit received from identifying violations of TVM reflect the wisdom of the famous Warren Buffett axiom: “Price is what you pay. Value is what you get”. But then again, usually, “you get what you pay for”.

  3. Instances in which prices diverge from value are more readily arbitrage-able when there are instruments to trade both \mathbb{E^Q}[P_t] and P_{t - \Delta t}, such as (the weak case of) constructing factor-based long-short portfolios and/or (the strong case of) dynamic hedging of replicating pay-off portfolios. A one-sided arbitrage trade is still risky even if the relationship between the expectation and the market holds; i.e., \Pi_{loss} \gg 0 if transacting one-side of the expectation while \Pi_{loss} \approx 0 in a (efficiently and cheaply) hedged position.
  4. Forms of statistical arbitrage which rely on TVM support the use of discounted cash flow analyses (DCF). While nearly all professional security analysts rely on some form of DCF, most fail to incorporate objective measures of uncertainty in their analyses. Furthermore, many analysts set discount rates equal to the weighted average cost of capital (WACC) as implied by the Capital Asset Pricing Model (CAPM). WACC as a proxy for opportunity cost inherent in TVM is not controversial. The utility of CAPM, however, is controversial.
  5. Stochastic pricing models (e.g., such as the Black-Scholes derivation for the no-arbitrage value of European-style put and call options) which model the expected value of a terminal, discrete cash flow based on the expected terminal value of an underlying stochastic process have made great progress in dealing with empirical uncertainty. However, this author is not yet aware of a closed-form model which handles the generic case of continuous stochastic cash flows which derive their time-dependent values from an underlying price process which is itself stochastic. In essence, the general formula for a stochastic annuity marries equation (1) on the time-value of an annuity with equation (6) on the expected measure of a random walk:(eq 8) \mathbb{E}^{\mathbb{Q}}_t[V_{t}] = \int_t^T \int_{-\infty}^{\infty} \varphi(\frac{Z_t \sigma \sqrt{\Delta t}+\frac{\sigma^2}{2} \Delta t}{\sigma\sqrt{\Delta t}})*({\mathbb{E}_t^\mathbb{Q}[ C_{t}e^{Z_t\sigma\sqrt{\Delta t}+r \Delta t}]} e^{-r \Delta t})\,dZ\,dt

    where: \varphi(Z) is standard normal (i.e., Gaussian) probability density function defined by:

    \varphi(X) = e^{\frac{-X^2}{2}}\frac{1}{\sqrt{2\pi}}

    Note the analogy between equation (8) and the generic form of a double-integrated volumetric equation (i.e., a stochastic annuity is like the integral of pricing models which estimate the no-arbitrage value of single, terminal pay-off). Also, the dynamics of C_t may not be log-normally distributed as equation (8) implies and therefore may have to be expressed as an expectation of a time-dependent pay-off condition contingent upon a random process. The generic problem of determining the no-arbitrage fair value of a stochastic annuity holds great promise and should make a honking good graduate thesis.


Postulate (6): Postulate (5) depends on the expectation that markets are rational over the short and long-run.

  1. If markets are indeed rational, inefficiently priced assets will eventually become efficiently priced. However, as Keynes noted in days of yore, “the market can remain irrational longer than you can remain solvent”. Bouts of extended periods of massive cognitive dissonance are well-documented. But only in hindsight do these delusions of the masses become apparent to the masses.
  2. Apparent irrationality at the group level may actually be rational from the perspective of individuals when such behavior is sanctioned and supported by an artifice of regulatory and/or monetary distortions. Even so, the resultant bubble is still no less a bubble.

Postulate (7): The possibility that Postulate (6) is false indicates that risk-management may be to needed avert ruination.

    1. The possibility that markets can deviate from rational behavior from time to time means that even the most skilled investors must practice risk-management in order to avert major, even total, loss. The possibility that extended bouts irrationality may prevail over the short and long-term is especially relevant to investors who attempt to earn an excess rate of return by holding a portfolio which is more focused than the market portfolio. Active investors who hold very broad portfolios are in essence closet indexers.
    2. Hedging is a great idea! However, hedging a concentrated portfolio of equities in which it is difficult (nay, impossible) to trade the underlying assets is problematic. Hedges which utilize options (i.e., derivatives of equities; i.e., derivatives of derivatives) are good risk tools for management. However, the use of derivatives to manage risk are usually costly to put on and costly to adjust in relation to the expected profit. Moreover, self-financing positions (e.g. credit spreads) typically are not well-suited to hedging risk since the credit received is compensation for risk assumed. In practice, the loss-aversion provided through dynamic hedging is not a free lunch. Rather, the efficient compounding of capital over the long-run simply demands a minimization of the likelihood for large, debilitating losses.
    3. The central tendency for large numbers to equal the expectation — as in making a large number of individual bets within a given time-frame — is generally an undervalued risk-management approach which does not sacrifice essential return provided that: a) direct and implied transaction costs are small in relation to the expectancy; and, b) the net expectation for each bet is positive.
    4. Markowitz’ Modern Porfolio Theory (MPT) demonstrates that diversification averts extreme losses within a mean-variance setting. However, the presence of steady-state covariances in equity markets has not been proven, even if a more generic form of co-movement is allowed to persist. The CAPM, an extension of MPT, possesses little utilitarian value aside from demonstrating the benefits of diversification.
    5. Most practical applications of CAPM are, contrary to canonical interpretation, not supported by the veracity of the Modigliani-Miller Theorem (MM) on the irrelevance of capital structure on firm value. Although Theorem II of MM differentiates between levered and unlevered equity beta, this original sense of beta was in terms of the firm’s assets to equity ratio. The CAPM (also called the Sharpe-Lintner-Black model) married the intuitions given by the arbitrage theory of pricing (ATP) with an extension of the MM theorem by positing that equity beta could be directly observed by regressing stock returns against the benchmark returns. It is this author’s opinion, however, that stock market returns contain little information on firm-specific risk factors. While the CAPM implies that a more volatile (i.e., “risky”) stock has a relatively greater expected returns on equity, there is little empirical evidence which corroborates the intuition that volatility\, (i.e., standard\, deviation) = risk or that index\,correlation * relative\,volatility = risk. FF (1993) found that when portfolios are adjusted for size, market beta’s explanatory value falls to zero (and that its utility is not likely salvageable through the remediation of analytical errors). Moreover, the opposite effect has been observed in which low-volatility and low-beta portfolios have out-performed the averages.
    6. FF’s entire existence is predicated on disproving the CAPM’s central intuition that risk-factors which drive expected return can be directly observed from the single-factor beta-coefficient (i.e., slope) of asset price returns versus an arbitrary benchmark. Still, FF Three and Five-Factor Models are merely more sophisticated implementation of ATP. The linear dependence on explanatory variables implied by APT is not necessarily indicative of market participants’ views.
    7. The Kelly criterion, conceived by John Larry Kelly Jr. at Bell Labs in 1956, is a promising venue for the construction of efficient portfolios in ways that mitigate the risk of ruin while preserving essential return. Professional gamblers have long employed this so-called Fortune’s Formula for sizing bets in relation to their bank-rolls. Proper full-Kelly betting minimizes the expected number of bets required to double one’s bank-roll; its also maximizes the long-run expected logarithmic growth rate of the bank-roll. This particular convergence between the Kelly criterion and normative discounting (i.e., logarithmic utility) methods used in canonical forms of TVM is indeed striking, but perhaps intentional. The Kelly criterion also anticipated the well-documented “volatility drag” phenomenon whereby the variance (\sigma^2_X) of logarithmic returns (\mu_x) literally exerts a drag on the expected arithmetic growth rate (m_X):(eq 9) \mathbb{E}[\mu_x] \approx \mathbb{E}[m_X] - \frac{(\sigma_X^2)}{2}

 

  1. Although this form of drag disappears if one assumes returns are continuous and logarithmic, real-world betting is discrete. In discrete scenarios with limited bank-rolls, betting in excess of full-Kelly always has a probability of ruin \gg 0. In other words, excessive leverage always results in an expectation of eventual and inevitable wealth destruction. The wealth destruction effect, in fact, explains the dramatic long-run under-performance of leveraged ETFs, multiple financial crises, among other things.
  2. In order to exploit the flip-side of variance, professional gamblers often employ fractional-Kelly in order to further minimize the risk of ruin while preserving an attractive rate of return. For example, half-Kelly betting halves the expected volatility of the bankroll while only decreasing its expected rate of return by 25%. Legendary investor and MIT professor Edward Thorp is a proponent of the Kelly betting system and an editor of the Kelly Capital Growth Investment Criterion, which seeks to identify how investors can utilize Kelly criteria to size bets within a complex investment universe consisting of an arbitrary number of continuous semi-martingales. Of note, this special case of the Kelly investment criterion converges with the findings of Stochastic Portfolio Theory (SPT).
  3. The field of finance which extends the continuous Kelly investment criterion and/or SPT in order optimize expected logarithmic utility under a general case of asset co-movement (or even the specific case of co-variance) has not been invented yet to my knowledge. This would make a honking good graduate thesis!

Postulate (8): Increasing information dispersion increases the likelihood that Postulate (1) is true (i.e., contributes to the obsolescence of all other postulates).

  1. The supposition that markets are trending towards increasing levels of efficiency is increasingly relevant in the information age. Transaction costs — barriers to arbitrage — are far lower now than in the past. Previously exclusive data is widely dispersed and usually much more inexpensive. The computing power to crunch that data has also become exponentially more powerful and available.
  2. The information regarding how to exploit that data is also now widely available. Quantpedia hosts a compendium of many supposed asset pricing factors and anomalies (i.e., those things which worked in the past). If we believe in even the weakest form of EMH, we would presume that dispersion of such knowledge leads to its obsolescence.
  3. Many supposed asset pricing anomalies are the expected result of over-fitting. A May 2017 NBER study tests the significance of 447 known asset pricing anomalies (57, 68, 38, 79, 103, and 102 variables from the momentum, value-versus-growth, investment, profitability, intangibles, and trading frictions categories, respectively), finding that as many as 64% may be insignificant at the 5% confidence level. Of these, many are expected to be spurious. But perhaps more important, the authors cite how their predictive powers tend to diminish over time, especially following publication. The paper further cites Schwert (2003) which shows that “after anomalies are documented in the academic literature, they often seem to disappear, reverse, or weaken”, and a similar study by McLean and Pontiff (2003) which shows that anomaly-based returns spreads decline post-publication. Following the publication of Carhart (1994), return-spreads related to prior asset returns (e.g., momentum and mean-reversion) have also declined.
  4. Evidence and logic overwhelmingly converge: the mere knowledge of an asset pricing inefficiency is its death knell — information dispersion is the alpha-killer. As a result, many historically relevant correlations are expected to become irrelevant going forward. Others share this position. In a recent post, Jason Zweig noted the impending hazard whereby known historical inefficiencies are expected to become future efficiencies once the knowledge thereof becomes widely dispersed, especially in light of the growing popularity of factor investing. Tammer Kamel noted incisively in his post on the Unbearable Transience of Alpha that “Professional allocators will not pay hedge fund fees for the execution of strategies that are on the first year curriculum of any Masters of Finance program.”
  5. Yet, some asset pricing anomalies appear to be robust despite the widely dispersed knowledge therefore. There are at least four non-exclusive explanations for the persistence of certain factors: a) many apparent mis-pricings are actually compensation for risk; b) many apparent mis-pricings do not result in arbitrage once costs are considered; c) many anomalies are fleeting and/or have limited capacity; and/or d) anomalies rooted in human psychology anticipate investor error. None of these previously cited possible explanations are very problematic to the idea that “the market is hard to beat”. Only the persistence of probabilistic arbitrage due to (d) may be seen as a clear affront to EMH. Anomalies related to investor errors should persist as long as qualitative factors influence asset prices, but may increasingly diminish as humans increasingly abdicate reflexive decisions to methodical machine automation.
  6. In order to assess whether an anomaly is a true mis-pricing or whether it is spurious correlation, it is helpful to return to a bedrock of principle — in this case, Postulate (1)’s definition of fair value. The following questions may help to frame such attempts:
    1. How does any given anomaly articulate within the concept of fair present value? I.e., how can it be used to estimate the present value of future money flows, or at least gauge the market’s expectations?
    2. How likely will a given factor uncover errors in estimation? Even if a factor does not directly articulate with a mathematically convenient expression of present value, some anomalies may contain information regarding persistence of investor errors and/or the presence of non-dispersed information.
    3. How likely is an apparent bargain to turn into a value trap? I.e., what is the likelihood that prices discount better information than I currently possess? Many times, apparently cheap things deserve to cheap. That said, low expectations are easier to surpass than high ones.

Upon review, one might (correctly) deduce that I offer no original ideas and that I stole everything from the Classical, Austrian, Chicago, and Behavioralist Schools of Economics… in that order, too. I believe that the Classicists laid the foundations; Austrians showed that economic canon (i.e., Keynesians) needed more precision than was then possible; Chicagoans brought rigor to the Austrians; and, now, the Behavioralists are showing the limitations of the Chicagoan efficient market hypothesis with evidence that humans beings are NOT mathematically-convenient, rational, utility-seeking agents. Then there are the ecclesiastical Keynesians — who by their very natures tend to gravitate to positions of authority — always stirring up the possibility that, in the age of certainty, it might finally be possible to anticipate the precise future ramifications of (de-)regulation and of fiscal and monetary policy. We’ll see about that…

  • Yuval Taylor

    If the “true value” or the “right price” of an asset is the amount of money people are willing to pay for it, then postulate 1 is a truism. If the “true value” or the “right price” is defined in another way, then you have to tell us how you’re defining it in order for us to evaluate the postulates.

    • Good point, Yuval. Are you able to drill-down under Postulate (1)? I (somewhat haphazardly) define the fair value of an asset as the discounted net present value of its internally generated free cash flows. But I think you are correct in anticipating how this is a narrow definition.

      • Yuval Taylor

        Ah, I didn’t see the drill-down thing. All I saw were the postulates. Now I see that you have defined it quite well. It will be fun reading all your drill-down points under the various postulates.

        My reasoning is different. I think the “right price” is very different from the “true value,” as defined by the discounted net present value, though you seem to use them in the same sense. For me, the “right price” of an asset is the amount for which I can sell the asset at some future point plus the income from the asset which I would receive in the meantime. To estimate the right price, one must pay more attention to the behavior of market participants than to hoary formulae involving free cash flows, etc.

        What I don’t know is how this might fit in with your various thoughts on your postulates, so I’m going to read some more soon.

        • Yuval Taylor

          Now I’ve read it all and found it fascinating. It seems like a very cogent examination of the problem!

          • Thank you so much for reading!

            I am going to take some liberties with your words. Can your definition of “right price” as a function of future selling price plus the income stream can be loosely interpreted as “free cash flows” in order to fit into this framework?

          • Yuval Taylor

            I don’t think so. I think it’s a very different dynamic at work. You mention it under postulate 3.

            I wrote an article about this last summer: you can read it here: http://backland.typepad.com/investigations/2016/08/the-expectations-game.html

    • By the way, I am glad to finally test how this comment software actually works out!