Last week, the Harvard Business Review blog presented some Harvard faculty responses to the newly signed Dodd-Frank financial reform bill. The Business School Professors’ sentiments range from the “cautious optimism” of Robert Steven Kaplan to Robert C. Pozen’s assertion that the act “misses the main cause of the crisis,” which was Fannie Mae/Freddie Mac, in his opinion. And while David A. Moss believes that the bill takes important steps to rein in “too big to fail” banks, Clayton S. Rose says that “little has been done” to defuse the systemic risks of such institutions.

None of the Professors focus on what Joseph Fuller, co-founder of Monitor Group, has called “The Terminator” of modern financial markets: computer-based modeling and trading programs. In a 2009 piece in The American Scholar, Mr. Fuller argues that the work of “quants” worsened the financial crisis. He also describes regulatory steps that could help dampen the volatility produced by automated trading programs.

Neither the Dodd-Frank Act nor Harvard’s Professors assign high importance to quant-driven volatility, but Mr. Fuller’s argument suggests that they should. Automated models drive hair-trigger, lockstep responses to market signals. “The Terminator” has also discouraged the sort of qualitative historical analysis that many investors, including those who consider environmental, social and governance (ESG) factors, believe is the key to long-term value creation.

LTCM Failure “An Omen, Not an Anomaly”

In his American Scholar article, Mr. Fuller defines quant modeling, and then describes its role in both the 2008 crisis and the 1997 failure of the Long Term Capital Management (LTCM) hedge fund. He believes that regulators drew the wrong lessons from the LTCM collapse, which resulted in a Fed-organized bailout to prevent a domino effect across the financial markets.

“Wall Street and its regulators did not pick up the echo from LTCM as soon as they should have,” he writes. “They failed to perceive the vulnerability to volatility of trading models such as those that LTCM employed.” Mr. Fuller writes that regulators should have “seen LTCM as an omen instead of an anomaly,” and then describes the “three inherent problems” of computer modeling:

Computer models have three inherent problems. The first problem is that those who created the models don’t understand the markets. Modelers are experts in math, computer science, or physics. They are not generally experts in stocks, bonds, markets, or psychology. Modelers like to think of markets as efficient abstractions, but these abstractions can never fully account for the messy and irrational actions that humans take for emotional reasons. Moreover, as we have seen, they construct their models or programs based on a study of historical market data. They test them by showing how well the model would have performed in a given historical situation. Because their programs must have some parameters, modelers necessarily have to exclude unprecedented circumstances like the current simultaneous volatility in global debt, equity, currency, and commodity markets. [They have also typically neglected ESG performance metrics. – Ed.]

The second problem is that managers don’t understand the modelers. … Because they are unable to speak the same language as the people creating the models, the managers have difficulty framing the questions necessary to comprehend how the models might respond to different situations. …

The third problem is that the models don’t “understand” each other. Each model executes its own strategy based on its calculus for maximizing value in a given market. But individual models are not able to take into account the role other models play in driving the markets. As a result, each program reacts almost in real time to the actions of other programs, potentially compounding volatility and leading to wild market swings. As we have seen, this happened recently when a set of models analyzing market data led their respective firms to liquidate assets and maximize their cash positions. The cumulative effect intensified the resulting selloff.

“Real Time” Reaction to the Wrong Signals

A model that responds in “real time” to relevant indicators would seem ideal. But what are the relevant indicators for a model of the modern financial market? Model-driven strategies have not only missed “unprecedented circumstances” – they’ve also ignored precedent as well, such as that of LTCM.

The quants' conventional wisdom built a system that ignored key systemic risk factors, and most models neglect non-financial indicators as well. No model predicted the collapse of a Massey coal mine, the suicides of Foxconn factory workers, or the explosion of a BP oil well.

But this does not mean that investors, regulators or citizens can resort to what Yves Smith has derided as the “Whocouldanode?” defense. No one could have known that Massey Coal, Foxconn, or BP would bring the world such tragedy in 2010, but these firms’ health and safety records did raise flags for some investors. For example, these firms' risky ESG practices kept them off KLD sustainability indices long before those practices had tragic consequences.

To paraphrase Joseph Fuller, a company’s historical record of ESG performance is an omen, not an anomaly. An analysis that studies companies, not just models, may reveal what “The Terminator” could never see.

Resources

Browse our Resource Center for more information.