“Beware of the geeks bearing formulas” said Warren Buffett. Was it formulas and complex structures, and indeed the technology that turns such structures into reality, that led to the blow up of LTCM or subprime debt? No, we say it was human greed and ignorance.
Hensershott et al, (Journal of Finance,(Vol. 66 2008), verified large asset block algorithmic trading led to spread narrowing, reduced adverse asset selection, and minimized price discovery time. They concluded algorithms improved liquidity and enhanced asset management. However, there are detractors questioning technology’s gift. The FAO suggested (July 2011) that technology and deregulated commodity futures increased scope for price speculation so excluding genuine commercial actors.
Understanding the markets and the argument
It was not just a front office benefit as support staff; risk and compliance officers could and should maintain a watchful eye on internal risk metrics as well as the relative level of exposure to any given issuer or counterparty. For the global operator the seamless passing of a trading book around the world and keeping risk departments in contact could work smoothly. With backup sites at remote locations even a dreadful event such as “9-11” wouldn’t halt trading. Technology protected the system.
If ever one hears a critic of the technological financial architecture declare there is a crisis and quant methods are to blame one should challenge their opinion immediately. Systemic fault lies not with technology, rather with market agents uneducated in what they trade or what losses leveraged exposure could amount to. In 2010, writing for “The Daily Markets” Peter Cohen suggested one of the largest risks to the integrity of the global economy was the rapid growth of derivatives. Using 2008 data outstanding futures and OTC contracts amounted to $696Tn cf. global GDP of $61.26Tn (Source: BIS and World Bank). That is a ratio of 11.36 : 1! However, Stephen Pope from the think tank ‘Spotlight Ideas’ says, “Do not be naive and question how this scale factor can exist, "derivative" indicates an instrument has no independent value; its notional worth is "derived" and leveraged from an underlying asset’s spot price. The issue with OTC derivatives is they are unregulated. If blame is to be apportioned it has to be on the shoulders of authorities that allowed self-regulation and still fail to reach a universal agreement”.
It is easy to lash out in all directions after Bear and Lehman, but when serious losses occurred it was because of a duty dereliction on the trading and risk management desk. American International Group (AIG) lost over S$18Bn on Credit Default Swaps (CDS). As Pope exclaims, “AIG was driven by a lust for fees earned from CDS underwriting. Insufficient use of IT due diligence by stress testing and scaling back exposures was made. Recent examples show risk officers and supervisory trading managers learned nothing from the Nick Leeson led $1.2Bn loss in equity derivatives, 1995, that broke Barings. In January 2008, Société Générale was almost bankrupted by Jérôme Kerviel; €4.9Bn was lost misusing futures contracts. Just this year UBS, lost $2Bn through Kweku Adoboli, a director of the Global Synthetic Equities Trading team, “Delta One”.
Every once in a while there is a rise in the protest against agents that operate in a way the layman doesn’t understand. It is felt hedge funds and naked short sellers are evil that force institutions to the brink. Let’s be clear, no short seller acts on a whim. Shorts are positioned because numbers offered by politicians or CEO’s don’t stack up. To ease pressure on French banks a short selling ban was introduced on 12th August 2011. Since then to 23rd November 2011, the CAC 40 is -11.9% cf. BNP Paribas -33.4% and Société Générale -36.4%...clearly banning shorts has not worked. The same argument can be applied to sovereign debt. It is neither technology nor the market leading Greece to bankruptcy. It is the inaccuracy of national data and the stupidity of banks that lent heavily as they chased yields that were too good to be true.
The need for speed
IT is but by no means the silver bullet. The very nature of technologists is to innovate and push boundaries, and this will always throw up new challenges. If recent debate about High Frequency Trading (HFT) were not enough, Field Programmable Gate Arrays (FPGA) - chips that process at light speed and focus on analytics instead of processing – are set to take the debate to the next stage. FPGA’s will mean traders are no longer constrained by the limits of traditional Computer Processing Units (CPU). They will be able to process and analyse vast amounts of data in real time.
This shift away from process towards analysis and data flow has been happening for some time with technologies such as Complex Event Processing (CEP) and the ubiquitous ‘big data’. Chris Dutta from The Piccadilly Group, a company that specialises in the reliability of trading systems says “To be sure that these technologies perform and deliver as expected, independent verification and agreement of standards will be key. A provider may make a claim about near zero latency or built in risk checks to control and report on such systems, but companies must have a way of independently determining that such claims are accurate given the speed and volumes of data involved, and the potential for catastrophic impact.”Have new initiatives such as MIFID helped, given IT is central to any such roll out? Recent surveys show the response is no ringing endorsement. It’s claimed the European Code of Conduct for Clearing and Settlement, designed to foster interoperability between Europe’s post-trade services “has not yet fulfilled its purpose.” Indeed, considerable industry uncertainty remains towards the raft of forthcoming legislation, including MIFID II, and it remains to be seen whether any of this will fulfil its purpose either.
Co-written with Stephen Pope from Spotlight Ideas