SYS-CON MEDIA Authors: PR.com Newswire, David Smith, Tim Crawford, Kevin Benedict, Gilad Parann-Nissany

Blog Feed Post

Did an Excel error bring down the London Whale?

When JP Morgan Chase announced it had lost more than 2 billion dollars on the capital markets back in May 2012, many pointed to the actions of rogue trader Bruno Iksil as the cause. But was the "London Whale" — the nickname he was given by other traders for his outsized positions — the victim not of hubris, but a simple spreadsheet error? James Kwak, associate professor at the University of Connecticut School of Law and co-founder of the Baseline Scenario blog, noted some interesting facts in JP Morgan Chase's post-mortem investigation of the losses. Specifically, that the Value at Risk (VaR) model that underpinned the hedging strategy “operated through a series of Excel spreadsheets, which had to be completed manually, by a process of copying and pasting data from one spreadsheet to another", and "that it should be automated" but never was. This is a surprisingly common practice: through accretion and incremental advancements, an important statistical calculation somehow ends up being implemented as a convoluted series of Excel worksheets, connected by hundreds (or even thousands) of cell-reference formulas, all driven by a series of input parameters that need to be entered manually. Not only does this impose the risk of introducing errors when cutting-and-pasting the inputs, it also makes the workbook extremely fragile. As anyone who's build a budget in Excel knows, it's very easy when editing the spreadsheet to find that formulas no longer extend to their expected ranges (ever missed the bottom row from a formula when adding new data?), or point to the wrong data entirely. And then there's the possibility of errors in the formulas themselves, which seemed to have been an issue here as well: “After subtracting the old rate from the new rate, the spreadsheet divided by their sum instead of their average, as the modeler had intended. This error likely had the effect of muting volatility by a factor of two and of lowering the VaR . . .” Excel is an excellent tool for many applications, but the intertwined cross-references of formulas make errors like this hard to detect, and hard to correct even if discovered. That's why a programming language designed for data analysis such as the R language is a better platform for building the computational engines behind VaR models and other financial systems. Not only can it automate the process of importing data and inputs from other systems (and thus eliminate cut-and-paste errors), it also provides a structured, maintainable environment for the computational logic, within a framework that promotes code review and unit testing to detect errors. Excel may still be the preferred vehicle for delivering the results, but use R to generate the analytics computations (or embed real-time R computations in Excel) instead of risking an implementation in Excel formulas. Read also: How Validus Re uses Revolution R Enteprise for risk management The Baseline Scenario: The Importance of Excel (via Ben Lorica)

Read the original blog entry...

More Stories By David Smith

David Smith is Vice President of Marketing and Community at Revolution Analytics. He has a long history with the R and statistics communities. After graduating with a degree in Statistics from the University of Adelaide, South Australia, he spent four years researching statistical methodology at Lancaster University in the United Kingdom, where he also developed a number of packages for the S-PLUS statistical modeling environment. He continued his association with S-PLUS at Insightful (now TIBCO Spotfire) overseeing the product management of S-PLUS and other statistical and data mining products.<

David smith is the co-author (with Bill Venables) of the popular tutorial manual, An Introduction to R, and one of the originating developers of the ESS: Emacs Speaks Statistics project. Today, he leads marketing for REvolution R, supports R communities worldwide, and is responsible for the Revolutions blog. Prior to joining Revolution Analytics, he served as vice president of product management at Zynchros, Inc. Follow him on twitter at @RevoDavid