“With City trading, everyone is running very similar algorithms,” he said. “They all follow each other, meaning you get results such as the flash crash. They use them to speed up the process and to break up big trades to disguise them from competitors when a big investment is being made. They will run new algorithms for a few days to test them, before letting them loose with real money. In currency trading, an algorithm lasts for about two weeks before it is surpassed by a new one. In equities, which is a less complicated market, they will run for a few months before a new one replaces them. It takes a day or two to write a currency algorithm. It’s hard to find out information about them because, for understandable reasons, they don’t like to advertise when they are successful. Goldman Sachs, though, has a strong reputation for having a brilliant team of algorithm scientists. PhD students in this field will usually be employed within a few months by an investment bank.”
The idea that the world’s financial markets — and, hence, the well-being of our pensions, shareholdings, savings, etc — are now largely determined by algorithmic vagaries is unsettling enough for some. However, as the NSA revelations have revealed, the bigger questions surrounding algorithms center on governance and privacy.
How are they being used to access and interpret “our” data, and by whom?
Ian Brown, associate director of the University of Oxford’s Cyber Security Centre, says we all urgently need to consider the implications of allowing commercial interests and governments to use algorithms to analyze our habits.
“Most of us assume that ‘big data’ is munificent. The laws in the US and UK say that much of this [the NSA revelations] is allowed, it’s just that most people don’t realize yet, but there is a big question about oversight. We now spend so much of our time online that we are creating huge data-mining opportunities,” Brown said.
Brown says algorithms are now programmed to look for “indirect, non-obvious” correlations in data.
“For example, in the US, healthcare companies can now make assessments about a good or bad insurance risk based, in part, on the distance you commute to work,” he said. “They will identity the low-risk people and market their policies at them. Over time, this creates or exacerbates societal divides.”
University of Pennsylvania professor Oscar Gandy has done research into “secondary racial discrimination,” whereby credit and health insurance, which relies greatly on zip codes, can discriminate against racial groups because they happen to live very close to other racial groups that score badly.
Brown harbors similar concerns over the use of algorithms to aid policing, as seen in Memphis where Crush’s algorithms have reportedly linked some racial groups to particular crime.
“If you have a group that is disproportionately stopped by the police, such tactics could just magnify the perception they have of being targeted,” Brown said.
Viktor Mayer-Schonberger, professor of Internet governance and regulation at the Oxford Internet Institute, also warns against humans seeing causation when an algorithm identifies a correlation in vast swaths of data.