Tue, Jul 24, 2007 - Page 9 News List

Forecasting how humans will behave carries enormous risks

By Christine Evans-Pughe  /  THE GUARDIAN , LONDON

Can a statistical model reliably predict that you will buy the latest Harry Potter book, or add organic brie to your virtual shopping cart this week? What about whether you might become violent in the next 15 years, or your unborn child grow up to be a delinquent?

The growing use of computerized techniques for forecasting what we might buy or do on the basis of how our data matches up to some statistical model would suggest that they are well proven. But a landmark paper recently published in the British Journal of Psychiatry has cast doubt on whether such techniques should be used for making decisions about anything beyond the trivial.

The personalized recommendations and special offers that pop up when you order books or groceries online, and even the specific sequence of questions an insurance call center asks about your claim, are all generated by computerized predictive algorithms derived from analysing patterns, links and associations in large sets of data.

By classifying types of people and their behaviors on this basis, shops try to increase their profits by automatically targeting those of us in their databases that seem most likely to buy certain items. Insurance companies use similar methods to reduce fraud by investigating the claims of those whom the software decides are most likely to be lying.

serious matters

But the British government is adopting such techniques for more serious matters. Software at the Department of Work and Pensions, for instance, is beginning to try to detect fraudsters by analysing the voices of people who ring its call centers -- so if you ask the wrong kind of questions, or perhaps ask the right kind of questions in the wrong way, the software could decide you're not strictly kosher.

The British government's Action Plan on Social Exclusion has risk prediction as its first guiding principle. The idea is to predict life outcomes and trigger early human interventions before things go wrong -- in the case of the Nurse Family Partnership scheme, even before birth. In this scheme, the unborn child of a pregnant mother might be categorized as at high risk of future criminality based on factors such as the mother's age, her poor educational achievements, drug use and her own family background. The mother is then visited regularly at home by a nurse and helped with parenting.

In the criminal justice system too, risk prediction instruments assess the probability of adults and young people re-offending, along with a battery of other actuarial tests for predicting future sexual and violent crime. Such techniques, which are not automated in these cases, also play a central role in evaluations to determine whether a person should be committed indefinitely as a dangerous person with severe personality disorder or whether these people, once committed, are ready for release.

The UK Department of Health has even developed a series of predictive algorithms for scoring those patients with long-term conditions who are at most risk of rehospitalization. The idea is to intervene early to minimize admissions.

The Surveillance Society report from the Information Commissioner's Office outlined worries about predictive social sorting on the grounds that it could amount to discrimination, create new underclasses and that by the totting up of negative indicators from health, school and other records, a predictive model could make its own worst predictions come true.

This story has been viewed 2226 times.
TOP top