Before Ahmad Khan Rahami planted bombs in New York and New Jersey, he bought bomb-making materials on eBay, linked to jihad-related videos from his public social-media account and was looked into by law enforcement agents, according to the FBI.
If only the authorities had connected the dots.
That challenge — mining billions of bits of information and crunching the data to find crucial clues — is behind a push by US intelligence and law enforcement agencies to harness “big data” to predict crimes, terrorist acts and social upheaval before they happen. The market for such “predictive analytics” technology is estimated to reach US$9.2 billion by 2020, up from US$3 billion last year, according to research firm MarketsandMarkets.
Illustration: mountain people
It is the stuff of a science-fiction movie such as Minority Report, in which Tom Cruise played a Washington cop who used technology to arrest people before they carried out crimes. It is also a red flag for privacy advocates already fighting US spy programs exposed by Edward Snowden and the FBI’s demands that Apple help it hack into encrypted mobile phones.
The idea is to make sense of the vast and disparate streams of data from sources including social media, GPS devices, video feeds from street cameras and license-plate readers, travel and credit-card records and the news media, as well as government and propriety systems.
“Data is going to be the fundamental fuel for national security in this century,” said William Roper, director of the US Department of Defense’s strategic capabilities office, at a conference in Washington last month.
For the first time, the White House on Wednesday last week released a strategic plan to advance research and development of artificial intelligence technology, including to predict incidents that might be dangerous to public safety.
Weeks before Rahami allegedly carried out the attacks last month, he bought circuit boards, electric igniters and ball bearings — all of which are known bomb-making materials, according to charging documents from the FBI.
In previous years, he was flagged by US Customs and Border Protection and the FBI after he made trips to Pakistan and after his father told police he was a terrorist, before recanting the remark.
Law enforcement agents could have been tipped off that Rahami was moving toward an attack had all of those data points been culled together in one place, said Mark Testoni, chief executive officer and president of SAP National Security Services, a US-based subsidiary of German software company SAP SE.
“This is a big data world now,” Testoni said.
He said his firm has developed a computer platform for doing predictive analytics that is being used in a limited way by a US Department of Defense agency and a US national security agency, declining to name the government customers or specify what they are doing.
The technology to predict events is only in its infancy, he said.
National security and law enforcement agencies also have different rules when it comes to obtaining and using data, meaning there are walls between what can be accessed and shared, he said.
For example, US law enforcement agencies need a court warrant to access most data.
Privacy advocates express concern about the “big brother” implications of such massive data-gathering, calling for more information and public debate about how predictive technology will be used.
“There’s often very little transparency into what’s being brought into the systems or how it’s being crunched and used,” said Rachel Levinson-Waldman, senior counsel to the National Security Program at the Brennan Center for Justice at New York University School of Law. “That also makes it very hard to go back and challenge information that might be incorrect.”
Computer algorithms also fail to understand the context of data, such as whether someone commenting on social media is joking or serious, Levinson-Waldman said.
Testoni’s company and others such as Intel Corp and PredPol are among a handful of firms pioneering the use of predictive analytics and artificial intelligence for clients from local police departments to US national security agencies.
More than 60 US local police departments have started making use of a service sold by PredPol, which calls itself “The Predictive Policing Company,” to forecast where crimes might occur based on past patterns, cofounder Jeff Brantingham said.
Its system, developed in collaboration with the Los Angeles Police Department, uses only three types of data: What type of crime occurred, when and where, he said.
Then, a software algorithm generates the probability of crime occurring in different locations, presented as 152m by 152m squares on a computer display or a printed map.
With that insight, police then can make decisions about how best to apply their resources, such as sending officers to a high-risk area, or which security cameras to monitor, Brantingham said.
PredPol’s system does not make predictions about who will commit a crime, so it stops short of a system that might identify a terrorist in the making.
“Interdicting places is, by and large, an approach that is more in line with protecting civil liberties than interdicting people,” Brantingham said.
Even with such limits, privacy and civil liberties groups oppose the use of predicting policing technology as a threat to the US constitution’s promises of equal protection and due process.
“This is fortune-teller policing that uses deeply flawed and biased data and relies on vendors that shroud their products in secrecy,” said Wade Henderson, president and CEO of the Leadership Conference on Civil and Human Rights. “Instead of using predictive technology to correct dysfunctional law enforcement, departments are using these tools to supercharge discrimination and exacerbate the worst problems in our criminal justice system.”
Vast databases that companies have created for online commerce and communications could help law enforcement and national security agencies build predictive systems if they are allowed to tap into them. Technology firms have terms of service that set out how much personal information can be kept and sold to outside companies such as advertisers, and most resist handing over such data to the government unless a court orders them to do so.
Predictive analytics are already being used by companies such as eBay, Amazon.com and Netflix to crunch their users’ Internet activity to forecast what they might be interested in. Companies such as Facebook and Twitter have access to more than a billion social-media accounts.
The storehouse of data on Americans will only grow with digital feeds from Internet-connected appliances and wearable devices.
In particular, social media is a valuable tool in tracking potential terrorist attacks, said Eric Feinberg, founding member of the Global Intellectual Property Enforcement Center, a private firm.
The firm has patented technology that can scan for hashtags across different social media platforms and in different languages for communications that indicate terrorist planning.
“Our software is about pattern analysis,” Feinberg said. “We focus on the communications stream.”
The US government is working on initial efforts to gain insight into global social and political trends.
A program under the US intelligence community’s research arm called Mercury seeks to develop methods for continuous and automated analysis of intercepted electronic communications “in order to anticipate and/or detect political crises, disease outbreaks, terrorist activity and military actions,” said Charles Carithers, spokesman for the Intelligence Advanced Research Projects Activity.
The agency also previously funded the Open Source Indicators program, which “developed methods for continuous, automated analysis of publicly available data in order to anticipate and/or detect significant societal events,” such as mass violence and riots, mass migrations, disease outbreaks and economic instability, Carithers said.
The CIA draws a distinction between using technology to anticipate events, versus predicting them. The agency is using sophisticated algorithms and advanced analytics, along with publicly available data, to forecast events. The initial coverage focuses on the Middle East and Latin America.
“We have, in some instances, been able to improve our forecast to the point of being able to anticipate the development of social unrest and societal instability to within three to five days out,” CIA Deputy Director for Digital Innovation Andrew Hallman said.
In its annual report in June, the US Defense Science Board said: “Imagine if national leaders had sufficient time to act in emerging regional hot spots to safeguard US interests using interpretation of massive data including social media and rapidly generate strategic options.”
“Such a capability may soon be achievable,” the board said. “Massive data sets are increasingly abundant and could contain predictive clues — especially social media and open-source intelligence.”
If US intelligence agencies develop an advanced system to predict terrorist acts they might call it “Total Information Awareness.” Except that name has already been used, with unhappy results.
Retired US Admiral John Poindexter created the Total Information Awareness program for the US Pentagon’s Defense Advanced Research Projects Agency in 2002 to find and monitor terrorists and other national security threats using data and technology.
The program became so controversial, especially over concerns that privacy rights would be violated, that the US Congress canceled funding for Poindexter’s office in 2003.
Having been there and done that, Poindexter now says predicting terrorism is possible, but would require a lot of data, such as banking information, analysis of social media, travel records and classified material.
The system also has to include strong privacy protections that the public can review, said Poindexter, who said he was working on such a “privacy protection application” when his program was canceled.
“You have to develop public trust in the way this is going to work,” said Poindexter, who continued developing the technology after leaving government through Saffron Technology, a cognitive computing firm that Intel bought last year for an undisclosed price.
Intel declined to comment.
“The government’s priorities should be to solve the privacy issue and start ingesting massive amounts of data into memory bases,” Poindexter said. “You have to get the public on board with the idea that we can collect and search information on terrorist planning that doesn’t have an adverse impact on innocent people.”
Chinese Nationalist Party (KMT) caucus whip Fu Kun-chi (傅?萁) has caused havoc with his attempts to overturn the democratic and constitutional order in the legislature. If we look at this devolution from the context of a transition to democracy from authoritarianism in a culturally Chinese sense — that of zhonghua (中華) — then we are playing witness to a servile spirit from a millennia-old form of totalitarianism that is intent on damaging the nation’s hard-won democracy. This servile spirit is ingrained in Chinese culture. About a century ago, Chinese satirist and author Lu Xun (魯迅) saw through the servile nature of
Monday was the 37th anniversary of former president Chiang Ching-kuo’s (蔣經國) death. Chiang — a son of former president Chiang Kai-shek (蔣介石), who had implemented party-state rule and martial law in Taiwan — has a complicated legacy. Whether one looks at his time in power in a positive or negative light depends very much on who they are, and what their relationship with the Chinese Nationalist Party (KMT) is. Although toward the end of his life Chiang Ching-kuo lifted martial law and steered Taiwan onto the path of democratization, these changes were forced upon him by internal and external pressures,
In their New York Times bestseller How Democracies Die, Harvard political scientists Steven Levitsky and Daniel Ziblatt said that democracies today “may die at the hands not of generals but of elected leaders. Many government efforts to subvert democracy are ‘legal,’ in the sense that they are approved by the legislature or accepted by the courts. They may even be portrayed as efforts to improve democracy — making the judiciary more efficient, combating corruption, or cleaning up the electoral process.” Moreover, the two authors observe that those who denounce such legal threats to democracy are often “dismissed as exaggerating or
The Chinese Nationalist Party (KMT) caucus in the Legislative Yuan has made an internal decision to freeze NT$1.8 billion (US$54.7 million) of the indigenous submarine project’s NT$2 billion budget. This means that up to 90 percent of the budget cannot be utilized. It would only be accessible if the legislature agrees to lift the freeze sometime in the future. However, for Taiwan to construct its own submarines, it must rely on foreign support for several key pieces of equipment and technology. These foreign supporters would also be forced to endure significant pressure, infiltration and influence from Beijing. In other words,