It is a rare criticism of elite American university students that they do not think big enough. But that is exactly the complaint from some of the largest technology companies and the federal government.
At the heart of this criticism is data. Researchers and workers in fields as diverse as biotechnology, astronomy and computer science will soon find themselves overwhelmed with information. Better telescopes and genome sequencers are as much to blame for this data glut as are faster computers and bigger hard drives.
While consumers are just starting to comprehend the idea of buying external hard drives for the home capable of storing a terabyte of data, computer scientists need to grapple with data sets thousands of times as large and growing ever larger. (A single terabyte equals 1,000 gigabytes and could store about 1,000 copies of the Encyclopedia Britannica.)
The next generation of computer scientists has to think in terms of what could be described as Internet scale. Facebook, for example, uses more than 1 petabyte of storage space to manage its users’ 40 billion photos. (A petabyte is about 1,000 times as large as a terabyte, and could store about 500 billion pages of text.)
It was not long ago that the notion of one company having anything close to 40 billion photos would have seemed tough to fathom. Google, meanwhile, churns through 20 times that amount of information every single day just running data analysis jobs. In short order, DNA sequencing systems too will generate many petabytes of information a year.
“It sounds like science fiction, but soon enough, you’ll hand a machine a strand of hair, and a DNA sequence will come out the other side,” said Jimmy Lin, an associate professor at the University of Maryland, during a technology conference last week.
The big question is whether the person on the other side of that machine will have the wherewithal to do something interesting with an almost limitless supply of genetic information.
At the moment, companies like IBM and Google have their doubts.
For the most part, university students have used rather modest computing systems to support their studies. They are learning to collect and manipulate information on personal computers or what are known as clusters, where computer servers are cabled together to form a larger computer. But even these machines fail to churn through enough data to really challenge and train a young mind meant to ponder the mega-scale problems of tomorrow.
“If they imprint on these small systems, that becomes their frame of reference and what they’re always thinking about,” said Jim Spohrer, a director at IBM’s Almaden Research Center.
Two years ago, IBM and Google set out to change the mindset at universities by giving students broad access to some of the largest computers on the planet. The companies then outfitted the computers with software that Internet companies use to tackle their toughest data analysis jobs.
And, rather than building a big computer at each university, the companies created a system that let students and researchers tap into giant computers over the Internet.
This year, the National Science Foundation, a federal government agency, issued a vote of confidence for the project by splitting US$5 million among 14 universities that want to teach their students how to grapple with big data questions.
The types of projects the 14 universities have already tackled veer into the mind-bending.
For example, Andrew Connolly, an associate professor at the University of Washington, has turned to the high-powered computers to aid his work on the evolution of galaxies. Connolly works with data gathered by large telescopes that inch their way across the sky taking pictures of various objects.
The largest public database of such images available today comes from the Sloan Digital Sky Survey, which has about 80 terabytes of data, Connolly said. A new system called the Large Synoptic Survey Telescope is set to take more detailed images of larger chunks of the sky and produce about 30 terabytes of data each night. Connolly’s graduate students have been set to work trying to figure out ways of coping with this much information.
Purdue looks to carry out techniques used to map the interactions between people in social networks into the biological realm. Researchers are creating complex diagrams that illuminate the links between chemical reactions taking place in cells.
A similar effort at the University of California, Santa Barbara, centers on making a simple software interface — akin to the Google search bar — that will let researchers examine huge biological data sets for answers to specific queries.
Lin has encouraged his students to illuminate data with the help of Hadoop, an open-source software package that companies like Facebook and Yahoo use to split vast amounts of information into more manageable chunks.
One of these projects included a deep dive into the reams of documents released after the government’s probe into Enron, to create an analysis system that could identify how one employee’s internal communications had been connected to those from other employees and who had originated a specific decision.
Lin shares the opinion of numerous other researchers that learning these types of analysis techniques will be vital for students in the coming years.
“Science these days has basically turned into a data-management problem,” Lin said.
By donating their computing wares to the universities, Google and IBM hope to train a new breed of engineers and scientists to think in Internet scale.
Of course, it’s not all good will backing these gestures. IBM is looking for big data experts that can complement its consulting in areas like health care and financial services. It has already started working with customers to put together analytics systems built on top of Hadoop. Meanwhile, Google promotes just about anything that creates more information to index and search.
Nonetheless, the universities and the government benefit from IBM and Google providing access to big data sets for experiments, simpler software and their computing wares.
“Historically, it has been tough to get the type of data these researchers need out of industry,” said James French, a research director at the National Science Foundation.
“But we’re at this point where a biologist needs to see these types of volumes of information to begin to think about what is possible in terms of commercial applications,” he said.
With its passing of Hong Kong’s new National Security Law, the People’s Republic of China (PRC) continues to tighten its noose on Hong Kong. Gone is the broken 1997 promise that Hong Kong would have free, democratic elections by 2017. Gone also is any semblance that the Chinese Communist Party (CCP) plays the long game. All the CCP had to do was hold the fort until 2047, when the “one country, two systems” framework would end and Hong Kong would rejoin the “motherland.” It would be a “demonstration-free” event. Instead, with the seemingly benevolent velvet glove off, the CCP has revealed its true iron
US President Donald Trump on Thursday issued executive orders barring Americans from conducting business with WeChat owner Tencent Holdings and ByteDance, the Beijing-based owner of popular video-sharing app TikTok. The orders are to take effect 45 days after they were signed, which is Sept. 20. The orders accuse WeChat of helping the Chinese Communist Party (CCP) review and remove content that it considers to be politically sensitive, and of using fabricated news to benefit itself. The White House has accused TikTok of collecting users’ information, location data and browsing histories, which could be used by the Chinese government, and pose
Chinese President Xi Jinping (習近平) at a ceremony on July 30 officially commissioned China’s BeiDou-3 satellite navigation system. The constellation of satellites, which is now fully operational, was completed six months ahead of schedule. Its deployment means that the People’s Republic of China (PRC) is now in possession of an autonomous, global satellite navigation system to rival the US’ GPS, Russia’s Glonass and the EU’s Galileo. Although Chinese officials have repeatedly sought to reassure the world that BeiDou-3 is primarily a civilian and commercial platform, US and European military experts beg to differ. Teresa Hitchens, a senior research associate at the University of
Taiwan’s rampant thesis and dissertation plagiarism has reduced the value of degrees, bringing the academic system’s public credibility to the brink of collapse. Data published on Retraction Watch — a blog that reports on retractions of scientific papers — showed that 73 papers written by Taiwanese researchers were retracted from international journals between 2012 and 2016 due to fake peer reviews, the second-highest in the world behind China. Based on the size of the academic population, Taiwan was the highest in the world, making it academically a pirate nation. Academic fraud in Taiwan can be divided into several types: the listing of coauthors;