Connoisseurs of the 1967 film, The Graduate, in which Dustin Hoffman starred as Benjamin Braddock, a bewildered 21-year-old, treasure the moment when a ponderous family friend comes up to the young man intent on imparting some advice.
“I just want to say one word to you,” he intones, “just one word — ‘plastics.’”
For the last two decades, the computing and media industries have been intoning their own version of “plastics.” They call it “convergence.” It’s all based on the realization that, since the 1990s, most media — print, audio, video, graphics — have been reduced to the lowest common denominator: bits, the ones and zeroes of binary arithmetic. If you examine the contents of your computer’s hard drive or your iPod’s flash memory, all you will see are bits and there’s no immediate way of knowing whether a particular bitstream is a portion of an e-mail message, a photograph, a song or a video.
Finally, after these two decades of promise, we might now have reached a tipping point in this convergence story. (Though one is always tempted to quote Sam Goldwyn on such matters and say: “I’ll give you a definite maybe.”) Among other developments, there are perhaps two key events. One is YouTube’s Original Channels — dedicated “professional” niches as opposed to YouTube’s signature amateur clips — rolling out this year, which could result in viewers no longer distinguishing between made-for-TV and made-for-YouTube. Complementing this is the emergence on the market of “smart” — that is, Internet-enabled — TVs. We might now ask: Where does TV begin and the Internet end and do we care?
During the shifts of recent years, it’s not surprising that the various industries involved have predicted that all bitstreams would eventually converge on a particular device. Touchingly, each one predicted convergence on to its favorite piece of equipment. Microsoft founder Bill Gates and company thought that the PC would become the center of the digital universe. The TV people assumed everything would converge on the television set in the living room, while the mobile phone industry argued confidently that the mobile handset would be the key to everything.
The funny thing is that while all this prediction was going on, nobody seemed to notice that convergence had already happened. All those bitstreams had converged in one place — on to the Internet — and henceforth the determining factor for the future of any device was the quality of the window on to the Net that it provided.
This fact is particularly traumatic for the TV industry, which was shaped in an era when broadcast (few-to-many) organizations were the dominant beasts in our media jungle. It’s not all that long ago that a few TV networks could attract up to 90 percent of the available primetime audience.
Between 1960 and 1990, broadcasters really were masters of the universe. They shaped people’s viewing habits, changed our politics and determined how we spent much of our leisure time.
However, once cable TV and later, digital technology arrived, that began to change. Broadcasting became less broad as cable and satellite channels proliferated. And in that “narrowcasting” world, the audience became increasingly fragmented into more specialized segments. There were still moments when the broadcasting model came into its own: major sporting events, Big Brother, The X-Factor and the like, where mass attention is focused on a single, unique event. However, in general, the huge mass audience that was routinely characteristic of broadcast TV in its heyday looks like becoming an endangered media species.