THE NEW ESTABLISHMENT
October 2014 Issue

The Great Connectors

This image may contain Human Person Larry Page Keyboard Hardware Computer Keyboard Electronics and Computer

Just as the Industrial Revolution was driven by combining the steam engine with ingenious machinery, the Digital Revolution has been driven by two great innovations: the personal computer and the Internet. The relationship between the two was standoffish at first, and it was only after their development became intertwined that the digital economy began to transform our lives. The result was a shift in influence from an old establishment led by bankers, wise men, and a corporate elite to a new establishment led by the pioneers of technology, information, and entertainment. This digital inflection point occurred in 1994, the year that this magazine began publishing its New Establishment list.

The first personal computers had sprouted two decades earlier, led by the Altair, a solder-it-yourself kit created by Ed Roberts, an engineer and hobbyist in Albuquerque. When one was shown off at a meeting of the Homebrew Computer Club, a collection of hackers and geeks who met monthly in Silicon Valley, a college dropout named Steve Wozniak got so excited that he devised an even better circuit board that he integrated with a keyboard and monitor. He was so proud of it that he gave away the specifications for free until his best friend from down the street, Steve Jobs, convinced him they should assemble them in the Jobs family’s garage and sell them. Thus was born the Apple I.

At that time, a new set of standards was being adopted for sending packets of information through digital networks. They were dubbed “Internet protocols” by their creators, Vint Cerf and Bob Kahn, and you generally had to be affiliated with a university or research institution to jack in.

It took a series of innovations to make it possible for the personal computer and Internet to meld into the combustible mix that fueled a revolution. One was creating a simple, consumer-friendly version of a modem, which could modulate and demodulate (hence the name) phone signals so they could carry digital data. That allowed electronic pioneers, such as the provocateur Stewart Brand and the marketer Steve Case, to create the WELL and AOL and other dial-up services that offered home-computer users e-mail, bulletin boards, chat rooms, and information. The wall between the P.C. and the Internet was breached by 1994, after AOL began allowing its members direct access to the Internet.

As ordinary folks began flooding onto the Internet, in 1994, another phenomenon exploded: the World Wide Web, a set of protocols that allowed people to post and access pages embedded with hypertext links, words, pictures, and eventually audio and video. It was created by Tim Berners-Lee, an Oxford-educated engineer who took a job at the CERN laboratory, in Switzerland, and was looking for ways to keep track of projects and foster collaboration. He was inspired by his favorite book as a child, a Victorian almanac crammed with random wisdom titled Enquire Within upon Everything, and he set out to create a web of links that would allow users of a network to do just that. His web spread rapidly after Marc Andreessen, an undergraduate at the University of Illinois, created an easy-to-install browser that allowed personal-computer users to call up Web sites. In January 1994, there were 700 Web sites in the world. By the end of that year there were 10,000. The combination of personal computers and the Internet had spawned something amazing: anyone could get content from anywhere, distribute new creations everywhere, and enquire within upon everything.

As a result, 1994 also witnessed the birth of a whole new medium. Justin Hall, a freshman at Swarthmore, created a beguiling Web site that included a running log of his personal activities, random thoughts, deep beliefs, and intimate encounters. His Web log featured poems about his father’s suicide, spiky musings about his diverse sexual desires, pictures of his penis, edgy stories about his stepfather, and other effusions that darted back and forth across the line of Too Much Information. By linking to others with similar logs he fostered a sense of community. Soon the phrase “Web log” had been shortened to “blog,”and Justin Hall had become a founding scamp of the first wholly new form of content to be created for, and to take advantage of, personal-computer networks.

In the 20 years since then, new platforms, services, and social networks have increasingly enabled fresh opportunities for individual imagination and collaborative creativity. Four of these innovations were especially transformative.

The first was the concept of search, pioneered most successfully by Google. Larry Page and Sergey Brin harnessed the wisdom of the billions of humans who put links on their Web sites with the power of a recursive algorithm that could rank each page by calculating the number of links pointing to it and the relative importance of each of those links based on the rank of the pages that originated them. In doing so they created a world in which humans, computers, and networks were intimately linked.

The second was the idea of crowd-sourced collaboration, which found full expression in Wikipedia. Jimmy Wales, a serial entrepreneur from Huntsville, Alabama, took a piece of software that allowed readers to edit a Web page and applied it to his effort to create an encyclopedia. Anyone could edit a page, and the results would show up instantly. Sure, that meant vandals could mess up pages. So could idiots or ideologues. But the software kept track of every version. If a bad edit appeared, others in the community could simply get rid of it by clicking on a “revert” link. That sometimes led to disputes. Wars have been fought with less intensity than the reversion battles on Wikipedia. But, somewhat amazingly, the forces of reason regularly triumphed. Wikipedia thus pioneered a Web 2.0 that was a place for open, peer-to-peer collaboration and crowd-sourced content.

The third seminal innovation was the migration to mobile. Steve Jobs paved the way by launching the iPhone in 2007. It included little pieces of application software, known as apps, for surfing the Web and other tasks. But initially Jobs wouldn’t allow outsiders to create apps for the iPhone, because he thought they might infect it with viruses or pollute its integrity. After pressure from his inner circle, Jobs figured out a way to have the best of both worlds. He would permit outsiders to write apps, but they would have to be approved by Apple and sold through the iTunes Store. The result was a new app economy, with 1.2 million apps available from Apple and a comparable number for Android systems.

And finally there was the creation of social networks. Early versions emerged around the time of the seminal year of 1994, with the advent of Geocities, The-Globe, and Tripod. But the real wave began 10 years later with the rise of Friendster, MySpace, and LinkedIn, followed by Facebook, Foursquare, Tumblr, and Twitter.

The next phase of the Digital Revolution will bring a fuller fusion of technology with the creative industries, such as media, fashion, music, entertainment, education, literature, and the arts. Much of the early innovation involved pouring old wine—books, newspapers, opinion pieces, journals, songs, television shows, movies—into new, digital bottles. But now completely new forms of expression and media formats are emerging. Role-playing games and interactive plays are merging with collaborative modes of storytelling and augmented realities. People are creating multi-media books that can be crowd-sourced and wikified but also curated. Instead of pursuing mere artificial intelligence, people are finding ways to partner the power of the computer with that of the human mind.

In this new era, the primary role for humans will be the same as it was 20 years ago. Human entrepreneurs and innovators will supply the imagination, the creativity, and the ability, as Steve Jobs would say, to think different. The people who succeed will be the ones who can link beauty to engineering, humanity to technology, and poetry to processors. In other words, it will come from creators like those on this year’s list, who can flourish where the arts intersect with the sciences and who have a rebellious sense of wonder that opens them to the beauty of both.