A. How the Web was Won
1. The world wide web
Do you know the difference between the internet and the world wide web? The internet is a hardware network of cables and routers – the physical infrastructure of computer connections. Originally government property, it has been built up gradually since the 1960s. The world wide web is a software component, and it was whipped up in short order by one man.
In the late 1980s, CERN laboratories in Switzerland hired British computer scientist Timothy Berners-Lee to help CERN scientists collaborate on massive, complex physics projects. Berners-Lee coded the World Wide Web 1 as an “information system” to address that challenge, and he was virtually its sole keeper and promoter for the first few years. Berners-Lee and collaborator Robert Cailliau set up the first web server, a web browser, and a few web pages by late 1990. A web browser is an “app” that enables any computer to become web-accessible. The very first webpages served to describe the World Wide Web itself. 3
Berners-Lee’s innovation was to write hypertext that worked across the internet. Hypertext, which has been around since the 1960’s, refers to the placement of links between computer files. It enables a computer user to jump around from one paragraph to another, or even from one computer file to another. Berners-Lee’s web was an extension of hypertext that could point among an entire network of computers, even the global internet. He also expanded the concept to links between all kinds of files besides text, such as images, audio and video. Today, the web is more broadly referred to as a system of hyperlinks. The World Wide Web gave us the seven letters http://www. The “http” stands for hypertext transfer protocol, the new standard for transmitting information from one computer to another.
The web became available for public use in 1991, and Berners-Lee encouraged programmers to help develop resources for it. A University of Illinois (UI) team made the first mass-market web browser, Mosaic. It operated on popular home computers, and it was the first browser to combine graphics with text. This gave the web a quick surge in popularity. However, when UI started charging license fees for others to use Mosaic, Berners-Lee became concerned that this kind of profit motive would restrict access to the web. 4 Meanwhile, he saw a competing internet system, Gopher, struggling under an effort to monetize. 5
Berners-Lee and CERN made a bold move shortly after Mosaic’s 1993 release. They released the www code to the public, relinquished their rights to profit from it, and declared that the web should be free for everyone. 6 This decision unleashed the beast, making the world wide web larger than any gatekeeper. It surged again and quickly became the internet standard. Berners-Lee’s historic decision delayed his own fortune by years. By now, he is deservedly wealthy as the head of several internet- and web-related organizations.
1995 was the next landmark year for the internet and the web. Before that, the American segment of the internet was still under strict government control, and commercial activity was expressly forbidden. 7 As part of the government’s divestiture of its network, those commercial restrictions on internet traffic were dropped on 1/01/1995. Just a few months later, new web browser Netscape enjoyed an explosive initial public offering on the stock market, making the first dot-com overnight millionaires. Businesses and startups immediately realized that they had a goldmine of potential customers. Amazon, eBay, Yahoo!, and Craigslist were some of the largest and most enduring websites to launch by 1995. The release of Microsoft’s Windows 1995 included Internet Explorer.
As the web grew, indexing it became an absolute necessity. 2 A browser such as Netscape or Internet Explorer could follow a command to go to a particular website, but it did not provide guidance if the user did not know a specific URL. The next major development was the search engine. A user could indicate what content he or she was looking for, and the search engine would check pages throughout the web for matching terms. The earliest major search engines were WebCrawler and Lycos in 1994. Google appeared in 1998 and, through use of its superior algorithms for indexing and ranking search results, quickly rose to prominence.
The late 1990’s saw a flurry of activity as corporations new and old rushed to get online. Web technology is credited with allowing the global economy to grow at hitherto impossible rates for several years. On the flipside, it also led to overly zealous investing in the high-tech bubble of 1995 – 2001. When most e-commerce corporations failed to show profits by the early 2000’s, this bubble burst, contributing to the recession of 2001.
The websites of the 21st century are much more interactive than the static webpages of the 1990s. Websites such as Facebook, YouTube, Wikipedia, Yelp, and millions of blogs are often described as Web 2.0. The online population is still growing exponentially. As recently as 1990, a majority of Americans had never used computers. 8 Now, more than half the human population accesses the internet, 9 including a quarter of Americans online “almost constantly”. 10
Simultaneously with the development of the web, information technology has become increasingly lightweight, wireless, mobile, and personalized. “Laptop” computers 3 were fairly well-known by 1990. Digital (2G) cell phones and SMS text messages were in place by 1993. Wi-Fi was commonplace by the late ‘90s. This was a wireless connection for use within a home or office, e.g. to allow printers and other smart appliances to communicate with the central computer.
A few devices integrated cell phone and internet capability as far back as the mid-1990s. The first hand-held computer to really make a splash, though, was the iPhone, released in 2007. Typically for Apple, it was not the iPhone’s functionality that was new, but its user-friendly design. It replaced a physical keyboard with a “soft” keyboard on the screen. The iPhone also made it very easy for users to download apps. These software and hardware features combined to make the iPhone a truly general-purpose computer. Smartphones are not as versatile as larger computers, but they don’t need to be. Most users have very low-grade computing needs like texting, web browsing, and photo-sharing. More demanding processing or storage is now offered easily on the “cloud” of remote servers. For a growing number of young users, a smartphone is the only computer they need. 11
In the ’10s, tech companies focused on dynamic apps and personalization. Rather than directing all users to one universal webpage, computers now bring up-to-the-minute information to each user according to location, demographic, and browsing history. If you like Thai food, your phone might start alerting you to nearby Thai restaurants (or even catering trucks) at lunchtime. A smart car can tell you how much traffic is up ahead, and how to avoid it. In fact, real-time traffic information is obtained by tracking all phones or GPS units on the road. This intense gathering and processing of information is a typical application of big data.
4. Transistor breakthroughs
How does all this technology work? Most people seem to take their computers for granted, although today’s technology would certainly have been called “science fiction” decades ago and “magic” centuries before that. Behind the scenes, though, is a gigantic industry applying some of history’s most ridiculously advanced engineering toward one overarching goal: to make transistors increasingly smaller and faster. Transistors have shrunk from 10-6 meter in 1990 to 10-8 meter (100 atoms wide!) in 2020. Every reduction poses new challenges and necessitates breakthroughs. It’s beyond the scope of this book to retrace all these feats (many of them are trade secrets anyway). Some of the most important 21st century techniques have names like anti-reflective coating 12, strained silicon 13, atomic layer deposition, and immersion lithography. Whether you’ve heard of them or not, these are the methods that make your smartphone and multiplayer online video games possible.
Moore’s Law continues to describe the pace of advancement fairly accurately – integrated circuits double in processing power about every two years. This makes for a thousand-fold increase in computing capabilities every two decades, a trillion-fold in a lifetime. The computing power that I carry around with me today would have served a major city at the time I was born.
Almost everyone living today understands how the telecommunication revolution has changed our lives and our world. I am only in my 40’s as I write this chapter, yet some periods of my past already feel impossibly old-fashioned. It doesn’t seem very long ago that “research” meant trips to the library, rooms full of microfilm, or maybe thumbing through the 30-volume encyclopedia at home. Now I am able to research the contents of this entire book from my desk, or out at a café if I prefer. In the 2010’s, every child grows up taking for granted that information, music, text messages, pictures, and videos are instantly accessible with the touch of a button.
The computer revolution has done more than bring us numerous toys and conveniences. The world is fundamentally different now that information can flow freely and instantly. It seems that discussing internet philosophy is impossible without using terms like “double-edged.” The internet is inherently neither good nor bad, but it is an effective tool for enabling both positive and negative behavior. Even more than television, the web has the potential to expand everyone’s perspective. We are much more apt to sympathize with people around the world when we can watch their YouTube videos and read their blogs, yet it is also easier for extremists to find each other and form virtual communities. Submitting our personal information to websites allows for a wonderfully tailored experience (“Oh look, there’s a pair of shoes just like the one I looked at last month, on sale only two miles away!”) At the same time, big data divulges an awful lot of private information to parties both known and unknown. Web 2.0 put us back in touch with long-lost friends. It also brought “comments sections” into our homes, often worse than a bathroom wall.
Knowledge is power. The information age has gone a long way toward democratizing information. Shoppers can easily look up prices, and body cameras take us right into police action. Yet ironically our very access to information is in the hands of a high-tech oligopoly. Corporations like Facebook, Amazon, and Google are de facto monopolies (especially in developing countries) 14. Economic arguments could be made that their industries are natural monopolies, but Berners-Lee recently expressed great concern about their size and power. 15
In this global village, 16 “everyone knows everyone,” or at least we all have fairly easy access to one another. This has created a loop back to ancient modes of moral enforcement. In the first prehistoric communities, transgressions were easily punished by ridicule and ostracism. As villages grew larger and some people got lost in the crowd, judgment passed to all-knowing, punishing gods. In this millennium, online reputation has the power to play the role of omniscient judge. The shady businessman who scams clients faces the possibility of negative reviews that put him out of business. Parents can open up neighborhood maps with icons locating nearby sex offenders. That sounds like a remedy for justice, but public opinion is touchy and inconsistent, especially in the pluralistic online world. Disagreements and mistakes sometimes fall to mob rule and public shaming beyond the scope of the offense. 17
Life is 90% beautiful, 9% imperfect, and 1% terrible. Updating the technology can’t change that balance. These new facts of life have cropped up just in our generation, but the way things look, the world will be online for a long time to come. We’d better appreciate our blessings and the solutions to many old problems, while learning to deal with the new ones.
- Image: © Tyler Olson, licensed to Scot Fagerland, https://www.dreamstime.com/stock-photos-women-looking-cell-phones-female-friends-sitting-dinning-table-image36393593 ↩
- Image of Timothy Berners-Lee by John S. and James L. Knight Foundation, photo by Scott Henrichsen / CC BY (https://creativecommons.org/licenses/by/2.0), https://commons.wikimedia.org/wiki/File:Berners-Lee_announcing_W3F.jpg (accessed and saved 6/28/15, archived 5/30/20). ↩
- E.g. World Wide Web, World Wide Web Consortium, http://www.w3.org/History/19921103-hypertext/hypertext/WWW/TheProject.html ↩
- Mark Fischetti, “The World Wide Web Became Free 20 Years Ago Today”, Scientific American (4/30/2013), https://blogs.scientificamerican.com/observations/the-world-wide-web-became-free-20-years-ago-today/ (accessed and saved 4/08/19). ↩
- Johnny Ryan, A history of the internet and the digital future, Reaktion Books (London, 2010), p. 110. ↩
- W. Hoogland and H. Weber, “Statement concerning CERN W3 software release into public domain” (4/30/1993), http://cds.cern.ch/record/1164399 (accessed and saved 4/09/19). ↩
- “Acceptable Use Policy”, National Science Foundation (c. 1992), provisions (10) and (11), http://www.cybertelecom.org/notes/nsfnet.htm#aup (accessed and saved 2/09/19). ↩
- Susannah Fox and Lee Rainie, “The Web at 25 in the U.S. Part 1: How the internet has woven itself into American life”, Pew Research Center (2/27/2014), http://www.pewinternet.org/2014/02/27/part-1-how-the-internet-has-woven-itself-into-american-life/ (accessed and saved 2/09/19). ↩
- “For the first time, more than half of the world’s population is using the internet”, International Telecommunication Union (12/07/2018), https://www.itu.int/en/mediacentre/Pages/2018-PR40.aspx (accessed and saved 2/09/19). ↩
- Andrew Perrin and Jingjing Jiang, “About a quarter of U.S. adults say they are ‘almost constantly’ online”, Pew Research Center (3/14/2018), http://www.pewresearch.org/fact-tank/2018/03/14/about-a-quarter-of-americans-report-going-online-almost-constantly/ (accessed and saved 2/09/19). ↩
- Comscore Whitepaper (3/30/2016), https://www.comscore.com/Insights/Presentations-and-Whitepapers/2016/2016-US-Cross-Platform-Future-in-Focus as quoted by David Murphy, “ComScore: Desktop Browsing on the Decline”, PCMag (4/16/2016), https://www.pcmag.com/news/343784/comscore-desktop-browsing-on-the-decline (accessed and saved 2/10/19). ↩
- Douglas J. Guerrero et al., “Anti-reflective coating for multipatterning lithography”, Proc. of SPIE Vol. 6923 (2008), https://www.brewerscience.com/uploads/publications/2008/Guerreroetal2008_6923Web.pdf (accessed and saved 2/10/19). ↩
- Els Parton and Peter Verheyen, “Strained silicon – the key to sub-45 nm CMOS”, III-Vs Review 19(3):28-31 (April, 2006), https://www.sciencedirect.com/science/article/pii/S0961129006715903 (accessed and saved 2/10/19). ↩
- Leo Mirani, “Millions of Facebook users have no idea they’re using the internet”, Quartz (2/09/2015), https://qz.com/333313/milliions-of-facebook-users-have-no-idea-theyre-using-the-internet/ (accessed and saved 2/11/19). ↩
- Guy Faulconbridge and Paul Sandle, “Father of Web says tech giants may have to be split up”, Reuters (10/31/2018), https://www.reuters.com/article/us-technology-www/father-of-web-says-tech-giants-may-have-to-be-split-up-idUSKCN1N63MV (accessed and saved 2/11/19). ↩
- Marshall McLuhan, The Gutenberg Galaxy: The Making of Typographic Man, University of Toronto Press (1962). McLuhan coined the phrase global village to describe the effect of instant electronic communication on the effective shrinking of the world. With amazing prescience, he practically predicted the internet in the early 1960’s. ↩
- Todd Leopold, “The price of public shaming in the Internet age”, CNN (4/16/2015), https://www.cnn.com/2015/04/16/living/feat-public-shaming-ronson/index.html (accessed and saved 2/11/19). ↩
Facebook comments preferred; negative anonymous comments will not display. Please read this page / post fully before commenting, thanks!
Powered by Facebook Comments