A. How the Web was Won
1. The world wide web
Do you know the difference between the internet and the world wide web? The internet is a hardware network of cables and routers – the physical infrastructure of computer connections. Originally government property, it has been built up gradually since the 1960s. The world wide web is a software component, and it was whipped up in short order by one man.
In the late 1980s, CERN laboratories in Switzerland hired British computer scientist Timothy Berners-Lee to help CERN scientists collaborate on massive, complex physics projects. Berners-Lee coded the World Wide Web 1 as an “information system” to address that challenge, and he was virtually its sole keeper and promoter for the first few years. Berners-Lee and collaborator Robert Cailliau set up the first web server, a web browser, and a few web pages by late 1990. A web browser is an “app” that enables any computer to become web-accessible. The first webpages served to describe the World Wide Web itself. 3
Berners-Lee’s innovation was to write hypertext that worked across the internet. Hypertext, which has been around since the 1960’s, refers to the placement of links between computer files. It enables a computer user to jump around from one paragraph to another, or even from one computer file to another. Berners-Lee’s web was an extension of hypertext that could point among an entire network of computers, even the global internet. He also expanded the concept to links between all kinds of files besides text, such as images, audio and video. Today, the web is more broadly referred to as a system of hyperlinks. The World Wide Web gave us the seven letters http://www. The “http” stands for hypertext transfer protocol, the new standard for transmitting information from one computer to another. 2
The web became available for public use in 1991, and Berners-Lee encouraged programmers to help develop resources for it. A University of Illinois (UI) team made the first mass-market web browser, Mosaic. It operated on popular home computers, and it was the first browser to combine graphics with text. This gave the web a quick surge in popularity. However, when UI started charging license fees for others to use Mosaic, Berners-Lee became concerned that this kind of profit motive would restrict access to the web. 4 Meanwhile, he saw a competing internet system, Gopher, struggling under an effort to monetize. 5
Berners-Lee and CERN made a bold move shortly after Mosaic’s 1993 release. They released the www code to the public, relinquished their rights to profit from it, and declared that the web should be free for everyone. 6 This decision unleashed the beast, making the world wide web larger than any gatekeeper. It surged again and quickly became the internet standard. Berners-Lee’s historic decision delayed his own fortune by years. By now, he is deservedly wealthy as the head of several internet- and web-related organizations.
1995 was the next landmark year for the internet and the web. Before that, the American segment of the internet was still under strict government control, and commercial activity was expressly forbidden. 7 As part of the government’s divestiture of its network, those commercial restrictions on internet traffic were dropped on 1/01/1995. Just a few months later, new web browser Netscape enjoyed an explosive initial public offering on the stock market, making the first dot-com overnight millionaires. Businesses and startups immediately realized that they had a goldmine of potential customers. Amazon, eBay, Yahoo!, and Craigslist were some of the largest and most enduring websites to launch by 1995. The release of Microsoft’s Windows 1995 included Internet Explorer.
As the web grew, indexing it became an absolute necessity. 3 A browser such as Netscape or Internet Explorer could follow a command to go to a particular website, but it did not provide guidance if the user did not know a specific URL. The next major development was the search engine. A user could indicate what content he or she was looking for, and the search engine would check pages throughout the web for matching terms. The earliest major search engines were WebCrawler and Lycos in 1994. Google appeared in 1998 and, through use of its superior algorithms for indexing and ranking search results, quickly rose to prominence.
The late 1990’s saw a flurry of activity as corporations new and old rushed to get online. Web technology is credited with allowing the global economy to grow at hitherto impossible rates for several years. On the flipside, it also led to overly zealous investing in the high-tech bubble of 1995 – 2001. When most e-commerce corporations failed to show profits by the early 2000’s, this bubble burst, contributing to the recession of 2001.
The websites of the 21st century are much more interactive than the static webpages of the 1990s. Websites such as Facebook, YouTube, Wikipedia, Yelp, and millions of blogs are often described as Web 2.0. The online population is still growing exponentially. As recently as 1990, a majority of Americans had never used computers. 8 Now, more than half the human population accesses the internet, 9 including a quarter of Americans online “almost constantly”. 10
Simultaneously with the development of the web, information technology has become increasingly lightweight, wireless, mobile, and personalized. “Laptop” computers 4 were fairly well-known by 1990. Digital (2G) cell phones and SMS text messages were in place by 1993. Wi-Fi was commonplace by the late ‘90s. This was a wireless connection for use within a home or office, e.g. to allow printers and other smart appliances to communicate with the central computer.
A few devices integrated cell phone and internet capability as far back as the mid-1990s. The first hand-held computer to really make a splash, though, was the iPhone, released in 2007. Typically for Apple, it was not the iPhone’s functionality that was new, but its user-friendly design. It replaced a physical keyboard with a “soft” keyboard on the screen. The iPhone also made it easy for users to download apps. These software and hardware features combined to make the iPhone a truly general-purpose computer. Smartphones are not as versatile as larger computers, but they don’t need to be. Most users have low-grade computing needs like texting, web browsing, and photo-sharing. More demanding processing or storage is now offered easily on the “cloud” of remote servers. For a growing number of young users, a smartphone is the only computer they need. 11
In the ’10s, tech companies focused on dynamic apps and personalization. Rather than directing all users to one universal webpage, computers now bring up-to-the-minute information to each user according to location, demographic, and browsing history. If you like Thai food, your phone might start alerting you to nearby Thai restaurants (or even catering trucks) at lunchtime. A smart car can tell you how much traffic is up ahead, and how to avoid it. In fact, real-time traffic information is obtained by tracking all phones or GPS units on the road. This intense gathering and processing of information is a typical application of big data.
4. Transistor breakthroughs
How does all this technology work? Most people seem to take their computers for granted, although today’s technology would certainly have been called “science fiction” decades ago and “magic” centuries before that. Behind the scenes, though, is a gigantic industry applying some of history’s most insanely advanced engineering toward one overarching goal: to make transistors increasingly smaller and faster. Transistors have shrunk from 10-6 meter in 1990 to 10-8 meter (100 atoms wide!) in 2020. Every reduction poses new challenges and necessitates breakthroughs. Some of the most important 21st century techniques have names like anti-reflective coating 12, strained silicon 13, atomic layer deposition, and immersion lithography. Whether you’ve heard of them or not, these are the methods that make your smartphone and multiplayer online video games possible.
Moore’s Law continues to describe the pace of advancement fairly accurately: integrated circuits double in processing power about every two years. This makes for a thousand-fold increase in computing capabilities every two decades, a trillion-fold in a lifetime. The computing power that I carry around with me today would have served a major city at the time I was born.
I don’t need to tell you that the world wide web was a game-changer. Middle-aged folks like me appreciate firsthand the impact that telecommuting and Cyber Mondays have had on our daily lives. Younger generations can’t imagine a world without video chatting or songs on demand. But the webolution is even bigger than its obvious catering to our lifestyle. It has altered the human experience in almost every way, including the foundations of sociology, politics, economics, epistemology, and religion.
What is the human experience, after all? Let’s start with one grandiose objective reality as perceived subjectively from eight billion points of view. People then communicate in networks to reconstruct their own objective models of reality (religion, education, culture). Next, we view those models from eight billion points of view and repeat endlessly. Information technology amplifies every phase of that cycle, for better or for worse. For example, it is much easier to swallow the idea of a single human race when we can play video games with people from around the world. On the flipside, falsehoods and propaganda spread more readily than ever. As geographic communities become more heterogeneous, cyberspace is an easy place to find a homogeneous “us” against a villainized “them”.
Knowledge is power. In Chapter 2, we saw the world’s power structure shift from church-state to consumers-corporations-state. Today, internet access and big data are central resources in that balance. The web has gone a long way toward democratizing information. Shoppers can easily look up prices, and body cameras take us right into police action. Authoritarian governments must work fervently to control communications critical of themselves. 14 Yet our access to information is gated by a high-tech oligopoly. Corporations like Facebook, Amazon, and Google are de facto monopolies (especially in developing countries) 15. Economic arguments could be made that their industries are natural monopolies, but Berners-Lee recently expressed great concern about their outsized influence. 16
Big data raises some of history’s most salient questions of privacy vs. public interest. Information about consumers’ browsing patterns and social media behavior is gold for advertisers. It is available in super-massive databases and crunched through algorithms that decide what ads we see. Corporations profit greatly from consumer information; should some of that profit redound to us consumers? 17 Databases also harbor information of great public interest, from social distancing patterns to criminal evidence. Should governments gather such information, or have access to privately gathered databases? Many big data algorithms, including those that assign us credit scores, are protected as trade secrets. 18 Moreover, algorithms “evolve” as old ones are replaced by new ones. The selective pressure guiding this evolution is not accuracy or the common good; it is profit for the corporations that use them. 19
In this global village, 20 “everyone knows everyone,” at least theoretically. This network effect has created a loop back to ancient modes of moral enforcement. In the first prehistoric communities, transgressions were easily punished by ridicule and ostracism. As villages grew larger and people got lost in the crowd, judgment passed to omniscient moralizing gods. Now, surveillance and online reputation are secularizing these godly roles once again. With IP tracking, DNA databases, and facial recognition, it is becoming damned difficult to commit any sin without leaving a trace. The shady businesswoman who scams clients gets negative reviews that can put her out of business. Parents can open neighborhood maps pinpointing nearby sex offenders. That sounds like a remedy for justice, but public opinion is touchy and inconsistent. Disagreements and mistakes sometimes fall to mob rule and public shaming beyond the scope of the offense. 21
- Image: © Tyler Olson, licensed to Scot Fagerland, https://www.dreamstime.com/stock-photos-women-looking-cell-phones-female-friends-sitting-dinning-table-image36393593 ↩
- Image of Timothy Berners-Lee by John S. and James L. Knight Foundation, photo by Scott Henrichsen / CC BY (https://creativecommons.org/licenses/by/2.0), https://commons.wikimedia.org/wiki/File:Berners-Lee_announcing_W3F.jpg (accessed and saved 6/28/15, archived 5/30/20). ↩
- E.g. World Wide Web, World Wide Web Consortium, http://www.w3.org/History/19921103-hypertext/hypertext/WWW/TheProject.html ↩
- Mark Fischetti, “The World Wide Web Became Free 20 Years Ago Today”, Scientific American (4/30/2013), https://blogs.scientificamerican.com/observations/the-world-wide-web-became-free-20-years-ago-today/ (accessed and saved 4/08/19). ↩
- Johnny Ryan, A history of the internet and the digital future, Reaktion Books (London, 2010), p. 110. ↩
- W. Hoogland and H. Weber, “Statement concerning CERN W3 software release into public domain” (4/30/1993), http://cds.cern.ch/record/1164399 (accessed and saved 4/09/19). ↩
- “Acceptable Use Policy”, National Science Foundation (c. 1992), provisions (10) and (11), http://www.cybertelecom.org/notes/nsfnet.htm#aup (accessed and saved 2/09/19). ↩
- Susannah Fox and Lee Rainie, “The Web at 25 in the U.S. Part 1: How the internet has woven itself into American life”, Pew Research Center (2/27/2014), http://www.pewinternet.org/2014/02/27/part-1-how-the-internet-has-woven-itself-into-american-life/ (accessed and saved 2/09/19). ↩
- “For the first time, more than half of the world’s population is using the internet”, International Telecommunication Union (12/07/2018), https://www.itu.int/en/mediacentre/Pages/2018-PR40.aspx (accessed and saved 2/09/19). ↩
- Andrew Perrin and Jingjing Jiang, “About a quarter of U.S. adults say they are ‘almost constantly’ online”, Pew Research Center (3/14/2018), http://www.pewresearch.org/fact-tank/2018/03/14/about-a-quarter-of-americans-report-going-online-almost-constantly/ (accessed and saved 2/09/19). ↩
- Comscore Whitepaper (3/30/2016), https://www.comscore.com/Insights/Presentations-and-Whitepapers/2016/2016-US-Cross-Platform-Future-in-Focus as quoted by David Murphy, “ComScore: Desktop Browsing on the Decline”, PCMag (4/16/2016), https://www.pcmag.com/news/343784/comscore-desktop-browsing-on-the-decline (accessed and saved 2/10/19). ↩
- Douglas J. Guerrero et al., “Anti-reflective coating for multipatterning lithography”, Proc. of SPIE Vol. 6923 (2008), https://www.brewerscience.com/uploads/publications/2008/Guerreroetal2008_6923Web.pdf (accessed and saved 2/10/19). ↩
- Els Parton and Peter Verheyen, “Strained silicon – the key to sub-45 nm CMOS”, III-Vs Review 19(3):28-31 (April, 2006), https://www.sciencedirect.com/science/article/pii/S0961129006715903 (accessed and saved 2/10/19). ↩
- The Chinese government relies on a workforce of over 2,000,000 internet police. Staff writer, “China employs two million microblog monitors state media say”, BBC (10/04/2013), https://www.bbc.com/news/world-asia-china-24396957 (accessed, saved, and archived 5/02/21). ↩
- Leo Mirani, “Millions of Facebook users have no idea they’re using the internet”, Quartz (2/09/2015), https://qz.com/333313/milliions-of-facebook-users-have-no-idea-theyre-using-the-internet/ (accessed and saved 2/11/19). ↩
- Guy Faulconbridge and Paul Sandle, “Father of Web says tech giants may have to be split up”, Reuters (10/31/2018), https://www.reuters.com/article/us-technology-www/father-of-web-says-tech-giants-may-have-to-be-split-up-idUSKCN1N63MV (accessed and saved 2/11/19). ↩
- Mobile Ecosystem Forum, “Understanding the Personal Data Economy: The Emergence of a New Data Value-Exchange” (c. 2017), https://mobileecosystemforum.com/wp-content/uploads/2016/11/Understanding-the-Personal-Data-Economy-Whitepaper.pdf (accessed and saved 5/02/21). ↩
- Jesse Campbell, “The Secret, Magical Math of Credit Scores”, Personal Capital (3/12/2015), https://www.personalcapital.com/blog/family-life/secret-magical-math-credit-scores/ (accessed, saved, and archived 5/02/21). ↩
- Cathy O’Neil, Weapons of Math Destruction, Broadway Books (Amazon Kindle e-book edition, 2017), p. 12. ↩
- Marshall McLuhan, The Gutenberg Galaxy: The Making of Typographic Man, University of Toronto Press (1962). McLuhan coined the phrase global village to describe the effect of instant electronic communication on the effective shrinking of the world. With amazing prescience, he practically predicted the internet in the early 1960’s. ↩
- Todd Leopold, “The price of public shaming in the Internet age”, CNN (4/16/2015), https://www.cnn.com/2015/04/16/living/feat-public-shaming-ronson/index.html (accessed and saved 2/11/19). ↩
Facebook comments preferred; negative anonymous comments will not display. Please read this page / post fully before commenting, thanks!
Powered by Facebook Comments