-- John Dewey, The Public and Its Problems , 1927
(D)ecisions about the development and exploitation of computer technology must be made not only "in the public interest" but in the interest of giving the public itself the means to enter into the decision-making processes that will shape their future.
-- J. C. R. Licklider, "Computers and Government", 1979
Can a plural understanding of society lay the foundation for social transformations as dramatic as those that fields like quantum mechanics and ecology have brought to natural sciences, physical technology and our relationship to nature? Liberal democracies often celebrate themselves as pluralistic societies, which would seem to indicate they have already drawn the available lessons from plural social science. Yet despite this formal commitment to pluralism and democracy, almost every country has been forced by the limits of available information systems to homogenize and simplify social institutions in a monist atomist mold that runs into direct conflict with its values. The great hope of plural social science and plural technology (Plurality) built on top of it is to use the potential of information technology to begin to overcome these limitations.
Private property. Individual identity and rights. Nation state democracy. These are the foundations of most modern liberal democracies. Yet they rest on fundamentally monist atomist foundations. Individuals are the atoms; the nation state is the whole that connects them. Every citizen is seen as equal and exchangeable in the eyes of the whole, rather than part of a network of relationships that forms the fabric of society and in which any state is just one social grouping. State institutions seen direct, unmediated relationships to free and equal individuals, though in some cases federal and other subsidiary (e.g. city, religious or family) institutions intercede.
Three foundational institutions of modern social organization represent this structure most sharply: property, identity and voting. We will illustrate how this works in each context and then turn to the ways that pluralist social science has challenged and offers ways past the limits of atomist monism.
Simple and familiar forms of private property, with most restrictions and impositions on that right being imposed by governments, are the most common form of ownership in liberal democracies around the world. Most homes are owned by a single individual or family or by a single landlord who rents to another individual or family. Most non-governmental collective ownership takes the form of a standard joint stock company governed by the principle of one-share-one-vote and the maximization of shareholder value. While there are significant restrictions on the rights of private property owners based on community interests, these overwhelmingly take the form of regulations by a small number of governmental levels, such as national, provincial/state and local/city.
These practices are in sharp contrast to the property regimes that have prevailed in most human societies throughout most of history.
In deep time individuals were born into families rooted within kin based institutions. Kin based institutions that provided everything, livelihood, sustenance, meaning, and that were for the most part inescapable. No "official documents" were needed, they didn't make any sense because you were born in one place in a social universe and remained there interacting with people you knew and who knew you your entire life. These kin based institutions began to be broken up in Europe beginning around 500 with the imposition by the Catholic Church of their Marriage and Family Practices that banned cousin marriages. This is what Joseph Henrich lays out in his book the WEIRDest People in the world: How the West Became Psychologically Peculiar and Particularly Prosperous as a germinal event in creating the west as we know it today.
By 1100 new types of voluntary associations that formed institutions started, monasteries, universities, charter towns, guilds began to emerge in the void left by the disillusion of kin based institutions. The plague where 1/3-of all people died also did a lot to disrupt the social order. These new social forms also gave rise to the emergence of a new psychology where people saw them selves as "individuals" (who could leave there family entirely and go to a far away chater town or join a monsteary) Impersonal pro-sociality emerged and became the norm because people were primarily interacting with non-kin. These new instiutions and extensive interactions with non-kin also lead to the emergence pre-capitalist markets, early contract law and governance processes rooted in abdstract rules.
Who you where and where you fit was not "obious" based on your kin relations anymore. So, as people began to move around more and new institutions formed paper based systems to document who belonged to them emerged, who was baptized by the church, who was a resident of a town, who was a member in a guild, who was a soldier in the army, who was a patient at the hospital etc. Identity systems in liberal democratic countries are rooted in historic practices originating from the church practices of keeping baptismal records in a log book. Beginning in the 1500s there was a shift over several centuries towards becoming state run systems where births were registered and birth certificates were issued to parents. This document, the birth certificate, is still the root breeder document that all other state issued identity documents (drivers license, national ID, tax/pension number, passports) are derived.
It is worth noting that universal birth registration is a very recent phenomena and only was achieved in the US in 1940. Universal registration for Social Security Numbers didn't begin until 1987 when Enumeration at Birth was instituted at the federal level in collaboration with county level government where births are registered. It also conisided with a new tax law requiring children claimed by their parent as a tax deuction had to have an SSN. Many countries around the world still do not have universal birth registration.
These documentary practices mean that some aspect of identity could be rooted outside of direct personal relations and into new abstrct relationship with the state based on primary registration at birth and secondary registration with subsequent documents usually issued in early adulthood. These state issued documents serve as foundational trust anchors for many other types of institutions who request them from people when people register/enroll. This includes children's sports teams (to determine age), religious institutions (who ask for them to background check people before they work with children), employers (to confirm name, tax number and eligibility to work), educational institutions (to confirm name and birthdate), medical care providers (to confirm name and birthdate), an officer at a boarder check point (confirmin name, birthdate, citizenship). These documents are abstract representations about people but they are universal and have enabled people to navigate the world not based on "who they know" or "where they fit" in a tight narrow social universe bounded by kin relations but as who they are in an abstracted universal sense relative to the state.
One can look at this and say these structures differ sharply from those that have prevailed over most of human history and in most places. However in most places over most of human history one was born into a large extended family and one basically never could leave that family and the context it was in. The innovation of WEIRD societies meant that people for the first time in human history could defect from their context, move from one university to another or move from one town to another or leave one guild and get trained in a new one as a normal part of life.
We have an opportunity to extend these documentary practices of state institutions and other formal institutions to peer and networked institutions. Work is being done now on developing common ways to support communities doing this in the [open recognition] community. (https://docs.google.com/presentation/d/153qXiNr9xTUWKKev8HytXw84YgLTttzomYdOqvOL8Xc/edit#slide=id.p) within them in peer based ways.
In most liberal democracies, the principle of "one-person-one-vote" is viewed as a sacred core of the democratic process. Of course, various schemes of representation (multi-member proportional representation or single-member districts), checks-and-balances (mutli- v. unicameral legislatures, parliamentary v. presidential) and degrees of federalism vary and recombine in a diverse ways. However both in popular imagination and in formal rules, the idea that numerical majorities (or in some cases supermajorities) should prevail regardless of the social composition of groups is at the core of how democracy is typically understood.
There are, of course, limited exceptions that in many ways prove the rule. The two most notable examples are "degressive proportionality" and "consociationalism". Many federal systems (e.g. the United States) apply the principle of degressive proportionality to which we will return later: namely, that smaller sub-units (e.g. provinces in national voting) are over-represented relative to their population. Some countries also have consociational structures in which designated social groups (e.g. religions or political parties) agree to share power in some specified fashion, ensuring that even if one group's vote share declines they retain something of their historical power. Yet these counterexamples are few, far between and usually subjects of on-going controversy, with significant political pressure to "reform" them in the direction of a standard one-person-one-vote direction.
Again this contrasts with decision-making structures throughout most of the world and most of history, including ones that involved widespread and diverse representation.
Thus, in contrast to most of human history and experience, the standard form of public administration in most liberal democracies expects property to exist primarily as either individual (or family) holdings or profit-seeking commercial ventures, with most checks and controls on these two being imposed by governments. This regime began to develop during the Renaissance and Enlightenment, when traditional, commons-based property systems, community-based identity and multi-sectoral representation were swept away for the "rationality" and "modernity" of what became the modern state. This system solidified and literally conquered the world during the industrial and colonial nineteenth century and was canonized in Max Weber. And it reached its ultimate expression in the "high modernism" of the mid-twentieth century, when properties were further rationalized into regular shapes and sizes, identity documents reinforced with biometrics and one-person-one-vote systems spread to a broad range of organizations.
Governments and organizations around the world adopted these systems for some good reasons. They were simple and thus scalable; they allowed people from very different backgrounds to quickly understand each other and thus interact productively. Where once commons-based property systems inhibited innovation when outsiders and industrialists found it impossible to navigate a thicket of local customs, private property cleared a path to development and trade by reducing those who could inhibit change. Administrators of the social welfare schemes that transformed government in the twentieth century would have struggle to provide broad access to pensions and unemployment benefits without a single, flat, clear database of entitlements. And reaching subtle compromises like those that went into the US Constitution, much less ones rich enough to keep up with the complexity of the modern world would have likely undermined the possibility of democratic government spreading.
In fact these institutions were core to what allowed modern, wealthy, liberal democracies to rise, flourish and rule, just as the insights of Newtonian mechanics and Euclidean geometry gave those civilizations the physical power to sweep the earth. Yet just as the Euclidean-Newtonian worldview turned out to be severely limited and naïve, plural social science was born by highlighting the limits of these atomist monist social systems.
Perhaps the clearest sign of the limits of the modern order is that one need not look to colonial discontent or the social margins to see them. In fact, some of the most prominent and celebrated founding figures were also searing critics and re-imaginers of these foundational institutions. Henry George, author of the best-selling and most influential book on economics in American and perhaps world history, made his career as a searing critic of private property. Georg Simmel, one of the founders of sociology, originated the idea of the "web" as a critique of the individualist concept of identity. John Dewey, widely considered the greatest philosopher of American democracy, argued that the standard national and state institutions that instantiated hardly scratched the surface of what democracy required. Norbert Wiener coined the term "cybernetics" to the field of studying such rich interactive systems. By perceiving the limits of the box they highlighted even as they helped construct it, we can learn to imagine a social world outside it.
We remember Karl Marx and Adam Smith more sharply, but the social thinker that may have had the greatest influence during and immediately following his lifetime was Henry George. Author of the for-years best-selling book in English other than the Bible, Progress and Poverty, George inspired or arguably founded many of the most successful political movements and even cultural artifacts of the early twentieth century including:
- the American center-left, as a nearly-successful United Labor candidate for Mayor of New York City;
- the Progressive and social gospel movements, which both traced their names to his work;
- the Chinese Nationalist movement, whose founder Sun Yat-Sen drew his "Three Principles of the People" from George much as Lenin and Mao drew on Marx, leading to the continuing reverence for George in today's Taiwan;
- and the game Monopoly, which originated as an educational device "The Landlord's Game", to illustrate how an alternate set of rules could avoid monopoly and enable common prosperity.
George wrote on many topics helping originate, for example, the idea of a secret ballot. But he became most famous for advocating a "single tax" on land, whose value he argued could never properly belong to an individual owner. His most famous illustration asked readers to imagine an open savannah full of beautiful but homogeneous land on which a settler arrives, claiming some arbitrarily chosen large plot for her family. When future settlers arrive, they choose to settle close to the first, so as to enjoy company, divide labor and enjoy shared facilities like schools and wells. As more settlers arrive, they continue to choose to cluster and the value of land rises. In a few generations, the descendants of the first settler find themselves landlords of much of the center of a bustling metropolis, rich beyond imagining, through little effort of their own, simply because a great city was built around them.
The value of their land, George insisted, could not justly belong to that family: it was a collective product that should be taxed away. Such a tax was not only just, it was crucial for economic development, as highlighted especially by later economists including one of the authors of this book. Taxes of this sort, especially when carefully designed as they were in in Taiwan, ensure property owners must use their land productively or allow others to do so. The revenue they raise can support shared infrastructure (like those schools and wells) that gives value to the land, an idea called the "Henry George Theorem".
Yet, as attractive as this argument has proven to politicians and intellectuals from Leo Tolstoy to Albert Einstein, in practice it has raised many more questions than it has answered. Simply saying that land does not belong to an individual owner says nothing about who or what it does belong to. The city? The nation state? The world?
Given this is a book about technology, an elegant illustration is the San Francisco Bay Area, where both authors and George himself lived parts of their lives and which has some of the most expensive land in the world. To whom does the enormous value of this land belong?
- Certainly not to the homeowners who simply had the good fortune of seeing the computer industry grow up around them. Then perhaps to the cities in the region? Many reformers have argued these cities, which are in any case fragmented and tend to block development, can hardly take credit for the miraculous increase in land values.
- Perhaps Stanford University and the University of California at Berkeley, to which various scholars have attributed much of the dynamism of Silicon Valley? Certainly these played some role, but it would be strange to attribute the full value of Bay Area land to two universities, especially when these universities succeeded with the financial support of the US government and the collaboration of other universities across the country.
- Perhaps the State of California? Arguably the national defense industry, research complex that created the internet (as we discuss below) and political institutions played a far greater role than anything at the state level.
- Then to the US? But of course the software industry and internet are global phenomena.
- Then to the world in general? Beyond the essential non-existence of a world government that could meaningfully receive and distribute the value of such land, abstracting all land value to such heights is a bit of an abdication: clearly many of the entities above are more relevant than simply "the entire world" to the value of the software industry; if we followed that path, global government would end up managing everything simply by default.
To make matters yet more complex, the revenue earned on the property is but one piece of what it means to own. Legal scholars typically describe property as a bundle of rights: of "usus" (to access the land), "abusus" (to build on or dispose of it) and "fructus" (to profit from it). Who should be able to access the land of the Bay Area under what circumstances? Who should be allowed to build what on it, or to sell exclusive rights to do so to others? Most of these questions were hardly even considered in George's writing, much less settled. In this sense, his work is more a helpful invitation to step beyond the easy answers private property offers, which is perhaps why his enormously influential ideas have only been partly implemented in a small number of (admittedly highly successful) places like Estonia and Taiwan.
- The world George invites us to reflect on and imagine how to design for is one of networked value, one where a variety of entities, localized at different scales (universities, municipalities, nation states, etc.) all contribute to differing degrees to create value, just as networks of waves and neurons contribute to differing degrees to the probabilities of particles being found in various positions or thoughts occurring in a mind. And for both justice and productivity, property and value should belong, in differing degrees, to these networks. In this sense, George was a founder of plural social science.
But if network thinking was implicit in George's work, it took another thinker, across the Atlantic, to make it explicit and, accidentally, give it a name. Georg Simmel, a German sociologist of the turn of the twentieth century who pioneered the idea of social networks and the mistranslation of whose work as focused on a “web” eventually went “worldwide”. In his 1955 translation of Simmel’s classic 1908 Soziologie, Reinhard Bendix chose to describe Simmel’s idea as describing a “web of social affiliations” over what he described as the “almost meaningless” direct translation “intersection of social circles”1. Had he made the opposite choice perhaps one of the leading technologies of our era would have exchanged names with one of its leading social movements, and we would speak of the “intersecting circles of the internet” and of a “web of oppression”.
Simmel’s “intersectional” theory of identity offered an alternative to both the traditional individualist/atomist (characteristic at the time in sociology with the work of Max Weber and deeply influential on ES) and collectivist (characteristic at the time of the sociology of Karl Marx and deeply influential on AT) accounts. He saw both as representing extreme reductions/projections of a richer underlying theory.
In his view, humans are inherently social creatures and thus there is no original and separate individual identity. Humans gain their sense of self, their goals, and their meaning through participation in social, linguistic, and solidaristic groups. In simple societies (e.g., isolated, rural, or tribal), people spend most of their life interacting with the same group of others or, as he called it, the same “social circle”. This circle comes to (primarily) define their identity collectively, which is why most scholars of simple societies (for example, anthropologist Marshall Sahlins) tend to favor methodological collectivism.
However, in more complex/urban/modern societies, social circles are more diverse. People work with one circle, worship with another, support political causes with a third, recreate with a fourth, cheer for a sports team with a fifth, identify as discriminated against along with a sixth, and so on. These diverse identifications together constitute a person’s identity. The more numerous and diverse these affiliations become, the less likely it is that anyone else shares precisely the same intersection of affiliations.
As this occurs, people come to have, on average, less of their full sense of self in common with those around them at any time; they begin to feel “unique” (to put a positive spin on it) and “isolated/misunderstood” (to put a negative spin on it). This creates a sense of “individuality” that helps explain why social scientists focused on complex urban settings (such as economists) tend to favor methodological individualism. However, ironically as Simmel points out, such “individuation” occurs precisely because and to the extent that the “individual” becomes divided among many loyalties and thus dividual. Thus, while methodological individualism takes the “(in)dividual” as the irreducible element of social analysis, Simmel instead suggests that individuals become possible as an emergent property of the complexity and dynamism of modern, urban societies.
Thus the individual that the national identity systems seek to strip away from the shackles of communities thus actually emerges from their proliferation and intersection. Just as a truly just and efficient property regime would recognize and account for such networked interdependence, identity systems that truly empower and support modern life would need to mirror its intersectional, networked structure.
If (in)dividual identity is so fluid and dynamic, surely so too must be the social circles that intersect to constitute it. As Simmel highlights, new social groups are constantly forming, while older ones decline. Three examples he highlights are the for his time still recent formation of cross-sectoral “working men’s associations” that represented the general interest of labor and the just-then-emerging feminist associations and cross-sectoral employers’ interest groups. The critical pathway to creating such new circles was the establishment of places (e.g. workman’s halls) or publications (e.g. working men’s newspapers) where this new group could come to know one another and understand, and thus to have things in common they do not have with others in the broader society. Such bonds were strengthened by secrecy, as shared secrets allowed for a distinctive identity and culture, as well as the coordination in a common interest in ways unrecognizable by outsiders2. Developing these shared, but hidden, knowledge allows the emerging social circle to act as a collective agent.
In his 1927, The Public and its Problems, John Dewey considered the political implications and dynamics of these “emergent publics” as he called them 3. Inventor of the Dewey decimal system, father of "progressive education" and perhaps the most celebrated American philosopher and philosopher of democracy, Dewey was a devoted follower of George. He led the "democratic" wing of progressive politics in his era and engaged in a famous series of debates with leading left-wing technocrat Walter Lippmann, whose 1922 book Public Opinion Dewey considered "the most effective indictment of democracy as currently conceived". In the debate, Dewey sought to redeem democracy while embracing fully Lippmann's critique of existing institutions as ill-suited to an increasingly complex and dynamic wold.
While he acknowledged a range of forces for social dynamism, Dewey focused specifically on the role of technology in creating new forms of interdependence that created the necessity for new publics. Railroads connected people commercially and socially who would never have met. Radio created shared political understanding and action across thousands of miles. Pollution from industry was affecting rivers and urban air. All these technologies resulted from research, the benefits of which spread with little regards for local and national boundaries. The social challenges (e.g. governance railway tariffs, safety standards, and disease propagation; fairness in access to scarce radio) arising from these forms of interdependence are poorly managed by both capitalist markets and preexisting “democratic” governance structures.
Markets fail because these technologies create market power, pervasive externalities, and more generally exhibit “supermodularity” (sometimes called “increasing returns”), where the whole of the (e.g. railroad network) is greater than the sum of its parts. In the technology industry, the most famous example of this is so-called "network effects", where use of a system by some raises its value for others. Capitalist enterprises cannot account for all the relevant “spillovers” and to the extent they do, they accumulate market power, raise prices and exclude participants, undermining the value created by increasing returns. Leaving these interdependencies “to the market” thus exacerbates their risks and harms while failing to leverage their potential.
Dewey revered democracy as the most fundamental principle of his career; barely a paragraph can pass without him harkening back to it. He firmly believed that democratic action could address the failings of markets. Yet he saw the limits of existing “democratic” institutions just as severely as those of capitalism. The problem is that existing democratic institutions are not, in Dewey’s view, truly democratic with regards to the emergent challenges created by technology.
In particular, what it means to say an institution is “democratic” is not just that it involves participation and voting. Many oligarchies had these forms, but did not include most citizens and thus were not democratic. Nor would, in Dewey’s mind, a global “democracy” directly managing the affairs of a village count as democratic. Core to true democracy is the idea that the “relevant public”, the set of people whose lives are actually shaped by the phenomenon in question, manage that challenge. Because technology is constantly throwing up new forms of interdependence, which will almost never correspond precisely to existing political boundaries, true democracy requires new publics to constantly emerge and reshape existing jurisdictions.
Furthermore, because new forms of interdependence are not easily perceived by most individuals in their everyday lives, Dewey saw a critical role for what he termed “social science experts” but we might with no more abuse of terminology call “entrepreneurs”, “leaders”, “founders”, “pioneer” or, as we prefer, “mirror”. Just as George Washington's leadership helped the United States both perceive itself as a nation and a nation that had to democratically choose its fate after his term in office, the role of such mirrors is to perceive a new form of interdependence (e.g. solidarity among workers, the carbon-to-global-warming chain), explain it to those involved by both word and deed, and thereby empower a new public to come into existence. Historical examples are union leaders, founders of rural electricity cooperatives, and the leaders who founded the United Nations. Once this emergent public is understood, recognized, and empowered to govern the new interdependence, the role of the mirror fades away, just as Washington returned to Mount Vernon.
Thus, as the mirror image of Simmel’s philosophy of (in)dividual identity, Dewey’s conception of democracy and emergent publics is at once profoundly democratic and yet challenges and even overturns our usual conception of democracy. Democracy, in this conception, is not the static system of representation of a nation-state with fixed borders. It is a process even more dynamic than in a market of invention led by a diverse range of entrepreneurial mirrors, who draw upon the ways they are themselves intersections of unresolved social tensions to renew and reimagine social institutions. Standard institutions of nation state-based voting are to such a process as pale a shadow as Newtonian mechanics is of the underlying quantum and relativistic reality. True democracy must be networked, plural and constantly evolving.
All of these critiques and directions of thought are suggestive, but none seems to offer clear paths to action and further scientific development. Could the understanding of the plural, networked nature of social organization be turned into a scientific engine of new forms of social organization? The hypothesis that was the seed from which Norbert Wiener sprouted the modern field of "cybernetics", from which comes all the uses of "cyber" to describe digital technology and, many would argue, the later name of "computer science" given to similar work. Wiener defined cybernetics as "the science of control and communication in (complex systems like) the animal and machine", but perhaps the most broadly accepted meaning is something like the "science of communication within and governance of, by and for networks". The word was drawn from a Greek analogy of a ship directed by the inputs of its many oarsmen.
- Wiener's scientific work focused almost exclusively on physical, biological and information systems, investigating the ways that organs and machines can obtain and preserve homeostasis, quantifying information transmission channels and the role they play in achieving such equilibrium and so on. Personally and politically, he was a pacifist, severe critic of capitalism as failing basic principles of cybernetic stabilization and creation of homeostasis and advocate of radically more responsible use and deployment of technology. He despaired that without profound social reform his scientific work would come to worse than nothing, writing in the introduction to Cybernetics, "there are those who hope that the good of a better understand of man and society which is offered by this new field of work may anticipate and outweigh the incidental contribution we are making to the concentration of power (which is always concentrated, by its very conditions of existence, in the hand of the most unscrupulous. I write in 1947, and I am compelled to say that it is a very slight hope." It is thus unsurprising that Wiener befriended many social scientists and reformers who vested "considerable...hopes...for the social efficacy of whatever new ways of thinking this book may contain."
Yet while he shared the convictions, he believed these hopes to be mostly "false". While he judged such a program as "necessary", he was unable to "believe it possible". He argued that quantum physics had shown the impossibility of precision at the level of particles and therefore that the success of science arose from the fact that we live far above the level of particles, but that our very existence within societies meant that the same principles made social science essentially inherently infeasible. Thus as much as he hoped to offer scientific foundations on which the work of George, Simmel and Dewey could rest, he was skeptical of "exaggerated expectations of their possibilities"
It thus fell to that younger generation, with a more human/social scientific background, to experiment with his vision and build the technologies that would define the information era. A blip moving across the sky in October 1957 proved the opportunity they needed.
The launch by the Soviet Union of the first orbital satellite was followed a month later by the Gaither Committe report, claiming that the US had fallen behind the Soviets in missile production. The ensuing moral panic forced the Eisenhower administration into emergency action to reassure the public of American strategic superiority. Yet despite, or perhaps because of, his own martial background, Eisenhower deeply distrusted what he labeled America's "military industrial complex", while having boundless admiration for scientists. He thus aimed to channel the passions of the Cold War into a national strategy to improve scientific research and education.
While that strategy had many prongs, a central one was the establishment, within the Department of Defense, of a quasi-independent, scientifically administered Advanced Research Projects Agency (ARPA) that would harness expertise from universities to accelerate ambitious and potentially transformative scientific projects with potential defense applications.
While ARPA began with many aims, some of which were soon assigned to other newly formed agencies, such as the National Aeronautics and Space Administration (NASA), it quickly found a niche as the most ambitious government supporter of ambitious and "far out" projects under its second director, Jack Ruina. One area was to prove particularly representative of this risk-taking style: the Information Processing Techniques Office led by Joseph Carl Robnett (JCR) Licklider.
Licklider hailed from a different field still from the political economy of George, sociology of Simmel, political philosophy of Dewey and mathematics of Wiener: "Lick", as he was commonly known, received his PhD in 1942 in the field of psychoacoustics. After spending his early career developing applications to human performance in high-stakes interactions with technology (especially aviation), his attention increasingly turned to the possibility of human interaction with the fastest growing form of machinery: the "computing machine". He joined the Massachusetts Institute of Technology (MIT) to help found Lincoln Laboratory and the psychology program. He moved to the private sector as Vice President of Bolt, Beranek and Newman (BBN) one of the first MIT-spin off research start-ups.
Having persuaded BBN's leadership to shift their attention towards computing devices, Lick began to develop an alternative technological vision to the then-emerging field of Artificial Intelligence that drew on his psychological background to propose "Man-Computer Symbiosis", as his path-breaking 1960 paper was titled. Lick hypothesized that while "in due course...'machines' will outdo the human brain in most of the functions we now consider exclusively within its province...(t)here will...be a fairly long interim during which the main advances will be made by men and computers work together...those years should be intellectually the most creative and exciting in the history of mankind." CITE
This visions turned out to arrive at precisely the right moment for ARPA, as it was in search of bold missions on with which it could secure it place in the rapidly coalescing national science administration landscape. Ruina appointed Lick to lead the newly-formed Information Processing Techniques Office (IPTO). Lick harnessed the opportunity to build and shape much of the structure of what became the field of Computer Science.
While Lick spent only two years at ARPA, they laid the groundwork for much of what followed in the next forty years of the field. He seeded a network of "time sharing" projects around the US that would enable several individual users to directly interact with previously monolithic large-scale computing machines, taking a first step towards the age of personal computing. The five universities thus supported (Stanford, MIT, UC Berkeley, UCLA and Carnegie Melon) went on to become the core of the academic emerging field of computer science.
Beyond establishing the computational and scientific backbone of modern computing, Lick was particularly focused on the "human factors" in which he specialized. He aimed to make the network represent these ambitions in two ways that paralleled the social and personal aspects of humanity. On the one hand, he gave particular attention and support to projects he believed could bring computing closer to the lives of more people, integrating with the functioning of human minds. The leading example of this was the Augmentation Research Center established by Douglas Engelbart at Stanford. On the other hand, he dubbed the network of collaboration between these hubs, with his usual tongue-in-cheek, the "Intergalactic Computer Network", and hoped it would provide a model of computer-mediated collaboration and co-governance. CITE
This project bore fruit in a variety of ways, both immediately and longer-term. Engelbart quickly invented many foundational elements of personal computing, including the mouse, a bitmapped screen that was a core precursor to the graphical user interface and hypertext; his demonstration of this work, six short years after Lick's initial funding, as the "oNLine system" (NLS) is remembered as "the mother of all demos" and a defining moment in the development of personal computers. CITE This in turn helped persuade Xerox Corporation to establish their Palo Alto Research Center (PARC), which in turn pioneered much of modern personal computing. US News and World Report lists four of the five departments Lick funded as the top four computer science departments in the country. Most importantly, after Lick's departure to the private sector, the Intergalactic Computer Network developed into something less fanciful and more profound under the leadership of his collaborator, Robert W. Taylor.
Taylor and Lick were naturally colleagues. While Taylor never completed his PhD, his research field was also psychoacoustics and he served as Lick's counterpart at NASA, which had just split from ARPA, during Lick's leadership at IPTO. Shortly following Lick's departure (in 1965), Taylor moved to IPTO to help develop Lick's networking vision under the leadership of Ivan Sutherland, who then returned to academia, leaving Taylor in charge of IPTO and the network that he more modestly labeled the ARPANET. He used his authority to commission Lick's former home of BBN to build the first working prototype of the ARPANET backbone. With momentum growing through Engelbart's demonstration of personal computing and ARPANET's first successful trials, Lick and Taylor articulated their vision for the future possibilities of personal and social computing in their 1968 article "The Computer as a Communication Device", describing much of what would become the culture of personal computing, internet and even smartphones several decades later.
By 1969, Taylor felt the mission of the ARPANET was on track to success and moved on to Xerox PARC, where he led the Computer Science Laboratory in developing much of this vision into working prototypes. These in turn became the core of the modern personal computer that Steve Jobs famously "stole" from Xerox to build the Macintosh, while ARPANET evolved into the modern internet. In short, the technological revolutions of the 1980s and 1990s trace clearly back to this quite small group of innovators in the 1960s. While we will turn to these more broadly-known later developments shortly, it is worth lingering on the core of the research program that made them possible.
At the core of the development of what became the internet was replacing centralized, linear and atomized structures with networked relationships and governance. This happened at three levels that eventually converged in the early 1990s as the World Wide Web:
- packet switching to replace centralized switchboards,
- hypertext to replace linear text,
- and open standard setting processes to replace both government and corporate top-down decision-making All three ideas had their seeds at the edges of the early community Lick formed and grew into core features of the ARPANET community.
While the concept of networks, redundancy and sharing permeate Lick's original vision, it was Paul Baran's 1964 report "On Distributed Communications" that clearly articulated how and why communications networks should strive for a plural rather than centralized structure. Baran argued that while centralized switchboards achieved high reliability at low cost under normal conditions, they were fragile to disruptions. On the other hand, networks with many centers could be built with cheap and unreliable components and still withstand even quite devastating attacks by "routing around damage", taking a dynamic path through the network based on availability rather than prespecified planning. While Baran received support and encouragement from scientists at Bell Labs, his ideas were roundly dismissed by AT&T, the national telephone monopoly in whose culture high-quality centralized dedicated machinery was deeply entrenched.
But despite the apparent threat it posed to that private interest, packet switching caught the positive attention of another organization that owed its genesis to the threat of devastating attacks: ARPA. At a 1967 conference, ARPANET's first program manager, Lawrence Roberts, learned of packet switching through a presentation by Donald Davies, who concurrently and independently developed the same idea as Baran, and drew on Baran's arguments that he soon learned of to sell the concept to the team.
If one path to networked thinking was thus motivated by technical resilience, another was motivated by creative expression. Trained as a sociologist and honed as an artist, Ted Nelson devoted his life beginning in his early 20s to the development of "Project Xanadu", which aimed to create a revolutionary human-centered interface for computer networks. While Xanadu had so many components that Nelson considered indispensable that it was not released fully unto the 2010s, its core idea, co-developed with Engelbart, was "hypertext" as Nelson labeled it.
Nelson imagined hypertext as a way to liberate communication from the tyranny of a linear interpretation imposed by an original author, empowering a "pluralism" (as he labeled it) of paths through material through a network of (bidirectional) links connecting material in a variety of sequences. This "choose your own adventure" quality is most familiar today to internet users in their browsing experiences but showed up earlier in commercial products in the 1980s (such as computer games based on hypercard). Nelson imagined that such ease of navigation and recombination would enable the formation of new cultures and narratives at unprecedented speed and scope. The power of this approach became apparent to the broader world when Tim Berners-Lee made it central to his "World Wide Web" approach to navigation in the early 1990s, ushering in the era of broad adoption of the internet.
While Engelbart and Nelson were lifelong friends and shared many similar visions, they took very different paths to realizing them, each of which (as we will see) held an important seed of truth. Engelbart, while also a visionary, was a consummate pragmatist and a smooth political operator, and went on to be recognized as the pioneer of personal computing. Nelson was an artistic purist whose relentless pursuit over decades of a software system ("Project Xanadu") that instantiated all of his seventeen enumerated principles buried his career.
As an active participant in Lick's network, Engelbart tempered his ambition with the need to persuade other network nodes to support, adopt or at least inter-operate with his approach. As different user interfaces and networking protocols proliferated, retreated in his pursuit of perfection. Engelbart, and even more his colleagues across the project, instead began to develop a culture of collegiality, facilitated by the communication network the were building, across the often competing universities they worked at. The physical separation made tight coordination of networks impossible, but work to ensure minimal inter-operation and spreading of clear best practices became a core characteristic of the ARPANET community.
This culture manifested in the development of the "Request for Comments" (RFC) process by Steve Crocker, arguably one of the first "wiki"-like processes of informal and mostly additive collaboration across many geographically and sectorally (governmental, corporate, university) dispersed collaborators. This in turn contributed to the common Network Control Protocol and, eventually, Transmission Control and Internet Protocols (TCP/IP) under the famously mission-driven but inclusive and responsive leadership of Vint Cerf and Bob Kahn between 1974 when TCP was first circulated as RFC 675 and 1983 when they became the official ARPANET protocols. At the core of the approach was the vision of a "network of networks" that gave the "internet" its name: that many diverse and local networks (at universities, corporations and government agencies) could interoperate sufficiently to permit the near-seamless communication across long distances, in contrast to centralized networks (such as France's concurrent Minitel) that were standardized from the top down by a government. Together these three dimensions of networking (of technical communication protocols, communicative content and governance of standards) converged to create the internet we know today.
Much of what resulted from this project is so broadly known it hardly bears repeating here. During the 1970's, Taylor's Xerox PARC produced a series of expensive, and thus commercially unsuccessful, but revolutionary "personal workstations" that incorporated much of what became the personal computer of the 1990s. At the same time, as computer components were made available to broader populations, businesses like Apple and Microsoft began to make cheaper and less user-friendly machines available broadly. Struggling to commercialize its inventions, Xerox allowed Steve Jobs access to its technology in exchange for a stake, resulting in the Macintosh's ushering in of modern personal computing and Microsoft's subsequent mass scaling through their Windows operating system. By 2000, a majority of Americans had a personal computer in their homes.
And much as it had developed in parallel from the start, the internet grew to connect those personal computers. During the late 1960s and early 1970s, a variety of networks grew up in parallel to the largest ARPANET, including at universities, governments outside the United States, international standards bodies and inside corporations like BBN and Xerox. Under the leadership of Kahn and Cerf and with support from ARPA (now renamed DARPA to emphasize its "defense" focus), these networks began to harness the TCP/IP protocol to inter-operate. As this network scaled, DARPA looked for another agency to maintain it, given the limits of its advanced technology mission. While many US government agencies took their hand, the National Science Foundation had the widest group of scientific participants and their NSFNET quickly grew to be the largest network, leading ARPANET to be decommissioned in 1990. At the same time, NSFNET began to spread interconnect with networks in other wealthy countries.
One of those was the United Kingdom, where researcher Tim Berners-Lee in 1989 proposed a "web browser", "web server" and a Hypertext Mark-Up Language (HTML) that fully connected hypertext to packet-switching and made internet content far more available to a broad set of end users. From the launch of Berners-Lee's World Wide Web (WWW) in 1991, internet usage grew from roughly 4 million people (mostly in North America) to over 400 million (mostly around the world) by the end of the millennium. With internet start-ups booming in Silicon Valley and life for many beginning its migration online though the computers many now had in their home, the era of networked personal computing (of "The Computer as a Communication Device") had arrived. CITE https://ourworldindata.org/internet
In the boom and bust euphoria of the turn of the millennium, few people in the tech world paid attention to the specter haunting the industry, the long-forgotten Ted Nelson. Stuck on his decades-long quest for the ideal networking and communication system, Nelson ceaselessly warned of the insecurity, exploitative structure and inhumane features of the emerging WWW design. Without secure identity systems (Xanadu Principles 1 and 3), a mixture of anarchy and land-grabs by nation states and corporate actors would be inevitable. Without embedded protocols for commerce (Xanadu Principles 9 and 15), online work would become devalued or the financial system controlled by monopolies. Without better structures for secure information share and control (Xanadu Principles 8 and 16), both surveillance and information siloing would be pervasive. Whatever its apparently success, the WWW-Internet was doomed to end badly.
While Nelson was something of an oddball, his concerns were surprisingly broadly shared among even the mainstream internet pioneers who would seem to have ever reason to celebrate their success. As early as 1979, while TCP/IP was coalescing, Lick penned a foresaw of "two scenarios" (one good, the other bad) for the future of computing: it could be dominated and its potential stifled by monopolistic corporate control or there could be a full societal mobilization that made computing serve and support democracy. In the former scenario, Lick projected all kinds of social ills, one that might make the advent of the information age a net detractor to democratic social flourishing. These included:
- Pervasive surveillance and public distrust of government.
- Paralysis of government's ability to regulate or enforce laws, as they fall behind the dominant technologies citizens use.
- Debasement of creative professions.
- Monopolization and corporate exploitation.
- Pervasive digital misinformation.
- Siloing of information that undermines much of the potential of networking.
- Government data and statistics becoming increasingly inaccurate and irrelevant.
- Control by private entities of the fundamental platforms for speech and public discourse.
The wider internet adoption spread, the more irrelevant the less relevant such complaints appeared. Sure, government did not end up playing as central of a role as he imagined, but surely by 2000 we were on the path of Lick's scenario 2, assumed most of the few commentators who were even aware of his warnings. Yet in a few places, concern was growing by late in the first decade of the new millennium. Virtual reality pioneer Jaron Lanier sounded the alarm in two books You are Not a Gadget and Who Owns The Future?, highlighting Nelson's and his own version of Lick's concerns about the future of the internet and information technology. While these initially appeared simply an amplification of Nelson's fringe ideas, a series of world events that we discuss in the Introduction eventually brought much of the world around to seeing the limitations of the internet economy and society that had developed. These patterns bore a striking resemblance to Lick and Nelson's warnings. The victory of the internet may have been far more Pyrrhic than it at first seemed.
How did we fall into a trap clearly described by the founders of hypertext and the internet? After having led the development of the internet, why did government and the universities not rise to the challenge of the information age following the 1970s?
It was the warning signs that motivated Lick to put pen to paper in 1979 as the focus of ARPA (now DARPA) shifted away from support for networking protocols towards more directly weapons-oriented research. Lick saw this resulting from two forces on opposite ends of the political spectrum. On the one hand, with the rise of "small government conservatism" that would later be labeled "neoliberalism", government was retreating from proactively funding and shaping industry and technology. On the other hand, the Vietnam War turned much of the left against the role of the defense establishment in shaping research, leading to the Mansfield Amendment of 1973 that prohibited ARPA from funding any research not directly related to the "defense function". Together these were redirecting DARPA's focus to technologies like cryptography and artificial intelligence that were seen as directly supporting military objectives.
Yet even if the attention of the US government had not shifted, the internet was quickly growing out of its purview and control. As it became an increasingly global network, there was (as Dewey predicted) no clear public authority to make the investments needed to deal with the socio-technical challenges needed to make a network society a broader success. To quote Lick
From the point of view of computer technology itself, export...fosters computer research and development" but that "(f)rom the point of view of mankind...the important thing would...be a wise rather than a rapid...development...Such crucial issues as security, privacy, preparedness, participation, and brittleness must be properly resolved before one can conclude that computerization and programmation are good for the individual and society...Although I do not have total confidence in the ability of the United States to resolve those issues wisely, I think it is more likely than any other country to do so. That makes me doubt whether export of computer technology will do as much for mankind as a vigorous effort by the United States to figure out what kind of future it really wants and then to develop the technology needed to realize it.
The declining role of public and social sector investment left core functions/layers that leaders like Lick and Nelson saw for the internet (e.g. identity, privacy/security, asset sharing, commerce) to which we return below absent. While there were tremendous advances to come in both applications running on top of the internet and in the WWW, much of the fundamental investment in protocols was wrapping up by the time of Lick's writing. The role of the public and social sectors in defining and innovating the network of networks was soon eclipsed.
Into the great hole left stepped the ever private sector, flushed with the success of the personal computer and inflated by the stirring celebrations of Reagan and Thatcher. While the International Business Machines (IBM) that Lick feared would dominate and hamper the internet's development proved unable to key pace with technological change, it found many willing and able successors. A small group of telecommunications companies took over the internet backbone that the NSF freely relinquished. Web portals, like America Online and Prodigy came to dominate most American's interactions with the web, as Netscape and Microsoft vied to dominate web browsing. The neglected identity functions were filled by the rise of Google and Facebook. Absent digital payments were filled in by PayPal and Stripe. Absent the protocols for sharing data, computational power and storage that motivated work on the Intergalactic Computer Network in the first place, private infrastructures (often called "cloud providers") that empowered such sharing (such as Amazon Web Services and Microsoft Azure) became the platforms for building applications.
While the internet backbone continued to improve in limited ways, adding security layers and some encryption, the basic features Lick and Nelson saw as essential were never integrated. Public financial support for the networking protocols largely dried up, with remaining open source development largely consisting of volunteer work or work supported by private corporations. As the world woke to the Age of the Internet, the dreams of its founders faded.
Yet faded dreams have a stubborn persistence, nagging throughout a day. While Lick passed away in 1990, many of the early internet pioneers lived to see their triumph and tragedy. Ted Nelson and many other pioneers in Project Xanadu continue to carry their complaints about and reforms to the internet forward to this day. Engelbart, until his death in 2013, continued to speak, organize and write about his vision of "boosting Collective IQ". These activities included organizing, along with Terrence Winograd (PhD advisor to the Google founders), a community around Online Deliberation based at Stanford University. While none of these efforts met with the direct successes of their earlier years, they played critical roles as inspiration and in some case even incubation for a new generation of plural innovators, who have helped revive and articulate the dream of Plurality.
While, as we highlighted in the introduction, the dominant thrust of technology has developed in directions that put it on a collision course with democracy, this new generation of leaders has formed a contrasting pattern, a scattered clearly discernible nodes of light that together give hope that with renewed common action, Plurality could one day animate technology writ large. Perhaps the most vivid example for the average internet user is Wikipedia.
This open, non-profit collaborative project has become the leading global resource for reference and broadly shared factual information. In contrast to the informational fragmentation and conflict that pervades much of the digital sphere that we highlighted in the introduction, Wikipedia has become a widely accepted source of shared understanding. It has done through harnessing large-scale, open, collaborative self-governance. Many aspects of this success are idiosyncratic and attempts to directly extend the model have had mixed success; trying to make such approaches more systematic and pervasive is much of our focus below. But the scale of the success is quite remarkable. Recent analysis suggests that most web searches lead to results that prominently include Wikipedia entries. For all the celebration of the commercial internet, this one public, deliberative, participatory, and roughly consensual resource is perhaps its most common endpoint.
Yet while Wikipedia is the most commonly visible illustration of this ethos, it is one that pervades the foundations of the online world. Open source software (OSS) is the ethos from which Wikipedia sprang and exemplifies the significant impact of participatory, networked, transnational self-governance. OSS, embodied most notably in the Linux operating system, underpins most public cloud infrastructures and intersects with the digital lives of many via platforms like GitHub, which claims over 100 million contributors. The Android OS, which powers over 70% of all smartphones, is an OSS project, despite being primarily maintained by Google.
OSS serves as a counter-reaction to the commercialization and secretive nature of the industry that emerged in the 1970s. The free and open development approach of the early days of ARPANET was sustained even after the withdrawal of public funding, thanks to a global volunteer workforce. Richard Stallman, opposing the closed nature of the Unix OS, led the "free software movement", promoting the “GNU General Public License” that allowed users to run, study, share, and modify the source code. This was eventually rebranded as OSS, with a goal to replace Unix with an open-source alternative, Linux, led by Linus Torvalds.
OSS has expanded across various internet and computing sectors, even earning support from formerly hostile companies like Microsoft, now owner of GitHub. This represents the practice of Plurality on a large scale; emergent, collective co-creation of shared global resources. Communities form around shared interests, freely build on each other’s work, vet contributions through unpaid maintainers, and "fork" projects into parallel versions in case of irreconcilable differences. The protocol “git” supports collaborative tracking of changes, with platforms like GitHub and GitLab facilitating millions of developers' participation. This book is a product of such collaboration.
However, OSS faces challenges such as chronic financial support shortage due to the withdrawal of public funding, as explored by Nadia Eghbal (now Asparouhova) in her book Working in Public. Maintainers are often unrewarded and the community's growth increases the burden on them. Nonetheless, these challenges are addressable, and OSS, despite its business model limitations, exemplifies the continuance of the open collaboration ethos (the lost dao) that Plurality aims to support. Hence, OSS projects will be central examples in this book.
Another contrasting reaction to the shift away from public investment in communication networking was exemplified by the work of Jaron Lanier, whom we mentioned above and who followed more in the shoes of cultural designers like Engelbart and Nelson than those of organizational leaders like Cerf and Crocker. A student and critic of AI pioneer Marvin Minsky, he sought to develop a technological program of the same ambition as AI, but centered around human experience and communication. Seeing existing forms of communication as being constrained by symbols that can be processed by the ears and eyes like words and pictures, he aspired to empower deeper sharing of and empathy for experiences only expressible by sense like touch and proprioception (the internal sense). Through his research and entrepreneurship during the 1980s, this developed into the field of "virtual reality", one that has been a continual source of innovation in user interaction since, from the GLOVE to Apple's release of the Vision Pro.
Yet, as we highlighted above, Lanier carried forward not only the cultural vision of the computer as a communication device; he also championed Nelson's critique of the gaps and failings of what became the internet. In books like You are Not a Gadget and Who Owns the Future?, he particularly emphasized the lack of base layer protocols supporting payments, secure data sharing and provenance and financial support for OSS. This advocacy was central to inspiring a wave of on these topics in and around the "Web 3 community" that harnesses cryptography and blockchains to create shared understanding of provenance and value. While many projects in the space have been influenced by ES and hyper financialization, the enduring connection to original aspirations of the internet, especially under the leadership of Vitalik Buterin (who founded Ethereum, the largest smart contract platform), has inspired a number of projects, like GitCoin and decentralized identity, that are central inspirations for Plurality today.
Other pioneers on these issues focused more on layers of communication and association, rather than provenance and value. Calling their work the "Decentralized Web" or the "Fediverse", they built protocols like Christine Lemmer Webber's Activity Pub that became the basis for non-commercial, community based alternatives to mainstream social media, ranging from Mastodon to Twitter's now-independent Blue Sky initiative. This space has also produced many of the most creative ideas for reimagining identity and privacy with a foundation in social and community relationships.
Finally and perhaps most closely connected to our own paths to Plurality have been the movements to revive the public and multisectoral spirit and ideals of the early internet by strengthening the digital participation of governments and democratic civil society. These "GovTech" and "Civic Tech" movements have harnessed OSS-style development practices to improve the delivery of government services and bring the public into the process in a more diverse range of ways. Leaders in the US include Jennifer Pahlka, founder of GovTech pioneer Code4America, and Beth Simone Noveck, Founder of GovLab. Yet despite these important impacts, the most dramatic illustrations of these movements are in countries where they managed to transform the national character overall, so that we can see their potential played out on the stage of nations.
After all, as inspiring globally distributed examples are, most of us still think of examples of societies in terms of nation states and thus look to countries as exemplary models. While many societies around the world have exemplified the ideals and employed the tools of Plurality, two exemplars stand out. Both Estonia and Taiwan share governments influenced by plural political thought, especially the ideas of Henry George, and facing dire and persistent threats from aggressive authoritarian neighbors. Drawing on this history and spurred by these challenges, they have pioneered the application of plural technology to shape democracy and the public sector.
Estonia paragraph
Yet while Estonia pioneered the use of Plurality to transform a national government, both its size and its very early development limited what it could achieve. While it took longer to fully develop, in the last decade a small and mountainous island became the world's clearest example of a different path. Its story animates our next chapter.