BOOK REVIEW: Drowning in Information, Starving for Knowledge

By on April 10th, 2018 in Book Reviews, Communication Technology, Human Impacts, Magazine Articles, Social Implications of Technology, Societal Impact

Information Overload Paradox: Drowning in Information, Starving for Knowledge.
By L. V. Orman. Seattle, WA: Create Space Independent Publishing, 2016, 190 pages.

 

Information overload is not a new phenomenon, but a part and parcel of modern life. In this vein, Georg Simmel earlier suggested in The Metropolis and Mental Life [1] that overwhelming stimuli transform the psyche of urban individuals and help them develop a blasé attitude. Social scientists have also sought to understand “information overload,” its determinants, consequences, and remedies ever since. An “information overload” keyword search in Google Books today yields 300 000+ hits! So is there anything new in Orman’s book?

The paradox is that technologies help us know more, but in the process, we know less.

Orman poses “information overload” as a paradox and carries out the daunting task of drawing from a large body of literature to give us three mechanisms through which such paradox arises. The paradox is that technologies help us know more, but in the process, we know less. In a Simmelian world, this is not an entirely novel proposition. However, Orman’s simplification of the problem along with pertinent evidence makes the mechanisms a compelling narrative. As we are moving fast towards “ubiquitous computing,” Orman’s effort is timely. In summary, Orman’s mechanisms are as follows:

  1. Substitution: As people substitute “cheap for expensive,” “complex for simple,” and “formal for informal,” a large quantity of information drives out high-quality information.
  2. Obsolescence: Changes in technologies often require organizational adaptations and specialization. Consequently, old but useful information, methods, and practices get lost.
  3. Competition: Information overload makes information a competitive weapon. Social actors competing for limited resources might mislead each other through deliberate misinformation.

Although it is not fully clear how the three mechanisms fit together and where the boundary of their explanatory power is drawn, Orman does a great job in illustrating them individually. He also makes the problem of information overload appear manageable and solvable. However, when it comes to solutions, Orman leaves us with some contradictions, and sidesteps some existing solutions as well. We will examine these solutions and propose to consider the interaction of three mechanisms.

First, he prescribes “liberalism” and “protectionism” at the same time without addressing why apparently inefficient organizations may monopolize social lives. According to Orman, we need to allow institutions to compete so we do not end up with quick, irreversible substitutions; whereas, to prevent obsolescence, we need to practice “cultural protectionism.” He argues that organizations such as state, family, and church are monopolies with breeding grounds for irreversible substitutions. Even if these organizations adopt inefficient practices, the practices become almost impossible to change. So, we need competition, such that practices come about through small-scale experimentations. While Orman’s illustrations are appealing, he does not note why such organizations survive despite being inefficient monopolies. The causes are, of course, multifaceted, and have been subject to debate. For instance, in a recent provocative manuscript, “Why Nations Fail,” Acemoglu and Robinson show that states are inefficient because extractive political institutions in them allow some people an unequal opportunity to usurp resources and power [2]. So, if Orman is to prevent states or nations from adopting inefficient, irreversible practices, he should address the causes (e.g., Acemoglu and Robinson’s extractive political institutions).

Second, Orman suggests that use of “trust partners” can save people from misinformation, but does not elaborate on the downside of such intermediaries. Here, trust partners are various information intermediaries that are rated and trusted by their users on an ongoing basis. We have many trust partners around us. For instance, websites such as TripAdvisor and Expedia are partners for hotel information; credit rating agencies are partners for credit information; auditors are partners for reliable financial statements. In a simple version of Orman’s framework, any important information will flow through such trust partners and in turn, the users will rate the partners. Thus, individuals will eventually have trust partners in every walk of their lives for reliable information. Even though such social design may have apparent merits, it is not without adverse consequences. One simple reason is that trust partners compete among themselves. Given that information quality is difficult to ascertain, trust partners may adopt misinformation to remain competitive. Indeed, the financial crisis of 2008 shows us that such competition among credit rating agencies facilitated large-scale commercialization of sub-prime mortgages [3]. Moreover, given information products have large economies of scale, what will prevent the trust partners from becoming monopolies?

Finally, Orman underemphasizes the role of existing social relations and institutions (e.g., families, friends, group memberships, kinships, and status) in addressing issues of information quality and trust. In his pioneering work on the role of religion in capitalism, Max Weber shows that sect memberships, a form of voluntary relation, became a stamp of trust for borrowing and lending transactions in the U.S. financial markets (see “Churches and Sects” in [4]). Over the last five decades there has been a multitude of evidence to prove various aspects of this point. Thus, a pertinent follow-up question to Orman would be to find out whether “information overload paradox” leads to changes in the nature of these relations and institutions. For example, “has information overload changed the way individuals form and trust family, friends, and groups?”

As discussed above, Orman lays out a simple, elegant framework to understand the paradox; however, interested research communities may find it useful to examine the ways in which the mechanisms (“substitution,” “obsolescence,” and “competition”) interact. We use an example to illustrate this point. Assume an individual has developed a trust relationship with an online, interactive news community for reliable source for interpreting social events. This apparently saves the individual from using low quality information instead of high quality information, i.e., the “substitution” problem. But the danger lies in the potential that repeated interaction among the community members would give rise to a clan-like culture, leading the whole community to think and act alike. Such phenomenon has often been described as the “echo-chamber effect” [5]. Moreover, there is ample evidence that human social lives (ideologies, views, living conditions) correlate with similar others, fortifying the “echo-chamber” effect [6]–[7][8][9]. Formed for a narrow issue or purpose, trust partners or communities could thus develop a tunnel vision among its members, leading to loss of knowledge and practices in various other areas of life. As the result, trust communities, those with echo-chamber effect, may become a source of “obsolescence” as well. Furthermore, echo chambers can develop an identity of their own and become a source of fierce “competition” and misinformation. Based on the reasoning above, we suggest a simultaneous consideration of Orman’s three mechanisms to understand the implications of information overload paradox and its full consequences.

Reviewer Information
Abdullah Shahid is a Ph.D. student in the Department of Sociology at Cornell University, Ithaca, NY. Email: ais58@cornell.edu.
Ningzi Li is a Ph.D. student in the Department of Sociology at Cornell University, Ithaca, NY. Email: nl323@cornell.edu.

 

To read full review, including references, click HERE.