Information Paradox : Drowning in Information, Starving for Knowledge

By on June 29th, 2017 in Editorial & Opinion

We are living in an age of information. Staggering amounts of information are collected, stored, and widely disseminated. Yet, we may be less informed and less knowledgeable than ever. This paradox of increasing information, yet decreasing knowledge and insight, has many possible causes, some of which are subtle and difficult to identify, and even more difficult to remedy. The fundamental issue is quantity crowding out quality, leading to an abundance of poor-quality information which may not be a good substitute for scarce but high-quality information. Information is not unique in exhibiting this paradox. There are many other goods that exhibit this unusual characteristic of more being worse than less. Those who eat the most food are rarely the healthiest people, and they may actually be severely deficient in some nutrients. Those who have the most Facebook friends are often the loneliest people. Those who are the busiest are not the most productive. Those who read books and watch television the most are sometimes the least knowledgeable. All of these examples point to the pervasiveness of this paradox, but it is most insidious with information, critical in an information economy, and most difficult to overcome in a modern society dominated by communication technologies. In an information economy, we may be shipwrecked, surrounded by an ocean of water, yet dying of thirst!

There are three fundamental reasons why quantity may crowd out quality. The most obvious is the production cost problem where the emphasis on quantity shifts the emphasis and resources away from quality. It is costly to produce quality information, and it is difficult to do both quality and quantity. When quality does not pay in proportion to its high cost, quantity wins over. This is also the most common explanation for non-information examples, but explanation for information products involves two other reasons. The second reason is the obsolescence problem. Information is not neutral with respect to the physical world, but it is an agent of change. Information is useful precisely because it is used to change the environment and subjugate nature and society to our purposes. But as information is used to change the environment to take advantage of new opportunities, our existing information about the environment becomes obsolete, leading to a loss of information. The net effect may be positive or negative, but it is increasingly negative as we will show, in a fast-changing information-intensive society. The third reason is the competition problem when information is used as a competitive weapon against others, to mislead and confuse others, leading to a loss of knowledge on their part. Information is power, because it can be used to control others and exploit them, by controlling their information sources, and consequently their behavior. But replacing reliable information with distorted and misleading information leads to a net information loss on their part. More importantly, if everybody uses the same tactics, leading to an information war, everybody may end up worse off with significant loss of knowledge and insight by all. We will describe all three reasons in detail in the following sections.

Information Cost

The American poet Henry David Thoreau, upon observing the excruciatingly hard work of laying telegraph lines near his home, famously lamented: “We are in great haste to build telegraph lines from Maine to Texas, but do we have anything important enough to communicate to justify this kind of cost and effort? We are eager to tunnel under the Atlantic Ocean to bring the old world to the new, but the first news that will arrive at the American ears will probably be that Princess Adelaide has the whooping cough!” He probably would have made the same comment about the Internet [14].

Information is costly to produce, especially quality information, but cheap to disseminate once the infrastructure has been built. Modern communication networks reduced the cost of dissemination drastically, but producing quality information remains costly. As a result, the incentive is to emphasize cheap dissemination rather than expensive production, by producing large quantities of low-quality information, but disseminate it widely. This is the explanation for the celebrity culture where some people succeed on the basis of name recognition alone, without any particular accomplishment; yet those with extraordinary accomplishments often remain obscure, poor, and unappreciated. Consider musicians. Quality music is very costly to create, and there are very few composers and song writers who produce high-quality music. They often live in obscurity because high-quality music rarely has mass appeal for large scale distribution, and in an era of cheap distribution, mass appeal dominates financial decisions. Mediocre music with better financial prospects then crowds out high-quality music. Similar arguments apply to other information products [6].

Economists have known for some time that low quality drives out high quality when it is difficult to distinguish between them, called the “lemons problem.” When the marketplace cannot distinguish quality easily, consumers tend to buy the cheaper alternative to reduce their risk. Those who produce high-quality expensive products then cannot compete and leave the market [9]. Information products fit well into this model, because they are notoriously difficult to distinguish in quality, since there are no obvious markers of quality information. Examples are abound. People with most Facebook friends are among the loneliest, because Facebook provides large numbers of low-quality superficial friendships, but takes away time from building high-quality but time-consuming friendships. When it is difficult to distinguish between the two types of friendships by not understanding why one may not be a good substitute for the other, the incentives are for developing the easy friendships with minimal cost. Lemons problem extraordinaire! People who eat the most food are not the healthiest, and in fact they may have serious nutritional deficiencies. That is because they tend to eat low-quality foods, since high-quality foods tend to be expensive and difficult to get in large quantities, and low-quality foods may not be a good source of nutrients in any quantity. Yet, when the quality is difficult to distinguish, and low-quality foods look and taste similar to their high-quality alternatives, the incentives are for filling up on cheap food. People who are the busiest may not be the most productive, because they fill their time with low-quality insignificant activities. That is because high-quality activities take planning, organization, and money, and when the distinction is not obvious, the incentive is to fill one’s time with the easy but insignificant activities. Those who read the most and watch the most television are not always the best informed, because they read and watch low effort and low-quality books and programs. High-quality books are hard to read, and when the distinction is not obvious, the incentives are to entertain your family with the easy-to-read books and easy to watch programs about celebrities and sports, not complex social and philosophical issues.

French poet Paul Valery wrote in 1895: “Western cultures worship information as if it were an omnipotent beast and place no limits on what they seek to know.”

Consider academic research. Academic articles all look very similar in style and presentation, but high-quality articles are much more costly to produce. If the quality is difficult to ascertain, then the likely outcome is large numbers of low-quality articles, and a proliferation of low-quality fields. The peer review system, although revered by the universities, does not solve this problem, because the peers of low-quality producers are likely to be low-quality producers themselves, and may not even recognize high-quality work. More importantly, peers have no incentive to pay the high price of evaluating quality in terms of time and effort. This may explain the proliferation of new fields of study, and the increasing numbers of Ph.D.’s and academic publications, where large quantity invariably chases out high quality. Harvard economist John Kenneth Galbraith once lamented that the shifting emphasis in economic research from qualitative analysis to empirical data analysis had impoverished the field of economics. That is because the emphasis had shifted from domain knowledge and insight to learning and properly using statistical tools. The qualitative research could not compete with empirical work, not because it was lower quality, but because empirical work was more easily produced and evaluated and hence less risky. Ph.D. students could count on finishing on time, and faculty could count number of publications for evaluation purposes. Qualitative work was risky, subjective to evaluate, and difficult to predict success. As a result, empirical work crowded out qualitative work, and academic evaluations were reduced to counting the number of publications per year [11].

Similarly, UCLA statistician Judea Pearl criticizes social science research as overly concerned with statistical tools, and not enough with insight and policy recommendations. In fact, he argued that the emphasis on statistical tools made social science research less relevant to understanding and solving real social problems. In every empirical research paper he colorfully pronounced that there is one section that is completely nonscientific and it is titled conclusions, and that is where he authors often draw policy implications. But policy implications require an understanding of causality so that one can manipulate causes to change effects. Yet, statistical data analysis without experiments rarely provides insight into causality, but only correlation, and the policy conclusions drawn are merely opinions with little scientific basis. That is why most every research paper ends with an urgent plea to do more research to justify the conclusions, and almost never ends a debate conclusively. Moreover, one can potentially find some statistical evidence to support any hypothesis, because 95% significance test is rather weak when so many researchers are repeating the same tests again and again with different data sets, and ignoring the failures, and publishing the rare successes. The inability to replicate most research findings is a troubling consequence of this research environment. As a result, the explosion of research activity does not provide better insight into any of the hypotheses tested. This is how quantity overcrowds quality even in scientific research [19].

Consider a highly publicized research article by Cornell University psychologist Daryl Bem in which he found statistically significant evidence that humans can predict purely random future events [5]. Such clairvoyance contradicts the laws of physics as we know them. Consequently, this research presents a dilemma to either reject the laws of physics governing time and randomness, or reject the rules of statistical significance. Either option requires invalidating a large amount of existing human knowledge. There is evidence that either choice is possible. The laws of physics may be different at very small and very large scales than the ones that operate at human scale so human neurobiology may not be governed by the familiar rules of physics; or the statistical significance tests may be too weak, rendering most knowledge in social sciences unreliable.

Education also suffers from the effects of quantity chasing out quality. The typical American now goes through 15 years of education before being eligible for a reasonable full-time job. Online education is starting to challenge the monopoly of universities, their high cost, and their operations, but not the content of education or the length of it. There seems to be a near-consensus that more education is always better, and increasingly essential for a high-tech society, irrespective of the content of that education. But most of our education is not about skills necessary for a high-tech complex society, but increasingly about intellectual debates over social and philosophical issues. Such debates dominate the liberal arts education, and they are justified as building skills of critical thinking, as opposed to disseminating known and undisputed facts and skills for jobs. These debates generate a lot of information, yet provide no obvious solutions, or even methodologies to arrive at solutions, so their quality is suspect. Every student is expected to think for themselves and reach their own conclusions. The education merely gives you some frameworks to clarify and classify the problem and possibly make some cogent arguments, with no obvious resolution of the issues.

One such loud and aggressive debate is between science and religion about the origins of the universe. In fact, neither has anything useful to say about the origins of the universe, and ironically that only encourages debate and generates a large quantity of information. Scientific discovery of the Big Bang Theory and the expanding universe, as interesting as they are, say nothing about the origin of the universe, unless one can take an arbitrary moment in the history of the universe and call it the beginning. The Big Bang Theory requires the existence of mass before it, and the existence of anything before the beginning suggests that it is not the beginning. Religions suffer from the same logical folly. They explain the universe in terms of a creator, but the existence of a creator before the beginning suggests that it was not the beginning, but an arbitrary moment in its history. In fact, the concept of a “beginning” is not explainable using our existing human constructs, yet the amount of information generated from debates between alternative theories fill our libraries and school curricula. A debate between alternative theories often obscures the shortcomings common to all the proposed alternatives, but emphasizes the information that distinguishes between those theories. Shortcomings that are common to all are often much more useful to understanding, yet they are driven out by the quantity of information produced by debate in the name of critical thinking [24].

Similar debates dominate the theories explaining the beginning of life where life is defines as self-replicating organisms. Science and religion are once again adversaries, and neither has a satisfactory explanation. Scientific theory of evolution says nothing about the beginning of life, since evolution requires the existence of self-replicating organisms, and as such cannot explain how they came into existence. There are attempts to explain the beginning of life in terms of accidental chemical reactions. But that explanation relies on intrinsic characteristics of the elements comprising the earth, but since we don’t know how the matter was created, that merely explains one unknown in terms of another. Accident theories merely defer the explanation of biology to chemistry of elements, whose origins are also not known. Religions also refer to a creator to explain life, but the creator is typically imagined as a life form, but if life forms existed before the beginning of life, then it cannot be the beginning. All such theories explain one unknown in terms of other unknowns, and create huge quantities of information, without answering any fundamental questions [24].


French poet Paul Valery wrote in 1895: “Western cultures worship information as if it were an omnipotent beast and place no limits on what they seek to know. The Chinese by contrast do not wish to know too much, because they understand that knowledge must not increase endlessly. If it continues to expand, it causes endless change, and creates a need to adjust and abandon age-old traditions and wisdom. You are better off ignorant than stricken with the European disease of constant invention and change, and the debauchery of endless confusion from new ideas” [1]. Paul Valery may have discovered a fundamental paradox of human existence, which goes well beyond the European culture. Humans throughout history appear to have constantly sought to learn more about nature, and use that knowledge to exploit nature to serve human needs. But such information is costly to produce, since nature does not reveal its secrets easily. More importantly, information is dangerous because once acquired, it cannot be ignored or discarded easily, and can have unintended consequences. Most critically, it can lead to the exact opposite of the intended effect, and can end up reducing our knowledge about the world around us. That is the paradox!

Information is power, but only to the extent that it is used to exploit nature, other species, or other humans. For example, scientific discoveries increase our knowledge, but they don’t yield power until they are used to exploit nature by creating new tools and technologies. But exploitation of nature invariably changes the world around us, and makes some of our existing information about that world obsolete. In other words, science increases our knowledge; but science is not terribly useful without technology; technology makes science useful, but then decreases our knowledge of the world by changing the world. Consequently, the net effect of scientific discoveries may be an increase or a decrease in our total information.

This is the fundamental dilemma of obsolescence. In a fast-changing technological society, the rate of obsolescence and the resulting loss of information may overwhelm the rate of new scientific discovery and additional information, leading to a continuing loss of knowledge and insight. This is why older people are increasingly considered ignorant about the world, as opposed to being a source of knowledge and wisdom as they used to be. But young people who may be knowledgeable about the modern tools and technologies may have no access to the wisdom and learning generated over millennia. In a fast changing technological society, there is no mechanism to acquire the wisdom and knowledge that can only be acquired by trial and error over millennia. Because of that loss of information transfer from one generation to another, we may be more ignorant about the world around us, as the world changes faster with more sophisticated technologies [23].

Consider the agricultural revolution in early human history. Agriculture is information intensive relative to the earlier hunter-gatherer paradigm. It requires a great deal of information about climate, crops, soil conditions, seeds, tilling, fertilizing, and fencing. But acquiring and using all of that information to practice agriculture made thousands of years of accumulated knowledge about our environment obsolete. Agriculture changed the environment, and the ecology of plants and animals in that environment. Suddenly, by adopting agriculture, our understanding of that new environment was minimal, and learning had to start from scratch. Agriculture, as information intensive as it is, probably led to the biggest loss of accumulated knowledge in human history, leaving humans largely ignorant about the new environment they created. We may be continuing to pay a price for that ignorance even to this day, because although there are short term and immediate advantages to new information, the price paid for the loss of information can be delayed and persistent for long periods of time as the environment is modified slowly and incrementally.

Information lost is often considerably more valuable than the new information generated, since it takes a long time to generate high-quality information. Loss is sudden, yet learning is very slow. When Europeans colonized Africa, they brought their knowledge of sophisticated transportation technologies with them, and changed the landscape by building roads, ports, and warehouses. They were surprised that natives always built their villages on top of hills, away from the waterways, and they had to carry all the goods arriving by boat, a long way up the hill to their villages. The inefficiency of the design amazed them, compared to the efficiency of European settlements built by the water. It was just another piece of evidence for the incompetence of the natives, even below the already very poor regard they had for the intelligence of the natives. They quickly forced the natives to relocate their villages to the edge of the water, near the ports, roads, and warehouses, and crated huge efficiencies. But within a few years, malaria killed most of them, both native and European, because the water’s edge was where the mosquitos lived! European technologies, as information intensive as they are, had changed the environment and the lifestyles of the locals, and had made a large body of their knowledge of the environment obsolete, leaving them all ignorant about the local diseases and pests [1].

Modern European civilization has been especially disruptive to the environment because of its accelerated generation of new information, and increasing emphasis on using that information to create new tools and technologies to exploit nature. What made modern Europeans especially successful during the past several hundred years also made them especially ignorant about the new environment they were creating. This is where one can see a difference among cultures, as some were more information intensive than others. Many indigenous cultures changed slowly and learned by trial and error. As inefficient as that sounds to the modern scientific mind, that type of slow generation of new information also reduced the rate of obsolescence and information loss, and allowed human societies to balance the loss with gain.

It was also difficult for modern Europeans to appreciate the value of indigenous cultures and their knowledge. In a fast changing information intensive environment, Europeans emphasized logging and recording what they learned and explaining why and how. In slow changing environments, the indigenous people incorporated the new information into rituals, myths, and stories. Explanations were not critically important, as long as the information was useful and the lessons were right, and trial and error did not produce adequate explanations. When Europeans questioned why native Africans would live on top of the hills, the answer was often tradition, or some myth about monsters and other dangers in the water. The poor explanation made it impossible for others with different traditions and myths to appreciate and learn from the accumulated wisdom, leading Europeans to ridicule and dismiss the indigenous knowledge. But, in a slow changing environment explanations are not critical, as there is time to adapt to the information — and learning by trial and error does not provide explanations. It simply works [1]!

Consider religion. All religions encapsulate millennia of knowledge acquired by trial and error, and incorporate it into their rituals and myths. The explanations are often factually wrong, and the reliance on supernatural forces and myths is often unsatisfactory to the modern scientific mind. But that does not make the accumulated knowledge any less relevant or any less useful for its time. Of course, in a fast changing technological society that accumulated information becomes obsolete faster and faster. Religious explanations are often factually wrong, and their relevance to a fast changing society is increasingly dubious. But that loss of accumulated information, not replaced adequately by modern science, is potentially a tragedy in the making. Science cannot adequately replace the knowledge lost when technology development and the consequent loss of knowledge due to environmental change outpaces new scientific discoveries [16].

History does not repeat itself in a fast-changing information-intensive society, and no learning from history takes place. Knowledge of history is not helpful to make predictions or to provide guidance when the society transforms itself at an increasing speed, with faster adoption of new tools and technologies.

Yet we continue to teach history to all school children, and consider it critical for literacy as if it was still relevant to our daily lives, despite the fact that increasingly history reads like science fiction with implausible plots and unlikely characters. Military strategists were shocked by the new realities of warfare in World War I dominated by static trench warfare, as compared to the dynamic battlefield maneuvers by infantry in previous wars. World War II was nothing like World War I, with trench warfare replaced by aerial bombardments of infrastructure. Modern wars are fought from a distance with missiles and drones.

Is there anything a military leader can learn from Alexander the Great and his conquest of Persia that may be relevant to the wars in Iraq, Syria, and Afghanistan? And more importantly is that inability to learn from history significant and harmful? The new industrial battlefield is drastically different from the ancient battlefields. Ancient battlefields gave options to the warriors by relying on age-old traditions of warfare. The options included flight, surrender, and mutiny, in addition to fighting, because of a high level of autonomy accorded to individuals and small units. Industrial battlefields with high levels of mechanization, coordination, and man-machine symbiosis took away those options, leaving no alternatives to enduring the horrors of war to its bitter end. Any careful observation of the 1993 Gulf War between the U.S. and Iraq could not help but note the slaughter of the retreating Iraqi army by a mechanical war machine from which there was no escape: no surrender since there were no visible humans, and no possibility of mutiny since machines dictated the movements on both sides.

The mechanical slaughter engendered by the trench warfare of World War I was the first experience for individual soldiers to face a situation where all other options to the unending slaughter were taken away, since their mobility, vision, and perception of the battlefield was greatly restricted by the mechanical war machine. That war undermined for the first time the assumptions about the rationality of war and self-determination of the individual soldiers. It also undermined the assumptions about the rationality of the European culture, its science and technology based superiority over all other cultures, and their civilizing influence on the rest of the world. The staggering human losses, disproportionate to any political and military gains, made war appear to be waged by an irrational civilization that had lost its bearings. The premier achievement of Europeans in science and technology also appeared to be the primary cause of such a catastrophe of all-out war. The world could see for the first time that along with all the advantages brought about by science and technology, there was also a very high price to be paid in a technological society for the loss of historical perspective and the loss of accumulated information about rules of warfare and conflict resolution [1], [23].

With the advent of nuclear, biological, and chemical weapons, potentially delivered from long distances or even from space, we are facing the possibility of annihilating complete nations, permanently damaging certain ecologies, or making large areas of the world unlivable. Such a risk has never existed before in a man-made form, and those who claim that civilization has reduced human violence do not appreciate the concept of risk [20]. Civilization has reduced ongoing low-level violence that was endemic in earlier societies, but it has raised the risk of catastrophic violence that rarely occurs. Such catastrophic events are notoriously difficult to predict, and they are notoriously difficult to analyze by collecting statistics about the past events because of their rarity, especially when the risk is rising over time. This is typical of information intensive societies, where large amounts of low-quality information raise the quality of life in the short run and very quickly. Yet the loss of high-quality long-term information raises the risk of catastrophic losses in the long term because of the increasing manipulation of the environment, and the consequent loss of information acquired over millennia about sustainable lifestyles.

As extensive as the information content is in modern wars, where every detail of the opposing military movements is known to both parties, one would expect conflicts to end very quickly by quickly identifying possible solutions and compromises. Yet the conflicts rage on endlessly, even when there are obvious solutions. With wide accessibility of explosives, and the ability to mobilize large populations all over the world, wars never end. Instead they turn into civilian insurgencies, attacks on civilian targets, volunteer fighters arriving from distant lands, sabotage of infrastructure, and even cyber warfare. There are obvious solutions to wars in Palestine, Iraq, Syria, and Afghanistan, and most people agree what the final solutions would basically look like. Yet the wars rage on endlessly with great information intensity both in intelligence and in propaganda, because the high-quality information about the rules of war and how to end them, conventions of victory and defeat, and traditions of reconciliation have been lost as the environment of war has changed, with new weapons, delivery systems, and new communication and transportation technologies. When history has no educational value, it can be a tragedy for all involved [13].

Warfare is not the only victim of information loss. Business suffers similar consequences. The industrial revolution of the 19th century was a technological marvel, and was based on the scientific developments of the previous hundred years. It elevated Europeans into global power and hegemony. Yet the industrial revolution also wreaked havoc on the social organization of European societies, and later on the whole world. Mental illness became an epidemic in the U.K. soon after the industrial revolution started. Schizophrenia rates rose by an order of magnitude, so much so that it was called “the English malady.” The curse of mental illness, in the form of schizophrenia, depression, and suicide spread to the rest of the world along with the industrial revolution. The most likely explanation is the fact that industrialization destroyed craft based communities and extended families, since family members had to acquire specialized skills, and travel long distances to large centralized factories where they were needed. Suddenly, work was separated from family and community, and people were expected to have fractured and disconnected personalities relating to work and family separately. It is not surprising then that fractured personality disorders such as schizophrenia, and social isolation disorders such as loneliness and depression skyrocketed. The new technologies led to a loss of age-old valuable information about community building and mental health support through integration of work and family structures [12].


Information is not neutral. It does not merely inform, it guides our decisions and actions. As such, there are incentives to control others’ information, and to distort and corrupt it, to change others’ behavior to serve your interests. In extreme cases, complete control of information means complete control of behavior. Cults and militaries often isolate their members from the general public and control their information. In a very short time, sufficient influence can be exerted to get members to sacrifice themselves for the common cause.

Stockholm Syndrome is another example where a kidnap victim, isolated from the outside world, quickly identifies with her abductors, loves and respects them, and may even defy and resist her would-be rescuers to stay with them. An American teenager named Elizabeth Smart was abducted in 2011 from her bedroom at knife point by a couple. After a period of isolation, she identified with her captors completely, changed her religion and appearance, and made no attempt to escape although she was left alone long periods of time [21].

The world is a messy place. Competition for survival often leads to information wars where misleading and deceiving others to serve your purpose is a common technique. Most of our existing information is distorted, or even downright wrong, to serve such competitive purposes of others. A great deal of information is produced to influence others rather than to inform, and such flooding of misinformation actually reduces our knowledge and insight, and ironically can leave everybody worse off. Consider spam email. So much of it is produced because it pays to send spam email, since the receivers pay the cost of sorting through large numbers of irrelevant and even fraudulent emails. Economists call this an externality when others pay part of the cost of your economic benefit. But when everybody sends spam to everybody else, everybody is worse off when email becomes unusable, with a great deal of low-quality information overwhelming the receivers and blocking high-quality information.

Information may merely inform and give options, but economics often creates necessities and forces action. Information easily spreads and becomes available to your competitors, so if it produces any short term advantage, it forces acting on that information. Consider the nuclear race during and after World War II. The possibility of nuclear bombs and the rudimentary knowledge to build them developed during the war by both parties. Neither party could possibly ignore that information, and refuse to develop the bomb, knowing that others may do so. In the process, the information eventually left everybody worse off, some by being targeted or threatened by them, others with nuclear weapons aimed at each other.

Consider agriculture, which may have changed what it means to be human. Archeologists have discovered prehistoric skeletons in Dickson Mounds, IL, that show that the shift from hunting and gathering to agriculture may have had serious negative consequences for humans, such as 50% increase in malnutrition and 30% increase in infectious diseases. Stone-age people may have lived healthier lives than the agricultural people that came after them [22]. Why then would they make the switch? They may have had no choice once the information was acquired, because agriculture gives you a short-term competitive advantage, but leaves everyone worse off eventually, similar to the weapons systems, because agriculture allows people to control other people’s access to food and exploit other people’s labor. The race to exploit others, and not to be exploited, cannot easily be opted out of. In previous hunter-gatherer societies, the food was distributed widely, not concentrated geographically, so it was not possible to control others’ food supply. Agriculture concentrated the food supply and fixed it geographically, so the people became dependent on a specific plot of land, once they invested their time and labor. Such concentration and immobility encouraged the exploitation of those dependent on that plot of land, by those with ownership rights to that plot of land.

Warfare followed invariably to establish ownership rights. In fact, when anthropologist Jane Goodall took some peaceful chimpanzees from their natural hunting-gathering habitat, and started feeding them from a central location, incessant violence broke out among them almost immediately to gain access to the limited but centralized food source [22].

Overall human health and longevity may have taken a severe hit from agriculture. “The typical human diet went from extreme variety and nutritional richness to just a few types of grain and occasional meat and dairy. In addition to the reduced nutritional value of the agricultural diet, the diseases deadliest to our species began their rampage with agriculture. High density populations stewing in their own filth, domesticated animals in close proximity, adding their excrement, viruses, and parasites to the mix, and extended trade routes facilitating the movement of contagious pathogens. Waorami Indians of Ecuador had no hypertension, heart disease, cancer, anemia, or common cold; no parasites, polio, pneumonia, small pox, chicken pox, typhus, typhoid, syphilis, tuberculosis, malaria, or hepatitis. Most of these diseases either originated in domesticated animals, or depend on a high-density population for transmission [8], [22].

Similar arguments apply to modern information intensive conflict and warfare. They also leave everyone worse off eventually, after protracted wars, insurgencies, revolutions, and terrorism. Yet, we can’t get away from them because of the great short-term advantages they confer to the winners. Israel would have been better off paying the Palestinians for their land and resettling them, instead of driving them into refugee camps, and instigating sixty years of warfare and terrorism. The U.S. would have been better off without the African slaves in the long run, by avoiding a bloody civil war, and the 200 year struggle to remedy the damage done by slavery. But, there is a tremendous impulse to do things cheaply in the short run and derive quick benefits. There is a great deal of simple information about the short run and it is easy to use that information to derive quick benefits, yet long-run planning requires rare high-quality information, insight, and wisdom [27].

Information about international politics is mostly about glorifying our own position, and demonizing the other. None of that information leads to insight about compromises and solutions. We glorify the generals who win wars and fill history books with their exploits, but we despise the civilian leaders who compromise as weak. There are obvious compromises to most international conflicts and domestic policy disputes, but there is no glory in advocating those because that advocacy requires reducing the information war and asking difficult questions about the interests of other parties. It is easy to see that the current problem in Iraq and Syria is not religious extremism, although that is the low-quality information that is easy to disseminate and debate, it is the existence of large oppressed populations in those countries. ISIS is just a violent face of large angry populations supporting them. Killing the violent face will not solve the underlying problem. The permanent solutions require solving the resource allocation problems [13].

Domestic policy issues similarly suffer from large amounts of misinformation. The gun control debate in the U.S. has obvious solutions, but one first needs to discard all the arguments about constitutional rights, background checks, and gun safety, and ask more fundamental questions like why the rural populations are overwhelmingly supportive of unlimited gun rights at the expense of urban populations that suffer most of the deadly consequences of gun violence. The urban-rural conflicts are common both in the U.S. and elsewhere. Rural populations have been economically devastated by the industrialization of agriculture, yet the urban population was never terribly sympathetic to their plight and just urged them to accept the economic consequences of modern technology. Well…here is a technology with the opposite consequences: guns; devastating to the urban population, yet giving an advantage to the rural folk. Is it any wonder then that the rural population is unsympathetic to the gun problem, which has deadly consequences for the congested and crime-ridden urban areas, yet is barely a nuisance for the hunting, fishing, and sparsely populated rural areas? Improving the economic conditions in rural America would go a long way to alleviating the rural hostility towards the urban population, and their unconditional support for gun rights just because it gives them an advantage over the urban population [17].

Similarly, there is an obvious solution to the abortion debate, but one has to discard the avalanche of information and debate about when life starts, when the fetus is viable, and the morality of destroying fetuses, and focus on a fundamental political question of why homemakers and their partners overwhelmingly oppose abortion, but professional women and their partners overwhelmingly support it. One can conclude from that observation that the abortion debate is a political struggle between professional women who greatly benefit from controlled pregnancy to focus on their careers, and homemakers who benefit from childbirth and child rearing as their economic livelihood, not from abortion. It also helps to observe that it has become increasingly difficult to make a living as a homemaker without taking on extra jobs outside the home, which fueled the traditional homemakers’ hostility towards professional women who made huge economic gains at their expense. Then the appropriate solution to the abortion debate would be to improve the economic livelihood of homemakers, such as professionalizing homemaking, rather than debating morality and the biology of reproduction [17].

Numbers and statistical reasoning make matters worse by creating even more low-quality, unreliable, and misleading information. Numbers oversimplify facts, obscure complexity, and make hidden choices, and as such they rarely illuminate. Consider the statistics that show that the life expectancy of humans has been rising steadily since the invention of agriculture. Here is a typical misleading story involving numbers: Two cavemen are chatting. One tells the other: “Something isn’t right. Our air is clean; our water is pure; we get plenty of exercise; everything we eat is organic and free range – yet nobody lives past 30.” Something is wrong alright, and that is the statistics. Life expectancy may have been 30, but that is very misleading, because the life expectancy at 45 was 23 more years of life, bringing it 68, closer to current levels. The reason for low life expectancy at birth was high rates of infant mortality, and also infanticide. Babies were not considered fully human, and disabled ones were probably killed. Average life expectancy across diverse cultures is not very meaningful, because the definition of human life varies. If you include the sperm in the computation, the life expectancy will go down to seconds; and if you include the people who can be kept in a vegetative state for decades, the life expectancy will rise dramatically, as it has more recently [22].

The net effect of scientific discoveries may be an increase or a decrease in our total information.

Similarly, it has been argued that our modern lives are much less violent than any other time in history [20]. Those statistics are misleading. They certainly don’t include the psychological violence we prefer to inflict on people such as prisoners, the overweight, and the ugly. Even the level of physical violence varies greatly among geographical locations and time periods. Dresden and Hiroshima during World War II was probably more violent than anything our ancestors ever experienced or even imagined. Averaging those out with the wealthy and peaceful suburbs of America gives a distorted view of what people experience. Violence is localized in the modern world to war zones and poor urban slums. More importantly, a large world population distorts the violence statistics. One cannot simply look at percentages and gain insights. Otherwise, the Biblical record of Cain killing Abel, when the world population was 4, amounts to a violent death rate of 25%, and it would be equivalent to over 2 billion deaths today. Amount of violence is better measured by absolute numbers in specific locations and time to reflect what is experienced by those people. Statistics in the form of averages is a main source of low-quality information that prevents insight and understanding [26].

Economic statistics similarly hide important insights. GDP is a measure of the total value of the goods produced by a nation, and it is often used as a measure of wealth. By that measure, primitive and more traditional societies are often deemed poor, because they rely partly on non-cash economies. For the same reason, development always seems to increase wealth, despite the immense poverty and suffering it creates in some parts of the world. According to this measure of wealth, if the river is clean, and everybody freely drinks the river water, that doesn’t contribute to the wealth of a nation – but if the river is polluted, and everybody has to buy bottled water for cash, then that is included in the wealth of a nation, and considered growth. If everybody is mentally healthy with vibrant communities and extended families, that does not contribute to the wealth of a nation and it is considered poverty; but when the mental health of a community declines, and people pay cash to get help from professionals, that contributes to the cash economy, and considered growth and wealth. These are perverse measures of wealth and growth, and they seem to be specifically designed to make modernity look more desirable in the ongoing competition between modernity and tradition. After all, some benefit immensely from modern development and the destruction of traditional societies [11].

Businesses also generate huge amounts of information that is designed to mislead, not to inform. Advertising is a fundamentally faulty model that relies on the sellers to provide information about their own products, although sellers cannot be expected to be an un biased and reliable source of information about their own products. In fact, advertising is generally expected to be misleading, and consumers go to great lengths to avoid it. This leads to an information war, with sellers spending increasingly larger amounts on advertising to overwhelm the defensive avoidance tactics of consumers, and the consumers increasingly spending more effort on protecting themselves from unwanted advertising. Like all information wars, it leads to a vicious cycle, with constantly diminishing returns to effort and investment, which leaves everyone worse off. All parties have an interest in reliable unbiased information about products, but advertising by sellers cannot provide that by simply producing large quantities of unreliable and biased information [4], [27].

The fundamental claim is that if all competitors advertise freely, then the consumers can sort through the exaggerated claims of all, and find the truth in the middle. That claim is likely to be false, and it also applies to many political, economic, legal, and social debates. Such adversarial systems generate a lot of information supporting various points of view, all of which are biased in one direction or another. Truth may be somewhere in the middle, but it cannot be discovered easily when nobody has the incentive to argue for the actual truth, but every part has an incentive to distort the truth in their direction. It is difficult to find the truth in a court of law or in a political election when everybody argues for their side by distorting the truth.


The information content of a society determines the social problems it faces. Previous changes in the human information repository created fundamental social problems. The knowledge of agriculture replaced hunter-gatherer societies with fixed communities with permanent borders. That allowed the exploitation of others’ labor, and created permanent social classes because people could not easily leave their physical location. The knowledge of industrialization replaced blood-based communities with work-based communities. That created merit-based societies with conditional acceptance and rejection of those who could not perform, leading to work as the basis of identity and personal value, and decreasing emphasis on family and child rearing. It created the constant risk of losing one’s community identification when one cannot perform adequately, leading to loneliness, depression, and anxiety related illnesses. A digital economy is replacing work-based communities, with interest-and lifestyle-based virtual communities connected by communication technologies. That leads to individuals being the social and economic unit, with memberships in many ephemeral communities. This is likely to be the age of pretense and opportunism with no sense of permanence or enduring relationships. Individuals are likely to assume many identities with different personalities during their lifetime, and even simultaneously. These fractured personalities are likely to lead to a crisis of trust and reliability, and consequently a plethora of cognitive illnesses such as attention deficit disorder, paranoia, and schizophrenia [12].

The solutions require enhancing virtual communities with physical qualities such as economic cooperation, resource sharing such as housing, food, and sex, cooperative child rearing, and joint vacations. Enforcing community principles and lifestyles may create a sense of identity and permanence that transcends the virtual world and spills over to the physical world [2].


Levant Orman