Convenience is the most underestimated and least understood force in the world today. … In the developed nations of the 21st century, convenience — that is, more efficient and easier ways of doing personal tasks — has emerged as perhaps the most powerful force shaping our individual lives and our economies. This is particularly true in America, where despite all the paeans to freedom and individuality, one sometimes wonders whether convenience is in fact the supreme value.
The Jewel Wasp
According to William E. Rees, “[a] parasite is an organism that gains its utility by sapping the vitality of its host [2].” A vivid illustration of this can be seen in WIRED science writer, Matt Simon’s latest book, Plight of the Living Dead: What Real-Life Zombies Reveal about Our World – and Ourselves. Simon fastens on the startling machinations of “real” parasites and how they turn their hosts into agents of their own propagation. In his opening chapter, Simon zeroes in on Ampulex compressa, the jewel wasp, its “hypnotizing beauty — with big eyes and a precious green sheen to its body — [belying] its belligerence [3].” The unlucky target of her design is a roach. In her first move, the wasp injects her venom loaded stinger between the roach’s front legs. Included in the venom’s more than two hundred compound brew is the weaponized central nervous system inhibiting neurotransmitter, gamma-aminobutyric acid (GABA). For five minutes, GABA paralyzes the roach’s ability to block the wasp’s next move with its front legs. The now unobstructed wasp removes her stinger and redeploys it to penetrate the now defenseless victim’s neck and from there moves into its brain. The wasp, for now at least, is not trying to kill her prey as she feels around for the right spot [actually two spots] in the brain to inject her mind-control potion [3]. These precise locations govern the roach’s means of movement. After discharging her venom, the wasp again removes her stinger and lo and behold, in a few minutes the roach, instead of flooding with fear, begins grooming itself as though it hadn’t a care in the world.
Ten minutes after the brain penetrating injection, while the roach continues its grooming, the wasp begins phase two of her modus operandi, chopping off her victim’s antennae and refreshing herself with its blood. Next comes phase three. Though the once again fully mobile roach could certainly walk away from its impending doom, it doesn’t. The wasp grabs and yanks the base of the roach’s removed antenna and thanks to the venom in the roach’s brain, it follows her into her den whereupon she lays her egg on its belly. She then plugs up the den’s entrance to keep “the neighborhood opportunists from consuming her prisoner” [3, p. 5].
Parents have no idea that lurking behind their kids’ screens and phones are a multitude of psychologists, neuroscientists, and social science experts, who use their knowledge of psychological vulnerabilities to devise products that capture kids’ attention for the sake of industry profit.
Unlike brute force stinging like that of an enraged bee, Simon observes that the jewel wasp’s stinger “is far more precise, a rapier to the honeybee’s claymore sword” [3, p. 10]. The chemical that turns the victim into a docile dupe is the same chemical that gives us humans pleasure: Dopamine. Interestingly, as an added plus, what the wasp has done by stinging the roach’s brain in precisely the right spots is desensitize it, putting it into a sleep-like state that allows the wasp to do her work while getting no complaints from her drowsy prey.
Though Simon, like most of us, has no love for roaches, watching its step by step, while still alive, evisceration made him feel a touch of sadness. “It’s just how the parasite,” he writes, “pounces and methodically drives its stinger into the roach’s brain and lingers, with the victim’s head bent in a silly way, staring at [him as if] asking for help” [3, p. 5].
The key component in the wasp’s brain swaying venom injected straight into the roach’s central nervous system — dopamine — is interestingly, also at work in our ongoing affairs with technology — applying especially, but not only to the slot-machine-in-hand, smartphone head-bending hooks. Though manipulation of we humans seems far removed from that of a parasitic wasp on a roach (the jewel wasp harbors no pretense of doing good by her host), it is self-similar, self-similarity being the key ingredient in the fractal functioning of the universe.
Self-Similarity
In “Nature Adores Self-Similarity,” cosmologist Robert L. Oldershaw writes that “were it not for [the elegant design strategy of self-similarity], we would be in dire straits… We would be unable to breathe without the critical self-similar architecture inside our lungs… We couldn’t digest food, but that wouldn’t matter; there would be little to eat “because, in the absence of self-similarity, the Earth would be virtually devoid of vegetation.” Then again, digestion and breathing wouldn’t matter because with neither body nor brain, we wouldn’t exist [4].
Self-similarity, as Oldershaw proposes, applies up and down the scales, including the origins and evolution of life, including the affairs of living organisms, their biological and social organizations. The scope of ways in which we, that is all of life, though unfathomably vast, share self-similar patterns, include the means by which parasites massage their prey into serving their interests. One of those patterns, arguably the pattern behind all patterns, the self-similarity pulling the strings backstage of all self-similarities, is the second law of thermodynamics. Australian astrophysicists, Charles H. Lineweaver and Chas A. Egan, cogently express this in their paper, “Life, gravity and the second law of thermodynamics.” “Entropy,” they claim, “is the unifying concept of life because the second law is universal; It applies to everything. Man, [woman], machine, microbe, or the entire cosmos — there is no scale or material to which the second law does not apply” [5]. As physicist, Carlo Rovelli, puts it in his recent book, The Order of Time, “It is entropy, not energy, that drives the world” [6].
The Second Law insists that no matter what happens in the universe, the sum-total of drained possibility, entropy, always increases, if it shrinks in a particular system, or system of systems,1 as it is so exponentially doing in the onrushing juggernaut of technology, it must, and it will increase always more in the rising-up system’s environment. What or who is the environment of technology? Answer: As bite-back of our collective parasitic sapping of the planet’s vitality — the clear-cut, burned, and consumed rainforests, the “sixth extinction” scale consumed fellow species, the heating up, plastic consumed oceans, the pollution consumed air, the tailings and lead consumed fresh water, the fossil fuel consumed glaciers… — we are the hooked-on-dopamine intermittently rewarded environment of cutting-edge technical advance.
The first step in getting a grip on what our ever-deepening tech dependency is doing to us is asking why, if total entropy always increases, does the Second Law allow entropy to shrink at all? The answer is that by allowing the local self-organization of matter and energy that ultimately led to life, the universal sum of entropy more effectively, more efficiently increases. As American evolutionary chemist, Jeffrey Wicken (1942–2002), put it in “Evolution and Thermodynamics: The New Paradigm,” “dissipation through structuring is the strategy of life” ([7]; see also [8], [9]). It is, in the words of American marine geologist and ecological thermodynamicist, Eric Schneider, and ecological scientist and policy-maker, James Kay (1954–2004), “order emerging from disorder in the service of causing even more disorder” ([10]; see also [11]). Lineweaver and Egan second the motions, claiming that “life and the second law are allies… the maintenance of a highly ordered structure increases the disorder in the universe more than would be the case without the structure” [5].
How does this link to the claim that we are the being-dissipated environment of cutting-edge technics, parasitically feeding on its human host? One possible tie-in is that elite-driven scientific and technical advance self-similarly parallels the incoming high-power (low-entropy) energy from the sun, causing matter in jackpot goldilocks contexts to self-organize into the stepping stones to life. The parallel is the high-stakes all-out racing-to-win drive to congeal the order in software and hardware into more and more powerful configurations that offer their consumers the means that remove more and more of the need for exertion. Since effort is what keeps our brains, bodies, and social engagement skills in shape, and for children especially, growing — the systematically escalating removal of the need for mental, physical, and face-to-face social bonding exertion mirrors the waste heat being discharged into the environment of resonant self-organizing matter.2 Waste heat represents spread-out full-of-entropy energy eviscerated of its transformative power. In the feeding-on-itself removal of the need for human effort, technology injects the self-similar equivalent of waste heat into its human environment, systematically consuming our now and future transformative powers.
Case in point: Dependency on GPS turn-by-turn navigation eliminates the cognitive maps in the human hippocampus that would have been made had the traveler navigated by deploying more attention-demanding means like maps, signs, landmarks, the direction of the sun, compass, sensing when something doesn’t look right and/or you’ve gone much further than you expected, or just getting out of your car to ask a local. The extremely high order as radically lowered entropy in the entire GPS satellite-enabled system is being compensated by the massive uptick in mental entropy in the millions of drivers who have become navigationally clueless without GPS to get them around.
Still, if it were just GPS-driven mental entropy, we could handle that. The problem is it’s not just GPS. It’s increasingly, faster and faster, across the board, five million and counting apps for everything. “‘Society is geared in many ways toward shrinking the hippocampus,’ says neuroscientist, Véronique Bohbot. ‘In the next twenty years, [she thinks] we’re going to see dementia occurring earlier and earlier’” ([15]; see also [16]). The soaring power as shriveling entropy at the cutting-edges of artificial intelligence is being paid for by the mental dissipation of mega-millions who ask Amazon’s Alexa/Apple’s Siri/Google’s Duplex… to do the thinking, attending, answering, ordering, remembering, deciding, directing, recommending, reminding, connecting, advising, guiding, alerting, relieving, reserving, amusing, and solving, while surreptitiously imprinting the expectations and behavior of children.
(Even though he knows his first-grade level arithmetic, a New Jersey mother was surprised when she caught her son asking Alexa “what’s five minus three.” The boy thanked Alexa after she supplied the answer. “He was just being lazy. Taking a shortcut,” said the mother who will probably “pull the plug on Alexa to make sure her son doesn’t do it again” [17].)
Conventional Wisdom (Not)
The conventional wisdom, what most people auto-believe, is that technical progress is human progress. Good technology is good. Faster technology is better. Easy is good. Easier is better. Convenience is good. Seamless convenience is better. One need only look at the ads to see that this is the background assumption. Internet providers like Verizon tout their download and upload speeds. Since we hate to wait, faster is automatically better. Even at the initial Qualcomm claimed median speed of about 1.4 gigabits a second (demonstrated max is 4.5) 5G is automatically better than current 4G because it is 20 times faster. That means at median speed it would take 17 seconds to download a typical movie compared to six minutes with 4G. According to Samsung senior vice president, Justin Denison, “Rather than remembering to download a season of a favorite TV show before heading to the airport… you could do it while in line to board a plane [18].” Car companies boast of the latest tech being incorporated in their vehicles. Get too close to the car ahead of you and the car will brake for you. Drift out of your lane for a second and your car will alert you or automatically shift you back in. Have trouble parallel parking? Your car will do the parking for you. Tired of having to pay attention en route? Your just about self-driving car will do the attending for you. Don’t know how to get from A to B? GPS will turn by turn get you where you want to go. Need some entertainment to pass the time on your commute? There’s TV on your smartphone and Candy Crush. Need to turn on a light. Ask Alexa. Not sure if you need more milk? Ask Alexa. Don’t feel like getting up off the couch to turn on the upstairs air conditioner. Ask Alexa. Want the answer to five minus three? Ask Alexa.
(According to Voicebot.ai editor and publisher, Bret Kinsella, as of September 2, 2018, Amazon Alexa had 50 000 “skills” worldwide, works with 20 000 devices and is used by 3500 brands. Her skills are growing at a rate of 10 000 every 121 days, and there are hundreds of thousands of Alexa skill developers in 180 countries with a growth rate in developer support of at least 14 times in about 20 months [19].)
No Sorcerer for Us
The flattering power of advancing technology, doing more and more for us, makes us feel like the immovable mover gods of old. Seduced by algorithm vacuumed taps and scrolls, we morph into the apprentice who dons the sorcerer’s magic hat and conjures a broom into relieving him of the chore of transferring water from one fountain to another. While the broom happily scoops, marches, and pours, back and forth, back and forth, the mesmerized apprentice flops on a chair and nods off. He dreams of conducting the oceans, storms, and stars until rudely awakened by falling off his chair into the rising flood. In desperation, he chases and smashes the broom to pieces to no avail, as each piece self-assembles into a marching army of fetching and pouring brooms? Spinning round and round on the sorcerer’s book of magic, the apprentice hopelessly searches for the turn-off command. As he’s about to drown, the sorcerer, awakened by all the commotion, arrives and sweeps away the flood [20].3
There’s no sorcerer who can arm-wave away the deluge we’ve set in motion.
Epidemic
In The Tipping Point, Malcolm Gladwell wrote that “the best way to understand the dramatic transformation of unknown books into bestsellers, or the rise of teenage smoking [cum-vaping-cum-smoking [22]], or the phenomenon of word-of-mouth, [or the overnight blitz of legal online sports betting ‘with potentially huge implications for how sports are watched, and even played in America’ [23]] or any other mysterious changes that mark everyday life is to think of them as epidemics. Ideas and products and messages and behaviors spread just like viruses do” [24].
Based on a recent DNA study, our species, homo sapiens, has been around for as long as 350 000 years [25]. The iPhone came on the scene in 2007, 12 years ago. The proportion of our existence as a distinct species that includes iPhones is 0.00003. In other words, for 99.997 percent of homo sapiens existence on Earth, there was no such thing as an iPhone. And yet here we are with near everyone thumbing away, head down, texting, tapping, scrolling, flicking, and oblivious to their surroundings and everyone in it, their kids, their parents, their friends at dinner.
Though we generally don’t see ourselves as the environment of high-tech products like the smartphone, we are. As with the order and power in GPS technology releasing entropic effluents into its human environment as the waste heat equivalent of shriveling hippocampi and its downstream potential for early dementia, including Alzheimer’s [26], on the scale of individual human engagement, the entropy being discharged into people, including, or especially young people, can be seen in smartphone and social media addiction, fine-tune designed by the ethically challenged to capture and latch.
Kelly
“‘We called the police because she wrecked her room and hit her mom…all because we took her phone,’ Kelly’s father explained. He said that when the police arrived that evening, Kelly was distraught and told an officer that she wanted to kill herself. So an ambulance was called, and the 15-year-old was strapped to a gurney, taken to a psychiatric hospital, and monitored for safety before being released.”
Days after being hospitalized, Kelly was brought to child and adolescent psychologist, Richard Freed’s office. Freed “asked Kelly… to help [him] understand her perspective on that evening. She didn’t respond and instead glared at her parents. But then, surprising everyone in the room, she cried, ‘They took my f***ing phone!’”
Freed points out that Kelly’s parents are by no means, outliers. “Even though [they] were loving and involved, her mom felt that they’d done something terribly wrong that led to their daughter’s problems… What none of these parents understand is that their children’s and teens’ destructive obsession with technology is the predictable consequence of a virtually unrecognized merger between the tech industry and psychology. This alliance pairs the consumer tech industry’s immense wealth with the most sophisticated psychological research, making it possible to develop social media, video games, and phones with drug-like power to seduce young users… These parents have no idea that lurking behind their kids’ screens and phones are a multitude of psychologists, neuroscientists, and social science experts who use their knowledge of psychological vulnerabilities to devise products that capture kids’ attention for the sake of industry profit” [27].
In her new book, The Age of Surveillance Capitalism, Shoshana Zuboff, puts it this way: “Facebook’s marketing director openly boasts that its precision tools craft a medium in which users ‘never have to look away,’ but the corporation has been far more circumspect about the design practices that eventually make users, especially young users, incapable of looking away [28].”
From Manipulating Behavior to Addiction by Design
Technology is not, as is the general view, reserved for its products. Technology is all forms of systematized, purpose driven organization, including human organizations like the for-profit corporation, a technology whose prime purpose is maximizing the making of large sums of money. In pursuit of that goal, creating addiction as an engine of profit is the name of too many games, well captured in the world of slot machines by the title and content of Natasha Dow Schüll’s Addiction by Design [29]. As with slots, as with the epidemic of online sports betting, legal and not, in the context of smartphones, social media and apps, maximizing profit takes the form of hooking and latching targets. Forwarding that goal is B. J. Fogg, founder of the Stanford Persuasive Technology Lab. Its aim is to alter human thoughts and behaviors via digital machines and apps. According to Fogg, “‘We can now create machines that can change what people think and what people do, and the machines can do that autonomously’” [27]. Just as the jewel wasp’s exquisitely targeted stinger releases dopamine into her prey’s brain, triggering oblivious-to-looming-danger grooming, “Social-media apps plumb one of our deepest wells of motivation. The human brain releases pleasurable, habit-forming chemicals in response to social interactions, even to mere simulacra of them, and the hottest triggers are other people: you and your friends or followers are constantly prompting each other to use the service for longer” [30].
Tap and Go
The sheer lure of the promise of on demand, tap and go/click and go/grab and go/scan and go/say and go/look and go/listen and go/point and go, of not having to wait for anything, anytime, anywhere, is a, if not the prime driver of the out-of-control sorcerer’s apprentice dilemma. The more technology does for us, the more it serves us, the more convenient it is, the faster it responds to our every string pulled desire, “seamlessly” — the favorite term deployed to cheerlead technology’s friction free effortlessness — the better. If you can talk to your smartphone in lieu of making the effort of typing, that’s good. If the machine knows you so well that it can warn you to leave 15 minutes early because it knows there is heavy traffic en route to your child’s soccer game — knowing you have a child, knowing that child plays soccer, knowing what time you take him or her to the soccer game, knowing the route you take — that’s good. If you can simply point your phone at a restaurant to get reviews that’s, of course, automatically better than having to look up reviews, even if you can just speak the request [31].
In their new book, Re-Engineering Humanity, Brett Frischmann and Evan Selinger claim “[they’re] not interested in the engineering of intelligent machines; [they’re] interested in the engineering of unintelligent humans” [32]. They ask:
“What’s the harm in technology companies making shopping easier for us? Or making it easier for us to get valuable information like directions for how to get to a meeting across town during rush hour traffic? These all seem like good things that enhance our lives. That’s why it would feel catastrophic to lose the technological services that we’ve grown accustomed to. At the same time, however, we’re being sold a misleading vision of cyberservants and digital assistants. These tools don’t just do our bidding. They’re also smart enough to get us to do theirs…We are being conditioned to obey. More precisely, [as with the venom zombified roach obediently following the jewel wasp into her den] we’re being conditioned to want to obey” [32].
The aim of the Stanford Persuasive Technology Lab is to alter human thoughts and behaviors via digital machines and apps.
Ants
From yet another angle, to see what such doing more and more for us, always listening techno-slaves like Amazon’s Alexa and their exploding Internet of Things (IoT) consortium, or the vast satellite enabled GPS system — to see what those technologies are doing to us, consider the only other species — actually no fewer than 35 species — that enslaves its own kind: ants, and what such enslavement does to the master ants.
The famed biologist/myrmecologist/prolific author, Edward O. Wilson, conducted experiments on master/slave ant species. The particular species of ant Wilson was studying was not an advanced slave maker. When their slaves were taken away, the master ants still retained a latent capacity for work that was reactivated, “rapidly taking over most of the tasks formerly carried out by the slaves… [This] latent capacity for working is a capacity that is totally lacking in more advanced species of slave making ants.” But there is a caveat. “The [master] workers that had lost their slaves did not, however, perform their tasks well” and, in fact would have starved were the slaves not returned [33].
The takeaway: No, we’re not ants, but just as the slave making ants lost the ability and willingness to do for themselves, even to the extent of perishing from hunger, so do we face the increasing prospect of a losing-it-for-not-using-it future.
According to Wilson, “The evolution of social parasitism in ants works like a ratchet, allowing a species to slip further down in parasitic dependence but not back up toward its original free-living existence” [33]. Though, in Wilson’s view, the master ants are the parasites, in a perverse twist that bears on our master/slave engagement with technology, as per the subtitle of Edward Tenner’s Why Things Bite Back: Technology and the Revenge of Unintended Consequences, the ant slaves are taking revenge by rendering their masters helpless without them. Again, drawing on Rees’ characterization of parasitism, the ant slaves, thanks to their creating the ratchet of master ant dependency, parasitically drain the life preserving vitality of the slave maker. The self-similarity to our own master/slave engagement with technology appears. On the surface, technology is our slave, we are its master. We command Alexa. Alexa, coupled to its virally multiplying “skills,” obeys and does our bidding [34]. As with a master/slave ant species, Alexa becomes the parasite, rendering us more and more helpless, dependent, vulnerable [35]. Our human ends, as political theorist, Langdon Winner, observed in Autonomous Technology: Technics-Out of-Control as a Theme in Political Thought, are being reverse adapted “to match the character of the available means” [36].
Return of the Jewel Wasp
It’s time now to revisit Matt Simon’s tale of the parasitoid jewel wasp and her doomed zombified dupe, the larva eviscerating cockroach with far from identical, but nonetheless revealing self-similar parallels to our own ongoing affairs with the techno-juggernaut. The vital difference between the wasp’s zombifying designs on her larva-nurturing-at-its-expense victim is that mirroring the rise of entropy in the universe itself, the degrading of human order is not uniform. Technology is indeed a double-edged sword. It really does, symbiotically, do things for us, freeing us up to do and be and create what we could not do and be and create otherwise. But as with today’s soaring wealth inequality, technology’s symbiotic gains concentrate in the elite sliver of humanity who are rising with the returns of advancing technology, perching themselves on the high end of the benefit-to-harm distribution because they allow the technical order to purposefully translate empowering effort while the multitudes just let Alexa do the work. The result is that technology’s accelerating returns are increasingly going to the already rich, creating ever escalating cumulative advantage known as “The Matthew Effect” [37].4 The rich are getting richer across the board of power, access, grasp, skill, income, and net worth at the increasing expense of the vast bulk of humanity who only think that their smartphone, always at and in hand, is doing it all for them.
The successfully evolved jewel wasp has found the means of insuring the survival and propagation of its species. Its technique turns the roach into an agent of its propagation at the roach’s expense, thus fulfilling the definition of a parasite. It paralyzes its victim’s front legs. It taps into just the right spots in the roach’s brain to release dopamine that turns its zombified dupe into a drowsy, not-a-care-in-the-world grooming machine, while sweeping aside its instinct for self-preservation as an added bonus. Advancing technology represents the concentrating of order that grants its possessor power. That order, that power, is negentropic, shrinking entropy. But since the Second Law won’t allow the sum-total of entropy in the universe to diminish, something — some environment of the powering up systems of technical advance — must pay the dissipative price. My claim is that it is us — just not (for now) all of us.
Will it be AI or IA?
In the goldilocks conditions of the Earth, our home, with resonant solar-energy-driven matter self-organized into life as a more efficient means for the Second Law to maximize entropy, the measure of waste heat — of spread out rendered useless energy — eviscerated possibility. Thanks to our remarkably self-evolved brains, we humans, not necessarily as individuals, but as a species, represent the supreme culmination of the 4.5 billion year evolution of efficiency in entropy production. But the process doesn’t stop with us. It carries on in the accelerating concentration of power as escalating exergy (see footnote 1) — “negative entropy” as the Nobel laureate physicist, Erwin Schrödinger, coined it [44] — in advancing technology, with its supreme manifestation in cutting-edge artificial intelligence.
Two recent articles in IEEE Technology and Society Magazine, “Efficiency Versus Creativity as Organizing Principles of Socio-Technical Systems: Why Do We Build (Intelligent) Systems?” by Ada Diaconescu [38] and “Assessing Artificial Intelligence for Humanity” by Andrzej Nowak, Paul Lukowicz, and Pawel Horodecki (NLH) [39], fasten on the growing concern that the feeding-on-itself advance of AI and its cutting-edge sidekicks (machine learning, deep learning, neural networks, robotics…), pose an increasing threat to our human future unless corralled into serving humanity and not on a plate.
Both Diaconescu’s and NLH’s essays zero in on the nature of technology as a double-edged sword.Our human future keys on whether AI, with its emphasis on efficiency above all else with adverse human consequences as the unfortunate externality — or IA, “intelligence amplification,” NLH’s “Human Centric AI”, dominates in our future. En route, they make recommendations aimed at realizing the latter. But what is the likelihood of the realization of AI enhancing rather than extinguishing us? In their essays, the authors keenly capture the dark side of the sword in many of its facets: the valuing of efficiency as an end in itself rather than a means to a better human end; and the by-and-large unrecognized slippery slope towards the parasitic sapping of human mental, physical, and social order due to technology increasingly eliminating the need to use brain, body, and communication skills. Towards remedy, Diaconescu recommends paying keen attention when the language of efficiency is being used to justify whatever is being promoted, marketed, and sold as a not to be questioned value in itself regardless of suitability, targets, beneficiaries, case-by case. Efficiency for what? Efficiency for whom? “[What] are the potential side effects?” “Are the benefits worth the potential risks?” “[What] does an ‘efficient education’, ‘efficient research?’, or ‘efficient healthcare’ even mean?” “[When] patients must be released as soon as possible lest they become negative statistics for a hospital’s performance evaluation, what kind of efficiency is being optimized, and for whom — certainly not for the patients; nor for the community.” Pay attention, she warns, to the kinds of technological devices we develop and/or use. Ask not only what it can do for us, but also what, along the way, is it doing to us?Use technology “selectively, for specific well-suited tasks rather than by default for everything… [Do] we have to use a smart phone [cum Alexa] to control the lights? Are we giving away valuable data or knowledge? Who benefits? What are the impacts?” In two words, pay attention.
In what they call a “Function-Oriented AI” future, where humanity increasingly looks to AI to do the work, NLH zero in on the losing-it-for-not-using-it equivalent of waste heat in the downslide of human brains, bodies, and societal bonds. They write that “Any skill that is not used decays. As an increasing range of tasks is delegated to AI, humans will lose the knowledge and skills to perform these tasks and will become helpless without AI… In this scenario, human information processing is delegated to AI, and humans just get answers. They don’t gain understanding of the knowledge and processing rules that led to the solutions… In contrast, the concept of Human Centric AI,” (IA)envisions future AI technology that will synergistically work with humans for the benefit of humans and human society: Towards that goal they stress enhancement not replacement, empowering people instead of prescriptive systems telling them what to do, transparent, validated, and trustworthy, not “black box” systems, with human values, ethics, and privacy embedded in AI system and app core design considerations.
While Diaconescu’s recommendations are crucial to any chance we have of steering the juggernaut towards uplifting human ends, the likelihood of their real world, sustainable-on-a-species-scale implementation are slim to nonexistent for all but a sliver of humanity on the top end of the benefit-to-harm scale. The same goes for NLH’s Human Centric AI realization recommendations. The reason is that technology is carrying on the self-similar path that led matter to self-organize into life in the first place. Technology is replacing humans as a still more efficient means for the Second law to pump out entropy, to degrade any and all gradients, the differences that can make a difference, not excluding our individual and collective human differences that can make an augmenting human difference.
The reason that either article’s necessary, but not sufficient recommendations for tilting the scales towards IA rather than its current, prescriptive, just-give-us-the-answers AI, has to do with the technology-corrupted urge to take the supremely powerful, once vital-for-survival draw of whatever path to a goal minimizes food energy consuming exertion — “The Principle of Least Effort” [40]. The need to pay attention to the qualifiers of efficiency — for whom, for what, with what side effects — demand more effort than just poke and go. This knee-jerk urge to take least effort paths to goals tilts the scale heavily in the direction of across the board human replacement, of AI, not human enhancing IA.
Epilogue
The key to getting a handle on the parasitic extraction and transfer of human mental, physical, and social order into the technosphere, lies in popping the sorcerer’s apprentice fantasy by recognizing that advancing technology is the leading edge of the thermodynamic imperative to dissipate gradients. That awareness is crucial to not giving the Second Law the maximum entropy production that it wants, to tilting the scales towards a more just distribution of technology’s freeing-up fruits. Recognizing our common future conundrum, including the string-pulling role of the Second Law of Thermodynamics, tracing back to the origins of order in the universe, including life, along with the driven-out-of-whack, primal urge to preserve food energy by letting technology do more and more of the work, is a key move in the direction of not serving as entropy’s puppets. (Figure 1). That awareness must include the sobering caveat that the always-waiting-for-slip-up Second Law, is a stealthy, slippery sloping, cliff for a recidivist, tipping point precipitating, tragedy of the commons, fine print redacting, whack-a-mole foe.
On a real-reality scale, locking and latching a techno-literate, efficiency for what? efficiency for whom? enhancement not replacement, human centric IA agenda that adds significant friction into the gears of the parasitic modus operandi — will encounter huge resistance. Since we are, claims Shoshana Zuboff, “objects of a technologically advanced and increasingly inescapable raw material extraction operation” [28, p. 10], not unexpectedly, then, there are enormously powerful vested interests that do not want any critical thinking engagement threats to their ongoing behavior mining operations; they do not want their users, they do not want children, they do not want adolescents, they do not want parents, they do not want teachers asking what they’re being sold is doing to them and to their charges behind the efficient, profit-driven seamless masquerade of symbiosis. Yet, without scaled-up critical brain, “system 2”, recognition overruling sucker for convenience, “system 1,” the sapping of brain, body, and bond will carry on faster and faster to no good human end [41].5 As Matt Simon summarily puts it, “[while] the human brain is vastly more complicated than that of an ant… all brains, no matter their complexity, follow the same physical laws of nature… The [parasites’] victims get their minds snatched because something figured out the code — something put a key in a lock and turned it, complexity of the brain be damned. Which means the zombifiers can conquer us too [3, pp. 193–194].” In other words, “If we’re not careful,” writes Yuval Noah Harari in his latest book, 21 Lessons for the 21st Century, “we will end up with downgraded humans misusing upgraded computers to wreak havoc on themselves and on the world” [43].
So then are we doomed? Is the dark side of human extinction to be our fate? Or even the grey side, as NLH put it, as the boiling frog scenario of ever-increasing helplessness sans increasingly powerful techno-props? Will our brains, bodies, and face-to-face social relationship skills be the tragedy of the commons eviscerated resource, the result of players’ all out racing to gain a competitive edge in the AI game? Stay tuned.
Acknowledgment
An earlier version of this article was presented at the 2018 IEEE International Symposium on Technology and Society [42].
Author Information
Jeff Robbins is with Rutgers University, New Brunswick, NJ. Email: jeff.robbins@rutgers.edu