Augmented or diminished humans?
Researchers are now trying to help a paraplegic patient walk with a chip implanted in the brain. Digital intelligence will soon be able to partially or fully restore sight to blind people, after helping deaf people hear. Rather than using skin grafts, surgeons can now grow new skin using stem cells. Those are just a few of the promises of science fiction which are fast becoming reality thanks to new technology applied to medicine. Who could complain about miracles like that? Since Descartes, science has always sought to repair living beings. And that is no longer a utopia, or at least much less than it once was. The Carmat artificial heart is one example. Despite a series of failed attempts, the possibility of a body with sustainable mechanical repairs now seems possible. Is the cyborg, a combination of man and machine, finally within reach?
Future technology seems to want to go beyond repairs to “augment humanity;” That is the goal of transhumanism, a very American philosophy with a number of different facets. It aims to modify humankind - to “improve” it, in its adherents’ words - within the next two decades, to ultimately bring about total symbiosis between biological man and artificial intelligence. Before we get there, we’ll need to define a few terms. It will take a few steps to reach the virtually immortal augmented human of certain Silicon Valley researchers’ dreams.
The duty of medicine is, first and foremost, to maintain people in good health, and to care for and heal them. Jean-Noël Missa, a doctor at the Université libre de Bruxelles explains that, “in contemporary medicine, the breakdown of the boundaries between traditional therapeutics and improvement-focused medicine is one of the main features of 21st century biomedicine. New medications and therapeutic technologies can be used not only to treat the sick but also to transform certain human capacities. This shift is changing the paradigm of medical practice."
Four key areas are currently affected by this research in biotechnology, synthetic biology, nanotechnology and cognitive science. They are embryo genetic selection and modification, physical performance improvement, life extension, and cognitive function modification.
From fixing ears to redesigning man
Cochlear implants are a crucial advance for the hard of hearing. Danish company Oticon Opn now offers MSAT (Multiple Speaker Access Technology) implants, which can restore full auditive capacity. This futuristic achievement comes thanks to a one-centimetre device which can carry out one billion operations per second, enabling it to perform a 360° scan of the sound environment over 500 times per second. With this 2,000 euro device, beneficiaries can hear their surroundings with no cognitive overload, like what happens when multiple people are talking simultaneously. This hearing aid is also connected to the internet, which in theory offers the option to gather biological or device usage data, for post-use advice, operational improvements, or even health alerts.
Ears are currently easier to fix than eyes. But hearing aids are doubtless serving as the model for the optical prosthetics of the future. These will be essential to treat the diseases which occur with increased longevity. One of those diseases is Age-related Macular Degeneration (ARMD), which causes incurable blindness. To treat it, bionic eyes are now being developed. A microcamera implanted in the retina should offer good vision by 2025. In this case, a device is simply attached to the body. Another, even more impressive option also exists. In April 2011, a Japanese team announced in the journal Nature that they had produced retinas from embryonic stem cells. This experimental gene therapy had already partially restored the vision of patients with a rare form of incurable blindness.
Genome manipulation heralds the transition from repair-focused medicine to augmentation-focused medicine, by modifying organisms at the cellular level. Doctor and entrepreneur Laurent Alexandre sees this as an essentially natural shift, although he does somewhat simplify the challenges, difficulties and negative consequences involved in any genome modifications. Alexandre says, “don’t forget that vaccinated man is already an augmented man.” A vaccinated body produces antibodies that it would not have produced on its own. Some think the next step is profound modifications to living beings...
Are augmented humans already waiting in a test tube?
On 19 February 2015, Current Biology announced a new experiment, one billed as revolutionary. A Japanese team was able to increase the intellectual abilities of mice by modifying their DNA with segments of human chromosomes. Transforming animals theoretically means that the same could be done to humans in the future. The possibilities are staggering.
In their novel Adrian, humain 2.0, David Angevin and Laurent Alexandre imagine what the life of a genetically augmented human might be. Increased memory, radically higher IQ, enhanced athletic and sexual performance, lizard-like reflex speed. But the potential existence of this superhero raises questions. What would an individual born to be superior - and aware of it - think? The authors chose to make him a serial killer, rather than a Zarathustra. This ubermensch worthy of a crazed Nietzsche drips with scorn for the non-transformed “biologicals” he sees as inferiors, and hatred for his equals, viewed as potential rivals. The authors see genetic augmentation as creating a lonely, sad, and hateful individual. An artistic posture, but one which highlights the questions of relationships with others, which cannot be ignored in a quest to use genetic modifications to create an individual who would help the human species make immense physical and mental progress.
The augmented brain
The promoters of these radical technologies, which were first conceived long ago in science fiction, present them as inevitable. They see them as a part of the new future of humanity, the “technological singularity.” The singularity is a concept developed in the early 1990s by Vernor Vinge, a novelist and professor of IT and mathematics. In addition to genetic augmentation, transhumanists are also selling the idea of the definitive marriage of man and machine. Proponents of this vision of the future include Ray Kurzweil, an American futurologist and Director of Engineering at Google, in Humanity 2.0. He argues that the brain could be amplified, or even fully copied, onto digital circuits, a vision which is widely criticized in the scientific community. This would start the Singularity, an evolutionary point of no return. Humans would become the alter ego of artificial intelligence, like in the 2014 film Transcendence, in which Johnny Depp is ultimately absorbed and amplified by an AI made up of nano-robots.
Our increasingly close coexistence with smartphones is, in a way, the very beginning of this shift. Looking beyond all of the desirable or dreadful future possibilities, and all of the ethical questions surrounding transhumanist ideals, our bodies’ relationship with technology already raises very practical concerns. Objects which are already a part of our daily lives, like smartphones, show that the human-machine symbiosis is not without physical consequences.
How digital affects our brains
The brain appears to be the first collateral victim of the increasingly close relationship between humans and machines. Some researchers are already concerned about intensive use of digital devices. A 2015 US study, published in The Journal of Clinical Psychiatry, showed that 11% of American adolescents have been diagnosed with attention deficit disorders. Michael Pietrus, a psychologist at the University of Chicago, believes that excessive use of digital devices “affects behavior in a way which intensifies underlying concentration problems,” for medically-identified reasons. The subjects diagnosed show a below-average number of dopamine receptors. This means that an activity with “ordinary” interest levels tends to bore them quickly. People with these disorders tend to turn to novelty to incite pleasure and dopamine secretion, a process targeted by digital usage.
Furthermore, 51% of web users who use more than one screen at once and 62% of intensive social media users show concentration deficits at work which force an average of 37% of people surveyed to work outside their working hours. While an average of 44% of subjects say that staying very concentrated requires intense effort, that figure rises to 67% for intensive social media users. Multi-task, multi-screen behaviours, which are increasing sharply, concern specialists because they decrease selective attention. This type of attention is essential for our ability to maintain perspective when dealing with competing stimuli. Multi-tasking generates significant communications costs to our brain. Researcher Maggie Jackson describes the phenomenon: “the brain takes time to change goals, remember the rules needed for the new task, and block out cognitive interference from the previous, still vivid activity."
Impact on memory
Digital seems to be a fantastic way to communicate and archive information. But is it “too” useful? After all, why memorise information when it’s available online? Human memory now finds itself competing with digital memory. Francis Eustache, a neuropsychiatrist and memory specialist, is worried: “The pace imposed by technology can affect our ability to form memories.” When we look things up with no effort too often, we do less to exercise our long-term memory, the type of memory that enables the brain to categorise facts and develop concepts, ultimately bringing information into coherent conceptual structures. Failing to do this mental work means that we risk becoming mere data consumers. Eustache also notes a failure to do the work of remembering on “the network of intimacy, the construction of the self. It is vital to protect that, because it is what enables us to organise our knowledge, to synthesise it, and make it our own. If all knowledge is delegated to Google, our convictions and our own richness are no longer anchored within us. "
The example of GPS and spatial location
Manipulating information stored elsewhere thus implies cerebral laziness. Philosopher and psychoanalyst Miguel Benasayag says that, “the brain is replacing complex functions with a simple on-off switch.” We are already seeing that shift with the use of GPS devices when driving. Benasayag cites a study of taxi drivers who drive with and without GPS. He believes it is particularly telling. The drivers’ brains were studied using a PET scan, which shows cerebral activity. The first group, the drivers “on GPS,” showed artificial dyslexia caused by a loss of space-time references: in other words, digital information enters the consciousness as coded information, with no need for the body and brain to undergo the physical experience of the trip. In the second group, which drove without GPS, the scan showed intense activity in the hippocampus, a part of the brain which plays a vital role in spatial memory. In the first group, on the other hand, the hippocampus did not “light up.” The zones used by the effort to spatialise and memorise a direction remained latent. “This experiment showed that while machines can make travel more fluid, the brain loses some of its adaptive capacity,” says Benasayag, whose reasoning tends to follow that of Rousseau in Émile, or On Education. “The human species can’t really progress, because what you gain on one side you lose on the other."
A more active but more chaotic brain
Should we all be as pessimistic as this proponent of our natural state, when it comes to our changing relationship with machines? Thierry Baccino, a professor of cognitive psychology of digital technologies at Université Paris 8, says that, “while reading a book always activates the same areas of the brain, whether you read a digital or a paper copy, reading on a screen to find information uses the frontal zones of the brain, like decision-making.” In other words, while some zones are underused, other previously dormant zones are activated. This observation puts the alarming results of the study of brains “on GPS” into perspective. Better yet, the increased activity in the frontal and prefrontal zones, which is particularly marked in the very young (ages 12-24 years) “leads to increased visual attention capacity, as well as accelerated decision-making. "
However, according to Olivier Houdé, director of the CNRS Child development and educational psychology lab, this acceleration “comes at the expense of another, slower function of the zone: taking a step back, drawing personal conclusions, and cognitive resistance [...] which enables impulse control.” He therefore believes it is imperative to teach cognitive control, to help young people learn to control their impulsive responses, resist automatic reactions, and develop the perspective needed to think independently. If we are to avoid the negative consequences of these digital changes, we will have to learn to use them the right way.
Education... and moderation in all things
Schools naturally have a key role to play. It is essential to enhance their role in teaching behaviours that ensure proper brain development. Something which requires major societal decisions. For instance, just as the French public education system is increasingly promising to teach typing at the expense of cursive, we are discovering that cursive writing activates specific neuron networks in the brain, which change their sizes and synapses - in other words, their capacity to form connections.
Miguel Benasayag says that “these movements produce specific traces in the memory. Writing by hand is a practice which physically anchors our thoughts. It sculpts and strengthens the brain, implying that the brain can change itself. What we do, or don’t do, with our brain requires sparking awareness, good habits and training, something which society as a whole, from parents to companies, must undertake.
Ultimately, somewhere between the virtually transhumanist optimism of someone like Laurent Alexandre, a urologist and President of the genome sequencing company DNAVision, and the opposite extreme of Miguel Benasayag and his book “Cerveau augmenté, homme diminué” (Augmented brain, diminished humanity), can we find a middle way which is neither obsessed with nor fearful of technology? Can we accept the promise of gene therapy and the artificial heart, which Laurent Alexandre argue already makes us cyborgs or augmented humans, without succumbing to the delusions and dangers of going all-digital and all-technological, denounced by Miguel Benasayag and many other researchers? We’re betting that we can.