The Doomsday Handbook Read online
Page 9
Parallel to that in the past half-century has been the stupendous growth in information and digital technologies, resulting in our networked world. Today, everything you want to know is at your fingertips.
These twin tracks of knowledge intersect at many points—computing is crucial in basic biological research or in operating medical equipment—but could they be integrated further to make a better type of human? Future generations could have infinite memories thanks to computer implants. Their networked minds could bypass the fingertips to access all of the world’s information. And as yet undreamed of medical breakthroughs will enable these improved humans to live for hundreds (perhaps thousands) of years. It all sounds so positive—what could possibly go wrong?
The transhumanist manifesto
Humanists believe that people matter; that even though we are not perfect as a species, things can be improved by freedom, tolerance, rational thinking and, above all, concern for other humans. People who call themselves “transhumanists” agree with all of that, but also want to emphasize the potential that humans have to develop beyond our natural limits. “Just as we use rational means to improve the human condition and the external world, we can also use such means to improve ourselves, the human organism,” says the philosopher Nick Bostrom, a proponent of the transhumanist way of thinking. “In doing so, we are not limited to traditional humanistic methods, such as education and cultural development. We can also use technological means that will eventually enable us to move beyond what some would think of as ‘human.’”
The future that Bostrom alludes to is known to transhumanists as “the era of the posthuman being.” This does not mean that humans will not exist any more; rather that the beings that will exist at that time will have basic capacities and desires so radically surpassing anything familiar to modern humans that they will be unrecognizable to us.
Transhumanists yearn to reach intellectual heights way above anything we know of today—as different from modern humans as humans are from other primates. They want unlimited youth and healthy life, and they want to be able to control their own desires and moods, so that they can avoid feeling tired or irritable, but can ramp up feelings of pleasure, love and artistic appreciation. They also want to experience states of consciousness inaccessible to modern human brains. “It seems likely that the simple fact of living an indefinitely long, healthy, active life would take anyone to posthumanity if they went on accumulating memories, skills, and intelligence,” says Bostrom, who heads up the Future of Humanity Institute at the University of Oxford.
This posthuman world could end up being populated by very non-human-looking people. It could be a world of, say, artificial intelligences based on the thoughts and memories of humans who uploaded themselves into a computer and exist only as digital information on superfast computer networks. Their physical bodies might be gone, but they would be able to access and store endless information in milliseconds, and share their thoughts and feelings immediately and unambiguously with other digital humans.
A more recognizable posthuman might simply be a modern human who has been enhanced with genetic modification or drugs to slow aging or mental decline. They might have neural interfaces, memory-enhancing prosthetics or be part cyborg to improve fitness and strength.
The rise of human modification
All of these scenarios might sound far-fetched, but technically, none of them is impossible. The accelerating rate of technological advance will lead us very soon into uncharted waters, and given the fact that most of our modern world was only developed in the past few decades, who knows where we will be in one or two centuries. Existing technologies, in their infancy now, already have the power to change our species.
An artificial hand, developed at the Forschungszentrum Karlsruhe in Germany, that uses hydraulic fluid actuators to move and grasp. It could be used as a prosthetic to improve normal human capabilities.
Artificial intelligence, for example, is bound to happen in some form, allowing computers and robots to do the kind of thinking that has so far been the sole preserve of humans. No doubt the machines will eventually think faster and more creatively than humans—why not incorporate them into our physiology to make humans better? Brain–computer interfaces are already being trialed in patients with severe disabilities, allowing them to move computer cursors with the power of thought alone. Blind people have had electrodes fitted into their retinas to help them see for the first time in years.
The futurist Ray Kurzweil thinks that by 2035, human brains and computers will begin to merge. Tiny nanobots could be used to improve the way and the amount we can think, extending our intelligence. “By 2020, $1,000 (£581) worth of computer will equal the processing power of the human brain,” he told the Guardian newspaper in 2005. “By the late 2020s, we’ll have reverse-engineered human brains ... By 2030, we will have achieved machinery that equals and exceeds human intelligence but we’re going to combine with these machines rather than just competing with them. These machines will be inserted into our bodies, via nanotechnology. They’ll go inside our brains through the capillaries and enlarge human intelligence.” Kurzweil takes hundreds of dietary supplements each day and spends time every week at clinics having other health-giving compounds administered intravenously—all this is intended specifically to keep him alive until such time when humans have worked out how to reprogramme their own biology through nanotechnology.
What about more prosaic ideas of enhancement? In 2000, the first draft sequence of the human genome was published by a consortium of international scientists. Researchers had been identifying and modifying genes for decades before that, but it had been a laborious and difficult process. The draft genome, combined with freefalling prices for sequencing technology, meant that the first decade of the 21st century was a glorious time for molecular biology, with an exponential growth in research that examined and manipulated genes.
At the same time, stem-cell biologists have been working on ways to understand disease and grow replacement tissue using the body’s own master cells. In the future, reprogrammed stem cells will allow doctors to treat everything from heart failure to neurodegenerative diseases such as Parkinson’s, Alzheimer’s and diabetes.
* * *
By 2030, we will have achieved machinery that equals and exceeds human intelligence but we’re going to combine with these machines rather than just competed with them.
* * *
Both genetic modification and stem-cell research are in their earliest stages, and are destined to be used at first in trials with people who are suffering from a disease. But transhumanists argue that the technology should not stop there—why not use it to enhance healthy people too, extending lifespan or cognitive ability?
The potential dangers
Technologies of all stripes tend to get cheaper as they become more entrenched, tested and commercialized. But for the first unknown number of decades, all the methods of enhancing human beings—stem cells, genetics, nanotechnology and computer intelligence—will no doubt be too costly for most people. This means that any advantages in lifespan or intelligence bestowed by these technologies will go disproportionately to the rich. This is nothing new—well-off people today already make more money and send their children to better schools—but it will only serve to exacerbate that divide.
Humans have always been at risk from each other, and to counteract this, we have created laws and institutions that act to prevent one group from suppressing another. But what if one group of humans was radically more capable than the rest? Laws and institutions would not worry them, or stop them from taking over the world, enslaving or killing all others.
Take uploads, for example, where a person manages to transfer their mind from their brain into a computer that can emulate all biological processes. “Suppose uploads come before human-level artificial intelligence,” wrote Bostrom in an essay for the Journal of Evolution and Technology. “A successful uploading process would preserve the original mind’s m
emories, skills, values, and consciousness. Uploading a mind will make it much easier to enhance its intelligence, by running it faster, adding additional computational resources, or streamlining its architecture. One could imagine that enhancing an upload beyond a certain point will result in a positive feedback loop, where the enhanced upload is able to figure out ways of making itself even smarter; and the smarter successor version is in turn even better at designing an improved version of itself, and so on.”
If this runaway process happens quickly, it could result in one upload reaching superhuman levels of intelligence while everybody else remains at a normal human level. Such enormous intellectual superiority may give that person enormous power, allowing them to invent new technologies. If they were bent on domination, they might prevent others from getting the opportunity to upload. “The posthuman world may then be a reflection of one particular egoistical upload’s preferences (which in a worst-case scenario would be worse than worthless),” said Bostrom. “Such a world may well be a realization of only a tiny part of what would have been possible and desirable.”
Future technology and our increasingly connected society could give very small groups of people, or even just a single person, the ability to control a huge amount from a small base. Today, our world and the people in it are separate enough that no single entity could do enough damage quickly enough to destroy all of it—if someone wanted to take over, at least some of us would survive. But what if the humans of the 27th century were all networked minds living mostly in computer networks?
The dangers are real, but, like the sharp sticks on that African plain thousands of years ago, progress should not be stopped unduly. “You can’t just relinquish these technologies,” says Kurzweil. “And you can’t ban them. It would deprive humanity of profound benefits and it wouldn’t work. In fact it would make the dangers worse by driving the technologies underground, where they would be even less controlled.”
Death of the Bees
* * *
Could you live without fruit or vegetables? What about clothes made from cotton? How would you feel if the world’s meadows had no flowers?
* * *
All of these plants, so crucial to the continuing economic and aesthetic success of our lives, survive through the generations thanks to insects that are attracted by their flowers and their smells. They come to eat nutritious nectar and inadvertently carry pollen from one plant to another. A third of everything we eat depends upon pollination by bees, moths and hoverflies, which means that these creatures contribute some $42 billion to the global economy. If all of the UK’s insect pollinators were wiped out, the drop in crop production would cost the economy up to £440 million a year, equivalent to around 13 percent of the country’s income from farming.
The king of this world is the bee. There are hundreds of species of bees, all of them critical in the life cycle of the various plants around the world, ranging from apples, carrots, oranges and onions to broccoli, melons, strawberries, peaches and avocados.
But there is something amiss: bees are disappearing, and fast. Since the problem was first identified in Britain in the 1950s, numerous studies have documented long-term deterioration in bee species. In fact, all pollinating insects have been in serious decline around the world in the latter half of the 20th century, a result of disease, changing habitats around cities, and increasing use of pesticides.
A study by Sydney Cameron, an entomologist at the University of Illinois, illustrates how fast it is happening. She looked at the genetic diversity and pathogens in eight species of bumblebees in the US. Her results, published in 2011 in the Proceedings of the National Academy of Sciences, showed that numbers of four common species of bumblebee in the US have collapsed by 96 percent in just the past few decades.
By comparing modern census data about the insects with those in museum records, she also found that the geographical spread of four of the bee populations she studied (Bombus occidentalis, B. pennsylvanicus, B. affinis and B. terricola) had contracted by between 23 percent and 87 percent, some within the past two decades.
The findings reflected those of similar studies across the world. According to the Centre for Ecology and Hydrology in the UK, three of the 25 British species of bumblebee are already extinct, and half of the remainder have shown a serious fall in number, often up to 70 percent, since around the 1970s. In addition, around 75 percent of all butterfly species in the UK have been shown to be in decline.
* * *
Bees in general pollinate some 90 percent of the world’s commercial plants.
* * *
Canada, Brazil, India and China, as well as most of western Europe, have similar bee afflictions. The US National Research Council warns that bees could be extinct in North America by 2035.
The insect economy
Bumblebees are important pollinators of wild plants and agricultural crops around the world thanks to their large body size, long tongues, and high-frequency buzzing, which helps release pollen from flowers.
Bees in general pollinate some 90 percent of the world’s commercial plants, including most fruits, vegetables and nuts. Coffee, soya beans and cotton are all dependent on pollination by bees to increase yields. It is also the start of a food chain that sustains wild birds and animals.
Pollinators are crucial for the quality of fruits and vegetables. Perfectly shaped strawberries, for example, are created only if every single ovary has been pollinated by an insect. And the number of seeds in a pumpkin depends on the number of species of insect that have pollinated the plants. “If you’ve got 10 pollinators, you’ll get more seeds in the pumpkin than you would have got if you’ve just got one pollinator,” says Giles Budge of the UK Food and Environment Research Agency. “It is important to have that diversity in a pollinating population.”
Causes of decline
So what’s causing the crash? Scientists think it is a combination of disease and changing agricultural practices. “The spread of industrial farming, increased use of pesticides, and loss of habitat led to declines in the role of wild insect populations such that they are now reported to account for just 15% of global crop pollination,” says Elliott Cannell, coordinator of the Pesticides Action Network, Europe. “In response farmers started to hire in honeybees to pollinate their fields, thus creating a market for pollination. Demand soon spawned an industry which today sees honeybees overexploited, plagued by parasites, exposed to pesticides, and ill adapted to the conditions they work in.”
In her study on American bees, Sydney Cameron pointed to two causes: a pathogen called Nosema bombi, and an overall reduction in the genetic diversity of the remaining bee populations. The pathogen, common in bumblebees throughout Europe, reduces the lifespans of individual bees and also results in smaller colony sizes. Reduced genetic diversity means that the smaller populations are less able to fight off any new pathogens or resist pollution or predators.
Another problem for bees is the blood-sucking varroa mite. This creature has been endemic in the honeybee populations of Asia for thousands of years, living symbiotically with the local Apis cerana population. The completion of the Trans-Siberian Railway in 1916 and the movement of trade and people along its length, however, kicked off an unintended problem for the Western honeybees. These bees had never been exposed to the mite and therefore had no natural methods of protection against them.
* * *
DECLINE IN US BEES
96% in the past 20 years
* * *
By the 1950s, varroa had entered the Soviet Union, and two decades later had spread to eastern Europe and South America, thanks to the movement of bee populations by people. Today, Australia is the only continent free of varroa. Billions of honeybees around the world have died as a result of the mite, which spreads viruses deadly to the insects.
But virus epidemics alone are not enough to explain the massive insect decline. By far the biggest danger to bees in the past few decades has been our increased use of pesticides,
specifically those called neonicotinoids.
* * *
Demand soon spawned an industry which today sees honeybees overexploited, plagued by parasites, exposed to pesticides, and ill adapted to the conditions they work in.
* * *
One winter in the early 1990s, French beekeepers noticed a sudden fall in their insect population. They quickly pointed to a best-selling pesticide called imidacloprid that had been used for the first time the year before. The US Environmental Protection Agency classes it as “highly toxic” to honeybees. Following the death of so many bees in one go, France banned imidacloprid.
In 2008, Germany suspended three neonicotinoid pesticides after reports from beekeepers in the Baden-Württemberg region that two thirds of their bees had died around places that had used a pesticide called clothianidin. The chemical had been applied to the seeds of sweetcorn planted along the Rhine.
The problem is widespread. A survey carried out in 2008 by researchers at Pennsylvania State University showed evidence of 70 pesticides or breakdown products in pollen and bees. All the bees they looked at had traces of at least one pesticide, while each pollen sample had around six pesticides, with as many as 31 in one case.
Research on the effects of pesticides shows that the chemicals damage the brains of the bees, blocking the electrical and chemical signals between neurons. According to bee experts, only subtle changes would be required to produce serious brain disorders in the insects. The results would include making it harder for the insects to get back to their hives after foraging trips; or interfering with their ability to communicate with nest-mates using the “waggle dance,” where bees come back to their hive and spread information about the food sources they have found.