Annihilation from Within Read online

Page 3


  Chastened by a quarter century of wars following the French Revolution, the world’s leading powers—all European—created an international order in 1815 that avoided wars or kept them localized. They succeeded in maintaining this order for a hundred years. Yet after this long period of relative peace, there were two enormously destructive global wars within three decades. This phase in turn was followed by more than half a century during which wars again were averted or kept limited and localized. Manifesting a somewhat similar rhythm of gain and loss, the beginnings of constitutional democratic government in the world in the late eighteenth and early nineteenth centuries were followed, a hundred years later, by new tyrannies in Russia, Germany, and other nations. And then, after the Second World War and the end of the Cold War, these tyrannies were followed by a restoration and expansion of democracy.

  Even the political enfranchisement of women, a seemingly irreversible development that began in the nineteenth century, has not followed a steady course. While it has spread all over the globe, it has suffered significant reversals recently in nations where a change of government led to the imposition of Islamic restrictions—the worst example being provided by the former Taliban regime in Afghanistan. In fact, the ups and down of religious tolerance have been an important reason for the erratic patterns of political development around the world. In some countries, diverse religious groups have lived together peacefully for generations, then suddenly became enmeshed in violence and civil war. In Bosnia, for instance, Muslims, Orthodox Christians, and Roman Catholics had enjoyed neighborly relations in the same towns and villages for generations. Then abruptly, in the 1990s, after Yugoslavia had broken up and autocratic rule was displaced by political turmoil, they started to persecute and kill each other. Something similar happened in Indonesia, where Muslims and Christians had lived peacefully in the same communities until Suharto’s autocracy was replaced by democratically elected governments, at which point religious violence erupted in various parts of the country.

  Economic policy and performance have followed a similar zigzag course. In the early nineteenth century, mercantilist economies steadily gave way to free market systems. Then, in the early twentieth century, the free market was partially displaced by socialist command economies. But at the end of that century, the command economies were themselves displaced by a successful global movement to restore free market economies. The course of free trade has been equally wobbly. Free trade prevailed throughout most of Europe during the last third of the nineteenth century until it ended in 1914, to be revived with the Common Market and free trade agreements forty years later. My interpretation of the divergence between the two cultural spheres—the scientific one and the ethical-religious one—is shared by John Gray, Professor of European Thought at the London School of Economics: “The belief that scientific advance engenders social progress suggests that science and ethics are alike, when in fact they are very different. Once it has been acquired and disseminated, scientific knowledge cannot now be lost; but there is no ethical or political advance that cannot be reversed.”10

  A fundamental difference between the two modes of human activity—and a source of tension between them—resides in their contrasting aspirations. In the scientific-industrial mode, human activity is devoted to a continuing progression of advances without a final goal. Science, properly pursued, does not seek to establish a definitive doctrine but to expand man’s never-completed comprehension and conquest of nature. But in the societal-political sphere, people have essentially finite goals. In Marxist thought, for instance, the communist society was not envisaged as a goal to be overtaken by some new form of capitalism. The Islamic movements that have established their theocratic states are not prepared to welcome “progress” that might lead to a religious reformation. America’s policy of promoting democracy and human rights is not intended to build a transitory political order to be replaced by new autocracies.

  The encyclical Fides et Ratio, released by Pope John Paul II in 1998, addresses this cultural divergence. The “profound unity” in Medieval thought, the encyclical notes (§45), was

  producing knowledge capable of reaching the highest form of speculation, [yet] was destroyed by systems which espoused the cause of rational knowledge sundered from faith and meant to take the place of faith.

  The encyclical conveys the Pope’s appeal (§48),

  that faith and philosophy recover the profound unity which allows them to stand in harmony with their nature without compromising their mutual autonomy.

  At the present time, this “profound unity” does not seem within reach. Many democracies are painfully torn between the secular and the religious; and disparate faiths in the world remain deeply divided, within nations as well as between them. These irreconcilable disagreements often lead to violent conflicts which reveal—it is sad to say—the extraordinary savagery and cruelty of religious wars.

  Since the eighteenth century, political leaders in Western democracies have sought to overcome these conflicts by instituting the separation of Church and State. Prominent thinkers in the nineteenth century even anticipated that the influence of religion in society would vanish. Auguste Comte, Karl Marx, Herbert Spencer, and others portrayed the importance of religion as merely a passing phase in the evolution of mankind, to be supplanted by some universal, rationalist philosophy. Several European nations have evolved along this path. But in other regions religion became more influential and even spawned fiercely militant groups. Religious resurgence has spread through Africa, lit fires throughout the Muslim world, and can be observed in the United States.

  It is mankind’s destiny that its cultural split will become wider, unless the calamity of annihilation from within (see chapters 4 and 5) forces societies to close the chasm between the two modes of human activity. Short of such an upheaval, the societal and religious modes of human activity cannot catch up with the ceaseless momentum of science. This widening chasm is ominous. It might impair the social cohesion of societies, and of nations, by drawing the human psyche in two directions: to the personal and national identity that resides in acquired beliefs, memories, and traditions of the past; and to the promise of greater wealth and power offered by untrammeled technological progress. Also, mankind’s two cultural spheres are driven further apart by emotions—that basic ingredient in all human activity. In the scientific sphere, we are neither emotionally tied to our cultural and religious heritage, nor pining for a final redemption. But when animated by the world’s old soul, we seek to protect our identity by clinging to ancient artifacts from our ancestors and hallowed legends from the distant past.

  In the 1990s, Hindus and Muslims in India killed each other in a dispute about the Babri Mosque, which is said to have displaced a Hindu temple half a millennium ago. In Jerusalem, Muslims and Jews kill and die for the Temple Mount, whose veneration by both faiths is based on even older legends. Perhaps it is our fear of mortality, that inescapable end of our earthly existence, which induces us to find solace by killing for a patrimony that has vanished long ago. A haunting thought, that Macaulay has captured:

  And how can a man die better

  than facing fearful odds,

  For the ashes of his fathers,

  And the temples of his gods?

  2

  SCIENCE PUSHES US OVER THE BRINK

  Men will acquire the power to alter themselves, and will inevitably use this power.

  —BERTRAND RUSSELL (1931)

  WE ARE AWESTRUCK by the continuing advances of science, yet often ambivalent about their impact on our world. Rightly so, because the nuclear age taught us the difficulty, nay the impossibility, of reining in threatening consequences of scientific breakthroughs. Today, it is the life sciences that keep producing successive stunners: a steady increase in longevity; therapeutic uses of stem cells; cloning of a human chimera or even a human being; new bioengineered weapons that can unleash a global pandemic; new applications of biotechnology and neurophysiology to discover the inne
r workings of the human brain; and eventually the ultimate leap—the construction of a superhuman intelligence—our twenty-first-century Tower of Babel.

  Further progress in the life sciences seems guaranteed because the evident health benefits will assure unstinting public support. Yet the dark side of this research is now impossible to ignore. Advanced biological weapons are now seen as a major threat, and rightly so. But two emerging problems of the life sciences also deserve more scrutiny. One is the outsized cost and ethical dilemmas of the longevity expansion. We can already discern the outlines of a cramped new world in which scores of millions in developed countries are in varying stages of senescence, but could be kept alive with indefinite life-prolonging interventions that would bankrupt national budgets.

  The other threatening phenomenon is harder to define with precision, though potentially far more revolutionary. It arises from the accelerating pace of research in both neuroscience and computer science, and the gradually expanding joint projects of those disciplines. Slowly coming into focus is a merger of the two versions of “intelligence”—the human brain’s unique thinking ability and the computer’s vast search-memory-and-computation powers. Superhuman intelligence looms on the horizon.

  The Deconstruction of Death

  From antiquity until yesterday, old age was not thought of as a curable affliction. Senectus insanabilis morbus est—old age is an incurable disease—wrote the Roman philosopher Seneca. The outer bounds of old age were understood to be the seventies and eighties, and we learn from biographies of famous people that a fair number lived that long. The Roman consul and orator Cato died at 86 in 149 B.C.; Saint Augustine died at 75, shortly after he finished writing The City of God; Michelangelo was creative almost until his death, a few months shy of 89; Voltaire lived to the age of 83; Benjamin Franklin lived to 84. What is new today is that a far larger proportion of people live to their eighties and often continue to participate actively in society. During the last three hundred years, most countries have experienced dramatic reductions in mortality rates. The average length of life almost everywhere has more than doubled, and in the developed world has tripled, rising from a norm of twenty to thirty years which had prevailed through most of recorded history.

  Beyond this genuine progress lie more audacious goals. Competent experts in biotechnology now propose to transform old age into a curable disease. The “cure” they have in mind would both extend the life span and improve the quality of life in the later years. Scientists have predicted that it will become possible to extend people’s active life-span by twenty years or more, probably long before the end of this century. Less clear is how many of those living longer will spend decades in senility. But even with massive uncertainties about this critical aspect, demand for the new life-prolonging treatments will be irresistible, and will confront democratic governance with agonizing choices.

  A threshold question is how will decisions be made about prolonging individual lives? Physicians and hospitals play a key role in any such decisions, and medical ethics assigns a high priority to prolonging life. Even when terminally ill patients convey a preference for ending life-support measures, their wishes are often overridden. Litigation over responding to surmised wishes of comatose patients can drag on for years, in the United States as well as in many European countries. Recent disputes offer a warning of things to come. Consider the case of Terri Shiavo, a comatose woman in Florida kept alive on a feeding tube for fifteen years, until the final court decision in 2005 gave her husband permission to have the feeding tube removed. This closure was preceded by three years of court decisions and appeals, as well as legislative interventions culminating in a special bill passed hurriedly by the U.S. Congress.

  Some jurisdictions have sought to clear a path through this thicket. The Netherlands legalized doctor-assisted suicide. So has the State of Oregon, but with more restrictive rules. Such laws tend to meet with strong religious opposition, although most religions make allowance for countervailing considerations. In a letter rich in beautiful passages about aging and death, Pope John Paul II reaffirmed that “the moral law allows the rejection of ‘aggressive’ medical treatment.’”1 But under what conditions is a specific medical treatment “aggressive”? In the Terri Shiavo case, Catholic theologians cited a more recent statement by Pope John Paul II, asserting that providing food and water was “morally obligatory.” For those guided by this precept, most of the future life-prolonging treatments might be considered “morally obligatory,” especially if they turn out to be less onerous than the currently used stomach tubes and other invasive techniques.

  Such treatments could also find warm support among doctors and might be eagerly requested by elderly patients. Before long, budgetary pressures would then force policymakers to limit the exploding expense of life-prolongation. It is not hard to imagine the ensuing debates. Well-intentioned theologians and ethicists would fiercely oppose any such limitations, seeing them as a “slippery slope” that leads to government-imposed euthanasia. And in case of deaths in hospitals that are now attributed simply to old age, family members supported by ethicists would accuse these hospitals of euthanasia and start innumerable lawsuits. Others will try to argue that the billions spent on life extension for centenarians ought to be used elsewhere, say to protect the lives of children who are killed in crime-ridden neighborhoods by cross-fire among drug dealers (at least one child per day in U.S. cities), or to reduce the deaths from automobile accidents (more than a hundred per day in the United States).

  Still others will point to a different slippery slope. If death could be indefinitely forestalled by ever more sophisticated interventions, humanity would lose all sense of a natural endpoint of human life. An indefinite postponement of death would create a profound challenge for some of the major religions. The “natural” beginning and end of our earthly existence have been viewed as boundaries drawn by God—boundaries linked to the sacredness of human life. For the Christian creed, in particular, a fading of these demarcations could be more damaging than the epochal discoveries of Galileo Galilei and Charles Darwin. Although their discoveries contradicted hallowed doctrine and centuries of teaching, they could be accommodated by adjusting peripheral aspects of doctrine. A science-driven deconstruction of the “natural” boundaries of human life would diminish a more central aspect of Christianity (as well as Islam). Also, throughout past centuries, the proximity of death for people in the prime of life has nourished the deepest sentiments of religious faiths.2 Theologians have been rather silent on these profound problems that lie ahead.

  A continuing postponement of death can also have serious fiscal implications. If democratic governments cannot raise the retirement age to compensate for our increasing longevity, they will be unable to manage the continuing growth in health care costs and retirement payments, save by cutting other expenses. Given current trends in the United States, federal spending on the elderly (Social Security, Medicare, and Medicaid) will rise from 8 percent of GDP to about 13 percent by 2030. Unless democracies can raise the retirement age, they will have to cut defense expenditures, a process that began in Europe some time ago. But given the political will to raise the retirement age gradually from 65 to 75, the workforce/retirement ratio could be stabilized or even improved—at least for the next few decades. Any serious politician knows that the retirement age will have to be raised substantially, but also knows that any such adjustment will trigger fierce opposition. In Germany, France, Italy, Israel, and other democracies, the labor unions have organized massive demonstrations against proposals for delayed retirement, often leading to street battles.3

  Policy adjustments, when at last they are carried out, can reduce or delay a deleterious impact of technological change. But, as Yehezkel Dror pointed out, without the stimulus of a major crisis, it is difficult for democracies to make badly needed policy adjustments that are unpopular with their voters or powerful constituencies.4 The U.S. Government passed its first major child labor law only in 1938
, and the struggle to curtail injurious child labor practices in the poorer countries has barely begun. Technological change leaves societies in a state of maladjustment, a manifestation of mankind’s cultural split. But this predicament does not preclude governments from taking steps, even if belatedly, to mitigate these maladjustments. To elucidate these delayed adjustments, American sociologists—notably Thorstein Veblen and William F. Ogburn—introduced the concept of “cultural lag.”5 Karl Marx’s idea that the bourgeoisie would produce its own gravediggers assumed it would be incapable of closing the cultural lag with regard to organized labor. By accepting labor unions, however, the “bourgeoisie” did survive the “revolutionary unification” of labor that Marx and Engels had correctly predicted. Thus the “bourgeoisie” in the industrializing nations—except Czarist Russia—averted the defeat that Marx had in mind for them.

  Increased longevity will not only lead to fiscal problems for many democracies but also could prolong the rule of tyrants. Most dictators cling to power as long as possible and can command for themselves the best medical treatments. Stalin comes to mind as a despot who would not have volunteered to retire had his doctors been able to keep him active and fit to age 120, and with tomorrow’s medical technology he might well have ruled his Evil Empire until 1999. Had modern biotechnology offered Mao Ze-dong the same extended life-span, Deng Xiaoping would still be waiting for an opportunity to implement his reforms in China. Cuban exiles in Florida who are pining and planning for a post-Castro Cuba might have to wait as Fidel Castro anticipates to stay in power as an octogenarian.

  A mistier question relates to changes in the human spirit. During the natural life cycle that we have grown used to, our emotional experience—of the world around us as well as of ourselves—moves through changing seasons. In the springtime of youth, the emotional landscape is mottled with subtle and flickering colors, playfully blending tones of gaiety with quickly passing shadows of sadness. Sexual drives and feelings shimmer throughout this delicate composition and frequently burst forth like a thunderstorm. In maturity, our emotional experience dwells longer in a single mood, and lights and shadows have harsher edges, as on a dry hot summer day. In the autumn of our lives, our feelings and sentiments become more subdued, yet are also enriched by joys and sorrows recalled from the past. And for those who can reach the last season in fair health, the sentiment of yearning—the mind’s strongest grip on life—is becalmed, and eventually fades.6 But what if biotechnology enables people to live in fairly good health much longer? What then would be the emotive melody of our journey through life? Surely, such a change in the emotions of our passage through life would alter the disposition of society as a whole. These are large questions, with elusive answers.