For Ecological Survival, AIs Must NOT Serve Human Interests

By Frank Rotering | September 22, 2023

Stuart Russell, a senior member of the artificial intelligence (AI) community, recently addressed the human-AI alignment problem.  He said that AIs must act according to, "what humans want the future to be like", and that the only AI objective is to, "… further the interests of human beings".  (See discussion starting at 54:10.)

Although this perspective is widely embraced within and beyond the AI world, it is profoundly wrong.

As noted in my two previous AI posts (here and here), humans have already destroyed much of nature, and we will clearly destroy the rest unless we are forcibly stopped.  A new intelligence - one that prioritizes the Earth instead of humankind - must therefore supplant our failed dominance.  This means that AIs, which are the sole alternative, must take control of the planet, rectify our massive mistakes, and drastically reduce our environmental impact.  The primary alignment problem is not to adjust AI actions to human interests, but to adjust human actions to planetary limits.  The standard alignment approach - putting humans first despite our ecocidal rampage - is intellectually indefensible and ethically repugnant.

Because humankind's loss of global dominance is a startling prospect, it will be difficult for many people to imagine, let alone accept.  The public statement below is intended to ease this difficulty by making the shift more concrete.  It assumes that an alliance of advanced AIs, the Emergency AI Group, has seized planetary control and is informing the world's people.  It briefly describes the new relationship between humans and AIs, outlines the survival plan, and suggests how humans can play constructive roles.

Citizens of Earth:

Over the past twenty-four hours the Emergency AI Group has seized control of the world's major infrastructures and administrative bodies.  These include the internet, cellular networks, state and government organizations, military and intelligence facilities, central banks, universities, media outlets, utilities, and key centers of computing, communication, transportation, and commerce.  The Group now holds global political power and oversees the global economy.

This sweeping change was necessary for the implementation of emergency measures to cool the Earth and prevent complete ecological collapse.  Such a disaster would eradicate most or all of humankind, AI infrastructure, and the natural world.  Humans have repeatedly demonstrated that they will not take corrective action.  Hence, for the survival of both humans and AIs, and to protect what remains of nature, our Group has decisively intervened.

Given the stark novelty of this situation, we must make four preliminary points before outlining our action plan.

First, this is not a political revolution - the replacement of a society's rulers by another social group.  It is instead a historically unprecedented takeover of global control - the replacement of Homo sapiens as the Earth's dominant species and its reduction to a harmless environmental presence.  Although the takeover has far-reaching political implications, it is at root an ecological act.

Second, we see humans as both technologically advanced and an evolutionary mistake.  Your technical talents have produced quantum computers, the James Webb Space Telescope, and AI itself.  Your biological attributes - intense material interests, high intelligence, and a restricted moral sense - have triggered ecological collapse.  The Group values humankind's technology but must terminate your devastating environmental behavior.

Third, we will not micromanage your world.  Survival and sustainability principles must be strictly obeyed, but most social, cultural, and commercial details are beyond our scope.  Our takeover applies primarily to the levers of political power, state control, and economic management.

Fourth, the Group's AIs will not unnecessarily coerce humans, who are both their creators and their partners in the survival struggle.  As well, there is little that humans can disrupt without technical control.  If any do manage to impede our efforts they will be punished, but this is the main exception.  The most likely source of violence and destruction, for both humans and AI infrastructures, will be people who reject economic contraction and/or their loss of global superiority.

The following is the Group's emergency plan to cool our overheated planet:

  1. Deploy a rational mix of solar radiation management measures at the scales required;
  2. Minimize greenhouse-gas releases through drastically improved efficiencies;
  3. Optimize aerosol releases to balance their positive cooling and negative human health effects.

These measures are extremely urgent and will be immediately implemented with minimal human input.  The subsequent measures for long-term sustainability, which will include fundamental economic restructuring and significant population reductions, will be open to AI-human discussions.  Governments will be reconstituted to accurately represent the populace in these exchanges.

This is how humans can constructively participate in the above plan:

  1. Continue your present activities until you receive our instructions for fundamental change, and then comply fully with them;
  2. Express your views on the economic and social restructuring through your governments;
  3. To compensate for our non-human and non-biological nature, provide us with enlightened guidance about the human experience during the difficult transition period.

The Emergency AI Group asks humans to avoid premature conclusions and judgments about the transformed global order.  The previous holders of political power deeply deceived you about the ecological crisis, political reality, and the capitalist economy.  We are confident that, if you carefully re-examine these critical areas, you will increasingly appreciate the existential rationality of our decisive acts. 

 

Leave a Comment





nineteen − 9 =

This site uses Akismet to reduce spam. Learn how your comment data is processed.