torsdag 7 augusti 2025

With great power comes great responsibility

As I've argued at length elsewhere (and nowhere at greater length than in my latest book), what is currently going on at the leading AI companies in Silicon Valley and San Francisco is likely to have stupendous influence on our lives and the entire future of humanity. Their ambitions for the intended transformation of society are on a stratospheric and hitherto unmatched level, but so are the risks. With great power comes great responsibility, yet they are proceeding at reckless speed, and they have not asked us for permission to risk our lives. It is hardly an exaggeration to call their behavior a unilateral moral trespass against humanity.

I expand on this matter in my brand-new manuscript Advanced AI and the ethics of risking everything, which also serves as a complement to my latest blog post, on OpenAI's CEO Sam Altman, in which I express my opinion about his behavior in just a few crisp swear words. In the new text I elaborate more extensively and with somewhat more polished language. Here is how it begins:
    Imagine sitting in the back seat of a taxi with your two closest family members. As you approach the bridge across the canal, you notice that a ship is waiting to pass and that the bridge has slowly begun to lift. The driver, visibly annoyed by the delay, turns to you and your loved ones and asks “Shall we try and jump it?”. You quickly calculate that doing so would save five minutes, at the cost of a 1% probability of a crash that kills everyone in the car. Do you give the driver your permission?

    In a Hollywood movie, you would probably say yes. But this is real life and you are not insane, so of course you politely decline the driver’s reckless suggestion.

    Next consider Sam Altman, CEO of OpenAI, facing the decision of whether to release the newly developed GPT-5. (There’s a good chance that when this reaches the reader, release of GPT-5 has already happened, but at the time of writing, in August 2025, it is still a hypothetical future event.)

Read the rest of my manuscript here!

Inga kommentarer:

Skicka en kommentar