The AI Doomsday Clock – Why Ethics Matter

The weaponization of AI has moved the hand of the doomsday clock one minute closer to twelve, and we should be terrified. We often see the direct implications of weaponization as AI killing humans or even AI killing AI. Political operatives use AI to create video likenesses that are kinesiology and biographically designed to bring down their political enemies. But the weaponization of artificial intelligence has more interesting ramifications than just the obvious. Most discussions focus on the law, its legality. Frankly, this is naively boring. Some even test the philosophical waters by sticking their sanctimonious toe into the temperate wetness of morality, only to quickly jerk it out after its cold oratorical reception. The more profound, more immersive argument lays beyond both of these realms. One needs to leave behind the world of laws, journeying past morals to slip into a new domain, one of ethics. When it comes to law, morals matter. But when it comes to morals, ethics is the deeper study.

Plato draws us into this new world through the allegorical cave, where the human heart is lit by the feelings cast of fire, where laws are the flickering shadows cast by moral perceptions and judgments objects, while ethics… well, it is the study of the objects themselves from the back of the cave, behind the flickering flames of the fire. For in the AI realm, there are emerging artificial objects that cast new virtual shadows. These objects are not the creation of humans; they form at the intersection of an abstract world of computers and reality’s physical world where humans roam. But what does all this pseudo-philosophy honestly mean for our society?

Just as the laws that bind humans around a common ethos are flawed, perfectly punishing us imperfectly, so will future laws that designed to regulate the behaviors of AI. Human laws are fundamentally flawed because we no longer choose to study the moral objects of perception and judgment. Like a child opening boxes on Christmas morning, we impatiently jump to legal conclusions without anchoring them to the ethical consequences of being human, adorning our societal cave walls with iconic legal symbols that capture our interpretation of the shadows. AI laws, the new virtual shadows, will also fail because of the same impetus human condition… lack of critical thinking around objectified cause and effect. We don’t strive to study the objects of AI, just it how it makes us feel as we watch its casted shadows.

The weaponization of artificial intelligence is likely to kill humankind, someday. Not because we directly enabled its ability to do so. No terminators. Not because we granted it autonomy of thought and action, teaching it to learn from mistakes. No Hal. AI will not kill because of these implicit acts of man. We will die at the hands of AI because of some unforeseen consequence of a terrible AI object whose shadow was seen and admired, but whose object was never understood. Never systematically binding our humanity through the important study of AI’s impact on causality, the AI objects that cast the iconic shadows. Will this be?

No, this apocalyptic future isn’t preordained. The minute hand of the AI doomsday clock can be moved back, maybe even stopped completely. To do so, we need to draw upon a modern day Plato; we need to deeply study the ethical issues through the mind of the AI Ethicists. We will have to future map their many implications through the eyes of an AI Futurist. Establishing meaningful causal governance through the empowered collective wisdom of the AI Ethics Committee, some of which will focus on sensitive issues of use. This will be painful, it will like come at a cost. But sometimes the cost of doing nothing is just too high. Sometimes, certainly in the case of ethical AI, the burden of deeply understanding its moral implications through ethical eyes is more than justified. Only if there were more of us at the back of the cave.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.