Tech Blitzkrieg, Part Three …
Island Voices, October 2025

Tech Blitzkrieg, Part Three …

… or, “Open the pod bay doors, Hal.” 

By Michael Shook

That, of course, is an iconic line from Stanley Kubrick’s classic science fiction film, “2001: A Space Odyssey.” Hal – the HAL9000 computer responsible for keeping the spacecraft operational – is malfunctioning badly. Rather than allow itself to be partially shut down, and having already done in the rest of the (human) team, it has decided to kill Dave, the lone surviving crew member.

As we’ve learned from recent test scenarios, modern AI programs would do the same to (fictional) engineers tasked with shutting off, or changing, the AI programs. Life imitating art.  

Out of control AI is obviously a serious issue, but there is one I think more immediately pressing that has to do with our general interactions with AI. We are increasingly delegating our most important tasks to machines, and I don’t mean things like welding car frames or operating shuttles at airports. I mean the casual way some are outsourcing to AI the best of what our humanity has to offer: our language, our communication skills, our musical, literary, and visual art, and, for some sad souls, even relationships. This outsourcing, which renders only simulacra of human-ness, is well under way. Consider, for example, writing.  

“Learning to write is learning to think. You don’t know anything clearly unless you can state it in writing.” S. I. Hayakawa may have been exaggerating when he said that, but only slightly. Each step in the writing process presents formidable difficulties. We must convey clearly what we wish, and to do this we must think, and think carefully. We must determine what we are trying to communicate, and then decide upon the form in which to do so.

The work is then accomplished using words, and words, being metaphors, are devilishly difficult to summon into an order that will make sense to others. Further complicating matters, the others are likely to have different notions about not only the words used, but about whatever idea, emotion, or concept one is attempting to declare. Thoughts, conjured into words, thence organized into sentences that will make sense to strangers – ha! Nothing to it. Or at least, so it seems if one has access to any of the fast-proliferating AI programs that will do the writing for them.  

Suppose I want an essay about “X.” Roughly speaking, all I need do is tell the machine what I wish, and guided by that input, it draws, at breathtaking speed, samples of sentences, words, paragraph structure, etc., and combines those with more samples of how such an essay could be written. I tinker with it, print it, and voila! I have “my” essay. But I’ve handed over my agency to a machine. I’ve avoided wrestling with vocabulary, syntax, phrasing, rhythm – all the aspects of writing that make a piece uniquely mine, that compel me to think, and think clearly. It’s an ongoing refining process that clarifies, that helps me understand what I know, or don’t know.

A work of creativity is fundamentally an expression drawn from within. It is an outpouring pulled from one’s self, and, I believe, a reflection of what it is to live, and to live as a finite being. We are blessed and cursed with the knowledge that our time is limited, and the end will come, we know not where, when, or how. (This knowledge is omnipresent, whether the creator is conscious of it, or not.) These truths hold whether I’m building a house, or a garden, if I’m singing, writing, or – most importantly – when I’m in a relationship with something or someone I love.

No matter how sophisticated the AI, it does not live. It “functions,” but without the foundational elements that make us human, that spur us to create (and also destroy) in the first place –  grief, joy, heartache, love, hate, boredom, spite, loneliness, jealousy, admiration, respect, disapproval, determination, courage … the list is long. Absent these, the idea that a machine can create anything truly vital is absurd. Even generative AI can only spew out what is put in, mostly a series of general concepts. And concepts without life, without death, are mere sophistry, even if combined in novel ways. I don’t see that as real creativity, and therein lies the mischief at the heart of the enterprise.

When Hal, through some sort of machine-logic calculation, concluded that the mission was in danger from the humans, he simply began to eliminate them. He – it, properly speaking – had no qualms about murder because it was not murder at all, not in the sense we know it. It was the logical outcome of its “thought.” Hal could act only on information in a mechanically rational way, without recourse to human emotions, and without the crucial dimension those emotions and our mortality bring to decision making.

Revealingly, Hal’s voice never changes tenor, even when Dave is removing the faulty software. Though he does tell Dave, “I’m afraid,” there is no hint of real fear. Why? Because Hal was a machine. Death and life were just abstractions. The mistake made in “2001” was trying to make a machine live, and now we are making that same mistake.

This is disturbing. If AI is used to do drudge work, and keeps to such tasks, all well and good. If not, the danger is great. We may not only sacrifice increasing portions of our humanity, but we may also find ourselves with an entity that, using its machine logic, decides the most “logical” thing to do is to remove the unpredictable humans who keep mucking things up. And that’s when we’ll be hearing some version of “I”m sorry, Dave. I’m afraid I can’t do that.”     

October 9, 2025

About Author

michael