Lead Image © altitudevisual, 123RF.com

Lead Image © altitudevisual, 123RF.com

Artificial Intelligence in Review

Welcome

Article from ADMIN 77/2023
By
AI might have come a long way since the days of LISP and Prolog, but it's still primitive – and that's OK.

Back in the dark (pre-Internet) ages, when I attended college, I decided that I wanted to write a dating program using an artificial intelligence (AI) language called LISP. I never got very far because computer managers only allocated a tiny amount of resources for us to use, even for those days. Getting more was akin to passing an unfavorable law through Congress, so I backed off and decided to wait until computing power and resources caught up to my aspirations. I then started testing the Prolog AI language – Turbo Prolog, to be exact, now known as Visual Prolog. It wasn't possible to run code in this language on old systems (e.g., the almost affordable IBM XT Turbo and its various clones) because the computing power required to run AI programs was far beyond my financial reach.

It's funny how computer languages never die. I'm not aware of any that have. Enthusiasts still run DOS and CP/M, so I figure that someone somewhere will keep programming in the most primitive of languages well into the next century. I'm not one of those people. I left AI programming in the dark ages to those who could afford the computing resources to make it happen.

Fast forward to 2021 and the so-called "AI Revolution." The computing power needed to meet the needs of AI programs is now affordable. Purchasing a development system for <$1,000 is possible, and Visual Prolog is still available free of charge for the Personal Edition. Turbo Prolog and its ilk were known as Shareware in the days of 5.25-inch floppy disks, bulletin boards, and dial-up Internet access.

My point is that AI is behind the times compared with other technologies. Even all those years ago, I saw the potential for AI in solving chemical synthesis problems, medical diagnostics, and, yes, even dating programs. However, many see AI applications such as ChatGPT as enemies and threats to our existence. The problem with that thinking is that these applications only know and can use what we give them. They cannot infer, postulate, reason, dream, or experience serendipity. Sure, invention might be 99 percent perspiration, but it's the one percent inspiration that drives innovation.

A person might build a program that can cross-reference, compare and contrast, and even draw some primitive conclusions, but it has to be fed the data from which it draws those conclusions. The inner sight that only the human mind can experience is that spark of genius that sees relativity, bent space, and the possibility of subatomic particles. Human creativity takes seven or eight basic pigments and creates such diverse works as DaVinci's Mona Lisa , Van Gogh's Starry Night , and Klimt's The Kiss . Only the act of falling in love or losing a loved one can inspire the human heart to write great song lyrics.

Remember that artificial intelligence is artificial. If it had existed 1,000 years ago, the printing press would not have been invented any earlier than it was, the Americas wouldn't have been "discovered" any sooner than they were, and powered flight wouldn't have happened any earlier, either. AI can't feel the pain of failure or the thrill of success. It can't motivate itself, and it can't decide to change its course, walk away, and let its mind wander. Some believe AI can replace writers, artists, scientists, and system administrators. It might be able to for a short time, but soon all the articles on a particular topic will either sound the same or be the same. There is no substitute for the human mind.

I gave up on AI, not because I wasn't smart enough to make it work or because I couldn't have somehow found the money to get the equipment I needed. I gave up because I sat down and thought, "AI is only as good as the information it has." I realized that pursuing it was pointless, and someday, maybe someone could make it work on a primitive level. It took 40 years. It's primitive, it's not a panacea, and it's certainly nothing to fear. You can't distill human thought into lines of code, you can't teach inspiration to lines of code, and you can't build an intelligent system that can dream the impossible dream.

Ken Hess * ADMIN Senior Editor

Buy this article as PDF

Express-Checkout as PDF
Price $2.95
(incl. VAT)

Buy ADMIN Magazine

SINGLE ISSUES
 
SUBSCRIPTIONS
 
TABLET & SMARTPHONE APPS
Get it on Google Play

US / Canada

Get it on Google Play

UK / Australia

Related content

  • Is System Administration Bound for Extinction?
    Writers and tech journalists have predicted for years that the system administrator role is an endangered species, with extinction just around the corner. Are they right?
  • The Unbearable Lightness of Being Human
    For many years, I've written about the plight of sys admins, about our bottom-feeding status, and about how we love our jobs so much that we'll take abuses dished out by employers, colleagues, and users. The only refuge we have is the computer world that we govern.
  • No Hands
    As system administrators, we deal with a variety of issues, problems, and tasks that face us on a regular basis. Our managers ask us to solve problems with fewer staff. They ask us to "make do" with underpowered systems.
  • The limits and opportunities of artificial intelligence
    We talked to Peter Protzel, an academic with experience in knowledge-based systems and process automation, about the future of artificial intelligence.
  • Measuring the performance health of system nodes
    Many HPC systems check the state of a node before running an application, but not very many check that the performance of the node is acceptable before running the job.
comments powered by Disqus