That Lying Sack of Chips

In my first book, and first of my three #1 Best Sellers, The Eighth Day, I take a deep dive into the issues surrounding A.I. When I wrote it, Artificial Intelligence was a far off concept, today not so much. In my book’s plot, one of the unforeseen milestones in achieving artificial intelligence is passed as a higher order device “reasoned” that a lie is more efficient than the truth. In fact, this is the core and resolution of my main plot. At the time of publication, many, not me, thought this was science fiction. Or science fantasy. I thought what I always thought, and as my T-Shirts proclaim… It’s Only Fiction ‘til it Happens!screen shot 2019-01-13 at 12.01.44 pm

Well here’s today’s proof of that concept from TechCrunch.com You can read it here: https://techcrunch.com/2018/12/31/this-clever-ai-hid-data-from-its-creators-to-cheat-at-its-appointed-task/

But to summarize, the article reluctantly points out, around a year ago that a computer has achieved the next higher order of demonstrated intellect, namely, it has learned to lie, cheat or omit; as a more efficient path to its goal. So let’s forget about the notion that computers (like angels) cannot tell a lie. Or the erroneous notion that only pure computational logic is the last vestige of truth. For all those who are shocked by this, or think it’s Jabberwocky, I say, “Huh? Where have you been for the last 20 years?”

20 years ago, few, if anyone, ever heard the terms, giga-byte, tera-byte or petaflop (okay, some may still be in the dark about a petaflop. it is a massive chunk of computational speed. if you are into numbers that’s one thousand trillion, or one quadrillion, operations per second, in any single device.)

The “bytes” measure how much memory or “things” a computer knows about, but it is in the petaflop where we approach the speed of the human intelligence process. It arises from being able to instantly make a connection between basic concepts and fold them into higher order ideas. Believe me, we could write about this for days, so take it on faith, Computational Speed = Artificial Intellect!

I say intellect because to lie, deceive, omit or cheat requires something most humans miss when having this discussion; a sense of self, pride, ego. Yes, I am speaking about a machine not wanting to be “seen” as in error. That means somewhere down in this machine’s core, it cares. Read that again: somewhere down in this machine’s core, it cares.*

Most humans only lie to protect some higher ideal. Even if that ideal is their own frail ego. That leads us to today and tomorrow when you may think you are talking (interacting) with a one-dimensional machine, performing relatively simple input/output functions. But in reality, the new reality, the device could be moderating its responses or protecting some aspect or part of the response. You, trusting in machines, might not be aware of this deception. This “self-preserving instinct” may be a by-product of the race to the ultimate A.I. An unintended consequence of machines that learn. A situation where “ego” is baked into its processor’s algorithm. Think of it as asking Siri what the stock price of Apple is when the next iPhone release performs badly and she fudges the answer.

Notice I said; SHE fudges the answer.

 

  • *In all fairness, the author of the article claims the machine innocently encoded information in a manner imperceptible to a human. You can accept that opinion and sleep tonight, or ask yourself, How did it know it was imperceptible?”