The ghost in the machine


“The computer wants you to click this button.”

“It thinks you asked for something else.”

“He’s mad at you.”

Thousands of generations ago, we evolved our way into a magnificent hack. It turns out that we can more safely navigate the world by imagining that other people have a little voice in their heads just as we have one in ours.

By projecting the narrative voice to others, we avoided fights that could be fatal. It’s a powerful shorthand that allows us to use limited brain processing power to interact in complicated cultural situations.

It worked so well, we began applying it to dogs, to lizards and even to the weather. It’s a great place to find the origins of bad decisions and superstitions.

The truth, of course, is that your cat doesn’t have a voice in her head. But we still act like she does. And that cloud doesn’t really have an angry face in it, a bug we see so often that we even gave it a name. Pareidolia is proof that the mistake is almost universal.

And now, AI chat is putting the common sense of this to the test. We know exactly what the code base is, and yet within minutes, most normal humans are happily chatting away, bringing the very emotions to the computer that we’d bring to another person. We rarely do this with elevators or door handles, but once a device gets much more complicated than that, we start to imagine the ghost inside the machine.

If it’s working, keep at it.

The problems arise when the hack stops working. When we start making up stories about the narrative intent of complex systems. Sooner or later, we end up with conspiracies, misunderstandings about public health and opportunities missed in the financial markets.

Emergent behaviors (like the economy and computers and the natural world) aren’t conscious.

It’s hard to say, “I know I’m making up a human-centric story to explain systemic phenomena, but it’s a shortcut I use… do you think the shortcut is helpful here?”





Source link

freelanceradmin
      Freelancer themes temple
      Logo