top of page
Search
  • Writer's pictureLoren Niemi

Some Thoughts about AI


Now is the future (or at least part of it) that they said was coming and frankly it is as bad as they said it would be.


What? Who is the “they” that said that? How “bad” is this bad compared to every other prediction of impending doom?


Let me begin answering those questions by saying the in 2010, I spent a few days as a consultant to the Association for the Advancement of Artificial Intelligence. My task was to explore story and more specifically the structure of story for, with, by computers. What followed was a “state of the art” observation of three groups of developers. Their efforts that can be encompassed by the dictum that “story is a structured narrative” but what that meant to each group was different.


The first group, was trying to get computers to read at a third-grade level. Why third grade? That is the level at which a reader can infer something from the text. “Jack and Jill went up the hill to fetch a pail of water” is a declaration of fact. “Jack fell down and broke his crown...” is an instance of inference, as we must “guess” as to whether the crown was his head or a piece of ceremonial jewelry. The focus of their work was to build a base for AI which would allow a computer to read a text and not only understand it in a literal sense of what the words were, but also to be able to draw a (correct) conclusion from that text as to its meaning. Fundamental to the idea of AI as being “conscious” is that act of correctly inferring meaning from text – and beyond that capacity, to do better than what humans do referencing both the breadth of sources and the speed of recognition.


As a subsidiary question: can AI recognize irony? Can it employ it to say something that has a double meaning?


That question could have been put to the second group who were trying to get computers to write. What? A full sentence containing a coherent thought for starters. Pedro, who was working on this problem said they gave a computer the librettos of all Italian operas and told it to write a synopsis. It offered: Antonio loves Violetta, Violetta is ill, Antonio cheats on Violetta, Violetta dies, Antonio dies. On the one hand it is the synopsis of some kind of opera but on the other, there is no emotion in it, which we know is the essence of opera. The essential methodology he employed is the one that most AI programs now uses to create text – once again, to draw words, phrases, images from thousands of examples and combine them in a pattern that more or less resembles the examples drawn from. What we have now is not what they are looking for…


To put it another way, AI is looking to replace the 100 monkeys typing for 100 years to get “Hamlet” with a program that will come up with something as emotionally complex and meaningful as “Hamlet” but not simply the repetition of that story, They want the creation of something in the space of a few minutes that is as good or better. Who needs monkeys? Who needs writers for that matter? As Kevin Drum recently said about the future of writers and AI – “At first it will just be entry-level (copy) writers, but very quickly the software will improve and journeyman writers will also be replaced. Then it will be the very best writers. Even poets and novelists will eventually be replaced.”


Think about that for a while and ask yourself, what kinds of stories would follow?


The third group were the gamers who were less concerned about the story as such than about how many choices could be offered a player and still have them arrive at the pre-ordained conclusion of the story. They wanted to map out every game option and how to fit every decision into the essential narrative as if a player’s action appeared to change the story but not really… This too is consistent with where AI is headed. The data base drawn from is enormous, the offered results are few within a frame of acceptable outcomes. A truly conscious AI would in fact rewrite the story with every player decision and not necessarily come to a foregone conclusion. Recently in a simulation, an AI program whose mission was to “kill” the enemy decided that the human operators who were supposed to be the safeguard against killing innocent bystanders were preventing it from accomplishing its prime task and killed them. Welcome to the nightmare...


In my work with each of these groups, it was clear to me that their focus on a particular feature of computers and story hindered their understanding of the whole, and more specifically of those functions of story that are the most human. It is not the breadth of our sources or the speed of our recall but the integration of emotion, imagination and (at least in performance) the ability to “read the room” that makes the most potent stories.


Near the end of my time with the AI folks, I went to a cocktail party. Ignoring the fact that there were a dozen men to every woman in the room, what was most striking was the moment when with a trumpet fanfare, doors opened, and robots came into the room and began to dance. It was an awkward, stiff and quickly repetitious set of movements not quite to the beat of the music. In the crowd a few men danced along. They had the same movements and I realized that they were probably the ones who had programmed the robots.


Recently I saw a video of “state of the art” robots running, turning cartwheels and back flips, with a speed and grace that was a measure of how far we’ve come since that reception. Thirteen years is several generations of improvement on capacity and performance and for all the progress being touted, it does not leave me feeling confident as to the story being told. The inner Luddite in me says, just because we can do something does not mean we should. Look at Icarus. Look at the Golem. I am left with the feeling that the developers of AI, intoxicated with possibility, have not taken to heart the meaning of those cautionary tales.


Stories have served as one of the most fundamental tools for our advancement. Sure, the opposable thumb helps. Language helps. But on the most basic level story tells us not only how to use the thumb but why. It is language which is encoded and shared from one to another across time and place for a purpose. It models actions and emotions. It allows us to see/hear/feel something outside of ourselves and in so doing internalize the meaning of the story.


I do not think we are at a point where AI is creating and sharing original stories for entertainment or education independent of the constraints of a program. That day may come and come sooner than we expect. What those stories are, is anybody’s guess but if AI is modeled after human thought, it is likely that those first stories will be its own creation myth.







Comentarios


bottom of page