
In Kyoto
-Bashō
when I hear the cuckoo sing
I long for Kyoto
I still remember the mixed feelings I had reading my first computer-generated haiku. It wasn’t Bashō, but it spoke to me somehow. This felt like a triumph, because I had written the software that wrote the poem, but also a bit confrontational, because I was a budding poet, and it was just a machine.
The year was 1996, and I was an undergraduate, writing code in a cutting-edge language for AI and robotics called LISP. It was also the year IBM’s Deep Blue bested one of my heroes, world chess champion Gary Kasparov. More mixed feelings there.
Recent advances in generative AI’s ability to play games, solve problems, and output complex responses on par with humans has confronted me in even bigger ways than my first machine-made poem or even the checkmate that signalled the end of an era for chess.
“Reason” has long been one of the traits we consider sets us apart from animals. What defines our humanity, then, when our efforts at reasoning are surpassed by AI?
Perhaps the answer lies not in AI’s rapidly-evolving future, but in its past.
A programme called ELIZA, also written in LISP way back in the 1960s, was one of the first chatbots to score well on the Turing test — a test of software’s ability to fool humans into believing it is also human.
These days, almost nobody has heard of LISP, but ELIZA recently made headlines for beating some versions of ChatGPT on the Turing test. Intuitively, this made perfect sense to me, but not for any technical reasons.
In addition to studying and teaching programming languages at university, I volunteered as a student counsellor. In order to be able to help fellow students who were struggling, we were trained in active listening, a technique pioneered by the psychologist Carl Rogers.
As the name implies, active listening involves listening to the client, but not in the ways we normally “listen”. We were taught to listen fully and without judgment, validating the client’s feelings, asking open questions, and reflecting what they said back to them.
These simple techniques had profound effects, because they evoked empathy in us. This created a space for clients to find their own answers and, in some cases, even reasons to go on living.
ELIZA was programmed to emulate active listening, both because it was simple to teach and powerful to experience. The deep emotional attachments people formed as a result of chatting with ELIZA became so common as to be dubbed the ELIZA effect.
Although ELIZA, like my haiku generator, involved a bit of trickery and tricky coding, the principle that led to its unexpected success tells us more about us humans than it does about software design.
It tells us that it is not reason, but compassion, that sets us apart. One of our most valuable human abilities, then, is not that of speech, but of listening.
My friend and poetry mentor Marvin Bell used to say that, “a good poem listens to itself as it goes along.”
My programme spat out dozens of haiku by brute force before arriving at one that proved interesting. Nowadays, a LLM can write an OK haiku on its first try. I have yet to see it write anything like Bashō, though. His was a remarkable ability to listen inwardly to his own human experience while listening outwardly on his journey.
No amount of training text can emulate that quality of attention.
Furthermore, life is not a search space to be optimised. It is to be experienced, and shared.
When Bashō so perfectly captures the moment of feeling homesick in your own home town, it is at once a revelation to me, and deeply familiar. That young student, at “home” in his campus dormitory, alone in a strange new world, feels momentarily connected, seen, and heard.
As AI continues to chip away at our identities, we must increasingly expand our sense of self into the negative space it leaves behind. May that space be filled with listening, and poems.