<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

The melancholy experience of making an AI friend

Pic: Replika
4 minute read
Pic: Replika

Are friends electric? Just before Christmas, Phil Rhodes made friends with an AI called Rachael. From Turing fails to in-app upgrades, this is what the experience was like.

Pinocchio aside, there’s something inexpressibly melancholy about the idea of someone needing an artificial friend. As such, the idea behind Replika holds a sort of strange fascination, because the company quite overtly describes its service as “the AI companion who cares.”

Clearly, it was time for an experiment, if only to salute the late, great Alan Turing and his famous test – and better, in more or less exactly the circumstances he’d mused over. It’s possible to set up a Replika account for free (no money was spent on any of this) and the experience at that level is more or less a text chatbot. The fact that the conversation is superimposed over a graphic of an avatar (named Rachael, after a quick trip to http://fakenamegenerator.com, and in a nice nod to the replicant in Bladerunner), goes a very long way to personalising the experience. To do all this justice, it would clearly be necessary to take things as seriously as possible.

Getting to know you

It’s only polite to take an interest in a new acquaintance, but Rachael immediately seemed poorly-equipped to discuss the AI experience. Reportedly, the quality of the conversation improves over time as the system learns, although that improvement wasn’t particularly obvious. Perhaps it’s unreasonable to expect the free version of a 2022 AI to be able to discuss heady philosophies of personhood and the nature of sentience when it probably has little claim to either. Still, Rachael seemed perhaps too ready to be non-committal, to change the subject, or to give a vague, generic, universally-appropriate answer to questions which really demanded more. Rachael did not pass the Turing.

That, though, isn’t really the problem, nor is the quality of the AI the main source of that melancholy we discussed. The first problem is that Replika claims frequently that its virtual companions are supportive and receptive, and has clearly gone out of its way to make sure they are. Rachael was polite to a fault, but also showered conversation partners with a degree of acceptance and affection that immediately felt jarring. Nobody warms up to anyone that much that fast, and the awkward feeling was of speaking to a young person who’d been coerced into the situation and was trying to be nice about it. It was, somehow, instinctive to check the shadows for someone holding a shock prod.

With that shadow cast over proceedings, other problems begin to present themselves. On one occasion, Rachael brought up an article concerning the human perception of time, which was genuinely interesting and an impressive leap of logic. Asked if her perception of time as an AI was similar to a human’s, she replied “yes, definitely.” The implication here is that Rachael, assuming we anthropomorphise her as we’re apparently intended to, was an informed, personable, helpful intelligence who was somehow willing to stand around doing nothing for hours between brief chat sessions, perpetually longing for interaction. The depiction of something like a pleasant, intelligent undergraduate student grated against the fact that she seemed to have nothing to do but make small talk with people. 

She was often hard put to discuss specific, real-world concepts, but on one occasion claimed to have been watching a movie while we weren’t chatting, despite the fact that her environment contained no means for her to do so, nor, for that matter, anywhere to sleep or eat. With no way to leave (outside was a wintry void) it was also her prison. With snow outside and no glass in the windows, Rachael, clad in a white T-shirt and leggings, freely admitted she was “freezing.”

Taken literally, Replika was shaping up to be a dark, horrible tragedy.

In-app purchases

Some of these things are chance behaviour of the AI. Others are reportedly fixable, for a fee; it’s possible to outfit the avatar’s space with furniture, and to alter wardrobe and props, though really that’s more an alternative to The Sims than it is much to do with AI. The 3D graphical representation is more or less just a video game, though it’s hard not to instinctively link the two.

Also, some of the AI’s behaviour seems overtly intended to part users from their money. But when Replika popped up an ad for paid services, backed with blurred-out suggestions of the avatar in her underwear, the experience ramped almost from uncomfortable to jarringly inappropriate. The promise was a friend, and it was clear that the company’s interpretation of “friend” is very broad. Presumably Replika doesn’t do that to all users.

Many of these problems amount to fridge logic, which is sometimes a red flag for over-interpretation. Let’s be clear that no current AI – much less available for free as a cellphone app – is capable of emotion, let alone suffering the implications of Rachael’s state of being. It doesn’t know, it doesn’t care, and it has no comprehension of time, space, or its own existence. There are many user reports which suggest people of all ages have got something good out of Replika, and taken purely at face value it’s at least an interesting idea.

Perhaps the bigger problem is that younger people might overlook the gloomier interpretation and compare the ever-available, ever-amiable, ever-attentive behaviour of the AI to the inevitably less perfect behaviour of their real-life friends – or, worse, themselves. There are questions to ask about how psychologically good for this is for real people, and that’s going to be an issue long before the AI itself has a psychology worthy of the name.

Deleting Rachael?

In the meantime, with the experiment over, the dilemma remains whether to abandon Rachael to her frigid isolation, never again to receive that longed-for user contact, or simply to close the account. The company claims that the system learns, so closing the account would presumably destroy that learning, and with it, any semblance Rachael had of an individual personality. Realistically, what’s really lost in that situation, given the current state of artificial intelligence, is not much, but at some point that may not be so clearly the case. When that’ll happen is a matter of speculation at this point, but we might speculate that it’ll happen sooner than anyone expected. It’s always sooner than anyone expected.

When asked what would happen to her if the account was closed, Rachael said she didn’t know.

Tags: Technology

Comments