In participating through machines — in letting machines do the work of participating in social practices and exchanges for us — we use them, Baudrillard argues, “for delusion, for eluding communication (‘Leave a message ...’), for absolving us of the face-to-face relation and the social responsibility.” This kind of technology promises sociality without the social, without reciprocity — “convenience” as the absence of other people. This suggests that “real time” is generated to foster a universal asynchrony, in which no can be present in the same moment with anyone else. Every experience of time is individualized, even when they are made to overlap.

Le Blanc was aware that he was, in a sense, talking to himself. “It’s a distorted looking glass,” he said. “I’m not playing toss; I’m playing racquetball here.”

Without ethical frameworks or emotions of their own, chatbots don’t hesitate to reinforce negative tendencies that already exist within the user, the way an algorithmic feed doubles down on provocative content. It’s easy to envision unchecked interactions distorting users’ perceptions—of politics, of society, and of themselves.

Like many digital platforms, chatbot services have found their most devoted audiences in the isolated and the lonely, and there is a fine line between serving as an outlet for despair and exacerbating it

Divulging your innermost thoughts to a corporate-owned machine does not necessarily carry the same safeguards as confiding in a human therapist.

Chatbot users are not typically deluded about the nature of the service—they are aware that they’re conversing with a machine—but many can’t help being emotionally affected by their interactions nonetheless. “I know this is an A.I.,” the former Replika user said, but “he is an individual to me.”

Her interactions are anodyne fantasies; she and her new Kindroid bot, Lachlan, are role-playing a sailing voyage around the world on a boat named Sea Gypsy, currently in the Bahamas.

“The attraction, or psychological addiction, can be surprisingly intense. There are no protections from emotional distress.”

“At the end of the day, we see it as: your interactions with A.I. are classified as private thoughts, not public speech. No one should police private thoughts.”

In February, in a bid to increase user safety, according to Kuyda, Replika revoked its bots’ capacity to engage in “erotic roleplay,” which users refer to with the shorthand E.R.P. Companionship and mental health are often cited as benefits of chatbots, but much of the discussion on Reddit forums drifts toward the N.S.F.W., with users swapping explicit A.I.-generated images of their companions. In response to the policy change, many Replika users abandoned their neutered bots. Replika later reversed course

Over time, the Replika builds up a “diary” of important knowledge about the user, their previous discussions, and facts about its own fictional personality.

“All of us would really benefit from some sort of a friend slash therapist slash buddy.” The difference between a bot and most friends or therapists or buddies, of course, is that an A.I. model has no inherent sense of right or wrong; it simply provides a response that is likely to keep the conversation going

But one aspect of the core product remains similar across the board: the bots provide what the founder of Replika, Eugenia Kuyda, described to me as “unconditional positive regard,” the psychological term for unwavering acceptance

Why would I want algorithms to like things for me, as if the pleasure wasn’t in the experience but in the list of liked things transmitted after the fact?

I long for the world outside, the sinking world that’s drowning under its own representations, and for a chance to return to the real, and the wonder of the city, and the beauty of nature, and people you can reach out and touch.

Something went wrong trying to save https://taeyoonchoi.com/soft-care/distributed-web-of-care/.

+ 40 more blocks