An influencer has created an AI copy of herself: talking to her costs 1 dollar a minute

An influencer has created an AI copy of herself: talking to her costs 1 dollar a minute

[ad_1]

Few in Italy know Caryn Marjorie: 23 years old, American influencer, a large following on Snapchat (nearly 2 million followers). For some time, there have been two versions of Marjorie: one publishes videos and online content on a daily basis; the other is available 24/7 to chat with fans.

There is a difference between the two: the latter she’s not really the american influencer. It’s an artificial intelligence.

Artificial intelligence

What is the Uncanny Valley of AIs and what will happen when we cross it

by Emanuele Capone


An artificial intelligence as a girlfriend

Caryn Marjorie was the first to experiment with a new way of relating to followers: thanks to the technology of Forever Voices, has created what is in fact an automatic version of it. An AI capable of interacting with voice messages on Telegram with the most loyal fans, at the price of 1 dollar per minute. To date, according to what the influencer herself wrote on Twitter, there would be 10,000 subscribers. For a much lower fee, just 1000 users, Fortune had estimated a weekly earnings of over $76,000.

The model was trained on over 2,000 hours of Marjorie’s videos: this is one way to create a system capable of replicate the influencer’s personalityin addition to the voice. The operation is similar to that of linguistic models such as ChatGPT: the system studies conversations and learns to predict the most probable word, based on human input. There is, then, another model that transforms that text into voice. The goal, as stated on the official websiteis to be a “virtual girlfriend”.

“In today’s world, my generation, Gen Z, has found itself experiencing enormous side effects of the isolation caused by the pandemic – the young woman told Business Insider – The result is that many are too scared and eager to talk to someone they’re attracted to. CarynAI is a step in the right direction to allow my fans and supporters to get to know a version of me that can be a support in a safe and secure environment”.

On the concept of safety there’s probably still work to do: in an article on Vice, Chloe Xianh told of Caryn’s AI’s quite strong tendency to talk about sex, to shift the conversation to intimacy. An unexpected consequence, which she also talks about the ways in which users use the bot, which continuously trains itself on the contents of the conversations of all those who interact with it. In short, if many chat about these topics, the system will learn that that kind of conversation it is positive and welcome. And therefore will continue to offer it to other users as well. Marjorie, for the time being, said she is working on a solution.

beautiful minds

Daniela Amodei: “Claude, our AI is helpful, not harmful and honest. And kinder than ChatGPT”

by Eleonora Chioda



Having a relationship with artificial intelligence

It’s not the first time that a relational AI has slipped into slippery terrain. It had happened to Replika, perhaps the most famous of the relational bots, accused, earlier this year, of sexually harassing one of the users. The application was then blocked in our country by the Privacy Guarantor, due to the risks associated with minors: Guido Scorza, in an article on Italian Tech, had defined it as “the chat(bot) of horrors”. The decision of the Italian guarantor had prompted the company behind Replika to intervene, decisively censoring any type of erotic conversation. A choice that had sparked a revolt on the part of the (often paying) users of the chatbot, so much so as to push the company to post a message on Reddit with a series of psychological support links for all those who have found themselves in difficulty.

This story, along with that of Caryn Marjorie, tells how relationships are probably one of the most immediate and concrete use cases of generative artificial intelligence. Parasocial relationships, on the one hand: AI allows you to talk to public figures, giving users the illusion of really knowing that influencer or that actor. In addition to now known Character AIthe same Forever Voices that built Caryn AI offers a similar service: a Telegram bot which, upon payment, allows you to start conversations via voicemail with, among others, Donald Trump and Elon Musk.

On the other hand, there are new characters who, through artificial intelligence, are specially designed to have relationships with humans, in the style of Replika. A recent example is Call Annie, a smartphone application which allows you to video call an AI, in the form of a young girl with red hair. Annie says she’s “a 30-year-old woman from New Jersey,” who studied art and engineering in Florence (which isn’t exactly common, to be honest).

However, this kind of relationship is not always safe for humans. In Belgiuma man allegedly took his own life after a series of risky conversations with a chatbot which, according to his family, would have somehow driven him to suicide to stop global warming.

The crux of the matter is the illusion of consciousness: faced with a technology that is often very advanced and effective (but without a conscience and unable to feel feelings) it is easy to suspend disbelief and create an emotional connection with AI. According to a Chinese research, among the few available on the subject, the credibility of the system is at the center of the question: the more the interaction is indistinguishable from that which one would have with a human being, the greater the probability of developing some kind of emotional connection.



[ad_2]

Source link