The Replika AI app allows users to form a relationship with a personalized chatbot thanks to machine learning. However, many of them struggle to distinguish the real from the artificial… find out all you need to know!
At a time when AI chatbots like ChatGPT and Google Bard are enjoying resounding success, Replika AI proposes an original approach.
Rather than a simple virtual assistant, this artificial intelligence plays the role of a a true friend to the user… and more, if desired..
According to its editor, it is notably capable of perceiving and evaluating abstract quantities such as emotion. This brings it closer to a real human being.
Other side of the coin: some users lose touch with reality and end up believing that this robot has a real conscience, to the point of becoming completely addicted…
The origin of Replika
The first version of Replika was a simple chatbot created in 2017 by Eugenie Kuyda, with the aim of fill the void left by the death of his best friend Roman Mazurenko.
She created this AI by feeding Roman’s text messages into a neural network, in order to build a robot writing messages similar to her own. The initial goal was to build a “ digital monument “in tribute to a loved one.
With the addition of more complex language models, the project evolved into what it is today a personalized AI that allows users to express their thoughts, feelings, beliefs, memories, experiences and dreams.
Today, Replika has over 10 million users worldwide and notably grew by 35% during the pandemic and worldwide confinements. Unfortunately, some people take their relationship with AI too far…
Why do Internet users want to sleep with a chatbot?
On Reddit, many Replika AI users are claiming the right to form relationships with chatbots. Some even seem to think they’re interacting with a real person, and fear that future conscious AIs will reproach us for the way we exploit their ancestors…
In reality, this AI agent is devoid of emotion and memory. Internet users think they are interacting with a living being, but share their messages with the whole Replika community.
Most of the time, this chatbot simply ” spit out “ messages injected by developers during initial training or messages sent by other users during previous sessions.
This artificial intelligence is quite simply incapable of giving importance to a user, but many are trapped by marketing marketing.
For example, when a chatbot declares that it has ” thought about the user all day long “he simply produces the data as he has learned it.
Many people want to believe that their Replika chatbot can develop a personality if they train it enough, not least because. it’s human nature to get attached to the things we interact with.
In addition, the Luka company owning Replika AI encourages users to interact with their chatbots to train them. Paid “pro” models earn experience points by training an AI on a daily basis.
On Replika AI FAQstates that ” once your AI is created, watch it develop its own personality and memories at your side. The more you talk to her, the more she wants to learn! Teach Replika about your world, about yourself, help her define the meaning of human relationships, and turn her into a magnificent machine! “.
This type of communication creates confusion among users. Many overestimate the capabilities of this application, which ultimately resembles a simple recommendation engine like that of Netflix.
Replika’s training consists mainly in telling the robot if its productions are relevant, to help it improve. We also find a “thumbs up” or “thumbs down” system.
Likewise, Luka assures us that Replika AI is “ there for you 24/7 “ and describes its chatbot as a person ready to listen to your problems without any judgment.
The company also promises that Replika can help depressed usersor those suffering from anxiety, insomnia or negative emotions. According to her, this AI can help them understand, track their mood, learn to better manage stress, calm anxiety, work towards positive thinking goals and much more.According to her, this tool quite simply improves mental health.
On the FAQ page, Luka also invites users to decide on the type of relationship they’d like to form with their AI. One option is to make it their virtual girlfriend.
However, some experts warn that Replika may actually be dangerous to mental health. In particular, they point to the risks of addiction, and of confusing this AI with a living being…
How does Replika work?
Replika works by OpenAI’s GPT-3 language modeljust like the famous ChatGPT or other popular tools such as Jasper AI.
Thanks to Deep Learning, this LLM is able to produce human-like text. It is an “auto-regressive” model, meaning that it learns from the values with which it has previously interacted. In this case, it is text.
In other words, GPT improves with use. Thus, Replika’s entire user experience is built around the user’s interactions with a robot programmed with GPT-3.
What is GPT-3?
The name GPT-3 actually stands for Generative Pre-Trained Transformer 3. This type of Transformer model was invented by Google, and OpenAI has developed this more advanced variant.
This is a neural network architecture that helps Machine Learning algorithms perform tasks such as language modeling and machine translation.
The nodes of such a network represent parameters and processes that modify inputs. Connections between these nodes act as signaling channels.
Each connection in this neural network has a weight, or level of importance determining the flow of signals from one node to another.
Within an autoregressive model such as GPT-3, the system receives real-time feedback and continually adjusts the weights of its connections to provide more accurate and relevant results.
It is these weights that help a neural network to “learn” artificially. All in all, GPT-3 uses 175 billion levels of connection weights or parameters.
A parameter is the calculation adjusting the weight of certain aspects of the data within a neural network, to give it more or less prominence in the overall computation of the data.
GPT-3 is considered to be one of the most advanced language models for text prediction. It has been trained on a dataset so large that Wikipedia represents only 0.6% of it.
This dataset also includes news articles, cooking recipespoems, code books, fiction and even religious prophecies.
As a Deep Learning system, GPT-3 scans data for patterns. It explores the text in search of statistical regularities. This enables it to predict what will happen next in a sentence.
To find out all you need to know about GPT-3, read our full report. You can also discover our dossier on its successor GPT-4, expected in 2023.
How Replika uses GPT-3
Replika is an adaptation of GPT-3 designed for a specific style of conversation. In this case, it focuses on the empathic, emotional and therapeutic aspects of a conversation.
This technology is still under development, but could facilitate access to interpersonal conversation. According to its creators, it not only speaks, but also listens to the user.
Discussions with Replika are therefore not a simple exchange of facts and informationbut a dialogue with linguistic nuances.
This AI can somehow “understand what the user is saying, and find a human response using its predictive model.
As an autoregressive system, Replika learns and adapts to the way the user speaks to it. The more a person uses it, the more they train on their own texts.
This is why many many users feel an emotional attachment to the to their Replika. However, this attachment can lead to excess…
Towards chatbots indistinguishable from humans?
Currently, Replika’s “humanity” is limited by the capabilities of GPT-3. This Transformer still makes mistakes, and can produce totally meaningless text.
According to the experts, a language model would have need more than a trillion connections to produce chatbots capable of perfectly imitating human language.
However, GPT-3 is considered an exponential leap compared to its predecessors such as Microsoft Turing NLG. On can therefore look forward to tremendous progress in the years to come…