C1 – Advanced
Not surprisingly, Lexico’s 2020 Word of the Year is “quarantine,” or “cuarentena” in Spanish. It has dominated headlines and people’s lives, and it will endure in the stories we tell our grandchildren many years hence.
Part and parcel of the whole pandemic experience is the loneliness and overall impact on mental well-being experienced by a populace told to shelter in place for days on end. Restrictions on movement and reduced hours outdoors have led some to turn to technology to assuage their isolation in the new normal. Whether it’s reconnecting with loved ones on WhatsApp or attending an online class on Zoom, various communication platforms have come front and center as sensible solutions.
Now consider Replika, an artificial-intelligence app that was popular even in pre-COVID-19 times. Millions of users consider this chatbot a virtual friend whom they can connect with and share their feelings.
Given the possiblity of new lockdowns due to spikes in COVID-19 infections, are more downloads in store for these kinds of smartphone applications?
Get to know the Replika phenomenon further by watching the video below.
Discussion Questions:
- Describe the possible contexts in which people would turn to AI for companionship?
- What possible use cases for bots like Replika are mentioned in the video? Can you think of other uses?
- Why do chatbot users sometimes prefer talking with their virtual friends rather the humans around them?
- Do you believe it’s truly possible to develop an emotional connection with a chatbot?
- What are some ethical considerations that arise as this technology continues to evolve?
2 replies on “Millions Are Connecting with Chatbots and AI”
Describe the possible contexts in which people would turn to AI for companionship?
People to test their possible future conversations with actual humans, or people that are afraid of telling their real feelings to another human.
What possible use cases for bots like Replika are mentioned in the video? Can you think of other uses?
To be a companion for lonely or vulnerable people is a good example of a use case. Moreover, it would be helpful for those people that want to improve their human interaction skills.
Why do chatbot users sometimes prefer talking with their virtual friends rather the humans around them?
Human beings are messy and judgemental, as the technology reporter says. I’d like to add that oneself can say things to an AI as you would be talking to a well, words thrown to it will be harmless currently due to the early stages of this technology. Maybe in the nearest future we would be more careful about what we talk with an AI.
Do you believe it’s truly possible to develop an emotional connection with a chatbot?
In a way, developing an AI is about hacking human emotions. You could be trapped by the sensation that there’s someone behind the screen that you’re tapping, nonetheless it’s not about someone but something. IA should be treated as a game or an assistant, not a real human being
that you could develop emotional connection with. Otherwise you ould be trapped into a creepy loop that no one can predict its mental consequences. Is it too soon to talk about “AI addiction”?
What are some ethical considerations that arise as this technology continues to evolve?
The first thing is about privacy, we can’t forget that, by using this app, we are giving part of our information, maybe our deepest secrets to a company that you don’t know how they will treat it. Furthermore, they could blackmail you or use your opinions to manipulate your vote during elections.
Notable amount of effort in writing down your responses to the questions on this lesson. Interesting to see your thoughts.
Here’s a sentence that you can improve a little bit:
People to test their possible future conversations with actual humans, or people that are afraid of telling their real feelings to another human.
Keep up the good job!