Can AI Tools Like ChatGPT Experience Feelings And Capture Memories? This New Survey Has The Answer

A new survey is shedding light on AI tools like ChatGPT and what people’s opinions about them might be.

The survey questioned people about their thoughts on the leading AI tools and if they felt large language models actually had feelings or featured the capabilities of preserving memories. Surprisingly, a lot of the respondents said yes and we’re talking about two-thirds of the majority.

Thanks to the research carried out by the University of Waterloo, many felt AI does engage in some form of conscious behavior and therefore can give rise to subjective experiences like making memories or having emotions.

LLMs do frequently display conversational styles when rolling out content and such human capabilities give rise to human debates on whether AI really displays consciousness.

As per the experts, when people hold such strong sentiments about AI tools, it ultimately impacts how they work with them and interact with them. Moreover, this gives rise to strong social bonds and builds more trust.

Meanwhile, too much trust is also not a good thing as it means strong emotional dependence and limited human interactions, not to mention being overly dependent on AI to make pivotal decisions.

Many AI experts have denied time and time again how AI might be conscious but as per this new study, the opinions of the general public prove otherwise.

It’s a reality that many people believe in and that is why the authors embarked on a survey to prove just that. The results are right in front of us and with close to 300 individuals from the US taking part and most of them agreeing about ChatGPT being conscious, we might need to consider that for the future of AI.

Respondents were asked to comment on the tools’ mental state and their capability to produce plans, reasonings, and emotions, not to mention how frequently they engaged with the tool.

Furthermore, the study revealed that more the people make use of ChatGPT, the greater they contribute as having feelings which is significant considering how frequently AI is being used in our lives each day.

The authors also explained how the results delineated how powerful language is as a simple chat alone could lead many to assume that agents appearing and working differently from them might also have a mind of their own.

Other than emotions, being conscious has to do with intelligence that is linked to moral responsibilities. This is the ability to produce plans and act in a certain way, not to mention having more self-control over ethical ordeals and legal matters.

So this opens up the future for more scientific experiments and studies to better understand AI models in detail and how people perceive over time around the globe. After all, social bonding with LLM and AI tools is not something that has ever been discussed in the past.


Read next: Study Finds that AI is Capable of Working at Healthcare as It Gives Better Answers than Most of Physicians
Previous Post Next Post