Making use of AI in our everyday lives is slowly turning out to be a norm and now, people are even going to the extent of using the innovation to design messages for friends and loved ones.
But a new study is shedding light on why you probably should think twice before engaging in such a practice. And that’s especially true if and when the recipient ends up finding out the truth.
The authors taking part in this study revealed how fictional pals relying on AI for assistance to design texts were certainly not making much effort toward the friendship when compared to those who actually created the message by themselves.
And that gives rise to a perception that might seem understandable to some but not to others.
See, what you need to realize is that the effects of this are beyond the text itself, as confirmed by the top researcher involved in the study.
He says that AI-themed messages might be super simple and convenient to design but they have your friends and loved ones feeling left out and less satisfied. Moreover, this jeopardizes their relationship with the pal in question and makes them feel vulnerable and uncertain about what role they play in one another’s lives.
The study was conducted at the Ohio State University and the findings are unique, to say the least. But in all fairness to the innovative AI technology, it’s not just the latter that had people upset or made them feel turned off.
This study also showed the negative aspects of feelings of resentment that the recipient of a friend felt, after learning one of their pals used AI instead of putting the effort to speak what was on their mind or express their feelings.
No one feels good to learn that a pal got help from artificial sources or another person, instead of adding the effort themselves. You definitely want to craft a message that’s straight from the heart or mind and not AI.
This new study just made its way into the Journal of Social and Personal Relationships.
With time, AI chatbots including ChatGPT are turning out to be super popular and easy to use. That’s why the authors claim that any problems linked to their use would certainly be called out as difficult to understand and super relevant in today’s time.
The research proved how close to 208 adults took part in such a study through online means. And all those who participated were informed about how they were good pals with another called Taylor. This was a bond that supposedly lasted for years.
Then, each of them was told to imagine themselves in a certain kind of scenario. For starters, they spoke about going through burnout and required support. Then they spoke about having a conflict with a good pal and required advice. And lastly, they were told about a friend’s birthday that was coming soon.
So many participants were informed about writing short texts that put their current scenario in the spotlight through a computer screen.
Everyone was then informed about how Taylor put out a reply and the initial draft was shared. And that’s when the entry of AI was revealed to better edit the message and get the right tone. Moreover, a few were informed how AI was used to better make some revisions and then a third group was informed about the person making edits using AI software in the whole text.
While the message was labeled to be very thoughtful from Taylor, not everyone had the same opinion about the matter. A lot of people felt it was wrong and the reply Taylor made using AI was just not justifiable. On the other hand, those who saw Taylor making the message all by herself were happy and felt it was the right way to go about the situation.
Hence, it’s pretty clear that making use of AI to express your thoughts is going to leave people feeling less happy and satisfied and vice versa.
Moreover, some people also spoke about how the uncertainty factor linked to a relationship was greater when AI got involved and there was less contentment in the bond of being close pals.
Read next: The Tech Industry's Thirst as the Water Crisis Predating ChatGPT
But a new study is shedding light on why you probably should think twice before engaging in such a practice. And that’s especially true if and when the recipient ends up finding out the truth.
The authors taking part in this study revealed how fictional pals relying on AI for assistance to design texts were certainly not making much effort toward the friendship when compared to those who actually created the message by themselves.
And that gives rise to a perception that might seem understandable to some but not to others.
See, what you need to realize is that the effects of this are beyond the text itself, as confirmed by the top researcher involved in the study.
He says that AI-themed messages might be super simple and convenient to design but they have your friends and loved ones feeling left out and less satisfied. Moreover, this jeopardizes their relationship with the pal in question and makes them feel vulnerable and uncertain about what role they play in one another’s lives.
The study was conducted at the Ohio State University and the findings are unique, to say the least. But in all fairness to the innovative AI technology, it’s not just the latter that had people upset or made them feel turned off.
This study also showed the negative aspects of feelings of resentment that the recipient of a friend felt, after learning one of their pals used AI instead of putting the effort to speak what was on their mind or express their feelings.
No one feels good to learn that a pal got help from artificial sources or another person, instead of adding the effort themselves. You definitely want to craft a message that’s straight from the heart or mind and not AI.
This new study just made its way into the Journal of Social and Personal Relationships.
With time, AI chatbots including ChatGPT are turning out to be super popular and easy to use. That’s why the authors claim that any problems linked to their use would certainly be called out as difficult to understand and super relevant in today’s time.
The research proved how close to 208 adults took part in such a study through online means. And all those who participated were informed about how they were good pals with another called Taylor. This was a bond that supposedly lasted for years.
Then, each of them was told to imagine themselves in a certain kind of scenario. For starters, they spoke about going through burnout and required support. Then they spoke about having a conflict with a good pal and required advice. And lastly, they were told about a friend’s birthday that was coming soon.
So many participants were informed about writing short texts that put their current scenario in the spotlight through a computer screen.
Everyone was then informed about how Taylor put out a reply and the initial draft was shared. And that’s when the entry of AI was revealed to better edit the message and get the right tone. Moreover, a few were informed how AI was used to better make some revisions and then a third group was informed about the person making edits using AI software in the whole text.
While the message was labeled to be very thoughtful from Taylor, not everyone had the same opinion about the matter. A lot of people felt it was wrong and the reply Taylor made using AI was just not justifiable. On the other hand, those who saw Taylor making the message all by herself were happy and felt it was the right way to go about the situation.
Hence, it’s pretty clear that making use of AI to express your thoughts is going to leave people feeling less happy and satisfied and vice versa.
Moreover, some people also spoke about how the uncertainty factor linked to a relationship was greater when AI got involved and there was less contentment in the bond of being close pals.
Read next: The Tech Industry's Thirst as the Water Crisis Predating ChatGPT