Recent findings from The Ohio State University indicate that individuals feel less content in their interpersonal relationships when they discover that their friends employed AI or another individual to assist in message composition. This research highlights the significance of personal endeavor in sustaining friendships and posits that resorting to AI or human assistance is perceived as a form of expediency.
According to the study, when people are aware that AI has played a role in crafting a message they receive, they regard their friend as lacking in sincerity. This viewpoint extends beyond merely the words in the message, as emphasized by Bingjie Liu, the leading researcher and an Assistant Professor of Communication at The Ohio State University.
Upon receiving a message assisted by AI technology, individuals reported reduced satisfaction in their friendship and experienced heightened uncertainty about their relational standing, Liu elaborated.
It is crucial to note, however, that it is not merely the technological component that leads to these negative outcomes. The study also reported unfavorable consequences when participants discovered that their friend had enlisted another person’s help in composing the message.
The central sentiment among study participants was a preference for friends to independently construct their messages, devoid of external aid—whether from AI or from other humans, Liu pointed out. The research was recently published online in the Journal of Social and Personal Relationships.
As AI conversational systems like ChatGPT become progressively ubiquitous, ethical and practical considerations surrounding their use are likely to grow increasingly intricate, Liu stated.
The research involved 208 online adult participants who were presented with one of three life scenarios. Subsequently, participants were asked to compose a brief message to a fictional friend named Taylor. Three different versions of Taylor’s responses were presented, involving varying levels of AI or human assistance in editing the message.
Participants’ perceptions of these edited messages revealed consistent patterns. Those who received AI-assisted replies rated them as less appropriate and were less satisfied with their relationship with Taylor. Moreover, these individuals felt more uncertain about statements like “Taylor considers me a close friend.”
The study suggests that the aversion to AI-assisted responses may stem from a societal belief that technological means are inferior to human effort in generating personal communications. However, the results indicated that aversion was equally strong when the assistance came from another human being.
Essentially, the study concluded that the perceived level of effort Taylor invested in the relationship—by opting for external help in message composition—directly impacted the satisfaction and certainty participants felt about their friendship with Taylor.
Liu stresses that genuine effort is crucial in relationships. The rise of services like ChatGPT may prompt individuals to perform a mental Turing Test, wherein they assess whether a message could have AI components, potentially jeopardizing relationships. Liu’s advice is unequivocal: sincerity and authenticity should be paramount in interpersonal relationships.
The study, titled “Artificial Intelligence and Perceived Effort in Relationship Maintenance: Effects on Relationship Satisfaction and Uncertainty,” was conducted in collaboration with Jin Kang of Carleton University, Canada, and Lewen Wei of the University of New South Wales, Australia. The paper has been officially cited with DOI: 10.1177/02654075231189899.
Table of Contents
Frequently Asked Questions (FAQs) about relationship satisfaction
What is the main focus of the study conducted by The Ohio State University?
The primary focus of the study is to investigate how the use of AI tools like ChatGPT in crafting personal messages affects the quality of interpersonal relationships.
Who conducted the study and where was it published?
The study was led by Bingjie Liu, an Assistant Professor of Communication at The Ohio State University. It was published online in the Journal of Social and Personal Relationships and conducted in collaboration with researchers from Carleton University, Canada, and the University of New South Wales, Australia.
What were the primary findings of the study?
The key findings indicate that individuals feel less content and more uncertain about their relationships when they discover that a friend has used AI or another person to help craft a message to them.
Was the study limited to AI technology?
No, the study also explored the effects of messages that were crafted with the assistance of another human being and found that they too negatively impacted relationship satisfaction and certainty.
What do people think about friends who use AI for messaging?
According to the study, people perceive friends who utilize AI for crafting messages as insincere and less invested in the relationship, which diminishes relationship satisfaction.
How did the study measure relationship satisfaction?
Participants were asked to rate the perceived appropriateness of the message they received and how satisfied they felt about their relationship after receiving an AI-assisted or human-assisted message.
What advice does Bingjie Liu offer based on the study?
Bingjie Liu advises that sincerity and authenticity should be paramount in maintaining interpersonal relationships, recommending against the use of AI or external human assistance in personal messaging.
Could this study have broader implications?
Yes, as AI conversational systems like ChatGPT become increasingly popular, ethical and practical considerations surrounding their use are likely to become more complex.
Is there a concern people might start conducting a ‘Turing Test’ mentally?
Yes, the study suggests that as AI communication tools gain popularity, individuals may mentally assess whether a message they receive from a friend has been AI-assisted, potentially straining the relationship.
What is the cited DOI of the study?
The study has been officially cited with DOI: 10.1177/02654075231189899.
More about relationship satisfaction
- The Ohio State University
- Journal of Social and Personal Relationships
- Artificial Intelligence and Perceived Effort in Relationship Maintenance: Effects on Relationship Satisfaction and Uncertainty
- Carleton University
- University of New South Wales
- Turing Test Explanation
- ChatGPT by OpenAI
7 comments
so effort is the key huh. Makes sense, relationships are about investing time and being genuine. Why would you outsource that?
Wow, this is eye-opening. Who would’ve thought that using AI could actually hurt friendships? Gotta think twice before using these tools, I guess.
Is it just me or does this feels a bit like we’re overthinking? I get the point but still, it’s just a message. If the friendship is strong, should it really matter?
kinda makes sense, we all want our friends to be real with us. If you’re using AI to craft a msg, what else you’re not sincere about?
Didn’t see that coming. But i think its more about trust than technology. if you trust someone, does it matter how they send a message?
Interesting study but I wonder how it’ll change as AI gets better. I mean, what if it gets so good, you can’t tell the difference?
This is a real issue. With how busy we are, it’s tempting to use AI. But looks like its not worth risking the relationship.