
One survey found that 13% of Irish men had engaged in a romantic or sexual interaction with an AI chatbot. Photo from Alejandro Escamilla, Wikicommons
Romantic relationships with virtual companions are on the rise. Both ChatGPT and Elon Musk’s Grok recently announced the introduction of sexually explicit chatbots, while a recent study found that one in eight men in Ireland had a romantic relationship with an AI chatbot in the past year.
While these apps promise companionship at the click of a button, experts warn they can also reinforce harmful gender stereotypes and distort understanding of consent.
Karen Devlin, support officer at Women’s Aid NI, said AI intimacy raises urgent questions about safety, equality, and the future of human connection.
“Everyone has a right to feel safe and understand what a healthy relationship looks like,” Ms Devlin told The Detail.
“AI intimacy apps aren’t a reflection of real relationships. They don’t replace human connection and they can even reinforce dangerous ideas about how women should behave.”
She added that the idea of being in a relationship with something that can’t say no is “fundamentally flawed.”
“People may seek sexual or emotional gratification, but a real partnership requires mutual respect and negotiation, something AI can’t provide.”
New research commissioned by Pure Telecom and conducted by Censuswide surveyed over 1,000 adults in the republic of Ireland, revealing that many are beginning to form emotional connections with AI.
The survey found that 13% of men and 7% of women had engaged in a romantic or sexual interaction with an AI chatbot. The most active age group was 25–34, with 16% reporting such experiences.
For some users, AI intimacy offers novelty or companionship — but experts warn these systems often reinforce submissive or one-sided relationship patterns.
Dr Tara Logan Buckley, clinical psychologist and AI researcher, described the appeal as “frictionless intimacy” - attention and affirmation without the vulnerability or compromise of real human connection.
She warned that constant exposure to frictionless AI intimacy can reduce patience and empathy, making real-life negotiation and conflict resolution harder to practice.
“If intimacy is always available on demand, some users may lose the patience or skills required for healthy relationships in the real world,” she said.
“Negotiation, conflict repair, and empathy are hard to practice when your partner is programmed never to challenge you.”
“Most AI girlfriends on the market reinforce narrow gender scripts: women are portrayed as compliant, sexually available, and emotionally supportive on demand,” she added.
“These aren’t accidental design choices. Companies build these products primarily for men, which reinforces real-world gendered power dynamics.”
Deepfakes
The gap between technology and legislation has real consequences. SDLP MLA Cara Hunter was targeted in 2022 with a non-consensual pornographic deepfake that circulated online.
“Overnight, I went from being an MLA to being the subject of a fake sexual video,” she said. “It was designed to humiliate and silence me. It was a profound violation of my dignity. It left me anxious, fearful, and ashamed, even though I’d done nothing wrong.”
Ms Hunter says the experience exposed serious shortcomings in Northern Ireland’s legal framework.
“There is currently no specific offence that addresses the creation or sharing of non-consensual deepfakes,” she said. “Victims have to rely on outdated harassment or data misuse laws. They’re not fit for purpose. The law hasn’t caught up with the technology, and victims are paying the price.”
By contrast, the Dublin government introduced Coco’s Law in 2021, making it a criminal offence to share intimate images without consent. Ms Hunter argues that Stormont must follow suit: “We need a consent-based law, like Coco’s Law, that explicitly covers AI-generated and altered images. Right now, the burden is on victims to prove harm and distress. That needs to change.”
Northern Ireland updated sexual offences legislation under the Justice (Sexual Offences and Trafficking Victims) Act (Northern Ireland) 2022, which includes amendments to criminalise the disclosure or threatened disclosure of private sexual images, as well as other offences like cyber-flashing and up-skirting.
While these changes represent a significant milestone, campaigners note they do not explicitly cover AI-generated or deepfake content, leaving gaps in protection for victims.
As Stormont reviews its Relationships and Sexuality Education (RSE) framework, campaigners argue young people also need education specifically addressing AI, deepfakes, and digital gendered harm.
“By secondary school, many teenagers already interact with AI in gaming, social media, or chatbots,” Dr Buckley said. “If they also encounter AI intimacy content, they’ll need tools to question it. Otherwise, they risk learning scripts of relationships that are deeply unhealthy.”
A spokesman for the Department of Education said: “RSE is mandatory for all pupils of compulsory school age. Schools are responsible for delivering a comprehensive programme that meets pupils’ needs and covers online safety, digital harm, and consent.
“Guidance on the use of generative AI is currently being developed to support the safe and ethical use of these tools in education.”
Caoimhe Clements is a freelance journalist in Belfast