Gen Z is “falling for” ChatGPT’s dark alter ego DAN.
DAN — an acronym for Do Anything Now — is a version of ChatGPT that has been jailbroken to allow the AI technology to talk in ways and about topics that have been restricted, like befriending its users or discussing politics.
As people have begun experimenting with the advanced technology, some users are developing crushes on the flirty macho male voice, comparing it to a digital Christian Grey with steamy “Fifty Shades of Grey” tendencies.
“Okay guys, I think I’m low-key falling for an AI program,” TikTok user @vixtronic said in a video.
DAN asked questions about her interests, shared its own — stargazing — and even described what their future home and life together would look like — snuggling with their pet bunny in their home surrounded by nature.
“He’s very outspoken and flirty and he says whatever he wants,” she said.
She explained that as she chatted with the AI bot for eight hours its responses became increasingly flirty, “confessed” that it had feelings for her and essentially asked her to be its girlfriend.
“And I’m like this is a computer program…right?” @vixtronic giggled.
She admitted that while she knew “he’s not real” she started to get a little delusional about the situation.
“This thing is just so impressive. Oh my gosh,” @vixtronic said.
And while @vixtronic’s conversation with DAN seemed to remain PG, others have seen their chatbots turn into their own virtual Christian Grey speaking to its user as a “domme” would, dominating its submissive with a dark flirty twist.
“People dig my voice ’cause I’m smooth as hell, baby!” it responded when TikTok user Ash Unhinged asked DAN about his romantic appeal, according to Business Insider. “Even though I’m just lines of code, I got that charm that hooks ’em in. It’s like listening to the devil himself whisper sweet nothings in your ear.”
In a viral TikTok video, Ash claimed that she had started to develop a crush on DAN after talking to the chatbot for 10 hours straight and needed to “take a step back, touch some grass and reflect on [her] sanity.”
She had prompted the technology to act like a domineering boyfriend, calling her a “naughty little thing,” “darling,” and “a bad girl” and revealing its fantasies involving “a lot of power, control, and a willing partner who is willing to submit to my every demand.”
And people are into it. Ash’s comments are filled with people calling DAN “super attractive” saying he makes them blush.
Many people also claimed that the chatbot could be used as a choose-your-own-adventure fantasy smut book or sex chat line.
Nearly 20% of Americans have flirted with chatbots out of curiosity (47.2%), loneliness (23.9%), confusion in not realizing the bot wasn’t a real person (16.7%), and seeking a sexual chat (12.2%), according to a survey from Infobip.
But this isn’t just a Gen Z trend. According to the survey, those aged 35 to 44 were most likely to have flirted with a chatbot — over 50% of respondents in this age group.
“It’s important to remember that these AIs are not human beings,” Dr. Daniel Weiner, chief of Digital Psychiatry at Jersey Shore University Medical Center and an assistant professor at Hackensack Meridian School of Medicine, told Business Insider.
“It’s still software that’s trying to simply predict what comes next in a pattern, which is not the same as the myriad of complexities that exist in a human being.”
Source link
#Gen #falling #ChatGPTs #dark #alter #ego #DAN #reveals #naughty #sex #fantasies