It seems as though we’ve arrived at the moment in the AI hype cycle where no idea is too bonkers to launch. This week’s eyebrow-raising AI project is a new twist on the romantic chatbot—a mobile app called AngryGF, which offers its users the uniquely unpleasant experience of getting yelled at via messages from a fake person. Or, as cofounder Emilia Aviles explained in her original pitch: “It simulates scenarios where female partners are angry, prompting users to comfort their angry AI partners” through a “gamified approach.” The idea is to teach communication skills by simulating arguments that the user can either win or lose depending on whether they can appease their fuming girlfriend.
The central appeal of a relationship-simulating chatbot, I’ve always assumed, is that they’re easier to interact with than real-life humans. They have no needs or desires of their own. There’s no chance they’ll reject you or mock you. They exist as a sort of emotional security blanket. So the premise of AngryGF amused me. You get some of the downsides of a real-life girlfriend—she’s furious!!—but none of the upsides. Who would voluntarily use this?
Obviously, I downloaded AngryGF immediately. (It’s available, for those who dare, on both the Apple App Store and Google Play.) The app offers a variety of situations where a girlfriend might ostensibly be mad and need “comfort.” They include “You put your savings into the stock market and lose 50 percent of it. Your girlfriend finds out and gets angry” and “During a conversation with your girlfriend, you unconsciously praise a female friend by mentioning that she is beautiful and talented. Your girlfriend becomes jealous and angry.”
The app sets an initial “forgiveness level” anywhere between 0 and 100 percent. You have 10 tries to say soothing things that tilt the forgiveness meter back to 100. I chose the beguilingly vague scenario called “Angry for no reason,” in which the girlfriend is, uh, angry for no reason. The forgiveness meter was initially set to a measly 30 percent, indicating I had a hard road ahead of me.
Reader: I failed. Despite my best efforts to pen messages to placate my livid faux-beau, she kept misinterpreting my intent and charging me with neglect. A simple text from me, asking, “How are you doing today?”, was met with an almost immediate curt reply. “Oh, now you care about how I’m doing?” Any attempts at apologies seemed to only further upset her. In proposing a dinner date, her response indicated that it wasn’t enough, yet she demanded it be “somewhere nice”.
The interaction became so exasperating that I barked at my digital nag that she was vexing. Unsurprisingly, she responded with, “Great to know that my feelings are such a bother to you.” Attempting to restart the interaction hours later, the app politely informed me that I would have to upgrade to the paid version to unlock more scenarios. The cost? $6.99 a week. I opted not to.
At this juncture, I started pondering about the actual purpose of the app – was it some elaborate stunt, perhaps? What partner would willingly want their significant other to download this? The mere idea that my partner would view me as unstable enough to warrant practicing their mollifying techniques on a digital shrew of an AI, is outrageous. It may bear more appeal than AI girlfriend apps intending to replace genuine human relationships, but the concept of an app aimed to help men improve their communication with women by fashioning an irritable bot, could potentially be worse.
I managed to reach out to Aviles, the co-founder of the app, in an attempt to make sense of the AngryGF phenomenon. A social media marketer based out of Chicago, she shared that the app was a product of her own previous relationships, where her partners had poorly impressed her with their confabulation abilities. Her dedication to the product was apparent. “You know men,” she pitched. “They listen, but then they don’t follow through.”
Aviles describes herself as the app’s cofounder but isn’t particularly well-versed in the nuts and bolts of its creation. (She says a team of “between 10 and 20” people work on the app but that she is the only founder willing to put her name on the product.) She was able to specify that the app is built on top of OpenAI’s GPT-4 and wasn’t made with any additional custom training data like actual text messages between significant others.
“We didn’t really directly consult with a relationship therapist or anything like that,” she says. No kidding.
Amit Katwala
Andy Greenberg
Caroline Haskins
Caroline Haskins