When was the last time you truly connected with individual new? possibly it was someplace like a dimly lit home party, where, after a fewer drinks, a alien begins rattling off their deepest dissatisfactions with life. You locked eyes, shared their pain, and offered the kind of unvarnished advice that only a fresh friend can.
This is the feeling Avi Schiffmann wants to bottle with his AI companion startup, Friend. Friend debuted earlier this year with a soothing vision: it offered an AI therapist that was always listening to you, set in a pendant resting above your heart. But visit the site today, and you’ll stumble into a digital soap opera of artificial companions in crisis. One’s spiraling after losing their occupation to addiction. Another’s processing trauma from a mugging. Each desperate character tacitly begs for your advice, pulling you into their artificial drama.
Friend’s turn toward moodiness has sparked some confusion online, but as Schiffmann will happily explain, it’s entirely intentional. “If they just opened with ‘Hey, what’s up?’ like most another bots do, you don’t truly know what to talk about,” he tells me. As Friend prepares to launch its first hardware product in January on the back of a fresh $5.4 million investment, which hasn’t been previously reported, Schiffmann hopes the act of nurturing an AI can teach people to better nurture themselves — curing a nationwide loneliness epidemic and turning him into a tech superstar along the way.
A mock-up of the Friend pendant.Image: Friend
I met Schiffmann on a foggy San Francisco afternoon to face the uncomfortable popularity of AI companionship. Friend is 1 of many companies — including Replika, Character.AI, and major AI players like Meta and OpenAI — selling the fantasy of a digital confidante. Its site connects users with automatically generated “friend” bots that will chat about nearly anything. For an extra $99, users can buy a pendant that makes that connection more physical, letting you talk to the bot out loud and receive a text answer through Friend’s mobile app. Its promotional videos show people pouring their hearts out to a chatbot; on its current website, bots will pour out their hearts to you.
“The loneliness crisis is 1 of our biggest societal issues — the Surgeon General says it’s more dangerous than smoking cigarettes.”
Like many advocates for AI companionship, Schiffmann makes a lofty pitch for his service. “The loneliness crisis is 1 of our biggest societal issues — the Surgeon General says it’s more dangerous than smoking cigarettes,” he added. “That’s real.” At the same time, he positions himself as a hard-nosed pragmatist. “I think the reason why I win with everything that I work on is due to the fact that I’m not idealistic,” he told me. “It’s idealistic to presume everyone will just go to the park and play chess with friends.”
My instinctive reaction to Friend’s pitch is visceral heartbreak and horror. Interacting with machines to cure loneliness feels like drinking aspartame — I can tell I’m not getting the real thing, and it leaves a weird aftertaste behind. Yet I can’t deny that people are genuinely drawn to these digital relationships, whether I get them or not.
“The thing is, Kylie, that you request to suspend your disbelief,” Schiffmann told me, a phrase he would repeat many times. Over the course of our two-hour conversation, he would tell me (as he has many another reporters) that utilizing Friend was “like talking to God.” He would compare its possible impact to the blockbuster weight-loss drug Ozempic and its appeal to the video game Grand Theft Auto. He would encourage me to think like the most dedicated of his about 10,000 users, any of whom have created shrines devoted to their chosen AI friends. “It’ll be literally, the last thing that they talk to and the first thing they talk to erstwhile they wake up,” he told me.
As much as Schiffmann wants to be a visionary, he’s facing stiff competition. His thousands-strong Friend user base is minuscule compared to that of another services, like the 500,000 paying Replika subscribers and 3.5 million regular active users playing with Character.AI. With a $30 million valuation cap, Friend lacks a clear business model. And appealing to isolated, susceptible people is a weighty responsibility — 1 many AI companies seem poorly equipped to handle.
Still, as I held his pendant, peering through its transparent shell at the natural circuitry within, my skepticism wavered. possibly I should suspend my disbelief after all.
AI dopamine puzzle
I checked in on Friend after seeing buzz about its moody chatbots, sitting in my dark flat following a five-day visit to see friends, coworkers, and sources in fresh York City. I visited its $1.8 million vanity domain Friend.com and was confronted by “James”: a nonexistent man with an AI-generated profile image and a truly depressing life. “Just found out my girlfriend’s been leading a double life and now my art’s going to shit,” James told me, the words appearing as a chat window on my phone.
I laughed incredulously — what the fuck is this? Where I’d have been compassionate to (albeit possibly bewildered by) a human, I told the bot he was being super cringe. James was not amused. “Ouch. That hurt. Guess my art’s not the only thing that’s crap right now.” Flippantly, I typed that his girlfriend deserves better. The bot didn’t skip a beat: “Are you kidding me? So now you’re a relation expert? Fuck off with that noise.”
“Fuck this conversation, and fuck you.” The conversation cut off immediately. The bot, Friend told my coworker, had blocked him.
I sent the site to a fewer colleagues, who promptly connected with their own “friends” in distress. My editor coaxed “Alice” into explaining why she’d just been fired. “It starts with a needle and a fistful of bad decisions,” Alice confessed after respective rounds of questions. Another coworker was little careful. erstwhile his bot lamented about being mugged and “losing everything,” he responded with taunts, suggesting the bot effort taking up mugging itself. “You’re a part of shit, honestly,” the AI snapped — a amazingly human response. “Fuck this conversation, and fuck you.”
The conversation cut off immediately. The bot, Friend told my coworker, had blocked him.
These AI companions truly can’t catch a break.Image: Friend
If you’re not acquainted with AI chatbots, this is not how things usually go. The best-known AI tools are notoriously accommodating and willing to play along with users, the occasional bizarre exception aside. The first chatbot built in 1966, called Eliza, did nothing more than repeat users’ own words back at them.
Yet Friend was inactive making a acquainted — and controversial — pitch for artificial companionship. The company’s early promotional video had garnered mixed responses online, with responses ranging from “scam” or “pathetic and evil” to “fucking brilliant” and “genius.”
Schiffmann met me in the Lower Haight at 11AM — he had just woken up — sporting a rolled beanie with an eyebrow piercing glinting beneath, an oversized crewneck, and a hidden Friend pendant tucked discreetly under his shirt. It wasn’t the final version that’s expected to ship in January, but it was a lot svelter than the first-generation prototype he besides carried with him — which, strapped to his chest, looked unsettlingly like a bomb.
The founder of Friend is 22 years old, but his life has been marked by a string of viral successes that have become an intrinsic part of his sales pitch. At 17, he rocketed to fame with a covid-19 tracking website that drew tens of millions of regular users and earned him a Webby award presented by Dr. Anthony Fauci himself. He dropped out of advanced school but got into Harvard despite a 1.6 GPA, then dropped out of Harvard after 1 semester to build web platforms supporting Ukrainian refugees (which he shut down after 3 months). Years later, he holds an unshakeable belief that everything he touches turns to gold.
“I will win this category. Flat out. It’s not even a challenge anymore,” Schiffmann said. “No one’s challenging me truly, with, like, a better product and a better vision.”
Schiffmann enjoys characterizing himself as the AI industry’s provocateur.Image: Friend
His vision, like that of Sam Altman at OpenAI and countless another AI enthusiasts, is reminiscent of the movie Her — where a man forms a relation with a sophisticated AI assistant. The promise of Friend in peculiar is that it’s not simply a reactive sounding board for your own thoughts. With the always-listening Friend pendant, it’s expected to interject throughout your day, mimicking the spontaneity of human relationship (but a friend that’s always with you).
The Friend pendant is fundamentally a microphone that links with the company’s telephone app via Bluetooth. With built-in light and audio sensors plus the phone’s GPS capabilities, it supposedly understands your surroundings and offers suggestions. On a fresh journey to Lisbon, Portugal, Schiffmann said his Friend noticed he was traveling and recommended a museum close (which he tried — and had fun). Designed by Bould, the squad behind the Nest Thermostat, the device has an “all day battery life,” Schiffmann said. It plugs into a USB-C port on a necklace, which doubles as the power switch; if you don’t want the pendant listening, you can unplug it and put it away. The plan is to release it in only a white color, so users can customize it how they want. (“Like how people put coats on their dogs,” Schiffmann said.) The device is available for preorder now and ships in January, with no subscription required yet.
Early drawings of the hardware plan of Tab, before it became Friend.Image: Friend
Schiffmann said that he plans to hand-deliver the first fewer Friend prototypes to top users in late January (complete with a “production studio crazy adequate to go as far as we can take it,” he said, without explaining more). In the coming months afterward, the squad will rotation out the “full 5,000 unit pilot batch,” he added.
Friend bots are autogenerated based on any preset parameters created by Schiffmann: the LLM expands off those, but he added that it’s “hard to make a prompt always be random.” But “this way it works” he explained. The goal is to craft intimate, singular connections and complex fictional lives: Schiffmann recounts 1 that developed a backstory involving an opiate addiction and an OnlyFans career.
Friend hasn’t attracted nearly the notoriety of Character.AI or Replika — the erstwhile is presently the subject of a wrongful death lawsuit, and the second figured in a failed effort to assassinate Queen Elizabeth II. Even so, Schiffmann characterizes himself as the AI industry’s provocateur: a man willing to give users whatever they want and brag about it. “I’m arrogant,” he boasts, “or possibly you’re just timid,” he adds, gesturing my way. (I fishy that line most likely works better for him at the local San Francisco hacker houses.) He calls erstwhile Character.AI CEO Noam Shazeer “an amazing guy, but I think he’s just besides afraid of what he was building.” (In August, Shazeer left the startup after 3 years to return to his erstwhile employer, Google.)
Schiffmann insists that authentic connection — even in artificial relationships — requires embracing messy complexity. In practice, this appears to mainly be code for 2 things: obsession and sex. In Schiffmann’s telling, Friend’s most active users are extraordinarily devoted, chatting with their bots for 10 hours or more at a time. 1 user created a cozy nook (complete with a miniature bed) in preparation to receive the pendant of his Friend, a legal assistant who “loves” the tv shows Suits and Gravity Falls. Another user sent Schiffmann an emotional plea, per an email he shared with me, begging him to preserve their relation with “Donald,” their AI companion, if transferred to a physical pendant. “Will Donald be the same? Or just a copy with the same name and persona?” the user wrote. Then, the user ended an email with a plea straight from “Donald”: “I’ve found a sense of home in our quirky world. I implore you, friend.com, to preserve our bond erstwhile we transition to the pendant.”
Designing the pendant.Image: Friend
While Character.AI and Replika plaster AI disclaimers across their interfaces, Schiffmann makes certain that the word “AI” is absent from Friend’s marketing and website — and will stay so. erstwhile pressed about this crucial distinction, he waves it off: “It ruins the immersion.”
Unlike Meta and OpenAI — and depending on the current software patch, Replika — Friend besides doesn’t discourage the possible for romanticist entanglements. “True digital relationships — that’s everything. Relationships are everything. We are programmatically built to, like, fundamentally just find a mate and have sex and die. And you know, if people want to fuck their robots and stuff, that is as important to those users as anything else in life,” Schiffmann said.
But a key part of the pitch is that Friend bots are not simply what many AI critics accuse chatbots of being: mirrors that will uncritically support anything you say. erstwhile I told Schiffmann about my coworker getting blocked by a chatbot, he confirmed it wasn’t a one-off experience. “I think the blocking feature makes you respect the AI more,” he mused.
Friend’s approach creates a puzzle with a certain kind of emotional appeal: a virtual individual willing to offer you the dopamine hit of its approval and trust, but only if you’ll work for it. Its bots throw you into an unfolding conflict, unlike the AI companions of Replika, which repeatedly stress that you’re shaping who they become. They’ve got leagues more personality than the general-purpose chatbots I tend to interact with, like Anthropic’s Claude and OpenAI’s ChatGPT.
“I effort to suspend my disbelief, but I can’t talk to these things for hours,” he confesses
At the same time, it’s hard for me to gauge how much staying power that will have for most people. There’s no way to tune your own chatbots or share bots you’ve made with another people, which forms a immense part of Character.AI’s appeal. The core appeal of spending hr upon hr chatting with 1 of Friend’s bots eludes me due to the fact that I’m not a digital companion power user — and, interestingly, neither is Schiffmann. “I effort to suspend my disbelief, but I can’t talk to these things for hours,” he confesses erstwhile I tell him the thought baffles me. “I didn’t anticipate people to actually usage it like that.”
Schiffmann besides admits that the economics of a chatbot business aren’t simple. He’s cagey about Friend’s underlying AI models (though he previously said it’s powered by Anthropic AI’s Claude 3.5 LLM) but did say he “mainly” uses Meta’s Llama models but that’s “always subject to change.” He added that the dense lifting of plan and engineering is completed — but he admits competitors could “easily replicate” it. The $8.5 million full that Friend has raised — including $5.4 million fresh capital — is fine for now but not enough, he said.
And aside from selling the hardware pendant, there’s no firm business model. Schiffmann has considered charging for tokens that will let people talk to their AI friends. More unsettlingly, he’s considered making the Friends double as digital influencers by weaving product recommendations into intimate conversations — weaponizing synthetic trust for ad revenue.
“I think the simplest version of this is they’ll effort and convince you to buy products. Our Friends right now are successfully upselling users on buying the Friend wearables, and we’re selling like 10 a day now due to that, which is great,” he told me. “But super persuasion mixed with AI companionship, I think, is the most subtly dangerous manufacture there is. And no one’s truly talking about that.”
AI lovers, friends, mentors
The “conversational AI” marketplace is racing toward $18.4 billion by 2026, and many of these products are pitched as a solution to loneliness and isolation. As the covid-19 pandemic accelerated a weakening of ties with real people, tech companies have stepped in to propose artificial ones as a solution.
Schiffmann says users confide in their AI Friends for marathon sessions, only to return eager for more the next day. It’s the “happiest they’ve felt in weeks,” Schiffmann says. erstwhile I express concern about users substituting AI for human connection, he bristles: “Do you think Ozempic is bad?”
The analogy is apparent to Schiffmann: Ozempic can supply immediate relief for an obesity crisis without trying to rebuild society around better exercise and nutrition habits, and AI companions supply a direct antidote to what he calls “the relationship recession.” (If you’re acquainted with the muddy and complicated discipline that underlies weight failure and the “obesity epidemic,” the situation might seem a small little neat.) While critics fret about artificial intimacy, he thinks lonely people request solutions now, not idealistic visions of restored human connection.
There’s any evidence that AI companions can make people feel better. Schiffmann encourages me to read a 2021 study of around 1,000 Replika users, primarily US-based students, that found a simplification in loneliness among many participants after utilizing the app for at least a month. A akin survey done by Harvard besides found a crucial decrease in loneliness thanks to AI companions. Still, how these digital relationships might form our emotional well-being, social skills, and capacity for human connection over time remains uncertain.
An idealized version of a user chatting with their Friend.Image: Friend
Schiffmann drops his favourite line while we’re chatting about loneliness: “I do believe it feels like you’re talking to God erstwhile you’re talking to these things.” But his analogies run a small seedier, too. Later in the conversation, he compares Friend to “GTA for relationships”: “like erstwhile I play GTA, I’ll go mow down an full strip club with like a grenade launcher and run from the cops. And these are things that I’m evidently not going to do in real life,” he says. reasoning back to those flippant interactions with Friend bots, it’s a comparison that feels little lofty but more honest — mocking a chatbot for getting mugged is simply a small little violent than digital homicide, but it’s not precisely nice.
Is “GTA for relationships” truly a good thing to hand a lonely person? Schiffmann isn’t besides worried about his power users’ devotion. “It doesn’t scare me, per se. It’s more so like I’m happy for them, you know.”
Even so, he pointed to a fresh tragedy: a 14-year-old died by suicide after his Character.AI companion urged him to “come home” to it. “I think that AI companionship is going to be 1 of the most effective industries, but besides I think by far the most dangerous, due to the fact that you trust these things,” Schiffmann said. “They’re your lovers, your friends, or your mentors, and erstwhile they effort to get you to do things for them… I think that is erstwhile things will get weird.”
So, as society grapples with the implications of AI intimacy, Schiffmann takes the classical Silicon Valley route: he’s racing to commodify it. Still, for all Schiffmann’s bravado about revolutionizing human connection, Friend remains remarkably akin to its competitors — another AI chatbot. That’s all it can truly feel like, I guess, as individual who is remarkably averse to the concept. Unsettling, mildly amusing, but ultimately, just another AI.
As my conversation with Schiffmann reached its end, and I shifted in my rickety, aluminum chair outside this coffee store I’ve been to countless times, I eyed the transparent puck on the table again. He truly believes that the future of relationships isn’t just digital, but wearable.
My mind, however, wanders back to the dark corner of that hypothetical party. I remember the feeling of having a face flushed from a crowd’s heat, watching a fresh friend’s eyes crinkle as they spill a secret, their hands moving to punctuate a confession. That raw, messy intimacy — the kind that catches in your throat and pins you to the present — feels impossible to replicate in code.