Speaking with Fortune, Braidwood described the various factors that influenced his decision to shut down the app, including the technical approaches the startup pursued to ensure the product was safe—and why he felt it wasn’t sufficient.
Yara AI was very much an early-stage startup, largely bootstrapped with less than $1 million in funds and with “low thousands” of users. The company hadn’t yet made a significant dent in the landscape, with many of its potential users relying on popular general purpose chatbots like ChatGPT. Braidwood admits there were also business headways, which in many ways, were affected by the safety concerns and AI unknowns. For example, despite the company running out of money in July, he was reluctant to pitch an interested VC fund because he felt like he couldn’t in good conscious pitch it while harboring these concerns, he said.
“I think there’s an industrial problem and an existential problem here,” he told Fortune. “Do we feel that using models that are trained on all the slop of the internet, but then post-trained to behave a certain way, is the right structure for something that ultimately could co-opt in either us becoming our best selves or our worst selves? That’s a big problem, and it was just too big for a small startup to tackle on its own.”
The most profound finding the team discovered during the year running Yara AI, according to Braidwood, is that there’s a crucial distinction between wellness and clinical care that isn’t well-defined. There’s a big difference between someone looking for support around everyday stress and someone working through trauma or more significant mental health struggles. Plus, not everyone who is struggling on a deeper level is even fully aware of their mental state, not to mention that anyone can be thrust into a more fragile emotional place at any time. There is no clear line, and that’s exactly where these situations become especially tricky — and risky.
“We had to sort of write our own definition, inspired in part by Illinois’ new law. And if someone is in crisis, if they’re in a position where their faculties are not what you would consider to be normal, reasonable faculties, then you have to stop. But you don’t have to just stop; you have to really try to push them in the direction of health,” Braidwood said.
“I’m playing a long game here,” he said. “Our mission was to make the ability to flourish as a human an accessible concept that anyone could afford, and that’s one of my missions in life. That doesn’t stop with one entity.”



