I definitely agree that real life relationships and communities are the best way to go, but also that our increasingly isolated lives makes that difficult, and some people get anxiety around social situations or just find themselves stuck in a rut of work followed by unwinding at home alone.
People increasingly also don't like being dependent on others, so while your take is a valid one, people who don't think it's true would need an alternate solution to that problem.
There are also some people who would like a better social life but are unsure how to, or don't have the skills or opportunity to do so.
The success criteria of such an app could even be that users should only be using it for a certain amount of time, after which the app should have encouraged and helped the users in replacing app interactions with real life social interactions.
I have both text inputs and voice, where the voice converts to text before submitting to the LLM. That allows people to edit their voice transcription, and so on.
Oh, you'd have to create an account. That's free, and nothing happens with the email you give to create the account beyond use for password recovery.
The behavior of users, their reactions to the site, are both from observations and my asking them, and them telling me. I'm writing it at a law office where it's in use by the attorneys. Turnover in staff gives me a pretty good idea how a fresh set of eyes looks at it, and as I've improved things that feedback is fresh from new people.
I'm making it as broad as I can at the moment, seeking to find a balance between automation and interactivity that promotes creative flow. I'm trying to do with writing literature, spreadsheets and generalized project management what people are doing with code AI integration. This includes students learning how to use these applications, as well as advanced users of them, but not necessarily programmer types, nor people comfortable with the idea of getting that technical.
A lot of what I find I need is communications establishment between a user and the AIs they have access. They don't know what to say, how to ask for information, how to basically use them. Then, for example, when they do ask the AI something that ask is loaded with implied context that AI does not know nor could know. The ask the AI to do their work, without explaining what that work is, or what the expect from the AI in response.
So I have added interfaces for specific and direct use, such as here is a location where you can ask the AI to do editing changes to your document, and over here is a place where you can ask the AI about the quality of the writing. Each of those are specific knowledge bots pre-loaded into that part of the interface, one can edit the document in all kinds of ways, and the other can give feedback on it from all kinds of perspectives, like a writing professor or coach, so one writing it can comprehensively write better.
Each of these AI integrations with one of these tools is also conversationally programmable. So a user can have a series of them, each with different knowledge, for different scenarios. Those then get collected into similar AI groups I'm calling organizations, because they end up working in tandem with each other and the user.
For lonely people I imagine social media is passive scrolling for the most part, maybe even with one way interactions (likes / comments). Some might be engaging with other users on social media, but it would almost certainly be text (a reddit thread conversation, or getting a response to a reply you made on some other social media).
With conversational AI, it would be a dynamic voice conversation, where the AI would be responding to you live, and the medium of speech-to-speech would also feel social in a way that text based communication does not.
The lonely user might still think that their interactions dont count because it's a fake person that they are engaging with, but since there are people using AI girlfriend/boyfriend services, I imagine an AI friend/coach should appeal to people as well, but be healthier than simulating a romantic connection with an AI.
In terms of safeguards and clinically backed design for the AI, I'm hoping to foster a conversation around it. Most LLMs have various safeguards in place around harmful responses, and while this product would hopefully be alleviating peoples mental health by reducing their feeling of loneliness, it wouldn't be developed as a therapist, but more of a friend. Still, knowing how to make an effective friend isn't trivial, and I'm open to figuring out how best to do that - ideally it would involve having users willing to engage with a WIP product that is iterated on based on their feedback.
I haven't looked into AI girlfriend/boyfriend services too much, but they seem like exploitative services that hog up peoples time and money and further isolate them from the real world.
If people are willing to spend time and money on these services, I thought one should be able to come up with a more thoughtful and healthy service that actually helps people / nudges them towards a healthier lifestyle.
Hence having a companion or a coach that alleviates feelings of isolation and encourages you to do things that could get you the real thing. While all the issues you raise are valid, there are solutions for them as well (subscription that pays for the service so it doesn't shut down, allowing you to lock in AI if you don't want updates changing its personality.. in any case, the personality should more be formed from the data it collects from its conversations with you over time)
There are definitely ways in which this could be unhealthy, which is why I was curious about how one could go about a way to maximize the chances of it being a healthy and helpful service.
I agree that belief plays a role, and that it would be difficult to perfectly replicate interactions between human beings, but I think you can get partly there, and especially for those that don't have human beings that they can interact with, it could provide benefits to them.
I've been working on my own for some time now, and I find talking to an AI via text or voice helps me work through problems, and it does give me a partial feeling of having a coworker.
Personally, while I have various social connections, there are certain things that I find interesting that none of my social connections do, and so there are certain conversations I cannot have with my social group. I sense engaging with an AI about these topics could give me more pleasure than talking to a disinterested friend about it. An interested friend would still be the best case scenario, but that is why I was thinking a product like this could be useful for those that don't have friends.
As @bsenftner mentioned as well, if you've withdrawn from social situations due to past interactions, because you feel misunderstood, or aren't good at expressing yourself, then having this no-risk platform to experiment / practice socializing with a conversational AI could be something that appeals to you, and over time gets you to a place where you seek out the real thing.
I’ve been working on a journaling app that takes a different approach via conversational AI: instead of writing or typing, you speak to an AI that listens and talks back, helping you reflect on your thoughts and feelings in real-time.
It’s designed for people who struggle with traditional journaling. A live transcript is available to revisit and annotate later.
The web app is free and still early. I have a lot of ideas for features, but instead of building in isolation, I’d rather shape it with real users—figuring out what actually helps rather than just guessing.
On the tech side, I tested a bunch of options and landed on DailyBots for orchestration, Deepgram for Speech-to-Text, Cartesia for Text-to-Speech, and OpenAI’s 4o-mini for the LLM—balancing latency, quality, and cost.
Try it here: innerecho.xyz
I’d love to hear your thoughts—what would make something like this truly useful to you?
People increasingly also don't like being dependent on others, so while your take is a valid one, people who don't think it's true would need an alternate solution to that problem.
There are also some people who would like a better social life but are unsure how to, or don't have the skills or opportunity to do so.
The success criteria of such an app could even be that users should only be using it for a certain amount of time, after which the app should have encouraged and helped the users in replacing app interactions with real life social interactions.