The "I'm gonna fix onboarding" instinct is almost always the wrong first diagnosis when users cancel immediately after trying. Onboarding is a funnel problem. Immediate cancels are a positioning/expectation problem - they expected X, saw Y, bailed inside 30 seconds.
Three canceled immediately and one got to the paywall and canceled. You have a sample size of 4 users who engaged. That's not "optimize conversion" territory, that's "call them personally and ask what they expected vs what they saw"
territory imo. A 15-minute call with each of them will teach you more than any onboarding A/B test from my experience.
If the three who canceled immediately say the same thing ("I thought this was going to do X and it doesn't") — well that's your answer and copy tweaks won't fix it. If they each say something different — the product isn't legibly positioned yet.
At 130 downloads with 2-9/day organic, you don't have a traffic problem. You have a "does this do something specific enough that people want" problem. Fixable, but only by talking to the people who bounced, not by guessing. You must established a channel of communication with your users.
Yeah thanks. I emailed some of people who stopped in the middle of onboarding (because I track at least the onboarding steps for the features) and nobody answered haha I should reach out to more people I guess
Not trying to advertise us (yes I am) but we had people switching from Hermes to PocketBot. And we are completely free in beta right now (iOS Testflight). Might be worth taking a look for anyone interested in the whole automations business.
It's as rough as it gets. You really have to know who your customer is. Not as a thesis, but as a proven. If you can find your exact customer base, you will get customers over and over again. There is one subreddit where I can post any time of the day and will have 50-100 new testers by the morning (and if it does extremely well then 10x those numbers). It's way more niche and specific for most products than people think, spamming X won't help build brand identity unless you are following and talking to your literal customer. Then persistance, persistance and more persistance. P.S I am an absolute amateur at this as well but just my experience.
Well PocketBot won't be AI related in a few years. We're using AI with it's only job being to eliminate itself. Automations, using a single simple prompt, fully private, with a massive user created library (~1000 testers so far). People create automations - our LLM makes a template out of it -> goes to our shared user library -> when someone asks for an automation we check if it already exists and if so then deterministically adjust it for their use. If no automation exists then request sent to our Tier 2 LLM which creates it. Maybe not exactly what you wanted to hear, I am just excited about our idea...
Edit: Sorry missed the most important point. So the more users and the bigger this library gets, the less LLM usage we have, until eventually we are able to run almost fully deterministic.
Three canceled immediately and one got to the paywall and canceled. You have a sample size of 4 users who engaged. That's not "optimize conversion" territory, that's "call them personally and ask what they expected vs what they saw" territory imo. A 15-minute call with each of them will teach you more than any onboarding A/B test from my experience.
If the three who canceled immediately say the same thing ("I thought this was going to do X and it doesn't") — well that's your answer and copy tweaks won't fix it. If they each say something different — the product isn't legibly positioned yet.
At 130 downloads with 2-9/day organic, you don't have a traffic problem. You have a "does this do something specific enough that people want" problem. Fixable, but only by talking to the people who bounced, not by guessing. You must established a channel of communication with your users.
reply