2
4 Comments

@ App publishers: How do you evaluate your ASO efforts?

Hi all,

For all the indie app publishers here: How you evaluate your ASO efforts. For example, how do you analyze the results of changing your app's keywords?

Are you using App Store Connect or another analytics vendor (e.g. Appfigures) for this? In what way are you taking into account filters (e.g. App Store Search, territory, etc.)?

Do you keep track of your ASO-related changes (per app version) and if yes, using what tool?

Niels

  1. 1

    ASO for indie hackers is very difficult to measure. I used TheTool for couple of months to track my app's rankings & ASO. But $40/month is an absurd fee for what they provide. I have tried App Radar's free account - Does not help either.

    Google Play Console has some data about search terms and conversions but most of the terms are grouped under "others" so it's useless as well.

    Unless you have 25K-50K daily visitors / searches, none of the tools provide any actionable data.

    1. 1

      Thanks for your reply.

      Why aren't you evaluating using data provided by App Store Connect / Google Play Console? It doesn't include ranking-related information, but provides information such as impressions, product page views and app units (downloads). At the end, increasing those is what counts, right?

      Hence, experimenting with both visible (screenshots, app icon, etc.) and invisible (e.g. keywords) data and see how they affect earlier mentioned metrics.

      I'm bringing up this topic as I feel current ASO services are making it too hard to run experiments and validating their results. Besides that, I feel they are too expensive too. I'm looking at introducing something more simple, less expensive, likely targeted towards indie app developers.

      ASO improved my app's MRR drastically and believe it could do the same for other developer's apps. I have a feeling they simply lack the knowledge and the right tooling for this. I would like to validate this before diving into another project.

      1. 2

        but provides information such as impressions, product page views and app units (downloads). At the end, increasing those is what counts, right?

        This data is not the best to know if your changes have an impact or not.

        When you run experiments, Google Play shows daily installs and retention for base and variant. But that does not tell us whether search rank has changed or not. Similarly, when you run experiments on title / description, we don't know for sure if the keywords in the variant are being considered in ranking or not. There's no data which says so.

        At the end, increasing those is what counts, right?

        Sure, but it's difficult to measure if your change has an impact on it or there are some external factors. eg. An article on AndroidPolice / AndroidAuthority will uplift downloads for over couple of weeks.

        I have a feeling they simply lack the knowledge and the right tooling for this. I would like to validate this before diving into another project.

        This is true for my case. All the ASO tools have the same generic articles about how to improve ASO. Add long tail keywords, track your keywords, but the change has never been significant for me to know whether what I did worked or not.

        1. 1

          Sure, but it's difficult to measure if your change has an impact on it or there are some external factors.

          In App Store Connect you can apply an "App Store Search" filter that ensures only impressions, product page views and app units as a result of a user searching are shown. Hence, other sources such as (web) referrals are excluded. This way, seeing an increase of impressions after performing ASO very likely confirms your change(s) had positive impact.

          Is there something similar on Google Play Console? And would this change your opinion about measuring impressions vs. ranking?

          Add long tail keywords, track your keywords, but the change has never been significant for me to know whether what I did worked or not.

          Interesting. For my app it meant a lot. Any idea why it didn't for you?

Trending on Indie Hackers
I'm naming my company and I need your help 28 comments I quit my 1+m$/year job to work full-time on my own project 12 comments UX mistake 24: Design the Table Instead of the Content 10 comments Designjoy crosses $70k MRR! Someone pinch me...😱 8 comments My Core Web Vitals Tool Featured on Product Hunt 🔥 7 comments RocketList - a collection of cloud actions for your site 6 comments