2
3 Comments

From early adopters to regular users - learning what golfers need

I started Ace Trace https://acetrace.app as a niche product for disc golfers. It was fun to play the sport during the pandemic, and I always watched tournaments on YouTube. I was fascinated by the highlights in post-production professional videos - if someone made a really good shot, they would show a trace on the replay. The trace looks magical since disc golf trajectories can be very tricky and interesting. If an athlete manages to go around one tree and then skip to the basket, avoiding sinking the disc into the water, that's a big deal, and it looks just beautiful!

I wanted the app to trace the trajectory automatically, and I spent two months trying to make it work. First, I tried existing OpenCV algorithms for tracking objects, and they could not do it. Then, I went to some ML algorithms as well. However, I didn't have a dataset to train anything, so I decided just to release an MVP where users could manually build the trajectory. They had to mark several key points, and the app connected the points into one smooth curve. The line looked pretty good, and despite the need for manual effort, it was super easy to create traces on the phone.

Manual Tracking editing mode Ace Trace app

Here we go: The first users started to download the app! I've gotten a bunch of feedback, and clearly, early adopters liked it. There were a lot of people who didn't want to pay for apps, so I had to learn that it was normal. The app is not for everyone; however, when the app is for you, you will pay for it. My feedback form was hot for a while, and I started fixing bugs as well as learning what my users wanted. At the same time, the app had a subscription-based monetization, and people started buying it right away. After three months, I had a good amount of paying customers with a subscription price of $12 per year.

It was time to build for Android. The MVP was only created for iOS, and the app had to be native - I could not do all that low-level computations and video processing using React Native or other hybrid technology. On the other hand, the app looks and feels much better when it's native. Anyway, five months later, I released the first simple version for Android, and man, was it harder to develop for that platform. Video processing specifically is so complex - there is no high-level API for that on Android at all. So, I had to roll up my sleeves and do the hard work with Media Codecs and buffers. And the app was still pretty bad - it crashed, the graphics were worse, and it was slow. Well, at least now, all these users waiting for the Android version could start tracing their shots.

The feedback was constant, and the most common request was to make it trace the trajectory automatically. So, I went back to that and started the research. This time, I used a more scientific approach. I've created a benchmark to compare the tracking results of different algorithms. I also collected a fair amount of relevant videos for the dataset - some of my users opted to share their videos with me to help improve the algorithm. After a few months of research, there is a winner - an ML-based computer vision tracking algorithm. Now that I trained it on disc golf videos, it showed decent accuracy.

I've published the first version of it to see what users think of it. It looked like the quality was not good enough - many experienced users chose to go back to the manual editing mode. At the same time, I noticed a change in my user base vibe - there were fewer and fewer excited people supporting what I was building and more and more frustrated users who expected the app to do everything for them and for free. The project transitioned into a mature stage. The need for a good quality tracking algorithm has become more of a priority.

At the same time, many ball golf users started to discover the app organically and use it for golf shot tracing. The app was not ready for them since it was designed for disc golf. The golf audience grew daily; I started seeing videos on social media made with my app for golf shots. I guess I just got myself early adopters who play traditional golf. These users had their way of expressing opinions - negative reviews on the AppStore/Google Play, so I had to make it work for them as quickly as possible. The automatic tracking for disc golf became less of a priority.
Golf shot trace

I decided to hire an ML professional to develop an algorithm for tracking golf balls. It was clear that I was not qualified for that task myself - I could barely trace frisbees, and golf balls were so much smaller - they were hardly visible. I hired one very talented engineer to do that on contract; however, he only wanted to give me some quick and bad solutions just to get rid of me. I had to fire him, and I hired another one. The second person really tried to make automatic tracking for golf balls work; however, he miserably failed.

While I outsourced the AI algorithm to professionals, I quickly came up with a simple manual interface for golfers. Obviously, the ML research would take significant time, so the manual mode was my task number one. And after all, they could not do the automatic tracking algorithm anyway. This way, I found an easy solution for my growing audience. Now I could raise the subscription price, which I did.

In the end, it was me again - I went online and started studying ML for computer vision from the beginning. I watched some Stanford lectures on YouTube, which I highly recommend. I went ahead and started researching contemporary ML techniques for tracking small objects. There were enormous challenges here - most computer vision algorithms use convolution layers to decrease the image resolution while extracting object features. I didn't have that luxury; I needed every pixel of my original object. Another problem was that the golf ball moves rapidly between frames, making it impossible to use algorithms based on the Optical Flow or 3D-CNNs. It was hard to find a secret source to deal with these difficulties, so I combined several algorithms I found in research papers to overcome the obstacles. Eight months later, I produced a neural network for tracking golf balls, which I engineered, and I trained it on my user videos again. It finally worked.
Automatic Tracking golf ball neural network

After all this hard work, I finally saw what my users thought about it. One thing it did for sure - it stopped the complaints about the app not having the automatic tracking feature. Now I had a new one - automatic tracking did not always work, and people were having issues with that. Well, that's a whole new level of problems; we transitioned to the next level, solving the problem for golf players. After a few months of tweaking the algorithm, I could return to some other essential feature requests and my background projects. The golf season started, and I saw a tremendous increase in app installations and subscription purchases compared to the previous year. I could afford to live off my app earnings. That felt very good.

Now that I have mature users who expect excellent app quality, I need to work on making the tracking algorithm much more reliable. My intern and I were trying to combine another technique to combine my model with several other ML modules that should increase tracking resilience dramatically. We've already used those modules when we implemented the original tracking for Disc Golf, and now, when we figure out how to deal with tiny objects, we can combine them into one powerful tool!

Working hard to satisfy users

Is it worth it to work so hard to satisfy a customer potentially? The early adopters used the manual mode with no problems. I invested a lot of time in this ML algorithm, which initially looked much simpler. I wonder if I would have done it again, knowing how hard it was to build. How do you know when to stop trying and move on to something next? When is your solution enough?

Interested in learning more about Ace Trace? Check out the website https://acetrace.app. I also write some about my entrepreneurial journey in my blog: https://ivkin.dev

on September 2, 2023
  1. 2

    Congratulations on having the tenacity to keep trying different approaches. An alternative to manual input is always appreciated by users even though it’s not 100% accurate. You did the right thing by releasing the CV based tracking option to customers to make the experience easier and collecting their feedback. You could consider marking this feature as “beta” for the mere fact that its accuracy is still a work in progress if that’s still the case.

    1. 1

      Good idea about marking the feature as beta. I did that for the initial version of the Golf manual mode. After a month of fixing bugs and improving the interface, I removed the "beta" label.
      Do you think having it marked as beta helps with negative feedback from non-early-adopters ?

      1. 2

        I think it does. It helps to set expectations between your app and its users. “Beta” signify that this feature is not fully developed but is functional as refinements are made based on real-world usage.

Trending on Indie Hackers
I'm a lawyer who launched an AI contract tool on Product Hunt today — here's what building it as a non-technical founder actually felt like User Avatar 151 comments Never hire an SEO Agency for your Saas Startup User Avatar 87 comments A simple way to keep AI automations from making bad decisions User Avatar 65 comments “This contract looked normal - but could cost millions” User Avatar 54 comments 👉 The most expensive contract mistakes don’t feel risky User Avatar 41 comments We automated our business vetting with OpenClaw User Avatar 34 comments