The latest release (1.1.0) includes a small neural net that enables you to draw with your finger to draw an icon.
You can see it in action here:
https://indieweb.social/@juliu/110061062142807019
The app is created in React Native and I wanted the neural network to be embedded. I did some research and was surprised by how few good options I had. A lot of libraries related to neural networks have not seen a lot of love lately. So I only had 2 options.
Either use TensorFlowJS or create something myself. Then TensorFlow option lead me down a path of yak shaving (https://en.wiktionary.org/wiki/yak_shaving). Eventually I did not have the extra GB of space on my Windows laptop to get it working. Also I feel like TensorFlow is slowly losing out on PyTorch, and am not super excited to pin a core feature of my app on a library that might be maintained in a year or two.
So I decided to user MicroGrad (https://www.youtube.com/watch?v=VMj-3S1tku0). It's a simple neural network created by Karpathy to explain how neural network work.
Upside: I fully understand what is going on and it's relatively simple.
Downside: performance is bad and can only do fully connected layers. And it's in Python.
The other main part was training data. I made a 'training mode' for my app, where I created 100 simple drawings of each of the ~200 icons.
The search result shows the top 6 icons in view, so the neural net doesn't need to be perfect. In the end it works quite decently.
For now I want to first focus on other features of the TinyUX app again, but I'll return to the topic later. And evaluate if I want to build on top of this solution, or can come up with something else.