Over the last few weeks I've been building a small SaaS called Skillentis.
The original idea was simple: analyze the evolution of a developer’s GitHub repositories and generate a structured “technical maturity index”.
But while working on it, I realized something uncomfortable.
The hardest part isn’t building the scoring model.
The hardest part is defining what actually signals engineering maturity in a repository.
Most quick GitHub reviews still rely on surface metrics:
• stars
• commit counts
• contribution graphs
But those signals are easy to optimize without reflecting real engineering quality.
Someone can push commits every day and still have very little structural depth in their projects.
The signals that seem more meaningful are things like:
• how repositories evolve over time
• architectural structure
• documentation quality
• consistency across projects
• evidence of real problem-solving
But those signals are slower and harder to evaluate.
Which creates a tension in technical hiring:
Recruiters need fast signals.
Real engineering maturity is slow to observe.
So now I'm starting to question something deeper.
Is the real problem building better evaluation tools?
Or helping the industry understand better signals?
Curious to hear from people here who hire engineers:
What’s the first thing you actually look for when opening someone’s GitHub profile?