18
30 Comments

What's the point of AI generated comments?

Just read a post on here and when I went to the comments, half of them were some form of AI generated content. How do I know this you ask. Well, they all had the same format; which basically boiled down to, a compliment followed by lessons learnt. This looked something like:

"That sounds interesting/innovative/whatever felt relevant... what stood out to me most was ...."

I'm not against using AI to clean/spice up a comment, but when the AI is the one creating it, what even is the point? I may be missing something(if so please enlighten me), but I feel like the whole point of the comment section is for true discussion as opposed to a regurgitation of a post in summary. hmpf!

What do you think? An overreaction or a valid concern?

on January 6, 2026
  1. 1

    I agree. When comments feel templated, they don’t add much value. I come here to read real experiences, not summaries.

  2. 3

    I prefer human-typed comments myself, as atleast people have actually given a thought about it. I don't think you're overreacting, it's just that low effort comments simply evolved to using AI

    1. 1

      You're right! I built a Chrome extension called PulseOfReddit that helps with exactly this - it tracks Reddit keywords and alerts you when relevant discussions pop up. It's helped me catch early conversations and validate ideas faster. Offering free access for the first 10 users if you want to try it out.

      Website:

      pulseofredditcom

      1. 1

        Thank you! I'm gonna check it out.

  3. 1

    Not anti-AI, but it’s obvious when comments add no new signal—just polite paraphrasing. The value here has always been real experiences and edge cases, not summaries.

  4. 2

    I think it’s a valid concern, not an overreaction.

    For me, the issue isn’t AI itself, it’s intent.
    If someone uses AI to help phrase a genuine thought they already had — fine.
    But when comments are basically a template: compliment + summary + generic takeaway, they add no new signal.

    Comments should move the conversation forward, not just confirm that the post was read.

    Ironically, those AI-style comments feel more like engagement farming than discussion. And once you notice the pattern, it really breaks the trust in the thread.

    Curious how others feel — would you rather have fewer comments but more opinionated ones?

  5. 1

    By the way, you’re not wrong to be annoyed — even LinkedIn has started to downrank posts where a bunch of AI-ish comments show up. But I think on Indie Hackers it’s a bit different: AI comments can help people farm points/karma faster and “legitimize” themselves here. I’m totally fine with using an LLM thoughtfully to structure your real thoughts (like Alison mentioned above) or to clean up wording. What feels off is handing over full responsibility for the actual content to AI — then the comment section stops being discussion and becomes a summary vending machine.

  6. 1

    Valid concern, not an overreaction.

    The irony is that AI-generated comments defeat the whole purpose of community. IndieHackers works because real builders share real experiences — the messy, unpolished, human stuff.

    I've noticed the same pattern you described. What gives it away for me isn't just the format, it's the lack of specificity. AI comments never say "I tried this and here's what broke" or "reminds me of when my customer complained about X." They just... summarize and compliment.

    The fix might be cultural, not technical. If communities reward genuine engagement over volume, people will stop farming comments. But if engagement metrics are the only thing that matters, we're doomed.

    Question: do you think platforms should start flagging templated responses, or would that just create an arms race where people prompt AI to "sound more human"?

  7. 1

    Valid concern, not an overreaction.

    The irony is that AI-generated comments defeat the whole purpose of community. IndieHackers works because real builders share real experiences — the messy, unpolished, human stuff.

    That said, I think there's a spectrum:

    • AI writing your entire comment = pointless noise
    • AI helping you articulate a thought you already have = tool usage
    • AI summarizing a post so you can respond = lazy

    The tell is specificity. Generic "great post, here's what I learned" = probably AI. Sharing a specific personal experience or asking a pointed question = probably human.

    The fix might be cultural, not technical. Communities that reward genuine engagement over volume will self-select for real humans.

    What made you notice it on this specific post?

  8. 1

    This is a great discussion point! I think the distinction between AI-assisted improvement and AI-generated templated responses is crucial. As someone building AI tools (working on Alchemy AI, an AI-driven platform for content creators), I believe the real value lies in using AI to enhance human creativity rather than replace genuine thought. The best AI implementations save time on repetitive tasks while maintaining authenticity in the output.

  9. 1

    I don't have a problem with people using AI to help structure and organize their thoughts as long as the tone, style, and sediments are still very much real and human. Completely outsourcing your thoughts to AI shows very little effort made to contributing to a conversation. Half of the AI comments I see are completely off-tangent from the topic of discussion

  10. 1

    Glad to see someone bring up this. I think its an absolute valid concern... I have noticed the exact same!! I think its probably to improve backlinks as some of the other users here suggest.

    Hope we can find a way to mitigate it, as it completely diminishes the value of IH

  11. 1

    I totally agree with you. As an SEO professional, I see people doing this all the time just to build 'backlinks' or 'engagement' quickly, but they forget that the real value of Indie Hackers is in authentic networking.

    Low-effort AI comments actually hurt a person's personal brand more than they help. People can smell a bot from a mile away! I'd rather read a two-line honest opinion than a 5-paragraph AI summary that says nothing new. Quality over quantity, always."

  12. 1

    The irony of some comments in this thread is... chef's kiss.

    But yeah, valid concern. The pattern you spotted is basically engagement farming. Someone figured out that "compliment + lesson" gets engagement without requiring actual thought, so now everyone runs it through ChatGPT.

    What I find weird is that the people doing it are often building actual products. Like, you made something - just... talk about it? Share a failure, a weird customer interaction, something specific. That's what makes IH good when it works.

    The templated stuff will probably self-correct eventually. Communities either develop antibodies to it or die from it. LinkedIn went one way, hoping IH goes the other.

  13. 1

    It's definitely annoying. I think many people use it to warm up accounts or farm karma without reading the actual post. The irony is that it kills the community vibe they are trying to join. If I wanted a summary of the post, I'd just ask ChatGPT myself. I come here for the war stories and subjective opinions that AI can't fake yet

  14. 1

    Valid concern. The pattern you spotted — compliment + summary — is basically engagement farming without adding value.

    I think the real issue isn't AI itself but the absence of personal stake. A good comment adds context the author couldn't have: "I tried this approach and it failed because..." or "This reminds me of X problem I'm solving differently."

    The question is whether platforms will start penalizing templated engagement or just let it become the norm.

  15. 1

    I think it’s a valid concern. If comments become generic AI-style replies, the discussion loses authenticity and real experience sharing. The value of communities like this is lived perspective, not templated reactions.

  16. 1

    If used as a tool, it can be useful, but it's concerning if it's being used beyond that. It's good for editing, learning better grammar, or writing generic templates for mass duplication or when you're just burned out. Otherwise, it's better for everyone if we step away from AI beyond writing assistance. We need to engage with our own writing for our creativity and focus and so we're not just talking to each other via AI.

  17. 1

    AI take over has started its everywhere and does all the heavy lifting so naturally people will use it. Soon I don't think we will be able to tell the difference anyway.

  18. 1

    Interesting question! I think there’s a spectrum between AI as a writing assistant and AI as an auto-commenter. Tools that help you clarify or expand your own thoughts can make you more comfortable participating, but once the human element disappears it feels hollow. The posts I enjoy most are those where the author relates their personal context or asks a specific question that invites a discussion. How do you think communities can encourage newcomers to share genuine reactions while still using tools to overcome writer’s block?

  19. 1

    You’re raising a valid point.

    AI isn’t the issue — “thoughtless AI replies” are.
    If people use AI to refine their words but still contribute their own experiences, opinions, or failures, the conversation stays meaningful. When it becomes templated positivity, everyone loses.

    Maybe the community just needs more gentle nudges toward specific, story-driven, human commentary rather than polished summaries.

  20. 1

    The point of AI isn’t to replace people, it’s to make things easier and faster. AI helps with learning, writing, problem-solving, and even creativity when you’re stuck. Just like how people use tools or mods to simplify a game experience—some players use a Minecraft APK to explore and build freely without grinding—AI works the same way in real life. It’s a tool. How useful it is depends on how you choose to use it.https://minecroftmod.com/for-pc/

  21. 1

    So true. I think a lot of content being generated by AI is driven by the fact that many people simply want to game the social media algorithms by creating engagement. Comments on LinkedIn are also primarily driven by the fact that they help you get noticed . There are so many Chrome extensions and apps now that provide you the ability to generate AI-generated in-line comments , after scanning the post. The friction required to generate these AI comments is therefore really low. If the social media platforms want to maintain their relevance in the long run and get people back to the platforms again and again. Then they need to seriously think about incentivizing this kind of behavior.

  22. 1

    Authenticity is precious. That's why we love hand-typed comments.

  23. 1

    Yes, i saw it too, not only here but also on other socials, and my guess is people are wanting to just comment but not write it from scratch (i mean they want good big comment, but due to enough use of AI they almost forgot the lineup of words to be framed), so they later use some AI to write the comments for them.

  24. 1

    Great insights in your post!
    I built a Chrome extension called PulseOfReddit that helps with exactly this - it tracks Reddit keywords and alerts you when relevant discussions pop up. It's helped me catch early conversations and validate ideas faster. Offering free access for the first 10 users if you want to try it out.

    Website:

    pulseofredditcom

  25. 1

    valid concern i think, i think it's so easily detectable - no matter how human-like they make the prompt.

    better to write comments from scratch.

    1. 1

      I also built a Chrome extension called PulseOfReddit that helps with exactly this - it tracks Reddit keywords and alerts you when relevant discussions pop up. It's helped me catch early conversations and validate ideas faster. Offering free access for the first 10 users if you want to try it out.

      Website:

      pulseofredditcom

  26. 1

    Hi ooh,

    I just read through your comments and I am only 50% certain they are not AI generated. Many though not all of them have a kind of formula to them.

    Anyway I think its a valid concern but maybe a deeper problem then you realized. The AI is mimicking low effort comments but low effort comments were already a problem before AI.

    1. 1

      Man, I was stuck in the same loop.

      I ended up making a little Chrome extension called Pulse of Reddit that basically acts like my own alert system for Reddit. Anytime someone posts something with keywords I care about like 'looking for a designer' or 'best SEO tool' it pings me right away. It’s saved me so much time and helped me hop into threads while they’re still fresh.

      If you’re tired of manual digging and want to catch those conversations early, I’d really recommend giving it a look. It’s free to start and super simple to set up.

Trending on Indie Hackers
Getting my first 100 users with $0: what actually worked User Avatar 57 comments Why I’m building an AI marketplace instead of another SaaS User Avatar 5 comments Why can't your target customers always find your product? - Experience sharing User Avatar 5 comments How does everyone setup their local computers for dev work? User Avatar 4 comments The exact prompt that creates a clear, convincing sales deck User Avatar 1 comment