1
0 Comments

Show IH: We built the first real-time multiplayer AI workspace (Humans + Agents together). This is not a wrapper.

Hey Indie Hackers,

We are the team at Blankline. Today we’re sharing Dropstone—a new desktop IDE that changes how humans and AI collaborate.

Most of us are using tools like Cursor, Claude Code, or Copilot. They are incredible, but they all share the same limitation: they are single-player. It’s just you and a chat bot. If you close the window, the context is often lost. If you want to bring a co-founder in to see what the AI built, you have to screen-share or push to Git.

We built Dropstone to be multiplayer by default. It is a shared workspace where humans and AI agents work side-by-side on the same project state in real-time.

Here is the demo (Share Chat just dropped in v3.0.5): https://www.youtube.com/watch?v=RqHS6_vOyH4

The Problem: The "70% Wall"
We’ve all seen the new wave of "text-to-app" tools (Lovable, Bolt, etc.). They are great for getting 70% of the way there. But when the AI hits a complexity wall, introduces a bug, or hallucinates, non-technical founders are stranded.

We designed Dropstone to solve this:

Business Owner describes the feature in chat.

AI Agents build the feature in the background.

Developer reviews the code and fixes the hard 30% in the same editor.

Everything syncs in real-time.

What makes this different (Technical Deep Dive)
We didn't just wrap the OpenAI API. We are a research lab (Blankline) and we built our own runtime infrastructure from the ground up to support this.

  1. Real-Time Multiplayer (Humans + Agents) Unlike Cursor, Dropstone supports multiple users and multiple agents in one session.

Share Chat: As of v3.0.5, you can generate a link to your workspace. Anyone can join via web or desktop to collaborate instantly.

Agent Coordination: We don't just have one LLM answering prompts. We use Horizon Mode, where multiple agents coordinate. One agent explores a solution, shares state with a second agent, who critiques or tests it. They communicate without you needing to prompt them.

  1. Infinite Context (D3 Engine) We don't rely solely on massive context windows (which get expensive and slow). We built the D3 Engine (Dynamic Distillation & Deployment).

It virtualizes context using logic-regularized compression.

It achieves 50:1 compression while preserving 100% of the logic gates and variable definitions.

This means Dropstone remembers your architecture, your chaotic "todo" comments, and your API signatures indefinitely, without forgetting old instructions.

  1. Workspace Memory Dropstone builds a structured brain for your project. It remembers why you made that weird architectural decision three weeks ago. It remembers that your team prefers functional programming over OOP. This persists across sessions—you don't start from zero every time you open the app.

We are not a Wrapper (We publish our Research)
There are too many "AI wrappers" right now. We believe in transparency. We conduct original research in mathematics, physical infrastructure, and agentic systems, and we publish peer-reviewed papers on our tech.

If you want to see the math behind our engine:

Tensor Rank Optimization: We published research on 3x3 Matrix Multiplication barriers (DOI: 10.5281/zenodo.18443297).

Memory Architecture: Our paper on "D3 Adaptive Memory" details how we solved the Temporal Event Horizon problem (88.7% recall vs the standard 12.4%).

Agent Swarms: Our research on Recursive Swarm Architecture explains how we keep agents coherent for 24+ hour tasks.

You can read all our papers here: https://www.blankline.org/research

Local-First & Model Agnostic
We know many of you care about privacy and vendor lock-in.

Local-First: The D3 Engine and memory run locally on your machine.

Offline Capable: Full support for Ollama. You can run Llama 3 or DeepSeek locally and use Dropstone’s multiplayer features without data leaving your premise.

Model Agnostic: Plug in Claude, GPT-4, Gemini, or your local models. We provide the infrastructure; you choose the brain.

Try it out
We have a generous Free Tier that includes local Ollama models and the full editor.

Download: https://www.dropstone.io/downloads

Docs: https://docs.dropstone.io

We’d love to hear your feedback on the "Share Chat" workflow. Are you finding it easier to collaborate with non-technical team members?

Let us know what you think!

on February 7, 2026
Trending on Indie Hackers
I'm a lawyer who launched an AI contract tool on Product Hunt today — here's what building it as a non-technical founder actually felt like User Avatar 150 comments A simple way to keep AI automations from making bad decisions User Avatar 59 comments “This contract looked normal - but could cost millions” User Avatar 54 comments Never hire an SEO Agency for your Saas Startup User Avatar 44 comments 👉 The most expensive contract mistakes don’t feel risky User Avatar 41 comments The indie maker's dilemma: 2 months in, 700 downloads, and I'm stuck User Avatar 41 comments