In this Trends.vc Report, we talk about open LLM leaderboards, controversial tasks, unfiltered creativity and more.
Censorship lowers leverage. Privacy limitations lower trust.
Local AI shifts control from OpenAI, Microsoft and Google to the people.
You pay for centralized AI tools that tell you what you can and cannot do.
While saving your documents and innermost thoughts on their servers.
Local AI gives you more control over your data and usage.
It becomes your advisor, not a supervisor.
Local AI Models
- WriteUp locked privacy behind a paid plan. It collects data from free users only.
- Pieces is a local-first coding assistant that protects your codebase. It uses your local resources to give code suggestions.
- Pieter Levels grew TherapistAI to $2,000/mo. He says local LLMs are perfect for sensitive use cases and plans to turn it into a client-side chatbot.
- LM Studio lets you build, run and chat with local LLMs.
- WebLLM is an in-browser AI engine for using local LLMs.
- TypingMind lets you self-host local LLMs on your own infrastructure.
- Matthew Berman shows how to run any AI model with LM Studio.
- Zoltan C. Toth teaches The Local LLM Crash Course. He’s got 2,769 students.
- Sam Witteveen made a series of tutorials on running local AI models with Ollama.
- Eden Marco teaches how to build LLM apps with LangChain. He’s got 56,404 students.
- Sharath Raju teaches how to use LangChain with Llama 2 and HuggingFace. He’s got 10,657 students.
- Camel lets you use open-source AI models to build role-playing AI agents.
- OpenAGI lets you use local models to build collaborative AI teams. Here’s an example of an AI team that writes blogs.
- MetaGPT lets you build a collaborative entity for complex tasks. It works best with commercial models, but you can use open-source AI too.
- ChatDev uses several AI agents with different roles to build software. They collaborate by “attending” specialized seminars on design, coding, testing and more.
- Flowise lets you build custom LLM flows and AI agents.
- Langflow offers a visual interface for building AI-powered apps.
- Obviously AI lets you build production-ready AI apps without code.
- Venice is a privacy-first chatbot that stores chats in your browser.
- xAI is an AI lab led by Elon Musk. It released Grok-1, an open-source and uncensored alternative to OpenAI.
- Perplexity made uncensored AI models that outperformed GPT-3.5 and Llama 2. Paired with browser access, they went too far. Since they weren’t open-source, they were taken down in 6 months.
“I need more expensive and powerful hardware to run local AI models.”
This is the main tradeoff for local AI at the moment. But it’s becoming more performant. See how llama.cpp lets you run them on consumer devices and how Apple is doing this on a grand scale. This may be an inflection point for hardware and local AI.
“Setup and onboarding is hard. I can’t just visit a URL.”
User experience with local AI is a solvable problem. We’re getting there with open-source tools that make setting up local AI easier. Ollama lets you set up Llama 3 in 10 minutes.
“Local AI models perform worse than AI models made by tech giants.”
Open-source AI models can be a little worse, but a lot more private and less censored. Depending on your use case, it can be wise to sacrifice quality without giving up your privacy.
“Providing support for models running locally sounds impossible.”
This is another tradeoff of local LLMs. Unless the model becomes unusable, users can use an AI model to debug another AI model. This guy uses local AI models as copilots for coding copilots.
“Local AI models aren't a panacea for AI-related data privacy issues.”
This is the risk of storing data in digital form. “Private”, local AI may not protect your data if your computer is compromised.
Become a Trends Pro Member to get the full report on Local AI.
Do you want to get the next free Trends.vc Report? Join 60,000+ founders discovering new markets and ideas.
If you're tired of the endless back-and-forth with your Ollama (Local AI) Client just to repeat the same task over and over I suggest checking out my repository on GitHub (github/albertocubeddu/extensionOS)