Robots.txt isn't working. Some sites are using Nepenthes/Quixotic to poison models, but that's a lose-lose approach.
DarkForest is an open-source middleware (Express/Next.js/Vite) that lets any site establish clear boundaries with AI systems. One line of code redirects crawlers to a message about respecting content - no poisoning required.
This is just the start. The long-term vision is to build: (1) A collective intelligence network where sites share crawler patterns, and (2) An internet scale information marketplace for powering AI. Think Google Ads, but for AI systems seeking real-time access to your content.
I believe we need infrastructure for fair, automated value exchange at internet scale - not the endless race of litigation and blocking. Would love feedback, especially from: web hosts dealing with AI bot problems, devs who could implement/contribute/critique, and anyone with thoughts on the marketplace vision
Source: https://github.com/darkforest-protocol/darkforest-blocker
Discord: https://discord.gg/JMBhkhUx