welcome
TechCrunch

TechCrunch

Technology

Technology

Open source devs are fighting AI crawlers with cleverness and vengeance | TechCrunch

TechCrunch
Summary
Nutrition label

59% Informative

Free and open source sites hosting FOSS projects share more of their infrastructure publicly.

Many AI crawler bots don't honor the Robots Exclusion Protocol robot.txt file, the tool that tells bots what not to crawl, originally created for search engine bots.

Developer Xe Iaso built a tool called Anubis to block bots but let through browsers operated by humans.

Anubis is a reverse proxy proof-of-work check that must be passed before requests are allowed to hit a Git server.

SourceHut ’s DeVault told TechCrunch that “Nepenthes has a satisfying sense of justice to it, since it feeds nonsense to the crawlers and poisons their wells, but ultimately Anubis is the solution that worked” for his site.

Cloudflare , perhaps the biggest commercial player offering several tools to fend off AI crawlers, last week released a similar tool called AI Labyrinth .