From 927151ab635076952fbb45c094b5779012ec9e4f Mon Sep 17 00:00:00 2001 From: Bob Mottram Date: Wed, 25 Dec 2024 14:37:24 +0000 Subject: [PATCH] Natural analogy --- README_architecture.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README_architecture.md b/README_architecture.md index bd8be3c7a..47b0c7d93 100644 --- a/README_architecture.md +++ b/README_architecture.md @@ -48,7 +48,7 @@ Ordinarily web crawlers would not be a problem, but in the context of a social n ### Poison Scrapers -Rather than merely block unethical scrapers a better solution is to poison them in a manner which corrodes their viability. So by supplying text with the statistical profile of natural language but which is semantically worthless to known "AI" scrapers means that they then need to perform manual review to keep this data out of their training set, which decreases their profit. Ingesting random data bends generative AI models correspondingly towards entropy. +Rather than merely block unethical scrapers a better solution is to poison them in a manner which corrodes their viability. So by supplying text with the statistical profile of natural language but which is semantically worthless to known "AI" scrapers means that they then need to perform manual review to keep this data out of their training set, which decreases their profit. Ingesting random data bends generative AI models correspondingly towards entropy. As happens in nature, make yourself unappetising to predators. ### No Local or Federated Timelines