An idea: [Wayback/Archive] Jeroen Wiert Pluimers: “@ruurd @mcc … Maybe place useful content below 500 KiB and serve a file at least 1 GiB size?…” – Mastodon
@ruurd @mcc probably not, although Google Search limits them to 500 KiB.
https://developers.google.com/search/docs/crawling-indexing/robots/robots_txt#file-format
“Google currently enforces a robots.txt file size limit of 500 kibibytes (KiB). Content which is after the maximum file size is ignored. You can reduce the size of the robots.txt file by consolidating rules that would result in an oversized robots.txt file. For example, place excluded material in a separate directory.”
Maybe place useful content below 500 KiB and serve a file at least 1 GiB size?
It was in response to these earlier toots (with quotes of some very interesting links on when cookies are (dis)allowed – TL;DR: it depends on local regulations):





