A new bot behavior I'm seeing on my site: my wiki software allows you to fetch a feed for every page. Either it contains updates to the page, or a feed of the pages linked, it depends. It's for humans. Of course some shit engineer decided that it was a good idea to scan the web for all the feeds that are out there (so rare! so precious!) and to download them all, forever (uncover the darknet! server our customers). Now I have to block IP number ranges, add user agents to robots.txt files (not all of them provide some), or block user agents (not all of them provide a useful one). I block and block and block (for the environment! to avoid +2.0°C and the end of human civilization). Knowing that all these shit requests exist out there – a hundred thousand requests or more per day, wasting CO₂ – makes me sad.
#ButlerianJihad
And so the load on my tiny little server dropeth from 80+ to 0.5 again. What fuckery.