Email or username:

Password:

Forgot your password?
Alex Schroeder

A new bot behavior I'm seeing on my site: my wiki software allows you to fetch a feed for every page. Either it contains updates to the page, or a feed of the pages linked, it depends. It's for humans. Of course some shit engineer decided that it was a good idea to scan the web for all the feeds that are out there (so rare! so precious!) and to download them all, forever (uncover the darknet! server our customers). Now I have to block IP number ranges, add user agents to robots.txt files (not all of them provide some), or block user agents (not all of them provide a useful one). I block and block and block (for the environment! to avoid +2.0°C and the end of human civilization). Knowing that all these shit requests exist out there – a hundred thousand requests or more per day, wasting CO₂ – makes me sad.
#ButlerianJihad

2 comments
Alex Schroeder

And so the load on my tiny little server dropeth from 80+ to 0.5 again. What fuckery.

Alex Schroeder

I wonder whether there is a witty phrase one can extract from this. "Do not make requests on the Internet if a human does not need it right now. If you fetch things now because somebody might need it someday, you are a planet killer."

Go Up