@leah I have added several lines of code to my .htaccess-file and also adjusted the firewall on the server where my website is hosted. That's it.
Robots.txt is first of all not obliged, but an agreement with search-engine companies. So, if they ignore it, they are within the boundaries of law, eventhough they're assholes.
I think it is better to do as I did and if you have the possibility to go even further, like blocking via the OS of the server, you're good for now.
@leah
I think webdesigners, coders and hosting companies should work together to do something about this pest. It completely ruins the internet, uses tonnes of resources and so damages the climate even more than we're already doing.
We should take back the internet, in a fashion like the fediverse is.
Most mainstream social media and AI-scrapers, if not all, are a virus. Not just for the infrastructure and content of the internet, but also for society as a whole.