@shalf @kittylyst @KathleenC @petrosyan @publictorsten I understand where you're coming from, and you are right to be upset. This was not an act of malice, it was addressed professionally and responsibly by management once they became aware of the situation, and - based solely on the original post since that's all the context I have - the injured party appears satisfied with the outcome.
If we fired people for not noticing things that IT tools do, every time it happened, there would literally not be anyone left working in IT. No one in their right mind thinks an "expand image" plugin is going to sexualize the image's contents; would you punish someone for printing flyers that have hidden metadata from the printer (https://www.snopes.com/fact-check/household-printers-tracking-code/) in them?
The correct course of action, imo, is to start exactly as they've done by removing the offending image (and apologizing to the injured party), and to then use that as an educational moment so that it doesn't happen a second time.
Now, if it DOES happen a second time, fuck 'em. Unless/until, though, let people learn from their mistakes.
@dave_cochran Alright. Train better the SoMe worker after a solid warning. Fire their Manager.
To be clearer: reputation and legal risk are way underestimated in this industry. It has to change. Yes, that means people at responsibility and strategy level thinking AI is harmless and allowing or promoting its unsupervised use, mostly for cost reasons (and ignoring externalities), losing their jobs.