@publictorsten Not only didn't she consent to this edit, additionaly her photo was uploaded to an AI tool without her consent. I really don't know what's worse.
Top-level
@publictorsten Not only didn't she consent to this edit, additionaly her photo was uploaded to an AI tool without her consent. I really don't know what's worse. 34 comments
@KathleenC @petrosyan @publictorsten The point is that no human did this - the AI tool that the social media person used did it "automatically". It's an example of the harmful biases ("pictures of women should be sexually suggestive") encoded into every one of these damn things, and why none of them are fit for purpose. @kittylyst @KathleenC @petrosyan @publictorsten The Social Media person was left, not trained enough, not supervised enough. Fire them, and most of all their Manager. @dave_cochran Alright. Train better the SoMe worker after a solid warning. Fire their Manager. @dave_cochran “Absolutely no reason to think would cause harm”, in 2024, in a design-oriented communications and events community and business. Alright. Let me elaborate: it was the Manager’s job to a. have heard and learned from Monteiro e.g. 11y ago (see Webstock 2013 iirc), b. have heard and learned about e.g. Gebru et al. stories and work. Defending workers is right, hence my leaning towards your point. Managers are paid to know better. About time for some accountability in those fields. @dave_cochran To be honest and fair, let in charge, I would fire producers of indoors conferences without neither masks mandates nor best-in-line air renewing and filtering infrastructures and practices too for being stupid and having learned nothing, to explain how I approach taking care of people attending events, or producing/volunteering/speaking there. YMMV. @dave_cochran Then you cannot not see the issue with a woman speaker being sexualized through automation and edition of a picture she provided but never consented to be edited, never being shown the edited version, and nothing internally preventing that to a. happen, b. not be thrown away in due time, c. ever be published. How is it taking good care of your speaker community? How can it not show failure at a high level of core biz competencies? How can someone in charge not be fired? @dave_cochran @shalf @kittylyst @petrosyan @publictorsten Nope. No excuses for creating and posting a SEXUALIZED photo of this individual without permission. No excuses. This isn't "not noticing every single piddly thing IT tools ever do," you straw man you, this is altering a the professional photo of a woman to SEXUALIZE IT. Wake up. @dave_cochran @shalf @kittylyst @petrosyan @publictorsten Yeah because "hidden metadata" is exactly the same as showing a faked image they pretended were her BREASTS. @kittylyst @KathleenC @petrosyan @publictorsten the change itself is such an example, yes, but the mistake is using the tool in the first place. @kittylyst @KathleenC @petrosyan @publictorsten A human did do it. A human decided to use the shit "AI" tool, and uncritically accepted the result that came out of it for use in the material they were posting. What YOU are doing is exactly the reason bad people love "AI". You are failing to account for the human agency of how the tool was used, and laundering the blame onto the tool that can't be held accountable. @kittylyst @KathleenC @petrosyan @publictorsten a human chose to use the tool. You can't wash bad decisions by saying "an AI tool did it." @KathleenC @petrosyan @publictorsten It was done by A.I. There is no one to fire. I guess you could fire a random person who ordered the A.I. to modify the photo to fit the space, or the team that programed the A.I. I am not sure that would help anything though. @Jon_Kramer if you start holding people accountable for using AI and causing harms with it, I guarantee you that will "help" @Jon_Kramer @KathleenC @petrosyan @publictorsten C&P'ing the exact same reply I gave to someone else with the same bad argument: A human did do it. A human decided to use the shit "AI" tool, and uncritically accepted the result that came out of it for use in the material they were posting. What YOU are doing is exactly the reason bad people love "AI". You are failing to account for the human agency of how the tool was used, and laundering the blame onto the tool that can't be held accountable. @Jon_Kramer @KathleenC @petrosyan @publictorsten Sure, but someone used the AI tool and would have (or should have) reviewed its output. That person should be held accountable. It’s one thing to use an AI tool, but you can’t just shrug when it does something in your name. @RenewedRebecca @KathleenC @petrosyan @publictorsten Have YOU looked at the output? When you do, and compare it to the input, tell me if you can figure out what is wrong. Remember, you can't refer to the original uncropped photo. Just the cropped photo, and the AI enhanced photo. @Jon_Kramer @KathleenC @petrosyan @publictorsten Did the person involved have both the cropped photo and the enhanced one? Did the cropped photo have the invented details? Why is this hard for you? @RenewedRebecca @KathleenC @petrosyan @publictorsten She had the cropped photo. The artist through the use of A.I. created the 'enhanced' version. These details are not in dispute, or in any way 'hard'. @Jon_Kramer @KathleenC @petrosyan @publictorsten Here’s the thing… She was responsible for the photo. If using AI makes it impossible to be verify the details on it, AI shouldn’t be used. @RenewedRebecca @KathleenC @petrosyan @publictorsten I'm guessing you have not seen the photo either. @Jon_Kramer the photo should not have been uploaded to any ai tool (I seriously doubt it was done in premise) without consent. This is where the wrongdoing started. @KathleenC @petrosyan @publictorsten @MSugarhill @KathleenC @petrosyan @publictorsten Why not? The photo was publicly posted. @Jon_Kramer @KathleenC @petrosyan @publictorsten the person who used the tool is responsible for its output. This is how it has always worked. When I was making videos for corporates I wouldn't be able to get away with using random stock footage, much of which is highly sexualised. I couldn't just say "this is what the shutterstock search returned". This is obviously absurd. @quinsibell @KathleenC @petrosyan @publictorsten LOOK at the output as a stand alone piece, and tell me what is objectionable about it. The phrase "highly sexualized" indicates to me that you have not seen the photo. I think many people commenting on this thread have not seen the photo, and therefore fail to understand what this discussion IS. This leap to vilify a woman who was just attempting to do her job the best way she could is... petty. And I am done with it. @Jon_Kramer the woman who's face it was objected to it. At the very least edited photos require approval. Like I said, I couldn't get away with this conduct prior to AI image generators. Someone selected and posted the image. They are responsible. @quinsibell Great. Now go look at the picture and tell me what is objectionable. Don't get back to me, I'm done with this idiotic conversation about nothing. @Jon_Kramer it doesn't matter what I consider objectionable. A photo was edited and distributed in a way that the person in the photo objected to without consultation. |
@petrosyan @publictorsten Sexualizing her photo was so special. Honestly it's essential to fire the individual who did this.