72 comments
@fl0_id @publictorsten das prinzipielle Problem ist nicht so ungewöhnlich/unüblich: Person A bekommt das Foto übermittelt für die Webseite, schneidet es auf ein Format zu und legt das Original entweder gar nicht ab oder in einer unüberblickbaren Struktur. Person B, die Social Media macht, bekommt das Original gar nicht zu sehen. Das sind Orga-Probleme, die du so in den meisten größeren Organisationen finden wirst. @publictorsten vor zwei Jahren so: „Leute wer hat das Pornhub Fullbackup auf die Platte mit den Model Trainingdaten gelegt?“ @publictorsten pretty poor that in examining the photo she didn't notice this and think, oops, not like that. @publictorsten What is so shocking about this is *not* that the genAI did exactly what it is designed to do, namely create teh most likely looking image, but that the person who used it switched off their brain at the sight of the words 'AI'. Before AI, if a social media person had had to do a major edit of a photo which left them deciding 'bra or no bra', they would have asked the sitter. Now they use genAI to be 'more productive', they stop doing their own job properly. @tomstoneham @publictorsten I am noticing this a lot: GenAI being used as an excuse/accelerator for a massive erosion of professionalism, with the excuse of “increased productivity” (“getting things done”). @klmr @tomstoneham @publictorsten That seems to be what it's made for: Productivity without responsibility. @Kraemer_HB @klmr @tomstoneham @publictorsten The purpose of a system is what it does! The Unaccountability Machine is a must-read. https://books.google.com/books/about/The_Unaccountability_Machine.html?id=M4bSEAAAQBAJ @klmr @tomstoneham @publictorsten This is in keeping with the TechBro mantra, "Move fast and break things." Which is how we have quickly gotten to a world full of broken things. @klmr @tomstoneham @publictorsten @tomstoneham @publictorsten I would personally not take for granted that, taking a professional shot of a woman as its input, "the most likely looking image" is one that sexualises her. There are built-in biases and the blame does not solely fall at the feet of whoever is in charge of socials. The people who built the machine to do that and marketed it as a magic tool everyone should use have a huge responsibility. @gallais @publictorsten Good points. 1) I was assuming that the genAI was picking up on images you see everywhere of a style of female business attire we might call 'smart sexy', a distant echo of sex-positive feminism. It is a personal choice to present like that by both men and women (e.g. Obama with two shirt buttons undone). 2) I agree absolutely. The purveyors of this crap are trying to make people less good at their jobs to make the AI indispensible. @publictorsten Not only didn't she consent to this edit, additionaly her photo was uploaded to an AI tool without her consent. I really don't know what's worse. @petrosyan @publictorsten Sexualizing her photo was so special. Honestly it's essential to fire the individual who did this. @KathleenC @petrosyan @publictorsten The point is that no human did this - the AI tool that the social media person used did it "automatically". It's an example of the harmful biases ("pictures of women should be sexually suggestive") encoded into every one of these damn things, and why none of them are fit for purpose. @kittylyst @KathleenC @petrosyan @publictorsten The Social Media person was left, not trained enough, not supervised enough. Fire them, and most of all their Manager. @dave_cochran Alright. Train better the SoMe worker after a solid warning. Fire their Manager. @dave_cochran “Absolutely no reason to think would cause harm”, in 2024, in a design-oriented communications and events community and business. Alright. Let me elaborate: it was the Manager’s job to a. have heard and learned from Monteiro e.g. 11y ago (see Webstock 2013 iirc), b. have heard and learned about e.g. Gebru et al. stories and work. Defending workers is right, hence my leaning towards your point. Managers are paid to know better. About time for some accountability in those fields. @dave_cochran To be honest and fair, let in charge, I would fire producers of indoors conferences without neither masks mandates nor best-in-line air renewing and filtering infrastructures and practices too for being stupid and having learned nothing, to explain how I approach taking care of people attending events, or producing/volunteering/speaking there. YMMV. @dave_cochran Then you cannot not see the issue with a woman speaker being sexualized through automation and edition of a picture she provided but never consented to be edited, never being shown the edited version, and nothing internally preventing that to a. happen, b. not be thrown away in due time, c. ever be published. How is it taking good care of your speaker community? How can it not show failure at a high level of core biz competencies? How can someone in charge not be fired? @dave_cochran @shalf @kittylyst @petrosyan @publictorsten Nope. No excuses for creating and posting a SEXUALIZED photo of this individual without permission. No excuses. This isn't "not noticing every single piddly thing IT tools ever do," you straw man you, this is altering a the professional photo of a woman to SEXUALIZE IT. Wake up. @dave_cochran @shalf @kittylyst @petrosyan @publictorsten Yeah because "hidden metadata" is exactly the same as showing a faked image they pretended were her BREASTS. @kittylyst @KathleenC @petrosyan @publictorsten the change itself is such an example, yes, but the mistake is using the tool in the first place. @kittylyst @KathleenC @petrosyan @publictorsten A human did do it. A human decided to use the shit "AI" tool, and uncritically accepted the result that came out of it for use in the material they were posting. What YOU are doing is exactly the reason bad people love "AI". You are failing to account for the human agency of how the tool was used, and laundering the blame onto the tool that can't be held accountable. @kittylyst @KathleenC @petrosyan @publictorsten a human chose to use the tool. You can't wash bad decisions by saying "an AI tool did it." @KathleenC @petrosyan @publictorsten It was done by A.I. There is no one to fire. I guess you could fire a random person who ordered the A.I. to modify the photo to fit the space, or the team that programed the A.I. I am not sure that would help anything though. @Jon_Kramer if you start holding people accountable for using AI and causing harms with it, I guarantee you that will "help" @Jon_Kramer @KathleenC @petrosyan @publictorsten C&P'ing the exact same reply I gave to someone else with the same bad argument: A human did do it. A human decided to use the shit "AI" tool, and uncritically accepted the result that came out of it for use in the material they were posting. What YOU are doing is exactly the reason bad people love "AI". You are failing to account for the human agency of how the tool was used, and laundering the blame onto the tool that can't be held accountable. @Jon_Kramer @KathleenC @petrosyan @publictorsten Sure, but someone used the AI tool and would have (or should have) reviewed its output. That person should be held accountable. It’s one thing to use an AI tool, but you can’t just shrug when it does something in your name. @RenewedRebecca @KathleenC @petrosyan @publictorsten Have YOU looked at the output? When you do, and compare it to the input, tell me if you can figure out what is wrong. Remember, you can't refer to the original uncropped photo. Just the cropped photo, and the AI enhanced photo. @Jon_Kramer @KathleenC @petrosyan @publictorsten Did the person involved have both the cropped photo and the enhanced one? Did the cropped photo have the invented details? Why is this hard for you? @RenewedRebecca @KathleenC @petrosyan @publictorsten She had the cropped photo. The artist through the use of A.I. created the 'enhanced' version. These details are not in dispute, or in any way 'hard'. @Jon_Kramer @KathleenC @petrosyan @publictorsten Here’s the thing… She was responsible for the photo. If using AI makes it impossible to be verify the details on it, AI shouldn’t be used. @RenewedRebecca @KathleenC @petrosyan @publictorsten I'm guessing you have not seen the photo either. @Jon_Kramer the photo should not have been uploaded to any ai tool (I seriously doubt it was done in premise) without consent. This is where the wrongdoing started. @KathleenC @petrosyan @publictorsten @MSugarhill @KathleenC @petrosyan @publictorsten Why not? The photo was publicly posted. @Jon_Kramer @KathleenC @petrosyan @publictorsten the person who used the tool is responsible for its output. This is how it has always worked. When I was making videos for corporates I wouldn't be able to get away with using random stock footage, much of which is highly sexualised. I couldn't just say "this is what the shutterstock search returned". This is obviously absurd. @quinsibell @KathleenC @petrosyan @publictorsten LOOK at the output as a stand alone piece, and tell me what is objectionable about it. The phrase "highly sexualized" indicates to me that you have not seen the photo. I think many people commenting on this thread have not seen the photo, and therefore fail to understand what this discussion IS. This leap to vilify a woman who was just attempting to do her job the best way she could is... petty. And I am done with it. @Jon_Kramer the woman who's face it was objected to it. At the very least edited photos require approval. Like I said, I couldn't get away with this conduct prior to AI image generators. Someone selected and posted the image. They are responsible. @quinsibell Great. Now go look at the picture and tell me what is objectionable. Don't get back to me, I'm done with this idiotic conversation about nothing. @Jon_Kramer it doesn't matter what I consider objectionable. A photo was edited and distributed in a way that the person in the photo objected to without consultation. @texttheater He's not a childless cat man. And now we know what that can communicate about men. @Kraemer_HB @texttheater This is a version of the Republican's recent idea that a woman is useless if she didn't bear her own children. Somehow having children bestows magical qualities on the parents, that are unavailable to everyone else. @publictorsten @Hunko @publictorsten But then you'd have to pay for a good graphics designer.
It is almost assuredly no coincidence that the kind of information that ML model was trained on also happens to reflect the viewpoints of the many toxic people who are (organizational) 'Leaders' in this area @publictorsten that's.... certainly an informative presentation on UX + AI!
@publictorsten @EricLawton What the actual fuck?!? I’ll add that one of my proudest moments was when Elizabeth spoke at Fluxible, and did so with her recently-born baby in a carrier strapped to her front. As someone who has organized UX events, it’s profoundly disappointing to see that this photo shit happened to such a wonderful person. @hellomiakoda @publictorsten @Kishi Jesus. The expression in your profile pic really summarizes what should be the general response to this news. 😒 |
Nebenaspekt der Story: Eigentlich hatte das Originalfoto ja den gewünschten Bildausschnitt. Ich vermute, dass an einer Stelle das Bild beschnitten wurde und an einer anderen Stelle plötzlich ein anderes Bildformat gewünscht wurde.
Das ist unsinnig und damit wohl der künftig vorherrschende Einsatz von KI: Unnötige Aufgaben schaffen und hoffen, dass nicht allzu schief geht.