Email or username:

Password:

Forgot your password?
mcc

2008, me: I love the idea of cryptocurrency

BITCOIN: The word "cryptocurrency" now means "financial scams based on inefficient write-only ledgers"

2018, me: I love the idea of the metaverse

FACEBOOK: The word "metaverse" now means "proprietary 3D chat programs with no soul"

2022, me: I love the idea of procedurally generated content

OPENAI: From now on people will associate that only with big corporations plagiarizing small artists and turning their work into ugly content slurry

187 comments
Leonora

@mcc I love the idea of the fediverse

datarama

@mcc I feel like an asshole when I say I enjoy (and used to make) "generative art" now.

mcc

RONALD LACEY: Again we see, Ms. McClure, there is nothing you can possess which I cannot take away.

mcc

I'm really concerned about the effect "generative AI" is going to have on the attempt to build a copyleft/commons.

As artists/coders, we saw that copyright constrains us. So we decided to make a fenced-off area where we could make copyright work for us in a limited way, with permissions for derivative works within the commons according to clear rules set out in licenses.

Now OpenAI has made a world where rules and licenses don't apply to any company with a valuation over $N billion dollars.

mcc

(The exact value of "N" is not known yet; I assume it will be solidly fixed by some upcoming court case.)

mcc

In a world where copyleft licenses turn out to restrict only the small actors they were meant to empower, and don't apply to big bad-actor "AI" companies, what is the incentive to put your work out under a license that will only serve to make it a target for "AI" scraping?

With NFTs, we saw people taking their work private because putting something behind a clickwall/paywall was the only way to not be stolen for NFTs. I assume the same process will accelerate in an "AI" world.

pinkdrunkenelephants

@mcc They should just make a license that explicitly bans AI usage then.

mcc

@pinkdrunkenelephants The licenses already bar this because they govern derivative works. If you can make the derivative work non-derivative by defining it "AI", then if we add a nonsense clause banning "AI", the AI companies can simply rename "AI" to "floopleflorp" and say "Ah, but your license only bans 'AI'— it doesn't ban 'floopleflorp'!"

pinkdrunkenelephants

@mcc They can rename it Clancy for all it matters. AI is still AI and actions don't just lose meaning because of evil people playing with language.

mcc

@pinkdrunkenelephants But AI is not AI. The things that they're calling "AI" are just some machine learning statistical models. Ten years ago this wouldn't have been considered "AI".

pinkdrunkenelephants

@mcc Doesn't matter, what matters is the definition behind the word. That is what licenses ought to ban outright.

It's like saying rape is perfectly legal so long as we call it forced sex. Who would believe that that wasn't already predisposed to rape?

Don't fall for other people's manipulative mindgames?

mcc

@pinkdrunkenelephants The definition behind the law is, again, decided by humans, who are capable of inconsistency or poor decisions. Rape is legal in New York because rape there is legally defined by the specific use of certain specific genitals. See E. Jean Carroll v. Donald J. Trump

pinkdrunkenelephants replied to mcc

@mcc And no one accepts that because of what I'm saying. A rose by any other name would smell as sweet. People need to start recognizing that fact. That's the only way things will change.

mcc replied to pinkdrunkenelephants

@pinkdrunkenelephants Well, per my belief as to the meaning of words, ML statistical models are derivative works like any other, and my licenses which place restrictions on derivative works already apply to the ML statistical models

datarama

@pinkdrunkenelephants @mcc That doesn't work if copyright *itself* doesn't apply to AI training, which is what all those court cases are about. Licenses start from the assumption that the copyright holder reserves all rights, and then the license explicitly waives some of those rights under a set of given conditions.

But with AI, it's up in the air whether a copyright holder has any rights at all.

pinkdrunkenelephants

@datarama @mcc I don't see how it would be up in the air. Humans feed that data into AI and use the churned remains so it's still a human violating the copyright.

mcc

@pinkdrunkenelephants @datarama Because humans also are the ones who interpret and enforce laws and if the government does not enforce copyright against companies which market their products as "AI", then copyright does not apply to those companies.

pinkdrunkenelephants

@mcc @datarama I guess that's more of a bribery problem than a legal precedent one, then.

datarama

@pinkdrunkenelephants @mcc In the EU, there actually is some legislation. Copyright explicitly *doesn't* protect works from being used in machine learning for academic research, but ML training for commercial products must respect a "machine-readable opt-out".

But that's easy enough to get around. That's why eg. Stability funded an "independent research lab" who did the actual data gathering for them.

mcc replied to datarama

@datarama I consider this illegitimate and fundamentally unfair because I have already released large amounts of work under creative commons/open source licenses. I can't retroactively add terms to some of them because the plain language somehow no longer applies. If I add such opt-outs now, it would be like I'm admitting the licenses previously didn't apply to statistics-based derivative works

datarama replied to mcc

@mcc I consider it illegitimate and fundamentally unfair because it's opt-out.

pinkdrunkenelephants replied to datarama

@datarama @mcc I wonder why it is people don't just revolt and destroy their servers then. Or drag them into jail.

Why do people delude themselves into accepting atrocities?

datarama replied to pinkdrunkenelephants

@pinkdrunkenelephants @mcc I think if there was a simple clear-cut answer to that, the world would be a *very* different place.

datarama

@mcc There is no such incentive. There is a very, very strong incentive (namely, not wanting to empower the worst scumbags in tech) to *not* share your work publicly anymore.

This, to me, is the most harmful effect so far of generative AI.

Graham Spookyland🎃/Polynomial

@mcc it's kinda gross that the only (current) way to meaningfully and tangibly refuse to be exploited by the mass commercialised theft of the commons is to, well, commercialise the commons.

Graham Spookyland🎃/Polynomial

@mcc although if there's an angstrom-thick silver lining to this whole thing, it's that it has proved incontrovertibly that copyright law was only ever intended to be used as a cudgel by the wealthy and powerful, and never to protect the rights of the individual artist.

Hugo Mills

@gsuberland @mcc The artists occasionally tried using the cudgel, but the opponents brought an AK47 to the courtroom...

Markus Hofer

@mcc maybe we've all been wrong about NFTs and it's the future after all? 😉

margot

@mcc im wondering if the broader art and design worlds will end up in a similar situation to where industries like fashion and jewelry already are, where plagarism is essentially an expectation for the designers working there

margot

@mcc i guess this is less a solution or an endgame so much as a window into an area where copyright has been hurting small creators while being completely flaunted by others worth multi-billions

past oral no mad

@mcc Literally zero. I have a thing I've been hacking on for a while, niche shit, probably not interesting to many others. I was planning on releasing it, but once I realized it'd probably have 0-1 other human users, but end up in every LLM training set, I decided not to.

JP

@mcc "the legal system is ultimately a weapon wielded by those with more capital against those with less" is of course the punchline after every movement that has tried to use legal mechanisms like licenses to enact social change. it'd be nice if there were some deep pan-institutional awareness of and correction for this.

mcc

Did you see this? The whole thing with "the stack".

post.lurk.org/@emenel/11211101

Some jerks did mass scraping of open source projects, putting them in a collection called "the stack" which they specifically recommend other people use as machine learning sources. If you look at their "Github opt-out repository" you'll find just page after page of people asking to have their stuff removed:

github.com/bigcode-project/opt

(1/2)

mcc

…but wait! If you look at what they actually did (correct me if I'm wrong), they aren't actually doing any machine learning in the "stack" repo itself. The "stack" just collects zillions of repos in one place. Mirroring my content as part of a corpus of open source software, torrenting it, putting it on microfilm in a seedbank is the kind of thing I want to encourage. The problem becomes that they then *suggest* people create derivative works of those repos in contravention of the license. (2/2)

mcc

So… what is happening here? All these people are opting out of having their content recorded as part of a corpus of open source code. And I'll probably do the same, because "The Stack" is falsely implying people have permission to use it for ML training. But this means "The Stack" has put a knife in the heart of publicly archiving open source code at all. Future attempts to preserve OSS code will, if they base themselves on "the stack", not have any of those opted-out repositories to draw from.

mcc

Like, heck, how am I *supposed* to rely on my code getting preserved after I lose interest, I die, BitBucket deletes every bit of Mercurial-hosted content it ever hosted, etc? Am I supposed to rely on *Microsoft* to responsibly preserve my work? Holy crud no.

We *want* people to want their code widely mirrored and distributed. That was the reason for the licenses. That was the social contract. But if machine learning means the social contract is dead, why would people want their code mirrored?

Aedius Filmania ⚙️🎮🖊️

@mcc

Please don't opt out all your repositories, leave the ones that didn't work or didnt compile or are full of security hole.

josh

@mcc i feel like we need llm opt out considerations in foss licenses tbh, then host code off github and nothing changes? Hard to enforce idk unlikely politicians will get it right, maybe the ftc will get lucky?

datarama

@mcc That's also basically how LAION made the dataset for Stable Diffusion. They collected a bunch of links to images with descriptive alt-text.

(Are you taking time to write good alt-text because you respect disabled people? Congratulations, your good work is being exploited by the worst assholes in tech. Silicon Valley never lets a good deed go unpunished.)

Morten Hilker-Skaaning

@mcc you'd think big companies need both copyright and IP ownership. Otherwise they'd just keep stealing each others work once the free content was exhausted...

Carlos Solís
@mcc I've already seen the general public considering the concepts of copyleft and open-source as "failed", and clamor to return to a world where every commit and transfer of source code is approved by, and paid to, the original author.
JimmyChezPants

@mcc

*that should be Belloq, not whatever that guy's name is.

#PopCultureReplyGuy

margot

@mcc we have GOTTA track down whichever wizard cursed you

Boba Yaga

@mcc this monkey's paw is bound to run out of fingers eventually

datarama

@bobayaga @mcc Have you seen how diffusion models draw fingers?

Emily S

@bobayaga @mcc

* A finger curls on the paw and splits open like a fractal. Why are there so many fingers? Why do I keep looking at it? Oh goddess why?!

lachlan but spooky

@mcc Sodding tech companies ruining everything and painting it a slightly off green shade of homogenised beige

Nazo

@mcc The sickening thing is, each of these does have potential good things to them. But naturally greedy people twisted them to the most evil possible course.

Cryptocurrency could have created an independent system worldwide with less volatility.

Metaverse could have meant transcending current limitations. But FB exists.

LLMs could have allowed neurodivergent people such as myself who cannot create to finally be able to do so. Now we can't do anything with it without being hated.

Susanna

@mcc Do you think you could love the idea of Elon Musk?

Scott Michaud

@mcc At some point you'll stop loving things, and then we'll have nice things.

And the cycle will continue.

Negative12DollarBill

@mcc

2010 ME: I love the idea of electric cars!

Carlos Solís
@negative12dollarbill @mcc The very same dude that brought you "the platform formerly known as Twitter": yeah let's make cars as locked down and extortive as legally possible
Glitch

@mcc Ugly content slurry, oh my gods that phrasing is perfect.

Lot⁴⁹

@mcc "Content slurry" is [chef's 💋]

Jacqueline Merritt

@mcc at least we still have Minecraft and no man sky as popular examples of the last one, so with any luck, once the ML scam dies, people will forget about it and only remember it being used for good

Eckes :mastodon:

@mcc you really need to stop liking things and ruining it for everybody

Alex Feinman

@mcc clearly the problem is you loving cool things???

Wyatt (🏳️‍⚧️♀?)

@mcc Again, commercialism ruins everything it touches tech-wise

disky

@mcc Snow Crash was not an instruction manual.

Ike

@mcc Yes. This is me. I love the underlying brilliance of so many of these items but it seems like every technology exploits something. Whether the environment, natural resources, people's attention, or whatever - there is no free lunch. No leverage without externalities.

Even the moat idealistic of energy capture, the dyson sphere, would be ethically dubious to construct.

The1goit #RedInstead

@mcc honestly ai is the most popular and fastest selling snake oil yet.

Ai uses to be a legitimate piece of tech used for telling NPCs where the player was and how to walk around an environment. Now it's just snake oil.
Generative art used to be art. Now it's snake oil.
Talking to a support agent used to be talking to a guy on tech support. Now it's talking to snake oil sold by the richest people in the world. Worst part is, it actually sells, but just like snakes, it'll eventually bite them.

Justin Pot

@mcc And then the companies and people who cynically exploit our belief in the future to promote these scams have the audacity to tell us that we’re pessimistic about technology. We’re not! We love tech! We hate the bullshit, the scams.

HyperSoop :spinny_cat_aroace: :spinny_fox_agender:

@mcc
18XX, Karl Marx: I love the idea of communism

USSR: From now on people will associate that only with murderous totalitarian regimes

Go Up