Email or username:

Password:

Forgot your password?
Top-level
Simon Willison

@matt @forrestbrazeal oh I absolutely went through these 5 stages of grief, but I've been firmly in the 5th step for over a year at this point

When ChatGPT first came out, I think a lot of us went through about 5 stages of grief:

 1   Shock (how is a language model this good? what the actual ***?)

 2   Denial (it’ll never be as good at writing code / words / etc as me. look at these hallucinations! silly model)

  3  Anger (how dare they train their models on my creative work?)

  4  Depression (what’s left for me? Who even am I)

  5  Acceptance (this is the new world, AI isn’t going away, let’s figure out what that means for us)
6 comments
Matt Campbell

@simon @forrestbrazeal I don't think I ever actually felt the anger. And I feel kind of guilty about that, because of course some people have decided to stop there and fight against AI, and they have such moral certainty about it.

Simon Willison

@matt @forrestbrazeal I've not felt the anger personally, because I've been releasing open source code for 20+ years so I already default to "I want people to be able to reuse my work as much as possible" - but I absolutely understand the people who ARE angry

If I was an artist and someone trained Stable Diffusion on my work without my permission and then started competing with me for commissions that used my own personal style I think I'd feel very differently about this all

Glyph

@simon @matt @forrestbrazeal I don’t feel like I am stuck in step 4, but I can’t get to step 5 because I do very much believe that (at least in their current incarnation) AI is very much going away. I am worried about the collateral damage that this particular dead tree is going to cause when it smashes into the rest of the ecosystem. E.g.: nvidia currently has a market cap of 11% of GDP, which I do not think is sustainable. But who on earth is going to get a positive ROI on ChatGPT at $20/mo?

Simon Willison

@glyph @matt @forrestbrazeal I expect there's going to be a substantial AI crash, but I don't think (most of) the tools I'm using right now will become unavailable to me - especially since I can run Llama 70B on my own laptop now

Matt Campbell

@simon About running models locally, I've experimented with that, but large context windows take up lots of RAM, right? Like, isn't it O(n^2) where n is the number of tokens? Or do you not depend on large context windows?

@glyph

Simon Willison

@matt @glyph the models I can run in my laptop today are leagues ahead of the models I ran on the exact same hardware a year ago - improvements in that space have been significant

This new trick from Microsoft looks like it could be a huge leap forward too - I've not dug into it properly yet though github.com/microsoft/BitNet

Go Up