@matt @forrestbrazeal oh hot damn I've been working on a blog entry which tries to say what's in this song but in 10x more words and 1/10th as good https://www.youtube.com/watch?v=hrfEUZ0UvRo
Top-level
@matt @forrestbrazeal oh hot damn I've been working on a blog entry which tries to say what's in this song but in 10x more words and 1/10th as good https://www.youtube.com/watch?v=hrfEUZ0UvRo 8 comments
@matt @forrestbrazeal oh I absolutely went through these 5 stages of grief, but I've been firmly in the 5th step for over a year at this point @simon @forrestbrazeal I don't think I ever actually felt the anger. And I feel kind of guilty about that, because of course some people have decided to stop there and fight against AI, and they have such moral certainty about it. @matt @forrestbrazeal I've not felt the anger personally, because I've been releasing open source code for 20+ years so I already default to "I want people to be able to reuse my work as much as possible" - but I absolutely understand the people who ARE angry If I was an artist and someone trained Stable Diffusion on my work without my permission and then started competing with me for commissions that used my own personal style I think I'd feel very differently about this all @simon @matt @forrestbrazeal I don’t feel like I am stuck in step 4, but I can’t get to step 5 because I do very much believe that (at least in their current incarnation) AI is very much going away. I am worried about the collateral damage that this particular dead tree is going to cause when it smashes into the rest of the ecosystem. E.g.: nvidia currently has a market cap of 11% of GDP, which I do not think is sustainable. But who on earth is going to get a positive ROI on ChatGPT at $20/mo? @glyph @matt @forrestbrazeal I expect there's going to be a substantial AI crash, but I don't think (most of) the tools I'm using right now will become unavailable to me - especially since I can run Llama 70B on my own laptop now @simon About running models locally, I've experimented with that, but large context windows take up lots of RAM, right? Like, isn't it O(n^2) where n is the number of tokens? Or do you not depend on large context windows? @matt @glyph the models I can run in my laptop today are leagues ahead of the models I ran on the exact same hardware a year ago - improvements in that space have been significant This new trick from Microsoft looks like it could be a huge leap forward too - I've not dug into it properly yet though https://github.com/microsoft/BitNet |
@simon You sure you linked to the right song? Have you gone through the kind of existential crisis portrayed in that song? I always saw you as being cautiously optimistic about AI, that it can be a useful tool if used well, but not making us redundant.
@forrestbrazeal