Email or username:

Password:

Forgot your password?
Top-level
Glyph

@simon @matt @forrestbrazeal I don’t feel like I am stuck in step 4, but I can’t get to step 5 because I do very much believe that (at least in their current incarnation) AI is very much going away. I am worried about the collateral damage that this particular dead tree is going to cause when it smashes into the rest of the ecosystem. E.g.: nvidia currently has a market cap of 11% of GDP, which I do not think is sustainable. But who on earth is going to get a positive ROI on ChatGPT at $20/mo?

3 comments
Simon Willison

@glyph @matt @forrestbrazeal I expect there's going to be a substantial AI crash, but I don't think (most of) the tools I'm using right now will become unavailable to me - especially since I can run Llama 70B on my own laptop now

Matt Campbell

@simon About running models locally, I've experimented with that, but large context windows take up lots of RAM, right? Like, isn't it O(n^2) where n is the number of tokens? Or do you not depend on large context windows?

@glyph

Simon Willison

@matt @glyph the models I can run in my laptop today are leagues ahead of the models I ran on the exact same hardware a year ago - improvements in that space have been significant

This new trick from Microsoft looks like it could be a huge leap forward too - I've not dug into it properly yet though github.com/microsoft/BitNet

Go Up