Email or username:

Password:

Forgot your password?
Top-level
Simon Willison

@glyph @matt @forrestbrazeal I expect there's going to be a substantial AI crash, but I don't think (most of) the tools I'm using right now will become unavailable to me - especially since I can run Llama 70B on my own laptop now

2 comments
Matt Campbell

@simon About running models locally, I've experimented with that, but large context windows take up lots of RAM, right? Like, isn't it O(n^2) where n is the number of tokens? Or do you not depend on large context windows?

@glyph

Simon Willison

@matt @glyph the models I can run in my laptop today are leagues ahead of the models I ran on the exact same hardware a year ago - improvements in that space have been significant

This new trick from Microsoft looks like it could be a huge leap forward too - I've not dug into it properly yet though github.com/microsoft/BitNet

Go Up