@stux I mean, if it was a local model that didn't need to send anything to the cloud, maybe, but we're not alking about those.
Top-level
@stux I mean, if it was a local model that didn't need to send anything to the cloud, maybe, but we're not alking about those. 1 comment
|
@qkslvrwolf @stux why not? They already exist. Especially for PC. Making 1B or 2B LLMs reliable for tools that can come handy on mobile is not that far off.