Email or username:

Password:

Forgot your password?
Top-level
凉拌茶叶

@yogthos any sources of ChatGPT actually has 175B parameters? The paper of InstructGPT shows that it gets better result for only 1B parameters. Also they published the model as a service, I can not imagine how can they run the service with 175B parameters.

arxiv.org/abs/2203.02155

2 comments
Yogthos

@leo_song I took the title from the Lemmy post, but haven't actually followed up to investigate lemmy.ml/post/747098

Yogthos

@leo_song it could be that 175B parameters are used during training but not during actual usage

Go Up