Email or username:

Password:

Forgot your password?
Top-level
VessOnSecurity

@gerrymcgovern The article confuses two very different things - the process of training an LLM, which is computationally intensive and uses a lot of energy but is done once, and the process of using the trained LLM, which isn't.

2 comments
thestrangelet :fedora:

@bontchev @gerrymcgovern No, it doesn't, it calls out both, but doesn't confuse the two.

Corporation 9592

@bontchev @gerrymcgovern

No one trains their LLM once.
Unless they want it to be obsolete in a few months.

Go Up