Email or username:

Password:

Forgot your password?
Top-level
grahamsz

@FeralRobots @NewtonMark You can definitely fine-tune a pre-trained GPT model pretty cost-efficiently. Considering my mid-size company spends 5 figures a year on slack I expect they can afford it. Though I suspect if it's any good there will be a hefty upcharge for it.

3 comments
FeralRobots

@grahamsz @NewtonMark
Right, but
a) is fine-tuning enough? we've been told before that tuning would keep stuff from showing up that neverthless keeps showing up.
b) how much will that surchage be?
c) will that surcharge actually end up covering cost? I.e., are Slack setting themselves up for an fall in a couple of years?

grahamsz

@FeralRobots @NewtonMark Fine-tuning will definitely stop stuff coming up in other people's results, because you make an extra layer over the existing model with new weights.

You also maybe wouldn't even need that, you could use an embedding model to place each conversation into a high dimensional space, then when you ask a question of the model it searches all relevant conversations to build a better prompt. For a lot of use cases i think that could work just fine.

Go Up