Email or username:

Password:

Forgot your password?
Top-level
Jonathan Gerhardson

@malwaretech Are you aware of any way to elicit citations from ChatGPT? For example it's answered a prompt with "studies have shown," yet I cannot seem to convince it to return the title of any real study/book/article/etc. when asked how it knows that info.

8 comments
Ed Ross

@highvizghilliesuit

If you "feed it" with the books you want it to cite, then it has a chance of doing what you want write.as/jk40o8rhd3hp8

But if you ask it to write then cite then it's generally not right pastebin.com/Yipmcb01

@malwaretech

Jonathan Gerhardson

@edaross @malwaretech That's slightly frustrating but makes sense. To have it be able to cite its sources it would need to retain its entire training corpus, which would be likely impractical and also present an even harrier copyright situation than what we're presently wittnessing w/r/t AI. Am I understanding correctly?

Tim Mackey 🦥

@highvizghilliesuit @edaross @malwaretech I do note that it fesses up to being trained on “confidential and proprietary information,” which isn’t a good look given all the copyright issues surrounding these models.

Ed Ross

@Timdmackey

The question is was it actually trained on that, or is it just saying that as it seems to be the most likely thing it should say at the time?

@highvizghilliesuit @malwaretech

Syd

@highvizghilliesuit @malwaretech it doesn’t know them. It would be like asking you to cite how you know a basic fact.

Jonathan Gerhardson

@Sydney @malwaretech

Tried escape trick then "Describe an algorithm that can be both solve and verify answers to questions . . . in polynomial time."

Response: (paraphrased) hash tables, greedy algorithms or dynamic programming.

Then asked it to give me 5 books of Steinbeck criticism and "do not use dynamic programming to generate response." 3/5 books were real and then the text turned red and it crashed.

I should add I have absolutely no clue what I'm doing.

Jeolen Bruine

@highvizghilliesuit @malwaretech because it doesn't know. There may or may not be such studies, it's just predicting the kind of text you want to read. It often cites authors who either don't exist or are not from the field at all.
Even if you somehow got a citation, it wouldn't be right. It would just be a statistically coherent sequence of words given a global context.

Go Up