Email or username:

Password:

Forgot your password?
Top-level
shrimp eating mammal 🦐

@jonny OK but you don't know that till after you've observed it awhile and i guess what I'm wondering is where one would get the idea in the first place? from someone else? from the internet? there's a stunningly wide gap in understanding if a person thinks chatgpt is evaluating code, both of chatgpt and of coding! how does that happen to a person? I'm fully failing to understand 😭

5 comments
jonny (good kind)

@walruslifestyle
Their mental model is that "I can talk to this thing, when I give it some code it knows what that code is and can tell me about it in the same way that it seems to tell me about lots of things," and they are not so naΓ―ve in my opinion because products like copilot do advertise themselves as understanding code, so thinking the LLM is actually parsing it and reasoning about it rather than generating plausible text from some seed vector in its latent space is reasonable enough to me.

jonny (good kind)

@walruslifestyle
I dont disagree if u know a little bit about how these things work its ridiculous, but he is just following everything he's been told about what they can do!

Bornach

@jonny @walruslifestyle
Especially given the flood of YouTube videos demonstrating ChatGPT solving coding problems in minutes
youtu.be/jwpja9fcqaM

The implied assumption is that humans with their very limited ability to memorize answers would have to understand code in order to arrive at the correct answer. We apply that same assumption to LLMs at our own peril. Surely it couldn't have simply memorized all the answers and is simply applying pattern matching to generate the answer, right?

Captain Janegay πŸ«–

@jonny @walruslifestyle Right, the ultimate problem here is one of misleading marketing. LLMs actually have a bunch of really useful applications, but those applications, I guess, are not the ones that the companies developing them thought would sell.

Go Up