@vees I'm not at all doubting that it gets access to it.
I'm just curious about how and when it gets selected, and speculating about whether they might selectively insert bits depending on the request vs. always include it.
It doesn't change the importance of the fact it's clearly capable of leaking data and that MS clearly is one way or another making it give wrong answers about what it has access to.
@vidar Could be something like that: https://platform.openai.com/docs/guides/function-calling
The user location is not in the normal chat context, but when you ask for the weather it calls back a function which knows your location just like the search engines do (e. g. by IP geolocation, no location permission in the browser needed). The LLM rephrases the function result into an answer which includes the location.
It seems they have tried to make it forget this information again by suggesting it's hypothetical...
@vees
@vidar Could be something like that: https://platform.openai.com/docs/guides/function-calling
The user location is not in the normal chat context, but when you ask for the weather it calls back a function which knows your location just like the search engines do (e. g. by IP geolocation, no location permission in the browser needed). The LLM rephrases the function result into an answer which includes the location.