@simon I always understood the one driving the keyboard wasn't the one coding: the coder had to verbalise what they wanted, but not having to type would allow them to think more.
Which seems the other way around to an LLM right now. But it does make me wonder whether a fast enough speech processor and enough IDE context could allow you lead an LLM like that...
@alexhudson I use LLMs like that quite often: I’ll describe the code I want written through typing or via voice (I write quite a lot of code while out walking the dog, speaking to ChatGPT through an AirPod) and the LLM churns out Python or JavaScript for me
Here’s a real example from a walk a few months ago: https://chat.openai.com/share/77996768-66ed-474a-8e33-c7ddcc4c18ff
More notes on that here: https://simonwillison.net/2024/Mar/23/building-c-extensions-for-sqlite-with-chatgpt-code-interpreter/#bonus-haversine
@alexhudson I use LLMs like that quite often: I’ll describe the code I want written through typing or via voice (I write quite a lot of code while out walking the dog, speaking to ChatGPT through an AirPod) and the LLM churns out Python or JavaScript for me
Here’s a real example from a walk a few months ago: https://chat.openai.com/share/77996768-66ed-474a-8e33-c7ddcc4c18ff