Email or username:

Password:

Forgot your password?
Simon Willison

I've released a new reusable Django app - django-http-debug - which makes it easy to quickly setup a debugging HTTP endpoint that returns a canned response and logs full details of any incoming requests, great for the initial stages of implementing things like OAuth or incoming webhooks.

Most of the code was written for me by Claude 3.5 Sonnet - full details here: simonwillison.net/2024/Aug/8/d

Django admin screenshot: add debug endpoint. Path is set to hello-world, status code is 200, content-type is text/plain; charset=utf-8, headers is {"x-hello": "world"}, content is Hello world, The is base 64 checkbox is blank and the logging enabled checkbox is checked.
Django admin screenshot showing a list of three logged requests to the hello-world endpoint, all three have a timestamp, method and query string - the method is GET for them all but the query string is blank for one, a=b for another and c=d for a third.
8 comments
Simon Willison

As part of working on this I figured out (with more help from Claude) a good pattern for writing automated tests for a reusable Django app like this that can live in the same repository and spin up a minimal Django project, just enough for the tests to run. I wrote that up in detail as this TIL: til.simonwillison.net/django/p

Glyph

@simon does this level of LLM “authorship” give you concerns about its provenance?

Simon Willison

@glyph not at all, it really is exactly what I would have written if I’d spent the extra time on it

My current take on provenance is that if it doesn’t spit out eg a carbon copy of John Carmack’s fast inverse square root I’m not personally morally bothered by it

And legally, most LLM vendors have some kind of copyright indemnity these days - here’s Anthropic’s: anthropic.com/news/expanded-le

Glyph

@simon this only covers you for the legal costs of defending the copyright lawsuit and potential damages though, which is not super relevant for something that goes up on PyPI. It doesn’t cover the reputation damage or the engineering effort required for your users to rip out the dependency, if they’re not paying you for it. It would make me feel pretty comfortable shipping something in a product but open sourcing it seems riskier.

antrix

@simon do you use any IDE integrations to work with LLMs?

antrix

@simon most (all?) of your posts about coding with LLMs seem to involve the web (chatgpt, etc) interfaces. I do the same. Haven't found a good enough way to do this style of assisted development as effectively from within VS Code.

Simon Willison

@antrix I’ve also recently started using the VS Code feature where you can select a bunch of lines and click the little sparkle icon and give it a prompt telling it how to modify the selected code

Go Up