Here are my custom instructions which I'm using as part of a Claude Project, but I expect they'll work the same way with other LLMs too
Top-level
Here are my custom instructions which I'm using as part of a Claude Project, but I expect they'll work the same way with other LLMs too 14 comments
I have a similar set of custom instructions I use with Claude Artifacts to get it to produce mobile-friendly single page HTML apps that run without a build step @evan love Web Components - I often prompt it to create those directly, but I still want them all in a single file so I can easily copy and paste the whole lot out of the LLM at once @simon So, you donβt use an IDE with integrated AI, like VSCode with CoPilot or Cursor or Zed or whatever? @evan I use GitHub copilot but I get a ton of work done directly in Claude (with Artifacts) and ChatGPT (with Code Interpeter) pasting code back and forth @simon last question! Do you ever use a local Open Suorce model for code generation, and if so, which one? @evan only on planes! Best I've tried are Qwen2.5-Coder-32B and Llama 3.3 70B @evan both of them use so much memory that I have to shut down a bunch of Firefox tabs and VS code windows first - plus they're noticeably slower than the best hosted models @prem_k with Claude Artifacts you have to or it defaults to writing you a React component every time! @simon true that ... I say some version of "native JavaScript without libraries as much as possible" |
@simon I am happy to see you are using this. I wasn't doing as much one-shot attempts with Projects when we talked at DCUS but sometimes that's what the results are.
I have been playing with MCP and it has a ton of potential but both the UI and the quality was a huge drop after two days of near perfection. Not sure if you tried that out yet, but I haven't tried it in the last two weeks. (I suspect it was an obvious bug)
Happy to compare notes some times.