@simon last question! Do you ever use a local Open Suorce model for code generation, and if so, which one?
Top-level
@simon last question! Do you ever use a local Open Suorce model for code generation, and if so, which one? 3 comments
@evan both of them use so much memory that I have to shut down a bunch of Firefox tabs and VS code windows first - plus they're noticeably slower than the best hosted models |
@evan only on planes! Best I've tried are Qwen2.5-Coder-32B and Llama 3.3 70B
https://simonwillison.net/2024/Nov/12/qwen25-coder/
https://simonwillison.net/2024/Dec/9/llama-33-70b/