@glyph Agreed. We're using Tab9 - https://www.tabnine.com/ which trains only on the code in your repository and doesn't treat your code like a publicly exploitable commodity.
Also? I think it produces vastly more useful, if less audacious in certain terms, results.
I've found it saves me probably around 30-40m a day in boilerplate I don't have to type.
@feoh @glyph I seriously doubt that it only trains on the code already in the repo… these LLM style networks take an absolute ton of training data.
In fact, their privacy page explicitly says that it *does not* use your code for training.
It seems exactly identical to Copilot in terms of copyright ramifications.