@philpax Without Python, they would not take off so easily and there'd be no ML revolution IMO.
And IMO Python is never a bottleneck in the productization of ML models.
If you really want to deploy something, especially optimally/on lower-end devices, typically you'd need to rewrite inference anyway. (For example in OpenCL, or CUDA).
@BartWronski I agree on Python making it easy for R&D; it's hard to argue with the results.
That being said, my primary interest is in deploying Stable Diffusion and supporting models to the desktop (esp part of games) with minimal dependencies, and as far as I can tell this is still a pretty major headache.
I don't want my users to have to install Python/PyTorch/Conda/CUDA: I want it to Just Workβ’.
There's some interesting work happening here and there, though, like https://github.com/webonnx/wonnx
@BartWronski I agree on Python making it easy for R&D; it's hard to argue with the results.
That being said, my primary interest is in deploying Stable Diffusion and supporting models to the desktop (esp part of games) with minimal dependencies, and as far as I can tell this is still a pretty major headache.