Ollama 0.2 now supports concurrency, allowing parallel requests and multiple models to run simultaneously. This update enhances use cases like managing chat sessions and running diverse agents, optimizing memory and GPU usage.
alternativeto.net/news/2024/7/