@juliana
@luminesce
Yes totally^ it depends on the application re: energy use. Chaining a bunch of models together would probably use more energy, but google et al want to use smaller models hooked up to knowledge graphs to make runtime inference feasible as a consumer product too, so that kind of application would be designed to use less.
The medical case has its own set of fun complications 💓
https://jon-e.net/surveillance-graphs/#nih-the-biomedical-translator
@jonny bestie how sad are you that xanadu never happened lol