@david @HistoPol @markearnest @paul @fediversereport @fediversenews @Bluedepth @darren @juneussell @davetroy There's distributed control to some extent. Mastodon has *some* safety measures built into it to automatically rate-limit access from one source for example, though I don't know their extent or details. If there are too many requests then errors are returned, so there's an incentive to cache some data locally aside from latency.

(Some local caching is most likely also preferable just in terms of operating cost already. Fetching all that remote information live every time increases latency, which increases parallel in-progress incoming requests, which increases resource requirements significantly.)