Email or username:

Password:

Forgot your password?
marius

Can someone with more #JavaScript experience tell me what's an idiomatic way of solving the problem of an application that must fetch multiple URLs?

I currently have the issue that a lot of those URLs actually are the same, so I end-up with 40 requests for the same resource.

Is there an elegant way to solve this with minimal overhead over the native fetch API?

#frontend #javascript #noframework #webdev

7 comments
diabhoil

@mariusor have something like a global dictionary (object) of type <string, Promise>. If the url is already a key in the dictionary then just resolve the promise again, otherwise add to dictionary and resolve.

marius

@diabhoil yes, that's the way I was leaning towards also, but I wasn't sure it was very idiomatic. Do you have an example about how you use that ?

diabhoil

@mariusor No not really. But you could have a wrapper function like "cachedFetch(url) { if(!dictionary[url]) dictionary[url] = fetch(url); return dictionary[url] }"

dont know if the syntax is correct :D And its not tested...but something like that was in my mind ^^

marius

@diabhoil yes, that's exactly what I started with, however the fetches happen all before the promises resolve, so the check fails for the URL. Maybe I just need to stick the promises in there.

diabhoil

@mariusor just tested with this const cachedFetch = (url) => { if(!dictionary[url]) { console.log("add to cache"); dictionary[url] = fetch(url); } else { console.log("just use from cache!"); } return dictionary[url]; } and it worked like expected when calling "await cachedFetch(url)"

marius

@diabhoil then maybe I get tripped by some awaits I have in the code. Thank you.

marius

In the end this turned out to be as easy as modifying the options we passed to the fetch() function. Afterwards the browser caching mechanism was enough to mitigate slamming the server.

The changes made:
* the fetch request was done with a slightly broken accept header, which had the server always return HTML instead of a JSON document
* the fetch request was using 'no-cors' option, which strips the accept header (the mdn docs say it should stay, but it doesn't look like it for my tests)
* the fetch request now uses 'force-cache' option to avoid making more requests to the server even if it's stale.

TY:
@tjcrowdertech @diabhoil @hattifattener @yasser

In the end this turned out to be as easy as modifying the options we passed to the fetch() function. Afterwards the browser caching mechanism was enough to mitigate slamming the server.

The changes made:
* the fetch request was done with a slightly broken accept header, which had the server always return HTML instead of a JSON document
* the fetch request was using 'no-cors' option, which strips the...

Firefox developer tool network tab, showing a large number of identical requests made to the same URL. All of them are being shown as coming from local browser cache, which was our desired goal in the first place.
Go Up