Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Data caching required to reduce API calls #196

Open
VinVorteX opened this issue Oct 5, 2024 · 6 comments
Open

Data caching required to reduce API calls #196

VinVorteX opened this issue Oct 5, 2024 · 6 comments

Comments

@VinVorteX
Copy link

Implement caching to reduce the number of API calls, especially if the user requests the same location multiple times.

@VinVorteX
Copy link
Author

VinVorteX commented Oct 5, 2024

i would like to work on this feature

@kordianbruck
Copy link
Collaborator

I think the question is: is any of the api vendors actually complaining or are we doing this for users, because they are running into quota issue?

@kordianbruck
Copy link
Collaborator

Also: why don't we persist the cache into /tmp or disk at least? It seems that would be more useful as a long term solution that just an in-memory cache.

@VinVorteX
Copy link
Author

For the API vendors, I think we’re mainly doing this for users who might run into quota issues. We want to help them avoid hitting those limits, even if the vendors aren’t complaining right now.

As for caching, using in-memory caching was a quick and simple solution for speed. It’s great for short-term use since weather data changes often. But I totally agree that persisting the cache in /tmp or on disk would be a better long-term solution. It would help users keep data across sessions and reduce loading times. I can definitely look into adding that feature in the future!

@kordianbruck
Copy link
Collaborator

As a oneshot terminal CLI in-memory cache does nothing. Once the program terminates, the cache is gone. So as implemented the code does nothing to my understanding. Without file persistence this makes no sense to me.

@marcofeltmann
Copy link

marcofeltmann commented Nov 10, 2024

Interesting task. Yet which timeout should that cache have for invalidation?
If you ask the same API for the same location over and over again you might receive to different results within a second.

The thing with weather is: It constantly changes. Hence the thing with weather forecasts is: They do, too.

Shouldn't each API have it's own "rate limiting" solution that the backends have to respect?

Like: First request the HEAD of the API, have a look at the "Last Modified" HTTP header and decide if you really want to download the body content.

Or have some "lastUpdated" endpoint for a location to get that same information.

You'll still call the endpoint every time the user wants to, yet save some bandwidth by only downloading larger amounts of data when things changed.

And: How does in-memory-cache work in an application that is started, downloads all the data, displays it and exits?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants