-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use new KeyValueCache
and friends from @apollo/utils.keyvaluecache
#6522
Conversation
✅ Deploy Preview for apollo-server-docs canceled.
|
This pull request is automatically built and testable in CodeSandbox. To see build info of the built libraries, click here or the icon next to each commit SHA. Latest deployment of this branch, based on commit 713f5cc:
|
// lru-cache npm module, whose maxAge feature is based on `Date.now()` | ||
// (no setTimeout or anything like that). So we want to use fake timers | ||
// just for Date. (Faking all the timer methods messes up things like a | ||
// setImmediate in ApolloServerPluginDrainHttpServer.) | ||
clock = FakeTimers.install({ toFake: ['Date'] }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we just stop using this then?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No because the plugin actually uses Date
to calculate the age
header, so this is actually still useful (and required for testing that)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah. Then update the comment instead of deleting it...
I do wish we understood why lru-cache doesn't seem to work with fake-timers. I do worry there's an actual bug there.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We looked into this for a while. It seems like the combination of lru-cache + fake-timers + Jest is messed up... something maybe about Jest's fake globals combined with lru-cache hanging on to a copy of performance
early on, perhaps...
|
||
advanceTime(ms: number) { | ||
[...this.cache.values()].forEach( | ||
(value) => value.ttl && (value.ttl -= ms), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Heh, cute.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe too cute, I'll put a comment here 😄
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed cuteness 8d1d84a
@@ -14,7 +14,7 @@ import resolvable, { Resolvable } from '@josephg/resolvable'; | |||
import { | |||
InMemoryLRUCache, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So you allude to this in the PR description but... isn't this a backwards-incompatible change, from an unbounded to a bounded default cache?
Could fix that by using the default (Map-backed) keyv. Or by putting an extraordinarily large max/maxSize that is still finite...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agree, I brought back keyv since it's actually unbounded and we weren't even leveraging size calculation in the default case. Let me know what you think.
97dcb80
My only hesitation is introducing the keyv dependency, but in this case we're just a consumer and not implementing our own so maybe less concerning.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm. You know, your test TTL cache made it kinda clear to me that it would be very easy to just implement our own ttl-obeying unbounded cache. Just store the value and the "deadline" (now + TTL) in a map, check time at read time. Maybe that beats the dependency on keyv. Though at least it's a purely internal dependency on keyv rather than something part of the API...
packages/apollo-server-integration-testsuite/src/ApolloServer.ts
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is mergable if you want, though I think we could also implement our own unbounded cache (maybe just directly in @apollo/server
rather than the utils package, and not exporting it) rather than keyv without much work.
Merging into release branch with intention to follow up on @glasser 's last comment about implementing our own unbounded cache |
In the same spirit as #6488
Note that this brings the internal version of
lru-cache
to v9 (which forces a bounded cache, where we previously used infinite). This version has also proven to be troublesome to test with mock timers, hence the introduction of theTTLTestingCache
.