Comments (5)
BTW, I think it the following should technically work:
delete ctx.cash;
But I would prefer it not to touch the context variables used by this lib, if possible.
from cash.
@edorivai your issue contains two very different feature requests.
But sometimes I figure out on the way back through the middleware stack that I don't want it cached for this specific request.
This could be easily solved with some simple helper function like ctx.cash.expire
of by deleting ctx.cash
, as you proposed yourself.
But I can't see any real application for it.
The whole point of caching is to skip any actual work by serving content from the cache.
After the polling has completed, we cache those results for, say, 15 minutes. During this period after polling completion, we want any requests to this specific route to be cached.
And this seems like a very useful feature, though it can't be solved simply by deleting ctx.cash
somewhere in your middleware stack.
To do so you (or cash
) should somehow mark that new request is already running and serve all current requests from the cache, thus postponing an actual cache expiration until new data is fetched.
from cash.
I'm not sure if I'm understanding you correctly. I'll elaborate a bit with examples.
We have a route, let's say GET /poll?query=something
which has two response modes;
The first, a polling process is running, a partial result is returned, but the client should request the same resource after a short interval (couple seconds). Example response:
{
data: { ...some partial data },
pollingComplete: false
}
The second response mode is when the polling process has completed, all results have been fetched, and the data that is responded is complete:
{
data: { ...complete data set },
pollingComplete: true
}
So a request sequence could be:
GET /poll
- pollingComplete=falseGET /poll
- pollingComplete=falseGET /poll
- pollingComplete=true
Now for caching; we want a completed response to be cached for specified time frame, but we obviously don't want the partial responses to be cached. The tricky part is that we serve both the partial, as well as the completed content over the same route, and that we only know whether a response is partial or complete on the way back through the middleware, which is after ctx.cached()
has already been called.
One way to see it is I tell koa-cash
to cash the response to the current request on the way down the middleware, and once the response is coming back up I realize that this specific response should not be cached after all, so I want to cancel the currently scheduled caching operation.
We are already using the method I described earlier:
delete ctx.cash
The only reason I opened this issue, is because deleting ctx.cash
feels kinda hacky, and I thought perhaps it's worth it to discuss whether this is some functionality that is more broadly useful. If that's the case, than we could perhaps implement something like ctx.cash.cancel
, and document it properly.
On the other hand, I would understand it if you'd argue that my use case is too specific, and not worth it to officially support. In that case I'd be okay keeping the delete
hack in place.
from cash.
@edorivai canceling response caching after ctx.cached()
call makes sense in your multiple response models example. Though it doesn't look like a common scenario.
In my opinion, the following two scenarios are more likely to be useful:
- Queue all incoming requests for any expired uri to ensure that no more than one data fetching process per uri is running at the same time.
- Serve slightly expired data while new data is processing.
But if you think ctx.cash.cancel()
should be a part of cash
API, you could submit a PR and I'll accept and publish it.
from cash.
@lbeschastny Thanks for the input.
If I'm the only one running into this issue, I'd refrain from implementing it, since it would increase the API surface and maintenance burden, without providing enough utility. If there are more people that would like to see this feature implemented, please
Closing for now, thanks again.
from cash.
Related Issues (10)
- yield* cashed(maxAge) - maxAge is not sent to set() HOT 5
- Supporting request method POST HOT 4
- Support Koa 2 HOT 5
- Binary response support (application/octet-stream) HOT 2
- increase test coverage to 100%
- Cache headers
- Add an option to set which methods to cache HOT 1
- make sure we support empty bodies
- readme.usage needs updating HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from cash.