HttpWebRequest.CachePolicy: Caching Questions

If I installed HttpWebRequest.CachePolicy as follows:

 var webRequest = (HttpWebRequest) WebRequest.Create(url); var policy = new HttpRequestCachePolicy( HttpCacheAgeControl.MaxAge, TimeSpan.FromMinutes(1) ); webRequest.CachePolicy = policy; 

and make two asynchronous requests for the same URL at the same moment what happens to the second request? Is the second one executed only at the first caching, or will 2 requests be sent, since during the release there was nothing in the cache?

Also, in this context, what is cache? Where does it live? Do we have more control over this?

+6
c # caching
source share
2 answers

Two such requests in .NET code will trigger two such HTTP requests that can be easily verified by simply creating something that does this, running it, and then checking what happens on the server.

This is appropriate, because it may be that two requests will receive a different answer, especially when you consider that one of them may suffer an error and the other does not. There are other reasons (for example, the server may send a response that is different each time from the instructions that it should not cache).

However, there may be an exception. The default limit for the number of requests that will be sent simultaneously to the same domain that is configured, but defaults to two (this is often complained because in some cases the use is unacceptable, however, two requests to the server do give the greatest overall throughput in in most cases, so it’s there for a good reason).

Because of this, it is entirely possible that one of the two requests will be delayed because it has been queued because of this rule. Even if the default limit is increased, it is possible that this limit has still been exceeded.

Now, since Mark notes that the response can be cached when its response stream has been read to the end *, and this can happen by the time the second request begins, which will lead to its use, if applicable, in the case of a cached response (the response was cached and there were no errors during the download).

So, in the balance sheet, we expect that there will be two separate downloads, and we should be glad for this (in case you have an error), but conditions are possible in which there will be only one download, since two "simultaneous" requests are actually not forced to be at the same time.

* In fact, although the documentation states that the stream should be read to the end, it really should be read to the end and closed manually, by deleting it (for example, from using ) or from the executable finalist.

+5
source

Firstly, the HttpWebRequest documented because it can only execute one asynchronous request at a time (raising an InvalidOperationException ), so you will need two of these requests. If you raised two such requests at the same time, I would fully expect that you will go to the server - there would be no reason not to do this. In particular :

A copy of the resource is added only to the cache if the response stream for the resource is retrieved and read at the end of the stream. Thus, another request for the same resource may use a cached copy depending on the cache policy level for this request.

So, at the time of the query, we should assume (from the question) that the data has not yet been read, so nothing will be cached locally.

Depending on whether both requests are sent to the same server, the server can request queues in requests and can be configured to cache the result, but in all likelihood the server will simply process everything here twice.

+2
source

All Articles