Two such requests in .NET code will trigger two such HTTP requests that can be easily verified by simply creating something that does this, running it, and then checking what happens on the server.
This is appropriate, because it may be that two requests will receive a different answer, especially when you consider that one of them may suffer an error and the other does not. There are other reasons (for example, the server may send a response that is different each time from the instructions that it should not cache).
However, there may be an exception. The default limit for the number of requests that will be sent simultaneously to the same domain that is configured, but defaults to two (this is often complained because in some cases the use is unacceptable, however, two requests to the server do give the greatest overall throughput in in most cases, so itβs there for a good reason).
Because of this, it is entirely possible that one of the two requests will be delayed because it has been queued because of this rule. Even if the default limit is increased, it is possible that this limit has still been exceeded.
Now, since Mark notes that the response can be cached when its response stream has been read to the end *, and this can happen by the time the second request begins, which will lead to its use, if applicable, in the case of a cached response (the response was cached and there were no errors during the download).
So, in the balance sheet, we expect that there will be two separate downloads, and we should be glad for this (in case you have an error), but conditions are possible in which there will be only one download, since two "simultaneous" requests are actually not forced to be at the same time.
* In fact, although the documentation states that the stream should be read to the end, it really should be read to the end and closed manually, by deleting it (for example, from using ) or from the executable finalist.