You can change the default limit (4 MB) for the directory and its children by dropping the special web.config in this directory. Thus, you donβt need to make the whole site valid for huge downloads (this may open you certain types of attacks).
Here is an example from a production application. This parameter is set for 200mb:
<?xml version="1.0"?> <configuration> <system.web> <httpRuntime maxRequestLength="200000"/> </system.web> </configuration>
Nesting web.configs has no negative side effects - your existing web.config settings will still work on all sites, and these httpRuntime parameters will be inherited from this directory to down.
On the page that handles the download, make sure you have a higher than usual ScriptTimeout. This is measured in seconds:
Server.ScriptTimeout = 1200;
I like to set this on the page, not web.config, because it isolates the increased timeout down to this page.
On the SQL side, varbinary columns can have up to two gigs, so you can go there.
Update: I used this approach with 50+ mega files, but it is possible that in 100 megacenters standard methods may break. There are many third-party controls available that can take over from there, but I would probably look at Telerik RadUpload, as they are a quality name and look very polished: RadUpload
Regarding your update regarding reading a file in chunks, if you really want to write it, Bobby D pointed to this article: Downloading large files using HttpModule
source share