Suggestions for sending a huge string to a web service

Below is my array:

var.child.Cars1 = { name:null,operation:0,selected : false} 

Now in the array above, the selected property represents check / uncheck status of checkbox and I am placing the web service (WCF) above the array as a string using json.stringify.

The above array contains 2000 - 4000 entries, and now the user can check / uncheck the boxes.

Now consider that in the aforementioned array there are 4,000 entries, in which there are 2,000 entries that are checked and 2,000 entries are not marked, and in my web service I only process those entries that are checked. I delete entries with the selected value as false.

Now, because of 4000 entries, its a huge json string and because of this I get an error from the end of the web service:

  Error : (413) Request Entity Too Large 

Now the reason I am not filtering entries with those selected as flase is because it will create a lot of overhead in the client’s browser and may even hang the browser, so right now I am doing this on the server side.

So my question is, should I filter the entries with the selected ones as false on the client side and then publish only 2,000 entries or what I am doing right.

I have some kind of question in my mind that posting such a huge json string again will take several times, and filtering entries with false as selected will also lead to a lot of overhead in the browser.

So I'm not sure what I'm doing wrong or right.

Can someone please direct me to this?

+7
javascript c # web-services asp.net-mvc wcf
source share
8 answers

Generally, if you start working with big query size problems like this, this is a good opportunity to take a look at optimization methods instead of overriding. Many people tried to provide ways to get around the problem, but did not provide an idea of ​​ways that did not bypass it, but instead improved the design to be easier.

Here are a few possibilities:

Have you considered the swap request (s)? This will allow you to download data on the client asynchronously as needed, thereby preventing too long requests, improving the responsiveness of the website and reducing the load on the client and server memory. You can preload the data in the form of user scrolls and, if necessary, provide the user with some kind of entertainment / feedback if the process takes too long, so they know that more data is loading.

Have you considered changing the names of properties, which should be shorter and less visual, reducing the area of ​​the object itself? For example:

Current Model:

{ name:null,operation:0,selected : false}

Simplified Model:

{ n: null, o: 0, s: false }

This approach will make it difficult to read JSON itself, but JSON is not intended only for reading by people, but for serializing data; although this can be overcome by documenting your model. Performing this method can help reduce the amount of data sent by up to 30%.

I can’t provide a complete approach to the solution, because you have to ask yourself many difficult questions about what you are trying to achieve, who will consume the data and how best to get there.

In addition, I would think a lot about why the process will require the user to interact with 2000+ records at the same time. I'm not trying to criticize, but I say that you need to look critically at the business process for what you are trying to achieve, because there can be serious problems with repeatability, stress on the user and much more, which will significantly affect how your applications will be effective and beneficial to the end user. For example, is it possible to break a task into smaller, less tedious blocks so that the end user does not look at 4000 flags for 2 hours?

This may not be the answer you are looking for, since it opens up a huge number of additional questions, but I hope this helps you to begin to formulate questions that will help form the final answer.

+3
source share

A quick fix may increase the allowed length of the server content. This is about what it will look like.

 <configuration> <system.web> <httpRuntime maxRequestLength="2147483647" /> </system.web> </configuration> <system.webServer> <security> <requestFiltering> <requestLimits maxAllowedContentLength="2147483647" /> </requestFiltering> </security> </system.webServer> 
+3
source share

It is preferable to filter data on the client side if you do not affect the flow of the user interface (browser lock). Posting too much data can cause problems depending on what you are trying to do and the end-user network speed. However, sometimes you just need to post big data. I'm not sure how big these 4,000 entries are, but if it's just text, then it can't be too big.

Since your problem is that the WCF site is responding with a response with a response of 413, then where you have a problem with the maximum size. According to the WCF documentation , the maximum allowed message size that can be received by default is 65,536 bytes. Obviously, this is below what you are trying to send.

This is what you need to update in your WCF service. The example below is for 10 MB, but grow to the point that makes sense for your data.

 <system.serviceModel> <bindings> <basicHttpBinding> <!-- Measured in Bytes --> <binding maxReceivedMessageSize="10485760"> <!-- 10 MB--> <readerQuotas ... /> </binding> </basicHttpBinding> </bindings> </system.serviceModel> 

If you are starting to get the HTTP status code 404.13, you also need to update your web.config to allow longer maximum sizes, but set it to whatever size makes the most sense for your application.

 <system.webServer> <!-- Add this section for file size... --> <security> <requestFiltering> <!-- Measured in Bytes --> <requestLimits maxAllowedContentLength="1073741824" /> <!-- 1 GB--> </requestFiltering> </security> </system.webServer> <system.web> <!-- Measured in kilobytes --> <httpRuntime maxRequestLength="1048576" /> <!-- 1 GB--> </system.web> 
+3
source share
  • Save json value as file
  • Download the json file (if you get a unique file name)
  • WCF method call with file name passed instead of json value
  • Read data from file passed to method
+1
source share

You must filter on the client, since you do not know how slow the connection to the server can be. Sending unnecessary data is likely to be slower than filtering in the first place.

If data filtering hangs in the browser, you can work around this problem using WebWorker .

+1
source share

if you use the GET method, therefore, it does not work because the maximum character limit of the URL method is 2083 characters. so use the post method in the web service using the Json object.

+1
source share

You can set maxRequestLength in config. These settings worked for me to load 750 mb.

 <system.web> <httpRuntime maxRequestLength="2097151" /> </system.web> <system.webServer> <security> <requestFiltering> <requestLimits maxAllowedContentLength="2147483648" /> </requestFiltering> </security> </system.webServer> 

maxRequestLength maximum 2097151 if you try to set more errors. And MSDN says:

The default size is 4096 KB (4 MB).

+1
source share

Client-side filtering should not create huge overhead, and it should be possible to maintain the resulting json string size within the queries.

Using underscore.js is quite simple to filter only selected / checked items and display only the required operations:

 var filteredCars = _.where(cars, {selected: true}); //an assumption that operation is a unique car id var operations = _.map(filteredCars, function(car){ return {id: car.operation};}); 

Please see this advanced JSFiddle for filtering with a similar array size and next to it.

However, I would doubt that the presentation of 4k flags on one screen of the user interface would be any user-friendly?

+1
source share

All Articles