I want to send a large base64 string (multi-megabyte) from a javascript client to our Mvc server. We currently use Ajax and send the line as follows:
Upload = function (aUrl, aFormData, aOnProgress, aOnSuccess, aOnTimeout, aOnError) { var settings = { url: aUrl, type: "POST", contentType: "application/json; charset=utf-8", data: JSON.stringify(aFormData), // { aData: longBase64String } success: function (resultObject, textStatus) { if (textStatus == "success" && aOnSuccess) aOnSuccess(resultObject); }, error: function (jqXHR, textStatus, errorThrown) { if (textStatus == "timeout" && aOnTimeout) aOnTimeout(); else if (textStatus == "error" && aOnError) aOnError(); } }; settings.xhr = function () { var req = new XMLHttpRequest(); // Create xml http request. if (req) req.addEventListener('progress', aOnProgress, false); // Upload progress. return req; } return $.ajax(settings); }
aFormData just takes a Json object with a base64 string: { aData: myString } .
On the Mvc side, we have this code for the controller:
[Authorize, RequireHttp, HttpPost] public ActionResult SetImage(String aData) {
Now - how much getting the data on the server seems to work in our localhost test environment. I am worried that this may not be "streaming" in a pleasant asynchronous and buffered form, but instead may block everything until it is sent, and I do not see a "real" effect on the client, since it starts localhost. I just donβt know enough about what is going on here.
I want the client to be able to continue using the web application without blocking. Should my content type be different or my options on the Mvc side? I hope this will be sent to pieces.
Even if my approach is correct, can someone enlighten me a little about what is going on behind the scenes? I saw several posts that use a content type with "multipart" in it? What is it and should I use it here?
Thanks.