Uploading File in parts using XMLHttpRequest and auth digest

I am trying to upload a local mp4/movie file with XMLHttpRequest() on a server. This is written with react-native and in a final version user would be able to upload any big files from his/her IOS/Android device to the server.

My test file is about 1 GB (it could potentially be from 100 mb to 10 GB).

With files below 70 mb I could easily use this code (below) to load the blob then slice it and put on server using : https://github.com/inorganik/digest-auth-request

sendRequest = new digestAuthRequest('PUT', uri, usernameVal, passwordVal);
        sendRequest.request(function (data) {
let oReq = new XMLHttpRequest();
oReq.open("GET", obj['path'], true);
oReq.responseType = "blob";
oReq.onload = function (oEvent) {
let blob = oReq.response;

Then this is sendBlobPart function stripped away from all handling on callback (not important here) :

function sendBloblPart() {
    sendRequest = new digestAuthRequest('PUT', uri, usernameVal, passwordVal);
    sendRequest.request(function (data) {
    }, mainThis.fileUploadErrorHandler, blob.slice(startB, endB), true);

I am reconstructing the file on server from all the parts and this works just fine with the blob created by XMLHttpRequest(), but it seems XMLHttpRequest is loading whole file into memory so for example when trying to load 1 GB file as blob I get out of memory error. I looked for several solutions however nothing worked so far. I found promising feature of : https://github.com/joltup/react-native-fetch-blob

so creating Blob from 1 GB file now takes about 100 ms and doesn’t consume memory:

const Blob = RNFetchBlob.polyfill.Blob
let newBlob = new Blob(RNFetchBlob.wrap(obj['path']), {type: obj['mime']});

This however has a challenge in that the new blob has a different structure then previous one (previous one meaning from XMLHttpRequest) where I was able to simply pass the sliced blob as data to send and it worked just fine. Now no matter what I do with new blob I am not able to send any data to the server, all requests appear to work just fine, but 0 bytes are received on server end.

Read More:   Passing a parameter to function with single quote

Are there any hints/solutions or ideas on how to better approach this problem?

Below I will add structure of both blobs, I can see they are quite different (I used 1.2 mb file for this example so XML request will work just fine and won’t crash as with my 1GB desired file) :

Structure of both blobs

While I thought about using readStream and similar solutions, I couldn’t find an option for stream to wait while upload is done or start from 50% of the file for example.

I have a couple ideas. What happens if you wait for the blob slice to be created before making the upload call?

const firstBlob = await new Promise(resolve => {
  newBlob.slice(0, 262144).onCreated(resolve)

// send request with blob


const secondBlob = await new Promise(resolve => {
  newBlob.slice(262144, 262144 * 2).onCreated(resolve)

// etc

It looks like the rn-fetch-blob package has a polyfill for XHR that considers the onCreated callback. If you use that XHR instead it could also work.

const XMLHttpRequest = RNFetchBlob.polyfill.XMLHttpRequest;

I’m a little confused about your codes.

As you said,’User would be able to upload any big files from his/her IOS/Android device to the server’.In that situation, you suppose to get the File instance by a component like <input type="file"> rather than get file from server and store it into a blob and then upload it.Blob loads all data into memory at once when the data is from server, so it causes memory error.

If the big file which user want to upload must be downloaded from a server, you should download the file chunk by chunk (need to do some changes of the backend download codes) and upload it in the same way.

Read More:   How to know if all javascript object values are true?

If you get file from <input type="file"> and use File.slice to get file chunks it won’t cause memory errors, do it as what you did.

The answers/resolutions are collected from stackoverflow, are licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0 .

Similar Posts