Python requests load large file with additional data

I was looking for ways to download a large file with additional data, but there seems to be no solution. To download the file, I used this code and it worked perfectly with a small file:

with open("my_file.csv", "rb") as f: files = {"documents": ("my_file.csv", f, "application/octet-stream")} data = {"composite": "NONE"} headers = {"Prefer": "respond-async"} resp = session.post("my/url", headers=headers, data=data, files=files) 

The problem is that the code downloads the whole file before sending, and I run in MemoryError when downloading large files. I looked around and the path to the data stream is to install

 resp = session.post("my/url", headers=headers, data=f) 

but I need to add {"composite": "NONE"} to the data. If not, the server does not recognize the file.

+6
source share
1 answer

You can use request-toolbelt to do this:

 import requests from requests_toolbelt.multipart import encoder session = requests.Session() with open('my_file.csv', 'rb') as f: form = encoder.MultipartEncoder({ "documents": ("my_file.csv", f, "application/octet-stream"), "composite": "NONE", }) headers = {"Prefer": "respond-async", "Content-Type": form.content_type} resp = session.post(url, headers=headers, data=form) 

This will result in requests being sent to download multipart/form-data .

+5
source

All Articles