I have a large xml file that I want to upload to mysql. Its about 20 GB uncompressed, but I think I can compress it to about 25% of its original size, and then load it into a compressed table.
I know that I can compress the data in the database itself, but can it read the compressed files during the bulk upload process?
Edit: compressed, I did not mean its .gz.tar file or anything else. I mean, when I create a file in Java or C ++, I output it as gzip, so the file itself is .csv or .xml and keeps the correct structure, but the elements in it (each line) are compressed.
If this is not possible, can I do something like bulk upload, but somehow filter it through a program that decompresses the contents? I thought to open the file in C and compress it by loading it into mysql. The problem is that I want to do this in a bulk insert, not in millions of individual inserts.
source share