C handle large file

I need to analyze a file whose size may be large. I would like to do this in C. Can anyone suggest any methods for this?

The file I need to open and parse is the hard drive dump that I get from my Mac hard drive. However, I plan to run my program inside the 64-bit Ubuntu 10.04. Also, given the large file size, the better the method is optimized.

+5
source share
5 answers

Both * nix and Windows have extensions for I / O procedures that relate to a file size that will support sizes larger than 2 GB or 4 GB. Naturally, the main file system must also support a large file. On Windows NTFS does, but FAT, for example, does not work. This is commonly known as "large file support."

The two procedures that are most important for these purposes are fseek()and ftell()so that you can make random access to the entire file. Otherwise, ordinary fopen()and fread()friends can sequentially access any file size if the underlying OS and the stdio implementation support large files.

+4
source

, 64- linux/bsd/mac/notwindows ( , ?), mmap . / .

, , . , 64- , ABSOLUTE, 32- , ~ 4 .

+1

RBerteig Matt:

64- - ( systemn ) , , . off_t .

, C99, . int long , / . int64_t ( int_fast64_t, ).

+1

-D_FILE_OFFSET_BITS=64 #define _FILE_OFFSET_BITS 64 ( ). . off_t ( 64 ), API.

+1

. , , , , "" .

Do you want all the data in memory at the same time?
One way is to write part of the file to disk in temporary files if not in use. A simple fread / fwrite structure and some clever on-demand counts can do this,

0
source

All Articles