C ++ loading a lot of data at compile time

I have a C ++ object that takes a huge amount of data to create. For instance:

class object { public object() { double a[] = { array with 1 million double element }; /* rest of code here*/}; private: /* code here*/ } 

Now the data (i.e. 1 million double numbers) is in a separate text file. The question is: how can I put it after "double a []" in an efficient way and ultimately compile the code? I do not want to read data at runtime from a file. I want it to be compiled with an object. What could be the solution? Ideally, I would like the data to sit in a separate text file, as it is now, and in some way also have an assignment like double a [] = ..... above.

Is it possible? Thanks in advance!

+4
source share
3 answers

Sort of:

 class object { public object(){ double a[] = { #include "file.h" }; /* rest of code here*/}; private: /* code here*/ } 

The file must be formatted correctly, i.e. contain something like:

 //file.h 23, 24, 40, 5, 1.1, 

In general, you can use the #include directives to insert content into files. I saw virtual methods that were inserted this way if they were common to most derived classes. I personally do not like this technique.

+9
source

Lucian Grigore's answer is quite correct. But the compiler may have some restriction on the length of the line of source code. See the example / questions / 235919 / source-line-length-limit

So try your compiler. But I'm afraid a simpler solution to your problem would be to read huge data from a file.

0
source

One big problem with this design is that 1 million int on the stack is likely to delete the stack. Most likely, you want to put data in a data segment or in some resource that is stored in your binary file and can be loaded at runtime. If you need more than one copy of the data, duplicate it at runtime with std::vector so that you know that the data is in free storage (heap). Mayhap even uses shared_ptr for std::array to reduce the likelihood of unnecessary random duplication (or unique_ptr to reduce the likelihood of duplicate links).

4 MB of data will not play all that is good, that’s all I say. And the terrain of the reference to the 4 MB array to other variables will not be your biggest problem.

Depending on your compiled target platform and framework, there will be ways to populate this kind of data in a binary resource. I have never done this for a file with several megapixels, but here is the help of the visual studio in resource files: http://msdn.microsoft.com/en-us/library/7zxb70x7%28v=vs.80%29.aspx

Please note that the "data in the code" does not significantly speed up loading (with the exception of moving the file system once to find it, perhaps). The OS still has to load binary files, and large binary files take longer to load, and a large array of values ​​will take up as much space in binary format as in a separate file. The real advantage is that it is not a file that might be "inappropriate" with respect to your executable, but a way to access fork / resource / etc resources can handle this.

As noted in the comments below, static const data (and global data) is typically loaded into a data segment that is different from a heap (aka free store) and a stack (aka automatic store). I forget what the standard calls. I know that a local static variable in a function will behave differently than a static or global non-local variable with respect to the initialization order (global ( static or not) data is fully initialized before main start, and static local is initialized the first time the function is called, if I correctly remember).

0
source

All Articles