Practical limits on the number of calculations constexpr

As an experiment, I just put together the code to generate std::array<uint32_t, 256> at compile time. The contents of the table are a fairly typical CRC lookup table - the newer is the use of constexpr functions for calculating records, and not for placing an automatically generated magic table directly in the source code.

In any case, this exercise aroused my curiosity: were there any practical restrictions on the number of calculations that the compiler would like to do to evaluate the constexpr function or variable definition at compile time? for example, something like the gcc -ftemplate-depth parameter, which creates practical limits on the amount of metaprogramming evaluation of a template. (I also wonder if there can be practical limits on the length of the parameter package, which would limit the size of the compilation time std::array created using the std::integer_sequence intermediate object.)

+5
source share
1 answer

Recommendations for such can be found in [implimits] & para; 2 :

(2.35) - & EnSP; Recursive function calls constexpr [ 512 ]

(2.36) - & EnSP; Valuable expressions evaluated in the expression of a constant constant [ 1,048,576 ]

GCC and Clang allow you to adjust through -fconstexpr-depth (which is the flag you were looking for).

Evaluation of constant expression is practically performed in the sandbox, because undefined behavior must be supplanted by the implementation . With this in mind, I do not understand why the implementation could not use all the resources of the host machine. Again, I would not recommend writing programs whose compilation requires a gigabyte of memory or other unreasonable resources ...

+3
source

All Articles