Recently, I noticed a strange case that I would like to check:
In SUS, for %n in the format string, the corresponding int will be set to the number of bytes written to the output. Also, for snprintf(dest, 3, "abcd") , dest will point to "ab\0" . What for? Because the output should be written no more than n (n = 3) bytes ( dest buffer).
I realized that for the code:
int written; char dest[3]; snprintf(dest, 3, "abcde%n", &written);
written will be set to 2 (zero completion is excluded from the count). But from the test that I did using GCC 4.8.1, written was set to 5. Am I misinterpreting the standard? This is mistake? Is this behavior undefined?
Edit:
@wildplasser said:
... the behavior of% n in a format string may be undefined or an implementation defined ...
and
... the implementation should simulate the processing of a full format string (including% n) ...
@par said:
written is 5 because the number of characters will be written at the point where %n occurs. This is the correct behavior. snprintf only copies up to size characters minus trailing zero ...
and
Another way to look at this is that %n would not even be encountered if it was processed up to two characters, so you can expect written have an invalid value ...
and
... the whole line is processed using printf() rules, then the maximum length is applied ...
Is it possible to check whether the standard is a standard, standard or any official source?
c gcc c99
John
source share