When we talk about the C programming language, we should note that there is an ISO standard (ANSI) for the language. Here is the latest public project C99 (ISO / IEC 9899: 1999): www.open-std.org/jtc1/sc22/wg14/www/docs/n1124.pdf
There is a section called β5.1.1.2 Translation Phasesβ that tells how to analyze program C. There are stages:
... a few steps to handle multiple bytes, trigraphs, and backslashes ...
3). The source file is split into preprocessing tokens and sequences of space characters (including comments).
This is a lexical analysis for preprocessing. Only preprocessor directives, punctuation marks, string constants, identifiers, comments are lexified in it.
4). Preprocessing directives are executed, macro commands are expanded
This is preprocessing. This phase will also include files with #include , and then remove preprocessing directives (for example, #define or #ifdef and others)
... handling string literals ...
7). Symbols of white space separating tokens are no longer significant. each pre-processing token is converted to a token. The resulting tokens are syntactically and semantically parsed and translated as a translation unit.
Converting to token means detecting keywords and detecting constants. This is the final lexical analysis step; syntactic and semantic analysis.
So your question is:
Is pre-processing done after lexical and parsing?
To perform pre-processing, some lexical analysis is required, therefore the order is: lexical_for_preprocessor, preprocessing, true_lexical, other_analysis.
PS: Real C compiler can be organized a little differently, but it should behave in the same way as in the standard one.