I looked at the size of the Brainfuck compiler (~ 240 bytes in the original form), and I doubt that you will get less than this, it was designed to create the smallest possible compiler (true, many years ago).
Although from Wikipedia :
With the exception of two I / O commands, brainfuck is a minor variation of the formal P '' programming language created by Corrado Böhm in 1964. In fact, using six characters equivalent to the corresponding brain commands +, -, <,>, [,], Böhm provided an explicit program for each of the main functions, which together serve to calculate any computable function. So, in a very real sense, the first "brain programs" appear in Böhm's paper of 1964 - and they were programs sufficient to prove Turing's completeness.
From the P '' page :
P '' was the first “required” GOTO-less structured programming language to be proven by the completion of Turing.
Thus, the compiler for P '' or a modified version of the brain bite of this equivalent will be smaller and still complete.
However, if I do not follow the spirit of the question, then the set of device-based instructions will be completely completed. The assembler is likely to be too large, but you can directly write the values of the operation code either to the executable file or to a text file that is “compiled” for the executable file. This "compiler" is likely to be smaller. Although this is not a compiler in any real sense of the word, therefore, not following the spirit of the question.
Is this a real world question? If you don’t have room for a compiler, then where will your sources and binaries go?
Related question: What is * conceptually * smallest * compiler * that can compile itself?