In the spirit of “maybe it's not too late to add my 2 cents”, as with @Alvin’s answer, here’s what I would think: if your application is designed for several years, it’s going to face several changes in the operation of applications and systems.
For example, let's say you thought about it 10 years ago. Then I watched Dexter, but, in my opinion, you really have memories of how it was then. From what I can say, multithreading was not a big problem for 2000 developers, and now it is. Therefore, Moore's law interrupted them. Prior to this, people did not even care about what would happen in Y2K.
Speaking of Moore’s law, processors really become quite fast, so perhaps some optimizations will not even be so necessary. And, perhaps, the array of optimizations will be much larger, some processors get optimization for several server elements (XML, cryptography, compression and regular expression! I am surprised that such things can be done on the chip), as well as spend less energy (which is probably very important for war hardware ...).
My point is that focusing on what exists today as a platform for tomorrow is not a good idea. Make it work today, and it will certainly work tomorrow (backward compatibility is especially valuable to Microsoft, Apple is not bad, and Linux is very liberal in making it work the way you want).
There is, yes, one thing you can do. Attach your technology to something that simply won't (probably) die, like Javascript. I'm serious, Javascript VMs are becoming terribly efficient these days, and they just get better, plus everyone loves it, so it's not going to suddenly disappear. If you need higher performance / features, perhaps targeting a CRL or JVM?
I also believe that multithreading will become more and more a problem. I have a feeling that the number of processor cores will have Moore's own law. And the architecture is likely to change in terms of cloud noise.
PS: In any case, I believe that C optimization of the past is still quite applicable with modern compilers!