You are looking for the completely wrong places. The "overhead" of the document / presentation architecture is in the nanosecond range (basically, accessing data through a pointer).
In comparison, the absolute maximum speed with which you can significantly refresh the screen is the monitor refresh rate, which is usually 60 Hz (i.e. every 16.67 milliseconds).
To make such an update frequency significant, you cannot change much in any monitor update - if you try to change too much, the user will not be able to follow what is happening.
As for streaming, the easiest way is to do all the actual window updates in one stream and use other streams to perform the calculations and create data for the updated window. As long as you assure that the thread does not need to do a lot of calculations and the like, updating the window as quickly as using any of it is pretty simple.
Edit: how much C ++ versus C # goes, it depends. I have no doubt that you can get completely adequate display performance from one. The real question is how many calculations you do behind these displays. What you mentioned was displayed mostly pretty close to raw data (price, volume, etc.). For this, C # will probably be ok. I assume that the people you talked to do a lot more computing than that, and that real Achilles heals .NET (or almost anything else that runs on a virtual machine). From what I saw for really heavy computing, C # is not very competitive anyway.
Just, for example, in another answer some time ago I mentioned an application that I originally wrote in C ++, which the other team rewritten in C #, which worked about 3 times slower. After posting this question, I was curious and talked to them a little more about it, asking if they could improve their speed to be at least close to C ++ with a little extra work.
Their answer was, in fact, that they had already done this extra work, and it was not just βa littleβ. C # rewriting took about 3 1/2 / 4-4 months. Of this time, it took less than a month to duplicate the features of the original; all the rest of the time was spent on (trying) to make it fast enough to be able to use.
I hasten to warn that 1) this is just one data point, and 2) I have no idea how close it is to anything that you could do. However, this gives some insight into what kind of slowdown you could start when (and if) you begin to perform real calculations, and not just transfer data through the network to the screen. At the same time, a quick glance shows that it generally matches the results on the Computer Language Shootout website - although keep in mind the results are for Mono and not for Microsoft implementation.
At least for me, the real question comes down to the following: is your concern for performance really justified or not? Something like 90% of applications around, the important thing is that the code does what you want, and the speed of execution matters little if it does not get much slower (for example, hundreds or thousands of times slower). If your code falls into this (large) category, C # could very well be a good choice. If you really have a good reason to care about execution speed, then it seems to me that choosing C # will be much more dubious.