Background
The client asked me to find out why their application in C # (we will call it XXX, delivered by the consultant who left the scene) is so flaky and fixed it. The application controls the measuring device through a serial connection. Sometimes the device provides continuous reading (which is displayed on the screen), and sometimes the application must stop continuous measurements and go into command response mode.
How NOT to do it
For continuous measurements, XXX uses System.Timers.Timer for background processing of sequential input. When the timer fires, C # starts the ElapsedEventHandler timer, using some thread from its pool. The XXX event handler uses commPort.ReadLine() lock with a few seconds timeout, and then calls the delegate when a useful measurement arrives at the serial port. This part works fine, however ...
When its time to stop real-time measurements and force the device to do something else, the application tries to pause background processing from the GUI stream by setting the timer Enabled = false . Of course, this simply sets a flag to prevent further events, and the background thread, already waiting for sequential input, continues to wait. Then the GUI thread sends a command to the device and tries to read the response - but the response is received by the background thread. Now the background thread gets tangled up as its not expected dimension. Meanwhile, the GUI thread is getting confused as it did not receive the expected command. Now we know why XXX is so flaky.
Possible Method 1
In another similar application, I used the System.ComponentModel.BackgroundWorker stream for measurements without participation. To pause background editing, I did two things in the GUI thread:
- calling the
CancelAsync method in the stream and - call
commPort.DiscardInBuffer() , which causes the pending (blocked, pending) comport to read in the background thread to throw System.IO.IOException "The I/O operation has been aborted because of either a thread exit or an application request.\r\n" .
In the background thread, I catch this exception and quickly clear it, and everything works as intended. Unfortunately, DiscardInBuffer , which DiscardInBuffer an exception in another read lock chain, does not document behavior anywhere I can find, and I hate relying on undocumented behavior. It works because internally, DiscardInBuffer calls the Win32 API PurgeComm, which interrupts the lock reading (documented behavior).
Possible Method 2
Directly use the BaseClass Stream.ReadAsync method with a monitor cancellation token, using the supported background I / O interrupt method.
Since the number of characters to be received is a variable (terminated by a new line) and there is no ReadAsyncLine method in the structure, I do not know if this is possible. I could process each character individually, but it could hit performance (it may not work on slow machines, unless, of course, the line termination bit is already implemented in C # within the framework).
Possible Method 3
Create a "I have a serial port" lock. No one reads, writes, or drops input from the port unless they have a lock (including repeating a lock read in the background thread). Change the timeout value in the background thread to 1/4 second for an acceptable GUI response without too much overhead.
Question
Does anyone have a proven solution to solve this problem? How can I stop background processing of a serial port completely? I googled and read dozens of articles that mourn the C # SerialPort class, but didn't find a good solution.
Thanks in advance!