C # Thread without blocking Manufacturer or consumer

TL; DR; version of the main questions:

  • When working with streams, is it safe to read the contents of the list using 1 stream and the other to write if you do not delete the contents of the list (redefine the order) and only read the new object after the new object is added completely

  • While the Int is updated from the "Old value" to the "New value" by one thread, is there a risk if the other thread reads this Int that the return value is not the "Old value", the New value "

  • Is it possible for a stream to “miss” a critical region if it is busy, and not just sleep and wait for the regions to release?

I have 2 pieces of code working in separate threads, and I want one of them to act as a producer for the other. I do not want any of them to “sleep” in anticipation of access, but instead jumped forward in the internal code if another thread accesses this.

My initial plan was to exchange data using this approach (and as soon as the counter got a high enough jump to the secondary list to avoid overflow).


pseudo-code of the stream, as I expected it.

Producer { Int counterProducer; bufferedObject newlyProducedObject; List <buffered_Object> objectsProducer; while(true) { <Do stuff until a new product is created and added to newlyProducedObject>; objectsProducer.add(newlyProducedObject_Object); counterProducer++ } } Consumer { Int counterConsumer; Producer objectProducer; (contains reference to Producer class) List <buffered_Object> personalQueue while(true) <Do useful work, such as working on personal queue, and polish nails if no personal queue> //get all outstanding requests and move to personal queue while (counterConsumer < objectProducer.GetcounterProducer()) { personalQueue.add(objectProducer.GetItem(counterconsumer+1)); counterConsumer++; } } 

Having looked at it, everything looked great at first glance, I knew that I would not extract half of the designed product from the queue, so the status of the list, no matter where it is located, should not be a problem, even if the flow switch is occour, at that time how Producer adds a new object. Is this the correct assumption or can there be problems here? (my guess is that the consumer requests a specific place in the list, and new objects are added to the end, and objects are never deleted, that this will not be a problem)

But what caught my attention could be a similar problem that "counterProducer" is under an unknown value when it is "counterProducer ++"? Could this lead to a temporary null or an unknown value? Would this be a potential problem?

My goal is not to block either of the two threads while waiting for the mutex, but to continue looping them instead, so I did it above since there is no blocking.

If using the list causes problems, my workaround would be to implement the linked list and share it between the two classes, still use counters to see if a new job has been added and keep the last location, while personalQueue is moving new things in a personal turn. Therefore, the manufacturer adds new links, the consumer reads them and removes the previous ones. (there is no counter in the list, only external counters to find out how many were added and removed)


alternative pseudo-code to avoid counterConsumer ++ risk (need help with this).

 Producer { Int publicCounterProducer; Int privateCounterProducer; bufferedObject newlyProducedObject; List <buffered_Object> objectsProducer; while(true) { <Do stuff until a new product is created and added to newlyProducedObject>; objectsProducer.add(newlyProducedObject_Object); privateCounterProducer++ <Need Help: Some code that updates the publicCounterProducer to the privateCounterProducer if that variable is not locked, else skips ahead, and the counter will get updated at next pass, at some point the consumer must be done reading stuff, and new stuff is prepared already> } } Consumer { Int counterConsumer; Producer objectProducer; (contains reference to Producer class) List <buffered_Object> personalQueue while(true) <Do useful work, such as working on personal queue, and polish nails if no personal queue> //get all outstanding requests and move to personal queue <Need Help: tries to read the publicProducerCounter and set readProducerCounter to this, else skips this code> while (counterConsumer < readProducerCounter) { personalQueue.add(objectProducer.GetItem(counterconsumer+1)); counterConsumer++; } } 

Thus, the goal in the second part of the code, and I could not understand how to code it, is to prevent both classes from waiting for the other if the other is in the “critical area” of the publicCounterProducer update. If I read the blocking functionality correctly, threads will sleep waiting for a release, which I don't want. You may ultimately have to use it, although in this case the first pseudo-code will do this and simply set a “lock” on receiving the value.

I hope you can help me with my many questions.

+4
source share
2 answers
  • No, this is not safe. A context switch may occur inside .Add after List added an object, but before List updated the internal data structure.

  • If it is int32 , or if it is int64 , and you are working in an x64 process, then there is no risk. But if in doubt, use the Interlocked class.

  • Yes, you can use Semaphore , and when it is time to enter a critical region, use WaitOne overload requiring a timeout. Pass a timeout of 0. If WaitOne returns true, then you have successfully acquired the lock and can enter. If it returns false, then you have not acquired a lock and should not enter.

You really should look at the System.Collections.Concurrent namespace. In particular, look at the BlockingCollection . It has a group of Try* statements that you can use to add / remove items from the collection without blocking.

+5
source

When working with streams, it is safe to read the contents of the list using 1 stream, while the other is written to it, since you do not delete the contents of the list (redefine the order) and only read the new object after the new object has been completely added

No it's not . A side effect of adding an item to the list may be the redistribution of its underlying array. Current implementations of List<T> update the internal link before copying old data onto it, so several threads can observe the list of the required size, but do not contain data.

While Int is updated from Old value to New value by one thread, is there a risk if another thread reads this Int that the return value is neither Old value nor New value,

No, int updates are atomic. But if two threads simultaneously increase counterProducer , this will go wrong. You should use Interlocked.Increment() to increase it.

Is it possible that the stream will “miss” the critical region if it is busy, and not just fall asleep and wait for the regions to be liberated?

No, but you can use (for example) WaitHandle.WaitOne(int) to see if you were able to wait, and respond accordingly. WaitHandle is implemented by several synchronization classes, such as ManualResetEvent .

By the way, is there a reason why you are not using Producer / Consumer built-in classes such as BlockingCollection<T> ? BlockingCollection is easy to use (after you read the documentation!), And I would recommend using it instead.

+4
source

All Articles