Is this using a universal list stream

I have a System.Collections.Generic.List<T> to which I only add items in a timer callback. The timer restarts only after the operation is completed.

I have a System.Collections.Concurrent.ConcurrentQueue<T> which stores the indices of the added elements in the above list. This storage operation is also always performed in the same timer callback described above.

Is a read operation that iterates through a queue and provides access to the relevant items in a list stream?

Code example:

 private List<Object> items; private ConcurrentQueue<int> queue; private Timer timer; private void callback(object state) { int index = items.Count; items.Add(new object()); if (true)//some condition here queue.Enqueue(index); timer.Change(TimeSpan.FromMilliseconds(500), TimeSpan.FromMilliseconds(-1)); } //This can be called from any thread public IEnumerable<object> AccessItems() { foreach (var index in queue) { yield return items[index]; } } 

My understanding: Even if the size of the list changes when indexing, I only get access to an element that already exists, so it does not matter whether it is read from an old array or a new array. Therefore, it must be thread safe.

+6
source share
4 answers

Is a read operation that iterates through a queue and provides access to the relevant items in a list stream?

Is documented as a safe thread?

If not, it is foolish to treat it as thread-safe, even if it is executed in this implementation by accident . Thread safety must be designed.

Sharing memory over threads is, firstly, a bad idea; if you do not, you do not need to ask if the operation is a safe thread.

If you need to do this, use a collection designed to access shared memory.

If you cannot do this, use a lock. Locks are cheap if they are not needed.

If you have performance issues because your locks are constantly being fixed, fix this problem by changing the architecture of the threads instead of trying to do dangerous and stupid things like low-blocking code. No one writes low-blocking code, with the exception of a few experts. (Iโ€™m not one of them, I donโ€™t write code with a low lock either.)

Even if the size of the list changes during indexing, I only get access to the element that already exists, so it does not matter whether it is read from the old array or the new array.

This is the wrong way to think about it. The right way to think about this is:

If the list is modified, then the internal data structures of the list will be mutated. Perhaps the internal data structure is mutated in an inconsistent form halfway through the mutation, which will be consistent by the time the mutation is complete. Therefore, my reader can see this inconsistent state from another thread, which makes the behavior of my entire program unpredictable. This can lead to failure, it can lead to an endless loop, it can damage other data structures, I donโ€™t know, because I am running code that assumes a consistent state in a world with an inconsistent state.

+9
source

Great editing

ConcurrentQueue is only safe for Enqueue (T) and T Dequeue () operations. You do foreach and it does not sync at the required level. The biggest problem in your particular case is the fact that listing the queue (which is the property of the Property) may result in the well-known exception "Collection has been changed." Why is this the biggest problem? Since you add things to the queue after , you added the corresponding objects to the list (the synchronization of the list is also very necessary there, but this + the biggest problem will be solved with only one "bullet"). When enumerating a collection, it is not easy to learn the fact that another thread modifies it (even if at the microscopic level the modification is safe - ConcurrentQueue does just that).

Therefore, you absolutely need to synchronize access to the queues (and the central list when you are on it) using another synchronization tool (and I mean that you can also forget about ConcurrentQueue and use a simple queue or even a List as long as you never share things).

So just do something like:

 public void Writer(object toWrite) { this.rwLock.EnterWriteLock(); try { int tailIndex = this.list.Count; this.list.Add(toWrite); if (..condition1..) this.queue1.Enqueue(tailIndex); if (..condition2..) this.queue2.Enqueue(tailIndex); if (..condition3..) this.queue3.Enqueue(tailIndex); ..etc.. } finally { this.rwLock.ExitWriteLock(); } } 

and in AccessItems:

 public IEnumerable<object> AccessItems(int queueIndex) { Queue<object> whichQueue = null; switch (queueIndex) { case 1: whichQueue = this.queue1; break; case 2: whichQueue = this.queue2; break; case 3: whichQueue = this.queue3; break; ..etc.. default: throw new NotSupportedException("Invalid queue disambiguating params"); } List<object> results = new List<object>(); this.rwLock.EnterReadLock(); try { foreach (var index in whichQueue) results.Add(this.list[index]); } finally { this.rwLock.ExitReadLock(); } return results; } 

And, based on my understanding of when your application is accessing a list and various queues, it should be 100% safe.

End of big editing

First of all: What do you call Thread-Safe? Eric Lippert

In your particular case, I think the answer is no .

In the global context (actual list), inconsistencies may occur.

Instead, it is possible that actual readers (who can very โ€œencounterโ€ a unique writer) turn out to be inconsistent on their own (their own stacks mean: local variables of all methods, parameters, as well as their logically isolated part of the heap)).

The possibility of such "inconsistencies" for the stream (the N-th stream wants to know the number of elements in the List and finds that this value is 39404999, although in fact you added only 3 values) is enough to declare that, as a rule, saying that the architecture It is not thread safe (although you are not actually modifying the globally accessible List by simply reading it in error).

I suggest you use the ReaderWriterLockSlim class. I think you will find it according to your needs:

 private ReaderWriterLockSlim rwLock = new ReaderWriterLockSlim(LockRecursionPolicy.SupportsRecursion); private List<Object> items; private ConcurrentQueue<int> queue; private Timer timer; private void callback(object state) { int index = items.Count; this.rwLock.EnterWriteLock(); try { // in this place, right here, there can be only ONE writer // and while the writer is between EnterWriteLock and ExitWriteLock // there can exist no readers in the following method (between EnterReadLock // and ExitReadLock) // we add the item to the List // AND do the enqueue "atomically" (as loose a term as thread-safe) items.Add(new object()); if (true)//some condition here queue.Enqueue(index); } finally { this.rwLock.ExitWriteLock(); } timer.Change(TimeSpan.FromMilliseconds(500), TimeSpan.FromMilliseconds(-1)); } //This can be called from any thread public IEnumerable<object> AccessItems() { List<object> results = new List<object>(); this.rwLock.EnterReadLock(); try { // in this place there can exist a thousand readers // (doing these actions right here, between EnterReadLock and ExitReadLock) // all at the same time, but NO writers foreach (var index in queue) { this.results.Add ( this.items[index] ); } } finally { this.rwLock.ExitReadLock(); } return results; // or foreach yield return you like that more :) } 
+5
source

No, because you are both reading and writing to / from the same object. It is not documented to be safe, so you cannot be sure that it is safe. Do not do that.

The fact that it is actually unsafe with .NET 4.0 does not mean anything, by the way. Even if it was safe according to the Reflector, it can change at any time. You cannot rely on the current version to predict future versions.

Do not try to get away with such tricks. Why not just do it in an obviously safe way?

As a side note: two timer callbacks can run simultaneously, so your code is broken twice (multiple authors). Do not attempt to remove tricks with threads.

+1
source

It is thread-safe. The foreach statement uses the ConcurrentQueue.GetEnumerator () method. What promises:

An enumeration is a snapshot of the contents of a queue. It does not reflect any collection updates after calling GetEnumerator. The enumerator is safe to use while reading and writing to the queue.

This is another way of saying that your program will not accidentally explode with an incomprehensible exception message, such as the one you get when you use the Queue class. Beware of the consequences, although this guarantee implies that you can look at an outdated version of the queue. Your loop will not be able to see any elements that were added by another thread after the loop started. Such magic does not exist and cannot be implemented in a consistent way. Regardless of whether your program makes the behavior incorrect, you have to think about it and cannot be guessed from the question. It is quite rare that you can completely ignore it.

Your use of the <> list is however extremely dangerous.

+1
source

All Articles