If I guarantee that two threads will never work in parallel, should I still make the list variable mutable?

Imagine that I have this code where, inside the Windows Forms timer, I can spwn some threads - but I guarantee that ONLY one thread works using the following approach (as indicated in one of the answers here - Matt Johnson):

nb: let's say now this _executing approach works and I don't use backgroundworker etc.

 private volatile bool _executing; private void TimerElapsed(object state) { if (_executing) return; _executing = true; if(smth) { Thread myThread = new Thread(MainThread1); myThread.IsBackground = true; myThread.Start(); }else { Thread myThread = new Thread(MainThread2); myThread.IsBackground = true; myThread.Start(); } } public void MainThread1() { try { methodWhichAddelementTomyList(); // eg, inside list.add(); } finally {_executing = false;} } public void MainThread2() { try { methodWhichAddelementTomyList(); // eg, inside list.add(); } finally {_executing = false;} } 

Now I also have the List instance variable that you see, I access from MainThread1 and MainThread2 - but since my logic is above, I guarantee that MainThread1 and MainThread2 will never work in parallel, do this still need to be done List volatile ? Can I encounter problems with caching a list variable?

EDIT : And does this approach protect me from running these threads in parallel? (The answer in the related question is slightly different - it starts the work inside the timer - so I want to double check).

EDIT2 . Honestly, there is no consensus below whether to use the volatile keyword on my List object or not. This state of affairs bothers me. Such a documented response is still welcome; otherwise it is not fully responsible

+5
source share
3 answers

I will pose your question:

If I guarantee that two threads will never execute in parallel , do I still need to make my list variable volatile ?

You do not have two threads, you have three: one thread that launches the other two. It always works in parallel with other threads, and it uses a common flag to communicate with them. Given that the code you submitted is not necessary to mark the list as volatile .


But in the case of two threads and two threads only , which somehow run one after another without interference from the third (i.e. reading from a shared variable), making a volatile list would be enough to ensure that two threads always see one the same data.

For two threads that do not work simultaneously in order to view the list in a consistent state (in other words, updated), they should always work with the latest version of what is in memory. This means that when the stream begins to use the list, it should read from the list after previous entries have been set.

This implies memory barriers. The thread needs to acquire a barrier before using the list, and a separation barrier after executing with it. Using Thread.MemoryBarrier , you cannot precisely control the semantics of barriers, you always get full barriers (release and acquire what is stronger than we need), but the end result is the same.

So, if you can guarantee that threads will never run in parallel, the C # memory model can guarantee that the following works as expected:

 private List<int> _list; public void Process() { try { Thread.MemoryBarrier(); // Release + acquire. We only need the acquire. _list.Add(42); } finally { Thread.MemoryBarrier(); // Release + acquire. We only need the release. } } 

Please note that the list is not volatile . Because it is not necessary: ​​barriers are needed.

Now the point is that the ECMA C # Language Specification says (emphasis mine):

17.4.3 Flying fields

  • Reading a volatile field is called volatile reading. Intermittent reading has "acquire semantics"; that is, it is guaranteed to occur before any memory references that occur after it in a sequence of commands.

  • A variable field record is called volatile write. A volatile entry has "release semantics"; that is, it is guaranteed to happen after any memory references before a write command in a sequence of commands.

(Thanks to R. Martigno Fernandez for finding the relevant paragraph in the standard!)

In other words, reading from the volatile field has the same semantics as the receive barrier, and writing to the volatile field has the same semantics as the release barrier. This means that, taking into account your premise, the following stanza of the code behaves identically 1 to the previous one:

 private volatile List<int> _list; public void Process() { try { // This is an acquire, because we're *reading* from a volatile field. _list.Add(42); } finally { // This is a release, because we're *writing* to a volatile field. _list = _list; } } 

And this is enough to ensure that until both threads are executed in parallel, they will always see the list in a consistent state.

(1) : Both examples are not strictly identical, the first offers more reliable guarantees, but in this particular case these strong guarantees are not required.

+3
source

Linking an object to a volatile list does nothing for the list itself. This affects the guarantees that you get when reading and assigning this variable.

You cannot use volatile anywhere and expect it to magically make a thread-safe data structure thread-safe. If it were light firmware, it would be easy. Just mark everything unstable. Does not work.

From the code and the description it is clear that you get access to the list in only one thread. This does not require synchronization. Please note that even if you read the list in the second thread, which is unsafe. If there is at least one writer, there can be no other simultaneous access. Doesn't even read.

Here's a simpler approach:

 Task.Run(() => Process(smth)); ... public void Process(bool smth) { try { if (smth) methodWhichAddelementTomyList(); else otherThing(); } finally {_executing = false;} } 

There are no more "two threads." This is a confusing concept.

+2
source

There seem to be 2 questions here:

  • If you use two threads, but they never start asynchronously, then why are there two threads? Just organize your methods appropriately, i.e. stick to one thread.

  • However, if two streams are a kind of requirement (for example, to allow one thread to continue processing / remain unlocked, and the other to perform some other task): even if you encoded this to ensure that neither of the two threads can access to at the same time, to be safe, I would add a lock construct, since the list is not thread safe. For me, this is the simplest.

Instead, you can use a threaded assembly, for example, one of the collections in System.Collections.Concurrent . Otherwise, you will need to synchronize all access to the list (i.e. put each additional call in a lock),

I personally avoid using volatility. Albahari has a good explanation for him: β€œthe volatile keyword ensures that the most relevant value is always present in the field. This is not true, because, as you can see, a record with subsequent reading can be reordered.”

Volatile simply ensures that two streams see the same data at the same time. This does not stop them at all from interleaving their read and write operations.

etc .: Declare a synchronization object, for example:

 private static Object _objectLock = new Object(); 

and use it like in your method methodWhichAddelementTomyList (and anywhere else your list has been changed) to provide sequential access to the resource from different threads:

 lock(_objectLock) { list.Add(object); } 
0
source

All Articles