Are FIFO streams expected?

Say I have the following code

static class ... { static object myobj = new object(); static void mymethod() { lock(myobj) { // my code.... } } } 

Then let's say that although thread1 has a thread2 lock, it tries to start mymethod. Will it wait for the lock to be released or throw an exception?

If he is waiting, is the order being executed, so if additional flows enter it, are they FIFO?

+6
multithreading c #
source share
5 answers

Updated my answer: They are queued, but the order is not guaranteed as a FIFO.

Check out this link: http://www.albahari.com/threading/part2.aspx

+8
source share

It is not clear from your code how myobj becomes visible inside mymethod . It looks like var myobj is a local stack variable in the declaration area (with var ). In this case, it may be that each thread will have a separate instance, and mymethod will not be blocked.

Update

With respect to the entire FIFO argument, some background information is needed: the CLR does not provide synchronization. This is the CLR host that provides this as a service for the CLR. The host implements IHostSyncManager and other interfaces and provides various syncronisation primitives. This may seem too important, since the most common host is a typical application host (i.e. you compile in and exe), and this shares all synchronization with the OS (your old Petzold book primitives in the Win32 API). However, there are at least two more basic aspects of hosting: one ASP.Net (I'm not sure what it does) and SQL Server. What I can say for sure is that SQL Server provides all the primitives on top of SOS (which is mostly a user of a more operating system), never touching OS primitives, and SOS primitives are unfair in design to avoid locking convoys ( i.e. do not support FIFO). Since a link was already provided in another answer, OS primitives also began to provide unfair behavior for the same reason as avoiding blocking convoys.

For more information on locking convoys, you should read Rick Vicik's articles at Designing Applications for High Performance :

Lock convoy

FIFO locks guarantee fairness and forward onward by triggering locking convoys. The term originally meant several threads executing the same part of the code as the group, leading to higher collisions than if they were randomly distributed throughout the code (in many ways cars are grouped into packets by traffic lights). Specific Iโ€™m talking about worse because once it forms an implicit transfer of ownership of the blocking threads in blocking mode.

To illustrate, consider an example where a thread contains a lock and it is unloaded while holding a lock. As a result, all other threads will accumulate in the waiting list for this lock. When the cut thread (owner lock at this time) starts up again and releases the lock, it automatically transfers ownership of the block to the first thread while waiting for the list. This thread may not work for a while, but the "timeout" clock is ticking. The previous owner usually requests a lock before the waiting list is cleared, immortalizing the convoy

+3
source share

A simple example tells us that order is not guaranteed by FIFO

  using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Threading; using System.Diagnostics; namespace ConsoleApplication { class Program { private static Info info = new Info(); static void Main(string[] args) { Thread[] t1 = new Thread[5]; for (int i = 0; i < 5; i++) { t1[i] = new Thread(info.DoWork); } Thread[] t2 = new Thread[5]; for (int i = 0; i < 5; i++) { t2[i] = new Thread(info.Process); } for (int i = 0; i < 5; i++) { t1[i].Start(); t2[i].Start(); } Console.ReadKey(); } } class Info { public object SynObject = new object(); public void DoWork() { Debug.Print("DoWork Lock Reached: {0}", Thread.CurrentThread.ManagedThreadId); lock (this.SynObject) { Debug.Print("Thread Lock Enter: {0}", Thread.CurrentThread.ManagedThreadId); Thread.Sleep(5000); Debug.Print("Thread Lock Exit: {0}", Thread.CurrentThread.ManagedThreadId); } } public void Process() { Debug.Print("Process Lock Reached: {0}", Thread.CurrentThread.ManagedThreadId); lock (this.SynObject) { Debug.Print("Process Lock Enter: {0}", Thread.CurrentThread.ManagedThreadId); Thread.Sleep(5000); Debug.Print("Process Lock Exit: {0}", Thread.CurrentThread.ManagedThreadId); } } } } using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Threading; using System.Diagnostics; namespace ConsoleApplication { class Program { private static Info info = new Info(); static void Main(string[] args) { Thread[] t1 = new Thread[5]; for (int i = 0; i < 5; i++) { t1[i] = new Thread(info.DoWork); } Thread[] t2 = new Thread[5]; for (int i = 0; i < 5; i++) { t2[i] = new Thread(info.Process); } for (int i = 0; i < 5; i++) { t1[i].Start(); t2[i].Start(); } Console.ReadKey(); } } class Info { public object SynObject = new object(); public void DoWork() { Debug.Print("DoWork Lock Reached: {0}", Thread.CurrentThread.ManagedThreadId); lock (this.SynObject) { Debug.Print("Thread Lock Enter: {0}", Thread.CurrentThread.ManagedThreadId); Thread.Sleep(5000); Debug.Print("Thread Lock Exit: {0}", Thread.CurrentThread.ManagedThreadId); } } public void Process() { Debug.Print("Process Lock Reached: {0}", Thread.CurrentThread.ManagedThreadId); lock (this.SynObject) { Debug.Print("Process Lock Enter: {0}", Thread.CurrentThread.ManagedThreadId); Thread.Sleep(5000); Debug.Print("Process Lock Exit: {0}", Thread.CurrentThread.ManagedThreadId); } } } } 

Execution will handle something like this

 Process Lock Reached: 15 Process Lock Enter: 15 DoWork Lock Reached: 12 Process Lock Reached: 17 DoWork Lock Reached: 11 DoWork Lock Reached: 10 DoWork Lock Reached: 13 DoWork Lock Reached: 9 Process Lock Reached: 18 Process Lock Reached: 14 Process Lock Reached: 16 Process Lock Exit: 15 Thread Lock Enter: 9 Thread Lock Exit: 9 Process Lock Enter: 14 Process Lock Exit: 14 Thread Lock Enter: 10 Thread Lock Exit: 10 Thread Lock Enter: 11 Thread Lock Exit: 11 Process Lock Enter: 16 Process Lock Exit: 16 Thread Lock Enter: 12 Thread Lock Exit: 12 Process Lock Enter: 17 Process Lock Exit: 17 Thread Lock Enter: 13 Thread Lock Exit: 13 Process Lock Enter: 18 Process Lock Exit: 18 

As you can see, the locking process is different from what is locked.

+1
source share

Windows and the CLR try to guarantee the fairness (FIFO order) of expectations. However, there are certain scenarios in which the order of threads waiting for locks can be changed, basically revolving around waiting expectations, and all blocking of the CLR thread puts the thread in an alarm state.

For all practical purposes, you can assume that the order will be FIFO; however, keep this issue in mind.

0
source share

He will wait, and they will NOT be in the same order.

Depending on your needs, you may have more performance if you look at something like ReaderWriterLock or something else other than just a lock

-one
source share

All Articles