Marcelo Cantos gave a pretty good explanation , but I think it can be done a little more accurately.
A type of thing can be arranged when several instances can be combined in a certain way to create the same type of thing.
The combination of management structure. Languages like C make a distinction between expressions that can be composed using operators to create new expressions and operators that can be composed using control structures such as if , for and the "sequence management structure" that simply follows the instructions for order. The fact is that these two categories are not on an equal footing - many control structures use expressions (for example, an expression evaluated by if to choose which branch to execute), but expressions cannot use control structures (for example, you cannot return a loop for ). Although it may seem crazy or pointless to “return a for loop,” the general idea of treating control structures as first-class objects that can be stored and transmitted is not only possible, but also useful. In lazy functional languages, such as Haskell, control structures such as if and for can be represented as ordinary functions that can be manipulated in expressions, like any other term, which allows you to use functions that "build" other functions in according to the parameters they pass, and return them to the caller. Therefore, while C (for example) divides “things that the programmer may want to do” into two categories, and restricts the ways of combining objects from these categories, Haskell (for example) has only one category and does not impose such restrictions, therefore this sense, it provides great flexibility.
Possibility of thread arrangement. . I assume that Marcelo Cantos did this, that you are really talking about the compositability of threads that use locks / mutexes. This is a bit more complicated because there can be threads on his face that use multiple locks; but what’s important is that we cannot have threads that use multiple locks with the guarantees they must have.
We can define a lock as a type of thing that has certain operations that can be performed on it, that come with certain guarantees. One guarantee: suppose there is a lock object x , then, provided that every process that calls lock(x) ultimately calls unlock(x) , any call to lock(x) will eventually return successfully with x , blocked current thread / process. This guarantee greatly simplifies discussions about program behavior.
Unfortunately, if there is more than one castle in the world, this is no longer the case. If thread A calls lock(x); lock(y); lock(x); lock(y); and thread B calls lock(y); lock(x); lock(y); lock(x); , then it is possible that A locks the lock x , and B locks the y lock, and they both wait indefinitely until the other thread releases the other lock: deadlock. Thus, locks are not composite, because when you use more than one, you cannot just say that this important guarantee is still preserved - without analyzing the code in detail to see how it manages the locks . In other words, you can no longer afford to view functions as black boxes.
Things that are composite are good because they allow abstractions, which means that they allow us to reason about code without caring about all the details, and this reduces the cognitive load on the programmer.