A few answers seem to focus on lock conflict, but locks are not the only resources on which the conflict may arise. A conflict is simple when two threads are trying to access the same resource or related resources in such a way that at least one of the competing threads is slower than if the other threads were not running.
The most obvious example of competition is blocking. If thread A has a lock and thread B wants to get the same lock, thread B will have to wait until thread A releases the lock.
Now it depends on the platform, but the thread may experience slowdown, even if it does not need to wait until another thread releases the lock! This is because locking protects some data, and the data itself will often be discussed.
For example, consider a thread that receives a lock, modifies an object, then releases the lock and performs some other actions. If two threads do this, even if they never fight for blocking, threads can run much slower than if only one thread was running.
Why? Let's say that each thread runs on its own core on a modern x86 processor, and the kernels do not have L2 cache. Only one thread can an object remain in L2 cache most of the time. When both threads execute, each time one thread modifies an object, the other thread will detect that the data is not in its L2 cache, because the other CPU has invalidated the cache line. For example, on Pentium D, this will cause the code to work at FSB speed, which is much less than the L2 cache speed.
Since a conflict can occur even if the lock itself is not considered, a conflict can also occur when there is no lock. For example, let's say your processor supports the atomic increment of a 32-bit variable. If one thread continues to increase and decrease the variable, the variable will be a lot of time in the cache. If two threads do this, their caches will struggle to own the memory holding this variable, and many accesses will be slower because the cache coherence protocol works to protect every basic ownership of the cache line.
Oddly enough, locks usually reduce competition. What for? Since without blocking two threads can work on the same object or in a collection and cause a lot of conflicts (for example, there are free queues). Locks will tend to dash out competing threads, allowing instead to run consistent threads. If thread A contains a lock and thread B wants the same lock, the implementation can instead execute thread C. If thread C does not need this lock, then in the future the rivalry between threads A and B can be avoided for some time. (Of course, it is assumed that there are other threads that can run. This will not help if the only way the system as a whole can make it useful is to start threads that are competing.)