How to formulate the difference between asynchronous and parallel programming?

Many platforms support asynchrony and parallelism as a means to improve responsiveness. I understand the difference as a whole, but it is often difficult to articulate in your mind, as for others.

I am a programmer and often use async and callback. parallelism feels exotic.

But I feel that they merge easily, especially at the language design level. I would like to get a clear description of how they relate (or not), and the classes of programs in which it is best to apply.

+87
multithreading asynchronous parallel-processing
May 26 '11 at 4:11
source share
10 answers

When you start something asynchronously, it means that it is not blocking, you execute it without waiting for it to complete and continue with other things. Parallelism means performing multiple actions simultaneously. Parallelism works well when you can divide tasks into independent parts of the work.

Take, for example, the rendering of 3D animation frames. Rendering animations takes a lot of time, so if you want to run this renderer from your animation editing software, you have to make sure that it works asynchronously, so it does not block your user interface, and you can continue to do other things. Now each frame of this animation can also be considered as a separate task. If we have several processors / cores or several machines available, we can display several frames in parallel to speed up the overall workload.

+62
May 26 '11 at 4:39
source share

I find the main difference between concurrency and parallelism .

Async and Callbacks are usually a way (tool or mechanism) for expressing concurrency, that is, a set of objects that can talk to each other and share resources. In the case of an asynchronous or callback, communication is implicit, and resource sharing is optional (consider RMI, where the results are computed on a remote machine). As correctly noted, this is usually done with responsiveness in mind; not to wait for long latent events.

Parallel programming is usually bandwidth as the main target, while latency, that is, completion time for one element, may be worse than an equivalent sequential program.

To better understand the difference between concurrency and parallelism, I will quote from probabilistic models for concurrency Daniele Waracki, which is a good set of notes for concurrency theory:

A calculation model is a model for concurrency when it is able to represent systems as consisting of independent autonomous components, possibly interacting with each other. The concept of concurrency should not be confused with the concept of parallelism. Parallel computing typically includes a central control that distributes work across multiple processors. In concurrency, we emphasize the independence of components and the fact that they communicate with each other. parallelism is similar to ancient Egypt, where the pharaoh solves and slaves work. concurrency is like modern Italy, where everyone does what they want and everyone uses mobile phones.

In conclusion, parallel programming is a special case of concurrency where individual entities interact to obtain high performance and throughput (usually).

Asynchronous and callbacks are just a mechanism that allows a programmer to express concurrency. Keep in mind that well-known parallel programming design patterns, such as master / worker or map / reduce, are implemented using frameworks that use such lower level mechanisms (async) to implement more complex centralized interactions.

+43
Jul 20
source share

This article explains it very well: http://urda.cc/blog/2010/10/04/asynchronous-versus-parallel-programming

It has this about asynchronous programming:

Asynchronous calls are used to prevent "blocking" within the application. A [such] call will stand out in an existing thread (such as an I / O thread) and perform its task when possible.

it's about concurrent programming:

In concurrent programming, you still break up work or tasks, but the key differences are that you create new threads for each piece of work.

and this is briefly:

asynchronous calls will use threads that are already used by the system , and parallel programming requires that the developer disrupt the work of threads with a spinning thread and erasure. .

+19
May 26 '11 at 4:14
source share

My basic understanding:

Asynchronous programming solves the wait problem around an expensive operation before you can do anything else. If you can get other things while you wait for the operation to complete, then this is good. Example: saving the user interface during the transition and getting more data from the web service.

Parallel programming is connected, but more related to breaking a large task into smaller pieces that can be computed simultaneously. Then the results of smaller pieces can be combined to obtain a common result. Example: ray tracing, where the color of individual pixels is essentially independent.

This is most likely more complicated, but I think the main difference.

+14
May 26 '11 at 4:33 a.m.
source share

I tend to think about the difference in these terms:

Asynchronous: go away and complete this task when you are done, come back and tell me and bring the results. Meanwhile, I will do other things.

In parallel: I want you to complete this task. If this simplifies, ask some people for help. This is relevant, so I will wait here until you return with the results. I can’t do anything until you return.

Of course, an asynchronous task can use parallelism, but the differentiation - in my opinion, at least - is whether you perform other things during the operation or if you completely stop everything while the results are in.

+12
Nov 23 '12 at 22:45
source share

This is a matter of order of execution.

If A is asynchronous with B, then I cannot predict in advance when substrings A will be executed with respect to subparts B.

If A is parallel to B, then things in happen simultaneously with things in B. However, the execution order can still be determined.

Perhaps the difficulty lies in the fact that the word asynchronous is ambiguous.

I perform an asynchronous task when I tell my butler to run to the store for a lot of wine and cheese, and then forget about him and work on my novel until he knocks on his office door again. Parallelism happens here, but the butler and I deal with fundamentally different tasks and different social classes, so we do not apply this label here.

My maid team works in parallel when each of them erases a different window.

Our race car support team is asynchronously parallel in that each team runs on a different bus, and they don’t need to communicate with each other or manage shared resources while they are doing their job.

My football team (aka football) performs parallel work, since each player independently processes information about the field and moves around it, but they are not completely asynchronous, because they must communicate and respond to messages from others.

My marching group is also parallel, as each player reads music and controls his instrument, but they are very synchronous: they play and march in time with each other.

The cam machine gun can be considered parallel, but everything is 100% synchronous, so that as if one process is moving forward.

+7
Nov 23 '12 at 23:23
source share

Why is asynchronous?

Today, the application is becoming more and more connected, as well as potentially lengthy running tasks or blocking operations, such as network I / O or database operations. Therefore, it is very important to hide the delay of these operations by running them in the background and returning them to the user interface as quickly as possible. Here Asynchronous fits the image, Responsiveness .

Why concurrent programming?

Today, the data collection is growing, and the calculations are becoming more complex. Therefore, it is very important to reduce the execution time of these operations with reference to the CPU, in this case, dividing the workload into pieces and then performing these pieces at the same time. We can call it "parallel." Obviously, this will give high performance for our application.

+6
08 Oct '14 at 3:55
source share

Asynchronous: running a method or task in the background without blocking. May not necessarily work on a separate thread. Uses context switching / time schedule.

Parallel tasks: each task is performed in parallel. Does not use context switching / time schedule.

+3
May 13 '14 at 10:24
source share

I came here quite comfortably with these two concepts, but with something incomprehensible to me about them.

After reading some answers, I think I have the right and useful metaphor to describe the difference.

If you think that your individual lines of code are separate but ordered playing cards (stop me if I explain how the punch start cards work), then for each separate written procedure you will have a unique card stack (don’t copy and paste !) and the difference between what usually happens during normal code execution and asynchronously depends on whether you care or not.

When you run the code, you pass the OS a set of single operations (your compiler or interpreter has broken your "higher level" code into) for transfer to the processor. With one processor, only one line of code can be executed at any given time. Thus, to achieve the illusion of running multiple processes simultaneously, the OS uses a technique in which it sends the processor only a few lines from a given process at a time, switching between all processes in accordance with how it sees fit. The result is several processes showing progress to the end user at the same time.

For our metaphor, the relationship is that the OS always shuffles the cards before sending them to the processor. If your card stack does not depend on another stack, you will not notice that your stack stops receiving the selected one, and the other stack becomes active. Therefore, if you do not care, it does not matter.

However, if you don't care (for example, there are several processes - or stacks of cards - that depend on each other), then shuffling the OS will spoil your results.

Writing asynchronous code requires handling dependencies between the execution order, no matter how it is ordered. This is why constructs such as callbacks are used. They tell the processor: "The next thing to do is tell the other stack what we did." Using such tools, you can be sure that another stack will receive a notification before it allows the OS to run its instructions more. ("If call_back == false: send (no_operation)" - not sure if this is actually how it is implemented, but logical, I think it is consistent)

For parallel processes, the difference is that you have two stacks that do not care about each other, and two workers for processing them. At the end of the day, you may need to combine the results with two stacks, which then will be a matter of synchronism, but for execution you don't care.

Not sure if this helps, but I always find some explanations helpful. Also note that asynchronous execution is not limited to an individual computer and its processors. Generally speaking, we are talking about time or (even more general) the order of events. Therefore, if you send the dependent stack A to the network node X and the associated stack B to Y, the correct asynchronous code should be able to take into account the situation as if it were running locally on your laptop.

+3
Jul 24 '14 at 11:33
source share

async : do it yourself elsewhere and let me know when you are done (callback). By the time I can continue to do my job.

enter image description here

parallel : Take as many guys (threads) as you wish , and split the task into them to complete faster and tell me (callback) when finished. By the time I can continue to do other things.

enter image description here

The main difference between parallelism is mainly hardware dependent.

+3
Feb 27 '16 at 21:49
source share



All Articles