What is the shortest perceived application response delay?

A delay will always occur between user action and application response.

It is well known that the shorter the response time, the more instantly the application reacts. It is also widely known that a delay of up to 100 ms is usually not perceived. But what about a delay of 110 ms?

What is the shortest application response delay that can be perceived?

I am interested in any convincing evidence, general thoughts and opinions.

+70
user-interface perception response-time
Feb 11 '09 at 10:46
source share
12 answers

What I remember was that any latency of more than 1/10 of a second (100 ms) for letters to appear after they enter starts to negatively affect performance (you instinctively slow down, less confident that you typed correctly, for example), but below this latency level, performance is essentially flat.

Given this description, it is possible that latency less than 100 ms may be perceived as not instantaneous (for example, trained baseball referees may probably resolve the order of two events even closer than 100 ms), but this is fast enough to be considered an immediate response to feedback, as it affects performance. A delay of 100 ms or more is definitely perceived, even if it is still fast enough.

This is for visual feedback that a specific input has been received. Then the requested operation would have a response standard. If you click on the form button after receiving visual feedback from that click (for example, the button displays a “depressed” look) for 100 ms, it is still perfect, but after that you expect something else. If nothing happens in a second or two, as others said, you really wonder if he took a click or ignored it, so the standard for displaying some kind of “working ...” indicator, when the operation can take more than a second before showing a clear effect (for example, waiting for a new window to appear).

+28
Feb 11 '09 at 23:52
source share

The 100 ms threshold was set more than 30 years ago. Cm:

Card, SK, Robertson, GG, and Mackinlay, JD (1991). Information visualizer: information workspace. Proc. ACM CHI'91 Conf. (New Orleans, Los Angeles, April 28 - May 2), 181-188.

Miller, RB (1968). Response time in man-machine negotiation transactions. Proc. AFIPS Fall Joint Computer Conference Vol. 33, 267-277.

Myers, BA (1985). The importance of percent progress indicators for computer interfaces. Proc. ACM CHI'85 Conf. (San Francisco, California, April 14-18), 11-17.

+64
Mar 30 '10 at 19:03
source share

New study as of January 2014:

http://newsoffice.mit.edu/2014/in-the-blink-of-an-eye-0116

... a team of neuroscientists from the Massachusetts Institute of Technology found that the human brain can process whole images that the eye sees in just 13 milliseconds ... This speed is much faster than the 100 milliseconds suggested by previous studies ...

+8
Jul 23 '14 at 21:34
source share

I do not think that jokes or opinions are really valid for the answers here. This question affects the psychology of user experience and the subconscious mind. The human brain is powerful and fast, and milliseconds are actually counted and recorded. I am not an expert, but I know that there is a lot of science, for example, that Matt Jacobsen mentioned. Check out Google here http://code.google.com/speed/files/delayexp.pdf for how it can affect site traffic.

Here's another Akami study - 2 second response time http://www.akamai.com/html/about/press/releases/2009/press_091409.html (From https://ux.stackexchange.com/questions/5529/once -apon-a-time-there-was-a-10-seconds-to-load-a-page-rule-what-is-it-nowa )

Does anyone have any other research to share?

+7
Nov 07 2018-11-11T00:
source share

The visual stability is about 100 ms, so it should be a reasonable visual feedback delay. 110ms should not make any difference, as this is an approximation. In practice, you will not notice a delay below 200 ms.

From my memory, studies have shown that users lose their patience and repeat the operation after two seconds of inactivity (in the absence of feedback), for example. by clicking the confirmation or action button. Therefore, plan to use some kind of animation if the action takes more than 1 second.

+6
Feb 11 '09 at 10:56
source share

At the San Francisco Opera House, we regularly tweak the exact delay setting for each of our speakers. We can detect 5 millisecond delay time changes for our speakers. When you make such subtle changes, you change the sound sources. Often we want the sound to sound as if it came from some other place than the speakers. Fine tuning the delay makes this possible. Sound delays of 15 milliseconds are very obvious even to unprepared ears, as they radically shift from sound sources. A simple test is to prove that it is reproducing sound through several speakers, and that the object closes its eyes and indicates where the sound is coming from. Now make a small change in the delay time for one of the speakers in just a few milliseconds and ask the person again to indicate where the sound is coming from. Changing the delay time is acoustically very similar to moving real speakers.

+6
Sep 26 '15 at 4:49
source share

I was working on an application that had an obvious business task - to be dazzlingly fast, and we had a maximum allowable server time of 150 ms to process a full web page.

+4
Feb 11 '09 at 10:56
source share

There is no convincing evidence, but for our own application, we allow a maximum of one second between user action and feedback. If this takes longer, a "wait window" must be displayed.

The user should see “something” for a second, triggering an action.

+2
Feb 11 '09 at 10:54
source share

100 ms is completely wrong. You can prove it yourself using fingers, a table and a clock with visible seconds. When synchronized with the seconds of the stopwatch, constantly strike the table so that every two hits strike 16 hits. I chose 16, because it’s natural to type a multiple of two, so it's like four strong hits with three weak hits in between. Their sounds are clearly visible adjacent beats. The bits are divided by about 60 ms, so even 60 ms is actually still too high. Therefore, the threshold is less than 100 ms, especially when it comes to sound.

For example, a drum application or keyboard application requires a delay of more than 30 ms, otherwise it becomes very annoying because you hear the sound coming from the physical button / pad / key long before the sound comes out of the speakers. Software such as ASIO and jack was specifically designed to solve this problem, so no excuses. If your drum app has a delay of 100 ms, I will hate you.

The situation for VoIP and high-performance games is actually worse, because you need to respond to events in real time and in music, at least you plan ahead, at least a little. For an average human response time of 200 ms, a further delay of 100 ms is a huge penalty. This noticeably changes the conversation flow of VoIP. In games, the response time of 200 ms is long, especially if the players have a lot of practice.

+1
Jul 22 '18 at 5:21
source share

For a reasonably relevant scientific article, try " How much faster is enough?" User perception of latency and latency improvement in direct and indirect touch (PDF). While the focus was on JND latency (just a noticeable difference), there is some good idea about the perception of absolute latency, and they also confirm and take into account 60 Hz monitors (redraw time 16.7 ms) in the second experiment.

0
Nov 30 '18 at 16:15
source share

For web applications, 200 ms is considered an imperceptible delay, and 500 ms is acceptable.

-one
Feb 11 '09 at 10:54
source share

I am a cognitive neuroscientist who studies visual perception and cognition.

Mary Potter's article mentioned above deals with the minimum time required to categorize a visual stimulus. However, understand that this is under laboratory conditions in the absence of any other visual stimuli, which, of course, would not be in the real user experience.

A typical reference for stimulus-response / input-stimulus interactions, i.e. the average time for the minimum response rate of an individual or detection of an input signal is ~ 200 ms. to ensure that there are no distinguishable differences, this threshold can be reduced to approximately 100 ms. Below this threshold, the temporal dynamics of your cognitive processes takes longer to calculate the event than the event itself, so there is practically no way to detect or differentiate it. You could go below to say 50 ms, but that really would be optional. 10 ms and you have moved to overcrowded territory.

-one
Oct 11 '16 at 19:23
source share



All Articles