General Music Information Sound Quality
This is what anyone who makes an iOS tool need to shout from the peaks:
If you use MIDI, you will struggle much less with these problems (the sound engineer should have dealt with most of this), but all instruments must deal with these requirements:
Expected Musicians
The first thing to do is to keep the buffers used by AudioQueue as small as possible. The default value for the iPad is 1024 samples, which is a very long 23 milliseconds. If you want to play sound with the size of this buffer, it will take from 0 to 23 ms to answer, even if the computer runs infinitely fast. If the buffer is more than 256 samples, you cannot make a large tool with it. In practice, also add up to 30 ms on top of this for overhead (from touch response to graphics, etc.).
Latency - Pro musicians expect finger-to-ear latency to be around 8 milliseconds. It seems like a fantasy on the iPad for me, but it is the goal. Just for a quick game, think of 120 (quarternote) beats per minute and set 4x accuracy to 16 notes, etc. Even if you don’t play fast, musicians get annoyed by the high delay and stop using a great instrument that exceeds 100 ms (!!!!) Everything above 30 ms, unfortunately, they go to pro-instruments and time shift the recorded sound by a few ms ago to fix your nonsense. The best applications in the store reach 30 ms. Some are really good, still ok for about 50 ms.
Measure the delay by placing the microphone over the surface of the ipad and running it into the left channel of the mixer. Then run ipad in the right channel. Record the sound in stereo. Then bring it to a sound editor that allows you to view stereo waves. You can look at the time when the finger hit the touch screen, and then at the time when the electronic sound started. This distance is latency. The dispersion in it is trembling.
Jitter - trembling - a deviation in latency. For more than a few milliseconds, jitter makes the rhythm sound wrong. I got to 5 ms. It is directly related to the size of the AudioQueue buffer. An example with a 1024 sample buffer: if you press your finger down at some random point in time to make a buffer, then the wait should be on sample 512 when you need to answer. Thus, it varies from 0 to 1024 ms minimum, but on average 512 ... 12 ms waiting for buffer completion in the average case before adding latency from other sources.
Touch Latency - The most important thing is that TouchsBegan is served as quickly as possible. Even if that means with regards to touches, Maybe touches Ended, touchsCancelled will take a little longer. Thus, you can get some asymmetry in the workload to achieve this. At first it was a confusing revelation, because the profiler told me that there was no time in touch, but it can make a big difference to postpone some work until the touch ends.
OpenGL - You should at least consider it. I am not sure that it is so easy to achieve these delay goals when using UIKit directly and using OpenGL in my main application.
Real time clock. For things that are not really tools, but have strict requirements for bit synchronization, such as drum machines, you can track what time it really is and make samples track the clock and suppose there is a drift when you do not watch the clock. But I'm tired of fighting this battle on my old MIDI instruments ... if you have MIDI, then MIDI makes the clock; it may be the only thing that works to synchronize several electronic instruments together.
Sound skips are all that your application competes for CPU time. Given that you have made your sound engine as efficient as you can with fewer buffers, use more processor. You may need to trade a little latency for better continuity. I have a bias, cut out stupidity for eye candies and live in your means if the sound engine gets more cycles back.
Timers. Using CADisplayLink. I didn’t have to think much about it. This is what I have.
- (NSInteger)animationFrameInterval { return animationFrameInterval; } - (void)setAnimationFrameInterval:(NSInteger)frameInterval { if (frameInterval >= 16) { animationFrameInterval = frameInterval; if (animating) { [self stopAnimation]; [self startAnimation]; } } } - (void)startAnimation { if (!animating) { displayLink = [NSClassFromString(@"CADisplayLink") displayLinkWithTarget:self selector:@selector(drawView:)]; [displayLink setFrameInterval:animationFrameInterval]; [displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode]; animating = TRUE; } } - (void)stopAnimation { if (animating) { [displayLink invalidate]; displayLink = nil; animating = FALSE; } }
I do not do it. Please do not put “musical instruments” in the store until you solve these problems. Jordan Rudes, Leon Gruenbaum, Chris Dudley and other professional musicians constantly raised these issues when talking about iPad / iPhone apps. It seems that the situation has improved somewhat, but I do not think that most developers know what numbers real musicians expect.
source share