Noticed that Plugin (or any) Delay Compensation isn't applied when monitoring input (audio or MIDI). This made me think that the PDC on Logic might not be done at 'online' or 'audio engine runtime' level, but is simply done by adjusting the read timing of channel source data, so that eventually channel outputs end up in sync with each other at summing/output stages, but only after going through their entire signal chains/routes.
This would be more like a design flaw, not a bug, and fixing this would propably require a rewrite of the Logic audio engine itself. At least in situations, where signals need to be in sync before the final output.
While monitoring live inputs through Logic, it seems that Logic just doesn't have the required compensation mechanism, and the signals just pass through, as Logic can't adjust the read timing of live input. If individual chanels and/or their routes have different latencies, the output will then be out of sync. In this respect, auxes and sidechain inputs are also just monitoring their sources 'live'.
The 'proper' way to deal with this propably would be (instead of relying on compensated the raw data read timing) to hold the data (that needs to be PDCd/delayed) in an extra memory buffer during audio engine 'runtime'. The data in memory buffer can be accessed in sync at any 'latency stage' during the audio engine runtime, by adjusting read timing from this memory buffer at the sidechain input/monitoring end. Trade-off being increased use of memory and lower maximum track count. If 100 channel inputs need to be monitored in sync, and one of them has latency inducing plugin, then the 99 other channels need to be held in a memory buffer until the channel with latency has finished filtering/processing.
But now, the Delay Compensation seems to work only at 'visual' level in Logic. 'What you see on Main window tracks is what you get at final output stage'. The signals flowing through their routes may not always be in absolute sync, and when this is required (sidechainin), Logic doesn't seem to have any mechanism to derive the amount of compensation needed at different points ('nodes' ) from it's own signalchains/routes.
A simple test on video:
2 channels/audio tracks.
Both with identical content.
Put a compressor on channel 1, and sidechain it to channel to 2.
Set the compressor to Platinum 1:1.
Put a Linear Phase EQ on chanel 1, after(!) the compressor.
Listen to sidechain input on compressor, while comparing it to channel 2.
It is out of sync. Even though the latency is introduced after(!) the compressor and it's sidechain input, and the tracks seem to visually be in perfect sync on timeline. This would be because instead of delaying the post-fader output ('write') of channel 2 to match channel 1 latency (would require memory buffering), Logic apparently just delays the absolute input ('read') of channel 2 data to match the latency.
In this case, when listening to the sidechain input on channel which later induces latency, the already delayed-at-read sidechain input signal goes through the Linear Phase EQ, and gets delayed by it's processing latency, resulting in sync offset between channel 1 & 2.
In other words, the sidechain input signal is already compensated for the Linear Phase EQ latency, even though it arrives in the compressor before the Linear Phase EQ.
Move the compressor behind the Linear Phase EQs and repeat the experiment. Notice, that now they are in sync. Because the listened sidechain input no longer goes through the Linear Phase EQs, and is the same delayed input signal heard from channel 2. In this situation both signals are delayd to compensate for the Linear Phase EQ latency, but neither of them goes through the LPE, so they are heard in sync, but with unnecessary delay compensation.
Mute channel 2, turn off sidechain input listening and enable sidechain input, and experiment with some compression settings. Then move the compressor back in front of the Linear Phase EQs. The sound changes. In the video the compressor now misses the kick attack completetly. Because the timing of the compressor input signal to be processed depends on the latency of the Linear Phase EQs, but the sidechain input timing is fixed based on the needed total latency compensation amount.
User doesn't even need to setup complex routings with auxes and sidechaining to suffer from this. Simple reordering of sidechained plugins can 'break' the sidechain input sync.
Repeat experiment in Ableton Live, and notice that it works perfectly (later part of video).
Last edited by en5ca
on Sat Jun 26, 2021 7:45 pm, edited 1 time in total.