Jump to content

Audio not in sync


Kim Prima

Recommended Posts

OK...

 

AUDIO INTERFACE LATENCY

When you record external, analog audio into your computer, the signal has to be converted to digital data and sent to Logic (via the interface's driver software). This doesn't happen instanteously. It's not like recording onto analog tape. The amount of time it takes to do all this digitizing and transferring of data can range from the duration of only a few samples to many milliseconds, sometimes even 10's of milliseconds or more. Let's call this "recording latency" for now (or just "RL").

 

RL has nothing to do with I/O buffer size. It's a separate issue.

 

Logic does NOT have any way of automatically calculating RL. And the amount of RL will be different for every audio system. So the only way to tell Logic how much RL exists is to do a loopback test (see page 1 of this thread). The test CANNOT be done using MIDI or an audio instrument (plug) as a source. The test must be done ONLY with an audio track as the sound source. The reason is that MIDI playback is inherently unstable. Audio playback is stable.

 

MIDI DELAY

External MIDI devices do not act instantaneously to MIDI data. There will always, always, always be at least a 1 millisecond delay from the time that Logic outputs a MIDI note until the time you hear it. Sometimes the delay is more, sometimes less. There are too many technical factors behind why this is the case to explain briefly, so I won't.

 

How "tight" a quantized MIDI part plays against a click is subjective. If it sounds good, it is good. But be aware that there is NO standard or requirement for how quickly an external MIDI device is supposed to react to MIDI messages. So some devices are going to sound as if they're late against the click, some will sound like they're right on time. Sometimes, a sound will seem late against the click because of the nature of the sound itself --- a kick with a woofy attack, or a "loose" handclap sample. But if you played a tightly truncated handclap sample out of that same device, maybe it would sound in time.

 

So... since all MIDI devices react differently to MIDI messages, how do you set a standard for MIDI timing? That's a damn good question, and this should be addressed in a separate post.

Link to comment
Share on other sites

Thank you Ski :wink:

 

In summary, play hardware with MIDI Notes in Logic will be never in perfect sync.

I maked the loopback test and allways is good for this tip.

 

To have a perfect sync, must I play directly on the synth keyboard without MIDI notes in Logic ? Maybe use a hardware sequencer and record the audio directly into Logic ???

Link to comment
Share on other sites

Please note that at some point I will move this to another thread...

 

To have a perfect sync, must I play directly on the synth keyboard without MIDI notes in Logic ? Maybe use a hardware sequencer and record the audio directly into Logic ???

 

No.

 

There are three things to consider when it comes to MIDI. The first thing is that MIDI timing isn't perfect. This has nothing to do with Logic. You can have this simple setup...

 

keyboard---->MIDI synth

 

...and even if you played one key over and over with perfect timing, the MIDI synth would not sound in perfect time with your playing. This is true for any external MIDI synth. There will always be a little timing variation.

 

Sometimes this timing variation is very slight, and most people won't notice it in musical context. Not even the musician playing the sound.

 

Next: Logic's ability to record "realtime, unquantized" MIDI data.

 

There is no such thing as recording and then playing back "unquantized" MIDI data. This has nothing to do with Logic. Every sequencer has a limited amount of resolution (ppqn, pulses per quarter note). Logic has 3840 ppqn resolution. This means that there are "only" 3840 possible places for a note to be recorded within a bar of 4/4. So if you played a note in realtime at the equivalent position of pulse # 3820.9634, and your quantization is set to "off", that note will still be quantized to the position of pulse # 3821.

 

Even if Logic (or any DAW) had resolution based on the sample rate (which is significantly higher than 3840 ppqn at any tempo), if you played a note in realtime halfway through sample #44,050, it'll probably be quantized to sample #44,051 (and that's with quant=off).

 

Given these limitations, you cannot truly record an unquantized performance in any sequencer. There will be little timing errors introduced into the playback of MIDI data, and the timing errors are more audible at lower tempi (the # of pulses per quarter note is regardless of tempo!).

 

There's more to this story, so, "to be continued..."

Link to comment
Share on other sites

...continued...

 

So yes, if you want to record a truly realtime, unquantized performance that reflects most accurately what you played, you would have to play your MIDI parts in real time and record them as audio. But the small timing discrepancies of MIDI and ppqn limitations -- as explained above -- aren't usually that noticeable, if at all. If these small timing errors were glaringly obvious all the time, MIDI wouldn't have lasted this long. But it has been for 25 years. Still, there might be situations where you might timing discrepancies that bother you. But it's beyond the scope of what I'm trying to say here to discuss remedies for that.

 

Finally... How how do you make sure that Logic is transmitting MIDI data in time? This is a difficult question. But I'll try to make it easy...

 

Let's say you do a lot of dance music and your kick comes from a MIDI synth (let's say it's an Emu Orbit). It's "your" kick sound and everything has to sync around it.

 

You record a 4-on-the-floor kick, quantized to 1/4 notes and loop it. Then you listen to that track against Logic's metronome and it sounds late.

 

Hmmm... OK, mute that track and record a HH coming from a Yamaha RM-50 drum module. 16th notes, quantized, looped. You listen to that against the click and it sounds late too.

 

Hmmm... OK, mute that and record a snare coming from an Triton. Two and four, quantized. Play it back against the click and it sounds late too.

 

So what do we have here? We have synths by three totally different manufacturers all sounding late. The chances of that are pretty slim that all three synths have slow response to MIDI. And if all three parts play back sounding in time with themselves, this would indicate that there's something wrong with the way MIDI is being transmitted to your MIDI synths. It could be the fault of the MIDI interface or something else. Who knows. But what you do know is that all of your instruments sound late against the click.

 

In a case like this, where everything is late, you might want to try adjusting the All Output MIDI parameter:

 

http://www.score2picture.com/logicpix/mididelay.jpg

 

Find the setting that brings your kick in time with the click. Note that this parameter is calibrated in milliseconds, and that it's possible to attain fractional values (1.1 ms, 4.3 ms, etc.).

 

Next, add the HH and see how it feels. Then bring in the snare. If it's all feeling right, then what you've done is compensated for a MIDI delay that seems to be system wide.

 

BUT...

 

By setting a value for the All Output MIDI parameter for those three synths, you might have gotten lucky. Remember, all MIDI synths react with different speeds to incoming MIDI data. So now let's say you hook up a Jupiter 8 (with MIDI) to get some kind of synthesized handclap sound. On the two and four, quantized, and it sounds late! At this point, the best approach is to use the realtime MIDI delay parameter on the handclap region. Slide it back (negative) a few ticks to get it to play in time with the snare (also on 2 and 4).

 

I hope that what I've explained shows the difference between how to set the All Output MIDI delay and the realtime delay on individual regions.

 

One final note: not all systems need to have any adjustment made to the All Output MIDI parameter. You should adjust this only if you're experiencing MIDI playback of external instruments to be either late or early.

Edited by ski
Link to comment
Share on other sites

Hi ski,

 

Yes, I worked with all output delay in MIDI SYNC (pref). I put it on 2 ms but when I put it on 2,3 ms and I close the pref, when I come back in the pref, the came back on 2 ms. Impossible to save it on 2,3 ms or 3,2 ms, ...

 

A solution ?

 

Cheers,

Vincent

 

Wow, that sucks. I have a 0.0 ms setting on my system, so I never thought to test it out to see if it would save non-integer values.

 

On the surface this seems like a bug to me. What version of Logic are you on?

Link to comment
Share on other sites

Ok Ski, first of all, you have to make this a sticky post. :shock:

I'm really disappointed that you can't find the correct information in the logic manuals, but THIS makes alot clearly. Respect Ski! 8)

 

You are damn right about the midi timing:

If i write music (score or matrix input) into logic, plays this through midi output to an instrument (synth for example) and then receive the analog audio into logic (NOT recording, just listening to the monitored channel when enabling record mode on that audio track) it will not be in sync with your metronome, even when you record it.

 

The way i do to correct this, is setting the midi delay to a amount of -10ms.

It sounds ok, when you listen carefully to the clicks from your metronome and your drumcomputer (kicks, etc) in realtime.

 

But when recording that audio described above i have to set this amount to 0ms because it will sound totally out of sync if you listen to your recorded track(s). I always set the recording delay to the same value as your i/o buffer size, yes i know for now on, its wrong, but it really worked :s

Its weird, certainly because i know not to much about it :P

 

Anyway, i will test your method asap!

 

Again, thanks !!! 8)

Link to comment
Share on other sites

hmm, i can't get a perfect cancellation on both tracks.

i get close to it (16dB on the output), but the sound isn't completely gone.

i also tried to calibrate the inputs and then normalize my recorded track, it still won't work. i'm doing something wrong/stupid :P

 

edit: i don't have to move the anchor point to the right because the sound will go louder and louder, leaving it in the default way sound the best. (hardware RME fireface 800).

Link to comment
Share on other sites

 

Yeah, let us know how that works out! Then I'll try it LOL!

 

Well, sorry for the delay getting back on this one. Been going in circles. I think the culprit is that when I loop the Duet output to input, somehow it is getting a gain offset from the original. Very minor, but just enough to make it not cancel fully. I've moved the anchor to the right and the left, and never can get it to zero out on the output. Almost, but never 100%. A visual inspection of the wave forms shows that some of the peaks are different, but that seems to be random as to which is higher (original vs. recording), sometimes the original, sometimes the recording. Maybe there is something going on with the Duet / Logic internals that are striving for doing the adjustment to recording delay on the fly and it is hosing up the output level???? :?: :?: :?:

 

Will try some using the method I thought up with the Arrange window tonight and report back the results.. Strange stuff indeed....

Link to comment
Share on other sites

Well, sorry for the delay getting back on this one. Been going in circles. I think the culprit is that when I loop the Duet output to input, somehow it is getting a gain offset from the original. Very minor, but just enough to make it not cancel fully. I've moved the anchor to the right and the left, and never can get it to zero out on the output. Almost, but never 100%. A visual inspection of the wave forms shows that some of the peaks are different, but that seems to be random as to which is higher (original vs. recording), sometimes the original, sometimes the recording. Maybe there is something going on with the Duet / Logic internals that are striving for doing the adjustment to recording delay on the fly and it is hosing up the output level???? :?: :?: :?:

 

Will try some using the method I thought up with the Arrange window tonight and report back the results.. Strange stuff indeed....

I've the same problem :(

Link to comment
Share on other sites

Well, like i described above: i'm going in circles too.

Because i can't set the recording volume to the same as my source volume, you will still hear some sound.

 

If the faders resolutions where more preciesly, i'd certainely get a better result.

I could be wrong but this is my experience, so far 8)

Link to comment
Share on other sites

My newly recorded signal is out of phase with the original. This is not with the gain plugin. This is just routing a signal out of logic and back in. The new signal is completely out of phase! I've tried different hardware, different cables, mono files and stereo files, all with the same result. Is anyone else noticing this?
Link to comment
Share on other sites

A couple of things... (and lots to reply to, so if I missed anyone's particular point please post back).

 

First, let's review the real meat and potatoes point of this test --- it's to make sure that live recordings play back precisely in time with how they were recorded against your track. Even if you have no track recorded, but you were playing to a click, you want your live recording to be in time with how you played it to the click. Or maybe you were scoring to picture and playing your guitar live against picture. You want your track to play back exactly with the timing and feel as you reacted to the picture.

 

Even though there are a lot of steps to the procedure, the premise is fairly simple:  the test seeks to determine whether or not Logic can play back its own track (X) and record it back into itself so that when you play back this re-recorded track (Y) that it's exactly in time with the original.

 

With this test you are pretending that Logic is a live musician. It's playing against itself. If X and Y don't play back in time, imagine how a real musician's part is going to sound against the Logic track.

 

So, back to X and Y. The reason phase cancellation is a good way to test sync is because if X and Y are exactly the same (exactly in time with one another) then they cancel. So if you hear nothing, you know you've achieved the desired result.

 

But if you do hear a little bit of something (close but not quite 100% phase cancellation), it might not be the end of the world -- something I'll explain in a moment. Especially if you found that your recording delay had to be set to 30, 50, 100, 150 samples or more, even if you don't get perfect cancellation, you'll at least have calibrated your system to be that much more accurate in terms of playback of live recorded parts.

 

Personally, I find that a delay of 30 samples is noticeable. Maybe not "audible", but it will affect the feel of a live-played part. But even if you feel that 30 samples (0.6 milliseconds at 48K) is reasonable, there's no reason that your DAW should be shifting audio recordings from where they should be playing back. MIDI is a whole other subject, as there's always at least a bit of timing tolerance in the playback of any note. But it's my feeling that at the very least, your audio recordings should be exactly representative of where in time they were played against other tracks, or the click, or whatever was being used as a reference.

 

There are other reasons to have as-close-to-perfect alignment of live recorded tracks as well, but this post is already getting lengthy so I'll save that for another day.

 

Now, to address cases of imperfect cancellation... There are indeed circumstances under which perfect cancelltion won't be possible.

 

First of all, keep in mind that X (the original) is being converted from digital to analog (on the output) and going back in (and being converted from analog to digital again). The quality of the converters, the type of anti-aliasing filtering in the convertors, and various other factors may result in a Y (loopback) that's not a 100% clone of the original. It might be 99.999% the same, but there can be a margin of error.

 

Another potential for margin of error has to do with levels. There's always the possibility that the inputs or outputs of your audio interface aren't perfectly calibrated (this is especially true on units that have level controls on the inputs). Perfect cancellation is achieved only if X and Y are perfect clones. So if the level of Y isn't exactly the same as X then you won't be able to achieve perfect cancellation.

 

One way to try and eliminate the level discrepancy situation is to first normalize X. Then, after Y is recorded, normalize it too. Then move your anchor point to find the offset.

 

Then, to test the offset, record Z and normalize it before you play it back out-of-phase with X.

 

BUT...

 

If you're able to achieve very close to perfect cancellation, that might be good enough. Remember, the main objective for setting the recording delay parameter is time alignment. So if you can get really close to cancellation with just a little "whisp" of sound left over, it's probably safe to assume that you've found the correct offset amount.

Link to comment
Share on other sites

One way to try and eliminate the level discrepancy situation is to first normalize X. Then, after Y is recorded, normalize it too. Then move your anchor point to find the offset.

 

Then, to test the offset, record Z and normalize it before you play it back out-of-phase with X.

 

BUT...

 

If you're able to achieve very close to perfect cancellation, that might be good enough. Remember, the main objective for setting the recording delay parameter is time alignment. So if you can get really close to cancellation with just a little "whisp" of sound left over, it's probably safe to assume that you've found the correct offset amount.

 

Gotcha... I think there must be some sort of auto-adjustment going on with the Duet. After many attempts of moving the anchor before and after, It always got back to the original starting point being the most quiet. I also tried the above with the normalization to try to get rid of level differences but still couldn't get all the way to full cancelation. To help "see" what was happeing, I zoomed all the way to the sample level, and it appears that the timing of the recording is spot on, sure some minor diffs in the curves, but I'm sure that is mostly do to the A->D then D->A for the recording, so in conclusion, I'm leaving things at 0 when using the Duet :lol:

 

In the attachment, original audio on top, recording on bottom...

logic_test.jpg.69209f21f698abfd7851dd0af2ec652c.jpg

Link to comment
Share on other sites

Open open the sample editor for "Y" and zoom into the anchor point. Now, the good part ---- while holding down OPT, drag the start point to the left a fair amount. You'll see that your anchor point doesn't move. That's what you want: more space on the left to drag the anchor into.

 

Now follow the procedure about moving the anchor point towards the left until you achieve null.

So far I'm following your instructions as close as possible, and I'm having trouble getting it to work. As should be obvious (due to the post that I quoted), my loopback recording (y) is comming back a bit earlier than the orig (x). I guess the problem I'm running into is when to click and hold on the anchor to get the numbers so that my math comes out correctly. I've tried it a few different ways, and nothing seems to be working out right. (IOW, am I supposed to get my first number before I lengthen the region or after?? I would guess that after would be correct, but not sure) Another problem is that when I click and hold on the anchor, I end up with left and right numers (as opposed to top and bottom) and I wanted to be sure that 'right' was the right one to be using. I can post screenshots if need be, so just holler if that's the case.

 

Also worthy of note is that I'm not getting complete cancelation ... not as much of a problem as your latest posts discuss. I even tried it with a higher quality 1/4" cable as I thought that capacitance could be the culprit, but the results seemed pretty similar. [FYI: the outputs on my interface are 1/4" unbalanced, and I'm coming back in the same way. On the first test I used a 6' run of the mill instrument cable and on the next test I used a 21' monster cable. Perhaps the capacitance added by the length of the monster cable canceled out the fact that it's (supposedly) a lower capacitance cable. Also I'm doing my test in mono.]

 

I'd be willing to do an ichat screenshare session if you have a few minutes to help me out ski. Hopefully it doesn't come to that. After all is said and done, perhaps we can collaborate on making a few different versions of your instructions (one for people doing it mono, one for logic 8, etc.) Maybe that's not necessary and I'm just over complicating things. Am I the only one that's having problems.

 

I was just starting to feel confident with my understanding of lp8, and now I feel like a newb all over again. :oops:

Link to comment
Share on other sites

m-m-m: I was having the same problem, I think it is something that was updated in .0.2 as I don't get the pop-up screen either. I just changed the view to "Samples" in the editor, and look at the numbers on the top right side of the editor window. I believe this is the same thing that is in the pop-up, I would just click and hold, look at the number, move, click and hold and notice the number changed, made it much simpler when I changed view to samples and zoomed way in, that way I could move the anchor 1 sample and then look and see which number changed by 1. Kind of round about, but allowed me to get the correct math. Also, I think you would do this after lengthening the region, as my numbers changed before and after...

 

But as you can see above, my setting ended up being best at 0 so it was all for naught :) Although will definitely come in handy if / when I get a new interface.

 

Not a logic expert by any stretch of the imagination, but I think this is what you are asking about...

Edited by SirReel
Link to comment
Share on other sites

I'm seeing how all this can get confusing. It's a lengthy thread and there's information scattered all over the place.

 

What I hope to do (soon, too) is consolidate all of this information in one place and make it either a sticky, or, put it in the tips/tricks section of the forum.

 

Meanwhile, if your audio is appearing early, I posted this previously:

 

http://www.logicprohelp.com/viewtopic.php?p=135376#135376

Link to comment
Share on other sites

9. Click/hold on the anchor point, being careful not to move it. You will now see two numbers in the upper left hand corner of the window. Write down the bottom number. 6

 

This is the part that I think m-m-m and I are both talking about in the most recent posts...

 

When I do this, there is no two numbers in the upper left had corner of the window... Where exactly do you mean.

 

The only thing I see change is this

 

Not Clicking / Holding

 

http://derrickm.homeip.net/~d/images/logic_no_click.jpg

 

While Clicking / Holding

 

 

http://derrickm.homeip.net/~d/images/logic_click.jpg

 

I think that is the part that has us both confused. I'm almost certain one of the recent updates did this, because I distinctly remember trying this before and everything matched as you stated, but I pulled a bonehead move and left the click on during record and it hosed it up. When I started trying again, which was fairly recently, I got the results above, no numbers on top and bottom, but I did notice that right had number changing, and in Sample View in the editor, it appeared to be doing basically the same thing as you describe as the bottom number in your original instructions...

Link to comment
Share on other sites

Like MattCooper, my test with the Digi002R seemed to show that the recorded track Y was out of phase. Enabling the phase invert function on the gain plug in made the tracks come back from the tiny sound I was hearing with without inverting the phase. Does this mean that everything I record is phase inverted or only audio that gets sent out and rerecorded? How do I test this? Any workarounds or suggestions?

 

Incidentally, the test seemed to show track Y to be 443 samples behind track X. I recorded at 96k.

 

Thanks.

Link to comment
Share on other sites

Does this mean that everything I record is phase inverted or only audio that gets sent out and rerecorded? How do I test this?

 

Create an instrument channel. Load up a mono test oscillator in the input slot. Set the oscillator to Needle Pulse, 100Hz. Bounce a bar of it to a mono file.

 

This will give you a file of "positive only" peaks that are only 3 samples in length and 10mSec's apart, (see screenshot). Use this in your test and you'll know for sure if your hardware is polarity swapping anywhere along the way.

Needle.jpg.98cf19085e062d84f927552595998bd0.jpg

Link to comment
Share on other sites

The wave I recorded in track Y is the opposite of the wave of the oscillator on track X: Track Y is pointing down instead of up. I guess this means my device is polarity swapping? What are the implications of this? Is there a way to use the gain plug in's phase invert to correct the problem?

 

I appreciate your help. Thanks.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...