Matt Mayfield Posted June 2, 2010 Share Posted June 2, 2010 I've been thinking a lot about digital aliasing. When running at 44.1/48kHz, many of Logic's distortion plugins seem to alias at least a bit - it's obvious when running a sine sweep through them. It can be heard and seen on the spectrum analyzer. It got me to thinking: some 3rd party plugins sound better, probably because their algorithms have better anti-aliasing. They probably use an upsample->process->downsample structure. But if you use a whole bunch of these plugins for their sound, it seems like a waste of processing power to have every single plugin instance do the up/down sampling. Why not add a mode in Logic where, even though the audio is *recorded* at the original sample rate (44.1/48k; since with good converters, this is all you need for a good unprocessed recording), the *plugins* all run at, say, 4x the recording rate. That would centralize the up/down sampling: upsample at the beginning of each channel strip, and downsample at the output. It would save processing power and disk space, and bring down aliasing artifacts. In my quick experimentation, aliasing artifacts in the 20-20kHz range from a Test Oscillator sine wave into the Clip Distortion plugin are brought down more than 20dB just by doubling the sample rate. Link to comment Share on other sites More sharing options...
This topic is now archived and is closed to further replies.