Jump to content

m. Anodyne

Member
  • Posts

    30
  • Joined

  • Last visited

m. Anodyne's Achievements

Newbie

Newbie (1/14)

0

Reputation

  1. So, there have been a number of times when (for non-professional reasons) I've done smart tempo analysis on a commercial track, and Logic just got it wrong. For example, Haddaway's "What Is Love" (radio release). Logic came up with something around 80bpm at 4/4. Actual bpm around 120, and the downbeat/bar skew is wrong in a way that matches that. I don't expect Logic to always get it right, but there's only the 2x and 1/2x buttons to adjust - no 2/3 or 3/2 or whatever. Is there a way to numerically adjust the mis-analysis? The tempo actual varies across the song, so attempting to drag by scale or something wouldn't seem to be likely to succeed.
  2. Well, after much experimentation, it would seem that AN answer would be to use Reaper, which natively supports doing this sort of thing. See section 19.9 "Parameter Modulation Under Parameter Control" in their user guide. There are also some simple examples of what you might use this sort of thing for. Reaper isn't initially appealing to me - I'm pretty fond of Logic, but I'm not at all interested in getting into DAW comparison discussions. OSC looked interesting, generally, but most of the recent comments related to it seem to say "it seems broken in Logic now", and point people to the proprietary control surface SDK that's available from Apple (under NDA, I'm assuming). I wasn't able to find a way to wire plugin parameters together the way I would want to in any of the other solutions that function inside Logic, and have timed out on my investigation phase. Thanks y'all for helping on my exploration...
  3. It's quite easy to demonstrate that this statement is false. Automate Center Freq with FAP on a channel with a synth on it, enable Latch, hit Play, start playing a few low notes... wait a sec, play some high notes. Hit Stop. Logic will show automation values like 511, 518, ..., 2995, 2999, etc. Automation values for this AU and others clearly are recorded in Logic as something besides 7 bit integers - presumably they match the floating point values that the AU SDK presents to AU hosting apps. These statements are incorrect - see above. As previously stated, my goal was to use data that was neither MIDI nor audio stream, but other derived data (like the derived center frequency) that could be communicated with sufficient precision from one plugin to another so as to be useful for my needs. In exploring this, it became apparent that Logic's event communication mechanism was the limiting factor. This is why the discussions of other AU hosting mechanisms that might offer a higher precision event pipe was relevant. Writing a meta-AU whose only purpose is to supply that event pipe is a possibility. Writing an AU that does all the processing itself is, indeed, a possibility, but would mark this whole approach as a dead end. I'm used to being able to string together existing pieces of functionality to accomplish my goals, rather than reinventing the wheel several times over.
  4. No it's not, it is Side-Chaining but using MIDI not AUDIO. Sorry, I guess that was too tongue-in-cheek for the internet. Did you follow the URL instructions that you posted, but using the values specific to your setup, not the same values that are in the pictures. I did and it's working. In fact, there's two solutions posted in that link, did you try both of them? Can you post the same exact pictures but with your setup values using both solutions. I was unable to find a way to get any values with a precision greater than 7 bits (0-127) to be passed along or manipulated via Logic between two plugins via this mechanism. Obviously using a MIDI CC value event (the second solution proposed in the linked article) limits things to 7 bits (I didn't see a MSB/LSB solution like the 14 bit MIDI pitch bend controller), and as we've discussed here, it appears that the non-MIDI Logic Fader events (the first solution) as exposed only have 7 bits of resolution available. It'd be awesome if I'm just missing something! But, as it stands, using the built in routed messages (Fader or MIDI CC) does not appear to provide sufficient resolution for frequency values, in particular. This is an essential part of what I want to experiment with. The value parameters themselves appear to be double-precision floating point parameters (which is, what, 56 bits of mantissa? something like that...), at least to the Audio Units and code dealing directly with them, so the issue isn't in the bit resolution of the parameters, just how they're packaged and routed through this part of Logic's visible machinery.
  5. Thanks - downloaded that, taking a look.
  6. The Environment is definitely Morlock territory. I described it to my girlfriend as being like going from a shiny and new looking office building down to the basement where all the leaky steam pipes and sharp corners are. I mean, especially compared with some other basements, it's pretty nice, but still. Anyways, I'm not at all wedded to using Environment as a solution for this - I just saw it in the BC docs and gave it a go. If Environment machinery/cabling does quantize down to 7 bit (and it's not just a display/interpretation issue) then bleh. I'm just trying to avoid reinventing the wheel - if I can use the high resolution outputs of one plugin to drive the inputs of another, that's all I really care about. I have some ideas sonically I'd like to try that would require a LOT of work if I'm not able to do something like this - basically I'd have to put together a plugin that had capabilities similar to ALL of the plugins I'm interested in, and pass parameters in between those chunks of code. Re-wiring, say, a frequency analysis output to one type of EQ, and then switching to another EQ type would be orders of magnitude simpler than finding a working C++ implementation of EQ type A and then finding an implementation of type B, etc. Spending the $90 on PatchWork would definitely be worth it if their "parameters mapping capabilities" let me do this. I just watched a bunch of the videos, though, and it looks like there are major chunks of what I would need, but not exactly a "wire these two knobs together" facility that would let me connect an OUT value from an analysis plugin to an input control on another. I'm guessing that I should write to them directly after I sift through the docs for Plug'n Script.
  7. FWIW, I do routing/summing auxes all the time, for the reasons already brought up. Nested summing stacks would be a UI convenience for a variety of reasons, but until/if Logic supports those, I just set up the aux equivalent - I could use VCAs too, but I don't. But my question, in this context, was mostly to figure out which problem you're running into. The extra surprising thing here, now, is that you're describing a problem where the audio on the SOURCE bus is delayed if you send it to another bus? So, if you have (just for purposes of simplification), an audio track with recorded drums on it. You have a send from that track to bus 130 (that has your reverb on it), and bus 129 (that has your compression on it). Both 129 and 130 have output to Stereo Out. Everything's great. Then you add a send from 129 to 130. Now the output from bus 129 is messed up. You turn off the send from 129 to 130 (by turning off the power button for the send), and the output from bus 129 is still hosed. But if you remove the send from 129 to 130 (by clicking on the send to get the pop-up, and choose No Send), then 129 works correctly again. That's my understanding of your situation - please correct me if I missed something. So, if you disable/remove all the plugins on 129 does the bad behavior go away? Same question for 130? And for both 129 and 130? Or does the problem only show up if the source track is something more complicated than a raw audio track? Sorry if this seems overly involved - I understand you're in the middle of trying to get s#!+ done. I'm a programmer, though, and divide and conquer is how we try to find out exactly what's going on.
  8. Hopefully others respond as well, but this surprises me... Sending one signal to another without any plugins involved would, I would think, introduce zero latency. If you plop an audio file into a track, then send it through 8 plugin-less aux channels, convert the last aux channel to a Track and then bounce the track, is the waveform in the bounced track really offset in time by any perceptually significant amount? As for aux channels with plugins on them, is Preferences->Audio->General->Plugin Latency Compensation set to "All"?
  9. Yep, as I was following the BC tutorial I did stick in multiple monitors to watch what was going on. I also stumbled upon the "Automation Events List" window (control-Command-E). It would appear that Automation Fader events are synthetic - that is, if you move automation keyframes, the list of automation events changes. Manual edits of the automation events are temporary until you change automation keyframes and the whole list is regenerated. I agree that BC's usage of treating output data as fake inputs so as to make them automatable is... novel, and a clever abuse of how things are "supposed" to work. As I mentioned about the copy/paste from one automation value to another, you can manually "move" this output data to a different automation value (like moving the frequency output from BC to an EQ frequency control) - unfortunately, there's a wierd scaling issue between the two. Something that shows up as 443 (pretty close to 440 for that A...) in frequency output shows up as 24Hz when it gets moved to the EQ automation value. This scaling, and the quantization to 7 bits (0-127) in terms of the Fader events, makes them less useful, at least with the knowledge I currently have. I don't know how to apply a scalar to all automation keys for a particular value in a track, for example. I can use the UI to "Trim" the values (which seems to scale, rather than offset, the values) by hand, but that's obviously more error prone than providing a numeric value. Basically, I want better access to the raw automation values, and it seems that (outside of plugins) I can't currently even get them in a 14-bit form - just the synthetic 7-bit form. I didn't get to reading the AU docs yesterday, so I'm still ignorant about what's available in there. I suppose for the relatively hardcore users, you could provide an AU that offers an input automation value, a transform matrix UI, and an output automation value to alleviate any precision issues by dealing with the values "natively." Just thinking out loud, and still on my first cup of joe... I'm missing what's preventing anyone from translation back and forth between MIDI and Fader data? The Transformer node in Environment lets you translate Fader events to (Status=Fader) to MIDI (eg, Control, Thru, Fix=7, Range=4,110), or the other way around, and cable it off wherever you want? Wasn't familiar with BC PatchWork... interesting that it would let you host VSTs inside Logic, or so it would appear.
  10. Some interesting follow-up: If you open up Automation for a track with a BC analysis tool, it shows "OUT: xxx" for a bunch of the automation channel values. So, I picked "FreqAnalyst: OUT:Center Freq(1)", set it to Latch, and hit Record... I have the Classic Electric Piano instrument on that track, so that's what the signal source was. Logic recorded the changing values (not weirdly quantized or anything, either) that FA was spitting out. Then I found this: If you hold down Command when you pick a different automation value (like "Channel EQ: Peak 2 Frequency"), Logic will let you copy the automation values over to that. Obviously, this would be a clunky AF workflow to actually use, but I thought it was progress, and I hadn't ever seen the automation value copy/paste stuff mentioned before.
  11. Does anyone have any extensive references to plugin "Fader" usage? For example, Blue Cat mentions using it here: https://www.bluecataudio.com/Tutorials/Tutorial_SideChain_LogicPro/ (the URL is a lie... this doesn't have anything to do with side chains as we normally know them) Specifically, you can tell their plugin to output certain values as "Automation Output" which, apparently, Logic treats sorta like, but not exactly like, MIDI controller values. You can also tell their plugin to spit out actual MIDI controller values as a separate option. Now, for something like a volume level, 127 values is probably plenty of resolution. But for frequency values? Not so much. 14 bits would probably be fine-ish. A 32-bit float would definitely do the trick with room to spare. Anyways, for the sake of example, if I were to take the "Center Frequency" output from their "FreqAnalyst Pro" plugin and connect that to the input for Logic Pro's Channel EQ "Peak 2 Frequency" I would think that I could have a simple setup that would (basically) isolate the root frequency of a simple source signal. Burning the output of the plugin into an automation channel's values would also be a possibility. I'm comfortably mucking about with the Enviroment, writing script, whatever. Getting an Audio Unit compiling is a bit of a pain in the ass, but also do-able. But the documentation for this part of Logic (the Filter values) feels pretty thin on the user side. I'll go spelunking in the AU dev docs now, but I figured I'd ask if this is familiar ground for anyone. Why I would want to do this, or anything similar, is a different subject, entirely. Basically, I want to be able to connect existing and highly functional tools together in Logic to achieve results that aren't standard fare. Yes, I have a *nix background, lol. FWIW, I'm currently using the demo version of that plugin to see if it fits my needs. It definitely does spit data out.
  12. Wouldn't it be best to only kill any non-zero-velocity note on events iff the channel is muted, and leave everything else alone? That way, pitch bend, etc., goes through. Kudos for writing the code, though.
  13. There's also starting Logic in "safe mode" as a diagnostic help. https://support.apple.com/en-us/HT203231 Usually if a program is crashing, it's possible to figure out what the issue is by going through the crash dump... but I'm a programmer, so reading that stuff is what I do, but it looks mostly like gibberish to normal people. It is possible that one of the misbehaving plugins caused Logic to get into a bad state internally without actually crashing, and then some garbage got saved into a project file that is causing problems when you reload it. If you can quickly re-create the project without too much pain, that might be the path of least resistence?
  14. Well, either Logic itself is getting overloaded, or something else (like Spotlight indexing) is happening at the same time and interfering. Take a look at the Activity Monitor utility and use Google to help understand what all is running on your machine.
  15. The old fashioned version of backing everything up to the cloud was to have two physical drives, and never have both of them in the same place. Like, take one to work and leave it in a drawer, then swap the one at home and the one at work every week or something. That way if your house burns down or something equally awful, you still have your data somewhere else. But, yeah, just use Time Machine.
×
×
  • Create New...