Jump to content

bhuether

Member
  • Posts

    116
  • Joined

  • Last visited

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

bhuether's Achievements

Enthusiast

Enthusiast (6/14)

  • Dedicated Rare
  • Reacting Well Rare
  • First Post Rare
  • Collaborator Rare
  • Conversation Starter Rare

Recent Badges

4

Reputation

  1. Ok, thanks! I will definitely see if Logic's existing group functions can come close to what I am trying to do.
  2. Hi, I think for what I am trying to do folders doesn't really make sense because in one moment, I will have some collection of regions whose volume I want to control together, other moment different regions, where the regions are already part of folder sum tracks. For isntance, I have sum folders for Strings, Woodwinds, etc. At some moment, I want to control volume of violins 1 (in one folder) together with Flute1 (in other folder). And ideally do this in a way where their volume automatoin is then clearly visible. I think what I am hoping to do is not readily doable in Logic, it would require some new abstraction object. Maybe call it Dynamic Groups. It could also have a track lane. So in time you would see the violin 1/flute 1 grouping (and its automation), likewise with other groupings over time. This woud be really great. Also it would be great for these groupings to show up by color. So maybe in general my strings are orange, woodwids, yellow. But at point where violin 1 and flute 1 are grouped as a melodic unison, I would see them colored how I want. That way I could easily visualize how a melody or counter melody are being passed around throughout a piece. Yeah - this would be awesome! I suspect a lot of orchestral arrangers would love such a feature in Logic.
  3. Hi, I am realizing lately that I am not properly using Logic's various volume control functionality. For an orchestral piece I need a more efficient way to control volume of parts that are meant to move dynamically together. For instance maybe for a few bars horns and trombone need to have volume rise together. Then a few bars later horns and flutes. Oboe and clarinet, etc,etc,etc. As it stands now I have been doing everything track by track drawing individual volume curves but it is getting out of control. So what is the proper way to do this sensibly? thanks
  4. Hi, I have a project on Catalina that I am trying to work on in Ventura. The way East West works I can only have a license active on one computer. No problem, I can deactivate on Ventura to get access again on Catalina. I am first trying to open the project again on Catalina to see that I have everything I need on Ventura. So here is the question: Do I need to do anything in Logic on Catalina to prep for opening the project in Ventura? If I open in Ventura will all instances of Play and Opus load the same way as they did in Catalina, with same settings, or do I need to manually save patches in every instance of Play, Opus in order for them to load properly in Ventura? thanks
  5. After hours of experimenting it seems my problem wasn't with Neutron and Ozone but with a version of Relay that was too old. So when I installed Ozone then Neutron I think Ozone replaced Relay and Logic couldn't validate, but when I reinstalled Neutron with relay things seem to work. Or vice versa. I already forget the order I did this in... Strange that a company that reputable would have such nightmares with version control/compatibility.
  6. But what are the exact version numbers you are using? 3.x.x 9.x.x
  7. Upgraded from Catalina to Ventura. I see Logic is failing on scans of Izotope plugins such as Ozone 9.11, Neutron 3.8, RX7. On Izotope website I see that Ventura is not supported with these plugin versions. What do I do with old projects that use the unsupported plugins? If I upgrade to, say, Neutron 4, will I have to manually replace every instance in every project of Neutron 3? Just trying to figure out painless way to go about this. I am not going back to Catalina because I really need Ventura. thanks
  8. That function did indeed create tracks. What I had done was chop up regions, reposition them, resize them. Didn't realize this behavior. Advice on how to avoid this behavior? The thing is, I don't actually see them overlapping. If I drag away a region that is exhibiting this behavior, there is no other region underneath. Maybe prior there was, but then I resized regions so that I had all these nonoverlapping clips. Maybe logic is not paying attention to all the resizing and is incorrectly seeing overlap when in fact there isn't. Thanks!
  9. I guess I just expect scissors to work how I am used to it in other DAWs. That is, apply cut, and the only action resulting is two regions, maybe with crossfade at cut position. But I don't expect lengths of regions to be altered. That is, if a region is 10 seconds and I cut in middle, I would only expect resulting regions to left and right of the cut, each region now being 5 seconds. I wouldn't expect the new region to right of cut to be shortened as is happening here. I would expect the right edge to stay unmoved, as I don't see any intuitive reason for a cut to alter a region's length so drastically.
  10. Hi, I suspect there is some plugin that might be severely interfering, and causing these spikes. Sometimes I see message about logic losing audio MIDI sync, stating that reported sample rate was some odd number like 25528 or something really odd. In the project I am using some free plugins, maybe some that cause havoc. As far as latency inducing, nothing crazy as far as I know. Nothing like Izotope Declick. I finished the project, so probably won't dig deeper. During the project I noticed other weird problems, but maybe normal behavior: For instance, I have audio takes. I chopped them up, placed them in various places. If I use scissors on some audio region, it shortens the right edge of the region with each successive scissors divide. Nothing odd in snap settings. Haven't checked if I can reproduce this in some other project. Anyway, maybe time to upgrade Logic and see if things are better. Made new post about it: thanks
  11. Hi, I am noticing something really odd. I recorded some audio. Then chopped it up, placed chopped regions all over the place. Now when I use scissors on some region, the right edge of the region is shortened afer using the scissors, as shown in attached screen recording: https://drive.google.com/file/d/1CNNNUyWngS2BckOJX3tmkWwv927cEd1O/view?usp=sharing Any idea what could cause this? I have tried changing snap and editing settings, but so far nothing works. thanks,
  12. The automation I was doing was indeed quite vertical. When I eased in the automation I still noticed the problem. Also the CPU load is higher than I thought, as I noticed during automation it was spiking. Even then, still not sure how played back audio would even make it to the audio output driver stage with automation being ignored, or maybe during high CPU use Logic filters out some of the automation to reduce workload. Thanks for the idea about just cutting at those moments and using cross fades. Hadn't thought about that.
  13. Hi, I am just automating the volume. It is a track whose output is set to a bus.
  14. Hi, I am noticing that playback in logic is often not matching what is going on with automation. For instance in screenshot I show a plosive in audio that I use automation on. I set volume at that time in the audio to null just to test. Nonetheless I hear the plosive during playback. I have tried every imaginable setting that deals with plugins, latency, compensation. Nothing works. This happens on this solo'd track, for instance, that has one EQ instance, going to a bus with one EQ, one compressor. Card is RME BabyFace Pro. Buffer 1024. No other programs using disk appreciably. During playback CPU meter goes to about 50%. I suppose even when a track is solo'd all loaded plugins in project are still draining, but I would still think playback would be in time with automation in most cases since it seems to me audio is rendered after automation is applied. So not sure how during playback it is even possible to hear audio that hasn't had automation rendered... Ideas?
  15. Hi, this is sort of related to my post here: Wow, this topic is far more complicated than it would seem! So here is my scenario: I recorded a video of me playing guitar. That resulted in Track1. It is a solo guitar performance, played in 3/4 time, but sometimes I am letting notes be held, tempo is drifting. Then I recorded the same guitar part but I recorded at studio quality. Played same way, but again with tempo drifting. I want to replace the video audio with this high quality audio and have the audio sync'd to my actual playing. Yes - this is how I prefer. Most people say do the opposite. But my goal is to have the quality audio in good sync with the video of me playing. The recordings aren't that far off time wise and so I am not worried about artifacts. Where I am at: - I used the video audio file and applied Smart Tempo. I don't need this file to stretch. It is basically the master track for establishing the grid. In Smart Tempo project settings I wasn't sure about choosing "On", "Bars", "Bars and Beats". Also in resolution I wasn't sure about Smooth versus beats. Very little info about these nuances... In Control bar I chose "Adapt". - I then set things to 3/4, edited the bar/beats locations, chose in Smart Tempo editor "Apply Region Tempo to Project" and kept default box checked in the pop up window. - Next I imported the quality audio and used Smart Tempo on it, with "Keep" selected above in control bar. I edited in Smart Tempo editor so that I identified bars/beats. What I need: 1) Conform quality audio so that its bars/beats match to the video audio. But now what?? In screenshots you see the Smart Tempo editor result for both tracks. If I turn on flex for the quality audio track it is going to apply the project tempo changes. I don't want that. How the video audio varied in tempo is not related to how the quality track varies in tempo. I just need for the quality version to have its bars/beats stretched so that they match the project bar/beats that were established when I used Smart Tempo on the video audio. Maybe for the quality audio track Smart Tempo is not what I need? Maybe it is a quantize function of some sort? If that is the case, how can I tell Logic to use the bars/beats for that track that I already established for the track? 2) Have Tempo map ready for MIDI tracks to be added and be in sync. I suppose I get this result automatically from the first Smart Tempo run I did on the video audio. Anyway, really hoping someone can shine some light here. thanks
×
×
  • Create New...