Jump to content

Routing MIDI FX across multiple tracks [Fixed in 10.7.2]


WaxTrax
Go to solution Solved by WaxTrax,

Recommended Posts

  • Solution

UPDATE: This seems to be completely resolved as of v10.7.2, testing on an M1 Max MBP with macOS v12.1. It now works like I would expect it to with each channel remaining independent of the others, no scripts or messing around with MIDI busses and channels necessary -- it just works!

 

I modified this post with step-by-step instructions after my issue was solved to hopefully help anyone else trying to do something similar.

 

My goal was to use third-party AU MIDI FX (such as W.A. Production InstaComposer or the Reason Studios Reason Rack Plugin (RRP, used throughout this example) to generate MIDI notes and have them play through various software instruments, each on their own separate track, within Logic Pro X. Logic does not support a logical :-) workflow for this as of v10.7, at least not as compared to a couple other DAWs (such as Reason or Reaper).

 

After tremendous help from David Nahmani, Dewdman42, and especially ValliSoftware, I got it to work and their help was greatly appreciated. Here are the steps you can take to make this work.

 

===== Prerequisites:

1. Create an IAC Driver bus. This serves as an internal MIDI device for channel routing. I am pretty sure you will need one bus per 16 channels of MIDI for the upcoming steps. https://support.apple.com/guide/audio-midi-setup/transfer-midi-information-between-apps-ams1013/mac

 

2. Obtain a copy of ValliSoftware's "Allow MIDI Channels" script. This is the key to getting this to work. After you have a copy of the script, you can either load it as a file from the Logic Scripter MIDI FX plugin, or you can save it to ~/Music/Audio Music Apps/Plug-In Settings/Scripter/ so that it shows up as a preset. viewtopic.php?f=45&t=159112

 

===== Project Setup:

The following instructions were created from a freshly-initialized/reset Logic configuration on version 10.7, and mostly mirrors the instructions present in ValliSoftware's video. If you are a more visual person, I highly recommend viewing this video. viewtopic.php?f=45&t=160763&p=843955#p843955

 

1. Inside a project, create a new "Software Instrument" track with instrument "Empty Channel Strip". Create the number of tracks that you want to have instruments for, plus one extra. For these instructions, I will create 3 tracks --- two will be for instruments, one will host the MIDI FX plugin (I will be using the RRP).

 

2. Select the tracks that will be hosting the individual instruments ("Inst 2" and "Inst 3" if you created 3 tracks in the last step). Right-click one of the tracks, then select "Create Track Stack" and create a "Summing Stack".

 

3. Select the "extra" track ("Inst 1" in this example). Click "Instrument" in the channel strip and select "Utility > External Instrument > Stereo" (mono or stero does not matter here since this track will be used for MIDI data only, not audio).

 

4. In the window that appears, select "IAC Driver Bus 1" (or whatever you named it in Pre-step 1) for the "MIDI Destination". You can close this window now if you wish.

 

5. Select the "extra" track again (if it is not selected) and click "MIDI FX" in the channel strip. Load the plugin of your choice. The rest of these steps will be using the RRP.

 

6. In the RRP, drag the number of Player devices into the rack corresponding to the number of tracks you want to generate notes for (two in this example). I will be using two instances of "Pattern Mutator" in this example. As a side note, for Pattern Mutator, turn off the MIDI Transpose option toward the upper-left of the player.

 

7. Inside the RRP, set the MIDI channels for each of the "MIDI Out" devices. I am setting these to 1 and 2, which will correspond to tracks "Inst 2" and "Inst 3", respectively.

 

8. For each track that will be hosting a software instrument ("Inst 2" and "Inst 3" in this example), click "MIDI FX" in the channel strip for that channel, and select "Scripter". Click "Factory Default" and either select the "Allow MIDI Channels" script if you saved it as a preset, or select "Load" to find and load the script file. Then, click the check box next to "Accept MIDI Channel" for the MIDI channel this track is to receive. Inst 2 receives channel 1, and Inst 3 receives channel 2 in this example. Repeat this for all tracks for which you set up RRP Player devices (and set their channels) in the previous two steps.

 

9. For each of the tracks modified in the last step ("Inst 2" and "Inst 3"), click "Instrument" in the channel strip and load your software instrument. For this example, it does not matter what you pick, just choose each instrument and patch, if applicable. You could even load a new instance of the RRP and use those instruments if you want (though it might not make sense to go through this whole process just to do that -- my goal was to use the RRP Players with Logic's instruments).

 

If your software instrument does not have the ability to self-audition patches, you will need some method of input that allows you to change MIDI channels, such as an external MIDI keyboard. You can also follow the remaining steps and choose patches or make adjustments to the instruments later. As ValliSoftware mentions in his video, another alternative is to record-enable and temporarily disable the script plugin for the channel you wish to audition. Just remember to re-enable the script when you are done.

 

10. Select the "extra" external instrument track ("Inst 1" in this example). Right/ctrl-click the timeline for the track and select "Create MIDI Region". Stretch/expand the region to the length that it takes for your slowest RRP Player device (in this example) to complete one repition. I will use 8 bars in this example.

 

12. Select the track stack ("Sum 1") and set it to record-enabled.

 

13. Press the spacebar to enable playback on the Logic transport. Your RRP Player devices are now active and can be adjusted as necessary. It is recommended to enable playback looping on the transport and set the range to the same as the MIDI region.

 

For the RRP, you do not need to create any notes inside the MIDI region for this to work. This may be required for other plugins, though. However, these instructions are just a starting point to demonstrate a working example. Nearly every step can be adjusted and tweaked for new creative possibilities.

 

I have included a copy of the Logic 10.7 project file I created following these steps so that you have a tweakable starting point if you wish. The project uses v12 of the Reason Rack Plugin with the Pattern Mutator player device.

Routing MIDI FX RRP Demo.zip

 

Special thanks again to ValliSoftware because this doesn't work without the script that he created.

 

 

Original post:

 

Hello everyone, first post here. Let me apologize up front if this question has already been answered --- I have spent several hours both searching for answers and trying to figure it on my own, but I am not getting anywhere.

 

What I would like to do is have two or more individual softsynth tracks with their own MIDI FX for the ultimate purpose of creating generative music. I am fairly new to Logic, and I am using v10.7 on an M1 MacBook Air (though I also tried this on my older Intel MacBook Pro with v10.7 and had the same results).

 

I can use any instrument plugin as the sound source, but only the built-in MIDI FX are working as desired, where each track is completely independent. Third-party MIDI FX plugins aren't working correctly.

 

What I would like to do is use multiple instances of the Reason Rack Plugin (one per track/softsynth) and use the Player modules inside the RRP to generate notes inside Logic. If I begin with a single track, this works as you would expect it to with the Player device in the RRP generating the notes to be played by the softsynth plugin in Logic. As soon as I add a second track and assign an instance of the RRP as MIDI FX on the second track, the first track no longer generates sound. Additionally, the RRP assigned to the first track now generates notes for the second track. If I have two instances of the RRP (each assigned to their respective tracks' MIDI FX) both will generate sound simultaneously, but the notes from both RRP instances are sent to the second track.

 

I have tried different combinations of changing the MIDI channels, and creating multiple IAC MIDI busses to test, but nothing changes this behavior. I also tried W.A. Production InstaComposer in place of the RRP (and even one instance of InstaComposer and one instance of the RRP), but it exhibits the exact same behavior. This makes me wonder if there is a bug or limitation with Logic, or if I am just not setting up the tracks correctly (since I am somewhat new to Logic). But once again, it works exactly as I would expect it to (each track remaining completely independent) when using the built-in Logic MIDI FX.

 

One of my favorite aspects of Reason is using the Player devices to create generative music, and my ultimate goal is to be able to do the same thing from within Logic. There are so many other features of Logic that I think are great; if I could get this one aspect to work I think I might be a convert :-) Thank you for reading!

Edited by WaxTrax
Link to comment
Share on other sites

Hello everyone, first post here. Let me apologize up front if this question has already been answered --- I have spent several hours both searching for answers and trying to figure it on my own, but I am not getting anywhere.

 

What I would like to do is have two or more individual softsynth tracks with their own MIDI FX for the ultimate purpose of creating generative music. I am fairly new to Logic, and I am using v10.7 on an M1 MacBook Air (though I also tried this on my older Intel MacBook Pro with v10.7 and had the same results).

 

I can use any instrument plugin as the sound source, but only the built-in MIDI FX are working as desired, where each track is completely independent. Third-party MIDI FX plugins aren't working correctly.

 

What I would like to do is use multiple instances of the Reason Rack Plugin (one per track/softsynth) and use the Player modules inside the RRP to generate notes inside Logic. If I begin with a single track, this works as you would expect it to with the Player device in the RRP generating the notes to be played by the softsynth plugin in Logic. As soon as I add a second track and assign an instance of the RRP as MIDI FX on the second track, the first track no longer generates sound. Additionally, the RRP assigned to the first track now generates notes for the second track. If I have two instances of the RRP (each assigned to their respective tracks' MIDI FX) both will generate sound simultaneously, but the notes from both RRP instances are sent to the second track.

 

I have tried different combinations of changing the MIDI channels, and creating multiple IAC MIDI busses to test, but nothing changes this behavior. I also tried W.A. Production InstaComposer in place of the RRP (and even one instance of InstaComposer and one instance of the RRP), but it exhibits the exact same behavior. This makes me wonder if there is a bug or limitation with Logic, or if I am just not setting up the tracks correctly (since I am somewhat new to Logic). But once again, it works exactly as I would expect it to (each track remaining completely independent) when using the built-in Logic MIDI FX.

 

One of my favorite aspects of Reason is using the Player devices to create generative music, and my ultimate goal is to be able to do the same thing from within Logic. There are so many other features of Logic that I think are great; if I could get this one aspect to work I think I might be a convert :-) Thank you for reading!

Sorry for the programming geek talk...

You're members in a class are shared across multiple instances of your MIDIFX.

If you don't want that, the members have to be defined with @property/@synthesize.

 

I know this because I ran into the same issue with my MIDIFX program.

I set from the UI in one instance, then all the other instances saw that change.

Once I implemented certian members with @property/@synthesize, then when I set from one instances of a loaded MIDIFX, only that one instances was set.

 

As far as Instacomposer is concerned, I've implemented that so I can take the MIDI outputted from Instacomposer and using Logic's MIDIFX to further modify the data.

Link to comment
Share on other sites

In your video, you are generating sound for multiple tracks from just a single instance of InstaComposer, is that correct?

Yes.

Instacomposer only can use 5 MIDI channels and if I set the MIDI channel numbers in Logic to match the MIDI numbers that are defined in Instacomposer, then the track will sound in Logic.

This is done easily with using a TrackStack in Logic.

Link to comment
Share on other sites

Instacomposer only can use 5 MIDI channels and if I set the MIDI channel numbers in Logic to match the MIDI numbers that are defined in Instacomposer, then the track will sound in Logic.

This is done easily with using a TrackStack in Logic.

 

I spent more time trying to figure this out, and to try to decode what you said in your original reply. Are you saying that you are using custom MIDI FX scripts in addition to multiple instances of MIDI FX plugins in order to somehow keep the instances separated? I also tried working with a summed track stack but I did not progress in my issue. I tried following along with your video as well, but it did not lead me toward any answers unfortunately. Though in your video, you do seem to be very close to what I am trying to do.

 

Would you mind sharing a project file that implements this with InstaComposer? If I can get this to work the way I am desiring, I can reply to this thread with a step-by-step instruction set for anyone else who is trying to do this same thing.

 

To recap, what I am trying to do is:

 

Track 1:

Third-party MIDI FX plugin (InstaComposer, Reason Rack Plugin, etc) that generates MIDI notes on its own ---> software instrument (Retro Synth, AU plugin, etc)

 

Track 2:

Different third-party MIDI FX plugin (InstaComposer, Reason Rack Plugin, etc) that generates MIDI notes on its own ---> different software instrument (Retro Synth, AU plugin, etc)

 

Track X: another independent non-overlapping third-party MIDI FX --> softsynth

 

Just for examples, this concept works flawlessly on a single track per midi effect/softsynth via plugin stacking on Reaper, and it works across two MIDI tracks (1 MIDI effect, 1 softsynth) in Ableton Live. It doesn't matter to me if I need to do this across multiple plugins, tracks, channels, or busses, I just wish I could figure out how to make it work because then Logic might become my primary DAW :-)

 

Thanks for your time, I appreciate it!

Link to comment
Share on other sites

Thanks for the reply. The behavior is the same across multiple third-party MIDI FX, it's not specific to Reason. I tried with RRP and InstaComposer and the behavior is identical. This is regardless of any MIDI channel settings that I set anywhere within Logic or the plugins themselves. This is working as intended in other DAWs, just not in Logic, which is leading me to believe either Logic won't support this, or I am just not doing it correctly.

 

ValliSoftware seems to be really close to what I am trying to do, I have just been unable to replicate his setup as of yet.

Link to comment
Share on other sites

secondly why are you using IAC or talking about that now? Perhaps you can elaborate a little bit more on your exact setup. its not entirely clear in your first post..but what I can say is that all of my midifx....running in midi plugin slots...are sandboxed to that track only...that includes both built in and 3rd party, they are the same. As to why Reason would route to another track, I have no idea..other then either that is a "feature" of reason...or you have it configured to send the midi over IAC..which is then routing it back around into the input of LogicPro again..and that's why its going to the other track. Probably you can configure Reason (and Instacomposer it would seem) to not to do that.
Link to comment
Share on other sites

The IAC was just something I was trying out to see if it would solve the problem, but it didn't.

 

I'll go through step by step where I am at. I reset all my preferences to default.

 

1. New Project > 1 track > software instrument > built-in Retro Synth (the softsynth plugin used for this part does not matter)

2. Track 1 MIDI FX > Reason Rack Plugin (I also tried InstaComposer and Kushview Element)

2a. Drag a Player device into the rack (Pattern Mutator for example). This creates a default setting of MIDI Out on channel 1 inside Reason. For other plugins like InstaComposer, I am generating notes for MIDI channel 1 only.

3. Turn on transport cycle © in Logic and press play -- notes from the RRP Player device successfully play using the softsynth assigned to Track 1 (Retro Synth).

4. Press Stop on transport.

5. Add new software instrument track in Logic

6. Track 2 MIDI FX > Reason Rack Plugin (this creates a new instantiation)

7. Drag a Player device into the rack. Still default MIDI Out channel 1.

8. Press Play on the Logic transport. Both RRP Player devices generate notes for Track 2, while Track 1 remains silent.

 

At this point I have tried all different combinations of channels both inside Logic and inside the Reason Rack Plugin (and Instacomposer and Kushview Element) and nothing has altered this behavior.

 

In other DAWs, following a similar workflow, the tracks remain independent. If I use the built-in MIDI FX like Arpeggiator, the tracks remain independent (though it doesn't self-generate notes like I'm trying to do).

 

If this only happened in Reason, I would completely agree that it's just Reason. But because the behavior is identical across multiple plugins, that has to mean this is either a Logic limitation or most likely I am just not setting this up correctly.

Link to comment
Share on other sites

I just tried a test with Instacomposer, it works exactly as expected, seperate midi, seperate tracks. totally independent.

 

But one thing is the way instacomposer is designed you have to hit a key on your midi keyboard to trigger its playback...so when you hit a key on your keyboard its probably only triggering the one of the currently selected track.

 

I am still back to saying there is something about your reason config.

Link to comment
Share on other sites

I tried record-enabling both tracks. There was no change in output (both RRP Player devices send the MIDI output to Track 2 only), but there was a slight change in behavior.

 

Following the steps from before, I had to press play on the transport in order for the two player devices to work at all. I could not just hit the "Run" button inside the respective Player devices to trigger the notes -- they only generated notes when the Logic transport was running.

 

After record-enabling both tracks, I can now press the Run button independently (and simultaneously) in each RRP Player device and the pattern runs without the Logic transport running. However, the output is still such that the generated notes are only going to Track 2.

 

At this point, if I create a third software instrument track and repeat the process, the three separate RRP Player devices now trigger notes just for the third track, and now track 1 and 2 are silent.

 

The common thing I noticed though is that when an individual track is record-enabled, then I am able to press the "Run" button within the RRP Player device to generate notes without pressing play on the Logic transport.

 

The RRP (and InstaComposer) are MIDI channel capable on the output side. Is there some setting I can do inside Logic to force each track to accept the MIDI FX on only that specific channel? Messing around with the track MIDI In Port, MIDI In Channel, and MIDI Out Channel has produced no results for me (again, unless I am just doing it wrong).

 

Thank you for the replies, I appreciate it!

Link to comment
Share on other sites

I don't know what to tell you then, its working properly for me. But I suspect your problem is how you are using things yes...regarding the INPUT side.

 

LogicPro by default sends all input midi to the currently selected track (which automatically becomes record-enabled) and any other tracks that are also record-enabled.

 

If you want to use different midi channel inputs then you need to look into demix mode, which further complicates things. How do you have that project setting set by the way?

 

You should probably look into LogicPro 10.7 if you are going to mess with demix mode.

 

good luck

Link to comment
Share on other sites

I am using v10.7. I'll look into the demix mode to see if I can get it working. I'm not afraid of complexity, I just want it to work :-) Once I can get it working I'll have no problem understanding how it works, I just have to get there first! :lol: Being new-ish to Logic and not understanding some of the internals is most likely what is hindering me here.
Link to comment
Share on other sites

If you're using 10.7 then you have to check out the midi input port and midi input channel values in your project on each track. Don't use DEMIX mode. I don't think its even an option in 10.7 anymore.

 

now all that being said, I don't have 10.7 because I am on Catalina, so I can't verify that MIDIFX are still working properly..its quite possible Apple broke something in the new version while implementing the new port/channel routing features.

Edited by Dewdman42
Link to comment
Share on other sites

Are you saying that you are using custom MIDI FX scripts in addition to multiple instances of MIDI FX plugins in order to somehow keep the instances separated?

Yes on the script and yes on the multiple instances of Instacomposer but I play them one after the other, I don't play that both at the same time.

If I didn't use a customize script, all instruments in the Track Stack would receive and play the 5 tracks from Instacomposer.

To stop that, on each individual instrument in the Track Stack I use a customized script which I posted it here.

You would put that as the first script, then check the checkbox for which MIDI channel you want the instrument to play. In an easy configuration you'd want each Track Stack instrument to play individual channels, 1-5. To experiment let's say on the first instrument, if you check MIDI channel 1 and 2, the first instrument will play whats generated from Instacomposers channel 1 and 2.

If you happen to install a multitimbral in the Track Stack and since the bass on Instacomposer defaults to 3, you can have the Track Stack instrument receive MIDI channel 3 but notice in my script, you can reassign the MIDI channel number to a number that the bass is on in your multitimbral instrument.

 

I'll create a video that will show how to setup Instacomposer in Logic using a Track Stack. I'll make sure to only use. whats in Logic and that customize script that I made available to everyone.

Now while you can use The Enviroment and cable things, using a Track Stack is easier.

Plus you can create another Track Stack with other instruments and just by select a different Track Stack, you'll hear the MIDI from Instacomposer being played with a different set of instruments.

 

I'll post that video shortly.

BTW - This video shows that I loaded 2 Instacomposers but I use it to extend Instacomposer.

Link to comment
Share on other sites

I posted the setup over here

Thank you so much for the reply and the setup video. I viewed the video and will try it out as soon as I get a chance. I can now understand the workflow and I believe this will enable me to do what I am trying to do. In my case, using the Reason Rack Plugin, I will be using a single instance (just like you're using a single instance of InstaComposer in the setup video), and then within RRP have multiple Player devices, each assigned to a separate channel.

 

After I get a chance to try it out, if I get it to work I'll post back with step by step instructions for future reference for anyone else looking to do this.

Link to comment
Share on other sites

After I get a chance to try it out, if I get it to work I'll post back with step by step instructions for future reference for anyone else looking to do this.

For sure, that'll just help everyone out. :mrgreen:

 

One other thing about that setup is, once you're happy with what Instacomposer has as far as sending you the MIDI data, just hit RECORD in Logic and now that Track Stack will have a MIDI Region with all the MIDI channels worth of MIDI events. You can leave it on the Track Stack track or split by MIDI Channel and Logic will put each on it's own track.

Link to comment
Share on other sites

Okay with the help of your script and video I finally got it to work! Just a quick question before I post a step-by-step: you wrote this script, correct? I just want to make sure I give proper credit. Thanks again!

Yes, I wrote the Allow MIDI Channels script and I'm glad that help you out cause it helped me out for the crazy stuff I do here. :mrgreen:

Link to comment
Share on other sites

i just read the OP and feel like i understand. i just want to make sure this will help me in the way that i am thinking.

 

I have stutter edit 2 midi fx usually sidechained to a sub mix bus, but i have to open up multiple instances of this midi fx plug in because of the lack of flexibility in routing in logic. ie. i need one instance of stutter edit 2 midi FX for synths, one instance for bass, and another instance for drums etc., but with this solution provided in OP, i'm able to use just one instance of stutter edit 2 for all 3 sub mix groups while maintaining my submix bus routes?

 

edit- so i only got as far as step 5, because i need to load stutter edit 2 as an AU midi-controlled FX plugin, not an FX plug-in. I cannot insert stutter edit 2 onto the midi FX slot and it needs to be in the input slot right where the external IAC bus plug-in is located.

Link to comment
Share on other sites

more info is needed about what you are trying to do. where is the midi originating from that is going to control Stutter Edit?

 

The way Stutter Edit works is that as you said, it sits in an instrument plugin slot as an AU midi controlled FX that gets its audio from the side chain. You should be able to route all three submits to that one side chain input via a bus.. Then you just control the one instance of stutter edit.. Maybe I'm misunderstanding your problem...?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...