Jump to content

SCRIPT: Channelizer


Recommended Posts

  • 2 weeks later...
  • 1 month later...
  • 4 months later...
  • 3 weeks later...
  • 2 weeks later...

This script works by Channelizing each articulation INSTEAD OF sending keyswitches. You will need to setup your instruments with each articulation listening on a different Midi channel and no need to receive a keyswitch.

 

Unfortunately logicpro strips the articulation id from events if you configure anything in the output section of the articulation set, so you cannot use this script to channelize in combination with also sending keyswitches from the articulation set, but generally they should not be necessary when using the channelizing approach.

Link to comment
Share on other sites

no.

 

The midi events will be channelized, (ie, sent on a midi channel) that corresponds to the articulationID assigned to it (presuming the articulationID has not been stripped off by the articulationSet output section).

 

Some sample libraries do not actually provide key switching at all, and so the method has previously been to put each articulation on its own track, with a different midi channel assigned to each track. With this script, you can use one source track for an instrument and assign articulationID's to each note...and then as it plays back it will automatically channelize each note to the right midi channel (and thus played by the right instrument with the right articulation sample)...no keyswitches needed and still just one source track for the entire instrument.

 

There are some other advantages of using this approach instead of keyswitches. For example, you can easily adjust the volume level of each articulation relative to each other when each articulation is basically a separate instrument with separate level adjustment.

Link to comment
Share on other sites

  • 3 months later...

Thanks a lot for this script. I'm trying to integrate it into a template with Play instruments and now that I've got it working, it's great. But, I spent quite a bit of time running into problems. It seems that I have to actually disable the lowest channelized port to get it to work correctly. Since this wasn't the default setting, I'm guessing this is a bug or I completely misunderstand the use of this parameter. I thought it was a way to filter out lower MIDI channels that I might want to use for keyswitching instruments on a single channel. In other words, channels below this one would be immune to the channel switching that the script is using to select articulations. This doesn't seem to be the case. I made a test instrument with 3 articulations on 3 different MIDI channels - 1, 2 and 3. I setup the articulation map with those three articulations pointing to their respective channels. I'm playing from a MIDI controller that's sending on MIDI channel 1. I can hear the articulation on MIDI channel 1, but as soon as I select the articulation on channels 2 or 3 I get silence and this error message:

 

WARNING: articulationID [2] channelizing exceeds range of 1 channels

WARNING: Muting channel overlap

 

This message is implying that the port number actually assigns the number of allowed channels? I'm not sure, but all I know is with the default settings, I can only get one articulation. Have I missed a setup step?

Link to comment
Share on other sites

In the meantime, I will explain the error message you got. That message is what happens when you try to send notes from tracks...with midi channels that overlap with the ranges of other channels that are channelizing.

 

For example, let's say you have a source track on midi channel 1 and a source track on midi channel 2. and then on channel 1 you have a note with articulation ID 2. Channelizer will detect that as an error, because it is. You would need your source tracks to be channel 1 and channel 3 to avoid that problem.

 

from the sounds of your explanation I think the problem is that you are doing something with your articulation set or the source channels that does not make sense with channelizer. Do not assign any midi channels in the articulation Set. Let channelizer do that. and your source channels need to stay out of each other's way.

 

So for example, if you have a source track on channel 1, with art's 1-3. That means that will use up channels 1-3 in Play. You should not have any other source tracks using channels 2-3, the next track should start at channel 4...otherwise you will get the error message you mentioned with channel overlap.

Link to comment
Share on other sites

There's only one track active in the entire project. It has a total of 10 notes. The first 6 are articulation 1 on MIDI channel 1 and the last 4 notes are using articulation 3 on MIDI channel 3. My controller stays on MIDI channel 1, but I get the error message if I try to use articulation 2 or 3 and there's no sound from either the track itself or the controller even though the MIDI in indicator shows the same activity no matter which articulation I pick. It isn't really a big deal because I can simply pick (Disabled) for the port and everything works as advertised. In the meantime, I attached the project and the articulation set so you can take a look. I could easily have screwed up something!

 

What is a big deal in this workflow has nothing to do with your excellent script. The problem I see is that these instruments use a lot of controllers and when the articulation changes involve channel changes, the CC data gets thrown into different lanes so there's no way to see the continuity of the controller data. Plus, you can easily wind up with 9 or 10 lanes of controllers that are actually only 3 controllers that have been divided up all over the place. I suppose I could view multiple lanes in the main window, but I don't think it's possible in the piano roll. Honestly, it sort of ruins the beauty of the articulation concept and could be easily fixed if EW made it possible to build custom key switch instruments.

 

Now that I've messed around with the articulation sets, I'd really like to be able to use them, but I'm not sold on this workaround yet. But you guys are much more experienced with this than I am. How do you deal with this?

 

script test.zip

Link to comment
Share on other sites

I will look at your project later tonight. Why do you have one track with more then one midi channel being used? That should Not matter though. But in that scenario if you try to use articulation 3 on midi channel 1 then the error message is to be expected.

 

You can change the bahavior so that overlapping notes are passed through rather then muted( see gui options ), but really you should identify the overlaps and correct what you are doing so you dont have overlaps.

 

Channelizer is designed to be very automatic. The GUI options in scripter are extremely limited so this is the preferable alternative. So as notes come in to channelizer in midi channels, it will detect those as known source channels. In this single track case you are using two source channels 1 and 3, presuming you aren’t also changing the channels with articulation set.

 

Once channelizer detects those two source channels then it will attempt to channelizing based on articulation Id and if it detects overlaps then by default it will mute the overlaps. Thst is a good default since it makes it painful to have an overlap by accident. If you try to use articulation 3 on channel 1, then that overlaps with the other detected source channel 3.

 

When I get home I will take a look at your project and should know the exact answer for you.

 

The lowest port gui option is meant to pass through lower ports without being channelized. So if you raise that value then you’re defeating the use of channelizer.

 

You can certainly use the articulation set to channelize your notes INSTEAD OF my script but articulation set will not forward your cc events to the destination channels so that becomes a problem. That is the whole reason for using channelizer instead of articulation set

Link to comment
Share on other sites

OK, it's becoming painfully clear to me that I'm assuming a workflow that's basically wrong for this script. I'm sure I'm being way too simple minded.

 

Here's what I thought was supposed to happen:

 

1. The articulation set in this case simply indicates to Logic which articulations are on which channels. There is no setup for the output because the script will take care of that.

 

2. The script knows which articulation is on which channel because of the articulation set file and automatically switches the channel based on the chosen articulation.

 

3. In addition, in the Piano Roll, the script changes the channel of the selected notes and controllers already recorded when a new articulation is selected.

 

That's all I thought it did. I thought it essentially worked the same way that a key switch based instrument would work but would use MIDI channels instead. When you ask why I have one track with more than one MIDI channel I'd have to answer - I thought that was the entire point of this script! Are you saying that I'm supposed to have all the different articulations on different tracks? That's what I'm trying to avoid.

 

Whenever I see a demo of articulation sets, the big advantage is that we can pick any note on a single track and easily change the articulation of that particular note or set of notes after we've already recorded. In the case of EW instruments, this unfortunately means changing MIDI channels also, unless we use one of the KS instruments that they already assembled. I did an articulation set for one of the KS instruments and it really works wonderfully. That's exactly the way I want to work with your script, but I'm getting the impression that this isn't your intention because of course there's going to be more than one MIDI channel speaking on a given track if I change the articulation for a series of notes. It actually works fine as long as I set the port to (Disabled), but I can see that this is somehow not the desired working method.

 

So, obviously my assumption about how this is all supposed to work is wrong. That's fine and explains the confusion. How should I set this up?

Link to comment
Share on other sites

Articulation Sets do a number of different things (some of which you can still use in combination with the Channelizer script, which is what I recommend for PLAY).

 

  1. You can name your articulations, associating names to numbers. So that in piano roll and other places you see a named articulation rather than trying to keep track of which ID means what
     
     
  2. You can use INPUT key switches which will let you change the articulation on the fly as you're playing in the parts. What that essentially does is, when you hit an input keyswitch, LogicPro will determine that you want to be using, let's say articulation 2, and then it will automatically assign articulationID 2 to all the midi events you play on your keyboard, until told otherwise... And they will be recorded to the track with that articulationID. The input keyswitches do not have to have any similarity to the OUTPUT section. They are simply to indicate which articulationID should be assigned to notes you are playing on your keyboard...and probably recording to a track region.
     
     
  3. The output section is where you can indicate keyswitches are to be sent based on the articulationID's found in the note events.... (this particular feature cannot be used in combination with my Channelizer script).
     
     
  4. The output section is also where notes could be re-channelized, based on the articulationID output section. (This particular feature cannot be used in combination with the Channelizer script)
     
     
  5. Articulation Sets do not do anything with CC, PC, PitchBend or aftertouch messages which don't have articulatioID's also assigned to them, which normally they don't. So the big problem here is that while Articulation Set's are useful, if you are using channelizing to accomplish your articulations, such as with PLAY, then if you try to apply a CC curve to your source track...the CC events will only go to the same channel as your source track...and articulation's listening on other channels won't see those CC messages. Thus channelizer was born.

 

So what you can do in order to use articulation sets together with Channelizer, you can setup your Articulation Set with names for each articulation. You can setup INPUT keyswitches as desired. You should not set any channel values anywhere in the articulation Set and you should leave the OUTPUT section completely empty.

 

Then basically you have to know how many channels you're going to use for each source instrument track. Let's say you have an instrument on channel 1, port 1. And you have 16 articulations in PLAY using midi channels 1-6. Great. Create the articulation set with your 16 articulations by name, output section empty. the next source track should be on port 2, channel1, no lower than that. You will have midi events on that first source track with articulationID's 1-16. Channelizer will set the channel for those articulations automatically to go to the right channel in PLAY. It will also forward CC, PC, PitchBend and aftertouch events to any of the PLAY channels that have notes sustaining.

 

I just got home, I will look at your zip file now.

Link to comment
Share on other sites

I don't have HollyWood Gold, so I guess can't load your PLAY sounds anyway...but it doesn't matter

 

You have three problems...

 

  1. You have your midi events hard coded to midi channels in your tracks. don't do that.
     
    eventlist.jpg.51b006841546f03c5096f04ae3f1ff28.jpg
     
     
    Leave them all set to channel 1, and actually use the track inspector to determine what channel the events should end up being sent. Right now you have the track set to ALL with a mixture of channel 1 and channel 3 events in the track region. So don't do that. Leave them all the same midi channel that was recorded (probably channel 1) and use the track inspector to assign them to whatever channel you want them all to be..the same channel.... don't worry about articulation channels here.
     
    eventlist2.jpg.aceb63c6c685e00e6cb17e5d86691e23.jpg
     
     
  2. You are trying to assign channels also in the articulation set. Don't do that.
     
    artset1.thumb.jpg.da6f8fad7e03c5311a3ef5daf35d67d9.jpg
     
     
    Just leave all the channel fields blank or "-".
     
    artset2.thumb.jpg.344eb9edd8325198f3ce361fa8206067.jpg
     
     
  3. When you had the scripter window open you accidentally added some characters to the end of the script, which might be causing your script to not even run. See the 11 and 2 at the end? You added that by accident somehow. Probably you had the scripter window open and hit those keys on your keyboard somehow it went to scripter. So delete those characters and hit the "Run Script" button at the top. Close the Scripter window whenever possible, you only need it open to debug, which you were probably trying to do, but just clarifying..
     
     
    error.jpg.8755373a23eaef9eb08ecec7bd62da37.jpg

 

Alright so after having your source track set to channel 1,....you will see that the channelizer script will do the channelizing for you, based articulationID. And presuming you don't have any source channel overlaps, then it should all behave as expected.

Link to comment
Share on other sites

In answer to your previous statements about how you thought articulation set was supposed to work:

 

 

1. The articulation set in this case simply indicates to Logic which articulations are on which channels. There is no setup for the output because the script will take care of that.

 

If you don't want to use Channelizer then yes. If you want to use Channelizer then you should specifically NOT use the articulation set to set any channels. Channelizer will do that work for you, based on articulationID. Read my original post again.

 

 

2. The script knows which articulation is on which channel because of the articulation set file and automatically switches the channel based on the chosen articulation.

 

Nope, the script doesn't know anything about the articulation set. it only knows what channel the incoming note is on and if it has an articulationID or not. For this script to work properly, you should not be using the articulationSet to change the channel. The script is assuming that it needs to set the midi channel and port for each event, based on the articulationID. You should also not use the OUTPUT section of the articulation set for keyswitches because LogicPro will strip the articulationID away from the notes before sending through the script to the instrument. The script depends on being able to see the articulationID. So do not set the channel in the articulationSet and do not use the output section for anything.

 

3. In addition, in the Piano Roll, the script changes the channel of the selected notes and controllers already recorded when a new articulation is selected.

 

As noted in the original docs, the script will increment the channel of each note, based on the articulationID. if your original track or the articulation set is changing the channel before hitting the script, then the script will increment the channel even more based on the articulationID, resulting in probably something you don't want. If incremented channels overlap with other detected source channels, then you get error message, as expected.

 

That's all I thought it did. I thought it essentially worked the same way that a key switch based instrument would work but would use MIDI channels instead.

 

Yes it can, but you need to make sure not to set the channel per articulation in your track nor in the articulation set. Let the script do it. Just set the articulationID of each note.

 

When you ask why I have one track with more than one MIDI channel I'd have to answer - I thought that was the entire point of this script! Are you saying that I'm supposed to have all the different articulations on different tracks? That's what I'm trying to avoid

 

no, the point of the script is to allow you to have one source track per instrument, regardless of how many articulations. But the script needs to set the channel of each note based on articulationID.

 

Whenever I see a demo of articulation sets, the big advantage is that we can pick any note on a single track and easily change the articulation of that particular note or set of notes after we've already recorded.

 

Definitely. Exactly why you should not have to worry about also setting the channel of each note there.

 

In the case of EW instruments, this unfortunately means changing MIDI channels also,

 

No. The script will set the channel for you automatically during playback.

 

It is explained pretty well in the original post of this thread. Try reading that again after reading all these comments and try what I suggested for fixing your project.

 

Just remember, the script is doing the channel changing. All you have to do is make sure the notes have the articulationID you want and let the script do the channel-changing during playback. Then you have to make sure that your source channels do not overlap their ranges. for example if you have two instruments on two tracks with 8 articulations each, then the first one would be on channel 1 and the second one needs to be no lower then channel 9.

Link to comment
Share on other sites

OK. I completely blew the whole thing when I assigned the MIDI channels in the articulation page of the set. I thought that was the only way Logic could know how to identify where to find the articulation and I see that I blew through your documentation a little too quickly and thought the ID numbers were channel numbers. Doh! Because of this initial mistake, the channel changes were just automatically being made whenever I changed articulations in the piano roll. I didn't actually ever change channels manually.

 

I think I understand now. Let me go back and try this again and follow instructions better. This sounds like it actually is going to do exactly what I hoped. Thanks a lot for helping me with this and sorry about the newbie error! I'll keep you posted...

Link to comment
Share on other sites

  • 1 month later...
8-)

Hi Dewdman - This is an amazing solution you've scripted. I'd love to incorporate it - i recently set up a whole template with art sets for every instrument in the strings, woodwinds and brass, most have between 8-15 articulations loaded, as added and MIDI channel incremented in PLAY. I no longer set up multis in LPX, i just load my saved instrument & everything's there, and i have one track with all notes for that instrument. I change articulations with the dropdown i've created via the artset. I don't assign or use keyswitches or outputs

 

Works fine but then i need 2-3 lanes for CC's on each articulation, which gets messy and seems unnecessary when only one note is playing at a time and a main source should just be sending a CC value to whatever note is playing. I should be able to write CC as a performance (edited after, of course, but the general arc should be already accurate) to give holistic control of this monophonic line and then each articulation will respond in its own way to whatever value is present. RIGHT? So, my reading on this thread suggests that's exactly what your script does. I've read and understood the instructions (i will un-channelize my sets, very easy), but the source channel/ports thing is a bit unclear. Do i need to increment the range with EACH track? Or is each track brand new (= source ch 1)? The latter would be easier. In my case, i'd never have to worry about range. Each track in the tracks area corresponds to only one instrument and that instrument's articulations. I don't go above 16, at least for now. I've tried and seen that LPX only writes real time CC to MIDI CH 1

 

Also, this means i have to instantiate this in Scripter in the MIDI FX dropdown on each track, right? Suggesting again that range overlap wouldn't be something i would need to worry about with my above described method and limits.

Link to comment
Share on other sites

8-)

I've read and understood the instructions (i will un-channelize my sets, very easy), but the source channel/ports thing is a bit unclear. Do i need to increment the range with EACH track? Or is each track brand new (= source ch 1)? The latter would be easier. In my case, i'd never have to worry about range. Each track in the tracks area corresponds to only one instrument and that instrument's articulations. I don't go above 16, at least for now. I've tried and seen that LPX only writes real time CC to MIDI CH 1

 

Also, this means i have to instantiate this in Scripter in the MIDI FX dropdown on each track, right? Suggesting again that range overlap wouldn't be something i would need to worry about with my above described method and limits.

 

So I will assume for now you are NOT using VePro. You're using PLAY, which means up to 16 midi channels per PLAY instrument.

 

The next question becomes do you ever try to cram more than two orch instruments into one PLAY instrument? In other words do you try to use channels 1-8 for 8 articulations of one orch instrument, and then use channels 9-16 in that same PLAY instance to accommodate a complete separate instrument with 8 articulations? Or do you keep each instance of PLAY relegated to one single orchestral instrument, with up to 16 articulations?

 

If the answer is the second, one orch instrument per PLAY instance, then its pretty simple. Yes every track in LogicPro should be setup to midi channel 1. Each Track in LogicPro will point to a a single mixer channel with an instance of PLAY and on that same mixer channel you should also put Scripter with Channelizer. Each Channelizer script will handle just one orchestral instrument...with up to 16 articulations, based on ArticulationID 1-16

 

I think that is what you're trying to do and probably will be the solution you need.

 

You may be confused by some of the options which are geared towards facilitating VePro and/or if you try to put more then one orchestra instrument into a single instance of PLAY, originating from two or more tracks in LogicPro. In that case, you have to be a bit more careful about how you space out the midi channels of the source tracks. Like in the example above, you'd have two orch instruments in one PLAY instance, the first one would be using midi channel 1, then second one would use midi channel 9. They would both use artsets numbered 1-8. Channelizer will then do the rest. In this example you would only have a single instance of PLAY, with a single instance of Scripter that is handling the channelizing for both orch instruments into that PLAY instance.

 

I think without VePro, though, its just easier to isolate each orchestra instrument to its own PLAY instance and keep it simple, then all tracks will use midi channel 1 and articulationID's 1-16. When you use VePro, then you could literally have hundreds of tracks feeding a single VePro plugin, through a single Channelizer Script, and then you have to be more careful about how you setup the midi channels on your tracks...so that they are funneled through that script the right away and directed to the right instance of PLAY inside VePro, etc..

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...