Jump to content

Dewdman42

Member
  • Posts

    4,118
  • Joined

  • Days Won

    2

Dewdman42 last won the day on November 3 2022

Dewdman42 had the most liked content!

About Dewdman42

  • Birthday May 10

Personal Information

  • Occupation
    Retired

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

Dewdman42's Achievements

  1. Channelized simply to see the articulation of itself to offset the midi channel. Please read the channelizer wiki page at gitlab
  2. The main thing is you want to make sure there is NO actual channelizing happening in the switches tab. I can't remember if its even there or not. I think I vaguely recall that LogicPro artset forces midi events to be recorded with the articulation id and the channel assignment as assigned in the middle tab. Which is a flaw IMHO. Also note that generally you don't want cc events to be recorded with articulationID either! And if yo are recording both notes and CC1 events in one recording pass, then both the CC and note events will be encoded with articulation ID. That can be problematic if and when you move notes around or change the articulation of notes, then you have to find all the CC events lining up with that note and change their articulationID also! So that can be very problematic. Work around for that, is to use a script of course, set in pre-record mode, that strips articulationID off of CC events. But in any case, if you just use channelizer instead, then it won't matter if the CC's get encoded with articulationID or not during playback, channelizer only looks at note's to detect articulationID and channelizes those notes accordingly, and any CC's are channelizing along with wherever notes are currently sustaining. It will just work as desired. You only need the artset in order to assign a friendly name to each articulationID and also so that during record you can use some key switches to change articulations on the fly while playing the part in.
  3. There are different ways to look at it. In some ways it can be more complicated to try to use the powers of articulation sets and only massage the results minimally with scripter. As opposed to doing all or most of the heavy lifting with scripter. Articulation sets have some peculiar behavior in some cases that honestly i feel in some cases Apple did it wrong. With scripter you have complete control to make it do exactly what you want, but if you are trying to let artset do some of it and only massage the results then it can sometimes get tricky. so I prefer to keep articulation sets as minimal as possible in fact sometimes doing absolutely nothing other then assign articulation ids to notes during record and having names associated with each articulation id. After that I prefer to use scripter to handle all key switching, channelizing, etc based on the assigned id’s. I have absolute power and control that way. I think the psychology you refer to is about trust and faith in Apple and Logic Pro vs trusting some “script” off the internet to work reliably. But scripter is very reliable if scripted well. But that is the psychology. I predict that when I have a juce version coded in c++ and not free, there will be more trust and confidence in that compared to scripter also. There is no rational reason for that but psychology is not rational. im away from Mac this week so I can’t look at the art set editor to remember if the input tab has something about midi channel?
  4. Yea I think the problem you are running into is that during record you have the input section of the articulation set to channelize your incoming notes and cc data based on the keyswitch you use while playing it in. You don’t want to do that. You want all recorded notes and cc’s to be channel 1, but encoded with the appropriate articulation Id while recording the track. It’s been a while since I used Logic pro so I can’t remember the exact details of the articulation set which would record that way but I do know a work around in case you want to use event chaser. Of course using channelizer instead also works since you turn off all channelizing by the articulation set and use channelizer to do it, avoiding the problem. but anyway, the trick to use event chaser is to make sure during record the articulation set will not set the channel of notes and cc’s. Kind of annoying that it does channelizer had a few advantages but most people tended to be put off by it for some reason preferring to let artset do the channelizing. I have not updated it in a while and it maybe could use some tlc. I am working on a juce based alternative which won’t be free. one work around for the artset problem would be to use another script pre-record which rechannelizes everything back to channel 1 at that time, or at least cc’s anyway glad to hear channelizer is getting some use.
  5. I haven’t done much of this so forgive me if I’m wrong but yea I believe unfortunately you have to do the learn thing to assign scripter parameters to plugin parameters within the channel strip. I’m not sure if you can assign some number in advance which might be correct. In the environment there are fader numbers and basically they line up with the plugins as long as you don’t change the plugins. But I don’t know if you can do that with scripter I think you might have to use the learn feature at least once, but once you set it up on the track it should be remembered when you close and open the project after that
  6. function HandleMIDI(event) { let chan = EXT.GetParameter(0); if(event.channel == chan || chan==17) { event.send(); } } var PluginParameters = []; PluginParameters.push({ type: "menu", name: "Channel", valueStrings: ["OFF","1","2","3","4","5","6","7","8", "9","10","11","12","13","14","15","16","OMNI"], defaultValue: 1, disableAutomation: true, hidden: false }); var EXT = { data: [], SetParameter: function(id, val) { if(typeof id != "string") id = PluginParameters[id].name; this.data[id] = val; }, GetParameter: function(id) { if(typeof id != "string") id = PluginParameters[id].name; if(this.data[id] == undefined) { this.data[id] = GetParameter(id); } return this.data[id]; } }; function ParameterChanged(id, val) { EXT.SetParameter(id, val); } Above is a script, put it on each instrument channel and use the Script GUI to set which midi channel it should listen to. Then setup summing track stack, set all of them to midi channel ALL. Make sure to colapse it, possibly hide the sub tracks to make sure they won't be accidentally selected and then you should be able to open the project and hear it work. No environment. I still recommend Mainstage though.
  7. in which case, use a scripter script to isolate the midi channels, just set them all to ALL. I can share a script momentarily that i have laying around that can do this. The point is not having to either select a track nor click on the R button. I think he just wants to load the project and know that it will be playing back sound from midi immediately without having to do anything. the point of limiting to a single collapsed track stack is that nothing has to be done to enable it to work. Load the project and that single track stack will be selected by default and record enabled. But honestly my recommendation is still to use Mainstage. This is what Mainstage is designed for and its not an expensive program.
  8. I think the suggestion of using a summing track stack would be the right way to go. You just have to to make sure that you keep the folder collapsed in the tracker header list, so that only one row is showing and if that is the only row in the main arrange window, it will be always selected and always sending midi through to the underlying sub-tracks. set each of them with different midi in/out assignments, and set the midi channel of the summing stack to ALL. It works. But keep that summing stack collapsed to ensure its the only selectable track and you should be able to load the project and it will always be ready without having to select the track or enable record. If you really want to be safe you could actually HIDE the underlying sub-tracks of that summing stack. what you are really wanting is a "rack" of instruments that is not funneled through the sequencer. if you are careful about setting up the track stack, that should work, but if you really want to make sure that you don't have to worry about the track header being selected, more of a direct wire of midi input to the "rack" so to speak. I can't really think of any way to do that which doesn't involve going into the enviornment and cabling around to bypass the sequencer object. Mainstage should really be on your short list for this because that is exactly what Mainstage is, it doesn't have a sequencer, and all midi input goes directly to the mixer where instruments are hosted in instrument channels. Mainstage has been coded with emphasis on live performance and allegedly may have lower latency and perhaps better use of CPU for live use compared to logicPro which is optimized for sequencer playback. Also, in Mainstage you can setup sets of patches with a set list and each patch can be a completely unique mixer setup with instruments and FX however you want. So basically if you are keen on running your sequencer from another computer, Mainstage will be honestly a much better tool for this. It has all the LogicPro internal instruments if you care about that also. And it can respond to program change for switching which patch is currently enabled, so your other sequencer machine could fire off program change messages to it to select which mixer setup you want. Honestly that is just going to be a way better solution for you. No environment and you don't have to worry about making sure any track header is selected or record-enabled.
  9. As was already suggested MainStage would work for you.
  10. Since you’re not using the logic sequencer you can cable the input ports directly to the channel strips in the environment and then they will be receiving all the midi regardless of any track channel settings. However since you are using different midi channels then you will need to filter out the undesired midi channels from each channel strip. They could be done in the environment also or it could be done with a simple scripter script to determine which midi channel to allow through in each case
  11. sorry, dumb me...here is correction function HandleMIDI(event) { if(event instanceof NoteOn) { // random value between 0-20 as int var offset = Math.floor(Math.random() * 20); event.velocity = event.velocity + (offset - 10); // should also make sure its valid velocity if(event.velocity > 127) event.velocity = 127; if(event.velocity < 1) event.velocity = 1; } event.send(); }
  12. That assumes a known input velocity. If you want it to be relative to whatever is coming in, then something like this: function HandleMIDI(event) { if(event instanceof NoteOn) { // random value between 0-20 as int var offset = Math.floor(Math.random() * 20); event.velocity = event.velocity + (offset - 10); // should also make sure its valid velocity if(event.velocity > 127) event.velocity = 127; if(event.velocity < 1) event.velocity = 1; event.send(); } }
×
×
  • Create New...