Jump to content

Please help me to control "MIDI In" on/off in automation with logic script!!


Dmitry S.
 Share

Recommended Posts

Guys, hello! I have a song where in certain parts two instruments are switched between each other (on/off), while the notes remain the same, making just a different sound (4 bars with one instrument, the next 4 bars with another instrument, repeat). Maybe someone here knows logic script language to give me advice on how I can control "MIDI In" for a specific track to turn it ON and OFF with the script. Because I'm using automation to turn ON and OFF playing instruments, so while I perform live those instruments are turned ON and OFF during the song. I have only one problem in a few song's parts where an active instrument every time is turned off the natural sustain of its last note is cut off as well. If I could switch "MIDI In" ON and OFF for instruments (tracks) instead of turning off and on the plugins themselves it would do the trick and would help me a lot!! Thank you!

P.S. Better if a script could be assigned to a midi controller, e.g. midi-pedal, but not necessary since it's an alternative to the automation, and I hope there is some solution!

 

Automation with plugins on:off.png

Link to comment
Share on other sites

Are you on Logic 10.7.x already? In that case your problem can easily and elegantly be solved in the MIDI environment. If you're on Logic 10.6.3 or erlier, I have to think a bit, bit probably the MIDI environment is the best way in that case as well.

  • Like 1

MacBook Pro 16'' 2019, 2,4 GHz 8-Core Intel Core i9 64 GB 2667 MHz DDR4 / macOS Monterey 12.4 / Logic 10.7.4

Link to comment
Share on other sites

Made a quick project in 10.7.4, look at attached GIF to see it in action. The Button I'm switching with the mouse could easily be controlled by an external footswitch.image.thumb.gif.a25c7a33aa1d16968ec847522da4574e.gif

  • Like 3

MacBook Pro 16'' 2019, 2,4 GHz 8-Core Intel Core i9 64 GB 2667 MHz DDR4 / macOS Monterey 12.4 / Logic 10.7.4

Link to comment
Share on other sites

Scripter lesson time....

what you can do with Scripter is control whether midi is echoed through or not. 

The default script for Scripter looks like this:

function HandleMIDI(event) {
    event.send();
}

That default script basically says to echo incoming midi input to the output.  if you were to comment out the send line, or remove the HandleMIDI function entirely, then Scripter would essentially block all midi by not echoing it through.

So...you just need to use an automatable parameter that determines whether to echo the midi through.

so in the most simple example, something like this would provide an automatable parameter that will turn midi on and off:

var PluginParameters = [];
PluginParameters.push({
    name: "MIDI enable",
    type: "checkbox",
    defaultValue: 1,
    disableAutomation: false,
    hidden: false
});

function HandleMIDI(event) {
    if(GetParameter(0) == 1) {
        event.send();
    }
}

There is a little more to think about though....  You have to worry about hung notes if you turn off the midi while a note is sustaining...in that case the NoteOff would not make it through.  A simple way around that would be something like this:

var PluginParameters = [];
PluginParameters.push({
    name: "MIDI enable",
    type: "checkbox",
    defaultValue: 1,
    disableAutomation: false,
    hidden: false
});

function HandleMIDI(event) {
    // only send NoteOn if ON
    if( event instanceof NoteOn) {
        if(GetParameter(0) == 1 || event.velocity==0) {
            event.send();
        }
    }
    // everything else send
    else {
        event.send();
    }
}

 

One other concern is that the GetParameter function is kind of hard on the CPU, so I would recommend you handle it more like this:

var PluginParameters = [];
PluginParameters.push({
    name: "MIDI enable",
    type: "checkbox",
    defaultValue: 1,
    disableAutomation: false,
    hidden: false
});

var ON = 1;

function ParameterChanged(idx, val) {
    ON = val;
}

function HandleMIDI(event) {
    // only send NoteOn if ON
    if( event instanceof NoteOn) {
        if(ON == 1 || event.velocity==0) {
            event.send();
        }
    }
    // everything else send
    else {
        event.send();
    }
}

 

  • Like 2
  • Thanks 1

OSX 12.x (Monterey) on OpenCore - Logic Pro 10.7.4, VePro7, Mainstage3 - 5,1 MacPro 3.46ghz x 12 96gb ram

Link to comment
Share on other sites

Posted (edited)

Just to scramble everyone's brain..here is a tricky way to do it for those wishing to explore, simply for the sake of learning.....

(But the above script works fine too)

 

var PluginParameters = [];
PluginParameters.push({
    name: "MIDI enable",
    type: "checkbox",
    defaultValue: 1,
    disableAutomation: false,
    hidden: false
});

var ON = 1;

function ParameterChanged(idx, val) {
    ON = val;
}

Event.prototype.midiThru = function() {
    this.send();
};
NoteOn.prototype.midiThru = function() {
    if(ON==1 || this.velocity==0) {
        this.send();
    }
};

function HandleMIDI(event) {
    event.midiThru();
}

 

Edited by Dewdman42
  • Like 3

OSX 12.x (Monterey) on OpenCore - Logic Pro 10.7.4, VePro7, Mainstage3 - 5,1 MacPro 3.46ghz x 12 96gb ram

Link to comment
Share on other sites

LOL I guess I should dive a bit deeper into the Scripter's possibilities. Of course the Scripter solution is way better that an Environment-only solution, as the latter will struggle handling Note-Offs and therefore is prone to cause hanging notes! Thanks @Dewdman42for the examples indeed (where can I find the best documentation on the functions you used?). May I add that, if you want to switch those MIDI enable buttons live (not via played back automation), it might still be necessary to add a few little environment objects, like in attached GIF? Or can this be implemented as a Scripter-only solution as well?

toggle scripts.gif

  • Like 2

MacBook Pro 16'' 2019, 2,4 GHz 8-Core Intel Core i9 64 GB 2667 MHz DDR4 / macOS Monterey 12.4 / Logic 10.7.4

Link to comment
Share on other sites

29 minutes ago, David Nahmani said:

Definitely. Same for me. I've played around with the basics but I need to set some time aside to really dive into it, it's insanely powerful and very elegant to use once you know what you're doing. 

It (currently?) does have it‘s limits though, e.g., no processing of MIDI data before the Sequencer Input (unless you use External Instrument and IAC driver I guess) and no distribution of events over more than one channel strip.

  • Like 1

MacBook Pro 16'' 2019, 2,4 GHz 8-Core Intel Core i9 64 GB 2667 MHz DDR4 / macOS Monterey 12.4 / Logic 10.7.4

Link to comment
Share on other sites

7 minutes ago, polanoid said:

It (currently?) does have it‘s limits though, e.g., no processing of MIDI data before the Sequencer Input (unless you use External Instrument and IAC driver I guess) and no distribution of events over more than one channel strip.

Totally. A MIDI Environment object that can host MIDI FX plug-ins would be amazing. Meanwhile it can be done with a workaround using an IAC bus: 

image.png

  • Like 1

My new Logic Pro Book is out!

Link to comment
Share on other sites

3 minutes ago, David Nahmani said:

Meanwhile it can be done with a workaround using an IAC bus: 

That‘s what I wrote, yes 😉 but thanks for the explanatory screenshot 

  • Like 1

MacBook Pro 16'' 2019, 2,4 GHz 8-Core Intel Core i9 64 GB 2667 MHz DDR4 / macOS Monterey 12.4 / Logic 10.7.4

Link to comment
Share on other sites

Posted (edited)

Additionally, it is possible to setup a Smart Control to control the parameter, which can then be tied to midi on input if you want to use midi to switch it on and off.

But  yes the environment still has its uses, it would be an excellent way to toggle which channel is currently ON, as Polaroid demonstrated above.

 

Edited by Dewdman42
  • Like 1

OSX 12.x (Monterey) on OpenCore - Logic Pro 10.7.4, VePro7, Mainstage3 - 5,1 MacPro 3.46ghz x 12 96gb ram

Link to comment
Share on other sites

Posted (edited)

Having an environment object that processes its input via a script would be the bomb 🙂

Edited by polanoid

MacBook Pro 16'' 2019, 2,4 GHz 8-Core Intel Core i9 64 GB 2667 MHz DDR4 / macOS Monterey 12.4 / Logic 10.7.4

Link to comment
Share on other sites

I just noticed we never heard back from the OP, but nevertheless this has become quite an interesting thread 🙂

MacBook Pro 16'' 2019, 2,4 GHz 8-Core Intel Core i9 64 GB 2667 MHz DDR4 / macOS Monterey 12.4 / Logic 10.7.4

Link to comment
Share on other sites

Posted (edited)

So, as my first Scripter exercise, a project

- in which you can switch between 16 Instrument channels

- which handles event distribution via MIDI channel (so this will even work when recording, either on the main track of the summing stack if you record enable that one, or on the individual tracks if you enable the 16 instrument tracks instead)

- and which will also handle note offs correctly, even if they arrive after Instrument channel switching

This is real fun! :) 

(Logic 10.7.x only, BTW)

 

16 Inst Switcher 1.2.logicx.zip

Edited by polanoid
  • Thanks 1

MacBook Pro 16'' 2019, 2,4 GHz 8-Core Intel Core i9 64 GB 2667 MHz DDR4 / macOS Monterey 12.4 / Logic 10.7.4

Link to comment
Share on other sites

It’s very very unlikely to see any kind of plugin or script language in the environment because the environment does not make use of the large buffer used by plugins in the mixer.  It processes stuff at near real time.  It is almost certainly coded in assembly language and part of the reason it is developing bugs, in my opinion, is that this code is extremely optimized in an old school kind of way with limitations about what can be done there.  Scripter and JavaScript are WAY too slow to operate in that part of the midi signal flow.  Just my opinion but it’s very unlikely that they will do that.  IF anything I think they will even do away with the environment it make it more black box like MainStage.

logics plugin hosting mixer makes use of a large enough buffer that can do all kinds of stuff in plugins even in JavaScript and it will be sample accurate.

  • Like 1

OSX 12.x (Monterey) on OpenCore - Logic Pro 10.7.4, VePro7, Mainstage3 - 5,1 MacPro 3.46ghz x 12 96gb ram

Link to comment
Share on other sites

Posted (edited)

I don't see why the MIDI environment should be coded in assembly language. Keep in mind it is dealing with MIDI/Event data only, no audio data at all, so the bandwidth needed is lower by magnitudes. C++ (which I assume the "old school" part of Logic is coded in, given the time it was created) is definitely "fast" enough.

Also, try my example project two posts above, it basically already integrates the Scripter into the environment, and it can definitely by played in real time

Edited by polanoid

MacBook Pro 16'' 2019, 2,4 GHz 8-Core Intel Core i9 64 GB 2667 MHz DDR4 / macOS Monterey 12.4 / Logic 10.7.4

Link to comment
Share on other sites

Posted (edited)

Yep, pretty sure it's C++. I think Gerhard and co. were quite happy to move away from the low level assembly that Notator used for MIDI event handling as computers and tools got more powerful. This code originated from back in the CodeWarrior days on Mac, but yes, it operates on MIDI data flowing through it just like "real" MIDI data flowing down a cable, and doesn't use MIDI timestamping or any of the newer technologies brought to macOS over the years. So it's definitely from an earlier generation of MIDI handling code (also macOS buffers incoming MIDI events of course).

Even the concepts, which once were a natural fit (it's just like connecting up MIDI devices) are only really meaningful to those of us who came through the MIDI hardware days. Modern users' idea of MIDI is mostly connecting a USB cable to their computerm so connecting up processing units along MIDI chains is not such a useful idiom...

it doesn't mean that some of those improvements *could* be done of course, but it's low priority, and probably requires way more effort involved in re-architecting some important foundations of Logic's core, which is not something to be taken lightly in development terms! One of the reasons the environment has been left alone for so long - I don't think any of the devs really want to touch it...! ;)

Edited by des99
  • Like 2

mu:zines | music magazine archive | difficultAudio | Legacy Logic Project Conversion | Logic 10.7.4, MBP 16" M1pro

Link to comment
Share on other sites

The Environment has never been a go-to for the vast majority of users, and likely will not be in a foreseeable future…

ITMT, for advance users, there is more to explore into what is already there. Debugging what is already there should be the priority, instead of adding new features.

However, I agree that would definitely be a bomb! as stated polanoid.

My 2 cents…

  • Like 1

LogicPro 10.7.4, MainStage 3.6,
MBPro 17", Core2Duo, 8G, OSX 10.12.6, MacPro, Xeon 6Cores, 64GB, OSX 10.16.1,
ULN8, MOTU MIDI TP-AV, C4, MCU Pro, KorgNano, Novation SLMkII, Several vintage gear
AAS, NI, Celemony, Spectrasonics, Korg, Arturia, etc..., PC, iPadPro 5th gen 12.9”(Duet D., V-Control & LogicRemote), AtariST(Notator SL),

Link to comment
Share on other sites

Posted (edited)

It’s just my theory of course, none of us have access to the code to find out.  You may also be perfectly right that extremely tight c/c++ code replaced the critical performance areas that were originally assembly decades ago.

But the point I was trying to make is not fundamentally about that. It is that the environment happens without the large buffer that plugins have. This is obvious.  A buffer there would add more latency.  The environment fundamentally must execute with extremely tight and fast operations which need to be extremely optimized and even limited in what they can do and tightly controlled by the internal code of logicpro.  The environment does not provide, for example, anything like a for loop (right?)  It is basically just a way to create a series of filters and transforms, through switches, etc.  These can get complex but internally it is possible to highly optimize these in a way that each midi event only needs a couple of fast math calculations Enroute to being stored in the sequencer.

plugins, on the other hand, use the host mixer which provides a lot of time due to buffer processing.  Scripter and JavaScript and many other plugins need this extra buffer, which equates to latency, to do the work they do, and the daw has no control over the efficiency of plugins, they can do all manner of complicated things and even sometimes report their own additional latency when they need to look ahead.  

If the environment were to inherit similar buffer time in front of the sequencer that would mean adding more latency.

other daws are the same way.  Cubase, for example, had a few midi filtering options, most of which are simple filtering, etc…not unlike the environment but a bit less flexible.  Meanwhile in cubase you can chain together tracks in order to process through midi plugins but you will notice that when you do that it’s almost the same as if you use iac in logicpro.  Every track that is chained together Carry’s over to the next buffer.  

Anyway main point is, as much as I would love to see it, I do not think we will ever see aumfx plugin use in the environment or in front of the sequencer.  And I don’t think we will ever see scripter there either, it is not fast enough code.  

but no other daws are doing that either.

don’t get me wrong I would love it also, but it’s extremely unlikely to happen.

also, sending over iac is not the end of the world, especially now that tracks can specify the input port and channel to use. Like other daws this will perform about the same as chaining together cubase tracks for example.

Edited by Dewdman42
  • Like 1

OSX 12.x (Monterey) on OpenCore - Logic Pro 10.7.4, VePro7, Mainstage3 - 5,1 MacPro 3.46ghz x 12 96gb ram

Link to comment
Share on other sites

I'm not sure your theories are correct in regards to MIDI buffer handling of incoming events. I've done a fair bit of low level MIDI handling coding on macOS, at the lowest C++ level the Core MIDI system APIs let you get at, and accessing MIDI events all works the same way on macOS. They are system functions, and there are a few ways of hooking them up - you either check the incoming buffer frequently, or set up a callback that executes every time an event comes in (which is what I generally do), which you then handle as requirements dictate.

Yes, the environment works like regular MIDI does - it sees an event, it does whatever processing on that even it needs to, and passes it on, and there are no real time priorities for it - just like MIDI, the event gets passed on as fast as it can.

The differences are more pronounced though on *output*. If you output a MIDI event from a recorded region through the environment and out via MIDI that way, it's handled the same way - the event is sent at some known time, and the environment code sends it on as fast as it can.

However, if you output MIDI events *bypassing* the environment - for example, using the External Instrument plugin, those events are not passed through that environment code, but are output via a different method (still Core MIDI system calls of course), and these events are timestamped with the macOS system timestamping method, which means device drivers have a more precise way of scheduling and queuing those events - and if you benchmark the differences (which we did over on GS), the timing and jitter performance is *much* better (assuming you are using MIDI drivers that correctly support MIDI timestamps, of course.)

This is the main difference between the two methods - the environment is really simulating a regular old MIDI cable, whereas other non-environment methods of transmitting or processing data can be more powerful (eg, MIDI timestamping support). But there isn't some magic extra buffer processing that Scripter can use, when just passing incoming MIDI bytes through the system, which makes MIDI input handling fundamentally different.

Playing back MIDI regions of course is different to passing live incoming MIDI data through, as Logic already knows in advance what events it needs to output and can make allowances accordingly - but either way, Logic sends a MIDI byte to Core MIDI which outputs it, environment, or not...

mu:zines | music magazine archive | difficultAudio | Legacy Logic Project Conversion | Logic 10.7.4, MBP 16" M1pro

Link to comment
Share on other sites

 

38 minutes ago, des99 said:

I've done a fair bit of low level MIDI handling coding on macOS, at the lowest C++ level the Core MIDI system APIs let you get at, and accessing MIDI events all works the same way on macOS. They are system functions, and there are a few ways of hooking them up - you either check the incoming buffer frequently, or set up a callback that executes every time an event comes in (which is what I generally do), which you then handle as requirements dictate.

 

what you are describing is how midi events are accessed from CoreMidi queues, which has nothing at all to do with plugin processing such as Scripter, which would require a buffer.  It also most likely has nothing to do with the way the environment works.  If the environment used entirely CoreMidi functionality and nothing else; it would not have all these bugs.  heheh  CoreMidi is how midi events are provided to an app, and how the app can send midi events on to other apps or through IAC or back out to a midi device. 

CoreAudio does provide AU api's which can be used to build plugins that can process queues of midi events also, in a buffer...but yes...plugins are beholden to a buffer, called a process block.

 

38 minutes ago, des99 said:

Yes, the environment works like regular MIDI does - it sees an event, it does whatever processing on that even it needs to, and passes it on, and there are no real time priorities for it - just like MIDI, the event gets passed on as fast as it can.

 

Correct, but plugins don't work that way.  That is the point.  Scripter can't work that way either.

Also, the environment is doing a good deal more than what CoreMIDI is providing through simple queues.  The whole point of the environment is to have an extremely flexible way for users to configure how midi events will be processed.  Because it does not actually have large buffer that would create latency, there is a limitation to what the environment can be used to do.

 

38 minutes ago, des99 said:

The differences are more pronounced though on *output*. If you output a MIDI event from a recorded region through the environment and out via MIDI that way, it's handled the same way - the event is sent at some known time, and the environment code sends it on as fast as it can.

 

Correct, but again that is not how plugins work.  as you say it, send out as fast as it can.  Plugins do not work that way at all.  Plugins require a *LOT* more processing then the environment or CoreMidi does...and that is why they will most likely never exist inside the environment.

 

38 minutes ago, des99 said:

However, if you output MIDI events *bypassing* the environment - for example, using the External Instrument plugin, those events are not passed through that environment code, but are output via a different method (still Core MIDI system calls of course), and these events are timestamped with the macOS system timestamping method, which means device drivers have a more precise way of scheduling and queuing those events - and if you benchmark the differences (which we did over on GS), the timing and jitter performance is *much* better (assuming you are using MIDI drivers that correctly support MIDI timestamps, of course.)

 

Again, now you are talking about basic midi queues.  That is what IAC is, for example also..  BUT THAT IS NOT HOW PLUGINS WORK.

 

38 minutes ago, des99 said:

This is the main difference between the two methods - the environment is really simulating a regular old MIDI cable, whereas other non-environment methods of transmitting or processing data can be more powerful (eg, MIDI timestamping support).

 

Environment could just as easily support time stamping if it wanted to.  And it might already pass timestamps through, we just don't have access to see them or manipulate them in the environment.  Environment can't see or manipulate articulationID either, for example.

 

38 minutes ago, des99 said:

But there isn't some magic extra buffer processing that Scripter can use, when just passing incoming MIDI bytes through the system, which makes MIDI input handling fundamentally different.

 

This is where you are wrong.  Scripter is a plugin and all AU plugins operate on a buffer.  This is fundamental to understanding how to program Scripter properly by the way.  You can do simple scripts without understanding this, but if you get into any time-based scripts, you will need to understand the buffer processing in order to make it work correctly and sample accurately.  Yes Scripter is entirely sample accurate.  

It is for this reason that Scripter will probably not ever be in the environment.

 

38 minutes ago, des99 said:

Playing back MIDI regions of course is different to passing live incoming MIDI data through, as Logic already knows in advance what events it needs to output and can make allowances accordingly - but either way, Logic sends a MIDI byte to Core MIDI which outputs it, environment, or not...

 

What we don't know about playing back regions is whether LogicPro actually creates even BIGGER buffers internally to process things ahead of time as much as possible...even more so then it does when processing live midi and audio input.  It is entirely possible that it may be!  

Plugins do not process anything in real time.  They process one buffer load at a time, called a process block.  Each buffer represents a length of time...and basically each plugin can scribble all over that buffer over that length of time any way it sees fit.  All the plugins in a track get an opportunity to do that, one after the other..and hopefully all of them have completed that task before the length of time of the buffer has expired (if not you get dropped notes or audio dropouts).  Finally the buffer gets flushed.  All plugins work this way, including Scripter.

Generally if you have a software instrument on a track, then the midi AUMFX plugins take their turns to process the midi for that process block (one buffer length of time).  Whatever is left in the midi buffer at the end of that is then provided to the instrument plugin, which during that same buffer of time is going to attempt to render audio into the buffer...scribbling all over the audio buffer however it sees fit.

Then that buffer is passed through each of the AU fx plugins one at a time, each one having an opportunity to look at the buffer so far, and overwrite it however it sees fit.  The buffer representing a length of time.

Finally the audio will be flushed through LogicPro's mixing engine and eventually flushed to the sound card when its time for that buffer to actually be played through your speakers.  If the plugins aren't finished doing all the work they needed to do during that period of time called the process block then that is when you will end up with a partially filled audio buffer...aka....dropouts.

All plugins work this way, including Scripter.  Scripter does not process midi in real time.  

 

 

OSX 12.x (Monterey) on OpenCore - Logic Pro 10.7.4, VePro7, Mainstage3 - 5,1 MacPro 3.46ghz x 12 96gb ram

Link to comment
Share on other sites

7 hours ago, Dewdman42 said:

If the environment used entirely CoreMidi functionality and nothing else; it would not have all these bugs.

Can you elaborate on "these bugs" a bit more? Is there a compiled list of environment bugs somewhere? I don't really experience the Environment as a buggier area than other parts of Logic

MacBook Pro 16'' 2019, 2,4 GHz 8-Core Intel Core i9 64 GB 2667 MHz DDR4 / macOS Monterey 12.4 / Logic 10.7.4

Link to comment
Share on other sites

Posted (edited)
11 hours ago, Dewdman42 said:

Also, the environment is doing a good deal more than what CoreMIDI is providing through simple queues.  The whole point of the environment is to have an extremely flexible way for users to configure how midi events will be processed.

Yes, of course. The environment has tools that let you process the MIDI events - we were simply talking about how MIDI events come in, which is what Core MIDI handles - the rest of the processing and manipulation is up to the app. I thought that was clear, and I wasn't suggesting that Core MIDI handles environment processing, if that's what you thought. Given that I said I'd done a certain amount of Core MIDI development, I would have thought you'd give me the benefit of any doubt that I have an idea of the purpose and scope of the framework is.

Anyway - events also come in to Logic exactly the same way, regardless of whether your MIDI keyboard is controlling a plugin, or being routed through the environment.

I understand what you mean about the plugin process block. I'm only really familiar with the audio processing for this, I don't really have any technical insight currently of how MIDI events are forwarded to plugins, where they are interspersed in those audio buffers, or whether they are separate etc, as it's not something I've dealt with development wise. Presumably, from what you say, the incoming MIDI events are wrapped up and handed off to the plugins after that happens. What I thought you were saying is that somehow incoming MIDI events are handled entirely differently, which isn't the case - but how the events are routed, processing and buffered obviously differs between the older environment stuff, and the newer plugin/MIDIFX stuff (which is your main point, which I understand). As far as we know, *all* incoming events (bar the ones intercepted by controller assignments) actually go through the environment, because they hit the input object and are passed to the sequencer - so that's presumably where the process buffering starts to come into play.

And no, the environment doesn't pass through timestamps from my testing, this is easily viewable by inspecting the MIDI events (and benchmarking the timing performance). Basically, the environment just passes regular MIDI events and a single 16-channel cable, it's very similar to a real MIDI cable and was designed that way in terms of event processing. Yes, they extended the messages passed a bit to cope with things like meta events, but it's still, for all intents and purposes, just forwarding un-timestamped events around just like it did in the OS9/Atari days. I doubt they did the work to upgrade the environment to handle macOS timestamping, just to not ever use it. Having said that, when they developed AMT in the OS9 days, this was a form of timestamping, and back then, *all* MIDI events went through the environment - though that technology was obviously deprecated and never made it to OSX.

The only evidence I see for timestamped MIDI events is on output from recorded MIDI regions, in the case where those events go out the "plugin window" via eg External Instrument route. I can't remember offhand without going to look, but I think if you play realtime through the External Instrument object, the timestamps are present but just set to null values..

11 hours ago, Dewdman42 said:

Scripter does not process midi in real time.  

I understand, and wasn't arguing it did... Someone else said it would be nice to have a version of this as an environment object, and it would - but it would have some limited functionality if implemented like this, and in that case it would have to work like the reset of the environment does, on a realtime stream basis. Moot point anyway, because it doesn't exist, whether technically possible or not.

BTW You don't need to shout at me, we're all adults here and I hope we can have technical discussions and exchanges of views, or clarify misunderstandings in a respectful way. It's not the first time you've expressed your views in quite an aggressive manner. Last time I just didn't bother to continue the discussion, as I have no desire to get into arguments here. I'm happy to learn stuff, and I'm happy to correct my understanding about things - as I hope we all are, but please keep it respectful. Thank you.

Edited by des99
  • Like 1

mu:zines | music magazine archive | difficultAudio | Legacy Logic Project Conversion | Logic 10.7.4, MBP 16" M1pro

Link to comment
Share on other sites

Posted (edited)

I have been respectful.  You decided to debate my point of view and I am simply pointing out where your points are wrong. Capital letters are for emphasis, not shouting, for example.

you are kind of talking in circles now bringing up non relevant Information as well as directing your comments towards me in a personalized way so I will check out of the thread now but will just state again, scripter and plugin hosting will not ever appear in the environment not in any other way in front of the logicpro sequencer, for the reasons I have already stated.  Cheers.

Edited by Dewdman42

OSX 12.x (Monterey) on OpenCore - Logic Pro 10.7.4, VePro7, Mainstage3 - 5,1 MacPro 3.46ghz x 12 96gb ram

Link to comment
Share on other sites

Posted (edited)
7 hours ago, polanoid said:

Can you elaborate on "these bugs" a bit more? Is there a compiled list of environment bugs somewhere? I don't really experience the Environment as a buggier area than other parts of Logic

No i can’t provide a list but you can search through the forum and find threads on the topic.  It has been getting buggier in recent releases.  In some cases there are work arounds.

Edited by Dewdman42

OSX 12.x (Monterey) on OpenCore - Logic Pro 10.7.4, VePro7, Mainstage3 - 5,1 MacPro 3.46ghz x 12 96gb ram

Link to comment
Share on other sites

5 minutes ago, Dewdman42 said:

I have been respectful.  You decided to debate my point of view and I am simply pointing out where your points are wrong. Capital letters are for emphasis, not shouting, for example.

All caps is often perceived as shouting on the internet, I suppose that's what des99 was referring to. 

I don't think there was any personal attacks or aggression on either part, if anything I see it as frustration to not being understood, which is understandable - but let's not let that take over the core of the discussion which is what we're interested in here. 

We have a great team of talented and knowledgeable, experienced and helpful Logic experts here, including (but not limited to) you two Des99 and Dewdman42, so let's be grateful for that and continue to focus on the root of the subject. 

Thanks! 

  • Like 1

My new Logic Pro Book is out!

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...