Jump to content

Undocumented Scripter features


Dewdman42

Recommended Posts

Two callback functions get called periodically and you can put any logic you want in there.

 

Idle() and ProcessMIDI()

 

There are several ways to look at the Date.now() time in milliseconds or the beat counter to get same result as a timer

Edited by Dewdman42
Link to comment
Share on other sites

So you could do something more or less like this to get a simple timer functionality:

 

var wait = 100;
var timer = Date.now() + wait;  // 100 ms timer

function HandleMIDI(event) {
   // reset timer
   timer = Date.now()+wait;
   // stash event info somewhere
}

function ProcessMIDI() {
   var now = Date.now();    
   if(now >= timer) {
       // send events from stash
   }
}

 

But as I think about it, this is not precise enough at all to have your notes play at a precise time in milliseconds. You could get close with it. The reason is because of a couple things. For one, the Date.now() function will give you the now time in milliseconds, but that will be the time when the script is actually executing the callbacks...not the exact time of the note being played live or coming from a region, nor the point in time represented by the processBlock. So how much slop is that? I'm not sure...probably something within the length of your sample buffer. You also have to worry about keeping track of the NoteOff's so that you can schedule those too!

 

You could get a lot more complicated with beatPos calculations in order to be more precise...but if you have cycle looping or tempo changes....or if the transport isn't playing..could get even more complicated.

Link to comment
Share on other sites

Whoops, for some reason ProcessMIDI slipped my mind; I hadn't used it for a script in awhile. I would think Idle isn't suited for anything time sensitive at all, since it sounds like it's low priority and can often be skipped.

 

Some further info on ProcessMIDI: it seems to get called at a rate of time of the number of samples of the current sample buffer, at the current sample rate. With my sample buffer at 128 and sample rate at 44.1 kHz, ProcessMIDI seems to be called a little under every 3 milliseconds (from looking at Trace outputs). Doing the math:

 

(1 sec / 44100) * 128 = 0.00290304 sec ≈ 2.9 ms

 

which lines up nicely. Changing the buffer size yields the expected change to the call rate. I guess this is what the vague line in the manual, of the call being dependent on the sample rate and buffer size, actually means. The call frequency rate is whatever your buffer delay time is — why didn't they just say that.

 

So, I would think a timer in ProcessMIDI should be off, at most, by whatever the length of your sample buffer is given your sample rate. At Logic's highest buffer of 1024, and lowest sample rate of 44.1 kHz, that gives a worst case of being a little over 23 ms off. Which is pretty terrible for a millisecond timer, but I doubt many people are running logic at those settings anyway. At more realistic setting values it seems decent enough, being in the single digits of milliseconds.

Link to comment
Share on other sites

pretty much yea...but also keep in mind that both HandleMIDI() and ProcessMIDI() do not get called "truly" in real time. They are called ahead of time and not necessarily the same way.

 

ProcessMIDI() gets called as javascript code to execute ahead of time. LPX allows plugins and itself to operate on a buffer full of data in whatever way they need to; ahead of time. Midi events are not actually "sent" when you call event.send(). They are "scheduled". There is this period of time where LPX and plugins are operating on audio data in the buffer and basically churning it and modifying the audio, taking into account midi events that are scheduled there to have software instruments use those midi events to modify the audio buffer, etc.. Finally when its time for that buffer to be played, then the buffer, along with any midi events that need to be sent externally, will be sent out the hardware interfaces. ProcessMIDI() is how you can schedule midi events to be processed during that audio buffer process block. How far ahead of time will this javascript get called to schedule the midi events? We don't know.

 

So the javascript code itself isn't actually running in true real time..its operating ahead of schedule some unknown amount of time, and scheduling midi events.

 

But the Date.now() function in javascript IS real time, as the script is executing. See what I mean? The ms timer we're trying to use here is not really lined up with the buffer in terms of when the actual midi notes are going to be played, its lined up with the scripting execution phase of processing, which is ahead of schedule.

 

HandleMIDI(), on the other hand, responds to incoming midi events. Midi events might be played live from a keyboard or might be played from a region. When they are played from a region its possible for LPX to call the HandleMIDI() callback function as ahead of time as it feels like to process that note..we don't really know how and when it will be called or how early ahead of when the actual midi scheduled will actually play. Live midi played from the keyboard can't be early, HandleMIDI() can only be called when the midi event is actually received live. So what does LPX do in those two situations? We don't really know and we don't really know how much ahead of time the execution of that code will be then the actual midi event playback...but again..the call to Date.now() is real time...

 

I don't know if that is making sense, but basically I'm saying there is going to be some slop factor and you'll have to experiment to see if it will be close enough for you.

 

The only way to be more accurate and reduce or eliminate the slop is to get into some math in your script to compute beat positions instead of using a ms timer. Then you can actually compare your timer against the beat clock and make sure to line them up exactly where you want them, sample accurately... If you don't have any tempo changes or cycle looping it wouldn't be that hard, but it can get complicated.

Link to comment
Share on other sites

  • 2 weeks later...

This thread is much appreciated guys! I have 4 more question about the scripter and I am wondering if you guys know: Is it possible to have some kind of text-input in the scripter? I would like to be able to rename parameters and modify the values of menus etc.

 

And also from the eventTypes.js file, would it be possible to read values from another file locally on my computer? What I am thinking of is that I would like to have the menus in my Scripter plugin to reflect the names of my articulation IDs. Would something like this be possible. Basically I want my Scripter to behave differently based the Articulation ID, and so it would be nice if you could have the menus in the Scripter update if I am saving my Articulation IDs.

 

Also, would it be possible to read and learn from the original js code used for the scripter?

 

And last, why don't I see my MIDI messages that I create if I cable my instrument into a monitor?

Edited by t-ride
Link to comment
Share on other sites

To your other question you added later, there is currently no known way to read files in scripter or include files or anything like that. The EventTypes.js file is the only one and it’s executed by scripter at some point before executing your code. It’s not really a good place to have your users messing with it.
Link to comment
Share on other sites

  • 4 weeks later...

Not sure what you mean by that exactly. What are you thinking? Scripter really is somewhat sandboxed, it can only effect midi events within the channel its being hosted.

 

The only other thing it can do a little bit is you can make rudimentary GUI controls for Scripter, those controls can be exposed as automatable items. And those can be exposed via smart control also, which makes it possible to direct some stuff from the incoming environment to those controls, but not the other way around.

 

You can also use Scripter to control automatable items on other plugins within the same channel.

 

Aside from that, Scripter can really just effect midi events happening within the channel. It can create midi events, drop midi events, change midi events....schedule them in the future, etc..but can't reach outside the channel.

Link to comment
Share on other sites

PS what are you trying to do exactly? There are other solutions out there that might get you where you want to be. For example, the free MIDIPipe utility will let you associate AppleScript with midi events. So you could potentially do something over IAC and have AppleScript driving keyboard maestro or something crazy like that, which could then basically do more things to Logic in a very indirect way. That's too complicated for my blood, but it would be interesting to see what's possible.

 

Logic simple does not have the kind of access that Reaper has for script customization.

 

Another possibility is you can write an AU plugin. In an AU plugin you could communicate with any outside process or other plugin instances you want and do all manner of complicated stuff, including sending keystrokes to LPX if you wanted to, or perhaps sending MCU commands, etc.. Many possibilities. Scripter probably isn't going to get you there though as its pretty sandboxed.

Link to comment
Share on other sites

ps, the only thing is, Scripter gives you access to the articulation ID, and as far as I know, AU plugins can't detect articulation ID. So if you're wanting to do something related to articulation ID, then its a bit of a challenge. What you'd have to do is turn articulation id into some kind of midi events or automation that an AU plugin could then intercept and do whatever you want with it.
Link to comment
Share on other sites

Not sure what you mean by that exactly. What are you thinking? Scripter really is somewhat sandboxed, it can only effect midi events within the channel its being hosted.

 

The only other thing it can do a little bit is you can make rudimentary GUI controls for Scripter, those controls can be exposed as automatable items. And those can be exposed via smart control also, which makes it possible to direct some stuff from the incoming environment to those controls, but not the other way around.

 

You can also use Scripter to control automatable items on other plugins within the same channel.

 

Aside from that, Scripter can really just effect midi events happening within the channel. It can create midi events, drop midi events, change midi events....schedule them in the future, etc..but can't reach outside the channel.

 

 

As always thanks for the detailed answer! What I would like personally would be to "have all of my parameters and values within the scripter to be saved to a file externally to logic in eg. json format.

Link to comment
Share on other sites

Best you could do is to send the JSON to the console as logging, then copy and paste it later

 

Allright, thanks again Dewdman42! Then I'm gonna try to figure out a way of getting around this problem. Follow up question: Do you know if there is a location on the mac computer where I can find all of the plugin parameters in a plist file maybe?

I am building a scripter plugin which depends on "which plugins I am using further down the signal chain" and therefore it would be nice if I could collect all plugin parameters from eg. a plist, and then pipe them into the EventTypes.js file so that I can select which plugin I am targeting and parameter from a drop down… :D

Link to comment
Share on other sites

  • 1 year later...

I just found another undocumented feature. At least I haven't seen it documented. Stumbled upon this.

 

You can declare a callback function called Initialize(), which will be called, I think, once before any other callbacks, but after the script has completely been scanned and all function definitions interpreted. For example:

 

function Initialize() {
  note = new NoteOn;
}

 

This is an actual callback function. I stumbled on this by accident and noticed Scripter was highlighting this function red.

 

You might ask the question, why bother to put initialization code inside this callback function? Why not just put at the global level? This particularly comes up if you are trying to make a class that will be defined later on in the guts of the script and you want some simple stuff at the top of the script. Javascript will currently fail because there is no way to include the definition earlier. But if you place this kind of code in an Initialize() callback, it can be located at the top, and Scripter will automatically call it for you after all the function definitions have been defined...

 

So for example. This won't work.

 

var hello = new myClass;

hello.trace();    // ERROR, not defined yet

//=========================================
// OOP Class stuff below here
//
function myClass() {
   this.name="hello world";
}

myClass.prototype.trace = function() {
   Trace(this.name);
};

 

This, however, does work:

 

var hello = new myClass;

function Initialize() {
   hello.trace();    // This works, beecause Initialize called later
};
//=========================================
// OOP Class stuff below here
//
function myClass() {
   this.name="hello world";
}
myClass.prototype.trace = function() {
   Trace(this.name);
};

 

I was hoping that GetParameter() could also be called inside Initialize(), since it can't really be used at the global level its not fully inited yet. But it appears to also not be inited yet during Initialize() callback, unfortunately. That would have been nice Apple if you're listening.

 

Mainly I think this is just a convenient way to put clean user-editable init code at the top of the script that may depend on all kinds of ugly scary complicated stuff down below. This way the ugly stuff can run first before Initialize() is called.

 

For example, this won't work

 


channelmap[5][3] = "Favorite Channel"; // ERROR

//============================================
// DO NOT EDIT BELOW HERE
//============================================

// init two dimension array
var channelmap = [];
for(var port=1;port<=8;port++) {
   channelmap[port] = [];
}

 

But this does!

 

function Initialize() {
   channelmap[5][3] = "Favorite Channel"; // this works!
};

//============================================
// DO NOT EDIT BELOW HERE
//============================================

// init two dimension array
var channelmap = [];
for(var port=1;port<=8;port++) {
   channelmap[port] = [];
}
Link to comment
Share on other sites

  • 2 months later...

Just came across another undocumented feature in Scripter.

 

Scripter is able to schedule midi events at a resolution that far exceeds 960ppqn. The LogicPro UI is setup with a midi tick resolution of 960ppqn, and so we can edit midi events in the piano roll and event list to that resolution or when we record midi tracks, its not clear to me whether they are recorded at a higher resolution then that and then displayed to us in the UI at 960 (until we move it around)...or if it quantizes our recordings down to 960 as we record. I have a feeling it does not quantize...so recorded performances then may also have this higher level of resolution that is hidden from us in the UI.

 

But make no mistake that midi events are stored internally at a much higher level of resolution. I think possibly at the current sample rate, which is roughly 20x the precision of LogicPro's 960ppqn midi (depending on the tempo). When we move things around in the pianoroll, then that resolution is meaningless...since its all quantized to 960ppqn.

 

Then comes Scripter.

 

When you use beatPos in Scripter to schedule midi events, they will be played with very accurate precision that exceeds 960ppqn. Other instrument plugins generally don't know anything about midi ticks, they are all programmed to think in terms of sample position for all rendering, as stored internally and as that information is passed to the plugin.

 

With Scripter you can use fractional beatPos values and achieve playback of midi events at any sample position you want.. I have tested this thoroughly and its true...with Scripter you can nudge notes to beatPos that are a fraction of one midi tick. I haven't thought of a use case for this, but still I found it interesting to know this.

 

For example:

 

var NeedsTimingInfo = true;

function HandleMIDI(event) {
   var samplerate = 48000
   var info = GetTimingInfo();
   var oneSample = info.tempo/60/samplerate;    

   event.beatPos = event.beatPos + oneSample;

   event.sendAtBeat(event.beatPos);
}

 

The above example moves midi events one sample later in time. By the way, in this example, at a tempo of 120BPM, there are 25 samples per midi tick. So the above example will take all incoming midi events and play them delayed by 1/25th of a midi tick (at 120BPM).

 

I tested this with several different plugins and indeed the bounced audio is moved one sample later.

Link to comment
Share on other sites

  • 1 month later...
I just found another undocumented feature. At least I haven't seen it documented. Stumbled upon this.

 

...

 

You might ask the question, why bother to put initialization code inside this callback function? Why not just put at the global level? This particularly comes up if you are trying to make a class that will be defined later on in the guts of the script and you want some simple stuff at the top of the script. Javascript will currently fail because there is no way to include the definition earlier. But if you place this kind of code in an Initialize() callback, it can be located at the top, and Scripter will automatically call it for you after all the function definitions have been defined...

[/size]

 

I also stumbled upon it..and THEN I found this post... ;)

 

In any case, moving all of my UI startup schtuff from the global space into Initialize() really cleaned up a lot of UI bugginess/spaghetti I had been living with, fwiw...

 

This thread is so great, btw!!

 

I do have a question. I am unable to locate EventTypes.js on my system (macOS 10.15.5 and Logic 10.5) where others have indicated it should be (/Applications/Logic Pro X.app/Contents/Frameworks/MADSP.framework/Resources/). Has it moved/changed names/simply gone away?? Don't really need it, just curious to take a look...

Link to comment
Share on other sites

Event types.js went away in 10.4.5. Most of what was in there was moved internally into logicpro, ie event type definitions, etc. Probably implemented in swift now. The MIDI functions are still viewable as javascript but there is nothing very interesting about them. I can’t remember the name of the file but look in the same path where you were looking for EventTypes.js, there is another js file
Link to comment
Share on other sites

  • 3 years later...
On 8/16/2017 at 2:54 PM, Dewdman42 said:

It should be noted that the beatPos floating point value that is provided is not always exactly what the outgoing event will be because all output in LPX is rounded to the nearest midi tick (1/960 beat). 

I found an error in the original first post which I cannot edit so note the above point #7 of original post is wrong.  The beat position float value is the actual sample accurate timestamp of the midi event.  Logicpro only rounds timestamps to 960 ppqn for display purposes only 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...