Jump to content

Notes dropping when playing through many transformer objects


saxmand

Recommended Posts

Hey.

I'm working on a big orchestra template in Logic X where I have an amount of transformers (round 10 plus a cable switcher) on every channel. (i've attached a picture so you'll get an idea)

I'm working with VSL dimension strings, so the string section it self is 29 tracks. I also have some master tracks and chord splitters so I can play more groups at the same time. I never have any problems what so ever on playback. But if I'm playing more instruments (round 13-14) at at once from my keyboard, not all NoteOn are send through and even less NoteOff...

 

I know it's the transformers, cause I can easily play 30 channels if I avoid all the transformers. Any other having some solutions. Else I just wanted to share the "problem".

684589530_Skrmbillede2014-02-16kl_22_22_37.thumb.png.9763ee7f809babc8dee379c2632a4a5a.png

Link to comment
Share on other sites

Try to use less transformer objects, or use regular faders instead. Using one transformer object per MIDI parameter change is probably too much especially when you have many ones like in your example. Regular fader object could probably do that job in many instances. Depending of the routing complexity in the Environment, some delay should be accounted for and palliated in the conception. Desigining your routing serially as much as possible could help in that regard.
Link to comment
Share on other sites

Thanks Atlas. So you are saying faders takes less "power" then transformers!?

Haven't been able to find any documentation for this.

In this case would I then also prefer 2 faders rather then on transformer (in case of creating notes from CC) ?

 

I'm constantly trying to simplify and make the environment smoother. And I see that I can definitely make some of the functions with fader objects instead. Do you have any measurement/rule of thumb for how big/complex an Environment setup will "have to be" before for it to start producing delay?

Link to comment
Share on other sites

I have the feeling that your cable switchers might be culprit here too, though it's really not possible to offer specific advice unless you were describe precisely what's going on in this environment scheme. Or, maybe it would be easier to describe not the process itself but what your actual goal is here. Chances are you can simplify this setup, but more info is needed.
Link to comment
Share on other sites

Amazing, bouncing you problems to other and you will find new solutions :)

I just realized I could get without the cable switcher and narrowed my transformers down to 5.

Another thing I realized is that my transformer setup is not for live playing so ruting around my transformers kinda solved it all.

 

But anyway, I'll just explaining what I'm doing, since it also might be interesting for others :)

 

It's specifically for controlling VSL.

Nothing is affecting channel 1 so it can be played freely. Above channel 1 these things apply:

- Note velocity is transformed to CC1 for controlling y-axis.

1.png.cae9358746006912839bf9e1572de482.png

- CC2 (breath) value is used as value for note velocity instead.

2.png.dba60269623ef54efefcd3f4b282d84c.png3.png.c624372af518c0a2feb1a5cbd584eee8.png

- Note channels from 2-13 control x-axis, which is key-switches C0-B0.

4.png.c7db3b4b113221a3991751ca6b2e3852.png

After this you of course have to give the right channel to the instrument before sending to VSL ensembler.

5.png.cd8b43e6218cd05747e84170298ff970.png

 

Why?

- Because all key-switches are connected to the note it self even if I move it.

- My notes velocity will be the same when I use Vel.XF or not. (when using Vel.XF you sometimes have more velocity layers playing. This is especially a problem for dynamic patches.

Link to comment
Share on other sites

So I did a bit more research on my original subject. (which only concern live playback on you MIDI keyboard, since Logic react different on playback). I would like to have a better understanding of this...

 

I made as many objects as possible to see when/if the MIDI signal will be stopped (cause that's obviously what happened with my template). Here's my observations.

- Sending from 1 input, the MIDI signal stops after 397 objects (this include the input object)

- it counts for all types of MIDI messages, (notes, CC ect)

- All objects tried acts the same. I've tried with: Transforer, MIDI instrument, Fader and Monitor objects.

- If I send from 2 input but limit input one to 160 objects, the next input signal will stop after 196 objects.

- Logic priorities the input that are first in the arrangement window

- the same happens with 3 inputs and limiting the first two, but it seems to me that the input object counts for 2.

- so playing with 4 input, limiting the first 3 inputs I can send through a total of 393 objects.

- Channel splitters and splitting or using channel split in you transformer IS re-ruting the signal.

 

The next thing I guess is to check how many inputs can be used live in the environment at the same time.

Link to comment
Share on other sites

Yes, there are limitations, and they can crop up when passing MIDI through many fewer objects than 397. Sometimes, the limit can be as few as 12! It all depends on what you're doing. I've built very complex environments that pass MIDI through a fairly high number of objects without dropping MIDI, so really it all depends on the approach you're taking that will determine "how far you can go".

 

Anyway, it's great that you've been able to reduce the number of environment objects. But I have the feeling you could do some of what you're trying to achieve in an easier way. (That's just a feeling because, of course, I can't see exactly what you're doing and I don't have your environment to examine in person). And if you're trying to "replace" velocity on-the-fly, that could be another reason why you're losing MIDI, because some of the messages you're generating end up being incomplete. Just a guess...

Link to comment
Share on other sites

I'll make it more clear. Actually my solutions are working well at the moment, but anyway...

 

This is what I want.

1) making a big orchestra template with and environment that gives me better control of keyswitches.

2) play different instrument groups with this chord splitter solution:

http://www.vi-control.net/forum/viewtopic.php?t=25222&postdays=0&postorder=asc&start=0

 

This is what happened.

1) I've tried many different environments. Sometimes they becomes more buggy, even if I feel I'm optimizing it.

This is the reason for my attempts on figuring out some of the environment limitations behaviors.

2) When using the chord splitter I got hanging notes (because of the limitations). I've solved this by avoiding all transformers when using the chord splitter and going straight to the instruments.

 

If you are interested I could share the Environment and maybe you have suggestions what would be better solutions.

I already did try to see if I could make it more serially routing. I haven't found a solution jet, but when I did my experiments with the Environment it kinda showed that it would take the same power from the Environment having two instruments playing through the same objects as it would having them playing through one each...

Link to comment
Share on other sites

When you say "better control of keyswitches", what are you referring to? Channel = keyswitch = articulation? Unifying keyswitch definitions? Something else? Curious to know...

 

In regards to chord splitters, are you talking about an auto-divisi function? AFAICT, creating that in the environment would be near impossible (considering how much time it would take to devise a complex scheme to do note tracking).

 

But all that aside, I'd be interested in seeing your environment. I may have some solutions for you, but I think it would help if you could clarify even more exactly what you're after.

Link to comment
Share on other sites

As described above: It's specifically for controlling VSL.

Nothing is affecting channel 1 so it can be played freely. Above channel 1 these things apply:

- Note velocity is transformed to CC1 for controlling y-axis.

- CC2 (breath) value is used as value for note velocity instead.

- Note channels from 2-13 control x-axis, which is key-switches C0-B0.

This means these controls are added after recording and not live.

 

For the chord splitter thing I use the Kontakt script in the link above. It works surprisingly well. The concept of this is to find colors rather then having you MIDI playing back through it. For that I would split my MIDI to the right channels.

 

Okay, so I'm not sure if I should clean the session and remove unimportant information and transformers..!?

I think it's best to clean it up a bit... But maybe still leave some MIDI regions in the project so you can see how I use it together with the MIDI programing? It'll probably make even more sense if you have some VSL productions to see the functionality? Do you have that?

Link to comment
Share on other sites

Clean it up as best you can so that there aren't any unused, non-functional, or "mystery objects" that would take up someone's time just figuring out that they don't do anything.

 

Leave a region or two in there for sure to show the result.

 

As for me, I have Vienna Special Edition, so with most of those sounds I'll have at least two Y-axis sounds to test that functionality, though no A/B stuff. Still, MIDI is MIDI, shouldn't be hard to figure out what's going on.

Link to comment
Share on other sites

Sure. My concept of orchestra programing though lies in the amount of recorded samples/effects so I also use a lot of dynamic patches. I've made my own matrixes to have all articulations at hand.

 

I'll make a file later cause I'm working on a deadline I need to finish first!

968247241_Skrmbillede2014-02-24kl_17_32_53.png.5a26b3388140fb9cb0b8138f1f60cb6b.png

for the reference of matrix setup. This is my piccolo matrx

Link to comment
Share on other sites

Well here's a logic x file. https://www.dropbox.com/s/t30gm8f7u6tid24/Saxmand%20Environment%204v1.zip

I've cleaned it up quite a bit. It might not come that interesting the MIDI regions since it's not the most active melodies.

 

- The idea is having an environment template. That's the layer called: 1MIDI control (in this session it's 2x16 channels).

On this pages there's also the routing for the chord splitter, but you'll miss two IAC ports called: "To Kontakt" & "From Kontakt".

- In the Instrument layer you add instruments and then route them to the right channels on 1MIDI control. Since I use Dimension strings were I have control of every string player I also have them playable from collective instrument.

 

I hope it makes sense. I guess I could provide some better MIDI regions. Also it's not the latest environment, but this one works really well for me so it's the one I use in projects until I have the newer solid (composing and being technical at the same time often stops the creative workflow...). Curious about you thoughts ski and if the setup makes any sense to you :) !

Link to comment
Share on other sites

Just a couple of observations related to your concern about running MIDI through transformers and how far those messages can then travel...

 

Within your macro you're using a transformer as an "input", (correctly) labeled "Macro In" as it needs to be. However, the xformer itself doesn't do anything. So right there you've got an unneeded xformer. Suggest instead of using a xformer, use an ornament (labeled "Macro In") and that'll reduce your xformer count by 1 in each Macro.

 

The same is true of your macro's "Macro Out" object. Again, it's a transformer that does nothing other than provide the necessary label of "Macro Out". Suggest you replace those with ornaments as well.

 

Of course, if your scheme is working out, no need to change anything. And to be honest, I don't know if using "passive transformers" such as these will have any impact whatsoever on the number of objects MIDi can travel through. Still, I've established for myself a "good practice" practice of using the most processor-benign object possible as a Macro In or Out connection point, and ornaments fit that bill.

 

Ornaments can be used as input & output connection points (such as described above) as well as summing and distribution points. Multiple MIDI sources can be connected to the input of an ornament, and multiple outputs can be connected from it to various destinations.

 

When I have more time I'll look a little deeper into your environment. But for now, there are some programming pointers for you! :)

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...