Jump to content

The future of Logic Environment


ivicam

Recommended Posts

Hi everyone,

I've been following this fantastic forum and site for quite a while and decided to register recently. This is my very first post here. ūüôā¬†

As we know, Apple has officially deprecated the environment, I think in Logic 10.7.

As a macOS and iOS software developer, I can guess that this means that the feature will be removed in future (at least the user-accessible part) and its usage is certainly discouraged for new projects although it's still very possible.

But the deprecation and removal of a technology usually comes with a complete or almost complete replacement. So I am wondering which additional features Logic needs so that everything possible in the environment could be done without it?

The most obvious one is internal midi routing, something similar to what Ableton Live has. Every track should be able to act as input or output of another track. Audio tracks should also be able to receive midi from MIDI tracks in order to control effects with MIDI.

Now, most of the routing can already can already be done in Logic with some combination of summing stacks, MFX plugins, IAC, external instrument, sidechaining, Quick Sampler etc. But a more elegant and sample-accurate solution for midi routing would certainly be a big plus.

Would this be enough in combination with various MIDI FX plugins and effects to completely replace the environment or something else would be necessary?

Edited by ivicam
  • Like 2
Link to comment
Share on other sites

Welcome to LPH!

I'm not sure they will completely remove it for the lifetime of LP 10 - it would potentially mean that you couldn't load old projects that used the environment, or import older projects. I think, like many of Logic's legacy plugins and instruments, it will stay hidden as a "Legacy" item, but remain intact and essentially neglected.

There's already many things you can do with the MIDI plugins and Scripter, which are really intended as more accessible replacements to implement MIDI processing, do delays and arpeggiations etc. In short, they've been replacing those environment functions in favour of more accessible, front-facing features for quite some time. Layering instruments you can already do via the tracks, and more niche things like setting up editing environments for MIDI hardware just isn't really something that, unlike in the old days, is a big use case, when people can use other apps, iPads etc with much better interfaces.

I think a better way of looking at it are what are the use cases that Logic users would want to set up, that can't currently easily be done, or can only be done with an environment solution. If there are good cases, I'd suggest them to Apple, so they can help design newer features to accomplish those goals more easily.

(And yes, the MIDI-controlled FX plugins are a bit of a clunky implementation, imo.)

For me, the main thing that the environment can't be replaced for is for handling MIDI hardware with program/bank changes selectable by name, as the External Instrument plugin does not do a good job here.

Also, having MIDI processing on the input before the sequencer is very useful (eg, modifying or filtering MIDI input streams to Logic), and that's something else that can't be achieved with any other means than the environment.

I agree on the MIDI routing thing, but this needs an overhaul anyway to support the output of MIDI from plugins and a more general solution - it's possible this is in process given the bug we saw in 10.8.0 regarding this. Because otherwise, it's pointless to set a track's MIDI out to another track, as plugins on those tracks can't currently output MIDI data at all, and if you just wanted to distribute the current incoming MIDI data to multiple tracks, that's easily done already. My guess is that it would be easy enough to use current environment functionality in disguise and create and route environment objects directly from the tracks window to accommodate some of those routing tasks.

What other use cases would you want to cater for?

  • Like 5
Link to comment
Share on other sites

5 hours ago, ivicam said:

As we know, Apple has officially deprecated the environment, I think in Logic 10.7.

As a macOS and iOS software developer, I can guess that this means that the feature will be removed in future (at least the user-accessible part) and its usage is certainly discouraged for new projects although it's still very possible.

I don't disagree.  As already pointed out, it's not really removed yet, just pushed to the back of the room.  But I do think its foretelling also.

5 hours ago, ivicam said:

Would this be enough in combination with various MIDI FX plugins and effects to completely replace the environment or something else would be necessary?

We can consider this conversation as hypothetical, since none of us has a crystal ball we don't know what Apple is like to do.  Anything is possible.  But hypothetically...  what does the environment bring to the table which would be essential enough to require an alternative replacement?  Yea?

First consider that the environment is more than just a midi routing place.  the basic midi routing capabilities of the environment could EASILY be replaced by something much more simple in the main GUI, that would not be hard to do and reading older projects is not a problem either, its just a matter of programming.  Other DAW's do generally have more midi routing.  LogicPro has recently already been adding some of those kinds of details little by little, for example, the ability to specify the input midi port and channel directly from the track inspector on the arrange page.  It would not take much to add a few more midi routing capabilities directly from the GUI.

The environment has been used for midi processing, with transformers and other tricks..most of which can be done using MIDIFX..including scripter an also third party AUmfx can be used.

LogicPro recently already added the ability to place these midifx in front of recording to the region as well, which used to be a reason to use the environment sometimes, but now can be eliminated.

There are a couple more midi routing things which could be better respresented outside the environment, for example multi-instrument objects can be very useful, but its probably true that less than 1% of users are actually using those anymore.  But occasionally they are still helpful due to other limitations in LogicPro such as only 127 channels per multi instrument, or only 1000 insrument channels, etc..  Those limitations could also be easily eliminated and basically eliminate the need to bother with the environment for so called "work arounds".

If you look at Mainstage, it has considerably more GUI based features than LogicPro does related to midi routing, velocity scaling and all manner of typical things that people have used the environment to manage for years in LogicPro.  In mainstage there is no "exposed" environment but there are significantly more powerful ways to route the midi, massage the midi, provide fancy smart controls on the main performance view, etc..  way beyond what LogicPro can do without diving into the environment.  If I were Apple I would be trying to make LogicPro more like that, covering all the bases and basically removing the need in 99.99% of all cases to ever need to get under the covers in the environment.

Another thing to keep in mind is that the environment is not just about midi routing.  Internally in the software LogicPro has some notion about so called "objects" such as channel strip objects, which historically users used to have to wire together in the environment without any GUI to do it at all, before being able to make any music.  they considered it a value add to have that power.  Apple attempted to hide all that stuff behind a better GUI, and covered maybe 90% of it, but they seem to be trying to hide it more and more...at least for now, the GUI only indirectly creates related environment objects under the covers, its still the same environment paradigm for managing those objects in some way.  But they are trying to eliminate the need to expose that internal access to the end user.  How its all implemented in actual software under the covers is entirely up to them, whether they continue to use some internal environment object paradigm in software or not would not matter at all unless we are sill demanding some kind of low level access to it.

I personally think it's basically pretty close to being covered already now, but every once in a while I'm glad I can work around a short coming routed to midi routing generally.  If they provide just a a few less limitations based on 127 and 1000 limits...and a few more routing options on par with Mainstage...perhaps the ability to create GUI panels more sophisticated then smart controls now...perhaps a few more midiFX..i think 99.99% of users will not care about the environment anymore and they could safely hide it permanently...and once its hidden, they can choose internally at Apple to completely change the guts of LP to be something completely different then cabled environment objects or not at their own choosing.  some of those limitations mentioned are precisely because of legacy environment code going back a long time, if I were them I would want to get rid of it and replace it with something more  modern and less limited in every way...albeit without exposing a user-controllable environment any longer.  Personally that is the direction I think they are going, but nobody has a crystal ball.

  • Like 4
Link to comment
Share on other sites

MIDI 2.0 features are likely to eliminate stuff like manually setting up program/bank, automatic hardware parameter mapping, a.s.o.
Communication of this sort of info between devices/apps reduces the need for many of the things the environment brings to the table.
That said, the backward compatibility of 40 years of MIDI devices can't be ignored, so the ability to manually handle things like program changes/bank select, etc. will need to be retained.
Whether or not that means the environment or different dialog windows is the mystery.

  • Like 1
Link to comment
Share on other sites

2 hours ago, des99 said:

I agree on the MIDI routing thing, but this needs an overhaul anyway to support the output of MIDI from plugins and a more general solution - it's possible this is in process given the bug we saw in 10.8.0 regarding this. Because otherwise, it's pointless to set a track's MIDI out to another track, as plugins on those tracks can't currently output MIDI data at all, and if you just wanted to distribute the current incoming MIDI data to multiple tracks, that's easily done already.

I agree. The only thing that would make sense currently is to route Modulator or Scripter output to audio tracks in order to modulate plug-in parameters on those tracks. But that can already achived without the environment by doing the opposite - routing audio to a MIDI track through sidechaining or Quick Sampler. 

By the way, are there any technical reasons why AUs and Logic don't fully support MIDI in and OUT from everything and why even VST3 standard has a weird way of doing it?

2 hours ago, des99 said:

What other use cases would you want to cater for?

Well, apart from internal MIDI routing and proper AU MIDI in-out support, nothing really. I rarely need to transform MIDI before the sequencer and if I ever do, I use something outside Logic to intercept MIDI from a controller, transform it and send the transformed MIDI through Logic Virtual In.

What I would like to have, though, is a Scripter that is not only a single-track MIDI effect, but also a global Logic scripting tool with access to most of the features of Logic. Something like Max for Live. But that has nothing to do with the environment, so it's another topic. ūüôā

Link to comment
Share on other sites

1 hour ago, ivicam said:

I agree. The only thing that would make sense currently is to route Modulator or Scripter output to audio tracks in order to modulate plug-in parameters on those tracks. But that can already achived without the environment by doing the opposite - routing audio to a MIDI track through sidechaining or Quick Sampler. 

the way to control audio FX is by using midi controlled FX, which sit in the instrument slot of an instrument channel.  so then you can put modulator in the same channel to control it and any other plugins in that same audio channel.  Audio comes through side chain from another bus.

1 hour ago, ivicam said:

By the way, are there any technical reasons why AUs and Logic don't fully support MIDI in and OUT from everything and why even VST3 standard has a weird way of doing it?

Apple's spec for AU2 does not support midi out.  Any AU plugins that try to do it are building in undocumented behavior.  LogicPro will never support that, despite all the wishes.  I'm not sure if AU3 officially supports midi out or not.  Environment has nothing to do with this situation.  The channel strip itself does not capture midi out from AU's in the instrument slot.  Its a midi sandbox.

Note that Steinberg doesn't really like to support midi out either, they have been vocal about that, note that VST3 barely supports it properly...  It happens to be there because of the way VST works, so many people have exploited it, but Steinberg considers VST to be an audio plugin system, and the only reason for midi is to feed instrument plugins.  And with VST3 they are even wanting to eliminate midi from that, but unfortunately they had to concede to some minimal midi handling in VST3 in the end, but its a bit broken compared to VST2.

If Steinberg truly supported the concept of VST midi plugins, they would put VST plugin slots in front of instrument plugin slots on their instrument tracks so that users could use VST midi plugins the same way LogicPro does with AUmfx.  But they don't.  Which means in Cubase you have to daisy chain tracks in order to utilize these underground VST midi plugins..which loses sample accuracy by the way.  

LogicPro is actually the only DAW that I know of which specifically puts AUmfx slots in front of the instrument slot on instrument channels so that you can use one or more AUmfx plugins (which ARE officially midi plugins by the way), and get sample accurate rendering of midi events by the instrument plugin.  Apple actually did it 100% right!  All these concerns about AUinst not outputting midi is only because so many plugin developers have exploited the unsupported VST feature to make midi plugins and then sometimes converted them to AU...and all the users get frustrated that they can't use those in LogicPro.  The same developers should be releasing AUmfx versions for LogicPro, that is the Apple way....and indeed the BEST way to handle midi plugins.

AUinst are not meant for midi output...and its very unlikely Apple will add that.  And if they did you may  not realize that the output from an AUinst would have nowhere to go but loop around to the next process buffer block with latency and non-sample-accuracy.  But people would simply be satisfied that finally they could daisy chain tracks like Cubase and some others..but really Apple's way is the superior way of doing it

1 hour ago, ivicam said:

What I would like to have, though, is a Scripter that is not only a single-track MIDI effect, but also a global Logic scripting tool with access to most of the features of Logic. Something like Max for Live. But that has nothing to do with the environment, so it's another topic. ūüôā

I hear you.  Probably won't happen.  That's more like a macro facility of some kind.  If Apple were going to do that they would probably try to integrate its with AppleScript or Automator, etc..  That is the apple way.  Your best bet for that today is to work with Keyboard Maestro to create some macros that automate LogicPro in various ways.  You want Reaper.  Apple won't expose it that way.  

  • Like 2
Link to comment
Share on other sites

the case I could make for some better midi routing options..  might be.  Let's say I want to use an AUmfx midi plugin that generates a bunch of midi that I want sent to multiple instruments sitting on multiple instrument channels.

Right now there is no way to do that without using some 3rd party utility plugins or loop around over IAC (which by the way is essentially what Cubase, S1 and other DAW's do).  I think the right apple way would be to support putting AUmfx plugins on the top level of a summing stack where each instrument channel inside the summing stack can be listening different midi channels, etc. (essentially making it more like a "multi-stack", or a stacked track version of a virtual multi instrument.

Another situation where people want to use AUinst with midi out is if its a plugin that outputs both audio and midi.  There aren't very many of those honestly.  Those same plugin developers should really be separating that behavior between using two plugins, one in the aumfx slot and one in the instrument slot...  That would be proper adherence to Apple CoreAudio spec, but many developers unfortunately tend to use VST as their baseline, or JUCE...and then wrap it to AU as an afterthought.

Edited by Dewdman42
  • Like 3
Link to comment
Share on other sites

6 minutes ago, Dewdman42 said:

LogicPro is actually the only DAW that I know of which specifically puts AUmfx slots in front of the instrument slot on instrument channels so that you can use one or more AUmfx plugins (which ARE officially midi plugins by the way), and get sample accurate rendering of midi events by the instrument plugin.  Apple actually did it 100% right!  All these concerns about AUinst not outputting midi is only because so many plugin developers have exploited the unsupported VST feature to make midi plugins and then sometimes converted them to AU...and all the users get frustrated that they can't use those in LogicPro.  The same developers should bet releasing AUmfx versions for LogicPro, that is the Apple way....and indeed the BEST way to handle midi plugins.

AUinst are not meant for midi output...and its very unlikely Apple will add that.  And if they did you may  not realize that the output from an AUinst would have nowhere to go but loop around to the next process buffer block with latency and non-sample-accuracy.  But people would simply be satisfied that finally they could daisy chain tracks like Cubase and some others..but really Apple's way is the superior way of doing it

I agree in principle, but in practice, many standalone software instruments have built-in MIDI transforming features such as sequencers, arpeggiators, strummers, etc. It would be redundant and weird from UX standpoint to build separate AUmfx for usage inside Logic. Sometimes, those transformations wouldn't even make much sense as a standalone MIDI plugins as they are highly specific to that single instrument. 

Link to comment
Share on other sites

11 minutes ago, ivicam said:

I agree in principle, but in practice, many standalone software instruments have built-in MIDI transforming features such as sequencers, arpeggiators, strummers, etc]

I agree, well as I said, the Apple way is to separate that functionality for sample accurate rendering.  All those instruments you are thinking of, when the midi routed to another track in cubase, it will not be sample accurate.  it will be delayed into the next process buffer block.

getting back to LogicPro, you can handle that situation, sample accurately, by using third party utility plugins such as Kushview Element, PlogeBidule, DDMF meta plugin and Unify.  These will let you host a chain of midi plugins leading to an instrument plugin, all hosted inside Logicpro's instrument slot...all sample accurate.  As long as none of those plugins are built in logicPro plugins that is.  Also, AUinst still does not officially support midi out so you'd have to use the VST version of the inst to get both audio and midi out of the one plugin.

So we could say it would be interesting if Apple provided this utility chaining as a built in apple plugin so that people could essentially cram that chain into the instrument slot, problem would be solved.  Ask for it!  Except Apple won't support VST plugins and they probably won't support midi out from AU, but hey you can ask for it.  Doubt they will do it now.

As I said, the way Cubase is doing it is not officially Steinberg's estimation of how things should work, will not be sample accurate either and is only happening because of a fluke of the original VST api that allowed developers to exploit it.  

 

11 minutes ago, ivicam said:

. It would be redundant and weird from UX standpoint to build separate AUmfx for usage inside Logic. Sometimes, those transformations wouldn't even make much sense as a standalone MIDI plugins as they are highly specific to that single instrument. 

I hear you and don't disagree.  Nonetheless you still have the issue of sample accuracy in that situation.  If you want to do that, do it the way I suggested above.  If you want to loop it out through IAC that is also possible, which would be more similar to what Cubase and other DAW's do when chaining midi tracks

Edited by Dewdman42
Link to comment
Share on other sites

26 minutes ago, Dewdman42 said:

I hear you.  Probably won't happen.  That's more like a macro facility of some kind.  If Apple were going to do that they would probably try to integrate its with AppleScript or Automator, etc..  That is the apple way.  Your best bet for that today is to work with Keyboard Maestro to create some macros that automate LogicPro in various ways.  You want Reaper.  Apple won't expose it that way.  

I had some success with using midi mappings for automating Logic. Almost everything in Logic can be mapped to a midi message. Those messages can then be programatically sent to Logic through IAC or Logic Virtual In or even from something on iPad (e.g. Swift Playgrounds) through a physical port. 

Interestingly, I was able to control Logic even from Scripter, by sending MIDI messages to IAC. Just map CC 22 that comes through IAC to Play or Stop key command, make an external instrument track that outputs to IAC and try this Scripter code on that track:

var PluginParameters = [{name:":", type:"momentary", valueStrings: ["Play or Stop"]}];

function ParameterChanged(param, value) {
	
	if (param == 0) {
		var cc = new ControlChange;
		cc.number = 22;
		cc.value = 127;   
		cc.send();
 	}
}

A problem for some use cases is that this is one way communication. There is no way to get feedback from Logic when something happens inside it. I guess control surfaces Lua scripts could be used for that scenario, but the API is not available even to registered Apple developers. Will MIDI 2.0 solve this type of two-way communication?

Link to comment
Share on other sites

I am trying to script and automate recording and playing live loops in many different ways inspired by Ableton Live's Session View features. And it works better than I expected. You can do things like "play the first scene for n bars then choose a random scene and play it for another m bars and at some moment start recording into a particular cell" etc.

What would be great is to also get feedback when a midi-mapped command is performed inside logic. For example, when a cell is played manually in Logic, it would be useful for the scripting code to get that feedback and adapt accordingly. That kind of feedback is possible via Lua scripts that supported midi controllers use for auto-mapping but, as I said, Apple doesn't make the API publicly available. I think they only provide it to chosen partners.

Edited by ivicam
Link to comment
Share on other sites

11 minutes ago, ivicam said:

What would be great is to also get feedback when a midi-mapped command is performed inside logic. For example, when a certain cell is played manually in Logic, it would be useful for the scripting code to get that feedback and adapt accordingly.

I agree, and good feedback is frustratingly hard to get. In my apps, I have to monitor the MCU communication and infer state from the things Logic sends to the MCU, which is quite limited and doesn't always give you what you want.

I think the way Logic Remote interfaces with Logic is much more complete, but good luck reverse-engineering what's going on there...

(If you do, I want in on it! ūüėȬ†)

11 minutes ago, ivicam said:

That kind of feedback is possible via Lua scripts that supported midi controllers use for auto-mapping but, as I said, Apple doesn't make the API publicly available.

I haven't dug into the Lua scripting stuff, but you can gain a lot of ideas of their capabilities by studying the included Lua scripts for various controllers. For example, you can get them to create custom controller assignments, but only in a limited fashion.

I think though the functionality is narrow, and targetted specifically for handling controllers, not providing an extensive entry and view on the internals of Logic...

  • Like 2
Link to comment
Share on other sites

1 hour ago, ivicam said:

I am trying to script and automate recording and playing live loops in many different ways inspired by Ableton Live's Session View features.

If you have a Push 2 you might wanna look at this:
https://github.com/zurie/LogicX-Push2

You can also run Logic/Live alongside each other and use IAC etc. to stream stuff between them.....essentially using Live/Push to "drive" Logic instruments/FX.

Edited by oscwilde
  • Like 2
Link to comment
Share on other sites

On 12/2/2023 at 1:39 AM, Dewdman42 said:

 If you want to loop it out through IAC that is also possible, which would be more similar to what Cubase and other DAW's do when chaining midi tracks

Out of curiousity, can you think of a practical use scenario in which non-sample-accurate approach would be a problem?

It never happened to me that MIDI latency introduced by IAC or other virtual ports became a problem. All the other latencies in a project are usually significantly higher than MIDI latency.

Link to comment
Share on other sites

On 12/2/2023 at 2:15 AM, oscwilde said:

If you have a Push 2 you might wanna look at this:
https://github.com/zurie/LogicX-Push2

I don’t have Push 2, but I have Novation Launchpad Pro MK3.

On 12/2/2023 at 2:15 AM, oscwilde said:

You can also run Logic/Live alongside each other and use IAC etc. to stream stuff between them.....essentially using Live/Push to "drive" Logic instruments/FX.

I‚Äôve already tried this setup and it works great. It‚Äôs just that I am trying to push Logic itself to the limit to see what‚Äôs possible without using Live at all. Partly for fun and partly because I would like to switch to Logic completely. Music is my hobby and I like to mix guitar and piano with electronic music. Live is deffinitely better and easier for those kinds of things, no question, but I think Logic is almost there (for my personal needs) with a bit more effort.¬†ūüôā

Link to comment
Share on other sites

5 hours ago, ivicam said:

Out of curiousity, can you think of a practical use scenario in which non-sample-accurate approach would be a problem?

It never happened to me that MIDI latency introduced by IAC or other virtual ports became a problem. All the other latencies in a project are usually significantly higher than MIDI latency.

Most people will not notice it I agree

but anyway the other latencies are not an issue due to plugin  delay compensation.  Iac latency is not compensated.  But as you said people wont notice then you should be fine there are many work arounds in logicpro to use iac.  I was only trying to explain why Au spec is better then vst for midi plugin support 

  • Like 1
Link to comment
Share on other sites

On 12/2/2023 at 1:39 AM, Dewdman42 said:

getting back to LogicPro, you can handle that situation, sample accurately, by using third party utility plugins such as Kushview Element, PlogeBidule, DDMF meta plugin and Unify.  These will let you host a chain of midi plugins leading to an instrument plugin, all hosted inside Logicpro's instrument slot...all sample accurate.  As long as none of those plugins are built in logicPro plugins that is.  Also, AUinst still does not officially support midi out so you'd have to use the VST version of the inst to get both audio and midi out of the one plugin.

I was curious about the bolded, given the 10.8 double notes bug and I found something interesting.

If you host the AU version (not VST) of Arturia Pigments (or I guess anything else known to generate MIDI) in something like JUCE AudioPluginHost or Max, you can see that the AU does in fact output MIDI. And you can send that MIDI to Logic. It's just that Logic itself strictly follows AU specs and ignores MIDI generated from AU instruments, except in 10.8.

Link to comment
Share on other sites

6 minutes ago, ivicam said:

you can see that the AU does in fact output MIDI.

Using what mechanic? Plugins have always been able to do this indirectly, by opening up a virtual MIDI port, or sending out MIDI to IAC, and then that can go out of the plugin and back into Logic. But are you saying they output MIDI back to the host directly using some different mechanic?

Link to comment
Share on other sites

Just now, des99 said:

Using what mechanic? Plugins have always been able to do this indirectly, by opening up a virtual MIDI port, or sending out MIDI to IAC, and then that can go out of the plugin and back into Logic. But are you saying they output MIDI back to the host directly using some different mechanic?

They output MIDI to the host directly. In the example bellow, I use MIDI output from one AU instrument to play another AU instrument. 

 

 

Screenshot 2023-12-04 at 00.12.56.png

  • Like 1
Link to comment
Share on other sites

Just to clarify, the way hosts and vst/au plugins work, there is a buffer that is passed to the plugin, which has audio and midi on it..  The plugin can then write all over that buffer however it wants.  the host will then see whatever is in the buffer, if it cares to look at it.  In the case of AU, its possible for a plugin to write midi data on that buffer, regardless of that fact that the AU spec doesn't recognize it as being something desirable.  There are even some hosts that check the buffer just in case it might have some midi data there, even though the spec doesn't specify  it.  BlueCatAudio's patchwork, for example, does.  Apple just doesn't recognize it as a disireable thing and LogicPro doesn't do it, some other hosts also don't look for it.  DP used to, but doesn't any longer.  

As it turns out AUmfx plugins are written very much the same way as AU plugins.  You can actually take an AUinst plugin, change one line of code and rebuild it as an AUmfx.  but a host such as LogicPro would in that case ignore the audio output in the buffer, just as the AUinst is igonoring the midi.  But basically the same mechanism that AUmfx pass midi out along to the next plugin in the chain, is how AUinst can do it...if the host decides to ignore Apple's restrictions and look for it anyway.

So the buffer is always there really, its just a question of whether the plugin and the host pay attention to it in certain plugin types.

Anyway this topic has been discussed to death for decades.  We are not covering anything new here.  Apple didn't spec it.  It's extremely unlikely to show up in AUinst pretty much ever.  If you want to talk about work around, you can search this forum and find some ways to deal with it.  If you just want to rant against apple, then have at it, but it will get you nowhere.  Apple doesn't agree with you.  At least they do provide a proper midi plugin format, which Steinberg does not. 

  • Like 2
Link to comment
Share on other sites

The reason JUCE hosts can do it, by the way is because it is designed to code once and built projects as both VST and AU.  So the basic default is to recognize and use midi output if its there since that is what VST can do.  If you build your host for AU's, it will be possible to also check for that buffer, regardless of the fact Apple condones it or not.  If the plugin put something there, then it will work.  Most plugins don't try to put anything into AU midi output anymore.  There are a few, but usually its just an oversight or they based it in juce and its just there for free without them doing anything special.  But as I said, for the most part hosts which are AU compliant will not look for it anyway.

  • Like 1
Link to comment
Share on other sites

1 minute ago, Dewdman42 said:

In the case of AU, its possible for a plugin to write midi data on that buffer, regardless of that fact that the AU spec doesn't recognize it as being something desirable.

Yep, I was just having a mooch about in the JUCE forum for code examples of how people were doing this in AUs, and from a quick reading it seems it's basically this - using the Juce buffer to pass back MIDI data...

Link to comment
Share on other sites

It's really the job of the host to look for whatever the plugin puts in there.  If the host doesn't look then it might as well not be there.  JUCE based hosts look for it, beause they are building with VSt intention.  IF someone builds a host with AU api's directly...it probably won't look for it, though its still possibly they could go out of their way to look for it depending on how they use the CoreAudio APi.  core audio has a lot of higher level API's for setting up signal chains and stuff and most likely if a host was using that, it would just basically ignore any midi output that might be sitting in that buffer.

Link to comment
Share on other sites

But like I said, you can take an AUinst, change one line and rebuild it as AUmfx, then suddenly a proper AU host would look for the buffer and make use of it.

its really just a matter of Apple specifications and who is being compliant.  LogicPro is being compliant.  So is DP.  some of these smaller hosts like Patchwork, maybe the JUCE freebie hosted, possibly Kushview element, are using shared code between VST and AU and will just look for the midi data and use it regardless...but its all under the table in terms of apple spec.

Also in the case of LogicPro, even if it did look for the AUinst midi output, what does it do with it?  it has a huge enviornment infrastructure underneath the covers and basically treats the channel strip like a midi sandbox...no way out and nowhere to go.  They'd have to deal with that.

what I would welcome in LogicPro which would be easy, would be a way to check a box that can look for renegade midi output in the AUinst and send to IAC.  problem solved for a lot of people and easy to do.

 

  • Like 1
Link to comment
Share on other sites

12 minutes ago, Dewdman42 said:

If you just want to rant against apple, then have at it, but it will get you nowhere.

I am not ranting against Apple nor anyone else, where did you see that? I am just trying to understand how all this works. And I am not even disagreeing with you, as you obviously know much more than me about the topic. 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...