Jump to content

The future of Logic Environment


ivicam

Recommended Posts

1 hour ago, Dewdman42 said:

the frustrations you are expressing about this issue have been expressed by many many people, including me, for literally decades.  Apple hasn't budged on the issue.

This also applied to other things, eg the MIDI input bottleneck, until it finally got addressed. Never say never! 🙂

Link to comment
Share on other sites

I'll use the word as I wish you and you can correct me later if I'm wrong.  😇

In this case, Apple has an actual specification with intended reasons.. referring to midi input bottlenecks or other short comings that have needed correction and needed a long time to do it is not the same.  Those were not spec'd out intentional things.  the non existence of AUinst midi out is actually spec'd that way.  You are spitting into the wind with that one. sorry to say but that is how I see it.  I'd love to see them add it too, as I have already said, but you have really 0.0001% chance of it happening.  So good luck with that.

Edited by Dewdman42
Link to comment
Share on other sites

Just now, Dewdman42 said:

You are spitting into the wind with that one.

Just take it for how it was intended - a little light hearted, well-intentioned fun. (If you missed it, there's an extra little yellow clue as to how it was meant to be received.) I wasn't saying one thing is exactly the same as the other in quite as literal a fashion as you've read it. No one knows what Apple will choose to implement/fix/improve in the future - I made a simple, mildly optimistic analogy, that's all.

No one here is trying to score points. Remember this is a friendly community for the most part.

  • Haha 1
Link to comment
Share on other sites

Now, this is interesting. In WWDC2017 session number 501, it says:

Quote

AU can emit MIDI output synchronized with its audio output

Host sets a block on the AU to be called every render cycle

Host can record/edit both MIDI performance and audio output from the AU

So AUv3 can output both MIDI and audio. The question is how will this affect Logic for macOS in future, given that we now have Logic for iPad that works only with AUv3 plugins. 

Link to comment
Share on other sites

And given that AUv3 host can host AUV2 units transparently, I am really wondering whether this means that hosts can now use AU instrument MIDI output without violating the AU specs? I mean, we are not required to check whether the underlying unit is AUv3 before using midiOutputNames and midiOutputEventBlock.

Not implying what Logic should or will do, I am talking about the third party hosts in general. 

Link to comment
Share on other sites

If you really want this feature make sure to submit a feature request to Apple about it.  

I don't think AUv3 has any relevance here.  LogicPro is not fully compliant with AUv3 yet in numerous ways.  They consider it beta support and about the only significant plugin that uses V3 is Vepro.AU3, which is considered beta for this reason.  The only reason VSL even bothered with it is in order to get multiple midi input ports in LogicPro.  And as it turns out, there are still limitations due to a lot of legacy code in LogicPro's enviornment.

LogicPro is about the only host I know of today that even supports hosting AUv3 plugins on MacOS.  But its possible that will change if and when JUCE rolls completely with AUv3, then a bunch of JUCE based hosts and plugins would suddenly start playing in the AUv3 space.  But as of now, that space is mostly related to iOS...and on MacOs it has mostly been ignored.  Most plugin developers will not bother with it, and likewise the hosts won't either since there are scarcely any AUv3 plugins to host and most of them also support VST.

There are only 2-3 AUv3 plugins even in existence today for MacOS.  They are much more prevalent on iOS.  Its not clear to me at all that AUv3 supports the kind of direct midi out you have been asking about.  Both V2 and V3 have always been able to create virtual midi output from the plugin if they wish, for example some of Toontrack products do that today.  You can also do that with Kushview Element and plague Bidule, its one way to deal with what you want to do right now today using the current version of LogicPro.  But that is not the direct midi out we have been talking about here that AUinst does not support.  I don't actually know what the AUv3 spec says about that, if you know some good links about it please share.

 

Link to comment
Share on other sites

37 minutes ago, Dewdman42 said:

But that is not the direct midi out we have been talking about here that AUinst does not support.  I don't actually know what the AUv3 spec says about that, if you know some good links about it please share.

I don't think there are official specs as such, but I posted the WWDC link from 2017:

Quote

AU can emit MIDI output synchronized with its audio output

Host sets a block on the AU to be called every render cycle

Host can record/edit both MIDI performance and audio output from the AU

And here is the transcript about that concrete slide:

Quote

Now, let's see the second main new feature we have, which is the support for MIDI output in an audio unit extension.

We have now support for an AU to emit MIDI output synchronized with its audio output.

And this mainly useful if the host wants to record and edit both the MIDI performance, as well as the audio output, from the AU.

So, the host installs a MIDI output event block on the AU, and the AU should call this block every render cycle and provide the MIDI output for that particular render cycle.

Source

Link to comment
Share on other sites

2 hours ago, Dewdman42 said:

But that is not the direct midi out we have been talking about here that AUinst does not support.

And why does then Apple have sample code from 2012 demonstrating how to build exactly that - an instrument with direct MIDI out?

https://developer.apple.com/library/archive/samplecode/SinSynth/Listings/SinSynthWithMidi_cpp.html

Quote

 

This is a subclass of SinSynth that demonstrates how to use the midi output properties kAudioUnitProperty_MIDIOutputCallbackInfo, and kAudioUnitProperty_MIDIOutputCallback

defined in AudioUnitProperties.h. Using these properties, the SinSynthWithMidi simply passes through the midi data it receives. Use of these properties requires host support.

 

 

Edited by ivicam
  • Like 1
Link to comment
Share on other sites

Hey @ivicam I'm not really interested in arguing with you.  You're new here I don't know who you are but it seems like you just came here to dig up controversy with your first thread.  Sounds like you a have it all figured out already.  

Nobody here has the exact answers you are seeking or I thought you were seeking.  Good luck, hope you can figure it out.  As to why that old source from 11 years ago exists, is anyone's guess, I certainly don't know the answer, but congratulations on digging it up. You still have to just accept that Apple doesn't support midi out from AUinst.  DP also removed it from their app.  I rather recall StudioOne also is the same.  Reaper might.  Hope you can find what you are looking for.

Edited by Dewdman42
Link to comment
Share on other sites

52 minutes ago, ivicam said:

kAudioUnitProperty_MIDIOutputCallbackInfo, and kAudioUnitProperty_MIDIOutputCallback

The properties seen in that code have been around for a long time (already discussed here at LPH even years before that sample code was published), and are still there today in the official docs:

kAudioUnitProperty_MIDIOutputCallback | Apple Developer Documentation

kAudioUnitProperty_MIDIOutputCallbackInfo | Apple Developer Documentation

It seems to me, based on the evidence I've seen over the years, that the reason we have not seen proper MIDI out from Audio Units in Logic (of course, prior to AUmfx (aumi), aka Audio Units MIDI FX) is more to do with Audio Units hosts (including Logic) not fully supporting it, and not many Audio Units developers implementing it in their plugins.

Some interesting feedback on this topic from Blue Cat Audio's dev here:

AU MIDI out on Mac - DSP and Plugin Development Forum - KVR Audio

J.

 

  • Like 2
Link to comment
Share on other sites

Thanks for digging up that documentation!!, but we have to also say that if it were so easy to host midi out of AUinst, then why hasn't Apple added it in some form a long long time ago?  The list of methods you referenced were base class properties and methods related to all AU plugins yea?  They could have put it there for some reason, but still with the intention that AUinst is not "appropriate" to use it.   I'm just theorizing now, we don't really know why.  Logic, MainStage, GarageBand, none of them support midi out from AUinst.

The fact that it is in the base class is exactly why some plugins are able to provide it despite the fact that LogicPro ignores it.  But that doesn't necessarily mean that Apple WANTS it to be used that way, but anyway we don't know what Apple wants, what we know is that its possible for hosts to host it and its possible for plugins to provide it...but Apple doesn't provide that host support in LogicPro, GarageBand, Mainstage or anything else that I'm aware of.  Neither does DP, I think it used to.  Last time I tried it with S1v4 it didn't work there either.

Anyway, I would love to see them add this support, but I stand by my earlier comments...they probably won't.  Its been years and years people talking about this off and on.  Bottom line.

To repeat again, the best way to work around this in 2023 is to use one of the 3rd party utility plugins such as Kushview element to host the VST version of the plugin and inside there you can capture the midi out and do whatever you want with it.

  • Like 1
Link to comment
Share on other sites

well in addition to whatever theory about AUinst "appropriate' use, there is the problem that they simply didn't have a proper route through the environment to deal with it.  Its not as simple as just throwing it back into the environment directly. because there are factors related to how the process buffer is processed, etc.

I think the easiest thing they could have added easily very long time ago with only a few lines of code would be to simply have a checkbox somewhere that if you check it, you can use an IAC port to use to send the output from the AUinst.  Essentially do what some plugins like Toontrack have done to handle the deficiency.  That at least provides a way to get the midi out from the AU and bring it back into LogicPro.  but with a check box that is off by default.  The accidental bug in 10.8 was always on and created havoc for people.  it should be optional and you should be able to choose which IAC port to use so that you can further go into the environment and catch it or deal with it in some predictable way.  That would literally be a very small amount of coding to do that, using exactly the API you pointed out.  Could have been done years ago very easily without digging into the environment to figure out all the implications of taking the midi out from AUinst and doing something with it directly in the environment.  

In the environment there is currently not a way to say, cable the midi output from one channel strip to another channel strip.  you can cable it, but what is passed along is the same input, it doesn't pass along the midi that was generated inside each channel strip.  for plugins you have to worry about buffered process block processing...and so if you cable it from one channel strip to another, there are implications that sometimes the data would need to be handled on the next process block because the destination channel strip was already processed in this process block, etc..  Weill multiply that complexity by all the possible cable routing variations and it can get complicated quickly.  

On the other hand, just send it to IAC, it comes back on the next process block as if its new midi..fine..you lose about 1ms probably of delay that way but basically avoid internal complexities related to how the channel strips and plugins within are processed and what to do with midi generated that you are trying to cable around to other channel strips in the environment, etc. . it would just be quite  larger problem to solve, and with some old legacy code too.

Link to comment
Share on other sites

@Dewdman42

 

I am not arguing with you at personal level. Sorry you perceived it that way, may be due to the fact that English is not my native language.

I agree about the Logic itself and I can see why they’ve never added capturing MIDI out from AU instruments and why they may never do it. It would be nice if they start supporting it at one point, in whichever way they find works the best for Logic, and it’s not the end of the world if they don’t.

What I am trying to say here based on reading both old and the most recent Apple’s documentation, sample code etc. is the following:

 

1. AU instruments can output both audio and MIDI directly, without any hacks, by using officially documented native APIs

2. Hosts can catch and record that MIDI output without any hacks, by using officially documented native APIs

3. This has been technically possible for many years with AUv2 as well and Apple even showed how to do it

4. The most recent and officially recommended API for writing both AUv2 and AUv3 hosts doesn’t event distinguish between the two types of AU units. Everything is bridged transparently between AUv2 and AUv3. If you use that API to catch MIDI out directly from AU instruments, it works in the exactly the same way for AUv2 and AUv3 without any hacks. It’s how it is designed to work. You don’t have to do anything special to support AUv2 or AUv3. Hosting both of those just works out of the box

 

That's all. I'm also finishing the discussion on this specific topic here as we mostly agree about the Logic Pro itself. Just wanted to clarify these more general technical aspects of the current state of affairs in the Apple's dev ecosystem.

 

Edited by ivicam
Link to comment
Share on other sites

Very, very interesting conversation . . . For me, the biggest asset related to the Environment was using it as a replacement for Emagic's Sound Diver. Bearing in mind, that new hardware like Yamaha Montage plan to introduce their own DAW plugin for managing configurations, patches and sound banks and that Access Virus TI has done this already, is there another alternative for managing external hardware keyboards and modules presently in LPX? 

Link to comment
Share on other sites

5 hours ago, synth_hero said:

Very, very interesting conversation . . . For me, the biggest asset related to the Environment was using it as a replacement for Emagic's Sound Diver.

I was personally very sad to see Sound Diver go away.  I have never tried to use the environment in this way but I know some clever users have.  Have always been curious about it.

5 hours ago, synth_hero said:

is there another alternative for managing external hardware keyboards and modules presently in LPX? 

The main advantage of the environment would be being able to us faders and knobs, etc.  Well I have no experience with this specific task, I would love to see some working examples if you have any.

Today there are a lot of 3rd party systems, some of them even free, some will run as plugins inside your daw of choice too, such as this one:

https://github.com/RomanKubiak/ctrlr

I've seen people put together sysex panels using some other tools out there too like plogue bidule, which is like the environment on steroids, loomer architect and some others.  And basically there are a number of ways to do it.

 

Link to comment
Share on other sites

6 hours ago, ivicam said:

 

I agree about the Logic itself and I can see why they’ve never added capturing MIDI out from AU instruments and why they may never do it. It would be nice if they start supporting it at one point, in whichever way they find works the best for Logic, and it’s not the end of the world if they don’t.

That's all. I'm also finishing the discussion on this specific topic here as we mostly agree about the Logic Pro itself. Just wanted to clarify these more general technical aspects of the current state of affairs in the Apple's dev ecosystem.

You can submit your discontent about LogicPro's and some other DAW's unwillingness to provide midi out using the documented API here : https://www.apple.com/feedback/logic-pro/

regards

  • Like 1
Link to comment
Share on other sites

6 hours ago, ivicam said:

The most recent and officially recommended API for writing both AUv2 and AUv3 hosts doesn’t event distinguish between the two types of AU units. Everything is bridged transparently between AUv2 and AUv3. If you use that API to catch MIDI out directly from AU instruments, it works in the exactly the same way for AUv2 and AUv3 without any hacks. It’s how it is designed to work. You don’t have to do anything special to support AUv2 or AUv3. Hosting both of those just works out of the box

For those who are interested, I've just tested this using Apple's sample code for writing modern AU hosts. The sample code is about one year old, written in Swift. 

The example app lists all the effect and instrument units installed on the system. The user can play a hard-coded audio file through the effect units and hard-coded MIDI notes through the instruments units.

The app can be compiled for both iOS/iPadOS and macOS. On iOS, it will list only AUv3 units, and on macOS it will list everything available). The MIDI callback is already set on selected audio unit in the sample code, I've just added a print statement to test it with Arturia Pigments AUv2 instrument. And as you can see in the attached screenshot, it receives MIDI out from Pigments naturally without any modifications, out of the box.

How can this be used in the current Logic? Well, I think that one can easily write a MIDI effect AU that simply wraps an AU instrument unit and forwards instruments AUMIDIOutputEventBlock to its own AUScheduleMIDIEventBlock. This would allow capturing AU instruments MIDI output via Logic's "Record MIDI to Track Here" option without any tricks and hacks. I will try to do it when I have time.

 

aumu-midi-out.jpg

  • Like 2
Link to comment
Share on other sites

10 minutes ago, ivicam said:

This would allow capturing AU instruments MIDI output via Logic's "Record MIDI to Track Here" option without any tricks and hacks. I will try to do it when I have time.

The way that option works is you have to move the "orange line" down below the last MidiDFX you want to capture.  LogicPro doesn't provide a way to drag that line below the AUinst.  or maybe it does now?  I haven't tried it in a while.  

But also you should know that it will only capture midi coming from input, not midi coming from the region.  It has a poor feature name, they should have called it something like pre-record midFX or something along those lines.  But anyway I don't think that is going to do what you want.  there are some AU's floating around that are already exposing midi out...you don't even need to write your own AUinst to test it.  For example KompleteKontrol I believe.

There are already some AUmfx which can wrap around AUinst plugins and use that way as you suggested..BlueCatAudio Patchworks, for example.  Try that one.  Kushview Element is free and might do it.  But then you won't be capturing the audio from the AUinst.

Edited by Dewdman42
  • Like 1
Link to comment
Share on other sites

@ivicam if you want a good product suggestion, since you appear to be a developer of some kind, then make a cool wrapper to work around LogicPro is put a wrapper that sits in both slots as two plugins that talk to each other and preset one unified GUI of the sub-hosted plugin to the user...so that the midi will go to the midiFX slot, the audio will go to the AUinst slot and the user just uses one GUI for both places.  You'd essentially need the midiFX plugin to be just something that receives data from the the other AUinst plugin in order to send the midi along.   Make sure it can host VST plugins also, since the vast majority of AUinst plugins don't output midi, while their VST versions do.  

I think that would be useful and people would buy it just for this use case.

No actually scratch that idea, it wouldn't work because the AUmfx processing happens before AUinst.

The best work around continues to be to send AUinst midi out to IAC.  That can be done with KushView Element for sure and there are other 3rd party ways to do that.  Would be nice if Apple would build it in.

Edited by Dewdman42
  • Like 1
Link to comment
Share on other sites

44 minutes ago, synth_hero said:

Maybe these plugins could create a new trend of integrating hardware into DAWs

Arturia already do this with MiniFreak, and Roland with their System1 and 8 Plug-Outs, along with the various engines for the Juno, JX, Jupiter, drum machines, etc. that run on hardware unit DSPs and computer CPUs.
The concept is actually well-worn (despite Yamaha's marketing hyperbole), with Yamaha themselves (and Ensoniq and Roland) having (PC/Mac) card-based synths...along with Creamware, Nord, and others a couple of decades back - with dedicated DSP hardware/control/software combos.
Everything old is new again 😉
 

Edited by oscwilde
  • Like 1
Link to comment
Share on other sites

17 hours ago, synth_hero said:

Here is an old screenshot of the Environment that I used for the Yamaha CS1x

Yes, giving options to build an automatable GUI for some old hardware synths was one of the Environments strengths, like here:

d110.thumb.png.3542bc78e6cd9f61da0f37b86ff15d2e.png

and here:
dss.png.c868615417decc9577b8e21c0ae16592.png

Forgive the garish and clipped colours and the overlapping text, to say that Apple's messing with the color palette and fonts has been ruthless would be an understatement.

Since most synths have either come (back) into the box or have gotten their own GUI software this is less of an important aspect nowadays.

But the ability to simply build your own MIDI processor which does precisely what is needed for any specific situation is something which an object based paradigm like the Environment still excels at. Many other software developers outside music have recognized this and built it into their software, like here, in a video editing software:

nodesresolve.thumb.png.10a08750bfd376d8d7e0ebfae05cbe7b.png

or here, in a 3D-program:

nodesc4d.thumb.png.ebe6b1477b9958f8008c6b8aabdd38df.png

...going so far as you can first define the number and name of inputs and outputs of a node, then put some python code inside the node and let the magic happen.

Here's what can be done with this concept in Logic (lo-res pic due to size):

eequalsmcsquareklein.thumb.jpg.5962058e79288324c18008ecde49cc39.jpg

What this does has been outlined here:

https://www.logicprohelp.com/forums/topic/123204-i-hope-logic-could-catch-up-at-namm/?do=findComment&comment=694074

I'm frustrated that Apple seems determined to let such a promising structure die. This is an open and efficient concept, and worthy of the tag 'Pro' in the software's name. But Apple name the software Pro, yet clearly aim it at the consumer, which is where this whole dilemma comes from.

Edited by fuzzfilth
  • Like 1
Link to comment
Share on other sites

2 hours ago, fuzzfilth said:

But the ability to simply build your own MIDI processor which does precisely what is needed for any specific situation is something which an object based paradigm like the Environment still excels at. Many other software developers outside music have recognized this and built it into their software, like here, in a video editing software:

Thanks for articulating this Logic Pro asset. 100%

 

2 hours ago, fuzzfilth said:

I'm frustrated that Apple seems determined to let such a promising structure die. This is an open and efficient concept, and worthy of the tag 'Pro' in the software's name.

Absolutely! The Environment can certainly be developed further with modern graphic enhancements and its enormous potential for customizing workflow: MIDI callbacks included.

Link to comment
Share on other sites

2 minutes ago, synth_hero said:

Absolutely! The Environment can certainly be developed further with modern graphic enhancements and its enormous potential for customizing workflow: MIDI callbacks included.

My best guess is that the environment's internal implementation is so old that it's virtually impossible to make it work with new Logic's features. 

But I would definitely love to see some modern equivalent, even more powerful.

  • Like 2
Link to comment
Share on other sites

I think modular environments are cool and would be cool for LogicPro to continue to have some kind of modular, environment.

However, i feel the way LogicPro is wired up right now with the environment being the actual underpinning of the entire track-channel infrastructure...is LogicPro's achilles heel.  I am looking forward to them getting rid of it in that capacity.  I know some of you environment warriors will not like me saying that, but I truly think that numerous bugs and limitations of LogicPro compared to other DAW's fall on this fact that its built on top of that decades old legacy underpinning.

Multi-instruments are a PITA in LogicPro and always have been and continue be, while other DAW's have left it in the dust.  how many times do we hear about limitations of 127 or 1000?  Frequently.  I blame the environment for that.  Track names and channel names doing weird automagical things that sometimes are right and sometimes are wrong....is to do with the loose connection between the GUI APple made and again that legacy environment underpinning.  Other DAW's let your rearrange the channels strips at will or create viewsets of particular tracks and so forth, all kinds of productive work flow things which LogicPro is lacking...again..I blame that on the fragile architecture they have of this old enviornment underpinning and Apple's new glossy GUI sitting on top of it that they try to keep synchronized as best they can...but sometimes it just doesn't work right and I think also prevents them from being able to add modern features such as I just mentioned.

I would much rather see Apple completely reengineer the underpinnings, do NOT try to make it user accessible with a modular environment, that is completely over engineering and overkill for most people and just makes it a more fragile and difficult system to roll forward, but it could be modernized from the ground up to include everything needed for infinitas tracks, infinite channels, everything works as you would expect it to, not weird glitchy things, AUv3 complete functionality, everything in place to handle midi 2.0 data, etc, etc, etc..

Now, could they also included something ON TOP OF all that in order to give people modular tools for building fader panels for their synths or whatever?  Absolutely!  And it could be done way better than the current environment too which is just so 1990 honestly.  There are numerous other modular environments out there using more modern tech that make the enviornment look old and crusty and difficult to deal with.  PlogueBidule can probably do everything you ever wanted to do with the LogicPro environment and can run as a plugin inside LogicPro.  KushView element is coming along.  Loomer architect is amazing.  There are others.    For the 1% of users that need that kind of functionality, fine...get a plugin..or maybe Apple will consider building it in, but ON TOP OF a much better architected underpinning for the basic track-channel architecture....IMHO.

 

  • Like 4
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...