Jump to content

Synchronization of sound and playhead position bug


Logicno8

Recommended Posts

One strange thing started happening in my Logic 9 today. I'm working on a funky club house song for one guy and I have 53 channel strips in my Arrange Window so far. What I did is I made some grooves, drum loops, melodies, brass section...a ton of things with Kontakt 4, Kore, some Logics VST's and I bounced all of that to audio tracks. And since I know this is destructive way, kind of, doing things (since I loose flexibility which comes with MIDI and I deleted MIDI regions) I had to do that since I would't be able to run countless instances of plugins and sequencers without my session crushing. So, I set buffer size to 1024 (I don't record any audio, just sequencing, editing, mixing) and with some Freezing my session run pretty smoothly. BUT...I see Logic for the first time is not in sync with what I hear. I can see Logics playhead going over snares, kicks, etc. and I hear them half of a second or whole second later, which is giving me hard times in editing. I need everything tight and neat on the grid here so I'm trying to compare listening and what I see. Those are really fine nuances and I expected to see and hear everything in sync in order to get it tight as possible.

 

So, can anyone tell me why would Logic behave like that ? Due to a number of channels (although they contain, mostly, just audio regions, no plugins) ?

Synchronization of Logics engine bugs ? Or....

 

Any idea ?

 

Thanks.

Link to comment
Share on other sites

What I did is I made some grooves, drum loops, melodies, brass section...a ton of things with Kontakt 4, Kore, some Logics VST's and I bounced all of that to audio tracks. And since I know this is destructive way, kind of, doing things (since I loose flexibility which comes with MIDI and I deleted MIDI regions) I had to do that since I would't be able to run countless instances of plugins and sequencers without my session crushing.

 

There's nothing destructive about printing your MIDI tracks. But I'd suggest that there's really no reason to delete your MIDI tracks. You can always keep your virtual tracks, even if you just put them all in a folder and then mute the folder so that they don't play.

 

Moving on...

 

So, I set buffer size to 1024 (I don't record any audio, just sequencing, editing, mixing) and with some Freezing my session run pretty smoothly. BUT...I see Logic for the first time is not in sync with what I hear. I can see Logics playhead going over snares, kicks, etc. and I hear them half of a second or whole second later, which is giving me hard times in editing.

 

The large buffer size might cause this if you're sequencing (i.e., MIDI tracks). Please confirm if what you mean by "sequencing" involves new MIDI tracks.

 

Also, do you have any latency-inducing plugins on your output 1/2?

Link to comment
Share on other sites

What I did is I made some grooves, drum loops, melodies, brass section...a ton of things with Kontakt 4, Kore, some Logics VST's and I bounced all of that to audio tracks. And since I know this is destructive way, kind of, doing things (since I loose flexibility which comes with MIDI and I deleted MIDI regions) I had to do that since I would't be able to run countless instances of plugins and sequencers without my session crushing.

 

There's nothing destructive about printing your MIDI tracks. But I'd suggest that there's really no reason to delete your MIDI tracks. You can always keep your virtual tracks, even if you just put them all in a folder and then mute the folder so that they don't play.

 

Moving on...

 

So, I set buffer size to 1024 (I don't record any audio, just sequencing, editing, mixing) and with some Freezing my session run pretty smoothly. BUT...I see Logic for the first time is not in sync with what I hear. I can see Logics playhead going over snares, kicks, etc. and I hear them half of a second or whole second later, which is giving me hard times in editing.

 

The large buffer size might cause this if you're sequencing (i.e., MIDI tracks). Please confirm if what you mean by "sequencing" involves new MIDI tracks.

 

Also, do you have any latency-inducing plugins on your output 1/2?

 

 

Hi ski...thanks for quick help.

First about printing MIDI...you are ABSOLUTELY right. I was too lazy to make an empty track and keep my MIDI regions there. So there is nothing destructive in printing MIDI to audio if MIDI regions are kept safe. That was my mistake I admit.

Second..."sequencing" is, as I use it, making new sequencers such as Kontakt 4,or Stylus RMX and making new MIDI arrangement and than printing that to an audio track. Yes I do have some MIDI tracks running with sequencers at their input and their notes are delayed as well.

Third....Indeed I DO have plugins at StereoOut 1-2. I have Waves L316 Multimaximizer, Logic's Exciter, Logic's Adaptive Limiter, and Logic's MultiMeter as a last link in chain.

Link to comment
Share on other sites

Thanks for the detailed answers.

 

The Maximizer and the AdLimiter are both latency-inducing plugs which, being on the stereo output, can't be delay-compensated. Therefore the audio you're hearing is going to be late with respect to the playhead position.

 

And your most recently recorded MIDI tracks are definitely not going to play back in time at a buffer setting of 1024 because of a flaw in Logic's playback engine: MIDI timing is only going to be accurate at buffer sizes of 128 or lower. A setting of 32 will give you the most MIDI timing accuracy, 64 is a little worse, and 128 is a little worse than 64. Still, all of those setting produce acceptable MIDI timing. Anything higher and pfffffffzt!

Link to comment
Share on other sites

Thanks for the detailed answers.

 

The Maximizer and the AdLimiter are both latency-inducing plugs which, being on the stereo output, can't be delay-compensated. Therefore the audio you're hearing is going to be late with respect to the playhead position.

 

And your most recently recorded MIDI tracks are definitely not going to play back in time at a buffer setting of 1024 because of a flaw in Logic's playback engine: MIDI timing is only going to be accurate at buffer sizes of 128 or lower. A setting of 32 will give you the most MIDI timing accuracy, 64 is a little worse, and 128 is a little worse than 64. Still, all of those setting produce acceptable MIDI timing. Anything higher and pfffffffzt!

 

Nice explanation doctor, so will my Logic be ok ? :D

No really, THANK YOU for clarifying this for me ski.

So, in a nutshell... playhead goes across my wave forms, but it takes time for Logic from the moment playhead physically went over a region to the moment I can hear what playhead just went over...BECAUSE...Logic needs more time to process the signal of all regions in channel strips summed through Stereo Out with very demanding plugins on it which are therefore responsible for latency or not synchronized playhead position and what I hear ...? Uhhh... :wink:

 

And other thing, MIDI. . . I didn't know that MIDI basically needs buffer size similar to one we need to record audio without latency (or at least without noticeable latency). I record my audio usually at 256 or 128 but I go lower sometimes. I didn't record any MIDI so far. I mostly use pencil tool and later automatization. It is less natural but I have no MIDI controller :(

Is that low buffer size 32/64 good just for recording MIDI or I should keep it that low even If I want to mix my project that contains 90 percent of MIDI based tracks ?

 

Thanks a lot once again.

Link to comment
Share on other sites

You're welcome!

 

Real quick reply...

 

• Even if you pencil in your MIDI data, it will play back in a funky way if the buffer size is too large.

 

• The reason you're seeing the playhead pass over transients but not hearing the sound immediately is because, as already mentioned, those latency-inducing plugins are making the audio sound later than the graphic representation of the audio in the arrangement. In other words, there is no latency compensation for the movement of the playhead, the way the counters update, and so on. What would be kind of cool is if Logic could detect the latency of plugins on your stereo output and delay the movement of the playhead/counters accordingly. In that scenario you wouldn't notice a lag.

Link to comment
Share on other sites

Clear and understood ski, thanks. I guess that would be great feature (Logic's capability of delay compensation on StereoOut) but even working like this I never experienced some non synchronization problems 'till now.

And I guess that this can happen only if my StereoOut is populated with demanding plugins but won't happen if I have a lot of plugins on individual audio and MIDI tracks.

Link to comment
Share on other sites

I guess that would be great feature (Logic's capability of delay compensation on StereoOut)

Logic does compensate for output and aux channels...but, it accomplishes it by delaying all other audio streams by the appropriate amount. I think this is why we get the playhead lag.

 

From page 1201 of the L9 Manual:

106312350_Picture1.jpg.4ea8ed87125d9b1befc8b9e7280c8759.jpg

Link to comment
Share on other sites

This is where things get confusing for me... if latency-inducing plugs are put on the main output, they inherently cause the sound to be delayed. Now, per the section of the manual you outlined, I'm not sure I understand what good it would do to actually delay the audio any more.
Link to comment
Share on other sites

Thanks Tom! :mrgreen:

 

I still don't get it though... :shock:

 

If a plugin needs X-amount of time to process audio, delaying the audio streams into only makes it later. So now we have lateness on top of lateness. And just to be totally clear, my definition of delay is "to make late", as in, "attention all passengers, all flights to Baden Baden are delayed indefinitely due to bad weather in Kiev".

 

Advancing the audio would make it earlier. So what I'm thinking is that by 'delaying', the manual really means 'advancing'?

Link to comment
Share on other sites

Maybe I'm not getting what you're not getting, but say output 1 & 2 has 50ms of plug-in latency, then (with PDC set to All) all the other outputs (i.e. 3 & 4, 5 & 6, etc) will have a 50ms delay (or latency) added to them, to put them in sync with outputs 1 & 2.

 

Outputs 1 & 2 will have no extra latency added to them ... so it's not lateness on top of lateness, but lateness next to lateness, if you like.

 

Audio is only ever advanced to compensate for latency occurring on Arrange Page channel strips. Latency occurring further down the chain, on auxes and outputs, is compensated for by delaying all other signals to match the most latent one.

 

... any clearer? :?

Link to comment
Share on other sites

Tom,

 

Thanks for your reply. Time to take a ride. Here we go! And just to reprise the manual quoted above (for reference):

 

"If latency-inducing plugins are inserted into aux or output channels (or ReWire channels, if used), Logic delays all other audio streams by an appropriate amount."

 

The underlined part is where my confusion lies, specifically the definition of "all other audio streams" which is clear as mud if you go by the manual's description. Nowhere does it say that "all other audio streams" are other outputs. It's unbelievably vague.

 

Now... I understand what you're saying, e.g., you have two outputs (say, 1/2 and 3/4). If 1/2 has a latency-inducing plug and 3/4 has none, then 3/4 will be delayed by the amount of 1/2's plugin latency. Lateness next to lateness, as you said. Fine. That's a kind of latency compensation. But it's not the whole picture IMO...

 

Here's where I was coming from with my talk about advancing tracks vs. delaying them: the very typical scenario of using a single stereo output with latency-inducing plugs. To me, meaningful latency compensation on an output would be to advance the entire arrangement so that what comes out of 1/2 is in sync with where it used to be prior to using those plugs.

 

Case in point: a cue sync'd to picture. It's perfect! That is, until I insert (say) a brick wall limiter plugin on my stereo out that induces a 2-frame delay. As I watch picture, the music is now out of sync. And if I bounce that track and lay it back up in the arrange page, even if I use the function to place that region at its originally recorded position, it's going to be 2 frames out of sync.

 

So my conclusion is that latency compensation doesn't actually exist for outputs with latency-inducing plugs on them, other than to sync them up when using more than one.

Link to comment
Share on other sites

"If latency-inducing plugins are inserted into aux or output channels (or ReWire channels, if used), Logic delays all other audio streams by an appropriate amount."

 

The underlined part is where my confusion lies, specifically the definition of "all other audio streams" which is clear as mud if you go by the manual's description. Nowhere does it say that "all other audio streams" are other outputs. It's unbelievably vague.

Not really, if you consider that an "audio stream" is defined as anything that's leaving Logic and getting passed on to the interface driver. Those streams HAVE to pass through an aux and/or output. It's easier to think about it that way.

 

As nosebagger describes, there are really 2 separate PDC systems in Logic. The one that advances the tracks and instruments that have the plug-ins inserted, and the one that delays aux's and outputs that don't have plug-ins inserted, so that they come out at the same time as the ones that do.

Link to comment
Share on other sites

Yup, it's all clear now.

 

So then there's still one thing missing: actual latency compensation for outputs so that the net effect of latency-inducing plugs inserted in them is zero. Essential stuff, I'd think, especially when working to picture.

 

There's another thing missing: a clear-as-day, even-a-monkey-can-understand-it explanation in the (current) sorry excuse that is the Logic manual.

 

Off the soapbox now.

Link to comment
Share on other sites

  • 2 years later...
So then there's still one thing missing: actual latency compensation for outputs so that the net effect of latency-inducing plugs inserted in them is zero. Essential stuff, I'd think, especially when working to picture.

 

 

For me, if I put look-ahead or latency inducing plug-ins on an output, I re-Ping to determine the number of extra samples of latency. I then relatively adjust my Recording Delay to compensate. Contrary to the manual, I believe the Recording Delay is something which you can't just set and forget, at least for my setup, and I know some others have aired that view.

 

Logic always seems to line-up audio (relative to other streams), but it doesn't always fully-compensate. For instrument/audio channel strips and routings destined for a bus (where that bus DOES NOT ultimately end up being routed to a physical output), Logic always lines up the streams and compensates by moving the streams earlier in time. Here Logic is fully-compensating, by lining up the streams relative to each other and sending the streams ahead of time by sending the longest latency tracks first. Full-compensation means the audio is lined-up AND doesn't appear late when recorded in real-time (i.e. not bounced).

 

However, for Outputs with high latency inserts, and Bus/Aux routings DESTINED FOR A PHYSICAL OUTPUT, Logic only provides partial-compensation. Here audio is lined-up (relative to other streams) but all lower latency streams (and fully-compensated instrument/audio channel strips) are delayed so that they line-up with the highest latency routing at the Outputs. Because the lower latency streams are delayed, they will appear late (but lined-up) when recorded via analogue loopback. Hence, the possible need to change the Recording Delay when you add or remove latency inducing plug-ins to/from an output, or a bus/aux routing who's ultimate destination is a physical output.

 

  • Full-compensation: Streams are lined-up and recorded in the correct position. Audio/instrument channel strips have full-compensation, as do as aux/bus routings not destined for an output. Here the audio sounds right AND is recorded in the correct position.
     
    Partial-compensation: Streams are lined-up relative to each other (so they sound right), but are recorded late. The streams' relative timings are preserved, but Logic doesn't send the high latency streams earlier, which results in real-time recordings that are late relative to the original source audio. At least, this is the case for look-ahead type plug-ins and the I/O Plug-in.

 

So, other than re-adjusting the Recording Delay, the other option is to record directly from buses, which are be fully-compensated, again, so long as the ultimate destination for that bus is not an output.

 

I have no idea whether or not any of this will fix the playhead position problem. Also, when I talk about recording I'm not referring to any kind of bouncing.

 

: D

Link to comment
Share on other sites

Not really, if you consider that an "audio stream" is defined as anything that's leaving Logic and getting passed on to the interface driver. Those streams HAVE to pass through an aux and/or output. It's easier to think about it that way.

 

 

Surely audio that's output/routed to a bus which is not ultimately routed to an output is also an "audio stream"? I'm pretty sure that Core Audio doesn't make any distinction between streams based on the ultimate destination of the audio (stream).

 

As I said in my post above, if you send audio to a bus or bus/aux, who's final destination is a bus, Logic fully-compensates for latency by sending the audio early. This appears to happen immediately after the last audio/instrument channel inserts have been processed. Aux/buses are potentially only delayed if the final destination is a physical output.

 

It's only when an bus/aux audio stream is finally routed to an output that Logic delays their audio to match the timing of the longest latency post-channel-insert routing.

 

: D

Link to comment
Share on other sites

Audio is only ever advanced to compensate for latency occurring on Arrange Page channel strips. Latency occurring further down the chain, on auxes and outputs, is compensated for by delaying all other signals to match the most latent one.

 

I don't think this is always true of ALL aux/bus routings. If the aux/bus is routed to another bus, and not a physical output, Logic sends the audio early.

 

You can see this by setting up an audio track with it's input set to come from a bus. Create a click-track to send a click via a send to an aux/bus (say Bus 10) and set the output of that aux to go to another bus (say Bus 11). Also, at the same time, set the output of the click-track to go directly to an output with a latency-inducing or I/O Plug-in inserted, and cable that output to an appropriate input for loopback recording.

 

Then set up 2 audio tracks, one with input from Bus 11, and the other with input from your loopback cable. Simultaneously record the click to both audio tracks. You will see that the audio track, who's input is from the bus, has it's click in the expected position; it hasn't been delayed. The audio track that was recorded via the latency-induced analogue loopback will show that the click is late. Recording Delay adjustment is the only remedy I have found in the latter routing.

 

I've used 2 buses here to illustrate that if you now insert an I/O Plug-in on bus/aux 10, Logic will fully-compensate for the latency because the final destination of this routing (click-track send > bus/aux 10 > bus 11) is not going to a physical output.

 

If you want to be sure your audio is recorded without any additional delay (say to print FX via hardware insert) AND hear that audio at the same time, send the audio to a bus as before. Again, record from that bus onto an audio track and set up an additional aux (input set to the same bus e.g. Bus 11) to route the audio to a physical output simultaneously. Recorded audio will line-up to Logic's "grid" correctly, and you'll hear perfectly timed (but delayed) audio via the additional aux > output routing.

 

 

So, to sum up, aux/bus routings aren't delayed if the final destination is a bus. The key here is whether or not a "routing" ends up going to an output.

 

  • Aux/bus -> aux/bus -> aux/bus -> bus = audio sent early
     
    Aux/bus -> aux/bus -> aux/bus -> output = audio delayed for all but the longest aux/bus routing

: D

Link to comment
Share on other sites

Quick reply... the recording delay parameter's function is used to ensure that the POSITION of audio files recorded into Logic end up being placed where they should.

 

Example:

 

With a recording delay value of zero (0)... You have a stereo audio recording of a metronome click ("A"). You route that signal out of your interface and patch it back in to a pair of inputs, recording it on another track (recording "B"). When you look at the waveforms, you notice that "B" is 500 samples late with respect to the source track. So you compensate by setting the recording delay to -500. Now you repeat the experiment and you'll find that your new recording of "B" lines up with "A" perfectly, as it should. Now your system is properly calibrated.

 

Thus...

 

The recording delay parameter is used to compensate for latencies in your system to ensure the correct placement of live recorded audio. And after doing a calibration of it (using a loopback test as described above or other method) it should not, under normal circumstances, need to be touched ever again. However, the test should be repeated once in a while to ensure that nothing about your system has changed.

Link to comment
Share on other sites

I don't agree, with all due respect.

 

Put a latency-inducing plugin on your output, then do a loopback recording.

 

The click will be late, even taking your correctly calibrated Recording Delay (Prefs -> Audio) into account.

 

Put another latency-inducing plugin on that same output.

 

The click will be even later.

 

My point is, that latency inducing plugins on outputs directly affect your round-trip recording latency (and therefore recording delay/offset). Please! show me another way to correct this.

 

I believe the manual is wrong is this regard.

 

: D

Link to comment
Share on other sites

Let me clarify...

 

I'm talking about compensating for the placement of live audio recording, which can also mean a looped-back recording from Logic (that's "live" too).

 

Yes, the manual is patently incorrect in stating that the recording delay value should not need to be touched. In fact, anyone running Logic and recording audio should calibrate it, something I've researched and written about extensively here on the forum. But I don't believe that the recording delay parameter was ever intended to compensate for the latency induced by any one or more random plugins installed on the output(s). Recording delay is meant to correct the "fixed latency" induced by the A/D delay of your interface + audio system "driver" software.

Link to comment
Share on other sites

I'm also talking about live audio recording, in my case outputs 1-2 via a 2 Bus compressor into inputs 1-2..

 

I understand what you're saying about the Logic's intended use, but that doesn't help when my recorded audio is late because of say an Adaptive Limiter on outputs 1-2.

 

Like I said, this causes audio to be recorded late, by 2500 samples last time I checked with the settings I used. I stand by what I said. Adding a latency inducing plugin on an output effectively increases your recording latency (round-trip) because it increases your output latency.

 

The only fix I've found is to increase my Recording Delay (from -178 samples) to say -2678 samples. I've tried using a latency fixer plugin, but that causes all other post-insert aux/bus routings to be delayed even further.

 

If anyone can offer an alternative solution for lining up my recorded audio in this situation I'll kiss whatever they tell me too :D

 

: D

Link to comment
Share on other sites

The only fix I've found is to increase my Recording Delay (from -178 samples) to say -2678 samples.

 

Then you've found your fix! :D And this approach -- pain in the @ss though it is -- utilizes the recording delay in keeping with its function, though in your case you're kind of amending its use to include the delay caused by your latency inducing plugins on the output.

 

In a perfect world, latency-inducing pluggies on the output would report their latency to Logic and, in turn, Logic would automatically (and temporarily) compensate for the additional delay. But alas...

 

If anyone can offer an alternative solution for lining up my recorded audio in this situation I'll kiss whatever they tell me too :D

 

: D

 

LOL! In this case I'd say that you'll have to kiss yourself because I think you've found your solution.

 

kissafrog.jpg.9f3d0e67c87f04e9f2eb20968314e846.jpg

Link to comment
Share on other sites

Yeah it is truly a " pain in the @ss". Every time I add/remove an aux/bus (I/O Plugin) and/or latency-inducing plugin on an output I have to re-Ping my Latency. Not hard, but something I'd rather not have to think about.

 

You're right, Logic should compensate for latency all the way through from audio/instrument channel strips, to aux/buses to outputs. I also think each plugin should display it's current latency in samples so you can add them up when trying to resolve latency issues. AUs have to report this to Logic, so why not make that info available to users.

 

Tried kissing myself. A bit salty!

 

: D

Link to comment
Share on other sites

Tried kissing myself. A bit salty!

 

LOLOL!!!

 

(TMI, but LOL all the same!) :D

 

What you've come up with here is, IMO, a really good use for the recording delay outside of its more workaday function to compensate for driver latency, etc. I mean, hey, why not use it to compensate for any live audio being recorded in Logic -- even if it's the output of Logic itself running through latency-inducing pluggies?

 

I'm definitely gonna keep your approach in mind as opposed to what I'd otherwise do -- slide latency-ridden 2-mixes around in the arrange page until they line up with the original.

 

:mrgreen:

Link to comment
Share on other sites

Hi RedBaron,

 

As long as Plugin Delay Compensation is set to All (and the Recording Delay is set to offset the regular system roundtrip latency), the following methods all result in a perfectly placed loopback recording or bounce on my system, so hopefully at least some will work on your system too:

 

Method 1

 

- Disable Software Monitoring

- Record the loopback

 

Method 2

 

- Leave Software Monitoring on, but enable Low Latency Mode

- In the Arrange Window, select your record track and make sure that no MIDI tracks are inadvertently record-enabled

- Record the loopback

 

Method 3

 

- Realtime bounce (good if using external FX)

 

Method 4

 

- Offline bounce (no good if you are using external FX)

 

Method 5

 

- Set the output of all tracks and auxes currently feeding the Stereo Output to a spare bus/aux

- Name the automatically created aux "2Bus" or something similar

- Move any plugins (e.g. AdLimiter, etc) from the Stereo Output to "2Bus"

- Create a send to a spare bus on "2Bus" and set it to 0dB

- Delete the aux that was automatically created by the send

- Create a new stereo audio track in the Arrange Window and set its input to the bus you are sending to from "2Bus"

- Mute the new audio track (or remove its output assignment) and put it in record

- If you like this method, then rename the bus that "2Bus" is feeding to PRINT (or similar) in the mixer under Options>IO Labels. This makes it easier to follow the routing and allows you to quickly print anything in your session by sending it to the "PRINT bus".

 

I've tested these methods with five AdLimiters on the output, with each set to a lookahead of 200ms - so that's 1 second of latency - as well as using them in real world situations, so I hope they give you some joy too.

 

Tom

Link to comment
Share on other sites

... Also, just to clarify the whole "what gets delayed where" business of Plugin Delay Compensation as it relates to bus/auxes and outputs:

 

In All mode, differences in latency are reckoned and compensated for whenever two audio streams are summed.

 

For example, if two auxes feed a common bus/aux, then any difference in latency between the source auxes will be compensated for by delaying the output of the least latent aux by the appropriate amount - so that both audio streams are in sync when they reach the summing bus/aux. This will happen whether or not the summing bus/aux is routed to an output, another bus/aux or even to nowhere.

 

So the key to when PDC is applied to an audio stream isn't whether or not it is eventually routed to an output, but whether or not it is being summed with a more latent signal.

 

On top of this, the stereo output channel strips (in a multi output system) are also appropriately delayed to put the parallel audio streams that they carry in sync with each other. (And as mentioned in earlier posts audio and software instrument channel strips are latency compensated by negatively delaying the regions that feed them, so that each outputs its audio signal at the same time).

 

Hope that helps,

 

Tom

Link to comment
Share on other sites

So the simple solution is to disable all plugs on the Output 1-2? (in addition to setting the buffer at the suggested rates)

 

No: disabling or bypassing plugins will not get rid of the introduced latency; you'll have to uninstantiate (choose no plugin) them.

 

Thx. Worked like a charm.

 

Although as mentioned below it would be nice if they could adjust the DPC on output 1-2 so we did not have to have so many work arounds.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...