Jump to content

Purchasing an iMac Pro vs an iMac Dilemma


Recommended Posts

Also Ploki,

 

With regards to SSD and Ram...

"The new iMac pro has roughly 3000mb/s throughput! That's like 6 (six) SATA 3 SSDs (they cap at 500mb/s). Most m.2 and nvme ssds cap at 1000mb/s...."

 

I was always told to keep my sample library data as well as audio on different drives so as not to tax the main drive that is running applications. With SSD now here and the specs that you gave me, does this theory no longer hold true. Can I now keep sample library data on the application drive without it bogging down Logic? If so, then it seems that getting a larger internal SSD Drive is almost more important than having a lot of DDR4. I ask because I use a lot of Kontakt libraries.

 

Thank You.

Link to comment
Share on other sites

well... 8-core is better than quad-core. :)

on my quad-core mbp i usually don't un into single core overloads but maxing most of my cores. (experience here differs and I'm sure someone will suggest singlecore performance is more important than more cores, I personally however would opt for more cores rather than faster single core performance)

for 1300$, you also get faster storage, faster RAM and faster GPU + more cores. In my opinion a significantly better computer, but really depends on your usage. if you run into single-core overloads but otherwise don't task your cpu that much, iMac will be better.

if you run out of CPU when most cores are maxed, iMac Pro will be better.

 

if you go for the iMac, I'd strongly encourage i7.

 

I run crucial sample libraries (round 250gb worth, mostly kontakt) from my 500mb/s SSD (i have 2012 retina) and it works splendid, in fact, everything is much snappier than if I run it from external drives.

 

I don't feel my drive is tasked by running samples or that logic is bogged down, and its a 5 year old computer with much slower SSD.

However if you have external SSDs connected via Thunderbolt it will still be blazing fast. (not as fast as internal SSD tho, new macs SSD speeds are stupid insane)

 

good blog on the theme:

https://diglloyd.com/blog/2017/20171214_0823-iMacPro2017-released.html

smart dude, but he got a little salty a few years back.

Link to comment
Share on other sites

Hello all,

 

I was always told not to place virtual instrument data on the same drive as my applications drive so as not to bog or slow down Logic.

 

With the latest iMac offering up to 2TB SSD and 64GB 2400MHz DDR4 and iMac Pro starting with a 3.2GHz 8-core Intel Xeon W processor offering to 4TD SSD and 128GB 2666MHz DDR4 ECC memory, does one still need to stream virtual instrument data from their USB 3 connected SSD Drives or can one now load from 2TB to 4TB of virtual instrument data on the same startup drive that runs the Logic application without running into congestion problems.

 

Thank you for any thoughts you might have.

Link to comment
Share on other sites

Why not call Apple and ask to speak to someone knowledgeable with Logic and its demands on the new hardware? I know those people exist and they might give you some better insight as to your specific needs.

 

I gave upon Apple support when it comes to asking for this type of advice. The people on the phones simply have no clue on what we, as composers, are actually doing and what our requirements are. I had one guy tell me a MacBook Air could run all of my VI's while scoring to picture...clueless.

 

I think awaiting the Mac Pro is a good option.

Link to comment
Share on other sites

I gave upon Apple support when it comes to asking for this type of advice. The people on the phones simply have no clue on what we, as composers, are actually doing and what our requirements are. I had one guy tell me a MacBook Air could run all of my VI's while scoring to picture...clueless.

 

I had to insist to talk to an Apple pro specialist. I agree, you can't rely on the first level techs or sales personnel, but once you ask for a specialist they can provide good information. It can't hurt to try.

Link to comment
Share on other sites

I disagree.

 

Logic can be used for myriad of things and even CPU usage differs greatly.

 

For example Reverb can refer to:

- low cpu algo like Silver verb

- med cpu convolution like Space Designer

- high cpu algo like Lexicon or 2C Audio B2

 

Mixing can refer to:

- band recording 20-30 tracks max

- full blown pop production of 100 tracks with mixed VSTs and recordings and Melodyne and whatnot

 

Composing can refer to:

- composing with s#!+ sounds without mockups

- using high quality sounds for mockups

- using high quality MODELLED soudns (Sample modelling, WIVI, Pianoteq) that are high on CPU but very easy on RAM/drive usage

 

unless you give them a project and full plugin and library list they literally CANT give you any meaningful advice.

 

My experienced with lvl2 experts on support line and Logic Pro is meh. These forums are usually if not always better.

Last time I had an issue i called literally 12 hours before the new update rolled out (that fixed my problem!), but the expert guy didnt even know a new update is rolling out, he told me i could try doing a full blow reinstall of my whole system if that fixes my issue.

 

Hello all,

 

I was always told not to place virtual instrument data on the same drive as my applications drive so as not to bog or slow down Logic.

With the latest iMac offering up to 2TB SSD and 64GB 2400MHz DDR4 and iMac Pro starting with a 3.2GHz 8-core Intel Xeon W processor offering to 4TD SSD and 128GB 2666MHz DDR4 ECC memory, does one still need to stream virtual instrument data from their USB 3 connected SSD Drives or can one now load from 2TB to 4TB of virtual instrument data on the same startup drive that runs the Logic application without running into congestion problems.

 

Thank you for any thoughts you might have.

 

I was told that too, and it was true in the age of spinning drives, because data throughput and latency was slow (the drives needle had to jump all over the place). I don't think this is relevant anymore if you use SSDs.

Unfortunately without testing in a real life scenario, its hard to tell which would work better.

 

For my type of work, running from external USB3.0 (even SSD) is much slower than running from an internal drive, so i got a bigger internal and run things i use often from internal. (and only use external libs because of space constraints)

 

The thing is, to get performance of the internal drive via Thunderbolt 3, its going to be extremely expensive and the only viable option is PCIexpress SSD drives. (most external enclosures are SATA3, not even m.2 or nVME), meaning you will likely have to increase kontakts prebuffer size to something larger. (Kontakts prebuffer is the amount of data per sample it stores in RAM for quick access. with faster drive you can do lower prebuffers, with slower drive you need to do larger prebuffers)

 

So running from an external to get optimal performance might force you to adjust your prebuffer to a larger value, consequently using up more of your RAM, which might also result in less system responsiveness.

 

My bet goes to using an internal for everything will result in a more responsive workflow, because the drive can handle it... But i might be wrong.

What I do know is that most of practices (partitioning, defragmenting, separate drives,etc) from spinning drives do not apply to SSDs at all.

Link to comment
Share on other sites

Does anybody have any thought on whether it is a good or bad idea to place Sample Data on a Raided drive?

I would think that this would be a much faster streaming option but for some reason I'm thinking that this might work well for video but not audio.

Link to comment
Share on other sites

Something else to think about – which might help get you down off of the horns of this "dilemma" – is that you can, in fact, actually find a way to "work within the constraints of" any hardware that you have – even if that's "24-track (or less!) ... magnetic tape."

 

No matter what piece of hardware you might have, you can always find a new and creative way to "max it out," if you stubbornly demand that it must produce, "in real time," more steam than its boiler can actually produce.

 

But you can still find a perfectly-acceptable way to get the job done. Logic readily-provides tools such as Bounce in Place which allow you to do some of the digital processing off-line, ahead of time, producing an audio-track that can then be used in down-stream mixing stage.

 

If you're patient and if you plan ahead, you will find that you can produce perfectly-serviceable music from ... (koff, koff ... "these kids today ...") ... "yesterday's™" machines. :D

 

(And, if I may add, "maybe this sort of thing is actually a better process," at least in some ways, "because it is more deliberate." For the very reason that it obliges you to approach your finished-product in stages, I think that it actually opens up your awareness of many important creative possibilities. Possibilities that "just push the button and wait for what happens next" ... if you have the raw horsepower to do it ... might actually conceal from you. Sometimes it actually pays to take "just a little more time" to get to wherever you might be going ...)

Link to comment
Share on other sites

I have them on a RAID 0 spinning drive (i weekly backup everything to a 2nd drive)

 

and it works faster than it does from a single drive or RAID 1.

 

 

No matter what piece of hardware you might have, you can always find a new and creative way to "max it out," if you stubbornly demand that it must produce, "in real time," more steam than its boiler can actually produce.

 

But you can still find a perfectly-acceptable way to get the job done. Logic readily-provides tools such as Bounce in Place which allow you to do some of the digital processing off-line, ahead of time, producing an audio-track that can then be used in down-stream mixing stage.

 

this is so good - i recently hit a wall with my projects on a quad-core.

Then figured I have so much sh!t crammed in my projects it was wonder that they even still opened.

dosens of flexed tracks, old data...

I ended up streamlining my workflow - each song consists 4 projects for different part groups, which i export to the main "mix" project which breaths and takes roughly ~350% of my CPU (track count 100)

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...