Jump to content

Why can't memory RAM be expanded externally?


Recommended Posts

RAM (random access memory) is the shortest possible distance from the processor in order to provide data to and from the processor at the highest speeds possible.

All other connections to the processor (busses, like Thunderbolt or USB or other I/O busses) are a fraction of the speed of the memory.

From Intels page (just for reference and it's the one I found quickly)

Quote

For example:
For DDR4 2933 the memory supported in some core-x -series is (1466.67 X 2) X 8 (# of bytes of width) X 4 (# of channels) = 93,866.88 MB/s bandwidth, or 94 GB/s.

For reference:
Thunderbolt 4 has 40Gbits/s (note it is Gigabits here and in the Intel example it is Gigabytes - so 8X the speed!)

The connections necessary for more memory are very limited by the motherboard and  processor design. So there is always an upper limit.

On the Apple Silicon systems: theoretically someone could solder loose the existing memory and solder new memory to the processor, and I think someone already did (check macrumors!), but this is nothing for normal people. It's more for real hardware freaks to show off that they are in fact capable of doing it.

That's how I understand the technology.

Edited by wonshu
  • Like 2
Link to comment
Share on other sites

as others have alluded, Apple Silicon is based on a consolidated chip design where the ram is integrated directly into the CPU, which avoids a lot of data movement that is typical on older systems.  This is a huge part of Apple Silicon's performance advantage.  If you were able (and you're not) to add more memory outside the CPU it would harm performance substantially. 

With Apple Silicon you need to decide ahead of time how much memory you need and buy it in that configuration, whatever price Apple dictates.  That is just how it is with this architecture.  There are huge performance advantages to that architecture, particularly related to graphics since the GPU is also integrated into the CPU.

Someday if Apple where to ship an Apple Silicon chip with pro-level audio interface built into the CPU also, it would hypothetically provide something close to zero latency...but....that is extremely unlikely.  For this reason its somewhat debatable to me whether Apple Silicon is providing as much of a performance advantage for music production as it does for video and graphics producers, gamers, etc..  But that is a much larger market.  We all have to move to Apple Silicon sooner or later but in my view for music production there is no rush.  But if you do...buy the model with the amount of RAM you will actually need.

Another thing to consider is that you may not actually  need as much RAM as has been needed in the past, because virtual memory paging has become way way more efficient in the AS architecture also, due to extremely fast SSD speeds.  So basically you can get away with a lot more with less memory compared to older designs.  (knock on wood).  I think for audio production if you need to load a lot of samples you may be able to use SSD streaming and virtual memory more and may not need as much memory as was typically needed in the past.

 

  • Like 5
Link to comment
Share on other sites

  • 2 weeks later...

Although "SSD-backed virtual memory paging" is much faster than before, because there are no "physical latencies" – read/write head movement nor disk rotation – associated with this technology, it is still "paging."  An I/O operation is still taking place: possibly two ... one "out" to make room, then one "in" to get the data you need.

So, when I buy a computer, I reflexively buy it with a large if not the largest amount of RAM that is available for that model.  This is actually of far more interest to me than the CPU type or "number of cores."  Because, it does not matter that you are driving a Porsche if you are stuck on a two-lane road behind a Yugo. Logic is a real-time application which cannot tolerate any delay however slight ... equals "system overload."

The "system" isn't being "overloaded" because the CPU cores are being maxed-out: it's being "overloaded" because the data didn't arrive in time.

Edited by MikeRobinson
Link to comment
Share on other sites

38 minutes ago, MikeRobinson said:

Although "SSD-backed virtual memory paging" is much faster than before, because there are no "physical latencies" – read/write head movement nor disk rotation – associated with this technology, it is still "paging."  An I/O operation is still taking place: possibly two ... one "out" to make room, then one "in" to get the data you need.

So, when I buy a computer, I reflexively buy it with a large if not the largest amount of RAM that is available for that model.  This is actually of far more interest to me than the CPU type or "number of cores."  Because, it does not matter that you are driving a Porsche if you are stuck on a two-lane road behind a Yugo. Logic is a real-time application which cannot tolerate any delay however slight ... equals "system overload."

The "system" isn't being "overloaded" because the CPU cores are being maxed-out: it's being "overloaded" because the data didn't arrive in time.

 

Things have changed since the 1990s. 

Given the "freeze" function, it's much easier to work around the problems you describe in logic than in, say, MainStage, which is actually a real-time system. 

Logic DOES depend upon audio being delivered in time for playback, that is true. But in contrast to hard drives, bandwidth on a typical SSD is high enough for several THOUSAND channels of audio at 192kHz and 32 bits. 

A single track of 192kHz/32-bit audio has a bitrate of just over 6 megabits/sec. A typical SSD is at 2 or 3 gigabytes/sec. 

When Logic craps out, it is invariably because of CPU overload. 

Your Logic is not crapping out because of paging. It's because those Abbey Road Waves plugins are clogging CPU lanes. 

  • Like 1
Link to comment
Share on other sites

None of these software programs are actually real time, they all operate on buffers.  Daws and everything else, they are not real time.  
 

the computer system with cpu and other components operates on one buffer at a time and not in  real time fashion.  They go as fast as they are capable of going until the buffer is full and then when the audio card is ready for the next buffer load it will take it and if the computer system couldn’t process everything for that buffer in time, then a partially filled buffer goes to audio and you hear a click or pop.

During the course of processing that buffer  in a non real-time manner, the computer may move a lot of data around including between the cpu and memory, from various buses such as usb, and to or from storage.  The software is capable also of prefetching data into other caches and buffers and all manner of complex things.  Yes it’s a lot faster to move data between ram and cpu then it is to have to read it from an ssd, even a very fast modern one, but still the paging or streaming from ssd may be perfectly well fast enough to keep the buffer filled in time, as newer ssd’s and related buses are extremely master then years past.

in addition to much faster paging, you can change all your sample instruments to use much smaller prefetch sizes and they will stream a lot more rather then requiring a ton of ram.  Already even in much older computers with much slower ssd’s we have been streaming from ssd and even from hdd in Kontakt.  The computers have been able to fill the buffer in time for many years already and now the ssd’s are even way faster.

yes, if money is no object then get all the ram you can I agree.  But there is a good possibility of having way more memory then you actually need and with Apple silicon that becomes a costly decision.

activity manager has a way to monitor your memory use and you can see how much paging is taking place on a graph and it will even indicate to you with different colors if there is cause for concern about about memory paging.  Your computer already pages memory a lot more then you probably think it does.  This is perfectly normal.  When it becomes excessive it can slow things down and eventually you may reach a point where your daw would not be able to keep up with the buffer.  But the current generation of ssd are so fast that this limit is much harder to hit then it was in past years.

many people running M1 seem to think it runs better with less memory then previous generations and that this is due to more efficient virtual memory paging.

wr need to yet hear a lot more actual reports and experiences from audio production users as to what they are able to do with differing amounts of ram before we can arrive at any firm conclusions and form practical advice from that, but with certainty these machines are capable of more efficient memeory paging  and ssd streaming then before so theoretically the same tasks can be accomplished with less ram.  128gb is likely to be overkill for the vast majority of people

  • Like 2
Link to comment
Share on other sites

Not to quibble on terminology 🙂 I'm not sure I would agree with the view that a system isn't a real-time system if it uses buffers. Buffers are generally used for performance/optimisation reasons as we know, but if I'm playing a live digital instrument - standalone synth, software synth, whatever, there are buffers involved, but it's still running in real-time - is it means that something does something to behave right there and then, rather than requiring more processing than can be done in real time.

Some intel-defined characteristics of real-time systems include:

Quote

What Is a Real-Time System?

The term “real-time system” refers to any information processing system with hardware and software components that perform real-time application functions and can respond to events within predictable and specific time constraints. Common examples of real-time systems include air traffic control systems, process control systems, and autonomous driving systems.

 

Quote

- A real-time system is characterized by its ability to produce the expected result within a defined deadline (timeliness) and to coordinate independent clocks and operate together in unison (time synchronization).

- A hard real-time system has absolute deadlines, and if those allotted time spans are missed, a system failure will occur. In soft real-time systems, the system continues to function even if missing a deadline, but with undesirable lower quality of output.

- A real-time system’s capability is “measured” on the basis of two requirements: latency and compute jitter.

By those measures, and what I remember of my computer science degree, Logic would qualify as a real-time system in my view... (in addition to other, more offline tasks of course). So would something like a flight-traffic or airplane control system, and you can be sure there are buffers involved there.

That aside, I basically agree with everything else. 32GB on a Mac is probably enough for typical people, and only ones who are demanding, explicitly pushing the limits, or really using heavyweight tools such as big 3D modelling/rendering, game design, very large movie-scoring sessions will find it necessary for more wiggle room in terms of resources.

I've done, and continue to do an incredible amount with 16GB RAM for many years, and I do a fairly wide range of stuff across development, photo/video/audio and other areas. I'm not sure if I had 64GB of ram how much of that I'd actually be using, therefore it's hard to justify (for me) the expense at having that resource in reserve. If money was no object, then sure... but it seldom is...

Edited by des99
Link to comment
Share on other sites

ok don't agree.  then go read up about real time computer systems and then you will realize that your Mac is not a real time computer system.  period.  The audio card itself is the only real time component.  The overall system is not even remotely close to real time.  NOT EVEN REMOTELY CLOSE.  Yes that was all caps for emphasis, I'm not shouting.

There are highly specialized real time computer systems out there, but personal computers are not one of them.

Nothing happens computationally in real time in your Mac.  

DAW's provide an ILLUSION of real time.  And that illusion will be broken quickly when you hear an audio dropout in the real time audio card.

Edited by Dewdman42
Link to comment
Share on other sites

The reason I make this distinction is because modern musicians often mistakenly confuse computer systems with electronic analog music gear...where electrical signal flows through electronic components and wires in a real time speed of light fashion.  

And our DAW software and really any digital processing works nothing like that at all.  Everything happening behind the curtain is NOT in real time.

Edited by Dewdman42
Link to comment
Share on other sites

I am fully aware of buffering. I've been doing this digital sound thingamajiggy bit for a few decades. 😉 

I was arguing from the point of user interaction, not the technical architecture. 

The reason I used the term "real-time" is that in MainStage, I press a key and need to hear the sound NOW. Obviously there's a processing latency and an audio buffer, but that's the basic point of the program, from a user perspective. 

On a DAW, it is completely irrelevant if it spends a second or twelve pre-fetching stuff and caching FX processing or whatever, as long as everything is aligned properly once playback/recording begins. Recording latency can be adjusted (automatically or by hand) after the fact. 

In either case, the speed of the SSD is pretty much irrelevant to the stability of the program. We're way past the age of bandwidth concerns for audio production, at least with internal SSDs. 

Also in both cases, if the machine craps out due to overload, it's going to be due to CPU overload on one or more cores. 

Edited by analogika
Link to comment
Share on other sites

46 minutes ago, Dewdman42 said:

then go read up about

I know how DAWs and computers work, thanks. I've plenty of software engineering experience, and have indeed studied real-time systems as part of those qualifications. I was illustrating what defines a real-time system, and why I consider Logic falls in that definition generally, when it comes to playing instruments in real-time, or in playback where it needs to process events and get that processing done in a predictable time, or else fail with CPU overload because the required processed can't be performed in the time required for smooth uninterrupted playback. Note the definition above does say events have to be responded to "instantly", just with specific predictable time-constraints.

I didn't say a Mac is considered a real-time system generally, I would say under the scope of the definition I used as an example, I would consider Logic a real-time system, but your definitions may be different. That's Ok. You don't need to resort to shouting emphasising every time someone disagrees with you, it's fine to say fairly reasonably "I don't see it that way and I disagree with your viewpoint". It's all part of presenting and discussing our viewpoints, which makes the forum a richer place.

I don't think I've ever seen anyone confuse a computer with analog gear in terms of processing speed, unless they just simply don't understand how any of that stuff works, in which case you can't really blame them... I don't know how a carburettor works - I just barely know how to spell it 🙂 , so I could make many assumptions that would be incorrect regarding it's functionality.

Anyway, I've no desire to argue, I just don't agree on that point, and you're free to think I'm wrong, it's all fine by me...

Edited by des99
Link to comment
Share on other sites

everything happening inside the DAW or MainStage is happening at a rate that is faster then real time.  and also in a start/stop fashion..its not happening in a continually consistent speed like real time.  Many of the components including the CPU can be sitting around doing nothing some of the time actually even while sound is still emitting from the sound card in a real time fashion.

That's the only way it keeps up with the sound card.  If the SSD and other components interacting with each other in this faster then real time fashion can all keep up with that rate..then you will get the low latency real time illusion from the sound card.

Modern SSD's are way faster then years prior...that's all I am saying.  And people are saying that virtual memory swapping is way more efficient with the new M1/M2 computers.  To the point that they are not noticing any problem and systems with less memory are performing as well as systems with more.

Plus...don't forget that usually the reason audio production needs anything more than about 16gb is usually more to do with large numbers of sample instrument tracks..and I suspect that DFD streaming is much more efficient now also.

anyway, over the next year or two we will get past theoretical and we will hear from so many users that they are doing this or that with 16GB or 32gb or whatever and we will form practical wisdom about how much memory people actually need to produce music with an Apple Silicon Mac.  I suspect it will be less then it was in years past.

 

Edited by Dewdman42
  • Like 1
Link to comment
Share on other sites

Let's keep in mind that the post that re-booted this thread was this one: 

4 hours ago, MikeRobinson said:

Although "SSD-backed virtual memory paging" is much faster than before, because there are no "physical latencies" – read/write head movement nor disk rotation – associated with this technology, it is still "paging."  An I/O operation is still taking place: possibly two ... one "out" to make room, then one "in" to get the data you need.

[incorrectly explained advice snipped]

The "system" isn't being "overloaded" because the CPU cores are being maxed-out: it's being "overloaded" because the data didn't arrive in time.

My point was that this is pretty much completely incorrect on just about any Mac Apple has shipped within the last six to eight years. (Not the technical description of paging itself, of course. That happens.)

Slow disk access simply is NOT a factor in DAW system overloads anymore (unless you're doing something stupid like running a mechanical hard disk off a USB hub with an external USB mouse attached or so). 

Edited by analogika
  • Like 2
Link to comment
Share on other sites

16 minutes ago, analogika said:

On a DAW, it is completely irrelevant if it spends a second or twelve pre-fetching stuff and caching FX processing or whatever, as long as everything is aligned properly once playback/recording begins. Recording latency can be adjusted (automatically or by hand) after the fact. 

In either case, the speed of the SSD is pretty much irrelevant to the stability of the program. We're way past the age of bandwidth concerns for audio production, at least with internal SSDs. 

Also in both cases, if the machine craps out due to overload, it's going to be due to CPU overload on one or more cores. 

More or less yes, but SSD is not completely irrelevant.  You will notice that your sound will have audio dropouts many times while your CPU meter are not showing anything remotely close to 100%.  That would be the CPU's crapping out, if they actually hit full 100%.  They almost never do.  So there are other bottlenecks that slow down the behind-the-curtain processing that needs to take place to keep the buffer filled.  Yes, the CPU speed can be a bottleneck sometimes, but sometimes it is other stuff, including SSD.  The m1 ssd is way faster now though.

 

Link to comment
Share on other sites

2 minutes ago, analogika said:

Slow disk access simply is NOT a factor in DAW system overloads anymore (unless you're doing something stupid like running a mechanical hard disk off a USB hub with an external USB mouse attached or so). 

Or running projects off an iMac system Fusion drive... which is... not great. 😉

  • Like 1
Link to comment
Share on other sites

also when there are many tracks to process, than having more cores can create more parallelism, which is theoretically less threads waiting around....but all those cores still have to funnel data through other components behind the curtain also...so its not that SSD is irrelevant.  its still very relevant.  But its just quite a bit more efficient now compared to years past due to not only the SSD speed but a faster bus...and faster memory bus speeds also.

Edited by Dewdman42
Link to comment
Share on other sites

1 minute ago, Dewdman42 said:

More or less yes, but SSD is not completely irrelevant.  You will notice that your sound will have audio dropouts many times while your CPU meter are not showing anything remotely close to 100%.  That would be the CPU's crapping out, if they actually hit full 100%.  They almost never do.  So there are other bottlenecks that slow down the behind-the-curtain processing that needs to take place to keep the buffer filled.  Yes, the CPU speed can be a bottleneck sometimes, but sometimes it is other stuff, including SSD.  The m1 ssd is way faster now though.

The 2016 13" MacBooks Pro have SSD speeds of 1.3 GBytes/sec at MINIMUM. That's way fast enough for over a thousand channels of 192 kHz/32-bit audio. The 15" are around twice that. 

SSD throughput and latency do make a difference to percieved performance, of course, but at least the internal drives really haven't been anything to be concerned about in our business for many years now. 

Link to comment
Share on other sites

ok, but again, you're reverting back to a real time mental concept...nothing happens in real time as I said before.  the data doesn't flow through that way at that kind of sustained rate like that.

Things happen in spurts, way faster then real time with timesharing between different cores, etc..  And virtual memory swapping is a different animal from streaming also...  That actually is much more critical for the SSD to be quite a bit faster to not become noticeable.  But the good news is that the M1 is actually quite a bit faster.

Link to comment
Share on other sites

7 minutes ago, Dewdman42 said:

also when there are many tracks to process, than having more cores can create more parallelism, which is theoretically less threads waiting around....but all those cores still have to funnel data through other components behind the curtain also...so its not that SSD is irrelevant.  its still very relevant.  But its just quite a bit more efficient now compared to years past due to not only the SSD speed but a faster bus...and faster memory bus speeds also.

You're telling me that the SSD is "still very relevant" because the system runs a bunch of stuff through other components…that aren't the SSD. 

What I'm saying is this:

We're in audio. Whatever is going to be causing issues in our workflow — it's not going to be the SSD.*

*unless you run out of space

Link to comment
Share on other sites

6 minutes ago, Dewdman42 said:

ok, but again, you're reverting back to a real time mental concept...nothing happens in real time as I said before.  the data doesn't flow through that way at that kind of sustained rate like that.

Things happen in spurts, way faster then real time with timesharing between different cores, etc..  And virtual memory swapping is a different animal from streaming also...  That actually is much more critical for the SSD to be quite a bit faster to not become noticeable.  But the good news is that the M1 is actually quite a bit faster.

We're not talking about "noticeable". Obviously it's going to be "noticeable". 

 

MikeRobinson explicitly stated that "overloads" in Logic usage would happen due to the relative slowness of SSDs. 

I call bollocks on that claim. 

Everything else you write is a given. 

Edited by analogika
Link to comment
Share on other sites

Mike Robinson was not completely wrong in terms of virtual memory paging.  This is a valid concern.  But he was just overlooking the fact that the M1 has a much faster SSD and bus speeds in both the SSD and memory...such that virtual memory swapping will be much more efficient then pre-M1.

anyone can open up Activity Monitor and observe information about swapping.  Here's mine, well I have a lot of RAM and not much going on right now...so no swapping for me...and note the Green "Memory Pressure" graph.  that is Apple telling me that I have more than enough RAM for my current tasks.

1123348592_ScreenShot2022-11-09at12_22_36PM.thumb.jpg.2cc022d334f29dd2735560c2ce7d805b.jpg

But...anyway...if I load more and more programs...with them all running at the same time, eventually I will start to see some virtual memory paging.  And that is totally normal...OS's have done that for years, but if they do it too much then the computer may start to have noticeable slow down.  In the case of a DAW, that would slow it down eventually to the point that it would not be able to keep the buffer filled and you'd start getting audio drop outs.

The new M1 has such improvements in SSD data efficiency....that allegedly it can handle more virtual memory swapping before causing a problem, compared to older Mac models.  That is what I have been reading.

 

Link to comment
Share on other sites

5 hours ago, MikeRobinson said:

Logic is a real-time application which cannot tolerate any delay however slight ... equals "system overload."

This is the incorrect statement, which I feel is related to misunderstanding about how LogicPro and other DAW's work internally on a very much non-real-time computer system.  Not able to "tolerate ANY delay"?  No that is incorrect.  There are already tons of delays happening all over the place behind the curtain, nothing is happening in real time.  There are many different possible scenarios that can result in a functioning system with or without delays in various components, it just depends on a lot of complicated factors.

 

5 hours ago, MikeRobinson said:

The "system" isn't being "overloaded" because the CPU cores are being maxed-out: it's being "overloaded" because the data didn't arrive in time.

This is definitely still a concern.  But its less of a concern with the M1 then it was with previous Mac models.

Edited by Dewdman42
Link to comment
Share on other sites

16 minutes ago, Dewdman42 said:

This is the incorrect statement, which I feel is related to misunderstanding about how LogicPro and other DAW's work internally on a very much non-real-time computer system.

It's actually quite correct, you've just misinterpreted the statement slightly. In this case, no delay in his statement means "maintain uninterrupted playback" - you can't wait for a while before continuing the next bunch of samples - not that Logic can't handle PDC.

You seem to be under a misunderstanding of what a real-time system is. My impression from your posts is that you think that, for example, a sample coming in should be immediately processed and output at that exact same point in time for the system to be real-time - ie, the event was handled at exactly the instant it was generated - even one sample late, and we're now "not real-time". And if a buffer of a bunch of samples is involved, we're not super not real-time.

This is not what a real-time system means - go back and check the definition above. In the context of Logic, the real-time constraint that Logic must serve is uninterrupted playback of the audio/song. As long as all the processing can serve that predictable time-constraint, everything is good. When Logic *can't* fulfill that constraint, for example the plugin processing hasn't finished before the next audio buffer is delivered for playback due to too many plugins to process in real-time, playback halts, and you get the CPU overload (or disk overload, in the case of disk throughput constraints).

Logic has to of course do a bunch of stuff behind the scenes to fulfil that constraint - start streaming audio files, process plugins, handle PDC and shuffle audio streams about, handle automation/parameter changes etc, but as long as all that can be done within the buffer cycle to maintain real-time uninterrupted playback, it's all good. That's what I mean by Logic being a real-time system.

Edited by des99
  • Like 1
Link to comment
Share on other sites

15 minutes ago, des99 said:

It's actually quite correct, you've just misinterpreted the statement slightly. In this case, no delay in his statement means "maintain uninterrupted playback" - you can't wait for a while before continuing the next bunch of samples - not that Logic can't handle PDC.

 

Yes you can.  There are already lots of delays internally.  he was talking specifically about SSD and paging.  As usual Des you are just making needless arguments with me.  You have not contributed to better understanding for all here, only contributed to burning it down.

The Mac is not a real time system.  The SSD's have breathing room to start and stop and have delays in the processing chain.  Too many delays of course would be a problem, but its not true to say that it can't have "ANY" delays in the SSD.

The new Macs have even more breathing room to do this because they have faster components.

Lastly Des, I am not interested in arguing with you about what you think you know or what you think I know.  Take that kind of rhetoric elsewhere.  I also have a BSCS by the way, not that it matters.

 

Edited by Dewdman42
Link to comment
Share on other sites

3 hours ago, Dewdman42 said:

Mike Robinson was not completely wrong in terms of virtual memory paging.  This is a valid concern.  But he was just overlooking the fact that the M1 has a much faster SSD and bus speeds in both the SSD and memory...such that virtual memory swapping will be much more efficient then pre-M1.

anyone can open up Activity Monitor and observe information about swapping.  Here's mine, well I have a lot of RAM and not much going on right now...so no swapping for me...and note the Green "Memory Pressure" graph.  that is Apple telling me that I have more than enough RAM for my current tasks.

1123348592_ScreenShot2022-11-09at12_22_36PM.thumb.jpg.2cc022d334f29dd2735560c2ce7d805b.jpg

But...anyway...if I load more and more programs...with them all running at the same time, eventually I will start to see some virtual memory paging.  And that is totally normal...OS's have done that for years, but if they do it too much then the computer may start to have noticeable slow down.  In the case of a DAW, that would slow it down eventually to the point that it would not be able to keep the buffer filled and you'd start getting audio drop outs.

 

The CPU overhead from having to compress and page on the fly will cause the CPU to overload LONG before SSD bandwidth is saturated from the paging and data delivery ever becomes a concern on any system sold in the past eight years or so. 

 

So…yes, having DRAMATICALLY too little RAM will contribute to system overload, but it won't do so because the SSD is too slow and delivery delays cause dropouts. 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...