Mac RAM is Cheap
Apple has a well deserved reputation for overcharging for simple things: $500 for a set of wheels, another $500 for a height adjustable stand for the Studio Display, $79 for an iPad cover, $1200 to upgrade a MacBook Pro to a 4TB SSD. Some of this added cost is because Apple doesn’t sell crap, or really anything less than the top of the line. The Mac Studio $1200 4TB SSD is several times faster than the $200 4TB SSD you picked up in the bargain bin at MicroCenter. The Apple $79 iPad case actually holds up an iPad, unlike the $20 knockoffs on AliExpress. The Apple Pencil lasts more than a month, unlike the $27 Vistaike I stupidly bought from Amazon.
Nowhere is Apple excoriated for overpricing more than in RAM upgrades on Macs. But when you compare Apples to, well, not Apples, then it rapidly becomes apparent that Apple’s RAM isn’t just more expensive. It is a *lot* better than third party alternatives on PCs and Linux computers in ways that really matter. And when you look at why that’s so, getting anywhere near RAM that performs as well as Apple’s does will cost you more than buying a Mac, and you still don’t get as much.
Whether you notice Apple’s RAM is better depends on what you’re doing. For Photoshop, video editing, web browsing, word processing, and coding, you probably won’t notice the difference. The minimum 16GB base configuration is plenty for anyone who just wants to browse the Web and write some documents, and I notice a lot of folks happily doing this with only 8GB of RAM. For most users, the cost of the upgrade just doesn’t matter because they simply don’t need more RAM, and wouldn’t know what to with it if they had it. Photo and video editors can often fill up 24 or even 32GB of RAM, but you have to be doing some very heavy video editing before you can use more than that.
But there is one software category that can easily use more RAM than this, and this category is where the ARM Macs and their integrated RAM smoke the competition: LLMs.
LLMs need RAM way beyond anything we’ve seen before. This is a very rough rule of thumb, but computers need a little more than 1GB of RAM per billion parameters. Except no, they don’t. RAM doesn’t cut it. If you store your model in RAM, then it runs on the CPU which is slow for LLMs. Performance requires LLMs to run massively parallel on the GPU, and that means the model needs to fit in *VRAM*, not in RAM. NVidia GPUs cost $1800 for a model with 8GB of VRAM. Bump that to 16GB and you’re looking at more like $3000. For 96GB of VRAM you need an NVidia RTX PRO 6000, which will set you back about $7000, if you can find one, which you can’t.
That’s more than I paid for the 128GB MacBook Pro I’m typing on right now. And here’s the kicker: the Mac can use use 96GB of that RAM as super high bandwidth VRAM without even rebooting thanks to the unified memory architecture. It is by far the cheapest VRAM you can buy today. On a Mac Studio I can get 512GB of RAM, almost all of which is available to the GPU, for around $10,000. That’s not cheap by any means, but it’s a lot cheaper than four NVidia 6000’s and the nuclear-grade power supply that can run all of them at once.
If you’re VRAM limited, as LLMs tend to be, the Mac is your cheapest path to running larger models faster.
Apple got lucky here. I don’t think they imagined local LLMs as an important use case when they decided to use unified memory in their M-series chips 5+ years ago. I don’t know why they went with unified memory, but whatever the reason, it paid off, big time, for Apple and for consumers. This reminds me not a little of how NVidia got lucky that their GPUs turned out to be worth a lot more when they were used for crypto-mining and LLMs than for the video games they built them for.
Macs might not outperform price-is-no-object ThreadRipper multi-GPU systems, but they dominate everything anywhere close to their price point. If you run LLMs locally (which is the only way you should run them, for reasons I’ll be writing about soon) a Mac with 64GB or more of “overpriced” RAM is the obvious and cheapest choice.