Apple just can’t resist making ridiculous margins from their customers, even when their devices do allow for upgrades to the default configuration.
For instance, with a Mac Pro, you have to pay an extra $800 to go from 64gb to 128gb of memory. For $800, you could get about 384gb of ram in 64gb sticks from a different vendor.
Wait, it’ll actually let you use local LLMs?
That would legitimately help me out. I use LLMs a lot for simple data restructuring, or rewording of explanations when I’m reading through certain sources. I was worried they would just do a simple ChatGPT API integration and have that be the end of it, but maybe this will end up being something I’d actually use.