What you using all that power for? Gaming? Not likely on Mac, Machine learning? Also not likely with that GPU… Maybe a Photoshop machine? Enjoy that non expandable ram.
For a nice dev machine I get it, nice battery life and watch Netflix on a screen, but it’s not like you can’t get a same performance machine for the same/lesser price with Dell/Thinkpad and use Linux…
That’s a rather narrow set of use cases. For example, they are audio and video editing powerhouses. Audio in particular is exceptional because of core audio in MacOS.
And upgradable components aren’t something 95% of the population is worried about. Max out what you need when you buy it. My last Mac lasted 8 years with no trouble. And by the time I was ready to upgrade, the bottleneck was mainly the cpu, which in a case of 8 years, that means a new motherboard, and at that point you might as well upgrade the whole computer, as standards have changed and updated.
Apple silicon has a pretty decent on-board ML subsystem, you can get LLMs to output a respectable number of tokens per second off of it if you have the memory for them. I’m honesty shocked that they haven’t built a little LLM to power Siri
What you using all that power for? Gaming? Not likely on Mac, Machine learning? Also not likely with that GPU… Maybe a Photoshop machine? Enjoy that non expandable ram.
For a nice dev machine I get it, nice battery life and watch Netflix on a screen, but it’s not like you can’t get a same performance machine for the same/lesser price with Dell/Thinkpad and use Linux…
That’s a rather narrow set of use cases. For example, they are audio and video editing powerhouses. Audio in particular is exceptional because of core audio in MacOS.
And upgradable components aren’t something 95% of the population is worried about. Max out what you need when you buy it. My last Mac lasted 8 years with no trouble. And by the time I was ready to upgrade, the bottleneck was mainly the cpu, which in a case of 8 years, that means a new motherboard, and at that point you might as well upgrade the whole computer, as standards have changed and updated.
Apple silicon has a pretty decent on-board ML subsystem, you can get LLMs to output a respectable number of tokens per second off of it if you have the memory for them. I’m honesty shocked that they haven’t built a little LLM to power Siri
With that software support…?