M1 Max



I think the notch, being in the menu bar is a great way to add the bigger camera sensor. It’s just wasted space there (usually. We’ll see how it handles lots of menu bar items.)

1 Like

I meant Pro actually.


I’m actually thinking about buying the M1 pro while it’s available so I can live without the damn notch for a few years. I do not need the new pro specs anyways.

1 Like

Actually I read the Max is more powerful and higher spec’d than the Pro. So I did mean Max after all!

@benwiggy thanks for pointing out that they all share the same instruction set.

Ok it’s good to know that it already runs well on M1 and will be native with Dorico 4. My 2015 MBP is starting to do some very scary things and so I’ve been waiting for these new MBPs eagerly.

The Max actually has the same number (10) of CPU cores as the Pro. The only difference is that the Max can address more memory, and has more GPU cores.

Here’s a helpful chart thanks to cnet:

1 Like

Just some additional trivia. The CPU & GPU cores used in the M1-series processors are the same as the ones in the A14 chip used in the iPhone 12.

And to think: the computer that put the astronauts on the moon was the size of a car and only had as much computational power as a Casio watch from the 1980’s. My, how far we’ve come!

1 Like

Don’t forget - the power of the M1’s isn’t the performance, it’s the power savings. The performance is typical for a scaled up embedded platform. For comparison, for an equivalent price I just built a Threadripper 24 core 4GHz (on all cores), 128GB RAM, 3090, which is at least an order of magnitude greater performance (I’m actually not sure they can be compared, the 3090 has 24GB VRAM and 11k cores, compared to 32 I think on the Max). The real power of Macs is the software ecosystem.


Well, try slipping that Threadripper machine into a standard size briefcase… :thinking: :rofl:

1 Like

There’s no point in a price comparison between a desktop box and laptop. But it might be interesting to see some benchmark scores to see how close a mere ‘scaled up embedded platform’ gets to your Threadripper.

This latest Linux laptop has an Intel 8-core i7-11800H, and (for 16Gb RAM and 1 TB storage), comes in a couple of hundred cheaper than a 16" MBP but you’re still missing HiDPI display, higher spec RAM and storage, plus USB-C charging, and lots more besides.

An HP Zbook is $3000 for 32Gb RAM, 1 Tb, with a 4K 15" display, with an Intel i9-10885H. Geekbench is c. 1100 and 7200.

Intel i7 11800H scores c. 1500 / 9400 on Geekbench, compared to the single reported 1700 / 11500 of the M1 Max. (24-core Threadrippers come in at around 1300 / 24000.)

And by all accounts, the GPUs are insanely powerful, too. Dismissing the power of these things as ‘typical’ is absurd.

I agree on the power front; the M1 S.O.C. Is running at about 15 watts. The new pro max reputedly has as much graphics oomph as a 3090, and that runs hundreds of watts by itself, not including the rest of the system required to run it. What apples engineers have achieved is truly stellar (at least in this case). I have an M1 Mac mini at work now, and everything is instantaneous. I can export some of my YouTube videos from camtasia in under a minute. My old machine I had to start the render and go to lunch.

I confess I was very disappointed they didn’t release a suped up mini; as soon as they make one with 64gb ram, that will be my new computer for home.

Possibly too far - I had a blazing row with my Google smart speaker this morning!


This made me chuckle out loud.

Much like my comment In another thread where I admitted I’m not on social media anymore, I also refuse to have smart speakers anymore. We had a few Alexa devices, including a nice “show” model that had a screen and played videos. After I found my four year old asking for music while we were in the other room and Alexa misinterpreting and delivering content that should have never flowed into our home, I was absolutely done. Also, I don’t like making it any easier for the spies than it already is. The number of data leaks and hacks is alarming. I also refuse to have a smart baby monitor for the same reason. But I digress.


Many a four year old has been observed in a waiting room with one of the inevitable pile of magazines on their lap, and the child is futilely trying to select and swipe, obviously wondering why the magazine is ‘broken’.
On a related vein, after playing from my iPad and switching to printed sheet music, I find myself tapping the paper and experiencing momentary confusion at the page not turning.
That’s how far we’ve come.


It won’t be long now before the men in white coats come to take us away… :fearful:

1 Like

On the bolded part you took the words right out of my mouth. I believe what they did is optimize the GUI stack so well that it seems instantaneous, which nobody bothered to do elsewhere. For example in Windows, if you go into an obscure settings window and turn everything off, its truly undetectable. Such as, switching virtual desktops, changing windows, minimizing, maximizing, etc. Win10 bundles all these whizzy animations which of course, are designed to take time. But you’ll make your machine apparently 10 times faster without.

A differing example is the right click menu on a file - on all machines that’s slow. I forget the reason, but there’s some silly things it does (file path lookup or something) which takes forever, no matter the machine. But perceptibly the machine seems slow which isn’t the case.

Anyhow on the first part the engineering here is nothing special, they just looked at the low hanging fruit of what takes time (perceptually) and optimized the hell out of it. Not hard or special, its just that most places don’t spend the effort. The real genius is the marketing.

It so happened that the morning I wrote that post I was in a software meeting about our ARM SOC. The engineer working on it talked about the new “Unified Memory Space” that’s sweeping the SOC industry. Without going into technicalities, the only people who care about it are, well not even us software guys who program this stuff. It just makes life a bit easier for some I/O HW engineers .

But go to the M1 Max Apple page and what do you see? “Unified Memory Architecture” …. Ooooooooooooooooooohhhhhhhhhhhhhh. LOL, sorry, it’s amazing how they turned such and obscure non feature into a feature, I wish we could do that.

Yeah you can’t scale like that, too many other variables such as internal clock, bus speeds, addressing etc. And more importantly the software stack, out of the gate Apple is so obsessed with power consumption that they under drive the whole system. Take ray tracing, it’s taken over the industry (games, visualization, etc), neither the software nor hardware is available on mobile and I doubt ever will be.

Fortunate for music people, the HW demands of music software (even virtuals) has long been addressed by Moore’s Law.