So… in the world as I understand it, ask anything (like a CPU) to do more work, and it consumes more power and generates more heat.
I think Ben briefly mentioned cooling in one of the performance threads, but I thought of it just as more of a “be careful” kind of thing. Well, I got educated.
That (depending on the chip) it is not so much that running multiple cores is inherently slower than a single core task. But you ARE multiplying the CPU’s heat and power requirements by the number of cores - and those are the limits you hit that affect the chip’s speed controlling behavior.
I eventually listened to my son the build enthusiast. First, a “k” variant of an 11th gen Intel chip, which means it was selected from a batch of similar chips for its stability at higher speeds and is “unlocked” for overclocking. Not the absolute fastest 12th gen for an Intel chip, but maybe in the “second fastest” tier at not quite so crazy a price point. The boost speed is about 56% faster than an M1 Pro provided that the build can sustain it.
That required an enthusiast class motherboard, a substantially bigger power supply, and an enormous cooling tower that looks about like the cylinder of a Harley Davidson. Like Daniel mentioned in one thread, like a gamer system except I only spent money where it would benefit the most.
I took it to a place with an orchestra’s worth of sample libraries where my laptop would normally crumble. It yawned at about 18%. I got a SUSTAINED 4.9 -5 Gig clock speed. I’m not claiming best or fastest - just that I was dumb to take so long to listen to my son.
I more or less tripled the orchestra with additional mics and sample libraries, added some heavy synths, doubled the sample rate… No issues. It ran no higher than 50% load, with individual core temps (the major key I think) staying around 39-43C’s.
The objective result for me is that I’m no longer letting the machine influence my choices in the composition.
It’s a marvelous feeling. I recently upgraded to the last intel iMac with 8 cores and 96gb ram. It cuts through whatever I throw at it like butter. And it’s so fun to be able to run everything 96khz, 24bit and literally load 70gb of sample set data in the ram like it’s no big deal.
I remember how excited I was when our second desktop computer had 256 mb of ram and a 50gb hard drive, lol. Ahhh ’96. Those were the days…
256mb of RAM in 1996? Blimey. I’m pretty sure that was the year I first typed something into Sibelius 7, on an Acorn with 4mb RAM and a 200mb hard drive
LOL. I remember installing an entire OS with floppy disks.
I sense in 20 years time we’ll wonder how we ever made do with what we have today.
For the first computer I had, I bought an external hard drive of 20 megabytes. It was HUGE. Yes, MEGABYTES, not gigabytes. It never became even half full. Well, who cares about these things nowadays…?
And what we wouldn’t give to sell that equipment to a collecter now a days…
I still have my first computer, an Atari 1040ST - and I later bought the 50Mb hard drive – it was enormous (for the day). I think it all still works, I just haven’t hooked it up in years.
Oh you kids. I did my first programming on an IBM 1620 using punch cards for both input & output. If memory serves me it was the Advanced Storage Unit Model 2 that could handle 60,000 decimal digits in memory.
And get off my lawn . . . .
You guys are spring chickens. This is me using my first computational device in the Middle Ages.
Speaking of huge: late 80s, working at a software/hardware company, we threw out an old 10 mb hard drive. It took two of us two trips to carry it downstairs.
My first Computer was a Dragon 32 - this beast of a machine had a huge 32kb (yes kilobytes) of RAM. My current DAW has 2 million times this ! Loved it!
32k! you don’t know how lucky you were - my ZX81 had 1k!
That is incredibly interesting - 15 numbers flashed on a screen in 1.5 seconds then added together mentally, that’s amazing.
This is what memory looked like - used until the seventies. Hand-woven iron rings. Each ring was a bit, then nine bits to one byte. No wonder it was limited and expensive.
Not only that: the color scheme is terrible!
Linus Tech Tips has a great video that came out a year or two ago where he tours a nasa facility with one of the lead engineers from the appollo missions. At one point they discuss these old memory modules in great detail. It’s very interesting.
we all learn, eventually…
Do I post this, or not? Sure, honor your father (and mother).
My Dad was an electrical engineer working for a large NASA contractor starting with the Mercury program (he brought home short film productions and showed them on a home projector when me and my siblings were little), then Gemini, then Apollo, then the early Space Shuttles. He specialized in batteries in the environment of space. Used to fly out to all the key NASA sites.
When I went through his boxes after he died I discovered a special pin with a Snoopy on it. Did some research and found out it was a very special industry award. He never told us about it.
When I got in my teens he did his best to explain to me and my brother what he did, all about electronics and computation (thus the connection to this thread) , but we were never able to fathom half of what he would tell us.
A good man and in my eyes a great man. Sorry for the captive audience, but doggone, a guy has to take the opportunity to publicly praise his Dad.
Glad you posted this.
These discussions often degenerate into tribal debates about Macs versus PCs but for a lot of people the main consideration should be off-the-shelf computers versus custom build.