Daw sound engine logic.cubase.protools

Give it up dude, these kinds of tests are utterly useless. You cannot know how the differences came to be - maybe the compression algorithm on that video source hase more impact on the Cubase mix - or indeed the person creating it doesn’t know what he’s doing, or purposely made one program look better. We cannot know - waste of time.

And… here´s the same crazy stupid discussion @ gearslutz :unamused:

superfluous buIIsh*t…
can anyone write something in this forum?
I dont have an account at gearslutz.

O.K i’m Convinced now… cubase is the best for my needs… logic is great too… protools ?? well !!!
look at the attachment :wink:

I use Cubase and Pro Tools. If there is any difference, which I cant tell to be honest, I mix for the sound engine, I guess.

I used logic too for years and the same thing applies.

I used different consoles, different tape machines too and the same applies.


Fix your ears or close your eyes…

bah bah bah bah bah bah!!!



“plastic highs”

LOL! As soon as I read that in the first post, I closed the browser window. WTF is plastic highs? ROFLMAO :laughing:

Dumb ass rookies.

Getting in the habit of eating plastic soda bottles with beverage and all and you’ll get both plastic highs and sugar highs? Really reminds me of the sound of Cubase btw! :unamused: :laughing:

I hear more more red in Cubase, and a bit Blueer in Logic. Protools always seemed kinda greenish. If you are color blind, that isn’t my fault.

:laughing:

daw engines dont have a sound of their own. but these tests usually have operator errors of their own. mismatching levels (fader readouts arent necessarily the same for all programs), forgetting about pan law, you name it.

so far, whenever i did or witnessed a professional test such as these, the usual result was that people were able to cancel out the two mixes to a complete null eventually - usually only after checking lots of little details (is dithering active here, is it inactive there, does the mix include any plugins that have unsynced LFOs or randomization in their algorithm like modulation effects or reverb, do we dither the result to a linear format or do we stay float)…whatever. point being: if you manage to cancel out a mix completely, that proves the point that the basic sound is the same, and all examples where its not are just examples of people not being able to do these tests right.

that, and then theres the science behind it. its math. there might be little details here and there where it wont be exactly the same (say, linear busses vs. floating point, resolution issues, automation jitter or whatever), but the BASIC sound - is none. its math.

(i hate math btw)