Hi All. I’ve seen reference to this but don’t know if it’s fixed or if it’s related to what we see on screen. Anyway, I think I need simple answers to basic questions I’ve been taking for granted:
What is MIDI recorded to? Milliseconds, Ticks? What affects it? Can’t find any specific references to this.
What we see is determined by the MIDI Display Resolution pref but this doesn’t affect what we hear. Or does it?
The most accurate representation of what we have recorded is the Cubase version and no export will ever be so.
Tick seems to be the resolution unit for editing and viewing, the highest value is 960 PPQ
The overall resolution for recording would depend on this I would think.
Experiment. Set the lowest allowed resolution 24 PPQ, then record something midi, see if the recorded notes fall on the set PPQ, then redo at 960 PPQ and see if that also holds.
The project tempo would also define the resolution along with the tick PPQ?
To see if this is true you could try doing the same experiment at a very slow tempo and a very fast one.
I would try it but I’m not at my computer.
Of course, is MIDI that accurate, talking external midi that is!
It shouldn’t matter in most cases as ppqn is handier for scoring purposes sometimes. Any difference in appearance between the two types of resolution should not, in musical terms, affect what you “see”.
Rarely it may matter if you are doing some detailed synthesis or mixing of two sounds (one audio and one midi) say, and you need to know why a certain artifact is happening, like an interference pattern (growliness on parts of what should sound like a smooth pad?).
Hmmm… I detect a bit of guesswork here, boys, tut-tut Some fair points too, of course, but I’m not overly concerned with repercussions. This is just an enquiry for dull facts. Just like to know what I’ve got in the bag, so to speak.
Thanks so far, but the fat lady is still fixing her make-up…