I am new to cuebase and have been test recording some tracks. When recording midi if I hold down a chord on strings and use sustain pedal after a few bars it will go crazy. If I hit reset it stops. I tried a piano track and it does the same thing about a verse into the song if I hold the sustain pedal. It is a yamaha mx88 and im using cuebase le ai elements 10.
What does this mean? Can you provide a step-by-step?
(The more details you can provide, the more likely you will get helpful replies.)
All the notes hang and will not stop and sometimes its like a big thud… only happens if i hold down chords on strings for a few bars or hold down sustain pedal playing grand piano
The problem ‘might’ be that you have your synth in Local Mode, while MIDI Thru is active in Cubase.
What I mean by this…
In Local Mode, when you hit a key on the synth it triggers a sound locally. It will ALSO send a MIDI event. If MIDI Thru is active in Cubase, it echos back into the synth and trys to trigger the same sound and/or continuous controller a second time. A kind of ‘feedback’.
In Controller mode (Local OFF), if you tap a key It doesn’t generate a sound unless it’s looped back into the synth via MIDI.
If you like to use local mode, then disable MIDI Thru in Cubase.
If you prefer Controller Mode (Local OFF), then you’ll Need MIDI Thru enabled in Cubase.
Most DAW users will want to have Local OFF on the MX88, with MIDI Thru ON in Cubase. That makes the keyboard something of an ‘independent, soundless, MIDI controller’. Meanwhile the DAW can still use the ‘tone generator’ portion of the synth by directing output from MIDI or Instrument tracks back into the instrument.
Check out page 23 in the MX88 Manual for more info on use with Cubase.
Another thing that ‘might’ be happening…
If you have more than one MIDI track in your DAW set to output to the MX88 over the same MIDI channel, and both are armed for Recording or Monitoring at the same time. Striking a key/pedal in this scenario might lead to cloned events, thus confusing the synth. The solution would be to only arm one of these tracks at a time when recording/monitoring, or set them to different MIDI channels. If you do need multiple tracks using the same MIDI channel for some reason (not uncommon when you’re using cycle mode to do different ‘takes’, or maybe you have a program change event to use different sounds at different points in the song and like to keep the different ‘sounds’ in their own track, etc…), be careful that the different tracks aren’t trying to play the same note(s) and/or use the same controller(s) at the same time.
I did reference that in the manual and also found a YouTube video of the same thing. Just forgot to update it in the forum. You are spot on. One thing you said about midi thru in cubase I need to look at because the sounds in the mx are awesome. I guess you would need to record them as audio tracks. I tried that with a piano sound and its out of sync with the metronome. More homework!
Yeah, your audio card may have some ‘latency’, while MIDI, and your keyboard sounding off is almost instant (depending on how you have it wired for monitoring).
If you pipe the keyboard’s audio into your audio interface, and then into Cubase, through an audio track or input channels (perhaps with VST effects on the channels as well), and in turn ‘monitor’ the overall output THROUGH the DAW, then your playing should be ‘closer’ to the metronome already, but might be a bit off.
Next, depending on your interface, and the processing power of your computer, you might can get the audio latency down some by adjusting the ASIO buffer size (done in the driver software of your audio interface). 512k is usually a good medium that gives headroom to run plugins, yet end up with a latency under say, 40ms (detectible, but no so bad that it’s impossible to play along live with the DAW tracks). If you’re not trying to run tons of VSTi plugins and stuff, or your PC is pretty beefy/modern, at moderate sample rates (44.1k or 48k) it might even be possible to get latency down to 10 or 12ms (buffer sizes of 128k, 64k, 32k, or maybe even smaller).
Check out menu “Studio/Studio Setup/Audio System” in Cubase. This is where you’ll deal with most latency issues. While you’re in this dialog, notice the Adjust for Record Latency…you can use that to get your keyboard sounding more in sync with the DAW transport.
You’ll see a few latency times reported near the top of the dialog.
Input and Output latency refer to what you should be getting when a track is ‘armed’ for recording or monitoring, and the DAW transport is idle.
ASIO-Guard Latency is latency for ‘everything’ I think, when the DAW is playing. Armed tracks do get some priority, but may still have increased latency above the raw input/output times.
It’s also possible to disable ASIO-Guard all together, and attempt to force everything to happen as close to real time as possible. Unless you have a specific reason, and know what you’re doing and why, it’s typically recommended to just leave ASIO-Guard on and use the DAW. Start with ‘normal’ and go from there if you have ‘issues’ later.
I.E. If you have a slower computer, and start to hear ‘glitches’ in the audio as more plugins are added…you might try giving ASIO guard higher or lower priority. If that doesn’t fix it, try a larger ASIO buffer size (You’d do that in your audio interface driver software). Larger buffers give the system ‘more time’ to process things (thus the latency delay of larger buffers).
Do you ‘need’ a really low latency? Probably not…at least not during ‘compositional and doodling’ phases of building a project. It’d come in handy at times when you need fine precisions with ‘real time’ mixing using remote controls and stuff like that. It can be helpful if you’re trying to do vocal tracks and stuff where everyone needs to hear each other, and all the parts with tight precision (mix VSTi plugins down to pure audio tacks first in those cases where possible…frees up processor resources).
Since most home studio folks will simply be using our mouse to ‘draw in’ automations…we rarely notice ‘latency issues’ except for maybe when we’re trying to ‘play along’ with the tracks. 40ms or so is tolerable for most folks (and music types).
Don’t freak out too much if the stuff you play in isn’t right on the grid. If it’s sounds good, that’s what matters Also be aware that most versions of Cubase have pretty good feature to perfectly quantize, or even humanize those playing events. There are even ‘groove’ quantizers (make a groove track, and get other tracks to quantize with it rather than to the grid).
In Cubase Pro for sure (not sure about the little brothers), there are also real time MIDI inserts that can push note on events closer to, or right on the grid precisely as you play in real time (adjust the strength of pull towards the grid with a percentage). Quantize in real time while playing.
When you ‘arm’ a tack for recording or monitoring, it should get some priority if ASIO-Guard is enabled.
There are also features in Cubase to set latency offsets for a given track…thus forcing a desired realtime delay in MIDI input so things ‘match up’ better.
From there, each track can be independently toggled to apply that latency offset, or not.
With some audio interfaces, it’s also possible to wire things so you hear your synth directly through zero latency ‘monitor’ channels. I.E. You hear those sounds straight from the audio interface (usually some sort of software mixer comes with the device and can mix all the inputs into a live ‘Stereo Mix’ or something similar). If you’re hearing your synth this way, then it could easily be ‘ahead’ of stuff playing ‘thru’ the DAW.
With some audio devices, and versions of Cubase, Cubase can make use of that ‘monitor’ channel from inside the DAW in interesting and useful ways. I.E. For sending a zero latency monitor to the earphones of some singers.