The MIDI 1.0 specifications... and no one followed it

I’ve been reading through the MIDI v1.0 specifications (found on midi.org) and after just a little while, it dawned on me—no one actually followed it!

If you’ve ever used MIDI in modern days, at one point or another you probably felt frustration over the 7-bit limitation of MIDI CCs? In today’s world where VST automation offer practically unlimited resolution for controlling things like a volume fader or a filter cutoff knob, it’s a sad affair that a MIDI CC controller is only able to send integer values between 0 and 127. Or is it?

MIDI v1.0 specifies a number of control change number for specific use such as CC #7 for Volume, CC #1 for Modulation, CC #10 for Pan and so on. Some of these became ubiquitous (such as Modulation or Sustain Pedal) while some fell to the wayside. (When was the last time you used MIDI CC #71 for “Harmonic Intensity”?) Often in today’s MIDI equipped devices or software all (or most) of the 128 CCs are treated equally as 7-bit values and this is where everyone got it wrong. The specification explicitly states that custom or proprietary parameters should be controlled either with CC #16-19, 80-83 or by using NRPN. It also states that if one needs a finer resolution for some of the predefined CCs, such as Volume or Pan for example, you should use the LSB counterpart of these CCs (CC #39 in the case of Volume).

For those who are not familiar with MSB (Most Significant Byte) and LSB (Least Significant Byte), it is in essence a way of composing large number by combining 2 or more smaller ones. In the world of MIDI, it turns a 7-bit value into a 14-bit value. Controlling Volume with its LSB counterpart increases the resolution from 0-127 to 0-16,383!
It gets crazier, NRPN I mentioned earlier (stands for Non-Registered Parameter Number) allows for 16,384 freely assignable controllers all with 14-bit values.

So why didn’t this catch on? I don’t know. The number of instruments and devices that utilize NRPNs, following the specification’s recommendation, are few. Not even Cubase does it (properly). The only reason I can think of is that at the time MIDI gained a stronghold in the market, most manufacturers were fine with low resolution parameters, single CC values became the norm and 14-bit CCs got forgotten about.

3 Likes

Hi,

Probably the users didn’t need it. Nowadays the need seems to be more present, therefore MIDI 2.0 is here.

1 Like

I needed it 20 years ago and I don’t think I was alone.
Also, MIDI 2.0 is almost here. :wink:

Still a bit puzzled why Steinberg, that was/is so big on the MIDI market, never properly implemented RPN/NRPN.

1 Like

Its not like it didn’t catch on, I think some manufacturers didn’t think they needed it.

Korg for example, used NRPN’s for everything. My Prophet Rev 2 can use either CC’s or NRPN’s, and it does make use of some dual CC controllers to make ranges larger. Sequential themselves recommend you set it to NRPN instead. The Bass Station 2 does as well, I can’t figure out how to make proper device panels for the BS 2 or the Rev 2 because there’s no way to use the dual CC setup that I can find in there.

Roland used a lot of NRPN and MSB/LSB as well. I have to pull out their chart every time I want to switch around to patches on one of the expansion cards in my XV-3080. Hell sometimes I need to just to switch around to the internal banks. Waldorf used MSB/LSB for some things (namely bank switching) as well.

All comes down to the manufacturer really.

1 Like

Well, the way I read it, NRPN should be the norm for device/instrument parameters, not regular CCs.
I know some manufacturers used it to some extent, but in the grand scheme of things, I still think it is accurate to say it didn’t catch on. I don’t know a single MIDI sequencer (or DAW) that can handle MSB/LSB values properly e.g.

Hi,

Another problem is, that even though the company would use 14-bit controllers, the MIDI 1.0 transmits data serially at a rate of 31,250 bits per second (baud) and each MIDI message consists of 8-bit bytes.

So you could overload the system quickly.

2 Likes

Perhaps. Depending on how many devices you would have on a single port.

A MIDI CC message is 3 bytes in total. That means on a single MIDI port, you can send at least 1,000 MIDI CC messages per second. A full NRPN message is comprised of 4 MIDI CC messages, so that gives you 250+ NRPN messages per second. Not bad.
In practice it would be even more if you’re sending multiple values for the same parameter. The spec states that you don’t need to transmit the Parameter Number for subsequent values to the same parameter. And you don’t need to transmit the MSB if not needed.

As a programmer who had a career in tight, embedded systems, this would be a huge consideration.

Another point is that MSB/LSB 14 bit values are basically optional since the MSB does all the heavy lifting - value wise, along with the fact that most folks wouldn’t really know the difference with a high resolution value and especially given the fact that sampling was basically new at the time. Had Midi 1.0 required those to be transmitted as a singular 14 bit datum, it would have forced the issue. A trade-off was wisely made.

3 Likes

Not just perhaps - it was a real and widely reported problem back then and we had to manage our MIDI chains very carefully.

MIDI was originally widely used with hardware sequencers with a single MIDI input and output. And we had to string multiple MIDI devices in series via using their MIDI THRU ports - using MIDI channels effectively as the destination address in such chains.

So my first MIDI network was the first sequencer was the Roland MSQ-700. Driving a Juno60, a DX7, a Juno 106, and a TR-707.

All on a single chain via liberal use of the “MIDI THRU” sockets on each keyboard, and each device listening on a different MIDI channel.

And it was very easy to overload the chain, such that MIDI latency became a problem. i.e. the last device might play its notes audibly too late.

While I never did that myself, that era also saw the arrival of “one man bands”, not using audio backing tracks, but lots of MIDI devices and sequencers to get the full band sound. And when they went to mod wheel heavy “guitar” solo’s you could frequently hear the MIDI latency in their “band”.

So MIDI latency was a very big deal, and you’d be careful about using the mod and pitch bend wheels, since they sent lots of data.

And even with the Atari 1040 soon becoming my sequencing platform, we still only had one MIDI OUT port that had to handle all of the traffic for all of the target devices.

Only once we got multiple MIDI ports coming out of the computer via the parallel or serial port (many of us used MOTUs first generation MIDI Time Piece) did we get some MIDI congestion relief.

But once the Roland MT-32 arrived, being one of the first highly popular multi instrument sound sources, we started having to worry about MIDI congestion again. And it only got worse with later devices like my JV-1080, since it could handle 16 instruments.

So yes, 14 bit was the friend of avoiding zipper noises, but the enemy of MIDI latency.


p.s. I’m still not enamoured with 14bit MIDI, since a competent MIDI target that creates audio, should be smart enough to avoid zipper via intelligent programming. MIDI is just an instruction of a target - it shouldn’t be mistaken as a literal jump instruction. And our human ears can’t tell more than 127 different levels or strengths of many audio results being controlled by MIDI.

So I consider that high resolution movement for MIDI more of a tech-bro fetish designed to sell more gear than something that is universally necessary.

To use an analogy: When the captain of a ship instructs the person at the wheel to steer the ship on a course 90 degrees different, it’s not instantaneous, but gradual. The instruction is interpreted as “get me to the target as fast as can be done without negative side effects”.

So I’ve long considered MIDI zipper noises being a problem with the receiver of the MIDI instruction, not with the protocol.

MIDI destinations should (and most modern good hardware and software already does that) take the instructions in such a way: “Get me to the target as fast as possible without creating undesirable side effects.”

3 Likes

Great trip down the memory lane there!
I do remember latency issues with long MIDI chains. (My first sequencer was an Atari 1040 running Pro24.) The root of the problem was in fact, as you seem to have observed yourself, too many devices on a single port and potentially the use of too long MIDI cables. The solution was to use MIDI interfaces with multiple MIDI Out ports. (Yes, those existed even back in the day for the Atari.)

I have never seen that been done successfully. It requires a time factor being applied the value change. A fixed time factor would just introduce another type of issue and a variable time factor would have to be manipulated depending on the difference between two values received and the time between them. (That last type of implementation has been used where the “slope” time is set with a CC message.)

Not sure exactly what you’re referring to, but whatever it is, it’s besides the point. Stepping artifacts are very much real with certain parameters controlled with a 7-bit value.

Lol!

I’m skeptical. You have any examples of devices, software of hardware, that employ such magical algorithms?

An important subject.

I know hardware synthesisers still exist, as does outboard equipment in general but it’s the implementation at the time that we are talking about and it’s for manufacturers, or else developers/programmers, to be able to implement the specifications in a way that they see fit, and which suits their customers.

I only use VST3 instruments (and effects) so my automation requirements are sufficed by the DAW.

What I am waiting for is, for SB to implement MIDI 2.0 controllers as automations, so that they can be edited graphically, as in the Key Editor, rather than in the project page only.

Until C14 Steinberg kept MIDI and VST automation nicely seperated. However, with the new Pattern Editor they combined both in the same editor for the first time. This seems to have passed unnoticed by most people but for me was quite jaw dropping.
Before I’d had replied to your request “ain’t going to happen”. Now I am actually excited to see if they are going to expand on this.

Over DIN, the format is 8-n-1, which results in 10 bits per byte (start bit, 8 bits of data, 1 bit to stop, no parity bits).

It takes just under 1ms to transmit something like a (on DIN) 30 bit Note On message, assuming no other messages are happening on any of the other 16 channels on that cable. Things like system real-time messages, for example, have to be worked into that stream.

Over USB, each individual MIDI 1.0 message ends up as a 32 bit USB packet. In theory, USB could run at the max speed USB runs at, but in reality, most USB MIDI 1.0 devices constrain themselves to the 31250 speed due to other attached devices, etc.

Pete

3 Likes

It will be nothing short of amazing I am sure and will bring the Steinberg DAW back to its’ roots, MIDI but it will now have this most powerful VST and audio capability, with ARA, for example so add graphical (and hopefully list-based) editing of automation curves and stem creation just got a whole lot easier.

14-bit values are commonly used: for pitch bend. Pitch is very different to most other parameters of audio signals.

If you have an 7-bit representation, then there’s about a 1% change between successive values. With a 14-bit representation then there’s a 0.006% change between successive values. Most people will not perceive the difference between a 1% change in two velocity levels, attack time, decay time, LFO rate, etc. However, the ear is much more sensitive to changes in pitch, and 1% is noticeable, so there is a benefit to 14-bits, and this has been the standard way of representing pitch bend for 40 years.

I guess for other controllers, in general users haven’t needed or demanded it, but the users of more unconventional instruments have wanted the extra resolution, which led to the development of MPE and hence MIDI 2.0.

From a software point of view, there is a bit of a problem with the 14-bit representation, in that you have one value spread over two separate MIDI messages. So if your device responds to CC7 volume, as well as the LSB value of CC39 (?), then if you show the CC7 lane in the DAW, what does it show? Does it show the actual CC7 value or does it combine it behind the scenes with the LSB value?

1 Like

Although I find the discussion about 7 vs 14 bit values interesting, I think the main point of my original post was poorly conveyed by me.
The interesting part of the spec, I found, was this:

A manufacturer wishing to control a number of device-specific parameters over MIDI should used nonregistered parameter numbers and the Data Entry controllers (Data Entry Slider, Increment, and Decrement messages) as opposed to a large number of controllers. This alleviates possible conflict with devices responding to the same control numbers unpredictably.

I know I focused on 14-bit values, but as @SuperG pointed out, NRPM values doesn’t have to use the LSB part and can as such, stay in the 7-bit value realm.

That’s a good question and perhaps one of the reasons CC32-63 was never(?) implemented according to the specs.
RPN and NRPM could be represented on a single lane however, couldn’t it?

Yes, quite possibly, in the same way that the DAW hides the two separate parts of MIDI pitch bend messages in a single ‘pitch bend’ lane. It’s just arguably a bit weird if you have a CC7 lane that actually edits CC39 too. If you have a lane called ‘MIDI Volume’ or ‘MIDI Volume (CC7+39)’ then that’s less surprising because the DAW has abstracted the individual events into a single controllable value.

1 Like

Thanks Paul, but I was specifically talking about RPN/NRPN messages. Those that have of a Parameter Number, CC98 and CC99 for LSB and MSB respectively for NRPN.

Cubase can do sysex, 14bit relative MSB/LSB, and/or set/enable RPN/NRPN+CC+disable from a single lane, or it at least it once could. (I haven’t pushed, tested how much of this legacy support still works properly in Cubase 14, but AFAIK it’s all still there)

MIDI Device Panels have been around in Cubase for ages…did it on Atari Cubase versions, and it was present in Cubase 7 when I finally got into the PC realm.

The trick is to use a MIDI Device Panel to build special ‘automation lanes’ for Cubase. There’s an extra supplemental manual with more details on building ‘MIDI device panels’.

They show up as fresh new automation lanes in the main Arrange View.
In this case I was just using a simple device panel to build sysex controls for setting a few essentials when building ‘general midi 2’ compliant files (Set GM2 mode, pick and setup the two AUX-BUS style effects).

These MIDI device lanes act like VST automation lanes for the most part. I can enable/disable them for read/write operations, draw in events, bind remote controls them, etc.

In this case, I’m just sending a few one time messages at the start of a sequence, but such panels can indeed have ‘faders/pots’ for ‘continuous’ style controllers…sysex, rpn, nrpn, 14bit style MSB/LSB stuff, and probably even more if your device does something ‘non-standard’.

It also includes some ‘propritary’ bits for auto calculation of stuff like Roland style checksum bits in real time too…

Very cool! I did not know that. We were just talking about MIDI Devices and Device Panels in another thread just the other day, but I never dove deep into them.
Thank you for sharing!

PS. I just gave it a shoot and it seems to work as expected with NRPN Parameters!

EDIT:
Setting up and controlling exotic and/or device specific MIDI messages with MIDI Devices/Panels is probably a much better idea than to try and massage them into the Key Editor controller lanes. Hats off to Steinberg!

1 Like