Simple efficiency.
As a serial protocol, MIDI was designed around simple serial chips of that time, which would receive 8 bits of data and transmit them as a stream from one separate serial pin of data with a forbidden speed. In the MIDI world, it was 31,250 Hz. He added a stop and the start of the bit so that all the data could move along one wire. It was designed as cheap and simple, and simplicity has been expanded in data format.
The most significant bit of the 8 data bits was used to signal whether the data byte was a command or data. So- To send a Middle C ON message on channel 1 with a speed of 56 bytes of command, the command is sent first and Note on was the top 4 bits of this command bit 1001. Note 1 in the most significant bit, followed by the channel identifier for channel 1 0000 (computers prefer to start counting from 0)
10010000 or 128 + 16 = 144
This was followed by actual Note data.
72 for average C or 01001000
and then the speed data is again indicated in the range 0 -127 with 0 MSB
56 in our case
00111000 So, that would let the wire down (ignoring the stop start and synchronization bit)
144, 72, 56
For the almost dead microcomputers of that time in electronic keyboards, the ability to separate a command from data by simply looking at the first bit was successful.
As said, 127 bits covers almost any western keyboard that you would like to mention. Thus, the completely logical meaning and survival of the protocols long after many serial protocols have disappeared into obscurity is a great compliment to http://en.wikipedia.org/wiki/Dave_Smith_(engineer) Dave Smith from sequential schemes who started the discussion with other manufacturers to install it all in place.
Modern music and composition will be significantly different without it and from them.
Enjoy it!
wyleu
source share