Jump to content

MIDI Start Bit & Stop Bit


Go to solution Solved by MikeRobinson,

Recommended Posts

Hello,

I'm trying to understand more about MIDI.

I learned that MIDI is an asynchronous transmission, so even several notes that are played at exact the same time are actually sent in serial.

A start bit and stop bit are sent with every data bytes. So are these start bits and stop bits the reason why the MIDI receiving end like a DAW can know when the MIDI events happen originally? Are MIDI Clock messages involved as well?

Link to comment
Share on other sites

  • Solution

I detect a "conflation" of several different unrelated ideas. Let me try now to sort it all out ...

 

For a discussion of the MIDI data format and some of its history, consider: https://cecm.indiana.edu/etext/MIDI/chapter3_MIDI3.shtml

 

MIDI used to be a serial data transmission that was sent at low speed over microphone cables because that was the type of cable that could be counted-on to connect the recording-room with the control-room. In classic serial communications – which is barely used anymore – of course the actual transmission is a stream of bits, not bytes. Therefore, the convention was to precede each byte with throw-away "start bits" and to end them with "stop bits." (This dates all the way back to the earliest electromechanical Teletype machines which used synchronous motors and relays. :shock: ) Thankfully, devices and software do not have to be concerned with this anymore: these days, we have a nifty hardware-device called a UART ... "Universal Asynchronous Receiver/Transmitter." It deals with the bits and very-nicely hands us bytes.

 

Our MIDI-stream consists of a stream of eight-bit bytes. These days, it might be captured in a file, or transmitted by a wireless network connection, or via USB.

 

Although we have vastly changed the tools that we use to handle it ... and, even though every device these days still has a "microphone cable port" if only for nostalgia ... the MIDI data format continues to be as useful as any well-designed standard might be expected to be. ("They really got it [almost ...] right the first time.") It has survived for many decades with very little change – instead, it has expanded for other applications such as the control of theatrical lights. Like any good locomotive, it moves the freight. It's simple, flexible, easy to implement at very low cost, and it works.

 

There are various interesting aspects of such a standard. For example, "if you have tapped into a byte data stream at any unknown point, where any byte is exactly like any other, how do you recognize 'the start of a message?'" (P.S.: on a "bits" level, that's why we have start/stop bits ...) How do you support the transmission of "arbitrary, unknown, vendor-specific ("system-exclusive") messages" without restricting their contents and without breaking anyone and everything else? Believe it or not, they found answers that have stood the test of time, in spite of the primitive hardware that "way back then" (koff, koff ...) they were obliged to use.

- - -

 

Meanwhile: (and switching subjects entirely here ...)

 

"MIDI Clock" messages (https://en.wikipedia.org/wiki/MIDI_timecode) are part of the (byte-level) data stream that is received by the device. The MIDI standard provided for so-called "system-exclusive" messages, capable of carrying arbitrary byte-content, and this has been leveraged here. The purpose of these messages is to communicate time so that the various devices can synchronize with one another. (This is especially important in motion-picture work ... SMPTE ... where musical events must be synchronized with the movement of film.)

Link to comment
Share on other sites

I detect a "conflation" of several different unrelated ideas. Let me try now to sort it all out ...

 

For a discussion of the MIDI data format and some of its history, consider: https://cecm.indiana.edu/etext/MIDI/chapter3_MIDI3.shtml

 

MIDI used to be a serial data transmission that was sent at low speed over microphone cables because that was the type of cable that could be counted-on to connect the recording-room with the control-room. In classic serial communications – which is barely used anymore – of course the actual transmission is a stream of bits, not bytes. Therefore, the convention was to precede each byte with throw-away "start bits" and to end them with "stop bits." (This dates all the way back to the earliest electromechanical Teletype machines which used synchronous motors and relays. :shock: ) Thankfully, devices and software do not have to be concerned with this anymore: these days, we have a nifty hardware-device called a UART ... "Universal Asynchronous Receiver/Transmitter." It deals with the bits and very-nicely hands us bytes.

 

Our MIDI-stream consists of a stream of eight-bit bytes. These days, it might be captured in a file, or transmitted by a wireless network connection, or via USB.

 

Although we have vastly changed the tools that we use to handle it ... and, even though every device these days still has a "microphone cable port" if only for nostalgia ... the MIDI data format continues to be as useful as any well-designed standard might be expected to be. ("They really got it [almost ...] right the first time.") It has survived for many decades with very little change – instead, it has expanded for other applications such as the control of theatrical lights. Like any good locomotive, it moves the freight. It's simple, flexible, easy to implement at very low cost, and it works.

 

There are various interesting aspects of such a standard. For example, "if you have tapped into a byte data stream at any unknown point, where any byte is exactly like any other, how do you recognize 'the start of a message?'" (P.S.: on a "bits" level, that's why we have start/stop bits ...) How do you support the transmission of "arbitrary, unknown, vendor-specific ("system-exclusive") messages" without restricting their contents and without breaking anyone and everything else? Believe it or not, they found answers that have stood the test of time, in spite of the primitive hardware that "way back then" (koff, koff ...) they were obliged to use.

- - -

 

Meanwhile: (and switching subjects entirely here ...)

 

"MIDI Clock" messages (https://en.wikipedia.org/wiki/MIDI_timecode) are part of the (byte-level) data stream that is received by the device. The MIDI standard provided for so-called "system-exclusive" messages, capable of carrying arbitrary byte-content, and this has been leveraged here. The purpose of these messages is to communicate time so that the various devices can synchronize with one another. (This is especially important in motion-picture work ... SMPTE ... where musical events must be synchronized with the movement of film.)

 

Thank you Mike! That's great information.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...