Midi Keyboard
Kevin Daoust Author Image
Kevin Daoust

From MIDI to MIDI 2.0: A 40-Year Journey Towards AI-Powered Music

September 26, 2023

If you hang around music makers long enough, you’ll eventually hear the term MIDI (short for Musical Instrument Digital Interface). MIDI was revolutionary when it came out, and it has enabled musicians to connect digital instruments for 40 years (at the time of writing). While still very much useful, an update has allowed MIDI to adapt to some interesting emerging technologies. This also opens the door to further possibilities with modern MIDI makers/generators, and a new generation of AI Instruments. I’ll get into this a bit more below.

What is MIDI exactly?

MIDI Device

MIDI was created in the early 1980s when there was a need for a standardized way for electronic music devices (keyboards, sound modules, etc.) to talk and synchronize with each other. One cool aspect of MIDI is that no one owns it. Instead, several organizations came together to create and maintain the standard.

At its core, MIDI is essentially a computer language. It allows for digital music devices to essentially talk to each other and work together to make music. These devices could include a Yamaha DX7 Keyboard, a Nord Rack which generates sound but does not have a keyboard, digital audio workstations (or DAWs, like Logic Pro), MIDI controllers (such as an M-Audio Keystation, or the newer Roland A-88 MKII) and software instruments (such as Native Instruments Kontakt or any of the instruments in Arturia’s V Collection, for example).

MIDI is not audio. MIDI itself does not make a sound. Instead, it is a set of instructions, basically telling a device that makes the sound (such as the DX7 or Kontakt): “Play this pitch at this time at this volume and for this long”. 

This MIDI data can be recorded into a DAW through a controller (such as the M-Audio Keystation) or an instrument that has a keyboard (such as the DX7). The DAW can either play the recorded part using that controller’s digital sounds (if they are built in) to any other virtual instrument that has the ability to generate sounds from MIDI. This allows a user to suddenly have a virtual symphony or rock band in their room, without having to purchase several different instruments.

An upgrade after 40 years

MIDI 1.0 has been the standard for a long time. With the coming of MIDI 2.0, it’s important to note that MIDI 1.0 is not being replaced. What we now have is a series of upgrades to the protocol that increase the connectivity between devices, and provide a deeper level of expression for the performer.

With many different MIDI-enabled instruments and hardware still being used today, it would be a shame for it all to be rendered obsolete through this change in MIDI. Luckily, MIDI 1.0 will not be going anywhere. If two devices cannot communicate via MIDI 2.0, they will automatically default to MIDI 1.0.

Connections made easy

Crazy Connections

With MIDI 1.0, the flow of data only went one way, from the MIDI Out side to the MIDI In side. So, if, you wanted to use your DX7 with Logic Pro, you would need to connect the MIDI Out of the DX7 to the MIDI In of the Clarett.

However, if you want Logic Pro to send information back to the DX7 (perhaps to use its own digital sounds), you now need a separate connection between the MIDI Out of the Clarett to the MIDI In of the DX7. These connections also became more sophisticated when multiple devices were involved, requiring multiple cables, as well as a bevy of configurations on each device to get everything working the way you want it.

MIDI 2.0 solves that issue through a two-way stream of data via a single connection point. This allows for different connection options (such as Cat5, Thunderbolt, Bluetooth, etc.), though USB will likely be the most common option. The advantage to this is allowing certain devices (like the M-Audio Keystation) to be bus-powered, negating the need for a wall plug. 

A dialogue between devices

Connected music devices

The two-way stream of data also allows different devices to share information about each other. This is done through a protocol called MIDI Capability Inquiry (MIDI-CI). This allows the devices to see if they can communicate in MIDI 2.0, exchange information about each other (capabilities, features, etc.) and even auto-configure themselves to a standard set of parameters.

For example, let’s say you’re using a controller that has a keyboard and a set of sliders. This device is connected to your computer and you open a drawbar organ virtual instrument. If both the controller and software are MIDI 2.0 compliant, and the controller has certain functions, the virtual instrument can automatically configure the controller. This allows the drawbars to be automatically assigned to the sliders without the player needing to configure anything, as they would have had to do with MIDI 1.0.

More steps for more expression

Due to the technologies available at the time of its creation, MIDI 1.0 had what appeared to be a wide range of value over different controls, though its resolution was only 7 bits. This limited the control range of any parameter to 127 steps. 

For example, depending on how hard you hit a key on a keyboard, the velocity (or volume) would be assigned a value between 0 and 127, 0 being the quietest a note would sound, and 127 being the loudest. While this may have been fine for certain parameters, it may have had some limitations with controls such as pitch bending. Depending on the instrument, you may have had hard stops between the 127 values instead of a smooth glissando, resulting in a zipper effect. 

This would feel different than using an old analogue synthesizer, where you had an almost infinite range between the lowest and highest possible values on an analogue potentiometer.

MIDI 2.0 improves on this because of improvements in resolution, from 7 to 32 bits. There are now literally thousands of steps available on each control, a marked difference from 127. This brings an almost “analogue” feel, providing the performer with a much larger range of expression available.

Future AI applications

With MIDI-CI’s information sharing abilities MIDI makers and generators (that are increasingly commonplace online) now have access to way more information about the instrument it’s creating a part for.

Imagine a scenario where you have an AI-based MIDI generator and it’s’ paired with a software instrument that is MIDI 2.0 compliant within Logic Pro. You want it to generate a part for a drawbar organ. You load up a drawbar organ patch on the software instrument, which automatically feeds information to the generator through MIDI-CI, such as the parameters for the rotary speaker speed and the various drawbars.

The user feeds the MIDI generator with information about what it wants it generate: an organ part that would make Booker T. Jones proud. The generator would not only generate the chords and the rhythm but also when to speed up and slow down the rotary speaker, when to move the drawbars, etc.

With this information, the MIDI generator creates the part with notes, rhythm and actions with the rotary speaker and drawbars. This information is then inserted in a MIDI track in Logic Pro, where the end user can see the moves made and make further edits if desired.

Considering the 32-bit resolution of MIDI 2.0, the MIDI generator could even create the slightest of moves as a human player would do, going beyond the original protocol’s 127 steps.

An eventual arrival

DAW Workstation

MIDI 2.0 has slowly been making its way into everyday music creation. Controllers such as the Roland A-88 MKII are now becoming available and are gradually being integrated into various DAWs (Logic Pro already supports it, with other programs likely to follow suit). Experts seem to agree that this will eventually become the standard, so it's only a matter of time before it becomes fully integrated into modern music-making.


Kevin Daoust Author Image

Kevin Daoust - Guitarist, Guitar Educator, Writer

Kevin Daoust is a guitarist, guitar educator, and writer based in Gatineau, Quebec, Canada.

When not tracking guitars for artists around the world, or writing music-related articles around the internet, he can be seen on stage with Accordion-Funk legends Hey, Wow, the acoustic duo Chanté et Kev, the funky Sh-Boom, as well as a hired gun guitarist around Quebec and Ontario. He holds a Bachelor of Music in Guitar Performance from Carleton University in Ottawa, Ontario, Canada.

© Staccato AI Inc. 2023