The Java Sound MIDI API
|"Java application programmers can now bring almost 20 years of development work into the Java environment and use it to create very rich musical experiences. "|
In my previous article, Introduction to the Java Sound API, we examined the
packages in Java Sound. This article will take a look at the
The first two packages provide interfaces supporting digital audio and MIDI (Musical Instrument Digital Interface) sequencing and synthesis, the .spi packages provide service providers with abstract interfaces to enable the installation of custom components. This article will focus on the first package,
provides interfaces for MIDI synthesis, sequencing, and event transport. The
packages provide service providers with abstract interfaces to enable the installation of custom components. This article will focus on the first package,
What Is MIDI?
An audio file is a digital representation of an acoustic event and is used to control an electronic loudspeaker in order to reproduce that event as accurately as possible. A MIDI file is a recording of performance events that is used to control a sound-producing device such as a synthesizer or sampler. Typical MIDI events recorded in a file consist of actions on a musical keyboard, such as note number, note velocity, timing of notes, the actions of knobs and pedals and more. Typically, MIDI files are read by programs called sequencers that are used to control synthesizers.
|"The Java Sound API implements a basic level of MIDI functionality and allows for the creation of more sophisticated services via the use of the Service Provider Interfaces. "|
The MIDI specification is split into two parts, the MIDI 1.0, or the MIDI wire protocol and Standard MIDI files. The MIDI wire protocol deals with the streaming of MIDI data. The Standard MIDI file specification deals primarily with the timing of events stored in a file.
For a more complete discussion of MIDI, see the Website of the MIDI Manufacturers Association.
Java Sound and the Handling of MIDI Data
The two most basic elements of MIDI data are Messages and Events:
The most basic element of MIDI in Java Sound is the class MidiMessage. This is an abstract class that represents a "raw" MIDI message. This is data that corresponds to the MIDI wire protocol and contains no timing information. This message can come from an input device or a file. MidiMessage is divided into three subclasses:
- ShortMessages Note On and Note Off are the most common messages and have at most two data bytes following the status byte.
- SysexMessages contain system-exclusive MIDI messages. Sysex messages are usually manufacturer exclusive and can be very large. They are used to configure devices, and transfer digital information such as sample files between devices.
- MetaMessages are part of MIDI files, but part of MIDI wire protocol. Meta messages contain data, such as lyrics or tempo settings, that can be used by sequencers.
An instance of MidiEvent represents the way a MIDI event might be stored in a Standard MIDI file with its timing information. MidiEvent contains methods for setting and getting an events timing value as well as a method to retrieve its raw MIDI message.
Java Sound organizes MIDI data into three parts:
MidiEvents are contained in Tracks and Tracks are contained in Sequences. This directly corresponds to the structure of Standard MIDI files. Sequences can be read from Standard MIDI files or created by combining tracks made up of events.
Java Sound and MIDI Devices
We now come to the part of the specification that addresses MIDI's original intention: the delivery of messages from one device to another. Most often, messages are delivered from a hardware device like a keyboard or a software device like a sequencer, through a MIDI interface with input and output ports to a sound producing device like a synthesizer or sampler.
The MidiDevice interface functions similarly to a hardware MIDI interface in that it can send or receive MIDI messages. The base MidiDevice contains all the functionality needed for input and output and can be used to implement a purely software-based device or it can act as an interface to hardware, such as a sound card's MIDI ports or to an external MIDI interface that connects via a serial port. MidiDevice contains an API for opening and closing devices. MidiDevice also has two subinterfaces, Synthesizer and Sequencer, used for implementing synthesizers and sequencers. There is also an inner class,
, that provides textual information about specific devices and functions in the same way as
Transmitters and Receivers
Most physical MIDI devices are capable of sending and receiving MIDI messages through their hardware MIDI in and out ports. Similarly, Java MIDI devices can send and receive MidiMessages through transmitter and receiver objects that it owns. These objects are implemented through the Transmitter and Receiver interfaces within MidiDevice. Transmitters and Receivers can connect with only one device at a time at each end. Devices that can transmit to or receive from multiple devices need to use multiple instantiations of the transmitter and receiver objects.
A sequencer is a device used to capture MIDI streams from another device and store them as MIDI sequences, and to play back MIDI sequences to another MIDI device. The most common container for these sequences is a Standard MIDI file. Sequencers use Receivers to capture data and Transmitters to send it. The interface
contains methods for basic MIDI sequencing. It can load a sequence from a Standard MIDI file, query and set tempo, and synchronize other devices.
A Synthesizer is the object that
package uses to generate sound. A synthesizer controls a set of MidiChannel objects. The MIDI specification calls for 16 channels, though more or fewer can be used; in any case, a synthesizer has at least one MidiChannel object. Synthesizers that implement more than one channel are referred to as "multitimberal." An application can invoke the MidiChannel interfaces directly to generate sound, but it's more common to send MIDI data from a MIDI port or a sequencer to one of the synthesizer's receivers. The synthesizer then sends the incoming message to the appropriate MidiChannel object according to the channel number specified in the event.
The specific sound played by the MidiChannel object is represented by an Instrument. An Instrument contains the precise instructions on how to create the audio signal for each incoming noteOn message. In Java Sound, Instruments are organized in soundbanks, banks and programs, with the program representing a specific instrument. A soundbank contains 128 banks and a bank contains 128 programs. The Patch object encapsulates the combination of a bank number and a program number and is used to select a specific instrument. The Soundbank object is selected by reading a soundbank file.
Another important capability of a synthesizer is the number of voices it can play simultaneously. A voice refers to a single note; polyphony refers to multiple voices sounding simultaneously. A synthesizer has a maximum limit on its polyphony. In Java Sound, that is reported through the
The Java Sound API implements a basic level of MIDI functionality and allows for the creation of more sophisticated services via the use of the Service Provider Interfaces (SPI). The new services can be integrated in the same way as the existing services and can function transparently to the application.
MIDI was first introduced in 1984, as a simple protocol to allow devices to stream information to each other. Since the introduction of the Standard MIDI file portion of the specification, it has remained basically unchanged, even though application developers have found more and more ways to use it. Sun's decision to bring such a widely used and stable environment into Java is a good one. Java application programmers can now bring almost 20 years of development work into the Java environment and use it to create very rich musical experiences.
About the Author
John Maxwell Hobbs is a musician and has been working with computer multimedia for over fifteen years. He is currently head of Creative Development at Ericsson CyberLab NY. His interactive compositions "Web Phases" and "Ripple" can be found at Cinema Volta and his CDs are available on MP3.com. He is on the board of directors of Vanguard Visions, an organization dedicated to fostering the work of artists experimenting with technology. He is the former producing director for The Kitchen, in New York City.