Liking cljdoc? Tell your friends :D

javax.sound.midi.ControllerEventListener

The ControllerEventListener interface should be implemented by classes whose instances need to be notified when a Sequencer has processed a requested type of MIDI control-change event. To register a ControllerEventListener object to receive such notifications, invoke the addControllerEventListener method of Sequencer, specifying the types of MIDI controllers about which you are interested in getting control-change notifications.

The ControllerEventListener interface should be implemented
by classes whose instances need to be notified when a Sequencer
has processed a requested type of MIDI control-change event.
To register a ControllerEventListener object to receive such
notifications, invoke the
addControllerEventListener method of Sequencer,
specifying the types of MIDI controllers about which you are interested in
getting control-change notifications.
raw docstring

javax.sound.midi.core

No vars found in this namespace.

javax.sound.midi.Instrument

An instrument is a sound-synthesis algorithm with certain parameter settings, usually designed to emulate a specific real-world musical instrument or to achieve a specific sort of sound effect. Instruments are typically stored in collections called soundbanks. Before the instrument can be used to play notes, it must first be loaded onto a synthesizer, and then it must be selected for use on one or more channels, via a program-change command. MIDI notes that are subsequently received on those channels will be played using the sound of the selected instrument.

An instrument is a sound-synthesis algorithm with certain parameter
settings, usually designed to emulate a specific real-world
musical instrument or to achieve a specific sort of sound effect.
Instruments are typically stored in collections called soundbanks.
Before the instrument can be used to play notes, it must first be loaded
onto a synthesizer, and then it must be selected for use on
one or more channels, via a program-change command.  MIDI notes
that are subsequently received on those channels will be played using
the sound of the selected instrument.
raw docstring

javax.sound.midi.InvalidMidiDataException

An InvalidMidiDataException indicates that inappropriate MIDI data was encountered. This often means that the data is invalid in and of itself, from the perspective of the MIDI specification. An example would be an undefined status byte. However, the exception might simply mean that the data was invalid in the context it was used, or that the object to which the data was given was unable to parse or use it. For example, a file reader might not be able to parse a Type 2 MIDI file, even though that format is defined in the MIDI specification.

An InvalidMidiDataException indicates that inappropriate MIDI
data was encountered. This often means that the data is invalid in and of
itself, from the perspective of the MIDI specification.  An example would
be an undefined status byte.  However, the exception might simply
mean that the data was invalid in the context it was used, or that
the object to which the data was given was unable to parse or use it.
For example, a file reader might not be able to parse a Type 2 MIDI file, even
though that format is defined in the MIDI specification.
raw docstring

javax.sound.midi.MetaEventListener

The MetaEventListener interface should be implemented by classes whose instances need to be notified when a Sequencer has processed a MetaMessage. To register a MetaEventListener object to receive such notifications, pass it as the argument to the addMetaEventListener method of Sequencer.

The MetaEventListener interface should be implemented
by classes whose instances need to be notified when a Sequencer
has processed a MetaMessage.
To register a MetaEventListener object to receive such
notifications, pass it as the argument to the
addMetaEventListener
method of Sequencer.
raw docstring

javax.sound.midi.MetaMessage

A MetaMessage is a MidiMessage that is not meaningful to synthesizers, but that can be stored in a MIDI file and interpreted by a sequencer program. (See the discussion in the MidiMessage class description.) The Standard MIDI Files specification defines various types of meta-events, such as sequence number, lyric, cue point, and set tempo. There are also meta-events for such information as lyrics, copyrights, tempo indications, time and key signatures, markers, etc. For more information, see the Standard MIDI Files 1.0 specification, which is part of the Complete MIDI 1.0 Detailed Specification published by the MIDI Manufacturer's Association (http://www.midi.org).

When data is being transported using MIDI wire protocol, a ShortMessage with the status value 0xFF represents a system reset message. In MIDI files, this same status value denotes a MetaMessage. The types of meta-message are distinguished from each other by the first byte that follows the status byte 0xFF. The subsequent bytes are data bytes. As with system exclusive messages, there are an arbitrary number of data bytes, depending on the type of MetaMessage.

A MetaMessage is a MidiMessage that is not meaningful to synthesizers, but
that can be stored in a MIDI file and interpreted by a sequencer program.
(See the discussion in the MidiMessage
class description.)  The Standard MIDI Files specification defines
various types of meta-events, such as sequence number, lyric, cue point,
and set tempo.  There are also meta-events
for such information as lyrics, copyrights, tempo indications, time and key
signatures, markers, etc.  For more information, see the Standard MIDI Files 1.0
specification, which is part of the Complete MIDI 1.0 Detailed Specification
published by the MIDI Manufacturer's Association
(http://www.midi.org).


When data is being transported using MIDI wire protocol,
a ShortMessage with the status value 0xFF represents
a system reset message.  In MIDI files, this same status value denotes a MetaMessage.
The types of meta-message are distinguished from each other by the first byte
that follows the status byte 0xFF.  The subsequent bytes are data
bytes.  As with system exclusive messages, there are an arbitrary number of
data bytes, depending on the type of MetaMessage.
raw docstring

javax.sound.midi.MidiChannel

A MidiChannel object represents a single MIDI channel. Generally, each MidiChannel method processes a like-named MIDI "channel voice" or "channel mode" message as defined by the MIDI specification. However, MidiChannel adds some "get" methods that retrieve the value most recently set by one of the standard MIDI channel messages. Similarly, methods for per-channel solo and mute have been added.

A Synthesizer object has a collection of MidiChannels, usually one for each of the 16 channels prescribed by the MIDI 1.0 specification. The Synthesizer generates sound when its MidiChannels receive noteOn messages.

See the MIDI 1.0 Specification for more information about the prescribed behavior of the MIDI channel messages, which are not exhaustively documented here. The specification is titled MIDI Reference: The Complete MIDI 1.0 Detailed Specification, and is published by the MIDI Manufacturer's Association ( http://www.midi.org).

MIDI was originally a protocol for reporting the gestures of a keyboard musician. This genesis is visible in the MidiChannel API, which preserves such MIDI concepts as key number, key velocity, and key pressure. It should be understood that the MIDI data does not necessarily originate with a keyboard player (the source could be a different kind of musician, or software). Some devices might generate constant values for velocity and pressure, regardless of how the note was performed. Also, the MIDI specification often leaves it up to the synthesizer to use the data in the way the implementor sees fit. For example, velocity data need not always be mapped to volume and/or brightness.

A MidiChannel object represents a single MIDI channel.
Generally, each MidiChannel method processes a like-named MIDI
"channel voice" or "channel mode" message as defined by the MIDI specification. However,
MidiChannel adds some "get" methods  that retrieve the value
most recently set by one of the standard MIDI channel messages.  Similarly,
methods for per-channel solo and mute have been added.

A Synthesizer object has a collection
of MidiChannels, usually one for each of the 16 channels
prescribed by the MIDI 1.0 specification.  The Synthesizer
generates sound when its MidiChannels receive
noteOn messages.

See the MIDI 1.0 Specification for more information about the prescribed
behavior of the MIDI channel messages, which are not exhaustively
documented here.  The specification is titled MIDI Reference:
The Complete MIDI 1.0 Detailed Specification, and is published by
the MIDI Manufacturer's Association (
http://www.midi.org).

MIDI was originally a protocol for reporting the gestures of a keyboard
musician.  This genesis is visible in the MidiChannel API, which
preserves such MIDI concepts as key number, key velocity, and key pressure.
It should be understood that the MIDI data does not necessarily originate
with a keyboard player (the source could be a different kind of musician, or
software).  Some devices might generate constant values for velocity
and pressure, regardless of how the note was performed.
Also, the MIDI specification often leaves it up to the
synthesizer to use the data in the way the implementor sees fit.  For
example, velocity data need not always be mapped to volume and/or brightness.
raw docstring

javax.sound.midi.MidiDevice

MidiDevice is the base interface for all MIDI devices. Common devices include synthesizers, sequencers, MIDI input ports, and MIDI output ports.

A MidiDevice can be a transmitter or a receiver of MIDI events, or both. Therefore, it can provide Transmitter or Receiver instances (or both). Typically, MIDI IN ports provide transmitters, MIDI OUT ports and synthesizers provide receivers. A Sequencer typically provides transmitters for playback and receivers for recording.

A MidiDevice can be opened and closed explicitly as well as implicitly. Explicit opening is accomplished by calling open(), explicit closing is done by calling close() on the MidiDevice instance. If an application opens a MidiDevice explicitly, it has to close it explicitly to free system resources and enable the application to exit cleanly. Implicit opening is done by calling MidiSystem.getReceiver and MidiSystem.getTransmitter. The MidiDevice used by MidiSystem.getReceiver and MidiSystem.getTransmitter is implementation-dependant unless the properties javax.sound.midi.Receiver and javax.sound.midi.Transmitter are used (see the description of properties to select default providers in MidiSystem). A MidiDevice that was opened implicitly, is closed implicitly by closing the Receiver or Transmitter that resulted in opening it. If more than one implicitly opening Receiver or Transmitter were obtained by the application, the device is closed after the last Receiver or Transmitter has been closed. On the other hand, calling getReceiver or getTransmitter on the device instance directly does not open the device implicitly. Closing these Transmitters and Receivers does not close the device implicitly. To use a device with Receivers or Transmitters obtained this way, the device has to be opened and closed explicitly.

If implicit and explicit opening and closing are mixed on the same MidiDevice instance, the following rules apply:

After an explicit open (either before or after implicit opens), the device will not be closed by implicit closing. The only way to close an explicitly opened device is an explicit close.

An explicit close always closes the device, even if it also has been opened implicitly. A subsequent implicit close has no further effect.

To detect if a MidiDevice represents a hardware MIDI port, the following programming technique can be used:

MidiDevice device = ...; if ( ! (device instanceof Sequencer) && ! (device instanceof Synthesizer)) { // we're now sure that device represents a MIDI port // ... }

A MidiDevice includes a MidiDevice.Info object to provide manufacturer information and so on.

MidiDevice is the base interface for all MIDI devices.
Common devices include synthesizers, sequencers, MIDI input ports, and MIDI
output ports.

A MidiDevice can be a transmitter or a receiver of
MIDI events, or both. Therefore, it can provide Transmitter
or Receiver instances (or both). Typically, MIDI IN ports
provide transmitters, MIDI OUT ports and synthesizers provide
receivers. A Sequencer typically provides transmitters for playback
and receivers for recording.

A MidiDevice can be opened and closed explicitly as
well as implicitly. Explicit opening is accomplished by calling
open(), explicit closing is done by calling close() on the MidiDevice instance.
If an application opens a MidiDevice
explicitly, it has to close it explicitly to free system resources
and enable the application to exit cleanly. Implicit opening is
done by calling MidiSystem.getReceiver and MidiSystem.getTransmitter. The MidiDevice used by
MidiSystem.getReceiver and
MidiSystem.getTransmitter is implementation-dependant
unless the properties javax.sound.midi.Receiver
and javax.sound.midi.Transmitter are used (see the
description of properties to select default providers in
MidiSystem). A MidiDevice
that was opened implicitly, is closed implicitly by closing the
Receiver or Transmitter that resulted in
opening it. If more than one implicitly opening
Receiver or Transmitter were obtained by
the application, the device is closed after the last
Receiver or Transmitter has been
closed. On the other hand, calling getReceiver or
getTransmitter on the device instance directly does
not open the device implicitly. Closing these
Transmitters and Receivers does not close
the device implicitly. To use a device with Receivers
or Transmitters obtained this way, the device has to
be opened and closed explicitly.

If implicit and explicit opening and closing are mixed on the
same MidiDevice instance, the following rules apply:


After an explicit open (either before or after implicit
opens), the device will not be closed by implicit closing. The only
way to close an explicitly opened device is an explicit close.

An explicit close always closes the device, even if it also has
been opened implicitly. A subsequent implicit close has no further
effect.


To detect if a MidiDevice represents a hardware MIDI port, the
following programming technique can be used:



MidiDevice device = ...;
if ( ! (device instanceof Sequencer) && ! (device instanceof Synthesizer)) {
  // we're now sure that device represents a MIDI port
  // ...
}


A MidiDevice includes a MidiDevice.Info object
to provide manufacturer information and so on.
raw docstring

javax.sound.midi.MidiDevice$Info

A MidiDevice.Info object contains assorted data about a MidiDevice, including its name, the company who created it, and descriptive text.

A MidiDevice.Info object contains assorted
data about a MidiDevice, including its
name, the company who created it, and descriptive text.
raw docstring

javax.sound.midi.MidiDeviceReceiver

MidiDeviceReceiver is a Receiver which represents a MIDI input connector of a MidiDevice (see MidiDevice.getReceiver()).

MidiDeviceReceiver is a Receiver which represents
a MIDI input connector of a MidiDevice
(see MidiDevice.getReceiver()).
raw docstring

javax.sound.midi.MidiDeviceTransmitter

MidiDeviceTransmitter is a Transmitter which represents a MIDI input connector of a MidiDevice (see MidiDevice.getTransmitter()).

MidiDeviceTransmitter is a Transmitter which represents
a MIDI input connector of a MidiDevice
(see MidiDevice.getTransmitter()).
raw docstring

javax.sound.midi.MidiEvent

MIDI events contain a MIDI message and a corresponding time-stamp expressed in ticks, and can represent the MIDI event information stored in a MIDI file or a Sequence object. The duration of a tick is specified by the timing information contained in the MIDI file or Sequence object.

In Java Sound, MidiEvent objects are typically contained in a Track, and Tracks are likewise contained in a Sequence.

MIDI events contain a MIDI message and a corresponding time-stamp
expressed in ticks, and can represent the MIDI event information
stored in a MIDI file or a Sequence object.  The
duration of a tick is specified by the timing information contained
in the MIDI file or Sequence object.

In Java Sound, MidiEvent objects are typically contained in a
Track, and Tracks are likewise
contained in a Sequence.
raw docstring

javax.sound.midi.MidiFileFormat

A MidiFileFormat object encapsulates a MIDI file's type, as well as its length and timing information.

A MidiFileFormat object can include a set of properties. A property is a pair of key and value: the key is of type String, the associated property value is an arbitrary object. Properties specify additional informational meta data (like a author, or copyright). Properties are optional information, and file reader and file writer implementations are not required to provide or recognize properties.

The following table lists some common properties that should be used in implementations:

MIDI File Format Properties

Property key Value type Description

"author" String name of the author of this file

"title" String title of this file

"copyright" String copyright message

"date" Date date of the recording or release

"comment" String an arbitrary text

A MidiFileFormat object encapsulates a MIDI file's
type, as well as its length and timing information.

A MidiFileFormat object can
include a set of properties. A property is a pair of key and value:
the key is of type String, the associated property
value is an arbitrary object.
Properties specify additional informational
meta data (like a author, or copyright).
Properties are optional information, and file reader and file
writer implementations are not required to provide or
recognize properties.

The following table lists some common properties that should
be used in implementations:


   MIDI File Format Properties

  Property key
  Value type
  Description


  "author"
  String
  name of the author of this file


  "title"
  String
  title of this file


  "copyright"
  String
  copyright message


  "date"
  Date
  date of the recording or release


  "comment"
  String
  an arbitrary text
raw docstring

javax.sound.midi.MidiMessage

MidiMessage is the base class for MIDI messages. They include not only the standard MIDI messages that a synthesizer can respond to, but also "meta-events" that can be used by sequencer programs. There are meta-events for such information as lyrics, copyrights, tempo indications, time and key signatures, markers, etc. For more information, see the Standard MIDI Files 1.0 specification, which is part of the Complete MIDI 1.0 Detailed Specification published by the MIDI Manufacturer's Association (http://www.midi.org).

The base MidiMessage class provides access to three types of information about a MIDI message:

The messages's status byte The total length of the message in bytes (the status byte plus any data bytes) A byte array containing the complete message

MidiMessage includes methods to get, but not set, these values. Setting them is a subclass responsibility.

The MIDI standard expresses MIDI data in bytes. However, because JavaTM uses signed bytes, the Java Sound API uses integers instead of bytes when expressing MIDI data. For example, the getStatus() method of MidiMessage returns MIDI status bytes as integers. If you are processing MIDI data that originated outside Java Sound and now is encoded as signed bytes, the bytes can can be converted to integers using this conversion: int i = (int)(byte & 0xFF)

If you simply need to pass a known MIDI byte value as a method parameter, it can be expressed directly as an integer, using (for example) decimal or hexadecimal notation. For instance, to pass the "active sensing" status byte as the first argument to ShortMessage's setMessage(int) method, you can express it as 254 or 0xFE.

MidiMessage is the base class for MIDI messages.  They include
not only the standard MIDI messages that a synthesizer can respond to, but also
"meta-events" that can be used by sequencer programs.  There are meta-events
for such information as lyrics, copyrights, tempo indications, time and key
signatures, markers, etc.  For more information, see the Standard MIDI Files 1.0
specification, which is part of the Complete MIDI 1.0 Detailed Specification
published by the MIDI Manufacturer's Association
(http://www.midi.org).

The base MidiMessage class provides access to three types of
information about a MIDI message:

The messages's status byte
The total length of the message in bytes (the status byte plus any data bytes)
A byte array containing the complete message


MidiMessage includes methods to get, but not set, these values.
Setting them is a subclass responsibility.


The MIDI standard expresses MIDI data in bytes.  However, because
JavaTM uses signed bytes, the Java Sound API uses integers
instead of bytes when expressing MIDI data.  For example, the
getStatus() method of
MidiMessage returns MIDI status bytes as integers.  If you are
processing MIDI data that originated outside Java Sound and now
is encoded as signed bytes, the bytes can
can be converted to integers using this conversion:
int i = (int)(byte & 0xFF)

If you simply need to pass a known MIDI byte value as a method parameter,
it can be expressed directly as an integer, using (for example) decimal or
hexadecimal notation.  For instance, to pass the "active sensing" status byte
as the first argument to ShortMessage's
setMessage(int)
method, you can express it as 254 or 0xFE.
raw docstring

javax.sound.midi.MidiSystem

The MidiSystem class provides access to the installed MIDI system resources, including devices such as synthesizers, sequencers, and MIDI input and output ports. A typical simple MIDI application might begin by invoking one or more MidiSystem methods to learn what devices are installed and to obtain the ones needed in that application.

The class also has methods for reading files, streams, and URLs that contain standard MIDI file data or soundbanks. You can query the MidiSystem for the format of a specified MIDI file.

You cannot instantiate a MidiSystem; all the methods are static.

Properties can be used to specify default MIDI devices. Both system properties and a properties file are considered. The sound.properties properties file is read from an implementation-specific location (typically it is the lib directory in the Java installation directory). If a property exists both as a system property and in the properties file, the system property takes precedence. If none is specified, a suitable default is chosen among the available devices. The syntax of the properties file is specified in Properties.load. The following table lists the available property keys and which methods consider them:

MIDI System Property Keys

Property Key Interface Affected Method

javax.sound.midi.Receiver Receiver getReceiver()

javax.sound.midi.Sequencer Sequencer getSequencer()

javax.sound.midi.Synthesizer Synthesizer getSynthesizer()

javax.sound.midi.Transmitter Transmitter getTransmitter()

The property value consists of the provider class name and the device name, separated by the hash mark ("#"). The provider class name is the fully-qualified name of a concrete MIDI device provider class. The device name is matched against the String returned by the getName method of MidiDevice.Info. Either the class name, or the device name may be omitted. If only the class name is specified, the trailing hash mark is optional.

If the provider class is specified, and it can be successfully retrieved from the installed providers, the list of MidiDevice.Info objects is retrieved from the provider. Otherwise, or when these devices do not provide a subsequent match, the list is retrieved from getMidiDeviceInfo() to contain all available MidiDevice.Info objects.

If a device name is specified, the resulting list of MidiDevice.Info objects is searched: the first one with a matching name, and whose MidiDevice implements the respective interface, will be returned. If no matching MidiDevice.Info object is found, or the device name is not specified, the first suitable device from the resulting list will be returned. For Sequencer and Synthesizer, a device is suitable if it implements the respective interface; whereas for Receiver and Transmitter, a device is suitable if it implements neither Sequencer nor Synthesizer and provides at least one Receiver or Transmitter, respectively.

For example, the property javax.sound.midi.Receiver with a value "com.sun.media.sound.MidiProvider#SunMIDI1" will have the following consequences when getReceiver is called: if the class com.sun.media.sound.MidiProvider exists in the list of installed MIDI device providers, the first Receiver device with name "SunMIDI1" will be returned. If it cannot be found, the first Receiver from that provider will be returned, regardless of name. If there is none, the first Receiver with name "SunMIDI1" in the list of all devices (as returned by getMidiDeviceInfo) will be returned, or, if not found, the first Receiver that can be found in the list of all devices is returned. If that fails, too, a MidiUnavailableException is thrown.

The MidiSystem class provides access to the installed MIDI
system resources, including devices such as synthesizers, sequencers, and
MIDI input and output ports.  A typical simple MIDI application might
begin by invoking one or more MidiSystem methods to learn
what devices are installed and to obtain the ones needed in that
application.

The class also has methods for reading files, streams, and  URLs that
contain standard MIDI file data or soundbanks.  You can query the
MidiSystem for the format of a specified MIDI file.

You cannot instantiate a MidiSystem; all the methods are
static.

Properties can be used to specify default MIDI devices.
Both system properties and a properties file are considered.
The sound.properties properties file is read from
an implementation-specific location (typically it is the lib
directory in the Java installation directory).
If a property exists both as a system property and in the
properties file, the system property takes precedence. If none is
specified, a suitable default is chosen among the available devices.
The syntax of the properties file is specified in
Properties.load. The
following table lists the available property keys and which methods
consider them:


 MIDI System Property Keys

  Property Key
  Interface
  Affected Method


  javax.sound.midi.Receiver
  Receiver
  getReceiver()


  javax.sound.midi.Sequencer
  Sequencer
  getSequencer()


  javax.sound.midi.Synthesizer
  Synthesizer
  getSynthesizer()


  javax.sound.midi.Transmitter
  Transmitter
  getTransmitter()



The property value consists of the provider class name
and the device name, separated by the hash mark ("#").
The provider class name is the fully-qualified
name of a concrete MIDI device provider class. The device name is matched against
the String returned by the getName
method of MidiDevice.Info.
Either the class name, or the device name may be omitted.
If only the class name is specified, the trailing hash mark
is optional.

If the provider class is specified, and it can be
successfully retrieved from the installed providers,
the list of
MidiDevice.Info objects is retrieved
from the provider. Otherwise, or when these devices
do not provide a subsequent match, the list is retrieved
from getMidiDeviceInfo() to contain
all available MidiDevice.Info objects.

If a device name is specified, the resulting list of
MidiDevice.Info objects is searched:
the first one with a matching name, and whose
MidiDevice implements the
respective interface, will be returned.
If no matching MidiDevice.Info object
is found, or the device name is not specified,
the first suitable device from the resulting
list will be returned. For Sequencer and Synthesizer,
a device is suitable if it implements the respective
interface; whereas for Receiver and Transmitter, a device is
suitable if it
implements neither Sequencer nor Synthesizer and provides
at least one Receiver or Transmitter, respectively.

For example, the property javax.sound.midi.Receiver
with a value
"com.sun.media.sound.MidiProvider#SunMIDI1"
will have the following consequences when
getReceiver is called:
if the class com.sun.media.sound.MidiProvider exists
in the list of installed MIDI device providers,
the first Receiver device with name
"SunMIDI1" will be returned. If it cannot
be found, the first Receiver from that provider
will be returned, regardless of name.
If there is none, the first Receiver with name
"SunMIDI1" in the list of all devices
(as returned by getMidiDeviceInfo) will be returned,
or, if not found, the first Receiver that can
be found in the list of all devices is returned.
If that fails, too, a MidiUnavailableException
is thrown.
raw docstring

javax.sound.midi.MidiUnavailableException

A MidiUnavailableException is thrown when a requested MIDI component cannot be opened or created because it is unavailable. This often occurs when a device is in use by another application. More generally, it can occur when there is a finite number of a certain kind of resource that can be used for some purpose, and all of them are already in use (perhaps all by this application). For an example of the latter case, see the setReceiver method of Transmitter.

A MidiUnavailableException is thrown when a requested MIDI
component cannot be opened or created because it is unavailable.  This often
occurs when a device is in use by another application.  More generally, it
can occur when there is a finite number of a certain kind of resource that can
be used for some purpose, and all of them are already in use (perhaps all by
this application).  For an example of the latter case, see the
setReceiver method of
Transmitter.
raw docstring

javax.sound.midi.Patch

A Patch object represents a location, on a MIDI synthesizer, into which a single instrument is stored (loaded). Every Instrument object has its own Patch object that specifies the memory location into which that instrument should be loaded. The location is specified abstractly by a bank index and a program number (not by any scheme that directly refers to a specific address or offset in RAM). This is a hierarchical indexing scheme: MIDI provides for up to 16384 banks, each of which contains up to 128 program locations. For example, a minimal sort of synthesizer might have only one bank of instruments, and only 32 instruments (programs) in that bank.

To select what instrument should play the notes on a particular MIDI channel, two kinds of MIDI message are used that specify a patch location: a bank-select command, and a program-change channel command. The Java Sound equivalent is the programChange(int, int) method of MidiChannel.

A Patch object represents a location, on a MIDI
synthesizer, into which a single instrument is stored (loaded).
Every Instrument object has its own Patch
object that specifies the memory location
into which that instrument should be loaded. The
location is specified abstractly by a bank index and a program number (not by
any scheme that directly refers to a specific address or offset in RAM).
This is a hierarchical indexing scheme: MIDI provides for up to 16384 banks,
each of which contains up to 128 program locations.  For example, a
minimal sort of synthesizer might have only one bank of instruments, and
only 32 instruments (programs) in that bank.

To select what instrument should play the notes on a particular MIDI
channel, two kinds of MIDI message are used that specify a patch location:
a bank-select command, and a program-change channel command.  The Java Sound
equivalent is the
programChange(int, int)
method of MidiChannel.
raw docstring

javax.sound.midi.Receiver

A Receiver receives MidiEvent objects and typically does something useful in response, such as interpreting them to generate sound or raw MIDI output. Common MIDI receivers include synthesizers and MIDI Out ports.

A Receiver receives MidiEvent objects and
typically does something useful in response, such as interpreting them to
generate sound or raw MIDI output.  Common MIDI receivers include
synthesizers and MIDI Out ports.
raw docstring

javax.sound.midi.Sequence

A Sequence is a data structure containing musical information (often an entire song or composition) that can be played back by a Sequencer object. Specifically, the Sequence contains timing information and one or more tracks. Each track consists of a series of MIDI events (such as note-ons, note-offs, program changes, and meta-events). The sequence's timing information specifies the type of unit that is used to time-stamp the events in the sequence.

A Sequence can be created from a MIDI file by reading the file into an input stream and invoking one of the getSequence methods of MidiSystem. A sequence can also be built from scratch by adding new Tracks to an empty Sequence, and adding MidiEvent objects to these Tracks.

A Sequence is a data structure containing musical
information (often an entire song or composition) that can be played
back by a Sequencer object. Specifically, the
Sequence contains timing
information and one or more tracks.  Each track consists of a
series of MIDI events (such as note-ons, note-offs, program changes, and meta-events).
The sequence's timing information specifies the type of unit that is used
to time-stamp the events in the sequence.

A Sequence can be created from a MIDI file by reading the file
into an input stream and invoking one of the getSequence methods of
MidiSystem.  A sequence can also be built from scratch by adding new
Tracks to an empty Sequence, and adding
MidiEvent objects to these Tracks.
raw docstring

javax.sound.midi.Sequencer

A hardware or software device that plays back a MIDI sequence is known as a sequencer. A MIDI sequence contains lists of time-stamped MIDI data, such as might be read from a standard MIDI file. Most sequencers also provide functions for creating and editing sequences.

The Sequencer interface includes methods for the following basic MIDI sequencer operations:

obtaining a sequence from MIDI file data starting and stopping playback moving to an arbitrary position in the sequence changing the tempo (speed) of playback synchronizing playback to an internal clock or to received MIDI messages controlling the timing of another device

In addition, the following operations are supported, either directly, or indirectly through objects that the Sequencer has access to:

editing the data by adding or deleting individual MIDI events or entire tracks muting or soloing individual tracks in the sequence notifying listener objects about any meta-events or control-change events encountered while playing back the sequence.

A hardware or software device that plays back a MIDI
sequence is known as a sequencer.
A MIDI sequence contains lists of time-stamped MIDI data, such as
might be read from a standard MIDI file.  Most
sequencers also provide functions for creating and editing sequences.

The Sequencer interface includes methods for the following
basic MIDI sequencer operations:

obtaining a sequence from MIDI file data
starting and stopping playback
moving to an arbitrary position in the sequence
changing the tempo (speed) of playback
synchronizing playback to an internal clock or to received MIDI
messages
controlling the timing of another device

In addition, the following operations are supported, either directly, or
indirectly through objects that the Sequencer has access to:

editing the data by adding or deleting individual MIDI events or entire
tracks
muting or soloing individual tracks in the sequence
notifying listener objects about any meta-events or
control-change events encountered while playing back the sequence.
raw docstring

javax.sound.midi.Sequencer$SyncMode

A SyncMode object represents one of the ways in which a MIDI sequencer's notion of time can be synchronized with a master or slave device. If the sequencer is being synchronized to a master, the sequencer revises its current time in response to messages from the master. If the sequencer has a slave, the sequencer similarly sends messages to control the slave's timing.

There are three predefined modes that specify possible masters for a sequencer: INTERNAL_CLOCK, MIDI_SYNC, and MIDI_TIME_CODE. The latter two work if the sequencer receives MIDI messages from another device. In these two modes, the sequencer's time gets reset based on system real-time timing clock messages or MIDI time code (MTC) messages, respectively. These two modes can also be used as slave modes, in which case the sequencer sends the corresponding types of MIDI messages to its receiver (whether or not the sequencer is also receiving them from a master). A fourth mode, NO_SYNC, is used to indicate that the sequencer should not control its receiver's timing.

A SyncMode object represents one of the ways in which
a MIDI sequencer's notion of time can be synchronized with a master
or slave device.
If the sequencer is being synchronized to a master, the
sequencer revises its current time in response to messages from
the master.  If the sequencer has a slave, the sequencer
similarly sends messages to control the slave's timing.

There are three predefined modes that specify possible masters
for a sequencer: INTERNAL_CLOCK,
MIDI_SYNC, and MIDI_TIME_CODE.  The
latter two work if the sequencer receives MIDI messages from
another device.  In these two modes, the sequencer's time gets reset
based on system real-time timing clock messages or MIDI time code
(MTC) messages, respectively.  These two modes can also be used
as slave modes, in which case the sequencer sends the corresponding
types of MIDI messages to its receiver (whether or not the sequencer
is also receiving them from a master).  A fourth mode,
NO_SYNC, is used to indicate that the sequencer should
not control its receiver's timing.
raw docstring

javax.sound.midi.ShortMessage

A ShortMessage contains a MIDI message that has at most two data bytes following its status byte. The types of MIDI message that satisfy this criterion are channel voice, channel mode, system common, and system real-time--in other words, everything except system exclusive and meta-events. The ShortMessage class provides methods for getting and setting the contents of the MIDI message.

A number of ShortMessage methods have integer parameters by which you specify a MIDI status or data byte. If you know the numeric value, you can express it directly. For system common and system real-time messages, you can often use the corresponding fields of ShortMessage, such as SYSTEM_RESET. For channel messages, the upper four bits of the status byte are specified by a command value and the lower four bits are specified by a MIDI channel number. To convert incoming MIDI data bytes that are in the form of Java's signed bytes, you can use the conversion code given in the MidiMessage class description.

A ShortMessage contains a MIDI message that has at most
two data bytes following its status byte.  The types of MIDI message
that satisfy this criterion are channel voice, channel mode, system common,
and system real-time--in other words, everything except system exclusive
and meta-events.  The ShortMessage class provides methods
for getting and setting the contents of the MIDI message.

A number of ShortMessage methods have integer parameters by which
you specify a MIDI status or data byte.  If you know the numeric value, you
can express it directly.  For system common and system real-time messages,
you can often use the corresponding fields of ShortMessage, such as
SYSTEM_RESET.  For channel messages,
the upper four bits of the status byte are specified by a command value and
the lower four bits are specified by a MIDI channel number. To
convert incoming MIDI data bytes that are in the form of Java's signed bytes,
you can use the conversion code
given in the MidiMessage class description.
raw docstring

javax.sound.midi.Soundbank

A Soundbank contains a set of Instruments that can be loaded into a Synthesizer. Note that a Java Sound Soundbank is different from a MIDI bank. MIDI permits up to 16383 banks, each containing up to 128 instruments (also sometimes called programs, patches, or timbres). However, a Soundbank can contain 16383 times 128 instruments, because the instruments within a Soundbank are indexed by both a MIDI program number and a MIDI bank number (via a Patch object). Thus, a Soundbank can be thought of as a collection of MIDI banks.

Soundbank includes methods that return String objects containing the sound bank's name, manufacturer, version number, and description. The precise content and format of these strings is left to the implementor.

Different synthesizers use a variety of synthesis techniques. A common one is wavetable synthesis, in which a segment of recorded sound is played back, often with looping and pitch change. The Downloadable Sound (DLS) format uses segments of recorded sound, as does the Headspace Engine. Soundbanks and Instruments that are based on wavetable synthesis (or other uses of stored sound recordings) should typically implement the getResources() method to provide access to these recorded segments. This is optional, however; the method can return an zero-length array if the synthesis technique doesn't use sampled sound (FM synthesis and physical modeling are examples of such techniques), or if it does but the implementor chooses not to make the samples accessible.

A Soundbank contains a set of Instruments
that can be loaded into a Synthesizer.
Note that a Java Sound Soundbank is different from a MIDI bank.
MIDI permits up to 16383 banks, each containing up to 128 instruments
(also sometimes called programs, patches, or timbres).
However, a Soundbank can contain 16383 times 128 instruments,
because the instruments within a Soundbank are indexed by both
a MIDI program number and a MIDI bank number (via a Patch
object). Thus, a Soundbank can be thought of as a collection
of MIDI banks.

Soundbank includes methods that return String
objects containing the sound bank's name, manufacturer, version number, and
description.  The precise content and format of these strings is left
to the implementor.

Different synthesizers use a variety of synthesis techniques.  A common
one is wavetable synthesis, in which a segment of recorded sound is
played back, often with looping and pitch change.  The Downloadable Sound
(DLS) format uses segments of recorded sound, as does the Headspace Engine.
Soundbanks and Instruments that are based on
wavetable synthesis (or other uses of stored sound recordings) should
typically implement the getResources()
method to provide access to these recorded segments.  This is optional,
however; the method can return an zero-length array if the synthesis technique
doesn't use sampled sound (FM synthesis and physical modeling are examples
of such techniques), or if it does but the implementor chooses not to make the
samples accessible.
raw docstring

javax.sound.midi.SoundbankResource

A SoundbankResource represents any audio resource stored in a Soundbank. Common soundbank resources include:

Instruments. An instrument may be specified in a variety of ways. However, all soundbanks have some mechanism for defining instruments. In doing so, they may reference other resources stored in the soundbank. Each instrument has a Patch which specifies the MIDI program and bank by which it may be referenced in MIDI messages. Instrument information may be stored in Instrument objects. Audio samples. A sample typically is a sampled audio waveform which contains a short sound recording whose duration is a fraction of a second, or at most a few seconds. These audio samples may be used by a Synthesizer to synthesize sound in response to MIDI commands, or extracted for use by an application. (The terminology reflects musicians' use of the word "sample" to refer collectively to a series of contiguous audio samples or frames, rather than to a single, instantaneous sample.) The data class for an audio sample will be an object that encapsulates the audio sample data itself and information about how to interpret it (the format of the audio data), such as an AudioInputStream. Embedded sequences. A sound bank may contain built-in song data stored in a data object such as a Sequence.

Synthesizers that use wavetable synthesis or related techniques play back the audio in a sample when synthesizing notes, often when emulating the real-world instrument that was originally recorded. However, there is not necessarily a one-to-one correspondence between the Instruments and samples in a Soundbank. A single Instrument can use multiple SoundbankResources (typically for notes of dissimilar pitch or brightness). Also, more than one Instrument can use the same sample.

A SoundbankResource represents any audio resource stored
in a Soundbank.  Common soundbank resources include:

Instruments.  An instrument may be specified in a variety of
ways.  However, all soundbanks have some mechanism for defining
instruments.  In doing so, they may reference other resources
stored in the soundbank.  Each instrument has a Patch
which specifies the MIDI program and bank by which it may be
referenced in MIDI messages.  Instrument information may be
stored in Instrument objects.
Audio samples.  A sample typically is a sampled audio waveform
which contains a short sound recording whose duration is a fraction of
a second, or at most a few seconds.  These audio samples may be
used by a Synthesizer to synthesize sound in response to MIDI
commands, or extracted for use by an application.
(The terminology reflects musicians' use of the word "sample" to refer
collectively to a series of contiguous audio samples or frames, rather than
to a single, instantaneous sample.)
The data class for an audio sample will be an object
that encapsulates the audio sample data itself and information
about how to interpret it (the format of the audio data), such
as an AudioInputStream.
Embedded sequences.  A sound bank may contain built-in
song data stored in a data object such as a Sequence.


Synthesizers that use wavetable synthesis or related
techniques play back the audio in a sample when
synthesizing notes, often when emulating the real-world instrument that
was originally recorded.  However, there is not necessarily a one-to-one
correspondence between the Instruments and samples
in a Soundbank.  A single Instrument can use
multiple SoundbankResources (typically for notes of dissimilar pitch or
brightness).  Also, more than one Instrument can use the same
sample.
raw docstring

javax.sound.midi.spi.core

No vars found in this namespace.

javax.sound.midi.spi.MidiDeviceProvider

A MidiDeviceProvider is a factory or provider for a particular type of MIDI device. This mechanism allows the implementation to determine how resources are managed in the creation and management of a device.

A MidiDeviceProvider is a factory or provider for a particular type
of MIDI device. This mechanism allows the implementation to determine how
resources are managed in the creation and management of a device.
raw docstring

javax.sound.midi.spi.MidiFileReader

A MidiFileReader supplies MIDI file-reading services. Classes implementing this interface can parse the format information from one or more types of MIDI file, and can produce a Sequence object from files of these types.

A MidiFileReader supplies MIDI file-reading services. Classes
implementing this interface can parse the format information from one or more
types of MIDI file, and can produce a Sequence object from files of
these types.
raw docstring

javax.sound.midi.spi.MidiFileWriter

A MidiFileWriter supplies MIDI file-writing services. Classes that implement this interface can write one or more types of MIDI file from a Sequence object.

A MidiFileWriter supplies MIDI file-writing services. Classes that
implement this interface can write one or more types of MIDI file from a
Sequence object.
raw docstring

javax.sound.midi.spi.SoundbankReader

A SoundbankReader supplies soundbank file-reading services. Concrete subclasses of SoundbankReader parse a given soundbank file, producing a Soundbank object that can be loaded into a Synthesizer.

A SoundbankReader supplies soundbank file-reading services. Concrete
subclasses of SoundbankReader parse a given soundbank file, producing
a Soundbank object that can be loaded into a
Synthesizer.
raw docstring

javax.sound.midi.Synthesizer

A Synthesizer generates sound. This usually happens when one of the Synthesizer's MidiChannel objects receives a noteOn message, either directly or via the Synthesizer object. Many Synthesizers support Receivers, through which MIDI events can be delivered to the Synthesizer. In such cases, the Synthesizer typically responds by sending a corresponding message to the appropriate MidiChannel, or by processing the event itself if the event isn't one of the MIDI channel messages.

The Synthesizer interface includes methods for loading and unloading instruments from soundbanks. An instrument is a specification for synthesizing a certain type of sound, whether that sound emulates a traditional instrument or is some kind of sound effect or other imaginary sound. A soundbank is a collection of instruments, organized by bank and program number (via the instrument's Patch object). Different Synthesizer classes might implement different sound-synthesis techniques, meaning that some instruments and not others might be compatible with a given synthesizer. Also, synthesizers may have a limited amount of memory for instruments, meaning that not every soundbank and instrument can be used by every synthesizer, even if the synthesis technique is compatible. To see whether the instruments from a certain soundbank can be played by a given synthesizer, invoke the isSoundbankSupported method of Synthesizer.

"Loading" an instrument means that that instrument becomes available for synthesizing notes. The instrument is loaded into the bank and program location specified by its Patch object. Loading does not necessarily mean that subsequently played notes will immediately have the sound of this newly loaded instrument. For the instrument to play notes, one of the synthesizer's MidiChannel objects must receive (or have received) a program-change message that causes that particular instrument's bank and program number to be selected.

A Synthesizer generates sound.  This usually happens when one of
the Synthesizer's MidiChannel objects receives a
noteOn message, either
directly or via the Synthesizer object.
Many Synthesizers support Receivers, through which
MIDI events can be delivered to the Synthesizer.
In such cases, the Synthesizer typically responds by sending
a corresponding message to the appropriate MidiChannel, or by
processing the event itself if the event isn't one of the MIDI channel
messages.

The Synthesizer interface includes methods for loading and
unloading instruments from soundbanks.  An instrument is a specification for synthesizing a
certain type of sound, whether that sound emulates a traditional instrument or is
some kind of sound effect or other imaginary sound. A soundbank is a collection of instruments, organized
by bank and program number (via the instrument's Patch object).
Different Synthesizer classes might implement different sound-synthesis
techniques, meaning that some instruments and not others might be compatible with a
given synthesizer.
Also, synthesizers may have a limited amount of memory for instruments, meaning
that not every soundbank and instrument can be used by every synthesizer, even if
the synthesis technique is compatible.
To see whether the instruments from
a certain soundbank can be played by a given synthesizer, invoke the
isSoundbankSupported method of
Synthesizer.

"Loading" an instrument means that that instrument becomes available for
synthesizing notes.  The instrument is loaded into the bank and
program location specified by its Patch object.  Loading does
not necessarily mean that subsequently played notes will immediately have
the sound of this newly loaded instrument.  For the instrument to play notes,
one of the synthesizer's MidiChannel objects must receive (or have received)
a program-change message that causes that particular instrument's
bank and program number to be selected.
raw docstring

javax.sound.midi.SysexMessage

A SysexMessage object represents a MIDI system exclusive message.

When a system exclusive message is read from a MIDI file, it always has a defined length. Data from a system exclusive message from a MIDI file should be stored in the data array of a SysexMessage as follows: the system exclusive message status byte (0xF0 or 0xF7), all message data bytes, and finally the end-of-exclusive flag (0xF7). The length reported by the SysexMessage object is therefore the length of the system exclusive data plus two: one byte for the status byte and one for the end-of-exclusive flag.

As dictated by the Standard MIDI Files specification, two status byte values are legal for a SysexMessage read from a MIDI file:

0xF0: System Exclusive message (same as in MIDI wire protocol) 0xF7: Special System Exclusive message

When Java Sound is used to handle system exclusive data that is being received using MIDI wire protocol, it should place the data in one or more SysexMessages. In this case, the length of the system exclusive data is not known in advance; the end of the system exclusive data is marked by an end-of-exclusive flag (0xF7) in the MIDI wire byte stream.

0xF0: System Exclusive message (same as in MIDI wire protocol) 0xF7: End of Exclusive (EOX)

The first SysexMessage object containing data for a particular system exclusive message should have the status value 0xF0. If this message contains all the system exclusive data for the message, it should end with the status byte 0xF7 (EOX). Otherwise, additional system exclusive data should be sent in one or more SysexMessages with a status value of 0xF7. The SysexMessage containing the last of the data for the system exclusive message should end with the value 0xF7 (EOX) to mark the end of the system exclusive message.

If system exclusive data from SysexMessages objects is being transmitted using MIDI wire protocol, only the initial 0xF0 status byte, the system exclusive data itself, and the final 0xF7 (EOX) byte should be propagated; any 0xF7 status bytes used to indicate that a SysexMessage contains continuing system exclusive data should not be propagated via MIDI wire protocol.

A SysexMessage object represents a MIDI system exclusive message.

When a system exclusive message is read from a MIDI file, it always has
a defined length.  Data from a system exclusive message from a MIDI file
should be stored in the data array of a SysexMessage as
follows: the system exclusive message status byte (0xF0 or 0xF7), all
message data bytes, and finally the end-of-exclusive flag (0xF7).
The length reported by the SysexMessage object is therefore
the length of the system exclusive data plus two: one byte for the status
byte and one for the end-of-exclusive flag.

As dictated by the Standard MIDI Files specification, two status byte values are legal
for a SysexMessage read from a MIDI file:

0xF0: System Exclusive message (same as in MIDI wire protocol)
0xF7: Special System Exclusive message


When Java Sound is used to handle system exclusive data that is being received
using MIDI wire protocol, it should place the data in one or more
SysexMessages.  In this case, the length of the system exclusive data
is not known in advance; the end of the system exclusive data is marked by an
end-of-exclusive flag (0xF7) in the MIDI wire byte stream.

0xF0: System Exclusive message (same as in MIDI wire protocol)
0xF7: End of Exclusive (EOX)

The first SysexMessage object containing data for a particular system
exclusive message should have the status value 0xF0.  If this message contains all
the system exclusive data
for the message, it should end with the status byte 0xF7 (EOX).
Otherwise, additional system exclusive data should be sent in one or more
SysexMessages with a status value of 0xF7.  The SysexMessage
containing the last of the data for the system exclusive message should end with the
value 0xF7 (EOX) to mark the end of the system exclusive message.

If system exclusive data from SysexMessages objects is being transmitted
using MIDI wire protocol, only the initial 0xF0 status byte, the system exclusive
data itself, and the final 0xF7 (EOX) byte should be propagated; any 0xF7 status
bytes used to indicate that a SysexMessage contains continuing system
exclusive data should not be propagated via MIDI wire protocol.
raw docstring

javax.sound.midi.Track

A MIDI track is an independent stream of MIDI events (time-stamped MIDI data) that can be stored along with other tracks in a standard MIDI file. The MIDI specification allows only 16 channels of MIDI data, but tracks are a way to get around this limitation. A MIDI file can contain any number of tracks, each containing its own stream of up to 16 channels of MIDI data.

A Track occupies a middle level in the hierarchy of data played by a Sequencer: sequencers play sequences, which contain tracks, which contain MIDI events. A sequencer may provide controls that mute or solo individual tracks.

The timing information and resolution for a track is controlled by and stored in the sequence containing the track. A given Track is considered to belong to the particular Sequence that maintains its timing. For this reason, a new (empty) track is created by calling the Sequence.createTrack() method, rather than by directly invoking a Track constructor.

The Track class provides methods to edit the track by adding or removing MidiEvent objects from it. These operations keep the event list in the correct time order. Methods are also included to obtain the track's size, in terms of either the number of events it contains or its duration in ticks.

A MIDI track is an independent stream of MIDI events (time-stamped MIDI
data) that can be stored along with other tracks in a standard MIDI file.
The MIDI specification allows only 16 channels of MIDI data, but tracks
are a way to get around this limitation.  A MIDI file can contain any number
of tracks, each containing its own stream of up to 16 channels of MIDI data.

A Track occupies a middle level in the hierarchy of data played
by a Sequencer: sequencers play sequences, which contain tracks,
which contain MIDI events.  A sequencer may provide controls that mute
or solo individual tracks.

The timing information and resolution for a track is controlled by and stored
in the sequence containing the track. A given Track
is considered to belong to the particular Sequence that
maintains its timing. For this reason, a new (empty) track is created by calling the
Sequence.createTrack() method, rather than by directly invoking a
Track constructor.

The Track class provides methods to edit the track by adding
or removing MidiEvent objects from it.  These operations keep
the event list in the correct time order.  Methods are also
included to obtain the track's size, in terms of either the number of events
it contains or its duration in ticks.
raw docstring

javax.sound.midi.Transmitter

A Transmitter sends MidiEvent objects to one or more Receivers. Common MIDI transmitters include sequencers and MIDI input ports.

A Transmitter sends MidiEvent objects to one or more
Receivers. Common MIDI transmitters include sequencers
and MIDI input ports.
raw docstring

javax.sound.midi.VoiceStatus

A VoiceStatus object contains information about the current status of one of the voices produced by a Synthesizer.

MIDI synthesizers are generally capable of producing some maximum number of simultaneous notes, also referred to as voices. A voice is a stream of successive single notes, and the process of assigning incoming MIDI notes to specific voices is known as voice allocation. However, the voice-allocation algorithm and the contents of each voice are normally internal to a MIDI synthesizer and hidden from outside view. One can, of course, learn from MIDI messages which notes the synthesizer is playing, and one might be able deduce something about the assignment of notes to voices. But MIDI itself does not provide a means to report which notes a synthesizer has assigned to which voice, nor even to report how many voices the synthesizer is capable of synthesizing.

In Java Sound, however, a Synthesizer class can expose the contents of its voices through its getVoiceStatus() method. This behavior is recommended but optional; synthesizers that don't expose their voice allocation simply return a zero-length array. A Synthesizer that does report its voice status should maintain this information at all times for all of its voices, whether they are currently sounding or not. In other words, a given type of Synthesizer always has a fixed number of voices, equal to the maximum number of simultaneous notes it is capable of sounding.

If the voice is not currently processing a MIDI note, it is considered inactive. A voice is inactive when it has been given no note-on commands, or when every note-on command received has been terminated by a corresponding note-off (or by an "all notes off" message). For example, this happens when a synthesizer capable of playing 16 simultaneous notes is told to play a four-note chord; only four voices are active in this case (assuming no earlier notes are still playing). Usually, a voice whose status is reported as active is producing audible sound, but this is not always true; it depends on the details of the instrument (that is, the synthesis algorithm) and how long the note has been going on. For example, a voice may be synthesizing the sound of a single hand-clap. Because this sound dies away so quickly, it may become inaudible before a note-off message is received. In such a situation, the voice is still considered active even though no sound is currently being produced.

Besides its active or inactive status, the VoiceStatus class provides fields that reveal the voice's current MIDI channel, bank and program number, MIDI note number, and MIDI volume. All of these can change during the course of a voice. While the voice is inactive, each of these fields has an unspecified value, so you should check the active field first.

A VoiceStatus object contains information about the current
status of one of the voices produced by a Synthesizer.

MIDI synthesizers are generally capable of producing some maximum number of
simultaneous notes, also referred to as voices.  A voice is a stream
of successive single notes, and the process of assigning incoming MIDI notes to
specific voices is known as voice allocation.
However, the voice-allocation algorithm and the contents of each voice are
normally internal to a MIDI synthesizer and hidden from outside view.  One can, of
course, learn from MIDI messages which notes the synthesizer is playing, and
one might be able deduce something about the assignment of notes to voices.
But MIDI itself does not provide a means to report which notes a
synthesizer has assigned to which voice, nor even to report how many voices
the synthesizer is capable of synthesizing.

In Java Sound, however, a
Synthesizer class can expose the contents of its voices through its
getVoiceStatus() method.
This behavior is recommended but optional;
synthesizers that don't expose their voice allocation simply return a
zero-length array. A Synthesizer that does report its voice status
should maintain this information at
all times for all of its voices, whether they are currently sounding or
not.  In other words, a given type of Synthesizer always has a fixed
number of voices, equal to the maximum number of simultaneous notes it is
capable of sounding.


If the voice is not currently processing a MIDI note, it
is considered inactive.  A voice is inactive when it has
been given no note-on commands, or when every note-on command received has
been terminated by a corresponding note-off (or by an "all notes off"
message).  For example, this happens when a synthesizer capable of playing 16
simultaneous notes is told to play a four-note chord; only
four voices are active in this case (assuming no earlier notes are still playing).
Usually, a voice whose status is reported as active is producing audible sound, but this
is not always true; it depends on the details of the instrument (that
is, the synthesis algorithm) and how long the note has been going on.
For example, a voice may be synthesizing the sound of a single hand-clap.  Because
this sound dies away so quickly, it may become inaudible before a note-off
message is received.  In such a situation, the voice is still considered active
even though no sound is currently being produced.

Besides its active or inactive status, the VoiceStatus class
provides fields that reveal the voice's current MIDI channel, bank and
program number, MIDI note number, and MIDI volume.  All of these can
change during the course of a voice.  While the voice is inactive, each
of these fields has an unspecified value, so you should check the active
field first.
raw docstring

cljdoc is a website building & hosting documentation for Clojure/Script libraries

× close