Microsoft DirectX 9.0

Audiopaths and Buffers

Each DirectMusic segment plays on an audiopath that controls the flow of sounds from the performance to the synthesizer, then through DirectSound buffers where effects can be applied, and finally into the primary buffer, where the final output is mixed.

Note   The buffers referred to here are used for streaming and processing PCM data after it has left the synthesizer, and these buffers support the IDirectSoundBuffer8 interface. Another kind of buffer, represented by the IDirectMusicBuffer8 interface, is used for sequencing message data to the synthesizer. Most applications do not need access to the second kind of buffer, which is managed by the DirectMusic performance.

Applications can create standard audiopaths and then play segments on them. For example, an application could create one audiopath for playing MIDI files to a buffer with musical reverb and another for playing WAV files to a buffer with 3-D control.

More sophisticated audiopath configurations can be authored into a segment in DirectMusic Producer. For example, a nonstandard configuration might direct parts in a segment through different DirectSound buffers to apply different effects to them.

An audiopath can be seen as a chain of objects through which data is streamed. An application can gain access to any of these objects. For example, you might retrieve a buffer object to set 3-D properties of a sound source, or an effect DMO to change the parameters of the effect.

See Also