<html><head></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; ">I have found that when playing some 5.1 audio (in particular, audio AIFF with 5.1 discrete surround channels), the mixer module "simple.c" is called. (<span class="Apple-style-span" style="font-family: Menlo; font-size: 12px; ">SOURCES_simple_channel_mixer in modules/audio_filter/channel_mixer)</span><div><br></div><div>This is a wonderful thing. For research purposes, I have made some, aah, peculiar adaptations to the downmix algorithm, and VLC would seem such a wonderful place to do this research.</div><div><br></div><div>The problem, when I play a 5.1 movie file, using AC3 as the container, this downmix is no longer invoked in the processing chain. If I trace this one through a bit... I am seeing audio data read in through a52.c as - whatever - opaque data at this point. And then I am seeing a52float.c invoked with 2 in and 2 output channels. The "simple.c" downmix algorithm is not invoked. My best guess here is that the downmixing here is happening inside the a52 library. </div><div><br></div><div>I could take the approach of looking for every place in VLC that a surround downmix is performed, but this seems like not such an efficient approach considering the above.</div><div><br></div><div>Is there a way to force VLC to behave like this: "Whenever the data has >2 discrete channels to output from, always run it through the 'simple.c' downmix module"?</div><div><br></div><div>Thanks,</div><div><br></div><div><br></div><div>Walt Horat</div><div>---</div></body></html>