(2017-11-20, 17:10)Martijn Wrote: No and there won't be one
The sad thing for me is that getting rid of the "raw" output option does NOT fix the "noise blast" of undecoded Dolby Digital for ALL decoders. For example, my high-end ribbon speaker system with all analog preamp/amps equipment has a Technics SH-AC500D DD decoder added to it so I can get surround without having to give up my Carver analog only preamp (with Sonic Holography among other things). That 1990s era decoder simply isn't fast enough to lock onto a Dolby Digital signal instantly and so if I leave it set to "keep audio device alive,' it will still passthrough a momentary speaker noise blast since the processor can't lock on fast enough to prevent it. I can set KODI to shut the audio device off instead to prevent it (the delay gives it time to lock on), but then I lose the first couple of seconds of audio with music every single time. Both options basically suck as-is. But I can't really upgrade the decoder without ditching the all-analog setup for records, etc. since "separates" decoders are largely a thing of the past.
What KODI would need to fix this problem with slower decoders is a "disable audio device only before starting a movie" option that leaves the audio device on all the time for music and non-encoded signals, but shuts it off before starting a signal that is encoded (i.e. a movie) so such older receivers/decoders don't play the undecoded signal. As it is, with my upstairs system, I have to choose between losing the start of songs or potentially getting a blast of noise each time. The only way to get both currently is to manually change the audio device setting every single time before switching between movies and music. In short, the "problem" that caused the "shitty" code to be ditched (using packaging instead of a RAW output) doesn't completely solve the problem for all systems. The noise blast is still there with the Technics decoder even with packaging.
Similarly, the "shitty" version doesn't cause a noise blast on all receivers (e.g. my Yamaha receiver from 10 years ago will not play a noise blast no matter what setting I use on the shitty version. That is because it can detect/lock the signal almost instantly so no noise is ever played regardless. It'd be nice if the option was left up to the user instead of deciding what works for him (when it doesn't at the above example shows). Instead, I will probably be forced to switch to MrMC at some point in the future instead for my home theater system downstairs since it is much more desirable to have true DTS playback instead of conversion (e.g. True DTS decoding will play 6.1 discrete "ES" movies (I have a couple dozen of these including all the first six Star Wars movies) whereas converted Dolby Digital will reduce them down to 5.1 channels on my receiver that uses TOSLINK).
THEATER: 11.1.10 Atmos, Epson 3100 3D Projector, DaLite 92" screen, Mixed Dialog Lift - PSB Speakers; Sources: PS4, LG UP875 UHD, Nvidia Shield (KODI), ATV4K, Zidoo X9S (ZDMC), LD, GameCube