Contributor
•
32 Messages
Why is Xfinity X1 XG1v4 still not processing Dolby Digital Surround correctly?
I first posted this problem over six months ago, never received anything approaching an adequate response, and it's still unchanged. The issue is that my 4k XG1v4 cable box falsely shows all channels (live, cloud and IP) as being in DD+ 5.1 format. This, even when the original source material was obviously not created that way. For example, I just turned on an original Star Trek episode from 1963. The signal the cable box is sending to my receiver is supposedly in DD+ 5.1, when obviously the show itself is only broadcasting in mono (as it was filmed). Even vintage black and white movies from the 30s and 40s will show as being broadcast in 5.1 channel surround. You can literally go up to the speakers and put your ear against them to hear that there is no content, other than in the left and right front main speakers (for both stereo and mono sources).
The fact that the source material is only in mono or stereo is fine, but because the cable box is wrapping a material in a header that tells your receiver it's a full 5.1 Dolby Digital signal, the receiver accepts it and tries to assign the appropriate signal to every speaker you have--which means that there is nothing coming from your center channel, where the majority of the dialogue should be. If the cable box sent out the audio from a stereo program correctly labeled as a stereo audio signal, then the audio receiver could apply something like Dolby Pro-Logic to synthesize the correct center and surround audio information, and you would again have a version of surround sound in all channels (as receivers have been doing for more than 20 years now). I can confirm this all day long by looking at the format of the incoming signal, which my receiver helpfully shows both on screen, and in its control app. In every single case the box is claiming the audio you're sending out has 5.1 channels, no matter that 3.1 of those channels are silent much of the time.
In addition to happening with vintage programming, or niche channels (where Xfinity seems to be trying to save bandwidth by only sending 2-channels of audio), it also happens to even current movies on demand, or other cloud/IP based content. Here's another example: I can watch a newer movie live, and get the correct full DD+ audio, which is handled perfectly by my Onkyo TX-RZ50 receiver. If I DVR it and watch it back, it's also fine. But if I push the button in the guide to "restart" it (thus switching to the cloud), now the audio is only in stereo, but still tagged as though it's DD+, wrecking the surround processing. This also happens if I rent a new movie on demand. Again I have to assume that all of this cloud and VOD streaming content is only being sent in stereo in order for you to save bandwidth. If that's not the reason, then you need to examine what in your chain is stripping out the other channels of audio.
I know that for your customers who only use a soundbar, or even their internal tv speakers, they probably can't even tell the difference. But for customers who have invested in a proper surround-sound setup with discrete speakers, it's borderline un-listenable. I can click on any of the apps on X1 for Hulu/Disney+/HBO-Max/AppleTV/etc. and none of them exhibit this behavior. I can stream anything from them all day, and the audio is correctly flagged for however many channels the original source material contains.
Why is this still an issue? After the last time I posted, I heard from other users who said it seemed to have started with a firmware update you installed on the 4k boxes over a year ago. I wasn't using the XG1v4 box then, but I can say that my standard HD box I was using at that time never had this behavior. Given that nothing else has changed in my setup, it's pretty clear where the problem is. Oh, and before you ask, I've toggled between "auto-detect" and "expert mode" in audio settings, and it makes zero difference.
Please pass this on to whatever tech reps or engineers actually understand how your system does (and should) handle audio processing. If it started with a firmware update, it certainly should be reversible.
Thank you!
user_89ccc8
Contributor
•
32 Messages
3 years ago
So, no response then?
0
0
user_89ccc8
Contributor
•
32 Messages
3 years ago
Do I win a prize for stumping the Xfinity experts?
4
0
user_89ccc8
Contributor
•
32 Messages
3 years ago
Um, did you even read my post?? As I stated, I first reported this to you approximately six months ago when I first got the XG1v4. Please reread my post, and forward the information to someone (probably one of your engineers) who is familiar with audio signal encoding on your platform. If you'd like, I'd be more than happy to go back and find all of the other people who have reported this same behavior. As I also stated, it's reported to have started about a year ago when you upgraded the firmware on your 4k boxes.
Not trying to be rude, but if you had read even the first line of my original post, you would have answered your own question.
8
0