I just found this explanation for the problem. Am wondering if another DVR, maybe a Tivo would do a better job of upscaling than the Xfinity boxes? Has anyone done a comparison?
Explanation regarding the HD quality that I found:
Quote: Originally Posted by gman1962 View Post When I hit the display button on my Sony tv is shows 1080P. Isnt this supposed to represent the signal that being inputed and not upscaled?
Yes I was surprised if it was a true 1080P signal. The Sony TV is displaying the format of the signal it, the TV, is receiving.
At this time there are no broadcast or cable networks that provide a 1080p signal; rather, they are 1080i or 720p (or 480i for standard definition or 480p for extended definition). About 75% to 80% of HD cable and broadcast networks broadcast in 1080i, the other 20% to 25% of HD cable and broadcast networks are 720p; none are 1080p. (See List of Current American High-Definition Channels.) Now there may be some pay-per-view and video-on-demand content that is 1080p, but that is the exception, not the norm. (If I recall correctly, I had come across a few Encore video-on-demand movies that were 1080p24.)
With some of the Comcast non-X1 set-top boxes, one can set the box to output "NATIVE", that is the same video format as the box received from Comcast. For a while I had the HD DVR I am renting from Comcast doing that and none of the linear channels (broadcast or cable) were 1080p, but rather a mixture of 720p (ABC, Disney, ESPN, a smattering of others), 1080i (most of the other major networks), and 480i (multicast networks like MeTV, Get.TV, Antenna TV, as well as the SD-only feeds and the second video channel for our PBS affiliate). When I would press the "Info" button on the TV remote, I would see 1080i, 720p, or 480p (because I had the HD DVR configured to deinterlace 480i by telling it the TV does not understand 480i--a workaround for a bug in the TV). So those are the formats the DVR is receiving from the Comcast headend (where Comcast puts the signals on the fibers that feed the node on our block). Where I checked, the video format reported by the TV matched what the wikipedia page of "List of Current American High-Definition Channels" said how each network was formatting its video signal.
My HD DVR also has the option to set a specific output format, converting (if necessary) each incoming signal to that output format. I currently have the HD DVR outputting 1080p. Guess what the TV says it is receiving on HDMI port 1 no matter what channel the DVR is tuned to or recording it is playing? 1080p. In my case, it is the HD DVR that is upscaling or deinterlacing as necessary to generate the 1080p signal.
Now, with the HD DVR upscaling and/or deinterlacing, is it making the picture sharper? My 62-year-old eyes tell me "No". When I watch a Get.TV (480i) movie or a Svengoolie episode (MeTV, 480i), I can clearly see locks of hair, but it is a blonde, brown, or black blob. If I see similar content thing on Syfy, Encore, Starz (the HD feeds of these channels), I see structure in those "clumps" of a few strands thick, and on a close-up I can make out individual strands in a lock of hair. So upscaling is a way of filling in the missing pixels, but it cannot add the missing detail. And that is true, whether it is the cable box doing the rescaling, or the TV is rescaling the signal to its native resolution. (Modern LCD and plasma TVs have only so many pixels so any signal not matching those pixel locations have to be rescaled to map those pixels. So for a 1080p TV, one could either have the set-top box generate a 1080p signal, which means the set-top box is doing the rescaling and/or deinterlacing, and the TV can take that resulting signal as is; or the set-top box if set in NATIVE passes the received format to the TV, and then the TV has to rescale/deinterlace as needed to display that video image on the TV screen's 1920x1080 pixels.)
In the Comcast X1 world, one does not have "NATIVE", so one is feeding a 1080p TV one would generally pick 1080p for the output format so that one doesn't lose resolution for 1080i channels and one doesn't add motion blur to 720p channels, having the set-top box do the rescaling and deinterlacing, and leaving it to the TV to just present the signal to the TV screen in the same format that it received from the set-top box. The TV wouldn't know what massaging of the video format had taken place, just that it is now receiving a 1080p through the TV's HDMI (or component) port. But a human with good eyes not sitting back too far from the TV would likely be able to tell if the content was originally SD (480i) or HD (720p or 1080i) from the level of detail one could make out (hair, facial hair or facial stubble, blades of grass, individual stones of a gravel road are examples of where one is likely be able to tell the difference between SD and HD). The upscaling would have provided the additional pixels, but not the additional detail, in really simplistic terms the upscaling provides an interpolation of the missing pixel values from the pixel values it has, maybe providing a more complicated algorithm than a weighted average of neighboring pixels, but ultimately it cannot produce what just isn't in the video signal.
So, am I receiving a "true 1080p" signal of KATU 2.3 (Comet TV)? TV says it is, but KATU is broadcasting it as 480i, Comcast is sending a further compressed Comet TV signal to the HD DVR I am renting, and the HD DVR is decompressing the video signal, deinterlacing and upscaling that 480i image (approximately 704x480 pixels, interlaced scanning) to 1920x1080 (progressive scanning) pixels, and then the TV thinks it has received a 1080p signal, so pressing Info on the TV remote shows I am watching 1080p content. But the lack of fine detail in the picture reveals that it isn't high definition after all.
... View more
I am new with Xfinity. Everything was connected a week ago. Had trouble with the internet, but now fixed. So just started checking out the cable tv. I do have the XG1v4 and a Xi3 for my bedroom box. I am running the XG1v4 with a Samsung 55 inch 1080p tv. I set the dvr to 16:9 1080p setting. I immediately noticed that all of the faces had the appearance of the "beauty" setting when taking selfies on my samsung phone. Very disappointing! I had expected good detail and texture, but everything is quite smoothed out. I had expected a crisper picture for HD. I have played around with all sorts of settings...but nothing really improved the quality. I also have an Arris SB8200 docsis 3.1 (3.1 down) was just actived a couple of days ago. Has anyone else with the XG1v4 noticed this soft version of HD? I learned on other threads that Xfinity has apparently "rezzed" down to 720pHD rather than 1080, apparently to allow them to push more channels through the bandwidth, and they have removed the 1080i native option for device settings on this new box. I called yesterday and, of course, the customer service agent knew nothing of the 720p issue. Just suggested a reboot and a tech visit. Since I have nothing to compare my picture quality to except that when I stream movies and shows, I get crystal clear HD with great detail and texture. So it is not my TV. I have never used one of the earlier XG1 DVR models, so I even thought it might be issues with the newest box. I do like the flexibility and other features, but picture quality is important. We just bought a Samsung UN49MU8000 for the bedroom and were thinking about upgrading our living room tv to a larger, higher quality tv, but now am wondering what the point would be if xFinity is not transmitting the highest quality. Does anyone know how they plan to deliver 4K....if depending on Netflix then it seems it will be streaming, so we should get a true 4K experience, correct? At this point, I am wondering if it is worth it to pay additional fees for HD that doesn't seem much better than the SD I had with Directv.
Has anyone compared the XG1v4 box with a tivo product? or some other dvr workaround?
... View more