Welcome to Comcast Help & Support Forums
Find solutions, share knowledge, and get answers from customers and experts

New to the Community? Start here.

5,709,447

members

62

online now

1,872,776

discussions

Back to Top

Comcast downgrading all 1080i HD channels to 720p

Posted by
Regular Contributor

Message 101 of 210
4,197 Views

artm_boston wrote:

Below is a reply I received from Comcast ECare after filing a complaint. Interesting that he calls the switch a "regression"!

 

 

 

reply from Comcast ECare:

====================


I do apologize that you disagree with our decision to go to 720p. We did
this as 720p is progressive whereas 1080i is interlaced. With 1080i and
its interlaced functionality we can only run shows at 30 frames per
second and this causes gaps in data for fast moving scenes. With 720p
we can provide the same quality picture faster at 60 frames per second.
With this change we can provide a better quality service and resolve the
flaws created with 1080i.


 

The correct thing to do would have been to use 1080p if they wanted the benefits of a progressive picture vs. interlaced.  Not to downscale to 720p so that our displays can turn around and upscale it back to 1080p (which all 1080p displays will do).

Posted by
Regular Contributor

Message 102 of 210
4,173 Views

jkozlow3 wrote:

artm_boston wrote:

Below is a reply I received from Comcast ECare after filing a complaint. Interesting that he calls the switch a "regression"!

 

 

 

reply from Comcast ECare:

====================


I do apologize that you disagree with our decision to go to 720p. We did
this as 720p is progressive whereas 1080i is interlaced. With 1080i and
its interlaced functionality we can only run shows at 30 frames per
second and this causes gaps in data for fast moving scenes. With 720p
we can provide the same quality picture faster at 60 frames per second.
With this change we can provide a better quality service and resolve the
flaws created with 1080i.


 

The correct thing to do would have been to use 1080p if they wanted the benefits of a progressive picture vs. interlaced.  Not to downscale to 720p so that our displays can turn around and upscale it back to 1080p (which all 1080p displays will do).


IMO, as a former Stockholder over the last 5 years who just sold out at a nice profit and no longer a fan (since the 720p regression attempt results have failed), the reason those people (Comcast) are not doing the right thing regarding PQ for their larger screen home clients (again, in my opinion) is that they are also trying to squeeze more channels into a limited bandwith space to plan for a growing percentage of smaller screen mobile viewers while using a just minumal amount from that $10/mo HD Technology Fee most clients presently pay.   Doing the right thing, as you recommend, would cost those people more money, reduce profit margins, possibly upset their nice publically traded stock price run from 25 to 72 over the last 4 or 5 years (I was lucky, I admit it), and limit their potentional to do potential deals or spin-offs.

 

Nothing personal, just simple Business 101 .... (Note: I am not the client pictured or discussed in the article from the Oct 2016 link below):

 

http://investorplace.com/2016/10/cmcsa-comcast-corporation-stock-rolling-over/

Posted by
Regular Contributor

Message 103 of 210
4,157 Views

eecon1 wrote:

jkozlow3 wrote:

artm_boston wrote:

Below is a reply I received from Comcast ECare after filing a complaint. Interesting that he calls the switch a "regression"!

 

 

 

reply from Comcast ECare:

====================


I do apologize that you disagree with our decision to go to 720p. We did
this as 720p is progressive whereas 1080i is interlaced. With 1080i and
its interlaced functionality we can only run shows at 30 frames per
second and this causes gaps in data for fast moving scenes. With 720p
we can provide the same quality picture faster at 60 frames per second.
With this change we can provide a better quality service and resolve the
flaws created with 1080i.


 

The correct thing to do would have been to use 1080p if they wanted the benefits of a progressive picture vs. interlaced.  Not to downscale to 720p so that our displays can turn around and upscale it back to 1080p (which all 1080p displays will do).


IMO, as a former Stockholder over the last 5 years who just sold out at a nice profit and no longer a fan (since the 720p regression attempt results have failed), the reason those people (Comcast) are not doing the right thing regarding PQ for their larger screen home clients (again, in my opinion) is that they are also trying to squeeze more channels into a limited bandwith space to plan for a growing percentage of smaller screen mobile viewers while using a just minumal amount from that $10/mo HD Technology Fee most clients presently pay.   Doing the right thing, as you recommend, would cost those people more money, reduce profit margins, possibly upset their nice publically traded stock price run from 25 to 72 over the last 4 or 5 years (I was lucky, I admit it), and limit their potentional to do potential deals or spin-offs.

 

Nothing personal, just simple Business 101 .... (Note: I am not the client pictured or discussed in the article from the Oct 2016 link below):

 

http://investorplace.com/2016/10/cmcsa-comcast-corporation-stock-rolling-over/


 

Why don't they just get rid of the SD channels if they need extra bandwidth?  I don't know a single person who hasn't had an HDTV for at least the past 10 years.  Seriously, not one single person.  I bought my first HDTV in 2002.  15 years ago.

 

Why is SD still a thing??

Posted by
Regular Contributor

Message 104 of 210
4,155 Views

jkozlow3 wrote:

eecon1 wrote:

jkozlow3 wrote:

artm_boston wrote:

Below is a reply I received from Comcast ECare after filing a complaint. Interesting that he calls the switch a "regression"!

 

 

 

reply from Comcast ECare:

====================


I do apologize that you disagree with our decision to go to 720p. We did
this as 720p is progressive whereas 1080i is interlaced. With 1080i and
its interlaced functionality we can only run shows at 30 frames per
second and this causes gaps in data for fast moving scenes. With 720p
we can provide the same quality picture faster at 60 frames per second.
With this change we can provide a better quality service and resolve the
flaws created with 1080i.


 

The correct thing to do would have been to use 1080p if they wanted the benefits of a progressive picture vs. interlaced.  Not to downscale to 720p so that our displays can turn around and upscale it back to 1080p (which all 1080p displays will do).


IMO, as a former Stockholder over the last 5 years who just sold out at a nice profit and no longer a fan (since the 720p regression attempt results have failed), the reason those people (Comcast) are not doing the right thing regarding PQ for their larger screen home clients (again, in my opinion) is that they are also trying to squeeze more channels into a limited bandwith space to plan for a growing percentage of smaller screen mobile viewers while using a just minumal amount from that $10/mo HD Technology Fee most clients presently pay.   Doing the right thing, as you recommend, would cost those people more money, reduce profit margins, possibly upset their nice publically traded stock price run from 25 to 72 over the last 4 or 5 years (I was lucky, I admit it), and limit their potentional to do potential deals or spin-offs.

 

Nothing personal, just simple Business 101 .... (Note: I am not the client pictured or discussed in the article from the Oct 2016 link below):

 

http://investorplace.com/2016/10/cmcsa-comcast-corporation-stock-rolling-over/


 

Why don't they just get rid of the SD channels if they need extra bandwidth?  I don't know a single person who hasn't had an HDTV for at least the past 10 years.  Seriously, not one single person.  I bought my first HDTV in 2002.  15 years ago.

 

Why is SD still a thing??


Because many folks, including myself, do not need HD for every single program (which may prematurely wear out the DVR's HDD heads).

 

Example:

 

When not lesuire traveling, I usually set my main non-X1 2-Tuner AnyRoom DCX3400M HDTV DVR in my office/den on SD business channels CNBC and Bloomberg all night long 3 to 4 days a week from the time when most European markets open (around midnight to 1am PT) while I prepare my simple next-day trading algorithms for the U.S. stock market which opens 6.5 hours later at 6:30am PT. I leave the DVR set on SD acruing nice and long 2+ hour long SD buffers of that day's U.S. market action with it set to record U.S. Market wrap-up shows which end around 4pm PT.

 

I then normally sleep from about 7am PT to 3pm PT on those same 3 or 4 trading weekdays and let my computers make the descisions during the U.S. trading day (which for me typically tends to work more profitably and faster since my human emotional trading tendancies are removed from the equation).

 

Most all evenings for me and my wife are typically for in-home HDTV viewing entertainment, dining out, etc.

 

Computers, HSI, and reliable DVRs can be tools for both entertainment and profit if utilized properly.

Posted by
Frequent Visitor

Message 105 of 210
4,074 Views

Teds:

 

Excellent.... your post worked for me...i recently was required to change hd boxes...  i could not figure why some shows were showing in 720...others were in 1080. (same box in my bedroom showed ALL my normal shows in 1080).  I did what you suggested (turn on tv and turn off box) and found where i needed to enable "show all shows in 1080.  Sure enough it worked.  now all shows coming in at 1080i.  I would never have figured it out without coming across this post.  This should especially help those who have changed boxes...or those who never had it set right in the first place.   great..thanks much.

.

patrick

Posted by
Problem Solver

Message 106 of 210
4,070 Views

Dennis:

 

Nice to see you have the box set up for 1080i.

 

However; only the over the air 1080i networks / stations are stll being shown on cable as (1920 x 1080) / 60i. An exception would be "On Demand" cable version showing some episodes after their original playing on the network (OTA) in 1080i being played at  least one day later on "On Demand" as (1920 x 1080) / 60i. If you happened to watch the original show on cable other than a OTA channel, the linear cable channel (History Channel, etc.) only outputs (1280 x 720) / 60p when first shown.

Posted by
Regular Contributor

Message 107 of 210
4,047 Views

dennis427 wrote:

Teds:

 

Excellent.... your post worked for me...i recently was required to change hd boxes...  i could not figure why some shows were showing in 720...others were in 1080. (same box in my bedroom showed ALL my normal shows in 1080).  I did what you suggested (turn on tv and turn off box) and found where i needed to enable "show all shows in 1080.  Sure enough it worked.  now all shows coming in at 1080i.  I would never have figured it out without coming across this post.  This should especially help those who have changed boxes...or those who never had it set right in the first place.   great..thanks much.

.

patrick


Keep in mind that the way Comcast is doing HD now now on non-local channels is converting them at the local plant to 720p along with the MPEG 4 codec and then sending them out over coax, and your box gets the 720p chanel signal. Even though you may set the box to output to 1080i (or 1080p), it is sitll not really a true 1080 quality do to all the converstion (first from 1080 to 720 by comcast, and then from 720 to 1080 by the box). Local stations at 1080 do not have all this extra conversion going on and look much better.

If your box has "Native" output, I would suggest that. That way the Box will output the channel resolution as-is, and then leave it to the TV to convert it to its own internal resolution.

Posted by
Contributor

Message 108 of 210
4,000 Views

X1 DVRs have no native setting--for me, one of the biggest drawbacks to parting with my DCX3400.  I have no confidence in the video processors built into STBs (the X1 included) and I have no desire to change resolution manually depending on the source.  So, to get as close to "native" as possible for the most channels, I previously set my X1 box to output 1080i.  To maintain the highest level of "nativeness" for the most channels, it sounds like the proper setting for X1 boxes is 720p, now that Comcast has converted all but local broadcasts to 720p.  Is that right?

 

This change to the source signal is more than a small pain in the posterior.  I was about to pull the trigger on a Vizio P75-C1, now that prices have dropped below $3K.  But the scalers on the Vizio do a relatively poor job on input sources at or below 720p.  Perfect!  That's Comcast's new and "improved" source resolution.  So, I would have no choice but to use an external video processor if I went with this TV and stayed with Comcast.  No way would I plug directly into the TV and rely on the STB to upconvert.  Ugh.

Posted by
Regular Contributor

Message 109 of 210
3,987 Views
720p may be correct for most channels, but some of the local channels still broadcast in 1080i. So outputting 720p would be downscaling in some instances.

Since the X1 box doesn't support native resolution, I output 1080p from the X1 since my TV would have to convert it to 1080p anyway.
Posted by
Contributor

Message 110 of 210
3,985 Views

jkozlow3 wrote:
720p may be correct for most channels, but some of the local channels still broadcast in 1080i. So outputting 720p would be downscaling in some instances.

Since the X1 box doesn't support native resolution, I output 1080p from the X1 since my TV would have to convert it to 1080p anyway.

You might want to at least consider switching to 720p.  With few exceptions, like maybe the Vizio I referenced above, an STB will do a considerably poorer job of upcaling 720p to 1080p than the video display's internal processor would.  The video processors in STBs are notorious for their low bidder-driven poor quality.  I haven't heard anything to the contrary about the video processors in the Arris and Pace X1 boxes. That's precisely why the native function was so popular with videophiles--the box would only descramble the signal and then pass it on unprocessed. 

 

You're right about the penalty of letting the STB de-scale and de-interlace to 720p those channels still received natively in 1080i.  But it sounds like Comcast's recent change to the signal quality of the vast majority of its channels makes that a concern only for a few local broadcasts.  And even some of those local affiliates (ABC and Fox, for example) already broadcast natively in 720p, making 1080i to 720p conversions even more rare.  But, no question.  It'll be bad when it does occur.  I'm just trying to figure out which setting will cause the fewest problems since none will eliminate them.

Posted by
Regular Contributor

Message 111 of 210
3,968 Views
Yeah, I hear you. It's my understanding that SCALING is easy but DE-INTERLACING is difficult to do well. So I'm not sure there'd be much of a difference in 720p vs. 1080p. But who knows.
Posted by
Contributor

Message 112 of 210
3,955 Views

jkozlow3 wrote:
Yeah, I hear you. It's my understanding that SCALING is easy but DE-INTERLACING is difficult to do well. So I'm not sure there'd be much of a difference in 720p vs. 1080p. But who knows.

Good point. But it does confirm that my original output resolution--1080i--is likely to be the worst possible alternative most of the time, now that Comcast will deliver very few channels in that format. 

Posted by
Frequent Visitor

Message 113 of 210
3,897 Views

People can argue the veracity of 720p vs 1080i.  But there's points that have to be kept in mind:

 

* 1080i can have deinterlace problems.  But a decent contemporary deinterlacer can do a good job.  However, mobile devices don't tend to have good capability here, so the progressive format works better if you want to play on mobile devices, or try to least-common-denominator your streams to work on both mobile and TV devices.

 

* 1080p would be holy grail.   However the bandwidth requirements for cable companies to do 1080p given current plants is not practical (without implementing SDV, or dumping traditional TV delivery for something like IPTV).

 

* 720p does, or course, drop about 55% of the pixels compared to 1080(i/p).  If you have a native 1080 or 2160 (4k) display, of course those pixels have to be interpolated (synthesized).  Stipulation, if you have a display and STB that uses directly the 720 signal... but the pixel mapping still has to happen.

 

* The critical issue... is less the 720p part.  720p can look good, and upscale well - IF the source encoding is very good.  Comcast is overcompressing the signal during the MPEG4 conversion, and reducing the bitrates on the program to just barely good enough.  MPEG4 compression works based on a quality/compression ratio - the more quality you can sacrifice (e.g. fine detail), the more bandwidth you can save.   Comcast has compressed the signal to death to cram more channels together in a QAM.   The loss of fine detail, and in many cases the dithering you can see during scene transitions, I believe are rooted in this.

 

That last part was not publicized as part of the MPEG4 migration (actually, few details were).  But it's the part that causes customer dissatisfaction.   Other systems (like DTV) do MPEG4 as well - but they don't bitrate-starve the result so badly, and the picture looks better.

Posted by
Contributor

Message 114 of 210
3,877 Views

webminster wrote:

People can argue the veracity of 720p vs 1080i.  But there's points that have to be kept in mind:

 

* 1080i can have deinterlace problems.  But a decent contemporary deinterlacer can do a good job.  However, mobile devices don't tend to have good capability here, so the progressive format works better if you want to play on mobile devices, or try to least-common-denominator your streams to work on both mobile and TV devices.

 

* 1080p would be holy grail.   However the bandwidth requirements for cable companies to do 1080p given current plants is not practical (without implementing SDV, or dumping traditional TV delivery for something like IPTV).

 

* 720p does, or course, drop about 55% of the pixels compared to 1080(i/p).  If you have a native 1080 or 2160 (4k) display, of course those pixels have to be interpolated (synthesized).  Stipulation, if you have a display and STB that uses directly the 720 signal... but the pixel mapping still has to happen.

 

* The critical issue... is less the 720p part.  720p can look good, and upscale well - IF the source encoding is very good.  Comcast is overcompressing the signal during the MPEG4 conversion, and reducing the bitrates on the program to just barely good enough.  MPEG4 compression works based on a quality/compression ratio - the more quality you can sacrifice (e.g. fine detail), the more bandwidth you can save.   Comcast has compressed the signal to death to cram more channels together in a QAM.   The loss of fine detail, and in many cases the dithering you can see during scene transitions, I believe are rooted in this.

 

That last part was not publicized as part of the MPEG4 migration (actually, few details were).  But it's the part that causes customer dissatisfaction.   Other systems (like DTV) do MPEG4 as well - but they don't bitrate-starve the result so badly, and the picture looks better.


This is really interesting.  Maybe you can answer one additional question.  I read that consumer electronics frequently don't deinterlace well and that even the high-end stuff produces inconsistent results. https://en.m.wikipedia.org/wiki/Deinterlacing.  If that is true, and I don't know for sure that it is (I recognize my source is dubious), is a 720p signal as Comcast now distributes it on balance "better" than the 1080i signal that used to be Comcast's dominant format?  You gain professional deinterlacing at the distribution source--a difficult and not always successful process for consumer electronics--but you pay a penalty in the form of reduced pixels and overcompression after deinterlacing has been performed and just prior to distribution.  Are we better or worse off with these changes?

Posted by
Frequent Visitor

Message 115 of 210
3,864 Views

There's no end of arguments from Comcast and supporters that we're better off.  Personally, with my 65" OLED set and decent equipment, I disagree, since I can see it.   I'd say your premise could be true (the "professional-quality deinterlacing"), if we knew that to be the case.  The only best way here is 720p from the source, not Comcast re-encoding the signal.

 

Still stand behind the last premise though... As long as Comcast is going to drain all the bitrate out of the signal, overcompress it, the game ended before it started.  No amount of upscaling or other CP-end signal processing can restore the lost detail.

Posted by
Regular Contributor

Message 116 of 210
3,861 Views

webminster wrote:

As long as Comcast is going to drain all the bitrate out of the signal, overcompress it, the game ended before it started.  No amount of upscaling or other CP-end signal processing can restore the lost detail.


Agree.

 

Perhaps if Comcast would announce plans to discontinue broadcasting SD (like DirecTV has announced), they wouldn't have to overcompress their HD channels.  I don't understand why Comcast can't just compress the heck out of SD (to the greatest amount physically possible) to allow for more bandwidth and less compression of HD.  Obviously people that still have SD televisions in 2017 don't care about picture quality.  I don't know a single person who hasn't had a 100% HDTV household for at least 8-10 years.  I haven't used a SDTV since ~2002.

Posted by
Frequent Visitor

Message 117 of 210
3,852 Views

I've seen discussions elsewhere about SD, dropping it or compressing more... my understanding is the SD bandwidth usage is pretty low as is (and the quality already bad enough), that more compression won't gain much.   I'd accept dropping the SD channels - if there were HD replacements.  Unfortunately there's a lot of SD channels (take TVLand) that Comcast doesn't carry in HD (at least here).

 

I've seen arguments that Comcast could drop SD channels where an HD equivalent exists, and have the CP equipment downscale and convert the signal for old TVs.  Not sure if practical in practice...

 

Posted by
Contributor

Message 118 of 210
3,815 Views

webminster wrote:

There's no end of arguments from Comcast and supporters that we're better off.  Personally, with my 65" OLED set and decent equipment, I disagree, since I can see it.   I'd say your premise could be true (the "professional-quality deinterlacing"), if we knew that to be the case.  The only best way here is 720p from the source, not Comcast re-encoding the signal.

 

Still stand behind the last premise though... As long as Comcast is going to drain all the bitrate out of the signal, overcompress it, the game ended before it started.  No amount of upscaling or other CP-end signal processing can restore the lost detail.


Your answer is a little unclear, so let me take another stab at formulating a better question.  I actually offered two foundational premises, both of which I would like you to assume as correct for pedagogical purposes:  Premise (1) from the article I cited ("Since consumer electronics equipment is typically far cheaper, has considerably less processing power and uses simpler algorithms compared to professional deinterlacing equipment, the quality of deinterlacing may vary broadly and typical results are often poor even on high-end equipment."); and Premise (2) that Comcast is in fact using professional grade equipment that most consumers do not have access to and is using it correctly to perform competent deinterlacing.  I fully recognize that Premise (1) does not apply to you personally, because you have professional-grade deinterlacing equipment; and Premise (2) is very difficult to accept, and with good reason.  But I'd like you to accept both premises anyway for the sake of argument.

 

Are you saying that, for most consumers, and starting with the typical broadcast 1080i source signal, the positive benefits of a professional-quality job of deinterlacing are fully negated by the negative effects of downscaling to 720p, bitstarving and overcompression to produce on balance an inferior signal quality?  Again, I emphasize most consumers, not you personally.  I know you have excellent equipment and I accept your counter premise that you don't need Comcast's help with deinterlacing.

 

I'm not angling for a particular answer here.  I'm just trying to figure out what matters more.  I was under the impression that bad deinterlacing is deadly (and very common, given the quality of even some high-grade consumer equipment) because it can produce all kinds of really bad things, like dropped frames and artifacts.  So, if you can't have everything (and, practically speaking, you can't) and you had to pick one thing to do really well, deinterlacing is a pretty good place to start because it's so hard for most consumers to do correctly.  If that impression is wrong, I'd like to know that.

 

 

Posted by
Frequent Visitor

Message 119 of 210
3,789 Views

So first I'm no expert on the subject, only an A/V amateur who watches way too much video.  And bought some good mid-scale (not "professional grade") equipment because of that.  And, needs a decent quality signal to make the investment worth it...

 

That said, part of the deal is is subjective.  Comcast eludes to that in their releases that they had feedback from many focus groups to arrive at their strategy.   I'm sensitive to video issues, dithering, macroblocking, pixelation, what-not.  My wife, on the other hand, not so much... I see something and say "what was that?"... she says, "what?".   So much of what Comcast has done revolves around the majority of customers who can't, for whatever reason, see the difference.

 

Speaking from my view only, the amount of discomfort, if any, in the past from deinterlacing artifacts on the various monitors and STBs I've used is minor compared to the post-migration changes.   Right after the migration, I could see the loss of detail, and the dithering.  call that what you will, but from my (subjective) perspective, the "cure" for any perceived deinterlacing issues is far worse.   Many people on other threads (like at DSLReports) also feel the same, FWIW.

 

One positive - before the migration, a lot of channels were encoded my Comcast with a switching interlaced/progressive format (the signal rapidly changed back and forth).  In Windows Media Center circles, we referred to the syndrome as "29/59" and it could drive a video card nuts (only some video cards did a decent job coping with it).  Even then, sometimes you could see some "shimmering" on channel ID bugs.  Since the migration, those channels are now stable since they're re-encoded as pregressive, no switching.  But it's a 1-forward-1-back situation - the bugs are stable, but the picture is softer and you can see dithering on some channels on occasion.  My choice?  Hard, but I'd take the former.

 

I suspect all the deal with progressive and deinterlacing still matters more for the streaming end.   Is Comcast trying to unify their signal so they don't have a mobile-optimized and a home-TV-optimized delivery, just one?  Since deinterlacing on phones and tablets is much worse that consumer-TV equipment, perhaps.   Comcast keeps me out of the loop on their strategic plans.

Posted by
Contributor

Message 120 of 210
3,748 Views

webminster wrote:

So first I'm no expert on the subject, only an A/V amateur who watches way too much video.  And bought some good mid-scale (not "professional grade") equipment because of that.  And, needs a decent quality signal to make the investment worth it...

 

That said, part of the deal is is subjective.  Comcast eludes to that in their releases that they had feedback from many focus groups to arrive at their strategy.   I'm sensitive to video issues, dithering, macroblocking, pixelation, what-not.  My wife, on the other hand, not so much... I see something and say "what was that?"... she says, "what?".   So much of what Comcast has done revolves around the majority of customers who can't, for whatever reason, see the difference.

 

Speaking from my view only, the amount of discomfort, if any, in the past from deinterlacing artifacts on the various monitors and STBs I've used is minor compared to the post-migration changes.   Right after the migration, I could see the loss of detail, and the dithering.  call that what you will, but from my (subjective) perspective, the "cure" for any perceived deinterlacing issues is far worse.   Many people on other threads (like at DSLReports) also feel the same, FWIW.

 

One positive - before the migration, a lot of channels were encoded my Comcast with a switching interlaced/progressive format (the signal rapidly changed back and forth).  In Windows Media Center circles, we referred to the syndrome as "29/59" and it could drive a video card nuts (only some video cards did a decent job coping with it).  Even then, sometimes you could see some "shimmering" on channel ID bugs.  Since the migration, those channels are now stable since they're re-encoded as pregressive, no switching.  But it's a 1-forward-1-back situation - the bugs are stable, but the picture is softer and you can see dithering on some channels on occasion.  My choice?  Hard, but I'd take the former.

 

I suspect all the deal with progressive and deinterlacing still matters more for the streaming end.   Is Comcast trying to unify their signal so they don't have a mobile-optimized and a home-TV-optimized delivery, just one?  Since deinterlacing on phones and tablets is much worse that consumer-TV equipment, perhaps.   Comcast keeps me out of the loop on their strategic plans.


Ok.  I think I'm beginning to understand where you are coming from.  Let me ask one more follow-up question.  Would you be satisfied with a 720p signal if it were possible to persuade Comcast to cut out (or at least cut back) the post-conversion shenanigans, like overcompressing and bit-starving the transmission signal?  I ask not because I'm trying to paint you into a corner, but because I'm trying to isolate what's really causing irreparable damage to the image.  That is where we should be focusing our efforts to get Comcast to do something.  Is it the conversion to 720p itself or the post-conversion processing creating the mischief?

 

I read somewhere about a petition demanding that Comcast return to the status quo and restore the 1080i signal.  But before I pick up a pitchfork and join the villagers in the town square for the corporate officer weenie roast, I'd really like to know if that alone would make much of a positive difference, on balance.  I personally believe that professional deinterlacing is very beneficial because many (maybe most) of us can't perform that function very well with our consumer-grade equipment.  And, there are only two ways that Comcast can do the deinterlacing for us starting from a 1080i base signal: go up to 1080p or down to 720p.  1080p isn't going to happen.  Comcast won't do it because it requires too much bandwidth.  It strikes me that 720p would actually be a very good alternative if upscaling isn't nearly as problematic for us as deinterlacing.  With a conversion to 720p, you'd get professional equipment applied to the deinterlacing function, leaving consumers with only the relatively easy task of upscaling to 1080p (or higher) using consumer-grade equipment generally adequate to the task.  If this is correct, then the fight really ought to be targeting the post-conversion processing (compression and bit rate reductions), not demanding that Comcast restore the 1080i signal.  Unless I'm misreading the tea leaves, it sounds like we could work effectively with a 720p signal, so long as Comcast otherwise leaves it alone.  I think you agree with me on that, unless I've misunderstood.  See Message 113 ("The critical issue... is less the 720p part.  720p can look good, and upscale well - IF the source encoding is very good.  Comcast is overcompressing the signal during the MPEG4 conversion, and reducing the bitrates on the program to just barely good enough.)" 

Posted by
Regular Contributor

Message 121 of 210
3,669 Views

Following this closely. Maybe we will see PQ improvemnts if you use an app as a cable box. If so, I would love to see this app on AppleTV or even a cable box that runs the app. I guess that would be IPTV, correct?

 

http://www.avsforum.com/xfinity-tv-beta-app-comes-roku-live-tv-demand-dvr/

Posted by
Contributor

Message 122 of 210
3,659 Views

charissamz wrote:

Following this closely. Maybe we will see PQ improvemnts if you use an app as a cable box. If so, I would love to see this app on AppleTV or even a cable box that runs the app. I guess that would be IPTV, correct?

 

http://www.avsforum.com/xfinity-tv-beta-app-comes-roku-live-tv-demand-dvr/


Very hard to say.  The conversion to new and different hardware and platforms is still in its infancy.  To the extent that devices like the Roku function as set top box alternatives, and nothing more, then I think in the short term, the damage to the image that is occurring at the post-conversion encoding stage would probably just flow through the new hardware leaving you more or less in the same place as far as picture quality goes.  But, if the long-term plan is eventually to convert everything over to an internet-based TV system and no longer support a duplicative traditional cable delivery system running in parallel, I could see that as freeing up a lot of bandwidth that could be put to better use than transmitting the same channels twice.  Improved picture quality may very well be one of those uses.

Posted by
Regular Contributor

Message 123 of 210
3,658 Views

JEB11 wrote:


Ok.  I think I'm beginning to understand where you are coming from.  Let me ask one more follow-up question.  Would you be satisfied with a 720p signal if it were possible to persuade Comcast to cut out (or at least cut back) the post-conversion shenanigans, like overcompressing and bit-starving the transmission signal?  I ask not because I'm trying to paint you into a corner, but because I'm trying to isolate what's really causing irreparable damage to the image.  That is where we should be focusing our efforts to get Comcast to do something.  Is it the conversion to 720p itself or the post-conversion processing creating the mischief?

 

I read somewhere about a petition demanding that Comcast return to the status quo and restore the 1080i signal.  But before I pick up a pitchfork and join the villagers in the town square for the corporate officer weenie roast, I'd really like to know if that alone would make much of a positive difference, on balance.  I personally believe that professional deinterlacing is very beneficial because many (maybe most) of us can't perform that function very well with our consumer-grade equipment.  And, there are only two ways that Comcast can do the deinterlacing for us starting from a 1080i base signal: go up to 1080p or down to 720p.  1080p isn't going to happen.  Comcast won't do it because it requires too much bandwidth.  It strikes me that 720p would actually be a very good alternative if upscaling isn't nearly as problematic for us as deinterlacing.  With a conversion to 720p, you'd get professional equipment applied to the deinterlacing function, leaving consumers with only the relatively easy task of upscaling to 1080p (or higher) using consumer-grade equipment generally adequate to the task.  If this is correct, then the fight really ought to be targeting the post-conversion processing (compression and bit rate reductions), not demanding that Comcast restore the 1080i signal.  Unless I'm misreading the tea leaves, it sounds like we could work effectively with a 720p signal, so long as Comcast otherwise leaves it alone.  I think you agree with me on that, unless I've misunderstood.  See Message 113 ("The critical issue... is less the 720p part.  720p can look good, and upscale well - IF the source encoding is very good.  Comcast is overcompressing the signal during the MPEG4 conversion, and reducing the bitrates on the program to just barely good enough.)" 


 

At this point, we're simply speculating, are we not?  We don't know 100% WHY the picture quality is worse, we just know it is.

 

Yes, perhaps 720p would look fine if they didn't overcompress it.  But I don't know that we're benefiting as much from "professional deinterlacing" as you think.  Maybe we are, maybe we're not.  I'd personally like to see side-by-side comparisons.  I've always output 1080p from my X1 box, allowing the X1 box to do the deinterlacing.  When I set the X1 to output 1080i (allowing my TV to do the deinterlacing instead), I couldn't really tell a difference.  I personally think both devices handled the deinterlacing well enough.

 

Yes, deinterlacing has been an issue in the past (it was a BIG issue many years ago with consumer grade equipment such as when HDTVs were first becoming mainstream), but I'm not sure that it's still a big issue for most material.  I suspect that most consumer grade equipment deinterlaces good enough these days.  I remember testing 2 blu-ray players side-by-side many years ago (8+) with SD DVDs (480i).  I found a movie (American Beauty) that had some serious issues when allowing the blu-ray player to deinterlace the signal (it didn't matter if I output 480p or 1080p - the conversion from interlaced to progressive was the issue).  There were several scenes that had a lot of artifacts.  I borrowed another copy of this movie from a friend and hooked up 2 different brands of blu-ray players side-by-side into 2 different inputs on my AVR so that I could quickly switch back and forth.  The difference was astounding and 1 player handled the deinterlacing without any visible issues.

 

So, back to Comcast.  I don't know exactly what the issue is.  All I know is that the picture quality is noticeably worse now than it used to be.  Could 720p suffice if they didn't overcompress it?  Perhaps.  I'm not demanding 1080i, but I *would* like a better picture - however they achieve it.  I wish Comcast would have tested on larger TVs (65" and larger) and taken picture quality into account before choosing to do whatever they've done.  It looks bad.

Posted by
Contributor

Message 124 of 210
3,654 Views

jkozlow3 wrote:

JEB11 wrote:


Ok.  I think I'm beginning to understand where you are coming from.  Let me ask one more follow-up question.  Would you be satisfied with a 720p signal if it were possible to persuade Comcast to cut out (or at least cut back) the post-conversion shenanigans, like overcompressing and bit-starving the transmission signal?  I ask not because I'm trying to paint you into a corner, but because I'm trying to isolate what's really causing irreparable damage to the image.  That is where we should be focusing our efforts to get Comcast to do something.  Is it the conversion to 720p itself or the post-conversion processing creating the mischief?

 

I read somewhere about a petition demanding that Comcast return to the status quo and restore the 1080i signal.  But before I pick up a pitchfork and join the villagers in the town square for the corporate officer weenie roast, I'd really like to know if that alone would make much of a positive difference, on balance.  I personally believe that professional deinterlacing is very beneficial because many (maybe most) of us can't perform that function very well with our consumer-grade equipment.  And, there are only two ways that Comcast can do the deinterlacing for us starting from a 1080i base signal: go up to 1080p or down to 720p.  1080p isn't going to happen.  Comcast won't do it because it requires too much bandwidth.  It strikes me that 720p would actually be a very good alternative if upscaling isn't nearly as problematic for us as deinterlacing.  With a conversion to 720p, you'd get professional equipment applied to the deinterlacing function, leaving consumers with only the relatively easy task of upscaling to 1080p (or higher) using consumer-grade equipment generally adequate to the task.  If this is correct, then the fight really ought to be targeting the post-conversion processing (compression and bit rate reductions), not demanding that Comcast restore the 1080i signal.  Unless I'm misreading the tea leaves, it sounds like we could work effectively with a 720p signal, so long as Comcast otherwise leaves it alone.  I think you agree with me on that, unless I've misunderstood.  See Message 113 ("The critical issue... is less the 720p part.  720p can look good, and upscale well - IF the source encoding is very good.  Comcast is overcompressing the signal during the MPEG4 conversion, and reducing the bitrates on the program to just barely good enough.)" 


 

At this point, we're simply speculating, are we not?  We don't know 100% WHY the picture quality is worse, we just know it is.

 

Yes, perhaps 720p would look fine if they didn't overcompress it.  But I don't know that we're benefiting as much from "professional deinterlacing" as you think.  Maybe we are, maybe we're not.  I'd personally like to see side-by-side comparisons.  I've always output 1080p from my X1 box, allowing the X1 box to do the deinterlacing.  When I set the X1 to output 1080i (allowing my TV to do the deinterlacing instead), I couldn't really tell a difference.  I personally think both devices handled the deinterlacing well enough.

 

Yes, deinterlacing has been an issue in the past (it was a BIG issue many years ago with consumer grade equipment such as when HDTVs were first becoming mainstream), but I'm not sure that it's still a big issue for most material.  I suspect that most consumer grade equipment deinterlaces good enough these days.  I remember testing 2 blu-ray players side-by-side many years ago (8+) with SD DVDs (480i).  I found a movie (American Beauty) that had some serious issues when allowing the blu-ray player to deinterlace the signal (it didn't matter if I output 480p or 1080p - the conversion from interlaced to progressive was the issue).  There were several scenes that had a lot of artifacts.  I borrowed another copy of this movie from a friend and hooked up 2 different brands of blu-ray players side-by-side into 2 different inputs on my AVR so that I could quickly switch back and forth.  The difference was astounding and 1 player handled the deinterlacing without any visible issues.

 

So, back to Comcast.  I don't know exactly what the issue is.  All I know is that the picture quality is noticeably worse now than it used to be.  Could 720p suffice if they didn't overcompress it?  Perhaps.  I'm not demanding 1080i, but I *would* like a better picture - however they achieve it.  I wish Comcast would have tested on larger TVs (65" and larger) and taken picture quality into account before choosing to do whatever they've done.  It looks bad.


I couldn't agree with you more.  We are indeed speculating, and that is precisely what I take issue with.  I was hoping that someone with far more expertise in the AV sciences than I could chime in and identify the culprit.  Is it the conversion to 720p that's causing the problem?  The post-conversion encoding?  Both?  I certainly don't know, so I asked the question.  But the "I-don't-like-it-change-it back" approach for me is neither productive nor compelling because it doesn't advance the ball in any systematic way.  If Comcast granted the wish and went back to 1080i but continued to apply the same encoding scheme to the 1080i signal as it currently does to the 720p signal, would we be better off?  Shoudn't we know before stomping our feet and demanding that Comcast do it?  It strikes me as a wasted effort if we get what we ask for, and the picture still stinks because we didn't ask for the right thing.

Posted by
Service Expert

Message 125 of 210
3,605 Views

JEB11 wrote:

jkozlow3 wrote:

JEB11 wrote:


Ok.  I think I'm beginning to understand where you are coming from.  Let me ask one more follow-up question.  Would you be satisfied with a 720p signal if it were possible to persuade Comcast to cut out (or at least cut back) the post-conversion shenanigans, like overcompressing and bit-starving the transmission signal?  I ask not because I'm trying to paint you into a corner, but because I'm trying to isolate what's really causing irreparable damage to the image.  That is where we should be focusing our efforts to get Comcast to do something.  Is it the conversion to 720p itself or the post-conversion processing creating the mischief?

 

I read somewhere about a petition demanding that Comcast return to the status quo and restore the 1080i signal.  But before I pick up a pitchfork and join the villagers in the town square for the corporate officer weenie roast, I'd really like to know if that alone would make much of a positive difference, on balance.  I personally believe that professional deinterlacing is very beneficial because many (maybe most) of us can't perform that function very well with our consumer-grade equipment.  And, there are only two ways that Comcast can do the deinterlacing for us starting from a 1080i base signal: go up to 1080p or down to 720p.  1080p isn't going to happen.  Comcast won't do it because it requires too much bandwidth.  It strikes me that 720p would actually be a very good alternative if upscaling isn't nearly as problematic for us as deinterlacing.  With a conversion to 720p, you'd get professional equipment applied to the deinterlacing function, leaving consumers with only the relatively easy task of upscaling to 1080p (or higher) using consumer-grade equipment generally adequate to the task.  If this is correct, then the fight really ought to be targeting the post-conversion processing (compression and bit rate reductions), not demanding that Comcast restore the 1080i signal.  Unless I'm misreading the tea leaves, it sounds like we could work effectively with a 720p signal, so long as Comcast otherwise leaves it alone.  I think you agree with me on that, unless I've misunderstood.  See Message 113 ("The critical issue... is less the 720p part.  720p can look good, and upscale well - IF the source encoding is very good.  Comcast is overcompressing the signal during the MPEG4 conversion, and reducing the bitrates on the program to just barely good enough.)" 


 

At this point, we're simply speculating, are we not?  We don't know 100% WHY the picture quality is worse, we just know it is.

 

Yes, perhaps 720p would look fine if they didn't overcompress it.  But I don't know that we're benefiting as much from "professional deinterlacing" as you think.  Maybe we are, maybe we're not.  I'd personally like to see side-by-side comparisons.  I've always output 1080p from my X1 box, allowing the X1 box to do the deinterlacing.  When I set the X1 to output 1080i (allowing my TV to do the deinterlacing instead), I couldn't really tell a difference.  I personally think both devices handled the deinterlacing well enough.

 

Yes, deinterlacing has been an issue in the past (it was a BIG issue many years ago with consumer grade equipment such as when HDTVs were first becoming mainstream), but I'm not sure that it's still a big issue for most material.  I suspect that most consumer grade equipment deinterlaces good enough these days.  I remember testing 2 blu-ray players side-by-side many years ago (8+) with SD DVDs (480i).  I found a movie (American Beauty) that had some serious issues when allowing the blu-ray player to deinterlace the signal (it didn't matter if I output 480p or 1080p - the conversion from interlaced to progressive was the issue).  There were several scenes that had a lot of artifacts.  I borrowed another copy of this movie from a friend and hooked up 2 different brands of blu-ray players side-by-side into 2 different inputs on my AVR so that I could quickly switch back and forth.  The difference was astounding and 1 player handled the deinterlacing without any visible issues.

 

So, back to Comcast.  I don't know exactly what the issue is.  All I know is that the picture quality is noticeably worse now than it used to be.  Could 720p suffice if they didn't overcompress it?  Perhaps.  I'm not demanding 1080i, but I *would* like a better picture - however they achieve it.  I wish Comcast would have tested on larger TVs (65" and larger) and taken picture quality into account before choosing to do whatever they've done.  It looks bad.


I couldn't agree with you more.  We are indeed speculating, and that is precisely what I take issue with.  I was hoping that someone with far more expertise in the AV sciences than I could chime in and identify the culprit.  Is it the conversion to 720p that's causing the problem?  The post-conversion encoding?  Both?  I certainly don't know, so I asked the question.  But the "I-don't-like-it-change-it back" approach for me is neither productive nor compelling because it doesn't advance the ball in any systematic way.  If Comcast granted the wish and went back to 1080i but continued to apply the same encoding scheme to the 1080i signal as it currently does to the 720p signal, would we be better off?  Shoudn't we know before stomping our feet and demanding that Comcast do it?  It strikes me as a wasted effort if we get what we ask for, and the picture still stinks because we didn't ask for the right thing.


that's not how it works. The signal from the satellite eventually will be a master version at 4k or later 8k progressive format that is received by a 'recoder' that provides 8k/4k/HDR/etc and HD 720p60 and SD versions all from the single copy sent via the satellite that uses the next generation HEVC security. The recoder puts out the signals in the Mpeg4 or Mpeg2 format for each streamed output. 

 

As an example, you get TV land normally in SD. but go to TVland TVgo and it is in HD. All the same satellite feed just recoded. That is why the conversion to all progressive format is being made for satellite delivered signals.




Community Icon
I am not a Comcast employee, I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help. For information on the program click here.
We ask that you post publicly so people with similar questions may benefit from the conversation.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee. I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help.
We ask that you post publicly so people with similar questions may benefit.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee.

Was your question answered?
Mark it as a solution!solution Icon

Posted by
Problem Solver

Message 126 of 210
3,592 Views

Thank you Rusty:

 

That is the way I thought it would work. Eventually we should get a 4K cable-box and at least one 4 k linear channel. It has been hard for any business to plan long term over the last few years.

 

 

Posted by
Contributor

Message 127 of 210
3,581 Views

Rustyben wrote:
that's not how it works. The signal from the satellite eventually will be a master version at 4k or later 8k progressive format that is received by a 'recoder' that provides 8k/4k/HDR/etc and HD 720p60 and SD versions all from the single copy sent via the satellite that uses the next generation HEVC security. The recoder puts out the signals in the Mpeg4 or Mpeg2 format for each streamed output. 

As an example, you get TV land normally in SD. but go to TVland TVgo and it is in HD. All the same satellite feed just recoded. That is why the conversion to all progressive format is being made for satellite delivered signals.



I don't have quite the technical sophistication to follow all of this.  Is the punchline that the conversion of all channels to 720p is more of an interim step to delivery of higher resolutions via satellite feed and it's nothing to be concerned about at this point?  If so, you have answered my question.

Posted by
Service Expert

Message 128 of 210
3,550 Views
Posted by
New Poster
  • Thank you contributor for your first reply to the community!
 Posting replies is the best way to get involved.

Message 129 of 210
2,748 Views

All I know is that the image quality seems to be worse than it used to be and once I've noticed it I can't unsee it. 

 

I use primarily two TiVo boxes with CableCards and they are capable of 1080p but I can't find a single channel on Xfinitiy that's 1080p. Not one. I am very disappointed in the direction Comcast is heading. With 4k on the horizon already and 1080p well-entrenched after being available for about a decade, they seem to be heading backwards. 

 

My only other option would be to go with DIrecTV or cut the cord where I live because there are no other cable providers. I really don't like either of those options so I'm stuck.

Posted by
Regular Contributor

Message 130 of 210
2,736 Views

TheKurtster wrote:

All I know is that the image quality seems to be worse than it used to be and once I've noticed it I can't unsee it. 

 

I use primarily two TiVo boxes with CableCards and they are capable of 1080p but I can't find a single channel on Xfinitiy that's 1080p. Not one. I am very disappointed in the direction Comcast is heading. With 4k on the horizon already and 1080p well-entrenched after being available for about a decade, they seem to be heading backwards. 

 

My only other option would be to go with DIrecTV or cut the cord where I live because there are no other cable providers. I really don't like either of those options so I'm stuck.


There will be a lot more cord-cutting options in the near future.  Hulu, Verizon, etc. are all coming out with streaming live-tv services soon!

Posted by
Service Expert

Message 131 of 210
2,725 Views

TheKurtster wrote:

All I know is that the image quality seems to be worse than it used to be and once I've noticed it I can't unsee it. 

 

I use primarily two TiVo boxes with CableCards and they are capable of 1080p but I can't find a single channel on Xfinitiy that's 1080p. Not one. I am very disappointed in the direction Comcast is heading. With 4k on the horizon already and 1080p well-entrenched after being available for about a decade, they seem to be heading backwards. 

 

My only other option would be to go with DIrecTV or cut the cord where I live because there are no other cable providers. I really don't like either of those options so I'm stuck.


1080P is not a broadcast nor cable provided format of programming. For HD the normal formats are 720p and 1080i. note that the i means interlaced which is a CRT transmission protocol/technology that is being deprecated and moved to 720p (progressive).  Google 720p vs 1080i for more information.




Community Icon
I am not a Comcast employee, I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help. For information on the program click here.
We ask that you post publicly so people with similar questions may benefit from the conversation.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee. I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help.
We ask that you post publicly so people with similar questions may benefit.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee.

Was your question answered?
Mark it as a solution!solution Icon

Posted by
Contributor

Message 132 of 210
2,717 Views

TheKurtster wrote:

All I know is that the image quality seems to be worse than it used to be and once I've noticed it I can't unsee it. 

 

I use primarily two TiVo boxes with CableCards and they are capable of 1080p but I can't find a single channel on Xfinitiy that's 1080p. Not one. I am very disappointed in the direction Comcast is heading. With 4k on the horizon already and 1080p well-entrenched after being available for about a decade, they seem to be heading backwards. 

 

My only other option would be to go with DIrecTV or cut the cord where I live because there are no other cable providers. I really don't like either of those options so I'm stuck.


You've answered your own question, DirecTV is the far better option when it comes to picture quality. Comcast has decided that 720p is the future when all evidence points to the contrary.

 

Make no mistake, Comcast moving to 720p for all their channels has absolutely nothing to do with improving picture quality but everything to do with saving bandwidth to enable their moves to a fully streaming IP delivery system and increased HSI speeds. 

 

If you don't want to suffer through crappy quality video from Comcast while they struggle with their limited bandwidth problems your best option is to move to DirecTV, especially if you value good picture quality! Comcast is in a transition stage, don't pay them while they reduce the quality of the product to benefit their internal infrastructure changes. Actually, they should be offering discounts for the reduced quality. 

 

Another tip, don't listen to what Rustyben says about broadcast formats, he's trying to confuse the issue and avoid the real question as we all know modern displays deinterlace those 1080i signal to display at 1080p. Additionally, his comments about 1080p not being a broadcast format are patently false, the ATSC spec actually includes 1080p30 and 1080p24 as part of its broadcast formats.

 

Again, Comcast's claim that the reduction in image resolution was to improve picture quality are flat out false, it was done for their benefit and not for the benefit of the viewer.

Posted by
Problem Solver

Message 133 of 210
2,708 Views

KeenanSR is correct. Countries like Japan, S. Korea, and others are transmitting, OTA, cable, satellite, and broadband, in UHD (4K) which also includes 1080p in MPEG4. NHK in Japan makes all the transmission equipment for 4K and upcoming 8K. Information on NHK can be found with a You Tube search.

 

Rusty would be correct about U.S.A. (3rd world country for television technology) with no OTA or linear-cable transmissions yet.

Posted by
Contributor

Message 134 of 210
2,702 Views

MNtundraRET wrote:

KeenanSR is correct. Countries like Japan, S. Korea, and others are transmitting, OTA, cable, satellite, and broadband, in UHD (4K) which also includes 1080p in MPEG4. NHK in Japan makes all the transmission equipment for 4K and upcoming 8K. Information on NHK can be found with a You Tube search.

 

Rusty would be correct about U.S.A. (3rd world country for television technology) with no OTA or linear-cable transmissions yet.


Right, the takeaway here being that Comcast has claimed that this reduction to 720p is to "improve" picture quality, but the fact is that it was done to save bandwidth. Comcast could have very easily sent 1080i channels in their native resolution at MPEG4(DirecTV and others, including networks, have been doing it for years) just as they are doing with the 720p channels but they wanted to squeeze out that last little bit of bandwidth to facilitate ongoing infrastructure changes, not to improve picture quality.

Posted by
Regular Contributor

Message 135 of 210
2,701 Views

KeenanSR wrote:

TheKurtster wrote:

All I know is that the image quality seems to be worse than it used to be and once I've noticed it I can't unsee it. 

 

I use primarily two TiVo boxes with CableCards and they are capable of 1080p but I can't find a single channel on Xfinitiy that's 1080p. Not one. I am very disappointed in the direction Comcast is heading. With 4k on the horizon already and 1080p well-entrenched after being available for about a decade, they seem to be heading backwards. 

 

My only other option would be to go with DIrecTV or cut the cord where I live because there are no other cable providers. I really don't like either of those options so I'm stuck.


You've answered your own question, DirecTV is the far better option when it comes to picture quality. Comcast has decided that 720p is the future when all evidence points to the contrary.

 

Make no mistake, Comcast moving to 720p for all their channels has absolutely nothing to do with improving picture quality but everything to do with saving bandwidth to enable their moves to a fully streaming IP delivery system and increased HSI speeds. 

 

If you don't want to suffer through crappy quality video from Comcast while they struggle with their limited bandwidth problems your best option is to move to DirecTV, especially if you value good picture quality! Comcast is in a transition stage, don't pay them while they reduce the quality of the product to benefit their internal infrastructure changes. Actually, they should be offering discounts for the reduced quality. 

 

Another tip, don't listen to what Rustyben says about broadcast formats, he's trying to confuse the issue and avoid the real question as we all know modern displays deinterlace those 1080i signal to display at 1080p. Additionally, his comments about 1080p not being a broadcast format are patently false, the ATSC spec actually includes 1080p30 and 1080p24 as part of its broadcast formats.

 

Again, Comcast's claim that the reduction in image resolution was to improve picture quality are flat out false, it was done for their benefit and not for the benefit of the viewer.


THIS!

 

Due to a contract I am still with Comcast but the lack of some HD channels is the reason I will go back after the contract is up, however, I do prefer X1 UI over DirecTV's. X1 just seems simple. This whole "upgrade" changed my viewing habits. When I can I watch a show via an app on my AppleTV such as HBO Go, AMC, etc it is my 1st choice!. My 2nd choice is via the onDemand as Comcast has yet to degrade that quaility.  The 3rd is via a live/recorded which gives me the worst picture.

 

The quality of live sports on channels such as TNT, TBS, etc are night and day compared to when I used to watch them on DirecTV. I am not even talking night and day for an videophile. My wife even says, "is this the HD channel?". My response is always, yes, X1 now switches to it automatically, it is just Comcast's new upgraded picture.

 

HBO, AMC, etc all look awful, especially for movies and shoes with dark scenes.

Posted by
Contributor

Message 136 of 210
2,683 Views

Glad I'm not the only one using my AppleTV more. I'll see what I recorded on my DVR, then go find it on my AppleTV to watch, then delete it off my DVR. It serves mainly as a bookmark and archive for all my recorded shows pre-720p transition.

Posted by
Service Expert

Message 137 of 210
2,663 Views

charissamz wrote:

KeenanSR wrote:

TheKurtster wrote:

All I know is that the image quality seems to be worse than it used to be and once I've noticed it I can't unsee it. 

 

I use primarily two TiVo boxes with CableCards and they are capable of 1080p but I can't find a single channel on Xfinitiy that's 1080p. Not one. I am very disappointed in the direction Comcast is heading. With 4k on the horizon already and 1080p well-entrenched after being available for about a decade, they seem to be heading backwards. 

 

My only other option would be to go with DIrecTV or cut the cord where I live because there are no other cable providers. I really don't like either of those options so I'm stuck.


You've answered your own question, DirecTV is the far better option when it comes to picture quality. Comcast has decided that 720p is the future when all evidence points to the contrary.

 

Make no mistake, Comcast moving to 720p for all their channels has absolutely nothing to do with improving picture quality but everything to do with saving bandwidth to enable their moves to a fully streaming IP delivery system and increased HSI speeds. 

 

If you don't want to suffer through crappy quality video from Comcast while they struggle with their limited bandwidth problems your best option is to move to DirecTV, especially if you value good picture quality! Comcast is in a transition stage, don't pay them while they reduce the quality of the product to benefit their internal infrastructure changes. Actually, they should be offering discounts for the reduced quality. 

 

Another tip, don't listen to what Rustyben says about broadcast formats, he's trying to confuse the issue and avoid the real question as we all know modern displays deinterlace those 1080i signal to display at 1080p. Additionally, his comments about 1080p not being a broadcast format are patently false, the ATSC spec actually includes 1080p30 and 1080p24 as part of its broadcast formats.

 

Again, Comcast's claim that the reduction in image resolution was to improve picture quality are flat out false, it was done for their benefit and not for the benefit of the viewer.


THIS!

 

Due to a contract I am still with Comcast but the lack of some HD channels is the reason I will go back after the contract is up, however, I do prefer X1 UI over DirecTV's. X1 just seems simple. This whole "upgrade" changed my viewing habits. When I can I watch a show via an app on my AppleTV such as HBO Go, AMC, etc it is my 1st choice!. My 2nd choice is via the onDemand as Comcast has yet to degrade that quaility.  The 3rd is via a live/recorded which gives me the worst picture.

 

The quality of live sports on channels such as TNT, TBS, etc are night and day compared to when I used to watch them on DirecTV. I am not even talking night and day for an videophile. My wife even says, "is this the HD channel?". My response is always, yes, X1 now switches to it automatically, it is just Comcast's new upgraded picture.

 

HBO, AMC, etc all look awful, especially for movies and shoes with dark scenes.


usually Comcast will not charge ETF if you keep one line of service, for example, internet.




Community Icon
I am not a Comcast employee, I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help. For information on the program click here.
We ask that you post publicly so people with similar questions may benefit from the conversation.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee. I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help.
We ask that you post publicly so people with similar questions may benefit.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee.

Was your question answered?
Mark it as a solution!solution Icon

Posted by
Service Expert

Message 138 of 210
2,658 Views

KeenanSR wrote:

MNtundraRET wrote:

KeenanSR is correct. Countries like Japan, S. Korea, and others are transmitting, OTA, cable, satellite, and broadband, in UHD (4K) which also includes 1080p in MPEG4. NHK in Japan makes all the transmission equipment for 4K and upcoming 8K. Information on NHK can be found with a You Tube search.

 

Rusty would be correct about U.S.A. (3rd world country for television technology) with no OTA or linear-cable transmissions yet.


Right, the takeaway here being that Comcast has claimed that this reduction to 720p is to "improve" picture quality, but the fact is that it was done to save bandwidth. Comcast could have very easily sent 1080i channels in their native resolution at MPEG4(DirecTV and others, including networks, have been doing it for years) just as they are doing with the 720p channels but they wanted to squeeze out that last little bit of bandwidth to facilitate ongoing infrastructure changes, not to improve picture quality.


I'm a customer like you guys. I do know thet the cable companies get 'a' channel feed and use a recoder to produce all the needed varities for linear SD, HD, device streaming, web site streaming. No modern display device uses the CRT interlaced format due to the way the signal must be approximated and reassembled in 1/2 images that lay over each other like shredding strips of paer and pasting them back together electronically from 2 different images.

 

"progressive" video is a stream of whole (or changes in the whole) pictures (imagine stills). with mpeg 4 there are fill in for changes that are 'boxes' 'rectangles' and even 'slices' (very small areas).  The point is that interlace would never have been used if progressive devices had been available when the technology of TV first began. There are people that still want NTSC just because they don't like change. No 1080i can 'make' 1080p the detail is not magically created out of air only approximated. 




Community Icon
I am not a Comcast employee, I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help. For information on the program click here.
We ask that you post publicly so people with similar questions may benefit from the conversation.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee. I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help.
We ask that you post publicly so people with similar questions may benefit.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee.

Was your question answered?
Mark it as a solution!solution Icon

Posted by
Problem Solver

Message 139 of 210
2,644 Views

This Saturday, 4/8/17, CBS will be using their UHD Cameras first tested (internal use only) at the Super Bowl in 2016. They are set up at "Amen Corner" at the Master's in Georgia. They will be available for the few customers at Direct TV who have 4K televisions and the needed 4K satellite equipment. The customers will see only the 4k cameras used there "live" on Saturday and Sunday. Most likely the same equipment used for some Notredame football games last fall.

 

This is somewhat similar to an experiment used by NBC last year at the PGA tournament. NBC created a special 4K PGA app for those of us with Samsung UHD televisions. I down-loaded the app and was able to watch 4k cameras used to follow a choice of 2 groups of players on the first 3 days played "live", or watch replays supplied on the app in 4K. They live action was rather poor considering people are just getting used to using the equipment. Pictures would break up when the walked with the cameras on some fareways. The replays were all good. This experiment was done using Broadband Internet for transmission.

 

What this means is that when Comcast gets their Xi6 (4K) cable-box it may be possible for them to pick up 4K, or 1080p, signals off the bird for early 4K experiments from CBS or NBC in the future. MPEG4 is used by NHK for their 4K and 8K equipment.

Posted by
Regular Contributor

Message 140 of 210
2,638 Views

Cydeweyz wrote:

Glad I'm not the only one using my AppleTV more. I'll see what I recorded on my DVR, then go find it on my AppleTV to watch, then delete it off my DVR. It serves mainly as a bookmark and archive for all my recorded shows pre-720p transition.


I do the same. If it isnt on AppleTV I then use my recorded queue to click on it, choose "ways to watch" and pick the ondemand version vs the recorded version.  The ondemand version is better than the live feed version. It honestly is not as good as DirecTV but it isn't bad. If you really analyze it you can see that the ondemand is worse than say a AppleTV stream and obviously blu-ray but it is watchable without ruining the experience.  I wish the live were that way and if it was I could easily look past it.

 

My hope is that like a previous poster and others have mentioned - this is all to save bandwidth to roll out new stuff. If you look into it, comcast is experimenting on Roku right now where they have their "Stream TV" app, like the one you can get on you ipad, etc.  Basically making your streaming device a cable box.  I hope that the quality through that can equal a stream you get through an app such as HBO Go.  We shall see. 

Posted by
Problem Solver

Message 141 of 210
2,622 Views

I just found a download for the Master's Golf 4K app on my Samsung UHD TV covering "Amen Corner" for all 4 days. Times are 11:45 am to 6:00 pm EDT each day. Live television being done on streaming Internet broadband. Direct TV has the same broadcast being done live on satellite.

 

The live signal is exellent for this time around. This would be done with CBS 4k equipment.

Posted by
New Poster
  • Thank you contributor for your first reply to the community!
 Posting replies is the best way to get involved.

Message 142 of 210
2,139 Views

" There are people that still want NTSC just because they don't like change. No 1080i can 'make' 1080p the detail is not magically created out of air only approximated. "

It's not magic. Deinterlacing (1080i 24fps --> 1080i 60fps) is trivial in modern TV's and set top boxes.

Posted by
Service Expert

Message 143 of 210
2,129 Views

locomo wrote:

" There are people that still want NTSC just because they don't like change. No 1080i can 'make' 1080p the detail is not magically created out of air only approximated. "

It's not magic. Deinterlacing (1080i 24fps --> 1080i 60fps) is trivial in modern TV's and set top boxes.


I would suggest a typo fix but that doesn't mean anything (just a frame rate change?).

 

progressive (whole pictures) vs mixing of 2 separate images (interlace) has a lot of difference.




Community Icon
I am not a Comcast employee, I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help. For information on the program click here.
We ask that you post publicly so people with similar questions may benefit from the conversation.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee. I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help.
We ask that you post publicly so people with similar questions may benefit.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee.

Was your question answered?
Mark it as a solution!solution Icon

Posted by
New Poster
  • Thank you contributor for your first reply to the community!
 Posting replies is the best way to get involved.

Message 144 of 210
2,120 Views

Comcast has hit the Boston area with the 720p downgrade. TNT used to be in 1080i and 448kpbs audio now it is at 720p and 384kbps. <Edited for violating forum guidelines: "Language">! Why? It also broke my cable card since I can't watch the cable networks now.

Posted by
Service Expert

Message 145 of 210
2,114 Views

erickrohto wrote:

Comcast has hit the Boston area with the 720p downgrade. TNT used to be in 1080i and 448kpbs audio now it is at 720p and 384kbps. Why? It also broke my cable card since I can't watch the cable networks now.


you will need to open a ticket to resolve the cable card issue with the device it is plugged into. The notice went out on converting from the higher bitrate mpeg2 to mpeg4 streams many months ago. there are many articles available about interlace 1080i and progressive 720p60 and quality. Comcast is moving toward an all progressive ismage video network.




Community Icon
I am not a Comcast employee, I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help. For information on the program click here.
We ask that you post publicly so people with similar questions may benefit from the conversation.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee. I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help.
We ask that you post publicly so people with similar questions may benefit.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee.

Was your question answered?
Mark it as a solution!solution Icon

Posted by
Regular Contributor

Message 146 of 210
2,099 Views

erickrohto wrote:

Comcast has hit the Boston area with the 720p downgrade. TNT used to be in 1080i and 448kpbs audio now it is at 720p and 384kbps. Why? It also broke my cable card since I can't watch the cable networks now.


But its an upgrade! I can't believe the downgrade in picture quality. Some people around these forums like to say the picture quality drop isn't bad or it's your TV, settings, etc. I came from DirecTV to this so it is very noticable. I am curious to see what someone who knows about this and just went through the "upgrade" thinks about picture quality, especially in dark scenes in movies from channels like HBO.

 

Posted by
Service Expert

Message 147 of 210
2,092 Views

charissamz wrote:

erickrohto wrote:

Comcast has hit the Boston area with the 720p downgrade. TNT used to be in 1080i and 448kpbs audio now it is at 720p and 384kbps. <edit removed per guidelines>> comcast! Why? It also broke my cable card since I can't watch the cable networks now.


But its an upgrade! I can't believe the downgrade in picture quality. Some people around these forums like to say the picture quality drop isn't bad or it's your TV, settings, etc. I came from DirecTV to this so it is very noticable. I am curious to see what someone who knows about this and just went through the "upgrade" thinks about picture quality, especially in dark scenes in movies from channels like HBO.


look at ABC do you see a degradation on that channel? ABC and FOX are native 720p so should be no difference. If you set your Comcast equipment to 720p then your home equipment can upconvert if you like to 1080p with better designed equipment.




Community Icon
I am not a Comcast employee, I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help. For information on the program click here.
We ask that you post publicly so people with similar questions may benefit from the conversation.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee. I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help.
We ask that you post publicly so people with similar questions may benefit.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee.

Was your question answered?
Mark it as a solution!solution Icon

Posted by
Frequent Visitor

Message 148 of 210
1,881 Views

This needs to reported to the FCC and the local police.

 

Posted by
Service Expert

Message 149 of 210
1,875 Views

BigDave1927 wrote:

This needs to reported to the FCC and the local police. 


pardon?




Community Icon
I am not a Comcast employee, I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help. For information on the program click here.
We ask that you post publicly so people with similar questions may benefit from the conversation.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee. I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help.
We ask that you post publicly so people with similar questions may benefit.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee.

Was your question answered?
Mark it as a solution!solution Icon

Posted by
Frequent Visitor

Message 150 of 210
1,858 Views

BigDave1927 wrote:

This needs to reported to the FCC and the local police.

 


"Hello, mr. occifer sir. Umm, I'd like to report a theft. Comcast has stolen my pixels.".

 

I'm curious how that would turn out.