Welcome to Comcast Help & Support Forums
Find solutions, share knowledge, and get answers from customers and experts

New to the Community? Start here.

5,667,120

members

56

online now

1,838,508

discussions

Back to Top

Comcast downgrading all 1080i HD channels to 720p

Posted by
Service Expert

Message 51 of 182
2,884 Views

jlind wrote:

Rustyben wrote:

information only, if you don't appreciate tech talk, please skip this post.

 <snip>

The result is 720p MPEG4 mush. It's not High Definition, it's Horrid Definition.

 

John


your post seems to be confusing QAM containers that contain several channels and are never crowded nor bit starved, and mpeg4 containers. The mpeg4 container is far more efficient in preventing loss of picture detail during encoding/decoding. The mpeg4 has open ended meta data streams too allowing the text information needed for new technology like ATMOS. 




Community Icon
I am not a Comcast employee, I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help. For information on the program click here.
We ask that you post publicly so people with similar questions may benefit from the conversation.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee. I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help.
We ask that you post publicly so people with similar questions may benefit.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee.

Was your question answered?
Mark it as a solution!solution Icon

Posted by
Service Expert

Message 52 of 182
2,882 Views

KeenanSR wrote:

Rustyben wrote:

charissamz wrote:
I hope you get something from comcast for your posts... I really do. It's the only explanation I have for your responses at this point. While I and others fully understand your comments, the fact remains, Comcast is the only one doing this.
<snip>
We understand that you're trying to help out on the forum but please stop parroting the nonsense that Comcast has been stating elsewhere in this forum, it's disingenuous and insulting to the intelligence of those that know better.

Ok, you seem to be pushing to return to mpeg2 (inefficient compared to mpeg4) for some reason. The idea is to get the best image in the smallest package which is why Comcast and others are spending many millions to convert to mpeg4. Sure, Comcast could have said no let's keep our money in the bank, and not use the HD technology fees we collect to improve the delivered picture quality, but instead they did commit to the massive upgrade in technology. 

 

You should be quite glad in that the out-of-home-viewing is in mpeg4 as it uses 1/2 approximately of the data for the same image quality. I'm hoping soon that Comcast decides to let you adjust at will the amount of cloud storage you have and maybe, just remove the physical DVR from the home entirely.

 

And for what it is worth, I'm a customer just like you. I just happen to have been a TV tech from the old black and white days, regunned CRTs, dealt with technology changes like round RCA color TVs to rectangular color, from hand wired chassis to modern black box epoxied boards, ... (etc.)




Community Icon
I am not a Comcast employee, I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help. For information on the program click here.
We ask that you post publicly so people with similar questions may benefit from the conversation.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee. I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help.
We ask that you post publicly so people with similar questions may benefit.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee.

Was your question answered?
Mark it as a solution!solution Icon

Posted by
Regular Contributor

Message 53 of 182
2,873 Views

RustyBen, lets be real. It is simple. Comcast is just trying to squeeze all they can out of what they have without spending too much on infrastructure.  They know most people wont notice the difference and or wont care enough to leave.  They also know traditional cable TV is changing, I wont say dying but it is changing. Their internet is their true bread and butter, even if the world cut the cord they know they have that.  And if net neutrality changes, well, look out, they really have that! Decrease this to allow more bandwidth for internet.

 

My hope is it is a 1 step back to go 2 steps forward and by increasing bandwidth we see IPTV and Xfintiy TV App on my devices like AppleTV that allow a good stream such as what you get from the TV channel apps such as HBO Go, AMC, etc and Netflix. Or even an IPTV Comcast cable box.

 

Like the previous poster said, please stop, you may know more about the technology behind the scenes than I but you aren't talking to people asking what end the HDMI cable plugs in or how to program our remotes.  This is a business decision by comcast not an "upgrade".

 

I only watch things via ondemand or apps on AppleTV now because of this until my contract is up and I go back to DirecTV. When I watch it via live/recorded I step back to when I first saw 480p widescreen DVD on my new rear projection Toshiba TW40X81 40 Theater Wide HDTV-Ready Projection TV. Only difference is it was a great picture than, before, you know, HDTV.

Posted by
Contributor

Message 54 of 182
2,868 Views

Rustyben wrote:

KeenanSR wrote:

Rustyben wrote:

charissamz wrote:
I hope you get something from comcast for your posts... I really do. It's the only explanation I have for your responses at this point. While I and others fully understand your comments, the fact remains, Comcast is the only one doing this.
<snip>
We understand that you're trying to help out on the forum but please stop parroting the nonsense that Comcast has been stating elsewhere in this forum, it's disingenuous and insulting to the intelligence of those that know better.

Ok, you seem to be pushing to return to mpeg2 (inefficient compared to mpeg4) for some reason. The idea is to get the best image in the smallest package which is why Comcast and others are spending many millions to convert to mpeg4. Sure, Comcast could have said no let's keep our money in the bank, and not use the HD technology fees we collect to improve the delivered picture quality, but instead they did commit to the massive upgrade in technology. 

 

You should be quite glad in that the out-of-home-viewing is in mpeg4 as it uses 1/2 approximately of the data for the same image quality. I'm hoping soon that Comcast decides to let you adjust at will the amount of cloud storage you have and maybe, just remove the physical DVR from the home entirely.

 

And for what it is worth, I'm a customer just like you. I just happen to have been a TV tech from the old black and white days, regunned CRTs, dealt with technology changes like round RCA color TVs to rectangular color, from hand wired chassis to modern black box epoxied boards, ... (etc.)


Never, ever, I have said I would like a return to MPEG2 so please cease with that nonsense.

 

MPEG4 is a wonderful codec, when it's used properly and, clearly, Comcast has a long ways to go in getting things right. What's annoying to me and many other is the total lack of any response to that issue, it's always some blurb about this is what's best after surveying our customers(I won't even get into who those customers are, and if Comcast is so confident of those survey results why not show us the data...) instead of looking at hard data like I have posted above. The line is always to ignore any facts contrary to the corporate line.

 

I am very pleased to see that you have stated clearly that this is primarily about optimizing the video for customers using cell phones and tablets, where high resolution video is not a high priority. The problem is that a large portion of us have displays that are larger than 5'' cell phone screens and the product that you yourself have just said is targeting mobile users looks terrible when viewed in a home video setting when display sizes reach into 65" and larger.

 

I held back this image as I knew someone would try the "you want to stay at MPEG2 nonsense", well, the below image is also MPEG4. Note the file size. 1.69GBs and the bitrate 5200Kbps. I grabbed this from a file sharing site to compare with pre-conversion and post-conversion samples that I've posted above, and you know what, this sample looks better than either the MPEG2 and the MPEG4 version Comcast has supplied.

 

 

VanHelsing112516P

 

So yes, MPEG4(AVC) is definitely a quality video codec and I have zero issues with Comcast converting to, but they need to use it properly and currently they don't seem to care about doing that, and you've pretty much confirmed the reason why, Comcast is focused on the mobile video audience and creating more room for HSI and video delivery via IP.

 

So, you can continue to spew out your nonsense about how what they've done is better but I won't be engaging with you anymore on the subject as you clearly are opposed to listening to facts and logic.

Posted by
Service Expert

Message 55 of 182
2,857 Views

KeenanSR wrote:

Rustyben wrote:

KeenanSR wrote:

Rustyben wrote:

charissamz wrote:
I hope you get something from comcast for your posts... I really do. It's the only explanation I have for your responses at this point. While I and others fully understand your comments, the fact remains, Comcast is the only one doing this.
<snip>
We understand that you're trying to help out on the forum but please stop parroting the nonsense that Comcast has been stating elsewhere in this forum, it's disingenuous and insulting to the intelligence of those that know better.

Ok, you seem to be pushing to return to mpeg2 (inefficient compared to mpeg4) for some reason. The idea is to get the best image in the smallest package which is why Comcast and others are spending many millions to convert to mpeg4. Sure, Comcast could have said no let's keep our money in the bank, and not use the HD technology fees we collect to improve the delivered picture quality, but instead they did commit to the massive upgrade in technology. 

 

You should be quite glad in that the out-of-home-viewing is in mpeg4 as it uses 1/2 approximately of the data for the same image quality. I'm hoping soon that Comcast decides to let you adjust at will the amount of cloud storage you have and maybe, just remove the physical DVR from the home entirely.

 

And for what it is worth, I'm a customer just like you. I just happen to have been a TV tech from the old black and white days, regunned CRTs, dealt with technology changes like round RCA color TVs to rectangular color, from hand wired chassis to modern black box epoxied boards, ... (etc.)


Never, ever, I have said I would like a return to MPEG2 so please cease with that nonsense.

 

MPEG4 is a wonderful codec, when it's used properly and, clearly, Comcast has a long ways to go in getting things right. What's annoying to me and many other is the total lack of any response to that issue, it's always some blurb about this is what's best after surveying our customers(I won't even get into who those customers are, and if Comcast is so confident of those survey results why not show us the data...) instead of looking at hard data like I have posted above. The line is always to ignore any facts contrary to the corporate line.

 

I am very pleased to see that you have stated clearly that this is primarily about optimizing the video for customers using cell phones and tablets, where high resolution video is not a high priority. The problem is that a large portion of us have displays that are larger than 5'' cell phone screens and the product that you yourself have just said is targeting mobile users looks terrible when viewed in a home video setting when display sizes reach into 65" and larger.

 

I held back this image as I knew someone would try the "you want to stay at MPEG2 nonsense", well, the below image is also MPEG4. Note the file size. 1.69GBs and the bitrate 5200Kbps. I grabbed this from a file sharing site to compare with pre-conversion and post-conversion samples that I've posted above, and you know what, this sample looks better than either the MPEG2 and the MPEG4 version Comcast has supplied.

 

 

VanHelsing112516P

 

So yes, MPEG4(AVC) is definitely a quality video codec and I have zero issues with Comcast converting to, but they need to use it properly and currently they don't seem to care about doing that, and you've pretty much confirmed the reason why, Comcast is focused on the mobile video audience and creating more room for HSI and video delivery via IP.

 

So, you can continue to spew out your nonsense about how what they've done is better but I won't be engaging with you anymore on the subject as you clearly are opposed to listening to facts and logic.


It is rare to be so totally gas-lighted as you fully misrepresent my points.

 

The maximum resolution may need only one file and that is for maximum (for example streaming on-demand) resolution. The industry direction is to take that one file and deliver what is needed using just-in-time transcoding for the device it is being delivered to at the moment. No more will Comcast and other cable providers have to maintain many different files to present when needed. 

 

The point is that technology has moved on to what will be beyond interlaced video content. 




Community Icon
I am not a Comcast employee, I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help. For information on the program click here.
We ask that you post publicly so people with similar questions may benefit from the conversation.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee. I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help.
We ask that you post publicly so people with similar questions may benefit.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee.

Was your question answered?
Mark it as a solution!solution Icon

Posted by
Frequent Visitor

Message 56 of 182
2,842 Views

KeenanSR wrote:

Rustyben wrote:

charissamz wrote:
I hope you get something from comcast for your posts... I really do. It's the only explanation I have for your responses at this point. While I and others fully understand your comments, the fact remains, Comcast is the only one doing this.

Plus, ESPN 720p looks fine on Comcast - Comcast's 1080i to 720p channels look awful. No math needed, we can all see it with our own eyes. I can agree with everything you say about 720p but Comcast's bandwidth starved 1080i to 720p conversion simply looks like DVD quality.

I'm not sure why you believe any channel is bandwidth starved. mpeg4 encoding is state of the art. If you mean DVD channel streams embedded in a QAM channel with other channels, usually SD cause bit-starving (is that a real term?) it just doesn't. multiplexing multiple live data streams in same frequency band has been around for decades (FM radios for example). Lasers are doing it by mixing different wavelengths (basically colors) on the same fiber transmission line. This doesn't interfere between the wavelengths. 


(The below has been lifted from a post I made at another forum, it has some syntax edits to provide better context here.) 

 

You say they are not starved. Then why is the below happening? The first image is front an episode before the conversion was done, the second from an episode after the conversion was done. 

 

There are several shows on Syfy where the resulting bitrate and file size are identical(1.65GB @ 3.94Mbp/s per kmttg) for over 20 episodes since the conversion which tells me that there's a hard bandwidth allocation setting in the stat-muxing and that the amount simply isn't enough, and these shows are banging up against it resulting in very poor quality images. If I was recording more shows on Syfy I have no doubt they would all have the same bandwidth and bitrate.


[graphics snipped out]
 
Note what looks like a "hard ceiling" on the post-conversion MPEG4 version and how the pre-conversion MPEG2 version has a much more dynamic flow of data peaks and valleys; the content is getting some "room" to express itself. The post-conversion file is pretty much locked into pounding a ceiling at around 3200kbps; it's "screaming out" for more bandwidth.
 
Additionally, note how every single episode of Van Helsing post-MPEG4 conversion is exactly the same file size and bitrate while the episode recorded prior to the conversion is not. If these episodes were not being bit-starved then they would not all have the same bandwidth and data totals being used.
 
[another graphic snipped out]
 
We understand that you're trying to help out on the forum but please stop parroting the nonsense that Comcast has been stating elsewhere in this forum, it's disingenuous and insulting to the intelligence of those that know better.

kmttg . . . I recognize the software ;-)

 

You are showing graphically with the Bitrate Viewer exactly what I was finding during playback on a TV that shows the bitrate real time. Last Friday night's 1080i/30 MPEG2 MacGyver episode (#11, "Scissors") from the local CBS station's channel on Comcast was varying between 7 - 16 Mbps on the video stream. That's considerable compression, about half that of a Blu-ray 1080p/24 AVC transport stream, but still respectable. kmttg, which I have but don't use (preferring TD and its server instead), reported 10.9 Mbps. By comparison, the 2008 remake of "The Day the Earth Stood Still" recorded in 720p/60 MPEG4 was a miserable 3.85 Mbps which falls right in line with what I saw on the TV screen varying from 2.8 - 5 Mbps. In addition, I watched a whitewater rafting portion of "The River Wild" (1994) on Encore-HD last night for a few minutes without recording it. The whitecaps and distant water detail in the rapids were complete mush with ZERO fine detail or texture. Recorded Inception last night as I've got it on Blu-ray, and the studio did a stellar job on its transfer with amazing detail, crisp edges and fine texture. I'm going to comapare the same scene frames from each, and already know what the answer will be as I saw absolutely, positively **ZERO** skin texture in closeup shots of Leonardo DiCaprio's face. It looked as if it had been fashioned out of clay! In a scene when he's laying on the beach with the waves washing in over him, it was similar to "The River Wild" with ZERO detail in the whitecaps.

 

You are exactly right. The postings defending the 720p/60 high compression, bit-starved MPEG4 transport streams is disingenuous and insulting. Rustyben may be able to BS the fans and spectators with all the voodoo technobabble smoke and mirrors, but he cannot BS the players who have the tools to measure and demonstrate the effect.

 

And he wants to talk about interlaced? I haven't even gotten to the 3:2 pulldown required to get from 24 fps cinema to 60 fps progressive, and what that does to severely compressed video (also required for 30 fps interlaced, but it's mitigated some if done properly with a sufficiently high bit rate). The 3:2 pulldown is something the average Consumer Joe doesn't even know about, he only sees the effect of it. BTW, the last I looked at a Comcast 1080i/30 transport stream in detail a number of years ago, it wasn't pure interlaced, but a mix of interlaced and progressive frames, which TVs can handle without choking and gagging; there's some video editing and transcoding software that doesn't like it though. I'm putting discussion of 3:2 pulldown in its own posting.

 

John

Posted by
Frequent Visitor

Message 57 of 182
2,830 Views

Rustyben wrote:

jlind wrote:

Rustyben wrote:

information only, if you don't appreciate tech talk, please skip this post.

 <snip>

The result is 720p MPEG4 mush. It's not High Definition, it's Horrid Definition.

 

John


your post seems to be confusing QAM containers that contain several channels and are never crowded nor bit starved, and mpeg4 containers. The mpeg4 container is far more efficient in preventing loss of picture detail during encoding/decoding. The mpeg4 has open ended meta data streams too allowing the text information needed for new technology like ATMOS. 


Yet more smoke and mirrors with a huge red herring tossed in for good measure. Not once did I make ANY mention of or even HINT at QAM!

 

For those that do not know what QAM is, it's the modulation method used to deliver the digital transport streams on RF carriers on North American cable TV systems. OTA broadcast TV uses ATSC, aka 8VSB to modulate their transmitter RF carrier with their MPEG2 and H.264 MPEG4 transport streams. Cable TV systems used the same analog modulation method as broadcast TV, known as NTSC, during the analog TV era. When digital TV was implemented, OTA broadcast and cable went their own ways. This is why you cannot connect a cable box to a TV arial and receive OTA digital TV, nor can you connect a cable TV cable to the TV's OTA antenna connector and receive any of the unencrypted cable channels -- unless it has both an ATSC tuner and a QAM tuner -- if the TV only has an ATSC tuner you will get nothing from cable. If the cable STB only has a QAM tuner, you will get nothing from an OTA antenna. Cable uses QAM RF modulation, which is not the same as OTA broadcast ATSC 8VSB RF modulation. QAM is the acronym for Quadrature Amplitude Modulation, and it's a means of putting an analog or digital signal onto an RF carrier, just as FM (Frequency Modulation), AM (Amplitude Modulation), or PCM (Pulse Code Modulation) can do the same, just to mention some other examples, or 8VSB does for OTA broadcast HDTV. It's called "quadrature" because the result (for digital content) is a combination of PSK (Phase Shift Keying) and ASK (Amplitude Shift Keying), which is robust for cable use and can put more information into a narrower RF bandwidth than some other modulation methods (e.g., PSK or PCM alone). 64-QAM is typically used for digital SD TV (aka 480i, the digital equivalent of analog NTSC TV and analog DVDs). 256-QAM is used for the HD content. The "channels" on your STB are virtual channels representing specific MPEG2 and MPEG4 transport streams, not the actual RF channels the cable company divides cable RF bandwidth into. It's a form of Frequency Division Multiplexing (aka FDM). Several virtual channels (transport streams) can be put into a single QAM RF channel, if the bit rate of the transport streams will fit in the QAM channels RF bandwidth. Think of QAM as a pipe that can carry one or more MPEG2 or MPEG4 video transport streams. I don't know if Comcast is still using 64-QAM or not. Some years ago the digital side of things was a combination of the two with 480i SD on 64-QAM and 1080i and 720p on 256-QAM (some local broadcast stations were using 720p and Comcast simply used their transport streams as-is). When you select a "channel" to watch on your STB, you're really selecting a specific transport stream. The STB magically translates that into the actual QAM RF channel that's carrying the transport stream you selected, sets the RF QAM tuner inside to its RF center frequency, demodulates the QAM RF signal, picks out the MPEG(2 or 4) transport stream you selected, decodes it using the proper CODEC, and delivers that to the TV. Transport streams are not "demodulated", they're "decoded" using the appropriate CODECs.

 

All my discussion about the substantial degradation of Picture Quality with the 720p/60 picture resolution downgrade and horribly compressed H.264 MPEG4 transport stream bit rates as compared to the 1080i/30 H.262 MPEG2 transport stream bit rates is about the transport streams. It has absolutely, positively NOTHING to do with the QAM that carries and delivers the transport streams to the STB on an RF carrier, its modulation method, or Comcast's real (actual) RF QAM channelization on their cable system. The claim that I'm discussing Comcast's QAM is a red herring. QAM is NOT a "container" like MPEG, it's an RF carrier modulation method.

 

You might be able to BS the fans and spectators, but you CANNOT BS the players.

 

John

Posted by
Service Expert

Message 58 of 182
2,794 Views

jlind wrote:

KeenanSR wrote:

Rustyben wrote:

charissamz wrote:

 <SNIP>

You are exactly right. The postings defending the 720p/60 high compression, bit-starved MPEG4 transport streams is disingenuous and insulting. Rustyben may be able to BS the fans and spectators with all the voodoo technobabble smoke and mirrors, but he cannot BS the players who have the tools to measure and demonstrate the effect.

 

And he wants to talk about interlaced? I haven't even gotten to the 3:2 pulldown required to get from 24 fps cinema to 60 fps progressive, and what that does to severely compressed video (also required for 30 fps interlaced, but it's mitigated some if done properly with a sufficiently high bit rate). The 3:2 pulldown is something the average Consumer Joe doesn't even know about, he only sees the effect of it. BTW, the last I looked at a Comcast 1080i/30 transport stream in detail a number of years ago, it wasn't pure interlaced, but a mix of interlaced and progressive frames, which TVs can handle without choking and gagging; there's some video editing and transcoding software that doesn't like it though. I'm putting discussion of 3:2 pulldown in its own posting.

 

John


mixture of interlace and progressive in the stream, eh? 

 

why are you deflecting to conversion of 24 frames per second of filmed productions to the near 30 frames per second of interlaced now-defunct NTSC technology? when you need 30 or 60 frames per second and you have only 24 yes you have to reuse the same frames or create new ones using math/approximation (aka distortion) by 'averaging' adjacent frames into new frames. Yes it is a problem but much better than the old kinescope method of recording off of a monitor to get a higher rate.




Community Icon
I am not a Comcast employee, I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help. For information on the program click here.
We ask that you post publicly so people with similar questions may benefit from the conversation.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee. I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help.
We ask that you post publicly so people with similar questions may benefit.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee.

Was your question answered?
Mark it as a solution!solution Icon

Posted by
Service Expert

Message 59 of 182
2,788 Views

jlind wrote:

Rustyben wrote:

jlind wrote:

Rustyben wrote:

information only, if you don't appreciate tech talk, please skip this post.

 <snip>

The result is 720p MPEG4 mush. It's not High Definition, it's Horrid Definition.

 

John

 <snip>

You might be able to BS the fans and spectators, but you CANNOT BS the players.

 

John


http://www.radio-electronics.com/info/rf-technology-design/quadrature-amplitude-modulation-qam/what-... gives details on the actual technology of QAM (it is not PSK or ASK) it is just like FM quad mod except this is digital information. The rate of mpeg4 depends on what is changed between frames, and it can get overloaded (white snow blizzard at noon). The QAM256 carries (container) the mpeg4 streams.

 

The point is that interlace 'need' is gone, so you must expect the providers to move off interlace and to progressive. 




Community Icon
I am not a Comcast employee, I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help. For information on the program click here.
We ask that you post publicly so people with similar questions may benefit from the conversation.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee. I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help.
We ask that you post publicly so people with similar questions may benefit.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee.

Was your question answered?
Mark it as a solution!solution Icon

Posted by
Visitor

Message 60 of 182
2,761 Views

We can argue the finer technical details on MPEG4 vs MPEG2, interlaced vs progressive but at the end of the day on my 4K TV since the MPEG4/720P switch Comcast looks like garbage.

 

If you take a look at Hulu, while still 720p looks significantly better than the murky over-compressed mess Comcast are producing.

Posted by
Frequent Visitor

Message 61 of 182
2,759 Views

Rustyben wrote:

jlind wrote:

KeenanSR wrote:

Rustyben wrote:

charissamz wrote:

 <SNIP>

You are exactly right. The postings defending the 720p/60 high compression, bit-starved MPEG4 transport streams is disingenuous and insulting. Rustyben may be able to BS the fans and spectators with all the voodoo technobabble smoke and mirrors, but he cannot BS the players who have the tools to measure and demonstrate the effect.

 

And he wants to talk about interlaced? I haven't even gotten to the 3:2 pulldown required to get from 24 fps cinema to 60 fps progressive, and what that does to severely compressed video (also required for 30 fps interlaced, but it's mitigated some if done properly with a sufficiently high bit rate). The 3:2 pulldown is something the average Consumer Joe doesn't even know about, he only sees the effect of it. BTW, the last I looked at a Comcast 1080i/30 transport stream in detail a number of years ago, it wasn't pure interlaced, but a mix of interlaced and progressive frames, which TVs can handle without choking and gagging; there's some video editing and transcoding software that doesn't like it though. I'm putting discussion of 3:2 pulldown in its own posting.

 

John


mixture of interlace and progressive in the stream, eh? 

 

why are you deflecting to conversion of 24 frames per second of filmed productions to the near 30 frames per second of interlaced now-defunct NTSC technology? when you need 30 or 60 frames per second and you have only 24 yes you have to reuse the same frames or create new ones using math/approximation (aka distortion) by 'averaging' adjacent frames into new frames. Yes it is a problem but much better than the old kinescope method of recording off of a monitor to get a higher rate.


Now you're showing extreme ignorance.

 

24 fps cinema is ***NOT*** transformed into 60 fps progressive by "averaging" adjacent frames!! It is done by using 3:2 pulldown. A 60 fps stream has 2-1/2 frames for every 24 fps cinema frame. To accomplish this, in terms of cinema frames, four frames of 24 fps cinema become 10 frames at 60 fps by repeating the first cinema frame thrice, the second cinema frame twice, the third cinema frame thrice and the fourth cinema frame twice, as follows (by cinema frame number):

1-1-1-2-2-3-3-3-4-4 . . . .(repeat to end of video)

The result is characterized as a judder or stutter in the visual effect that results because motion is no longer fluid. The result accentuates compression artifacts. To get from 24 fps to 30 fps progressive requires five frames at 30 fps for every four frame at 24 fps by repeating every fourth frame:

1-2-3-4-4-1-2-3-4-4 . . . (repeat to end of video)

It likewise introduces a stuttering into the visual effect.

 

To get from 24 fps cinema frames into 30 fps interlaced, each cinema frame is split into two fields, an odd and an even for the interlacing (in some jargon it's top and bottom which is a misnomer). For interlacing, the screen is divided into its odd numbered and even numbered horizontal lines. Each frame is divided into two halves with one half using the odd numbered lines and the other using the even numbered ones. In terms of interlace frame pairs of odd and even lines, the 3:2 telecine pulldown by 24 fps frame numbers is as follows with the cinema frames split into pairs of (odd, even) interlaced line sets:

(1,1) - (2,2) - (2,3) - (3,4) - (4,4) . . . repeat to end of video, if the odd lines come before the even ones or

(1,1) - (2,2) - (3,2) - (4,3) - (4,4) . . . repeat to end of video if the even lines come before the odd numbered ones

It creates two frames mixed with the odd lines of one and the even lines of an adjacent frame out of every five. Which is first and which is second depends on whether odd numbered lines lead the even ones in the interlace, or vice versa. There are a couple other schemes to get to five frames for every four cinema cinema, but this is by far the most common. It also results in a judder or stuttering effect.

 

John

Posted by
Service Expert

Message 62 of 182
2,750 Views

jlind wrote:

Rustyben wrote:

jlind wrote:

KeenanSR wrote:

Rustyben wrote:

charissamz wrote:

 <SNIP>

You are exactly right. The postings defending the 720p/60 high compression, bit-starved MPEG4 transport streams is disingenuous and insulting. Rustyben may be able to BS the fans and spectators with all the voodoo technobabble smoke and mirrors, but he cannot BS the players who have the tools to measure and demonstrate the effect.

 

And he wants to talk about interlaced? I haven't even gotten to the 3:2 pulldown required to get from 24 fps cinema to 60 fps progressive, and what that does to severely compressed video (also required for 30 fps interlaced, but it's mitigated some if done properly with a sufficiently high bit rate). The 3:2 pulldown is something the average Consumer Joe doesn't even know about, he only sees the effect of it. BTW, the last I looked at a Comcast 1080i/30 transport stream in detail a number of years ago, it wasn't pure interlaced, but a mix of interlaced and progressive frames, which TVs can handle without choking and gagging; there's some video editing and transcoding software that doesn't like it though. I'm putting discussion of 3:2 pulldown in its own posting.

 

John


mixture of interlace and progressive in the stream, eh? 

 

why are you deflecting to conversion of 24 frames per second of filmed productions to the near 30 frames per second of interlaced now-defunct NTSC technology? when you need 30 or 60 frames per second and you have only 24 yes you have to reuse the same frames or create new ones using math/approximation (aka distortion) by 'averaging' adjacent frames into new frames. Yes it is a problem but much better than the old kinescope method of recording off of a monitor to get a higher rate.


Now you're showing extreme ignorance.

 

24 fps cinema is ***NOT*** transformed into 60 fps progressive by "averaging" adjacent frames!! It is done by using 3:2 pulldown. A 60 fps stream has 2-1/2 frames for every 24 fps cinema frame. To accomplish this, in terms of cinema frames, four frames of 24 fps cinema become 10 frames at 60 fps by repeating the first cinema frame thrice, the second cinema frame twice, the third cinema frame thrice and the fourth cinema frame twice, as follows (by cinema frame number):

1-1-1-2-2-3-3-3-4-4 . . . .(repeat to end of video)

The result is characterized as a judder or stutter in the visual effect that results because motion is no longer fluid. The result accentuates compression artifacts. To get from 24 fps to 30 fps progressive requires five frames at 30 fps for every four frame at 24 fps by repeating every fourth frame:

1-2-3-4-4-1-2-3-4-4 . . . (repeat to end of video)

It likewise introduces a stuttering into the visual effect.

 

To get from 24 fps cinema frames into 30 fps interlaced, each cinema frame is split into two fields, an odd and an even for the interlacing (in some jargon it's top and bottom which is a misnomer). For interlacing, the screen is divided into its odd numbered and even numbered horizontal lines. Each frame is divided into two halves with one half using the odd numbered lines and the other using the even numbered ones. In terms of interlace frame pairs of odd and even lines, the 3:2 telecine pulldown by 24 fps frame numbers is as follows with the cinema frames split into pairs of (odd, even) interlaced line sets:

(1,1) - (2,2) - (2,3) - (3,4) - (4,4) . . . repeat to end of video, if the odd lines come before the even ones or

(1,1) - (2,2) - (3,2) - (4,3) - (4,4) . . . repeat to end of video if the even lines come before the odd numbered ones

It creates two frames mixed with the odd lines of one and the even lines of an adjacent frame out of every five. Which is first and which is second depends on whether odd numbered lines lead the even ones in the interlace, or vice versa. There are a couple other schemes to get to five frames for every four cinema cinema, but this is by far the most common. It also results in a judder or stuttering effect.

 

John


guess you miseed the word "OR". the 3:2 pulldown was used (past) for building new 'frames' using interlaced fields from adjacent frames mixed together to create a new 'fake' frame out of whole cloth. 

 

Progressive naturally has no interlace fields to join together. mpeg4 uses IPB frames and slices to create the images and save room needed for storing a recording. 




Community Icon
I am not a Comcast employee, I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help. For information on the program click here.
We ask that you post publicly so people with similar questions may benefit from the conversation.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee. I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help.
We ask that you post publicly so people with similar questions may benefit.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee.

Was your question answered?
Mark it as a solution!solution Icon

Posted by
Frequent Visitor

Message 63 of 182
2,735 Views

Rustyben wrote:

jlind wrote:

Rustyben wrote:

jlind wrote:

KeenanSR wrote:

Rustyben wrote:

charissamz wrote:

 <SNIP>

You are exactly right. The postings defending the 720p/60 high compression, bit-starved MPEG4 transport streams is disingenuous and insulting. Rustyben may be able to BS the fans and spectators with all the voodoo technobabble smoke and mirrors, but he cannot BS the players who have the tools to measure and demonstrate the effect.

 

And he wants to talk about interlaced? I haven't even gotten to the 3:2 pulldown required to get from 24 fps cinema to 60 fps progressive, and what that does to severely compressed video (also required for 30 fps interlaced, but it's mitigated some if done properly with a sufficiently high bit rate). The 3:2 pulldown is something the average Consumer Joe doesn't even know about, he only sees the effect of it. BTW, the last I looked at a Comcast 1080i/30 transport stream in detail a number of years ago, it wasn't pure interlaced, but a mix of interlaced and progressive frames, which TVs can handle without choking and gagging; there's some video editing and transcoding software that doesn't like it though. I'm putting discussion of 3:2 pulldown in its own posting.

 

John


mixture of interlace and progressive in the stream, eh? 

 

why are you deflecting to conversion of 24 frames per second of filmed productions to the near 30 frames per second of interlaced now-defunct NTSC technology? when you need 30 or 60 frames per second and you have only 24 yes you have to reuse the same frames or create new ones using math/approximation (aka distortion) by 'averaging' adjacent frames into new frames. Yes it is a problem but much better than the old kinescope method of recording off of a monitor to get a higher rate.


Now you're showing extreme ignorance.

 

24 fps cinema is ***NOT*** transformed into 60 fps progressive by "averaging" adjacent frames!! It is done by using 3:2 pulldown. A 60 fps stream has 2-1/2 frames for every 24 fps cinema frame. To accomplish this, in terms of cinema frames, four frames of 24 fps cinema become 10 frames at 60 fps by repeating the first cinema frame thrice, the second cinema frame twice, the third cinema frame thrice and the fourth cinema frame twice, as follows (by cinema frame number):

1-1-1-2-2-3-3-3-4-4 . . . .(repeat to end of video)

The result is characterized as a judder or stutter in the visual effect that results because motion is no longer fluid. The result accentuates compression artifacts. To get from 24 fps to 30 fps progressive requires five frames at 30 fps for every four frame at 24 fps by repeating every fourth frame:

1-2-3-4-4-1-2-3-4-4 . . . (repeat to end of video)

It likewise introduces a stuttering into the visual effect.

 

To get from 24 fps cinema frames into 30 fps interlaced, each cinema frame is split into two fields, an odd and an even for the interlacing (in some jargon it's top and bottom which is a misnomer). For interlacing, the screen is divided into its odd numbered and even numbered horizontal lines. Each frame is divided into two halves with one half using the odd numbered lines and the other using the even numbered ones. In terms of interlace frame pairs of odd and even lines, the 3:2 telecine pulldown by 24 fps frame numbers is as follows with the cinema frames split into pairs of (odd, even) interlaced line sets:

(1,1) - (2,2) - (2,3) - (3,4) - (4,4) . . . repeat to end of video, if the odd lines come before the even ones or

(1,1) - (2,2) - (3,2) - (4,3) - (4,4) . . . repeat to end of video if the even lines come before the odd numbered ones

It creates two frames mixed with the odd lines of one and the even lines of an adjacent frame out of every five. Which is first and which is second depends on whether odd numbered lines lead the even ones in the interlace, or vice versa. There are a couple other schemes to get to five frames for every four cinema cinema, but this is by far the most common. It also results in a judder or stuttering effect.

 

John


guess you miseed the word "OR". the 3:2 pulldown was used (past) for building new 'frames' using interlaced fields from adjacent frames mixed together to create a new 'fake' frame out of whole cloth. 

 

Progressive naturally has no interlace fields to join together. mpeg4 uses IPB frames and slices to create the images and save room needed for storing a recording. 


3:2 pulldown for 60 fps non-interlaced from 24 fps cinema which is inherently non-interlaced requires ZERO interlacing. I suggest you reread the first paragraph of what you just quoted. It's done by frame repetition and has NOTHING to do with H.264 encoding which, if it's used (versus H.262 or some other encoding), occurs AFTER the 3:2 pulldown is performed.

Posted by
Frequent Visitor

Message 64 of 182
2,706 Views

samj01 wrote:

We can argue the finer technical details on MPEG4 vs MPEG2, interlaced vs progressive but at the end of the day on my 4K TV since the MPEG4/720P switch Comcast looks like garbage.

 

If you take a look at Hulu, while still 720p looks significantly better than the murky over-compressed mess Comcast are producing.


Exactly!

 

Theatrical release from the Inception Blu-ray (m2ts file):

1080p/24 VC-1 video track

384 kbps 5.1 AC3 audio track

2:28:07 run time

26.4 GB total file size

 

Inception transport stream from the IFC channel,  without commercials (m2ts container):

720p/60 H.264 video track

384 kbps 5.1 AC3 audio track

2:20:50 run time

3.77 GB total file size, 1/7th the size of the Blu-ray m2ts

Note: 0:8:17 shorter run time due to typical IFC end credit editing.

 

It's not H.262 MPEG2 vs H.264 MPEG4 or interlaced vs progressive. It's fundamental resolution downsizing and more important, the massively increased compression that's causing degradation. The data above alone would tell you there's going to be enormous degradation visible, and there is with the muddy mush that's readily visible in these lossless frame captures from the first few minutes at the beginning of the movie. I'm not surprised at all you're readily seeing this on a 4k UHD TV when I'm clearly seeing it on a 1080p . . .  

 

Comcast 720p/60 frame capture: IFC Channel "Inception"Comcast 720p/60 frame capture: IFC Channel "Inception"

 

 

Comcast 720p/60 frame capture: IFC Channel "Inception"Comcast 720p/60 frame capture: IFC Channel "Inception"

 

Posted by
Regular Contributor

Message 65 of 182
2,675 Views

samj01 wrote:

We can argue the finer technical details on MPEG4 vs MPEG2, interlaced vs progressive but at the end of the day on my 4K TV since the MPEG4/720P switch Comcast looks like garbage.

 

If you take a look at Hulu, while still 720p looks significantly better than the murky over-compressed mess Comcast are producing.


exactly - Its that simple.  Notice how ESPN still looks good?  It has been 720p forever.  Whatever Comcast is doing is garbage.

Posted by
Silver Problem Solver

Message 66 of 182
2,673 Views

charissamz wrote:

samj01 wrote:

We can argue the finer technical details on MPEG4 vs MPEG2, interlaced vs progressive but at the end of the day on my 4K TV since the MPEG4/720P switch Comcast looks like garbage.

 

If you take a look at Hulu, while still 720p looks significantly better than the murky over-compressed mess Comcast are producing.


exactly - Its that simple.  Notice how ESPN still looks good?  It has been 720p forever.  Whatever Comcast is doing is garbage.


ESPN looks good because it may still be MPEG2 and still running near-full bandwidth of 18Mbit or so. Even when Comcast was moving to 3:1 and 4:1 HD multiplexing, the main sports channels were still 2:1.

Posted by
Regular Contributor

Message 67 of 182
2,664 Views

andyross wrote:

charissamz wrote:

samj01 wrote:

We can argue the finer technical details on MPEG4 vs MPEG2, interlaced vs progressive but at the end of the day on my 4K TV since the MPEG4/720P switch Comcast looks like garbage.

 

If you take a look at Hulu, while still 720p looks significantly better than the murky over-compressed mess Comcast are producing.


exactly - Its that simple.  Notice how ESPN still looks good?  It has been 720p forever.  Whatever Comcast is doing is garbage.


ESPN looks good because it may still be MPEG2 and still running near-full bandwidth of 18Mbit or so. Even when Comcast was moving to 3:1 and 4:1 HD multiplexing, the main sports channels were still 2:1.


Right, that is my point to what RustyBen and a few others have said when they compare 720p to 1080i.  When done right, 720p isnt that different visually, can even be better, especially for sports. A few posters keep saying that there isnt much difference but Comcast is doing something far different with these 1080i down to 720p conversions. 

Posted by
Frequent Visitor

Message 68 of 182
2,624 Views

charissamz wrote:

Right, that is my point to what RustyBen and a few others have said when they compare 720p to 1080i.  When done right, 720p isnt that different visually, can even be better, especially for sports. A few posters keep saying that there isnt much difference but Comcast is doing something far different with these 1080i down to 720p conversions. 


You're right, there is something much different going on . . .

 

I might debate whether 720p/60 displays as well visually as 1080i/30 as many STB and TVs do not perform upscaling very well. Regardless, 720p/60 is about 10% fewer pixels per second for a CODEC and the resulting transport stream to handle compared to 1080i, presuming constant frame rates and a 1080i stream using continuous interlacing (not a mix of interlace and progressive). That gets a modest gain in bandwidth reduction, all else being equal. H.264, which is a variant of MPEG4, is an improvement in several respects over H.262 (a variant of MPEG2). Commercial Blu-ray originally used MPEG2 m2ts files for the main feature and usually most of the extras. They've switched to VC-1 and H.264 as they're both superior to H.262 in disk space consumed and several other aspects of picture quality regarding things such as chroma, color space, etc. Both have the potential, emphasis on potential, for encoding better picture quality.

 

The primary issue dominating everything is the very excessive compression level being used with the resulting transport streams suffering from extremely low bit rates to the point of severe picture degradation. I had the TV on AMC-HD which was showing Ghostbusters Monday afternoon and evening while working on installing a new STB and setting up the transfer to move all the auto-record settings and stored recordings from the old one to the new one. It was horrid looking. I tried changing the upscaling from the STB to the TV and back again, to no avail. Complete rubbish with color banding, improper color rendition with pallete smearing, visible 16x16 pixel macroblocks (if you see small square blocks, that what it is), and closups of faces with zero skin texture making them look like wax or clay. And all Rustyben can talk about is progressive, progressive, progressive, ad nauseum.

 

John

Posted by
Visitor
  • Thank you contributor for your first reply to the community!
 Posting replies is the best way to get involved.

Message 69 of 182
2,390 Views

So, so glad I ran across this thread. I was seriously considering moving from DTV to Comcast X1. There is not one single chance in hades that I'm going to watch poorly encoded 720P video on a 4K projector. Going from the CBS NFL broadcast to the Fox broadcast is annoying to me, the loss of apparent resolution is very noticable. Thanks again for saving me from this trash.

Posted by
Regular Contributor

Message 70 of 182
2,366 Views
Cornereduser, its worse than that. Fox looks good compared to the way these "upgraded" channels look now. My kids were watching a recent movie on TBS last night and it looked like they put a scratched DVD in. Flip the channel to NBC or even Fox and it improves. Whatever comcast did is awful. I don't recommend Comcast to anyone now and will be going back to DirecTV as soon as I can.
Posted by
Service Expert

Message 71 of 182
2,361 Views

charissamz wrote:
Cornereduser, its worse than that. Fox looks good compared to the way these "upgraded" channels look now. My kids were watching a recent movie on TBS last night and it looked like they put a scratched DVD in. Flip the channel to NBC or even Fox and it improves. Whatever comcast did is awful. I don't recommend Comcast to anyone now and will be going back to DirecTV as soon as I can.

TBS is currently 1080i while fox is in 720p. NBC is currently 1080i. The industry directions seems to be settling on 720p60 which comcast is moving toward now. There are several threads about this in the Comcast forums. You tube has some tech information videos that explain 720p vs 1080i. The takeaway is that 720p60 gives you clear image on motion intensive video. 

 

that scratched DVD reference probably is inferring film scratches/damage type of effects?




Community Icon
I am not a Comcast employee, I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help. For information on the program click here.
We ask that you post publicly so people with similar questions may benefit from the conversation.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee. I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help.
We ask that you post publicly so people with similar questions may benefit.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee.

Was your question answered?
Mark it as a solution!solution Icon

Posted by
Service Expert

Message 72 of 182
2,344 Views
Posted by
Contributor

Message 73 of 182
2,341 Views

Here we go again, the same old nonsense being spewed out by what must be a Comcast fanboy.

 

The fact is that roughly 75% of television networks broadcast in 1080i while the rest use 720p. And you say "the industry" is moving to 720p? What industry is that? Certainly not the television broadcast industry. Can you name me one network that has switched from 1080i to 720p???

 

 


Rustyben wrote:

charissamz wrote:
Cornereduser, its worse than that. Fox looks good compared to the way these "upgraded" channels look now. My kids were watching a recent movie on TBS last night and it looked like they put a scratched DVD in. Flip the channel to NBC or even Fox and it improves. Whatever comcast did is awful. I don't recommend Comcast to anyone now and will be going back to DirecTV as soon as I can.

TBS is currently 1080i while fox is in 720p. NBC is currently 1080i. The industry directions seems to be settling on 720p60 which comcast is moving toward now. There are several threads about this in the Comcast forums. You tube has some tech information videos that explain 720p vs 1080i. The takeaway is that 720p60 gives you clear image on motion intensive video. 

 

that scratched DVD reference probably is inferring film scratches/damage type of effects?

 

 


 

 

 

Posted by
Regular Visitor
  • Congratulations on receiving your first Kudos! Thank you for your meaningful contribution to the forum. May this be the first of many kudos.
  • Thank you contributor for your first reply to the community!
 Posting replies is the best way to get involved.

Message 74 of 182
2,315 Views

Except for what he is referring to is how TBS (broadcast in 1080i, being butchered to 720p) compares to NBC (broadcast in 1080i, locally supplied and not subjec to butchering) and Fox (broadcast in 720p from the source). NBC and Fox both look superior because they in their original transmission mode, TBS (and the other 1080i to 720p converstions) are being fed through numerous filters and re-encoded at low bit rates (despite the gains from MPEG-4), the result is a muddy, color banding mess easily recognizable on any reasonably sized calibrated display.

Posted by
Regular Contributor

Message 75 of 182
2,280 Views

Picture quality is much worse on most channels vs. a few months ago.  Local channels are the only ones that still look good, as I assume they remain un-touched.  I can't believe Comcast chose to downscale everything to 720p.  On a 65" screen, the picture is very noticeably worse than it used to be.  Comcast must have tested the picture quality on much smaller TVs before they decided to switch to 720p.

 

It's sad when a streaming service like DirecTV Now (which I'm trying out) has better picture quality than your cable company.  

 

I'd suggest that anyone that has a larger TV and cares about PQ start investigating other options.

 

 

Posted by
Regular Contributor

Message 76 of 182
2,259 Views

jkozlow3 wrote:

Picture quality is much worse on most channels vs. a few months ago.  Local channels are the only ones that still look good, as I assume they remain un-touched.  I can't believe Comcast chose to downscale everything to 720p.  On a 65" screen, the picture is very noticeably worse than it used to be.  Comcast must have tested the picture quality on much smaller TVs before they decided to switch to 720p.

 

It's sad when a streaming service like DirecTV Now (which I'm trying out) has better picture quality than your cable company.  

 

I'd suggest that anyone that has a larger TV and cares about PQ start investigating other options.

 

 


My bet is the downscaling is simply because they are probably planning for most of their future customers to be watching HDTV on small mobile screens based on changing demographics and satistical data, so why not grab some extra bandwith in the process.

 

The PQ degredation you describe is accurate for anything larger than ..... lets say a 32" to 37" HDTV.   I'm fortunate in that my 10 year old 720p (768p actually) 32" Panny still works as good as when it was new so I don't see too much of the lower PQ on the converted 1080i to 720p60 channels unless I get real close like 2 to 3 feet (I usually watch from 6 feet).  One more thing to note is I notice much less (or no) PQ degradation on channels that were always 720p like ESPN from up close.   I have the DVR set to native and let the old 32" Panny HDTV do the scaling from 720p60 (or 1080i) to 768p.

 

Same thing above noticed on a fairly new 24" kitchen HDTV whose output resolution is 1080p.

 

I suspect that the smaller the screen, the less customers will notice or care (which is simple Business 101), just like the food industry shrinking the content weight an ounce or two yet holding pricing the same.  I'm old enough to remember when one pound ground coffee cans were a full 16 ounces ...... today they are 13 ounces or less (with so-called manufacturer claimes touting that 13 ounces today produces the same amount of brewed coffee as 16 ounces).  Also fewer and fewer people brew their coffee at home, so in the end who really notices ..... akin to the ongoing migration to smaller mobile screens for watching TV .... right?  It's all the same business strategy in most product sales industries, except for maybe the fuel suppliers where they are still delivering a full gallon (or liter if you are outside the U.S.) due to govermental regulatory metering and taxing requirements.

 

That old 32" Panny Viera seems to couple up nicely with a fairly new Panny Viera Smart Blu-ray player (connected via ethernet cable directly to my nearby router) and so the old Panny now also works like a Smart TV.  BTW, Netflix, Amazon Prime and Hulu all look as good as the current untouched local channels like our local NBC channel or as good as all the former 1080i channels of yester-year.

 

My wife only watches the cooking and travel channels in the kitchen, otherwise she prefers reading books at home.

 

All this is why I've put off buying a new larger screen unit to replace the 32" Panny for the time being until I see what the other content delivery competitors do in the future. Plus, a very thin bezel 40" is about as large a TV as I could fit in my office/den where I watch most of all of my HDTV on the old 32" large bezel Panny Viera (mostly the business channels and sports along with an occasional movie from the three subscription streaming providers).

 

My sister and her husband are very proud of their new 55" high quality Sony HDTV and the newly converted 720p channels do look unacceptable from any distance, but I was too polite to say anything because they are new to Hi-Def TV viewing and they don't really know any better regarding PQ ..... which is a shame.  Same applies IMO to many of today's youngsters (and soon to become future subscription HDTV customers on their own) .....who have never watched HDTV on anything but small mobile devices and thus they too don't really know that there may be better PQ available for larger screens nor really care, especially if they never plan to purchase a very large screen HDTV.

 

I also have FIOS available in my area (with good reviews from neighbors but they are elderly and some with poor eyesight). However, the FIOS competitor could also foul things up for larger screen viewers in the future regardless of whatever their sales people presently claim, even when they try to tempt me with 1st year reduced rates plus a $400 Gift Card to switch to their FIOS triple play.

 

Lastly, we occasionally go out to real theaters if we plan to watch a new release on a really big screen (the old fashioned way, I guess).

Posted by
Regular Contributor

Message 77 of 182
2,199 Views

boochalouche wrote:

Except for what he is referring to is how TBS (broadcast in 1080i, being butchered to 720p) compares to NBC (broadcast in 1080i, locally supplied and not subjec to butchering) and Fox (broadcast in 720p from the source). NBC and Fox both look superior because they in their original transmission mode, TBS (and the other 1080i to 720p converstions) are being fed through numerous filters and re-encoded at low bit rates (despite the gains from MPEG-4), the result is a muddy, color banding mess easily recognizable on any reasonably sized calibrated display.


Thank you, my reply was deleted for "flaming" although all I did was explain myself so it was nice to see others say the same thing.

Posted by
Regular Contributor

Message 78 of 182
2,176 Views
Please stop. Converting 30 frames per second to 60 frames per second does not negate losing half the resolution.
Posted by
Regular Contributor

Message 79 of 182
2,171 Views

2themax wrote:
Please stop. Converting 30 frames per second to 60 frames per second does not negate losing half the resolution.

Most networks broadcast natively in 1080i (with the exception of ABC and Fox I believe).

 

Most TVs scale the picture to 1080p for output.

 

So Comcast is taking the native 1080i feed, downscaling it to 720p so that the TV or X1 box can then turnaround and upscale it to 1080p.

 

What do you suppose all of this does to the resolution???

Posted by
Service Expert

Message 80 of 182
2,155 Views

the announcement post was http://forums.xfinity.com/t5/Non-X1-Service/1080i-channels-are-being-changed-to-720p60-channels/m-p/...

 

modern displays (TV,s, smart devices, desk/laptop screens, etc) require a progressive format for their digital panels. The CRT and related projectors are aging out. The old analog NTSC signal was dropped in conversion to digital transmission, the BetaMax format was beaten out in the industry for VHS (VCRs). Progress in technology marges on and older technology drops. This happens most often in computers and consumer electronics because the tech is improving so quickly and prices keep coming down for the new technology. It would be great if the technology would/could support broadcast being in 1080p but that has not happened. The next step down for clarity for and remaining progressive is the 720p and Comcast is using the 60 images per second (720p60) instead of 720p30. 




Community Icon
I am not a Comcast employee, I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help. For information on the program click here.
We ask that you post publicly so people with similar questions may benefit from the conversation.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee. I am a paying customer just like you!
I am an XFINITY Forum Expert and I am here to help.
We ask that you post publicly so people with similar questions may benefit.
Was your question answered? Mark it as an accepted solution!solution Icon
Community Icon
I am not a Comcast employee.

Was your question answered?
Mark it as a solution!solution Icon

Posted by
Service Expert

Message 81 of 182
2,130 Views
Posted by
Problem Solver

Message 82 of 182
2,122 Views

I'm sorry but I don't see the big deal. When I was 10-30 years old I watched tv on an SD TV

 

I'm not a stickler about PQ. maybe cause It wasn't a big deal growing up.

 

I have a HDTV now and the picture looks nice yes 1080i MPEG 4 was beautiful but it's not that a big of a deal to me.

Posted by
Regular Contributor

Message 83 of 182
2,115 Views

Rustyben wrote:
baiting, You can review the Forum Guidelines here: http://forums.xfinity.com/t5/Forum-Community/Forums-Policy-and-Guidelines/td-p/2618379

I always been curious about these posts that are moved ...... was the original post by Rustyben or was Rustyben the forum member who moved the post?

Posted by
Service Expert

Message 84 of 182
2,070 Views
Posted by
Regular Contributor

Message 85 of 182
2,100 Views

KeenanSR wrote:

It was Rustyben that removed my post, he didn't like what I said about what Comcast is doing. You have to keep in mind, this is their playground so if something they don't like gets posted they'll just remove it.


Okay thanks ..... they way they write it makes it look like Rustyben was doing the baiting which is not a good thing for Rustyben .... maybe they could be more clear about that in the future and hold out the real baiter's handle for shaming and to warn other forum users.

Posted by
Regular Contributor

Message 86 of 182
2,048 Views

Rustyben wrote:

 

It would be great if the technology would/could support broadcast being in 1080p but that has not happened. The next step down for clarity for and remaining progressive is the 720p and Comcast is using the 60 images per second (720p60) instead of 720p30. 


Are you referring to CableLabs specifications?

Posted by
Regular Contributor

Message 87 of 182
2,037 Views

Rustyben wrote:

the announcement post was http://forums.xfinity.com/t5/Non-X1-Service/1080i-channels-are-being-changed-to-720p60-channels/m-p/...

 

modern displays (TV,s, smart devices, desk/laptop screens, etc) require a progressive format for their digital panels. The CRT and related projectors are aging out. The old analog NTSC signal was dropped in conversion to digital transmission, the BetaMax format was beaten out in the industry for VHS (VCRs). Progress in technology marges on and older technology drops. This happens most often in computers and consumer electronics because the tech is improving so quickly and prices keep coming down for the new technology. It would be great if the technology would/could support broadcast being in 1080p but that has not happened. The next step down for clarity for and remaining progressive is the 720p and Comcast is using the 60 images per second (720p60) instead of 720p30. 


I'm old enough to remember the old VHS vs BETA wars where only one combatant was left standing.

 

I was just wondering if content providers (TV channel/network owners, etc) will ever feasibly switch to native 720p from 1080i like ABC, FOX, ESPN so Comcast and other delivers of content don't have to mess with changing the 1080i signals into 720p60 to deal with the increasing trend towards smaller mobile screen viewing and at the same time keep larger home screen HDTV owners satisfied with their viewing experience?

 

Or, would that be financially and technically be out of reason.  I would think that NBC and its sister channels might be the first to switch to 720p natively (rather than converting from 1080i) if feasible, being that they are part of the Comcast family .... right?.

 

The original 720p native channels still look fine on larger 1080p screens even when now delivered via MPEG4 so the PQ degredation seen on larger screens of former 1080i channels being converted to 720p60 may not necessarily be related to the MPEG4 conversion from MPEG2.... No/Yes?

Posted by
Problem Solver

Message 88 of 182
2,030 Views

Only the cable-company is doing this. Not the over the air stations transmitting in 1080i, or the satellite companies.

Posted by
Contributor

Message 89 of 182
2,020 Views

MNtundraRET wrote:

Only the cable-company is doing this. Not the over the air stations transmitting in 1080i, or the satellite companies.


Or the 1080i cable television networks. I asked Rusty to name a single TV network that had switched from 1080i to 720p and all he did was to delete my post.

 

Rusty, I ask again, name one single network that has switched from 1080i to 720p. You can't because it hasn't happened. Roughly 75% of all networks transmit in 1080i, your comment about the industry moving to 720p is a complete fabrication and a flat out lie.

 

This is being done for Comcast's benefit and it has nothing to do with improving the customer experience.

Posted by
Regular Contributor

Message 90 of 182
2,012 Views

Calm down everybody ..... I just would like to know who decides whether to create content in 720p or 1080i and why, and if changing over by an entire network family of channels like CBS or NBC from 1080i to 720p is feasible and reasonable to be like ABC, FOX, ESPN, etc.   If I'm not mistaken, many of the other channels we get are also affiliated or owned by the major networks so that means about half of the Comcast Starter line-up.

Posted by
Regular Contributor

Message 91 of 182
1,991 Views

So over the past week, the conversion has been going on in my area of Deerfield/Pompano Beach, FL.

Each day, more HD channels that were 1080i are now being converted to 720p. Sunday when I recoded "The Young Pope" on HBO it was at 1080i, now tonight all HBO and Starz HD channels are 720p.

The first think to notice is that when an HD channel is using MPEG 4 instead of MPEG 2, the Live TV Buffer is about 85 min instead of 30-45 min.

Freeform HD (formally ABC Family) has always been 720p, and is still 720p, but now is using the MPEG 4 codec. I see no difference in the Picture Quality.

But on other HD channels that were 1080i MPEG 2 and are now 720p MPEG 4, I can see is difference, as the image is not as sharp (more soft), fast motion actualy looks like frames are missing, and color isn't as accurate. I was able to compare "The Young Pope, epp 2" that I recorded Sunday night on HBO at 1080i, and then today when it was at 720p. I've also looked at Batman Begins and the first X-Men, and all of these don't look at good as they were when broadcast last week (they are shown all the time).

 

I personally don't know (right now) if this is worth me changing my TV provider from Comcast to another provider (U-Verse, DTV, or Dish), but it is truly a shame that Comcast is moving away from moderatly good HD picture quality to this downconverted lower quality. I'm not sure what (expensive) equipment they are using to do this at the local plant in Pompano Beach, but they must be using a low bit-rate setting (probably under 10Mbps) during the 1080i-to-720p converstion.

 

With more 4K UHD set coming out each year, the strive of Comcast should be to improve the PQ, not to make it worse, just to fit more channels into 1 frequency. Do you realy think this is the best way to have good PQ (original 1080i HD downconverted to 720p by Comcast, sent over coax, box decodes it, then TV upconverts it to its native resolution of 1080p or 4k)?

Posted by
Regular Contributor

Message 92 of 182
1,967 Views

FAUguy wrote:

So over the past week, the conversion has been going on in my area of Deerfield/Pompano Beach, FL.

Each day, more HD channels that were 1080i are now being converted to 720p. Sunday when I recoded "The Young Pope" on HBO it was at 1080i, now tonight all HBO and Starz HD channels are 720p.

The first think to notice is that when an HD channel is using MPEG 4 instead of MPEG 2, the Live TV Buffer is about 85 min instead of 30-45 min.

Freeform HD (formally ABC Family) has always been 720p, and is still 720p, but now is using the MPEG 4 codec. I see no difference in the Picture Quality.

But on other HD channels that were 1080i MPEG 2 and are now 720p MPEG 4, I can see is difference, as the image is not as sharp (more soft), fast motion actualy looks like frames are missing, and color isn't as accurate. I was able to compare "The Young Pope, epp 2" that I recorded Sunday night on HBO at 1080i, and then today when it was at 720p. I've also looked at Batman Begins and the first X-Men, and all of these don't look at good as they were when broadcast last week (they are shown all the time).

 

I personally don't know (right now) if this is worth me changing my TV provider from Comcast to another provider (U-Verse, DTV, or Dish), but it is truly a shame that Comcast is moving away from moderatly good HD picture quality to this downconverted lower quality. I'm not sure what (expensive) equipment they are using to do this at the local plant in Pompano Beach, but they must be using a low bit-rate setting (probably under 10Mbps) during the 1080i-to-720p converstion.

 

With more 4K UHD set coming out each year, the strive of Comcast should be to improve the PQ, not to make it worse, just to fit more channels into 1 frequency. Do you realy think this is the best way to have good PQ (original 1080i HD downconverted to 720p by Comcast, sent over coax, box decodes it, then TV upconverts it to its native resolution of 1080p or 4k)?


 

FAUguy, How do you know which channels are in 720p vs. 1080i?  I know my picture is a lot softer in general now, but I am not aware of how to determine each channel's resolution since the X1 box cannot output the native resolution of each channel and forces the output of a fixed resolution (i.e. 1080p).


Thanks!

Posted by
Regular Contributor

Message 93 of 182
1,953 Views

jkozlow3 wrote:

FAUguy wrote:

So over the past week, the conversion has been going on in my area of Deerfield/Pompano Beach, FL.

Each day, more HD channels that were 1080i are now being converted to 720p. Sunday when I recoded "The Young Pope" on HBO it was at 1080i, now tonight all HBO and Starz HD channels are 720p.

The first think to notice is that when an HD channel is using MPEG 4 instead of MPEG 2, the Live TV Buffer is about 85 min instead of 30-45 min.

Freeform HD (formally ABC Family) has always been 720p, and is still 720p, but now is using the MPEG 4 codec. I see no difference in the Picture Quality.

But on other HD channels that were 1080i MPEG 2 and are now 720p MPEG 4, I can see is difference, as the image is not as sharp (more soft), fast motion actualy looks like frames are missing, and color isn't as accurate. I was able to compare "The Young Pope, epp 2" that I recorded Sunday night on HBO at 1080i, and then today when it was at 720p. I've also looked at Batman Begins and the first X-Men, and all of these don't look at good as they were when broadcast last week (they are shown all the time).

 

I personally don't know (right now) if this is worth me changing my TV provider from Comcast to another provider (U-Verse, DTV, or Dish), but it is truly a shame that Comcast is moving away from moderatly good HD picture quality to this downconverted lower quality. I'm not sure what (expensive) equipment they are using to do this at the local plant in Pompano Beach, but they must be using a low bit-rate setting (probably under 10Mbps) during the 1080i-to-720p converstion.

 

With more 4K UHD set coming out each year, the strive of Comcast should be to improve the PQ, not to make it worse, just to fit more channels into 1 frequency. Do you realy think this is the best way to have good PQ (original 1080i HD downconverted to 720p by Comcast, sent over coax, box decodes it, then TV upconverts it to its native resolution of 1080p or 4k)?


 

FAUguy, How do you know which channels are in 720p vs. 1080i?  I know my picture is a lot softer in general now, but I am not aware of how to determine each channel's resolution since the X1 box cannot output the native resolution of each channel and forces the output of a fixed resolution (i.e. 1080p).


Thanks!


I do not have an X1 box. I am using the Motorola DCX3400M since I have a 2TB Expander HDD attached to it via eSATA. The DCX3400M allows you to select the output resolution, or have it set to Native (which is what I use). So that is how I could tell what HD channels are now being output at 720p instead of 1080i.

 

You can also tell by the length of the Live TV Buffer. Before, when watching a HD channel (either 720p or 1080i), the live TV Buffer was between 25-45 minutes (depending on the channel) due to the fact that the MPEG 2 codec was used. Now those same channels have an 85 min live buffer since they are using MPEG 4. So check to see if any of your HD channels have a longer live TV Buffer than they use to. If it is over an hour, chances are that HD channel is using MPEG 4.

Posted by
Service Expert

Message 94 of 182
1,819 Views
Posted by
Regular Contributor

Message 95 of 182
1,922 Views

If anyone wants to do an A and B comparison just watch the show live/dvr vs ondemand.  The ondemand still delivers it the way it was, at least in my area.

 

My TV watching habits have shifted because of all this. I still record my shows but I use it as a what I have to watch - like a playlist. I then go watch it via AppleTV if possible which gives it the best quality - example is HBO Go. If that isnt available I then bring up the recorded show, click episodes and see if it is ondemand. For watever reason even when it says I can't fast forward I can so that isn't an issue.  Last resort is watch it recorded which gives me this new "upgraded" mess. Unfortuately sports is where I suffer - games on TNT and others look awful. I am glad they havent touched some of the others yet like Comcast SportsNet Chicago, which is odd considering it is Comcast owned. 

 

Once my contract is up I will be leaving to go back to DirecTV unless Comcast comes through with what I believe to be the real reason for this - free up bandwidth on the backs of all those that don't care and then offer IPTV type services. I can see them adding an app like they have on mobile to devcies such as AppleTV, Roku, etc and maybe even release an IPTV standalone box.  If the quality that comes from this looks like HBO GO and the other apps, I will be happy. The real reason for all this is to increase their internet bandwidth because without that they are in trouble. Just my thoughts.

Posted by
Regular Contributor

Message 96 of 182
1,897 Views

charissamz wrote:

If anyone wants to do an A and B comparison just watch the show live/dvr vs ondemand.  The ondemand still delivers it the way it was, at least in my area.

 

My TV watching habits have shifted because of all this. I still record my shows but I use it as a what I have to watch - like a playlist. I then go watch it via AppleTV if possible which gives it the best quality - example is HBO Go. If that isnt available I then bring up the recorded show, click episodes and see if it is ondemand. For watever reason even when it says I can't fast forward I can so that isn't an issue.  Last resort is watch it recorded which gives me this new "upgraded" mess. Unfortuately sports is where I suffer - games on TNT and others look awful. I am glad they havent touched some of the others yet like Comcast SportsNet Chicago, which is odd considering it is Comcast owned. 

 

Once my contract is up I will be leaving to go back to DirecTV unless Comcast comes through with what I believe to be the real reason for this - free up bandwidth on the backs of all those that don't care and then offer IPTV type services. I can see them adding an app like they have on mobile to devcies such as AppleTV, Roku, etc and maybe even release an IPTV standalone box.  If the quality that comes from this looks like HBO GO and the other apps, I will be happy. The real reason for all this is to increase their internet bandwidth because without that they are in trouble. Just my thoughts.


Yes, I have done that over the weekend, by looking at what was recorded (at 720p) and watched it On demand at 1080i. Ever thought the Fast Forward doesn't work, and I can still do the 5 minute Skip Ahead, and then a quick rewind to get past the ads. Personally, I think the FCC should not allow any content provider of tv service (cable or satellite company) to alter the resolution of the any channel. Technically, you are paying for a channel in HD 1080i, but you are only being provided it in 720p because the provider decided to change the resolution and lower it's quality. That would be like in you are leasing a car with a V6 engine, but the dealership decided to alter the engine to only be a V4 since they were still the owner.

Posted by
Regular Contributor

Message 97 of 182
1,884 Views

FAUguy wrote:

jkozlow3 wrote:

FAUguy wrote:

So over the past week, the conversion has been going on in my area of Deerfield/Pompano Beach, FL.

Each day, more HD channels that were 1080i are now being converted to 720p. Sunday when I recoded "The Young Pope" on HBO it was at 1080i, now tonight all HBO and Starz HD channels are 720p.

The first think to notice is that when an HD channel is using MPEG 4 instead of MPEG 2, the Live TV Buffer is about 85 min instead of 30-45 min.

Freeform HD (formally ABC Family) has always been 720p, and is still 720p, but now is using the MPEG 4 codec. I see no difference in the Picture Quality.

But on other HD channels that were 1080i MPEG 2 and are now 720p MPEG 4, I can see is difference, as the image is not as sharp (more soft), fast motion actualy looks like frames are missing, and color isn't as accurate. I was able to compare "The Young Pope, epp 2" that I recorded Sunday night on HBO at 1080i, and then today when it was at 720p. I've also looked at Batman Begins and the first X-Men, and all of these don't look at good as they were when broadcast last week (they are shown all the time).

 

I personally don't know (right now) if this is worth me changing my TV provider from Comcast to another provider (U-Verse, DTV, or Dish), but it is truly a shame that Comcast is moving away from moderatly good HD picture quality to this downconverted lower quality. I'm not sure what (expensive) equipment they are using to do this at the local plant in Pompano Beach, but they must be using a low bit-rate setting (probably under 10Mbps) during the 1080i-to-720p converstion.

 

With more 4K UHD set coming out each year, the strive of Comcast should be to improve the PQ, not to make it worse, just to fit more channels into 1 frequency. Do you realy think this is the best way to have good PQ (original 1080i HD downconverted to 720p by Comcast, sent over coax, box decodes it, then TV upconverts it to its native resolution of 1080p or 4k)?


 

FAUguy, How do you know which channels are in 720p vs. 1080i?  I know my picture is a lot softer in general now, but I am not aware of how to determine each channel's resolution since the X1 box cannot output the native resolution of each channel and forces the output of a fixed resolution (i.e. 1080p).


Thanks!


 

 

You can also tell by the length of the Live TV Buffer. Before, when watching a HD channel (either 720p or 1080i), the live TV Buffer was between 25-45 minutes (depending on the channel) due to the fact that the MPEG 2 codec was used. Now those same channels have an 85 min live buffer since they are using MPEG 4. So check to see if any of your HD channels have a longer live TV Buffer than they use to. If it is over an hour, chances are that HD channel is using MPEG 4.


 

Good idea - I'll try that and test out a few channels.

Posted by
Regular Contributor

Message 98 of 182
1,879 Views


Yes, I have done that over the weekend, by looking at what was recorded (at 720p) and watched it On demand at 1080i. Ever thought the Fast Forward doesn't work, and I can still do the 5 minute Skip Ahead, and then a quick rewind to get past the ads. Personally, I think the FCC should not allow any content provider of tv service (cable or satellite company) to alter the resolution of the any channel. Technically, you are paying for a channel in HD 1080i, but you are only being provided it in 720p because the provider decided to change the resolution and lower it's quality. That would be like in you are leasing a car with a V6 engine, but the dealership decided to alter the engine to only be a V4 since they were still the owner.


If you are curious, my FCC and Comcast complaints are mentioned in the thread below. I filed complaint with the FCC. Won;t do much and they treated me as if I was ingnorant but it is what I expected.

http://forums.xfinity.com/t5/Channels-and-Programming/Bring-back-1080i-VIDEO-for-HBO-Hallmark-AMC-Br...

Posted by
Regular Visitor
  • Thank you contributor for your first reply to the community!
 Posting replies is the best way to get involved.

Message 99 of 182
1,808 Views

It amazes me that the obvious is lost to "someone": the picture quality VISIBLY suffers from this Comcast upgrade, on normal-sized televisions. So, we "normal" viewers are being sacrificed for those watching TV on smaller devices? Pathetic.

 

After putting up with Comcast's games for years - specifically, BEIN Sports: broadcast on 1080, Comcast pushes it at 640x480 (!) - this will be the last straw for me. Here in Boston, FIOS is finally coming and I cannot wait to switch - if for no other reason than to leave Comcast. After 30 years with them, I am paying the highest I've ever paid, for a lower tier package and now VISIBLY lower picture quality to come.

Posted by
Regular Visitor
  • Thank you contributor for your first reply to the community!
 Posting replies is the best way to get involved.

Message 100 of 182
1,780 Views

Below is a reply I received from Comcast ECare after filing a complaint. Interesting that he calls the switch a "regression"!

 

I use a Silicondust Prime and Windows Media Center. Flipping through many channels, I see none that are 1080/30 - the tuner shows 1080 content as ~60fps. So, where is Matthew getting this 1080/30 from?

 

 

reply from Comcast ECare:

====================

My name is Matthew with the Office of Tom Karinshak. Please rest easy
knowing you have reached the right place. We will do everything we can
to address your concerns regarding the regression from 1080i to 720p.
I would like to sincerely apologize for the inconvenience and
frustration you have experienced. I am a consumer as well and truly
dislike hearing and seeing that my service company is getting away from
a picture quality that I feel is better than what the company is
reverting too.
I do apologize that you disagree with our decision to go to 720p. We did
this as 720p is progressive whereas 1080i is interlaced. With 1080i and
its interlaced functionality we can only run shows at 30 frames per
second and this causes gaps in data for fast moving scenes. With 720p
we can provide the same quality picture faster at 60 frames per second.
With this change we can provide a better quality service and resolve the
flaws created with 1080i.