HD broadcasts

ProdigalHorn

10,000+ Posts
Question for anyone who knows - when a show is shot in HD, for example if it's shot in 1080, does the network automatically broadcast it at that level, or could it be downgraded by a local affiliate?

Reason I ask is I'm watching Lost and per my Tivo, it's in 720p, and the picture sucks (well, relatively). So I'm wondering if ABC just went cheap and didn't shoot in 1080i, or if they did shoot it in 1080i and the local affiliate just doesn't have the capacity to broadcast at that level.
 
Well, there's 720HD and 1080. It's a pretty noticeable difference. But considering they do the seasons in Blu-Ray it seems unlikely they weren't shooting in pure HD. So basically I'm left with the option of my ABC affiliate sucking.
 
In all fairness, my new apartment's smaller, so I'm only about 9 feet away. But I can definitely tell the difference. Of course I may just not be "average".
biggrin.gif


I'm curious though - because that quote says you can't tell the difference between a 720p set and a 1080p set showing the same blue-ray feed. That I believe. But it doesn't say that you can't tell the difference between a 720 feed and a 1080 feed on a 1080 TV. And you absolutely can tell the difference - otherwise why bother?

Note: found this article, which explains it as well:
The Link

In reply to:


 
I think you probably had a crappy feed or something else going on

I don't think the difference between 720p and 1080i is that big.
 
This post from AVS Forum should answer all questions:The Link

HD TV programs are broadcast at either 720P or 1080i. Most channels are in 1080i because of the higher resolution, but a few channels like ESPN use 720P for the Fast action in sports. All HDTV's, LCD's or Plasma's will display that 720P or 1080i picture in it's OWN NATIVE RESOLUTION. For example a 720P Plasma HDTV is normally 768P! So no matter what, the 720P picture is scaled up, or the 1080i picture is scaled down and De-Interlaced. On a 1080P HDTV, the 720P picture is scaled up to 1080P and the 1080i picture is just De-Interlaced. Plasma and LCD displays are PROGRESSIVE. They CAN'T display a interlaced picture. A CRT(TUBE) HDTV on the other hand do, which is why they are 1080i. Or in case of a SDTV, 480i. Which when watching a 1080i picture they don't have to do anything. Trying to send a 1080P picture takes way to much bandwidth.

1080i and 1080P are the EXACT same resolution, 1920x1080. Most TV's in the U.S. show 60 frames per second, so with 1080P you can get a full 60 frames per second. Because each pass on the screen shows the FULL Progressive Picture. For a true Interlace SDTV or HDTV, it takes 2 passes, a Odd line pass and then a Even line pass to make a complete Picture, so this really gives you 30 frames per second. Most MOVIES on Blu-Ray are 1080P/24 or 24 frames per second. That's because that's how it's shown in the theater. To get 60 frames per second out of the 24 frames per second on Blu-Ray(And DVD also) they use 2:3 Pull-down. First frame shown 2 times, the second frame shown 3 times, the 3rd frame shown 2 times and so on and so on. 11222334445566677888 This gets you 60 frames out of 24 frames!! Because some people notice this juddering effect, they came out with 120Hz HDTV's or other speeds to even it out. So for 120Hz HDTV's, which means it's showing 120 frames per second, your now doing 5:5 pull-down. So it's 1111122222333334444455555, you get 120 frames from 24 frames, but since it's even the effect is gone.

Point being Blu-Ray is showing less frames at 1080P/24 then a 1080i/30 or 1080P/60 HDTV. I've seen a few places advertising a 1080i HDTV Plasma for example. This is B.S. No such thing. In fact when you then look at the resolution it's 768P. They just allow a 1080i input signal which the TV once again De-interlaces and scales the picture down to 768P. The HDTV will Display what it's source INPUT is, but no matter what, it's doing what it needs to, to display it on it's screen at it's Native Resolution. Hell the early 1080P HDTV's didn't allow a 1080P Input, only up to 1080i. There was nothing out to send a 1080P Signal to the HDTV, until really the PS3 and Blu-Ray. I'd also say HD DVD, but most of them were 1080i early on which generally is just as good unless the HDTV's Deinterlacer wasn't the best. Later on the Xbox 360 was updated to support 1080P also as when it was first released could only go up to 1080i.

So Time Warner, or Comcast and all the others, or your Antenna, get 720P or 1080i. You have a 1080P HDTV and are watching a 1080i channel, there's generally no scaling going on, just De-Interlacing. That means getting the 540 Odd frame and then the 540 even frame, Combining the 2 together to be shown at the same time, which of course would also mean showing that complete frame twice to get the 60 frames per second. With the few 720P sources, your already getting 60 frames per second, so your HDTV just scales the picture up to 1080P. It's just not simple like the old NTSC SDTV days of everything being 480i/60. Or in other countries using say PAL SDTV, having 576i/50Hz or 50 frames per second.
 

Recent Threads

Back
Top