I am very confused. I was planning to get one but I guess I need to study the technology before shelling out the money
I am very confused. I was planning to get one but I guess I need to study the technology before shelling out the money
Yes they give you that resolution but they are not full HD. 1080p resolution is the max available at this point, anything lower is simply not as high quality HD.
users go read about the difference between interlaced and progressive video. That should clear it up for you
http://blog.hometheatermag.com/geoff...061080iv1080p/
"When it comes to movies (as in HD DVD and Blu-ray) there will be no visible difference between the 1080i signal and the 1080p signal, as long as your TV correctly de-interlaces 1080i. So even if you could input 1080p, you wouldn't see a difference (because there is none)."
Last edited by users; 09-25-2008 at 04:32 PM.
So, the consensus is, 1080i is high definition
yes it is but it is not TrueHD. In other words anything under 1080p is HD but is not TrueHD
It's all HD. What we are talking about are four formats:
-Progressive vs. Interlaced
-720 vs. 1080
Here's how that breaks down:
"Progressive" is a full frame. Think of it like a motion picture camera where one complete image is captured 24 times a second (24p, 24fps, however you want to say it). This is what the "p" stands for in 1080p and 720p.
"Interlaced" are two fields that are scanned and then combined to make a full frame, hence interlace (need a visual? Interlock your fingers on each hand so that the tips are touching the webbing on the opposite). This means that if you are watching a broadcast at 30fps (broadcast standard) that 60 "scans" are done in one second.
Now, sports are always broadcast in "Interlaced". This is due to the fact that there is less "artifacting" in interlaced shooting and interlaced shooting is more responsive to fast motion. You'll often hear about something called "motion trails" in advertisements for HDTVs. This has nothing to do with interlaced/progressive and has everything to do with the kind of TV you buy (DLP = fastest motion detection, Plasma = slowest, but the technology has progressed to a level that this motion issue is pretty moot).
Movies are almost always progressive. This is to achieve the "filmy" look.
So back to the HD discussion:
720i/p and 1080i/p are all considered HD formats. What you have to understand is that "HD" is really just an arbitrary term defining new technology. What we are really talking about is frame size. Standard definition formats are 720 x 480 or 640 x 480 or 720 x 534, etc.
High-def formats (for the purposes of this discussion, not relating to 2k and 4k scan technology or "Ultra-High def") are 1280 pixels x 720 pixels, or 1920 pixels x 1080 pixels. To put it in laymans terms, anything that has a vertical resolution greater than 720 pixels is generally deemed High-Def. You might have some derivations in there and might not see these numbers exactly on TVs, but these are the standards.
To my original point:
Broadcast standards (both satellite and cable) are 1080i and 720p. This is due to bandwidth restrictions and hardware capabilites on the PROVIDER's end. They simply do not have a cost-effective way to broadcast "Full Progressive" High-Def signals at 1920 x 1080. BUT they can broadcast at 1920 x 1080 if the signal is INTERLACED. So, while sports might come in at 1920 x 1080 INTERLACED, movies and some TV shows will most likely be at 720p.
So what does that mean to a consumer? Well, if you're buying an HDTV and you want to experience the "Full" High-Definition setup, you have to buy a TV who's "native resolution" is 1920 x 1080. This resolution currently runs the gamut of all formats currently available at the consumer level; if you have DirecTV you will be enjoying football at 1920 x 1080 INTERLACED, and if you have a BluRay player you will be enjoying your favorite flick at 1920 x 1080 PROGRESSIVE aka FULL FRAME.
Last thing: some people claim that you can't see a difference between a lot of 720p and 1080p signals/broadcasts. This is true for some, not the case for others. As with anything, it is purely a matter of taste and general perception. Most folks who have 1080p (1920 x 1080) sets are either purists or folks who had a lot of money to blow and got sold one by a home-theater consultant, because generally, the 1920 x 1080 native sets are more expensive. Also, don't confuse yourselves (in this case, it sounds hard, I know)...if you see a TV that says it is something like 720p/1080i and NOT 1080p, all that means is that it can display a 1080i signal (downsized for your pixel resolution). This DOES NOT MEAN you are watching the signal at it's full resolution of 1920 x 1080. It means you are watching the signal at your TV's full resolution, whatever that may be.
Whew! Hope that clears some things up.
Just a side note, this is absolutely untrue. I don't know what the guy in this article was trying to say, but there is DEFINITELY a difference between 1080p and 1080i, both from a look and feel standpoint and a bandwidth standpoint.
Last edited by alucarddl; 09-30-2008 at 06:30 PM. Reason: Automerged Doublepost
Bookmarks