Results 1 to 9 of 9

Thread: 720 and 1080p movies/shows downloaded.

  1. #1
    Poster BT Rep: +1
    Join Date
    Oct 2006
    Posts
    55
    I have a question. My PC is connected to my AV receiver, and I watch movies and TV shows this way. However, whenever I watch something 720 and even 1080p, they aren't nearly as crisp compared to if I'm watching HD off my cable receiver. Is it because the rips are usually compressed a bit? Or is it something to do with my PC settings or graphics card?
    An example is The Walking Dead:
    The.Walking.Dead.S01E02.720p.HDTV.x264-CTU
    My graphics card is a bit older, GeForce 8500GT, but would it be from that?

    HEre's a screenshot from Walking Dead:
    http://img80.imageshack.us/img80/5861/walkingd.jpg

    And Elf 1080p
    http://img220.imageshack.us/img220/6586/elfz.jpg

    I guess the Elf one looks about normal?
    Last edited by doogie88; 12-02-2010 at 07:35 AM.

  2. Internet, Programming and Graphics   -   #2
    sandman_1's Avatar Poster
    Join Date
    Aug 2010
    Location
    Somewhere
    Posts
    519
    I have my vid card hooked up to my 50" plasma, dvi/hdmi adaptor to HDMI on tv, and the video looks good.

    What player do you use and how is your av receiver hooked up to your PC? HDMI/DVI adaptor to HDMI?

    I use MPC-HC and use these settings for video:

    1. EVR Custom Pres.**
    2. Bicubic A=-1.00 (PS 2.0)
    3. EVR Buffers 6

  3. Internet, Programming and Graphics   -   #3
    Poster BT Rep: +1
    Join Date
    Oct 2006
    Posts
    55
    I have Bilinear selected and EVER Buffer 5.

    What is the difference between the bilinear and bicubic?

  4. Internet, Programming and Graphics   -   #4
    sandman_1's Avatar Poster
    Join Date
    Aug 2010
    Location
    Somewhere
    Posts
    519
    Quote Originally Posted by doogie88 View Post

    What is the difference between the bilinear and bicubic?
    BILINEAR INTERPOLATION:

    Bilinear interpolation considers the closest 2x2 neighborhood of known pixel values surrounding the unknown pixel. It then takes a weighted average of these 4 pixels to arrive at its final interpolated value. This results in much smoother looking images than nearest neighbor.

    The diagram [at] the left is for a case when all known pixel distances are equal, so the interpolated value is simply their sum divided by four.


    BICUBIC INTERPOLATION:

    Bicubic goes one step beyond bilinear by considering the closest 4x4 neighborhood of known pixels-- for a total of 16 pixels. Since these are at various distances from the unknown pixel, closer pixels are given a higher weighting in the calculation. Bicubic produces noticeably sharper images than the previous two methods, and is perhaps the ideal combination of processing time and output quality. For this reason it is a standard in many image editing programs (including Adobe Photoshop), printer drivers and in-camera interpolation.


    Source: http://www.cambridgeincolour.com/tut...erpolation.htm

    Also you have to have a video card that supports Pixel Shader 2.0(PS 2.0) to use Bicubic Interpolation. I don't know if you had Bilinear or Bilinear PS 2.0 checked in the options.

  5. Internet, Programming and Graphics   -   #5
    It's most likely related to your PC settings, not specifically just from the files you're watching. If your PC or PC-out has a higher brightness setting than your TV and/or HD receiver, then that alone could cause artifacts to be more visible. If not, it's very likely due to the software and settings that you're using, like what's discussed above. Keep in mind that blu-ray is higher quality than HDtv broadcasts, and even a good compressed bluray-sourced rip should still probably look a little better than an uncompressed raw HD stream from a TV broadcast like what is sent to your HD receiver.

  6. Internet, Programming and Graphics   -   #6
    sandman_1's Avatar Poster
    Join Date
    Aug 2010
    Location
    Somewhere
    Posts
    519
    Forgot about this:

    Check your HDMI black level on your TV, if that is how it is hooked up. Non-PC sources use output range of 16-235. PC's output 0-255. MPC-HC defaults to 0-255. If your TV is set for 16-235, then obviously there is going to be problems with the picture such as banding.

    To change with MPC-HC:

    MPC-HC>>right click on screen>>select "Renderer Settings">>select "Output Range"

    Might have to check which setting is which on your TV. For instance my Samsung Plasma, Normal is 0-255 while Low is 16-235, which seems backwards to me.

  7. Internet, Programming and Graphics   -   #7
    Poster BT Rep: +1
    Join Date
    Oct 2006
    Posts
    55
    Thanks guys.
    The first change actually made a big difference. At first on my PC it looked a bit too sharp and a bit of 'static', and I forgot I changed it, then watched an episode of the same show and it was a lot better. I probably still have a few more things to tweak to get it since jkl49 said a compressed bluray should look better than my HDTV, because even the 1080p movies don't seem as good as regular HDTV.

  8. Internet, Programming and Graphics   -   #8
    sandman_1's Avatar Poster
    Join Date
    Aug 2010
    Location
    Somewhere
    Posts
    519
    Is your TV tweaked as close as it can be to the best settings? Go to TweakTV and see if they did your model. Their recommendations will be as close as you can get to the best picture without having a professional coming in to your home and doing it through the Service Menu.

    http://www.tweaktv.com/

  9. Internet, Programming and Graphics   -   #9
    Interesting site. Thanks for the link, that definitely looks worth checking out.

    Quote Originally Posted by doogie88 View Post
    I probably still have a few more things to tweak to get it since jkl49 said a compressed bluray should look better than my HDTV
    Well, maybe it's arguable, and it probably depends on the quality of the rip too. I've seen compressed bluray 720 rips that are about 1gb per 45min, and I've seen others that are 2.2gb per 45min, and most of them overall probably fall right in between. The compressed bluray might look better in some ways, and maybe the HDTV broadcast could look better in other ways. I think mostly what I meant is that the bluray sources are noticably clearer and sharper, and it's usually very obvious that it's from a better source than a TV broadcast, especially even moreso if you're watching a TV show that you often watch on TV but was later released on bluray.

    I think a lot of it comes down to what you're using to play each source. If they're coming from different equipment and different connections, that can be a factor. If you were using the same hardware and software to play back each one, then it'd be easier to compare. Maybe you could try one of those hardware HD media players from a store like Best Buy with a very good return policy, and see how the bluray rips look when played from that, or I think a PS3 can do the same thing if you remux the file to a new container first. Or if you have cable tv, you could try out a QAM tuner for the PC, and watch/record the network HD broadcast channels on that instead of always through your HD receiver.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •