DH Article: Avivo HD Vs Purevideo HD

Discussion in 'Reviews & Articles Discussion' started by HardwareHeaven, Apr 16, 2008.

Thread Status:
Not open for further replies.
  1. Stuart_Davidson

    Stuart_Davidson Well-Known Member

    Joined:
    Nov 20, 2002
    Messages:
    5,843
    Likes Received:
    188
    Trophy Points:
    73
    Thanks for the feedback.

    With regard to the software rendering, we will consider that for the next article. I can see it making some interesting comparisons.

    At this time we don't have any plans for other video formats, if anything i would be thinking about .ts or .mkv performance rather than xvid/mpeg etc. But once again we will keep it in mind for the next video article we do.
     
  2. pouyoux

    pouyoux New Member

    Joined:
    Apr 18, 2008
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    0
    as almost all mkv are using x264 stream, I think you already have results in the "casino royal" part of your article.
    But comportments of those cards with SD (small definition) movies like DVD, xvid files would be interesting.
    Another interesting thing would be to do the same article under Linux, as Linux is a valuable platform to make Home Theater PC.
     
  3. Stuart_Davidson

    Stuart_Davidson Well-Known Member

    Joined:
    Nov 20, 2002
    Messages:
    5,843
    Likes Received:
    188
    Trophy Points:
    73
    At this point with ATI/NV i would take nothing for granted about where results would be, performance of the system would also be interesting in my opinion.
     
  4. pouyoux

    pouyoux New Member

    Joined:
    Apr 18, 2008
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    0
    as Casino royale is h264 encoded I don't see what could makes a difference with a h264 stream in a mkv. But you can show me I'm wrong ;)

    as you said in your conclusion that none of the nv/ati decoders are perfect, I am on hurry to see/validate that software rendering just do it.
     
  5. Teme

    Teme Super Moderator

    Joined:
    Dec 22, 2004
    Messages:
    8,496
    Likes Received:
    174
    Trophy Points:
    73
    btw. As far as I know there is only DivX acceleration on ATI cards. And mostly Xvid decoding is done by the cpu so there aren't much to test in my opinion...
     
  6. Judas

    Judas Obvious Closet Brony Pony

    Joined:
    May 13, 2002
    Messages:
    38,691
    Likes Received:
    982
    Trophy Points:
    138

    that's irrelavent due to the fact that when xp was out for just over a year, the mass amount of users still using windows 98 was roughly the same. and the same claims were made (sp1 would have been either just recently available as well, or becoming available soon)

    Everthing you've stated about vista is in FACT.... completely false/inaccurate/laughable.

    There's simply nothing to be discussed, Vista is the OS to use... there isn't any need for XP any more.
     
  7. Vikingod

    Vikingod Int'l Fish Liaison

    Joined:
    Jul 6, 2004
    Messages:
    16,226
    Likes Received:
    117
    Trophy Points:
    88
    Lets keep the discussion to the articel at hand, please. Irrelavent replies should be ignored. No need to justify them, their merit is obvious to everyone reading the thread.

    Great article Allan and Stu, it was very informative.
     
  8. skibum5000

    skibum5000 New Member

    Joined:
    Apr 18, 2008
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    0
    I thank you for the report and there is much valuable and useful information, however, I fear that a very major and fairly basic error was made in the IQ comparisons.

    Doesn't nvidia currently auto expand levels while ATI does not?
    edit: ati apparently does too with recent drivers for HD, maybe your special method used older drivers scren captures, or more likely looking at color values it seems like you black crushed nvidia by doing something to make it carry out levels expansion twice.

    below still applies only it is now likely that everything I said needs to be scaled by another 5%, with nvidia getting black crushed and ATI arriving at proper expansion levels (makes sense since it didnt look that faded to me as 15-235 would've).
    So perhaps below is what happend, but it seemsmore like you had ATI 15-235 expanded to 0-255 and nvidia already expanded and then expanded again.
    Did you set some something weird somewhere? I can watch that bluray on my nvidia and not get double expansion (i have all sliders neutral default). Fine on my system.

    No wonder you found the mysteriously 5% deeper blacks for the nvidia because guess what 5% off 255 is... 13 and guess what the offset is for levels expansion 15! That is all it is. NVidia, as of the last few months of drivers automatically stretches 15-235 to 0-255 so you should view the images with TV set to handle 0 as black and 255 as white but with ATI you should set the TV to expect 15 as black and 235 as white. The ATI way does give you a little leeway to adjust things yourself and allow for a little blacker than black and whiter than white signal, although technically beyond spec there can sometimes be some info there and it is up to you if you a little lost contrast and more noise or a little more detail into black and white, but strictly speaking all that data was meant to be cut off if you exactly follow the standard. If you watch ATI as if 0 is black though you are getting a great loss in contrast, saturation, image pop and muted black levels. If you watch NVIDIA with 15 as black you are getting lots of black and white shades clipped off. If do it in reverse, then both look as they should.

    If you do the test properly (by, for instance, setting normal when viewing the nvidia images and low when viewing the ATI ones for black level on HDMI input or by stretching 15-235 to 0-255 in CS3 for the ATI images) the luminance are pretty much the same on both cards! There is no detail loss on Nvidia, they look pretty much the same! And the ATI is not mysteriously missing pop and saturation, again it's pretty much the same!

    The color balance is different though and that is weird and an important finding (note the level balancing enhances the apparent difference and increases apparent red strength for the expanded versions but it still very much there).
     
    Last edited: Apr 18, 2008
  9. skibum5000

    skibum5000 New Member

    Joined:
    Apr 18, 2008
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    0
    but what if nvidia happened to have expanded 15-235 to 0 to 255 before your capture while ATI had not, this appear to be the case. Then everything regarding saturation, rightness, black and white details will be wrong. Maybe you accidentally had things set so as to trigger double levels expansion for NVidia and have ATI correct (EDIT: this is what happened for sure, you are viewing ATI at proper levels expansion and nvidia with a DOUBLE expansion applied due to who knows what, but it does not do that on my machine using any of three different drives, inlcuding latest beta and certified and an older one). Anyway, something was messed up. The two cards, in fact, put out pretty much the same levels, if not color balance.
     
    Last edited: Apr 18, 2008
  10. arfster

    arfster New Member

    Joined:
    Apr 18, 2008
    Messages:
    4
    Likes Received:
    0
    Trophy Points:
    0
    Yup - downloading and looking at the screenshots, this is exactly what's happened. The ATI images have been expanded to 0-255, as their drivers always do with HD, while the Nvidia ones are way beyond that. Most likely explanation is a double expansion by the Nvidia drivers, which I have heard of happening occasionally.

    IMO the Nvidia ones are unwatchable, even on a PC display. All dark detail is totally wiped out, like watching a badly encoded xvid.
     
  11. Tigre Marino

    Tigre Marino New Member

    Joined:
    Apr 17, 2008
    Messages:
    4
    Likes Received:
    0
    Trophy Points:
    0
    Only if XVID codec is installed. DiVX can takeover on any generic MPEG-4 decoding if there it is not a competing codec installed in the system, so if the video card supports DXVA (DirectX Video Acceleration) and the driver supports it, DiVX can use this support to enhance the experience. :wtf:
     
  12. Tigre Marino

    Tigre Marino New Member

    Joined:
    Apr 17, 2008
    Messages:
    4
    Likes Received:
    0
    Trophy Points:
    0
    The same happened with Windows Millenium Edition... and I've seen tons of people "degrading" their new Vista laptops/desktops to Windows XP, which is more of an upgrade, btw. Even I did with my new Gateway MX6947M notebook using the license from my previous dead laptop.

    I rather use Ubuntu or MacOSX if there it is no choice to use XP.
     
  13. arfster

    arfster New Member

    Joined:
    Apr 18, 2008
    Messages:
    4
    Likes Received:
    0
    Trophy Points:
    0
    Vddobrev over at avsforum has a suggestion. Apparently Nvidia's 174.74 expand just like ATI normally, but .....

    "Edit: I think I know how they triggered double expansion. In the nVidia control panel, there is a setting "Select HDMI color format" - RGB or YCbCr444. If they selected YCbCr444, then this is how they got it."

    Pretty nasty bug.
     
  14. Stuart_Davidson

    Stuart_Davidson Well-Known Member

    Joined:
    Nov 20, 2002
    Messages:
    5,843
    Likes Received:
    188
    Trophy Points:
    73
    I'm going to ignore most of the recent comments above, we have explained the answers to these several times now.

    As for the YCrCb comment, when i was starting the testing i compared the output of that option compared to RGB and don't remember seeing any real difference in IQ on the scenes I tried. I will go back and check again over the next day or two and confirm the results using the two settings. However, if there is a bug in the NV driver it doesn't invalidate the results/testing. The images still reflect the output of the Nvidia card and if all NV need to do to improve things is fix that bug, then great.

    Once again, please remember... the point was not to get perfect IQ through our own configuration... it was to test what each card is currently outputting... bugs or no bugs... these images are what consumers will see.
     
  15. HardwareHeaven

    HardwareHeaven Administrator Staff Member

    Joined:
    May 6, 2002
    Messages:
    32,274
    Likes Received:
    163
    Trophy Points:
    88
    As stuart said if there are nvidia or ATI driver bugs, it will be exactly what the end user will see - we mentioned everything we noticed. So it is relevant. We have answered many of these questions earlier in the thread relating to driver settings, why they were selected and the input we have received from Nvidia. Changes are being made by them already for future revisions.

    Perhaps we could follow the article up with our own "optimised" conditions - making registry changes, changes to the driver etc, to get multiple results from both sets of hardware, however as it stands our testing basically shows what the end user will see with settings out of the box as well as settings that nvidia asked us to use (ATi said default was their best). People in this thread who are attempting to ascertain quality comparisons via possibly uncalibrated or incorrectly calibrated screens are wasting their time, we could have done that eons ago, but we have analysed colour breakdown from raw captures directly taken from the hardware without anything else in the mix (panel, calibration, ambient light etc) to taint results.

    While we appreciate the interest in the article, much of this appears to be going in circles with many people debating the supposed merits of outdated operating systems and other such irrelevancies. If you have any questions which weren't detailed in the article or in this thread then drop us an email, however if they have already been answered then we really won't be making the effort to reply to them. So please make sure to read everything beforehand.
     
Thread Status:
Not open for further replies.

Share This Page

visited