DH Article: Avivo HD Vs Purevideo HD

Discussion in 'Reviews & Articles Discussion' started by HardwareHeaven, Apr 16, 2008.

Thread Status:
Not open for further replies.
  1. Stuart_Davidson

    Stuart_Davidson Well-Known Member

    Joined:
    Nov 20, 2002
    Messages:
    5,843
    Likes Received:
    188
    Trophy Points:
    73
    I would have to agree with the readers above, XP is no longer a product which really should receive new features. ATI, Nvidia and most other companies should be concentrating on ensuring the most recent OS has the best features and experience... its how we move forward.

    Having said that, this page seems to indicate Nvidia have acceleration, including dual stream on recent cards, working: GeForce Release 174
     
  2. ianken

    ianken New Member

    Joined:
    Oct 25, 2002
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    0
    I find this comparison highly subjective.

    I see you employed the noise reduction features. Why? NR for HD-DVD and BRD is pointless, it merely replaces random noise with deterministic noise, and destroys detail. IN a film like "300" where film grain was added in post production you'll actually change intended image.

    To determine of color is rendered accurately: color bars and vector plot. Using photoshop and film content is not going to do it. It reminds me of the guy who calibrated his TV using "The Matrix" and when he got rid of the "too much green" wondered why all his other stuff looked weird.

    To determine if black levels are right: ramp pattern and waveform plot.

    The difference in brightness between the images is due to differences in how the two cards handle nominal range of video content. You need to deal with it differently depending on the display device in use. Nominal black for video content is "16." Depending on how the display or decoder is handling it this can look "too dark" or "washed out." It is obvious to me that the AMD and NV gear is handling nominal range differently.

    Anyway, I just don't trust image quality analysis that does not employ test patterns. Video is very standards based. There are very well defined methods of determining if a video system is accurate and I didn't see that employed here.

    My $.02
     
  3. Supersparky

    Supersparky New Member

    Joined:
    Apr 17, 2008
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    0
    I noticed that no consideration was made about calibration of the video card with the display (or perhaps I missed it). Comments about color rendering cannot be made unless the proper both device and display have been calibrated. Perhaps Nvidia's darker rendering is only a result of miscalibration. This is why both manufacturers have adjustments on the card. Nevertheless, no big deal in my opinion.

    However, there was one thing that stood out like a giant wart in the "expert's" review on "noise". I had to almost laugh. Movies, unless it's a high-budget sci-fi flick from George Lucas, are FILMED, not many are made with high-def digital cameras yet. Film has grain. The highlighted portions in the review (blue water) where "noise" was highlighted was actually film grain and not noise. Since film grain is part of the original frame or image, I would give the render of such grain a better review than one that merely smears over it thinking it's "noise". Sure, one can argue that grain is physical "noise", but it's not real noise, specifically artificially generated artifacts in the electronic process, is what real noise is.

    This was one of the complaints (from production companies) of movies in high-def and films made on cheap film stock. It was the fear people would complain about the image being "dirty" because the high-resolution of HDTV being able to render film grain. The older the movie, the more grain is shown, as film technology has improved. Stanard NTSC wouldn't be able to show grain and thus the movie would be rendered "cleaner" in a side-by-side comparison with the hi-def version.

    Treating grain as noise in the rendering process has the potential of smearing out small details. It all depends on the algorithm used. Nevertheless, the best algorithm is patented and is what the industry uses to "digitally restore" old movies. This technology is not in your video card, nor its driver software. It's quite an amazing technology too, it actrually makes the result look better than the original. Explaining how it works would take too much here. The technique is so advanced and requires a lot of computing horsepower that it requires a cluster of servers to work on.

    So-called "noise-reduction" in consumer video should be avoided. I turn that stuff off and my image is better than an image with the small details removed or smeared together. Even though film grain moves around like noise, it still shows small details amongst the grain that noise-reducing algorithms take out or smear over all for the sake of a "clean" picture.

    9 times out of 10, that "noise" is probably film grain. Removing it, generally means removing the small details.
     
  4. HardwareHeaven

    HardwareHeaven Administrator Staff Member

    Joined:
    May 6, 2002
    Messages:
    32,274
    Likes Received:
    163
    Trophy Points:
    88
    Totally Incorrect with regards to the technical analysis. The frames are captured directly from the GFX card to a file on a HD. These files are then analysed via direct pixel breakdown. There is nothing in this equation which relates to the monitor being used as the figures are reading from the raw output of the files. If you don't comphrehend this then I can go into more detail.

    Actually there is noise in every film, even pure digital CGI to disc. You will find noise even will come into the equation from a hardware level on the output. I am glad you are laughing, because I sure had a chuckle on your George Lucas commentary. I am sure we could fire up one of the last Star Wars movies and analyse that, however there will be noise in those as well. (and they also suck!). There will be noise on the transfers and equipment used. We aren't analysing recording techniques or various production values we are studying how products DEAL with identical situations for consumers in a real world environment.


    The artifacting in that particular scene is partially caused by the enhanced nvidia algorithm (this is NOT grain - its heavy artifacting due to issues with a multiscan correction algorithm - grain with a modern recording technique never appears in 20-250 pixel random deformations), this issue has actually been verified by our sources and there are improvements being made to the process. The process for multisampling the colour swatches and linear bands will be improved.

    The rest of your comments are absolutely irrelevant to the majority of our testing. reads more to me like a beginners guide to noise techniques.
     
  5. Stuart_Davidson

    Stuart_Davidson Well-Known Member

    Joined:
    Nov 20, 2002
    Messages:
    5,843
    Likes Received:
    188
    Trophy Points:
    73
    It is very clear that either you haven't read the whole article or you just don't understand the overall point of it.

    There were 2 aspects.

    1) Compare the image quality which end users will see when they buy each card, install it and the drivers and then fire up a HD-DVD/Blu-Ray. This is also known as default quality.

    ATI employ noise reduction by default, regardless of the disc used so that's what was tested.

    2) Ask ATI/Nvidia what enhancements they recommend turning on/off and retest. ATI asked us not to change anything. To quote them "Default is fine". Nvidia suggest trying the Contrast/Colour enhancements as well as noise reduction, starting from a figure of 60 and experimenting. These are the settings that both companies asked us to use. Also known as "best IQ". (I already explained this earlier but clearly you didn't read that either).


    So to clarify...
    1) Default
    2) Best

    Then we discussed our opinion on the results both subjective and technical, summarised and then allowed YOU to download each image and form your own opinion on what is the best IQ, and what each company needs to do to fix their image. (I'm sure you agree neither is perfect).

    Screen calibration, test patterns... neither are relevant to the technical analysis (read Z's post) and our screens are calibrated with professional equipment for the subjective analysis (which is never perfect but as close as possible and significantly higher than the average enthusiast). The images are true digital captures of the card output at the documented settings.
     
  6. Alex

    Alex Driverheaven Lover

    Joined:
    Aug 8, 2005
    Messages:
    241
    Likes Received:
    9
    Trophy Points:
    0
    where on earth did these two idiots come from? clearly they didnt even read the testing methods. I dont see any better way to test images by grabbing directly from the card then analysing the raw files. its the purest analysis I can imagine possible.

    Great job Z and V3. probably some losers from another site jealous of the work you guys put into this.
     
  7. brutusmaximus

    brutusmaximus New Member

    Joined:
    Apr 24, 2005
    Messages:
    520
    Likes Received:
    17
    Trophy Points:
    0
    Well said. I think the problem here is that there are really very few people on the net capable of going into such image detail as Z can and as such its confusing as its such a groundbreaking idea.

    I mean whoever thought of grabbing the output from the cards, direct to file, then breaking apart the raw files? there is no monitor in the mix, no calibration to cock things up, and the signal is as pure as possible from card to file.

    I mean its great we have people coming over here to debate it, but at least make some sense, what im reading is clearly people interested in video but quoting information which isnt even relevant as they figure the whole thing was handled by someone looking at a bloody calibrated (or not) screen to get the figures.

    take some time to read the thing right, guys, its embarassing :uhoh:
     
  8. humonous

    humonous New Member

    Joined:
    Jun 27, 2005
    Messages:
    157
    Likes Received:
    7
    Trophy Points:
    0
    moral of the story. Don't take on Z with regards to image work. He will pwn everyone, guy is an expert in the field. totally. You outta have seen some of his work before when he had his own design sites. amazing stuff. respect needs shown. end of story.
     
  9. floppychops

    floppychops New Member

    Joined:
    Apr 17, 2008
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    0
    yeah I had to join up to post something yesterday and im amazed how knowledgable and intelligent Veridian3 and Zardon are. Some of their hardware reviews are incredible. (skulltrail and recent SLI reviews for example).

    The methodology is sound, I think people just dont comprehend it, as everyone is used to people making purely personal opinions looking at images on a screen. This is the total opposite of this article tho, I learned a lot from it and i know my friends did on the other forums I visit. Its a briliant article and its good to read that Nvidia are using the article to help guide them a little in the future. The company seem very proactive in improving their products lately.

    I think ill hang around here but the articles take me some time to read and absorb there is so much in them. I had to read this one three times to work a lot of it out, maybe more people should read them slower if they are getting lost which is clearly the case.
     
  10. OmegaRED

    OmegaRED Relapsed Gamer

    Joined:
    Oct 18, 2002
    Messages:
    5,704
    Likes Received:
    186
    Trophy Points:
    73
    Don't care that much about the issue since I don't have HD but damn that was some great analysis, it must have taken a long time to examine the different photos knowing full well there'd be a lot of heat if you made a mistake. We appreciate the effort!
     
  11. Tigre Marino

    Tigre Marino New Member

    Joined:
    Apr 17, 2008
    Messages:
    4
    Likes Received:
    0
    Trophy Points:
    0
    If being the 85% of the Windows ecosystem like XPSP2 is now isn't relevant... hello!!! I don't know what that is. Face it: Windows Vista is a mediocre, bug-ridden, slow, slouch and DRM-infested operating system. There's nothing you can do on Vista that SP2 can't (except playing games with DX10 which has been overhyped both in IQ and performance). Vista is like one fuel-thirsty SUV that rolls over without apparent cause (like the Ford Explorer). Now we now why Japanese, Indian and Korean auto industries are crushing American ones, if they build cars like they now program operating systems like Vista. If Windows 7 isn't as good as Windows XP, I'm moving to MacOS or Linux, because Windows Vista is like Windows Millenium Edition 2.
     
  12. arfster

    arfster New Member

    Joined:
    Apr 18, 2008
    Messages:
    4
    Likes Received:
    0
    Trophy Points:
    0
    One key fact explains 99% of the differences observed in the article: by default, ATI 2xxx/3xxx series cards expand the colour space to PC levels (ie 0-255), presumably because they think most people will be viewing on a PC monitor. Note this doesn't apply to SD - ATI bizarrely expand HD but not SD, and you have to apply registry tweaks to get them to behave consistently (I've been complaining to their tech support for 9 months about this, but no resolution).

    For a fairer comparison with Nvidia, I'd suggest reversing the expansion via brightness16/contrast86 (this uses the same bt709 method as the original expansion, so it's near lossless).
     
    Last edited: Apr 18, 2008
  13. arfster

    arfster New Member

    Joined:
    Apr 18, 2008
    Messages:
    4
    Likes Received:
    0
    Trophy Points:
    0
    ATI's denoise only works for interlaced material, and only with the more advanced modes of deinterlacing selected(motion-adaptive & vector-adaptive). It's also disabled for all HD in recent drivers, along with sharpening - whether by design or not I've no idea. Their drivers are very buggy, things regularly get broken/fixed/rebroken every few driver cycles.
     
    Last edited: Apr 18, 2008
  14. osho_gg

    osho_gg New Member

    Joined:
    Jan 15, 2003
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    0
    inverse telecine ??

    Thanks for a very thorough review.

    I noticed in the Nvidia Control Panel's screenshots that inverse telecine was unchecked. I understand that the default is to enable it. I was wondering did you notice any difference in your tests with this setting enabled/disabled? and if so, what?

    Osho
     
  15. Eeastcoasthandle

    Eeastcoasthandle New Member

    Joined:
    Nov 8, 2006
    Messages:
    158
    Likes Received:
    1
    Trophy Points:
    0
    I completely disagree with the conclusion. IMO based on reading the article, the IQ & performance wasn't properly weighed out. When attempts were made to adjust video or color settings to the nvidia card to make it as good as ATI. Should have been the first indication that ATI IQ is better. The second is when the tweaked settings for the nvidia card had negative side effects should have also weighed heavily into the conclusion. When most people haven't the slightest clue on how to tweak those settings the nudge should have went to ATI. So what that Nvidia offers playback with Aero, I have no use for it. The direction of this review is about IQ and performance not which offer more features. If ATI clearly demonstrates that the IQ and performance is better without playback using Aero it should have won with a cliff not stating that in your opinion it's a feature worth adding.
     
    Last edited: Apr 18, 2008
  16. HardwareHeaven

    HardwareHeaven Administrator Staff Member

    Joined:
    May 6, 2002
    Messages:
    32,274
    Likes Received:
    163
    Trophy Points:
    88
    And?


    This is the last time I will dignify these questions with a response. This has already been answered in the thread (and in the article itself) and it was stated that these are the settings ATI stated were the best. Obviously as they design the software and hardware then we would assume they would know, not yourself. Also if you feel this article wasn't "properly weighed out" then I suggest you perhaps test them yourself, publish it and post in here so I can see how you can improve on capturing raw data from the cards then analysing the raw pixel breakdown.

    Rather a sweeping statement isnt it? We mentioned that certain aspects of the IQ are better, but not all. I can therefore already assume you have either a: glanced over it or b: not bothered to correctly read the whole editorial.

    Why? they can be turned off and aren't the default out of the box settings.

    Well I guess you just answered your own question. If people don't know how to tweak the settings, then the enhanced Nvidia settings which AREN'T default wouldnt be used.

    Well as difficult a concept as it might seem, you are just one person. Are we to start reviewing products and omitting bugs and issues because some people might not care? If this is a condemning point for you then I suggest you perhaps start frequenting another site which omits issues based on loyalties to specific manufacturers because DH always tells it like it is, regardless of the company involved.

    Really? I wasn't aware you were in charge of handling editorials and the direction of the content on DriverHeaven, I thought that was my job.

    This article was ascertaining IQ, however during this process if we find issues relating to the playback of this content we will mention it. It seems to me that quite honestly you have no interest in an unbiased article but are more concerned with the fact we should be ignoring issues, if they are relating to ATI or their products.

    Well thats the point, it didn't. Quite what a cliff has to do with anything however .......

    Incidentally if I see these same questions repeated again and again in this thread they will be removed. We have no time to be readdressing the same mindless questions which have already been answered in both the editorial and this thread.
     
  17. HardwareHeaven

    HardwareHeaven Administrator Staff Member

    Joined:
    May 6, 2002
    Messages:
    32,274
    Likes Received:
    163
    Trophy Points:
    88
    We are actually aware of this, however this is a real world article and we are detailing the relevant image quality with basic out of the box settings, or settings that either ATI or Nvidia have recommended we use. (there is a reason you have no resolution from your reports to ATI, they dont feel its a change people should be making).

    Like everything, whether it is hardware or software there are always tweaks and (possibly subjective) improvements an educated enthusiast can make to have something run better (bios, registry etc). However for the majority of end users this will be irrelevant as they will never know a tweak like this, therefore it is pointless testing. The fact that ATI told us not to alter registry settings or panel settings beyond what we used is the only feasible way we can analyse the products. I mean we know some tweaks for Nvidia also, but we didn't use them either for exactly the same reasons.

    If ATI feel the settings they provide are the ones that people should be using then this is the way we test it, the same with Nvidia. If we were to analyse something we had tweaked significantly to the point that only 1% or less of the populace would change then this becomes a totally irrelevant article.
     
  18. Alex

    Alex Driverheaven Lover

    Joined:
    Aug 8, 2005
    Messages:
    241
    Likes Received:
    9
    Trophy Points:
    0
    Yeah their drivers are very buggy, I noticed things working or breaking on a regular basis between different driver revisions. Must be a nightmare for reviewers to get results when the drivers are really so poor at the end of the day.
     
  19. Stuart_Davidson

    Stuart_Davidson Well-Known Member

    Joined:
    Nov 20, 2002
    Messages:
    5,843
    Likes Received:
    188
    Trophy Points:
    73
    On our test system the default was unchecked. As for the differences when used, for the tests we were doing the option has no impact on the results.
     
  20. pouyoux

    pouyoux New Member

    Joined:
    Apr 18, 2008
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    0
    I've find your article very very relevant to what lots of people are wanting to know about video cards.

    I'am totally stunished about the lack of transparency in the process of decompressing HD material ..

    I've got 2 questions :
    - why not to include in this article a third concurrent with the picture you get in 100% software rendering mode ?
    - do you plan to do the same thing with other video formats (xvid, mpeg etc etc), to verify if ati and nvidia have the same footprint in decoding those files ?
     
Thread Status:
Not open for further replies.

Share This Page

visited