Driverheaven Source Benchmark

Discussion in 'Overclocking, Benching & Modding' started by Stuart_Davidson, Aug 19, 2004.

Thread Status:
Not open for further replies.
  1. dpagan

    dpagan New Member

    Joined:
    Apr 5, 2004
    Messages:
    249
    Likes Received:
    0
    Trophy Points:
    0
    system
    abit nf7-s rev 2.0
    amd 2500 xp @2.1ghz
    1gb corsair
    2x80gb hd
    bfg 6800gt stock
    win xp
    same settings reflect all yada yada i get 53FPS
     
  2. Surfer2374

    Surfer2374 New Member

    Joined:
    Aug 19, 2004
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    0
    Here you go guys here's a more accurate view of the picture. Ati is still in the lead but the gt and pro are much closer. They also paired the ultra with the XT and not the XT PE. If you wanna pair the cards up the only fair way is by suggested retail price:


    x800xtPE vs ultra extreme
    x800xt vs ultra
    x800pro vs gt
    For the high end anyway. Nvidia's previous generation(i.e. FX ) not really worth mentioning as the r300 pretty much killed it.

    the link:http://www.gamers-depot.com/hardware/video_cards/source/002.htm
     
  3. Malus

    Malus BSD SMASH!

    Joined:
    May 13, 2002
    Messages:
    1,170
    Likes Received:
    3
    Trophy Points:
    0
    Very interesting. Not sure what to make of it though.

    I'll probably buy an Nvidia card anyway because of their *BSD/Linux support. ATI is sorely lacking in that department, so I probably won't be spending a lot of money on them anytime soon.
     
  4. Silent Buddha

    Silent Buddha New Member

    Joined:
    Aug 18, 2004
    Messages:
    19
    Likes Received:
    0
    Trophy Points:
    0
    Here's something interesting I just noticed in both Veridian3's and S-Extreme's benchmarks...

    Both reports show the XTPE faster at 6xFSAA and 16xAniso than the 6800U at 4xFSAA and 8xAniso when every detail setting in the game is cranked up.

    I'm also dis-regarding the gamer's depot benchmark as they don't disclose at all what detail settings they are using in benchmark. Looking at their numbers, it doesn't look like they have everything cranked up to the max in the benchmark.

    However, it does let people know roughly how their card will perform IF you don't crank up all the visual eye-candy in the benchmark.

    And this quote sounds suspiciously like the what many NV "fans" derided ATI "fans" for saying about bringing performance up to par in Doom 3.

    "AntiAliasing and Anisotropic Filtering performance belongs to ATI for sure, especially at the higher resolutions. NVIDIA can, and most likely will, release a new driver right before or shortly after Half Life 2's release which we're confident will help bring up some of the delta we see here. "

    ATI "fans" - We're sure with an OpenGL rewrite, that performance will close the gap with the 6800 series.

    Nvidia "fans" - We're sure with Direct3D optimizations that performance will close the gap or surpass the X800 series.

    Now assuming you aren't a "fan" (in other words a fanatic) of either company, you'll see that you will get great gameplay in pretty much any game with either card.

    Myself even if the promised OpenGL rewrite doesn't close the "performance" gap with the 6800 series in OpenGL, I can still run every single game at 1920x1200 resolution at playable framerates. The same goes for D3D.

    I'm also sure that if I had a 6800 series, I'd be able to make the exact same claim.

    Now, if you live and die by your benchmark scores and actually playing the games is only secondary, then yeah, feel free to keep arguing about which is faster and which company is "cheating" and what website is biased, etc...

    For me, I'll be playing my games...

    Silent Buddha
     
  5. No_Style

    No_Style Styleless Wonder

    Joined:
    Jun 18, 2002
    Messages:
    6,027
    Likes Received:
    0
    Trophy Points:
    0
    The silent one has spoken and spoken well. :) I just hope that these 'promises' of gap closures will become more evident before Q2 of 2005. I'm confident that it will be, but time will tell.
     
  6. fish99

    fish99 New Member

    Joined:
    Aug 19, 2004
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    0
    Clearly says XT Platinum there. Those scores are radically different from the DH and firingsquad ones though. I'd love to know what's going on :confused:

    One other thing, since everyone is so keen on fairness - comparing the GT OC isn't exactly fair since it's not using stock speeds.
     
    Last edited: Aug 20, 2004
  7. The_Neon_Cowboy

    The_Neon_Cowboy Well-Known Member

    Joined:
    Dec 18, 2002
    Messages:
    16,076
    Likes Received:
    28
    Trophy Points:
    73
    I’m surprised, I don’t think anyone mentioned this but I just breezed over the posts. Did anyone notice a couple of the IQ differences?



    Save both images to your HDD, and then use the slide show feature in the windows picture viewer. Flip back and forth and you’ll see it.



    [​IMG]


    There is a texture/ detail difference on the far wall also there is noticeable a color difference on the bars that are up high take a look….(BTW… Sorry for monkeying with a DH image)
     
  8. Sk0t

    Sk0t New Member

    Joined:
    Aug 19, 2004
    Messages:
    10
    Likes Received:
    0
    Trophy Points:
    0
    Techreport has a benchmark out, can be seen here:
    http://www.techreport.com/onearticle.x/7221

    Their results are closer to gamers.depot, than to dh/firingsquads.

    However, they had "water detail" set to "reflect world" yours(DH) was set to "reflect all" right? is this simply the diffence between the DH/fs vs. GD/TR ?

    Site............. .CPU...X800XT..6800U..6800gt..X800pro

    Driverheaven ..FX-53....72........50......-............-
    Gamers-depot .P4-3.4...69........55.....50..........51
    Firingsquad .....3800+....80........63.....57..........66
    Techreport .....3800+....70........56.....51..........51

    Also, they mentioned something that might explain the strange difference in results:
     
  9. HawK

    HawK Banned

    Joined:
    May 13, 2002
    Messages:
    2,092
    Likes Received:
    0
    Trophy Points:
    0
    Yes I did post see post #8

    Look at the surroundings of the steel door:
    NV:
    [​IMG]
    ATI:
    [​IMG]
     
  10. fish99

    fish99 New Member

    Joined:
    Aug 19, 2004
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    0
  11. flow``

    flow`` New Member

    Joined:
    Jul 12, 2002
    Messages:
    35
    Likes Received:
    0
    Trophy Points:
    0
    pretty interesting stuff.. cant wait to see the game released and an updated driver release or two and see how the story changes

    i could really care less in the end which company is better, since competition only makes the products better, but it's a shame valve didnt put a little more work into fixing or working around the massive performance difference between the cards because that's really unfair and unfortunate for a lot of gamers with nvidia cards.

    doom3 benchmarks looked a lot better then these, even though there were some fairly small performance differences, but these hl2 benchmarks are just rediculus that a company would put out such a biased product. shame on valve
     
  12. Silent Buddha

    Silent Buddha New Member

    Joined:
    Aug 18, 2004
    Messages:
    19
    Likes Received:
    0
    Trophy Points:
    0
    Uh...

    Someone correct me if I'm wrong, but didn't Valve themselves say they spent 3 times the amount of time trying to get performance up on Nvidia hardware as they did in trying to get performance up on ATI hardware?

    And, considering the amount of games that use extensive DX9 pixel shaders, I'm not at all surprised by this. One whole game that I can think of. Farcry. Tomb Raider only uses token pixel shaders.

    Just to beat a dead horse but ATI has quite a lead in any game that uses Pixel Shaders extensively. Just take a look at the performance difference in Far Cry.

    Nvidia NEEDED PS 3.0 to get performance similar to ATI with base PS 2.0. However when PS 2.0b is used ATI cards also gain a similar performance gain due to long instructions in Pixel Shaders. All things being equal ATI has better pixel shader performance than equivalent Nvidia cards.

    The good news though is that at least with the 6800, Nvidia is closing the gap. At least this time around HL2 won't have to fall back to DX8.1 on Nvidia's hardware to have playable performance. I'd call that a big step up and a good thing for EVERYONE. I'd hate it very much if ATI or Nvidia didn't have good competition.

    Usually, I don't care one whit if Nvidia's or ATI's graphics solutions perform better than the other. However, if people are going to start slandering a company's reputation even though previous generation games tend to back up what is happening, then that is just not right.

    If you're going to slander Valve just because ATI hardware happens to perform better in Halflife 2, then you should also give equal time to slander ID software for just happening to perform better on Nvidia hardware.

    However if people could actually step back and stop being graphics card/software company "fans" (IE fanatics) they'll see that neither Valve nor ID did anything to purposely "screw" over one graphics company or the other.

    For me I'll always go with whatever solution gives me enough performance to play the games that I want to play with the best image quality available.

    Up to now that meant, voodoo1, voodoo2, Geforce 256, Geforce3, Voodoo5 5500, Geforce 4 TI4600, Radeon 9700pro, and now the X800 XT PE.

    I also tried out the Riva TNT but couldn't stand it. And the Voodoo5 was preferred at the time as the image quality was far superior to the GF3 even though the GF3 had higher speed. The GF4 was just too much of a speed increase to keep using the V5 even though the V5 still had better IQ, IMO.

    IF the 6800s had come out on the market before the X800s had, there's a good chance that I'd be using a 6800 class card right now. And I'd probably be just as happy with that as I am with my X800.

    Although I must say that OpenGL performance is a distant second to me compared to the importance of D3D performance.

    Silent Buddha
     
  13. HawK

    HawK Banned

    Joined:
    May 13, 2002
    Messages:
    2,092
    Likes Received:
    0
    Trophy Points:
    0
    Mmm, then ofcourse you then turn around your reasoning, why did I.D. not spend more time to make sure D3 would run better on Ati HW?
    As valve did to make HL2 run better on NV? fair is fair right?
     
  14. Demigod

    Demigod New Member

    Joined:
    Nov 6, 2003
    Messages:
    102
    Likes Received:
    0
    Trophy Points:
    0
    The 5x more development valve put in for nvidia was for the fx series which if you look at then benchmarks worked well as the fx scores very well.
    Bottom line both companies have optimised for both cards, id works closer with nvidia valve with ati so what, But both card makers cards get playable Framerates in both games
     
  15. fish99

    fish99 New Member

    Joined:
    Aug 19, 2004
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    0
    Isn't that 'steel door' actually the floor - I think that shot is looking down. Anyway, I think from the full size images the 'door' has a layer of water over it which is the main cause of the difference in the two images, just the rippling of the water. The left hand rope supporting the box centre screen has gone green for some reason on the ATI image though :wtf:

    The only thing I don't understand about all these benchmarks is how the X800 Pro can demolish the 6800 GT on some sites, yet the 6800 GT is a touch ahead on others. There's clearly a 10-15% gap between nvidia results depending on which site you get your benchmarks from.

    I trust all these hardware sites to be as dedicated to accuracy as possible, so I find the whole thing very odd. Maybe as SkOt and Tech Report said, it's something to do with changing res without restarting the benchmark tool. Might be worth Veridian3 looking into that.

    Quote from Tech Report:
    no matter which card we tried, we'd see pixel shader corruption problems and skewed benchmark results if we didn't exit the game and restart it after each video mode change. This problem was simple to work around, of course, but it's something to note.
     
  16. petri1

    petri1 New Member

    Joined:
    Jul 13, 2004
    Messages:
    746
    Likes Received:
    0
    Trophy Points:
    0
    so allt his is very impressive but how will the radeon 9xxx series do in those beches??


    will i for one have as good flow in coming valve hyper hyped game as in far cry or even better? in valve tha crytechs?
     
  17. Demigod

    Demigod New Member

    Joined:
    Nov 6, 2003
    Messages:
    102
    Likes Received:
    0
    Trophy Points:
    0
    It would be nice to have some results for the 9xxx series as I have one in a comp next to me.
     
  18. Sk0t

    Sk0t New Member

    Joined:
    Aug 19, 2004
    Messages:
    10
    Likes Received:
    0
    Trophy Points:
    0
    9600xt, 9800pro and 9800xt are all in the techreport test:
    http://www.techreport.com/onearticle.x/7221
     
  19. Demigod

    Demigod New Member

    Joined:
    Nov 6, 2003
    Messages:
    102
    Likes Received:
    0
    Trophy Points:
    0
    cool its playable with 4aa 8af @ 10/7 on a 9800 (over 30fps) good enough for me. I cant see anyone having a bad time fps wise with this game.:lol:
     
  20. fish99

    fish99 New Member

    Joined:
    Aug 19, 2004
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    0
    Remember this is just a graphics benchmark with (as far as I'm aware) no physics or AI. The full game surely won't reach these sort of numbers.
     
Thread Status:
Not open for further replies.

Share This Page

visited