Discussion in 'Reviews & Articles Discussion' started by craig5320, Oct 22, 2010.
Cool, what made you choose Gigabyte over the others?
I'd love to see a test with PowerDVD 10 and a Full HD TV with Active shutter glasses... both since that's the major 3D version for TV's and because that's what I'm struggling with now myself.
Why coudn't ATI research a little longer and brought physX instead? I see at times the GTX 470 outperforms the 6870. Is it because of the driver? I was expecting a little more out of the 6000 series (like giving the same performance of the GTX 480) and more CFX efficiency. There is still plenty of optimizations to be made. Nice to see they are still maintaining low power and noise levels. That's a plus point
With the same ,logic, why couldn't Nvidia research longer to bring their own physx? don't forget that Nvidia consumed the physx company.
I already have a "Do Not Disturb" XFX Door Hanger.
Actually, my original intent was to get another XFX brand card. I like the "Lifetime Limited Warranty" they offer. That's what my HD 5770 is. But, they are asking a bit more for the XFX card and I really don't see any logic in paying that extra amount. The other cards pretty much offer a 3-year warranty and, let's be honest, I won't be still using this card three years from now.
Also, with the packaging being almost identical to the MSI version, there is also the price differential.
And, FWIW, I really kind of like the looks of that Gigabyte card over the others. Since it all pretty much boils down to 9 feet one way; 3 yards another, it just was a matter of pricing and funds available for this particular upgrade.
One of my brothers-in-law is buying my current HD 5770 at half the current shelf price. It's been used now for about a year so I'm happy with that...and he's thrilled to death that he will be able to utilize three monitors.
BTW, I already got my USP Tracking Number. So, this card should be here by the first of the week for sure.
EDIT: Here's a link to an Overclocking Study of the Gigabyte HD6870. I hope I've copy/pasted the translated link for you...
Two questions for anyone who can answer them.
1. I heard rumors that an overlcloked 6850 can, depending on the model, reach 6870s performance, true?
2. The 6850 is faster than a 4870, correct?
3.DS, congrats on your new GPU.
RE: 3: Thanks! I'm very eager to see just how it compares to my current HD 5770!
As for the other questions, you may be interested to know that, according to reviews I've read, the HD6850 actually can give the HD5850 a run for its money. Check out the PC World comments here:
AMD Radeon HD 6850 Graphics Card Review - PCWorld
Since I plan to use the new card for at least two years, I felt compelled to get the better of the two at this time. I have absolutely NO intention of spending anywhere near $300+ (USD) for any video card -- regardless of how well it performs. I prefer to pay less than $200. But, since I went with Eyefinity several months ago -- and really, really do enjoy it! -- I felt going from the limitations of the 128-bit BUS of the HD5770 to the 256-bit BUS of the HD6870 was worth the investment.
I plan to run some default benchmarks before I remove the HD 5770 and then do them again at the default settings of the HD 6870 to see just how they compare performance-wise on my system. I've got several of the benchmark programs and want to give them all a good work out this coming week.
Oh...and for those who may be interested, here's the image of the Gigabyte card....(this is from the manufacturer's site...not my own card.)
1) It would depend on the overclock really but generally no, not quite that performance. The only results which show an overclocked 6850 beating the 6870 are probably performed on the "faulty" Sapphire 6850 which was sent out with all the stream processors enabled... basically using an engineering sample of the Barts core. It was actually us that alerted Sapphire that this had happened (at least for the UK) and by the time we did that other sites had already gone with their (incorrect) performance.
Thankfully we actually pay attention to what we are doing and test the cards thoroughly so notice issues like that before publishing results, allowing us to make sure we are accurate with our testing. Any launch day articles of the Sapphire 6850 you see that show GPUz as having 960 cores have a doctored GPUz.
2) Yes, by some way. And it obviously has a whole load of extra features including DX11 support.
Yes, both the 6850 and 6870 are fine products even when compared with the 5800 series.
We will be waiting for your update
Very interesting. While I was told about the 6850 reaching the 6870, I was still not ready to accept it as fact unless I heard it here. Thank you for the clarification.
Good news too about it being faster than the 4870. At least I won't lose way too much performance if I manage to get the 6850!
wonder how much that dp hub will cost.. probably way to bloody much..
Still would like to know how well the cards work with passive dp adapters
and if the 2 dvi + hdmi can be used as the same time in surround view
1) Fine, had no problems in our testing using the mini-DP>DVI passive adapter.
2) I'll check that for you later.
Even in games with good Xfire scaling, you might not loose much at all.
If I remember rightly, the 4870X2 performed slightly under the GTX295, its main rival. The 6850 appears to perform a little under a GTX470. And the GTX470 performs similarly to a GTX295. See what I did here?
Ballpark estimations, but you get the idea
Good logic there Dr Watson, but for one thing. I was talking about the 6850, not the 6870!
I meant that
Original post edited!
2xDVI+HDMI= only 2 screens available (can be any of the 3)
This is the same as what we have now on the HD5xxx cards, right? I wouldn't actually expect this configuration to be any different than what we've had previously. So, in order to use anything above two monitors, the DisplayPort/s have to come into play.
What I am eager to find out is whether the adapters for DVI have to be...HAVE to be... active as opposed to passive. And, if they do have to be active, is the dependability and stability improved over what was happening on the HD5xxx cards?
I tried the ACCELL Active adapter and had to return it because I was constantly losing signal and having to unplug/replug the USB power in order to restore signal. The passive DP->VGA has worked almost flawlessly for several months now. AND, is much, much cheaper than the active adapter!
So, some more information about how well the Eyefinity is working and what changes/improvements were made would be very welcome.
Passive works fine, i used the mini-DP to DVI passive adapter for our Eyefinity testing. Having said that, the active adapter we have used in the past works well with LG and Asus screens so that wouldnt have been an issue either i'd imagine.
This is very welcome news! Although the VGA display is actually excellent, I'd really prefer to be able to use the DVI input on my third monitor as opposed to VGA.
Here's hoping that UPS gets my new card here in the next day or two. Estimated delivery, however, per their tracking, is the 27th.
EDIT: Holy Smokes!!! After writing the above, I got an updated status on the shipment and it's already been scanned in the main HUB here in Metro Atlanta. Could it be that it will arrive today??? Here's hoping with fingers and toes crossed!
Anyone know why the Blu-Ray 3D CPU usage is so high with the 6000 series?? Kinda wierd.
Minor driver/software issue... I have a new build aready which should fix this.
Separate names with a comma.