Monday | September 26, 2016
Popular Links:
AMD Radeon R9 Fury X Review

AMD Radeon R9 Fury X Review

http://abstinc.com/how-to-do-your-homework-easily/ How To Do Your Homework Easily

AMD Radeon R9 Fury X Review – Packaging and the Fiji GPU

http://www.jagiellonka.plock.pl/?phd-research-proposal-management-accounting Phd Research Proposal Management Accounting

http://www.au-fait.be/?mother-terasa-essay Mother Terasa Essay

http://www.sephiroth.it/?writing-a-research-paper-help Writing A Research Paper Help

AMD Radeon R9 Fury X Review – The Fury X

Wendell Berry Essays Online

Marijuana Research Papers

http://gastrolux-canada.ca/can-you-get-a-phd-without-a-masters-thesis/ Can You Get A Phd Without A Masters Thesis

As far as the key specs go. This is a DirectX 12 compatible 28nm GPU with 4096 stream processors, 64 compute units and 256 texture units. ROPs are 64 and the memory interface is listed as 4096-bit. With a core speed of 1050MHz and memory of 500MHz (4GB) that gives us a bandwidth of 512GB/s

fury-x-gpuz

AMD Radeon R9 Fury X Review – Performance

Testing was performed on the Intel Core i7-5960X running on an X99 board with 16GB of DDR4 and a Samsung 850 Pro SSD. Windows 8.1 was the OS and all games along with the OS were patched.

All testing was performed on a BenQ BL3201 4k Display

benq-bl3201pt-review-screen

NVIDIA Driver: 353.30
AMD Driver: 15.15.1004 Beta

3dmark-furyx

temps-furyx power-furyx

fury-x-review-bfhardline-4k fury-x-review-bfhardline-1440 fury-x-review-cars-4k fury-x-review-cars-1440 fury-x-review-dota2-4k fury-x-review-dota2-1440 fury-x-review-gtav-4k fury-x-review-gtav-1440 fury-x-review-witcher-3-4k fury-x-review-witcher3-1440 fury-x-review-atilla-1440 fury-x-review-attila-4k

fury-x-review-arkham-knight-4k

GPU Compute (Sisoft Sandra):

fury-x-gpu-compute

http://balibeetour.com/web-copywriting-services/ Web Copywriting Services Overclocking:

fury-x-oc-fix

AMD Radeon R9 Fury X Review – Conclusion

So, thats Fiji… or more specifically Fury X. Starting with the design of the product, we love it. The aesthetics are great with the mix of LEDs, metal and soft touch surfaces. We would have liked to see the soft touch repeated on the back of the card, but that is a really minor issue. The size of the card is fantastic too, at just 19.4cm it is the smallest high end GPU we have ever tested and significantly smaller than the likes of the GTX 980 Ti.

As far as outputs go, some may be disappointed with there not being a single DVI however converters are available so its not a huge issue. We also have no problem with the X variant of the card using an external fan/radiator… the target market wont have an issue with that and little touches like nicely integrated wiring for the fan along with braided cables help keep the overall product quality at a high level.

When it comes to cooling and noise levels, the card is very competitive on the sound front with current high end cards using low noise custom coolers. On the thermals, it excels. Our sample ran at 28c when idle and peaked at 52c when gaming. That is fantastic for a high end GPU running at full speed… of course with FRTC we could look to reduce this… but few will need to. Power use is in line with other high end cards when gaming too. If you were to run 100% load on the GPU that could vary a little but for real world gaming, the Fury X just edged out the GTX 980 Ti.

Where things do get a bit more difficult for the Fury X is framerates. There are a couple of ways you can look at these… the first is that generally, in the games we tested, the GTX 980 Ti is faster. In some cases, like Project Cars or Arkham Knight, significantly so. In others, such as GTA V, its very close between the two cards. The flip side is that the Fury X is a significant leap in performance over the last generation 290X, so a nice upgrade for AMD users. Additionally the framerates, while not the fastest, do allow us to game in pretty much every recent title at 4k with high detail levels and smooth performance.

So that brings us to pricing. In advance of launch, AMD tell us that the card will retail for £509.99 in the UK, which compares with the 980Ti at £539.99. In a mainstream card, £30 would be a big chunk but it becomes less significant at £500+. NVIDIAs bundle of the moment is Arkham Knight, AMD are offering Dirt Rally.

Summary: If retailers stick to the £509 price point, or even manage to hit £499 then the Fury X offers a competitive alternative to the GTX 980 Ti. The competition does offer some higher framerates but the Fury X is more than just raw numbers… its a really lovely looking card, in an unheard of form factor for a high end GPU which runs quiet and at the lowest temperatures you will find for a card in its class. A very well balanced product indeed.

Gold Award

AMD Radeon R9 Fury X Review – Screenshot Gallery

amd-radeon-r9-fury-x-review-4k-screenshots-2 amd-radeon-r9-fury-x-review-4k-screenshots-3 amd-radeon-r9-fury-x-review-4k-screenshots-4 amd-radeon-r9-fury-x-review-4k-screenshots-5 amd-radeon-r9-fury-x-review-4k-screenshots-6 amd-radeon-r9-fury-x-review-4k-screenshots-7

About Author

Stuart Davidson

Stuart Davidson is Senior Editor at HardwareHeaven having joined the site in 2002.

24 Comments

  1. IvanV

    Okay, but I think that the new features deserved a spot in the review. As in at least a quick test of VSR’s effects and performance costs. Also, the cooler probably deserved some more space. It’s thermal capacity is near twice what the chip dissipates, AMD gave the card more PSU connectors than necessary, the reviewer takes not of that and doesn’t perform a single OC test? Come on guys, don’t just tease us like that! 🙂 Or is that bit still under NDA?!

  2. Jac

    I don’t think it looks well built, looks homemade to me. Plus it’s slower than the 980 Ti, costs slightly less but uses more power. This isn’t the card that is going to turn around AMD’s fortunes.

    • IvanV

      The Radeon uses 6W more when idle, but 5W less under load. My guess is that the increased idle consumption comes from the water pump. I’m not sure if AMD partners will be allowed to ship this card with a traditional cooler, but if they do, the numbers will shift in AMD’s favour (it’s certainly doable, it doesn’t dissipate more heat than the previous generation and there are some fairly cool ones with non-reference coolers).

  3. Hmmm. I hope AMD can optimize those framerates with their drivers, because that’s too many games they are trailing behind. Disappointing.

  4. Jac….. how does it use more power?

    There are some clear and obvious flaws that shouldn’t have been left out:

    1: The card lacks HDMI 2.0 port, this is a big issue.. BIG…. bigger than i think amd clearly realizes let alone what a number of reviewers seem to even remotely want to touch on which is simply IMO bad form. Considering the touting of 4k support and all that, which is something greatly desired by a number of people looking at highend specially for their refresh builds, this card massively falls short as a great many …. i’d even go out on a limb and say a solid 30% or rather more likely, MORE than 30% easily are looking to be hooking up their tvs to their pc’s. The HTPC market is bustling, and there are plenty of people out there wanting their 4K support out of the box, and not having HDMI 2.0 compliancy instantly kills it, because there among all the 4k TVs…. 1 bloody tv that i’m aware of from panasonic that has displayport 1.2, the rest require hdmi 2.0. So unless AMD has up their sleeve, some kind of push to be able to get tv manufacturers dropping 4k TVs with displayport 1.2 at the very least, the moment a pile of potential buyers seen hdmi 1.4 listed…. they’ll just move along back to nvidia because they at least provide such functionality out of the box.

    2: Display port…. I’m excited actually to see the removal of dvi…. super thrilled and no that’s not sarcasm. 3x full sized Display ports…. fantastic. However for amd having been directly tied to the display port standard…. i don’t understand how it’s version 1.2… when 1.2a had been solidified for what… 2 years?… and dp 1.3 has also been standardized as well. So this begs the question, why?

    The other question is…. if display port 1.2 is supported, is HDCP 2.2 also tied into it this round? If so…. does amd or any manufacturer have a plan or capability to perhaps provide a DP to HDMI 2.0 adapter? Were their any DP to DVI adapters provided with the card? (i suspect not since it wasn’t mentioned other than being able to use adapters).

    3: Adaptive-Sync/Freesync…. with the advent of this IMO, better alternative to g-sync solution recently hitting the stores in monitor form (having cards that have had it for awhile or more so, support for it for awhile)….. I would have thought this would be an oppertune time to refresh on it, but perhaps not. There are a LOT of people that are completely unaware of what this feature is… or what it does and just mentioning it doesn’t do it justice in what it really is capable of doing. Personally having sat down and played a number of games on a 120 or 144hz display with powerful cards capable of hitting frames rates that high (though modern games are difficult to do actually), and then sitting down with a 60hz panel with freesync enabled…. it’s like night and day, and i would definitely take a freesync display over any high refresh model on the planet. Truly the experience is really just that damn awesome. No matter how high a frame rate you get on a non-freesync display…. it just won’t have the pure fluidity that freesync displays do at 60hz even. No contest… i really can’t say enough about it, and no one can accurately explain it well enough to really give anyone a good idea what one is even talking about short of seeing it themselves which the vast majority of users will never see. There are still tons of people even after suggesting such a display that still think there is no way for a 60hz freesync display to look/play smoother than a high hz display, but well, sorry….. they do.

    4: Many of us know reviews of just launched products specially direct from AMD without any 3rd party products to test, have usually a very VERY limited time to do so. Additionally we all should know that the initial packaging and such is referrence and usually slower and less “efficient” than later 3rd party interations that will come. Granted one would wonder if any 3rd party would be able to make a cooler running, faster running card due to the memory now being stuffed onto the gpu itself… leaves very little room for changes that could make for improvements, but we’ll have to see.

    5: OpenCL and GPU computational power….. I’m going to go out on a limp and say that this is likely due to my #4 point above…. lack of time? This seems like a fairly serious thing to neglect or not include. Among the things that AMD does super excel at, it’s compute power, not to mention the efficiency even on older hotter power hungry gpus usually would significantly out perform nvidia on not only calculation speed, but power consumption and heat. Nvidia suffers significantly at huge power draw increases and heat production when doing computing in comparison. Considering the TFlop rating of the card, it should do extremely well, would have been nice to see.

    6: Hate to go back to the video output options, but considering the grand size and capability of the card, not to mention the amount of unused space for additional connectors, what’s stopping amd from simply giving us lets say 6 display ports and 2 HDMI ports without the need for a external hub?

    • Hey Judas, love the enthusiasm as always.

      Picking up on a couple of points. Compute… I used to go out of my way to add in those results at launch. No-one seemed to care about them though so when we need to prioritise, thats one thing that goes. It was Batman or compute and that wasn’t a hard decision (in fact we only just got OC numbers in at the very last second). I’m ok with the number of DisplayPorts, given the performance with one 4k display, i can’t see many people connecting several unless its niche uses and likely those consumers won’t mind a hub too much.

      Where I am definitely in agreement with you is HDMI 2.0. I would very much like to have seen that and I do hope that “custom” Fury cards can find a way to bolt it on, even at an extra cost because for the few that need it I would imagine they would choose a brand that does it over one that does not. (My gut feel on why its not in is AMD trying to remain as competitive as possible with the pricing and 2.0 (and DP 1.3 etc) would have added development and manufacturing costs…not a lot but they all count.)

      • Nice to see some compute scores…. interesting to see how well the nvidia 980 ti does in single float shaders….. just edging out the amd card… i’m guessing it’s likely due to the the slightly lower clocks being the root of that.. but the dual and quad…. those are pretty amazing numbers compared… specially the quad, i mean that’s completely destroying. But again, synthetic tests usually do show some pretty impressive numbers. Now it’s just a question of how those actually do in real world tests that make use of the gpu to do the task, such as encoding or folding or photoshop/illustrator or countless other heavily gpu compute related.

        Guess we’ll find out when they give you more time or when the 3rd parties start launching them in mass giving you a significant amount more time to properly sit down and not have to rush through it all.

        As for the multiple outputs…. as i said… i’m ecstatic about what they’ve finally done…. I was just mostly mentioning that for a highend gpu, seems like something they could add relatively easily, granted it’s definitely a niche thing yes, but it would definitely make a flagship more how should i say, eye catching as well while filling the needs of that niche market. I just also know that those hubs are a fairly big pain in the ass sometimes, just one more thing to go wrong vs just connecting everything straight away.

        I hope 3rd parties can manage to figure out how to bolt it all together and make hdmi 2.0 compliancy and supported entirely work without adapters or other non-sense…. i just find it wildly insane to have such a 4k touted card with the specs it has slapped with hdmi 1.4a. It seems like an insane oversight…. I just can’t see it requiring all that much more investment, specially when it’s going to HAVE to be made at some point otherwise they WILL become obsolete.

  5. Hey all, i’ve thrown in a GPU compute comparison for you… nothing fancy just a nice simple run within Sisoft Sandra.

  6. Nick

    WHAT GOLD REWARD ??? According to your own tests the 980Ti is still faster with most games i do not think that the Fury deserves a gold reward for being a slower solution

    • As the review states, there is more to a card than just being the fastest. The rating is for the overall product not just FPS…add to that we have an award level above gold and really there isnt an issue. Will the custom 980Ti’s i’m benching currently get that higher award? Time will tell…

  7. WoW! Significantly slower in a number of tests and never faster. Water cooling ? Seriously? The review bent over backwards to be kind in its summary but this is pretty big disappointment.

    • Whats wrong with watercooling? As for bending over backwards, hardly. Stating the facts. A) Geforce is Faster. B) Radeon is slower. C) Power is the same D) Temps and size are lower/smaller on Radeon. E) Both can play at 4k. Choose which suits your needs best because high end GPUs are more than just FPS.

      Seems pretty balanced to me.

  8. I have to say I seriously expected more from big red. The way AMD talked about this card 6 months ago made me think it was going to blow the doors off of anything out there that Nvidia had. I guess I am just a sucker. I thought the same thing about Bulldozer CPUs, Piledriver CPUs, APUs….man. And I have that same disappointed sucky feeling about this card and the 390x. They’re good. But not fantastic. I want AMD fantastic!!! AMD has a track record of disappointing their loyal fans with just “good” or “ok” products. They’ve got to produce something incredibly new and innovative (which they may do on occasion) but FAST and POWERFUL as well. Lets not even go down the CPU rabbit hole that Intel has such a dominate position over AMD it’s ridiculous at this point. And guess what – I was a hard core fanboy of AMD in the day. I made the switch to Intel at Sandy Bridge, and have actually used about an equal amount of cards from ATI/Nvidia over the past ten years or so.

    Here’s my thoughts on the water cooling of the card. Not really an issue for me or most folks. But you can get a CLC on a 980 Ti if you want, but they make air cooling available as well. Its not a necessity. I have a feeling that it was a necessity on this card or it would go up in flames. The only real problem I see is people with smaller cases (that would normally be super excited about this smaller form factor card), but may not be able to fit a radiator/fan cooler inside their smaller case. And X-Fire would be a mess with multiple CLC’ers in your case. And “only” 4GB VRAM? Really? Even the rebadged 390x has 8GB. I know, I know HBM limitations. But promoting this card as a 4k card and then it only has 4GB? Not good. More fuel to the flame wars from Nvidia.

    At this point, if I were looking for a top tier card I would have no hesitation going with a GTX 980 Ti, prob the MSI Gaming version with 6GB of VRAM.

  9. I’d like to see a screen shot of Firestrike Extreme, to see what settings were used.

  10. In the last graphic re overclocking, the overclocked Fury X is 10MHz lower in clock than the stock card, yet has the higher figure in Firestrike Ultra, is this an error?