Thoughts about new expandable build.

Discussion in 'Hardware Discussion & Support' started by Liqourice, Feb 16, 2018.

  1. Liqourice

    Liqourice Well-Known Member

    Joined:
    Nov 16, 2005
    Messages:
    3,837
    Likes Received:
    339
    Trophy Points:
    93
    After the release of the new AMD APU's I started thinking about a way to build a new rig and initially save a lot of money and then expand as I can afford to.

    The new Ryzen 5 2400G with Vega 11 is very capable on the CPU side, at least compared to my current Phenom II that's nearing 10 years old. It's in some benchmarks more than double the speed.

    Also, the Vega 11 seems to be not all that much slower than my old HD5870, at least in some operations. In some it's even a bit faster.

    So, to get going I thought about getting a decent expandable motherboard, the 2400G and some good ram and to begin with use my old case and PSU. I might even put the 5870 in if the built in gpu doesn't do good enough.

    Then, when I can afford to I'll get a new gfx card, either a 580 or if there's a new better AMD card that doesn't cost that much more and eventually I'll upgrade to a faster stand-alone Ryzen CPU.

    In the long run it'll probably cost me a bit more totally but I'm really starting to feel the need to get a new rig because this old one is really starting to struggle with some newer games.

    You guys think this is a good idea?
     
  2. Judas

    Judas Obvious Closet Brony Pony

    Joined:
    May 13, 2002
    Messages:
    38,071
    Likes Received:
    808
    Trophy Points:
    138
    Having build a 2400g system already.. i can certainly tell you... that it's a beast. We should be able to expect improvements down the road as well considering.

    Initially it was thought that the HDMI ports on the motherboards would still be hard limited to HDMI 1.4, but as more and more people adopt the ryzen APU, it appears this isn't the case and as i had hoped, HDMI 2.0 functionality (even partial HDMI 2.1) is functional, freesync included.

    The 2400g combined with 3200mhz DDR4 low latency ram (best if you can run 2x dual rank ram in dual channel for the most effective graphics performance), performs as well if not better than an nvidia 1030 and in plenty of cases, as well as a RX550 which is sufficient for many games in 720p relatively high quality or 1080p high or lower depending on the graphical demands. There are a few cases in which 1440p and 4k is even playable, though many people are targeting 50-60fps minimums, so they tend to use built in scaling to get better performance.

    I know a game like overwatch @ 1080p with relatively good quality performs VERY well on the 2400g.

    I've several more machines to build, but will be waiting for the 400 series based boards to arrive, as this is where we'll see significant improvements in board options and designs that make far greater sense.


    Nearly all my up and coming business/home machines are almost exclusively going to be the ryzen 4 core 8 thread solution with the vega 11 gpu. The upgrade options available down the road will be plenty for games if those users venture that direction.

    IF i was building a gaming machine however, the Ryzen 1600 or rather the up and coming "2600" would be a minimum, as the numerous games really pouring on the multi core/thread requirements are expanding VERY quickly now that AMD has massively changed the computing world forcing intel's hand to drive core/thread counts up and allowing developers to jump on it now that it's become mainstream. Of course this isn't viable for those on a tight budget however.

    I'm looking forward to Ryzen 2 (not to be confused with the ryzen 2000 series cpu/apus that are technically ryzen+)... as more and more information keeps pointing toward 6 core CCX, which means 12 core ryzen 7's... and 6 core APUs with potentially a 15CU vega which puts it in the realm of being as fast if not faster than a 1050/1050ti. We aren't likely to see that happen until 2019 though.
     
    Last edited: Feb 17, 2018
  3. Liqourice

    Liqourice Well-Known Member

    Joined:
    Nov 16, 2005
    Messages:
    3,837
    Likes Received:
    339
    Trophy Points:
    93
    My biggest concern here is really what motherboard to chose to be as future proof as possible and still not spend all to much money.
     
  4. Judas

    Judas Obvious Closet Brony Pony

    Joined:
    May 13, 2002
    Messages:
    38,071
    Likes Received:
    808
    Trophy Points:
    138
    Again if you can wait.. then wait for the ideal ones to arrive with better quality and options than current 300....
     
  5. Liqourice

    Liqourice Well-Known Member

    Joined:
    Nov 16, 2005
    Messages:
    3,837
    Likes Received:
    339
    Trophy Points:
    93
    Oh, I have to wait anyway. Need to find a new job first. I'm quite sure I want a motherboard that will fully support Ryzen 2 and there are none out there yet. Though, since I'll be using the 2400G to begin with and my monitor only has DVI I have to have a motherboard that supports Raven Ridge and has a DVI port. No such motherboard out yet.
     
  6. Liqourice

    Liqourice Well-Known Member

    Joined:
    Nov 16, 2005
    Messages:
    3,837
    Likes Received:
    339
    Trophy Points:
    93
    Of course I can use an adapter from HDMI to DVI but adapters tend to decrease the quality a bit so I prefer to avoid that.
     
  7. Judas

    Judas Obvious Closet Brony Pony

    Joined:
    May 13, 2002
    Messages:
    38,071
    Likes Received:
    808
    Trophy Points:
    138
    HDMI to DVI decreases the quality...

    come again?

    HDMI carries exactly the same digital data and DVI.. (in fact hdmi can provide better quality but DVI is limited due to it's aging/obsolete standard sliding away)

    If you were to say something in regards to analog... i would have made an exception but even then.. HDMI/DVI and even DP to analog VGA is unchanged for the most part... The only exception is the change from one VGA to some other analog standard.

    Using HDMI to DVI cables doesn't actually adapt/change anything.. the port provides a universal digital standard and can feed whatever the monitor desires... DP even can do this passively provided a free clock generator is available on the card.... an active adapter is required only when those clock generators have been exhausted due to already supplying other displays.

    You'll be finding it rather hard to get a motherboard or even newer graphics cards with DVI ports... they are being phased out quickly and why not, when HDMI provides the necessary backward compatibility for the vast majority. As VR continues to grow, the need for 2x HDMI ports is overtaking the one port that takes up 2 slots that could be used for 2x HDMI.
     
  8. Liqourice

    Liqourice Well-Known Member

    Joined:
    Nov 16, 2005
    Messages:
    3,837
    Likes Received:
    339
    Trophy Points:
    93
    Just reflections from old experience. I do know that digital signals aren't affected in the same way as analog, I just have a bad experience with adapters in general. Though, I guess I won't have much choise in this matter anyway. Buying a new monitor isn't an option atm.
     
  9. Judas

    Judas Obvious Closet Brony Pony

    Joined:
    May 13, 2002
    Messages:
    38,071
    Likes Received:
    808
    Trophy Points:
    138
    I just don't think you should have a problem.... Even with old DVI displays, they have come a LONG ways in terms of the display drivers and HDMI ~> DVI connectivity.

    At a few businesses i actually run 50 or 100ft HDMI cables to displays that only accept DVI or VGA, and using just a generic out of the box like what is provided in some older AMD/Nvidia retail boxes that convert DVI ~> HDMI... you can use the same adapter to go from HDMI to DVI to attach to monitors. These displays have been running for years and years now without failure and they are probably the worst potential circumstances mostly due to length, but also in relation to their choice of generic lcd monitors that they insist on recycling for these kinds of purposes which of course, with only DVI being a port, an most everything including laptops using HDMI output only (many no longer include a VGA port), i've moved quite a few short/long HDMI to DVI adapters/cables. Most prefer the female dvi end on one end of the cable with the male hdmi on the other, no additional adapters needed and it's seamless.
     
  10. Liqourice

    Liqourice Well-Known Member

    Joined:
    Nov 16, 2005
    Messages:
    3,837
    Likes Received:
    339
    Trophy Points:
    93
    One question I can't quite find reliable information about. Do my old Corsair H50 fit on a Ryzen as well as the mountings on the motherboard?

    Prefer to reuse the H50 instead of the stock cooler that comes with the APU if possible.
     
  11. Judas

    Judas Obvious Closet Brony Pony

    Joined:
    May 13, 2002
    Messages:
    38,071
    Likes Received:
    808
    Trophy Points:
    138
    I couldn't say for certain, however most people seem to be having good luck with either a low cost or even free bracket adapter in the even they do need it for AM4/TR4 socket boards by calling in your case, corsair's most local department and inquiring.
     

Share This Page

visited