Post Your Computer Build

Discussion in 'Geek Cave: Computers, Tablets, HT, Phones, Games' started by The Alchemist, Oct 8, 2015.

  1. Walderstorn

    Walderstorn Friend

    Pyrate
    Joined:
    Mar 20, 2016
    Likes Received:
    1,905
    Trophy Points:
    113
    Location:
    Europe
    It's from last year but why not:

    [​IMG]

    [​IMG]
     
  2. FallingObjects

    FallingObjects Pay It Forward

    Pyrate
    Joined:
    Oct 15, 2016
    Likes Received:
    2,235
    Trophy Points:
    93
    @OJneg In that case, you're pretty much gonna be stuck with AMD's processors and x570 motherboard for the moment, as they're the only motherboards that support the new PCIe 4.0 lanes (which will be the new standard going forward).

    Not that much of anything is currently capable of taking advantage of 4.0, but if you're futureproofing as much as possible then its nice to not have to unhook an entire motherboard because cards came out that finally surpass what 3.0 has available.

    3700X is probably fine for a 5 year time horizon. But I'd say a 3900X is worth it when considering that this is the last time the AM4 socket will be used by AMD (future CPU upgrade would require a new motherboard), and it's the best processor currently out for it until the 3950X hits shelves. 3950X I have no comment on until third party reviews land, and I suspect that the extra cores (16! With 32 threads!) will just create waste heat for gaming purposes, even 5 years down the road.

    As @Riotvan pointed out, CPU matters less if you're not trying to hit 144hz. I strongly suspect your 4k monitor is only 60hz, so even 5-6 years down the road, your GPU should be the biggest bottleneck still.

    Overclocking RAM matters more than overclocking your processor for the new AMD Zen chips, so if you go the AMD route make sure you're aware of that.
     
  3. netforce

    netforce MOT: Headphones.com

    Pyrate
    Joined:
    Aug 1, 2016
    Likes Received:
    3,119
    Trophy Points:
    93
    4k gaming hasn't caught on for gaming in the same way high refresh rates have. 2080 should be great, 2080ti will be better but for more money. If I was stuck between choosing a high refresh rate or higher resolution, I would go towards higher refresh rate.
     
  4. BenjaminBore

    BenjaminBore Friend

    Pyrate
    Joined:
    May 23, 2016
    Likes Received:
    2,842
    Trophy Points:
    93
    Location:
    London, UK
    AMD were fibbing. PCIE 4.0 will work on some older gen boards, presumebly due to the PCIE controller being on the CPU package. Historically speaking PCIE really hasn’t had any impact on gaming, at least not for an incredibly long time. My system is still on 2.0.
     
    Last edited: Jul 23, 2019
  5. fraggler

    fraggler A Happy & Busy Life

    Pyrate
    Joined:
    Oct 1, 2015
    Likes Received:
    5,116
    Trophy Points:
    113
    Location:
    Chicago, IL
  6. Thad E Ginathom

    Thad E Ginathom Friend

    Pyrate
    Joined:
    Sep 27, 2015
    Likes Received:
    14,238
    Trophy Points:
    113
    Location:
    India
    My favourite thing about PCs as opposed to laptops... ease of repair/replacement/upgrade.
     
  7. FallingObjects

    FallingObjects Pay It Forward

    Pyrate
    Joined:
    Oct 15, 2016
    Likes Received:
    2,235
    Trophy Points:
    93
    Huh, go figure.

    Maybe invest in a bigass M2 SSD, and pick one of the cheaper x470 boards with confirmed 4.0 unlocks through BIOS updates then, haha.
     
  8. Syzygy

    Syzygy Friend

    Pyrate
    Joined:
    Jun 13, 2018
    Likes Received:
    2,144
    Trophy Points:
    93
    Location:
    DFW, Texas
    Also, are you really gonna game in 4k? Is there a reason to do that, as opposed to 1080p (even on a 4k monitor)?

    I don't have a dog in the hunt, but have just seen a couple of YT titles questioning why?



    Edit: Uh, ok possibly already covered. Missed the whole new page of posts before the reply.
     
  9. OJneg

    OJneg The Most Insufferable

    Pyrate BWC
    Joined:
    Sep 25, 2015
    Likes Received:
    3,923
    Trophy Points:
    113
    Location:
    Grand Rapids, MI
    I was unaware that anyone doubted the advantages of 4k (2160) vs 1080p. And I am not familiar with the exact argument of why it would be overkill for gaming. My only experience with 4k gaming are playing Civ VI at the moment, so I am curious to watch that video and read a bit more.

    But given that this is the same doofus who listens to an M50 + 02 amp and calls it indistinguishable from high end audio equipment, I get the feeling I know what's going to be said.
     
  10. Riotvan

    Riotvan Snoofer in the Woofer

    Pyrate
    Joined:
    Sep 27, 2015
    Likes Received:
    4,198
    Trophy Points:
    113
    Location:
    The Netherlands
    f**k him i've been gaming on 4k since 2015. You might need to play with settings and disable AA which i have seen no need for anyway. Not on my 32" at least.

    As i said before don't buy an 8GB card even the 2080ti is a bit iffy on future support due to it's 11gb of ram, that's how nvidia plans future obsolescence. They f**k you on vram.

    I have a Radeon VII and with the 16GB it has i have reached close to vram limit on some titles. I happened to get a good sample and put a better cooler on it and it overclocks and undervolts like a champ. Stock cooler is too noisy though. Amd reference designs usually have shit coolers. But for the price i paid i'm happy with it. Other reason i went for amd is linux support, sure nvidia works but this card is plug and play. Also f**k nvidia.
     
  11. BenjaminBore

    BenjaminBore Friend

    Pyrate
    Joined:
    May 23, 2016
    Likes Received:
    2,842
    Trophy Points:
    93
    Location:
    London, UK
    It’s a good point, and something that benchmark centric reviews don’t really account for. Extra VRAM, assuming enough performance to make use of it, can aid in useful longevity. Particularly for open world games. Though there’s also the question of whether all that VRAM is being actively used and really of benefit even if it is showing as filled, especially with target specs etc. That AMD card does use HBM2 RAM though which has shown some small advantage in specific scenarios when running ultra high resolutions due to its enormous bandwidth.

    @Riotvan How quirky have you found the drivers? AMD stuff always seems more prone to issue, requiring a higher CPU overhead in DX11 games. They also do not support their cards as long as NVIDIA does.
     
    Last edited: Jul 24, 2019
  12. OJneg

    OJneg The Most Insufferable

    Pyrate BWC
    Joined:
    Sep 25, 2015
    Likes Received:
    3,923
    Trophy Points:
    113
    Location:
    Grand Rapids, MI
    I'm brand agnostic at this point but I've always used AMD GPUs and never had problems with AMD drivers, except for a few of the Total War games which I could never get to work.
     
  13. Riotvan

    Riotvan Snoofer in the Woofer

    Pyrate
    Joined:
    Sep 27, 2015
    Likes Received:
    4,198
    Trophy Points:
    113
    Location:
    The Netherlands
    Have a problem with oc-profiles resetting at startup but the rest of my system is due for an upgrade and reinstall.
    Other then that games usually work fine though i haven't been trying the latest games or drivers lately due to it being too f'ing hot for gaming atm.
    I'm waiting on some titles and i have a backlog i'll get back too after summer.

    As for driver support after a while i can't really comment, my last amd card was an r290x and i sold it after 2 years i think. But i think they get a bad rep due to the past, their control panel is much better then that ancient nvidia shit.
     
  14. Lurker

    Lurker Facebook Friend

    Joined:
    Oct 10, 2015
    Likes Received:
    132
    Trophy Points:
    33
    Hey gamers. Recently decided to sell my HD 800 to build a new pee cee because my old one (FX6300,GTX 660 Ti,...) from 2013 started freezing when the CPU was above half clockspeed...
    Here's the new one:
    [​IMG]

    Specs:
    • CPU: Ryzen 5 2600
    • GPU: Sapphire RX Vega 56 Pulse
    • Mainboard: Gigabyte B450 AORUS M
    • RAM: 16GB Corsair Vengeance LPX DDR4-2666 CL16
    • Drives: 500GB Crucial P1 NVMe as the Boot-drive and a 1TB SanDisk Plus SATA SSD as a data-grave
    • Case: CM MasterBox Q300L
    • Power Supply: 500W be quiet! Pure Power 11 CM
    • Monitors: Dell 2408WFPb, Fujitsu P24W-7 LED
    Buying the hardware right before the Ryzen 3000/Navi launch wasn't actually too stupid because since then the prices for the vega 56 and the 2600 haven't changed much/at all.
    For 125€ the 2600 is still plenty fast and the Vega for 250€ is a really good deal aswell considering that decent GTX 1660 Tis still cost around 280€.
    Although I did run into some issues with the pulse, mainly the fans. At least in my case it can't always keep the temps under 55°C which makes the 0RPM mode useless and AMDs driver doesn't allow fan speed below 29% which is very audible. So I'm controlling the fans with the Afterburner to get the fans to a constant 14% when under 50°C - another application running in the background but at least it stays silent when the load is insignificant.

    Main reason I chose that specific mainboard was because the top slot is x16.
    On a lot of µATX boards the top slot is occupied by the M.2 slot which is a problem with big 2.5 slot graphics cards like the pulse (choked by psu + no additional expansion slots).
    This board has place for the pulse and something like a Wifi card in the bottom x16 slot.

    Glad I invested into the SSDs, feels really snappy and elliminates another noise-source.
    Also a lot of be quiet stuff because here in Germany their products are quite reasonably priced.
    The case is cute but quality is very meh.

    Entire system cost was around 800€.
     
  15. Thad E Ginathom

    Thad E Ginathom Friend

    Pyrate
    Joined:
    Sep 27, 2015
    Likes Received:
    14,238
    Trophy Points:
    113
    Location:
    India
    That's changed now, hasn't it?

    I have bottom-of-the-line Nvidia card, and I use Nvidia's own drivers, not the open-source alternative.

    By the way... No my decade-old rig does not have 4k, and neither does my monitor. And no, I don't need it. But would I like to have it? Oh, I'm sure I would. :cool:

    And if I had a monitor that supported it, I'd feel short-changed with hardware that didn't.
     
  16. Riotvan

    Riotvan Snoofer in the Woofer

    Pyrate
    Joined:
    Sep 27, 2015
    Likes Received:
    4,198
    Trophy Points:
    113
    Location:
    The Netherlands
    Yeah the closed source drivers work but it can be a hassle with kernel upgrades. The reason opensource nvidia drivers are shit is because nvidia is refusing to provide any help from their end. That and their business practices in general give a bad taste. They are pretty much anti opensource, everything proprietary and locked down.
     
  17. OJneg

    OJneg The Most Insufferable

    Pyrate BWC
    Joined:
    Sep 25, 2015
    Likes Received:
    3,923
    Trophy Points:
    113
    Location:
    Grand Rapids, MI
    Bad trade IMO. HD800 will likely still be Plankton King in a decade. Pee Cee for gayming will be recycling material!
     
  18. Lurker

    Lurker Facebook Friend

    Joined:
    Oct 10, 2015
    Likes Received:
    132
    Trophy Points:
    33
    The 800 was nice but towards the end I was babying them a lot and barely used them. Unlike my beater 650 which I still have.
    I spend a lot of time infront of the screen and an FX 6300 at 1.7 GHz with a packed HDD is not at all fun to use.
    Also I can always buy another 800. Used prices right now are hilarious with the average being 550€. And its not looking like the prices are going up.
     
  19. Thad E Ginathom

    Thad E Ginathom Friend

    Pyrate
    Joined:
    Sep 27, 2015
    Likes Received:
    14,238
    Trophy Points:
    113
    Location:
    India
    OK, so still f**k you Nvidia, then.

    Thanks for the update.
     
  20. HeadFoneDude64

    HeadFoneDude64 Facebook Friend

    Joined:
    Jul 25, 2016
    Likes Received:
    192
    Trophy Points:
    43
    My 2x Gigabyte Vega 64 and PowerColor Vega 64 Red Devil say 'Hi!'.....and they agree wholeheartedly!
     

Share This Page