Post Your Computer Build

Discussion in 'Geek Cave: Computers, Tablets, HT, Phones, Games' started by The Alchemist, Oct 8, 2015.

  1. Dash

    Dash Friend

    Pyrate
    Joined:
    Nov 7, 2015
    Likes Received:
    279
    Trophy Points:
    63
    Location:
    Florida
    I would personally wait a bit to see how AMD fleshes out the Navi lineup. I would think they could increase the compute units from the measly 40 in the current 5700XT and really challenge Nvidia on the upper end.
     
  2. zerodeefex

    zerodeefex SBAF's Imelda Marcos

    Staff Member Pyrate BWC
    Joined:
    Sep 25, 2015
    Likes Received:
    14,051
    Trophy Points:
    113
    You can always speculate and wait. Or you can get a 2080ti and know you have the single best card you can get today.
     
  3. Syzygy

    Syzygy Friend

    Pyrate
    Joined:
    Jun 13, 2018
    Likes Received:
    2,144
    Trophy Points:
    93
    Location:
    DFW, Texas
    Just "upgraded" my 3.2GHz Skylake i5-6500 (4 cores; 4 threads total), 24GB RAM, integrated HD Graphics 530 Hackintosh for a used 2010 Mac Pro dual 2.4GHz Xeon 4-core (8 cores; 16 threads total), 32GB RAM, and Sapphire RX590 Nitro+. The same Sabrent NVMe as above, but 512GB. Total cost ~ $650.

    I don't do a much gaming, but I edit photos and compile software, so the extra hardware threads are awesome for my workload. That's why I chose a video card a generation removed from current…it does great with editing photos on a 4k monitor on Mojave.

    Now the old box is free to become the FreeNAS server I've been needing at home.

    Those Ryzen 3's look awesome though. If either of (1) Apple supported them, or (2) Capture One was available for Linux, I'd have gone that way.

    But honestly most of my workflow is the computer waiting for me to type something in a code editor, then compiling the changes. Computers (CPUs) have been fast enough for me for at least the last 5 years. I'm happy that AMD has finally busted the artificial core/thread limits that Intel has been putting on their CPUs for the last 8 or so years.
     
    Last edited: Jul 22, 2019
  4. FallingObjects

    FallingObjects Pay It Forward

    Pyrate
    Joined:
    Oct 15, 2016
    Likes Received:
    2,235
    Trophy Points:
    93
    @OJneg I'll offer a dissenting opinion and recommend to at least hold out for the 2080 Super to land later this year.

    Yes the 2080ti is without a competitor for 'best' enthusiast card - but it's also gross overkill (the 2080 is capable of capping monitor display rates, depending on game/what monitor you have) and not very price efficient for an upgrade (50% more expensive for a 20-30% increase in FPS, and this margin will shrink a bit once the 2080 Super launches).

    Unless you have a 1440p/144hz monitor with G-SYNC already, I don't think I'd recommend just flat out buying a 2080ti right now unless you found one on sale, especially considering that Ray Tracing improvements will probably happen within 2-3 years that'll gut the resale value on it.
     
  5. zerodeefex

    zerodeefex SBAF's Imelda Marcos

    Staff Member Pyrate BWC
    Joined:
    Sep 25, 2015
    Likes Received:
    14,051
    Trophy Points:
    113
    Except he clearly doesn't want to upgrade for a long time. If that's the case, you buy the best card available and use the f**k out of it.
     
  6. BenjaminBore

    BenjaminBore Friend

    Pyrate
    Joined:
    May 23, 2016
    Likes Received:
    2,842
    Trophy Points:
    93
    Location:
    London, UK
    @OJneg Everything orbits console-target specs, and new consoles are due at the end of next year. You’ll need something that meets their on paper spec at a minimum, but preferably double that performance-wise to account for PC inefficiencies and framerate headroom.

    You could take a punt on a top end card like the 2080 Ti Super which I would estimate will somewhat exceed next-gen consoles in raw performance. But it won’t really have the kind of headroom you’d want for the long term and who knows how those first-gen raytracing cores will hold up in a few years time. Alternatively, and assuming you’re running at around 1920x1080, find a deal on a 970 or 1060 which are about twice as powerful as the current gen console base spec and can run everything very comfortably. Then wait and see how the next-gen consoles effect things before putting down the big bucks.

    None of this accounts for the increase in CPU power or the faster-than-anything-available-on-PC-today SSDs they will utilize. Most people are going to need a complete system overhaul in a few years time, not just a GPU upgrade.
     
    Last edited: Jul 23, 2019
  7. OJneg

    OJneg The Most Insufferable

    Pyrate BWC
    Joined:
    Sep 25, 2015
    Likes Received:
    3,923
    Trophy Points:
    113
    Location:
    Grand Rapids, MI
    I have a 4k monitor if it makes a difference. Samsung something. I'll have to GooGoo this raytracing business.

    Adding: do people still go for dual cards or is that over?
     
    Last edited: Jul 23, 2019
  8. LetMeBeFrank

    LetMeBeFrank Won't tell anyone my name is actually Francis

    Pyrate
    Joined:
    Aug 4, 2017
    Likes Received:
    3,758
    Trophy Points:
    93
    Location:
    Jackson, Mi
    SLI and crossfire are basically dead. Most new games these days don't officially support either technology.
     
  9. Thad E Ginathom

    Thad E Ginathom Friend

    Pyrate
    Joined:
    Sep 27, 2015
    Likes Received:
    14,134
    Trophy Points:
    113
    Location:
    India
    Still using an AMD Phenom 2 X4 955 based system built in November 2010. A year or so ago, I added more memory and a low-end Nvidia graphics card.

    I guess the system was first built using a CRT monitor that dated back to my office days, probably pre-2000! It lived a lot longer than the replacement 22" LCD monitor, which, in its turn died and was replaced by a Dell Ultrasharp 24". No 4k, so I don't need processor/graphics to cope with that. Nor do I need the 4k itself, so all is good enough, and likely to remain so as long as it lives.

    Best up-grade ever, for me, is the Ergotron monitor arm that I added a year or two ago, which accommodates my dreadful feet-on-desk posture and makes life easy for my eyes.

    Thing is, I don't much do video (except for youtube) and I am not a gamer. The most processor-intensive stuff I ever do is photography associated: GIMP and RawTherapee. I don't take video, just stills.

    If it lives (me too!), my system will do me fine for another decade! But if I started taking/editing video, especially 4k, I'm sure I'd be needing more very quickly.

    Oh... Linux probably also helps with having a middle-aged system.
     
  10. zerodeefex

    zerodeefex SBAF's Imelda Marcos

    Staff Member Pyrate BWC
    Joined:
    Sep 25, 2015
    Likes Received:
    14,051
    Trophy Points:
    113
    Dude is running a 4k monitor. Stop recommending he buy some piece of shit card. For the last 20 years people have argued that you should "wait for the next gen."

    Consoles will get some variation of this gen Navi and Zen 2. I wouldn't worry about being on that train.
     
  11. LetMeBeFrank

    LetMeBeFrank Won't tell anyone my name is actually Francis

    Pyrate
    Joined:
    Aug 4, 2017
    Likes Received:
    3,758
    Trophy Points:
    93
    Location:
    Jackson, Mi
    If he has no interest in ray tracing then I agree with you. Just buy a 2080ti and call it good.

    Right now we are in a similar situation with ray tracing to when rasterizing first came out. The first gen cards to support it were OK, second gen were significantly better. I think ray tracing is going to be the next big thing, with basically every AAA game announced at e3 supporting it in some way. The first gen RTX cards are great if you are 1080 or 1440, but even the 2080ti can barely push 60fps in 4K with DXR on in a fully ray-traced game like metro.

    Still interested to see where AMD lands with ray tracing, as they have said Navi can do it, and next gen consoles will support it using Navi, but we haven't seen it yet.
     
  12. BenjaminBore

    BenjaminBore Friend

    Pyrate
    Joined:
    May 23, 2016
    Likes Received:
    2,842
    Trophy Points:
    93
    Location:
    London, UK
    I was referring to waiting to see what the next-gen console base spec will be, not the hamster wheel of waiting for the next best GPU architecture. I wasn’t aware of the 4k monitor until OJ’s last post. I’m less versed in 4k requirements but looking at it now it appears that at least a 2070 is needed, with the more power thrown at it the better. Just don’t expect 5+ years out of it.

    Is there a 2080 Ti Super coming or did I just assume they were supering everything? Probably not I suppose, at least until the high end RDNA cards come out, as it currently has no competitors.

    Raytracing is currently next to useless as it is just layered on top of games that were designed and engineered without it. Unfortunately it is highly doubtful that the new consoles will have enough horsepower in that department to make significant use of it (though DLSS-like tech would have some impact there). Even if they did we’re at or passed the point of diminishing returns in graphics, it’s nice but it won’t be revelatory. If there is enough horsepower to negate the baking in of lighting effects it will have a significant and positive effect on game development however. Microsoft and Sony each have so far only mentioned raytracing in passing. Their biggest bullet point was the SSD, followed by the stronger CPU. We’ll have to wait and see what AMD has in store for us with RDNA 2.

    On Zen 2 it looks pretty damn good but all the benchmarks I’d seen hadn’t done enough to remove other bottlenecks to show their real world performance, and benchmarks do not tell all. Digital Foundry found that Intel’s offerings show more consistent performance, not dipping as hard as Zen 2 does when things get heavy. Along with a few other inconsistancies, some of which may be addessed in time through software. Unless going for the top-end or Intel reduces their prices this is all academic.
    https://www.eurogamer.net/articles/digitalfoundry-2019-amd-ryzen-7-3700x-review
     
    Last edited: Jul 23, 2019
  13. OJneg

    OJneg The Most Insufferable

    Pyrate BWC
    Joined:
    Sep 25, 2015
    Likes Received:
    3,923
    Trophy Points:
    113
    Location:
    Grand Rapids, MI
    Hasn't 4k been a thing for a while now? How come only the very TOTL cards can handle it?
     
  14. BenjaminBore

    BenjaminBore Friend

    Pyrate
    Joined:
    May 23, 2016
    Likes Received:
    2,842
    Trophy Points:
    93
    Location:
    London, UK
    It has, but there were compromises, and SLI was initially needed. Also some games are more demanding than others, and requirements sneak up over time. Looking over recent benchmarks the 2070 looks to be a good catch all single GPU place to start. https://www.anandtech.com/show/14663/the-nvidia-geforce-rtx-2080-super-review/7
    https://www.eurogamer.net/articles/...3-nvidia-geforce-rtx-2080-super-review?page=3

    You don’t really want your framerate to ever dip below 30fps to achieve solid consistant performance, or 60fps if that’s what you’re targetting.


    Regarding taking console specs into account all one has to do is look to the PC offerings available 18 months prior to the current and last console generations to see how that worked out in practical terms.
     
    Last edited: Jul 23, 2019
  15. m17xr2b

    m17xr2b Friend

    Pyrate Banned
    Joined:
    Mar 26, 2016
    Likes Received:
    3,988
    Trophy Points:
    93
    Location:
    United kingdomland of fish and chips
    4K Mario will work, 4K Crysis may work, 4K AAA games sometimes do, 4K AAA games with max anti aliasing don't really work, 4K 120fps is still not a thing except in lower tier games.
    Also no point in getting the 2080TI if your CPU can't handle it which is...?

    Just saying 4K is too broad of a requirement, you have to narrow it down a bit. No video card will last 5 years and still be good unless you're like me and play Disciples 2 on 1070.
     
  16. Riotvan

    Riotvan Snoofer in the Woofer

    Pyrate
    Joined:
    Sep 27, 2015
    Likes Received:
    4,171
    Trophy Points:
    113
    Location:
    The Netherlands
    The higher you go with res the less cpu speed matters. Just get 6 cores and you should be set, maybe when the new consoles come out 8 are better but who knows.
    As for picking a gpu for 4k don't buy an 8GB card. I reckon a used 1080ti is still a great deal for budget 4k gaming or wait for what AMD does with the bigger navi gpu's.
     
  17. OJneg

    OJneg The Most Insufferable

    Pyrate BWC
    Joined:
    Sep 25, 2015
    Likes Received:
    3,923
    Trophy Points:
    113
    Location:
    Grand Rapids, MI
    Ok, I'm getting the sense that I'd be happy with a 2080 card for my purposes. I'm not someone who's afraid to turn down the settings if the game becomes a slideshow.
     
  18. BenjaminBore

    BenjaminBore Friend

    Pyrate
    Joined:
    May 23, 2016
    Likes Received:
    2,842
    Trophy Points:
    93
    Location:
    London, UK
    All most people need at the moment is any old half decent quad core, ideally somewhere a bit over 3ghz. The PS5/XBOX4 however will have at least 8x Zen 2 cores which have shown hale and hearty IPC performance, unlike their console predecessors. It’s just a question of how their power and cooling limitations hinder clockspeed. Anyone on a quad core will have to upgrade. If they manage to get it passed 3ghz then that’ll go for hexa, and lower clocked octo, core owners too.

    Saying all this it could well be a year into the new console cycle until we see anything that really starts to utilize all the new tech, or isn’t just a cross-gen title.
     
    Last edited: Jul 23, 2019
  19. OJneg

    OJneg The Most Insufferable

    Pyrate BWC
    Joined:
    Sep 25, 2015
    Likes Received:
    3,923
    Trophy Points:
    113
    Location:
    Grand Rapids, MI
    My favorite thing about PC building is the ability to upgrade the GPU workhorse at any point in time. My 10 year old machine has a 6 year old (mid grade at the time) GPU and I've been happily using it for most any game up until the last 2 years or so.
     
  20. BenjaminBore

    BenjaminBore Friend

    Pyrate
    Joined:
    May 23, 2016
    Likes Received:
    2,842
    Trophy Points:
    93
    Location:
    London, UK
    Absolutely. I haven’t used it a great deal in the last few years but my overclocked i7 920 is still going strong. I upgraded it with some RAM and a used factory overclocked GTX670 a good while back, the GPU was meant as a stop gap and I just never quite needed to change it.

    Wasn’t always like this of course. Prior to the 360/PS3 era it was barely possible to run a fixed system for even two years.
     
    Last edited: Jul 23, 2019

Share This Page