Jump to content

Whats a good future proof PC in October 2019?


Recommended Posts

Not to mention , you always fall back on Froza. Try Metro, at 4k, or any game with RT, and see what happens. You will lose everytime.

 

That's like saying my card is better than yours because we can achieve the same settings at 108060 but my card was cheaper.

  • Upvote 1
Link to post
Share on other sites
  • Replies 268
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Popular Posts

Having a 1200 dollar video card and being trolled into doing 1080p benchmarks LOL

Well here's the biggest question before we can even start, what's your financial, performance and resolution target?

26 minutes ago, The Mother Fucker said:

really?  cause the CPU Simulation seems to tell a different story.  :comicbookguy:

 

20191017_204702.jpg

 

:mickey:

What does that matter?  My GPU tells a completely different story from yours, but all you cared about was what you ACHIEVED.

 

Get fucked.  My PC with my CPU bottleneck beats yours with your GPU bottleneck in this game that is so AMD favored... and the difference is my CPU bottleneck goes away when I play at actual resolutions people with good PC hardware use.  Yours doesn't. :tom: 

 

This year you've spent around $1200 on that PC.  You had a Threadripper and Vega 56s... and your $1200 you spent has got you barely anything.  My advice to you would be to stop buying so many weak ass PCs and instead build a good one with a good 4K monitor and an NVIDIA GPU. :trump: 

Edited by Remij_
Link to post
Share on other sites
13 minutes ago, Remij_ said:

What does that matter?  My GPU tells a completely different story from yours, but all you cared about was what you ACHIEVED.

 

Get fucked.  My PC with my CPU bottleneck beats yours with your GPU bottleneck in this game that is so AMD favored... and the difference is my CPU bottleneck goes away when I play at actual resolutions people with good PC hardware use.  Yours doesn't. :tom: 

 

This year you've spent around $1200 on that PC.  You had a Threadripper and Vega 56s... and your $1200 you spent has got you barely anything.  My advice to you would be to stop buying so many weak ass PCs and instead build a good one with a good 4K monitor and an NVIDIA GPU. :trump: 

RX 5700 XT = $449,   Ryzen 7 3700X $329 ,  =/= $1200. 

 

I bought that after selling my two GTX 980 Ti on ebay, and returning some 2 M.2 FPGA Acorns that I got back at $680 in Bitcoin, before the brief bitcoin boom that happened in June.

 

I made out pretty well this year. :reg:

 

 

 

and the 2990wx Threadripper and X399 mobo I bought last year still hold their value as the price is still the same.  Well, actually it cost more now, than it did when I bought them.  :rofls:

 

 

 

 

Edited by The Mother Fucker
Link to post
Share on other sites
2 minutes ago, The Mother Fucker said:

RX 5700 XT = $449,   Ryzen 7 3700X $329 ,  =/= $1200. 

 

I bought that after selling my two GTX 980 Ti on ebay, and returning some 2 M.2 FPGA Acorns that I got back at $680 in Bitcoin, before the brief bitcoin boom that happened in June.

 

I made out pretty well this year. :reg:

 

 

 

and the 2990wx Threadripper and X399 mobo I bought last year still hold their value as the price is still the same.  Well, actually it cost more now, than it did when I bought them.  :rofls:

 

 

 

 

No... 5700XT ANNIVERSARY = $500, 3700x = $330, RAM = 170, Case = 100, PSU = $70 and there's taxes.... ect ect ect

 

Yea.. no.  The fact is.. you spent money.  And you don't think I sell my old PC components as well? :reg: 

 

And you spent a shit ton on that 2990wx... So don't talk to me about what I spent on my GPU :bena:

Link to post
Share on other sites
3 hours ago, Remij_ said:

No... 5700XT ANNIVERSARY = $500, 3700x = $330, RAM = 170, Case = 100, PSU = $70 and there's taxes.... ect ect ect

 

Yea.. no.  The fact is.. you spent money.  And you don't think I sell my old PC components as well? :reg: 

 

And you spent a shit ton on that 2990wx... So don't talk to me about what I spent on my GPU :bena:

Go to AMD.com, they sell the 5700XT AE, it's $449, it's been that way since launch.   Jackass.  :kaz:  

 

I have 4 Desktops technically 5 that is a server.   I can speak about your $1200 GPU getting sweated by my $449 GPU of 1 of my desktop's any day of the week.  :mj2:

Edited by The Mother Fucker
Link to post
Share on other sites
1 minute ago, The Mother Fucker said:

I have 4 Desktops technically 5 that is a server.   I can speak about your $1200 GPU getting sweated by my $449 GPU of 1 of my desktop's any day of the week.  :mj2:

You really can't.  Because you already posted the proof in this thread that it's not even close.  The only other thing proven is that despite my CPU bottleneck... I still beat you :rofls: 

 

So 5 useless gaming PCs.. Gotcha :rofls: 

 

Link to post
Share on other sites
1 minute ago, Remij_ said:

You really can't.  Because you already posted the proof in this thread that it's not even close.  The only other thing proven is that despite my CPU bottleneck... I still beat you :rofls: 

 

So 5 useless gaming PCs.. Gotcha :rofls: 

 

You're an idiot if you think a 2990wx Threadripper is for gaming.

or that a Server is for gaming. 

 

 

Link to post
Share on other sites
5 hours ago, Hot Sauce said:

Yeah, just do this. I had to go this route on my laptop for years and it's not a problem at all.

 

I'm really curious what the ports are, though. SS labeling is just an alternative to the blue receptacle that traditionally denotes a 3.0 port and the port itself is backwards compatible with 2.0 and 1.1 devices. Type C is technically SS+, but you'd never list it as that.

 

Do you know the motherboard that's in your PC? @TLHBO


Got this one after good reviews.


https://www.asus.com/uk/Motherboards/TUF-GAMING-X570-PLUS/
 

 

Reading up on it, it seems the standard looking USB ports are USB 3.2 gen2 and these things are USB 3.2 gen1

https://www.zdnet.com/article/get-ready-for-usb-3-2-super-fast-20gbps-transfer-speeds-but-tons-of-confusion/

 

So from what I gather they keep changing the name, but its basically gen 1 is 5GBPS, gen 2 is 10GBPS.

 

Either speed is fine but the standard USB cables are a little too big for these gen 1 ports. At least on the motherboard I have.

Link to post
Share on other sites
12 hours ago, TLHBO said:

It's an awesome motherboard. I was choosing between that one and my Aorus Elite and only went with my Aorus Elite because it went back in stock faster. :happysad:

 

12 hours ago, TLHBO said:

Reading up on it, it seems the standard looking USB ports are USB 3.2 gen2 and these things are USB 3.2 gen1

https://www.zdnet.com/article/get-ready-for-usb-3-2-super-fast-20gbps-transfer-speeds-but-tons-of-confusion/

 

So from what I gather they keep changing the name, but its basically gen 1 is 5GBPS, gen 2 is 10GBPS.

 

Either speed is fine but the standard USB cables are a little too big for these gen 1 ports. At least on the motherboard I have.

dK6iRTG.png

 

That's the I/O panel in the manual for the board.

 

The 4 USB ports for (2) are USB 3.2 Gen 1, which is just essentially USB 3.0. Bandwidth is 5 GB/S.

 

The 2 USB ports for (7) are USB 3.2 Gen 2, which is just essentially USB 3.1. Bandwidth is 10 GB/s.

 

The USB port for (9) is also USB 3.2 Gen 2, but uses a type-C connector rather than a regular type-A connector.

 

The USB 3.2 stuff is just so dumb without a need to be. All USB 3.2 is is the introduction of dual lane data transfer to USB. Prior to USB 3.2 this is how modern USB worked:

 

8b/10b encoding? 5 GB/s -> USB 3.0

128b/130b encoding? 10 GB/s -> USB 3.1

 

With dual lanes now you've got:

 

8b/10b encoding with a single lane? 5 GB/s -> USB 3.2 1x1 -> USB 3.2 Gen 1

8b/10b encoding with two lanes? 10 GB/s -> USB 3.2 1x2 -> USB 3.2 ???

128b/130b encoding with a single lane? 10 GB/s -> USB 3.2 2x1 -> USB 3.2 Gen 2

128b/130b encoding with two lanes? 20 GB/s -> USB 3.2 2x2 -> USB 3.2 Gen 2x2

 

It's not even that complicated, but the naming convention they went with is just dumb. Like if you're just going to ignore the existence of USB 3.2 1x2 anyway, just refer to USB 3.2 2x2 as Gen 3. But nah, gotta go with Gen 2x2 because technically Gen 3 is two different transfer rates and people might get confused. :grimaceright: 

 

If the transfer rates don't matter, then the naming convention for the doesn't matter either and all that does is the connection type. (2) and (7) on your motherboard are all type-A, while (9) is type-C. They're backwards and forwards compatible with regard to the port in the same type connector, so you shouldn't be having any problems getting your older type-A devices to fit.

 

All I can say is double check for anything obstructing the ports, and any bends on the I/O panel that might be preventing you from using those ports with your old devices. Just definitely don't replace your stuff with type-C versions because of the alternative @Twinblade pointed out, but also because they won't fit those ports either.

 

You also have 4 USB 2.0 ports on your motherboard you can connect to the front of your PC case, assuming your case as USB ports there.

Edited by Hot Sauce
  • Upvote 1
Link to post
Share on other sites
15 hours ago, The Mother Fucker said:

You're an idiot if you think a 2990wx Threadripper is for gaming.

or that a Server is for gaming. 

 

 

Of course a Threadripper isn't for gaming...  That didn't stop you from bragging about your 3dmark score using it though did it?  The point is I don't care what it's for... you spent $1800 for a CPU alone...  lmao you're probably just running a basic home network as well... you have no need for that shit :rofl: 

 

If people think $1200 for a GPU is a waste.. that's fine... but they'll also likely think $1800 for a CPU is a waste as well.

Link to post
Share on other sites
6 hours ago, Hot Sauce said:

It's an awesome motherboard. I was choosing between that one and my Aorus Elite and only went with my Aorus Elite because it went back in stock faster. :happysad:

 

dK6iRTG.png

 

That's the I/O panel in the manual for the board.

 

The 4 USB ports for (2) are USB 3.2 Gen 1, which is just essentially USB 3.0. Bandwidth is 5 GB/S.

 

The 2 USB ports for (7) are USB 3.2 Gen 2, which is just essentially USB 3.1. Bandwidth is 10 GB/s.

 

The USB port for (9) is also USB 3.2 Gen 2, but uses a type-C connector rather than a regular type-A connector.

 

The USB 3.2 stuff is just so dumb without a need to be. All USB 3.2 is is the introduction of dual lane data transfer to USB. Prior to USB 3.2 this is how modern USB worked:

 

8b/10b encoding? 5 GB/s -> USB 3.0

128b/130b encoding? 10 GB/s -> USB 3.1

 

With dual lanes now you've got:

 

8b/10b encoding with a single lane? 5 GB/s -> USB 3.2 1x1 -> USB 3.2 Gen 1

8b/10b encoding with two lanes? 10 GB/s -> USB 3.2 1x2 -> USB 3.2 ???

128b/130b encoding with a single lane? 10 GB/s -> USB 3.2 2x1 -> USB 3.2 Gen 2

128b/130b encoding with two lanes? 20 GB/s -> USB 3.2 2x2 -> USB 3.2 Gen 2x2

 

It's not even that complicated, but the naming convention they went with is just dumb. Like if you're just going to ignore the existence of USB 3.2 1x2 anyway, just refer to USB 3.2 2x2 as Gen 3. But nah, gotta go with Gen 2x2 because technically Gen 3 is two different transfer rates and people might get confused. :grimaceright: 

 

If the transfer rates don't matter, then the naming convention for the doesn't matter either and all that does is the connection type. (2) and (7) on your motherboard are all type-A, while (9) is type-C. They're backwards and forwards compatible with regard to the port in the same type connector, so you shouldn't be having any problems getting your older type-A devices to fit.

 

All I can say is double check for anything obstructing the ports, and any bends on the I/O panel that might be preventing you from using those ports with your old devices. Just definitely don't replace your stuff with type-C versions because of the alternative @Twinblade pointed out, but also because they won't fit those ports either.

 

You also have 4 USB 2.0 ports on your motherboard you can connect to the front of your PC case, assuming your case as USB ports there.

Yeah I had a quick look at it last night and figured I had the wrong end type.

 

Took another look today when I got home from work, turns out the fucking motherboard is misaligned :samj:

 

The bottom USB ports next to the audio ports work, but further towards the top the motherboard slants so those top ones are out of alignment thats why nothing fits.

 

I got this one pre-made because it didnt cost much more and I was too lazy to do the cable management. Fucking waste of time that was...

 

Now I’m gonna have to open it up tomorrow and see what I can do. Unless its just how it is, done a bit of googling and it seems theres a chance it might just be how the cookie crumbles given the case size and standoff screw size?

Edited by TLHBO
Link to post
Share on other sites
11 hours ago, Remij_ said:

Of course a Threadripper isn't for gaming...  That didn't stop you from bragging about your 3dmark score using it though did it?  The point is I don't care what it's for... you spent $1800 for a CPU alone...  lmao you're probably just running a basic home network as well... you have no need for that shit :rofl: 

 

If people think $1200 for a GPU is a waste.. that's fine... but they'll also likely think $1800 for a CPU is a waste as well.

What are you talking about?  A simple router can be a basic home network.  

 

My TR system does everything I throw at it.   Which is what I built it for, to be my ultimate build machine.  I been at it since 2000, building custom PCs and just so happen in 2018 I realized my last builds were from 2012 and were dated Phenoms so I got caught up and decided to build my post powerful creation.   Compile code? done  Cryptomine? done. Reencode 1000s of HD videos that I captured over the decade to H.265 codec?  Done.    Test the many modes of Ryzen Master to  see far this system can go in numerous benchmarks.  

 

Anyways when it comes to the Geforce RTX 2080 Ti, the price personally means nothing to me because if I wanted one, I can easily get one, but that's the thing I don't want one. When I built my ultimate TR rig I preferred it contain hardware that was a representation of what AMD offered and therefore opted for AMD GPU and decided on two RX VEGA 56 as those cards seemed the most efficient. 

 

:facep: So speak for yourself remij.  A 2990wx TR probably is a waste for you, since all you do and care about is gaming when it comes to PCs.  

  • Upvote 1
Link to post
Share on other sites
13 hours ago, TLHBO said:

Yeah I had a quick look at it last night and figured I had the wrong end type.

 

Took another look today when I got home from work, turns out the fucking motherboard is misaligned :samj:

 

The bottom USB ports next to the audio ports work, but further towards the top the motherboard slants so those top ones are out of alignment thats why nothing fits.

 

I got this one pre-made because it didnt cost much more and I was too lazy to do the cable management. Fucking waste of time that was...

 

Now I’m gonna have to open it up tomorrow and see what I can do. Unless its just how it is, done a bit of googling and it seems theres a chance it might just be how the cookie crumbles given the case size and standoff screw size?

I'd guess the problem is with the I/O shield being installed incorrectly. It doesn't come pre-installed on that board and whoever put it together probably just cares that they get video out of it for testing purposes.

Link to post
Share on other sites
5 hours ago, The Mother Fucker said:

What are you talking about?  A simple router can be a basic home network.  

 

My TR system does everything I throw at it.   Which is what I built it for, to be my ultimate build machine.  I been at it since 2000, building custom PCs and just so happen in 2018 I realized my last builds were from 2012 and were dated Phenoms so I got caught up and decided to build my post powerful creation.   Compile code? done  Cryptomine? done. Reencode 1000s of HD videos that I captured over the decade to H.265 codec?  Done.    Test the many modes of Ryzen Master to  see far this system can go in numerous benchmarks.  

 

Anyways when it comes to the Geforce RTX 2080 Ti, the price personally means nothing to me because if I wanted one, I can easily get one, but that's the thing I don't want one. When I built my ultimate TR rig I preferred it contain hardware that was a representation of what AMD offered and therefore opted for AMD GPU and decided on two RX VEGA 56 as those cards seemed the most efficient. 

 

:facep: So speak for yourself remij.  A 2990wx TR probably is a waste for you, since all you do and care about is gaming when it comes to PCs.  

Ok... and?

 

Thing is... I didn't say a damn thing about about how much you spent VS what I spent until you started bringing up that I spent $1200 on my GPU as a reason to gloat about your shit while obviously glossing over the facts.  The point was that YOU YOURSELF have spent tons of money on a single component to have "the best".. So don't come at me with the bullshit of spending some money on something I wanted as well.  I don't need a 2080ti to game...just like you don't need a threadripper to compile code, cryptomine, or reencode videos.  We've both spent a lot of money to have the fastest stuff... so why don't you just cut the shit out about the cost?

 

When I pushed my FH4 score past yours, and called out my CPU beating yours... you wanted to look at the simulation numbers instead of the achieved numbers.  But when we were talking about the GPU... no.. you were all about that achieved number.

 

And in the same breath you tell me to speak for myself, you make an assumption that all I do and care about is gaming when it comes to PCs. :roll:  This is a gaming site... nobody is talking about anything other than gaming and politics...

Link to post
Share on other sites
11 minutes ago, Remij_ said:

Ok... and?

 

Thing is... I didn't say a damn thing about about how much you spent VS what I spent until you started bringing up that I spent $1200 on my GPU as a reason to gloat about your shit while obviously glossing over the facts.  The point was that YOU YOURSELF have spent tons of money on a single component to have "the best".. So don't come at me with the bullshit of spending some money on something I wanted as well.  I don't need a 2080ti to game...just like you don't need a threadripper to compile code, cryptomine, or reencode videos.  We've both spent a lot of money to have the fastest stuff... so why don't you just cut the shit out about the cost?

 

When I pushed my FH4 score past yours, and called out my CPU beating yours... you wanted to look at the simulation numbers instead of the achieved numbers.  But when we were talking about the GPU... no.. you were all about that achieved number.

 

And in the same breath you tell me to speak for myself, you make an assumption that all I do and care about is gaming when it comes to PCs. :roll:  This is a gaming site... nobody is talking about anything other than gaming and politics...

You lie, you made it about that when you brought up my TR.  This thread was about RX 5700XT GPUs being able to compete with nvidia GPUs, I showed a graph of the RX 5700XT which cost a third of the price ($399) beating a RTX 2080 TI ($1200) in Forza Horizon 4 in a 1080p benchmark and that's when you got triggered and came at me with your updated 1440p driver graphs in videos, and touting the RX 5700XT can't even beat a 2070 Super and after I showed you that I could, I egged you into putting your own 2080Ti hardware up for challenge.

 

I'll admit I did switch gears on which area I was looking at from the FPS count, but you played along so? 

My TR rig is a non-factor in this discussion between the RX 5700XT AE $449 making you sweat to victory having a $1200 RTX 2080 Ti

   :comicbookguy:

Edited by The Mother Fucker
Link to post
Share on other sites
32 minutes ago, The Mother Fucker said:

You lie, you made it about that when you brought up my TR.  This thread was about RX 5700XT GPUs being able to compete with nvidia GPUs, I showed a graph of the RX 5700XT which cost a third of the price ($399) beating a RTX 2080 TI ($1200) in Forza Horizon 4 in a 1080p benchmark and that's when you got triggered and came at me with your updated 1440p driver graphs in videos, and touting the RX 5700XT can't even beat a 2070 Super and after I showed you that I could, I egged you into putting your own 2080Ti hardware up for challenge.

 

I'll admit I did switch gears on which area I was looking at from the FPS count, but you played along so? 

My TR rig is a non-factor in this discussion between the RX 5700XT AE $449 making you sweat to victory having a $1200 RTX 2080 Ti

   :comicbookguy:

So you admit to showing an old outdated graph to push your little agenda claiming that a 5700XT could beat a 2080ti.  I came in and corrected you.... and showed you benches that the 2070S was as fast... Then you admitted that what you showed to begin with was outdated and you didn't know that Nvidia had a performance patch for that game (which was a fucking lie) and then you started pitting your own GPU against it.... with your faster CPU.  And that's the reason why you only want to benchmark at 1080p... Since the Nvidia gpus are CPU bottlenecked at 1080p.

 

So then you asked me to post my own scores beating it (remember your original claim was that the 5700XT beat the 2080ti) and then I slapped you down while explaining to you that I'm severely bottlenecked at that resolution.  Which you KNOW is true.  So you kept going on and on about stupid shit, ignoring the fact that the benchmark itself clearly shows what I explained.

 

In the end.  The first graph you showed was wrong... and then we put it to the test in this very thread, and I beat your ass black and blue... even at your own shitty 1080p resolution, which immensely favored your CPU.

 

There was no sweating to victory.  Every benchmark posted in this thread showed that my GPU smokes yours.  This wasn't a GPU comparison like you were claiming... this was a CPUvsCPU comparison.. and I pushed my old 4 core one past your brand new 8 core in effective output. 161 vs 156 :hehe: 

Edited by Remij_
Link to post
Share on other sites
1 hour ago, Remij_ said:

So you admit to showing an old outdated graph to push your little agenda claiming that a 5700XT could beat a 2080ti.  I came in and corrected you.... and showed you benches that the 2070S was as fast... Then you admitted that what you showed to begin with was outdated and you didn't know that Nvidia had a performance patch for that game (which was a fucking lie) and then you started pitting your own GPU against it.... with your faster CPU.  And that's the reason why you only want to benchmark at 1080p... Since the Nvidia gpus are CPU bottlenecked at 1080p.

 

So then you asked me to post my own scores beating it (remember your original claim was that the 5700XT beat the 2080ti) and then I slapped you down while explaining to you that I'm severely bottlenecked at that resolution.  Which you KNOW is true.  So you kept going on and on about stupid shit, ignoring the fact that the benchmark itself clearly shows what I explained.

 

In the end.  The first graph you showed was wrong... and then we put it to the test in this very thread, and I beat your ass black and blue... even at your own shitty 1080p resolution, which immensely favored your CPU.

 

There was no sweating to victory.  Every benchmark posted in this thread showed that my GPU smokes yours.  This wasn't a GPU comparison like you were claiming... this was a CPUvsCPU comparison.. and I pushed my old 4 core one past your brand new 8 core in effective output. 161 vs 156 :hehe: 

actually the last benchmark I ran that night I got a 157 :smoke:   So be happy you manage to squeeze out a mere 4 fps points ahead.    Trying to claim this is a CPU vs CPU, in a benchmark of a game that is heavily GPU dependent. If this were a CPU war, you would lose which is why I pointed the CPU simulations that show my system blowing yours away in that category.

 

 

Enough of this squabble, let's settle this in a game of

 

 


Mortal Kombat 11.   :obama:

Link to post
Share on other sites
Just now, The Mother Fucker said:

actually the last benchmark I ran that night I got a 157 :smoke:   So be happy you manage to squeeze out a mere 4 fps points ahead.    Trying to claim this is a CPU vs CPU, in a benchmark of a game that is heavily GPU dependent. If this were a CPU war, you would lose which is why I pointed the CPU simulations that show my system blowing yours away in that category.

 

 

Enough of this squabble, let's settle this in a game of

 

 


Mortal Kombat 11.   :obama:

This is CPU vs CPU... the game isn't heavily GPU dependent because you're only wanting to test in 1080p... :drake: 

And I already showed my GPU min/avg/max fps destroying yours... while being completely 100% CPU bound...  with only ~75% of my GPU being utilized.. :drake: 

 

You did it to yourself. :shrug: 

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    No registered users viewing this page.


×
×
  • Create New...