Hardware Discussion Thread

Monsieur Tirets

Well-known member
Joined:
Nov 8, 2012
Posts:
8,682
Liked Posts:
4,314
further into last light now and while it is more of the same for the most part everything is more polished and detailed. so far its shaping up to be a better game than its predecessor.

and to talk about gpus again, 1070 reviews have been released. it pretty much performs on par with the 980ti, with the ti coming out on top slightly in some games and the 1070 in others. and this makes the current nvidia line up rather odd imo. the 1080 while delivering a bit more performance costs over 200 bucks more and unless you are going to spend thousands to sli and game at 4k, or really need to game at 144hz 1080p, i really dont see a reason to buy the 1080 besides simply wanting the newest most expensive shit. meanwhile prices on 980 tis should come way down with the 1070 offering similar performance for 380 bucks. then on top of all that theres still the question of whether or not nvidia gpus are going to handle dx12 anywhere near as well as amds. so, while polaris will simply offer 390 performance and is not meant to compete with pascal, vega, coming out late this year or yearly next year, will. and itll bring hbm2 with it and possibly far better dx12/vulkan performance as well.

a part of me was thinking maybe i should swoop in when 980 ti prices drop, but now im thinking with the 390 still being more than enough for 1080p i should just wait until vegas is released and see what happens. even if i dont want to go with one of the current cards on the market, used 980 tis will still be out there, or i could always see how cheaply i can crossfire 390s.
 

botfly10

CCS Donator
Donator
Joined:
Jun 19, 2011
Posts:
32,868
Liked Posts:
26,844
Wait for the aftermarket cards and see how they perform. They are all getting released this week. I saw asus is releasing a 1080 strix that has a custom board for $640


As far as the pricing... the x80 cards have always been poor dollars to performance ratio. x70 cards have usually had the best power to dollars ratio. Its also why the 980ti was so popular. Excelled performance to dollar ratio, especially for a flagship.
 
Last edited:

Monsieur Tirets

Well-known member
Joined:
Nov 8, 2012
Posts:
8,682
Liked Posts:
4,314
Wait for the aftermarket cards and see how they perform. They are all getting released this week. I saw asus is releasing a 1080 strix that has a custom board for $640


As far as the pricing... the x80 cards have always been poor dollars to performance ratio. x70 cards have usually had the best power to dollars ratio. Its also why the 980ti was so popular. Excelled performance to dollar ratio, especially for a flagship.

i have to admit, while im happy with my 390 and know you cant wait for the next big thing because youll never buy anything, im kind of annoyed that suddenly, just a few months after building my pc, a card delivering 980ti performance is available for 65 bucks more than my 390. theres always going to be new cards offering better performance but i didnt expect 980ti performance in this price range. i figured a jump like every other generation and even this one with 1080 over the 980ti offering a bit more performance for the same price. but the 1070 is a big jump from the 970 and is closer to the 1080 for hundreds less.

hell, maybe rumors of a 300 dollar amd card offering 980ti performance are true. which is why ill wait. lol my luck, even if i went and got a 1070, amd would turn around and immediately offer something better and cheaper.
 

botfly10

CCS Donator
Donator
Joined:
Jun 19, 2011
Posts:
32,868
Liked Posts:
26,844
Yeah. Well, for me, the next big purchase is a new monitor. Really no sense in upgrading my card before that. I wish I could try 1440 tho. I am not convinced the resolution jump is worth losing the frames.

The acer predator z35 is an ultrawide 1080 monitor that hits 200 hz and has gsync
 

Monsieur Tirets

Well-known member
Joined:
Nov 8, 2012
Posts:
8,682
Liked Posts:
4,314
Yeah. Well, for me, the next big purchase is a new monitor. Really no sense in upgrading my card before that. I wish I could try 1440 tho. I am not convinced the resolution jump is worth losing the frames.

The acer predator z35 is an ultrawide 1080 monitor that hits 200 hz and has gsync

yeah, as far as resolution goes, even the gtx 1080 cant deliver a constant fps above 60 at all times in some games at 1440p. and all i know is if i spent 600 plus dollars on a gpu you better damn well believe i want to max everything out while never seeing fps drop below 60... ever. even if that means 1080p. thats why i dont get why people call cards like the 980ti over kill at 1080p, its really not. even the 1080 isnt if you want a constant 120+ fps.
 

botfly10

CCS Donator
Donator
Joined:
Jun 19, 2011
Posts:
32,868
Liked Posts:
26,844
yeah, as far as resolution goes, even the gtx 1080 cant deliver a constant fps above 60 at all times in some games at 1440p. and all i know is if i spent 600 plus dollars on a gpu you better damn well believe i want to max everything out while never seeing fps drop below 60... ever. even if that means 1080p. thats why i dont get why people call cards like the 980ti over kill at 1080p, its really not. even the 1080 isnt if you want a constant 120+ fps.

1080 will stay above 60fps on 1440. How many times I gotta tell you to stop looking at the reference bs.

Aftermarket 1080's are out and they are up in the 90-100's on 1440 and at or above 60 in 4k.

Further, you don't need AA anymore in 1440 or above and a lot of benchmarks have it turned on. And there is other barely noticeable shit you can turn down for more frames too.

Well, OK. I just talked myself into 1440. If nothing else then for the screen real estate. 1080p is pretty much dead I guess.

If I can get 1440 at 90+ Hz/fps with gsync, I am stoked. Time to start saving up I guess.
 
Last edited:

botfly10

CCS Donator
Donator
Joined:
Jun 19, 2011
Posts:
32,868
Liked Posts:
26,844
Look at this. And this card doesn't even have a custom board, just a better cooler (1080 was def thermal throttling btw). Just wait for those aftermarket boards with better VRMs and more power pins.

[video=youtube;22ePLtROm6I]https://www.youtube.com/watch?v=22ePLtROm6I[/video]
 

Monsieur Tirets

Well-known member
Joined:
Nov 8, 2012
Posts:
8,682
Liked Posts:
4,314
doom is extremely well optimized, and while good looking isnt that demanding. its not a good example to use. the witcher 3, GTA5, and rise of the tomb raider all fall in the 30s(average, minimums are far lower, often single digit) at 4k with the 1080.

and the 1070 delivers minimum frames in the 30s in a number of games at 1440p. you cant just look at averages(which is all a number of benches like to show), you have to look at minimum fps and the 1070 simply isnt capable of delivering fps above 60 at all times at that resolution. its essentially on par with the 980 ti. it seems to fall a little short overall actually, so custom cards might push it so that its even or a little better but even turning off aa and running a custom pcb/cooler is not going to rise fps by 30+. its just not going to happen. with that said, if you willing to deal with 60fps average with some fps dips and/or start lowering setting then the 1070 is a decent card for 1440. youre definitely not going to get 100fps or anywhere near it consistently though. not even the 1080 is going to give you 100fps at 1440. shit, the average fps in most games isnt near that. if you want 100fps you need to stick to 1080p or go sli. its really that simple.

anyway, with the 1070 delivering near 980ti performance at 380 bucks i think amd is poised to fire back. polaris is rumored to deliver 390 performance at 200-250 bucks, so i think amd will release their 390 successor at the same 300 dollar or so price point while offering 980 ti performance as well. im likely going to have to eat crow, i seriously thought that rumor was bs, but it probably wasnt. it makes sense now, if nvidia was able to make that jump going from 28nm to 16, amd should be able to offer even more performance at an even lower cost going to 14nm. hell, with the 1080 costing 600+ the 1080ti is going to be expensive as hell which means amd might be able to price cut them and offer equal performance at the lower price bracket. which could explain why amd is releasing the mainstream cards first with the 280 successor and is waiting on vega. 390 performance below 250 bucks will sell a shit ton of cards, far more than either company will sell of their 300+ dollar cards, regardless of performance. i defiantly want to see what happens once both players have shown their cards. while i was annoyed with a card like the 1070 being released so soon after building my pc ive realized im in no rush to upgrade anyway, the 390 delivers 60+fps, without having to skimp on eye candy, and considering i have a 60hz monitor i dont make use of any extra frames anyway.

though if you do upgrade and are wiling to make me an offer i couldnt refuse on that 980ti... you know what? youre right, i was wrong, youll get 100+ fps, you should upgrade.
 

Crystallas

Three if by air
Staff member
Donator
Joined:
Jun 25, 2010
Posts:
19,890
Liked Posts:
9,618
Location:
Next to the beef gristle mill
My favorite teams
  1. Chicago Bulls
AMD's Polaris is a versatile architecture targeting platform potential, not top of the price-segment gaming. Although they will price accordingly anyway.

IMO, it's going to wind up becoming the candidate for PS4.x, XBOne(point five or whatever) and the NX, Now, I'm actually happy with my Fury, and I'm constantly making good use of the HBM over GDDR5 in a lot of CAD projects, but AMD still is leaving Linux proprietary drivers without support of major features in the kernel and xserver. If they are focusing software for consolemakers to make the most of the GPU and focusing the platform for more users than just a traditional use desktop PC, why the fuck should I support them. nVidia has been great with drivers in the last 2 years, after being fairly rough for a number of years before that. nVidia could have HBM2 at the same time that AMD, or sooner. And nVidia might be forced to adopt more VESA standards, and phase away gsync for async or freesync just to be compliant with a standards organization that they carry huge influence with.

Really, AMD has one thing going for them on the power desktop PC/Gamer market. Their open source drivers are years ahead of nVidia. nVidia's proprietary drivers are still better than AMD's proprietary drivers, so that's really not enough reason to go AMD in 2016.

Why is that important? Because if you can't support Vulkan for shit (even if you pretty much invented the damn thing!!) then benchmarks off review-sites start to mean shit. Your hardware will be fast but hardly usable.
 

botfly10

CCS Donator
Donator
Joined:
Jun 19, 2011
Posts:
32,868
Liked Posts:
26,844
doom is extremely well optimized, and while good looking isnt that demanding. its not a good example to use. the witcher 3, GTA5, and rise of the tomb raider all fall in the 30s(average, minimums are far lower, often single digit) at 4k with the 1080.

and the 1070 delivers minimum frames in the 30s in a number of games at 1440p. you cant just look at averages(which is all a number of benches like to show), you have to look at minimum fps and the 1070 simply isnt capable of delivering fps above 60 at all times at that resolution. its essentially on par with the 980 ti. it seems to fall a little short overall actually, so custom cards might push it so that its even or a little better but even turning off aa and running a custom pcb/cooler is not going to rise fps by 30+. its just not going to happen. with that said, if you willing to deal with 60fps average with some fps dips and/or start lowering setting then the 1070 is a decent card for 1440. youre definitely not going to get 100fps or anywhere near it consistently though. not even the 1080 is going to give you 100fps at 1440. shit, the average fps in most games isnt near that. if you want 100fps you need to stick to 1080p or go sli. its really that simple.

anyway, with the 1070 delivering near 980ti performance at 380 bucks i think amd is poised to fire back. polaris is rumored to deliver 390 performance at 200-250 bucks, so i think amd will release their 390 successor at the same 300 dollar or so price point while offering 980 ti performance as well. im likely going to have to eat crow, i seriously thought that rumor was bs, but it probably wasnt. it makes sense now, if nvidia was able to make that jump going from 28nm to 16, amd should be able to offer even more performance at an even lower cost going to 14nm. hell, with the 1080 costing 600+ the 1080ti is going to be expensive as hell which means amd might be able to price cut them and offer equal performance at the lower price bracket. which could explain why amd is releasing the mainstream cards first with the 280 successor and is waiting on vega. 390 performance below 250 bucks will sell a shit ton of cards, far more than either company will sell of their 300+ dollar cards, regardless of performance. i defiantly want to see what happens once both players have shown their cards. while i was annoyed with a card like the 1070 being released so soon after building my pc ive realized im in no rush to upgrade anyway, the 390 delivers 60+fps, without having to skimp on eye candy, and considering i have a 60hz monitor i dont make use of any extra frames anyway.

though if you do upgrade and are wiling to make me an offer i couldnt refuse on that 980ti... you know what? youre right, i was wrong, youll get 100+ fps, you should upgrade.

I don't know where you are getting all this. The 980 ti already gets 60-100 fps in 1440. And yes, aftermarket cards are reporting overclocks in the 2.5 GHz range which is almost a 70% improvement from the founders (including the boost clock on the founders).

If you are saying the min fps has to be 90's range, thats absurd. Min fps tells you almost nothing since there is no way to tell how long or how often it dipped that low. Plus, thats the point of gsync - to remove stutter or tearing from frame dips. Its pretty clear that you can get an excellent experience with 1440 using a 980 ti, not to mention a 1080.

[video=youtube;BHAIe-fOTyM]https://www.youtube.com/watch?v=BHAIe-fOTyM[/video]
 
Last edited:

Monsieur Tirets

Well-known member
Joined:
Nov 8, 2012
Posts:
8,682
Liked Posts:
4,314
im absolutely talking minimum fps, and to say minimum fps tell you almost nothing is ludicrous. minimums tell you everything, max and averages tell you jack shit on their own. but regardless, at 1440p none of these cards are even getting anywhere near a max 100fp, let alone an average 100fps, in most games. whats the point of a 100+hz monitor if you rarely hitting those fps? all you have to do is watch some benchmarking, including the one you posted, though again the lack of minimums taints the results, and youll see that none of these cards are near 100fps the majority of the time at 1440p. the games that do hit 100fps are the exception. and usually older. theres a big difference between averaging 60-75 and averaging 100.

[video=youtube_share;Wf6lpFmCKGQ]https://youtu.be/Wf6lpFmCKGQ[/video]

reference or not, which by the way people have overclocked to over 2gz without throttling, after market cards are not going to give you dozens of extra fps. shit, even in the vid you posted the 980ti kingpin only delivers a few more fps than the reference. dont get me wrong the 1070 is a hell of a deal and the 1080 is a strong 1440p card capable of delivering 1440p 60fps performance across the board, but they are not 2k 100+fps cards. at least not singly. wait, no, yes they are.
 

botfly10

CCS Donator
Donator
Joined:
Jun 19, 2011
Posts:
32,868
Liked Posts:
26,844
Min fps doesn't tell you anything because you have no idea how long the fps was that low. Does that make sense? If you load into a new area and the flames drop for a few seconds while it renders, that shows up as your min.

Measuring a card's capabilities by its min fps is absurd.

And anyway, thats the whole point of gsync. So you don't even notice the spikes.

And if you are averaging 90+ fps, lol yes a 100hz monitor matters.

Finally, this all still ignoring that you can turn down just a few settings (and turn AA all the way off) to get big fps improvements.
 
Last edited:

Monsieur Tirets

Well-known member
Joined:
Nov 8, 2012
Posts:
8,682
Liked Posts:
4,314
Min fps doesn't tell you anything because you have no idea how long the fps was that low. Does that make sense? If you load into a new area and the flames drop for a few seconds while it renders, that shows up as your min.

Measuring a card's capabilities by its min fps is absurd.

And anyway, thats the whole point of gsync. So you don't even notice the spikes.

And if you are averaging 90+ fps, lol yes a 100hz monitor matters.

Finally, this all still ignoring that you can turn down just a few settings (and turn AA all the way off) to get big fps improvements.

thats why you need to watch actual benchmarks with footage as well not just graphs. any reviewer worth a damn will tell you benchmarks without mins is worthless. watch the digital foundry vid i posted, if you havent, they probably do the best benchmark/performance vids, and as you can see in that vid the 1080 is nowhere near averaging 90fps in demanding games. youd have to drop a number of setting to get from a 60-70fps average to 90-100 without major dips. gsync or not 60fps is still 60fps and 100 is still 100. shit man, if you want to pick up a 1080 and a 1440p monitor, more power to you. i would if i had the cash, i simply wouldnt expect to game at anything much more than 60fps most of the time.

heres a 1080 round up. the asus card is looking pretty beastly.

http://wccftech.com/nvidia-geforce-gtx-1080-custom-model-round-up/

oh, and apparently news has trickled out of computex and the first polaris gpu will cost 200 bucks and deliver 390x/980 performance. if true, AMDs 300 dollar 980 ti beater is looking more and more like a reality.

anyway, back to games. im a lot further into last light, probably a bit more than half way. its good. it took everything from the original and simply improved upon it. the mood and atmosphere is thicker, the horror elements are executed with more craft and the metro underworld feels more alive with the stations a bit more fleshed out as well as the factions and the conflicts between them.
 

botfly10

CCS Donator
Donator
Joined:
Jun 19, 2011
Posts:
32,868
Liked Posts:
26,844
If AMD has a card that performs at or slightly above a 980 ti, that would put it in the neighborhood of a 1070. If they release at $300, that would be crazy. It seems really unlikely, but shit, would be great for the industry if true.
 

Ares

CCS Hall of Fame
Donator
CCS Hall of Fame '19
Joined:
Aug 21, 2012
Posts:
41,424
Liked Posts:
39,625
So I am in the middle of Watchdogs and I also still need to complete the last Assassin's Creed game and Shadow of Mordor is still in my queue and I just picked up that The War for the Underworld game which is pretty sweet.

What are you guys playing right now?

:troll:
 

botfly10

CCS Donator
Donator
Joined:
Jun 19, 2011
Posts:
32,868
Liked Posts:
26,844
Oh, and as far as monitors, I have been thinking about and and totally forgot maybe the most important perspective - namely that monitors aren't something I really have an upgrade cycle for. I have had the same monitor since I built my first computer, maybe 6 years ago. When I buy a new one, I will probably have it for around another 6 years minimum. Probably longer.

So in that sense, 1080p probably really is dead as far as making new purchases. It would be crazy to buy a brand new 1080p monitor and expect to be satisfied with it for that long.

The flip side is that monitor tech has had crazy advances over the last year with a lot more new stuff on the immediate horizon. If I am gonna have the thing for upwards of 10 years, it might be worth waiting to see how that shit matures (not that I'm going to pay $1200 anyway)
 

Monsieur Tirets

Well-known member
Joined:
Nov 8, 2012
Posts:
8,682
Liked Posts:
4,314
If AMD has a card that performs at or slightly above a 980 ti, that would put it in the neighborhood of a 1070. If they release at $300, that would be crazy. It seems really unlikely, but shit, would be great for the industry if true.

a few weeks ago when that rumor first surfaced i called it bs if you remember, but now with the 1070 having a msrp of $380, polaris delivering the performance it does at $200, and the fact that AMD is on 14nm, it all adds up to suggest AMD will be able to do just that... deliver 980ti performance at around $300. thats why even if i had the cash to upgrade already, i think id wait. not only is it likely going to be a month or two before third party nvidia cards are readily available at reasonable prices, but by then we should also have more insight as to what amd is doing. plus, its looking like a used 980 ti down the road might actually offer the best bang for your buck.

as for monitors, i got a new one with this pc, its 1080p but only cost a little over $100, and its an upgrade over the 1024x768 monitor i had the previous 10 years. lol besides im good with 1080p for a while. and like you said waiting makes sense. once 4k becomes the standard prices should come way down.
 

ytsejam

CCS Donator
Donator
Joined:
May 31, 2010
Posts:
5,947
Liked Posts:
5,621
I can't say enough how much I like my new monitor. I knew it would be an upgrade but didn't expect that big of a jump.

Acer XB270HU
2560 x 1440, 144hz, g-sync

Rawr!
 

Monsieur Tirets

Well-known member
Joined:
Nov 8, 2012
Posts:
8,682
Liked Posts:
4,314
I can't say enough how much I like my new monitor. I knew it would be an upgrade but didn't expect that big of a jump.

Acer XB270HU
2560 x 1440, 144hz, g-sync

Rawr!

that reminds me, i just came across an article and apparently gsync adds 2-3 hundred dollars to the production cost of a monitor freesync costs nothing. which might explain why theres talk that nvidia might drop gsync and the industry might adopt freesync as the standard.
 

Crystallas

Three if by air
Staff member
Donator
Joined:
Jun 25, 2010
Posts:
19,890
Liked Posts:
9,618
Location:
Next to the beef gristle mill
My favorite teams
  1. Chicago Bulls
that reminds me, i just came across an article and apparently gsync adds 2-3 hundred dollars to the production cost of a monitor freesync costs nothing. which might explain why theres talk that nvidia might drop gsync and the industry might adopt freesync as the standard.

This would be best for everyone. Using as many VESA standards makes life easier for all parties involved, except the one that might take a hit on R&D costs, but over time, they can still do better with an open standard. PC part makers need to be careful not to IP troll the world so hard, because the added costs will push a lot of PC users of newer generations, to simply stick to a mobile device if they are on the fence.
 

Top