gtx 1080 or wait for vega?

botfly10

CCS Donator
Donator
Joined:
Jun 19, 2011
Posts:
32,868
Liked Posts:
26,844
For gaming, there are usually 2 reasons people go sli

1. People trying to "game the system" - some people look at benchmarks of 2 cheaper cards in sli (like the rx 480 for example) and see performance numbers that beat cards more than twice as expensive. However, synthetic benchmarks usually leverage sli better than games do and users rarely see the same performance increase in games.

2. People with an old card that find a second for cheap as a band-aid to postpone a full on upgrade. This is probably the best argument for sli, though it depends on getting your second card for real cheap.
 

Crystallas

Three if by air
Staff member
Donator
Joined:
Jun 25, 2010
Posts:
19,890
Liked Posts:
9,618
Location:
Next to the beef gristle mill
My favorite teams
  1. Chicago Bulls
Lots of uses for multiple video cards. Just not so much for gaming because the software layers for SLI/CFX are some of the last to be ironed out and optimized(if ever) as well as high particle simulation APIs being integrated into core render engines(ie when Ageia made PPUs, you used an accelerator or for a few years after the nVidia acquisition, you could use a low end nvidia card on the side just for PhysX).

Now, what's the point unless you use multiple GPUs for multiple-tasked applications?

I had two quadfire (not actually run in bridged crossfire mode) setups back when I mined crypto(4850s and 6870s with the existing card), a hybrid setup with a GF 320 GT and a RHD6870, SLI BFG Geforce 6800 GTs, a few diamond voodoo2 12mb SLI, Voodoo 3 3500 with an ATI All-in-wonder 128 pro, a Matrox mystique and an original voodoo accelerator. Most recently dual Radeon 7850s, but that was used strictly for CAD and it was replaced seamlessly with a Fury non-X (a system getting a complete overhaul near the end of this year).
 

Ares

CCS Hall of Fame
Donator
CCS Hall of Fame '19
Joined:
Aug 21, 2012
Posts:
41,437
Liked Posts:
39,639
Lots of uses for multiple video cards. Just not so much for gaming because the software layers for SLI/CFX are some of the last to be ironed out and optimized(if ever) as well as high particle simulation APIs being integrated into core render engines(ie when Ageia made PPUs, you used an accelerator or for a few years after the nVidia acquisition, you could use a low end nvidia card on the side just for PhysX).

Now, what's the point unless you use multiple GPUs for multiple-tasked applications?

I had two quadfire (not actually run in bridged crossfire mode) setups back when I mined crypto(4850s and 6870s with the existing card), a hybrid setup with a GF 320 GT and a RHD6870, SLI BFG Geforce 6800 GTs, a few diamond voodoo2 12mb SLI, Voodoo 3 3500 with an ATI All-in-wonder 128 pro, a Matrox mystique and an original voodoo accelerator. Most recently dual Radeon 7850s, but that was used strictly for CAD and it was replaced seamlessly with a Fury non-X (a system getting a complete overhaul near the end of this year).

So your use for it is CAD.... I'm curious, what other use cases do you know of?

I am trying to think of cases where dual GPUs are helpful and I am struggling to find many.

Can it help if say you were running a box that was processing video streams to send out? Are those GPU intensive?
 

xer0h0ur

HS Referee HoF
Donator
Joined:
Aug 20, 2012
Posts:
22,260
Liked Posts:
17,824
Location:
Chicago, IL.
My favorite teams
  1. Chicago White Sox
  1. Chicago Bulls
  1. Chicago Bears
  1. Chicago Blackhawks
As someone who has used up to three GPUs at once, don't ever fucking do it. Its a waste of time, money and sanity to ever get it working properly and that is assuming the support is even there to begin with. Often times you're left having to find workarounds or finding community made fixes. I have tried SLI and Crossfire. Its never worth it either way. I will forever stick to buying the top dog video card at the moment I upgrade. I am still sticking with my GTX 1080 overclocked @ 2100MHz for the time being. I game at 1440p on a 144Hz monitor so I routinely still run into games that will kick my card in the nuts. The crude reality is that often times new games are still severely unoptimized and even @ 1080p can still be taxing. Playerunknown's Battlegrounds is one of those early access games that kicks my rig in the dick @ 1440p.
 

xer0h0ur

HS Referee HoF
Donator
Joined:
Aug 20, 2012
Posts:
22,260
Liked Posts:
17,824
Location:
Chicago, IL.
My favorite teams
  1. Chicago White Sox
  1. Chicago Bulls
  1. Chicago Bears
  1. Chicago Blackhawks
So your use for it is CAD.... I'm curious, what other use cases do you know of?

I am trying to think of cases where dual GPUs are helpful and I am struggling to find many.

Can it help if say you were running a box that was processing video streams to send out? Are those GPU intensive?

Cryptocurrenty mining and folding were things I was doing when I had multi-gpu setups. I don't fuck with any of that anymore.
 

botfly10

CCS Donator
Donator
Joined:
Jun 19, 2011
Posts:
32,868
Liked Posts:
26,844
I have friends that work in a lab and they run analyses on HUGE data sets (using R mostly) and they run multiple GPUs so they can delegate separate analyses to different GPUs and have them running at the same time.
 

Crystallas

Three if by air
Staff member
Donator
Joined:
Jun 25, 2010
Posts:
19,890
Liked Posts:
9,618
Location:
Next to the beef gristle mill
My favorite teams
  1. Chicago Bulls
So your use for it is CAD.... I'm curious, what other use cases do you know of?

I am trying to think of cases where dual GPUs are helpful and I am struggling to find many.

Can it help if say you were running a box that was processing video streams to send out? Are those GPU intensive?

You can find uses for multiple GPUs; mining, folding, encryption cracking, rendering, container and virtualization passthru, multi display+multitasking, yes games do run faster, and they make a nice space heater.

That doesn't make them sensible as others mention. The only experience I have ever had where it wasn't much hassle was the voodoo2, and computers were slower then, so any bit of time put into configs was recouped quickly by doing the extra shit. As computers became quicker, that recoup time was gone around... I would say 2008ish. But people kept building multigpu systems out of habit and others would build them just because they always wanted one. Or like bot said, they extended the life of a machine by adding a second card for dirt cheap.

Let me throw the common scenario out. The main issue is the games you want to play with SLI/CFX aren't fully optimized when you build it. User buys 2 $300 cards, gets 65fps avg with one card, but 80fps with 2(diminishing returns). In 10 months, a $300 card comes out that gets 82fps avg in that game. Then your budget for upgrades starts to make no fucking sense because you went SLI/CFX instead of set aside the funds for a real upgrade path. So if you have money to blow and the time to tinker, you're set.

So as far as rendering and streaming, multiple GPUs can help a lot, but there is no benefit in bridged modes(SLI/CFX). Also, you're not going to buy two nVidia/AMD cards to do it. More like one or the other, and buy an application specific accelorator, like from avermedia, hauppage, or elgato. Some of which are external options only, therefore you're going to put more into a motherboard/expansion card with a good chipset for reliable data transfer to those areas for better optimization.
 

Ares

CCS Hall of Fame
Donator
CCS Hall of Fame '19
Joined:
Aug 21, 2012
Posts:
41,437
Liked Posts:
39,639
You can find uses for multiple GPUs; mining, folding, encryption cracking, rendering, container and virtualization passthru, multi display+multitasking, yes games do run faster, and they make a nice space heater.

That doesn't make them sensible as others mention. The only experience I have ever had where it wasn't much hassle was the voodoo2, and computers were slower then, so any bit of time put into configs was recouped quickly by doing the extra shit. As computers became quicker, that recoup time was gone around... I would say 2008ish. But people kept building multigpu systems out of habit and others would build them just because they always wanted one. Or like bot said, they extended the life of a machine by adding a second card for dirt cheap.

Let me throw the common scenario out. The main issue is the games you want to play with SLI/CFX aren't fully optimized when you build it. User buys 2 $300 cards, gets 65fps avg with one card, but 80fps with 2(diminishing returns). In 10 months, a $300 card comes out that gets 82fps avg in that game. Then your budget for upgrades starts to make no fucking sense because you went SLI/CFX instead of set aside the funds for a real upgrade path. So if you have money to blow and the time to tinker, you're set.

So as far as rendering and streaming, multiple GPUs can help a lot, but there is no benefit in bridged modes(SLI/CFX). Also, you're not going to buy two nVidia/AMD cards to do it. More like one or the other, and buy an application specific accelorator, like from avermedia, hauppage, or elgato. Some of which are external options only, therefore you're going to put more into a motherboard/expansion card with a good chipset for reliable data transfer to those areas for better optimization.

Idk why but the space heater comment made me lmao....

But back to the topic... many of the tasks you mention sound like they can generically benefit from more processing power, why not just get more CPU?

Limitations in builds? You have the best CPU and most cores you can on your build and multiple GPUs add more processing power even if not graphics specific processing? Or are there processing tasks in the things you mention where a GPU is needed over a CPU?

This is all interesting stuff, I had not considered multiple video cards as anything but overkill for gaming honestly.
 

xer0h0ur

HS Referee HoF
Donator
Joined:
Aug 20, 2012
Posts:
22,260
Liked Posts:
17,824
Location:
Chicago, IL.
My favorite teams
  1. Chicago White Sox
  1. Chicago Bulls
  1. Chicago Bears
  1. Chicago Blackhawks
Dude, he isn't kidding. When I was running three AMD GPUs, 295X2 + 290X, it was a legitimate space heater. I could have the heater not running in the house and the computer alone would heat that entire space. Sometimes even enough where I needed to crack open a window, during winter.
 

Crystallas

Three if by air
Staff member
Donator
Joined:
Jun 25, 2010
Posts:
19,890
Liked Posts:
9,618
Location:
Next to the beef gristle mill
My favorite teams
  1. Chicago Bulls
Idk why but the space heater comment made me lmao....

But back to the topic... many of the tasks you mention sound like they can generically benefit from more processing power, why not just get more CPU?

Limitations in builds? You have the best CPU and most cores you can on your build and multiple GPUs add more processing power even if not graphics specific processing? Or are there processing tasks in the things you mention where a GPU is needed over a CPU?

This is all interesting stuff, I had not considered multiple video cards as anything but overkill for gaming honestly.

GPUs are more powerful than CPUs for focused execution. CPUs are like a series of streets, roads, expressway junctions and overpasses with a lot of traffic and the average speed is 30mph between everything. The GPU is a series of race tracks where a lot of traffic exists still, but the average speed is 180mph. But in order to use the race tracks, you still need all the logistics in between. Thus why CPUs are extremely complex and need to communicate with more devices resulting is less end point performance. And it should be noted that all those things the CPU communicates with, make the system faster as well.

Then if you want to get rid of everything and be a purist in one task, there are ASICs just as well. ASICs have replaced most everything else for cryptographic hashing, cipher analysis, encode/decode specific to the codecs, compression acceleration. ASICs would be like Airports, and FPGAs would be like the Bonneville salts, and direct circuits would be like light traveling, before we get into Star Trek shit, to which I have no analogy.
 
Last edited:

Crystallas

Three if by air
Staff member
Donator
Joined:
Jun 25, 2010
Posts:
19,890
Liked Posts:
9,618
Location:
Next to the beef gristle mill
My favorite teams
  1. Chicago Bulls
When I was running three AMD GPUs, 295X2 + 290X, it was a legitimate space heater.

Those dual GPU boards are cray. I have yet to do one, came close with the GF 9800 GX2. I think that single 295X2 card could heat a room LOL
 

Ares

CCS Hall of Fame
Donator
CCS Hall of Fame '19
Joined:
Aug 21, 2012
Posts:
41,437
Liked Posts:
39,639
GPUs are more powerful than CPUs for focused execution. CPUs are like a series of streets, roads, expressway junctions and overpasses with a lot of traffic and the average speed is 30mph between everything. The GPU is a series of race tracks where a lot of traffic exists still, but the average speed is 180mph. But in order to use the race tracks, you still need all the logistics in between. Thus why CPUs are extremely complex and need to communicate with more devices resulting is less end point performance. And it should be noted that all those things the CPU communicates with, make the system faster as well.

Then if you want to get rid of everything and be a purist in one task, there are ASICs just as well. ASICs have replaced most everything else for cryptographic hashing, cipher analysis, encode/decode specific to the codecs, compression acceleration. ASICs would be like Airports, and FPGAs would be like the Bonneville salts, and direct circuits would be like light traveling, before we get into Star Trek shit, to which I have no analogy.

Interesting stuff... I imagine, based on the analogy.... only certain tasks can be offloaded to the GPU.... otherwise we wouldn't bother having any CPUs, at least at this iteration of software/hardware.

Ok last branch off.... Quantum Computing... is that on your list in the analogy... I always kind of presumed it was a next level of computing power.
 

Ares

CCS Hall of Fame
Donator
CCS Hall of Fame '19
Joined:
Aug 21, 2012
Posts:
41,437
Liked Posts:
39,639
Dude, he isn't kidding. When I was running three AMD GPUs, 295X2 + 290X, it was a legitimate space heater. I could have the heater not running in the house and the computer alone would heat that entire space. Sometimes even enough where I needed to crack open a window, during winter.

Those dual GPU boards are cray. I have yet to do one, came close with the GF 9800 GX2. I think that single 295X2 card could heat a room LOL

Oh I have no doubts, I've had machines that pumped heat out like crazy.... just the phrasing Crys used caught my funny bone.
 

Crystallas

Three if by air
Staff member
Donator
Joined:
Jun 25, 2010
Posts:
19,890
Liked Posts:
9,618
Location:
Next to the beef gristle mill
My favorite teams
  1. Chicago Bulls
Interesting stuff... I imagine, based on the analogy.... only certain tasks can be offloaded to the GPU.... otherwise we wouldn't bother having any CPUs, at least at this iteration of software/hardware.

Ok last branch off.... Quantum Computing... is that on your list in the analogy... I always kind of presumed it was a next level of computing power.

Of course, there is a lot more to it, than how I explained it. You can use a GPU with a controller chip and just blast it with calculations without a traditional CPU.
The beauty of the GPU design and origins for demand, is that so many wildcard type calculations exist in games and graphics that have overlap with other tasks.
OpenCL makes a lot of his happen. OpenCL works on hardware from a few hundred chipmakers which include the handful of FPGA makers, nearly all of the CPUs like ARM/x86/MIPS/Power/zilog and freescale types, etc. nVidia uses openCL as well, but it also has CUDA, but CUDA is not open and a licensed tech. Plus it's many years behind OpenCL for these kinds of applications that can eliminate the need for a traditional CPU.
Quantum computing has been very disappointing. When more of those problems are solved and those systems are consumer available, the whole list of analogies will change, no doubt. Because of OpenCL, a lot of what you're pondering about gains the flexibility for transition between platforms. Stuff that doesn't exist today thanks to layers upon layers of protectionism for the big players who know how to game the legal system from new competition.

So Quantum computing... Fun topic. You'll know someone who understands this topic from someone who just repeats what they see/hear just by getting past the explanation of qubits. :D
There is also the chance that a quantum core(s) will heterogeneously be combined into existing architectures until the legacy execution can be minimized to either software or some kind of FPGA. Which is why the OpenCL point is important. I think we'll realistically see CNT (carbon nanotube) transistor accelerated processors as the next big thing. One of the next truly ground up CPU architectures will use CNT layering, which will be a ~40% performance improvement with lower power consumption at the same lithography and die size. Quantum computing will be a dud until we all forget about it, then all of a sudden a breakthrough will make it mass producible, and by that point, all the other technologies around it will also improve that QC will be a small but necessary step up. Kind of like how ARM and x86 compete today.
 

xer0h0ur

HS Referee HoF
Donator
Joined:
Aug 20, 2012
Posts:
22,260
Liked Posts:
17,824
Location:
Chicago, IL.
My favorite teams
  1. Chicago White Sox
  1. Chicago Bulls
  1. Chicago Bears
  1. Chicago Blackhawks
I've been hearing for ages that x86 will go the way of the dodo bird in consumer computing but have yet to see it happen. Just how close do you see it happening Crys?
 

Crystallas

Three if by air
Staff member
Donator
Joined:
Jun 25, 2010
Posts:
19,890
Liked Posts:
9,618
Location:
Next to the beef gristle mill
My favorite teams
  1. Chicago Bulls
I've been hearing for ages that x86 will go the way of the dodo bird in consumer computing but have yet to see it happen. Just how close do you see it happening Crys?

Some technicalities/semantics here, because x86 has evolved to the point that it is fairly unrecognizable. For example, I barely run anything that is x86. Everything is mostly ARM or AMD64. I'd imagine x86 in modern development is dead and the only x86 code run (i386) is legacy code that likely works on multiple platforms anyways. i686 code was so new, that nearly all of it was replaced by now by some cross platform evolution.

So I don't see it dying for a long time, just evolving. Intel has to protect that IP and they are deploying tactics to do so. Therefore what constitutes as x86 in their eyes is strictly language to meet legal specifications. If you're interested, study how Transmeta made an x86 code compatible CPU without actually being x86 in architecture.
 

Monsieur Tirets

Well-known member
Joined:
Nov 8, 2012
Posts:
8,682
Liked Posts:
4,314
well i went ahead and ordered the 1080. i figure evenb if vega comes out soon itll be sold out for months and likely price gouged. plus if they offer a model with more performance that the 1080 itll likely cost more than the 460 i got the 1080 for anyway.

and like i mentioned, im kind of done with amd. ive bought 2 gpus in my life... one was a 9800 pro way back in the day, and it was a beast. it was awesome maxing out HL2, doom 3 , far cry, etc without thinking twice about it, but even back then i always had little nagging issues and eventually the card died. and i remember thinking back then, im going to give nvidia a try next time. but fast foward many years and the 390 was simply the better deal at my price point. and its a good card, but theres always small issues. often because more games than not are gameworks titles. but the fact remains, nvidia dominates the market. the real kicker though as been how the past 6 months of crimson drivers have been totally useless when it comes to the vast majority of 390 users. newer drivers will not allow me to adjust voltage, deliver worse performance, and worst of all cause artifacting at stock clocks. so this time nvidia finally got my money.

on a side note did you know nvidia quietly refreshed the 10 series? it simply bumps memory speed. pretty much irrelevant. thats probably why there was so little news about it.
 

Monsieur Tirets

Well-known member
Joined:
Nov 8, 2012
Posts:
8,682
Liked Posts:
4,314
well i went ahead and ordered the 1080. i figure evenb if vega comes out soon itll be sold out for months and likely price gouged. plus if they offer a model with more performance than the 1080 itll likely cost more than the 460 i got the 1080 for anyway.

and like i mentioned, im kind of done with amd. ive bought 2 gpus in my life... one was a 9800 pro(though ati wasnt owned by amd yet) way back in the day, and it was a beast. it was awesome maxing out HL2, doom 3 , far cry, etc without thinking twice about it, but even back then i always had little nagging issues and eventually the card died. and i remember thinking back then, im going to give nvidia a try next time. but fast forward many years and the 390 was simply the better deal at my price point. and its a good card, but theres always small issues. often because more games than not are gameworks titles. but the fact remains, nvidia dominates the market. the real kicker though has been how the past 6 months of crimson drivers have been totally useless when it comes to the vast majority of 390 users. newer drivers will not allow me to adjust voltage, deliver worse performance, and worst of all cause artifacting at stock clocks. so this time nvidia finally got my money.

on a side note did you know nvidia quietly "refreshed" the 10 series? it simply bumps memory speed. pretty much irrelevant. thats probably why there was so little news about it.
 

czman

Well-known member
Joined:
May 7, 2013
Posts:
2,195
Liked Posts:
551
i can get a good deal on a 1080 right now and im really considering pulling the trigger. but vega is right around the corner and while i usually advise people against waiting, with vega so close i cant help but wonder if amd will offer anything worthwhile at the same price point. but then im kind of ready to go with nvidia rather than amd since amds handling of their drivers recently has pissed me off. any drivers newer than 6 months are pretty much useless for most 390 owners.

anyway, what do you guys think? should i pull the trigger on a 1080?

The 1080 at 460 bucks is about as good a price/performance you can hope for in a high end card. Also the 1080 is a great card. Obviously the luck of fab will be present. I think you should jump on the deal. The ti in most games only gives about a 10% performance increase in FPS, and most people it won't matter much at all; especially if you are gaming at 1080p.

The only reason to wait for Vega is that you are looking to upgrade you monitor and freesync is cheaper and more likely to become the standard. That is little to do with what card will be a better value.
 

Monsieur Tirets

Well-known member
Joined:
Nov 8, 2012
Posts:
8,682
Liked Posts:
4,314
The 1080 at 460 bucks is about as good a price/performance you can hope for in a high end card. Also the 1080 is a great card. Obviously the luck of fab will be present. I think you should jump on the deal. The ti in most games only gives about a 10% performance increase in FPS, and most people it won't matter much at all; especially if you are gaming at 1080p.

The only reason to wait for Vega is that you are looking to upgrade you monitor and freesync is cheaper and more likely to become the standard. That is little to do with what card will be a better value.

well, in the end, after a rebate and a couple gift cards i received after the purchase, it turns out i technically got the card for $415. cant ask for much better than that when it comes to price vs performance, when talking a brand new card.
 

Top