Hardware Discussion Thread

botfly10

CCS Donator
Donator
Joined:
Jun 19, 2011
Posts:
32,868
Liked Posts:
26,844
Yeah, the thing is that gsync requires additional hardware module and also performs at a much wider fps range that freesync.

Agree tho that eventually manufacturers will push for an open industry standard. But no way is gsync gonna just be flat out dropped in favor of freesync. Thats just not going to happen.

Also, I don't buy the $200-300 gsync cost. Not when you can get good gsync monitors for $400.
 

botfly10

CCS Donator
Donator
Joined:
Jun 19, 2011
Posts:
32,868
Liked Posts:
26,844
I can't say enough how much I like my new monitor. I knew it would be an upgrade but didn't expect that big of a jump.

Acer XB270HU
2560 x 1440, 144hz, g-sync

Rawr!

Yeah, I keep hearing from friends that moving to a high refresh monitor is the biggest change in their gaming and general PC experience since SSDs became a thing.
 

Monsieur Tirets

Well-known member
Joined:
Nov 8, 2012
Posts:
8,682
Liked Posts:
4,314
Yeah, the thing is that gsync requires additional hardware module and also performs at a much wider fps range that freesync.

Agree tho that eventually manufacturers will push for an open industry standard. But no way is gsync gonna just be flat out dropped in favor of freesync. Thats just not going to happen.

Also, I don't buy the $200-300 gsync cost. Not when you can get good gsync monitors for $400.

the article may have been old. but the point was that gsync adds additional cost while freesync doesnt. i only bring it up because we discussed this recently. and as for nvidia dropping gsync... it wouldnt be the first time a company drops its tech do to loosing a battle against a competing technology. if the industry adopts freesync nvidia wont be able to do anything about it. monitor manufactures much rather manufacture one monitor instead of 2, for less cost per monitor to boot. plus itll be better for the consumer. nvidia with all their propriety bs gets annoying. personally i wouldnt buy a freesync or a gsync monitor. like you said, a monitor usually lasts a long time, the span of numerous gpus, why the fuck would i buy a monitor that locks me into only one brands gpus? i want to be able to buy whichever gpu on the market offers the best value while still being able to take advantage of my monitors features. the sooner a unified standard is in place the better.
 

botfly10

CCS Donator
Donator
Joined:
Jun 19, 2011
Posts:
32,868
Liked Posts:
26,844
Yeah, there is a hardware cost and I think a licensing fee too. But I think its the manufacturers themselves that do gsync related gouging.

nvidia doesn't want to gouge on gsync. They want as wide adoption as possible because it locks people into nvidia cards for the life of the monitor, which can be a very long time.

Also, the industry is not going to adopt freesync cause it doesn't perform as well. The best hope is a third party option emerging.
 
Last edited:

Crystallas

Three if by air
Staff member
Donator
Joined:
Jun 25, 2010
Posts:
19,890
Liked Posts:
9,618
Location:
Next to the beef gristle mill
My favorite teams
  1. Chicago Bulls
Also, the industry is not going to adopt freesync cause it doesn't perform as well. The best hope is a third party option emerging.

We're already seeing improvements from various manufacturers going upstream and getting used. The pull requests into the newest xserver for Kaby Lake from intel shows freesync support, so the rumors that intel will support FreeSync is true. Future Adreno, Mali-T's, and PowerVR chips also have adopted support. So even if AMD's shitty drivers are a hair behind when it comes to vsync disabled comparisons, the actual freesync technology and platform is vastly superior. nVidia will not be able to compete with the entire industry to force their standard. This is also why more FreeSync supported models of monitor are available and are still coming out, yet being the younger of the two standards. gsync is like rambus vs DDR in the late 90s. Rambus cost more and was slightly better, but it wound up dying because it was not an industry adoptable standard.

It's okay though, not the first time AMD has released an industry standard. Hell, they were the charter designers for PCISIG that designed PCI-e that took down AGP. Same reason, one was a closed standard, the other was VESA supported. Their 64bit x86 implementation is what most of us here are using, as intel had to drop theirs. The list of modern open standards that were originated from AMD is just unreal compared to their size in the marketplace. They keep good technology moving and keep the marketplace honest and affordable. No third option is going to emerge. This is it, and AMD's track record with open standards has been impressive. ESPECIALLY post alpha team, none of that 3Dnow BS.
 

Ares

CCS Hall of Fame
Donator
CCS Hall of Fame '19
Joined:
Aug 21, 2012
Posts:
41,432
Liked Posts:
39,633
We're already seeing improvements from various manufacturers going upstream and getting used. The pull requests into the newest xserver for Kaby Lake from intel shows freesync support, so the rumors that intel will support FreeSync is true. Future Adreno, Mali-T's, and PowerVR chips also have adopted support. So even if AMD's shitty drivers are a hair behind when it comes to vsync disabled comparisons, the actual freesync technology and platform is vastly superior. nVidia will not be able to compete with the entire industry to force their standard. This is also why more FreeSync supported models of monitor are available and are still coming out, yet being the younger of the two standards. gsync is like rambus vs DDR in the late 90s. Rambus cost more and was slightly better, but it wound up dying because it was not an industry adoptable standard.

It's okay though, not the first time AMD has released an industry standard. Hell, they were the charter designers for PCISIG that designed PCI-e that took down AGP. Same reason, one was a closed standard, the other was VESA supported. Their 64bit x86 implementation is what most of us here are using, as intel had to drop theirs. The list of modern open standards that were originated from AMD is just unreal compared to their size in the marketplace. They keep good technology moving and keep the marketplace honest and affordable. No third option is going to emerge. This is it, and AMD's track record with open standards has been impressive. ESPECIALLY post alpha team, none of that 3Dnow BS.

My first graphics card was an AGP 8x.... I think it had like 32MB of RAM.... godamn dat was a long time ago.
 

Crystallas

Three if by air
Staff member
Donator
Joined:
Jun 25, 2010
Posts:
19,890
Liked Posts:
9,618
Location:
Next to the beef gristle mill
My favorite teams
  1. Chicago Bulls
My first graphics card was an AGP 8x.... I think it had like 32MB of RAM.... godamn dat was a long time ago.

HAY, are we going to talk about old hardware or wut?
 

Schmidtaki

Just your everyday fail.
Donator
Joined:
Aug 21, 2012
Posts:
3,087
Liked Posts:
2,103
Location:
Lost OMW to the Point
My first graphics card was an AGP 8x.... I think it had like 32MB of RAM.... godamn dat was a long time ago.

Mine was a Voodoo 2 PCI video card. Man I had a friend who had two of those, linked together. We used to play Carmageddon back then.
 

Crystallas

Three if by air
Staff member
Donator
Joined:
Jun 25, 2010
Posts:
19,890
Liked Posts:
9,618
Location:
Next to the beef gristle mill
My favorite teams
  1. Chicago Bulls
Should I start an Old Af Hardware Discussion thread?

Then I can say, my first expansion graphics were S100s and I don't even know which chipset was used, only that it allowed me to use a cheap RF display.
 

Monsieur Tirets

Well-known member
Joined:
Nov 8, 2012
Posts:
8,682
Liked Posts:
4,314
i dont remember what they were but i remember as a kid having a couple all in one pcs, and i remember when 486 was a big deal and i think we may have had a guy who built pcs build us a 486 rig. i was a kid but i actually researched the parts and chose what went into it. that was decades ago so i dont remember much about it. after that i stopped following the pc scene for a long time until the early 2000s when i ordered a dell. yeah, it was a dell, but i made sure that it had whatever the best intel cpu was that they offered at the time and that it had a 9800pro. i was maxing out farcry and doom 3 and half life 2. then years later, i tried to run crysis and the card blew out, which was still under warranty, and since by that time the only cards dell had werent compatible with my motherboard they just replaced the whole pc. the cpu was an upgrade with a core duo but the gpu was a shitty 2400pro, though by that time it pretty much offered the same performance as the 9800 pro anyway. that was the pc i stuck with for like ten years and again i fell out of the pc scene until recently building this one. goes without saying that i was regulated to retro gaming for some time. lol
 

ijustposthere

Message Board Hero
Donator
CCS Hall of Fame '20
Joined:
Aug 20, 2012
Posts:
33,374
Liked Posts:
27,841
Location:
Any-Town, USA
My favorite teams
  1. Chicago Cubs
  1. Chicago Bulls
  1. Chicago Bears
  1. Chicago Blackhawks
  1. Michigan Wolverines
  2. Purdue Boilermakers
51-A-computer-ad-1980s.jpg
 

botfly10

CCS Donator
Donator
Joined:
Jun 19, 2011
Posts:
32,868
Liked Posts:
26,844
Here is an excellent video from Gamers Nexus on how graphics cards are/should be rated or measured by reviewers. Its excellent, I highly recommend it.

The channel overall is very good imo and cuts through the bullshit. With actual expertise.

The tl;dr version of the vid is:

- Average FPS is sometimes a good measure of a card and sometimes a bullshit measure and sometimes somewhere inbetween.
- Min and Max FPS are nearly always worthless measures of a card

- A mixture of average FPS and something gamers nexus calls 1% lows and .1% lows is the best consumer level measure of a card
- The very best metric is a mix of average FPS, 1% lows, .1% lows, and millisecond offset frametime

Check the video out:

[video=youtube;uXepIWi4SgM]https://www.youtube.com/watch?annotation_id=annotation_595552049&feature=iv&src_vid=45lkmiKfC5g&v=uXepIWi4SgM[/video]
 

Crystallas

Three if by air
Staff member
Donator
Joined:
Jun 25, 2010
Posts:
19,890
Liked Posts:
9,618
Location:
Next to the beef gristle mill
My favorite teams
  1. Chicago Bulls
Not trying to sound like an ass. I didn't know enough people didn't know how to decipher benchmarks. The ones who have attention span issues aren't going to learn no matter what learning aids are thrown their way.

Not only that, comparing 1^:1^ in driver and chipset features, which is overlooked often.
 

Crystallas

Three if by air
Staff member
Donator
Joined:
Jun 25, 2010
Posts:
19,890
Liked Posts:
9,618
Location:
Next to the beef gristle mill
My favorite teams
  1. Chicago Bulls

I've been hunting for a genuine IMSAI 8080 PC for a long time. A friend gave me his old one in the late 80s and I chucked it after a few months. One of those things you have and regret binning.
 

botfly10

CCS Donator
Donator
Joined:
Jun 19, 2011
Posts:
32,868
Liked Posts:
26,844
Not trying to sound like an ass. I didn't know enough people didn't know how to decipher benchmarks. The ones who have attention span issues aren't going to learn no matter what learning aids are thrown their way.

Not only that, comparing 1^:1^ in driver and chipset features, which is overlooked often.

Sorry for making an effort to learn stuff. Did you watch the video?
 

Crystallas

Three if by air
Staff member
Donator
Joined:
Jun 25, 2010
Posts:
19,890
Liked Posts:
9,618
Location:
Next to the beef gristle mill
My favorite teams
  1. Chicago Bulls
Sorry for making an effort to learn stuff. Did you watch the video?

Sorry, I was more harsh than usual. Yes I did watch it. IDK, just seems like something that would be common sense at this point. But then again, we still have HP/TQ understanding issues with automobiles that is along the same path, except with cars, it's about the whole power band and gearing.
 

Crystallas

Three if by air
Staff member
Donator
Joined:
Jun 25, 2010
Posts:
19,890
Liked Posts:
9,618
Location:
Next to the beef gristle mill
My favorite teams
  1. Chicago Bulls
Legit, yes. Reflective of real world performance?, no. You want a real world benchmark to get real world results. Therefore find applications/games you use most, and many of those do have some form of benchmark and database.

That being said, passmark can be an okay starting point. So many synthetic benchmark tools, you will get a lot of hardware that can pump out great numbers that don't always beat a significantly cheaper competing product for specific use case. Hardware makers know people buy into this trap, build hardware to wow on synthetic benchmarks, then owners complain to developers for slow software on their new expensive hardware. So in a way, even when passmark scores are bunk, the gaming of scores themselves forces developers to pay more attention to certain hardware.
 

Top