• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Introduces G-SYNC Technology for Gaming Monitors

Joined
Jun 11, 2013
Messages
353 (0.08/day)
Processor Core i5-3350P @3.5GHz
Motherboard MSI Z77MA-G45 (uATX)
Cooling Stock Intel
Memory 2x4GB Crucial Ballistix Tactical DDR3 1600
Video Card(s) |Ξ \/ G /\ GeForce GTX 670 FTW+ 4GB w/Backplate, Part Number: 04G-P4-3673-KR, ASIC 68.5%
Storage some cheap seagates and one wd green
Display(s) Dell UltraSharp U2412M
Case some cheap old eurocase, black, with integrated cheap lit lcd for basic monitoring
Audio Device(s) Realtek ALC892
Power Supply Enermax Triathlor 550W ETA550AWT bronze, non-modular, airflow audible over 300W power draw
Mouse PMSG1G
Keyboard oldschool membrane Keytronic 104 Key PS/2 (big enter, right part of right shift broken into "\" key)
Fair enough.
Not useless, but walled garden ... it's basically just a pile of bones.

if nv did this in the range of 140-85Hz, id call that a gamers benefit :toast:
but dropping frames to what? 5fps@5Hz and calling it a holy grail????? :confused::twitch:
gimme a f*cking break! :banghead:

im too old to jump this bait, so im kinda happy i cant afford to buy nvidia g-card :laugh:
 
Joined
Apr 30, 2012
Messages
3,881 (0.85/day)
Can I upgrade it to a 780M ?

 
Joined
Sep 28, 2012
Messages
980 (0.22/day)
System Name Poor Man's PC
Processor waiting for 9800X3D...
Motherboard MSI B650M Mortar WiFi
Cooling Thermalright Phantom Spirit 120 with Arctic P12 Max fan
Memory 32GB GSkill Flare X5 DDR5 6000Mhz
Video Card(s) XFX Merc 310 Radeon RX 7900 XT
Storage XPG Gammix S70 Blade 2TB + 8 TB WD Ultrastar DC HC320
Display(s) Xiaomi G Pro 27i MiniLED + AOC 22BH2M2
Case Asus A21 Case
Audio Device(s) MPow Air Wireless + Mi Soundbar
Power Supply Enermax Revolution DF 650W Gold
Mouse Logitech MX Anywhere 3
Keyboard Logitech Pro X + Kailh box heavy pale blue switch + Durock stabilizers
VR HMD Meta Quest 2
Benchmark Scores Who need bench when everything already fast?



interesting...so G-Sync could be nVidia's mobile version of lowest end SKU graphic card that doesn't sell,or it might be a defective Tegra4 comes from nVidia shield /sarcasm
 
Joined
Dec 16, 2010
Messages
1,668 (0.33/day)
Location
State College, PA, US
System Name My Surround PC
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS STRIX X670E-F
Cooling Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans
Memory 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30
Video Card(s) MSI NVIDIA GeForce RTX 4090 Suprim X 24GB
Storage WD SN850 2TB, Samsung PM981a 1TB, 4 x 4TB + 1 x 10TB HGST NAS HDD for Windows Storage Spaces
Display(s) 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD
Case NZXT Source 530
Audio Device(s) Sony MDR-7506 / Logitech Z-5500 5.1
Power Supply Corsair RM1000x 1 kW
Mouse Patriot Viper V560
Keyboard Corsair K100
VR HMD HP Reverb G2
Software Windows 11 Pro x64
Benchmark Scores Mellanox ConnectX-3 10 Gb/s Fiber Network Card
as a red team guy, the hatred in this thread is ridiculous (on a related note, amd setting up a 290x demo across the street from nv's event made me uneasy)

name non proprietary nv tech? someone was blind on a videocardz thread as well, physx on the cpu is multiplatform & in many games+consoles! the new FCAT method runs anywhere, FXAA (although i want to say that's one guy), how about helping devs add DX10 or 11 support? not all nv sponsored titles have locks on them, amd has done the same in helping codemasters games be good looking & efficient

sure GPU physx & cuda are very annoying, not doubting that, it's not 'good guy nvidia', but many things start out proprietary to show that they work

we should be pressuring VESA/displayport/hdmi for an adaptive refresh technique like this :respect:

i dont get why nv doesnt license things out so that everyone can enjoy instead of locking down to only their platform (look at bluray, it's standard, just a royalty is paid, it's not locked to sony products)

just cuz we didnt hit a perfect or open world doesnt mean we should destroy it, this is still better than deciding between matrox+rendition+3dfx+whoever else all at once if you want to play various games in the 90s :banghead:

Thank you for this post. It seems like for every 1 neutral post on any AMD, NVidia or Intel press release there are 2 from people who read the only first sentence of the article and immediately claim the technology is evil based on some argument that would be disproved if they had read the entire article. If only more people approached these topics from a neutral point of view and took time to understand what they were criticizing this forum would be full of intelligent discussion. I guess this is the internet and you can't expect more... (I always wished there was a "No Thanks" button on TPU to note irrelevant posts, but I can imagine how it would be abused).

I support technological advancement, whether it is NVidia, AMD, Intel, or some other company, and G-sync looks promising. This is the type of disruptive technology like AMD did with Eyefinity where no one saw it coming and it will take a few years before everyone else catches up.
 

ItsKlausMF

New Member
Joined
Oct 19, 2013
Messages
1 (0.00/day)
Seriously , AMD is the one suffering from a colossal crap load of dropped frames, tearing, frame interleaving and micro-stuttering in their CF systems , that's why they are trying to mitigate the issue with CF over PCIe.

TrueAudio doesn't mean squat , and R290X is not really the answer to Titan at all , you will see that when the review comes about.The only real advantage for AMD now is Mantle, but it remains to be seen whether they will really get it to shine or not .

On the other hand, NVIDIA always has been the brute force of advancement in the PC space, AMD just plays catch up, and today's announcements just cements that idea.

NVIDIA was the first to introduce the GPU, SLi, Frame Pacing (FCAT) , GPU Boost ,Adaptive V.Sync, first to reach unified shader architecture on PCs, first with GPGPU , CUDA , PhysX , Optix (for real time ray tracing) , first with 3D gaming (3D Vision) . first with Optimus for mobile flexibility , first with Geforce Experience program , SLi profiles , shadow play (for recording) and game streaming .

They had better support with CSAA , FXAA , TXAA , driver side Ambient Occlusion and HBAO+ , TWIMTP program , better Linux , OpenGL support, .. and now G.Sync! all of these innovations are sustained and built upon to this day.

And even when AMD beat them to a certain invention , like Eyefinity , NVIDIA didn't stop until they topped it with more features , so they answered with Surround , then did 3D Surround and now 4k Surround.

NVIDIA was at the forefront in developing all of these technologies and they continue to sustain and expand them till now , AMD just follows suit , they fight back with stuff that they don't really sustain so they end up forgotten and abandoned ,even generating more trouble than they worth. Just look at the pathetic state of their own Eyefinity and all of it's CF frame problems.

In short AMD is the one feeling the heat, NOT NVIDIA, heck NVIDIA now fights two generations of AMD cards with only one generation of their own, Fermi held off HD 5870 and 6970 , Kepler held off 7970 and x290 ! and who knows about Maxwell !

Audio does matter, better audio = better experience.

Yeah nv first with SLI, but not the first to do it "right". FCAT was just an anti AMD tool to bottleneck AMD sales. PhysX wasn't even nvidias technology, they just bought it to make better marketing than ATI. Better Linux support? Read this and cry. OpenGL and OpenCL is pretty much AMD territory at this moment.

Then what do you mean by nvidia fighting two generations by AMD with only one generation? GF10*(gen1) GF11*(Gen2). Maxwell is far FAR away(late 2014) and by then AMD will be as well hitting their next generation.
 
Joined
Dec 16, 2010
Messages
1,668 (0.33/day)
Location
State College, PA, US
System Name My Surround PC
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS STRIX X670E-F
Cooling Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans
Memory 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30
Video Card(s) MSI NVIDIA GeForce RTX 4090 Suprim X 24GB
Storage WD SN850 2TB, Samsung PM981a 1TB, 4 x 4TB + 1 x 10TB HGST NAS HDD for Windows Storage Spaces
Display(s) 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD
Case NZXT Source 530
Audio Device(s) Sony MDR-7506 / Logitech Z-5500 5.1
Power Supply Corsair RM1000x 1 kW
Mouse Patriot Viper V560
Keyboard Corsair K100
VR HMD HP Reverb G2
Software Windows 11 Pro x64
Benchmark Scores Mellanox ConnectX-3 10 Gb/s Fiber Network Card
Audio does matter, better audio = better experience.

No argument there. How much TrueAudio actually improves audio is still up for debate until AMD releases drivers for it.

Yeah nv first with SLI, but not the first to do it "right".

What in the world does this even mean?

FCAT was just an anti AMD tool to bottleneck AMD sales.

So you're arguing that we shouldn't have any way to measure frame pacing just because AMD couldn't do it properly?

PhysX wasn't even nvidias technology, they just bought it to make better marketing than ATI.

PhysX was doomed to fail even before NVidia took it over. There was no way that dedicated physics accelerator cards were going to take off just like dedicated sound cards had died out in the years prior.

Better Linux support? Read this and cry. OpenGL and OpenCL is pretty much AMD territory at this moment.

I don't know enough about these to make a comment.

Then what do you mean by nvidia fighting two generations by AMD with only one generation? GF10*(gen1) GF11*(Gen2). Maxwell is far FAR away(late 2014) and by then AMD will be as well hitting their next generation.

Maxwell is speculated to launch on 28nm in early 2014 then move to 20nm in late 2014 according to reports. And what does it matter how many generations each manufacturer puts out? All that matters is performance/price, which is why I don't care one bit about the rebrands that both sides do as long as the rebrands move down pricing.
 
Joined
Apr 30, 2012
Messages
3,881 (0.85/day)
interesting...so G-Sync could be nVidia's mobile version of lowest end SKU graphic card that doesn't sell,or it might be a defective Tegra4 comes from nVidia shield /sarcasm

I actually wasn't kidding when I said

I was wondering what Nvidia was going to do with all the un-sold Tegra 4 chips

Besides Nvidia users don't have stuttering nor tearing....right guys ???


It reminded me of Tegra 3 short board modules. If they were full board they be almost twins





NVIDIA GeForce GT 630M


its at the lower end for sure size wise
 
Last edited:
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
You clearly have no idea what the point of Adaptive V-sync is. It's to keep your frame rate from reducing to 30 if it ever drops below 60 (which is what it does, intervals of 30). Basically it was a way to get the best of both worlds.

Clearly neither do you then. Adaptive V-Sync is there to:
a) remove image tearing
b) doesn't pretty much half the framerate of FPS drops below certain level like normal V-Sync does

So, why do we need G-Sync to remove image tearing again?

Please link me to where it says it won't, cause I'm reaaaaaly curious just from where the BS spring blows. And even if it wouldn't, AMD is in the position to slipstream DX11 HLSL port pretty much every multi-platform game coming out in the next half-decade or so, so having 75% chance that, after a certain point, multi-platform next-generation games can and will be written in Mantle pretty much cements them into API relevancy, w/ or w/o a leading share in the market.

Name one nVidia branded tech that ISN'T proprietary.

Well, AMD does say you need a GCN powered hardware to use it and NVIDIA doesn't have that at all. And since it's an low level API it is very hardware specific. That would be like expecting ATI, S3 and NVIDIA to support Glide back in its days. Natively.

High level API's work that way (but are slower, aka Direct3D), low level ones don't (and are faster as a result, aka Matle or Glide).
 
Joined
Apr 30, 2012
Messages
3,881 (0.85/day)
Well, AMD does say you need a GCN powered hardware to use it and NVIDIA doesn't have that at all. And since it's an low level API it is very hardware specific. That would be like expecting ATI, S3 and NVIDIA to support Glide back in its days. Natively.

High level API's work that way (but are slower, aka Direct3D), low level ones don't (and are faster as a result, aka Matle or Glide).

Anyone remember GLIDE Wrappers ?

Back then it was possible but now a days it be a lawsuit frenzy. That's why there no CUDA\PhysX wrappers. Slower\buggy but at least you weren't locked out of the API entirely.

Nvidia G-Sync FAQ

Q: Does NVIDIA G-SYNC work for all games?

A: NVIDIA G-SYNC works with all games. However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver.

Boo!!!


Q: Does G-SYNC work with FCAT?

A: FCAT requires a video capture card to catch the output graphics stream going to a monitor. Since G-SYNC is DP only and G-SYNC manipulates DP in new ways, it is very unlikely that existing capture cards will work. Fortunately, FRAPS is now an accurate reflection of the performance of a G-SYNC enabled system, and FRAPS output can be directly read by FCAT for comparative processing.

:rolleyes:

Q: How much more does G-SYNC add to the cost of a monitor?

A: The NVIDIA G-SYNC Do-it-yourself kit will cost approximately $175.
 
Last edited:
Joined
Sep 25, 2010
Messages
536 (0.10/day)
Location
Bahrain
System Name Lazyman 9000+
Processor AMD Ryzen 7 3800X
Motherboard X570 Aorus Ultra
Cooling Arctic LF II 280
Memory 2x8GB Crucial Ballistix 3600 MHz DDR4
Video Card(s) PowerColor Red Devil 5700XT
Storage Never Enough
Display(s) ASUS VG248QE+ Dell E2420H
Case LIAN LI Lancool II Mesh RGB
Audio Device(s) Sennheiser HD518 / Logitech Z623
Power Supply FSP AURUM PT 850W
Mouse Logitech G600
Keyboard Logitech G810 Orion Spectrum
Software Win 11
Benchmark Scores All over the bench
This looked like an interesting thing to try as i just bought the VG248QE a while ago, but it looks it'll cost a kidney and might not be available for me :ohwell: will see when its released,
aand it looks i might need a new system too with a kepler card :banghead:
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
@Xzibit
Glide Wrapper means emulation. Emulation means slow. How exactly that solves anything? Glide wrappers exist because you could play games that would then look better. Besides, what you can emulate from Voodoo cards doesn't mean you can emulate modern stuff today. Stuff was rather simple back then... Vertex Shaders had software emulation, but it was slower. Pixel Shaders were not even possible to emulate without getting 1 frame per minute... and neither was available on any Voodoo card...

Mantle is performance only thing and if you negate the boost with emulation, wouldn't it be easier to just stay with Direct3D 11 then and remain at point zero?
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,550 (2.86/day)
Location
Piteå
System Name White DJ in Detroit
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
You guys make my hatred for booth babes and Bethsoft seem rational and levelheaded (which it totally are btw). :laugh:

Anyway wasn't qubit going on about something like this some time ago?
 
Last edited:
Joined
May 13, 2010
Messages
6,065 (1.14/day)
System Name RemixedBeast-NX
Processor Intel Xeon E5-2690 @ 2.9Ghz (8C/16T)
Motherboard Dell Inc. 08HPGT (CPU 1)
Cooling Dell Standard
Memory 24GB ECC
Video Card(s) Gigabyte Nvidia RTX2060 6GB
Storage 2TB Samsung 860 EVO SSD//2TB WD Black HDD
Display(s) Samsung SyncMaster P2350 23in @ 1920x1080 + Dell E2013H 20 in @1600x900
Case Dell Precision T3600 Chassis
Audio Device(s) Beyerdynamic DT770 Pro 80 // Fiio E7 Amp/DAC
Power Supply 630w Dell T3600 PSU
Mouse Logitech G700s/G502
Keyboard Logitech K740
Software Linux Mint 20
Benchmark Scores Network: APs: Cisco Meraki MR32, Ubiquiti Unifi AP-AC-LR and Lite Router/Sw:Meraki MX64 MS220-8P
Ok what the hell does this have to do with Glidewrapper?

Say this like you are teaching it to people at a class and not like you are rabid fandogs or sales people.
 
Joined
Oct 9, 2009
Messages
716 (0.13/day)
Location
Finland
System Name RGB-PC v2.0
Processor AMD Ryzen 7950X
Motherboard Asus Crosshair X670E Extreme
Cooling Corsair iCUE H150i RGB PRO XT
Memory 4x16GB DDR5-5200 CL36 G.SKILL Trident Z5 NEO RGB
Video Card(s) Asus Strix RTX 2080 Ti
Storage 2x2TB Samsung 980 PRO
Display(s) Acer Nitro XV273K 27" 4K 120Hz (G-SYNC compatible)
Case Lian Li O11 Dynamic EVO
Audio Device(s) Audioquest Dragon Red + Sennheiser HD 650
Power Supply Asus Thor II 1000W + Cablemod ModMesh Pro sleeved cables
Mouse Logitech G500s
Keyboard Corsair K70 RGB with low profile red cherrys
Software Windows 11 Pro 64-bit
instead of braving the 60+ fps, they aim for 30fps@30Hz hahahahahaha :roll: :slap: :nutkick:



well, dont try that with a crt monitor or you end up in epileptical convulsion in no time :rockout:

This is going to 144Hz monitors too, meaning they aim to sync everything between 30 to 144fps@144Hz (refresh rate varies with fps). You obviously didn't read a thing about this, instead came in raging like any typical nvidia hater would.

No matter what this company is releasing it is always going to get this same hatred. Doesn't matter what the greatest minds of game development think.

G-SYNC board will support Lightboost too.
http://www.neogaf.com/forum/showpost.php?p=86572603&postcount=539

edit:
30Hz is minimum, variable between 30Hz to 144Hz (obvious max limit)

https://twitter.com/ID_AA_Carmack/status/391300283672051713
 
Last edited:
Joined
Sep 6, 2013
Messages
3,328 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
I am full with AMD and I hate Nvidia with their locks, but this really looks interesting and I believe AMD could follow in the future with something similar (who doesn't like to sell more hardware?).
 

Am*

Joined
Nov 1, 2011
Messages
332 (0.07/day)
System Name 3D Vision & Sound Blaster
Processor Intel Core i5 2500K @ 4.5GHz (stock voltage)
Motherboard Gigabyte P67A-D3-B3
Cooling Thermalright Silver Arrow SB-E Special Edition (with 3x 140mm Black Thermalright fans)
Memory Crucial Ballistix Tactical Tracer 16GB (2x8GB 1600MHz CL8)
Video Card(s) Nvidia GTX TITAN X 12288MB Maxwell @1350MHz
Storage 6TB of Samsung SSDs + 12TB of HDDs
Display(s) LG C1 48 + LG 38UC99 + Samsung S34E790C + BenQ XL2420T + PHILIPS 231C5TJKFU
Case Fractal Design Define R4 Windowed with 6x 140mm Corsair AFs
Audio Device(s) Creative SoundBlaster Z SE + Z906 5.1 speakers/DT 990 PRO
Power Supply Seasonic Focus PX 650W 80+ Platinum
Mouse Logitech G700s
Keyboard CHERRY MX-Board 1.0 Backlit Silent Red Keyboard
Software Windows 7 Pro (RIP) + Winbloat 10 Pro
Benchmark Scores 2fast4u,bro...
This is going to 144Hz monitors too, meaning they aim to sync everything between 30 to 144fps@144Hz (refresh rate varies with fps). You obviously didn't read a thing about this, instead came in raging like any typical nvidia hater would.

No matter what this company is releasing it is always going to get this same hatred. Doesn't matter what the greatest minds of game development think.

https://twitter.com/ID_AA_Carmack/status/391300283672051713

No, it doesn't, especially since they're in Nvidia's pockets being payed to spread testimonials about useless "tech" like this, if you can even call it that.

Really? You think bi-lateral communication with the monitor's ability to control the frame rate delivery of the GPU so it's completely in sync with the monitor can just be easily implemented via a firmware update?

The official white paper hasn't even been released yet and you have to gall to make such an inaccurate and ballsy statement such as that. What standard do you think it can already use? There isn't a display protocol that is capable of doing that. From what I've read on Anandtech, it's a modification of the DisplayPort standard - meaning non-standard - meaning it can't be implemented on ANY of today's monitors without that specific hardware bundle.

Give me one reason why it can't work in the way I stated. The only thing Nvidia would be required to do is modify Adaptive V-sync to keep in check in with the monitor's refresh rate, and there would be nothing stopping AMD from developing a similar method. Fifty bucks says DisplayPort or one of the other display standard companies come up with a non-proprietary way of doing this in the next 5 years.

Nvidia G-Sync FAQ

Q: How much more does G-SYNC add to the cost of a monitor?

A: The NVIDIA G-SYNC Do-it-yourself kit will cost approximately $175.

Q: Does NVIDIA G-SYNC work for all games?

A: NVIDIA G-SYNC works with all games. However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver.



Wait, what...$175 dollars for more useless bullshit with a support list of 10-20 games that will never even exceed the refresh rate of a 120Hz monitor, unless they run at lower than garbage settings of a console and 95% of the time needing to turn off the "exclusive" shit for which you've payed out the ass for. :banghead: And people are supporting this shit...if Nvidia (not directed at you BTW) doesn't have the most braindead and loyal fanboys, I don't know what other company does (other than Apple).

just cuz we didnt hit a perfect or open world doesnt mean we should destroy it, this is still better than deciding between matrox+rendition+3dfx+whoever else all at once if you want to play various games in the 90s :banghead:

I would take those days over the price fixing bullshit Nvidia are playing now any day -- at least competition was fierce back then, and you still had various ways to run proprietary tech yourself (with Glide wrappers etc).
 
Last edited:
Joined
Sep 28, 2012
Messages
980 (0.22/day)
System Name Poor Man's PC
Processor waiting for 9800X3D...
Motherboard MSI B650M Mortar WiFi
Cooling Thermalright Phantom Spirit 120 with Arctic P12 Max fan
Memory 32GB GSkill Flare X5 DDR5 6000Mhz
Video Card(s) XFX Merc 310 Radeon RX 7900 XT
Storage XPG Gammix S70 Blade 2TB + 8 TB WD Ultrastar DC HC320
Display(s) Xiaomi G Pro 27i MiniLED + AOC 22BH2M2
Case Asus A21 Case
Audio Device(s) MPow Air Wireless + Mi Soundbar
Power Supply Enermax Revolution DF 650W Gold
Mouse Logitech MX Anywhere 3
Keyboard Logitech Pro X + Kailh box heavy pale blue switch + Durock stabilizers
VR HMD Meta Quest 2
Benchmark Scores Who need bench when everything already fast?
I support technological advancement, whether it is NVidia, AMD, Intel, or some other company, and G-sync looks promising. This is the type of disruptive technology like AMD did with Eyefinity where no one saw it coming and it will take a few years before everyone else catches up.

Have you remember monitor technology related that nVidia introduced between 2010?Yep...3D Vision that require very expensive 3D shutter glass,emitter and monitor approved by nVidia.I got one glasses,one emitter and Samsung's 223RZ 22 inch LCD for the price of $800, not to mention another $350 for GTX 470.As far as i remember,3D only "works" on nVidia specific cards and specific monitor.Later i knew nVidia just adopting 3D FPR from LG and bring it to desktop.For the same $1200 i switch to HD5850 + 42 inch LG 240Hz and had the same effect.Meanwhile,a pair of 3D Vision kit and Asus VG236H will cost you $650 and only works with highend GTX,or you can grab $250 LG D2343P-BN and paired it with every graphic card out there.Where are those "3D Gaming is the future" now?

Personally,i don't hate nVidia for their "innovation" or "breakthrough" technology.They push gaming to a new level,creating a better experience and better enjoyment.I just hated their price and the so called "proprietary" tech.

Audio does matter, better audio = better experience.
Yeah nv first with SLI, but not the first to do it "right". FCAT was just an anti AMD tool to bottleneck AMD sales.

A better DSP doesn't translate to better audio,it's only small fragment.You need a better amplifier that can encode DSP digital signal and better speaker to translate analog signal from amplifier.
FCAT was surely anti AMD,because AMD uses AFR rather than nVidia's SFR.Look at bright sides,AMD now working for better driver to address frame pacing issues.
 
Joined
Sep 19, 2012
Messages
615 (0.14/day)
System Name [WIP]
Processor Intel Pentium G3420 [i7-4790K SOON(tm)]
Motherboard MSI Z87-GD65 Gaming
Cooling [Corsair H100i]
Memory G.Skill TridentX 2x8GB-2400-CL10 DDR3
Video Card(s) [MSI AMD Radeon R9-290 Gaming]
Storage Seagate 2TB Desktop SSHD / [Samsung 256GB 840 PRO]
Display(s) [BenQ XL2420Z]
Case [Corsair Obsidian 750D]
Power Supply Corsair RM750
Software Windows 8.1 x64 Pro / Linux Mint 15 / SteamOS
I don't know enough about these to make a comment

Ha! Probably the smartest comment in this thread so far. You deserve a medal.
I know how hard I try to shut up about things I don't know much about yet.


Well, AMD does say you need a GCN powered hardware to use it and NVIDIA doesn't have that at all. And since it's an low level API it is very hardware specific. That would be like expecting ATI, S3 and NVIDIA to support Glide back in its days. Natively.

I know what you're saying but:

1. Kepler might not support it, but Maxwell or Volta at the very list could. GPUs are way more complex and adaptable creatures then what they used to be back then... and even then:

2. You could use wrappers to run Glide on non-3dfx hardware...

Ah, I still remember how I finally got Carmageddon 2 to work on a glide wrapper on nV hardware, a world of difference in graphics...
 
Joined
Aug 16, 2004
Messages
3,285 (0.44/day)
Location
Sunny California
Processor AMD Ryzen 7 9800X3D
Motherboard Gigabyte Aorus X870E Elite
Cooling Asus Ryujin II 360 EVA Edition
Memory 4x16GBs DDR5 6000MHz Corsair Vengeance
Video Card(s) Zotac RTX 4090 AMP Extreme Airo
Storage 2TB Samsung 990 Pro OS - 4TB Nextorage G Series Games - 8TBs WD Black Storage
Display(s) LG C2 OLED 42" 4K 120Hz HDR G-Sync enabled TV
Case Asus ROG Helios EVA Edition
Audio Device(s) Denon AVR-S910W - 7.1 Klipsch Dolby ATMOS Speaker Setup - Audeze Maxwell
Power Supply beQuiet Straight Power 12 1500W
Mouse Asus ROG Keris EVA Edition - Asus ROG Scabbard II EVA Edition
Keyboard Asus ROG Strix Scope EVA Edition
VR HMD Samsung Odyssey VR
Software Windows 11 Pro 64bit
I really hope nVidia releases a version of this tech that works with existing monitors, something like a HDMI dongle between the monitor and graphic, and make it hardware agnostic please (though I don't see that last part happening...)

Most PC gamers have invested a lot of moola on their monitors (myself included) and if nVidia really wants to see this tech take off so to speak, they must make this available to as many costumers as they can, not just people who invest in new monitors starting next year...
 
Joined
May 29, 2012
Messages
531 (0.12/day)
System Name CUBE_NXT
Processor i9 12900K @ 5.0Ghz all P-cores with E-cores enabled
Motherboard Gigabyte Z690 Aorus Master
Cooling EK AIO Elite Cooler w/ 3 Phanteks T30 fans
Memory 64GB DDR5 @ 5600Mhz
Video Card(s) EVGA 3090Ti Ultra Hybrid Gaming w/ 3 Phanteks T30 fans
Storage 1 x SK Hynix P41 Platinum 1TB, 1 x 2TB, 1 x WD_BLACK SN850 2TB, 1 x WD_RED SN700 4TB
Display(s) Alienware AW3418DW
Case Lian-Li O11 Dynamic Evo w/ 3 Phanteks T30 fans
Power Supply Seasonic PRIME 1000W Titanium
Software Windows 11 Pro 64-bit
if nv did this in the range of 140-85Hz, id call that a gamers benefit :toast:
but dropping frames to what? 5fps@5Hz and calling it a holy grail????? :confused::twitch:
gimme a f*cking break! :banghead:

im too old to jump this bait, so im kinda happy i cant afford to buy nvidia g-card :laugh:

It won't drop below 30 you dunce. You could have at least spent 5 minutes reading what it does. Clearly, none of you have.

Give me one reason why it can't work in the way I stated. The only thing Nvidia would be required to do is modify Adaptive V-sync to keep in check in with the monitor's refresh rate, and there would be nothing stopping AMD from developing a similar method. Fifty bucks says DisplayPort or one of the other display standard companies come up with a non-proprietary way of doing this in the next 5 years.

Are you like, intentionally playing dumb or are you just not getting this? The refresh rate of the monitor will always be 60Hz, there is no way right now to modulate it, dynamically, on the fly. Period. End of story. V-Sync attempts to lock your frames to 60Hz so it doesn't induce screen tearing, because the monitor operates at 60Hz. You can change it, but not dynamically. Doesn't work that way. But you can still get some screen tearing, and most importantly, you can get input lag because the GPU is rendering frames faster than the monitor can handle.

Adaptive V-Sync, simply removes the restriction on frame rates when it drops below 60. So instead of hard locking your game into either 60FPS or 30FPS, if it drops below, it simply reacts like V-sync isn't enabled, so it can run at 45FPS instead of being locked to intervals of 30. Again, this has nothing to do with the monitor and can not control it. Why do you even think Adaptive V-sync can change the monitor's refresh rate? How do you expect Adaptive V-sync to change the monitor's refresh rate - on the fly - to 45Hz? It doesn't and it physically can't. There is no standard that allows that to happen. There is no protocol that allows that to happen.

That is what G-Sync is. Maybe one of the consortiums can come up with a hardware agnostic standard...at some point. We don't know when, or if any of the consortiums even care. So please, for the love of god, stop making baseless (and wholly inaccurate and ignorant) assumptions. There isn't a single freaking monitor that works the way you describe these days and no firmware update is going to change that. Ever.
 
Last edited:
Joined
Jun 11, 2013
Messages
353 (0.08/day)
Processor Core i5-3350P @3.5GHz
Motherboard MSI Z77MA-G45 (uATX)
Cooling Stock Intel
Memory 2x4GB Crucial Ballistix Tactical DDR3 1600
Video Card(s) |Ξ \/ G /\ GeForce GTX 670 FTW+ 4GB w/Backplate, Part Number: 04G-P4-3673-KR, ASIC 68.5%
Storage some cheap seagates and one wd green
Display(s) Dell UltraSharp U2412M
Case some cheap old eurocase, black, with integrated cheap lit lcd for basic monitoring
Audio Device(s) Realtek ALC892
Power Supply Enermax Triathlor 550W ETA550AWT bronze, non-modular, airflow audible over 300W power draw
Mouse PMSG1G
Keyboard oldschool membrane Keytronic 104 Key PS/2 (big enter, right part of right shift broken into "\" key)
edit: 30Hz is minimum, variable between 30Hz to 144Hz (obvious max limit)
https://twitter.com/ID_AA_Carmack/status/391300283672051713

imagine a group of players playing at 120fps and a group struggling at 30-50fps.. whos going to get more frags? the first gropu can react after 1/120th of a wsecond, whereas the other only at 1/30th - 1/50th.. their reactions will be twice as slow. wheres the benefit for gamers? and let me repeat the crucial thing - nvidia wouldnt dare to present this on a crt monitor. and as you posted the carmack link, if you move your view so that the pixel needs to be refreshed faster than 1/30th of a second, it sends a duplicate frame which doesnt contain updated information in the scene aka a player leaning outta the corner. this can lead to upto four identical frames send to you which makes it upto 4*1/30 ~ 133 milisecond later response on your side versus a 120fps guy. is that clearer now? this technology is nvididas sorry excuse for being lazy to make graphics cards that are able to render 60+ fps at 1600p+ resolution. nothing more, nothing less. so now you see why im upset. and they are able to sell this crap even to you so smoothly. unbelievable.



It won't drop below 30 you dunce. You could have at least spent 5 minutes reading what it does. Clearly, none of you have.

thanks. as soon as i realized what this technology is, and i realized it as soon as they started talking about lowering the refresh frequency of the monitor, i wasnt exactly in the mood to start searching where is the lowest acceptable limit for those guys.. im inclined to think that they really have no limit if it boils down to getting money from you, lol!
 
Last edited:
Joined
May 29, 2012
Messages
531 (0.12/day)
System Name CUBE_NXT
Processor i9 12900K @ 5.0Ghz all P-cores with E-cores enabled
Motherboard Gigabyte Z690 Aorus Master
Cooling EK AIO Elite Cooler w/ 3 Phanteks T30 fans
Memory 64GB DDR5 @ 5600Mhz
Video Card(s) EVGA 3090Ti Ultra Hybrid Gaming w/ 3 Phanteks T30 fans
Storage 1 x SK Hynix P41 Platinum 1TB, 1 x 2TB, 1 x WD_BLACK SN850 2TB, 1 x WD_RED SN700 4TB
Display(s) Alienware AW3418DW
Case Lian-Li O11 Dynamic Evo w/ 3 Phanteks T30 fans
Power Supply Seasonic PRIME 1000W Titanium
Software Windows 11 Pro 64-bit
Then maybe you shouldn't have made a hyperbolic inaccurate statement on something you knew nothing about. You know, like most people would.

I've never set up a WC loop, so you don't see me going into sub-forums and threads about setting up WC loops and throwing around BS beliefs on something I clearly know nothing about.
 
Joined
Aug 16, 2004
Messages
3,285 (0.44/day)
Location
Sunny California
Processor AMD Ryzen 7 9800X3D
Motherboard Gigabyte Aorus X870E Elite
Cooling Asus Ryujin II 360 EVA Edition
Memory 4x16GBs DDR5 6000MHz Corsair Vengeance
Video Card(s) Zotac RTX 4090 AMP Extreme Airo
Storage 2TB Samsung 990 Pro OS - 4TB Nextorage G Series Games - 8TBs WD Black Storage
Display(s) LG C2 OLED 42" 4K 120Hz HDR G-Sync enabled TV
Case Asus ROG Helios EVA Edition
Audio Device(s) Denon AVR-S910W - 7.1 Klipsch Dolby ATMOS Speaker Setup - Audeze Maxwell
Power Supply beQuiet Straight Power 12 1500W
Mouse Asus ROG Keris EVA Edition - Asus ROG Scabbard II EVA Edition
Keyboard Asus ROG Strix Scope EVA Edition
VR HMD Samsung Odyssey VR
Software Windows 11 Pro 64bit
imagine a group of players playing at 120fps and a group struggling at 30-50fps.. whos going to get more frags? the first gropu can react after 1/120th of a wsecond, whereas the other only at 1/30th - 1/50th.. their reactions will be twice as slow. wheres the benefit for gamers? and let me repeat the crucial thing - nvidia wouldnt dare to present this on a crt monitor. and as you posted the carmack link, if you move your view so that the pixel needs to be refreshed faster than 1/30th of a second, it sends a duplicate frame which doesnt contain updated information in the scene aka a player leaning outta the corner. this can lead to upto four identical frames send to you which makes it upto 4*1/30 ~ 133 milisecond later response on your side versus a 120fps guy. is that clearer now? this technology is nvididas sorry excuse for being lazy to make graphics cards that are able to render 60+ fps at 1600p+ resolution. nothing more, nothing less. so now you see why im upset. and they are able to sell this crap even to you so smoothly. unbelievable.

Yes! How can they offer more options to gamers?! What impertinence! Let's make sure that all people can only buy graphic cards and monitors that are capped at 30FPS so no one can have an unfair advantage, who needs more alternatives anyways?!!

Who do these people think they are by offering innovation in this field?? Let's boycott Nvidia and burn all their engineers at the stake! And then to have the audacity to sell this technology and make a profit! How dare them?!!

/S :rolleyes:
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,550 (2.86/day)
Location
Piteå
System Name White DJ in Detroit
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
@haswrong: Isn't that how it is now though?
 
Joined
Jun 11, 2013
Messages
353 (0.08/day)
Processor Core i5-3350P @3.5GHz
Motherboard MSI Z77MA-G45 (uATX)
Cooling Stock Intel
Memory 2x4GB Crucial Ballistix Tactical DDR3 1600
Video Card(s) |Ξ \/ G /\ GeForce GTX 670 FTW+ 4GB w/Backplate, Part Number: 04G-P4-3673-KR, ASIC 68.5%
Storage some cheap seagates and one wd green
Display(s) Dell UltraSharp U2412M
Case some cheap old eurocase, black, with integrated cheap lit lcd for basic monitoring
Audio Device(s) Realtek ALC892
Power Supply Enermax Triathlor 550W ETA550AWT bronze, non-modular, airflow audible over 300W power draw
Mouse PMSG1G
Keyboard oldschool membrane Keytronic 104 Key PS/2 (big enter, right part of right shift broken into "\" key)
@haswrong: Isn't that how it is now though?

nvidia said that gamers will love it.. this sounds like notebook gamers will love it.. are you ready to ditch your $1300 monitor for even more expensive one with the variable refresh rate, which doesnt make your reasponse any faster?

ok, i have to give you one thing, you can now watch your ass being fragged without stutter :D



...the audacity to sell this technology and make a profit! How dare them?!!
/S :rolleyes:

exactly.. if i decide to please someone, i do it for free..

remember when your momma asked you to take out the litter back then? did you ask like $270 for the service? the whole world will be soon eaten by the false religion that is the economy, you just wait.
 
Top