• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Sacrifices VESA Adaptive Sync Tech to Rake in G-SYNC Royalties

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,230 (7.55/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA's G-SYNC technology is rivaled by AMD's project Freesync, which is based on a technology standardized by the video electronics standards association (VESA), under Adaptive Sync. The technology lets GPUs and monitors keep display refresh rates in sync with GPU frame-rates, so the resulting output appears fluid. VESA's technology does not require special hardware inside standards-compliant monitors, and is royalty-free, unlike NVIDIA G-SYNC, which is based on specialized hardware, which display makers have to source from NVIDIA, which makes it a sort of a royalty.

When asked by Chinese publication Expreview on whether NVIDIA GPUs will support VESA adaptive-sync, the company mentioned that NVIDIA wants to focus on G-SYNC. A case in point is the display connector loadout of the recently launched GeForce GTX 980 and GTX 970. According to specifications listed on NVIDIA's website, the two feature DisplayPort 1.2 connectors, and not DisplayPort 1.2a, a requirement of VESA's new technology. AMD's year-old Radeon R9 and R7 GPUs, on the other hand, support DisplayPort 1.2a, casting a suspicion on NVIDIA's choice of connectors. Interestingly, the GTX 980 and GTX 970 feature HDMI 2.0, so it's not like NVIDIA is slow at catching up with new standards. Did NVIDIA leave out DisplayPort 1.2a in a deliberate attempt to check Adaptive Sync?



View at TechPowerUp Main Site
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,042 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64


:shadedshu: :rolleyes:
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,995 (0.34/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASRock X870 Taichi Lite
Cooling Thermalright Phantom Spirit 120 EVO CPU
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA) / NVIDIA RTX 4090 Founder's Edition
Storage Crucial T500 2TB x 3
Display(s) LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA)
Case HYTE Hakos Baelz Y60
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000L
Mouse Logitech Pro Superlight 2 (White), G303 Shroud Edition
Keyboard Wooting 60HE+ / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2
VR HMD Occulus Quest 2 128GB
Software Windows 11 Pro 64-bit 23H2 Build 22631.4317
From the article, it sounds like NVIDIA is actively blocking FreeSync, when in fact all it is is that the 900 series lacks DisplayPort 1.2a support, which is required for FreeSync.
 

CookieMonsta

New Member
Joined
Jun 19, 2014
Messages
11 (0.00/day)
Utterly stupid move by NVIDIA if it proves to be true. History has not been kind to proprietary technology, NVIDIA does not want it's competitors to unite over a common standard lest it becomes marginalized by the LCD manufacturers.
 
Joined
Dec 16, 2010
Messages
1,668 (0.33/day)
Location
State College, PA, US
System Name My Surround PC
Processor AMD Ryzen 9 7950X3D
Motherboard ASUS STRIX X670E-F
Cooling Swiftech MCP35X / EK Quantum CPU / Alphacool GPU / XSPC 480mm w/ Corsair Fans
Memory 96GB (2 x 48 GB) G.Skill DDR5-6000 CL30
Video Card(s) MSI NVIDIA GeForce RTX 4090 Suprim X 24GB
Storage WD SN850 2TB, Samsung PM981a 1TB, 4 x 4TB + 1 x 10TB HGST NAS HDD for Windows Storage Spaces
Display(s) 2 x Viotek GFI27QXA 27" 4K 120Hz + LG UH850 4K 60Hz + HMD
Case NZXT Source 530
Audio Device(s) Sony MDR-7506 / Logitech Z-5500 5.1
Power Supply Corsair RM1000x 1 kW
Mouse Patriot Viper V560
Keyboard Corsair K100
VR HMD HP Reverb G2
Software Windows 11 Pro x64
Benchmark Scores Mellanox ConnectX-3 10 Gb/s Fiber Network Card
I wouldn't jump to conclusions until there actually are monitors supporting adaptive refresh in the market. It's completely normal for companies to avoid mentioning and outright deny upcoming features/products in order to avoid cannibalizing sales of current products.
 
Joined
Apr 17, 2014
Messages
231 (0.06/day)
System Name 14900KF
Processor i9-14900KF
Motherboard ROG Z790-Apex
Cooling Custom water loop: D5
Memory G-SKill 7200 DDR5
Video Card(s) RTX 4080
Storage M.2 and Sata SSD's
Display(s) LG 4K OLED GSYNC compatible
Case Fractal Mesh
Audio Device(s) sound blaster Z
Power Supply Corsair 1200i
Mouse Logitech HERO G502
Keyboard Corsair K70R cherry red
Software Win11
Benchmark Scores bench score are for people who don't game.
Gysnc works,

Gsync is awesome

Gsync is here now.


Nvidia pushing technology forward with their enginuity and innovations

AMD needs to start making some leaps and strides if they expect to survive.

I wish AMD was pushing technology more then there would be an actual competition between red/green thus pushing performance and technology at a faster rate instead the snail like speed of the past 6+ years.
 
Joined
Jan 31, 2012
Messages
162 (0.03/day)
The fact that they support HDMI 2.0 (the first gpu to support that standard which is the latest HDMI revision), yet refuse to implement the latest displayport standard, makes the whole thing stink. Mother****ers!
 
Joined
Jun 13, 2012
Messages
1,388 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
I read off somewhere that a closed door meeting that 900 series can do 1.2a, all needs is software update so.
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
We change graphic cards way more often than monitors. So being stuck to a single graphic card brand because monitor supports G-Sync only is dumb. But if you have FreeSync enabled monitor, you are free to choose whichever graphic cad brand suits the price/performance best. But G-Sync monitor users are stuck with NVIDIA whether they like it or not (if they want HW adaptive sync).
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
I read off somewhere that a closed door meeting that 900 series can do 1.2a, all needs is software update so.

Sure, and all i need is a software update to convert my VGA port into DisplayPort... Things don't work that way with connectors and standards associated with them.
 
Joined
Mar 6, 2012
Messages
569 (0.12/day)
Processor i5 4670K - @ 4.8GHZ core
Motherboard MSI Z87 G43
Cooling Thermalright Ultra-120 *(Modded to fit on this motherboard)
Memory 16GB 2400MHZ
Video Card(s) HD7970 GHZ edition Sapphire
Storage Samsung 120GB 850 EVO & 4X 2TB HDD (Seagate)
Display(s) 42" Panasonice LED TV @120Hz
Case Corsair 200R
Audio Device(s) Xfi Xtreme Music with Hyper X Core
Power Supply Cooler Master 700 Watts
Gysnc works,

Gsync is awesome

Gsync is here now.


Nvidia pushing technology forward with their enginuity and innovations

AMD needs to start making some leaps and strides if they expect to survive.

I wish AMD was pushing technology more then there would be an actual competition between red/green thus pushing performance and technology at a faster rate instead the snail like speed of the past 6+ years.

I totally agree, AMD are light years behind Nvidia and should kiss Nvidia's feet for coming up with Gsync - Nvidia has every right to charge everyone for this tech, and btw, how dare AMD copy Gsync and call it free sync - such dumb asses. /if i have to put a sarcasm tag here, i seriously feel bad for you.
 

dansergiu

New Member
Joined
Jan 16, 2014
Messages
2 (0.00/day)
This doesn't say that Nvidia decided not to support Adaptive Sync. It simply says that they are focusing on GSync. Also, it's unlikely that they will not support it since it's part of the standard, but most likely they will market the GSync as the better solution, at least for a while.

Anyway, my point is that the title of the article is slightly misleading; it looks more like click bait from a tabloid than an article on a tech publication to be honest.
 
Joined
Jan 13, 2011
Messages
221 (0.04/day)
This doesn't say that Nvidia decided not to support Adaptive Sync. It simply says that they are focusing on GSync. Also, it's unlikely that they will not support it since it's part of the standard, but most likely they will market the GSync as the better solution, at least for a while.

Anyway, my point is that the title of the article is slightly misleading; it looks more like click bait from a tabloid than an article on a tech publication to be honest.
Gsync is currently a better product. Given it's an actually buyable product, it can even do QHD at 144hz right now and it works pretty reliably. These are all things we'll have to assess with freesync they are not the same technology they are two different approaches and we have yet to see Freesync in the erratic frame environment of games. That being it's perfect for watching video which is unlikely to see frame rate drops and rises like a game would.
 
Joined
Feb 8, 2008
Messages
2,667 (0.43/day)
Location
Switzerland
Processor i9 9900KS ( 5 Ghz all the time )
Motherboard Asus Maximus XI Hero Z390
Cooling EK Velocity + EK D5 pump + Alphacool full copper silver 360mm radiator
Memory 16GB Corsair Dominator GT ROG Edition 3333 Mhz
Video Card(s) ASUS TUF RTX 3080 Ti 12GB OC
Storage M.2 Samsung NVMe 970 Evo Plus 250 GB + 1TB 970 Evo Plus
Display(s) Asus PG279 IPS 1440p 165Hz G-sync
Case Cooler Master H500
Power Supply Asus ROG Thor 850W
Mouse Razer Deathadder Chroma
Keyboard Rapoo
Software Win 10 64 Bit
first of all we must see if Free-Sync works the same or better than G-Sync then we can talk...

For now G-Sync is already a reality and is proven that works flawlessy.
 
Joined
Jun 13, 2012
Messages
1,388 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
Sure, and all i need is a software update to convert my VGA port into DisplayPort... Things don't work that way with connectors and standards associated with them.

It could be just as simple as that, why say you support adaptive sync when there is 0 monitors out and according to press release won't be til least q1 on some.
"Today, AMD announced collaborations with scaler vendors MStar, Novatek and Realtek to build scaler units ready with DisplayPort™ Adaptive-Sync and AMD's Project FreeSync by year end."
http://ir.amd.com/phoenix.zhtml?c=74093&p=irol-newsArticle&ID=1969277

That is off AMD's press rls, If they say end of year sounds like no monitor til near end of q1. As other said no reason to say support something isn't out when you have working product Now. Just to kill off your own sales when don't even know if adaptive sync will give any benefit to games or if that is limited to AMD proprietary code.


first of all we must see if Free-Sync works the same or better than G-Sync then we can talk...

For now G-Sync is already a reality and is proven that works flawlessy.

I agree with that.

Gsync is currently a better product. Given it's an actually buyable product, it can even do QHD at 144hz right now and it works pretty reliably. These are all things we'll have to assess with freesync they are not the same technology they are two different approaches and we have yet to see Freesync in the erratic frame environment of games. That being it's perfect for watching video which is unlikely to see frame rate drops and rises like a game would.

They are 2 different ways, we know well g-sync works but have yet to see how AMD's works. I said this before I don't take AMD's claims at face value, just this "prove it works like you say" then I will give credit they are due.
 
Last edited:
Joined
Sep 6, 2013
Messages
3,328 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
When I posted one week ago about this, I was either ignored or treated as someone that was talking conspiracy theories from a very negative perspective. Either we like or we hate Nvidia, we do know that they stick with their proprietary standards, especially when they have the upper hand in the market. There is nothing strange here with them implementing 1.2 and not 1.2a. It's what they are doing for years. OpenCL is another example with even the 900 series cards supporting only OpenCL 1.1 if I am not mistaken, when AMD was supporting 1.2 for years. OpenCL 1.2 is 3 years old.

It's no negative posting, or bashing, or conspiracy theories, and there is nothing strange here. It is business as usual for Nvidia.
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,047 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
It's very simple. Nvidia have worked on g-sync, whether you want it or not, they have a hardware implementation to address frame rate from GPU to monitor.
If you want it, you buy Nvidia's product. If you don't want it, you buy AMD product. Nvidia have pushed a business model to increase profit, to please shareholders.
Nvidia is a business, not a charity, it has zero requirement to work along 'free' business models. It has arguably spent a lot on R&D and builds a very capable vanilla card.
If you do not like what they do, you have no need to buy their products. Buy AMD instead.
By all review accounts, on the whole, Gsync works like a dream, why as a private company would they support a free or cheaper version?

If people start seeing Nvidia and AMD as businesses and not charities, a lot of arguments and misplaced anger venting could be avoided.
 
Joined
Jul 16, 2014
Messages
26 (0.01/day)
Location
Sydney, Australia
System Name Da Swift
Processor Intel i7 4770K
Motherboard ASUS Maximus VI Formula
Cooling Corsair H80i
Memory Kingston 16GB 1600Mhz
Video Card(s) 2x MSI R9 290X Gaming 4G OC (Crossfire)
Storage Samsung 840 EVO 500 GB (OS) Rapid, Samsung 840 EVO 256GB
Display(s) ASUS Swift PG278Q
Case Antec 900 II V3
Audio Device(s) Astro A50 Headset, Logitech Speakers
Power Supply Seasonic 1000w 80 Plus Platnum
Software Windows 8.1 64bit (Classic Shell - Windows 7 Theme)
Benchmark Scores 3DMark Fire Strike - 16353 http://www.3dmark.com/fs/2911875
Gysnc works,

Gsync is awesome

Gsync is here now.


Nvidia pushing technology forward with their enginuity and innovations

AMD needs to start making some leaps and strides if they expect to survive.

I wish AMD was pushing technology more then there would be an actual competition between red/green thus pushing performance and technology at a faster rate instead the snail like speed of the past 6+ years.

What do you think they are doing with Mantle, thanks to AMD Microsoft started working on DX12 after the announced they they were focusing on other areas but Mantle forced their hand on the issue.

Nvidia is forcing people to buy monitors with Gsync, if they released DP 1.2a then sales of GSync monitors would collapse when freesync would be available I'm not loyal or a fan boy. I was going to get 2x 780ti's but Mantle's 290x crossfire performance on BF4 won me over. The 780ti's would sit at 75% each wasting GPU performance. When DX12 comes out and they can run at 100% all the time and use max performance all the time I might switch over but then moves like this makes you think twice.

AMD has planned to make Mantle open so we will wait and see when it gets released at the end of this year if that happens.
 
Joined
Jul 16, 2014
Messages
26 (0.01/day)
Location
Sydney, Australia
System Name Da Swift
Processor Intel i7 4770K
Motherboard ASUS Maximus VI Formula
Cooling Corsair H80i
Memory Kingston 16GB 1600Mhz
Video Card(s) 2x MSI R9 290X Gaming 4G OC (Crossfire)
Storage Samsung 840 EVO 500 GB (OS) Rapid, Samsung 840 EVO 256GB
Display(s) ASUS Swift PG278Q
Case Antec 900 II V3
Audio Device(s) Astro A50 Headset, Logitech Speakers
Power Supply Seasonic 1000w 80 Plus Platnum
Software Windows 8.1 64bit (Classic Shell - Windows 7 Theme)
Benchmark Scores 3DMark Fire Strike - 16353 http://www.3dmark.com/fs/2911875
Gsync is currently a better product. Given it's an actually buyable product, it can even do QHD at 144hz right now and it works pretty reliably. These are all things we'll have to assess with freesync they are not the same technology they are two different approaches and we have yet to see Freesync in the erratic frame environment of games. That being it's perfect for watching video which is unlikely to see frame rate drops and rises like a game would.

You better hope it's not better than GSync since if you have that card your shit out of luck. Being on AMD I don't need to care about spending an extra $150 on something that you probably won't even notice lol I rather have that for free. I don't see the the jittering on my 120hz display that they show on those demos..... lol as they say a sucker is born every day. Those tests are probably the worst case scenarios if not purposely exaggerated.
 
Joined
Apr 19, 2012
Messages
12,062 (2.62/day)
Location
Gypsyland, UK
System Name HP Omen 17
Processor i7 7700HQ
Memory 16GB 2400Mhz DDR4
Video Card(s) GTX 1060
Storage Samsung SM961 256GB + HGST 1TB
Display(s) 1080p IPS G-SYNC 75Hz
Audio Device(s) Bang & Olufsen
Power Supply 230W
Mouse Roccat Kone XTD+
Software Win 10 Pro
The real crime here is that I don't care about Freesync or Gsync, and have no intention on buying into either.

Jesus, gamers these days think they deserve to get everything for free.
 
Joined
Jul 16, 2014
Messages
26 (0.01/day)
Location
Sydney, Australia
System Name Da Swift
Processor Intel i7 4770K
Motherboard ASUS Maximus VI Formula
Cooling Corsair H80i
Memory Kingston 16GB 1600Mhz
Video Card(s) 2x MSI R9 290X Gaming 4G OC (Crossfire)
Storage Samsung 840 EVO 500 GB (OS) Rapid, Samsung 840 EVO 256GB
Display(s) ASUS Swift PG278Q
Case Antec 900 II V3
Audio Device(s) Astro A50 Headset, Logitech Speakers
Power Supply Seasonic 1000w 80 Plus Platnum
Software Windows 8.1 64bit (Classic Shell - Windows 7 Theme)
Benchmark Scores 3DMark Fire Strike - 16353 http://www.3dmark.com/fs/2911875
The real crime here is that I don't care about Freesync or Gsync, and have no intention on buying into either.

Jesus, gamers these days think they deserve to get everything for free.
Me neither and I guess we can thank Nvidia for the extra cost on the Asus Swift, I hope there is a version without the GSync I doubt it will be that noticeable, if you limit your FPS to your monitors refresh rate using RTSS or maxvariable in BF4 it's not needed IMO. Like I said my monitor plays smooth thanks to my 2x 290x's at 120hz. Those demos are laughable, marketing 101 on display that is all.
 
Joined
Apr 19, 2012
Messages
12,062 (2.62/day)
Location
Gypsyland, UK
System Name HP Omen 17
Processor i7 7700HQ
Memory 16GB 2400Mhz DDR4
Video Card(s) GTX 1060
Storage Samsung SM961 256GB + HGST 1TB
Display(s) 1080p IPS G-SYNC 75Hz
Audio Device(s) Bang & Olufsen
Power Supply 230W
Mouse Roccat Kone XTD+
Software Win 10 Pro
Me neither and I guess we can thank Nvidia for the extra cost on the Asus Swift, I hope there is a version without the GSync I doubt it will be that noticeable, if you limit your FPS to your monitors refresh rate using RTSS or maxvariable in BF4 it's not needed IMO. Like I said my monitor plays smooth thanks to my 2x 290x's at 120hz. Those demos are laughable, marketing 101 on display that is all.

Maybe we're bad examples, we have decent systems and don't see sub 45FPS instances. I imagine GSync and Freesync are more important for people with low end systems, or midrange systems on 4K ridiculousness. I would assume that's where the sync magic comes in handy for the low FPS ranges and dips. Either way, I don't know why people with high end systems care, they wouldn't see much improvement with gsync or freesync anyway.
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,047 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Me neither and I guess we can thank Nvidia for the extra cost on the Asus Swift, I hope there is a version without the GSync I doubt it will be that noticeable, if you limit your FPS to your monitors refresh rate using RTSS or maxvariable in BF4 it's not needed IMO. Like I said my monitor plays smooth thanks to my 2x 290x's at 120hz. Those demos are laughable, marketing 101 on display that is all.

The problem is you need to experience the sensation of Gsync. You cannot gauge it through a video or you tube. All the reviews are exceptionally positive about it for the most part.
I too don't care about either product but if Freesync works as well as Mantle, Nvidia might adapt their business model to compete. It might just remain peripheral technology though, much like Mantle.
FTR one of my BF4 mates went mantle with a 290 and truly appreciated the smoothness, though he did come from a 2 GB gtx680. On my part, I only game on one 780ti but our perf.render display outputs were identical. But, I do use a decent CPU so mantle is limited in purpose for my GPU needs.
 
Joined
Sep 3, 2013
Messages
75 (0.02/day)
Processor Intel i7 6700k
Motherboard Gigabyte Z170X-Gaming 7
Cooling Nocuta nh-d14
Memory Patriot 16GB
Video Card(s) Gibabyte GeForce GTX 1080Ti
Storage (system) Corsair 120GB Force GT, (games) 500GB 840, (data) Seagate 1TB Barracuda + WD 1TB Black
Display(s) AOC AG271QG
Case NZXT Phantom
Power Supply Corsair ax1200i
Mouse g502
Keyboard G710+
Software Win10
What do you think they are doing with Mantle, thanks to AMD Microsoft started working on DX12 after the announced they they were focusing on other areas but Mantle forced their hand on the issue.

Nvidia is forcing people to buy monitors with Gsync, if they released DP 1.2a then sales of GSync monitors would collapse when freesync would be available I'm not loyal or a fan boy. I was going to get 2x 780ti's but Mantle's 290x crossfire performance on BF4 won me over. The 780ti's would sit at 75% each wasting GPU performance. When DX12 comes out and they can run at 100% all the time and use max performance all the time I might switch over but then moves like this makes you think twice.

AMD has planned to make Mantle open so we will wait and see when it gets released at the end of this year if that happens.

Well following your line of though here. Isn't AMD "forcing" you to buy new Microsoft Software i.e. Windows 9 ?

I agree with above "foreworders":

Nvidia pushing technology forward with their enginuity and innovations

For now G-Sync is already a reality and is proven that works flawlessy.

Exactly people need a solution right here, right now. I am having a blast using it, it is incredible and fun! As a consumer I am not going to wait for "I don't know how long in a distant future" for something that hasn't been even implemented/tested yet.

I also read that it using 1.2a, all needs is software update so, we are talking simple soft solution to be able to use "freesync" on nvidia cards. Why all this doom and gloom and rant.

Nvidia has every right to charge everyone for this tech

Umm, yes they do if there are people willing to pay for it, duuuh???
 
Top