• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA CEO Jensen Huang on Radeon VII: "Underwhelming (...) the Performance is Lousy"; "Freesync Doesn't Work"

Joined
Feb 5, 2016
Messages
171 (0.05/day)
Location
Sthlm
System Name Vamos
Processor Ryzen 5 3600
Motherboard Gigabyte B450 I Aorus Pro WiFi
Cooling Dark Rock Slim
Memory 2x8GB XPG 3600MHz
Video Card(s) RX 5700 XT PowerColor
Storage WD SN550 1TB, Seagate 2TB HDD
Display(s) AOC CQ32G1
Case Jonsbo V8
Audio Device(s) Logitech G633, Presonus E3.5
Power Supply Fractal Design Ion SFX-L 650W Gold
Mouse Logitech G502
Keyboard Logitech G513 Carbon
Software Win 10 Home
AFAIK, G-SYNC is the technically superior solution...

Well, for such a premium price difference it should be. But it ain't worth as much.
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.44/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Someone’s oblivious knows that AMD is about to kick them really hard with navi
Mum's the word on Navi. I don't think it's imminent. PS5 releases anywhere between Christmas 2019 and 2021. I think Christmas 2019 is probably the soonest we'll see a consumer Navi graphics card. Makes one wonder what AMD is going to release between now and then. Polaris 40 on 7nm?

I don't think Radeon VII really concerns Huang so much as the road ahead for NVIDIA is rough. AI isn't taking off like they banked on it doing. RTX reception was poor. 7nm transition for NVIDIA is going to be very costly. Cryptocurrency mining on GPUs is mostly dead. FreeSync integration is rougher than he expected. The sky looks cloudy to Huang where the sky over Su looks sunny.
 
Last edited:
Joined
Aug 14, 2017
Messages
74 (0.03/day)
well i dont think huang is scare.noway.

truth is that amd brand new flagship gpu cant challenge even nvidia 3rd faster gpu, not even close,so even amd compare it 2080 gpu...hehe, so amd compare they flagship not nvidia flagshp, but 3 rd fastest gpu...2080 nad hypeting THAT!! come on! THINK!!
i remembe same issue was when vega64 release...amd compare it gtx 1080..NOT gtx 1080 ti..think and i hope when amd then also hypeting vega64 very high...

truth is that without 7nm update radeon VII is melting,thats fact. so its need 3 fans toREFERENCE gpu,never happend bfore gpu history, just like vega64 must use watercool!


and radeon VII even its make 7nm tech STILL eat more than 300W power and i be sure real power is somewhere 350W and peak 400W its terrible score!


think ppl cant understand how big big handicap amd radeon VII have bcoz it build 7nmm architecture... THINK!! its almost half of rtx 2000 series 12nm,so its should eat much less power what is now eating....something is very wrong, and it is that amd cant build good efficiency gpu.

example rtx 2080 eat 220W for average gaming,radeon VII score is over 300W ,EVEN its make 7nm lines!!!!!!!!

its lausy gpu like it little brother vega64, and i call it vega64 II.

rtx 2060 is 10 times better gpu. if we took fastest gpus, we took rtx 2080,rtc 2080ti and of coz rtx titan.

looks that radeon VII just and just avoid watercool,let see how its oc'd.. if it is power eat go absolutly dky high. eve its 7nm gpu.

lausy gpu again, and lisa know it and huang know it.


we e seen it soon or do i say afer 12 month when nvidia release thys 7nm gpu...i promise , under 250W and performance is 2080ti + 40% at least.

dont lie urself, see truth on eye, amd want every1 have 1kw psu. or amd not care. amd should thanks and bed deep another company TMSC that they brink it 7nm arhitecture,without that help i think amd not even release radeon VII.

lausy gpu. shame amd, and ppl wake up, amd not deserve anything smpaty,its earth eat gpu, slow and for it performace its pricey.
 
Last edited:
Joined
Nov 24, 2018
Messages
2,252 (1.01/day)
Location
south wales uk
System Name 1.FortySe7en VR rig 2. intel teliscope rig 3.MSI GP72MVR Leopard Pro .E-52699, Xeon play thing
Processor 1.3900x @stock 2. 3700x . 3. i7 7700hq
Motherboard 1.aorus x570 ultra 2. Rog b450 f,4 MR9A PRO ATX X99
Cooling 1.Hard tube loop, cpu and gpu 2. Hard loop cpu and gpu 4 360 AIO
Memory 1.Gskill neo @3600 32gb 2.corsair ven 32gb @3200 3. 16gb hyperx @2400 4 64GB 2133 in quad channel
Video Card(s) 1.GIGABYTE RTX 3080 WaterForce WB 2. Aorus RTX2080 3. 1060 3gb. 4 Arc 770LE 16 gb
Storage 1 M.2 500gb , 2 3tb HDs 2. 256gb ssd, 3tbHD 3. 256 m.2. 1tb ssd 4. 2gb ssd
Display(s) 1.LG 50" UHD , 2 MSI Optix MAG342C UWHD. 3.17" 120 hz display 4. Acer Preditor 144hz 32inch.z
Case 1. Thermaltake P5 2. Thermaltake P3 4. some cheapo case that should not be named.
Audio Device(s) 1 Onboard 2 Onboard 3 Onboard 4. onboard.
Power Supply 1.seasonic gx 850w 2. seasonic gx 750w. 4 RM850w
Mouse 1 ROG Gladius 2 Corsair m65 pro
Keyboard 1. ROG Strix Flare 2. Corsair F75 RBG 3. steelseries RBG
VR HMD rift and rift S and Quest 2.
Software 1. win11 pro 2. win11 pro 3, win11 home 4 win11 pro
Benchmark Scores 1.7821 cb20 ,cb15 3442 1c 204 cpu-z 1c 539 12c 8847
gamerman
you can get tablets for nvidia addiction :) sounds like you have the same as poor mr Huang matey, times are a changing... you know im joking right.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
gamerman
you can get tablets for nvidia addiction :) sounds like you have the same as poor mr Huang matey, times are a changing... you know im joking right.
I believe the tablet for Nvidia addiction is called Nintendo Switch.



... I'll see myself out.
 
Joined
Nov 24, 2018
Messages
2,252 (1.01/day)
Location
south wales uk
System Name 1.FortySe7en VR rig 2. intel teliscope rig 3.MSI GP72MVR Leopard Pro .E-52699, Xeon play thing
Processor 1.3900x @stock 2. 3700x . 3. i7 7700hq
Motherboard 1.aorus x570 ultra 2. Rog b450 f,4 MR9A PRO ATX X99
Cooling 1.Hard tube loop, cpu and gpu 2. Hard loop cpu and gpu 4 360 AIO
Memory 1.Gskill neo @3600 32gb 2.corsair ven 32gb @3200 3. 16gb hyperx @2400 4 64GB 2133 in quad channel
Video Card(s) 1.GIGABYTE RTX 3080 WaterForce WB 2. Aorus RTX2080 3. 1060 3gb. 4 Arc 770LE 16 gb
Storage 1 M.2 500gb , 2 3tb HDs 2. 256gb ssd, 3tbHD 3. 256 m.2. 1tb ssd 4. 2gb ssd
Display(s) 1.LG 50" UHD , 2 MSI Optix MAG342C UWHD. 3.17" 120 hz display 4. Acer Preditor 144hz 32inch.z
Case 1. Thermaltake P5 2. Thermaltake P3 4. some cheapo case that should not be named.
Audio Device(s) 1 Onboard 2 Onboard 3 Onboard 4. onboard.
Power Supply 1.seasonic gx 850w 2. seasonic gx 750w. 4 RM850w
Mouse 1 ROG Gladius 2 Corsair m65 pro
Keyboard 1. ROG Strix Flare 2. Corsair F75 RBG 3. steelseries RBG
VR HMD rift and rift S and Quest 2.
Software 1. win11 pro 2. win11 pro 3, win11 home 4 win11 pro
Benchmark Scores 1.7821 cb20 ,cb15 3442 1c 204 cpu-z 1c 539 12c 8847
Valantar
lol how did i miss that, come back in i like you humor.
 
Joined
Aug 2, 2011
Messages
1,458 (0.30/day)
Processor Ryzen 9 7950X3D
Motherboard MSI X670E MPG Carbon Wifi
Cooling Custom loop, 2x360mm radiator,Lian Li UNI, EK XRes140,EK Velocity2
Memory 2x16GB G.Skill DDR5-6400 @ 6400MHz C32
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra OC Scanner core +750 mem
Storage MP600 Pro 2TB,960 EVO 1TB,XPG SX8200 Pro 1TB,Micron 1100 2TB,1.5TB Caviar Green
Display(s) Alienware AW3423DWF, Acer XB270HU
Case LianLi O11 Dynamic White
Audio Device(s) Logitech G-Pro X Wireless
Power Supply EVGA P3 1200W
Mouse Logitech G502X Lightspeed
Keyboard Logitech G512 Carbon w/ GX Brown
VR HMD HP Reverb G2 (V2)
Software Win 11
AMD fanbois will be so angry when in a couple of years RTX becomes standard, and AMD will be forced to adopt it effectively increasing the prices of their hot plastic cards even more.
You know, that is a good point, they invested in RTX and maybe or maybe not it'll play out well for them. "Standard" is also a good question, since DX12 introduced the RTX API in Win 10, it's still very young. It's all about the ecosystem, meaning how many developers are going to invest time into implementing it when there's only a handful (or maybe one right now... 2080 ti) that can really handle it.

It's not a very compelling story at the moment.

Guys, you realize that currently, the only implementation of RTRT on desktop is Windows 10, with 1809 update, using DXR as the library to display said technology? You know, the library that anyone can use to accelerate ray tracing? It's not "RTX API". It's Microsofts' DXR API, that is a part of DirectX. Once AMD implements Radeon Rays, they'll be using the same library.

I've just got a 55'' samsung tv with freesync built in 120fps with my r9 290x only at 1080p but works for me. WITH MY LEATHER JACKET ON LOL.....

Loving my Q6 so far. Been a great TV

No halving thank to dlss

Man, your guys reliance on DLSS being an end all solution is bewildering. Personally, I'd never rely on a scaling solution in my gaming. I'd rather just run at native res, for better or worse. Beyond that, we're already getting 1080p 60fps in the only RTX supported title so far, with cards that are lower in the stack than the range topping RTX Titan and RTX 2080 Ti.



Beyond all that, the Radeon 7 does look good. So what, it's just a shrunk and higher clocked Vega. But it has twice the ROPS as previous, a huge increase that hamstrung the first Vega, and it has faster HBM, and twice as much as last time as well.

It's $700 because NVIDIA allowed them to price it at that; I can hardly blame AMD for taking advantage of NVIDIA's absofuckinglutely insane pricing.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Guys, you realize that currently, the only implementation of RTRT on desktop is Windows 10, with 1809 update, using DXR as the library to display said technology? You know, the library that anyone can use to accelerate ray tracing? It's not "RTX API". It's Microsofts' DXR API, that is a part of DirectX. Once AMD implements Radeon Rays, they'll be using the same library.



Loving my Q6 so far. Been a great TV



Man, your guys reliance on DLSS being an end all solution is bewildering. Personally, I'd never rely on a scaling solution in my gaming. I'd rather just run at native res, for better or worse. Beyond that, we're already getting 1080p 60fps in the only RTX supported title so far, with cards that are lower in the stack than the range topping RTX Titan and RTX 2080 Ti.



Beyond all that, the Radeon 7 does look good. So what, it's just a shrunk and higher clocked Vega. But it has twice the ROPS as previous, a huge increase that hamstrung the first Vega, and it has faster HBM, and twice as much as last time as well.

It's $700 because NVIDIA allowed them to price it at that; I can hardly blame AMD for taking advantage of NVIDIA's absofuckinglutely insane pricing.
The main difference here: AMD's range tops out at $700. Regardless of relative and absolute performance, that's within reason, I'd say. I paid that much for my Fury X, and I'm happy with it, but it would take a serious improvement to make me pay as much again, at least for a couple of years yet. Nvidia's range, on the other hand, tops out at either $1200 or $2500, depending if you include the Titan. I don't think it counts (it's not GeForce), but apparently rabid fanboys such as the above example disagree (edit: to clarify, not the post quoted above, but the one you've all noticed if you've read the last page of posts). That is well beyond "within reason". People were pissed when Nvidia pushed Titan pricing to $1200, yet now they're eating up the 2080 ti at the same price. Just shows how easily accustomed one gets to insanity.
 
Last edited:
Joined
Jun 2, 2017
Messages
9,374 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I bought a Water block for my Sapphire Vega 64 card. I do seriously hope that the layout is the same as when I ran Fire Strike this morning
The main difference here: AMD's range tops out at $700. Regardless of relative and absolute performance, that's within reason, I'd say. I paid that much for my Fury X, and I'm happy with it, but it would take a serious improvement to make me pay as much again, at least for a couple of years yet. Nvidia's range, on the other hand, tops out at either $1200 or $2500, depending if you include the Titan. I don't think it counts (it's not GeForce), but apparently rabid fanboys such as the above example disagree. That is well beyond "within reason". People were pissed when Nvidia pushed Titan pricing to $1200, yet now they're eating up the 2080 ti at the same price. Just shows how easily accustomed one gets to insanity.


Exactly, and people are talking about DLSS and Ray Tracing without Physx, Hairworks and SLI which Nvidia took over and hogged for themselves only to see the technology not used because of the way they do things vs AMD with Vulkan which let to DX12 or Freesync which suddenly Nvidia is supporting. Of course with a comment for Jensen making it seem like Freesync is only good for NVidia cards..
 
Joined
Dec 6, 2005
Messages
10,885 (1.56/day)
Location
Manchester, NH
System Name Senile
Processor I7-4790K@4.8 GHz 24/7
Motherboard MSI Z97-G45 Gaming
Cooling Be Quiet Pure Rock Air
Memory 16GB 4x4 G.Skill CAS9 2133 Sniper
Video Card(s) GIGABYTE Vega 64
Storage Samsung EVO 500GB / 8 Different WDs / QNAP TS-253 8GB NAS with 2x10Tb WD Blue
Display(s) 34" LG 34CB88-P 21:9 Curved UltraWide QHD (3440*1440) *FREE_SYNC*
Case Rosewill
Audio Device(s) Onboard + HD HDMI
Power Supply Corsair HX750
Mouse Logitech G5
Keyboard Corsair Strafe RGB & G610 Orion Red
Software Win 10
It's Microsofts' DXR API, that is a part of DirectX. Once AMD implements Radeon Rays, they'll be using the same library.

Yes, yes and yes. It's a component of DX in the latest Windows 10 version that everyone has access to.

The catch is that game developers have to write code to implement it, and the hardware has to be able to process it as well. Currently, these games are in the works to support it, at least to some degree or another:
  • Assetto Corsa Competizione from Kunos Simulazioni/505 Games.
  • Atomic Heart from Mundfish.
  • Battlefield V from EA/DICE.
  • Control from Remedy Entertainment/505 Games.
  • Enlisted from Gaijin Entertainment/Darkflow Software.
  • Justice from NetEase.
 
Joined
Apr 2, 2014
Messages
7 (0.00/day)
What so screen tearing, which freesync was made for wouldn't give the game away if it's not working??.
People would notice.

The screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.

Actually no, there are in depth Youtube vids about just about everything and when there are doubts, some reddit or YT or Twitch channel will explode because there's a new daily shitstorm to click on.

There have been problems noted. This pcperspective article notices excessive ghosting on early freesync monitors.
https://www.pcper.com/reviews/Displ...hnical-Discussion/Gaming-Experience-FreeSync-
They released a follow up article claiming the issue was fixed, but the ghosting was replaced by arguably more noticeable overshoot artifacts.

PCmonitors.info noted that the Samsung c24fg70 had overshoot artifacts with freesync at 100hz, and flickering on the desktop with freesync on.
https://pcmonitors.info/reviews/samsung-c24fg70/

Monitor reviews aren't as exciting as gpu or cpu reviews, and these things can easily go unnoticed by the community. Those overshoot and ghosting issues would be solved by dynamic overdrive, something nvidia requires for all gsync monitors.
 
Last edited:
Joined
Mar 10, 2010
Messages
11,878 (2.20/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
The screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.



There have been problems noted. This pcperspective article notices excessive ghosting on early freesync monitors.
https://www.pcper.com/reviews/Displ...hnical-Discussion/Gaming-Experience-FreeSync-

They released a follow up article claiming the issue was fixed, but the ghosting was replaced by arguably more noticeable overshoot artifacts.

PCmonitors.info noted that the Samsung c24fg70 had overshoot artifacts with freesync at 100hz, and flickering on the desktop with freesync on.

https://pcmonitors.info/reviews/samsung-c24fg70/

Monitor reviews aren't as exciting as gpu or cpu reviews, and these things can easily go unnoticed by the community.

So you have 450 ish more examples (of different monitor) to find and Huang is then right, crack on.
He did say they're All broken.

And you Are hear trying to back his words, Have you seen this issue in the flesh??????

Everyone with some sense would expect the odd monitor or even line of monitor to have issues possibly but to say they're all broken ,well.

My Samsung works without artefacts etc but tbf it is a 45/75hz one so I can see it's not that model.
 
D

Deleted member 177333

Guest
"Underwhelming. The performance is lousy."

Kinda funny since that's exactly what I thought when reading the Turing reviews.
 
Joined
Jun 3, 2010
Messages
2,540 (0.48/day)
The screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.



There have been problems noted. This pcperspective article notices excessive ghosting on early freesync monitors.
https://www.pcper.com/reviews/Displ...hnical-Discussion/Gaming-Experience-FreeSync-
They released a follow up article claiming the issue was fixed, but the ghosting was replaced by arguably more noticeable overshoot artifacts.

PCmonitors.info noted that the Samsung c24fg70 had overshoot artifacts with freesync at 100hz, and flickering on the desktop with freesync on.
https://pcmonitors.info/reviews/samsung-c24fg70/

Monitor reviews aren't as exciting as gpu or cpu reviews, and these things can easily go unnoticed by the community. Those overshoot and ghosting issues would be solved by dynamic overdrive, something nvidia requires for all gsync monitors.
There is no need to rationalize all that much after the debate, here you go;
 
Joined
Oct 14, 2017
Messages
210 (0.08/day)
System Name Lightning
Processor 4790K
Motherboard asrock z87 extreme 3
Cooling hwlabs black ice 20 fpi radiator, cpu mosfet blocks, MCW60 cpu block, full cover on 780Ti's
Memory corsair dominator platinum 2400C10, 32 giga, DDR3
Video Card(s) 2x780Ti
Storage intel S3700 400GB, samsung 850 pro 120 GB, a cheep intel MLC 120GB, an another even cheeper 120GB
Display(s) eizo foris fg2421
Case 700D
Audio Device(s) ESI Juli@
Power Supply seasonic platinum 1000
Mouse mx518
Software Lightning v2.0a
Yup, like RTX and DLSS …

View attachment 114314

Smartcom

loool

You werent there when the first T&L gpu came out, a!

mybe he wasn't, but it looks like you weren't there when the first shading GPU (R300) came out

- and this was 3 years ago... they only got worse.

AMD doesn't lock features behind hardware, they make open-source platforms.

they don't do it for you, or should I say us, they do it becouse it help them save money from software development expenses, it is also the reason they have shit gpu drivers
 
Joined
Sep 17, 2014
Messages
22,681 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
The screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.



There have been problems noted. This pcperspective article notices excessive ghosting on early freesync monitors.
https://www.pcper.com/reviews/Displ...hnical-Discussion/Gaming-Experience-FreeSync-
They released a follow up article claiming the issue was fixed, but the ghosting was replaced by arguably more noticeable overshoot artifacts.

PCmonitors.info noted that the Samsung c24fg70 had overshoot artifacts with freesync at 100hz, and flickering on the desktop with freesync on.
https://pcmonitors.info/reviews/samsung-c24fg70/

Monitor reviews aren't as exciting as gpu or cpu reviews, and these things can easily go unnoticed by the community. Those overshoot and ghosting issues would be solved by dynamic overdrive, something nvidia requires for all gsync monitors.

Sorry but no. There are terrible monitors with Gsync too. Has nothing to do with VRR and everything with the panel and board behind that.

Tftcentral has it all in case you are curious. Not once has Freesync been considered related to ghosting or artifacting. The only gripe is limited ranges.
 
Joined
Oct 14, 2017
Messages
210 (0.08/day)
System Name Lightning
Processor 4790K
Motherboard asrock z87 extreme 3
Cooling hwlabs black ice 20 fpi radiator, cpu mosfet blocks, MCW60 cpu block, full cover on 780Ti's
Memory corsair dominator platinum 2400C10, 32 giga, DDR3
Video Card(s) 2x780Ti
Storage intel S3700 400GB, samsung 850 pro 120 GB, a cheep intel MLC 120GB, an another even cheeper 120GB
Display(s) eizo foris fg2421
Case 700D
Audio Device(s) ESI Juli@
Power Supply seasonic platinum 1000
Mouse mx518
Software Lightning v2.0a
-While I agree that Vega as an architecture is far better suited for professional applications (we've seen this proven plenty of times), apparently AMD feels confident enough in the gaming-related improvements when doubling the memory bandwidth and doubling the number of ROPs to give it a go. I suggest we wait for reviews.

vega IS far better suited for (professional) compute applications
this is not about what AMD feels
memory bandwidth gives nothing to gaming performance on vega
it is not double the ROPs, they are still 64
but guess what ? I also suggest we waith for reviews

Ultimately this is good news, finally Vega can compete with the 1080 Ti. Let's wait for the review and see if TSMC's 7nm is the silver bullet many had hoped for.

it's not really 7 nm, actual size is closer to intel's 10 nm.

I think I know whay he says freesync doesn't work "at all" on "all" monitors, I suspect he considers the concept of a monitor not running the same hertz range in freesync as it does without it.
which makes him right about that since it is a lie sold by specifications and makes people believe freesync is full range, this says something to many people here that say AMD is not as dirty as nvidea...

Sasqui Assetto CC have shit for graphics ray tracing is not going to do it any good
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
vega IS far better suited for (professional) compute applications
Yes, that's what I said. Kind of weird to try to make a point against me be repeating what I said like that. Is a snarky tone supposed to make it mean something different? This is a fact. Doesn't mean that it can't work for gaming. I was very surprised to see Vega 20 arrive as a gaming card, but apparently it performs well for this also. Nothing weird about that.
this is not about what AMD feels
Considering they're the only ones with any knowledge of how this performs in various workloads, it very much is. Launching it is also 100% AMD's decision, so, again this comes down to how AMD feels about it.
memory bandwidth gives nothing to gaming performance on vega
Source?
it is not double the ROPs, they are still 64
That seems to be true, given the coverage of how this was an error in initial reports. My post was made before this was published, though.

it's not really 7 nm, actual size is closer to intel's 10 nm.
Node names are largely unrelated to feature size in recent years - there's nothing 10nm in Intel's 10nm either - but this is still a full node shrink from 14/12nm. Intel being slightly more conservative with node naming doesn't change anything about this.

I think I know whay he says freesync doesn't work "at all" on "all" monitors, I suspect he considers the concept of a monitor not running the same hertz range in freesync as it does without it.
If that's the case, he should look up the definition of "work". At best, that's stretching the truth significantly, but I'd say it's an outright lie. What you're describing there is not "not working".
which makes him right about that since it is a lie sold by specifications and makes people believe freesync is full range, this says something to many people here that say AMD is not as dirty as nvidea...
Why is that? AMD helped create an open standard. How this is implemented is not up to them, as long as the rules of the standard are adhered to (and even then, it's VESA that owns the standard, not AMD). AMD has no power to police how monitor manufacturers implement Freesync - which is how it should be, and not Nvidia's cost-adding gatekeeping. AMD has never promised more than tear-free gaming (unless you're talking FS2, which is another thing entirely), which they've delivered. AMD even has a very helpful page dedicated to giving accurate information on the VRR range of every single FreeSync monitor.

Also, your standard for "as dirty as" seems... uneven, to say the least. On the one hand, we have "created an open standard and didn't include strict quality requirements or enforce this as a condition for use of the brand name", while on the other we have (for example) "attempted to use their dominant market share to strong-arm production partners out of using their most famous/liked/respected/recognized brand names for products from their main competitor". Do those look equal to you?
 
Joined
Oct 14, 2017
Messages
210 (0.08/day)
System Name Lightning
Processor 4790K
Motherboard asrock z87 extreme 3
Cooling hwlabs black ice 20 fpi radiator, cpu mosfet blocks, MCW60 cpu block, full cover on 780Ti's
Memory corsair dominator platinum 2400C10, 32 giga, DDR3
Video Card(s) 2x780Ti
Storage intel S3700 400GB, samsung 850 pro 120 GB, a cheep intel MLC 120GB, an another even cheeper 120GB
Display(s) eizo foris fg2421
Case 700D
Audio Device(s) ESI Juli@
Power Supply seasonic platinum 1000
Mouse mx518
Software Lightning v2.0a
you mean feels as in "it good enough to go out to the market" ?
yhe sure it can work in gaming, first vegas proved that :), im saying it like that becouse wrong chip for the wrong purpose, mybe better to augment polaris ?
you right AMD is better becouse they use the open standards approach, that's not the problem. for me the problem is that they are not doing it for good reasons, for us, it becouse they are doing it for themselfs...
thanks man that very elaborate response, you cleaned up a few things :)
I didn't know the freesync problems are becouse of the monitors not AMD or the freesync system itself
everyone are liers today about the nm eh ?
 
Last edited:

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.79/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
I have a Vega 64 and I assure you that memory bandwidth isn't helping gaming. 20% overclock on HBM yields no tangible performance benefit with mine. These cards are constrained by power consumption, not memory bandwidth.
 

HTC

Joined
Apr 1, 2008
Messages
4,664 (0.76/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
This strikes me more that Huang is somewhat scared of what's coming down the pipeline.

Nvidia got forced out of its lucrative Gsync business because otherwise it wouldn't be able to certify against HDMI 2.1.

Nvidia's again pushed out of the console space (yes we know theres the Switch, but the Switch won't be defining GPU trends), which will continue to see Nvidia technology only bolted on, rather than core to any game experience. In 6-12 months we will see two new consoles built on AMD Navi GPU's, and corresponding cards for desktop. This is new, as traditionally console GPU's have lagged behind desktop tech, which will invariably benefit AMD for optimisation.

And lastly, with Intel entering the GPU space, Nvidia is left out in the cold with a lack of x86 ability, and only so much of that can be countered with its investment in RISC.

Full disclosure - I own a 2080 Ti, but to me this comes across as a temper tantrum rather than anything meaningful

Have not heard of this: source link, please?
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Have not heard of this: source link, please?
The HDMI 2.1 spec includes VRR support as a requirement, which would prevent Nvidia from certifying its cards against that spec if they still limited VRR support to G-sync. I guess a "middle of the road" option would be possible by supporting VRR on non-G-sync displays only if they are HDMI 2.1 displays, but that would just lead to (yet another) consumer uproar against Nvidia. Let alone the fact that (at least according to all reports I've seen) the HDMI 2.1 implementation of VRR is technically very, very close to AMD's existing extension of VESA Adaptive Sync To FreeSync over HDMI. Nvidia would have no technical excuse whatsoever to not support FreeSync.
 
Top