• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Shares Details About Ampere Founders Edition Cooling & Power Design - 12-pin Confirmed

Joined
Apr 24, 2020
Messages
2,723 (1.60/day)
This actually shows that is it you who does not understand how global ilumination works. The amount and location of light and shadow depends also on the materials. You cannot compare the reflection of a corner in some random dude's house and the one in the Nvidia demo, because you have no way of knowing if the materials are even remotely similar, with similar luminance etc.

Look, I know you're getting egged on by some other users right now. So I'll try to cut you some slack here.

Let me just give you a few links on this issue:

* https://docs.blender.org/manual/en/2.79/render/blender_render/world/ambient_occlusion.html

Ambient Occlusion is a sophisticated ray-tracing calculation which simulates soft global illumination shadows by faking darkness perceived in corners and at mesh intersections, creases, and cracks, where ambient light is occluded, or blocked.

There is no such thing as AO in the real world; AO is a specific not-physically-accurate (but generally nice-looking) rendering trick. It basically samples a hemisphere around each point on the face, sees what proportion of that hemisphere is occluded by other geometry, and shades the pixel accordingly.

* https://rmanwiki.pixar.com/display/REN/PxrOcclusion

PxrOcclusion is a non-photorealistic integrator that can be used to render ambient occlusion, among other effects.

* https://docs.arnoldrenderer.com/display/A5AFMUG/Ambient+Occlusion

Ambient occlusion is an approximation of global illumination that emulates the complex interactions between the diffuse inter-reflections of objects. While not physically accurate (for that use full global illumination), this shader is fast and produces a realistic effect.

All three 3d programs above are Raytracers, implementing raytraced ambient occlusion. All three claim that the effect is "fake" to some degree. No one who knows what they're talking about would ever claim that ambient occlusion is physically accurate.
 
Joined
Mar 10, 2010
Messages
11,878 (2.20/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
So are you saying quantity > quality? Like the amount of posts you make is actually more important that WHAT'S IN THOSE POSTS? Cute. FYI I've been in tech for a very long time. But this is internet. Anyone can say anything, be it the truth or completely made up. Believe me at your own peril. For the same reason I do not believe you, as the quality of your posts does not support your claims.
I pointed out that your talking about roughly the same process with different terminology, perspectives differ and your just wrong.

The human consciousness makes up 68% of what you see while you're eyes blink around like mad focusing on the next most important thing , scanned subconsciously by your peripheral senses...

So what you see is mostly what you want to.

And all methods thus far devised are fake representations of real world's, none exclusively.
 
Joined
Jan 21, 2020
Messages
109 (0.06/day)
Look, I know you're getting egged on by some other users right now. So I'll try to cut you some slack here.

Let me just give you a few links on this issue:

* https://docs.blender.org/manual/en/2.79/render/blender_render/world/ambient_occlusion.html



* https://rmanwiki.pixar.com/display/REN/PxrOcclusion



* https://docs.arnoldrenderer.com/display/A5AFMUG/Ambient+Occlusion



All three 3d programs above are Raytracers, implementing raytraced ambient occlusion. All three claim that the effect is "fake" to some degree. No one who knows what they're talking about would ever claim that ambient occlusion is physically accurate.
We are talking about games here. Ambient occlusion in games is based on rasterization - in facts the depth informarion and normals of the surfaces being occluded.

Let me just give you a few links on this issue: ...

From your own source:
"Ambient occlusion is an approximation of global illumination that emulates the complex interactions between the diffuse inter-reflections of objects. While not physically accurate (for that use full global illumination), this shader is fast and produces a realistic effect."

That is exactly what I was talking about. And that is why Nvidia is talking about global illumination all the time. You can get raytraced ambient occlusion (as in "the effect you used to get via SSAO", not the actual technique) using pure raytraced output. I still don't understand why you even brough Ambient Occlusion into this discussion. Nobody suggests that approach in raytraced games.
 
Last edited:
Joined
Apr 24, 2020
Messages
2,723 (1.60/day)
I still don't understand why you even brough Ambient Occlusion into this discussion.

You're welcome to review the post where I first brought up AO. The one you responded to here just a few hours ago:


Ambient Occlusion is another funny biased-rendering technique. Its completely fake. Corners do NOT absorb light into invisible black holes. But we use AO techniques because it makes shadows look deeper and with more contrast, which aids the video game player significantly.

------

My point is that Chrispy_'s discussion point on "fake" raytracing techniques is accurate. There's a lot of fakery going on in today's video games (and even movies). The fakery gets better every year, but if you train your eye to see how these 3d simulations are "fake", it becomes pretty easy to pick out the inaccuracies. Yes, even against raytracers (even movie-class raytracers like Arnold). AO happens to be one fake technique that I'm able to personally pick up on somewhat easily. Yes, even from the NVidia demo you've linked last page.

Raytracing can be fake (see Raytraced AO as a perfect example). Don't assume something is physically accurate just because some marketing material in a slick youtube video tells you so.
 
Last edited:
Joined
Feb 20, 2019
Messages
8,340 (3.91/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
That is a complete lie. A "library of tile-based approximation" is completely made up. There are denoisers at work, which you are obviously unable or unwilling to comprehend. The noisy ground truth output you posted is exactly the noisy ground truth that you can see in this video:

There is nothing fake about it. There is no tile-based whatever thing you made up used to process that. It uses denoisers.
So this is photo mode (static scene, static lighting, 8000 samples) run through an equalised histogram to expand the contrast range and draw out the tiling patterns that I mentioned were clearly visible in dark scenes. This is a single image but the patterns are way more obvious in motion because you can control their angle by moving the camera and your persistence of vision adds a level of temporal blur that helps pick these repeating patterns out of the truly random ray samples that manifest (without denoiser) as static noise like an old untuned TV set.

Important notes:
  • No denoise filter
  • No temporal AA
  • The highlighted surface of the railgun is textureless and smooth. It should not have patterns on it, especially not after 8000 samples.

1598479836806.png


These are repeating chunks of ordered noise, tiles - as I have called them - that cannot be true raytracing. It doesn't match gaussian/quantization noise that you'd get from a non-infinite number of rays and it has no place being there.

Nvidia may claim that they didn't cheat, but that's not a quote you can take out of context and apply to everything about raytracing - it's specific to the context of how they did the lighting - with raytracing methods rather than pre-baked static lightmaps or screen-space occlusion via shaders.

The way they did the raytracing is overflowingly full of cheats and shortcuts because they have to. We need 2-3 orders of magnitude more compute power to achieve cheat-free realtime raytracing at current resolutions. If you can't accept that then I don't know what else to say or how to explain it.
 
Joined
Jan 21, 2020
Messages
109 (0.06/day)
You're welcome to review the post where I first brought up AO. The one you responded to here just a few hours ago:
So basically you brought in ambient occlusion into the discussion, even though noone mentioned it before and noone suggests actually using it in raytraced games, because you read somewhere it's fake and it was in some professional renderers that also do raytracing, even though they clearly state themselves it's fake and if you want the real thing use global illumination. Is that it? Yes, ambient occlusion is fake. I said that myself. That's why Nvidia demonstrates how to get rid of it with global illumination. Are we done here?

My point is that Chrispy_'s discussion point on "fake" raytracing techniques is accurate.

No, it is not. And yes, there is a lot of fakery in todays games. That's why we're replacing it with physically based rendering - raytracing - to get rid of the fake. A person that tried to compare screenshots of reflections of two different materials is not going to convince me otherwise.

So this is photo mode (static scene, static lighting, 8000 samples) run through an equalised histogram to expand the contrast range and draw out the tiling patterns that I mentioned were clearly visible in dark scenes. This is a single image but the patterns are way more obvious in motion because you can control their angle by moving the camera and your persistence of vision adds a level of temporal blur that helps pick these repeating patterns out of the truly random ray samples that manifest (without denoiser) as static noise like an old untuned TV set.

Important notes:
  • No denoise filter
  • No temporal AA
  • The highlighted surface of the railgun is textureless and smooth. It should not have patterns on it, especially not after 8000 samples.

View attachment 166800

These are repeating chunks of ordered noise
Those are no patterns. You are trying to conjure things where there are none.
 
Joined
Apr 24, 2020
Messages
2,723 (1.60/day)
noone suggests actually using it in raytraced games

Really? No one? Not a single company you can think of that's pushing for Raytraced Ambient Occlusion?

* NVidia:
* Unity: https://docs.unity3d.com/Packages/c...@7.1/manual/Ray-Traced-Ambient-Occlusion.html
* Unreal: https://docs.unrealengine.com/en-US/Engine/Rendering/RayTracing/index.html

------

I'm not even "against" AO. It looks cool. It improves contrast, it helps video game characters stick out. But its a fake effect for sure. And its no surprise: true unbiased global illumination is far beyond the capabilities of modern computers. I'm not even talking about "Realtime", I'm talking about movies who spend 8+ hours per frame on supercomputer clusters. Ground-truth global illumination is simply too expensive to actually calculate.

So AO, a "cheaty fake" shadow system, will remain. Its the best we got for our current level of computers.
 
Joined
Feb 20, 2019
Messages
8,340 (3.91/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Those are no patterns. You are trying to conjure things where there are none.
The denial is strong here. I can turn up the contrast more just by repeating the process in a darker area for even more obvious contrast but if you can't see those patterns then it's time to get your eyes tested. Hell, fire up QuakeII for yourself and disable all the filters to see them in motion for yourself, motion makes it 10x more obvious.

Or, continue the denial; I don't gain anything from your acceptance and it doesn't reflect on me.
 
Joined
Jan 21, 2020
Messages
109 (0.06/day)
Really? No one? Not a single company you can think of that's pushing for Raytraced Ambient Occlusion?
OMG. I said it before it seems I need to repeat it - they are refering to raytraced ambient occlusion because the whole gaming industry was refering to the "darkened corners effect" as ambient occlusion.

ON THE FIRST SLIDE of the Nvidia video it says:
"Physically correct ambient occlusion"

Which should have hinted you why they are using that term. They are replacing ambient occlusion with a physically correct effect. It's even under the video in the description: "SSAO (Screen Space Ambient Occlusion) is a popular but limited process being used in contemporary games. Ray tracing provides better results."

I feel like I'm talking to a wall. Please actually watch the video. They are not suggesting anything like the sphere-based approximations some of the renderers you mentioned are using. They are simply casting rays. What they are talking about in the slides is how the DENOISER was modified to work well, saying that some areas need more samples per pixel to produce correct results when using raytracing. Nothing else. Nothing faked.

Please next time at least watch the video before you post it.

The denial is strong here. I can turn up the contrast more just by repeating the process in a darker area for even more obvious contrast but if you can't see those patterns then it's time to get your eyes tested. Hell, fire up QuakeII for yourself and disable all the filters to see them in motion for yourself, motion makes it 10x more obvious.

Or, continue the denial; I don't gain anything from your acceptance and it doesn't reflect on me.
The imagination is strong on your part.


I hope we are done here.
 
Joined
Mar 21, 2016
Messages
2,508 (0.78/day)
The thing to do would be to use two blower fans and have some cutout inlets in the middle. The inlets could either be to simply draw air thru them and use a blower on the bottom rear of the PCB to push it toward the middle and another fan on the top front of the card that expels all the heat it draws in out the back of the case. The other option is two blower on the rear of the PCB and some inlets that are kind of crisscrossed women between some cutoff holes in the PCB for some heat pipes to fit thru them. The big things are the two blower fans could defiantly expel heat outside the case more quickly and by utilizing the top and bottom of the PCB itself you've got more area for heat-sink cooling. This isn't the first time I've mentioned the concept of utilizing a pairing of blower fans and active cooling on both the top and bottom of a GPU in a 3 slot cooler design. Something I hadn't thought of in the past would be the addition of some cutoff out inlets to weave some heat pipes thru them from the bottom to the top of the PCB which I think would be great or even as air holes to draw the hot air up and out rather than being trapped against the PCB under load and heating it up in the process which is far from ideal.
 

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.36/day)
Nothing beats the non reference cooler deshroud plus custom 100mm or bigger fans that cover the shape of the heatsink perfectly. 20C lower at 20dB less. But those geniuses are far from perfecting this thing give or take another what 10 years and they may eventually get there. have to go through all the possible sketchy desings first.
 
Joined
Jul 9, 2015
Messages
3,413 (0.99/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
That's not traditional rasterization, that demo uses some form of Tracing for the global illumination system in fact.

Epic was asked "so, were you using RT"? And the answer is "nope".
Let's talk about "traditional" and "non-traditional" rasterizations, shall we.
 
Last edited:

Dux

Joined
May 17, 2016
Messages
511 (0.16/day)
When you consider that XBOX series X RDNA 2 GPU has 12 TFLOPS of performance (on pair with RTX 2080 Super), that high end RDNA 2 graphics card will surely be faster than 2080 Ti. Probably on pair with 3080. So Nvidia had to create this 3090 monstrosity to make sure it remains the leader with fastest discrete GPU.
 
Joined
Jan 11, 2005
Messages
1,491 (0.20/day)
Location
66 feet from the ground
System Name 2nd AMD puppy
Processor FX-8350 vishera
Motherboard Gigabyte GA-970A-UD3
Cooling Cooler Master Hyper TX2
Memory 16 Gb DDR3:8GB Kingston HyperX Beast + 8Gb G.Skill Sniper(by courtesy of tabascosauz &TPU)
Video Card(s) Sapphire RX 580 Nitro+;1450/2000 Mhz
Storage SSD :840 pro 128 Gb;Iridium pro 240Gb ; HDD 2xWD-1Tb
Display(s) Benq XL2730Z 144 Hz freesync
Case NZXT 820 PHANTOM
Audio Device(s) Audigy SE with Logitech Z-5500
Power Supply Riotoro Enigma G2 850W
Mouse Razer copperhead / Gamdias zeus (by courtesy of sneekypeet & TPU)
Keyboard MS Sidewinder x4
Software win10 64bit ltsc
Benchmark Scores irrelevant for me
The two 8-pin connections of 12-pin cable go into the PSU, this is different than 150W 8-pin you plug into the GPU. These are the slots that normally power your 2x8-pin cable, rated up to 300W. So theoretically 12-pin is up to 600W.

This doesn't necessarily need to be a hint at anything about Ampere's power draw, but it could mean that even Founders Edition is going to be able to pull over 375W. Most likely not with reference TDP, but after raising power target to the cap. Theoretically there would be no need for dual 8-pin if it was to be capped at 320W like 2080 Ti. Using two slots on the PSU instead of one is certainly some kind of compatibility concern, they wouldn't go for it if it wasn't needed. I wonder if there is going to be single 8-pin version for lower end cards like 3070, assuming they get 12-pin too.

wrong; the two 8 pin connector of 12 pin cable (provided by psu producer ) is using the same slots in psu as you use for current 8 pin ones, one connector /cable; so is same wattage/connector ; each psu producer will have to make these new cables available, as connector/slot type at psu vary by brand...

in addition , for those who don't have modular psu's, but have the two 8 pin pcie cables, they'll be adapters - two female 8 pin to male 12 pin; this adapter can be used also at modular psu by connecting the existing pcie cables so you won't have to buy the above mentioned special cable which may be expensive compared to adapter btw...
 
Joined
Feb 20, 2019
Messages
8,340 (3.91/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Nothing beats the non reference cooler deshroud plus custom 100mm or bigger fans that cover the shape of the heatsink perfectly. 20C lower at 20dB less. But those geniuses are far from perfecting this thing give or take another what 10 years and they may eventually get there. have to go through all the possible sketchy desings first.
AMD and Nvidia are unwilling to design a card that looks industrial, and - much like motherboard manufacturers - are obsessed with functional compromises for the sake of aesthetic design cues. Most of the time the aesthetic choices are tons of decorative plastic, lighting strips, and airflow obstructions that exist only to hold brand logo or name plate.

The obvious solution is much like you say - a full-coverage vapor-chamber to deal with the GPU, VRAM, and VRMs all connected to a 280mm heatsink that gets optimum cooling from a pair of regular 140mm fans and are controlled via a PWM header on the GPU board. Third party solutions exist for huge aftermarket air-coolers but they are all poor compromises designed to fit a wide range of cards and can never be as specific for one card as the cooling solution an OEM designed for that one exact board layout.
 
Joined
Jul 5, 2008
Messages
337 (0.06/day)
System Name Roxy
Processor i7 5930K @ 4.5GHz (167x27 1.35V)
Motherboard X99-A/USB3.1
Cooling Barrow Infinity Mirror, EK 45x420mm, EK X-Res w 10W DDC
Memory 2x16GB Patriot Viper 3600 @3333 16-20-20-38
Video Card(s) XFX 5700 XT Thicc III Ultra
Storage Sabrent Rocket 2TB, 4TB WD Mechanical
Display(s) Acer XZ321Q (144Mhz Freesync Curved 32" 1080p)
Case Modded Cosmos-S Red, Tempered Glass Window, Full Frontal Mesh, Black interior
Audio Device(s) Soundblaster Z
Power Supply Corsair RM 850x White
Mouse Logitech G403
Keyboard CM Storm QuickFire TK
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/e5uz5f
It's right there at ~2:25. Also no crazy airflow apocalypse in the case as some suggested. Back fan is in pull config. Edit: Or they revised the design and put the second fan on the front like a sane person would do...
View attachment 166751

Look at all that innovation:

My old XFX Fury (2015???) had a triple fan cooler with a short PCB like that so the third fan blew upwards, it resulted in lower CPU temperatures under combined load vs just CPU load in some situation.

I suspect a similar approach will be taken on may AIBs, does this mean the 3090 is going with HMB, surely you can't squash 20+ GB of GDDR6 into a short PCB like that?

Got a triple slot cooler on your new top end card? Welcome to 2008, you're going to love it!

Got a vapour chamber on your GPU? Welcome to 2006, you're going to love it.
 
Last edited:
Joined
Feb 20, 2019
Messages
8,340 (3.91/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Look at all that innovation:

My old XFX Fury (2015???) had a triple fan cooler with a short PCB like that so the third fan blew upwards, it resulted in lower CPU temperatures under combined load vs just CPU load in some situation.

I suspect a similar approach will be taken on may AIBs, does this mean the 3090 is going with HMB, surely you can't squash 20+ GB of GDDR6 into a short PCB like that?

Got a triple slot cooler on your new top end card? Welcome to 2008, you're going to love it!

Got a vapour chamber on your GPU? Welcome to 2006, you're going to love it.
Nvidia's just taking the 'Apple stance' of re-using an existing idea and claiming they thought of the innovation themselves, and then charging extra for the privilege.
Fanboys will take them at their word because they're in love, or something like that.
 
Joined
Jan 21, 2020
Messages
109 (0.06/day)
Nvidia's just taking the 'Apple stance' of re-using an existing idea and claiming they thought of the innovation themselves, and then charging extra for the privilege.
Fanboys will take them at their word because they're in love, or something like that.
Nowhere in the video are they claiming anything like that. In a typical AMD fanboy fashion you are making things up, just like in our previous discussion. The only part they mention something that was not done before is related to the springs and backplate attachment, not the actual airflow, cut out board or anything from what you mentioned. I urge whoever is reading this to actually watch the video.
 
Joined
Apr 24, 2020
Messages
2,723 (1.60/day)
In a typical AMD fanboy fashion you are making things up

The dude literally was posting screenshots of a 2060 doing Raytracing in Quake in the last page. He has a freaking NVidia GPU. He's willing to spend large amounts of time figuring out how it works.

Raytracing a scene fully on my 2060S at native resolution still takes 20 seconds to get a single, decent-quality frame
 
Joined
Aug 31, 2016
Messages
104 (0.03/day)
wrong; the two 8 pin connector of 12 pin cable (provided by psu producer ) is using the same slots in psu as you use for current 8 pin ones, one connector /cable; so is same wattage/connector ; each psu producer will have to make these new cables available, as connector/slot type at psu vary by brand...

in addition , for those who don't have modular psu's, but have the two 8 pin pcie cables, they'll be adapters - two female 8 pin to male 12 pin; this adapter can be used also at modular psu by connecting the existing pcie cables so you won't have to buy the above mentioned special cable which may be expensive compared to adapter btw...

Your 2x8-pin cable connects to the PSU with one 8-pin connector, not two. If 12-pin cable uses two then it should be able to pull 600W, at least in theory. But if that's not the case then what is all the power draw crying about if the card won't be able to possibly pull more than your regular 2x8-pin one? 2080 Ti already pulls more than that after OC since it is not entirely satisfied with 380W power limit, and that's with 11GB of memory not 24. If 3090 can deliver 50% performance uplift at the same power and with 24GB of memory then it will be very efficient.
 
Joined
Apr 24, 2020
Messages
2,723 (1.60/day)
When you consider that XBOX series X RDNA 2 GPU has 12 TFLOPS of performance (on pair with RTX 2080 Super), that high end RDNA 2 graphics card will surely be faster than 2080 Ti. Probably on pair with 3080. So Nvidia had to create this 3090 monstrosity to make sure it remains the leader with fastest discrete GPU.

I don't know if we should be comparing flops-for-flops across architectures. Case in point: Vega64 had 12 TFlops (single precision) of performance.

NVidia has always had fewer TFlops than AMD chips, and yet NVidia delivers high FPS when it actually comes to games. I'm sure a lot of it is the PTX compiler and/or other parts of the driver.

With that being said: RDNA has made advancements in efficiency. And XBox Series X / PS5 seem to have high-powered raytracing cores (Ray-box and Ray-triangle).
 
Joined
Feb 20, 2019
Messages
8,340 (3.91/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Nowhere in the video are they claiming anything like that. In a typical AMD fanboy fashion you are making things up, just like in our previous discussion. The only part they mention something that was not done before is related to the springs and backplate attachment, not the actual airflow, cut out board or anything from what you mentioned. I urge whoever is reading this to actually watch the video.
AMD fanboy. LOL, I think you've earned yourself an ignore.
A discussion with you is much like arguing with a flat-earther; Facts are irrelevant and you're in denial of real evidence.
 
Joined
Jan 21, 2020
Messages
109 (0.06/day)
AMD fanboy. LOL, I think you've earned yourself an ignore.
A discussion with you is much like arguing with a flat-earther; Facts are irrelevant and you're in denial of real evidence.
So where in the video is Nvidia saying they invented all this? Could you give us a timestamp where we can see and hear that, since you're so full of relevant facts? ;)

The dude literally was posting screenshots of a 2060 doing Raytracing in Quake in the last page. He has a freaking NVidia GPU. He's willing to spend large amounts of time figuring out how it works.

Yes of course he is making things up. FYI I do have Quake RTX, in fact the full game including all expansions, not just the demo and I played through the whole thing, inlcuding experimenting, on my RTX 2080 Ti. There is nothing like what he is claiming happening in the game.

quake.jpg


He is actually actificially increasing any shimmer in the picture, when he "run through an equalised histogram to expand the contrast range and draw out the tiling patterns ". Only what he is trying to convince us are tiling patterns is nothing more than effects of multiple other phenomena, including but not limited to noise induced order and the fact that floating point numbers in computer software are in reality discrete, not continuous.

"One distinguishing feature that separates traditional computer science from scientific computing is its use of discrete mathematics (0s and 1s) instead of continuous mathematics and calculus. Transitioning from integers to real numbers is more than a cosmetic change. Digital computers cannot represent all real numbers exactly, so we face new challenges when designing computer algorithms for real numbers. Now, in addition to analyzing the running time and memory footprint, we must be concerned with the "correctness" of the resulting solutions. This challenging problem is further exacerbated since many important scientific algorithms make additional approximations to accommodate a discrete computer. Just as we discovered that some discrete algorithms are inherently too slow (polynomial vs. exponential), we will see that some floating point algorithms are too inaccurate (stable vs. unstable)."

I hope Princeton as a source is good enough. :D

He expanded the contrast to the extreme, chasing ghosts. Nothing else.
 
Joined
Apr 24, 2020
Messages
2,723 (1.60/day)
Lets see. On the one hand, there's Chrispy_, a dude who has been discussing things reasonably with over at TechReport for over a decade and has proven himself to me (multiple times) to have a sharp mind.

On the other hand, there's Jinxed, with ~50 posts in history. And literally every single one of them is about Nvidia vs AMD bullshit I don't give a care about. Someone who gets flustered over the simple mention of ambient occlusion and... floating point numbers.

Clean up your posting history Jinxed. Start posting about other topics, and prove yourself to me if you expect me to take yourself seriously.
 
Top