• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Arc B580

Joined
Dec 6, 2022
Messages
516 (0.67/day)
Location
NYC
System Name GameStation
Processor AMD R5 5600X
Motherboard Gigabyte B550
Cooling Artic Freezer II 120
Memory 16 GB
Video Card(s) Sapphire Pulse 7900 XTX
Storage 2 TB SSD
Case Cooler Master Elite 120
TBH, visually games have barely gotten better looking over the past 4 years while being noticeably heavier in computing requirements. We've reached a point where more details and higher poly count are also becoming gimmicky because you only notice the difference when you stop and stare.
Oh yes, agreed.

We reached that point a little while ago and I lament that many devs continue going for the eye candy but seem to have forgotten that gameplay is that important.

RT could bring something new on the eye candy dept but not necessarily in gameplay , but so far, i havent seen anything that makes me want to ignore the performance hit.

Funny enough, going by that, it does helps justifying gpus like this one, given its msrp and performance.
 
Joined
Jul 20, 2020
Messages
1,166 (0.71/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 ProVDH M
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) PColor 6800 XT / GByte RTX 3070
Storage 4TB Team MP34 / 2TB WD SN570
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Power Supply EVGA 650 G3 / EVGA BQ 500
Witcher 3 results are still incomparable to any other reviewers and give the appearance that this card is faster than it actually is.

W1zz tests The Witcher 3 in DX11 where most/all other places test it in the newer DX12 implementation. Both are valid but I prefer DX11 here to represent performance in older titles, many of which I'm still playing.
 
Joined
Feb 1, 2019
Messages
85 (0.04/day)
Location
Larvik, Norway
I still can't get my head around this:
  • No support for DLSS (yes I know it's an NV exclusive, still doesn't change the fact that you can have it on one option and not on others)
I thought XeSS was about as good as DLSS. Why would one care about DLSS if the card supports a very similar version of it? Also the last TPU Frontpage poll only found 2.8% of TPU readers care about Upscaling and Frame Generation. I would almost say these technologies are all cons as we are being forced to pay for the development of something the vast majority of us don't want.

As long as NV GPU´s being reviewed get the equivalent comment; "Cons: No support for XeSS 2" it all works out in the end. However, awarding manufacturers who makes use of proprietary tech by penalizing those who do not, require a spoon fed explanation if I am to make any sense of it.
 
Joined
Dec 9, 2024
Messages
95 (2.97/day)
Location
Missouri
System Name Don't do thermal paste, kids
Processor Ryzen 7 5800X
Motherboard ASUS PRIME B550-PLUS AC-HES
Cooling Thermalright Peerless Assassin 120 SE
Memory Silicon Power 32GB (2 x 16GB) DDR4 3200
Video Card(s) RTX 2080 Super Founders Edition
Display(s) Gigabyte G27Q
Case SAMA SV01
Power Supply Firehazard in the making
Mouse Corsair Nightsword
Keyboard Steelseries Apex Pro
I thought XeSS was about as good as DLSS. Why would one care about DLSS if the card supports a very similar version of it? Also the last TPU Frontpage poll only found 2.8% of TPU readers care about Upscaling and Frame Generation. I would almost say these technologies are all cons as we are being forced to pay for the development of something the vast majority of us don't want.
It isnt, but neither is FSR to be fair (I'd still like to see both reach that point though.) Never take a poll at face value, even TPU's. Theres pros and cons to upscaling and frame gen, I think the positives outweigh the negatives if developers don't rely on them but were seeing a increased reliance on upscaling, frame gen, etc it seems.
 
Joined
Dec 28, 2012
Messages
4,032 (0.92/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
As long as NV GPU´s being reviewed get the equivalent comment; "Cons: No support for XeSS 2" it all works out in the end. However, awarding manufacturers who makes use of proprietary tech by penalizing those who do not, require a spoon fed explanation if I am to make any sense of it.
When there are swaths of games that support XeSS, but not DLSS, that will make sense. As it stands DLSS are widespread and the other two are spotty to MIS, so not having it is a major downside no non Nvidia GPUs.

Sucks, but the real world is not fair and open. Never has been.
 
Joined
Dec 25, 2020
Messages
7,215 (4.88/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
When there are swaths of games that support XeSS, but not DLSS, that will make sense. As it stands DLSS are widespread and the other two are spotty to MIS, so not having it is a major downside no non Nvidia GPUs.

XeSS SR is all that's necessary, the other components (hardware-assisted frame generation and low-latency) are already available in NV hardware. You can even use NVIDIA Reflex low-latency in conjunction with a XeSS 1.3 or FSR 3 context. they're independent of one another.
 
Joined
Jul 24, 2024
Messages
316 (1.86/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
Having a problem with a company does not justify such childish behaviour.
What is and what is not considered a childish behavior is a relative thing.

PhysX left many gamers pissed, so that's why it's PissX. One of the reasons has already been mentioned this thread, but there are also others.

To me, childish is Nvidia who lets their kids play only with their own toys and the kids are also forbidden to share toys. This is one of the fundamental differences between AMD and Nvidia. Nvidia is a corp. that keeps their technologies locked and wants money for it, while AMD inventions are often released as open-source. I get it, it's all for money. However, sharing such things can radically speed up the development of other things and might result in much better things/inventions. To me, sharing seems more adultish than childish. Even Intel does open-sourcing. Sometimes Nvidia product owners get f*cked up when Nvidia says the technology won't be available on previous gen. cards, because ... god only knows why, as all required compoments (computing units) are present on previous gen. products as well.

Being greedy and not wanting to share toys is also present in very young age of a kid, mostly in age of 2-5 years.

Btw, have you read how Jensen avoided paying $8 billion in taxes? One definition for greed is for one to be never satisfied with assets they already posses, they always think they can have more and more. An honest businessman with lots of money would not give a f* about paying taxes, but someone deliberately abusing a hole in the law is not honest. That's to say. US tax laws are a joke, that's also to say.

So, Nvidia does this charity thing called "GeForce fund" which helps poor souls in San Francisco, but god only knows how really does it help and how much money do these poor souls see in the end, because to disclose how these money is spent (along with how much) is not required by US law. :kookoo:
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,991 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
However, sharing such things can radically speed up the development of other things and might result in much better things/inventions
Has it? Have you seen any meaningful returns from AMD's open source initiatives? I only see NVIDIA advancing technology in major ways, and others just follow and are like "o shit, how can we do this with our own tech?" .. the CES press conferences will be interesting
 
Joined
Jul 24, 2024
Messages
316 (1.86/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
Has it? Have you seen any meaningful returns from AMD's open source initiatives? I only see NVIDIA advancing technology in major ways, and others just follow and are like "o shit, how can we do this with our own tech?" .. the CES press conferences will be interesting
Yes. AMD developed Mantle and open sourced it. This paved path for Vulkan. Thanks to Vulkan we can play games without emulation on Linux. Many games run smoother on Linux but TPU only tests Windows. It would be good to have at least few games tested on Linux, so that we know how much performance are we losing thanks to Windows.

Nvidia advances in technologies that aim to alter the original image in order to increase fps. That's not a proper way to justify ever raising complexity and prices of their hardware. But that's my humble opinion.
 
Joined
May 25, 2022
Messages
137 (0.14/day)
Looks like they can still pull more from driver improvements, because it still has performance "flaws" that Alchemist has, which is that it's weak at lower resolutions. I had hoped that maybe the change in architecture and increased utilization would have improved it, but instead of enhancing it at 1080p, it makes it relatively more competitive in 1440p and 4K resolutions. At the higher resolutions the card is rather formidable. It also does fix some issues such as Alchemist falling really behind in Assassin's Creed games. Looks like the near 50% improvement makes it a strong contender in that game.

Perhaps a 10% general boost in 1080p is possible. The card is more powerful than Alchemist and thus would run into bottlenecks more. Also we haven't got the driver that would do a general boost to DX11 yet, they are still targeting games individually. It's also weaker on Unreal 5 engine games, which is also a surprise since things like Execute Indirect is supposed to be especially of benefit to engines like UE5.

Despite the fact that it's still behind performance per area, I estimate it's still 20-30% more performance ISO-process compared to Alchemist. They also managed to improve power efficiency despite clocking 20% or so higher(which is a typical node gain), so the power efficiency improvements are significantly attributed to architecture, not node. ISO-process perf/W is probably 35-45% better compared to Alchemist.

The 10% general boost in 1080p via drivers may be a nice surprise to counter the RDNA4 and RTX 5000 series competition?

Based on the A580 vs A770 results, a hypothetical B770 "G31" might just be able to reach 4070 Super in 4K, as it has 60% more shaders, fill rates, and the clock gap is supposed to be greater than A580 vs A770, possibly in the 3.2-3.3GHz range. The lower end resolution will depend on the driver and will probably land around regular 4070.

Solid card, for now.
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,991 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Yes. AMD developed Mantle and open sourced it. This paved path for Vulkan
Ah yes, good old times when they were competitive. Over 10 years ago now .. Mantle was never open sourced though afaik, just the API spec was basically carried forward. and still Vulkan games make up less than 1% of the market. it's the only reason I kept Doom Eternal in my test suite.

Yes, before anyone asks, I'll be adding Indiana Jones in next rebench

Space Marine 2 has been retested on all cards and readded to the charts
 
Joined
Nov 13, 2024
Messages
96 (1.66/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5950X
Motherboard ASRock B550 Phantom Gaming 4
Cooling Peerless Assassin 120 SE
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
Has it? Have you seen any meaningful returns from AMD's open source initiatives? I only see NVIDIA advancing technology in major ways, and others just follow and are like "o shit, how can we do this with our own tech?" .. the CES press conferences will be interesting
FreeSync also wasn't bad... almost on any monitor nowadays. BUT it doesn't earn AMD anything, also advertised as "g-sync compatible" so most users think it's an nvidia thing.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,991 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
FreeSync also wasn't bad... almost on any monitor nowadays. BUT it doesn't earn AMD anything, also advertised as "g-sync compatible" so most users think it's an nvidia thing.
That's a nice example indeed. NV invented G-Sync, AMD was "o shit", but luckily they figured out how to run it without $$ dedicated hardware in the monitors and gave it to VESA, who made it a standard
 
Joined
Nov 13, 2024
Messages
96 (1.66/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5950X
Motherboard ASRock B550 Phantom Gaming 4
Cooling Peerless Assassin 120 SE
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
There was talk earlier about differing control panels, this video shows off the Intel side of things.
looks pretty clean. Let's see if they can keep it that way ^^
 
Joined
Mar 18, 2015
Messages
182 (0.05/day)
Ah yes, good old times when they were competitive. Over 10 years ago now .. Mantle was never open sourced though afaik, just the API spec was basically carried forward. and still Vulkan games make up less than 1% of the market. it's the only reason I kept Doom Eternal in my test suite.
Not everything is about the latest AAA slopfest. People use GPUs for other things. Vulkan is the beating heart of the emulation scene, along with gaming on Linux as a whole. Devices like the Steam Deck simply wouldn't exist without it. Not to mention Android, where Vulkan is the standard graphics API, having replaced the now-deprecated OpenGL ES. The impact of Vulkan on the world has been gigantic, and only an ignorant person would try to downplay it.

As for Mantle being open sourced, AMD donated it to the Khronos Group as a base to develop Vulkan. If you want to see the Mantle source as it existed at the time development on it ceased, just grab Vulkan 1.0 from 2016. Obviously modern Vulkan has been hugely extended and expanded upon since then.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,991 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
The impact of Vulkan on the world has been gigantic, and only an ignorant person would try to downplay it.
No doubt, but in the context of modern commercial AAA game development on PC it barely exists. I wish it was different

If you want to see the Mantle source as it existed at the time development on it ceased, just grab Vulkan 1.0 from 2016
I don't think that is an accurate statement. AFAIK Mantle is not compatible with Vulkan, not even 1.0
 
Joined
Jul 5, 2013
Messages
28,464 (6.77/day)
The impact of Vulkan on the world has been gigantic, and only an ignorant person would try to downplay it.
Wait, was that a slam on W1z? Are you kidding?

Vulkan is not as popular on Windows, thus why it's not focused on very much by anyone reviewing hardware.

I don't think that is an accurate statement. AFAIK Mantle is not compatible with Vulkan, not even 1.0
I would agree. Vulkan is a rework and recompile of Mantle and as such there are differences. It's not a 1 to 1 transition.
 
Joined
Jul 24, 2024
Messages
316 (1.86/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
Ah yes, good old times when they were competitive. Over 10 years ago now .. Mantle was never open sourced though afaik, just the API spec was basically carried forward. and still Vulkan games make up less than 1% of the market. it's the only reason I kept Doom Eternal in my test suite.
Was you previous question somewhat time-limited? Anyway, there are other games that support Vulkan as alternative API, AAA games too (RDR2, Detroit Become Human, Rainbow Six Siege). Many games and software that are ported to Android leverage Vulkan, beucause it is OS independent and cross-platform. Many gamers hate Windows 11 and would go for Linux next day when there is support for their games and hardware. I think you underestimate what OS independent API means. Valve has big plans with it and Valve surely is a company that can give gaming scene a twist.

Mantle was given to Khronos by AMD while knowing Khronos' intent to make new open API based on it. Surely if AMD did not want to make it open, they would specify it in terms they agreed upon with Khronos.

That's a nice example indeed. NV invented G-Sync, AMD was "o shit", but luckily they figured out how to run it without $$ dedicated hardware in the monitors and gave it to VESA, who made it a standard
DisplayPort AdaptiveSync was optional part of VESA's DisplayPort 1.2a standard released in January 2013. Nvidia released GSync in October 2013, that's a bit later. AMD did not give anything to VESA, Adaptive Sync was there first and then there was AMD FreeSync "made" on this already existing part of VESA's DisplayPort VRR technology. VESA later renamed this technology on AMD's request. AMD showed first prototype of FreeSync monitor in 2014, one year and few months later than DP 1.2a specification was released. Nvidia invented something what was already invented (and invented without requirement for additional device). Foundation for adaptive sync technology was part of eDP specification since 2009, as states this article on TPU:
Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA's embedded DisplayPort (eDP) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync.
It's been already there for eDP, they added it for external DP in 2013 in 1.2a revision.

While fighting for Nvidia's GSync, please don't forget to mention 14W permanent power draw of monitor's GSync module evens while monitor is in standby mode.

Now I clearly see why Nvidia-related news on TPU get much more attention than news related to anything else.
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,991 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
DisplayPort AdaptiveSync was optional part of VESA's DisplayPort 1.2a standard released in January 2013. Nvidia released GSync in October 2013, that's a bit later. AMD did not give anything to VESA, Adaptive Sync was there first and then there was AMD FreeSync "made" on this already existing part of VESA's DisplayPort VRR technology. VESA later renamed this technology on AMD's request. AMD showed first prototype of FreeSync monitor in 2014, one year and few months later than DP 1.2a specification was released. Nvidia invented something what was already invented (and invented without requirement for additional device). Foundation for adaptive sync technology was part of eDP specification since 2009, as states this article on TPU:
I must be an idiot, because I was at the G-SYNC press event in London, and nobody had ever seen something like it. A while later I was at AMD's event, where they showed Adaptive Sync running on their GPUs and everybody was like "ah, so finally they have G-SYNC"
 
Joined
Jul 24, 2024
Messages
316 (1.86/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
I must be an idiot, because I was at the G-SYNC press event in London, and nobody had ever seen something like it. A while later I was at AMD's event, where they showed Adaptive Sync running on their GPUs and everybody was like "ah, so finally they have G-SYNC"
I'd say you reviewers don't give a damn about 500-sheet specifications. And who does? When there is generational change, people talk only about bandwidth (mostly). I was impressed about that fact from 2009, too, I thought of 2011 or 2012 as beginning for VRR but I was wrong. I felt need to let you know that AMD did not give anything to VESA to standardize to make it AMD's GSync-like and VRR technology was already specified 9 month sooner than Nvidia came out with GSync, and as it turns out, VESA themselves states the sooner date for embedded DP standard. AMD gave proposition to VESA, but that was a bit later than DP 1.2a specs was released.

Thanks for fixing Space Marine 2 section, I could not believe my eyes back then when I saw B580 beating RX 6800.

Also, while I don't want to sound rude or that I don't appreciate your work, maybe it'd be wise to expand efficiency testing of GPUs on more than just Cyberpunk to make results more accurate. B580 does very well in CB2077, but on average the RTX 4060 is real efficiency king across the board.
 
Joined
Jun 25, 2018
Messages
34 (0.01/day)
Ah yes, good old times when they were competitive. Over 10 years ago now .. Mantle was never open sourced though afaik, just the API spec was basically carried forward. and still Vulkan games make up less than 1% of the market. it's the only reason I kept Doom Eternal in my test suite.

Yes, before anyone asks, I'll be adding Indiana Jones in next rebench

Space Marine 2 has been retested on all cards and readded to the charts
mantle 1.0 was open source. then kronos pick it and become mantle.
the begining of mantle came from betrayal from microsoft-amd's temash deal. and that why we get better version of dx12 that is low level api same as mantle.
radeon team had been behind developing of direct x for a long time now.
 
Joined
Jul 5, 2013
Messages
28,464 (6.77/day)
While fighting for Nvidia's GSync, please don't forget to mention 14W permanent power draw of monitor's GSync module evens while monitor is in standby mode.
That depends on what you mean by "Standby". If you're talking about low-power sleep mode during an active OS session, than then maybe. If you're talking about when the display is not receiving an active signal due to the connected system being powered off, then no. In "Powered Off" mode, the display will draw less than 2.5W(IIRC) as per US regulations, which became mandatory in the US a number of years ago and were adopted world-wide soon thereafter.
Thus. Read, learn.
Now I clearly see why Nvidia-related news on TPU get much more attention than news related to anything else.
Good grief! And people call ME condescending?
 
Joined
Jun 2, 2017
Messages
9,531 (3.43/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Has it? Have you seen any meaningful returns from AMD's open source initiatives? I only see NVIDIA advancing technology in major ways, and others just follow and are like "o shit, how can we do this with our own tech?" .. the CES press conferences will be interesting
Freesync or if you like VRR
 
Joined
Nov 13, 2024
Messages
96 (1.66/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5950X
Motherboard ASRock B550 Phantom Gaming 4
Cooling Peerless Assassin 120 SE
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
DisplayPort AdaptiveSync was optional part of VESA's DisplayPort 1.2a standard released in January 2013. Nvidia released GSync in October 2013, that's a bit later. AMD did not give anything to VESA, Adaptive Sync was there first and then there was AMD FreeSync "made" on this already existing part of VESA's DisplayPort VRR technology. VESA later renamed this technology on AMD's request. AMD showed first prototype of FreeSync monitor in 2014, one year and few months later than DP 1.2a specification was released. Nvidia invented something what was already invented (and invented without requirement for additional device). Foundation for adaptive sync technology was part of eDP specification since 2009, as states this article on TPU:

It's been already there for eDP, they added it for external DP in 2013 in 1.2a revision.

While fighting for Nvidia's GSync, please don't forget to mention 14W permanent power draw of monitor's GSync module evens while monitor is in standby mode.

Now I clearly see why Nvidia-related news on TPU get much more attention than news related to anything else.
I'm a bit confused here, maybe I misunderstood W1zzards message. But why does it matter who owns the tech/software now/implemented it as a standard? AMD found a way to introduce something cheap that almost did the same thing. Why argue about something this pointless. It's a win for amd (and us) in my book at least.
Freesync or if you like VRR
was already mentioned by me in #262 (Freesync)
 
Last edited:
Top