• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Editorial AMD Actively Promoting Vulkan Beyond GPUOpen

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,535 (6.67/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
That's the most hilarious shit when you call someone an AMD fanboy and they own NVIDIA card... Epic fail.

FNGs-smh
 
Joined
Nov 13, 2007
Messages
10,826 (1.73/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
Hence why I just bought a Fury for $310 - it beats the 1070 in TODAY's games. That's just stupid.


What on earth are you talking about... the 780 was 15-20% faster than the 7970 GHZ edition, so even if you hit 1200Mhz clocks, as soon as someone OC'ed a 780 they would still soundly beat that card.

I get that you like your cards, and that's cool - but benchmarks are benchmarks, the Fury is nowhere near beating the 1070...

And it won't - in maybe 2-3 titles it will match the 1070, beat it by very little, but it will lose in a large majority of the rest... thats what i mean by false hope. It's just not true.

and AMD markets it and people like you believe it and go out buying $310 Furies thinking they'e beating 1070's ... when they're not. This is my problem with these types of PR campaigns.

upload_2016-9-17_3-6-7.png
]
 
Last edited:

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,105 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Uhhh have you not looked at benchmarks for the past year? I genuinely encourage you to go read them and then come back.


Ok you back? Good.

1) The 7970 overclocks better than anything that has been released since then. My 7970 ran at 1220/1840. My brother's 7950 ran at 1120/1800, and all of my crypto-mining 7950's ran at 1100+/1800+. Those are 40% overclocks lmao! My 7970 benches as well as a 980 in Deus Ex: MD and BF1. So drop that argument here.

2) 2-3 generations? You completely missed what I was saying. I said that withing a year of the 7970's launch it was ALREADY beating the 680 by 10-20% on average. Most people keep their cards for 2-3 years in my experience.

Furthermore just because it is 1-2 generations newer doesn't make a difference. Everyone CONSTANTLY complains about AMD's recent trend of re-branding old GPU's. I will admit that I think it is stupid too, but can you blaim them? Radeon is like 1-2 times smaller than Nvidia. If they can sell the 7970 2 years later and have it compete with the 970 they will lmao. Hence why I just bought a Fury for $310 - it beats the 1070 in TODAY's games. That's just stupid.

You're overstating the 680. The 680 did enough to stay ahead of the initial 7970. But AMD rereleased the 7970 as a GHz edition. It did perform on par if not better.
Given the driver optimisations AMD perform, it gained performance over the years. The 680 didn't because Nvidia get DX11 driver optimisations very quickly.
The question is, are you buying a card for now or two years down the line? You seem to take a hugely AMD slanted bias. I've seen your posts on other forums and it's clear you're a hater, you lack balance. For all your precise arguments, like any hater, you tend to use slanted evidence or ignore standard business practice.
I own a 980ti and it chuckles me to see it perform on par with a Fury X. Now that's in a DX11 version of a Gaming Evolved title. Sure, in DX12 the Fury X will gain some frames but why should I cry? I played the game at 1440p with very high settings (some maxed) at about 60fps. It was excellent.
In Doom Vulkan I was on far higher fps. Yes a Fury X would have got more but I've got a 60hz monitor. My gaming experiences have been great.
I bought the card a year ago. It hasn't let me down.
Going forward, I am no fan of Nvidia. I won't spend the money for a 1080, 1080ti or above because Vega is only 6 months away (hopefully at most). If Vega has a better perf/watt than Polaris and it's a far bigger chip it should match the 1080 in DX11 and it should absolutely own the bare metal API's. So Nvidia won't see my money again until Volta and even then, that's only if it performs.

So if Vega is twice as good as a 480, I'm on that next. But the reason I have no reason to move from my 980ti is because it still performs very, very well. If I can play AAA Gaming Evolved titles with butter smooth frames, I have nothing to feel cheated about. Only children get upset because someone elses card plays it faster than theirs.

Oh and one more thing, my Powecolor LCD 7970 clocked to the 1300 catslyst maximum. My MSI version (under an EKWB) only managed 1225.

I had more fun overclocking my original Titan using the voltage soft mod. I can't remember the clocks but they were scary for such a card. Two of those cards got recycled in TPU.
 
Joined
Apr 18, 2015
Messages
234 (0.07/day)
I want a push to Vulkan for the simple reason it works on other operating systems too (Linux/Mac).

I second that, graphics api should not be linked to the platform. Nowadays the main reason to purchase windows at home and the main reason there is no other serious competitor on PC is that so many applications are tied to windows apis.
 
Joined
Jan 13, 2009
Messages
424 (0.07/day)
nVidia a a member of Khronos too. They can add optimizations to Vulkan for their hardware too. Of course then it would be open source though.
 
Joined
Jun 19, 2010
Messages
409 (0.08/day)
Location
Germany
Processor Ryzen 5600X
Motherboard MSI A520
Cooling Thermalright ARO-M14 orange
Memory 2x 8GB 3200
Video Card(s) RTX 3050 (ROG Strix Bios)
Storage SATA SSD
Display(s) UltraHD TV
Case Sharkoon AM5 Window red
Audio Device(s) Headset
Power Supply beQuiet 400W
Mouse Mountain Makalu 67
Keyboard MS Sidewinder X4
Software Windows, Vivaldi, Thunderbird, LibreOffice, Games, etc.
- Fails in bringing CPU-Pressure down in Games using DX12
- Often no FPS-Gains for slower CPUs
- sadly most DX11 to DX12 shifts stay fokused on strong single-thread-Perf like all DX before

Vulkan resulting from Mantle does everything better on weak Hardware, logically because the Consoles use 8 weak Cores so the work needs to be ballanced as good as possible between ALL available Hardware



this Chart is old but i can´t show it often enough:
Fact the 290X theoretically is way stronger than a GTX 780GHz

Feeding the GTX in DX11 is done with less CPU-Load over the 290X resulting in higher FPS on every CPU in DX11 (you get the max Perf out of it easier)

- the Gains the 290X gets in Mantle are mostly from feeding it better with the CPU
- the impressive Mantle-Leads over Nvidia-DX11 are only done between the "lowest 4-Thread CPU" and the "highest non-k i5"

"On the 2 strongest CPUs" the 290X can close the Gap to DX11 because the GPU Utilization was bad in DX11, but it can´t beat the GTX
- because DX11-Language can´t be ballanced in the GCN GPU good enough because GCN is very compute-oriented. It needs compute-Shaders etc. to get fully utilized, DX11 mostly isn´t used like that.
____________________________________________________________________________________________________________

Sadly DX12 seems to not focus on giving us more FPS with cheaper CPUs, no it just makes new Games on $1500+ Rigs look better

I´m a person being satisfied with for example a GTA without Temporal-Aliasing and Flickering-Edges and Tearing and Input-Lag
With a Budget for CPU+GPU of upto $350 ... for example i3-6100 with RX470 4GB

that could be done with DX12 or Vulkan
 

bug

Joined
May 22, 2015
Messages
13,836 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
LOL can you read? I said IN VULKAN.
In Vulkan what? The whole architecture suddenly becomes more energy efficient?
Also, are you basing your Vulkan performance evaluation on something more than just Doom?

Edit: Mind you, while everybody was quick to point out how much more Polaris benefits from Vulkan in Doom, nobody was equally quick to measure the power consumption at the same time.
 
Last edited:
Joined
Jun 10, 2014
Messages
2,987 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
nVidia a a member of Khronos too. They can add optimizations to Vulkan for their hardware too. Of course then it would be open source though.
The APIs are not optimized for any hardware, that's up to the driver implementations.

Nvidia is not only a member, their president Neil Trevett is an Nvidia employee. Nvidia started OpenGL ES, and have been the major contributor to OpenGL (post 2.0), OpenCL, and also Vulkan. There is no doubt that they are dedicated to adding support and evolving the APIs.
 
Joined
Jun 21, 2016
Messages
1,125 (0.36/day)
System Name Team Crimson
Processor AMD FX 8320
Motherboard Gigabyte GA-78LMT-USB3
Cooling Corsair H80i
Memory DDR3 16GB Crucial Ballistix Sport
Video Card(s) Sapphire Nitro+ RX 480 8GB RAM
Storage Samsung 850 EVO 250GB / Crucial MX300 275GB
Display(s) AOC 2752H 27" 1080p
Case NZXT Source 220 Windowed
Power Supply Antec Earthworks 650W
Mouse Logitech M510/ AGPTEK T-90 Zelotes
Keyboard Logitech K360/ iBUYPOWER TTC RED Switch Mechanical
Software Windows 8.1 64 Bit
How on earth is AMD's efficiency fine when the RX480 eats as much power as the GTX1070?

We're still talking pennies per month on the electric bill?

Every time I see the point brought up, I think of a parrot; "Power Consumption, the Power Consumption...caw caw rawwwk!" .
 
Joined
Oct 2, 2004
Messages
13,791 (1.87/day)
People who go after 3 frames of difference and declare kings of the hill based on that are idiots. The fact is, if you buy Fury X even today, you can be assured you'll enjoy all latest games in highest details and resolutions. You really have to be running most demanding game in 4K with max settings to even drag framerate to 30fps. What does that tell you? Reality is, graphic cards from either camp are about the same within similar price range. It doesn't even matter what you pick in the end, it's tiny things that make your decision in the end. Someone who buys new graphic card every year wont' care about future proofing. Someone who doesn't, might, because it means the particular graphic card will last longer. For some V-Sync modes are more important and for others it's the RGB lighting on the graphic card cover or the sticker. If people always made rational decisions, then they'd sell exactly ZERO Titan cards. And yet that's not the case. So stop bitching over few frames per second lead or loss, when you draw a line, they are basically the same. In the end emotional factor plays a larger role than an actual performance.

@Ungari
People are funny. They bitch over power consumption of graphic cards where it's like 50W difference. But when it comes to home appliances like fridges or tumble dryers, they don't care even for 100kW of difference per year.Like you said, it's literally pennies even for such massive differences, those 50W difference is nothing. And it also doesn't reflect as dramatically in terms of thermals. It helps if it's lower, but people tend to blow this stuff way out of proportions.
 
Joined
Jun 21, 2016
Messages
1,125 (0.36/day)
System Name Team Crimson
Processor AMD FX 8320
Motherboard Gigabyte GA-78LMT-USB3
Cooling Corsair H80i
Memory DDR3 16GB Crucial Ballistix Sport
Video Card(s) Sapphire Nitro+ RX 480 8GB RAM
Storage Samsung 850 EVO 250GB / Crucial MX300 275GB
Display(s) AOC 2752H 27" 1080p
Case NZXT Source 220 Windowed
Power Supply Antec Earthworks 650W
Mouse Logitech M510/ AGPTEK T-90 Zelotes
Keyboard Logitech K360/ iBUYPOWER TTC RED Switch Mechanical
Software Windows 8.1 64 Bit
I'm really curious where this almost immediate obsolescence is occurring, because my 980Ti still performs stellar as does the 980 before it that is now used by my better half. Indeed, it continues to perform at it's same original standard, probably better then at release, due to drivers optimizing performance. In that respect, it is no better or worse for it's category than my R9 380X, which has matured in its mid-priced category.

It all depends on what games and tasks you are doing, and what you are looking to do in the near future. There are many low processor games that are popular, and even new sprite games just released, but there are also those that push the limits of our cards if you want all the eye candy. Already your cards VRAM can be maxed out, shortly after the 980Ti was released a 6GB HD Texture was made for Rainbow Six Siege. This trend of games utilizing more and more VRAM is increasing.
I'm not suggesting that your card is obsolete right now since Pascal is super-clocked Maxwell, but the lack of Async Compute will certainly be an issue if you ever decide to play these new APIs.

People are funny. They bitch over power consumption of graphic cards where it's like 50W difference. But when it comes to home appliances like fridges or tumble dryers, they don't care even for 100kW of difference per year.Like you said, it's literally pennies even for such massive differences, those 50W difference is nothing. And it also doesn't reflect as dramatically in terms of thermals. It helps if it's lower, but people tend to blow this stuff way out of proportions.

What is sad is that just because Nvidia uses it as a selling point in their advertising, tech enthusiasts actually buy into this as being an important criteria in evaluating performance.
You would think that of all people, tech junkies would know better.
 
Last edited by a moderator:
Joined
Jul 14, 2008
Messages
872 (0.15/day)
Location
Copenhagen, Denmark
System Name Ryzen/Laptop/htpc
Processor R9 3900X/i7 6700HQ/i7 2600
Motherboard AsRock X470 Taichi/Acer/ Gigabyte H77M
Cooling Corsair H115i pro with 2 Noctua NF-A14 chromax/OEM/Noctua NH-L12i
Memory G.Skill Trident Z 32GB @3200/16GB DDR4 2666 HyperX impact/24GB
Video Card(s) TUL Red Dragon Vega 56/Intel HD 530 - GTX 950m/ 970 GTX
Storage 970pro NVMe 512GB,Samsung 860evo 1TB, 3x4TB WD gold/Transcend 830s, 1TB Toshiba/Adata 256GB + 1TB WD
Display(s) Philips FTV 32 inch + Dell 2407WFP-HC/OEM/Sony KDL-42W828B
Case Phanteks Enthoo Luxe/Acer Barebone/Enermax
Audio Device(s) SoundBlasterX AE-5 (Dell A525)(HyperX Cloud Alpha)/mojo/soundblaster xfi gamer
Power Supply Seasonic focus+ 850 platinum (SSR-850PX)/165 Watt power brick/Enermax 650W
Mouse G502 Hero/M705 Marathon/G305 Hero Lightspeed
Keyboard G19/oem/Steelseries Apex 300
Software Win10 pro 64bit
People who go after 3 frames of difference and declare kings of the hill based on that are idiots. The fact is, if you buy Fury X even today, you can be assured you'll enjoy all latest games in highest details and resolutions. You really have to be running most demanding game in 4K with max settings to even drag framerate to 30fps. What does that tell you? Reality is, graphic cards from either camp are about the same within similar price range. It doesn't even matter what you pick in the end, it's tiny things that make your decision in the end. Someone who buys new graphic card every year wont' care about future proofing. Someone who doesn't, might, because it means the particular graphic card will last longer. For some V-Sync modes are more important and for others it's the RGB lighting on the graphic card cover or the sticker. If people always made rational decisions, then they'd sell exactly ZERO Titan cards. And yet that's not the case. So stop bitching over few frames per second lead or loss, when you draw a line, they are basically the same. In the end emotional factor plays a larger role than an actual performance.

@Ungari
People are funny. They bitch over power consumption of graphic cards where it's like 50W difference. But when it comes to home appliances like fridges or tumble dryers, they don't care even for 100kW of difference per year.Like you said, it's literally pennies even for such massive differences, those 50W difference is nothing. And it also doesn't reflect as dramatically in terms of thermals. It helps if it's lower, but people tend to blow this stuff way out of proportions.

preach it brother!

What is sad is that just because Nvidia uses it as a selling point in their advertising, tech enthusiasts actually buy into this as being an important criteria in evaluating performance.
You would think that of all people, tech junkies would know better.

no, they dont, in their majority. just like @RejZoR said, most choices are made based on "feelings" not rationality.
 
Joined
Dec 22, 2011
Messages
3,890 (0.82/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Even sadder was when AMD created a video mocking nVidia's power consumption.
 
Joined
Sep 15, 2007
Messages
3,946 (0.63/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember
In Vulkan what? The whole architecture suddenly becomes more energy efficient?
Also, are you basing your Vulkan performance evaluation on something more than just Doom?

Edit: Mind you, while everybody was quick to point out how much more Polaris benefits from Vulkan in Doom, nobody was equally quick to measure the power consumption at the same time.

Bullshit. I saw the reviews. Power usage went up a few watts. Quit trying to spin it.
 
Joined
Mar 24, 2012
Messages
533 (0.11/day)
I know my throat is getting hoarse from saying this, but that's simply not true. This is more evidence that no one here really understands what words/phrases like "low-level" and "close to the metal" mean.

You don't optimize hardware to a low level api, you optimize software to a hardware exposed by a low level api.

At this moment, people are using exposed parts by DX12 to better optimize for AMD because frankly, there's a lot of optimizing to do compared to their DX11 renderer. There is some valid argument that async compute IS better supported on AMD's side, but it's not a valid argument for the way you are using it as NVIDIA also supports several things AMD doesn't:


this. that's why i wonder if those that say that nvidia need to build their architecture better to take advantage of DX12 are really understand what going low level really is

Anyone else remember that both AMD and Nvidia are bidding to supply the graphics in Samsung's next Smartphone APU's?

By making Vulkan the standard API of Android, AMD may have just secured a massive advantage in their bidding....

it doesn't matter if AMD have massive advantage in Vulkan. the only thing that matter if the said hardware support Vulkan or not. Samsung is building phone. they did not build ultimate gaming machine. so whoever can give the better deal probably will win. though what samsung discussing with AMD and Nvidia is mostly not about making GPU for samsung.

Tegra chips are absolute garbage in terms of efficiency. Their powerful chips hog 25-50w (Far more than a phone can take), and their 5w variants fail to beat their Qualcomm/Apple competition.


AMD's efficiency is totally fine, 14nm just isn't mature for big chips yet. Furthermore you should look at efficiency for AMD in Vulkan. Their far cheaper to produce 480 is roughly as efficient as the 1070 (Like a 10% difference).

and you think AMD can hold candle in SoC space with their GCN? you just look at desktop part and then confident that AMD will do fine in mobile SoC? that is outright delusional.
 
Last edited by a moderator:
Joined
Sep 15, 2007
Messages
3,946 (0.63/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember
and you think AMD can hold candle in SoC space with their GCN? you just look at desktop part and then confident that AMD will do fine in mobile SoC? that is outright delusional.

Low clocks are different (if indeed they would be low). With proper fabbing, maybe.
 
Joined
Mar 24, 2012
Messages
533 (0.11/day)
/thread.

Though it is great news android will use it, that doesn't mean the pc market will adapt. I certainly hope it does, more competition never hurts. I wont hold my breath though.
but i seriously believe most android game will end up using OpenGL ES 2.0 only lol.
 
Joined
Sep 15, 2011
Messages
6,759 (1.40/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Quick question.
Are the current gen consoles also able to support Vulkan? How about PS4-Pro?
 

INSTG8R

Vanguard Beta Tester
Joined
Nov 26, 2004
Messages
8,047 (1.10/day)
Location
Canuck in Norway
System Name Hellbox 5.1(same case new guts)
Processor Ryzen 7 5800X3D
Motherboard MSI X570S MAG Torpedo Max
Cooling TT Kandalf L.C.S.(Water/Air)EK Velocity CPU Block/Noctua EK Quantum DDC Pump/Res
Memory 2x16GB Gskill Trident Neo Z 3600 CL16
Video Card(s) Powercolor Hellhound 7900XTX
Storage 970 Evo Plus 500GB 2xSamsung 850 Evo 500GB RAID 0 1TB WD Blue Corsair MP600 Core 2TB
Display(s) Alienware QD-OLED 34” 3440x1440 144hz 10Bit VESA HDR 400
Case TT Kandalf L.C.S.
Audio Device(s) Soundblaster ZX/Logitech Z906 5.1
Power Supply Seasonic TX~’850 Platinum
Mouse G502 Hero
Keyboard G19s
VR HMD Oculus Quest 3
Software Win 11 Pro x64
Quick question.
Are the current gen consoles also able to support Vulkan? How about PS4-Pro?
Well it would make sense for them to go that route seeing as they have been using OGL anyway.
 
Joined
Jun 10, 2014
Messages
2,987 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Quick question.
Are the current gen consoles also able to support Vulkan? How about PS4-Pro?
PS4, Xbox One and updates have full hardware support for Vulkan 1.0, it's just a matter of drivers.
 
Joined
Dec 30, 2010
Messages
2,199 (0.43/day)
If DX12 is limited to W10 with functions only, how is mantle doing in an older OS? For example, does it offer all hardware features that normally would be available in DX12 on W10 ?
 
Joined
Jun 21, 2016
Messages
1,125 (0.36/day)
System Name Team Crimson
Processor AMD FX 8320
Motherboard Gigabyte GA-78LMT-USB3
Cooling Corsair H80i
Memory DDR3 16GB Crucial Ballistix Sport
Video Card(s) Sapphire Nitro+ RX 480 8GB RAM
Storage Samsung 850 EVO 250GB / Crucial MX300 275GB
Display(s) AOC 2752H 27" 1080p
Case NZXT Source 220 Windowed
Power Supply Antec Earthworks 650W
Mouse Logitech M510/ AGPTEK T-90 Zelotes
Keyboard Logitech K360/ iBUYPOWER TTC RED Switch Mechanical
Software Windows 8.1 64 Bit

bug

Joined
May 22, 2015
Messages
13,836 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
We're still talking pennies per month on the electric bill?

Every time I see the point brought up, I think of a parrot; "Power Consumption, the Power Consumption...caw caw rawwwk!" .
Well, you may say that on the desktop, but in this case we're talking mobiles where every watt counts.
 
Joined
Jun 10, 2014
Messages
2,987 (0.78/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
If DX12 is limited to W10 with functions only, how is mantle doing in an older OS? For example, does it offer all hardware features that normally would be available in DX12 on W10 ?
It has nothing to do with the features in Direct3D 12, just at Microsoft wants to keep it to Windows 10. Hardware features has little to do with OSs ;)

I think microsoft only alows directx on xbox.
Yes, but we are talking about Vulkan ;)
 
Top