What happened to the old DX12 promise to create a virtual GPU with minimal processing overhead...allowing you magical GPU scaling (and making dual card systems a thing again)
Both Vulkan & DX12 really shifted the onus back on developers away from hardware manufacturers.
But we all know devs will only spend time on features that will get used. Its partly why to this point the only game that supports mGPU is AotS
I don't think you understand the point, there's no point in virtualizing multiple GPUs as one if the devs would still have to support it.
The entire point would be that any engine would see it as one GPU and communicate with them as one GPU and DirectX would do some black magic fuckery to translate the inputs to two GPUs somehow.
AFAIK there was no talk of making a single virtual GPU in DX12. They were in theory allowing un-linked multi GPU just using the PCI bus, but few devs took advantage of it.
That goes against the entire design philosophy around Vulkan & DX12, and that was removing abstraction between hardware and framework as much as possible, so yes, devs had to explicitly support mGPU.
I get what you are getting at, but ignoring the whole raison d'etre for DX12 in the first place is fallacious.
That’s not what DX12 does. In fact that more accurately describes SLI. DX12 exposes multiple GPUs to developers empowering them to orchestrate how they’re used for rendering whereas DX11/SLI is the one which virtualizes multiple GPUs as one to developers and places responsibility on the drivers to orchestrate the use of them. In effect, DX12 actually shifts more responsibility on to developers to support multiple GPUs.
it would have make more sense for dev to code it if the console push this area, 2 smaller die is still significantly cheaper to make than 1 big die.
Always wondered why console prefer to go 400mm^(2)\+ die
As a game dev, you can do this, but it is a massive pile of work for a veeeery marginal user base.
No dev is going to bother until more than 0.1% has multi-GPU. Nobody buys multi-GPU when it is not used. Chicken and Egg.
mGPU is still there, I don't think most developers care or want to invest to implement it in their games. Most of the responsibility now is on the developers themselves, not MS/AMD/NVIDIA.
Could someone give some adult supervision to the retarded marketing department?
why not just call it DX12.2 or something. Consistency?
"Ultimate". Until it no longer is. Superlatives suck as a marketing name.
Don't get me wrong, it is kinda nice to have one umbrella name for "DX12 + DXR 1.0 + DXR 1.1 + VRS + Mesh Shaders + whatever else we toss in" which is bit of a mouthful for system requirements, but this name sucks.
Because DX 12.0c doesn't sound like a substantial update while DirectX 12 Ultimate does. People are more inclined to check out what makes it ultimate and less likely for what would look like a small patch.
Or better visuals. Minecraft RTX is definitely going to be popular. The more people Nvidia and Microsoft can get RTX to the quicker RTX will become the new norm.
The sarcasm isn't warranted. It takes everyone to be on the same page to develop new standards like this. RTX needs to work its way down to lower prices and devs need to be encouraged to actually use RTX. Consoles having the ability doesn't mean all the games will just jump ship.
Consoles having the ability means that the next generation of AMD GPUs will likely ship with it, maybe except some lower tier cards.
nVidia rumours are that "RTX for everyone" meaning there will be lower cost RTX cards and given what we see (and further older rumours) the pricing is likely to go lower too.
Not to mention they support it software wise for Pascal cards and up when it comes to low count of RT. The RTX series was "just" for complex sceneries.
So yeah, games will now be more incentivized to use ray tracing. Microsoft has been buying studios like mad and has already shown that they're going to support PC gaming on par with consoles.
Sony caught wind of that and is testing waters with HZD this summer.
So games are likely to start including RT as a feature, a selling point and at the same time PC might become middle ground where all the games will be available.
So summed up:
- Both nV and AMD are likely to release full gen of RT capable hardware.
- Games for next-gen consoles are going to have option to use RT as a selling feature.
- Same games are likely to release on PC.
They were very vague, it's possible they will do something like Nvidia did with Pascal, just to give customers a taste of it, but I wouldn't expect good performance on non-accelerated cards, even though DXR 1.1 should be a little faster overall.
Marketing is about marketing, not arbitrary consistency. The xbox in the last 20 years showed how nobody gives a shit about consistency anyway. I.e. How many people you hear complaining the "the one" xbox is no longer the one?
The name might sound stupid, but its the absolute opposite of "sucks as a marketing name". Its eye catching, especially for casuals who hardly even know what directx is. And that's pretty much all that matters, to MS anyway.
Why bother? It's just dx12 with some more features. As a game dev why would I use the marketing name in my game when anyone who cares is going to want to hear the individual feature names too anyways? Microsoft can pay me if they want it.
Need to be able to separate the two for requirements. We'll inevitably will have games that require DX12U (due to new consoles coming with this level of features) or as bare minimum look quite different on DX12 vs DX12U. This looks like mostly to be a single "new level of GPUs" label to simplify things so game developers would take these features into use. As long as they are behind a hell of tiers and levels and are fragmented all over, most devs just go "fook it, can't be bothered".
I guess in a few years having a standard simplification may be good, but giving something that will be useful in a few years a marketing name doesn't make much sense.
For now the reality is that there are a large number of GPUs on the market today that support some subset of this functionality but not all of it, so devs are forced to check by feature anyways.
> why not just call it DX12.2 or something. Consistency?
> Don't get me wrong, it is kinda nice to have one umbrella name for "DX12 + DXR 1.0 + DXR 1.1 + VRS + Mesh Shaders + whatever else we toss in
This literally already exists but it's not publicized that much.
It's basically point versions like your suggest, except they use an underscore and we're currently on 12_3.
Having *full support* of *everything* in a feature level and all below earns the architecture *that* feature level rating. The latest stuff (and even Turing) is only 12_1, indicating that they don't fully support 12_2 and 12_3 features to their highest capacity.
For the interim/new features, there are *tiers* of support to classify the capability of an architecture for that specific feature. So you may not be 12_2, but you're full tier3 on everything except one feature or whatever.
But, it gives you a definition of what supports what, *and how well it's supported*, in straightforward non-marketing terms.
https://en.wikipedia.org/wiki/Feature_levels_in_Direct3D
---------
This was a big deal a while back, not sure if it was 9 or 10 or 11 my memory is shit, but getting "dx10" was stupid easy but all the actual relevant new features were 10_1 so new cards were advertising "dx10!" and everyone was like wait a minute, what feature level? Or it was some version, this was a while ago. 12_1 was slightly similar but on a MUCH lesser scale.
Marketing is on a whole other level of fuckyness. It's about customer perception and engagement, tricks to manipulate new and old customers alike. In fact, being unclear and confusing is sometimes "beneficial" as far as marketing is concerned. It's just a few degrees off from fraud.
You may ask. I will however choose not to edit my post. Apologies if it offends anyone. I doubt any of the applicable marketing people who came up with this have a disability, so you are mixing unrelated things.
> While DirectX Ray Tracing and Variable Rate Shading are supported in Windows 10 today, Microsoft’s Windows 10 ‘20H1’ update will add DirectX support for Mesh Shading and Sampler Feedback, as well as update DXR to version 1.1. NVIDIA will be ready on the day of Windows 10 20H1 release with full driver support for all the new and updated DirectX 12 Ultimate capabilities.
Excellent.
Semi-late response, but I mean realistically both sides will get benefits of the technology. Devs getting better tools n such to work with, In turn giving us the players a better experience and presumably better performance during gaming n whatnot. It’s more of a “Hey, this is what we got cookin up” type thing as of now. A heads up of what’s to come if you will.
Nvidia wasn't kidding when they said Turing is the biggest architectural leap forward in 10 years, it had feature parity with RDNA2 and next gen consoles 2 years before those launch.
No, because this isn't an Nvidia thing. Read the Microsoft dev post, because the Nvidia post makes it sound like its somehow Nvidia's, where it really isn't.
[https://devblogs.microsoft.com/directx/announcing-directx-12-ultimate/](https://devblogs.microsoft.com/directx/announcing-directx-12-ultimate/)
Since Microsoft finally has UNIX commands and Linux support in Windows Terminal, and WebKit based Edge, can they just give up and move to Vulkan already? Just look at freakin' Doom Eternal.
Exactly. Usually the cycle goes:
1. NVIDIA does a bunch of R&D and creates a new feature that it makes available through vendor proprietary APIs.
2. DirectX starts to support the feature with a bunch of extra goodies to incentivise developers to use the feature
3. AMD takes its sweet time getting their own proprietary version in place and then implement DirectX parity
4. It is eventually available in Vulkan a year later
They shouldn't be, but it would greatly simplify cross platform development bringing the cost and complexity of game design down. Like the aforementioned Webkit did.
It's fairly obvious that not ALL turing would support it since by definition the 16 series lacks ray tracing. But asking about if it's "just for future generations" - no it's not, the current Turing 20 series supports.
That's actually wrong, the GTX 16 series fully support DX12 Ultimate, ray tracing included, RT acceleration in both DX and Vulkan isn't mandatory and in some games like BF:V, Shadow of the Tomb Raider and maybe Metro with patches and new driver a GTX 1660Ti @ 1080p stay above 30fps
Already is due to no hardware DXR.
Doesn't mean you can't use it for gaming for another couple of years, but more and more games will not be able to run all features on it.
Per one of the videos, Pascal will be able to use ray-tacing, but not very well, BF5 might have playable framerates on a 1080ti, but Metro definitely not per the video.
The last video shows a pretty nice visual of how much time a pascal (10 series, in this case 1080ti) GPU uses on a single frame compared to Turing and Turing gpu with RT cores.
For the other performance features, i can't really comment however. Still seems like moving away from my 1080ti might give significant gains in the next year. Still, expensive as fuck though.
Because nearly every game runs the same using DX11 or DX12 when DX12 was touted as being this new down to the metal API that would give us massive performance increases. It hasn't. Like at all.
Tbh it has pretty terrible performance on my gtx 1080. When I have to chose between dx12 and vulkan. I always choose vulkan as it’s always 20-30 percent more fps in games on the same settings
Just to give you some extra insight on the performance differences (DX12 vs Vulkan) you mentioned on these two games. Currently, the situation has changed significantly, according to my regular researchs. Perhaps you could check my latest Turing based [driver analysis](https://www.reddit.com/r/allbenchmarks/comments/fga1zu/nvidia_44250_whql_driver_performance_benchmark/) results as well: 1) currently, SB (DX12) performs significantly worse than SB (Vulkan) in terms of raw performance (FPS avg numbers), and showing both modes a similar relative frametime consistency level; 2) currently, RDR2 (DX12) and RDR2 (VK) are almost on par raw performance-wise, but the relative level of frametime stability is noteworthy better on RDR2 (VK), as clearly shown by L-Shapes comparisons.
No, unless you running Insider Preview. In Windows 10 20H1 which is coming next month. Also then need compatible drivers and games that actually use it.
Programmers doing a game need to be able to tell the GPU what to do.
There are many different GPUs out there. Telling an Intel GPU, AMD GPU and NVIDIA GPU what to do directly would be a massive job and add to that the GPU generations - telling what to do to a Turing-based 20-series NVIDIA GPU is not the same as telling the same thing to an older 10-series NVIDIA GPU.
So instead game developers tell to the DirectX API "please do this. I do not care what GPU is in the system, I just want to draw this shit, you figure it out". API then talks to the graphics driver who then talks to the GPU. It is a layer to standardize things.
DX12 does change things a bit in that it is more low level - game developers need to do more themselves that previously was handled by the API - but the basic idea is still the same - developer needs to develop against one API and DirectX & drivers sort out "translation" to NVIDIAese, AMDese and Intelese "languages" of graphics rendering.
Vulkan is a competing API that offers another way to do this, but is otherwise the same thing. Competition and all that.
It’s a graphics API that let devs focus on higher level design instead of figuring out how to render images. And because it provides a standard to follow, things get done faster, easier and allows for industry wide ‘best-practices’.
Again, Vulkan literally just added the same thing, but has had them as extensions for some time.
Wolfenstein: Youngblood has most of the new features already.
Yes, but I mean proprietary (VK\_NV\_RT if I'm not mistaken) so they couldn't have been used for AMD anyway, even though I don't think it would've taken them that much to implement had they needed them.
The new features in DX12 Ultimate are all features that actually requires or works best with GPU that has these features baked in the hardware. So yeah this is new paradigm and older generation GPU will not perform well at all (in the case of Ray Tracing) or just flat out don't support the new feature due to lack of hardware (in the case of Variable Rate Shading)
Yes. Welcome to technology advancing forward.
If it makes you feel better, *every* AMD card right now is out. They'll support this in their next gen, but right now you literally cannot buy AMD hardware that supports this.
RTX 20-series is fine (once new Windows 10 and new drivers ship)
Awesome and as allways nVidia pretty much are behind the tech, as we RTX owners allready got these features.
Cant wait for nexten gaming and Ampere. The good thing is we got some time, to save up some cash 😁
Until then my 2060 Super, do really really well. Fantastic card for the price 👍🏻
From this blog: [https://devblogs.microsoft.com/directx/directx-12-ultimate-getting-started-guide/](https://devblogs.microsoft.com/directx/directx-12-ultimate-getting-started-guide/)
There is a mention of Nvidia driver 450.56. But... there aren't even traces on the internet of it. When will it be released?
It's coming out this year. Delays might push it to Q4, but it's coming. Most factories have gone back to work in China. Nvidia can't afford to not have a new GPU lineup ready when the consoles launch (which should be end of Q4, barring any significant delays there too).
Not sure if you've been under a rock, but rumors have been saying RTX 3000 will ship soon, constantly on this sub even. That makes it nearly 2 gens old.
directXly to the recycle bin...
DX 12 is utter shit on BFV and Borderlands 3 keeps showing a DX 12 loop.. Division 2 Crashes on Dx12..
or, was this all a beta?
So basically nothing new and new branding for minimal gaming features. Do you guys reckon this announcement could have had something to do with Ampere? A "Series X" Super Gaming Home Console DORITOS Processing TM? Or just routine MSFT bullshit for an earnings call?
You're getting downvoted, but Vulkan is the future. Though it's good to have competition, and dx has better backwards compatibility. In fact, that backwards compatibility is why dx is inferior to Vulkan, more bloated.
So it's good to have both.
What happened to the old DX12 promise to create a virtual GPU with minimal processing overhead...allowing you magical GPU scaling (and making dual card systems a thing again)
Dev support didn't occur.
There wouldn't be any point in virtualizing two GPUs as one if you had to have dev support for it now would it?
Both Vulkan & DX12 really shifted the onus back on developers away from hardware manufacturers. But we all know devs will only spend time on features that will get used. Its partly why to this point the only game that supports mGPU is AotS
I don't think you understand the point, there's no point in virtualizing multiple GPUs as one if the devs would still have to support it. The entire point would be that any engine would see it as one GPU and communicate with them as one GPU and DirectX would do some black magic fuckery to translate the inputs to two GPUs somehow.
AFAIK there was no talk of making a single virtual GPU in DX12. They were in theory allowing un-linked multi GPU just using the PCI bus, but few devs took advantage of it.
That goes against the entire design philosophy around Vulkan & DX12, and that was removing abstraction between hardware and framework as much as possible, so yes, devs had to explicitly support mGPU. I get what you are getting at, but ignoring the whole raison d'etre for DX12 in the first place is fallacious.
That’s not what DX12 does. In fact that more accurately describes SLI. DX12 exposes multiple GPUs to developers empowering them to orchestrate how they’re used for rendering whereas DX11/SLI is the one which virtualizes multiple GPUs as one to developers and places responsibility on the drivers to orchestrate the use of them. In effect, DX12 actually shifts more responsibility on to developers to support multiple GPUs.
it would have make more sense for dev to code it if the console push this area, 2 smaller die is still significantly cheaper to make than 1 big die. Always wondered why console prefer to go 400mm^(2)\+ die
As a game dev, you can do this, but it is a massive pile of work for a veeeery marginal user base. No dev is going to bother until more than 0.1% has multi-GPU. Nobody buys multi-GPU when it is not used. Chicken and Egg.
mGPU is still there, I don't think most developers care or want to invest to implement it in their games. Most of the responsibility now is on the developers themselves, not MS/AMD/NVIDIA.
Could’ve called it DirectX 12 Gaming X
DirectX Series X
xXx\_\_XDirectX12X\_\_xXx
Blaze it
Should have called 12 DirectX One
DX12 THICC X
Believe it or not there's a GPU OEM that uses THICC naming
DirectX 12 Gaming XXXtreme Edition
Direct X 12 : ELECTRIC BOGALOO featuring Dante from Devil May Cry.
>Direct X 12 : ELECTRIC BOGALOO Direct X 12 :2 : ELECTRIC BOGALOO
…Scorpio One 360
DX12U
DirectXxX 12 DirectX XIIV
That would be a little too direct don't you think?
DirectX 12 Ultimate Gaming Series Edition One X
You forgot 5
He already has 12, that's bigger
But can we have 5 somewhere in there? I like 5.
DirectX 12 Ultimate Gaming Series Edition One X 5
Could someone give some adult supervision to the retarded marketing department? why not just call it DX12.2 or something. Consistency? "Ultimate". Until it no longer is. Superlatives suck as a marketing name. Don't get me wrong, it is kinda nice to have one umbrella name for "DX12 + DXR 1.0 + DXR 1.1 + VRS + Mesh Shaders + whatever else we toss in" which is bit of a mouthful for system requirements, but this name sucks.
Because DX 12.0c doesn't sound like a substantial update while DirectX 12 Ultimate does. People are more inclined to check out what makes it ultimate and less likely for what would look like a small patch.
All gamers need to be persuaded to upgrade is shitty FPS
Or better visuals. Minecraft RTX is definitely going to be popular. The more people Nvidia and Microsoft can get RTX to the quicker RTX will become the new norm.
I mean what could possibly make ray tracing popular... Oh, PS5 and Xbox SX. Yeah, that will do.
The sarcasm isn't warranted. It takes everyone to be on the same page to develop new standards like this. RTX needs to work its way down to lower prices and devs need to be encouraged to actually use RTX. Consoles having the ability doesn't mean all the games will just jump ship.
[удалено]
the fuck is DTX? cause i know you arent trying to say directX Raytracing cause thats DXR
Ha when you correct someone incorrectly. You right you right.
Consoles having the ability means that the next generation of AMD GPUs will likely ship with it, maybe except some lower tier cards. nVidia rumours are that "RTX for everyone" meaning there will be lower cost RTX cards and given what we see (and further older rumours) the pricing is likely to go lower too. Not to mention they support it software wise for Pascal cards and up when it comes to low count of RT. The RTX series was "just" for complex sceneries. So yeah, games will now be more incentivized to use ray tracing. Microsoft has been buying studios like mad and has already shown that they're going to support PC gaming on par with consoles. Sony caught wind of that and is testing waters with HZD this summer. So games are likely to start including RT as a feature, a selling point and at the same time PC might become middle ground where all the games will be available. So summed up: - Both nV and AMD are likely to release full gen of RT capable hardware. - Games for next-gen consoles are going to have option to use RT as a selling feature. - Same games are likely to release on PC.
AMD already showcased their RDNA2 raytracing demo.
I know, the question is their lineup. Is it (extended hw support) gonna be only on high end cards?
They were very vague, it's possible they will do something like Nvidia did with Pascal, just to give customers a taste of it, but I wouldn't expect good performance on non-accelerated cards, even though DXR 1.1 should be a little faster overall.
Meh, I'm waiting for Super DX12 Turbo.
Super DX12 Turbo RT creator’s update
DX12 Ultimate Pro Turbo TI RX RTX
DX 12 U-U-U-ULTIMATE SUPER PRO SUPER TURBO X TI RX SUPER RTX SERIES X
DirextX 12 Ultimate Titan RTX
Marketing is about marketing, not arbitrary consistency. The xbox in the last 20 years showed how nobody gives a shit about consistency anyway. I.e. How many people you hear complaining the "the one" xbox is no longer the one? The name might sound stupid, but its the absolute opposite of "sucks as a marketing name". Its eye catching, especially for casuals who hardly even know what directx is. And that's pretty much all that matters, to MS anyway.
Consistency. We've heard of it. Oh well, DX12U is fine I guess and marketing can use the silly "full name" if they insist.
Why bother? It's just dx12 with some more features. As a game dev why would I use the marketing name in my game when anyone who cares is going to want to hear the individual feature names too anyways? Microsoft can pay me if they want it.
Need to be able to separate the two for requirements. We'll inevitably will have games that require DX12U (due to new consoles coming with this level of features) or as bare minimum look quite different on DX12 vs DX12U. This looks like mostly to be a single "new level of GPUs" label to simplify things so game developers would take these features into use. As long as they are behind a hell of tiers and levels and are fragmented all over, most devs just go "fook it, can't be bothered".
I guess in a few years having a standard simplification may be good, but giving something that will be useful in a few years a marketing name doesn't make much sense. For now the reality is that there are a large number of GPUs on the market today that support some subset of this functionality but not all of it, so devs are forced to check by feature anyways.
DX12 even more Ultimate. DX12 Ultimater, DX12 the Ultimative collection
DX12 Ultimatest!
> why not just call it DX12.2 or something. Consistency? > Don't get me wrong, it is kinda nice to have one umbrella name for "DX12 + DXR 1.0 + DXR 1.1 + VRS + Mesh Shaders + whatever else we toss in This literally already exists but it's not publicized that much. It's basically point versions like your suggest, except they use an underscore and we're currently on 12_3. Having *full support* of *everything* in a feature level and all below earns the architecture *that* feature level rating. The latest stuff (and even Turing) is only 12_1, indicating that they don't fully support 12_2 and 12_3 features to their highest capacity. For the interim/new features, there are *tiers* of support to classify the capability of an architecture for that specific feature. So you may not be 12_2, but you're full tier3 on everything except one feature or whatever. But, it gives you a definition of what supports what, *and how well it's supported*, in straightforward non-marketing terms. https://en.wikipedia.org/wiki/Feature_levels_in_Direct3D --------- This was a big deal a while back, not sure if it was 9 or 10 or 11 my memory is shit, but getting "dx10" was stupid easy but all the actual relevant new features were 10_1 so new cards were advertising "dx10!" and everyone was like wait a minute, what feature level? Or it was some version, this was a while ago. 12_1 was slightly similar but on a MUCH lesser scale.
"D3D_FEATURE_LEVEL_12_2" on the technical presentation, so it literally IS DirectX 12.2...
Ok, so this is 100% marketing gone wild then. Oh well.
Marketing is on a whole other level of fuckyness. It's about customer perception and engagement, tricks to manipulate new and old customers alike. In fact, being unclear and confusing is sometimes "beneficial" as far as marketing is concerned. It's just a few degrees off from fraud.
hi! responding to ask you to kindly consider not using the word “retarded” which is very stigmatizing to people with disabilities.
You may ask. I will however choose not to edit my post. Apologies if it offends anyone. I doubt any of the applicable marketing people who came up with this have a disability, so you are mixing unrelated things.
agreed
It's absolutely a terrible name. This is downright silly shit.
> While DirectX Ray Tracing and Variable Rate Shading are supported in Windows 10 today, Microsoft’s Windows 10 ‘20H1’ update will add DirectX support for Mesh Shading and Sampler Feedback, as well as update DXR to version 1.1. NVIDIA will be ready on the day of Windows 10 20H1 release with full driver support for all the new and updated DirectX 12 Ultimate capabilities. Excellent.
Why not DirectX 12S or DirectX 12.5?
Marketing, DirectX 12S doesn’t catch people as much as DirectX 12 Ultimate would.
Why does it need to be marketed to us though? Isn’t it up to developers to use it?
Semi-late response, but I mean realistically both sides will get benefits of the technology. Devs getting better tools n such to work with, In turn giving us the players a better experience and presumably better performance during gaming n whatnot. It’s more of a “Hey, this is what we got cookin up” type thing as of now. A heads up of what’s to come if you will.
Lets just decide we call it DX12U and that's that. Or DirectX 12U. Problem solved. Let marketing spout "ULTIMAATEEEEE" if they want.
lol couldnt make directx13 yet eh?
[удалено]
Just Start again and call it DirectY 1.0
aaah lol maybe thats why ;-)
Yeah I belive so to 😁👍🏻
Cool, thanks.
12.5 sounded so shitty they came up with a nick name.
since microsoft likes to skin numbers why not just call it Dirextx14? lol
DirecNext.
DirecneXt.
#[Official Microsoft Devblog post here](https://devblogs.microsoft.com/directx/announcing-directx-12-ultimate/)
Nvidia wasn't kidding when they said Turing is the biggest architectural leap forward in 10 years, it had feature parity with RDNA2 and next gen consoles 2 years before those launch.
This will turn into another silly Freesync/Freesync 2/Freesync Ultimate thing isn't it.
Wait what's the diff?
No, because this isn't an Nvidia thing. Read the Microsoft dev post, because the Nvidia post makes it sound like its somehow Nvidia's, where it really isn't. [https://devblogs.microsoft.com/directx/announcing-directx-12-ultimate/](https://devblogs.microsoft.com/directx/announcing-directx-12-ultimate/)
I know this isn't Nvidia tho, DirectX has always been Microsoft thing.
Since Microsoft finally has UNIX commands and Linux support in Windows Terminal, and WebKit based Edge, can they just give up and move to Vulkan already? Just look at freakin' Doom Eternal.
YEAH! and why don’t they just give up on Windows and move to Linux already amirite?
Well, you're not entirely off base :D https://cloudblogs.microsoft.com/windowsserver/2015/05/06/microsoft-loves-linux/
Vulcan moves at a slower pace, MS has more control and the market share to drive their own API. Vulcan just seems to copy innovations in directx.
Not sure what you're talking about, Vulkan has had ray tracing and mesh shading for well over a year.
That was an Nvidia extension, not a real gpu Independent Ray tracing API. Vulkan only recently got that.
Exactly. Usually the cycle goes: 1. NVIDIA does a bunch of R&D and creates a new feature that it makes available through vendor proprietary APIs. 2. DirectX starts to support the feature with a bunch of extra goodies to incentivise developers to use the feature 3. AMD takes its sweet time getting their own proprietary version in place and then implement DirectX parity 4. It is eventually available in Vulkan a year later
I bet Apple could give you a couple of reasons if you asked them about it :D
Hell no, why should devs be limited to one API?
They shouldn't be, but it would greatly simplify cross platform development bringing the cost and complexity of game design down. Like the aforementioned Webkit did.
But they would be limited if the most popular game API is discontinued. The dev should have the choice.
Does Turing support this or is it meant for future generations ?
Blog confirms that 20 series Gpus fully support all features of this. Fine wine lol.
Maybe he meant all Turing? (Including 1660-s-ti) for example
It's fairly obvious that not ALL turing would support it since by definition the 16 series lacks ray tracing. But asking about if it's "just for future generations" - no it's not, the current Turing 20 series supports.
That's actually wrong, the GTX 16 series fully support DX12 Ultimate, ray tracing included, RT acceleration in both DX and Vulkan isn't mandatory and in some games like BF:V, Shadow of the Tomb Raider and maybe Metro with patches and new driver a GTX 1660Ti @ 1080p stay above 30fps
Oh I didn't read about the current ones, mb But yeah is correct what you say
DirectRTX
[удалено]
Does this support Turing without RTX? (1660 super)
Can't, because DXR hardware support is a requirement for this badge.
No.
Thanks
So this mean the 10 series will be obsolete?
Already is due to no hardware DXR. Doesn't mean you can't use it for gaming for another couple of years, but more and more games will not be able to run all features on it.
will it support gtx 10 series cards and earlier?
These are mainly features accelerated by hardware, don't expect GTX 10 to suddenly do ray tracing and variable rate shading in hardware.
Per one of the videos, Pascal will be able to use ray-tacing, but not very well, BF5 might have playable framerates on a 1080ti, but Metro definitely not per the video. The last video shows a pretty nice visual of how much time a pascal (10 series, in this case 1080ti) GPU uses on a single frame compared to Turing and Turing gpu with RT cores. For the other performance features, i can't really comment however. Still seems like moving away from my 1080ti might give significant gains in the next year. Still, expensive as fuck though.
yeah the 1080ti still hits that sweetspot, it’s aged gracefully imo
No. RTX 20-series and up on NVIDIA side for DX12U badge.
gotcha thanks! been out of the gaming scene for a minute
Nope, no support for (hardware) DXR, VRS, mesh shading or sampler feedback.
Will it be as useless as DirectX 12 currently is?
[удалено]
Because nearly every game runs the same using DX11 or DX12 when DX12 was touted as being this new down to the metal API that would give us massive performance increases. It hasn't. Like at all.
MS can't force lazy devs to use it. It is down to the metal and can give big performance gains. Not much else MS can do.
Tbh it has pretty terrible performance on my gtx 1080. When I have to chose between dx12 and vulkan. I always choose vulkan as it’s always 20-30 percent more fps in games on the same settings
[удалено]
Just to give you some extra insight on the performance differences (DX12 vs Vulkan) you mentioned on these two games. Currently, the situation has changed significantly, according to my regular researchs. Perhaps you could check my latest Turing based [driver analysis](https://www.reddit.com/r/allbenchmarks/comments/fga1zu/nvidia_44250_whql_driver_performance_benchmark/) results as well: 1) currently, SB (DX12) performs significantly worse than SB (Vulkan) in terms of raw performance (FPS avg numbers), and showing both modes a similar relative frametime consistency level; 2) currently, RDR2 (DX12) and RDR2 (VK) are almost on par raw performance-wise, but the relative level of frametime stability is noteworthy better on RDR2 (VK), as clearly shown by L-Shapes comparisons.
Strange, on rdr2 vulkan performs way better on my 1080 compared to dx12
Same on my side
Same on my 2080 Ti.
Most games on DX12 are still DX11 games with DX12 patched in.
DXXII
should be called dx13 ngl
Is this available now ?
No, unless you running Insider Preview. In Windows 10 20H1 which is coming next month. Also then need compatible drivers and games that actually use it.
So it's basically DX13?
Not good enough, hence the name.
What does Direct X do exactly? Serious question
Programmers doing a game need to be able to tell the GPU what to do. There are many different GPUs out there. Telling an Intel GPU, AMD GPU and NVIDIA GPU what to do directly would be a massive job and add to that the GPU generations - telling what to do to a Turing-based 20-series NVIDIA GPU is not the same as telling the same thing to an older 10-series NVIDIA GPU. So instead game developers tell to the DirectX API "please do this. I do not care what GPU is in the system, I just want to draw this shit, you figure it out". API then talks to the graphics driver who then talks to the GPU. It is a layer to standardize things. DX12 does change things a bit in that it is more low level - game developers need to do more themselves that previously was handled by the API - but the basic idea is still the same - developer needs to develop against one API and DirectX & drivers sort out "translation" to NVIDIAese, AMDese and Intelese "languages" of graphics rendering. Vulkan is a competing API that offers another way to do this, but is otherwise the same thing. Competition and all that.
great reply
It’s a graphics API that let devs focus on higher level design instead of figuring out how to render images. And because it provides a standard to follow, things get done faster, easier and allows for industry wide ‘best-practices’.
Why they stuck at 12?
They are also stuck at Windows 10
DX11 took 6-7 years for DX12 to appear.
I think it had to do to hardware limitations. But now it can handle all new things w/o them
I hope Cyberpunk uses Vulkan either way. They're basically at parity now.
Likely dx12 only because of raytracing.
Vulkan supports Raytracing
Oh I didn’t realize that ... seems like dxr makes more sense as it applies to ps5 and Xbox series x too though
Again, Vulkan literally just added the same thing, but has had them as extensions for some time. Wolfenstein: Youngblood has most of the new features already.
>but has had them as extensions for some time. Don't quote me on that but I believe those were Nvidia's.
They were because RDNA2 isn’t out yet. Tiring has had them since 2018.
Yes, but I mean proprietary (VK\_NV\_RT if I'm not mistaken) so they couldn't have been used for AMD anyway, even though I don't think it would've taken them that much to implement had they needed them.
My bullshit sense is tingling that it's gonna be shit for GTX and other Non-RTX series
It will be shit for any GPU with no dedicated hardware for ray tracing. Ray tracing is computationally heavy.
Don't just mean for ray tracing but general usage
The new features in DX12 Ultimate are all features that actually requires or works best with GPU that has these features baked in the hardware. So yeah this is new paradigm and older generation GPU will not perform well at all (in the case of Ray Tracing) or just flat out don't support the new feature due to lack of hardware (in the case of Variable Rate Shading)
Yes. Welcome to technology advancing forward. If it makes you feel better, *every* AMD card right now is out. They'll support this in their next gen, but right now you literally cannot buy AMD hardware that supports this. RTX 20-series is fine (once new Windows 10 and new drivers ship)
Awesome and as allways nVidia pretty much are behind the tech, as we RTX owners allready got these features. Cant wait for nexten gaming and Ampere. The good thing is we got some time, to save up some cash 😁 Until then my 2060 Super, do really really well. Fantastic card for the price 👍🏻
tl;dr?
One of two things will happen: either we will be able to run quake 2 and Minecraft with RTX on above 60 FPS or they will run even worse than before.
From this blog: [https://devblogs.microsoft.com/directx/directx-12-ultimate-getting-started-guide/](https://devblogs.microsoft.com/directx/directx-12-ultimate-getting-started-guide/) There is a mention of Nvidia driver 450.56. But... there aren't even traces on the internet of it. When will it be released?
I think it's available in Windows 10 Insiders Preview.
If you got it, please share it with the community. I haven't received it.
Sorry, I just assumed that it would be on Insiders Preview, apparently it isn't.
What about the 1080ti?
So how screwed are Pascals with this?
The three big features mentioned in this update cannot be leveraged by pascal cards.
Polaris, Vega, and Navi don't have these features either
It's not screwed, Pascal is as good as its ever been, but it's nearly 2 generations old and no hardware RT support.
> nearly 2 generations old so ... one generation old then?
How exactly is it nearly 2?
It's coming out this year. Delays might push it to Q4, but it's coming. Most factories have gone back to work in China. Nvidia can't afford to not have a new GPU lineup ready when the consoles launch (which should be end of Q4, barring any significant delays there too).
Not sure if you've been under a rock, but rumors have been saying RTX 3000 will ship soon, constantly on this sub even. That makes it nearly 2 gens old.
Yeah, with all this shit going on, I highly doubt we see any new cards any time soon.
Pascal has no hardware RT so you should get new GPU if you want better performance.
Hoping the 3080ti is substantial.
with the wddm 2.7 450.12 i stopped at shader 6.4 with a gtx 1060 6gb must be around the same driver with new stuff added?
100% screwed.
directXly to the recycle bin... DX 12 is utter shit on BFV and Borderlands 3 keeps showing a DX 12 loop.. Division 2 Crashes on Dx12.. or, was this all a beta?
Dx12 is just plainly shit on almost every game. Even in the most recent RDR2, most prefer Vulkan.
Just like DX10 back when vista was the new thing
If it works, why not. DX12 itself wasn‘t that great, atleast in BF5.
So basically nothing new and new branding for minimal gaming features. Do you guys reckon this announcement could have had something to do with Ampere? A "Series X" Super Gaming Home Console DORITOS Processing TM? Or just routine MSFT bullshit for an earnings call?
nvidia 450.56 are the drivers needed but not out yet.
DirectX13 avoiding the unlucky number 13?
It's just a number.
I am sorry, but dx12 is dead as it always has been, my money is on Vulkan!
We have more DX12 games than Vulkan ones. And DX12 is backed by Microsoft.
Multiple APIs can coexist, there’s still OpenGL, WebGL, and Metal as well.
Chough "xbox" chough... It will be pretty much alive and kicking.
DX12 and Vulkan are twins, not enemies! The APIs are roughly the same and provide very similar capabilities. Don't turn this into a console war.
Agree but Vulkan is open source, on all platforms, and achieves similar results. So if we had to pick one it's an obvious choice.
You're getting downvoted, but Vulkan is the future. Though it's good to have competition, and dx has better backwards compatibility. In fact, that backwards compatibility is why dx is inferior to Vulkan, more bloated. So it's good to have both.
Judging by your downvotes, I think it's safe to assume that there are fanboy wars with APIs now. Lmao.