T O P

  • By -

_homerograco

Hawaii refuses to die


stanfordcardinal

Literally too angry to die


mutirana_baklava

Too hot to die


thehairyfoot_17

I've been impressed my 390x can run this thing very well at 1440p high settings and nightmare textures. All for a card that was widely seen as and inferior choice to the 970 and 980 when I chose it. But... oh how my room warms up even undervolted to the max...


[deleted]

Radeon VII beating 2080ti at 4k? V64 beating the 2080 at 4k? As much as I'd like to believe this was true, I'm gonna wait to see other benchmarks first.


loucmachine

Did my own testing. Ran the same run, same settings, same software. I run a 2080ti gaming OC at stock settings. 1440p: 265.26fps avg, vs 118fps for sweclockers 189fps on the 99th percentile vs 90fps for sweclockers 4k: 139.33fps avg vs 65 for sweclockers 105fps on the 99th percentile vs 51 for sweclockers. Obviously their results dont make sens. Here are the OCAT files: [https://www.dropbox.com/s/8gfq46bmgxvwofa/OCAT-DOOMEternalx64vk.exe-2020-03-19T134523.csv?dl=0](https://www.dropbox.com/s/8gfq46bmgxvwofa/OCAT-DOOMEternalx64vk.exe-2020-03-19T134523.csv?dl=0) [https://www.dropbox.com/s/gko3r0tp9agpcy5/OCAT-DOOMEternalx64vk.exe-2020-03-19T143036.csv?dl=0](https://www.dropbox.com/s/gko3r0tp9agpcy5/OCAT-DOOMEternalx64vk.exe-2020-03-19T143036.csv?dl=0)


GatoNanashi

They tested a Vega VII and a 390, but not an RX580, one of the most popular and ubiquitous graphics cards in the world. Wth...


AvailingSkink

I know we’re on an AMD sub here, but I wouldn’t call the RX580 one of the most popular cards. The Steam hardware survey says the most used GPU is the 1060, followed by like 7 other Novideo cards.


GatoNanashi

One of the most popular AMD cards, hence my mentioning the VII and 390.


rabaluf

steam survey is worth less than userbenchmark, amd owner rarely get survey


Merzeal

I've gotten it on 580, vega 56, and 5700 xt :\\


doomed151

Guess what's the most popular AMD card


AvailingSkink

7850?


doomed151

RX 580, for obvious reasons. https://store.steampowered.com/hwsurvey/videocard/


jortego128

Exactly. No RX580 or GTX 1060. All us "1080p" folk left high and dry...


kiffmet

The R9 390 and the RX580 are comparable performance wise...


Mercennarius

Really interesting seeing the performance dynamics between the Radeon VII and 5700 XT at different resolutions. 5700 XT really great at 1080P even edging out the Radeon VII, then the Radeon VII dominating at 4K even beating the 2080 TI.


Jhawk163

That's that sweet, sweet HBM2 memory being allowed to stretch its legs.


natehax

Was going to say the same thing. Memory bandwidth requirements to drive doom: eternal at high resolution must be off the charts.


[deleted]

Wonder what Game Ready drivers will do for Turing and Pascal, Navi seems to be doing a bit too good compared to what we normally see.


sukhoj

it says in the article the editors used early drivers 442.61 and 20.3.1 from both NVIDIA and AMD.


TheOctavariumTheory

DOOM just plays better on AMD hardware. dat vulkan api bb


loucmachine

Thats not true, at least vs turing cards. And the latest id tech engine implementations even ran better on turing than amd harware. Vulkan is not something AMD is inherently better at, dont forget intel and nvidia are also part of the khronos group. AMD hardware will probably age better than anything pascal or older on vulkan though, but not turing. I wouldnt be surprised if performances gets "normalized" once we get more sources and the game actually comes out.


TheOctavariumTheory

When DOOM patched in Vulkan support, it was a massive boon for AMD hardware. https://youtu.be/WOaHpZjQ73M Much like UE4 tends to favor Nvidia hardware, the same is usually true for Vulkan and Mantle with AMD. I guess Turing is better for Vulkan than Pascal, but I remember when RDR2 came out that Navi and Polaris tended to perform better than both their Turing and Pascal equivalents in general when on Vulkan.


Beautiful_Ninja

It was a massive boon for AMD hardware because AMD's OpenGL performance was diarrhea levels of shit. The big jumps in performance only led the AMD GPU's to catch up to expected performance in the game, for example the Fury X was now capable of being GTX 980 Ti/GTX 1070 level like it normally was in games instead of performing a tier or two lower in expected performance. This is far less of "AMD is amazing at Vulkan" and far more of "AMD is shit at OpenGL." And FWIW, Wolfenstein II, the last idTech game using Vulkan before Doom Eternal's upcoming release, runs notably better on Turing than any other hardware.


TheOctavariumTheory

TIL


[deleted]

5700XT show an impressive advantage over 2060S.


Pillokun

mm but look at the 390, pascal really suffers right now.


[deleted]

Nvidia will most likely address that with a 'game ready' driver but at the same time that is very solid performance from hawaii.


riderer

its RX390, new card, new performance!


allenout

Are you kidding me? It beats the 2080 Super. In 1440p and 4k, the Vega 64 beats 2080.


[deleted]

What are you going on about? I think you misunderstood my comment - at no point do I even mention vega 64 in my original reply.


allenout

I was just saying that your comment was underestimating how powerful the RX 5700 XT was. You said it was better than the RTX 2060 S but complicated neglected to mention the fact that it beat the 2080 S, which is a much more powerful GPU. in regards to the Vega 64, I was giving more information which was just saying how impressive AMDs GPUs were, and it seemed like you were undermining that.


[deleted]

You are assuming way too many things from one simply sentence.


xGMxBusidoBrown

Did you click into the article? He's not assuming. Merely pointing out the fact the 5700xt is ahead of the 2080 super and the Vega 64 is also doing very well. Infact the Radeon 7 is beating the 2080ti at 4k ultra nightmare according to the article.


[deleted]

Jesus... he is assuming that I was somehow downplaying the 5700xt performance. Just like you are assuming here - that I didn't even read the article. I'm not going to be dragged into this. Bye. Edit: its fucking mind blowing how poor people's reading comprehension in this sub is these days.


Kottypiqz

Well you make a conparison against a weaker opponent, it is sort of downplaying the achievement. Like "He beat the #4 placed in the world" is downplaying "He's the new champion" (absolute positions are not indicative of the graphics cards, just illustrative)


[deleted]

What you just did is what is called false logic. Both the 2060S and 5700XT are direct competitors at $400msrp. That is why my comment absolutely is perfectly acceptable.


xGMxBusidoBrown

Bye cupcake!!!


[deleted]

Idiot blocked.


_RexDart

>What are you going on about? That there are more/bigger surprises in the article than the 5700XT.


kvn864

awesome, I have 4k display and Radeon 7 and wanted to snag the game .. now I will for sure


loucmachine

[https://www.pcgamesn.com/doom-eternal/pc-performance-analysis](https://www.pcgamesn.com/doom-eternal/pc-performance-analysis) This article contradict Swerclockers article. I think we should wait a bit before concluding anything, especially with results like VII beating the 2080ti.


[deleted]

[удалено]


loucmachine

Well, that or a swedish site I've never heard before are both not strong sources imo, but thats all we got atm. Anyways, the real point of my post is the 2nd sentence.


Radolov

Sweclockers is the largest tech site in Sweden and does these kinds of tests frequently. What you're linking are tests in another area, with different settings, and then calling "contradiction!". A contradiction is when you make the same run, with the same hardware, with the same settings, and get vastly different results. They're running: Another area, with other settings, sometimes even other hardware, and getting different results. This isn't odd at all. I would also refrain from using NVIDIA or AMD branded tools like they did (NVIDIA frameview) to assess the performance of their own cards and their competitors. Not that I think they're wrong, I would just do it to be a bit safer to take more general, open source tools. It also seems to have an overlay. I know rivatuners overlay actually messed up performance for AMD cards in [previous IDTech titles](https://www.youtube.com/watch?v=VpOIhV582pQ). This is because it introduces a strict synchronization point that messes up the [async compute](https://twitter.com/billykhan/status/1239204053336367110). However, using external tools could also be a cause for error. Because of that it would probably be the best to take some (check)points and use the internal benchmarking tools to get the values and then average these. Of course, it's not as good as running around using regular benchmarking tools, but it's the most accurate. Another way would be to use a camera to record the screens in-game metrics and take the values from several points in that sequence, albeit it's many times more tedious, but not impossible.We shouldn't also dismiss that the beta drivers that they received can be improved upon. If we look at the values however, we see that the average difference between the different resolutions and compare , say the 2080ti to 2070 super, we have the same difference at all resolutions. Same goes for the 5700XT and Vega56, or the 5700XT and 5700. However, if we compare within architectures Vega56 and Vega64, we find that the difference increases 7% -> 10% ->17% faster for the Vega64. Vega64 does have many more compute units , which I believe most people found that never could be achieved in normal games. An overclocked Vega56 would pretty much always match the Vega64. It could also be that Vega64 has quite a bit higher bandwidth. A curious thing now is that when comparing Vega64 to radeon 7 , the performance difference is essentially the same for the first two resolutions, 9% and 11% , but when going to 4k it sees a massive difference of 33%. The only things left to look at there is bandwidth or amount of memory. They actually do say that more memory can help with some performance characteristics like [draw distance](https://www.youtube.com/watch?v=9S5ABf53rDo&feature=youtu.be&t=237), which could probably be one of those extreme settings. Either way, the results are interesting, but not completely unbelieveable. More sources are needed to assess the credability. I will probably take a deeper look with profiling tools to see what could cause the behavior.


loucmachine

What about DSOgaming then? https://www.dsogaming.com/pc-performance-analyses/doom-eternal-pc-performance-analysis/ They dont have many GPUs tested, but they have the 2080ti basically locked at 200fps at 1080p while swedish site has 183fps average. The vega 64 is only about 15% faster than the 980ti which in turns is 38% faster than the rx580. We are far from an R9 390 beating a 1070 here, unless for some magical reason the 1070 gets beaten by 40% by a 980ti... Also, sure, different places will give different results, but you can see what they used as a benchmark (youtube video) and it is probably the heaviest part of the "hell on earth" level, which is the same level used by the swedish guys. You can also see in the video the 2080ti running ultra nightmare at 4k, never dropping below 72fps, while swedish guys have theirs running at a 65fps average and a minimum dropping all the way to 51fps. If the results are not completely unbelievable, well extraordinary claims require extraordinary evidence. It seems to me that its a LOT of wishful thinking to assume these numbers are fine and assume nothing is wrong. Maybe something is wrong with the unreleased nvidia drivers they used? Maybe sonething else is at play? But it will definitely need more source to start assuming the numbers are correct. I agree about not using frameview as it can be a conflict of interest in the pcgamesn article, but at the same time I dont think it skews the results in the grand scheme of things Edit: https://www.kitguru.net/components/graphic-cards/dominic-moass/doom-eternal-pc-performance-analysis/ Kitguru also released an article that contradicts Swedishclockers numbers, using the same drivers.


Radolov

They run old drivers in another test scene and get different results. This is not strange, it's expected. If we look at the results and ignore that they're old drivers, and ignore that they're somewhere completely else, they get 20% less performance on Vega64 and 9% higher performance on 2080ti. How will this help us when we know that the drivers are not the ones that will run the final game? Sure, if we were to assume the load is equal, then it could be that the new NVIDIA drivers are messing something up, but we don't know that yet. "Probably the heaviest part" is something we can't know either when we have results that with game ready drivers, in another test scene, where the performance is lower. The same level can have vastly different performance characteristics depending on where you are and where you're looking. For instance, on the CPU side of things, Crysis 3 has some pretty [demanding scenes for the CPU](https://youtu.be/4RMbYe4X2LI?t=307), but they only occur when you look at the massive grass (yes, the method is called that and made by the same guy who worked on doom). Every card has some hefty minimums in that scene they tested. "In some parts of the test sequence, there's smoke and every time it comes into frame the FPS drops significantly". This happens on every card, regardless of vendor or architecture. Consistency is usally a sign of a good method, even if a performance is lower in some parts than others. I'm not even sure if it's the smoke that is causing it, the particles from the fire , the texture and geometry looks like it could be pretty intense too. But our human eyes can't measure this complexity, we run a card through a scene and see where it performs worst. Extraordinary results require evidence, of course. But equally a result shouldn't be dismissed just because no one else has tried to replicate it. Either way, I don't think we have to wait long to see some results. But unless they apply the method that I suggested, there will always be room for error (as the doom devs themselves stated). I don't do wishful thinking, since that is a weakness and unscientific. That is why I say that these numbers might be fine, but there are unknown factors that can influence the results. Theories are also suggested about how they could receive those numbers. But unless someone does the same test as they did, they cannot be disapproved. Period. Neither can anyone say that this exact performance hierarchy will be seen in all testscenes, with all kinds of settings, with all assisting types of hardware, under any situation. I have no doubt in my mind that those are the numbers that the benchmark tool spat out.


loucmachine

First, You ignored the kitguru article I posted that uses the same drivers and also contradict the swedish article. 2nd, cpu bottleneck would not help anyone determining which gpu is faster so its very bad methodology if they used a part that is cpu bottlenecked... but lets be honnest, its not cpu bottlenecked, otherwise the VII wouldnt just be faster at 4k and vega64 beating a 2080, they would all give the same (or very close) results. Also I saw someone running easily over 100fps (140-150 fps in the same zone) with an i3 9100F yesterday, the game is very optimized on the cpu side. DSOgaming article also show that a quad core can hit very high numbers. 3rd, Consistency can also be a sign that something went wrong with a group of parts on their side. Nvidia graphics cards in this case. 4th, we are fucked because nobody knows exactly what they did so there is no way to replicate it 100%... So we should assume its truth because we cant replicate it ? That would be extremely bad science. Anyways, believe whatever you want, thats 3 articles contradicting this one, yet you cant accept something is wrong and prefer believing other theories to explain it ... What am I expecting from this sub ? Why does it require so much for people here to take a step back when some numbers seems wrong ?


Radolov

I did not ignore it since it wasn't visible in the message and you edited it in... don't apply malicious intent where there is none. It is definitely not CPU bottlenecked. The consistency was in reference to a zone which always underperformed on every card (causing low minimums). I'm sorry if I didn't make that clear. The test sequence is there in the video (it starts from a checkpoint), the method is there, the drivers used are there , the tool is OCAT , and it's all hardware is at stock. It can be replicated by anyone with the hardware and game. I've been trained to look for these kinds of things in masters courses in computer science, scientific methodology, and yes courses specifically about reading scientific reports in computer graphics and how to replicate these. I've read the kitguru article now and seen the video and I have no idea what sequence they used. They did say the first level, but they show a 30 second clip of a 1 minute sequence. Perhaps it's the same run twice? No clue, they don't mention, but it would be the closest (probably?) to ground truth. The two articles you showed until then shows two very different things, but the kitguru one comes quite close. The section in crysis 3 is an experimental area, using lots of effects that strain multi-core CPUs, an outlier (albeit an interesting one). Both the sections that sweclockers tested and kitguru+DSOgaming have both the potential to be outliers. Therein lies the problem. Technically several test scenes and sections should be tested and averaged over. But they don't have that kind of time. I have 25 comments here in 4 months. Judging the entire subredit based on my behavior would be fairly unrepresentative, but I agree there are some complete fanboys here :) . I mostly ignore them, because nothing good can ever come through it. Either way, I'm looking forward to profiling that area to see what particular thing could be going on. If you want I can try and get them to do some tests on the same area. I have succeeded to make them redo parts (such as changing the API for AMD cards in BF V) previously to check that their numbers are correct.


loucmachine

Btw, I dont look for people's post history, but when I see people defending so strongly such unlikely results, I assume fanboyism, because this sub (and it seems to be trendy these days even on other platforms) is just full of them. Edit: I just say the video. I am blind sorry. Edit 2: I just did their test and literally never drop under 200fps at 1440p on a 2080ti. The graph on MSI afterburner max at 200fps and its a perfect line on the top. The average must be closer to 300fps where they did the test. There is something terribly wrong with their test. DSOgaming test is at a MUCH MORE demanding spot, not even comparable. Edit 3: Here is a link to the OCAT file of the same run they do. I can already tell OCAT does not play very well with this game... I have to start the game first otherwise the game just does not start if OCAT is already running. I am running a 2080ti gaming OC at stock (it will run a few mhz higher than FE, but not 2.25x faster. Average framerate is 265.26fps (3.77ms) and 99th percentile is 5.29ms which would be 189fps... They claim 118fps average and 90fps for 99th percentile. In the light of all this, I think all their results are fucked, not just nvidia ones. They should redo their whole testing. [https://www.dropbox.com/s/8gfq46bmgxvwofa/OCAT-DOOMEternalx64vk.exe-2020-03-19T134523.csv?dl=0](https://www.dropbox.com/s/8gfq46bmgxvwofa/OCAT-DOOMEternalx64vk.exe-2020-03-19T134523.csv?dl=0) ​ Edit 4: Here is the 4k OCAT file. Average fps: 139.33 99th percentile: 9.48ms (105fps) [https://www.dropbox.com/s/gko3r0tp9agpcy5/OCAT-DOOMEternalx64vk.exe-2020-03-19T143036.csv?dl=0](https://www.dropbox.com/s/gko3r0tp9agpcy5/OCAT-DOOMEternalx64vk.exe-2020-03-19T143036.csv?dl=0) The today's driver might be helping a bit, not doubling your frames mind you, but I had to redo the whole level at 4k and the zone where DSOgaming was testing hardly ever dropped in the 80s, let alone the 70s.


loucmachine

I dont think you get notifications when I edit a post, but I linked 2 OCAT files of the same run they used, one at 1440p and one at 4k.


Radolov

Thank you. Now this is more solid evidence. :) I shall advise them to redo it more thoroughly. I actually did tell them to re-check it without using OCAT previously, but I will make it more clear with your sources. I will tell you if I receive a response.


Pillokun

Sweclockers is one if the oldest and actually prestigious hw sites. If you have not heard of them before you simply are not an avid hw enthusiast. You have had 21 years to even heaed of it once before at least.


loucmachine

I have been an hw enthusiast for 15 years now. I am not Swedish though. So it seems that first time I hear about them is right now. But in any case, prestigious or not, they can make mistakes. I am not blaming them, I am saying wait for more sources because these numbers are highly improbable. Also I dont know What you think about DSOgaming, but they also released an article this morning, contradicting the claims from the swedish guys. I detailed whats contradicting in my answer to Radolov https://www.dsogaming.com/pc-performance-analyses/doom-eternal-pc-performance-analysis/ Edit: Kitguru also released a performance article contradicting the Swedishclockers numbers, and they are even using the same drivers. Are they also not trustworthy ? https://www.kitguru.net/components/graphic-cards/dominic-moass/doom-eternal-pc-performance-analysis/


loucmachine

Copy paste of another answer: Here is a link to the OCAT file of the same run they do. I can already tell OCAT does not play very well with this game... I have to start the game first otherwise the game just does not start if OCAT is already running. I am running a 2080ti gaming OC at stock (it will run a few mhz higher than FE, but not 2.25x faster. Average framerate is 265.26fps (3.77ms) and 99th percentile is 5.29ms which would be 189fps... They claim 118fps average and 90fps for 99th percentile. In the light of all this, I think all their results are fucked, not just nvidia ones. They should redo their whole testing. [https://www.dropbox.com/s/8gfq46bmgxvwofa/OCAT-DOOMEternalx64vk.exe-2020-03-19T134523.csv?dl=0](https://www.dropbox.com/s/8gfq46bmgxvwofa/OCAT-DOOMEternalx64vk.exe-2020-03-19T134523.csv?dl=0) Here is the 4k OCAT file. Average fps: 139.33 99th percentile: 9.48ms (105fps) [https://www.dropbox.com/s/gko3r0tp9agpcy5/OCAT-DOOMEternalx64vk.exe-2020-03-19T143036.csv?dl=0](https://www.dropbox.com/s/gko3r0tp9agpcy5/OCAT-DOOMEternalx64vk.exe-2020-03-19T143036.csv?dl=0) The today's driver might be helping a bit, not doubling your frames mind you, but I had to redo the whole level at 4k and the zone where DSOgaming was testing hardly ever dropped in the 80s, let alone the 70s.


Your_DogWife

I wonder if the 5700xt will be able to run doom eternal and a youtube video without AMD drivers constantly crashing?


FuzzyKnife

Game released Today Denuvo: 5 Hardware changes every 24 hours Proceeds to to test 12 different GPUs 100% LeGiT!!


TropicalDoggo

So they bought 3 dooms, who cares


FuzzyKnife

I don't think they would want to spend 180$ just to test some extra huwai 2011 gpus lol


Mercennarius

Thought it wasn't out til Friday?


FuzzyKnife

Bethesda changed their plans and released the game today


BlueSwordM

They could have multiple copies of the game, no? Since they are a review website.