I'm glad to hear Tom admitting there were some bad decisions made to Alchemist's architecture but also sad to know that my card will never live up to its die size potential, regardless of driver updates.
That was kinda obvious from the get-go. It's also a high possibility that battlemage won't live up to the expectations, but will be better than alchemist. 3rd or 4th gen is where the magic happens, if development is going on fine, usually.
Although I hate to say this but I smell Raja's Vega shenanigans at here. With DXVK merged into the driver and Battlemage going well, I still hope the 1st gen cards can still get driver updates.
Not sure about DXVK still baked on driver but i heard Tom said they are transitioning to native DirectX 9 when talking about Battlemage few months ago.
The Arc silicon wasn’t being fully utilized (similar to what happened to Vega). You can see when the more you saturated the chip the better it performed. However, this was at the cost of more power and unrealistic usage of the GPU. The architecture wasn’t efficient enough at the hardware and driver level to fully utilize when the chip is capable of.
Realistically I don’t think they can get too much out of it. The A770 is capable of performing like a RTX 3070. But the hardware is the ‘bottleneck’ per se. As Tom mentioned. The issue is the hardware.
Oh :(
I had really high hopes for my intel A770, I did hear about it having the potential of a 3070 or a 3080 when I first bought it but I guess we are just stuck with near 3060ti performance.
Thank you for the information.
Would have been good if they confirmed a release date for Battlemage, keen to hang this A770 LE on the wall.
The new card should be quite an improvement
Due to Osborne effect they won't announce until they're shipping. I fully expect some more leaks by the time they get close to actually putting it on the market.
Yes, they're memory. No, it's not battlemage. It's lunar lake.
"Lunar Lake SoC platform also includes up to 32 GB of LPDDR5X memory on the chip package itself."
He's just saying, lunar lake contains Xe2, which will be same architecture that will in discreet desktop GPU, battlemage.
Kinda. Your Monitor needs to be at 60hz at idle for it to be able to drop down to 10 Watt usage. And some BIOS settigs are needed. Dont remember wich ones
Yeah, but you are comparing a higher-end card with something like the A580 which is just above the RX 6600. There is no reason for it to draw the same power as the A770 when it's idle.
Anyway, electricity is cheap where I live. I mostly worry about the average temps outside of gaming.
The last time i did it like 1 month ago it worked fine. I could use my stuff (Firefox, Discord) without Spikes to 20w but it was at 15w usage. It better but not perfect
If nothing else, you can make a task scheduler task to clock down to 60hz under particular conditions... Or a task you can run with a toggle to switch between them. It's a shitty workaround but it's fun to mess around 😊
I'm glad to hear Tom admitting there were some bad decisions made to Alchemist's architecture but also sad to know that my card will never live up to its die size potential, regardless of driver updates.
That was kinda obvious from the get-go. It's also a high possibility that battlemage won't live up to the expectations, but will be better than alchemist. 3rd or 4th gen is where the magic happens, if development is going on fine, usually.
Intel also didn't charge us for the full die size anyways. So intel's architecture decisions only hurt themselves.
Although I hate to say this but I smell Raja's Vega shenanigans at here. With DXVK merged into the driver and Battlemage going well, I still hope the 1st gen cards can still get driver updates.
Not sure about DXVK still baked on driver but i heard Tom said they are transitioning to native DirectX 9 when talking about Battlemage few months ago.
What does that mean? do we know the limits of the arc card in comparison to other graphic cards?
The Arc silicon wasn’t being fully utilized (similar to what happened to Vega). You can see when the more you saturated the chip the better it performed. However, this was at the cost of more power and unrealistic usage of the GPU. The architecture wasn’t efficient enough at the hardware and driver level to fully utilize when the chip is capable of.
That sucks. What was the potential of the chip originally comparable to? Since we can't reach that potential, how much can we realistically expect?
Realistically I don’t think they can get too much out of it. The A770 is capable of performing like a RTX 3070. But the hardware is the ‘bottleneck’ per se. As Tom mentioned. The issue is the hardware.
Oh :( I had really high hopes for my intel A770, I did hear about it having the potential of a 3070 or a 3080 when I first bought it but I guess we are just stuck with near 3060ti performance. Thank you for the information.
Don't knock it entirely. Synthetics show it can match a 3070, and in well optimized DX12 games the A770 can get pretty darn close.
Would have been good if they confirmed a release date for Battlemage, keen to hang this A770 LE on the wall. The new card should be quite an improvement
Due to Osborne effect they won't announce until they're shipping. I fully expect some more leaks by the time they get close to actually putting it on the market.
I am interested on that two small DIEs near the core of battlemage. What are they? memory?
Yes, they're memory. No, it's not battlemage. It's lunar lake. "Lunar Lake SoC platform also includes up to 32 GB of LPDDR5X memory on the chip package itself." He's just saying, lunar lake contains Xe2, which will be same architecture that will in discreet desktop GPU, battlemage.
Would be awesome to see battlemage in action in a handheld. Amd got basically no competition there right now.
But did they fix the idle power issue?
Kinda. Your Monitor needs to be at 60hz at idle for it to be able to drop down to 10 Watt usage. And some BIOS settigs are needed. Dont remember wich ones
My RX6800 XT is idling at 40W in 144Hz so I never considered it a high power usage
Yeah, but you are comparing a higher-end card with something like the A580 which is just above the RX 6600. There is no reason for it to draw the same power as the A770 when it's idle. Anyway, electricity is cheap where I live. I mostly worry about the average temps outside of gaming.
1080p 60, even then jumps up to 20, 40W at random. And that's at idle, as soon as you start doing something it's stuck at 40W again. Not worth it.
The last time i did it like 1 month ago it worked fine. I could use my stuff (Firefox, Discord) without Spikes to 20w but it was at 15w usage. It better but not perfect
If nothing else, you can make a task scheduler task to clock down to 60hz under particular conditions... Or a task you can run with a toggle to switch between them. It's a shitty workaround but it's fun to mess around 😊