Just wanted to share about UMA_Game_Optimized because I had no clue this setting existed and happen to stumble across it today. Not sure if its adaptative to how much ram you have but after enabling it the gpu is now showing 4gb of ram.
Just wanted to share about UMA_Game_Optimized because I had no clue this setting existed and happen to stumble across it today. Not sure if its adaptative to how much ram you have but after enabling it the gpu is now showing 4gb of ram.
It’s anecdotal but (on Linux at least) Steam in 4K locks up and the driver appears to crash with Auto but does just fine with the reserved 4GB.
I’d think you’d have a very hard time to get reports other than anecdotal ones about this specific topic, so totally fine. As I have no AMD iGPU available to test, I have no idea how common that kind of problem is.
But sounds a lot like either the AMD Linux driver or the Steam Big Picture implementation for Linux does things wrong. I believe I explained it in other posts, that I would not expect it to directly impact performance, but any program that treats this like a dGPU and tries to manage memory on a low level will probably handle things very wrong when only tiny amounts of memory is allocated.
And while I have no deep knowledge of GPU APIs, which API is used should influence greatly, how likely it is for an application to bake in a way that only works for dGPUs or high amounts of UMA.