I have a ryzen apu, so I was curious. I tried yesterday to fiddle with it, and managed to up the “vram” to 16gb. But installing xformers and flash-attention for LLM support on igpus is not officially supported and was not possible to install anything past pytorch. It’s step further for sure, but still needs lots of work.
It had lots of bugs and crashes back last year, but considerably improved. Flawless backups/restore, A13. It’s almost a replacement for titanium now.