If you are so keen on correctness, please don’t say “LLMs are lying”. Lying is a conscious action of deceiving. LLMs are not capable of that. That’s exactly the problem: they don’t think, they just assemble with probability. If they could lie, they could also produce real answers.
What I find weird about Tumbleweed is, that updating is not integrated into YaST or another UI. You have to use the commandline to keep your system up to date. That makes it exactly as inconvenient as Arch for newcomers, but Arch has a whole philosophy behind this while SuSE is typically very GUI oriented. It’s weird.