Let’s be honest: I only use Java for Minecraft. So I only debugged with it. But all version, server or client, all launchers. All of them use double (or more) RAM. In the game the correctly allocated amount is used, but on my system double or more is allocated. Thus my other apps don’t get enough memory, causing crashes, while the game is suffering as well.

I’m not wise enough to know what logs or versions or whatever I should post here as a cry for help, but I’ll update this with anything that’ll help, just tell me. I have no idea how to approach the problem. One idea I have is to run a non-Minecraft java application, but who has( or knows about) one of those?

@jrgd@lemm.ee’s request:

launch arguments [-Xms512m, -Xmx1096m, -Duser.language=en] (it’s this little, so that the difference shows clearly. I have a modpack that I give 8gb to and uses way more as well. iirc around 12)

game version 1.18.2

total system memory 32gb

memory used by the game I’m using KDE’s default system monitor, but here’s Btop as well:

this test was on max render distance, with 1gb of ram, it crashed ofc, but it crashed at almost 4gbs, what the hell! That’s 4 times as much

I’m on arch (btw) (sry)

  • aksdb@lemmy.world
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    3 months ago

    glibc’s malloc increases the stacksize of threads depending on the number of cpu cores you have. The JVM might spawn a removedload of threads. That can increase the memory usage outside of the JVMs heap considerably. You could try to run the jvm with tcmalloc (which will replace malloc calls for the spawned process). Also different JVMs bundle different memory allocators. I think Zulu could also improve the situation out of the box. tcmalloc might still help additionally.