Users of lemmy.today are reporting that outgoing federation of posts and comments stopped to work after the update to 0.19.1 about 19 hours ago.

A restart of lemmy software seems to have made it work again for now, but not sure for how long.

In this version its also common with CPU spikes on a regular basis. I assume its the new federation queue that takes more cpu in exchange for being more reliable. But I see a lot of Postgres UPDATE queries that did not occur in previous version. Also sometimes i see ROLLBACK, which I assume should not be happening.

Anyone else has similar issus with 0.19.1?

Relevant thread: https://lemmy.today/post/4382768

  • tal@lemmy.today
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    11 months ago

    This post, a day before yours on the !lemmy_support@lemmy.ml community, is describing some similar behavior, with some CPU usage at start (at least on the first boot; not clear whether that is a one-off on migration from the text) and then federation problems with 0.19.1:

    https://lemmy.ml/post/9563852?scrollToComments=true

    After upgrading Lemmy from 0.18.5 to 0.19.1, the lemmy_server process is taking up 200-350+% of my CPU…It seems like my instance isn’t federating properly now tho.

    • mrmanager@lemmy.todayOP
      link
      fedilink
      arrow-up
      3
      ·
      11 months ago

      Yeah that’s what we are having here too, huge cpu usage and issues with outgoing federation. Sounds spot on.

    • Loulou@lemmy.mindoki.com
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      I got som weird behaviour, like nothing worked very well, with 0.19.0 so I updated to 0.19.1 and after quite a while (small instance, beefy PC) it starts to work again, it seems.

      Except I am now a moderator everywhere. Maybe it doesn’t work, I won’t try, except asked, so can someone make a post that I can sticky or delete just to see what’s happening?

      Cheers!