• 0 Posts
  • 17 Comments
Joined 1 year ago
cake
Cake day: June 14th, 2023

help-circle
  • Yeah, I reckon having a split of the frontend and the backend results in about half the complexity in each. If you have multiple frontends you can upgrade whatever the least important one is to see if there are any problems

    I didn’t really answer your original question.

    When I was using NUC’s I was using Linux mint which uses cinnamon by default as the window manager. Originally I changed it to use some really minimal window manager like twm, but then at some point it became practical to not use one at all and just run kodi directly on X.

    If I was going back to a Linux frontend I’d probably evaluate libreELEC as it has alot of the sharp edges sorted out.


  • I used to run kodi on linux on intel NUC’s connected to all our TV’s a while ago. I don’t remember it being particularly unreliable. The issue that made me change that setup was hardware decoding support in 4k for newer codecs.

    What I’ve had doing that frontend function ( kodi, jellyfin, disney plus, netflix etc ) for the last few years is three Nvidia shield TV pro’s which have been absolutely awesome. They are an old product now and I suspect Nvidia are too busy making money to work on a newer generation version of them,

    The biggest surprise improvement was how good it was being able to ( easily ) configure their remotes to generate power on / off and volume up and down IR codes for the TV or the AV amp they were using so you only need a single remote.

    Separating the function of the backend out from the frontend in the lounge has reduced the broken mess that happens around OS upgrades drastically.


  • I just read the update to the post saying that the issue has been narrowed down to the NTFS driver. I haven’t used NTFS on linux since the NTFS fuse driver was brand new and still wonky as hell something like 15 years ago, so I don’t know much about it.

    However, it sounds like the in kernel driver was still pretty fresh in 5.15, so doing as you have suggested and trying out a 6.5 kernel instead is a pretty good call.


  • If you haven’t already, try running hdparm on your drive to get an idea of if the drives are at least doing large raw reads straight off the disk at an appropriate performance level.

    This is output from the little NUC I’m using right now:

    # lsblk
    NAME   MAJ:MIN RM   SIZE RO TYPE MOUNTPOINTS
    sda      8:0    0 465.8G  0 disk 
    ├─sda1   8:1    0   512M  0 part /boot/efi
    ├─sda2   8:2    0 464.3G  0 part /
    └─sda3   8:3    0   976M  0 part [SWAP]
    
    # hdparm -i /dev/sda
    
    /dev/sda:
    
     Model=Samsung SSD 860 EVO 500GB, FwRev=RVT02B6Q, SerialNo=S3YANB0KB24583B
    ...
    
    # hdparm -t /dev/sda
    
    /dev/sda:
     Timing buffered disk reads: 1526 MB in  3.00 seconds = 508.21 MB/sec
    
    

    If your results are really poor for this test then it points more at the drive / cable / controller / linux controller driver.

    If the results are okay, then the issue is probably something more like a logical partitioning / filesystem driver issue.

    I’m not sure what a good benchmark application for Linux that tests the filesystem layer as well is other than bonnie++ which has been around forever. Someone else might have a more current idea of something to use for this.


  • It might help for the folks here to know which brand and model of SSDs you have, what sort of sata controllers the sata ones are plugged into and what sort of cpu and motherboard the nvme one is connected to.

    What I can say is Ubuntu 22.04 doesn’t have some mystery problem with SSDs. I work in a place where we have in the order of 100 Ubuntu 22.04 installs running with SSDs, all either older intel ones or newer samsung ones. They go great.



  • Appreciate the reply. Which desktop environment are you using?

    My only experience with Wayland is also with KDE. Wheres for the 27-ish years before that I’ve used all sorts of stuff with X.

    I’ve scripted the machine that drives the frontend for our video surveilance ssytem to place windows exactly where I want them when it comes up.

    I use a couple of dbus triggers that make the TV on the wall in my garage go to sleep from the shell, perhaps not tested via ssh though. They were pretty well the functional equivalent of some xset dpms commands that I used to use. Not sure if that is what you were meaning. I think I also had something working that disabled the output altogether. I think that was pretty clunky as it used some sort of screen ID that would occasionally change. Sorry I’m hazy on the details, I’m old.

    I’ll try it all out when I get home, I’ve got to find some old serial crap for a coworker in the garage anyway.


  • Which workflows? Asking because I’d like to experiment with some edge case stuff.

    I’m running KDE with wayland on multiple different vintage machines with AMD and intel graphics and it would take alot for me to go back to the depressing old mess that was X.

    The biggest improvement in recent times was absolutely pulling out all my Nvidia cards and putting in second hand Radeon cards, but switching to wayland fixed all the dumb interactions between VRR ( and HDR ) capable monitors of mixed refresh rates.

    Even the little NUC that drives the three 4k TV’s for the security cameras at work is a little happier with wayland, running for weeks now with hardware decoding, rather than X crashing pretty well every few days.



  • I have an A1502 Macbook that I have been using for work since it was new in 2014. It triple boots Windows, Linux and OSX, but I only really use Linux.

    Mine has the same CPU, a i5-4308U but 16GB of memory, I think it was a custom order at the time.

    If I recall I did the regular bootcamp process you would do to install Windows, installed Windows on a subset of the free space and Linux on the rest.

    I’ve got Linux mint 21 on it currently, but I have had vanilla Ubuntu at different times. I can’t think of anything on it that doesn’t just work off hand.


  • deadbeef@lemmy.nztoLinux@lemmy.mlI tried, I really did
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    8 months ago

    If you go back a bit further, multi monitor support was just fine. Our office in about 2002 was full of folks running dual ( 19 inch tube! ) monitors running off matrox g400’s with xinerama on redhat 6.2 ( might have been 7.0 ). I can’t recall that being much trouble at all.

    There were even a bunch of good years of the proprietry nvidia drivers, the poor quality is something that I’ve only really noticed in the last three or so years.


  • deadbeef@lemmy.nztoLinux@lemmy.mlI tried, I really did
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    edit-2
    8 months ago

    The support for larger numbers of monitors and mixed resolutions and odd layouts in KDE vastly improved in the ubuntu 23.04 release. I wouldn’t install anything other than the latest LTS release for a server ( and generally a desktop ), but KDE was so much better that it was worth running something newer with the short term aupport on my desktops.

    We aren’t too far off the next LTS that will include that work anyway I guess. I’m probably going to be making the move to debian rather than trying that one out though.


  • I have two AMD Radeon cards for Linux that I’m pretty happy with that replaced a couple of Nvidia cards. They are an RX6800 and an RX6700XT. They were both ex mining cards that I bought when the miners were dumping their ethereum rigs, so they were pretty cheap.

    If I had to buy a new card to fill that gap, I’d probably get a 7800XT, but if you don’t game on them you could get a much lower end model like an RX7600.


  • Sorry to hear about that mess.

    I posted here https://lemmy.nz/comment/1784981 a while back about what I went through with the Nvidia driver on Linux.

    From what I can tell, people who think Linux works fine on Nvidia probably only have one monitor or maybe two that happen to be the same model ( with unique EDID serials FWIW ). My experience with a whole bunch of mixed monitors / refresh rates was absolutely awful.

    If you happen to give it another go, get yourself an AMD card, perhaps you can carry on using the Nvidia card for the language modelling, just don’t plug your monitors into it.


  • I’ve been using Linux for something like 27 years, I wouldn’t say evangelical or particularly obsessed.

    I started using it because some of the guys showing up to my late 90’s LAN parties were dual booting Slackware it and it had cool looking boot up messages compared to DOS or Windows at the time. The whole idea of dual booting operating systems was pretty damn wild to me at the time too.

    After a while it became obvious to me that Slackware '96 was way more reliable than DOS or Windows 95 at the time, a web browser like Netscape could take out the whole system pretty easily on Windows, but when Netscape crashed on Linux, you opened up a shell and killed off whatever was left of it and started a new one.

    I had machines that stayed up for years in the late 90’s and that was pretty well impossible on Windows.



  • I’ve been running Linux for 100% of my productive work since about 1995. Used to compile every kernel release and run it for the hell of it from about 1998 until something like 2002 and work for a company that sold and supported Linux servers as firewalls and file servers etc.

    I had used et4000’s, S3 968’s and trio 64’s, the original i740, Matrox g400’s with dual CRT monitors and tons of different Nvidia GPU’s throughout the years and hadn’t had a whole lot of trouble.

    The Nvidia Linux driver made me despair for desktop Linux for the last few years. Not enough to actually run anything different, but it did seem like things were on a downward slide.

    I had weird flashing of sections of other windows when dragging a window around. Individual screens that would just start flashing sometimes. Chunky slideshow window dragging when playing video on another screen. Screens re-arranging themselves in baffling orientations after the machine came back from the screen being locked. I had crap with the animation rate running at 60hz on three 170hz monitors because I also had a TV connected to display network graphs ( that update once a minute ). I must have reset up the panels on cinnamon, or later on KDE a hundred times because they would move to another monitor, sometimes underneath a different one or just disappeared altogether when I unlocked the screen. My desktop environment at home would sometimes just freeze up if the screen was DPMS blanked for more than a couple of hours requiring me to log in from another machine and restart X. I had two different 6gb 1060’s and a 1080ti in different machines that would all have different combinations of these issues.

    I fixed maybe half of the issues that I had. Loaded custom EDID on specific monitors to avoid KDE swapping them around, did wacky stuff with environment variables to change the sync behaviour, used a totally different machine ( a little NUC ) to drive the graphs on the TV on the wall.

    Because I had got bit pretty hard by the Radeon driver being a piece of trash back in something like 2012, I had the dated opinion that the proprietary Nvidia driver was better than the Radeon driver. It wasn’t til I saw multiple other folks adamant that the current amdgpu driver is pretty good that I bought some ex-mining AMD cards to try them out on my desktop machines. I found out that most of the bugs that were driving me nuts were just Nvidia bugs rather than xorg or any other Linux component. KDE also did a bunch of awesome work on multi monitor support which meant I could stop all the hackery with custom EDIDs.

    A little after that I built a whole new work desktop PC with an AMD GPU ( and CPU FWIW ) . It has been great. I’m down from about 15 annoying bugs to none that I can think of offhand running KDE. It all feels pretty fluid and tight now without any real work from a fresh install.