Hiker, software engineer (primarily C++, Java, and Python), Minecraft modder, hunter (of the Hunt Showdown variety), biker, adoptive Akronite, and general doer of assorted things.

  • 0 Posts
  • 44 Comments
Joined 1 year ago
cake
Cake day: August 10th, 2023

help-circle
  • I agree honestly. I’ve gone through the exact process op has described several times now and flew even more before the process started.

    I think the lines are worse than they used to be. I have not seen the scan improve anything. The only time I’ve seen things move slower for me is recently the agent hadn’t done an opt out in a while and had to remember how to do it. The TSA precheck line I was in did not have opt out signs posted. I just politely asked anyways (and they did comply).

    I had a similar situation happen with customs where the guy was confused by my request and hadn’t done one it the old way in a while but honored it without argument.



  • Can’t comment on the DOCSIS, I don’t know enough about it to not be making stuff up.

    Regarding WiFi though… The simple answer is if you’re not having trouble accessing the WiFi in the places you use the WiFi and you’re getting the full speed that you’re paying for, there’s not a ton of a reason to upgrade the router.

    The exception to this is that most routers only get a few years of security updates like most phones… That can potentially leave your network more vulnerable as the router might not properly block unsolicited traffic from making it to your devices. There’s a solid argument that you should just have your devices secured via their own firewalls though.

    The Google routers are nice for the average Joe because they just kind of work and keep themselves updated (and Google tends to keep the hardware they sell under the Nest name receiving security updates a VERY long time compared to the competition). Netgear has been my go to for years but their update mechanism is … fairly manual in my experience.

    I’ve since moved to having a pfSense box for the firewall and routing side of things and using my old negate router in access point mode (I’m much less concerned about this setup).

    I’ll add that I don’t recommend WiFi for gaming… And that most people have more download speed than they really “need.” Files really haven’t gone up in size much (sure games have) but evening else… meh (?). Video streaming is more popular but unless you’ve got a lot of people in your home or you stream at 4k, it’s really not going to be that noticable between 30Mbps and 1Gbps. So like, by all means if you don’t want to spend money, don’t… you’re probably fine.



  • Honestly a huge portion of the problem is asshole drivers that just don’t turn off their brights and their fog lights or that tailgate the vehicle right in front of them while their headlights are mirror level.

    I’ve seen brand new trucks with LEDs that were so easy on my eyes then I’ve seen the exact same model of truck via rearview mirror only after I passed it because the lights were beyond blinding.

    They need to enforce maximum luminosity laws with an iron first; it’s ridiculous that people get away with this stuff.








  • So, the web uses a system called chain of trust. There are public keys stored in your system or browser that are used to validate the public keys given to you by various web sites.

    Both letsencrypt and traditional SSL providers work because they have keys on your system in the appropriate place so as to deem them trustworthy.

    All that to say, you’re always trusting a certificate authority on some level unless you’re doing self signed certificates… And then nobody trusts you.

    The main advantage to a paid cert authority is a bit more flexibility and a fancier certificate for your website that also perhaps includes the business name.

    Realistically… There’s not much of a benefit for the average website or even small business.



  • So the local machine doesn’t really need the firewall; it definitely doesn’t hurt, but your router should be covering this via port forwarding (ipv4) or just straight up firewall rules (ipv6).

    You can basically go two routes to reasonable harden the system IMO. You can either just set up a user without administrative privileges and use something like a systemd system level service to start the server as that user and provide control over it from other users … OR … if you’re really paranoid, use a virtual machine and forward the port from the host machine into the VM.

    A lot of what you’re doing is … fine stuff to do, but it’s not really going to help much (e.g. building system packages with hardening flags is good, but it only helps if those packages are actually part of the attack surface or rather what’s exposed to the remote users in someway).

    Your biggest risk is going to be plugins that aren’t vetted doing bad things (and really only the VM or using the dedicated user account provides an insulation layer there – the VM really only adds protection against privilege escalation which is pretty hard to pull off on a patched system).

    My advice for most people:

    • Make a new user on the system to run each game you want to run
    • Run the game using systemd and that user
    • Use something like kopia + the root user’s crontab (easier than systemd timers, but systemd timers also work) to backup the files on disk

    For Minecraft in particular, to properly back things up on a busy server you need to disable auto save, manually force save, do the backup and then enable auto save again after your backup. Kopia can issue commands to talk to the server to do that, but you need a plugin that can react to those commands running on the server (or possibly to use the server console via stdin). Realistically though, that’s overkill and you’ll be just fine backing up the files exactly as they are periodically.

    Kopia in particular will do well here because of its deduplication of baked up data + chunking algorithm that breaks up files. That has saved me a crazy amount of storage vs other solutions I’ve tried. Kopia level compression isn’t needed because the Minecraft region files themselves are already highly compressed.



  • They do have versioning: https://docs.syncthing.net/v1.27.7/users/versioning

    Of course, you actually have to use that, it has to work, and you have to have a strategy for reverting the state (I don’t know if they have an easy way to do that – I’ve never used the versioned side of things).

    I have had some situations where Syncthing seems to get confused and doesn’t do its job right. I ran into this particularly with trying to sync runelite configurations and music. There were a few times I had to “force push” … and I vaguely recall one time where I was fighting gigs of “out of sync” in both directions on something and just destroyed the sync and rebuilt it to stop … whatever it was doing.

    Don’t get me wrong, it’s a great tool for syncing things between computers; but I would not rely on it for backup (and prefer having a backup solution on top of the synced directories). There are real backup tools out there that are far better suited to this sort of thing. I suggested Kopia, you should get some integrity checking using its builtin sync (as it won’t be able to figure out what to sync if your origin is corrupted); you won’t get that with a straight up rsync or a syncthing, they’re not application-aware enough to know they’re about to screw you over.

    Restic has a similar feature but I’ve always found Restic’s approach much more frustrating and not-at-all friendly for anyone less than a veteran in systems administration. Kopia keeps configuration in the repository itself, has a GUI for desktop use that runs jobs for you automatic, automatically uses the secrets manager appropriate for your operating system, etc … Restic you kind of have to DIY a lot of basic things and the “quick start tutorial” just kinda ignores these various concerns.

    Even if you plan to just use cron jobs, Kopia will do sane things with maintenance. Restic last I checked you still need to manually run maintenance tasks and if any job maintenance or otherwise fails, you need to make sure to unlock the repository (which if you haven’t set up notifications … well now you’ve got a silent backup failure and your backups aren’t running).

    I just kept running into a sea of “oh this could be bad” footguns with Restic that made me uncomfortable trusting it as my primary backup. I’m sure Restic can be a great tool if used in expert hands with everything appropriately setup; but nobody tells you how to do that … and I get the feeling a lot of people are unaware of what they’re getting into.

    The folks making Kopia … they seem like they really know what they’re doing and I’ve been very happy with it. We’re moving from rsnapshot to Kopia at work now as well (rsnapshot is also fairly good you’ve got a bunch of friends with NASes that support hard links and SSH, but it’s CHATTY and has no deduplication, encryption, data integrity verification is basically left to the file system – so you better be running ZFS – etc).

    Duplicati’s developer is back too, so that might be something to keep an eye on … but as it stands, the project has been bit rotting for a while and AFAIK still has some pretty significant performance issues when restoring data.


  • You could use kopia for this (but you would need to schedule cron jobs or something similar to do it).

    The way this works with kopia… You configure your backups to a particular location, then in-between runs there’s a sync command you can use to copy the backup repository to other locations.

    Kopia also has the ability to check a repository for bad blobs via its verify function (so you can make sure the backups stored are actually at least X% viable).

    Using zerotier or tailscale (for this probably tailscale because of the multithreading) would let you all create a virtual network between the devices that lets them directly talk to each other. That would allow you to use kopia’s sync functionality with devices in their homes.