Software Engineer, Linux Enthusiast, OpenRGB Developer, and Gamer

Lemmy.world Profile: https://lemmy.world/u/CalcProgrammer1

  • 0 Posts
  • 44 Comments
Joined 3 years ago
cake
Cake day: June 9th, 2021

help-circle
  • I’m not familiar with KDE’s new feature yet, but if it only supports sysfs LEDs then it won’t control 99% of keyboards. Few RGB keyboards have drivers that expose this interface. Most RGB keyboards are controlled from userspace on their official software on Windows, and that’s also what most Linux projects that control RGB devices including my OpenRGB project do. I wonder if it would be possible to write an OpenRGB plugin/script that exposes a virtual /sys/class/leds/openrgb device that KDE could talk to, then translate that into OpenRGB calls to set the color on all available devices. It doesn’t sound too difficult.



  • Same. I started really using Linux with Ubuntu 6.06 and was drawn in by its “Linux for human beings” goals - the Ubuntu homepage of the era really pushed the ideals of community and openness. Canonical sat in the background paying to send you free CDs in the mail. It was such an idealistic thing back then.

    And then it all changed around 2010. The color scheme shifted to a removedty MacOS lookalike, the human elements were dropped, the logo was reworked, it got bundled with a paid music store, then Amazon ads in the search, and it’s been a roller coaster on a downward spiral ever since. I switched to Debian not long after the initial enremovedtification in the early 2010s and have not looked back, though I moved most of my systems to Arch a few years back because I like life in the fast rolling release lane and Debian wouldn’t support my new GPUs.



  • I would say we’re beyond the era of PC referencing the classic “x86 IBM Personal Computer compatible” definition. PC could reasonably be considered to include many ARM systems, considering there are now Windows laptops shipping with ARM processors that can run “PC” software. Besides, most new x86 PCs aren’t IBM PC compatible anyways as legacy BIOS support has been dropped by a lot of UEFI implementations. I would consider any device that runs a desktop style OS (be it Windows, Linux, or even MacOS) a PC. The distinction in my mind is specifically mobile vs. desktop. Android and iOS are not PC. They’re primarily touch driven and apps are restricted to a certain format with a centralized app store where you are expected to get all of your apps. Windows/Linux/MacOS are primarily keyboard and mouse driven and you have a lot more flexibility on acquiring new apps, with their forms of “sideloading” and “rooting/jailbreaking” being things that are just normal and accepted rather than workarounds/hacks to break out of the walled garden. I would also go as far as saying a smartphone can be a PC if you have a PC like OS on it, such as mobile Linux OSes that let you run desktop applications.


  • Squeekboard is where it’s at. By far my favorite onscreen keyboard for Linux and mainly because you can easily create your own layouts using .yaml files. I’m tired of virtual keyboards that omit keys needed for development and terminal use or shove them off to separate tabs. My custom Squeekboard layout fits my needs exactly and I’m pretty fast at typing on it (typing this on it now). I wish it were usable outside of Phosh, though tbf I haven’t tried. Between GNOME Mobile, KDE Plasma Mobile, and Phosh (Squeekboard), I choose Phosh primarily because of how much I like Squeekboard.


  • Except that in the case of VGA (and DVI, HDMI, and DisplayPort) the i2c interface is intended for use over the cable. All of those ports have a pair of i2c pins and corresponding wires in their cables. The i2c interface is used for DDC/EDID which is how the computer can identify the capabilities and specifications of the attached display. DDC even provides some rarely-used control functionality. Probably the most useful of which is being able to control the brightness of the display from software. I use the ddcci module on Linux and it lets me control my desktop monitor brightness the same way a laptop would, which is great. I have no idea why this isn’t widely used.

    Edit:

    This i2c interface is widely used to control the lighting on modern graphics cards that have RGB lighting. We’ve spent a lot of time reverse engineering these chips and their i2c protocols for OpenRGB. GPU chips usually have more i2c buses than the cards have display connectors, so the RGB chip is wired to one of the unused buses. I think AMD GPUs tend to have 8 separate i2c buses but most cards only use 4 or 5 of them for display connectors. There is also an i2c interface present on RAM slots normally used for reading the SPD chip that stores RAM module specifications, timings, etc. This interface is also used for RAM modules with controllable RGB lighting.


  • Yeah, the lack of proper discoverability on i2c truly sucks. You have to just poke random addresses and hope for the best to see if an i2c device exists on the bus. It’s a great standard but I wish it would get updated with some sort of plug and play autodetection feature. Standardized device PID/VID system like USB and PCI would be acceptable or a standardized register that returns a part string. Anything other than blindly poking registers and hoping you’re not accidentally overvolting the CPU or whatever because the register on your expected device overlaps with the overvolt the CPU register on the same address of a different device.





  • Most gaming laptops these days don’t do GPU switching anyways. They do render offloading, where the laptop display is permanently connected to the integrated GPU only. When you want to use the discrete GPU to play a game, it renders the game frames into a framebuffer on the discrete GPU and then copies the completed frame over PCIe into a framebuffer on the iGPU to then output it to the display. On Linux (Mesa), this feature is known as PRIME. If you have two GPUs and you do DRI_PRIME=1 <command>, it will run the command on the second GPU, at least for OpenGL applications. Vulkan seems to default to the discrete GPU no matter what. My laptop has an AMD iGPU and an NVIDIA dGPU and I’ve been testing the new NVK Mesa driver. Render offloading seems to work as expected. I would assume the AMD Mesa driver would work just as well for render offloading in a dual AMD situation.


  • Same. I put together a knock off Prusa i3 kit in 2014 that cost me like $600 and most of the parts were themselves 3D printed. Not a bad thing but they were really rough prints. It printed OK for the time but was an endless source of annoyance. In comparison, the Ender 3 Pro just basically worked out of the box with minor bed leveling tweaks and everything else has just been minor quality of life improvements. It’s great.


  • The AMD radv driver is best for gaming at the moment IMO. If you’re stuck with NVIDIA hardware then yes, the proprietary driver is the best for gaming as the open source driver is quite slow, but the good news is that this is rapidly changing after being stagnant for 5+ years. NVK is the new open source NVIDIA Vulkan driver in Mesa and it just recently left experimental to be included officially in the next Mesa release. Also, NVIDIA’s GSP firmware changes mean that the open source nouveau kernel driver can finally reclock NVIDIA GPUs to high performance clocks/power states thus it could achieve performance parity with the proprietary driver with enough optimization. On my RTX 3070 laptop it is still significantly slower and some games don’t work yet, but there is no flickering or tearing that I experience with the proprietary driver. Unfortunately for GTX 10 series users, these cards do not use GSP firmware and have no means of reclocking still so they will be stuck using only proprietary drivers for the forseeable future.


  • I’m happy with my Ender 3 Pro. I paid $200 but you can get it cheaper. I think there is a coupon sometimes for $100 at Micro Center. It’s common enough that replacement parts and upgrades (both printed and purchased) are readily available. Runs open source Marlin firmware and the design itself is also open source. Print quality is good for the price. For the price I’m OK with it not having direct extruder or dual Z axis, and adding auto bed leveling is easy enough with a BLTouch or CRTouch.



  • AOSP and even factory kernel source tends to be only mildly useful for proper Linux phone use. Android phones tend to ship with old kernel revisions that the chip maker forked a long time ago and developed their chip drovers on without following accepted kernel conventions or submitting any code to the actual kernel maintainers for proper review and integration into the most up to date “mainline” kernel. Due to this, and the fact that phone makers need to constantly ship new products out the door, the quality of this code added into the old kernel is often garbage, poorly commented and with no documentation. Usually no git history either.

    There are other teams of people trying to clean up and/or rewrite these drivers from scratch in a way that is reviewable and acceptable in mainline. Only a small handful of the vast number of phone chips have such support, so proper Linux phone is limited to a small selection of hardware. The designed-for-Linux librem and PinePhone models intentionally chose old chipsets because these chipsets had good mainline support and thus could receive actual kernel updates rather than being stuck forever on an ancient kernel release from the manufacturer that has long since been abandoned.

    Lately the Qualcomm Snapdragon SDM845 chip is seeing growing mainline Linux support and quickly becoming one of the most viable chips for mobile Linux that isn’t a complete dinosaur in terms of performance and power draw. The OnePlus 6 and 6T, which both use the SDM845 chip, have become quite popular as Linux phones now despite not yet having VoLTE and thus being useless for calls. I carry a OnePlus 6T as a secondary non-phone pocket PC because the Linux experience is very good other than the lack of phone and camera functionality. It’s fast and can do all my terminal and coding stuff as well as run full fledged web browsers well.


  • CalcProgrammer1@lemmy.mltoLinux@lemmy.mlLibrem 5: A Practical Review
    link
    fedilink
    arrow-up
    21
    arrow-down
    2
    ·
    4 months ago

    Nice review. I agree with others here that this phone is borderline scam for the price and with all the delays people had in receiving them. Performance seems on par with the $200 original PinePhone which I had a similar experience with.

    The one good thing that came out of Purism/Librem 5 is Phosh. It’s a pretty good phone shell/UI for other more capable Linux phones to use. I particularly like Phosh for its on-screen keyboard Squeekboard which allows for custom keymaps.


  • I’ve had an A770 Limited Edition since its release in late 2022. Overall, I’m happy with it. The drivers were a mess at launch but now everything works as expected. Performance is decent in the games I play, though I have a 144Hz 4K monitor and it’s not really capable of that resolution and refresh rate except on the lightest esports games so I use FSR on most games. My most played game is Overwatch and it hits 144Hz with dynamic resolution scaling on and medium settings. I want to buy a higher end GPU eventually to really push this monitor but waiting to see what happens with the next generation of Intel and AMD cards (NVIDIA is not even in the running unless NVK suddenly gets performance parity with the proprietary drivers).


  • GitLab used to be awesome when it was the place to go after MS bought out GitHub. They had premium access for all public projects under a FOSS license and top-tier CI. Then as time went on, they began pulling support for various functions in a very Microsoftian EEE sort of way. First requiring credit cards fir new users to access the CI, then taking away the CI almost entirely except for a practically useless monthly allotment, then taking away the premium access for public FOSS licensed projects. If I were migrating today I would not have chosen GitLab, but it is where I settled after leaving GitHub and my projects have grown to depend on GitLab CI even if I’m now forced to run my own runners due to the extreme nerfs they’ve done to the hosted CI. I mirrored OpenRGB to Codeberg, but since the CI pipelines depend on GitLab I don’t see Codeberg becoming the main hub anytime soon unless they can execute GL CI configs. Sad to see how far GitLab has fallen though, it is unrecognizable from what it used to be as far as support for FOSS prohects goes, especially given how GitLab itself started as a FOSS project.