I’ve been playing with the largest models I can get running and have been using Librewolf or Firefox, but these use several gigabytes of system memory. What options exist that have less overhead? I’m mostly looking at maximizing the model training potential as I’m learning. The obvious solution is python in a terminal, but I need a hiking trail not free solo rock climbing.

  • Bloody Harry@feddit.de
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    11 months ago

    Not what you’re asking for, but how about putting the web browser and the page rendering on a different machine? This way your main machine can focus on calculating.

    Edit: If the pages are super simple, there’s “web browsers” which do work on the command line which can render simple pages in a very crude way.

    • railsdev@programming.dev
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      11 months ago

      This is kind of what I was thinking. I have a $5/mo VPS running Selenoid. All it does is take incoming requests to simulate a real user clicking on some stuff.

      Basically I run a website in Ruby on Rails that has to talk to some API’s. Unfortunately the industry that app works in is very behind with tech, so I make do with simulating a user visiting some portals en lieu of actual API calls. It’s great because the resource-constrained containers don’t have to power up an entire web browser in background jobs, though running these tasks as long-running background jobs presents other issues.