• 1 Post
  • 96 Comments
Joined 1 year ago
cake
Cake day: June 14th, 2023

help-circle
  • My first couple of computers had AmigaOS and even from the start Windows felt like complete garbage in comparison, but eventually I had to buy a PC to keep up with the times. After that I kept looking for alternative OS:es, tried Linux dual booting but kept going back to Windows since all the programs and hardware I needed to use required it. When I finally decided to go full time Linux, some time between 2005 and 2010, it was because I felt like I was just wasting my life in front of the computer every day. With Windows it was too easy to fire up some game when I had nothing else to do, and at that time there were barely any games for Linux so it removed that temptation. But that has ofc. changed now and pretty much all Windows games work equally well on Linux :)






  • I think a 650 W PSU should be enough for a workload of 490 W idle. Please, correct me, if I am wrong.

    You mean 490W under load, right? One would hope that your computer uses less than 100W idle, otherwise it’s going to get toasty in your room :) I would say this depends on how much cheaper a 650W PSU is, and how likely it is you’ll upgrade your GPU. It really sucks saving up for a ridiculously expensive new GPU and then realizing you also need to fork out an additional €150 to replace your fully functional PSU. On the other hand, going from 650W to 850W might double the cost of the PSU, and it would be a waste of money if you don’t buy a high end GPU in the future. For PSU, check out https://cultists.network/140/psu-tier-list/ .If you’re buying a decent quality unit I wouldn’t worry about efficiency loss from running at a lower % of its rated max W, I doubt it’s going to be enough to be noticeable on your power bill.

    I’ve always had Nvidia GPUs and they’ve worked great for me, though I’ve stayed with X11 and never bothered with Wayland. If you’re conscious about power usage, many cards can be power limited + overclocked to compensate. For example I could limit my old RTX3080 to 200W (it draws up to 350W with stock settings) and with some clock speed adjustments I would only lose about 10% fps in games, which isn’t really noticeable if you’re still hitting 120+ fps. My current RTX3090 can’t go below 300W (stock is 370W) without significant performance loss though.

    If you have any interest in running AI stuff, especially LLM (text generation / chat), then get as much VRAM as you possibly can. Unfortunately I discovered local LLMs just after buying the 3080, which was great for games, and realized that 12GB VRAM is not that much. CUDA (i.e. Nvidia GPUs) is still dominant in AI, but ROCm (AMD) is getting more support so you might be able to run some things at least.

    Another mistake I made when speccing my PC was to buy 2*16GB RAM. It sounded like a lot at the time, but once again when dealing with LLMs there are models which are larger than 32GB that I would like to run with partial offloading (splitting work between GPU and CPU, though usually quite slow). Turns out that DDR5 is quite unstable, and I don’t know if it’s my motherboard or the Ryzen CPU which is to blame, but I can’t just add 2 more RAM. I.e. there are 4 slots, but it would run at 3800MHz instead of the 6200Mhz that the individual sticks are rated for. Don’t know if Intel mobos can run 4x DDR5 sticks at full speed.

    And a piece general advice, in case this isn’t common knowledge at this point; Be wary when trying to find buying advice using search engines. Most of the time it’ll only give you low quality “reviews” which are written only to convince readers to click on their affiliate links :( There are still a few sites which actually test the components and not just AI generate articles. Personally I look for tier lists compiled by users (Like this one for mobos), and when it comes to reviews I tend to trust those which get very technical with component analyses, measurements and multiple benchmarks.


  • It’s not that bad. Of course I’ve had a few games that didn’t work, like CoD:MW2, but nearly all multiplayer games my friends play also work on Linux. The last couple of years we’ve been playing Apex Legends, Overwatch, WoWs, Dota 2, Helldivers 2, Diablo 4, BF1, BFV, Hell Let Loose, Payday 3, Darktide, Isonzo, Ready or Not, Hunt: Showdown to name a few.


  • For LLMs it entirely depends on what size models you want to use and how fast you want it to run. Since there’s diminishing returns to increasing model sizes, i.e. a 14B model isn’t twice as good as a 7B model, the best bang for the buck will be achieved with the smallest model you think has acceptable quality. And if you think generation speeds of around 1 token/second are acceptable, you’ll probably get more value for money using partial offloading.

    If your answer is “I don’t know what models I want to run” then a second-hand RTX3090 is probably your best bet. If you want to run larger models, building a rig with multiple (used) RTX3090 is probably still the cheapest way to do it.