That’s not meme, that’s REALITY!

  • leisesprecher@feddit.org
    link
    fedilink
    arrow-up
    0
    ·
    7 days ago

    I honestly don’t understand why people keep buying these. It’s not like games look that much better compared to cheaper models.

    And what’s really surprising to me: who exactly pays those scalper prices? Who can justify shelling out 1-2000 currency units for a (part of a) toy?

    The entire industry lost its direction. It’s only about some artificial numbers go up and no one even tries to come up with a reason for those numbers.

    • crawancon@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      7 days ago

      I understand your point but you’d have to admit that A. gaming is competitive. B. hitting high FPS in 4k+ requires thoroughbred hardware, and for those of well means, buying the latest just means staying ahead of those who can’t.

      games can absolutely look better on highest resolution, all textures and options enabled , etc.

      that’s just for overzealous gamers, not touching the coin mining industry or rendering or other capabilities the 5 series has( and will grow over time )over 4 series.

      • leisesprecher@feddit.org
        link
        fedilink
        arrow-up
        0
        ·
        7 days ago

        What is competitive about textures? Is eSports an arts contest now? You can’t seriously suggest that having better textures makes a gamer better at playing counter strike.

        4K high FPS is exactly the kind of useless numbers game I’m talking about. There’s a point at which humans don’t see differences anyway. Especially in those super high competitive games you also have pretty fast paced movements, you can’t see texture quality.

        Yes, if compared in a lab environment the newer card will probably create a better image, but we’re pretty close to “virgin blood infused speaker wires for audiophiles” territory.

        • Lag@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          7 days ago

          Games and hardware have been passing the ball back and forth and I think games have the ball right now. It’s only a matter of time until we need more power again so why slow down? Besides, the people who don’t want realism already benefit from it by having cheaper and more efficient parts.

    • Chef_Boyardee@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      7 days ago

      I have a graphics degree that I never use. But I know damn well that graphics advancements are not smoke and mirrors. The endgame has been the same for decades. All of the new graphic features existed for a long time in movies/TV. Now, we are finally getting those features to render in real time. It’s so damn amazing.

      I game on a 65" OLED from about 2 feet away. So yeah, there is a difference when one cranks the settings.

      But then again, I’m apparently crazy because I’m a cloud gamer. But having the equivalence of a $3,000 PC for $16/month using a $700 laptop is a no brainer. And people saying the picture and lag is terrible is simply not true.

      With that said, I’m happy that it doesn’t matter to you. Because that saves you from having the need to have cutting edge equipment.

    • snooggums@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 days ago

      People with excess money. There doesn’t need to be a lot of them, just enough to justify the scalping.

      Past generations were mostly scalped for bitcoin mining, and this one will probably be scalped for AI enthusiasts. Both have potential revenue streams used to justify the cost.

      The gamers were the smallest part, and probably will be this generation too. There are a few (poorly optimized) games where they do improve the experience, but yeah the last few gens have had diminishing returns even on the top end games. Certainly not enough to justify their massive price increases in my opinion.

      • leisesprecher@feddit.org
        link
        fedilink
        arrow-up
        0
        ·
        7 days ago

        Both have potential revenue streams used to justify the cost.

        You sure about that? AI in any semi-professional environment uses proper AI accelerators with loads of ram and not overpriced consumer cards. And that ignores the fact that AI never produced any relevant revenue stream.

        Miners, do they even exist anymore? Bitcoin is ASIC land and ethereum switched to proof of stake?

        • KingRandomGuy@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          7 days ago

          I work in CV and a lot of labs I’ve worked with use consumer cards for workstations. If you don’t need the full 40+GB of VRAM you save a ton of money compared to the datacenter or workstation cards. A 4090 is approximately $1600 compared to $5000+ for an equivalently performing L40 (though with half the VRAM, obviously). The x090 series cards may be overpriced for gaming but they’re actually excellent in terms of bang per buck in comparison to the alternatives for DL tasks.

          AI has certainly produced revenue streams. Don’t forget AI is not just generative AI. The computer vision in high end digital cameras is all deep learning based and gets people to buy the latest cameras, for an example.

    • GBU_28@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 days ago

      I’m with you but the meme does say “gpu enthusiast”, which sets the stage that they are interested in the top of the line for no reason other than personal interest.

    • Hotdog Salesman@programming.dev
      link
      fedilink
      arrow-up
      0
      ·
      7 days ago

      The target demographic of the 90 tier is grouped on two:

      1. Enthusiasts who just kinda want the power for the power
      2. Professionals, mostly 3D artists but video editors and a few others to. High fidelity renders can take days, so shaving a few hours off can add up, especially if you need to redo it