• reev@sh.itjust.works
    link
    fedilink
    arrow-up
    34
    arrow-down
    6
    ·
    edit-2
    4 months ago

    The point of software like DLSS is to run stuff better on computers with worse specs than what you’d normally need to run a game as that quality. There’s plenty of AI tech that can actually improve experiences and saying that Skyrim graphics are the absolute max we as humanity “need” or “should want” is a weird take ¯\_(ツ)_/¯

    • warm@kbin.earth
      link
      fedilink
      arrow-up
      17
      arrow-down
      8
      ·
      edit-2
      4 months ago

      The quality of games has dropped a lot, they make them fast and as long as it can just about reach 60fps at 720p they release it. Hardware is insane these days, the games mostly look the same as they did 10 years ago (Skyrim never looked amazing for 2011. BF3, Crysis 2, Forza, Arkham City etc. came out then too), but the performance of them has dropped significantly.

      I don’t want DLSS and I refuse to buy a game that relies on upscaling to have any meaningful performance. Everything should be over 120fps at this point, way over. But people accept the shit and buy the games up anyway, so nothing is going to change.

      The point is, we would rather have games looking like Skyrim with great performance vs ‘4K RTX real time raytracing ultra AI realistic graphics wow!’ at 60fps.

      • NekuSoul@lemmy.nekusoul.de
        link
        fedilink
        arrow-up
        9
        arrow-down
        1
        ·
        edit-2
        4 months ago

        The quality of games has dropped a lot, they make them fast

        Isn’t the public opinion that games take way too long to make nowadays? They certainly don’t make them fast anymore.

        As for the rest, I also can’t really agree. IMO, graphics have taken a huge jump in recent years, even outside of RT. Lighting, texture quality shaders, as well as object density and variety have been getting a noticeable bump. Other than the occasional dud and awful shader compilation stutter that has plagued many PC games over the last few years (but is getting more awareness now) I’d argue that game performance is pretty good for most games right now.

        That’s why I see techniques like DLSS/FSR/XeSS/TSR not as crutch, but as just as one of the dozen other rendering shortcuts game engines have accumulated over the years. That said, it’s not often we see a new technique deliver such a big performance boost while having almost no visual impact.

        Also, who decided that ‘we’ would rather have games looking like Skyrim? While I do like high FPS very much, I also do like shiny graphics with all the bells and whistles. A Game like ‘The Talos Principle 2’ for example does hammer the GPU quite a bit on its highest settings, but it certainly delivers in the graphics department. So much so that I’ve probably spent as much time admiring the highly detailed environments as I did actually solving the puzzles.

        • warm@kbin.earth
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          edit-2
          4 months ago

          Isn’t the public opinion that games take way too long to make nowadays? They certainly don’t make them fast anymore.

          I think the problem here is that they announce them way too early, so people are waiting like 2-3 years for it. It’s better if they are developed behind the scenes and ‘surprise’ announced a few months prior to launch.

          Graphics have advanced of course, but it’s become diminishing returns and now a lot of games have resorted to spamming post-processing effects and implementing as much foliage and fog as possible to try and make the games look better. I always bring Destiny 2 up in this conversation, because the game looks great, runs great and the graphical fidelity is amazing - no blur but no rough edges. Versus like any UE game which have terrible TAA, if you disable it then everything is jagged and aliased.

          DLSS etc are defo a crutch and they are designed as one (originally for real-time raytracing), hence the better versions requiring new hardware. Games shouldn’t be relying on them and their trade-offs are not worth it if you have average modern hardware where the games should just run well natively.

          It’s not so much us wanting specifically Skyrim, maybe that one guy, but just an extreme example I guess to put the point across. It’s obviously all subjective, making things shiny obviously attracts peoples eyes during marketing.

          • NekuSoul@lemmy.nekusoul.de
            link
            fedilink
            arrow-up
            3
            ·
            edit-2
            4 months ago

            I see. That I can mostly agree with. I really don’t like the temporal artifacts that come with TAA either, though it’s not a deal-breaker for me if the game hides it well.

            A few tidbits I’d like to note though:

            they announce them way too early, so people are waiting like 2-3 years for it.

            Agree. It’s kind of insane how early some games are being announced in advance. That said, 2-3 years back then was the time it took for a game to get a sequel. Nowadays you often have to wait an entire console-cycle for a sequel to come out instead of getting a trilogy of games on during one.

            Games shouldn’t be relying on them and their trade-offs are not worth it

            Which trade-offs are you alluding to? Assuming a halfway decent implementation, DLSS 2+ in particular often yields a better image quality than even native resolution with no visible artifacts, so I turn it on even if my GPU can handle a game just fine, even if just to save a few watts.

            • warm@kbin.earth
              link
              fedilink
              arrow-up
              2
              ·
              4 months ago

              Which trade-offs are you alluding to? Assuming a halfway decent implementation, DLSS 2+ in particular often yields a better image quality than even native resolution with no visible artifacts, so I turn it on even if my GPU can handle a game just fine, even if just to save a few watts.

              Trade-offs being the artifacts, while not that noticable to most, I did try it and anything in fast motion does suffer. Another being the hardware requirement. I don’t mind it existing, I just don’t think mid-high end setups should ever have to enable it for a good experience (well, what I personally consider a good experience :D).