• DarkThoughts@fedia.io
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    6 months ago

    Game logic runs independently from what your monitor can display. So it’s really just a question on what effect it has on the player itself. Maybe for VR there’s an argument to be made, although I feel 1000 Hz still sounds like complete overkill even in that area. But I’m gonna call bullshit on people who claim to be able to tell the difference of such high rates.

    • burgersc12@mander.xyz
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      6 months ago

      Game logic does not always run independent of the framerate. Look at Fallout 4, if you run it at more than 60fps the dialogue literally overlaps itself.

      • Belgdore@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 months ago

        That’s because Bethesda is bad at making games not because there is an intrinsic need for the game logic to be tied to frame rate.

        • burgersc12@mander.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          Used to be very common, but even Switch games today lock the framerate to 30/60fps or else it runs at 2x the speed it should

      • DarkThoughts@fedia.io
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        6 months ago

        I didn’t say framerate, I said from what your monitor can display. FPS and Hz are not synonymous.

          • DarkThoughts@fedia.io
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            6 months ago

            You’ve got it the wrong way around. People play very high FPS games on (comparatively) lower Hz monitors. This has been common practice in competitive pvp shooters for decades.

            • burgersc12@mander.xyz
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              6 months ago

              This is my point. A 1000Hz screen would, most likely, be played at as close to 1000 fps as possible. I am not sure why you think i have it the wrong way when it is you.

              • DarkThoughts@fedia.io
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                6 months ago

                Your point was that game logic doesn’t run independently from your framerate, trying to refute my comment saying that game logic runs independently from your monitor. You’re clearly severely confused about the topic at hand.

                • burgersc12@mander.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  6 months ago

                  I have not tried to refute. Just gave an example of game logic running slower than the screen and a question to why you wouldn’t try to equalize fps and Hz. How am i confused again?

                  • DarkThoughts@fedia.io
                    link
                    fedilink
                    arrow-up
                    1
                    arrow-down
                    1
                    ·
                    6 months ago

                    You gave an example of game logic being tied to framerate, which again, is a completely different matter. And generally, why would you ask me why you wouldn’t equalize it, when you claim that the reason I’ve given was the point you were making in the first place, even though it’s a completely different type of example? You make no sense at all.

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      6 months ago

      For rendered stuff, it typically does make for smoother motion, even at rates much higher than the eye can see, because of motion blur.

      So, recorded video works fine at relatively low bitrates…but the camera is also set up to record a relatively-long exposure, something like a thirtieth of a second, and you see the scene averaged over that time. Your brain can see motion blur and interpret that usefully, to know that there is motion happening.

      Rendered 3D game images typically do not work like that. You see a series of perfectly-sharp images at instants in time. So your brain doesn’t get the nice smooth motion blur to work with.

      But if your computer renders and displays the intermediate images, then your eye can work with that nice smooth blur.

      It’s probably possible to compute a motion-blur more efficiently than rendering a lot of intermediate frames, get at least some kind of approximation of true motion blur, and some games do that, but brute-force rendering of more frames is simple for s developer and accurate. Plus, any game that can support a high frame rate can do it, even if it doesn’t have some kind of faux motion blur approximation.

      I have a 165Hz monitor. When moving my mouse cursor around, I can definitely see independent images of the cursor.

      EDIT: That being said, you could probably get a pretty good approximation by rendering and combining multiple frames on the card and only pushing a lower frame rate out to the monitor – that is, you only really need beefy rendering hardware, not a fancy monitor or cable, to get pretty close. I suppose that in theory, a compositor could do that. I don’t know if someone’s already done that or not.