lol. has anyone found ways to optimize starfield for their pc, like reducing stuttering, FPS drops, etc?

  • circuitfarmer@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    64
    arrow-down
    1
    ·
    1 year ago

    Thing is, I did upgrade my PC. Starfield runs acceptably, but not to the level it should given my hardware.

    I’d much rather hear that they’re working on it in a patch rather than be gaslit into thinking it already runs well.

    • Freeman@lemmy.pub
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      3
      ·
      1 year ago

      I would agree. They should acknowlege its not well optimized and are working on fixing it, especially with Nvidia cards. It rubs me wrong that they are in denial here, especially given their rocky release history.

      Heck that think that 50% of the reason they didnt want even co-op or any netcode. FO76 was a nightmare on relase largely because of that.

      • hydrospanner@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        2
        ·
        1 year ago

        While I’m probably in a small demographic here, I’m sitting with a perfectly capable PC, not a gaming supercomputer or anything but a healthy upgrade from standard, and when I started hearing about Starfield I got really excited.

        …then I saw all this stuff about compatibility and performance, and when I tried to see where my equipment stood, I was left with the impression that if I wasn’t a PC building enthusiast, it was going to be hard to figure it out, and if my computer wasn’t pretty high end and new, it was probably going to struggle.

        And now hearing more about the performance issues, I’ve basically given up even considering the game based on possible performance and compatibility issues.

        • Freeman@lemmy.pub
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          4
          ·
          1 year ago

          What do you play on. The reality it’s pretty serviceable and it is one of those games where FPS != performance or experience.

          I’ve played most of the game at 27-35 fps. It’s been mostly fine as long as I’m not obsessing about the fps counter. I frankly just turn it off unless something bad starts happening.

          I’ve even figured out ways to get it mostly at 30fps on a pretty low power card.

          • frezik@midwest.social
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            At those settings, you might be seeing a consistent 30fps. You tend to get used to that if that’s what you see all the time.

            What people in somewhat higher tier hardware are seeing is an average >60fps, but with sudden dips down below 35fps. That inconsistency is very noticeable as stutters. This seems to happen even all the way up to an rtx4090 running at 1080p. The game seems to hit some bad CPU bottlenecks, and no CPU on the market can fix it.

            • Freeman@lemmy.pub
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              1 year ago

              No doubt I agree. I can definately get the game to do similar with KB+M. The response time, sensitivity and precision of a mouse for camera movements is much faster and more accurate than a game pad.

              Honestly one of the biggest things i did was just use a controller. It smoothed the game out quite a bit. Its then using motion blur (slightly) and the stick acceleration to smooth out the frametime and the input delays make it much less noticeable. Honestly ive become convinced thats the primary way Bethesda tests their games. I started doing this with Fallout 76 for the exact same reasons.

              Those sudden movements seem to cause the system to have to render or re-render and re-calculate parts of the world faster with a KB+M. THus the dips and stutters become more noticeable.

              Im not excusing Bethesda here. I think its bullshit. I think they should optimize their code. At the least they should goddam acknowledge the issue and not try and act like this is normal. Its not. Im also merely trying to portray a way that you can play and enjoy this game without totally raging out in frustration because Bethesda cant really do their job, assuming this is a title you wanted to see and enjoy and are willing to “support” a company with such a rich history of putting out products like this. Its not really new. Skyrim on PC has CTD issues on release, Fallout New Vegas did too, along with a crash once ram usage exceeded 4GB because of the 32-bit barrier. Modders had to fix that shit first. Heck fallout 4 was heralded as a success because it at least was playable day 1.

          • veng@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            27-35fps is low enough for me to wait 5-10 years and brute force it with future hardware, if it’s on sale cheap enough. It shouldn’t run this badly for how little it impresses graphically.

            • Freeman@lemmy.pub
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              edit-2
              1 year ago

              Thats what I am saying though. If you dont have an FPS counter up, the way this game runs at 27-35 is still smoother than 60-75 on other titles (ie: RDR2).

              I mean, you do you, but the raw FPS numbers arent necessarily accurate depictions of how the game runs. Its like they fuck with frametimes and the like. Which makes sense considering if you unlocked or removed vsync from old titles physics would get wacky, lock pick minigames were super fast etc etc.

              That said, i have noticed with a number of Bethesda releases that certain aspects run smoother on a controller. Fallout4/3/NV i was able to brute force performance to run fine on KB+M in 90% of areas. It would still get choppy downtown etc. It was when 76 came out that I tried playing just on a controller. Something about the stick acceleration when moving the camera was much smoother, it made the overall experience better. The same applies here. As soon as I just moved to a controller, its really pretty enjoyable. It doesnt seem to be a “fast twitch” style game like say…CS:GO, Battlefield etc are.

              Though I would totally understand if some folks arent going to make such concessions. Just seems to me Bethesda is one of those studios that really only playtests/develops for controller based play despite “supporting” alternative inputs.

              FWIW I already have a controller for other games (like Elite: Dangerous or other flight games). So its no biggie for me to change up.

      • Spiritreader@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        1 year ago

        This seems to be the new normal though unfortunately.

        It’s normal that 5-10 percent generational improvement on a product (6800 XT - > 7800 XT) is celebrated and seen as good product, when in reality it’s just a little less audacious than what we had to deal with before and with Nvidia.

        It’s normal that publishers and game studios don’t care about how their game runs, even on console things aren’t 100 percent smooth or often the internal rendering resolution is laughably bad.

        The entire state of gaming at the moment is in a really bad place, and it’s just title after title after title that runs like garbage.

        I hope it’ll eventually stabilize, but I m not willing to spend 1000s every year just to be able to barely play a stuttery game with 60 fps on 1440p. Money doesn’t grow on trees, which AAA game studios certainly seem to think so.

        Yes GI and baked in RT / PT is expensive, but there need to be alternatives for people with less powerful hardware, at least until accelerators have become so powerful and are common in lower end cards to make it a non issue.

      • frezik@midwest.social
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        The Nvidia issue might be more Nvidia’s fault. All three major GPU companies (including Intel now) work with major publishers on making games work well with their products. This time around, though, Nvidia has been focused on AI workloads in the datacenter–this is why their stock has been skyrocketing while their gaming cards lower than a 4080 (i.e., the cards people can actually afford) have been a joke. Nvidia didn’t work much with Bethesda on this one, and now we see the result.

        Nvidia is destroying what’s left of their reputation with gamers to get on the AI hype train.

        • Pieisawesome@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          Not entirely true.

          Starfield was amd exclusive, so Intel didn’t get the game (or support it) until early access.

          Nvidia probably didn’t get as much time to optimize the game or work with Bethesda as they normally would.

          • Freeman@lemmy.pub
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Nvidia didn’t work much with Bethesda on this one, and now we see the result.

            Is there a source to this? Arguably the game is an AMD exclusive, which is a deal Beth/MS made, so it sounds like maybe they werent the inclusive ones in that relationship.