So a user on Reddit (ed: u/Yoraxx ) posted on the Starfield subreddit that there was a problem in Starfield when running the game on an AMD Radeon GPU. The issue is very simple, the game just won’t render a star in any solar system when you are at the dayside of a moon or even any planetary object. The issue only occurs on AMD Radeon GPUs and users with the Radeon RX 7000 & RX 6000 GPUs are reporting the same thing.

The issue is that the dayside of any planetary body or moon needs a source of light that gets it all lit up. That source is the star and on any non-AMD GPU, you will see the star/sun in the sky box which will be illuminating light on the surface. But with AMD cards, the star/sun just isn’t there while the planet/moon remains illuminated without any light source.

Original Reddit post by u/Yoraxx

  • Pxtl@lemmy.ca
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    29
    ·
    1 year ago

    Ugh. A part of me wants to give AMD a chance for my next upgrade and push back against Nvidia’s near-monopoly of GPUs but I really don’t want to deal with how everything kinda-sorta works on Radeons.

    • ruckblack@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      34
      arrow-down
      1
      ·
      1 year ago

      I’ve exclusively been on AMD since like 2015 and my GPUs “kinda-sorta working” has not been my experience at all lol. Literally have never had brand-specific problems. The only brand-specific issues I’ve had were trying to get my laptop with an Nvidia GPU to work properly under Linux.

    • JJROKCZ@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      2
      ·
      1 year ago

      I’ve been red only in my rig for over a decade and the only problems I’ve had are that I play the same games as everyone else perfectly fine and I have more money in my wallet due to not spending as much on parts. That and the bulldozer generation CPUs heated my house like crazy, there’s no denying that lol

      • XTornado@lemmy.ml
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        3
        ·
        edit-2
        1 year ago

        Ugh… the last part is still happening? Like are the new CPUs also so hot or whatever would somebody call it?

        I am tempted to build a new PC all AMD for costs alone although the AM4 probably won’t last as long as the Am3 did sadly. But the summer is already terrible with my Intel… no need for more heat.

        • JJROKCZ@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          1 year ago

          No bulldozer chips have been gone for like 6-7 years. They last two ryzen generations have been far more energy/heat efficient than intel. Ryzen is the better choice by far right now

        • ninjan@lemmy.mildgrim.com
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          Current Intel is worse than current AMD for CPU heat and Nvidia is currently cooler than AMD on GPU. Also we’re on AM5. AM4 lived for a relatively long time, no indication that AM5 won’t be a long runner as well. Intel changes socket more often as well so for longevity AMD is almost always the best, except at the tail end of a socket.

          • Resolved3874@lemdro.id
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Huh. Didn’t even know they replaced am4 until this comment 😂 my am4 ryzen 5 paired with an rx6700xt still does everything I want it to do. And if it starts slacking I have plenty of upgrading left to do.

    • Frog-Brawler@kbin.social
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      1 year ago

      I’ve exclusively used AMD GPU’s since building my first PC 27 years ago. I’m not aware of things “kinda-sorta” not working.

    • Sharkwellington@lemmy.one
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      3
      ·
      edit-2
      1 year ago

      I have a suspicion that developers do less testing, optimization, and bugfixing for AMD cards due to reduced market share and that’s why more of these brand-specific coding errors slip through for them. It’s unfortunate but I can’t deny I’ve seen some weird bugs in my time.

      • darkeox@kbin.social
        link
        fedilink
        arrow-up
        6
        ·
        1 year ago

        How can an AMD sponsored game that litteraly runs better on all AMD GPU vs their NVIDIA counterpart, doesn’t embark any tech that may unfavor AMD GPU can be less QA-ed on AMD GPUs because of market share?

        This game IS better optimized on AMD. It has FSR2 enabled by default on all graphics presets. That particular take especially doesn’t work for this game.

      • MycoPete@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Some games are built specifically for AMD from the ground up and have been optimized like crazy. Depends on the game and the devs mostly. And let’s not forget that if devs want it to run well on PS5 and Xbox Series x/s, then they better have good AMD optimization.

      • Pxtl@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        5
        ·
        1 year ago

        Oh of course. I don’t actually blame AMD for those kinds of bugs. But it’s the reality as a user, at least in my experience… but it’s been like stupid long time since I’ve used a machine with an AMD card.