• BananaTrifleViolin@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 months ago

      Yeah, this kind of misunderstands what debian is. If you wanted newer bleeding edge stuff you wouldn’t be using debian. Debian is all about the stability.

      That said, Debian Sid or testing (the bleeding edge system that 13 will come from) may move to 6? Debian 12 was last year so 13 would be in 2025, so it seems likely 6 will make its way into the bleeding edge versions if people really wanted to use it. But there are better options for most end users than using test versions of major distros.

      • bisby@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        10 months ago

        Debian is not all about “stability” in the sense of “doesn’t crash”. Debian is all about consistency. The platform doesn’t change. That means if there is a bug that crashes the system for you… it’s going to consistently be there.

        For me, it was when stable was on kernel 3.16, and 3.18 was in testing, but the latest kernel was 3.19. And this was an era where AMD’s drivers not fully OpenGL compliant yet. Which meant games would crash. And knowing “this game will always crash until 3 years from now when we finally get a newer kernel” was enough to chase me off.

        debian’s neovim package is 0.7.2. Sid is 0.7.2. Experimental is 0.9.5… If there are any bugfixes between 0.7.2 and 0.9.5 that are critical for your workflow… too bad. If its not a “security” release, its not getting updated. You can live with knowing the bug.

        “Never change anything, stick to known good versions” only works if you know 100% that the “known good version” is actually bug free. No code is bug free, so inevitably the locked down versions in Debian will have still some flaws (and debian doesn’t backport bugfixes, they only backport SECURITY fixes). For most use cases, the flaws will be minor enough to not matter. But inevitably, if a flaw exists, it affects SOMEONE.

        If you actually want to do any sort of complicated computing, debian is not a great choice. if you want a unchanging base so you can run a web browser and processor, I’m sure it’s great.

        • hemko@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          Debian is not all about “stability” in the sense of “doesn’t crash”. Debian is all about consistency. The platform doesn’t change.

          Yes, that’s what ‘stable’ means.

          • bisby@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            Most people use stable to refer to something that doesn’t crash or cause issues. Something that you might call “rock solid” which implies it’s not going to fall over. Something to put on your server because you’ll get great uptime without issues.

            Debian is one of the few places where stable might crash more than unstable, because known bugs in Debian don’t get backported unless they cause security issues.

            I use Debian on my servers because “some testing” is nice and the only thing I run on my servers is docker. And ironically, I have to use a PPA for docker.

            So for me, it’s a stable enough base OS, but it “too stable” for anything that actually runs on the servers.

      • Possibly linux@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        If you want cutting edge you should use Fedora. Debian does have a unstable branch but it isn’t really tested

        • zaphod@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          10 months ago

          Thats seriously overstating things. I’ve been running testing or sid for years and years, and I can only remember a handful of times where anything meaningfully broke. And typically its dependency breakages, not actual software breakages.

          • Possibly linux@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            10 months ago

            Testing is different from unstable. Testing should be fairly stable but it is missing security support so keep that in mind.

            • zaphod@lemmy.ca
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              10 months ago

              Yes I’m aware of the security tradeoffs with testing, which is why I’ve started refraining from mentioning it as an option as pedants like to pop out of the woodwork and mention this exact issue every damn time.

              Also, testing absolutely gets “security support”, the issue is that security fixes don’t land in testing immediately and so there can be some delay. As per the FAQ:

              Security for testing benefits from the security efforts of the entire project for unstable. However, there is a minimum two-day migration delay, and sometimes security fixes can be held up by transitions. The Security Team helps to move along those transitions holding back important security uploads, but this is not always possible and delays may occur.