• NaibofTabr@infosec.pub
    link
    fedilink
    English
    arrow-up
    2
    ·
    8 months ago

    The problem with this is that even if a machine is conscious, there’s no reason it would be conscious like us. I fully agree that consciousness could take many forms, probably infinite forms - and there’s no reason to expect that one form would be functionally or technically compatible with another.

    What does the idea “exact copy of our brain” mean to you? Would it involve emulating the physical structure of a human brain? Would it attempt to abstract the brain’s operations from the physical structure? Would it be a collection of electrical potentials? Simulations of the behavior of specific neurochemicals? What would it be in practice, that would not be hand-wavy fantasy?

    • Sombyr@lemmy.zip
      link
      fedilink
      arrow-up
      4
      ·
      8 months ago

      I suppose I was overly vague about what I meant by “exact copy.” I mean all of the knowledge, memories, and an exact map of the state of our neurons at the time of upload being uploaded to a computer, and then the functions being simulated from there. Many people believe that even if we could simulate it so perfectly that it matched a human brain’s functions exactly, it still wouldn’t be conscious because it’s still not a real human brain. That’s the point I was arguing against. My argument was that if we could mimic human brain functions closely enough, there’s no reason to believe the brain is so special that a simulation could not achieve consciousness too.
      And you’re right, it may not be conscious in the same way. We have no reason to believe either way that it would or wouldn’t be, because the only thing we can actually verify is conscious is ourself. Not humans in general, just you, individually. Therefore, how conscious something is is more of a philosophical debate than a scientific one because we simply cannot test if it’s true. We couldn’t even test if it was conscious at all, and my point wasn’t that it would be, my point is that we have no reason to believe it’s possible or impossible.

      • intensely_human@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        8 months ago

        Unfortunately the physics underlying brain function are chaotic systems, meaning infinite (or “maximum”) precision is required to ensure two systems evolve to the same later states.

        That level of precision cannot be achieved in measuring the state, without altering the state into something unknown after the moment of measurement.

        Nothing quantum is necessary for this inability to determine state. Consider the problem of trying to map out where the eight ball is on a pool table, but you can’t see the eight ball. All you can do is throw other balls at it and observe how their velocities change. Now imagine you can’t see those balls either, because the sensing mechanism you’re using is composed of balls of equal or greater size.

        Unsolvable problem. Like a box trying to contain itself.

        • Blue_Morpho@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          8 months ago

          Chaos comes into play as a state changes. The poster above you talks about copying the state. Once copied the two states will diverge because of chaos. But that doesn’t preclude consciousness. It means the copy will soon have different thoughts.

    • intensely_human@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      8 months ago

      We make a giant theme park where people can interact with androids. Then we make a practically infinite number of copies of this theme park. We put androids in the copies and keep providing feedback to alter their behavior until they behave exactly like the people in the theme park.