• keepcarrot [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 months ago

    Is there a reason for the convention other than that’s how most people count? (Which is a perfectly fine reason, I’m just curious)

    • micnd90 [he/him,any]@hexbear.net
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 months ago

      When you say the first element of a matrix, first implies one and not zero. This is how linear algebra was invented (on paper, by a human mathematician), taught, and passed down to fellow humans.

      Starting indexes at zero stem from the lineage of C programming and binary nature of computer. For example,

      Computer memory addresses have 2^N cells addressed by N bits. Now if we start counting at 1, 2^N cells would need N+1 address lines. The extra-bit is needed to access exactly 1 address. (1000 in the above case.). Another way to solve it would be to leave the last address inaccessible, and use N address lines.

      This is why, math and physics people who learn linear algebra and matrix calculus learn to index at 1 (on a piece of paper) while computer science programmers index at 0.

      • keepcarrot [she/her]@hexbear.net
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 months ago

        Is linear algebra older than 0? Hang on (no, it is not, formalised in 17th century)

        In my CS course, at least, it was treated as “engineering”, so we did both linear algebra and C programming. For everyone counting from 1 was more natural and the C method had to be taught a few times throughout the course (starting with java loops, which wasn’t used for malloc, OOP was probably the first unit anyone did for CS). As a habit it tended to stick even where we didn’t really use it (or in languages that don’t, e.g. lua), given how grueling C programming was and the other languages that were downstream of it.

        I guess you could analogise things like saying “17th century” is 1600-1699 (first century is 0001 to 0099, I guess), in CS you are counting the very start of a thing (e.g. how many apple-widths to get to the first apple), vs the more common how many apples to have gotten the first apple. Or something, idk,

        I’m drunk and avoiding housework, sorry

    • DannyBoy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      It’s from when arrays were just a block of memory and the index was the offset. So you’d start at pointer x and read memory from there. x + i was your memory location. So you’d start at x + 0 to read your first data element. x + 1 would be the location in memory of the second element.