• ReallyKinda@kbin.social
    link
    fedilink
    arrow-up
    56
    ·
    1 year ago

    I know a couple teachers (college level) that have caught several gpt papers over the summer. It’s a great cheating tool but as with all cheating in the past you still have to basically learn the material (at least for narrative papers) to proof gpt properly. It doesn’t get jargon right, it makes things up, it makes no attempt to adhere to reason when it’s making an argument.

    Using translation tools is extra obvious—have a native speaker proof your paper if you attempt to use an AI translator on a paper for credit!!

    • SpikesOtherDog@ani.social
      link
      fedilink
      English
      arrow-up
      13
      ·
      1 year ago

      it makes things up, it makes no attempt to adhere to reason when it’s making an argument.

      It doesn’t hardly understand logic. I’m using it to generate content and it continuously will assert information in ways that don’t make sense, relate things that aren’t connected, and forget facts that don’t flow into the response.

      • mayonaise_met@feddit.nl
        link
        fedilink
        English
        arrow-up
        10
        ·
        edit-2
        1 year ago

        As I understand it as a layman who uses GPT4 quite a lot to generate code and formulas, it doesn’t understand logic at all. Afaik, there is currently no rational process which considers whether what it’s about to say makes sense and is correct.

        It just sort of bullshits it’s way to an answer based on whether words seem likely according to its model.

        That’s why you can point it in the right direction and it will sometimes appear to apply reasoning and correct itself. But you can just as easily point it in the wrong direction and it will do that just as confidently too.

        • Aceticon@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          1 year ago

          It has no notion of logic at all.

          It roughly works by piecing together sentences based on the probability of the various elements (mainly words but also more complex) being there in various relations to each other, the “probability curves” (not quite probability curves but that’s a good enough analog) having been derived from the very large language training sets used to train them (hence LLM - Large Language Model).

          This is why you might get things like pieces of argumentation which are internally consistent (or merelly familiar segments from actual human posts were people are making an argument) but they’re not consistent with each other - the thing is not building an argument following a logic thread, it’s just putting together language tokens in common ways which in its training set were found associate with each other and with language token structures similar to those in your question.

    • pc_admin@aussie.zone
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      32
      ·
      edit-2
      1 year ago

      Any teacher still issuing out of class homework or assignments is doing a disservice IMO.

      Of coarse people will just GPT it… you need to get them off the computer and into an exam room.

      • SmoothLiquidation@lemmy.world
        link
        fedilink
        English
        arrow-up
        45
        arrow-down
        6
        ·
        1 year ago

        GPT is a tool that the students will have access to their entire professional lives. It should be treated as such and worked into the curriculum.

        Forbidding it would be like saying you can’t use Photoshop in a photography class.

        • Neve8028@lemm.ee
          link
          fedilink
          English
          arrow-up
          24
          arrow-down
          1
          ·
          1 year ago

          It can definitely be a good tool for studying or for organizing your thoughts but it’s also easily abused. School is there to teach you how to take in and analyze information and chat AIs can basically do that for you (whether or not their analysis is correct is another story). I’ve heard a lot of people compare it to the advent of the calculator but I think that’s wrong. A calculator spits out an objective truth and will always say the same thing. Chat GPT can take your input and add analysis and context in a way that circumvents the point of the assignment which is to figure out what you personally learned.

          • Benj1B@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            10
            ·
            1 year ago

            Where it gets really challenging is that LLMs can take the assignment input and generate an answer that is actually more educational for the student than what they learned d in class. A good education system would instruct students in how to structure their prompts in a way that helps them learn the material - because the LLMs can construct virtually limitless examples and analogies and write in any kind of style, you can tailor them to each student with the correct prompts and get a level of engagement equal to a private tutor for every student.

            So the act of using the tool to generate an assignment response could, if done correctly and with guidance, be more educational than anything the student picked up in class - but if its not monitored, if students don’t use the tool the right way, it is just going to be seen as a shortcut for answers. The education system needs to move quickly to adapt to the new tech but I don’t have a lot of hope - some individual teachers will do great as they always have, others will be shitty, and the education departments will lag behind a decade or two as usual.

            • Neve8028@lemm.ee
              link
              fedilink
              English
              arrow-up
              5
              ·
              1 year ago

              Where it gets really challenging is that LLMs can take the assignment input and generate an answer that is actually more educational for the student than what they learned d in class.

              That’s if the LLM is right. If you don’t know the material, you have no idea if what it’s spitting out is correct or not. That’s especially dangerous once you get to undergrad level when learning about more specialized subjects. Also, how can reading a paper be more informative than doing research and reading relevant sources? The paper is just the summary of the research.

              and get a level of engagement equal to a private tutor for every student.

              Eh. Even assuming it’s always 100% correct, there’s so much more value to talking to a knowledgeable human being about the subject. There’s so much more nuance to in person conversations than speaking with an AI.

              Look, again, I do think that LLMs can be great resources and should be taken advantage of. Where we disagree is that I think the point of the assignment is to gain the skills to do research, analysis, and generally think critically about the material. You seem to think that the goal is to hand something in.

        • MrMcGasion@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          ·
          1 year ago

          I’ve been in photography classes where Photoshop wasn’t allowed, although it was pretty easily enforced because we were required to use school provided film cameras. Half the semester was 35mm film, and the other half was 3x5 graphic press cameras where we were allowed to do some editing - providing we could do the edits while developing our own film and prints in the lab. It was a great way to learn the fundamentals and learning to take better pictures in the first place. There were plenty of other classes where Photoshop was allowed, but sometimes restricting which tools can be used, can help push us to be better.

        • ReallyKinda@kbin.social
          link
          fedilink
          arrow-up
          6
          ·
          1 year ago

          Depends on how it’s used of course. Using it to help brainstorm phrasing is very useful. Asking it to write a paper and then editing and turning it in is no different than regular plagiarism imo. Bans will apply to the latter case and the former case should be undetectable.

        • Muffi@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I studied engineering. Most classes were split into 2 hours of theory, followed by 2 hours of practical assignments. Both within the official class hours, so teachers could assist with the assignments. The best college-class structure by far imo.