• Neve8028@lemm.ee
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    1
    ·
    1 year ago

    It can definitely be a good tool for studying or for organizing your thoughts but it’s also easily abused. School is there to teach you how to take in and analyze information and chat AIs can basically do that for you (whether or not their analysis is correct is another story). I’ve heard a lot of people compare it to the advent of the calculator but I think that’s wrong. A calculator spits out an objective truth and will always say the same thing. Chat GPT can take your input and add analysis and context in a way that circumvents the point of the assignment which is to figure out what you personally learned.

    • Benj1B@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      10
      ·
      1 year ago

      Where it gets really challenging is that LLMs can take the assignment input and generate an answer that is actually more educational for the student than what they learned d in class. A good education system would instruct students in how to structure their prompts in a way that helps them learn the material - because the LLMs can construct virtually limitless examples and analogies and write in any kind of style, you can tailor them to each student with the correct prompts and get a level of engagement equal to a private tutor for every student.

      So the act of using the tool to generate an assignment response could, if done correctly and with guidance, be more educational than anything the student picked up in class - but if its not monitored, if students don’t use the tool the right way, it is just going to be seen as a shortcut for answers. The education system needs to move quickly to adapt to the new tech but I don’t have a lot of hope - some individual teachers will do great as they always have, others will be shitty, and the education departments will lag behind a decade or two as usual.

      • Neve8028@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Where it gets really challenging is that LLMs can take the assignment input and generate an answer that is actually more educational for the student than what they learned d in class.

        That’s if the LLM is right. If you don’t know the material, you have no idea if what it’s spitting out is correct or not. That’s especially dangerous once you get to undergrad level when learning about more specialized subjects. Also, how can reading a paper be more informative than doing research and reading relevant sources? The paper is just the summary of the research.

        and get a level of engagement equal to a private tutor for every student.

        Eh. Even assuming it’s always 100% correct, there’s so much more value to talking to a knowledgeable human being about the subject. There’s so much more nuance to in person conversations than speaking with an AI.

        Look, again, I do think that LLMs can be great resources and should be taken advantage of. Where we disagree is that I think the point of the assignment is to gain the skills to do research, analysis, and generally think critically about the material. You seem to think that the goal is to hand something in.