• VirtualOdour@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    8
    ·
    3 months ago

    Yeah it’s weird a lot of people are stuck on fat off on either sides of the sensible opinion - some people seem to think that their ten min with Chat gpt 3.5 was enough to demonstrate that the whole concept of LLMs is stupid while others think it shows them that it’s a godlike technology incapable of error.

    Reality is there’s a lot of great ways AI and especially LLMs can currently help education but we’re far from it being ready to replace human teachers. Probably in five years it’ll be a standard part of most educations systems much like how online homework portals and study guides have become since my own time in school, maybe by 2035 well have moved to systems where ai education tools are in every school and providing higher quality education than most human teachers, possibly by 2040-50 home schooling via ai tools will be a more common option than other forms of education.

    Though I wouldn’t be too shocked if it happened sooner it will require new developments in AI that aren’t yet in the development stage.

    That all said I bet this article is exaggerating reality and they will have human teachers involved at every step and overseeing activities.

    • Carrolade@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      3 months ago

      Only major problem with this line of thought is it underestimates the challenges of teaching.

      Teaching is about more than just providing the material, if that was enough we could have automated teaching a long time ago. A teacher has to be able to understand and diagnose the source of a students confusion, and compose a solution. This is a very complex problem, due to how much individual people vary in their thinking, experiences and knowledge base.

      • fartsparkles@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        ·
        3 months ago

        Such as can an LLM tell from the bruises on a child or the sunken shoulders whereas the day before they were bright and cheerful that they are being abused at home and that no amount of tailored teaching plans will help that child except through a keen and perceptive teacher who spots what’s really going on.

        And will that abused child feel cared about by a school that thinks an AI and a computer monitor are superior to a human being with empathy?

    • snooggums@midwest.social
      link
      fedilink
      English
      arrow-up
      7
      ·
      3 months ago

      possibly by 2040-50 home schooling via ai tools will be a more common option than other forms of education.

      Taught by nuclear powered robots and we will get around in flying cars and have transporter technology and the future will be predicted through quantum computing and someone will have figured out how to heat up a hot pocket that doesn’t burn the roof of your mouth.

      • VirtualOdour@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        2 months ago

        Living in this rapidly changing world and pretending tech development doesn’t happen is so bizarre I don’t even know where to begin.

        Your the type that was saying online shopping wouldn’t catch on in 2010 when it was already huge, or that computers were a fad in 1995.

        You’re going to live in a world with increasingly good ai and the more to pretend its not happening the sillier you’ll look.

        • snooggums@midwest.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          2 months ago

          When I was young I lived on science fiction from authors like Isaac Asimov that regularly featured tons of technology including robotic teachers and technology that had general artificial intelligence. That lead to an interest in computers and in 1995 I enrolled in a computer science program at a state college and I currently work with data collections that come from student information systems. I also have a kid in high school right now.

          My dad worked at a university and we had an electric motorcycle and car prototypes in the late 80’s/early 90s that used racks of car batteries to drive their electric motors. Man that bike was bulky, but it was quiet and the potential was obvious once the battery bulk was addressed. It has taken decades to get there, but now we have tons of electric vehicles and that is awesome! Hell, I didn’t expect the miniaturization of computers to fit in the current form factor in my lifetime.

          But the thing is, having the idea and the possibility of something involves multiple parts. Electric vehicles wouldn’t be possible at scale until they solved the battery bulk problem. Working towards it was good, as there was an end goal that could be met if only we could solve the form factor. The whole thing with AI not being a realistic thing for replacing teachers is that there is far more to it than a computer and it will require hardware advancements that are pie in the sky at the moment in addition to the actual process of machine learning. Science fiction modeled advanced computers after the neural networks of the brain, such as Asimov’s positronic brains. For AI to be at least as smart as the average person they will need a network of a similar complexity. We are currently using massive amounts of energy to regurgitate jokes from reddit as facts. To have something like a teacher we will need to make the energy requirements similar to a brain, in a form factor similar to a brain, and with the ability to learn and share with other fake brains to improve over time.

          But then we get to the reality part which is nobody is going to do this work at scale to make pretend teachers of any quality any time soon. Student software systems used by schools are pretty rough because the money isn’t there. That is just tools for people to use, not as a replacement for people! There is far more to being a teacher than the subjects they teach. They need to motivate students, act as human beings in real meat space, and even then a lot of teachers are mediocre but trying and some are even terrible because teaching is hard.

          Now you might say that ai home schooling would be done by a motivated kid, but a motivated kid doesn’t need ai to learn on their own. They do that already! We already have tons of available content like Khan Academy that motivated kids can use to learn. But Khan Academy can’t motivate a child to learn, and ai won’t either because it is a thing and not a person who has life experience and a complex biology in addition to their brain that can improve success in teaching. For ai to be used to teach more than 50% of students it would need to be at least as good as the better teachers including the ability to motivate the students.

          Between my wide eyed youth and today I have seen a ton of technology that gets touted as the next big thing and some are some are hits and some are obvious misses. There are tons of technologies that are promoted for specific uses that are clearly bullshit either due to the technology itself or because it is being sold as a magical cure all by snake oil salesmen. AI as a magical cure all is snake oil. It has a lot of uses, but this is not one of them.

          Extra credit: Home schooling is fine for some kids, but the vast majority of kids need to share physical space with other children outside their home as part of their development because we are biological creatures. Even in 1000 years the majority of students learning exclusively alone at home from any kind of teachers is a pipe dream.