Does AI actually help students learn? A recent experiment in a high school provides a cautionary tale.
Researchers at the University of Pennsylvania found that Turkish high school students who had access to ChatGPT while doing practice math problems did worse on a math test compared with students who didn’t have access to ChatGPT. Those with ChatGPT solved 48 percent more of the practice problems correctly, but they ultimately scored 17 percent worse on a test of the topic that the students were learning.
A third group of students had access to a revised version of ChatGPT that functioned more like a tutor. This chatbot was programmed to provide hints without directly divulging the answer. The students who used it did spectacularly better on the practice problems, solving 127 percent more of them correctly compared with students who did their practice work without any high-tech aids. But on a test afterwards, these AI-tutored students did no better. Students who just did their practice problems the old fashioned way — on their own — matched their test scores.
I’ve found AI helpful in asking for it to explain stuff. Why is the problem solved like this, why did you use this and not that, could you put it in simpler terms and so on. Much like you might ask a teacher.
I think this works great if the student is interested in the subject, but if you’re just trying to work through a bunch of problems so you can stop working through a bunch of problems, it ain’t gonna help you.
I have personally learned so much from LLMs (although you can’t really take anything at face value and have to look things up independently, but it gives you a great starting place), but it comes from a genuine interest in the questions I’m asking and things I dig at.
No offense but that’s what the article is also highlighting, naming that students, even the good, believe they did learn. Once it’s time to pass a test designed to evaluate if they actually did, it’s not that positive.
I mean…
私と日本語で会話したいか 😅
At the end of the day, I feel like it’s how you use the tool. “if you’re just trying to work through a bunch of problems so you can stop working through a bunch of problems, it ain’t gonna help you.” How do you think a bunch of kids using this are going to be using it when it comes to school work that they’re required to finish, but not likely actually interested in?
To an extent, but it’s often just wrong about stuff.
It’s been a good second step for things I have questions about that I can’t immediately find good search results for. I don’t wanna get off topic but I have major beef with Stack Overflow and posting questions there makes me anxious as hell because I’ll do so much diligence to make sure it is clear, reproducible, and not a duplicate only for my questions to still get closed. It’s a major fucking waste of my time. Why put all that effort in when it’s still going to get closed?? Anyways – ChatGPT never gets mad at me. Sure, it’s often wrong as hell but it never berates me or makes me feel stupid for asking a question. It generally gets me close enough on topics that I can search for other terms in search engines and get different results that are more helpful.
Yep. My first interaction with GPT pro lasted 36 hours and I nearly changed my religion.
AI is the best thing to come to learning, ever. If you are a curious person, this is bigger than Gutenberg, IMO.
That sounds like a manic episode