Sal Khan's AI revolution hasn't happened yet
Posted by the-mitr 1 day ago
Comments
Comment by MostlyStable 1 day ago
I was among those who, when Khanmigo was first announced, were pretty excited about it's potential. I then waited for data on the results....and kept waiting.....and kept waiting. And now four years later this is apparently what we are going to get. I think that this is enough for me to decide that Khanmigo, regardless of whether or not a student actually engages with it, doesn't make much learning difference. At some point, the absence of (reported) data becomes data in itself.
I still believe, in principle, that AI tutors could be massively helpful for learning. But apparently we haven't yet figured out how to take that principle and turn it into reality.
Comment by galaxyLogic 1 day ago
Comment by mrdevlar 1 day ago
AI is perfectly capable of teaching you quantum mechanics if you understand music theory. However, unless you have a full understanding of music theory, you'll need to explain to the machine what you know, and that takes trial and error that most students won't bother with.
Comment by christkv 1 day ago
Comment by 6stringmerc 1 day ago
Honest question: how many of you tech bros have used this platform with your own children? If you won’t dog food it, quit claiming it’ll help the disadvantaged. Please.
Comment by tanvach 1 day ago
It’s going to be quite hard to motivate students to learn now that they know answering can be automated.
Comment by drivebyhooting 1 day ago
Comment by dirkc 1 day ago
Comment by roncesvalles 1 day ago
I still remember when Khan Academy first came out, there was talk that teachers would go obsolete because teaching would become centralized and delivered over video.
Khan Academy to me is still just a YouTube channel trying very hard to be something more.
Comment by 440bx 1 day ago
The thing is people want more than material. They want the material to be accredited and examined. Otherwise there is no demonstrable credibility from doing it.
And there's a whole world out there of higher quality material with has that accreditation and examination structure around it. And it existed, sometimes for decades in the case of The Open University, before Khan Academy appeared. But it costs money.
Comment by utopiah 1 day ago
Well, in practice it's still about the amount of time a pupil does train with the right oversight and that is precisely the bottleneck that hasn't been alleviated.
Comment by the-mitr 1 day ago
Comment by utopiah 1 day ago
amazing in theory with the perfect user in the perfect use case,
misused in practice with terrible consequences for society at large.
Sure the one student who already excels, is motivated, understand what the concept to learn is, that actually completing exercises helps them to learn might, possibly, thrive. All other students, the vast majority, will try to "game" the (terrible) evaluation system to get good grades by cheating WHILE avoiding the very challenge that make the learning possible.Who could have guessed.
Comment by 18al 1 day ago
A para from from [0] makes it seem that students understand that LLM use doesn't lead to learning, but still do so. Do they not see effort put into learning worthwhile?
A few months ago, I overheard some college students talking about their classes.
One was complaining about an assignment they needed to do that night, and
another incredulously asked why they wouldn’t just have ChatGPT do it. The first
replied, “This is my major, I actually need to learn stuff in this class. I use
AI for my other classes.”
I myself use LLMs for learning (using ChatGPT's study mode for instance r.i.p)
and can see that there's a right way to use it—you reach for it when you hit a wall, not to avoid the friction of developing an understanding.From what I understand tho, most of LLM use for learning is just LLM used as a tool for cheating. Even tfa mentions something of the sort:
few of Musall’s most advanced students have taken advantage of AI to learn new
topics. But, as far as she can tell, more students are using it to just find
answers
The article attributes _skill issue_ as part of the problem, but how much of that
is a motivation or awareness issue. How do you make student realize that learning is worth it?[0] https://arstechnica.com/science/2026/04/to-teach-in-the-time...
Comment by JimsonYang 1 day ago
Comment by UncleMeat 1 day ago
But "students will use the cheating machine to cheat" was obvious from the release of ChatGPT3. There was never some period of time where AI looked like it was a net positive for students only to be revealed to have an unexpected harm.
Even from the folks who claim to use LLMs to learn rather than cheat or avoid work, I've seen so many people admit that they are actually using it to harm themselves. "Oh, I only ask ChatGPT for the answer for really hard problems." Yeah man, doing the hard problems is how you learn.
Comment by sigmoid10 1 day ago
>Unlike other AI tools such as ChatGPT, Khanmigo doesn’t just give answers. Instead, with limitless patience, it guides learners to find the answer themselves. In addition, Khanmigo is the only AI tool that is incorporated with Khan Academy’s world-class content library that covers math, humanities, coding, social studies, and more.
The first differentiation is literally just prompting (if at all). Nowadays you can tell any chatbot to behave that way. The second one may have been an edge before tool use was widely common, but with all chatbots now having access to the internet and code execution, it seems like this has also become a dud. This product was a nice idea on paper, but the fast technical evolution of the field has largely left it in the dust.
Comment by vasco 1 day ago
They had really cool math videos and got given too much money, that's about the story.
Comment by xtiansimon 1 day ago
I thought Sal's revolution was the idea of flipping the script on primary school learning: in-class homework & at-home video lessons.
I'm not surprised. Students are not rewarded when they ask _curious_ questions--rather, they're admonished for not paying sufficient attention.
Personally, my first use of ChatGPT was to ask tangential questions on JavaScript while taking a LinkedIn learning course on VueJS. I found ChatAI an excellent substitute for Reddit and StackOverflow, which is how I would have followed these inquiries before. Of course, I'm not a primary-school-age learner. I had to learn _How To Learn_ from experience.
Comment by augment_me 1 day ago
The issue in my country is that you equate education with getting a safe job. 20 years ago, you needed a high-school degree in social science to get a government job. 10 years ago you needed a bachelor in social sciences to get the same job. 5 years ago you needed a bachelor in economy/engineering to get the same job. Now, because of recessions this is stretching to masters degrees.
You can't expect people who just want a job and a comfortable life and NEED to go to uni for this to want to be curious and want to learn.
Comment by Ekaros 1 day ago
Comment by utopiah 1 day ago
Feels like whatever tool they'd be given, they'd be ahead anyway. What's more worrying IMHO is, are the remaining 85% faring even worst than they would have before because they are learning even less, not just slower than the 15% learning faster. Namely is the gain for the few a loss for the majority?
Comment by augment_me 1 day ago
As for the other question, its mixed. I think about 20% of students understand that they are fucked if they just delegate it all to LLMs, they still go through the ropes and show up to class but do the minimum. However most are down the deep end in various degrees. I have seen students with 5 different 3000-line files for 5 questions for the same lab where each file has 3 lines of code different. This never happened even when the students cheated by accessing old labs online or plagiarizing before.
I believe that what will happen (because universities move really slow on policy and education on LLM use), is that pre-LLM, the university had a normal distribution of skills upon graduation. A company could trust that someone with a degree knew X and Y. With this however, you have more of a bimodal distribution, some know nothing and some know it all, so then the trust in universities deteriorates. I think we will see much more IQ-test/practical tests in hiring processes as the trust falters for that a degree equals something.
Comment by uhoh-itsmaciek 1 day ago
Ignoring whether or not this is a good idea in the first place, what about inverting the loop? Have the robot drive the interaction.
Comment by ericd 1 day ago
It's been fascinating to watch - my kids are really into Slay the Spire, and it had a discussion about a decision tree they use when fighting one of the enemies, and then it used that to bridge to writing some python code and walking them through it. Another time, with dinosaurs, it went with them through the k-pg extinction event, and what really killed the dinosaurs - the kids thought the explosion - it walked them towards the sun dimming, and why food getting more scarce filtered for small mammals, our ancestors, and smaller dinosaurs.
Comment by npodbielski 1 day ago
on the other hand, I was playing a lot Slay the Spire few years back and I would love to talk about with my kids while they play. Going from that it is not job of the parent to explain why dinosaurs are extinct?
Comment by ericd 1 day ago
Comment by croes 1 day ago
If you can’t articulate what you want it becomes a guessing game
Comment by uhoh-itsmaciek 1 day ago
Comment by themafia 1 day ago
How about completing the loop? Pose subject matter questions to them throughout the day, maybe via something like mobile push, collect their answers, immediately grade their results, and then actively reward them for performance.
All of the things brick and mortar schools are uniquely bad at.
Comment by barrenko 1 day ago
As other people have noted, asking a.k.a <i>typing</i> questions, especially math-type is fatiguing, and there's no substitute for pen and paper and thinking hard.
KA would be better off using AI on the supply side (but heavily curated) to have more assignments, or better assignments in some sections.
But it's important to recognize KA for what it is, and it's an excellent way to have some sort of a basic curriculum, especially when self-studying, and all of the instructors have great teaching personalities, as far as I can deduce from the approach in the videos.
Comment by gcanyon 1 day ago
An AI-based education system should have embedded in it "I am here to teach this person Geometry. Here is a list of the topics to cover, with a breakdown of steps for each including an intro section, a study section, a test section, and the meta material to go along with it.
That would work.
Comment by brabel 1 day ago
Comment by Peritract 1 day ago
The people who work in education don't have this issue; the people who work in tech and assume that gives them expertise in education do.
Comment by brabel 1 day ago
Comment by Peritract 1 day ago
Most teachers I know would be delighted if tech companies and management stopped trying to push tools on them that aren't fit for purpose.
Comment by Ekaros 1 day ago
Comment by ericd 1 day ago
Comment by titannet 1 day ago
Comment by waynesonfire 1 day ago
The poor engagement of the KA bot becomes clear--a teaching technique in not an education system.
Comment by JimsonYang 1 day ago
To give an example, I have a friend who learned system design through Claude in order to get a job interview (and he got really good at system design)while I have another friend who copies and paste ChatGPT responses in order to get a B on a reflection assignment.
This highlights that there is legit use case for personalized learning and growth via AI-but these are the people who seek knowledge with or without AI. Whereas the majority of students actively tries to do the least as possible on assignments even if they get 0 value out of it
Comment by cs_throwaway 1 day ago
Modern AI has made me a more productive teacher—-I produce higher quality material and have more time for research.
But the impact on most students is negative. It is another thing to engage with, which they won’t unless forced. The only way to learn is to do the work yourself. An AI tutor can get you unstuck faster, but that’s typically bad. Learning to be productively stuck on something for days without making visible progress is an important skill that most people never learn.
Comment by RealityVoid 1 day ago
I like struggling with interesting problems. But spinning your wheels is not progress. So, IMO, getting you unstuck is a generally good thing.
Comment by flexagoon 1 day ago
Comment by utopiah 1 day ago
So... yeah, it got old quick. Genuinely cool for a bit but now "we" as users just want good UX. Now give me the FAQ that I can search through then an email if it's not in there.
PS: FWIW I do believe in a long-tail fashion, for few users who are not into scripting, might not be developers (or believe they could become) it could help find very few very niche use cases with solutions.
Comment by globalnode 1 day ago
Comment by kevlened 1 day ago
His hottest take is we're already close to the optimal process for learning, so technology isn't going to improve it. Learning takes work, and no technology can do the work for you.
Comment by LaFolle 1 day ago
AI is great for the curious. But its not yet there where it can proactively engage with students to generate interest.
Comment by BoneShard 1 day ago
Comment by vasco 1 day ago
Comment by usrnm 1 day ago
Comment by gyomu 1 day ago
I remember an educator ranting to me a long time ago that the only data-proven ways to meaningfully improve educational outcomes was to reduce classroom size and make sure kids got enough sleep + fed well enough, everything else was just a waste of time.
Comment by vasco 1 day ago
Comment by Espressosaurus 1 day ago
Ask teachers that have been teaching for 10 years. Ask the professors how today's kids are different than the ones of yesteryear.
The move to de-tech the classroom will eventually help out I expect, but keeping kids (and adults!!!) from using cognitive shortcuts so they can develop their own sense of what's reasonable instead of taking information from a bought-and-paid-for oracle is going to remain a problem.
Comment by suttontom 1 day ago
Dear Lord, how is this any different from Microsoft sticking Copilot or Google sticking Gemini in every single offering? They're literally saying that people aren't using the chat bot enough so they're going to force it on people inside the product.
Comment by Ozzie_osman 1 day ago
Comment by BrenBarn 1 day ago
That is a warning sign if ever there was one.
Comment by ericd 1 day ago
The biggest thing is motivation. First off, if Khanmigo requires them to type and read everything, that's going to get tiring fast for most kids. But I don't know how you could do voice in a school setting - mine uses STT/TTS, but with 20 kids in a room, it'd be chaos - STT accuracy and diarization with 2 is already really challenging.
Motivation is helped a bit by following their interest, but it seems like KA is having trouble guiding the kids when they prompt it that way. That was a pretty big issue with mine early on - the kids would talk to it for an hour about whatever topic they were interested in at the time, but it would never branch into something new.
The tutor I'm working on solves it by having a concept graph that covers a lot of learning, from the basics like math, dinosaurs, etc to other developmental topics like 6 year old boundary-pushing humor, and two LLM threads - one that handles the conversational turns, and another one in the background that strategizes and steers the conversational thread by looking at the concept graph connections and considering how ready they are for each, and then injecting steering notes into the conversational thread. Basically system 1 and system 2 thinking. And after sessions, it'll make a basic plan of where to start next time, and what might be interesting to offer up.
I mentioned this in another comment, but I've been really pleasantly surprised at the quality of the tutoring, especially when it bridges into new topics - one of my sons is really into slay the spire, and at different times it’s used that as a launching-off point into probabilities, decision trees, python code of the algorithms he thinks about as he's facing different enemies, and general strategies on different facets, and my other son was really into sharks, which it has bridged into extinct sharks like megalodon, how scientists derive how it looks given cartilage's lower propensity to fossilize, bridging to dinosaurs and their fossils, the K-PG extinction event, how food scarcity filtered for smaller animals like the ancestors of birds, and our small mammalian ancestors. And a whole bunch of other topics.
It's been pretty great in that way, but my biggest open question at the moment is how to get them to engage with it on their own on a more regular basis - they go to it occasionally for random questions, but to get good coverage of that huge knowledge graph would take much more. And fundamentally, I think that human engagement still just has a number of important aspects to it that it's lacking, and I'm not sure if it's possible to replace those well enough.
Comment by anilgulecha 1 day ago
Comment by waynesonfire 1 day ago
Which explains the poor engagement you observed. To me it seems like a _technique_ I'd expect a skilled educator to deploy, sparingly in narrow use-cases, when it's nescessary to probe a students interest.
Comment by ericd 1 day ago
Open to suggestions for how to improve it!
Comment by 10keane 1 day ago
i think what should be taught is the metacognative ability - like how to retrieve knowledge, how to ask the right questions towards a certain goal. knowledge itself are easily accessible with ai. now the difficult part is the ability to discern actual knowledge from llm halucination bs, the ability to retrieve the required knowledge given a scenario.
this still requires some foundational grounding — you can't detect bullshit with zero context. but the balance shifts from memorization to retrieval, iteration, verification. honestly i think it is more about critical thinking and philosophy.
Comment by Peritract 1 day ago
1. It isn't
2. As you acknowledge, you need some 'foundational grounding', but the amount needed is quite a lot
3. The best way to teach metacognitive (and all other) skills is within a context
> the balance shifts from memorization to retrieval, iteration, verification
This has been trumpeted with every poorly-thought-out educational change, and it's a marker of unfamiliarity with the space. Memorisation hasn't been the focus ever; it's always about the other skills, and (some) memorisation is useful as part of that.
Comment by yabutlivnWoods 1 day ago
> “Students aren’t great at asking questions well.”
In my interactions with my kids public school and their teachers, they're goal is ram content down their throat and test for retention, not foster an environment open to questions
Had a teacher claim straight up they don't believe the system works and are just in teaching for benefits and summer vacation
IMO Sal Khan's revolution hasn't happened because the adults in charge right now are ignorant and inept but incredibly vain nonetheless
Comment by hebsu 1 day ago
Comment by lancebeet 1 day ago
Is that actually true though? Average American students (especially those in the public school system) are not excellent test takers, and they're even worse at rote memorization. If this is actually the goal they're not achieving that either.
Comment by Peritract 1 day ago
Comment by yabutlivnWoods 1 day ago
Comment by tanvach 1 day ago
Comment by yabutlivnWoods 1 day ago
They could quit and free up the slot for someone who does care
Comment by krainboltgreene 1 day ago
That said I do think it's particularly hilarious that KA's strategy to students not wanting to use the product is to make the product more integral to the experience.
Comment by croes 1 day ago
Who would have thought?
Comment by 6stringmerc 1 day ago
…sounds a lot like Investors versus those who actually perform “work” as it’s defined in research literature.
But I’m sure a shoe company pivoting to AI isn’t a sign of a bubble about to burst, nope.