Discussion about this post

User's avatar
Claudio Calligaris's avatar

For the vast majority of students, I think that this is exactly where AI tutoring breaks down…

”The structure and discipline of in-person classrooms are important, and online platforms lack this structure. So even if they are full of brilliant content and sound pedagogical principles, they may not be as effective as in-person teaching.”

Expand full comment
dakkster's avatar

The most obvious reason why LLMs should never ever ever ever be used as anything even remotely close to anything delivering facts to anyone is that hallucinations are mathematically impossible to avoid. The entire model is predicated on hallucinations being necessary as a sort of release valve of garbage probability calculations. 25% of GPT5's answers contain hallucinations. So if you have a person who is far from an expert on a subject, which is pretty much every student, then that student will not be able to know when the LLM is hallucinating or not, which means that the entire concept is completely meaningless.

Expand full comment
8 more comments...

No posts