things that are tried and true and set in stone.
There isn’t much new under the sun in middle school algebra/biology/earth science.
I would be interested to see an example of these kinds of basic math and science problems being answered incorrectly by GPT.
I have seen plenty of issues where people ask it to do specific computation. Its inability to do arithmetic is understandable because it is a language model, not a math model. However, if it has learned a class of problems, it is plenty good at assessing the correctness of an answer from a problem it has already seen. Furthermore, even in the failed arithmetic problems, you can simply ask it to solve it using Python code instead of utilizing the LLM directly and it will succeed.
As I said, it will take some training to teach a kid how to use it correctly as a tutor, but the fact is that these are only going to get better and better. Teach them to use it and identify problems now, and they will be well positioned to use it as the ubiquitous tools that it is/will become.