George and others. Of course, the so-called hallucinations are problems that must be brought to an absolute minimum. Unlike some people, I have a lot of patience to put up with obvious errors and find a way of getting at the truth. How do I know the difference? I don't always, but fortunately hallucinations don't seem to be based in difficulty! Sometimes, I can ask a competing Ai in a different context. Sometimes ask another for a proof while putting it off as my own naive idea. Sometimes do a little research. Sometimes the hallucination just makes no sense!
As far as "reasoning limitations go," like proving something clearly provable, but I've never seen written out and am too lazy to think of on my own, giving the Notebook assistant hints from my best reasoning as I go and asking for a step or two at a time seems to work well so far.
Here is an example of where we both just had to hold each other's hand to scratch out a proof that was sufficient for me: