I would like to create a new LLM prompt using a prompt resource definition notebook. The idea is to set it up such that the AI "assistant" is told to be an AI Tutor and is given some teaching tools. These tools would be implemented as functions - LLM Tools in wolfram terminology. These would allow the AI tutor to augment the LLM's text-only experience with all the rich visualizations that Wolfram Language can do, as well as reduce hallucinations by feeding in data.
I couldn't get LLM Tools and LLM Configuration Optionsworking so far... I got the feeling from watching WolframU videos etc. that this is a half-baked feature that Wolfram R&D is still actively working on. What's going on, can someone please provide an update, ETA? The whole chat-notebook experience in 13.3 is on one hand wonderful on the other hand not robust, menu items sometimes appear sometimes disappear.