Dear Joshua,
Thank you for your comment. You are absolutely right. There is a lot of functionality built into the Wolfram Language and Service connect. This post is supposed to be one out of a number of posts on different LLMs as well. By now I have implemented functions for Llama, Mistral and Gemini as well.
I really like the functionality within the Wolfram Language, but sometimes the "manual" method might offer some advantages. If new features roll out, direct calls allow us mostly to use the newest features directly, including new endpoints and parameters, for example a seed so that we get consistent answers or using TTS the day it came out. It is also quite easy for me to change the API keys I use if there are private and work related ones for example.
Also, it appears to me that right now the LLMs available and the features of them, change so quickly that it is basically impossible to add all the functionality immediately into cannon Wolfram Language. For my part I often cannot wait to play around with the new functionality.
I found that this is all really easy to do now as GPT writes the functions if you show it the API documentation of a new LLM; I wanted to test out how much of that can be automated by simply talking to GPT.
I do agree thought that if you want to use GPT only and for most users the built in functions will be more than enough.
Cheers,
Marco