[...] There are, however, many alternative API servers being made available all over the internet. One of which is Mixtral.Ai or Mistal. AI. The models are fully open models and you can, in principle, run them locally on your hardware. I have chosen to use them via their apis. If you sign up, you will get on a waiting list, but in my experience you will gain access soon. [...]
Thank you! After reading your statements above, I submitted my application to MistralAI's waiting list. I was granted access after 4-5 days. Here is Raku package that streamlines the access to MistralAI (in Raku): "WWW::MistralAI",
In many ways, MistralAI's interface is similar to OpenAI's, so, it was not that difficult to come up with the code and tests for accessing it. Also, was not difficult to integrate with "LLM::Functions" and "Jupyter::Chatbook".