Message Boards Message Boards

Local LLMs with Llama and Wolfram Language

Posted 1 day ago

Attachments:
POSTED BY: Daniel Carvalho
3 Replies

Thanks for posting the code and examples of hooking up LLaMA with WL !

Since LLaMA manuals say that its completions API adheres to the completions API of OpenAI's ChatGPT, how easy it is (or would be) to make LLaMA access configurations for use in WL's LLM* functions? (Like LLMSynthesize and LLMFunction.)

POSTED BY: Anton Antonov

enter image description here -- you have earned Featured Contributor Badge enter image description here Your exceptional post has been selected for our editorial column Staff Picks http://wolfr.am/StaffPicks and Your Profile is now distinguished by a Featured Contributor Badge and is displayed on the Featured Contributor Board. Thank you!

POSTED BY: Moderation Team

I am not totally sure now how to integrate, I mean to tie Mathematica & Wolfram Language notebook GUI to a local LLaMa. Those two local models can process images too, I can do an example in a next post. Wolfram tech usually integrates very well with other languages and databases, etc... I guess we will use in real-world projects LLMs from different sources, to attend different use cases.

POSTED BY: Daniel Carvalho
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract