Message Boards Message Boards

Local LLMs with Llama and Wolfram Language

Posted 5 months ago

Attachments:
POSTED BY: Daniel Carvalho
4 Replies

I am not totally sure now how to integrate, I mean to tie Mathematica & Wolfram Language notebook GUI to a local LLaMa. Those two local models can process images too, I can do an example in a next post. Wolfram tech usually integrates very well with other languages and databases, etc... I guess we will use in real-world projects LLMs from different sources, to attend different use cases.

POSTED BY: Daniel Carvalho

enter image description here -- you have earned Featured Contributor Badge enter image description here Your exceptional post has been selected for our editorial column Staff Picks http://wolfr.am/StaffPicks and Your Profile is now distinguished by a Featured Contributor Badge and is displayed on the Featured Contributor Board. Thank you!

POSTED BY: EDITORIAL BOARD

Thanks for posting the code and examples of hooking up LLaMA with WL !

Since LLaMA manuals say that its completions API adheres to the completions API of OpenAI's ChatGPT, how easy it is (or would be) to make LLaMA access configurations for use in WL's LLM* functions? (Like LLMSynthesize and LLMFunction.)

POSTED BY: Anton Antonov
Posted 4 months ago

I have this same question. I would be very useful to use the existing LLM “plumbing” with a local llamafile.

POSTED BY: Jacob Evans
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract