Message Boards Message Boards

Chat Notebooks bring the power of Notebooks to LLMs

Join us in an exclusive livestream with our special guest Theo Gray. Theo will present chat-driven Notebooks with integrated LLM functionality at tomorrow, June 14th at 11 AM CST on Wolfram R&D livestreams (Twitch or YouTube)!

Theo Gray is a co-founder of Wolfram Research, science author, and co-founder of app developer Touch Press. He is the architect of the Notebook user interface for Mathematica and led the user interface group for over 20 years.

Follow our YouTube channel for the latest developments in R&D at Wolfram.

enter image description here

POSTED BY: Keren Garcia
6 Replies

Except that they do not work. It is deeply misleading to advertise this feature when it does not work without paid API account. It is dishonest to hide the extra costs from users so that you can generate hype for your product.

Perhaps instead of spending time on hyping, Wolfram people could actually provide usable documentation for their users.

And by the way, the updated interface of 13.3 is much worse than previous versions. Either Wolfram is now targeting high-school students as their main users or they have no idea what users need.

POSTED BY: Volodymyr Babich

I totally agree on the way that the cost of using LLMs is hidden. Even though I could easily afford it (and many cannot), I am not willing to pony up real money for technology that is of absolutely no use for me -- other than as a fancy code assistant.

I also agree that there is a lot more chrome in the UI for Mathematica. Some of the functionality is useful, of course, and I am happy to have a lot of the improvements made over time. And, I guess one person's chrome is another person's essential service.

So far, it seems that most of the fancy UI stuff can be turned off, so that is good. I would rather see some of the bugs getting fixed, and some functionality that would be generally useful, such as split windows, implemented. Given limited developer resources, in my opinion, there are better things to do than replacing monospaced code text with something else.

Yay Theo!

POSTED BY: Kathryn Cramer
Posted 11 months ago

Viewing the video today.

It's great that Theo Gray is back doing development for Wolfram Research! He's clearly a natural for enhancing the Wolfram Notebook implementation with an LLM interface.

After watching the video, the biggest unanswered question: can the LLM be run on my local PC? Do Apple Silicon computers have sufficient GPU and Neural Engine computational resources to run the AI? How much pattern data is required to have the AI do its stuff? Running the LLM in a cloud computer is attractive, but running it standalone on your PC is attractive for completely different reasons.

POSTED BY: Phil Earnhardt

No. You need to connect to an LLM that runs elsewhere, and you probably need a paid account. They are too large to run on your computer.

POSTED BY: Kathryn Cramer

I have been wondering the same thing myself. As the LLMs are currently constituted, they are almost certainly well beyond the means of even a maxed out 'home' computer.

However, I can't help wondering if it would be possible to create a LLM using Wolfram language that could run on a laptop. My current MacBook Pro has 64 Gbytes RAM, and 2 TByte storage, but current iterations can have more of both, to say nothing of the Mac Studio. I first ran Mathematica (version 1.1) on an SE/30 that had 8 Mbytes RAM (the minimum was 2.5), and that extra RAM almost doubled the cost of the computer -- but it was well worth it. In my opinion, current hardware capacity is being underutilized by Mathematica.

Creating a LLM would be beyond what could be done without linked servers, but running one should be possible. (?)

A useful first step would be to make the current neural net functionality use Apple's hardware, instead of requiring a NVIDIA GPU. Next, applying some of the multi-computational ideas from Stephen's physics project might be better than the current effort, which seems to be more 'brute force' than necessary.

I would not mind if the LLM 'specialized' in Wolfram Language code. Such specialization might require fewer resources, and would avoid a lot of issues with current LLMs. (In tests I've done, the LLM seems to make stuff up at least 50% of the time, which renders it useless for many use cases.)

Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract