Message Boards Message Boards

GROUPS:

Export a smaller Classify or Predict model to use in the cloud?

Posted 2 years ago
2768 Views
|
9 Replies
|
4 Total Likes
|

For LogisticRegression, it is easy, I can just use the "Function", but in general, both for Classify and Predict, I'd just like the output function, which is usually about the same size as the training data, which can be huge.

A common use case would be to train locally to create a model, then upload that model to the cloud through an API that should be quick and small and easy to use.

9 Replies

Hi Philip,

You might have luck using Compress and Uncompress to convert your ClassifierFunction to something copy-pasteable:

trainingset = {1->"A",2->"A",3.5->"B",4->"B"};
c=Classify[trainingset]

c[2.3]

data=Compress[c]

d=Uncompress[data];
d[2.3]
Posted 2 years ago

I'll give it a shot but that's not going to do better than say 20% compression, and if I have many gigabytes of data, it's still too slow and big to be useful in a cloud API function. In any case, the model size, whether it's a neural network or whatever, shouldn't be proportional to the data size. There should be some way to delete the data from the Classify or Predict object and just leave the necessary mechanism for prediction, no?

The classifierfunction (in an efficient form) should be much smaller than the input data. The entire idea of the classifier is to make a 'model' of it, and a model is by definition simpler than the original... It could be that Mathematica stores the original data (the input data) also in the classifierfunction ;-)

Posted 2 years ago

Here is a short related SE discussion from a year or two ago:

How can I export my learned ClassiferFunction and PredictorFunction ?

I am totally happy with exporting something to the cloud API, I just don't want it to be multiple gigabytes, for a presumably much shorter amount of code that is the model per se.

I used 'compress' and loaded a large model to the cloud. However, APIFunction accessed it but the results were too slow for production use. Sometimes it was longer than 2 seconds to respond. Smaller models work fine but they are far more inaccurate.

Hi Philip,

The ClassifierFunction already has the minimal amount of data needed for prediction. The problem is that some models can be very large (e.g. NearestNeighbors or RandomForest).

The first thing you should do is to set the option PerformanceGoal -> "Memory" at training time. Then you can play with the Method option to see if you can find a model that corresponds to your need.

Thanks, Etienne

I already did that. It's not accurate enough. I split tested all the Wolfram algorithms in Classify and Predict. The only thing good enough is a Neural Network which took like 30 hours to train. In the Notebook the performance is acceptable. In the cloud it is way too slow.

Posted 3 months ago

It's been two years and there still doesn't seem to be a way to have a locally trained neural network model evaluated in the cloud. Is that correct?

Here's some simple sample code, everything but CloudDeploy is straight from the docs:

net = NetChain[{LinearLayer[], LogisticSigmoid}];
trained = NetTrain[net, {1->False, 2->False, 3->True, 4->True}]

trained[3.5] (* True *)

CloudDeploy[FormPage[{"n" -> "Number"}, trained[#n] &], Permissions -> "Public"] // SystemOpen

When you run it, and put in 3.5 into the form, it returns an errorCode "no-wolfram-engine-response", statusCode 503, and errorDetails "Unable to evaluate".

1) Am I doing something wrong? 2) Is there an ETA for when it will be possible? 3) Is there a workaround?

Thanks

Posted 3 months ago

Oh snap! This workaround by b3m2a1 works really well (CloudExport/CloudImport): https://mathematica.stackexchange.com/a/156125

Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract