I tested the Classify algo's on a big data set of text message conversations between users and business IoT devices. The Neral Net was fantastically more accurate than the rest.
It took a long time to run, which is fine. The problem I am facing is:
After uploading it to the cloud and creating API access, the API is slower than dirt. It times out even. However, when I classify samples in the notebook it is lightning fast as always.
My code:
Quiet[CloudDeploy[
APIFunction[{"text" ->
"String" -> "Please turn off the lights at 10pm."},
Get["BtNeuralNetwork"][#text] &],
Permissions -> "Public"]]
I used "CloudGet" and the result was even slower. Not sure how that makes any sense.
I even tried compressing it before deploying using:
With[{c = Compress[c]},
CloudEvaluate[Put[Uncompress[c], "BtNeuralNetwork"]]]
Nothing really makes a difference. The API is slow in responding. When I was using Markov it replied instantly on notebook and cloud. The neural net is same speed on notebook but not cloud. Obviously, "BtNeuralNetwork" is just the arbitrary name I gave the model. The fact that it responses correctly "when" it does respond tells me it's working. I just can't figure out how to speed up the response time.