# Convert Wolfram Language Neural Network to Keras

Posted 11 months ago
1071 Views
|
0 Replies
|
3 Total Likes
|
 Suppose you want to convert a Wolfram Language Neural Network to a Keras one, how should you do it?With a few lines of code you can build a code to convert simple Net elements (others can easily be expanded.Let's start by creating a function that convert a Mathematica array into a Python array. PythonArray[L_List] := Map[ToString, L, {ArrayDepth@L}] //. List[x__String] :> "[" <> StringRiffle[{x}, ", "] <> "]" As an example.  PythonArray[{{1,2},{3,4}}] == "[[1, 2], [3, 4]]" We can also create a NumPy array: PythonNumPyArray[L_List] := StringTemplate["np.array(1)"][PythonArray@L] Let's create a very simple Net to test: net = NetInitialize@NetChain[{ 5, Ramp, 13, Tanh }, "Input" -> 1]; Which is composed of only a LinearLayer and ElementwiseLayer Layers.The Keras format for Layers LinearLayer (Dense) is as following: model = Sequential() model.add(Dense(5, input_shape=(1,))) model.layers[0].set_weights(...) Hence we can parse the Wolfram input as: NetParseInput[layer_LinearLayer] := StringTemplate["model.add( Dense(1, input_shape=(2,)) )"][NetExtract[layer, "Output"], NetExtract[layer, "Input"]] NetParseWeight[layer_LinearLayer, i_Integer] := StringTemplate["model.layers[1].set_weights(2)"][i, PythonArray@{PythonNumPyArray@Transpose@NetExtract[layer, "Weights"], PythonNumPyArray@NetExtract[layer, "Biases"]}] For the LinearLayer and NetParseInput[layer_ElementwiseLayer] := StringTemplate["model.add(Activation('1'))"][Switch[NetExtract[layer, "Function"], Ramp, "relu", Tanh, "tanh", _, "ERROR" ]] NetParseWeight[layer_ElementwiseLayer, ___] := Nothing For the layer_ElementwiseLayer. The code is pretty self-explanatory.Now we can create a function to parse the Net. NetParse[net_NetChain] := Block[{model, layer}, model = Table[ layer = NetExtract[net, i]; StringRiffle[{NetParseInput@layer, NetParseWeight[layer, i-1]}, "\n"] , {i, Length@net}] // StringRiffle[#, "\n\n"] &; model = "model = Sequential()\n\n" <> model ] Using the example of the Net above we have: SeedRandom[5] NetParse@net Outputs: model = Sequential() model.add( Dense(5, input_shape=(1,)) ) model.layers[0].set_weights([np.array([[-1.41198, -1.23031, 1.38467, 1.32588, -0.846079]]), np.array([0., 0., 0., 0., 0.])]) model.add(Activation('relu')) model.add( Dense(13, input_shape=(5,)) ) model.layers[2].set_weights([np.array([[0.368951, -0.465557, 0.0588311, 0.353987, 0.415121, -0.273476, -0.493994, 0.226572, 0.189246, 0.413253, -0.0264303, 0.218976, -0.202013], [-0.4737, 0.056413, -0.509581, -0.0702224, -0.297756, -0.562088, -0.45776, -0.517487, 0.0414736, 0.446953, 0.272512, -0.181571, 0.449515], [0.543519, -0.424297, 0.531857, -0.413055, 0.395872, -0.153824, -0.0818212, -0.552472, -0.0703974, -0.36928, -0.0741213, -0.365046, 0.228673], [-0.311873, -0.309076, -0.463902, -0.54538, 0.0629959, -0.478365, -0.0507208, -0.101537, -0.267402, 0.28598, -0.542745, 0.497146, -0.0757304], [0.129907, 0.318853, -0.537684, -0.485585, 0.255933, 0.0802962, -0.288363, -0.103131, -0.230148, -0.0018353, -0.0192109, 0.184851, 0.0072636]]), np.array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.])]) model.add(Activation('tanh')) Loading this into python and making sure to import the packages: from keras.models import Sequential from keras.layers import Dense, Activation import numpy as np We can test the result as following (in Python): model.predict([1]) array([[ 0.3266632 , -0.76046175, 0.12077561, -0.86044425, 0.55920595, -0.68963015, -0.17860858, -0.71611154, -0.42355815, -0.13139175, -0.67629176, 0.15248899, 0.21291924]], dtype=float32) While in Mathematica: net[1] {0.326664,-0.760461,0.120777,-0.860444,0.559206, -0.689629,-0.178608,-0.716111,-0.423557,-0.131392, -0.67629,0.152486,0.21292} Which gives the same result.Others layers can be easily added following the same rationale.
Community posts can be styled and formatted using the Markdown syntax.