Group Abstract Group Abstract

Message Boards Message Boards

1
|
441 Views
|
3 Replies
|
5 Total Likes
View groups...
Share
Share this post:

Help with "NetEncoder[{Scalar,...}]" deprecation warning in neural network model (Titanic data)

Posted 1 month ago

Hi everyone,

I'm developing a neural network model using the Titanic dataset, and I recently encountered the following warning:

"NetEncoder[{Scalar, ...}]" setting has been deprecated. Use "Real" or "Integer" bare port specifications instead.

In response, I tried changing "age"->"Real" , but the warning persists and doesn't resolve the issue.

Any suggestions on how to fix or properly update the encoder settings would be greatly appreciated.

Here are codes.

titanicdata = ExampleData[{"Dataset", "Titanic"}];
titanicdata = DeleteMissing[titanicdata, 1, 2];
{trainingData, testData} = 
 TakeDrop[RandomSample@titanicdata, 800]; Take[trainingData, 5]

net1 = NetGraph[{CatenateLayer[], LinearLayer[10], 
   BatchNormalizationLayer[], LinearLayer[],
   LogisticSigmoid}, {{NetPort["age"], NetPort["class"], 
     NetPort["sex"]} -> 1, 1 -> 2 -> 3 -> 4 -> 5 -> NetPort["survived"]},
   "age" -> "Scalar", 
  "class" -> NetEncoder[{"Class", {"1st", "2nd", "3rd"}, "UnitVector"}],
   "sex" -> NetEncoder[{"Class", {"male", "female"}, "UnitVector"}], 
  "survived" -> "Boolean"]
POSTED BY: Sangdon Lee
3 Replies
Posted 1 day ago

Dear Giulio, Thank you for your valuable information -- I really appreciate it.

I have one more question. In my current setup, the response variable is a numerical value rather than a categorical one. I've created a simple training dataset and defined a neural network accordingly. In this case, the "survived" variable consists of real numbers, and therefore, the final layer is specified as LinearLayer[1], not an ElementwiseLayer with a LogisticSigmoid activation function.

However, I'm encountering the following error message: Specification NetEncoder[{"Function", ...}] is not compatible with port "survived", which must be a length - 1 vector of real numbers.

Do you have any suggestions for resolving this issue?

Ultimately, my goal is to develop a multilayer perceptron in which:

  • the predictors include multiple categorical and multiple numerical variables, and
  • the responses include multiple categorical and multiple numerical variables.

An example of MLP that handles both multiple categorical and numerical predictors and responses would be really great.

I've already reviewed the example provided at the following link: https://community.wolfram.com/groups/-/m/t/1402774

Here are my codes:

titanicdata02 = 
 Dataset[{<|"class" -> "1st", "age" -> 61, "sex" -> "male", 
    "survived" -> 1.0|>, <|"class" -> "3rd", "age" -> 15, 
    "sex" -> "female", "survived" -> 1.0|>, <|"class" -> "3rd", 
    "age" -> 15, "sex" -> "male", 
    "survived" -> 2.0|>, <|"class" -> "1st", "age" -> 31, 
    "sex" -> "female", "survived" -> 2.0|>, <|"class" -> "3rd", 
    "age" -> 33, "sex" -> "male", "survived" -> 1.0|>}]

net2 = NetGraph[
  {
   CatenateLayer[],
   LinearLayer[10],
   ElementwiseLayer["SELU"], 
   LinearLayer[1](* final output is ?????*)
   },
  {
   {NetPort["age"], NetPort["class"], NetPort["sex"]} -> 1,
   1 -> 2 -> 3 -> 4 -> NetPort["survived"]
   },
  "age" -> NetEncoder[{"Function", List, 1}],
  "class" -> NetEncoder[{"Class", {"1st", "2nd", "3rd"}, "UnitVector"}],
  "sex" -> NetEncoder[{"Class", {"male", "female"}, "UnitVector"}],
  "survived" -> NetEncoder[{"Function", List, 1}]
  ]
POSTED BY: Sangdon Lee

You could use

"survived" -> NetDecoder[{"Function", First}]

To pluck out the real from the list generated by LinearLayer[1]

But the easiest would be to not attach anything to the "survived" port and just use

LinearLayer[{}]

To output a scalar

If you want your real number wrapper in a list (to work with CatenateLayer) the best thing is probably to use a function encoder as follows "age" -> NetEncoder[{"Function", List, 1}]

Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard