User Portlet User Portlet

Giulio Alessandrini
Discussions
You mean something like this? ``` NetGraph[ AggregationLayer[Total, 1], "divide" -> ThreadingLayer[Divide, -1]|>, {NetPort["Input"] -> "norm", {NetPort["Input"], "norm"} -> "divide"} ] ```
Hi Iuval, the third argument is used to extract training properties. To specify the probability cross-entropy instead of the index based one, you must use the `LossFunction` option. Try this: ``` famTrained = NetTrain[famNet,...
You can use the `LearningRateMultipliers` to freeze layers during training. Also, can you edit your post with the current solution? It will be useful for future people ending up here. Thanks!
The NN framework works on numerical data only.
The encoders and decoders are not layers but net port properties and you need to attached them to a port. This will work as you expect ``` dec = NetDecoder[{"Class", {"a", "b", "c"}}]; NNcompare = NetInitialize@ NetChain[{3,...
You could use "survived" -> NetDecoder[{"Function", First}] To pluck out the real from the list generated by `LinearLayer[1]` But the easiest would be to not attach anything to the "survived" port and just use LinearLayer[{}] ...
Hi Dalila, I cannot reproduce the error you see with this simple example ``` data = ResourceData["Sample Data: Fisher's Irises"]; c = Classify[data -> "Species"] FeatureImpactPlot[c] ``` Can you share a sample of the data you are using?
Hi Haoyu, the loss is just a function that is getting minimized during training. There is notthing special in the value 0. Typically, you can define it in a way that makes 0 mean "no error", "perfect result" or a similar concept but it's not...
Not perfect but you could use color based segmentation mask = Binarize[ColorDetect[img, ColorsNear[Brown]], .1] ![mask][1] Then use the centers of the morphological components HighlightImage[img, ...
Hi Dinesh, the benchmark is measuring the performance of raw LLMs on the task of writing WL code. The notebook assistant is using [RAG][1] based on language documentation to boost the relevance of its answers so it would be like cheating! ...