Message Boards Message Boards


Use NetTrain[net,f] to read in a large set of rules (Is there an example?)

Posted 8 months ago
6 Replies
2 Total Likes

Hello, Is there an example of using NetTrain[net,f] to read in a large file (1.8 million lines). The file is in text format with lines of the form


. . .

Thank You


6 Replies
Posted 8 months ago

Much easier and faster if each line did not have the , at the end. Strip them off using sed or awk which would be much faster than using Mathematica. Then

$fileStream = OpenRead["data.txt"];
NetTrain[net, ReadList[$fileStream, Expression, 1000]] (* batches of 1000 *)

Training is probably going to take a while with such a large dataset and depending on the complexity of the model. So, if you have not already done so, you should train / test on a random subset of the data.

Michel, I hope this will help if you cannot delete the , at the end.

in = Import["data.txt", "Words"];
f = ToExpression@StringTrim[#, ","] & /@ in

enter image description here

Posted 8 months ago


I assumed Michel is concerned that 1.8M training samples may not fit into available memory and was looking for a solution that loaded samples in smaller batches. Some techniques for dealing with this are documented here.

If that is not an issue then certainly your solution is more straightforward.

Posted 7 months ago

Thank You Rohit Namjoshi

What you offered, is what I needed.

Thank You

Posted 8 months ago

Thank you Okazaki-san

Very much appreciated this

Anonymous User
Anonymous User
Posted 7 months ago

Hello Michel Mesedahl, Greetings of the day I hope all are going well.When i tried to use this code in = Import["data.txt", "Words"]; f = ToExpression@StringTrim[#, ","] & /@ in ,then got the same result.You may try with your code.

POSTED BY: Anonymous User
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
or Discard

Group Abstract Group Abstract