Hi everyone,
I often have to write code that processes the text in many files and then exports the results for each as a file. It usually looks something like this:
myProcess = Block[{text, results,filePath},
Do[
text =Import[thisFile];
results = stuffDoneToText[text];
Export[filePath, results, "CSV"];
{thisFIle, theseFiles}]]
The files are small, but very often the procedure slows to a crawl. It might be after eight files or after 128. When I get fed up and abort, I get the spinning beachball on my Mac telling me that it cannot handle the Mathematica's demand for memory. I don't understand why. I figure that the text and results variables are being overwritten with each iteration of the Do loop, so where is the data accumulating to consume memory? Does it have to do with my use of Block? I have had the same problem with Module. I'm wondering whether I should perhaps not use either and stick with global variables instead.
Any insights would be much appreciated,
Gregory