Hi community :-)
I have 1024 csv-files in my folder. Each csv-file represents an 16-bit image with 1024 x 1024 (rows and columns).
My goal: Import the images, apply ImageTrim[] to every image to get rid of the unnecessary parts (the same for every image). Afterwards calculate the total power of each image. Finally, save this result. Repeat this for every image.
Problem: Import takes very long (20-30 minutes). Post-Processing takes another 20 minutes.
Question: How can I optimize my code?
Read file names and create temporary file to save results:
SetDirectory["C:\\Users\\images"]; (*path to the folder where all the images are stored*)
NameofFiles = FileNames[]; (*lists all files in the current working directory.*)
NumberofFiles = Length[FileNames["*"]]; (*number of files in the current working directory.*)
file = OpenWrite["file.dat", FormatType -> OutputForm](*Opens a file called file.dat which is used later to temporary save data*)
Loop in which I import every single csv-file and calculate and save intensity:
Do[
ImagesCSV = Import[NameofFiles[[t]]] ; (*import file*)
Img = Image[ImagesCSV, "Bit16"]; (*transforms image to csv*)
ImgTrim = ImageTrim[Img, {{150, 130}, {1000, 890}}]; (*trimms the image*)
ImageTrimmed = ImageMeasurements[ImgTrim, "Total"]; (*calculate total intensity of trimmed image*)
Write[file, ImageTrimmed]; (*writes result into temporary file*)
, {t, 1, NumberofFiles}]
Now it try to close my temporary file: file.dat (I think it doesn't really close because when I try to delete it with DeleteFile[] it doesn't really work)
The code works fine, but is very slow. Any ideas how to speed it up?
Peter
EDIT: For some reason the first step (read file names...) seems to be significantly faster today. Maybe I have to quit the kernel before I run the code. To loop takes now 15-20 minutes which I think is reasonable. So, seems that today is much faster then yesterday. Anyway, do you have ideas to improve the code in terms of speed? If not, I think I can live with 20 minutes processing time.
Attachments: