I have created a folder action script that sorts incoming print jobs according to file type, copying the files to a job folder hierarchy. When run on jobs with about 50 files, no problems occur. However, I tested it on a “Worst Case Scenario” by dropping a job of about 700 files onto it (an entire CD) and it executed the entire job but after it finished I got an error message which said something to the effect of “Folder Action script got an error. Out of memory”. I cleared the error dialog but each time I opened or closed that folder it popped back up. Does anyone have any insight into this and how to correct it? The script uses recursion to dig out all the files in nested folders and copies the files to the proper folders. I don’t know if it’s the recursion or the copying of 700 files or both.
I would look at your code to see if data is piling up during the recursion routine. It might be something simple that can be overcome by periodically clearing a variable.
Thanks, Rob. I’ll look at that. I also thought that I could copy the entire job in one shot to the “Originals” folder then simply move the files to the appropriate folders so there is only 1 copy command instead of 700. Thanks again.