[socal-piggies] weird behavior with files on different filesystems

William Yardley piggies at veggiechinese.net
Fri Mar 7 11:50:04 PST 2008


I was working on a python script today, basically my first. It's just a
simple script to iterate through some files / directories and append a
comment with an SVN "Id" string to the files, and then set a property on
the file. I probably could have done it more quickly in perl / shell /
sed / whatever, but I wanted to play around with using python instead.

The filesystem my home directory (and the directory tree that I'm
modifying) are on is mounted via NFS. The filesystem that /tmp is on is
a local disk.

The program works fine *if* the source and destination file are on the
same filesystem (for example, if I do the svn co to /tmp and make the
temporary files in /tmp, or if I just name the temporary file
"filename.tmp").

However, if I make a unique temp file in /tmp using tempfile.mkstemp,
and then copy it into place, then some (but not all) of the resulting
files are 0 size, and many or all of the remaining ones are truncated
without a trailing newline. The behavior seems to be consistent (i.e.,
the same files are 0 size, filesizes of the files that are there are the
same if I re-checkout the tree and run the script again).

Earlier, I had some problems with closing files in the wrong place,
which was causing some similar problems with 0 size output files.

I put the program at:
http://veggiechinese.net/id_tags.txt

Any thoughts? And would it be better to read in the text files line by
line rather than sucking them into a string? None are super-huge.

Oddly, I just ran into a similar problem with a perl script that does
something kind of similar on a totally different machine.

w





More information about the socal-piggies mailing list