BUG: System Events gives wrong file size

I recently tried to determine the size of files using System Events and found an important bug: System Events gives the wrong result for files larger than 2 ^ 32 degrees (about 2.16 GB).

Is that the size (number of bytes in the file) or the physical size (number of bytes in the blocks containing the files on disk)?

2^31

I can confirm—albeit with limited testing—that I’m observing the same behavior.
When pointed at the Mojave installer, System Events reports results that are three times the actual file size:

set isTarget to choose file
tell application "System Events" to set SEsize to isTarget's size
tell application "Finder" to set Fsize to isTarget's size
{SEsize, Fsize} --{-9.26437325E+8, 3.368529971E+9}

*Edited to reconcile variable names.

It looks like sizes are limited to the maximum size of an AppleScript integer (signed int, or 2^31 - 1). You’ll see in Finder’s dictionary it says its file sizes are a double integer, which in practice means they end up converted to reals.

Strangely, here (in Catalina) AS seems intent on turning any integer value above 2^29 into a real.

2^29 - 1 is specified in the language guide as being the largest expressable integer (Class Reference - Integer, or p112). Above that, compiling will convert to real. Same for negative values.

So it is, thanks. It’s a bit odd, because the Objective-C code to create an integer specifies SInt32, which is a full signed 32-bit integer.

Maybe it’s a dawn of time issue? When would that limit have been set? I was under the impression that apple didn’t begin using Objective-C until they purchased Next, which was several years after applescript came out.That said, I have no clue where those last bits went.

The Objective-C class NSAppleEventDescriptor is just a wrapper around the old Carbon AEDesc stuff. Perhaps they use a couple of bits as flags in actual AppleScript structures, as opposed to in Apple events.

While that makes sense, it would place the matter well beyond my (very) limited knowledge.

That said, I did just spend a while rooting through Bill Cheeseman’s posts in the Versions forum. I only found reference to extreme integers on two occasions. There is a mention of a ‘30-bit signed integer’ and overflow under the v1.1.2 (Mac OS 8) page, which discusses a bug in v1.1.1. So this predates Carbon by several years.

There was a second reference to the ‘true range’ (ie ~2^29) and some of the problems encountered in AS 1.3-1.4 — so again, somewhat before Carbon.

Anyway, that’s as much digging as I’m going to do :slight_smile:

Now that I think more about it, I suspect it was an optimization: that AS stores values internally as 32-bit pointers, but by using a couple of bits as a flag, integers can be stored directly.