Get x/y pixel coordinates of an image from a mouse click

The subject pretty much says it all. I’d like to have a small app that loads an image view then intercepts the coordinates of a mouse click on that image. Is there a way to do this in ASOC?

Yes. You need to make a subclass of NSImageView, and change your view to the new subclass. In the subclass, you override -mouseDown:, which passes an NSEvent. You can then get the event’s locationInWindow(), and probably use -convertPoint:toView: to convert the location to view coordinates.

Thanks. I’m looking around for a good tutorial/example using NSImageView. Anything to get me started. Probably a dumb question, but is this something I should be doing in xCode instead of the Applescript Editor?

Oh, sorry – yes, what I suggested would have to be done in Xcode.

You might be able to get it to work otherwise, perhaps using an scripting addition to track the mouse click.