I’ve noticed most of the posts (if not all) about problems with URL Access scripting here have been about problems uploading. Well here’s a new problem with Downloading. It’s been working fine for me until today when the server my script accesses was down. The script hangs for several minutes before finally throwing an error. This is unacceptable behavior for this particular script for reasons I won’t bother going in to. I looked in the “URL Access” scripting dictionary hoping to find some way of telling it to only try to access the url for a much shorter period of time, say about 5 or 10 seconds before giving up and moving on but there was nothing. I tried using a try block but that was no help either. The script still tries for just as long.
Anybody have any ideas on how to make it only try for a few seconds?
Without a snippet of your script, it’s hard to know exactly what’s going on, but can’t you just test whether the server is running by testing for it: (this script by Kai is one way for a named server).
Alternatively, if the server will answer pings, why not ping it before executing your script?
on idle
-- check if the machine is running and answering pings
set msg to ""
set Png to ""
set add to "192.168.1.101"
try
-- this takes a while if the machine is down
set Png to do shell script "ping -c 1 " & add
on error
set p to number of paragraphs in Png
if p < 5 then set msg to "Not answering pings" & return
end try
-- check if the server is running
try
set S to do shell script "curl http://" & add
on error
set msg to msg & "Host down"
end try
display dialog msg giving up after 10
return 5 * minutes
end idle
Interesting, This returns true for apple.com and also returns true for the other server I’m trying to access which is still down so that doesn’t seem to useful. I’ll try the other solution now.
Unfortunately this suffers from the same problem as my original code. It just takes to long. What I really need is a way to make it stop trying after a few seconds. Weather it’s simply pinging the server or trying to download a file, either way it needs to stop trying after 10 seconds.
Actually I don’t understand why there isn’t a setting for that in the system prefs/network. There should be a field there to enter an amount of time to give up trying to connect to a server. I realize there are some servers that are slow to respond but quite frankly if a server is slow to respond I don’t like when safari or any app I’m using keeps trying for several minutes. I’d rather give up on the slow servers and move on. Oh well. Back to the drawing board. If I come up with something that works I’ll post it in hopes that future users looking for a solution to the same problem might have a more fruitful search than I. Thanks for trying every body.
The first script just checks that the server has an address - that you’re not trying for the impossible. The second (as it says within) is under the control of the ping software once the shell instruction is passed, and it’s fairly “persistent”. If you look at the man page for ping, however, you’ll see that it has a -t timeout option that sets the number of seconds it will wait before exiting. Did you play with that?
If you run your Terminal.app (in Applications/Utilities), and after the prompt type man ping you’ll get several screens full of explanation. To see more, just thumb the space bar and eventually you’ll see [END].
If you get tired of it before then or have reached the end and want out, type “q” and you’ll be back at the prompt. You can do another lookup or:
Type “exit” and the terminal will log you out of your shell session,
then: command-q and you’re outta there.
This script by Kai will return the ip address of any URL you type in. You don’t need the http:// part, so just www.Apple.com will do.
text returned of (display dialog "Enter a URL in the text box below:" default answer ¬
"[url=http://www.apple.com]www.apple.com[/url]" buttons {"OK"} default button 1) as string returning {text:theURL}
-- the following line presumes that a reasonable URL has been entered in the form given by the default - it will not correct anything else.
if theURL does not contain "http://" then set theURL to "http://" & theURL
try
set theHost to (dotted decimal form of ((theURL as URL))'s host)
display dialog (theURL & "'s IP Address is: " & theHost) buttons {"OK"} default button 1
on error
display dialog "No such URL" buttons {"Too Bad"} default button 1
end try
set theURL to "http://www.google.com" -- the whole thing
try
set theHost to (dotted decimal form of ((theURL as URL))'s host)
on error e
display dialog e -- no such site found
end try
You can leave out the “on error” dialog too, since this will rarely error. You can make up the most gobbly-gook URL and usually a Registrar will respond hoping you’ll register that URL with them.