A coworker needs access to our webalizer data often, but I cannot give the login and password.
I need to create a script that logs into our web host, gets the webalizer data, saves it to the desktop and logs out.
I’ve already created a script that logs into and goes to the site in Safari, but it shows the login and password in the URL field - can’t have that.
I tried a script that logs in via ftp (therefore mounting the remote folder) and has all the contents of the webalizer folder. This is good, but I need the script to continue by copying all of the contents of the mounted volume to a local folder (perhaps with a prompt to choose a folder), then unmount the remote volume. Ideally this would all happen without the volume actually mounting, but rather the script just goes in and gets all of the remote contents.
I am open to other solutions. essentially I just need a coworker to be able to access web stats data anytime without me having to give him th login and password.
In your situation, I would be inclined to write a stay-open ‘on idle’ script to run on my own machine, accessing the stats every 30 minutes, say; then have your colleague give you access to a shared folder on his machine, and transfer the stats to that folder every 30 minutes. When idle, ‘on idle’ scripts don’t consume any resources (except a bit of memory) on your machine, and your colleague can surely work with periodic updates.
In a more elaborate scenario, you could write him a script that would trigger yours for stats on demand.
Finally, if you’re willing to parse out the data required, you can use the unix function ‘curl’ to download the source from the server and parse out the report you want him to have. That could run as a run-only AppleScript application with the password embedded in it.
Thanks. Can you direct me to the any examples of the last option you suggested (using curl)?
I am a novice at Applescript and am not sure how to implement this… I would even be ok with the first version I did where the script just launches Safari (login and password embedded) and goes to the web stats index page, but the problem is that Safari shows the login/password in the url.
with the curl command you can either download the file and save it
or send the contents to standard output to parse it.
Here is a sample script which shows the syntax of both commands:
property ftpserver : "[url=ftp://ftp.myServer.com]ftp.myServer.com[/url]" --ftp server
property filename : "theFile.xyz"
property ftpfile : "/path/to/" & filename --the path to the file on the server
property server_username : "¢¢¢¢¢" --server_username
property server_password : "¢¢¢¢¢" --server_password
property destPath : POSIX path of (((path to desktop) as string) & filename) -- e.g. save file on desktop
set servername to ftpserver & ftpfile & filename
-- save theFile.xyz on desktop
set theCommand to "curl -o " & destPath & " -u " & server_username & ":" & server_password & " " & servername
-- or
-- send the contents of theFile.xyz to standard output
set theCommand to "curl -u " & server_username & ":" & server_password & " -0 " & servername
--download
do shell script theCommand
Model: G5 dual 2,5 GHz
Browser: Safari 419.3
Operating System: Mac OS X (10.4)
Thanks, I am trying this out. though I need little help with two things:
I am not sure how to script the path to desktop so that the entire script will work on any computer
I don’t know how to tell the script to prompt the user to specify a folder on their desktop where the remote files should be copied to and then copy all of the contents of the remote directory to that folder.
If this can be done, then the user will have all they need to view the web stats locally.
Sorry If I am asking too much, but I’ve done unly the basics of applescripting (mostly copying and pasting from Apple’s example scripts).
When you use curl, you are basically downloading the page source to a variable in your script as the original HTML. The next step is to parse that text using text item delimiters, and it’s tough to demonstrate how to do that without a sample of what has to be parsed. Can you provide a ‘cleansed’ sample?