Return input to shell script trough applescript

Hey, i’ve searched a lot and couldn’t find anything about this(maybe the wrong search terms ?) What im trying to make is an script that downloads an whole imgur album without using the Terminal. i use a shell script that i found on the internet to download the album. the only thing i need is to pass the variable to the shell script. Can anyone help me doing that ?

 
display dialog "Paste Imgur URL :" default answer "http://imgur.com/"
set imgurURL to text returned of result
set bashFile to path to resource "Scripts/imgur-ripper.sh"

tell application "Terminal"
	do shell script "bash " & quoted form of (POSIX path of bashFile)
end tell 

and the shell script :

thanks !

Hi devvr,

A good search term is “passing arguments to shell script”.

Anyway, all you need to do is add parameters after the path to the script:

do shell script "/some/path/ arg1 arg2 arg3"

In your shell script use $1, $2, $3, etc. to use the parameters.

I don’t know much about the different shells.

gl,
kel

Hello and welcome.

Please provide a sample url for testing. :slight_smile: And, I assume you want to specify some download folder? I guess it is assumed to get the files downloaded into the current directory with the bash script.

Thanks for your replies, although its a bit unclear for me what you’re trying to say kel1. I already tried to do that but it wasn’t working for me.

McUsrll here is the link http://imgur.com/a/WLbdJ[ And yes the next step is specify a download folder, i=can you help me with that aswell ?

Hello.

You have to insert your own path to wget *, I don’t have the --no-check-certificate option in my local version of wget, so I can’t go further with this. :confused:

*The reason is that Applescript’s do shell script command doesn’t read your profile. or .bashrc, try this to be convinced:

do shell script "echo $PATH"
set imgurURL to text returned of (display dialog "Paste Imgur URL :" default answer "http://imgur.com/")

set thePath to POSIX path of ((choose folder "Choose Download Folder") as alias)
set res to (do shell script "cd " & quoted form of thePath & "
export gallery_url=" & imgurURL & " 
wget --no-check-certificate -q  \"$gallery_url\" -O - | grep 'data-src'|cut -d\\\" -f10|while read id
do
echo theId=$id
hashid=`basename \"$id\" \"s.jpg\"`
echo \"Downloading $hashid.jpg\"
wget -q -c \"http://i.imgur.com/$hashid.jpg\"
done")


Hey,

i tried your solution but it doesn’t seem to download anything into the folder ?

Thanks!

Hey Devvr,

This is working.

You’ll have to change downLoadDir to your preferred local download directory.

You’ll also have to make sure that wget is in the PATH (in the script). You can do that by typing ‘which wget’ in the Terminal.

I’'ve added /opt/local/bin/ & /usr/local/bin/ to the PATH variable in the script, but you might need to change that on your system. (This method is more convenient than using the full path to the command in my opinion.)


set imgurURL to "http://imgur.com/a/WLbdJ"
set downLoadDir to "~/Downloads/test_imgur/"

if downLoadDir starts with "~/" then set downLoadDir to ¬
	(POSIX path of (path to home folder)) & text 3 thru -1 of downLoadDir

set shCMD to "
PATH=/opt/local/bin:/usr/local/bin:$PATH;
cd " & quoted form of downLoadDir & ";
gallery_url=" & imgurURL & ";
wget --no-check-certificate -q  \"$gallery_url\" -O - \\
| grep 'data-src' \\
| cut -d\\\" -f10 \\
| while read id
	do
		hashid=`basename \"$id\" \"s.jpg\"`
		wget -q -c \"http://i.imgur.com/$hashid.jpg\"
	done
"
do shell script shCMD

Hey Devvr,

Okay. We’ve got the first script working.

It’s rather inefficient due to calling wget for every file to download, so let’s switch to curl(since it’s on everyone’s system already) and feed it a list of jpgs to download successively as one job.

Let’s also have the script create a dated gallery download folder within your ~/Downloads/ folder and open it upon completion of the job.


set galleryURL to "http://imgur.com/a/WLbdJ"

set shCMD to "
galleryURL=" & quoted form of galleryURL & ";
downloadDir=~/\"Downloads/Gallery Download `date \"+%Y.%m.%d %H%M%S\"`/\";
mkdir -p \"$downloadDir\";
cd \"$downloadDir\";
curl -A 'Opera/9.70 (Linux ppc64 ; U; en) Presto/2.2.1' -Ls \"$galleryURL\" \\
| egrep -iv 'thumb' \\
| sed -En '/data-src/{s!^.+data-src=\"(//i.imgur.com.+).(\\.jpg).+!url = http:\\1\\2! p; }' \\
| curl -Ls --remote-name-all -A 'Opera/9.70 (Linux ppc64 ; U; en) Presto/2.2.1' -K - ;
open \"$downloadDir\";
"
do shell script shCMD

On my system I would use gsed instead of sed, because it has a case-insensitive switch and other goodies - but you’d have to install that via MacPorts, or Homebrew (or make it yourself).

Perl is a viable alternative to that of course (and again is on everyone’s system already).


set galleryURL to "http://imgur.com/a/WLbdJ"

set shCMD to "
galleryURL=" & quoted form of galleryURL & ";
downloadDir=~/\"Downloads/Gallery Download `date \"+%Y.%m.%d %H%M%S\"`/\";
mkdir -p \"$downloadDir\";
cd \"$downloadDir\";
curl -A 'Opera/9.70 (Linux ppc64 ; U; en) Presto/2.2.1' -Ls \"$galleryURL\" \\
| egrep -iv 'thumb' | egrep -i 'data-src' \\
| perl -wlne 'if ( m!(//i.imgur.com.+).(\\.jpg).+!i ) { print \"url = http:$1$2\" }' \\
| curl -Ls --remote-name-all -A 'Opera/9.70 (Linux ppc64 ; U; en) Presto/2.2.1' -K - ;
open \"$downloadDir\";
"
do shell script shCMD

The Perl version will be a bit more robust, since it’s case-insenstive. However this type of download script is fragile due to dependencies on the code structure of the webpages you’re downloading from. Be aware of that and prepared to fix it if the web-devs break it in future.