I'm trying to download multiple .mp4 files from a website. Right now I'm having to go to each individual page, go into the source code and finding the text ".mp4" and dragging the URL that ends with the mp4 file name to Downie which downloads it.
The problem with this process is that there are 30 files that I'd have to work on individually to download. Was wondering if there was a Macro I could create to automate this process?
This is much easier if you have Lynx installed, but you can it with standard old
Something like this should work:
#!/usr/bin/env zsh -f
curl -sfLS "$URL" \
| tr '>|<| |"|\47' '\012' \
| egrep -i '^http.*\.mp4$' \
| while read line
curl -fL --remote-name "$line"
URL= line is where you should put in the actual URL between the
cd line says "change directory to ~/Downloads/". You could change this to anywhere you wanted the files downloaded
curl line will fetch the web page (like what you'd see if you did View Source in Safari
tr line says: "Look through the HTML source and whenever you find a > or < or " or ' (that's
\47), replace it with a newline (that's
the egrep line says "look for URLs that start with HTTP and ends with MP4. (it will ignore case) ¹
that will create a list of URLs. The
while read line starts a loop which will process each line, that is, each URL,
curl line will download the
¹ Sometimes URLs are not given completely. For example, instead of
they might just use
If that is the case, I'll need to know the actual details of the download page to explain how to do it.
Is there an URL that has all of the videos linked to it? That would be the only way to try to automate getting all of them.
I have another idea but still need help:
- I've made Downie download to 1 specific folder
- I've set up a Hazel Rule to rename the video .mp4 files as they are downloaded into the folder above
- I've created a text file with the mp4 URLs of all 30 videos
How do I create a Macro to copy and paste the URLs one by one from the text file to Downie?
Don't know how to do it with Keyboard Maestro but here's a shell script.
Assuming that the text file with the URLs (I assume each URL is on a separate line) is in your Desktop and named
filename.txt it would work like this:
#!/usr/bin/env zsh -f
# change this to the actual folder and filename
fgrep -i http "$FILE" | while read line
open -a 'Downie 3' "$line"
But, having said that, I think you can just copy the URLs from the filenames from the text file and paste it right into the Downie window, as shown here:
That's probably the best option
Thank you for all your help!