automator take screenshots of website recursively

1.3k Views Asked by At

I would like to use Automator to:

1- extract URLs from a text file with about 50 URLs 2- open it in firefox 3- take a screenshot of the window 4- Close the window 5- do it a again for the next 49 URLs.

First step, I can't extract urls from the text files, automator give me nothing when I do it.

Well, this is done know, mistake from me I had to use get content of text edit document before extract url.

Second thing, I don't know how to make it recursively URL after URL.

Know it opens all urls at the same time in different tabs, which make my firefox to shut down because of the number of tab open at the same time. How could I make it do it url after url ?

It's the first time I use Automator and I don't know nothing about apple scripting.

Any help?

1

There are 1 best solutions below

1
On

No need for Automator, just use webkit2png which you can install easily with homebrew like this:

brew install webkit2png

Then put a list of all your sites in a file called sites.txt that looks like this:

http://www.google.com
http://www.ibm.com

and then run webkit2png like this:

webkit2png - < sites.txt

Or, if you don't like that approach, you can do something like this just with the built-in tools in OS X. Save the following in a text file called GrabThem

#!/bin/bash
while read f
do
   echo Processing $f...
   open "$f"
   sleep 3
   screencapture ${i}.png
   ((i++))
done < sites.txt

Then make it executable in Terminal (you only need do this once) with

chmod +x GrabThem

Then run it like this in Terminal:

./GrabThem

and the files will be called 1.png, 2.png etc.

You can see the newest files at the bottom of the list is you run:

ls -lrt

You may want to look at the options for screencapture, maybe the ones for selecting a specific window rather than the whole screen. You can look at the options by typing:

man screencapture

and hitting SPACE to go forwards a page and q to quit.