I would like to use Automator to:
1- extract URLs from a text file with about 50 URLs 2- open it in firefox 3- take a screenshot of the window 4- Close the window 5- do it a again for the next 49 URLs.
First step, I can't extract urls from the text files, automator give me nothing when I do it.
Well, this is done know, mistake from me I had to use get content of text edit document before extract url.
Second thing, I don't know how to make it recursively URL after URL.
Know it opens all urls at the same time in different tabs, which make my firefox to shut down because of the number of tab open at the same time. How could I make it do it url after url ?
It's the first time I use Automator and I don't know nothing about apple scripting.
Any help?
No need for Automator, just use
webkit2png
which you can install easily withhomebrew
like this:Then put a list of all your sites in a file called
sites.txt
that looks like this:and then run
webkit2png
like this:Or, if you don't like that approach, you can do something like this just with the built-in tools in OS X. Save the following in a text file called
GrabThem
Then make it executable in Terminal (you only need do this once) with
Then run it like this in Terminal:
and the files will be called
1.png
,2.png
etc.You can see the newest files at the bottom of the list is you run:
You may want to look at the options for
screencapture
, maybe the ones for selecting a specific window rather than the whole screen. You can look at the options by typing:and hitting
SPACE
to go forwards a page andq
to quit.