is it possible in bash script to do something like get web page content as a browser using lynx store it in a variable (actually only one network access to the page) and then do many grep to extract information ?
i have tried things like :
content="$(lynx -dump -nolist $url')"
var1=`cat "$content" | grep myre1`
var2=`cat "$content" | grep myre2`
but i mess up between assignations, using quotes, backquotes and so on.
if someone have another solution without lynx: i am searching for something like lynx --dump
that is "browser rendering view" (and not wget
) and preferably would avoid creating a file on system (or if this is the only solution, how may i proceed to delete this temporary file?)
thank u and best regards
Just fixed your code. And I want to answer to this
$()
instead of backticks, usually there is no difference, but one great advantage of$()
is that you can nest it. Also in some fonts backticks could look similar to single quotes and if you paste your code to some sites it may break, so $() is more stable." "
around parameters. Examples:echo "$myvar"
,wget "$myurl"
[[ ]]
instead of[
.[
is atest
command located in /bin/test (and /bin/[ is usually symlinked to /bin/test), while[[ ]]
is bash syntax. (However it seems like [ is a bash builtin now)let
for math operations, use(( ))
. Examples:(( a = 5 * b ))
,echo $(( a / 20 ))
(( ))
for math comparisons instead of[[ ]]
since it allows intuitive<=
<
>
>=
operators. Example:if (( a <= b )); then ...
instead ofif [[ $a -le $b ]]; then ...
tr
and other utilities. Some of them and A bit more. Example: use${myvar^^}
instead ofecho "$myvar" | tr a-z A-Z
cat filename | grep somestr
, usegrep somestr filename
instead.Your fixed code: