Hey There,
Ah, finally. Here we are. Regular readers will remember our previous entries in this series for finding your search index ranking on Google, MSN/Live and Yahoo We've finally gotten around to Ask and, with this script, that should be that... until one or any combination of them change their source ;).
IMPORTANT NOTE: Although this warning is on the original Google search rank index page, it bears repeating here and now. If you use wget (as we are in this script), or any CLI web-browsing/webpage-grabbing software, and want to fake the User-Agent, please be careful. Please check this online article regarding the likelihood that you may be sued if you masquerade as Mozilla.
This Ask script is most similar to our Yahoo script, except we haven't been able to get Ask to bump us, so we don't know what error message to try and capture ;)
There are, at least, three different ways you can call it. The most basic being:
host # ./arank www.yourdomain.com all these key words
It doesn't matter if they're enclosed in double quotes or not. If you "really" want to get the double quote experience, you just need to backslash your double quotes.
host # ./arank www.yourdomain.com \"all these key words\"
Other ways include creating files with the URL and keyword information (same format as the command line) and feeding them to the script's STDIN:
host # cat FILE|./arank
host # ./arank <FILE
Check out the image below to see the script in action. We put the DEBUG line in there so you could see how simple the pattern matching is. Don't stand too close to the screen, though. You'll ruin your eyes ;)
And, at last, here's the script. Enjoy and have fun re-tooling it. Just be sure to double-check Ask to make sure you're not overstaying your welcome ;)
Cheers,
This work is licensed under a
Creative Commons Attribution-Noncommercial-Share Alike 3.0 United States License#!/bin/bash
#
# arank - Get your Ask.com Search Ranking Index
#
# 2008 - Mike Golvach - eggi@comcast.net
#
# Creative Commons Attribution-Noncommercial-Share Alike 3.0 United States License
#
if [ $# -lt 2 -a $# -ne 0 ]
then
echo "Usage: $0 URL Search_Term(s)"
echo "URL with or without http(s)://"
echo "Double Quote Search If More Than 1 Term"
exit 1
fi
if [ $# -eq 0 ]
then
while read x y
do
url=$x
search=$y
$0 $x "$y"
done
exit 0
else
url=$1
shift
search=$@
fi
search_terms=`echo $search|sed 's/ /+/g'`
start=1
count=1
echo
echo "Searching ASK for URL $url with search terms: $search"
echo
results=`wget -O - --user-agent=Firefox http://www.ask.com/web?q=${search_terms}\&qsrc=0\&o=0\&l=dir 2>&1|sed -n 's/Showing .* of \([0-9,]\)/\1/p'`
while [ $count -lt 1001 ]
do
wget -O - --user-agent=Firefox http://www.ask.com/web?q=${search_terms}\&qsrc=0\&o=0\&l=dir\&page=${start} 2>&1|grep http|egrep -v 'ask.com|www.google|ask.pronto.com|googleadservices.com|askcache.com'|sed '/www.askcareers.com.*Careers/,$'d|sed 's/<[^>]*href="\([^"]*\)"[^>]*>/\n\1\n/g'|sed -e :a -e 's/<[^>]*>//g;/</N;//ba'|grep http|uniq|while read line
#|grep "^http"|sed '/^http[s]*:\/\/[^\.]*\.*[^\.]*\.yahoo.com/d'|sed '/cache?ei/d'|uniq|while read line
do
echo "$line"|grep $url >/dev/null 2>&1
yes=$?
if [ $yes -eq 0 ]
then
echo "Result $count of approximately " $results " results for URL:"
echo "$line"
exit 1
else
let count=$count+1
fi
done
end=$?
if [ $end -eq 1 ]
then
exit 0
else
let start=$start+1
let count=$count+10
let next_hop=$count-1
let random=${RANDOM}/600
echo "Not in first $next_hop results"
echo "waiting $random seconds..."
sleep $random
fi
done
, Mike
Please note that this blog accepts comments via email only. See our Mission And Policy Statement for further details.