分类: LINUX
2011-02-06 01:11:20
curl -u username:password –silent “” | tr -d ‘\n’ | awk -F ” ‘{for (i=2; i<=NF; i++) {print $i}}’ | sed -n “s/\(.*\)<\/title.*name>\(.*\)<\/name>.*/\2 – \1/p”
Checks the Gmail ATOM feed for your account, parses it and outputs a list of unread messages.
For some reason sed gets stuck on OS X, so here’s a Perl version for the Mac:
curl -u username:password --silent
"" | tr -d '\n' | awk -F
'
awk '!x[$0]++'
lynx -dump |grep address|egrep ‘city|state|country’|awk ‘{print $3,$4,$5,$6,$7,$8}’|sed ‘s\ip address flag \\’|sed ‘s\My\\’
I save this to bin/iptrace and run “iptrace ipaddress” to get the Country, City and State of an ip address using the service.
I add the following to my script to get a tinyurl of the map as well:
URL=`lynx -dump details|awk ‘{print $2}’`
lynx -dump tinyurl|grep “19. http”|awk ‘{print $2}’
7) Block known dirty hosts from reaching your machinewget -qO - |awk ‘!/#|[a-z]/&&/./{print “iptables -A INPUT -s “$1″ -j DROP”}’
Blacklisted is a compiled list of all known dirty hosts (botnets, spammers, bruteforcers, etc.) which is updated on an hourly basis. This command will get the list and create the rules for you, if you want them automatically blocked, append |sh to the end of the command line. It’s a more practical solution to block all and allow in specifics however, there are many who don’t or can’t do this which is where this script will come in handy. For those using ipfw, a quick fix would be {print “add deny ip from “$1″ to any}. Posted in the sample output are the top two entries. Be advised the blacklisted file itself filters out RFC1918 addresses (10.x.x.x, 172.16-31.x.x, 192.168.x.x) however, it is advisable you check/parse the list before you implement the rules
8) Display a list of committers sorted by the frequency of commitssvn log -q|grep “|”|awk “{print \$3}”|sort|uniq -c|sort -nr
Use this command to find out a list of committers sorted by the frequency of commits.
9) List the number and type of active network connectionslynx -useragent=Opera -dump ‘′ |gawk -F’\”t\”:\”‘ -v RS=’\”,’ ‘RT{print $NF}’ |grep -v ‘\”n\”:\”‘ |cut -d, -f2
There’s no need to be logged in facebook. I could do more JSON filtering but you get the idea…
Replace u=4 (Mark Zuckerberg, Facebook creator) with desired uid.
Hidden or not hidden… Scary, don’t you?
11) List recorded formular fields of Firefoxcd ~/.mozilla/firefox/ && sqlite3 `cat profiles.ini | grep Path | awk -F= ‘{print $2}’`/formhistory.sqlite “select * from moz_formhistory” && cd – > /dev/null
When you fill a formular with Firefox, you see things you entered in previous formulars with same field names. This command list everything Firefox has registered. Using a “delete from”, you can remove anoying Google queries, for example ;-)
12) Brute force discoversudo zcat /var/log/auth.log.*.gz | awk ‘/Failed password/&&!/for invalid user/{a[$9]++}/Failed password for invalid user/{a["*" $11]++}END{for (i in a) printf “%6s\t%s\n”, a[i], i|”sort -n”}’
Show the number of failed tries of login per account. If the user does not exist it is marked with *.
13) Show biggest files/directories, biggest first with ‘k,m,g’ eyecandytar -cf – . | pv -s $(du -sb . | awk ‘{print $1}’) | gzip > out.tgz
What happens here is we tell tar to create “-c” an archive of all files in current dir “.” (recursively) and output the data to stdout “-f -”. Next we specify the size “-s” to pv of all files in current dir. The “du -sb . | awk ?{print $1}?” returns number of bytes in current dir, and it gets fed as “-s” parameter to pv. Next we gzip the whole content and output the result to out.tgz file. This way “pv” knows how much data is still left to be processed and shows us that it will take yet another 4 mins 49 secs to finish.
Credit: Peteris Krumins http://www.catonmat.net/blog/unix-utilities-pipe-viewer/
16) List of commands you use most oftenawk ‘length>72′ file
alias busy=’my_file=$(find /usr/include -type f | sort -R | head -n 1); my_len=$(wc -l $my_file | awk “{print $1}”); let “r = $RANDOM % $my_len” 2>/dev/null; vim +$r $my_file’
This makes an alias for a command named ‘busy’. The ‘busy’ command opens a random file in /usr/include to a random line with vim. Drop this in your .bash_aliases and make sure that file is initialized in your .bashrc.
19) Show me a histogram of the busiest minutes in a log file:awk ‘{print NR”: “$0; for(i=1;i<=NF;++i)print “\t”i”: “$i}’
Breaks down and numbers each line and it’s fields. This is really useful when you are going to parse something with awk but aren’t sure exactly where to start.
21) Browse system RAM in a human readable formsudo cat /proc/kcore | strings | awk ‘length > 20′ | less
This command lets you see and scroll through all of the strings that are stored in the RAM at any given time. Press space bar to scroll through to see more pages (or use the arrow keys etc).
Sometimes if you don’t save that file that you were working on or want to get back something you closed it can be found floating around in here!
The awk command only shows lines that are longer than 20 characters (to avoid seeing lots of junk that probably isn’t “human readable”).
If you want to dump the whole thing to a file replace the final ‘| less’ with ‘> memorydump’. This is great for searching through many times (and with the added bonus that it doesn’t overwrite any memory…).
Here’s a neat example to show up conversations that were had in pidgin (will probably work after it has been closed)…
sudo cat /proc/kcore | strings | grep '([0-9]\{2\}:[0-9]\{2\}:[0-9]\{2\})'(depending on sudo settings it might be best to run
sudo sufirst to get to a # prompt)
22) Monitor open connections for httpd including listen, count and sort it per IPIt’s not my code, but I found it useful to know how many open connections per request I have on a machine to debug connections without opening another http connection for it.
sudo aptitude purge `dpkg –get-selections | grep deinstall | awk ‘{print $1}’`
Purge all configuration files of removed packages
24) Quick glance at who’s been using your system recentlyThis command takes the output of the ‘last’ command, removes empty lines, gets just the first field ($USERNAME), sort the $USERNAMES in reverse order and then gives a summary count of unique matches.
25) Number of open connections per ip.BSD Version: