Unix Ground
A unique blog for Unix, Linux based Tips, tricks and Shell Scripts. This is intended to be one-stop Information Center for all your Unix, Linux needs.
Tuesday, June 14, 2011
Wednesday, June 9, 2010
Bourne shell redirection
0 is stdin;
1 is stdout;
2 is stderr
redirect std err to file
$command 2>file
direct the output from command to stderr
$commamd >& 2
direct std output and std err to a file or /dev/null
$ command >file.log >>file.log
$ command >file.log 2>&1
$ command >/dev/null 2>&1
1 is stdout;
2 is stderr
redirect std err to file
$command 2>file
direct the output from command to stderr
$commamd >& 2
direct std output and std err to a file or /dev/null
$ command >file.log >>file.log
$ command >file.log 2>&1
$ command >/dev/null 2>&1
Some cool one liners..
sum numbers in first column of a file
$ cat filename | awk 'BEGIN { tot=0 } { tot+=$1 } END { print tot }'
only print certain lines in a file, eq 20 - 30
$ sed -n -e 20,30p filename
Search all files in and below for a string xyz:
$ find . -exec egrep xyz {} \; -print
Remove all files name ".log.tmp"
$ find . -name .log.tmp -exec rm {} \;
search a file for any of three strings:
$ egrep 'abc|xyz|or' filename
$ cat filename | awk 'BEGIN { tot=0 } { tot+=$1 } END { print tot }'
only print certain lines in a file, eq 20 - 30
$ sed -n -e 20,30p filename
Search all files in and below for a string xyz:
$ find . -exec egrep xyz {} \; -print
Remove all files name ".log.tmp"
$ find . -name .log.tmp -exec rm {} \;
search a file for any of three strings:
$ egrep 'abc|xyz|or' filename
Wednesday, January 20, 2010
Sexy Unix Commands
the follwing are the sexy commands.
date;
unzip;
touch;
strip;
finger;
mount;
gasp;
yes;
uptime;
Wednesday, May 6, 2009
removing duplicate files using shell script
Friends,
One liner script provided below, can be used to identify the duplicate files within the system. If you want to search in a partticular path, just replace it with "/tmp" provided in sample command. It will redirect all the duplicate file names into removal_list.txt which can be used to delete them.
find /tmp "$@" -type f -print0 xargs -0 -n1 md5sum sort --key=1,32 uniq -w 32 -d --all-repeated=separate sed -r 's/^[0-9a-f]*( )*//;s/([^a-zA-Z0-9./_-])/\\\1/g;s/(.+)/#rm \1/' >> removal_list.txt
One liner script provided below, can be used to identify the duplicate files within the system. If you want to search in a partticular path, just replace it with "/tmp" provided in sample command. It will redirect all the duplicate file names into removal_list.txt which can be used to delete them.
find /tmp "$@" -type f -print0 xargs -0 -n1 md5sum sort --key=1,32 uniq -w 32 -d --all-repeated=separate sed -r 's/^[0-9a-f]*( )*//;s/([^a-zA-Z0-9./_-])/\\\1/g;s/(.+)/#rm \1/' >> removal_list.txt
Performance comparision
Checkout the "time" command.
For checking the time taken for execution of any command do the following:
Say you want to execute "ls -l grep sanju.txt" and measure time taken for this.
Now open a bash shell and on the prompt
$time ls -l grep sanju.txt
it shows you the time taken by the command execution
For checking the time taken for execution of any command do the following:
Say you want to execute "ls -l grep sanju.txt" and measure time taken for this.
Now open a bash shell and on the prompt
$time ls -l grep sanju.txt
it shows you the time taken by the command execution
Tuesday, May 5, 2009
Unaliasing in BASH
Unaliasing in BASH
How can we unalias a command temporarily without using unalias command ?
just put the escape character before the command i.e. "\"e.g. :-
$\ls
It will show the without alias.
Subscribe to:
Posts (Atom)