My Favorite Linux Commands
There are lot of articles on the web regarding Linux commands . Here I have listed some of my favorite Linux commands. These commands are little bit advanced than those mentioned in general blog but very useful.
- Start a service in CentOS/Redhat 7
systemctl start httpd
- Start a service at startup in CentOS/Redhat 7
systemctl enable httpd
- Find all files following a pattern
find /home/mdn2000/ms-h/bin -name "core*"
Description: The above command find all files prefixed with core and show them on the screen
- Find all files following a pattern and older than specific time
find /home/mdn2000/ms-h/bin -name "core*" -ctime +7
Description: The above command find all files prefixed with core and older than 7 day
- Find all files following a pattern and older than specific time and then move them specific directory
find /home/huawei/mdn2000/ms-h/bin -name "core.hmsserver*" -ctime +7 -exec mv {} /home/hms/data/c/core_backup/ \;
Description: The above command find all the files prefixed with core and older than a specific time and then move them to specific directory
- Compress and archive all files of a specific pattern
i) tar -czvf test.tar.gz *.unl ii) tar –czvf test.tar.gz prm*
Description: The first command will compress and archive all files with suffix /file extension “.unl”. The second one will compress and archive all files with prefix “prm” . Here gzip compress is used which is very efficient.
- Uncompress and Untar
tar -xzvf test.tar.gz
Description: The above command will un tar and uncompress all files that was previously archived and compressed as file name test.tar.gz by “ tar –czvf” command
- Check how many HTTP processes are running
# ps -ylC httpd | wc -l
Description: The above command shows how many httpd processes are running. When more requests come to the Webserver ( Apache, Nginx etc.), it spawns more process. Generally, the more process consumes more memory and CPU.
- Check total and average memory consumption of a process For httpd process:
# ps -ylC httpd | awk '{x += $8;y += 1} END {print "Apache Memory Usage (MB): "x/1024; print "Average Process Size (MB): "x/((y-1)*1024)}' Apache Memory Usage (MB): 284.121 Average Process Size (MB): 10.523
For mysqld process:
# ps -ylC mysqld | awk '{x += $8;y += 1} END {print "MySQL Memory Usage (MB): "x/1024; print "Average Process Size (MB): "x/((y-1)*1024)}' MySQL Memory Usage (MB): 15840.1 Average Process Size (MB): 15840.1
Description: The above two commands will output how much total memory is consumed by Apache and MySQL server as well as average memory consumption a process. For Apache many httpd process are spawned but for MySQL only one mysqld process is spawned.
- Check which package is required to be installed for a command by yum for Centos/Redhat
# [root@localhost ~]# yum whatprovides netstat Loaded plugins: fastestmirror Loading mirror speeds from cached hostfile * base: centosmirror.go4hosting.in * extras: centos.ustc.edu.cn * updates: centos.ustc.edu.cn net-tools-2.0-0.17.20131004git.el7.x86_64 : Basic networking tools Repo : @base Matched from: Filename : /usr/bin/netstat
Description: The above command will show to activate netstat command( if it is not already installed on your linux server), you need to install net-tools package. So the next command will be
#yum install net-tools
- Activate tcp/udp port in firewall at Redhat7/Cen0S7Allow the port through firewalld:
firewall-cmd --permanent --add-port=21/tcp
And reload the firewall:
firewall-cmd --reload
Description: Here we activate ftp port 21 so that remote host can get FTP access
- Split a big file into smaller according to line
split -l 200000 -d NID_formatted.txt nid
Description: The file NID_formatted.txt contains 2000000 lines. The split command will divide it into 10 files containing each 200000 lines. -d option will add a numeric suffix (length 2) and nid is the prefix. so the splitted files name will be nid00, nid01, nid02 ……. nid19. If you do not use -d option, the suffix will be alphanumeric which will generate files like nidaa, nidbb … and so on.
Top Reviewed Books of Linux at amazon.com