10 Feb 2014 on web and iot

Starting commands for DevOps

20 *nix commands that I used most often when getting started with DevOps.

In the past month, I was immersed in DevOps starting up various EC2 and RDS instances in AWS, trying to deploy my app consisting of Rails API server, AngularJS frontend and PostgreSQL database in various environments such as staging or production.

It was a challenging time for me as I needed to have a systems view instead of just a single application. Yet, firing up some instances in different operating systems also gave me a slight insight to their similarities and differences.

Below are the list of 20 most used commands. Why is noticing the most used commands important? Because ultimately, we can include them in a script and build automation for various tasks in DevOps.

Here we start!


Logging into the Server

1. ssh-keygen generate ssh keys

Even before we log in to the remote server, we generate a pair of public and private ssh keys. Ultimately, the contents of the public key *.pub can go into Github SSH keys or the remote computer's ~/.ssh/authorized_keys file.

$ ssh-keygen -t rsa -f project-name -C "name@email.com"

2. ssh secure shell

Next, to login there are various ways of using the ssh command. The 3 common ways that I have used are:

  1. with a config file at ~/.ssh/config

Host projectname
      HostName website.com
      Port 8000
      IdentityFile ~/.ssh/sshkey
      IdentitiesOnly=yes
      User username
  

And now we can ssh into the server with a much simpler command

$ ssh projectname

  1. ssh with a username and a public key

<!--emailoff-->$ ssh -i mykey.pem [email protected]<!--/emailoff-->

  1. ssh directly into a particular directory

$ ssh projectname -t "cd /path/to/app ; /bin/bash"


Transferring files

3. scp secure copy

Secure copy scp comes in handy for transferring files from the local machine to the remote machine and vice versa.

  1. from local to remote

<!--emailoff-->$ scp -i /path/to/key.pem /path/to/file [email protected]<!--/emailoff-->

  1. from remote to local

<!--emailoff-->$ scp -i ~/.ssh/key.pem  /path/to/source/file [email protected]:~/path/to/destination/file<!--/emailoff-->

4. cp copy

To copy files or entire folder:

$ cp /path/to/source-folder/filename /path/to/destination-folder
$ cp -r /path/to/source-folder/folder /path/to/destination
$ cp /path/to/source-folder/filename /path/to/destination . # dot: destination is current folder

5. mv move

To move files/folders is also the same as renaming.

$ mv /path/to/source-folder/filename . # move file to current directory
$ mv /path/to/source-folder . # move folder to current directory

6. echo output

To print out simple commands and piping the output to a new file.

$ echo hello
$ echo `date` >> currentime.txt # copy to file
$ echo `date` > currentime.txt # append

7. touch update a file

When used without any arguments, touch will simply create a file.

$ touch filename1 filename2 filename3

8. ln symlink

Symbolic links for files are useful when we want the exact file contents in 2 different places to be synced up. This can be handy for doing so with executable files if needed.

$ ln -s /path/to/source/filename /path/to/destination/filename

File contents

9. head top of a file

head is very useful to quickly view a certain first few lines of a file with an option -n to specify how many lines from the top.

$ head -n 5 /path/to/filename

10. tail end of a file

tail as the name goes, is the opposite of the command head and this gives the content of the last lines of a file. Using with the option -f, the output is appended as the file grows and this is used to continuously view server log files while changes are happening.

$ tail -n 5 /path/to/filename
$ tail -f log/production.log

11. cat see the contents of a file

cat is a simple way to view the file contents, but it can also be used to concatenate a few files into a new file.

$ cat /path/to/filename
$ cat file1 file2 >> newfile # newfile has contents of file1 and file2

Permissions

12. chmod change permissions

File permissions include the ability to read, write or execute the file for users owner, group or others. chmod can change these permissions with either a number or expression.

$ chmod u+x /path/to/script # user can execute
$ chmod 600 ~/.ssh/authorized_keys # user can read, write

13. sudo act as root

sudo gives the permission to run commands as the root user. It can be used for a single command or a series of commands. This command should definitely be used with caution as root user will have permissions to execute things a normal user might not.

$ sudo !! # execute the previous command with sudo
$ sudo cp /path/to/source /path/to/destination # execute as root
$ sudo su # switches from current user to root

14. chown own the folder/file

To quickly change the owner of a folder or file chown comes in handy in various situations.

$ sudo chown -R username:username ~/path/to/folder

Directory

15. cd change directory

Changing directory is such a common command. Here are some ways to use cd:

$ cd ~ # go to home directory
$ cd - # go to the last visited directory
$ cd ../path/to/folder # go to relative directory one level up
$ cd /path/to/folder # absolute path

16. ls list files/folders

Another common commands is to list the directory contents. Using ls with 3 options are useful to view the dotfiles in long format to give a comprehensive look inside that folder.

$ ls -lah . # current folder
$ ls -lah ~ # home folder
$ ls -lah /path/to/folder # absolute path to a folder

Monitoring processes

17. ps running processes

Knowing the current processes running is another useful command through ps. Often, the list will be long. In this case piping the answer to a search terms is useful.

$ ps aux | grep unicorn # find all unicorn processes

The command ps aux will give a long list of useful information about the processes including the user, CPU usage, path to the command as well as the Process ID (PID) which will come in handy if we want to stop that particular process.

18. kill terminate a process

Often, we want to find that particular process and kill/stop it so that we can make some change and then run it again. Coupled with the previous command ps which will give the PID number, we can kill that particular process with a Unix signal 9. Here's an example:

$ ps aux | grep jekyll
$ sayanee  1835   0.0  0.8  2509028  68756 s003  S+    7:59PM   1:00.64 ruby /Users/command
$ kill -9 1835

19. lsof

At other times, knowing which process is running in a particular port is also useful. For this lsof comes in handy!

$ sudo lsof -i:8080 # complete info on the process
$ sudo lsof -t -i:8080 # just the PID of the process
$ kill -9 `sudo lsof -t -i:8080` # put the PID to kill it

20. crontab cron jobs for scheduling

Scheduling scripts to run at a certain time or an interval is helpful through CRON jobs. Sometimes CRON jobs are created through a backend application using the application specific variables. To see the output, listing the cron job is useful.

$ crontab -l

Getting started with DevOps

Noticing the most used commands is the first steps towards automation/shortcuts which can be done in the following ways:

  1. write the commands in a bash script
  2. include commands in a CRON job
  3. including commonly used commands as aliases for shorthands

There are also frameworks such as Chef and Puppet which will help to kick start a list of commands that makes starting a production/staging environments easy with just one command or a click.

I have also come to realize that there is a change in thought processes when I'm doing development versus devops. But in the end, it was rather rewarding to get to understand the file systems in various linux distributions or operating systems.

What common commands are you using? How do you automate your DevOps processes?

Updates: Some useful tools were suggested:

  1. Ansible
  2. Chef
  3. Capistrano
  4. Puppet
  5. Fabric
comments powered by Disqus