Archive for the ‘Tips and Tricks’ Category


Using ‘shopt’ To Adjust Bash Terminal Number Columns After Resizing Window

This solved a long time frustration I had within PuTTY in that when changing the window size, the terminal columns get all messed up and you get some pretty strange behavior (as seen in this blog post). The solution is to use shopt with the checkwinsize option. This will make sure that your bash terminal will always have the correct number of columns.

shopt -s checkwinsize

After throwing this into my ~/.bashrc file (or ~/.bash_profile, if you prefer), my frustration is gone! Whew!

Check out the man page or this page for more information.


Use CDPATH to Quickly Change Directories

You can create a shortcut to frequently accessed directories by adding them to the CDPATH environment variable.  So, say I frequently access /var/www/html/.  Instead of typing cd /var/www/html, I can add /var/www/ to CDPATH and then I only have to type cd html.

Open ~/.bashrc (or ~/.bash_profile) and add the following line with your frequently used directories separated with a colon (similar to PATH variable).

export CDPATH=$CDPATH:/var/www/

Here’s an example usage:

[email protected]:~> export CDPATH=$CDPATH:/var/www/
[email protected]:~> cd html
[email protected]:html>

There’s one caveat to using this that I’ve ran into in the past: if you are working with Makefiles and building c/c++ apps, this can potentially confuse the Makefile script. So, if you suddenly can’t build your project after adding this variable, try removing it.


Handy Terminal Keyboard Shortcuts

Put these into your “Terminal Guru” belt and be more productive!

Cursor Movement Control
Ctrl-a: Move cursor to the start of a line
Ctrl-e: Move cursor to the end of a line
Ctrl-Left/Right: Navigate word by word (may not work in all terminals)

Modify Text
Ctrl-w: Delete the whole word to the left of the cursor
Ctrl-k: Erase to end of line
Ctrl-u: Erase to beginning of line

Scrolling/Buffer Control
Shift-PageUp/PageDown: Scroll through current buffer
Ctrl-s: Pause terminal output (program will keep running)
Ctrl-q: Release terminal output (after being paused)
Ctrl-l: Clears the screen. Use this instead of the clear command.

Ctrl-r: Search the history (enter to run the command once found)

Bonus Tip: Use ‘!!’ command to run last command and ‘!com’ to run the last command starting with ‘com’.

Process Control
Ctrl-d: Exit
Ctrl-c: Kill the current process
Ctrl-z: Put the current process in the background (fg will restore it)

Are there any keyboard shortcuts that you can’t live without? Tell us about them in the comments below.


msmtp – a (fairly) simple mail submission program

As an oldtime Unix guy, I’ve always been used to having the BSD mail utility to hand, and a suitably configured mail system, so that I can script jobs to run and email the results back to me. I use mail as a sort of glorified syslog facility. With smaller single board Linux computers we don’t always want to install a full mail setup – resources often tend to be limited. A few years back I discovered msmtp

This utility is an smtp client that submits a file in standard mail format to a mail server. It can submit plain text email or use TLS/SSL etc. I use a couple of script wrappers to emulate, sort of, sendmail and the sending part of the BSD mail utility.

Of course to use msmtp you need a mail server to which you can submit email for delivery. My home server is my mail server, but you could use your ISP’s smtp server. Another problem is that msmtp just fails if it can’t connect to the mail server – it’s up to you to handle that and do something else with that precious message you can’t mail just now! My sample scripts do not deal with that situation.

This very simple script I call sendmail, and it will need customising for your setup…


# set these for your setup...

exec msmtp --host=$MailServer --domain=$Domain --from=$From $*

This is my simple script to emulate the simple parts of the send functionality in the BSD mail utility. It has many shortcomings, but it has served me well…

# A sort of shell replacment for the send functionality of
# the standard "mail" utility.
# mail [-s subject] recipient(s)

u=`id -un`

while [ $# -gt 0 ]; do
 case "$p" in
        if [ $# -gt 0 ]; then shift ; fi
        echo 1>&2 "Option \"$p\" not recognised."
        exit 1
        r="$r , $p"
if [ "$r" = "" ]; then
 echo 1>&2 "No Recipients."
 exit 1
if [ "$s" = "" ]; then
 printf "Subject: "
 read s

s="Subject: $s \n"

(echo -e "From: ${u}@$d \nTo: $r \n$s \n\n"
 cat ) | sendmail -t $v

So if you have a job to run on the platform, then this will email the output to you…

my_job | mail -s “my_job output” [email protected]

msmtp can be loaded from the package systems of most distributions, but I have had occasion to cross-compile the package for installing on a system without package management. I had only limited libraries on my cross compile system, and found that after downloading and extracting the sourcecode from sourceforge, I had to cross compile without some of the advanced features. I used

./configure –build=arm –disable-ssl –disable-gsasl –disable-nls

before doing the make to build the binaries. The resultant binary just submitted plain text email, but that was ok for my use – YMMV. The resultant binary, suitably stripped,is pretty lean.

I recently revisited using msmtp to pre-test a change to my ISP’s new smtp server, before committing the change to my mailserver’s sendmail setup. It can also be useful for testing security settings etc on mail submissions systems.



Self Documenting Scripts

As a “good” programmer I like to put comments at the top of my scripts to say what the script does, and how it is used. I also like the script to output a useful help message when a user gets options/arguments wrong, or when they use the option ‘-h’. I found it was a pain keeping the 2 in step, and developed this simple scheme to only have one place for this info.

I’ll use bash here but I’m sure people can adapt to other scripting languages.

## Usage: helpdemo 
## This demos a self documenting scheme for scripts.

me=`basename "$prog"`

dohelp () {
  grep '^##' "$prog" | sed -e 's/^##//' -e "s/_PROG_/$me/" 1>&2

echo "Program name is: $me"
echo "Program file is: $prog"


Prefix any lines you want to be output as “help” by ‘##’ at the beginning of the line. All such lines are printed out to stderr by the dohelp function. ‘sed’ in this function also strips off the leading ‘##’ from the lines and substitutes the filename of the invoked script for ‘_PROG_’, so that if you change the name of the script, it still magically refers to the new name.

It’s a simple scheme, and can obviously be extended, e.g. one could change the dohelp function thus…

dohelp () {
  if [ "$pfx" = "" ]; then pfx='##' ; fi
  grep "^$pfx" "$prog" | sed -e "s/^$pfx//" -e "s/_PROG_/$me/" 1>&2

dohelp can now be called to select out lines with a different prefix, but its default behaviour when given no prefix, is as before.

This has formed part of my standard shell script template for many years. Hope others find it useful.

Jim Jackson


The watch Command

The watch command allows you to periodically run a command (default, every 2 seconds but can be specified using -n or --interval option). It’s like an alias for a while 1; sleep loop and is really simple to use.

watch [options] command command_options

When I had used it, I was wanting to see the progress of the ‘dd’ command. Follow this link if you’re interested as well:


What Connections Have I Got

It is easy to use the utility netstat to list the active connections you have to your machine

netstat -t

provides the information. But often it is good to know which process/program the connection is to. Again netstat obliges

netstat -tp

However, if like me you use 80 column xterms (or even one of the linux VTs in 80 column mode), then the long lines make the output less than easy reading. So using the “cut” utility we can cut down the verbiage to what is important

netstat -tp | cut -c21-63,80-

I often like to use the “--numeric-hosts” option to netstat too.

Another use of netstat is to show those programs that are listening for connections. This can help check if services have crashed, or whether you have unwelcome “services” running on your machine.

netstat -tpl

Again “cut” can be used to trim the fat and keep the output manageable

netstat -tpl | cut -c21-36,56-63,80-

It is worth giving the man page a netstat a check, it is a very useful utility.


Creating Busybox Links

This tip came in from a subscriber/regular contributor. It’s a script for quickly creating busybox links based on the commands included in your build. I know it will be helpful to somebody out there.

Hi Derek,

Dunno if you are still doing your “DailyLinux” thing, but here’s
a bit of script I knocked together today to create busybox links.
Yes I know there is an “–install” option on a lot of busybox builds,
but it doesn’t seem to play well with installing in a chroot area.

So copy the busybox executable into the “bin” directory…

cp -p /bin/busybox /data/chroot/bin

and cd into that directory

cd /data/chroot/bin

Then run this shell snippet…

  ./busybox  --help | \
  sed -e '1,/^Currently defined functions:/d' \
      -e 's/[ \t]//g' -e 's/,$//' -e 's/,/\n/g' | \
  while read app ; do
    if [ "$app" != "" ]; then
      printf "linking %-12s ...\n" "$app"
      ln -sf "./busybox" "$app"
      ls -ld "$app"

and all the links for the apps built-in to the busybox version you have
get built for you.

It takes advantage of the fact that “busybox –help” gives a list of the
built apps at the end of its output, after the line “Currently defined
functions:”. The sed script, removes all line upto and including this line,
and then removes spaces, tabs and trailing commas, then it converts all
commas to newlines. This gives a list of apps, one app to a line. The while
loop reads each line and does the linking.

Dunno if this would interest anyone else.

all the best


Adding Individual Files or Folders to an Archive

The following steps will walk you through how to create an archive, or tarball, of specific files. This is handy when you don’t necessarily want to archive an entire directory, but would rather just have subset of files spread across different directories. Open a terminal and we’ll begin with this example of grabbing files within our home directory ending in .php.

Step 1: Create a blank/empty tarball in our home directory to add to

tar cvf ~/php_backups.tar --files-from /dev/null

Step 2: Add/append folders to tarball

The following will search for all files ending in .php, pipe those results into xargs which feeds into the tar command to add to our directory.

find ~/ -type f -name "*.php" | xargs tar rvf ~/php_backups.tar

Step 3: Compress the archive

gzip ~/php_backups.tar

Step 4: # Take a look at the archive to verify the directory structure

tar tzvf backup.tar.gz

From here, you can untar the file to whatever directory you want. Keep in mind that it will overwrite any existing files. You can combine other commands (like for or ls) to get a list of files that you’re interested in. You could also save all of the files with the directory path into a file and use cat to pipe into the xargs command.


Proper authorized_keys Permissions for Passwordless SSH Access

I recently used the guide over at to setup passwordless SSH access between hosts (direct link), but for some reason, I was still being prompted to enter a password. I spent a while studying the verbose messages from ssh -vv without any resolution. I returned to the guide and saw a small note about permissions on the bottom of the page. I ran through the instruction there as well, but with no resolve. I finally came across the OpenSSH FAQ page which thankfully explained one or more permission settings I was missing. If you find yourself in my same shoes, try running the following commands on the remote host you’re trying to connect to:

chmod 700 $HOME/.ssh
chmod go-w $HOME $HOME/.ssh
chmod 600 $HOME/.ssh/authorized_keys
chown `whoami` $HOME/.ssh/authorized_keys

This is what got things to work for me after fighting with it for a while. I hope it helps somebody else out there too. Also, if there’s any room for improvement, let me know. I wasn’t looking for an optimized solution at the time because I was tired of fighting.

Next Page »