Category Archives: Shell Games & Scripts

An All-Variable Method to Send a Bash Command over SSH

I needed to send a command (actually two separate) over ssh to check for the presence of a user (username) on a remote system. The command needed to be constructed entirely from variables. This is the solution I crafted (after much trial and error).

You will see that I send two ssh commands because I need to check for two styles of username (flast and first.last).  These functions are run inside a follower-script called by a leader-script responsible for gathering a host of data regarding departing users.

The leader passes a trio of variables into the follower.  The follower then crafts some variables of its own.  It is from all of this that the final command is crafted.

#! /usr/bin/env bash 
# Title   : 
# Parent  : 
# Author  :  JamesIsIn 20180719 Do something nice today. 

# Purpose :   
# SOP     :  https://some.url/display/ops/NOC+Terminator+Script 
#         :  https://some.url/display/ops/Termination+Procedure+-+NOC 

#  Variables  # 

# # debugging 
# scriptUser_linux= 
# termuser_flast= 
# termuser_firstDotLast= 
# # 

# the original script included the following note:  
# using test nagios results in halved execution times (compared to prod); all configs in sync 
readonly const_nagiosHost="nagios.some.url" 
readonly const_nagios_cgi="/usr/local/nagios/etc/cgi.cfg" 
declare termuser_firstDotLast_grep 
declare termuser_flast_grep 


#  Functions  # 

function func_sshCommand() { 
	local loc_sshArguments="-n -l ${scriptUser_linux} -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o LogLevel=quiet" 
	local loc_sshHostCommand="grep -c \"${1}\" ${const_nagios_cgi}" 
	# this is complex 
	# the export is for filling the passed parameter which is a variable name 
	# that passed variable name is then filled with the output from ssh 
	# where ssh is loaded from a trio of variables 
	export "$( printf '%s' "$( printf '%s' "${2}" )"="$( ssh $( printf '%s' " ${loc_sshArguments} ${const_nagiosHost} ${loc_sshHostCommand}" ) )" )" 

function main() { 
	# printf '%s\n' "Your password will be required for two tests.  " 
	func_sshCommand "${termuser_firstDotLast}" termuser_firstDotLast_grep 
	func_sshCommand "${termuser_flast}" termuser_flast_grep 
	if [ ! "${termuser_firstDotLast_grep}" == "0" ] || [ ! "${termuser_flast_grep}" == "0" ] ; then 
		printf '%b\n' "" "Nagios user found.  Please remove the following user from the nagios cgi.cfg file.  " 
		if [ ! "${termuser_firstDotLast_grep}" == "0" ] ; then 
			printf '%s\n' "${termuser_firstDotLast} (${termuser_firstDotLast_grep}) " 
		if [ ! "${termuser_flast_grep}" == "0" ] ; then 
			printf '%s\n' "${termuser_flast} (${termuser_flast_grep}) " 
		printf '\n' 
		return 1 
		return 0 


#  Main  # 

exit $? 



The magic finally happens in the admittedly somewhat cryptic export line (hence the preceding lines of comment).

Clever? You be the judge.


Using sort as a Version Test in Bash

I have several bash scripts where I use readarray (aka mapfile) only if it will function on that system. This is dependent upon the version of bash being used. I have this test which is interesting enough to be worthy of some small discussion.

Here is the function that gets the bash version, makes the test, and then either uses mapfile (if the version is high enough) or uses read (if it is not). Basically, you are asking if the local version of bash is greater than or equal to x.x.x (your target version).

function func_getTorrentList() {
    local loc_bashVersion
    loc_bashVersion="$( bash --version | head -n1 | cut -d " " -f4 | cut -d "(" -f1 )"
    # readarray or mapfile -d fails before bash 4.4.0
    if printf '%s\n' "4.4.0" "${loc_bashVersion}" | sort -V -C ; then
        mapfile -d $'\0' A_torrentList < <( transmission-remote --list | sed -e '1d;$d;s/^ *//' | cut --only-delimited --delimiter=' ' --fields=1 )
        while IFS= read -r ; do
            A_torrentList+=( "$REPLY" )
done < <( transmission-remote --list | sed -e '1d;$d;s/^ *//' | cut --only-delimited --delimiter=' ' --fields=1 )

# bonus: surround the local version for limiting the test to a version range
    if printf '%s\n' "4.4.0" "${loc_bashVersion}" "6.6.6" | sort -V -C ; then
# or with variables
    if printf '%s\n' "${version_lowerLimit}" "${loc_bashVersion}" "${version_upperLimit}" | sort -V -C ; then


If you look at the if printf line you will see the last operation of the if is a sort call. In that sort call the -V tells sort to use version sorting. The -C tells sort to give true or false on whether any sorting was done.

If the first number (your target version) is the same or smaller than the second number (the local version of bash), then sort passes (-C is true). Otherwise, sort fails (and -C is false and so goes the if statement).

You can further create a range by surrounding the local version with lower- and upper-limit version numbers, as you see in the bonus line at the end.



Understanding Perl Locality

Perl has a variable declaration called local.  One might  be tempted to interpret this as creating a local variable in the sense such as other scripting languages (like Bash or Python) use local.  Avoid temptation.  I’m going to unpack this a bit here.

The places on the Web where Perl documentation lives tend to be a little shy about revealing much concerning this distinction.  For example Perl Tutorial (org) cleverly avoids mention of localPerl Variables.  And w3schools (io) has a single ill-defined definition and an awkward attempt at scope:  Perl – Comments (see Variable Scopes and Perl Local Variable).

Geeks for Geeks provides a much better explanation of the my keyword which will help to understand how my is similar to local in other languages (like Python or Bash):  Perl | my Keyword.  Also, plover has a thorough explanation for the uses of localSeven Useful Uses of local.  (If you’d like to dive even deeper, plover also has a thorough discussion of scope in Perl:  Coping with Scoping.)

Ok, let’s see how simple we can make this.

If you are writing a Bash script and putting your executable code only into functions (in programming there should be no executable code outside of functions), then when you declare a local variable inside a function the scope of that variable will exist only inside that function.  The only things that can have access to your variable live inside that function.  Another function may call this function, but that calling function cannot see the variable.

Still in Bash, the reverse is also true with one caveat:  this function can call another function but the called function will also not be able to see calling function’s local variable.  However, this function can use export to export its local variable to the called function.

Let’s phrase this as a local variable in Bash can be exported down the call-stack (using export).

One other caveat of bash is that each function runs in its own subshell. This means that creating any variable inside a function will not overwrite any global variable (with the same name). This will catch one off-guard from time to time; it has me. (You cannot use export to send any variable up the call-stack.)

Back in Perl, a my variable would be stuck within it’s what’s called its lexical block.  This is usually defined by a pair of squigglies:  {}.  To continue with our functions example (and again, you want all of your code to live within functions when you script) if you declare a my variable inside a function in Perl (what Perl calls a subroutine declared using sub), that my variable will be trapped in its function (subroutine) the same that a local variable in Bash would be trapped in its function.

Now that we have that aligned, let’s look at the relationship between a global variable and a local variable in Perl, because this is a bit tricky.

In both Perl and Bash, you will want to prefer declaring your global variables in the declarations section at the beginning of your script.  (See my script templates for assistance.)  Once declared, these global variables will be available throughout the current script by any function or call-stack.

This is where the power of Perl’s poorly named local keyword comes into play, which you will likely never need or use.  Declaring an existing global variable as local inside a function temporarily re-assigns the global variable for use in this function and down any call-stack therein.  Once this function (and any call-stack it evoked) ends execution, the former global variable is restored for the remainder of the script.

(Attempting to declare a local variable for a non-existant global variable will throw an error.)

So, in Perl local temporarily substitutes a value for a global variable.  Could be powerfully useful in the right situation, but you probably just want my.


testing & expansion in code blocks

WordPress insists on changing certain characters into their & equivilants (when those characters appear within a code block) each time the article is udpated.  This wouldn’t be a problem except that the ampersand is one of the characters which gets this expansion.

If the article is always saved via the Text side this bug is circumnavigated.  This only appears to happen when clicking the Update button while viewing the Visual side in the editor.

This is how the problem increases with each subsequent save.

Watch the ampersand go crazy:   &amp;amp;amp;amp;quot;&amp;amp;amp;amp;quot; &amp;amp;amp;amp;gt;&amp;amp;amp;amp;amp; 
function func_error() {
    printf '%s\n' &amp;amp;amp;amp;quot;[$( date +'%Y-%m-%d_%H:%M:%S%z' )]: $@&amp;amp;amp;amp;quot; &amp;amp;amp;amp;gt;&amp;amp;amp;amp;amp;2

if ! do_something; then
    err &amp;amp;amp;amp;quot;Unable to do_something&amp;amp;amp;amp;quot;
    exit &amp;amp;amp;amp;quot;${error_did_nothing}&amp;amp;amp;amp;quot;

Just a fucking mess.

When I see that I’ve made this mistake (of click the Update button in an article with a code block while viewing the Visual side of the editor), if it’s one or two I just edit them directly and hit the update button from the Text side. If it’s a lot (I’m looking at you, “) then I pull the entire article from the Text side and paste that into VScode and make any find-and-replaces necessary.

The order is important here. First I change all amp;amp; into just amp;. Then I change all & into &. And finally I can fix each of the other expansions as required (usually <, >, and ").

Maybe this is fixed in a newer version? Could be. Have fun.


Remove Subtitles and Set Default Audio in Video Files

Some of the time my DLNA server fails to parse the main language correctly and at the same time fails to offer any language selection for that file. So I dug around and found a simple method using ffmpeg to set the file as I want it. (I used this thread as part of my solution.)

ffmpeg -i 01\ –\ Spellbound.mp4 -map 0:0 -map 0:2 -map 0:1 -sn -disposition:a:1 default -sn 01\ –\ Spellbound__ffmpeg.mkv 

In my case (outlined in the above command) English was the second language option and the original default was, I don’t remember, not English. The -map 0:0 just uses the existing video portion. The next two set the second language as the first and the first as the second (in order). Finally, that -sn removes all (or at least doesn’t bother to copy any) subtitles. The bit about disposition ensures the (new) first language is set as default.

It takes a while because it’s re-writing all the frames. If I come across a faster or more efficient method I’ll return to this article.

One could easily loop this command for all files in a folder.

Here is another example where I squashed six VOB files into one mkv (and reduced the total size from 5.78 GB to under 1 GB):

ffmpeg -i "concat:VTS_01_1.VOB|VTS_01_2.VOB|VTS_01_3.VOB|VTS_01_4.VOB|VTS_01_5.VOB|VTS_01_6.VOB" output.mkv 

Also, converting from mp4 to mkv was just an easy way to keep the same file name. You could convert either way or with the same extension and into a separate folder or with a different file name instead.


bash Style Guide

bash Style Guide

(Based originally on Google Shell Style Guide r1.26.)


Why this Guide?

The essential purpose of this guide is to ensure a consistency of style, an adherence to and improvement of best practices, and a generally positive direction for writing code.  As best practices are changed or discovered, we will endeavor to update this guide.  Nothing here is etched in stone and community involvement in this evolutionary process is encouraged (and likely essential).  After the Conclusion there is a section where we can offer code examples which fall outside the other structures of this guide.

Have fun!

Which Shell to Use

Scripts must start with #!/usr/bin/env bash and a minimum number of flags. Use set to set shell options so that calling your script as bash script_name does not break its functionality.

When to Use bash

While shell scripting isn’t a development language, it is used for writing various utility scripts.

This style guide is more a guide to best practices during its use rather than a suggestion that it be used for any particular deployment.

Further bash Guidelines

If you’re mostly calling other utilities and are doing relatively little data manipulation, shell is an acceptable choice for the task.

If performance matters, use something other than (faster than) shell.

Consider using Python for scripts that perform multiple parsing or transformation operations on collections/arrays.

Prefer shell if writing the script in Python would require several subprocess calls (provided the complexity doesn’t make using shell prohibitive).

For a script that is more than 100 lines long, Google prefers Python, bearing in mind that scripts grow.

Rewrite your script in another language early to avoid a time-consuming rewrite at a later date.

Prefer printf over echo as echo has a number of issues when run in scripts (especially portability).

When using printf, remember that you must explicitly add newline ( \n ) characters.

Shell Files and Interpreter Invocation

File Extensions

Executables should have an sh extension (

Libraries must have an sh extension ( and should not be executable.


SUID and SGID are forbidden in shell scripts.

There are too many security issues with shell that make it nearly impossible to secure sufficiently to allow SUID/SGID.

While bash does make it difficult to run SUID, it’s still possible on some platforms which is why we’re being explicit about banning it.

Use sudo to provide elevated access if you need it.



All error messages should go to STDERR.  This makes it easier to separate normal status from actual issues.

function func_error() {
    printf '%s\n' "[$( date +'%Y-%m-%d_%H:%M:%S%z' )]: $@" >&2

if ! do_something; then
    err "Unable to do_something"
    exit "${error_did_nothing}"


File Header

Begin each file with a description of its contents.

Every file must have a top-level comment including a brief overview of its contents.

A ticket number for tracking is required. If one does not exist, create it.

A copyright notice and author information are optional.

#! /usr/bin/env bash
# Purpose: Perform hot backups of Oracle databases.
# Ticket: 

Function Comments

Any function that is not both obvious and short must be commented.

Any function in a library must be commented regardless of length or complexity.

It should be possible for someone else to learn how to use your program (or to use a function in your library) by reading the comments (and self-help, if provided) without reading the code.

All function comments should contain:

  • Description of the function
  • Global variables used and modified
  • Arguments taken
  • Returned values other than the default exit status of the last command run
#! /usr/bin/env bash
# Perform hot backups of Oracle databases.

export path='/usr/xpg4/bin:/usr/bin:/opt/csw/bin:/opt/goog/bin'

# Cleanup files from the backup dir
# Globals:
# Arguments:
# None
# Returns:
# None

cleanup() {

Implementation Comments

Comment tricky, non-obvious, interesting, or important parts of your code.  This follows general coding comment practice.

Don’t comment everything.  If there’s a complex algorithm or you’re doing something out of the ordinary, put a short comment.

ToDo Comments

Use ToDo comments for code that is temporary, a short-term solution, or good-enough but not perfect.  This approximates the convention in Google’s C++ Guide.

ToDo’s should include the string ToDo, followed by your username in parentheses.  It’s preferable to put a ticket number next to the ToDo item as well.

# ToDo (john.doe):  Handle the unlikely edge cases (ticket:  ####) 


Always use the following for any new code; for existing code, follow the style that’s already there – unless the existing style violates current best-practices or is otherwise problematic.  In that case, update all instances of the existing style.


Indentations should be 4-spaced tabs.  Set the indentations in your IDE accordingly. Keep indentation consistent.

Use blank lines between blocks to improve readability.

Line Length and Long Strings

Avoid line lengths longer than 80 characters.  If you have to write strings that are longer than 80 characters, consider using a here document or an embedded newline if possible.

Literal strings that have to be longer than 80 chars and can’t sensibly be split are ok, but it’s strongly preferred to find a way to make it shorter.

# DO use here documents
cat <<END;
I am an exceptionally long

# Embedded newlines are ok too
long_string="I am an exceptionally
long string." 


Pipelines should be split one per line if they don’t all fit on one line.  Indent for clarity.  If a pipeline all fits on one line, it should be on one line.

This applies to a chain of commands combined using the | as well as to logical compounds using the double || and &&.  Line these elements up where sensible.  Be sure to include the trailing \ which causes the executed code to act as though it is on a single line (by escaping the newline).

# All fits on one line
command1 | command2

# Long commands (with escaped newlines)
command1 \
| command2 \
| command3 \
| command4 


Put ; do and ; then on the same line as the while, for, or if.

Loops should follow the same principles for braces as when declaring functions.  However, else should be on its own line (and closing statements should be on their own lines vertically aligned with their opening statements).

for dir in ${dirs_to_cleanup} ; do
    if [[ -d "${dir}/${c_ORACLE_SID}" ]] ; then
        log_date "Cleaning up old files in ${dir}/${c_ORACLE_SID}"
        rm "${dir}/${c_ORACLE_SID}/"*
        if [[ "$?" -ne 0 ]] ; then
        mkdir -p "${dir}/${c_ORACLE_SID}"
        if [[ "$?" -ne 0 ]] ; then


Indent alternatives.

A one-line alternative needs a space after the close parenthesis of the pattern and before the ;;

Long or multi-command alternatives should be split over multiple lines with the pattern, actions, and ;; on separate lines.

The matching expressions are indented one level from the case and esac.  Multiline actions are indented another level.

Pattern expressions should not be preceded by an open parenthesis.

Avoid the ;& and ;;& notations.

case "${expression}" in
a )
some_command "${variable}" "${other_expr}" ...
absolute )
another_command "${actions}" "${other_expr}" ...
* )
error "Unexpected expression '${expression}'"

Simple commands may be put on the same line as the pattern and the ;; as long as the expression remains readable.  This is often appropriate for single-letter option processing.

When on the same line as the actions, use a space after the close parenthesis of the pattern and another before the ;;

while getopts 'abf:v' flag ; do
    case "${flag}" in
        a ) aflag='true' ;;
        b ) bflag='true' ;;
        f ) files="${optarg}" ;;
        v ) verbose='true' ;;
        * ) error "Unexpected option ${flag}" ;;

Variable Expansion

In order of precedence:  stay consistent with what you find; quote your variables; prefer “${var}” over “$var” – but see details.  Avoid brace-quote single character shell specials or positional parameters, unless strictly necessary or avoiding deep confusion.

# Section of recommended cases.

# Preferred style for 'special' variables:
printf '%s\n' "Positional: $1" "$5" "$3"
printf '%s\n' "Specials: !=$!, -=$-, _=$_. ?=$?, #=$# *=$* @=$@ \$=$$ ..."

# Braces necessary:
printf '%s\n' "many parameters: ${10}"

# Braces avoiding confusion:
# Output is "a0b0c0"
set -- a b c
printf '%s\n' "${1}0${2}0${3}0"

# Preferred style for other variables:
printf '%s\n' "PATH=${PATH}, PWD=${PWD}, mine=${some_var}"
while read f ; do
    printf '%s\n' "file=${f}"
done <<(ls -l /tmp)

# Section of discouraged cases

# Unquoted vars, unbraced vars, brace-quoted single letter
# shell specials.
printf '%s\n' a=$avar "b=$bvar" "PID=${$}" "${1}"

# Confusing use:  this is expanded as "${1}0${2}0${3}0",
# not "${10}${20}${30}
set -- a b c
printf '%s\n' "$10$20$30" 


Always quote strings containing variables, command substitutions, spaces, or shell meta characters (unless careful unquoted expansion is required).

In short, unless there is a reason not to do it, format your variables as “${variable}” when calling them.

Prefer quoting strings that are words (as opposed to command options or path names).

Never quote literal integers as this makes an integer into a string.

Be aware of the quoting rules for pattern matches in [[

Use “$@” unless you have a specific reason to use $*

# 'Single' quotes indicate that no substitution is desired.
# "Double" quotes indicate that substitution is required/tolerated.

# Simple examples
# "quote command substitutions"
flag="$(some_command and its args "$@" 'quoted separately')"

# "quote variables"
printf '%s\n'  "${flag}"

# "never quote literal integers"
# "quote command substitutions", even when you expect integers

# "prefer quoting words", not compulsory
readonly const_USE_INTEGER='true'

# "quote shell meta characters"
printf '%s\n' 'Hello, stranger, and well met.  Earn lots of $$$'
printf '%s\n' "Process $$:  Done making \$\$\$."

# "command options or path names"
# ($1 is assumed to contain a value here)
grep -li Hugo /dev/null "$1"

# Less simple examples
# "quote variables, unless proven false": ccs might be empty
git send-email --to "${reviewers}" ${ccs:+"--cc" "${ccs}"}

# Positional parameter precautions: $1 might be unset
# Single quotes leave regex as-is.
grep -cP '([Ss]pecial|\|?characters*)$' ${1:+"$1"}

# For passing on arguments,
# "$@" is right almost everytime, and
# $* is wrong almost everytime:
# * $* and $@ will split on spaces, clobbering up arguments
# that contain spaces and dropping empty strings;
# * "$@" will retain arguments as-is, so no args
# provided will result in no args being passed on;
# This is in most cases what you want to use for passing
# on arguments.
# * "$*" expands to one argument, with all args joined
# by (usually) spaces,
# so no args provided will result in one empty string
# being passed on.
# (Consult 'man bash' for the nit-grits ;-)
set -- 1 "2 two" "3 three tres"; printf $# ; set -- "$*"; printf "$#, $@")
set -- 1 "2 two" "3 three tres"; printf $# ; set -- "$@"; printf "$#, $@") 

Features and Bugs

Command Substitution

Nested backticks require escaping the inner backticks with \

Prefer $( command ) instead of backticks.  The $( command ) format doesn’t change when nested and is easier to read.

# This is preferred:
var="$( command "$( command1 )" )"

# This is not:
var="`command \`command1\``" 

test, [, and [[

[[ … ]] is preferred over [, test, and /usr/bin/[

[[ … ]] reduces errors as no pathname expansion or word splitting takes place between [[ and ]]

plus [[ … ]] allows for regular expression matching where [ … ] does not.

# This ensures the string on the left is made up of characters in the
# alnum character class followed by the string name.
# Note that the RHS should not be quoted here.
# For the gory details, see
# E14 at
if [[ "filename" =~ ^[[:alnum:]]+name ]] ; then
    printf '%s\n' "Match"

# This matches the exact pattern "f*" (Does not match in this case)
if [[ "filename" == "f*" ]] ; then
    printf '%s\n' "Match"

# This gives a "too many arguments" error as f* is expanded to the
# contents of the current directory
if [ "filename" == f* ] ; then
    printf '%s\n' "Match"

Testing Strings

Since bash is smart enough to deal with an empty string in a test (and given that the code is much easier to read), use tests for empty/non-empty strings or empty strings rather than filler characters.

Use quotes rather than filler characters where possible.

# Do this:
if [[ "${my_var}" = "some_string" ]] ; then

# -z (string length is zero) and -n (string length is not zero) are
# preferred over testing for an empty string
if [[ -z "${my_var}" ]] ; then

# This is ok (ensure quotes on the empty side) but not preferred:
if [[ "${my_var}" = "" ]] ; then

# Not this:
if [[ "${my_var}X" = "some_stringX" ]] ; then
Avoid confusion about your test by explicitly using -z or -n
# Use this
if [[ -n "${my_var}" ]] ; then

# Instead of this as errors can occur if ${my_var} expands to a test
# flag
if [[ "${my_var}" ]] ; then

Wildcard Expansion of File Names

Use an explicit path when doing wildcard expansion of filenames.

As filenames can begin with a -, it’s a lot safer to expand wildcards with ./* instead of *

# Here's the contents of the directory:
# -f -r somedir somefile

# This deletes almost everything in the directory by force
psa@bilby$ rm -v *
removed directory: `somedir'
removed `somefile'

# As opposed to:
psa@bilby$ rm -v ./*
removed `./-f'
removed `./-r'
rm: cannot remove `./somedir': Is a directory
removed `./somefile' 


Avoid eval.  It munges the input when used for assignment to variables and can set variables without making it possible to check what those variables were.

# What does this set?
# Did it succeed?  In part or whole?
eval $( set_my_variables )

# What happens if one of the returned values has a space in it?
variable="$( eval func_someFunction )" 

Pipes to while

Use process substitution or for loops in preference to piping to while.  Variables modified in a while loop do not propagate to the parent because the loop’s commands run in a subshell.  The implicit subshell in a pipe to while can make it difficult to track down bugs.

your_command | while read line ; do

# This will output 'NULL'
printf '%s\n'  "${last_line}" 

Use a for loop if you are confident that the input will not contain spaces or special characters (usually, this means not user input).

# Only do this if there are no spaces in return values.
for value in $( command ) ; do

Using process substitution allows redirecting output but puts the commands in an explicit subshell rather than the implicit subshell that bash creates for the while loop.

while read count filename ; do
done <<(your_command | uniq -c) 

# This will output the second field of the last line of output from
# the command.
printf '%s\n'  "Total = ${total}"
printf '%s\n' "Last one = ${last_file}" 

Use while loops where it is not necessary to pass complex results to the parent shell; this is typically where some more complex parsing is required.

Beware that simple examples are probably more easily done with a tool such as awk.  This may also be useful where you specifically don’t want to change the parent scope variables.

# Trivial implementation of awk expression:
# awk '$3 == "nfs" { print $2 " maps to " $1 }' /proc/mounts
cat /proc/mounts | while read src dest type opts rest ; do
    if [[ ${type} == "nfs" ]] ; then
        printf '%s\n' "NFS ${dest} maps to ${src}"

Naming Conventions

Function Names

Names for both functions and variables should be clear as to purpose and origin.

Lower-case with underscores to separate words are preferred (while camel case is also acceptable).

So that functions are easily identifiable at all locations in the code it is preferred to prepend them with either f_ or func_ (so f_my_function or func_my_function).

Separate libraries with ::

Parentheses are required after the function name.

If you’re writing a package, separate package names with ::

Braces will typically be on the same line as the function name.

The function keyword is extraneous when the () is present after the function name, but this enhances quick identification of functions and thus is preferred.  (It also allows easy transformation of functions when porting to another shell.)

# Single function
function func_my_function() {

# Part of a package
mypackage::func_my_function() {

Variable Names

Names for both functions and variables should be clear as to purpose and origin.

There are three kinds of variables which are marked with prepends:  locals, constants (read-only), and arrays (both types).

Local variables  get the l_ or loc_ prepend (l_variable or loc_variable).  Constants or read-only variables get c_ or const_ prepend (c_variable, c_VARIABLE, const_variable, or const_VARIABLE); if it is your preference to ALL-CAPS your constants (a la C languages), only do so after the prepend.  Finally, arrays and associative arrays should get a_ aa_ (or A_ or AA_ but be sure to avoid an all-caps array name), respetively.

Variable names within loops should be similarly named for any variable they are looping through.

for zone in ${zones} ; do
    something_with "${zone}"

Constants and Environment Variable Names

Names for both functions and variables should be clear as to purpose and origin.

Environment and system variables are all-caps variables; as such it is best to avoid using all-caps for your own variables (and thus avoid conflicts).

So that constants (read-only variables) are easily identifiable at all locations in the code it is preferred to prepend them with either c_ or const_ (so c_my_read-only).

Other languages use all caps for constants and so some may prefer that, but we can still avoid conflict by prepending those with c_ or const_ (so either const_VARIABLE or c_VARIABLE).

Declarations for these variables should come near the top of the file.

# Constant
readonly c_PATH_TO_FILES='/some/path'

# Both constant and environment
declare -xr c_ORACLE_SID='PROD' 

Some things become constant at their first setting (for example, via getopts).  Thus, it’s ok to set a constant in getopts or based on a condition, but it should be made read-only immediately afterwards.  Note that declare doesn’t operate on global variables within functions, so readonly or export are recommended instead.

while getopts 'v' flag ; do
    case "${flag}" in
        v) c_VERBOSE='true' ;;
readonly c_VERBOSE 

Source File Names

Lowercase with underscores to separate words if desired.  (Prefer underscore style over camel case, though both are acceptable.)  Avoid spaces in file paths!

Be consistant.

Read-Only Variables (aka Constants)

Names for both functions and variables should be clear as to purpose and origin.

When you declare a variable that is meant to be read-only, make this explicit.

Use readonly or declare -r to ensure they’re read only.

As globals are widely used in shells, it’s important to catch errors when working with them.

c_zip_version="$( dpkg --status zip | grep Version: | cut -d ' ' -f 2 )"
    if [[ -z "${c_zip_version}" ]] ; then
        readonly c_zip_version

Local Variables

Names for both functions and variables should be clear as to purpose and origin.

To make local variables easy to parse and locate within the code they are prepended with l_ or loc_ (as l_my_local_variable or loc_my_local_variable).

Declare function-specific variables with local.  Ensure that local variables are only seen inside a function and its children by using local when declaring them.

Declaration and assignment should be on different lines.  This is especially true when the assignment value is provided by a command substitution as local does not propagate the exit code from the command substitution.

func_my_function2() {
    local loc_name="$1"

    # Separate lines for declaration and assignment:
    local loc_my_var
    loc_my_var="$( func_my_function )" || return

# DO NOT do this:  $? contains the exit code of 'local' not func_my_function
    local loc_my_variable="$(func_my_function)"
    [[ $? -eq 0 ]] || return

Function Location

Put all functions together at the top of the file just below the constants.

Don’t hide executable code between functions!

Only includes, set statements, and constants should be done before declaring functions.


A function called main is required for scripts long enough to contain at least one other function.  (A script containing only one function ought to be main regardless.)

In order to easily find the start of the program, put the main program in a function called main as the bottom most function.

This provides consistency with the rest of the code base as well as allowing you to define more variables as local (which can’t be done if the main code is not a function).

The last non-comment line in the file should be a call to main:

main “$@”

Calling Commands

Checking Return Values

Always check return values and give informative return values.

For unpiped commands use $? or check directly via an if statement to keep it simple.

if ! mv "${file_list}" "${dest_dir}/" ; then
    printf '%s\n' "Unable to move ${file_list} to ${dest_dir}" >&2
    exit "${error_bad_move}"

bash also has the PIPESTATUS variable that allows checking of the return code from all parts of a pipe.  If it’s only necessary to check success or failure of the whole pipe, then the following is acceptable:

tar -cf - ./* | ( cd "${dir}" && tar -xf - )
if [[ "${PIPESTATUS[0]}" -ne 0 || "${PIPESTATUS[1]}" -ne 0 ]]; then
    printf '%s\n' "Unable to tar files to ${dir}" >&2

However, PIPESTATUS will be overwritten as soon as you do any other command.

If you need to act differently on errors based on where it happened in the pipe, you’ll need to assign PIPESTATUS to another variable immediately after running the command.

Remeber that [ is a command and will wipe out PIPESTATUS.

tar -cf - ./* | ( cd "${dir}" && tar -xf - )
    if [[ "${return_codes[0]}" -ne 0 ]] ; then
    if [[ "${return_codes[1]}" -ne 0 ]] ; then

Built-in Commands v External Commands

Given the choice between invoking a shell built-in and invoking a separate process, choose the built-in.

Prefer the use of built-ins such as the Parameter Expansion functions in bash(1) as it’s more robust and portable (especially when compared to things like sed).

# Prefer this:
addition=$((${x} + ${y}))

# Instead of this:
addition="$(expr ${x} + ${y})"
substitution="$( printf "${string}" | sed -e 's/^foo/bar/' )"


Use common sense and be consistent!

Code Examples and Other Useful Tips


There are some scripts which make unusual use of read.  Here are some suggestions for fixing these and for future scripts.

# Offer a default.  (The -e is required for the -i to work.)
read -e -p "Enter the path to the file:  " -i "/usr/local/etc/ " file_path
# or
read -ep "You may accept the default or delete it and type another: " -i "${mail_to}" mail_to

# Add a newline.
read -p "Please Enter a Message: "$'\n' message
printf '%s\n' "${message}"

# Do this...
read -e -p "Do you want to go again?  [y/n] "
printf '%s\n' "${REPLY}"
# ... or possibly this...
read -ep "Do you want to go again?  [y/n] " answer
printf '%s\n' "${answer}"
# ... but never this.
read "Do you want to go again?  [y/n] "
printf '%s\n' "${answer}"


As mentioned above, we should be using printf in scripts and avoiding echo.  Here are some pointers for printf.

# New lines.
printf '%s\n' "" "Make some space if you want your message to stand out. " ""

# More than one line.
printf "You can
make multi-line
prints as well.

# As printf has no included newline it must be specified.
printf "Just remember printf behaves like echo -n. "
printf '%s\n' "So if you want a new line, specify it using \\\n. " ""

# Though there is likly no issue with using echo to output an empty line, you may as well use a good habit.
printf '\n'

# You want some color?
# (A list exists here: )
printf "Here is some \e[0;31mred\e[0m text. "
printf "Here is some \e[31mred\e[m text. "
printf '\n' 



This can be a very useful variable.  You can use it to test the results of a previous command as it contains the return status code of whatever came before it.

# $? contains the return status code of the previous command.
$ echo
$ echo $?
$ [ ]
$ echo $?
$ []
[]: command not found
$ echo $?

You may want to change the case of a string you have stored in a variable.  This is the simplest way we’ve found.


Of course you can also force case in your variable declaration.

# for lower-case letters conversion
declare -l lowerCaseLetters
# for upper-case letters conversion
declare -u upperCaseLetters
# any text input for variables thus declared will convert to the specified case 

decimal numeration and math

Let’s look at how we might handle decimal numeration.

declare numberWithDecimals
# print it as it is
printf '%s\n' "Original number: ${numberWithDecimals}"
# print the number with the decimal places removed
printf '%s\n' "Clipped: ${numberWithDecimals%.*}"
# printf the number rounded
printf '%s' "Rounded: " ; printf '%.0f' "${numberWithDecimals}" ; printf '\n' 

What about more basic math problems?

# addition should work similarly
# the set of double parentheses provide a space for doing math
printf '%d\n' "$(( ${firstNumber} - ${secondNumber} ))" 

Suppose you like decimals.

# get a start and stop time (or any pair of decimal numbers)
time_start="$( date +%s.%N )"
time_end="$( date +%s.%N )"
# subtract them via bc into another variable
time_elapsed="$( printf '%b\n' "${time_end} - ${time_start}" | bc )"
# and trim that output to three decimal places
time_elapsed="$( printf '%0.3f' "${time_elapsed}" )" 

Common Functions Farm

Function for pausing input to wait for the user to continue.
function func_EnterToContinue() {
# just waits for the user before proceeding; a timer could be added later
read -rp "Press [Enter] to continue... "
Check root and sudo.

It is a good idea to test for root use. It may also be useful to ensure a user is using or not using sudo when running a script.

function func_test_root(){
    if [ ! "${USER}" = root ] ; then
        printf '%s\n' "If this is not run with sudo, something something. " ""
        exit 0
    elif [[ "${USER}" == root ]] ; then # check if some naughty monster is logged in as root
        if [[ "${SUDO_USER}" == "" ]] ; then # sudo is ok
        printf '%s\n' "" "It is a bad practice to log in as root. Use sudo instead. " ""
        exit 0

Or you may want to get the current user while testing for root and sudo.

function func_GetCurrentUser() {
    if [[ ! "${USER}" == root ]] ; then
    elif [[ "${USER}" == root ]] ; then # check if some naughty monster is logged in as root
        if [[ "${SUDO_USER}" == "" ]] ; then # sudo is ok
            printf '%s\n' "" "It is a bad practice to log in as root. " "Log in as yourself and use sudo if necessary. " ""
            exit 0
Get a directory for working.

Function for obtaining working directory from user input.

function func_getContainingFolder() {
    # obtain directory in which to work
    printf '%s\n' "Hello. " ""
    printf '%s\n' "Crtl-c at any time abandons any changes and exits the script. " ""
    while [ ! -d "${containingFolderPath}" ] ; do
        read -rep "Please provide the containing folder for the files to be renamed: " -i "${containingFolderPath}" containingFolderPath
        # expand the ~/ if it gets submitted
        # fix spaces to be used in a quoted variable
        if [ -d "${containingFolderPath}" ] ; then
            printf '%s\n' "I have confirmed this is a directory. "
            printf '%s\n' "${containingFolderPath}" ""
            printf '%s\n' "I cannot confirm this is a directory. "
            printf '%s\n' "${containingFolderPath}" ""
Get numeric input.

You may want to ensure input is numeric.  Here is a pair of functions for testing ticket numbers.

function func_TestTicket_isNumeric() {
    ticket_number=$( printf '%s' "${1}" | grep -E -o '[0-9]+' )
    if [[ "${1}" =~ ^[0-9]+$ ]] ; then # Check passed value as integer with a regex
        return 0 ;
        return 1 ;

function func_TestTicket_isNumeric_loop() {
    until func_TestTicket_isNumeric "${ticket_number}" ; do
        read -rep "Ticket number: " -i "${ticket_number}" ticket_number

Enhance bash

Functions can also be used to enhance bash itself.  Here is a function for launching two scripts.  (Since this lives in a bashrc file and not in a script it lacks the func_ prepend.)

# add to /etc/bash.bashrc file for a function for all users

# function for using termuser_auto and termuser_manual shell scripts
function termuser() {
case "${1}" in
-a|--auto) ; return 0 ;;
-m|--manual) ; return 0 ;;
printf '%s\n' "In future you can call this funciton using -a or -m. "
read -rp "Automated or manual? (a/m) " -n1
if [[ "${REPLY}" == "a" ]] ; then
return 0
elif [[ "${REPLY}" == "m" ]] ; then
return 0
fi ;;
printf '%s\n' "Oops. "
return 1

Function v Method

I’ve been trying to get my head around the difference between a function and a method.  Part of the trouble in clearing this distinction is that, depending on the language in question, the naming conventions are all over the place.  C never calls anything a method while Java and C# never call anything a function.  Yet each does have both?  Yeah.  A rose by any other name would make people scratch their heads.

In essence, we can rephrase this problem as Classless Functions v Classed Functions.  A Classless Function is a function not associated with a class (nor a struct, C; nor a type, Go) while a Classed Function is a function tied to a class.  To call a classless, one merely calls the function by name, while to call a classed, one must call it using dot notation (class.function).

A Classless Function may be called a function or a static method and a Classed Function may be called a member function or a method.  See how this adds to the confusion?  No wonder developers get paid extra.

This question page have several answers which attempt to dive into this problem:

What’s the difference between a method and a function?


Quickly Make Directories Based on Files in Location

I needed to make a bunch of folders for a slew of singles I’d downloaded.  I wanted each single to have its own folder but when you download singles they tend to offer just the flac file and no containing folder.  Reasonable, I suppose.

Anyway, I wrote this to create the folders.

for f in ./* ; do ff=$( printf '%s\n' "${f//.\//}" ) ; mkdir "${ff//.flac/}" ; done 

Good enough.  I could have added something to move each file (“${f}”) into the newly created folder (./”${ff}”) but didn’t think of that until I was done.  Probably an addition like this (untested):

for f in ./* ; do ff=$( printf '%s\n' "${f//.\//}" ) ; mkdir "${ff//.flac/}" ; mv "${f}" "./${ff}/${f}" ; done 

Enjoy! Maybe I’ll get around to testing the move version at some point.