Verify validity of scripts.
shellcheck somescript.sh
var=value
with no spaces in between, else it will run it as a command
echo "$var"
Always quote your variables
a=2 and a="2" is same and evaluated as strings. Technically there is a way to do arithmetics, but it's better be avoided.
cat "${var}sometexttoappend"
Run env
to list all available env variables
# To set an env variable
NEWENVVAR=somevalue
export NEWENVVAR
Child processes inherit environment variables but not shell variables
$ svg2png pic1.svg pic1.png
Access arguments with:
$0 - svg2png $1 pic1.svg $2 pic1.png
When command is run, you can access all arguments as array with $@
except the $0
for i in "$@"
do
...
done
echo $1 prints first argument
shift removes first argument
echo $1 prints second argument
Most bash commands are programs but there are builtin functions as well such as:
- type
- source
- read
- echo
- cd
- printf
which|type command
will tell you what it is.
Single quotes will not expand variables while double quotes will.
Thus, echo: 'home: $HOME'
will return echo: $HOME
Is a way to match strings. If you want to match a regular expression though, quote it with single quotes.
egrep 'b.*' file.txt
vs grep b.* file.txt
When you glob like:
cat *.txt
it will match all text files, but not
the ones that starts with .
.
You have to define that explicitly
cat .*.txt
STDIN STDOUT STDERR
somecommand 2> file.txt
-> redirects stderr to file.txt
somecommand 2>&1
-> redirects stderr to stdout
sudo echo x > /etc/xyz
won't work. Instead do:
echo x | sudo tee /etc/xyz
See here
x=$((2+2))
does arithmetics
(cd ~/downloads; pwd)
{ cd ~/downloads; pwd }
-> groups commands and runs within
the process
command a{.png,.svg}
-> expands to command a.png a.svg
<(command)
x=(1 2 3)
creates an array
[[]]
- arrays - posix shells only have one array: $@ for arguments
- local keyword - in POSIX all variables are global
- expansion:
a.{png,svg}
- no c style loops eg: for((i=; i<3; i++))
- {1..5} POSIX alternative ${seq 1 5}
- `$'\n' POSIX alternative $(printf "\n")
for i in panda swan
do
echo "$i"
done
for i in panda swan; do echo "$i"; done
for word in $(cat file.txt)
do
echo "$word"
done
while COMMAND
do
...
done
for i in {1..5}
do
...
done
read -r variable
-> reads STDIN into a variable
By default read uses IFS
's value to strip whitespace. Set IFS=''
to avoid this.
Same applies to for loops -- for loops will iterate over every word
in file, which by setting IFS='' can be avoided. Then looping will happen over each line instead
sayHello() {
}
function sayHello() {
}
# () may be omitted as in bash params will be used with $1, $2, $3
# local x -> sets local variable x
# y=somevalue -> is global
OS creates buffer for each pipe and when it gets full, it will pause until there is more space.
We can create named pipes:
mkfifo mypipe
ls > mypipe &
wc < mypipe
${}
is powerful - few usages:
- Using default value for the variable:
${var:-$othervar}
-
${#var}
: length of string or arrayvar
-
${var:?some error}
: prints "some error" and exists if variable is unset or null -
Search and replace
x="some text"
# replace only first occurrence
${x/text/newtext}
# replace all occurrences
${x//text/newtext}
-
${var}
: use when interpolating within a string etc... -
${var:offset:length}
: get a substring of var -
Remove prefix or suffix of some pattern from variable
${x#pattern}
${x%pattern}
Schedule a background process
somecommand &
Wait for processes to be finished
somecommand &
somecommand2 &
wait
Keep processes running even when after shell session is closed with nohup
nohup command &
jobs
: list all current bg processes spawned by current shell
fg
: jump into the bg process in foreground
bg
: resumes suspended job
disown
: like nohup
but you do it after process is started
Callback functions in bash. We can listen to many different events and act on them:
- unix signals(INT, TERM ...)
- script exits(EXIT)
- each line of code(DEBUG)
- function returns(RETURN)
trap 'kill $(jobs -p)' INT
Will kill all jobs on SIGINT signal
function cleanup{
rm -rf $TMPDIR
}
trap cleanup EXIT
By default bash will continue working through the script even if any of it's commands exited with status code non-zero.
Also by default, unset variables will not error
To avoid this we can:
set -e # stop scripts on error
set -u # unset variables will error
set -o pipefail # by default when piping commands, if the left side fails, it doesn't make the later stages to fail
# to combine all above
set -euo pipefail
set -x
-> will print out every line of script as it executestrap read DEBUG
-> will stop before each line of code