Why shell scripts?
A shell script is a plain text file containing a sequence of commands that the shell runs for you. Scripts help you repeat tasks reliably, reduce typing mistakes, and make your work reproducible. In this chapter you will build small scripts with inputs (parameters), decisions (conditionals), repetition (loops), and structure (functions), while keeping safety in mind.
1) The shebang: choosing the interpreter
The first line of many scripts is the shebang, which tells the system which interpreter should run the file.
#!/usr/bin/env bash/usr/bin/env bash is a common, portable way to find bash in your environment. After the shebang, write normal shell commands.
Step-by-step: your first script file
- Create a file named
hello.sh. - Add:
#!/usr/bin/env bash
echo "Hello from a script"2) Making scripts executable (and running them)
To run a script as a program, it needs execute permission. Then you can run it using a relative or absolute path.
chmod +x hello.sh
./hello.shIf you do not make it executable, you can still run it by explicitly calling the interpreter:
Continue in our app.
You can listen to the audiobook with the screen off, receive a free certificate for this course, and also have access to 5,000 other free online courses.
Or continue reading below...Download the app
bash hello.sh3) Variables and quoting (the most common source of bugs)
Variables store values for reuse. In Bash, you assign without spaces and reference with $.
name="Ada"
echo "Hello, $name"Quoting rules you should use by default
- Use double quotes around variable expansions unless you explicitly want word splitting and globbing.
- Prefer:
"$var"not$var. - Use single quotes to prevent expansion:
'$HOME'prints literally$HOME.
Example showing why quotes matter:
folder="My Files"
# Good:
ls "$folder"
# Risky (breaks into two words: My and Files):
ls $folder4) Parameters: taking input with $1, $2, and friends
Scripts can accept command-line parameters. The first argument is $1, the second is $2, and so on. $0 is the script name.
#!/usr/bin/env bash
echo "Script: $0"
echo "First arg: $1"
echo "Second arg: $2"Two very useful special variables:
$#: number of arguments$@: all arguments as separate quoted words when used as"$@"
Example: forward all arguments safely to another command:
some_command "$@"5) Exit codes: signaling success or failure
Every command returns an exit code: 0 means success; non-zero means failure. Your scripts should also return meaningful exit codes so other tools (or you) can detect failures.
#!/usr/bin/env bash
if cp source.txt dest.txt; then
echo "Copy succeeded"
exit 0
else
echo "Copy failed" >&2
exit 1
fiYou can check the exit code of the last command with $?:
some_command
code=$?
echo "Exit code was: $code"6) Basic conditionals with if
Use if to make decisions. In Bash, [ ... ] (test) is commonly used for file checks and string comparisons.
Common file tests
-e: exists-d: is a directory-f: is a regular file
#!/usr/bin/env bash
path="$1"
if [ -z "$path" ]; then
echo "Usage: $0 PATH" >&2
exit 2
fi
if [ -d "$path" ]; then
echo "It's a directory"
else
echo "Not a directory" >&2
exit 1
fiNotes for safety: always quote variables inside tests, and keep spaces around [ and ].
7) Simple loops: for
Loops help you repeat actions over a list of items.
Loop over a fixed list
for ext in txt md log; do
echo "Handling extension: $ext"
doneLoop over script arguments safely
for item in "$@"; do
echo "Arg: $item"
doneBe careful with patterns like for f in *: it depends on the current directory and can behave unexpectedly if filenames contain spaces (quoting helps when you use the variable, but the glob expansion already happened). When in doubt, test with a dry-run pattern (shown below).
8) Functions: readability and reuse
Functions let you group logic into named blocks. This makes scripts easier to read and maintain.
#!/usr/bin/env bash
log() {
echo "[$(date +%H:%M:%S)] $*"
}
die() {
echo "ERROR: $*" >&2
exit 1
}
log "Starting"
# ...
log "Done"$* inside a function collects the function arguments. For safer handling of multiple arguments, you can also use "$@" depending on your needs.
9) Safer scripting practices
Using set -e (and its considerations)
set -e tells Bash to exit when a command fails. This can prevent scripts from continuing in a broken state, but it can also surprise you in conditionals and pipelines. Use it deliberately.
#!/usr/bin/env bash
set -ePractical guidance:
- Use
set -efor scripts where any failure should stop the run. - When you expect a command might fail and you want to handle it, wrap it in an
ifor use explicit error handling. - Consider adding your own checks and clear error messages rather than relying only on
set -e.
Echo for debugging
Strategic echo statements help you see what the script is doing, especially with variables and paths.
echo "DEBUG: src=$src"
echo "DEBUG: dest=$dest"Tip: send debug output to stderr so it does not mix with “real” output:
echo "DEBUG: something" >&2Dry-run patterns (do nothing, but show what would happen)
A dry-run mode prints actions instead of executing them. This is one of the simplest ways to make scripts safer.
DRY_RUN=0
run() {
if [ "$DRY_RUN" -eq 1 ]; then
echo "DRY-RUN: $*"
else
"$@"
fi
}Now use run instead of calling commands directly:
run mkdir -p "$dest"
run cp -a "$src" "$dest/"Mini-lab: backup a folder to a timestamped directory and log results
You will write a script that:
- Takes a source folder as
$1and an optional destination base folder as$2 - Creates a timestamped backup directory
- Copies the source into it
- Logs what happened to a log file
- Supports a
--dry-runflag
Step 1: Create the script skeleton
Create a file named backup_folder.sh with this content:
#!/usr/bin/env bash
set -e
DRY_RUN=0
log() {
# Writes to both screen and log file (LOG_FILE must be set)
local msg="$*"
echo "$(date '+%Y-%m-%d %H:%M:%S') $msg" | tee -a "$LOG_FILE"
}
die() {
echo "ERROR: $*" >&2
exit 1
}
run() {
if [ "$DRY_RUN" -eq 1 ]; then
log "DRY-RUN: $*"
else
"$@"
fi
}
usage() {
echo "Usage: $0 [--dry-run] SOURCE_DIR [DEST_BASE_DIR]" >&2
}
# Parse optional flag
if [ "${1:-}" = "--dry-run" ]; then
DRY_RUN=1
shift
fi
SRC="${1:-}"
DEST_BASE="${2:-$HOME/backups}"
if [ -z "$SRC" ]; then
usage
exit 2
fi
if [ ! -d "$SRC" ]; then
die "Source is not a directory: $SRC"
fi
TS="$(date +%Y%m%d_%H%M%S)"
BACKUP_DIR="$DEST_BASE/backup_$TS"
LOG_FILE="$BACKUP_DIR/backup.log"
# Create destination and log file
if [ "$DRY_RUN" -eq 1 ]; then
echo "DRY-RUN: would create $BACKUP_DIR"
else
mkdir -p "$BACKUP_DIR"
fi
# Now that BACKUP_DIR exists (unless dry-run), set up logging
if [ "$DRY_RUN" -eq 1 ]; then
# In dry-run, log to stderr to avoid pretending a file exists
LOG_FILE="/dev/stderr"
fi
log "Backup starting"
log "Source: $SRC"
log "Destination: $BACKUP_DIR"
# Copy source folder into backup directory
# -a preserves attributes and copies directories recursively
run cp -a "$SRC" "$BACKUP_DIR/"
log "Backup finished successfully"Step 2: Make it executable and run a dry-run
chmod +x backup_folder.sh
./backup_folder.sh --dry-run /path/to/sourceVerify that it prints the actions it would take, without creating files.
Step 3: Run a real backup
./backup_folder.sh /path/to/sourceThis should create a directory like $HOME/backups/backup_20260116_153012 containing a copy of your source folder and a backup.log file.
Step 4: Use a custom destination base directory
./backup_folder.sh /path/to/source /path/to/backup_rootStep 5: Add a simple loop to back up multiple folders
If you want to back up several folders in one run, you can extend the script to accept multiple sources. One simple approach is to treat all remaining arguments as sources and use $DEST_BASE from an option or environment variable. For practice, modify the script so it loops over "$@" and performs the same backup steps for each source directory.
for SRC in "$@"; do
if [ ! -d "$SRC" ]; then
log "Skipping (not a directory): $SRC"
continue
fi
TS="$(date +%Y%m%d_%H%M%S)"
BACKUP_DIR="$DEST_BASE/backup_$TS"
LOG_FILE="$BACKUP_DIR/backup.log"
run mkdir -p "$BACKUP_DIR"
log "Backing up: $SRC"
run cp -a "$SRC" "$BACKUP_DIR/"
log "Done: $SRC"
doneWhen you add loops like this, dry-run mode becomes even more valuable: you can confirm the script will touch the right folders before it actually copies anything.