Free Ebook cover Linux Command Line for Beginners: Navigate, Search, and Automate Simple Tasks

Linux Command Line for Beginners: Navigate, Search, and Automate Simple Tasks

New course

10 pages

Daily-Use Mini Projects and a Linux Command Line Cheat Sheet

Capítulo 10

Estimated reading time: 11 minutes

+ Exercise

How to Use This Chapter

This chapter is a set of repeatable routines you can run on any Linux machine. Each mini-project includes: input data you can generate, a target output you can verify, and troubleshooting checkpoints to help you recover when results look wrong. The final lab combines multiple skills into one scenario and ends with concrete checks.

Mini-Project 1: Log Triage Pipeline (Find the Signal Fast)

Concept

Log triage means turning a large, noisy log into a short, actionable summary. The command line is ideal for this because you can chain small tools into a pipeline that: selects relevant lines, extracts fields, counts patterns, and saves results for later.

Input Data (Create a Practice Log)

Create a small log file with mixed severity and repeated messages:

mkdir -p ~/mini-projects/log-triage && cd ~/mini-projects/log-triage
cat > app.log <<'EOF'
2026-01-16T09:00:01Z INFO  api user=alice action=login status=200
2026-01-16T09:00:02Z WARN  api user=bob   action=login status=401
2026-01-16T09:00:03Z ERROR db  query=select timeout=1s
2026-01-16T09:00:04Z INFO  api user=alice action=list status=200
2026-01-16T09:00:05Z ERROR api user=bob   action=login status=500
2026-01-16T09:00:06Z WARN  api user=carol action=login status=401
2026-01-16T09:00:07Z ERROR db  query=update deadlock=1
2026-01-16T09:00:08Z INFO  api user=carol action=list status=200
2026-01-16T09:00:09Z ERROR api user=bob   action=list status=500
EOF

Expected Output

  • A file errors.txt containing only ERROR lines.
  • A short summary showing counts by component (api/db) and by status code (401/500).
  • A “top offenders” list (e.g., which user appears most in error lines).

Step-by-Step

1) Extract ERROR lines into a reusable artifact

grep ' ERROR ' app.log > errors.txt
wc -l errors.txt

2) Count errors by component (3rd field: api/db)

Continue in our app.

You can listen to the audiobook with the screen off, receive a free certificate for this course, and also have access to 5,000 other free online courses.

Or continue reading below...
Download App

Download the app

awk '{print $3}' errors.txt | sort | uniq -c | sort -nr

3) Count authentication failures (status=401) and server errors (status=500)

grep -o 'status=[0-9]\+' app.log | sort | uniq -c | sort -nr

4) Find “top users” involved in ERROR lines (extract user=...)

grep ' ERROR ' app.log | grep -o 'user=[^ ]\+' | sort | uniq -c | sort -nr

5) Save a triage report (append multiple sections)

{
  echo '=== ERROR LINES ==='
  cat errors.txt
  echo
  echo '=== ERRORS BY COMPONENT ==='
  awk '{print $3}' errors.txt | sort | uniq -c | sort -nr
  echo
  echo '=== STATUS COUNTS (ALL) ==='
  grep -o 'status=[0-9]\+' app.log | sort | uniq -c | sort -nr
  echo
  echo '=== USERS IN ERROR LINES ==='
  grep ' ERROR ' app.log | grep -o 'user=[^ ]\+' | sort | uniq -c | sort -nr
} > triage-report.txt
ls -l triage-report.txt

Troubleshooting Checkpoints

  • Nothing matches: confirm the exact spacing/case. For example, grep ' ERROR ' expects spaces around ERROR. Try grep 'ERROR' app.log to test.
  • Awk field seems wrong: print the first few lines with field numbers: awk '{print "1=" $1, "2=" $2, "3=" $3, "4=" $4}' errors.txt | head.
  • Counts look off: ensure you are sorting before uniq -c. uniq only counts adjacent duplicates.

Mini-Project 2: Bulk Rename and Move (Organize a Messy Download Folder)

Concept

Bulk organization is about applying consistent naming rules and moving files into predictable folders. The safe approach is: preview changes, then perform them. You will use a “dry run” preview with echo and then run the real commands.

Input Data (Create a Messy Folder)

mkdir -p ~/mini-projects/bulk-organize/incoming
cd ~/mini-projects/bulk-organize/incoming
touch 'Report Final (1).txt' 'Report Final (2).txt' 'holiday photo 1.JPG' 'holiday photo 2.JPG' 'notes 2026-01-16 .md'
ls -1

Expected Output

  • All filenames are “normalized” (lowercase, spaces to underscores, no trailing underscores).
  • Images moved into ../photos/, text/markdown into ../docs/.
  • No files lost; counts before and after match.

Step-by-Step

1) Create destination folders

cd ~/mini-projects/bulk-organize
mkdir -p photos docs

2) Preview a normalization rename (dry run)

This preview prints what would happen without changing anything:

cd ~/mini-projects/bulk-organize/incoming
for f in *; do
  new=$(printf '%s' "$f" | tr '[:upper:]' '[:lower:]' | tr ' ' '_' | sed 's/_\+/_/g; s/_\././g; s/_$//')
  if [ "$f" != "$new" ]; then
    echo mv -n -- "$f" "$new"
  fi
done

3) Perform the rename (remove echo)

for f in *; do
  new=$(printf '%s' "$f" | tr '[:upper:]' '[:lower:]' | tr ' ' '_' | sed 's/_\+/_/g; s/_\././g; s/_$//')
  if [ "$f" != "$new" ]; then
    mv -n -- "$f" "$new"
  fi
done
ls -1

4) Move files by type (preview first)

echo mv -n -- *.jpg ../photos/ 2>/dev/null
echo mv -n -- *.txt *.md ../docs/ 2>/dev/null

5) Perform the move

mv -n -- *.jpg ../photos/ 2>/dev/null
mv -n -- *.txt *.md ../docs/ 2>/dev/null

6) Verify counts and locations

cd ..
find . -maxdepth 2 -type f | sort
printf 'incoming=%s\n' "$(find incoming -type f | wc -l)"
printf 'photos=%s\n' "$(find photos -type f | wc -l)"
printf 'docs=%s\n' "$(find docs -type f | wc -l)"

Troubleshooting Checkpoints

  • “No such file” for globs: if there are no matches, the pattern stays literal in some shells. Redirecting errors (2>/dev/null) hides noise. You can also check matches first: ls *.jpg.
  • Name collisions: mv -n prevents overwriting. If two files normalize to the same name, decide a rule (e.g., add a suffix) before forcing.
  • Weird characters: always quote variables ("$f") and use -- before filenames to avoid option confusion.

Mini-Project 3: Permissions Audit (Spot Risky Files Quickly)

Concept

A permissions audit is a targeted scan to find files that are unexpectedly writable or executable. You are not changing permissions here; you are producing a short list for review. This is useful in shared directories, project folders, or scripts directories.

Input Data (Create a Small Audit Target)

mkdir -p ~/mini-projects/perm-audit/target
cd ~/mini-projects/perm-audit/target
echo 'secret' > private.txt
echo '#!/bin/sh
echo hi' > tool.sh
chmod 755 tool.sh
chmod o+w private.txt
ls -l

Expected Output

  • A report listing world-writable files.
  • A report listing executable files (useful for reviewing what can run).
  • Counts for each category.

Step-by-Step

1) Find world-writable files

cd ~/mini-projects/perm-audit
find target -type f -perm -0002 -print > world-writable.txt
cat world-writable.txt
wc -l world-writable.txt

2) Find executable files

find target -type f -perm -0111 -print > executables.txt
cat executables.txt
wc -l executables.txt

3) Produce a single audit report with context

{
  echo '=== WORLD-WRITABLE FILES (review carefully) ==='
  while IFS= read -r p; do ls -l "$p"; done < world-writable.txt
  echo
  echo '=== EXECUTABLE FILES (review what can run) ==='
  while IFS= read -r p; do ls -l "$p"; done < executables.txt
} > audit-report.txt
sed -n '1,120p' audit-report.txt

Troubleshooting Checkpoints

  • Empty results: confirm you are scanning the right directory and that test permissions exist. Run ls -l target to inspect.
  • Too many results on real systems: narrow scope (project folder only) and add -maxdepth or exclude directories with -path and -prune.
  • Symbolic links: if you want to include or exclude them explicitly, add -type l or avoid following links (default is not to follow).

Printable Linux Command Line Cheat Sheet (By Category)

Navigation

  • pwd show current directory
  • ls, ls -l, ls -a list files (long, all)
  • cd /path go to path
  • cd .. up one level
  • cd - toggle to previous directory
  • find . -maxdepth 2 -type d list directories (shallow)

File Operations

  • mkdir -p dir/subdir create directories
  • cp -r src/ dest/ copy directory recursively
  • mv old new rename/move
  • mv -n src dest move without overwriting
  • rm file, rm -r dir remove (be careful)
  • touch file create empty file / update timestamp

Viewing

  • cat file print file
  • less file scroll view
  • head -n 20 file first lines
  • tail -n 20 file last lines
  • tail -f file follow appended lines
  • wc -l file count lines

Permissions

  • ls -l view mode/owner/group
  • chmod 644 file set permissions (rw-r--r--)
  • chmod +x script.sh add execute bit
  • chown user:group file change owner/group (requires privileges)
  • find . -type f -perm -0002 world-writable files
  • find . -type f -perm -0111 executable files

Search / grep

  • grep 'pattern' file search
  • grep -n 'pattern' file include line numbers
  • grep -i 'pattern' file case-insensitive
  • grep -r 'pattern' dir/ recursive search
  • grep -v 'pattern' file invert match (exclude)
  • grep -o 'regex' file print only matching part

Pipes and Redirection

  • cmd1 | cmd2 pipe output to next command
  • cmd > out.txt overwrite output file
  • cmd >> out.txt append output file
  • cmd 2> err.txt redirect errors
  • cmd > out.txt 2>&1 redirect output and errors together
  • cmd | tee out.txt save and also display

Scripting Templates (Copy/Paste)

Safe loop over files (handles spaces)

find . -type f -print0 | while IFS= read -r -d '' f; do
  echo "FILE: $f"
done

Make a timestamped backup before editing

cp -n -- config.conf "config.conf.$(date +%Y%m%d-%H%M%S).bak"

Batch run with a log

#!/bin/sh
set -eu
log="run-$(date +%Y%m%d-%H%M%S).log"
{
  echo "Started: $(date)"
  # commands here
  echo "Finished: $(date)"
} >"$log" 2>&1

Final Lab: Triage, Fix, and Automate One Step

Scenario

You are given a small “incident bundle” directory. Your tasks: navigate to it, search for errors, filter to the most important lines, edit a config value to reduce noise, and automate creation of a short report. You must verify outputs with checks.

Setup Input Data

mkdir -p ~/mini-projects/final-lab/bundle
cd ~/mini-projects/final-lab/bundle
cat > service.log <<'EOF'
2026-01-16 10:00:01 INFO  scheduler job=sync interval=5
2026-01-16 10:00:02 WARN  api request_id=aa1 status=401 user=bob
2026-01-16 10:00:03 ERROR api request_id=aa2 status=500 user=bob msg="upstream timeout"
2026-01-16 10:00:04 INFO  scheduler job=sync interval=5
2026-01-16 10:00:05 ERROR db  request_id=aa2 msg="deadlock"
2026-01-16 10:00:06 WARN  api request_id=aa3 status=401 user=carol
2026-01-16 10:00:07 INFO  scheduler job=sync interval=5
EOF
cat > app.conf <<'EOF'
# app configuration
LOG_LEVEL=INFO
SYNC_INTERVAL=5
EOF
ls -l

Tasks (Do in Order)

1) Navigate and confirm you are in the right place

cd ~/mini-projects/final-lab/bundle
pwd
ls -1

2) Search and extract only ERROR lines into a file

grep ' ERROR ' service.log > errors.txt
cat errors.txt

3) Filter to the most important identifiers (request_id and component)

awk '{print $3, $4}' errors.txt | sort | uniq -c | sort -nr > error-keys.txt
cat error-keys.txt

4) Edit configuration to reduce noise (change LOG_LEVEL to WARN)

Edit app.conf with your terminal editor and change LOG_LEVEL=INFO to LOG_LEVEL=WARN. Then verify:

grep '^LOG_LEVEL=' app.conf

5) Automate one step: generate a single incident report file

Create a small script that builds a report from the log and config. This script should be rerunnable and overwrite the report each time.

cat > make-report.sh <<'EOF'
#!/bin/sh
set -eu
out="incident-report.txt"
{
  echo "=== CONFIG ==="
  grep -E '^(LOG_LEVEL|SYNC_INTERVAL)=' app.conf
  echo
  echo "=== ERROR LINES ==="
  grep ' ERROR ' service.log || true
  echo
  echo "=== ERROR COUNTS BY COMPONENT ==="
  grep ' ERROR ' service.log | awk '{print $3}' | sort | uniq -c | sort -nr || true
  echo
  echo "=== TOP REQUEST_IDS IN ERRORS ==="
  grep ' ERROR ' service.log | grep -o 'request_id=[^ ]\+' | sort | uniq -c | sort -nr || true
} >"$out"
echo "Wrote $out"
EOF
chmod +x make-report.sh
./make-report.sh
sed -n '1,120p' incident-report.txt

Verification Checks (Must Pass)

  • test -s errors.txt && echo OK (file exists and is not empty)
  • grep -q '^LOG_LEVEL=WARN' app.conf && echo OK (config edited correctly)
  • grep -q '=== ERROR LINES ===' incident-report.txt && echo OK (report structure present)
  • grep -c ' ERROR ' service.log equals wc -l errors.txt (extraction matches)
  • test -x make-report.sh && echo OK (script is executable)

Troubleshooting Checkpoints

  • Report shows empty sections: confirm your grep pattern matches the log format. Try grep 'ERROR' service.log to test.
  • Script stops unexpectedly: with set -eu, missing matches can cause failures in pipelines. The template uses || true after greps to keep the report generation robust when there are no errors.
  • Counts don’t match: ensure you are reading the same file and not an older copy in a different directory; re-check with pwd and ls -l.

Now answer the exercise about the content:

In a log triage pipeline, why is it important to run sort before uniq -c when counting repeated values (like status codes or users)?

You are right! Congratulations, now go to the next page

You missed! Try again.

uniq -c counts only repeated lines that are next to each other. Using sort first groups identical items together so the counts reflect all occurrences.

Download the app to earn free Certification and listen to the courses in the background, even with the screen off.