L2 Support Engineer · Fintech · Week 3
Week 3 Day 3 Day 4
Week 3 · Day 3 & Day 4

Log Parsing &
Cron Jobs

Day 3 teaches you to extract exactly what you need from logs using powerful command combinations. Day 4 teaches you to schedule tasks so they run automatically without you being there.

Day 3 — Log Parsing Day 4 — Cron Jobs grep awk pipes crontab
Day 3 Log Parsing
01 The Simple Idea
Real-life Analogy

Think of a log file like a very long CCTV recording. You don't watch 8 hours of footage to find one incident. You fast-forward, filter by time, zoom in.

grep, awk, and pipes are your fast-forward and zoom tools — they let you cut through millions of log lines and extract exactly the one piece of information you need in seconds.

02 Commands — grep, awk & Pipes
grep
Search & Filter
Advanced patterns for log investigation
Used for: Finding lines that match a pattern. Combine with flags for powerful filtering.
grep — useful flags
# Count ERROR lines
grep -c "ERROR" payment.log

# Show line number of each ERROR
grep -n "ERROR" payment.log

# Show 3 lines BEFORE the error (context)
grep -B 3 "ERROR" payment.log

# Show 3 lines AFTER the error (context)
grep -A 3 "ERROR" payment.log

# Search for multiple patterns at once
grep -E "ERROR|WARN|TIMEOUT" payment.log

# Search recursively in all files in a folder
grep -r "DB_TIMEOUT" /var/logs/
💡 L2 daily use: grep -c "ERROR" payment.log gives you the exact error count for a client report in one second.
awk
Extract Columns
Pull specific fields from each log line
What it does: awk treats each log line like a table row and lets you pick specific columns (fields) from it. By default, fields are separated by spaces. $1 = first word, $2 = second word, and so on.
awk examples
# Sample log line:
# [2024-03-15 14:02:05] ERROR DB_TIMEOUT TXN-9823
# $1=date $2=time $3=level $4=error $5=txn_id

# Print only the timestamp (field 1 and 2)
awk '{print $1, $2}' payment.log

# Print only the error type (field 4)
awk '{print $4}' payment.log

# Print lines where field 3 equals ERROR
awk '$3 == "ERROR" {print $0}' payment.log

# Custom separator — for CSV files
awk -F',' '{print $2}' report.csv
💡 L2 use: Extract just the TXN IDs of all failed transactions from a log — no manual copy-pasting. Send the list straight to the client.
pipes |
Connect Commands
Chain commands together — output of one becomes input of next
What it does: The pipe | connects two commands — the output of the first becomes the input of the second. This is how you build powerful one-line commands that do multiple things at once.
pipe combinations
# Watch live log but only show ERROR lines
tail -f payment.log | grep "ERROR"

# Count how many unique error types appear
grep "ERROR" payment.log | awk '{print $4}' | sort | uniq -c

# Get top 5 most repeated errors
grep "ERROR" payment.log | awk '{print $4}' | sort | uniq -c | sort -rn | head -5

# Extract disk % and check if above 80
df / | awk 'NR==2 {print $5}' | tr -d '%'
🔗 Pipe Chain Explained — Most Used Combination
CommandWhat it does in the chain
grep "ERROR" payment.logGet all lines containing ERROR
| awk '{print $4}'From those lines, extract only the 4th field (error code)
| sortSort the error codes alphabetically so duplicates are together
| uniq -cCount how many times each unique error code appears
| sort -rnSort by count — highest first
| head -5Show only the top 5 most frequent errors
💡 This single pipe chain tells you which 5 errors are causing the most failures — the most useful command you'll run after an outage.
03 Day 3 Lab — Extract ERROR Counts

🔬 Lab: Parse Logs & Extract Error Report

Kali Linux · Terminal
01
Create a sample log file to parse
Use the log file you created in Week 2 Day 2, or create a fresh one.
terminal
cat ~/payment-service.log # view it first
02
Count total ERROR lines
Get the exact number of errors in the log.
terminal
grep -c "ERROR" ~/payment-service.log
→ Expected output: 5
03
Extract unique error types and their count
Use the full pipe chain to get a ranked error summary.
terminal
grep "ERROR" ~/payment-service.log | awk '{print $4}' | sort | uniq -c | sort -rn
→ Expected: count + error type, sorted highest first
04
Save the error report to a file
Redirect the output into a report file you can share.
terminal
grep "ERROR" ~/payment-service.log | awk '{print $4}' | sort | uniq -c | sort -rn > ~/error-report.txt
cat ~/error-report.txt
→ error-report.txt created — contains ranked error summary ✅
05
Find errors with context — 2 lines before and after
See what happened just before and after each error.
terminal
grep -B 2 -A 2 "ERROR" ~/payment-service.log
→ Shows each ERROR with 2 lines of context — reveals the build-up ✅
Day 4 Cron Jobs
04 The Simple Idea
Real-life Analogy

Think of a bank's scheduled reports. Every morning at 9 AM, the system automatically generates yesterday's transaction summary and emails it to management. Nobody manually triggers it — it's scheduled.

Cron jobs are exactly that — you tell Linux "run this script every day at 8 AM" and it does it automatically forever, even when you're asleep.

05 Crontab — How It Works

What is Crontab?

Crontab is Linux's built-in task scheduler. You give it a time pattern + a command and it runs that command automatically at the right time. Every scheduled job is called a cron job.

To open and edit your cron schedule, you run: crontab -e

⏱️ Cron Syntax — 5 Fields + Command
Minute
*
0 – 59
Hour
*
0 – 23
Day of Month
*
1 – 31
Month
*
1 – 12
Day of Week
*
0=Sun 6=Sat
Command
/path/script.sh
full path
Cron ExpressionWhen it runs
0 8 * * *Every day at 8:00 AM
0 8 * * 1Every Monday at 8:00 AM
*/30 * * * *Every 30 minutes
0 0 * * *Every day at midnight
0 8,17 * * *Every day at 8 AM and 5 PM
0 8 1 * *1st of every month at 8 AM
crontab
Scheduler
Schedule any script or command to run automatically
crontab commands
# Open crontab editor to add/edit jobs
crontab -e

# List all current cron jobs
crontab -l

# Remove all cron jobs (careful!)
crontab -r
example cron jobs for L2 use
# Run health check every day at 8 AM — save output to log
0 8 * * * /home/kali/health-check.sh >> /home/kali/daily-report.txt

# Run error count script every hour
0 * * * * /home/kali/error-count.sh

# Run disk check every 30 minutes
*/30 * * * * /home/kali/disk-check.sh

# Run log cleanup every Sunday at midnight
0 0 * * 0 /home/kali/cleanup-logs.sh
⚠️ Important: Always use the full path to your script in crontab — not ./script.sh but /home/kali/script.sh. Cron doesn't know your current directory.
06 Day 4 Lab — Schedule Daily Health Check

🔬 Lab: Schedule Your Health Check Script with Crontab

Kali Linux · Crontab
01
Create a log parser script to schedule
This script counts errors and saves the result — this is what cron will run.
error-count.sh
#!/bin/bash
LOG="/home/kali/payment-service.log"
OUT="/home/kali/daily-error-report.txt"
echo "=== Error Report: $(date) ===" >> $OUT
grep -c "ERROR" $LOG >> $OUT
grep "ERROR" $LOG | awk '{print $4}' | sort | uniq -c >> $OUT
02
Give it permission and test it manually first
Always test a script manually before scheduling it.
terminal
chmod +x /home/kali/error-count.sh
./error-count.sh
cat /home/kali/daily-error-report.txt
→ Report file created with timestamp and error counts ✅
03
Open crontab and schedule it
Add two cron jobs — one for daily health check, one for hourly error count.
crontab -e
# Daily health check at 8 AM
0 8 * * * /home/kali/health-check.sh >> /home/kali/health-report.txt

# Error count every hour
0 * * * * /home/kali/error-count.sh
04
Confirm the cron jobs are scheduled
List all active cron jobs to confirm they were saved correctly.
terminal
crontab -l
→ Both jobs listed — health check at 8 AM, error count hourly ✅
07 Quick Cheat Sheet — Day 3 & 4
⌨️ Log Parsing & Cron Cheat Sheet
grep -c "ERR" fileCount matching lines
grep -n "ERR" fileShow line numbers with matches
grep -B 2 -A 2 "ERR"Show 2 lines before and after match
grep -E "ERR|WARN"Match multiple patterns at once
awk '{print $2}'Print the 2nd field of each line
awk -F',' '{print $1}'Use comma as field separator (CSV)
sort | uniq -cSort and count duplicates
sort -rn | head -5Top 5 highest count items
cmd > file.txtSave output to file (overwrites)
cmd >> file.txtAppend output to file (adds to end)
crontab -eOpen cron editor to add/edit jobs
crontab -lList all current cron jobs
08 Real L2 Scenarios
01

After an outage, manager asks: "What were the top 3 errors?" — You run the full pipe chain in 5 seconds: grep "ERROR" payment.log | awk '{print $4}' | sort | uniq -c | sort -rn | head -3 — done.

02

Client says: "Can you send us the error summary every morning?" — You schedule error-count.sh with crontab at 8 AM. It runs automatically every day, saves to a file, and you forward it. No manual work.

03

You need to find all logs that contain a specific TXN ID across 20 files: grep -r "TXN-9823" /var/logs/ — searches all files recursively in one command.

04

Disk is filling up because old logs are never deleted. You write a cleanup script and schedule it with crontab: every Sunday at midnight0 0 * * 0 — it runs automatically and keeps disk healthy without manual intervention.

✅ Week 3 · Day 3 & 4 Outcomes