Skip to content

Instantly share code, notes, and snippets.

@ravikant-pal
Last active July 15, 2025 07:02
Show Gist options
  • Save ravikant-pal/64f1b54b6a176eb728d3c5ca05af67c6 to your computer and use it in GitHub Desktop.
Save ravikant-pal/64f1b54b6a176eb728d3c5ca05af67c6 to your computer and use it in GitHub Desktop.
Log Master Challenge

πŸ“‚ Log Master β€” The Ultimate Shell Text Processing Challenge 🧠πŸ’₯

⏱ Duration: 25 minutes


🎯 Objective

Become the Log Master by:

  1. Finding the number of ERROR, WARNING, and INFO messages from log files
  2. Creating a summary report
  3. Archiving the logs with a timestamp
  4. Bonus: Compressing and cleaning up old log files

πŸ“ Setup Instructions

  1. Run this command to generate sample log files:
mkdir -p log-master/logs && cd log-master/logs

for i in {1..5}; do
  for j in {1..10}; do
    echo "$(date '+%Y-%m-%d %H:%M:%S') [INFO] Service started" >> app$i.log
    echo "$(date '+%Y-%m-%d %H:%M:%S') [WARNING] Memory usage high" >> app$i.log
    echo "$(date '+%Y-%m-%d %H:%M:%S') [ERROR] Failed DB connection" >> app$i.log
  done
done

cd ../..
echo "βœ… 5 log files generated in log-master/logs/"

πŸ§ͺ Your Tasks

You’ll work inside the log-master/ directory.

1. πŸ“Š Count Log Types

Write commands to count how many times each log level appears across all logs.

Example:

grep -roh 'ERROR' logs/ | wc -l
grep -roh 'WARNING' logs/ | wc -l
grep -roh 'INFO' logs/ | wc -l

🧠 Hint: Try using cut, sort, and uniq -c if the logs were more complex!

2. πŸ“ Create a Summary Report

Write the output to a file called log_summary.txt like this:

ERROR: 4
WARNING: 3
INFO: 5

Use a script or compound command:

{
  echo "ERROR: $(grep -roh 'ERROR' logs/ | wc -l)"
  echo "WARNING: $(grep -roh 'WARNING' logs/ | wc -l)"
  echo "INFO: $(grep -roh 'INFO' logs/ | wc -l)"
} > log_summary.txt

3. πŸ“¦ Archive Logs

Compress the entire logs/ folder into a tarball named with today’s date:

tar -czvf logs_$(date +%F).tar.gz logs/

4. 🧹 Cleanup (Bonus)

Delete original .log files after compression (simulate archiving + cleaning):

rm -rf logs/

πŸ† Bonus Challenges

  • Sort logs by frequency of log level: sort | uniq -c | sort -nr
  • Combine everything into a bash script called process_logs.sh
  • Use awk to extract timestamps and group by hour (advanced)

🧠 Learnings & Real-World Relevance

Skill Real-World Use Case
grep, wc Analyze service logs for errors/warnings
tar, gzip Archive rotated logs before deletion
uniq -c, sort Frequency analysis of log patterns
Log reporting Basis for alerting, monitoring, analytics

🏁 Who Wins?

  • Fastest to complete all 3 tasks correctly
  • Neatest summary file
  • Most efficient (fewest commands)
  • Bonus: One-liner challenge!

πŸŽ‰ Winner earns the prestigious title of β€œLog Master of SIT Pune” πŸ”₯

#!/bin/bash
# process_logs.sh β€” count log levels, summarize, sort, and archive
cd log-master || { echo "❌ log-master directory not found."; exit 1; }
echo "πŸ“Š Counting log levels..."
{
echo "ERROR: $(grep -roh '\[ERROR\]' logs/ | wc -l)"
echo "WARNING: $(grep -roh '\[WARNING\]' logs/ | wc -l)"
echo "INFO: $(grep -roh '\[INFO\]' logs/ | wc -l)"
} > log_summary.txt
echo "πŸ”’ Sorting log levels by frequency..."
# Extract log levels and sort by count descending
grep -rohE '\[INFO\]|\[WARNING\]|\[ERROR\]' logs/ | \
sed 's/\[//;s/\]//' | \
sort | uniq -c | sort -nr > log_frequency_sorted.txt
echo "πŸ”  Sorting log levels by frequency and then alphabetically..."
# Sort by count (desc) and then label (asc)
grep -rohE '\[INFO\]|\[WARNING\]|\[ERROR\]' logs/ | \
sed 's/\[//;s/\]//' | \
sort | uniq -c | sort -k1,1nr -k2 > log_frequency_sorted_by_label.txt
echo "πŸ“¦ Creating archive..."
tar -czvf logs_$(date +%F).tar.gz logs/
echo "🧹 Cleaning up logs/"
rm -rf logs/
echo "βœ… Done."
echo "πŸ“„ Summary: log_summary.txt"
echo "πŸ“„ Sorted by frequency: log_frequency_sorted.txt"
echo "πŸ“„ Sorted by frequency and label: log_frequency_sorted_by_label.txt"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment