Project Overview and Requirements
In this capstone, you will assemble previously learned skills into a cohesive console utility that feels like a “real” tool: a menu-driven log analyzer that reads one or more log files, extracts structured information, and produces readable summaries. The focus is on integrating pieces—clean project organization, a simple domain model, user flows, persistence, validation, and a final polish pass—rather than re-teaching fundamentals.
Utility Definition: Log Analyzer
You will build a console utility that can:
- Import log files (plain text lines) into a local store.
- Parse each line into a structured record (timestamp, level, message, optional source).
- Provide menu actions: list recent entries, filter by level/date range, show counts by level, search by keyword, export a report.
- Persist imported entries to disk as JSON so the tool can resume later.
- Validate user input and handle errors without crashing.
- Use at least one NuGet package to improve output readability.
- Add one feature by deliberately consulting API documentation (example: robust timestamp parsing with DateTimeOffset and exact formats, or regex-based parsing with timeouts).
Milestones
- Milestone 1: Project scaffold and folder structure
- Milestone 2: Domain model and parsing
- Milestone 3: Menu and user flows
- Milestone 4: Persistence layer (JSON store)
- Milestone 5: Error-handling and validation pass
- Milestone 6: Final polish (formatted output, clean organization)
Milestone 1: Project Scaffold and Organization
Keep the app small but organized. A simple structure makes later changes safer.
Suggested folders
- Domain: entities and value objects (LogEntry, LogLevel)
- Parsing: log line parsing logic (ILogParser, SimpleLogParser)
- Persistence: JSON repository (ILogRepository, JsonLogRepository)
- UI: menu rendering and input helpers (Menu, ConsolePrompts)
- Reporting: summary generation and export (ReportService)
NuGet package choice
Use Spectre.Console to render tables, prompts, and colored output. This improves usability and demonstrates integrating a package into a cohesive app rather than a single isolated example.
App entry point responsibilities
Keep Program.cs thin: wire up services, load persisted data, run the menu loop, and save on exit. Avoid putting parsing, persistence, and reporting logic directly in the entry point.
Continue in our app.
You can listen to the audiobook with the screen off, receive a free certificate for this course, and also have access to 5,000 other free online courses.
Or continue reading below...Download the app
using LogAnalyzer.Domain;using LogAnalyzer.Parsing;using LogAnalyzer.Persistence;using LogAnalyzer.Reporting;using LogAnalyzer.UI;using Spectre.Console;var repository = new JsonLogRepository("data/logstore.json");var parser = new SimpleLogParser();var reportService = new ReportService();var app = new LogAnalyzerApp(repository, parser, reportService);await app.RunAsync();The LogAnalyzerApp class becomes the coordinator for user flows, delegating real work to focused components.
Milestone 2: Domain Model and Parsing
The domain model is the “language” of your utility. It should be small and stable: once you have a good LogEntry, everything else becomes easier.
Domain model
namespace LogAnalyzer.Domain;public enum LogLevel{Trace,Debug,Info,Warn,Error,Fatal,Unknown}public sealed record LogEntry(DateTimeOffset Timestamp, LogLevel Level, string Message, string? Source, string RawLine);Include RawLine so you can still display or export the original text when parsing is imperfect.
Parsing strategy
Define a parser interface so you can swap implementations later (for different log formats).
namespace LogAnalyzer.Parsing;using LogAnalyzer.Domain;public interface ILogParser{bool TryParse(string line, out LogEntry? entry, out string? error);}Implement a simple parser (regex-based)
Assume a common format like: 2026-01-16T10:15:30Z [INFO] (Auth) User logged in. Use a regex with a timeout to avoid pathological inputs. This is a good place to consult API documentation: Regex supports timeouts via constructor overloads.
namespace LogAnalyzer.Parsing;using System.Text.RegularExpressions;using LogAnalyzer.Domain;public sealed class SimpleLogParser : ILogParser{private static readonly Regex Pattern = new Regex( @"^(?<ts>\S+)\s+\[(?<lvl>\w+)\]\s+(?:\((?<src>[^)]+)\)\s+)?(?<msg>.*)$", RegexOptions.Compiled | RegexOptions.CultureInvariant, matchTimeout: TimeSpan.FromMilliseconds(200));public bool TryParse(string line, out LogEntry? entry, out string? error){entry = null;error = null;if (string.IsNullOrWhiteSpace(line)){error = "Empty line.";return false;}Match m;try{m = Pattern.Match(line);}catch (RegexMatchTimeoutException){error = "Parsing timed out.";return false;}if (!m.Success){error = "Line did not match expected format.";return false;}var tsText = m.Groups["ts"].Value;var lvlText = m.Groups["lvl"].Value;var src = m.Groups["src"].Success ? m.Groups["src"].Value : null;var msg = m.Groups["msg"].Value; if (!DateTimeOffset.TryParse(tsText, out var ts)){error = $"Invalid timestamp: {tsText}";return false;}var level = ParseLevel(lvlText);entry = new LogEntry(ts, level, msg, src, line);return true;}private static LogLevel ParseLevel(string text){return text.ToUpperInvariant() switch{"TRACE" => LogLevel.Trace,"DEBUG" => LogLevel.Debug,"INFO" => LogLevel.Info,"WARN" => LogLevel.Warn,"WARNING" => LogLevel.Warn,"ERROR" => LogLevel.Error,"FATAL" => LogLevel.Fatal,_ => LogLevel.Unknown};}}Deliberate API documentation use: the regex timeout overload and RegexOptions.CultureInvariant are easy to miss unless you read the docs. This improves reliability and makes parsing behavior predictable across environments.
Milestone 3: Menu and User Flows
Design user flows first, then implement them as methods. Each menu option should map to a single method that orchestrates input, calls services, and prints output.
Core menu actions
- Import log file(s)
- Show recent entries
- Filter entries (level, date range)
- Search by keyword
- Show counts by level
- Export report to file
- Exit
App coordinator
namespace LogAnalyzer;using LogAnalyzer.Domain;using LogAnalyzer.Parsing;using LogAnalyzer.Persistence;using LogAnalyzer.Reporting;using LogAnalyzer.UI;using Spectre.Console;public sealed class LogAnalyzerApp{private readonly ILogRepository _repo;private readonly ILogParser _parser;private readonly ReportService _reports;private List<LogEntry> _entries = new();public LogAnalyzerApp(ILogRepository repo, ILogParser parser, ReportService reports){_repo = repo;_parser = parser;_reports = reports;}public async Task RunAsync(){_entries = (await _repo.LoadAsync()).ToList();while (true){var choice = Menu.ShowMainMenu();switch (choice){case MainMenuChoice.Import: await ImportFlowAsync(); break;case MainMenuChoice.Recent: ShowRecentFlow(); break;case MainMenuChoice.Filter: FilterFlow(); break;case MainMenuChoice.Search: SearchFlow(); break;case MainMenuChoice.Counts: ShowCountsFlow(); break;case MainMenuChoice.Export: await ExportFlowAsync(); break;case MainMenuChoice.Exit: await _repo.SaveAsync(_entries); return;}}}private async Task ImportFlowAsync(){var path = ConsolePrompts.PromptForExistingFilePath("Enter path to log file");var lines = await File.ReadAllLinesAsync(path);var imported = new List<LogEntry>();var failures = 0;foreach (var line in lines){if (_parser.TryParse(line, out var entry, out _)){imported.Add(entry!);}else{failures++;}}_entries.AddRange(imported);AnsiConsole.MarkupLine($"Imported [green]{imported.Count}[/] entries. Failed: [yellow]{failures}[/].");await _repo.SaveAsync(_entries);}private void ShowRecentFlow(){var count = ConsolePrompts.PromptForInt("How many recent entries?", min: 1, max: 200);var recent = _entries.OrderByDescending(e => e.Timestamp).Take(count).OrderBy(e => e.Timestamp).ToList();Output.RenderEntriesTable(recent);}private void FilterFlow(){var level = ConsolePrompts.PromptForOptionalLevel();var from = ConsolePrompts.PromptForOptionalDate("From date (yyyy-MM-dd) or blank");var to = ConsolePrompts.PromptForOptionalDate("To date (yyyy-MM-dd) or blank");var filtered = _entries.AsEnumerable();if (level is not null) filtered = filtered.Where(e => e.Level == level);if (from is not null) filtered = filtered.Where(e => e.Timestamp.Date >= from.Value.Date);if (to is not null) filtered = filtered.Where(e => e.Timestamp.Date <= to.Value.Date);Output.RenderEntriesTable(filtered.OrderBy(e => e.Timestamp).Take(500).ToList());}private void SearchFlow(){var term = ConsolePrompts.PromptForNonEmptyString("Keyword to search");var results = _entries.Where(e => e.Message.Contains(term, StringComparison.OrdinalIgnoreCase) || (e.Source?.Contains(term, StringComparison.OrdinalIgnoreCase) ?? false)).OrderBy(e => e.Timestamp).Take(500).ToList();Output.RenderEntriesTable(results);}private void ShowCountsFlow(){var counts = _reports.CountByLevel(_entries);Output.RenderCountsTable(counts);}private async Task ExportFlowAsync(){var path = ConsolePrompts.PromptForOutputFilePath("Export report path", defaultName: "report.txt");var reportText = _reports.BuildTextReport(_entries);await File.WriteAllTextAsync(path, reportText);AnsiConsole.MarkupLine($"Report exported to [green]{path}[/].");}}This style keeps each flow readable and testable. The heavy lifting (formatting, counting, report generation) is delegated to specialized classes.
Milestone 4: Persistence Layer (JSON Store)
Persistence should be boring and reliable. Use a repository abstraction so the rest of the app doesn’t care whether data is stored in JSON, a database, or something else later.
Repository interface
namespace LogAnalyzer.Persistence;using LogAnalyzer.Domain;public interface ILogRepository{Task<IReadOnlyList<LogEntry>> LoadAsync();Task SaveAsync(IEnumerable<LogEntry> entries);}JSON repository implementation
Store a single JSON file on disk. Ensure the directory exists and handle the “file not found” case by returning an empty list.
namespace LogAnalyzer.Persistence;using System.Text.Json;using LogAnalyzer.Domain;public sealed class JsonLogRepository : ILogRepository{private readonly string _path;private static readonly JsonSerializerOptions Options = new JsonSerializerOptions{WriteIndented = true};public JsonLogRepository(string path){_path = path;}public async Task<IReadOnlyList<LogEntry>> LoadAsync(){if (!File.Exists(_path)) return Array.Empty<LogEntry>();try{var json = await File.ReadAllTextAsync(_path);var data = JsonSerializer.Deserialize<List<LogEntry>>(json, Options);return data ?? new List<LogEntry>();}catch{ return Array.Empty<LogEntry>(); }}public async Task SaveAsync(IEnumerable<LogEntry> entries){var dir = Path.GetDirectoryName(_path);if (!string.IsNullOrWhiteSpace(dir)) Directory.CreateDirectory(dir);var json = JsonSerializer.Serialize(entries, Options);await File.WriteAllTextAsync(_path, json);}}Note the deliberate choice: on load failure, return an empty list rather than crashing. In a later pass, you can improve this by showing a warning and offering to back up the corrupted file.
Milestone 5: Validation and Error-Handling Pass
After the “happy path” works, do a dedicated pass to harden the utility. This is where the tool becomes trustworthy.
Input helpers
Centralize validation in a small set of prompt methods. This prevents duplicated validation logic across menu flows.
namespace LogAnalyzer.UI;using LogAnalyzer.Domain;using Spectre.Console;public static class ConsolePrompts{public static string PromptForExistingFilePath(string label){while (true){var path = AnsiConsole.Ask<string>($"{label}:").Trim(' ', '"');if (File.Exists(path)) return path;AnsiConsole.MarkupLine("[red]File not found.[/] Try again.");}}public static int PromptForInt(string label, int min, int max){while (true){var text = AnsiConsole.Ask<string>($"{label} ({min}-{max}):");if (int.TryParse(text, out var value) && value >= min && value <= max) return value;AnsiConsole.MarkupLine("[red]Invalid number.[/]");}}public static string PromptForNonEmptyString(string label){while (true){var text = AnsiConsole.Ask<string>($"{label}:");if (!string.IsNullOrWhiteSpace(text)) return text.Trim();AnsiConsole.MarkupLine("[red]Value cannot be empty.[/]");}}public static LogLevel? PromptForOptionalLevel(){var options = new[] { "(any)", "Trace", "Debug", "Info", "Warn", "Error", "Fatal", "Unknown" };var choice = AnsiConsole.Prompt(new SelectionPrompt<string>().Title("Level filter").AddChoices(options));if (choice == "(any)") return null;return Enum.Parse<LogLevel>(choice, ignoreCase: true);}public static DateTimeOffset? PromptForOptionalDate(string label){while (true){var text = AnsiConsole.Ask<string>($"{label}:").Trim();if (string.IsNullOrEmpty(text)) return null;if (DateTimeOffset.TryParse(text, out var dt)) return dt;AnsiConsole.MarkupLine("[red]Invalid date.[/] Example: 2026-01-16");}}public static string PromptForOutputFilePath(string label, string defaultName){var text = AnsiConsole.Ask<string>($"{label} (default: {defaultName}):").Trim();return string.IsNullOrWhiteSpace(text) ? defaultName : text;}}Error handling policy
- Parsing errors: count and optionally log them; do not stop import.
- File I/O errors: show a clear message and return to menu.
- Corrupt JSON store: warn and start with empty data (optionally back up the file).
- Unexpected exceptions: catch at the top-level menu loop, show a friendly error, and continue when safe.
Add a top-level guard in RunAsync around each action to prevent a single bug from terminating the session.
try{ // execute selected flow }catch (Exception ex){AnsiConsole.MarkupLine("[red]An unexpected error occurred.[/]");AnsiConsole.WriteException(ex, ExceptionFormats.ShortenEverything);}Use shortened exception formatting for developer-friendly diagnostics while keeping output readable.
Milestone 6: Final Polish (Readable Output and Clean Code)
Polish is where the utility becomes pleasant to use: consistent formatting, predictable sorting, and clear summaries.
Formatted output with Spectre.Console tables
namespace LogAnalyzer.UI;using LogAnalyzer.Domain;using Spectre.Console;public static class Output{public static void RenderEntriesTable(IReadOnlyList<LogEntry> entries){var table = new Table().Border(TableBorder.Rounded).AddColumn("Time").AddColumn("Level").AddColumn("Source").AddColumn("Message");foreach (var e in entries){table.AddRow( e.Timestamp.ToString("u"), ColorizeLevel(e.Level), e.Source ?? "-", Truncate(e.Message, 80));}AnsiConsole.Write(table);AnsiConsole.MarkupLine($"Showing [green]{entries.Count}[/] entries.");}public static void RenderCountsTable(IReadOnlyDictionary<LogLevel,int> counts){var table = new Table().Border(TableBorder.Rounded).AddColumn("Level").AddColumn("Count");foreach (var kvp in counts.OrderByDescending(k => k.Value)){table.AddRow(kvp.Key.ToString(), kvp.Value.ToString());}AnsiConsole.Write(table);}private static string Truncate(string text, int max){if (string.IsNullOrEmpty(text) || text.Length <= max) return text;return text.Substring(0, max - 1) + "…";}private static string ColorizeLevel(LogLevel level){return level switch{LogLevel.Error => "[red]Error[/]",LogLevel.Fatal => "[maroon]Fatal[/]",LogLevel.Warn => "[yellow]Warn[/]",LogLevel.Info => "[green]Info[/]",LogLevel.Debug => "[blue]Debug[/]",LogLevel.Trace => "[grey]Trace[/]",_ => "Unknown"};}}Reporting service
Keep reporting logic separate from UI so you can reuse it for export and on-screen summaries.
namespace LogAnalyzer.Reporting;using System.Text;using LogAnalyzer.Domain;public sealed class ReportService{public IReadOnlyDictionary<LogLevel,int> CountByLevel(IEnumerable<LogEntry> entries){return entries.GroupBy(e => e.Level).ToDictionary(g => g.Key, g => g.Count());}public string BuildTextReport(IEnumerable<LogEntry> entries){var list = entries.OrderBy(e => e.Timestamp).ToList();var sb = new StringBuilder();sb.AppendLine("Log Analyzer Report");sb.AppendLine($"Generated: {DateTimeOffset.Now:u}");sb.AppendLine($"Total entries: {list.Count}");sb.AppendLine();var counts = CountByLevel(list).OrderByDescending(k => k.Value);sb.AppendLine("Counts by level:");foreach (var c in counts){sb.AppendLine($"- {c.Key}: {c.Value}");}sb.AppendLine();sb.AppendLine("First 20 entries:");foreach (var e in list.Take(20)){sb.AppendLine($"{e.Timestamp:u} [{e.Level}] {(e.Source ?? "-")} {e.Message}");}return sb.ToString();}}Clean code checklist
- Keep parsing, persistence, UI, and reporting in separate namespaces and files.
- Prefer small methods that do one thing (import, filter, search, export).
- Use consistent ordering (usually by timestamp) and consistent limits (e.g., show up to 500 rows).
- Make defaults explicit (default export name, default filters, default display counts).
- Ensure saving happens after import and on exit.
- Keep user-facing messages actionable (what failed, what to do next).
Optional enhancements (if time permits)
- Support importing a whole directory of
.logfiles. - Deduplicate entries by
Timestamp + Messagehash to avoid repeated imports. - Add a “bookmark” feature: save named filters and rerun them quickly.
- Export JSON report in addition to text.
- Allow multiple parser formats by selecting a parser strategy from the menu.