Learn to steer command output and errors into files, other programs, or combined streams to build reliable shell pipelines. 15.11.2025 | reading time: 2 min Pipes and redirection let a user steer command output and error streams so small programs combine into powerful workflows; learn what to send where and why. Log analysis in practice Collect the last logs, filter for errors, count unique lines and save the summary; run the pipeline shown below to produce a concise report and then display it: ```bash sudo tail -n 1000 /var/log/syslog | grep -i error | sort | uniq -c > error_summary.txt ; cat error_summary.txt # sample output: "12 kernel: usb disconnect" ``` Operators that matter Use `>` to overwrite, `>>` to append, `2>` to capture standard error, and `2>&1` to merge error into standard output; the pipe `|` forwards only stdout while `|&` forwards both stdout and stderr in many shells, for example: ```bash grep pattern file 2>errors.txt ; command >out.txt 2>&1 ; cmd1 |& cmd2 ``` When to reach for redirects Redirect output when you log cron jobs, capture failures for debugging, or chain filters for data processing; be cautious with `>` to avoid clobbering files and prefer `>>` or `tee -a` for safe appends. Complementary utilities Tools like `tee` duplicate streams to both terminal and file, `xargs` turns piped lists into command arguments, and `pv` shows throughput for large transfers; combine them to monitor, persist, and transform pipelines without losing data. Next steps Try rewriting a multi-step task as a single pipeline and add error capture with `2>`; mastering these patterns accelerates daily administration and prepares a user for certifications and deeper Linux study at bitsandbytes.academy for CompTIA Linux+ or LPIC-1 exam prep. Join Bits & Bytes Academy First class LINUX exam preparation. scripting utilities troubleshooting