Free Ebook cover Windows PowerShell for Beginners: Everyday Automation and System Insights

Windows PowerShell for Beginners: Everyday Automation and System Insights

New course

10 pages

Writing Your First PowerShell Scripts: From One-Liners to Reusable Tools

Capítulo 9

Estimated reading time: 7 minutes

+ Exercise

From Interactive to Script: the Same Command, Saved

You already know how to run commands interactively. The next step is to take a command that works, save it into a .ps1 file, and make it reusable: accept input (parameters), produce consistent output (objects), and behave predictably (errors and safety).

Start with a working one-liner

Pick a small task you can run repeatedly. Example: list large files in your Downloads folder.

Get-ChildItem -Path $env:USERPROFILE\Downloads -File -Recurse -ErrorAction SilentlyContinue | Sort-Object Length -Descending | Select-Object -First 20 FullName, Length, LastWriteTime

This is fine interactively, but it has limitations: the path is fixed, the output is tailored for the screen, and errors are hidden rather than handled.

Save It as a Script File (.ps1)

Create a script file and run it

Create a file named Get-LargeDownloads.ps1 and paste the command inside. Then run it by providing the path to the script file.

# From the folder where the script is saved (example: Desktop)\Get-LargeDownloads.ps1

If your environment blocks script execution, you may need to run it in a way allowed by your organization or current policy. In many environments, scripts are permitted when signed or when run from trusted locations.

Continue in our app.

You can listen to the audiobook with the screen off, receive a free certificate for this course, and also have access to 5,000 other free online courses.

Or continue reading below...
Download App

Download the app

Script anatomy you will use most often

  • Comment-based help (optional but recommended): makes your script self-documenting.
  • param() block: defines inputs and defaults.
  • Functions: small reusable pieces inside the script.
  • Output: return objects, not formatted strings.
  • Error handling: decide what to do when something fails.

Refactor Step 1: Add Parameters

Replace hard-coded values with parameters so the script is reusable.

<#
.SYNOPSIS
Lists the largest files under a folder.
.DESCRIPTION
Searches a folder recursively and returns the top N largest files as objects.
#>

param(
  [Parameter(Mandatory=$false)]
  [string]$Path = "$env:USERPROFILE\Downloads",

  [Parameter(Mandatory=$false)]
  [ValidateRange(1,500)]
  [int]$Top = 20
)

Get-ChildItem -Path $Path -File -Recurse -ErrorAction SilentlyContinue |
  Sort-Object Length -Descending |
  Select-Object -First $Top FullName, Length, LastWriteTime

Now you can run:

\Get-LargeDownloads.ps1 -Path "C:\Temp" -Top 10

Refactor Step 2: Return Objects (Not Pre-Formatted Text)

A common beginner mistake is to format output inside the script (for example, using Format-Table) and then trying to export or reuse it. Prefer returning objects. Let the caller decide whether to format, export, or further process.

Good: return objects with consistent properties.

param(
  [string]$Path = "$env:USERPROFILE\Downloads",
  [ValidateRange(1,500)]
  [int]$Top = 20
)

Get-ChildItem -Path $Path -File -Recurse -ErrorAction SilentlyContinue |
  Sort-Object Length -Descending |
  Select-Object -First $Top @{Name='Path';Expression={$_.FullName}},
                       @{Name='SizeMB';Expression={[math]::Round($_.Length/1MB,2)}},
                       LastWriteTime

Example usage:

$items = \Get-LargeDownloads.ps1 -Top 5
$items | Export-Csv -NoTypeInformation -Path "$env:TEMP\largest-downloads.csv"

Refactor Step 3: Add Clear, Controlled Messaging

Separate “data output” from “messages.” Data should be objects written to the pipeline. Messages should use streams intended for humans.

  • Write-Verbose for optional detail (enabled with -Verbose).
  • Write-Warning for non-fatal issues.
  • Write-Error for errors (can be terminating or non-terminating).
[CmdletBinding()]
param(
  [string]$Path = "$env:USERPROFILE\Downloads",
  [ValidateRange(1,500)]
  [int]$Top = 20
)

Write-Verbose "Scanning: $Path"

Get-ChildItem -Path $Path -File -Recurse -ErrorAction SilentlyContinue |
  Sort-Object Length -Descending |
  Select-Object -First $Top @{Name='Path';Expression={$_.FullName}},
                       @{Name='SizeMB';Expression={[math]::Round($_.Length/1MB,2)}},
                       LastWriteTime

Run with verbose messages:

\Get-LargeDownloads.ps1 -Verbose

Refactor Step 4: Handle Errors Intentionally

Many cmdlets emit non-terminating errors by default. To catch them with try/catch, you typically need to make them terminating using -ErrorAction Stop.

[CmdletBinding()]
param(
  [string]$Path = "$env:USERPROFILE\Downloads",
  [ValidateRange(1,500)]
  [int]$Top = 20
)

try {
  if (-not (Test-Path -Path $Path)) {
    throw "Path not found: $Path"
  }

  Write-Verbose "Scanning: $Path"

  Get-ChildItem -Path $Path -File -Recurse -ErrorAction Stop |
    Sort-Object Length -Descending |
    Select-Object -First $Top @{Name='Path';Expression={$_.FullName}},
                         @{Name='SizeMB';Expression={[math]::Round($_.Length/1MB,2)}},
                         LastWriteTime
}
catch {
  Write-Error "Failed to scan '$Path'. $($_.Exception.Message)"
}

This pattern makes failures explicit and easier to troubleshoot.

Introduce a Simple Function (Reusable Building Block)

Functions help you avoid repeating logic and keep scripts readable. A practical use is converting bytes to a friendly size.

function ConvertTo-Size {
  param(
    [Parameter(Mandatory=$true)]
    [long]$Bytes
  )

  if ($Bytes -ge 1GB) { return "{0:N2} GB" -f ($Bytes/1GB) }
  if ($Bytes -ge 1MB) { return "{0:N2} MB" -f ($Bytes/1MB) }
  if ($Bytes -ge 1KB) { return "{0:N2} KB" -f ($Bytes/1KB) }
  return "$Bytes B"
}

Use it while still returning objects:

[CmdletBinding()]
param(
  [string]$Path = "$env:USERPROFILE\Downloads",
  [ValidateRange(1,500)]
  [int]$Top = 20
)

function ConvertTo-Size {
  param([long]$Bytes)
  if ($Bytes -ge 1GB) { return "{0:N2} GB" -f ($Bytes/1GB) }
  if ($Bytes -ge 1MB) { return "{0:N2} MB" -f ($Bytes/1MB) }
  if ($Bytes -ge 1KB) { return "{0:N2} KB" -f ($Bytes/1KB) }
  return "$Bytes B"
}

try {
  if (-not (Test-Path -Path $Path)) { throw "Path not found: $Path" }

  Get-ChildItem -Path $Path -File -Recurse -ErrorAction Stop |
    Sort-Object Length -Descending |
    Select-Object -First $Top @{Name='Path';Expression={$_.FullName}},
                         @{Name='Size';Expression={ ConvertTo-Size -Bytes $_.Length }},
                         LastWriteTime
}
catch {
  Write-Error $($_.Exception.Message)
}

Capstone: A Safe, Reversible Downloads Cleanup Tool

This capstone script demonstrates a common real-world pattern: identify candidates, take a safe action, and make it reversible. Instead of deleting files, it moves them into a dated archive folder. It also supports -WhatIf so you can preview changes.

What the tool will do

  • Scan a folder (default: Downloads) for files older than a chosen number of days.
  • Move them into an archive folder (default: Downloads\_Archive\YYYY-MM-DD).
  • Return an object per moved file (or per candidate when using -WhatIf).
  • Use SupportsShouldProcess so -WhatIf and -Confirm work.

Script: Move-OldDownloads.ps1

<#
.SYNOPSIS
Moves old files from a folder into a dated archive subfolder.
.DESCRIPTION
Safe cleanup: files are moved (reversible) rather than deleted.
Supports -WhatIf and -Confirm.
#>

[CmdletBinding(SupportsShouldProcess=$true, ConfirmImpact='Medium')]
param(
  [Parameter(Mandatory=$false)]
  [string]$Path = "$env:USERPROFILE\Downloads",

  [Parameter(Mandatory=$false)]
  [ValidateRange(1,3650)]
  [int]$OlderThanDays = 30,

  [Parameter(Mandatory=$false)]
  [string]$ArchiveRoot = "$env:USERPROFILE\Downloads\_Archive"
)

function New-ArchiveFolder {
  param(
    [Parameter(Mandatory=$true)]
    [string]$Root
  )

  $dateStamp = (Get-Date).ToString('yyyy-MM-dd')
  $target = Join-Path -Path $Root -ChildPath $dateStamp

  if (-not (Test-Path -Path $target)) {
    New-Item -Path $target -ItemType Directory -ErrorAction Stop | Out-Null
  }

  return $target
}

try {
  if (-not (Test-Path -Path $Path)) { throw "Path not found: $Path" }

  $cutoff = (Get-Date).AddDays(-$OlderThanDays)
  Write-Verbose "Cutoff date: $cutoff"

  $archiveFolder = New-ArchiveFolder -Root $ArchiveRoot
  Write-Verbose "Archive folder: $archiveFolder"

  $candidates = Get-ChildItem -Path $Path -File -ErrorAction Stop |
    Where-Object { $_.LastWriteTime -lt $cutoff }

  foreach ($file in $candidates) {
    $destination = Join-Path -Path $archiveFolder -ChildPath $file.Name

    $action = "Move '$($file.FullName)' to '$destination'"
    if ($PSCmdlet.ShouldProcess($file.FullName, $action)) {
      Move-Item -Path $file.FullName -Destination $destination -ErrorAction Stop
    }

    [pscustomobject]@{
      SourcePath      = $file.FullName
      DestinationPath = $destination
      LastWriteTime   = $file.LastWriteTime
      SizeBytes       = $file.Length
      Action          = if ($WhatIfPreference) { 'Planned' } else { 'Moved' }
    }
  }
}
catch {
  Write-Error "Cleanup failed. $($_.Exception.Message)"
}

Run it safely (preview first)

Preview the moves without changing anything:

\Move-OldDownloads.ps1 -OlderThanDays 60 -WhatIf

Run for real with verbose details:

\Move-OldDownloads.ps1 -OlderThanDays 60 -Verbose

Prompt before each move:

\Move-OldDownloads.ps1 -OlderThanDays 60 -Confirm

Make it easy to undo

Because the script moves files into a dated folder, undo is simply moving them back. You can also use the returned objects to drive an undo action:

$moved = \Move-OldDownloads.ps1 -OlderThanDays 60
# Example undo: move everything back to Downloads
foreach ($item in $moved) {
  Move-Item -Path $item.DestinationPath -Destination (Join-Path $env:USERPROFILE\Downloads (Split-Path $item.DestinationPath -Leaf))
}

Checklist: Turning a One-Liner into a Reusable Tool

GoalWhat to add/changeWhy it matters
Reuseparam() with defaults and validationRun the same script in different contexts safely
Clarity[CmdletBinding()], Write-Verbose, Write-WarningHuman-friendly messages without polluting data output
Reliabilitytry/catch + -ErrorAction StopPredictable failure behavior and easier troubleshooting
ComposabilityReturn objects ([pscustomobject])Enables exporting, filtering, reporting, and reuse
SafetySupportsShouldProcess + -WhatIf/-ConfirmPreview changes and reduce accidental impact
MaintainabilitySmall helper functionsKeeps scripts readable and reduces duplication

Now answer the exercise about the content:

When turning a PowerShell one-liner into a reusable script that may move files, what approach best preserves safe automation and keeps output reusable?

You are right! Congratulations, now go to the next page

You missed! Try again.

A reusable, safe tool should accept inputs via param() (with validation), return objects for composability, and use SupportsShouldProcess so -WhatIf/-Confirm work before making changes.

Next chapter

Safe Execution and Troubleshooting: Errors, Permissions, and Risk Reduction

Arrow Right Icon
Download the app to earn free Certification and listen to the courses in the background, even with the screen off.