From Interactive to Script: the Same Command, Saved
You already know how to run commands interactively. The next step is to take a command that works, save it into a .ps1 file, and make it reusable: accept input (parameters), produce consistent output (objects), and behave predictably (errors and safety).
Start with a working one-liner
Pick a small task you can run repeatedly. Example: list large files in your Downloads folder.
Get-ChildItem -Path $env:USERPROFILE\Downloads -File -Recurse -ErrorAction SilentlyContinue | Sort-Object Length -Descending | Select-Object -First 20 FullName, Length, LastWriteTimeThis is fine interactively, but it has limitations: the path is fixed, the output is tailored for the screen, and errors are hidden rather than handled.
Save It as a Script File (.ps1)
Create a script file and run it
Create a file named Get-LargeDownloads.ps1 and paste the command inside. Then run it by providing the path to the script file.
# From the folder where the script is saved (example: Desktop)\Get-LargeDownloads.ps1If your environment blocks script execution, you may need to run it in a way allowed by your organization or current policy. In many environments, scripts are permitted when signed or when run from trusted locations.
Continue in our app.
You can listen to the audiobook with the screen off, receive a free certificate for this course, and also have access to 5,000 other free online courses.
Or continue reading below...Download the app
Script anatomy you will use most often
- Comment-based help (optional but recommended): makes your script self-documenting.
param()block: defines inputs and defaults.- Functions: small reusable pieces inside the script.
- Output: return objects, not formatted strings.
- Error handling: decide what to do when something fails.
Refactor Step 1: Add Parameters
Replace hard-coded values with parameters so the script is reusable.
<#
.SYNOPSIS
Lists the largest files under a folder.
.DESCRIPTION
Searches a folder recursively and returns the top N largest files as objects.
#>
param(
[Parameter(Mandatory=$false)]
[string]$Path = "$env:USERPROFILE\Downloads",
[Parameter(Mandatory=$false)]
[ValidateRange(1,500)]
[int]$Top = 20
)
Get-ChildItem -Path $Path -File -Recurse -ErrorAction SilentlyContinue |
Sort-Object Length -Descending |
Select-Object -First $Top FullName, Length, LastWriteTimeNow you can run:
\Get-LargeDownloads.ps1 -Path "C:\Temp" -Top 10Refactor Step 2: Return Objects (Not Pre-Formatted Text)
A common beginner mistake is to format output inside the script (for example, using Format-Table) and then trying to export or reuse it. Prefer returning objects. Let the caller decide whether to format, export, or further process.
Good: return objects with consistent properties.
param(
[string]$Path = "$env:USERPROFILE\Downloads",
[ValidateRange(1,500)]
[int]$Top = 20
)
Get-ChildItem -Path $Path -File -Recurse -ErrorAction SilentlyContinue |
Sort-Object Length -Descending |
Select-Object -First $Top @{Name='Path';Expression={$_.FullName}},
@{Name='SizeMB';Expression={[math]::Round($_.Length/1MB,2)}},
LastWriteTimeExample usage:
$items = \Get-LargeDownloads.ps1 -Top 5
$items | Export-Csv -NoTypeInformation -Path "$env:TEMP\largest-downloads.csv"Refactor Step 3: Add Clear, Controlled Messaging
Separate “data output” from “messages.” Data should be objects written to the pipeline. Messages should use streams intended for humans.
Write-Verbosefor optional detail (enabled with-Verbose).Write-Warningfor non-fatal issues.Write-Errorfor errors (can be terminating or non-terminating).
[CmdletBinding()]
param(
[string]$Path = "$env:USERPROFILE\Downloads",
[ValidateRange(1,500)]
[int]$Top = 20
)
Write-Verbose "Scanning: $Path"
Get-ChildItem -Path $Path -File -Recurse -ErrorAction SilentlyContinue |
Sort-Object Length -Descending |
Select-Object -First $Top @{Name='Path';Expression={$_.FullName}},
@{Name='SizeMB';Expression={[math]::Round($_.Length/1MB,2)}},
LastWriteTimeRun with verbose messages:
\Get-LargeDownloads.ps1 -VerboseRefactor Step 4: Handle Errors Intentionally
Many cmdlets emit non-terminating errors by default. To catch them with try/catch, you typically need to make them terminating using -ErrorAction Stop.
[CmdletBinding()]
param(
[string]$Path = "$env:USERPROFILE\Downloads",
[ValidateRange(1,500)]
[int]$Top = 20
)
try {
if (-not (Test-Path -Path $Path)) {
throw "Path not found: $Path"
}
Write-Verbose "Scanning: $Path"
Get-ChildItem -Path $Path -File -Recurse -ErrorAction Stop |
Sort-Object Length -Descending |
Select-Object -First $Top @{Name='Path';Expression={$_.FullName}},
@{Name='SizeMB';Expression={[math]::Round($_.Length/1MB,2)}},
LastWriteTime
}
catch {
Write-Error "Failed to scan '$Path'. $($_.Exception.Message)"
}This pattern makes failures explicit and easier to troubleshoot.
Introduce a Simple Function (Reusable Building Block)
Functions help you avoid repeating logic and keep scripts readable. A practical use is converting bytes to a friendly size.
function ConvertTo-Size {
param(
[Parameter(Mandatory=$true)]
[long]$Bytes
)
if ($Bytes -ge 1GB) { return "{0:N2} GB" -f ($Bytes/1GB) }
if ($Bytes -ge 1MB) { return "{0:N2} MB" -f ($Bytes/1MB) }
if ($Bytes -ge 1KB) { return "{0:N2} KB" -f ($Bytes/1KB) }
return "$Bytes B"
}Use it while still returning objects:
[CmdletBinding()]
param(
[string]$Path = "$env:USERPROFILE\Downloads",
[ValidateRange(1,500)]
[int]$Top = 20
)
function ConvertTo-Size {
param([long]$Bytes)
if ($Bytes -ge 1GB) { return "{0:N2} GB" -f ($Bytes/1GB) }
if ($Bytes -ge 1MB) { return "{0:N2} MB" -f ($Bytes/1MB) }
if ($Bytes -ge 1KB) { return "{0:N2} KB" -f ($Bytes/1KB) }
return "$Bytes B"
}
try {
if (-not (Test-Path -Path $Path)) { throw "Path not found: $Path" }
Get-ChildItem -Path $Path -File -Recurse -ErrorAction Stop |
Sort-Object Length -Descending |
Select-Object -First $Top @{Name='Path';Expression={$_.FullName}},
@{Name='Size';Expression={ ConvertTo-Size -Bytes $_.Length }},
LastWriteTime
}
catch {
Write-Error $($_.Exception.Message)
}Capstone: A Safe, Reversible Downloads Cleanup Tool
This capstone script demonstrates a common real-world pattern: identify candidates, take a safe action, and make it reversible. Instead of deleting files, it moves them into a dated archive folder. It also supports -WhatIf so you can preview changes.
What the tool will do
- Scan a folder (default: Downloads) for files older than a chosen number of days.
- Move them into an archive folder (default:
Downloads\_Archive\YYYY-MM-DD). - Return an object per moved file (or per candidate when using
-WhatIf). - Use
SupportsShouldProcessso-WhatIfand-Confirmwork.
Script: Move-OldDownloads.ps1
<#
.SYNOPSIS
Moves old files from a folder into a dated archive subfolder.
.DESCRIPTION
Safe cleanup: files are moved (reversible) rather than deleted.
Supports -WhatIf and -Confirm.
#>
[CmdletBinding(SupportsShouldProcess=$true, ConfirmImpact='Medium')]
param(
[Parameter(Mandatory=$false)]
[string]$Path = "$env:USERPROFILE\Downloads",
[Parameter(Mandatory=$false)]
[ValidateRange(1,3650)]
[int]$OlderThanDays = 30,
[Parameter(Mandatory=$false)]
[string]$ArchiveRoot = "$env:USERPROFILE\Downloads\_Archive"
)
function New-ArchiveFolder {
param(
[Parameter(Mandatory=$true)]
[string]$Root
)
$dateStamp = (Get-Date).ToString('yyyy-MM-dd')
$target = Join-Path -Path $Root -ChildPath $dateStamp
if (-not (Test-Path -Path $target)) {
New-Item -Path $target -ItemType Directory -ErrorAction Stop | Out-Null
}
return $target
}
try {
if (-not (Test-Path -Path $Path)) { throw "Path not found: $Path" }
$cutoff = (Get-Date).AddDays(-$OlderThanDays)
Write-Verbose "Cutoff date: $cutoff"
$archiveFolder = New-ArchiveFolder -Root $ArchiveRoot
Write-Verbose "Archive folder: $archiveFolder"
$candidates = Get-ChildItem -Path $Path -File -ErrorAction Stop |
Where-Object { $_.LastWriteTime -lt $cutoff }
foreach ($file in $candidates) {
$destination = Join-Path -Path $archiveFolder -ChildPath $file.Name
$action = "Move '$($file.FullName)' to '$destination'"
if ($PSCmdlet.ShouldProcess($file.FullName, $action)) {
Move-Item -Path $file.FullName -Destination $destination -ErrorAction Stop
}
[pscustomobject]@{
SourcePath = $file.FullName
DestinationPath = $destination
LastWriteTime = $file.LastWriteTime
SizeBytes = $file.Length
Action = if ($WhatIfPreference) { 'Planned' } else { 'Moved' }
}
}
}
catch {
Write-Error "Cleanup failed. $($_.Exception.Message)"
}Run it safely (preview first)
Preview the moves without changing anything:
\Move-OldDownloads.ps1 -OlderThanDays 60 -WhatIfRun for real with verbose details:
\Move-OldDownloads.ps1 -OlderThanDays 60 -VerbosePrompt before each move:
\Move-OldDownloads.ps1 -OlderThanDays 60 -ConfirmMake it easy to undo
Because the script moves files into a dated folder, undo is simply moving them back. You can also use the returned objects to drive an undo action:
$moved = \Move-OldDownloads.ps1 -OlderThanDays 60
# Example undo: move everything back to Downloads
foreach ($item in $moved) {
Move-Item -Path $item.DestinationPath -Destination (Join-Path $env:USERPROFILE\Downloads (Split-Path $item.DestinationPath -Leaf))
}Checklist: Turning a One-Liner into a Reusable Tool
| Goal | What to add/change | Why it matters |
|---|---|---|
| Reuse | param() with defaults and validation | Run the same script in different contexts safely |
| Clarity | [CmdletBinding()], Write-Verbose, Write-Warning | Human-friendly messages without polluting data output |
| Reliability | try/catch + -ErrorAction Stop | Predictable failure behavior and easier troubleshooting |
| Composability | Return objects ([pscustomobject]) | Enables exporting, filtering, reporting, and reuse |
| Safety | SupportsShouldProcess + -WhatIf/-Confirm | Preview changes and reduce accidental impact |
| Maintainability | Small helper functions | Keeps scripts readable and reduces duplication |