Calibration Concepts for Robotics Sensors: Offsets, Scale Factors, and Frame Alignment

Capítulo 11

Estimated reading time: 13 minutes

+ Exercise

Calibration as a Process (Not a One-Time Tweak)

In robotics, calibration is the structured identification and management of parameters that map a sensor’s raw output to a physically meaningful quantity in the robot’s coordinate frames. Treat calibration as a lifecycle: (1) define the model and parameters, (2) estimate parameters using a controlled procedure, (3) validate on holdout data and against references, (4) deploy with traceable storage, and (5) monitor drift and trigger recalibration when needed.

A useful mental model is that every measurement pipeline has two layers of calibration:

  • Sensor model calibration: offsets, scale factors, nonlinearity terms, temperature dependence.
  • Geometric calibration: alignment between sensor axes and robot frames (misalignment, mounting angles, lever arms), plus timing alignment when relevant.

Factory Calibration vs In-System Calibration

Factory calibration is performed with specialized fixtures and reference instruments. It often provides baseline parameters (e.g., nominal scale factors, lens intrinsics, IMU bias specs) and sometimes per-unit calibration constants. It is valuable, but it assumes a controlled environment and does not include your robot’s mounting, wiring, mechanical tolerances, or operational temperature range.

In-system calibration is performed on the assembled robot. It captures installation-specific effects: wheel diameter under load, encoder-to-wheel coupling, IMU axis misalignment due to mounting, range sensor tilt, camera extrinsics relative to the robot base, and any changes after maintenance. In-system calibration should be repeatable and documented, and it should be re-run after defined triggers (e.g., wheel replacement, sensor remounting, collision, firmware changes).

Core Calibration Parameters

Offset (Bias)

An offset is an additive error: the sensor reads a nonzero value when the true input is zero (or reads a constant shift across the range). Offsets commonly arise from electronics, mounting stress, and temperature. A simple model is:

Continue in our app.
  • Listen to the audio with the screen off.
  • Earn a certificate upon completion.
  • Over 5000 courses for you to explore!
Or continue reading below...
Download App

Download the app

y = x + b

where b is the offset.

Scale Factor (Gain)

A scale factor is a multiplicative error: the sensor changes with the input but with the wrong slope. A simple model is:

y = s * x

where s is the scale factor.

Offset + Scale Together

Most practical calibrations start with an affine model:

y = s * x + b

and solve for s and b from reference measurements. This is often sufficient for encoders (distance), simple range sensors (distance), and some IMU checks (bias).

Nonlinearity

Nonlinearity means the relationship between raw output and true input cannot be captured by a single scale factor. You may need a higher-order model (polynomial), a piecewise linear model, or a lookup table:

y = f(x)  (e.g., y = a2*x^2 + a1*x + a0)

Nonlinearity calibration is justified when residuals after affine calibration show systematic curvature rather than random scatter.

Misalignment and Frame Alignment

Misalignment occurs when the sensor’s measurement axes are rotated relative to the robot’s expected axes. This is a geometric calibration problem: you estimate a rotation (and sometimes translation) between frames.

For a 3-axis sensor, a common model is:

v_robot = R * v_sensor

where R is a 3×3 rotation matrix (or equivalent quaternion). In practice, you may estimate small-angle misalignment (roll/pitch/yaw offsets) or a full rotation depending on accuracy needs.

A Structured Calibration Workflow Template

1) Define the Measurement Model

Write down what you will correct: offset only, offset+scale, nonlinearity, and/or alignment. Decide the parameterization (e.g., b, s, polynomial coefficients, rotation angles).

2) Choose Reference(s) and Fixtures

Pick a reference measurement that is more accurate than the sensor under calibration and is traceable in your context (tape measure, calibration target, precision turntable, known distance markers, motion capture, or a well-characterized reference sensor). Ensure the setup constrains the degrees of freedom you are trying to estimate.

3) Collect Data with Coverage

Collect data across the operating range: different speeds, distances, angles, and temperatures if relevant. Avoid collecting only “easy” points (e.g., only short distances or only one orientation).

4) Estimate Parameters

Fit the model parameters using least squares or robust regression. Keep raw logs so you can re-fit later if you change the model.

5) Validate (Holdout + Cross-Checks)

Validate on data not used for fitting, and cross-check against independent references or alternative sensing modalities. Quantify residual errors and look for systematic patterns.

6) Deploy + Monitor Drift

Store calibration constants with versioning and metadata. Monitor residuals during operation to detect drift and trigger recalibration.

Mini-Workflow 1: Encoder Distance Calibration (Effective Wheel Scale)

Goal: calibrate the mapping from encoder ticks to traveled distance for your assembled robot, capturing effective wheel diameter, tire compression, and drivetrain coupling. This is typically a scale-factor problem, sometimes with left/right differences.

Parameters

  • Distance scale per wheel: s_L, s_R (meters per tick, or correction multiplier).
  • Optional offset: usually not needed for incremental encoders, but you may handle a start/stop detection offset in software separately.

Setup

  • Flat surface with a straight reference line.
  • Reference distance measurement (tape measure, marked floor, or a calibrated track).
  • Consistent payload and tire pressure (if applicable).

Step-by-step

  1. Log raw ticks for left and right encoders while driving a straight segment of known length D_ref (e.g., 5–20 m to reduce relative measurement error).
  2. Repeat for multiple runs and speeds (slow/medium), and in both directions to average out floor slope effects.
  3. Compute measured distance from current parameters: D_meas = s * ticks (or using your existing ticks-to-distance conversion).
  4. Estimate correction: if using a single scale factor k, set k = D_ref / D_meas. If calibrating per wheel, compute k_L and k_R from each wheel’s ticks against the same D_ref (assuming straight motion and negligible slip).
  5. Update parameters: s_new = k * s_old (or directly set meters-per-tick from D_ref / ticks).
  6. Holdout validation: drive a different distance (not used in fitting), and verify the residual error stays within your tolerance.

Practical notes

  • If results vary strongly with speed or acceleration, you are likely seeing slip or compliance effects; treat calibration as “effective” for a defined operating regime and document that regime.
  • If left/right scales differ significantly, inspect mechanical asymmetry (wheel wear, tire pressure, encoder mounting).

Mini-Workflow 2: IMU Bias and Axis Alignment Checks

Goal: verify and correct IMU biases (especially gyro bias) and confirm that the IMU axes are aligned with the robot frame (or estimate the rotation between them). This is both a sensor-model and geometric calibration task.

Parameters

  • Gyro bias vector b_g (rad/s).
  • Accel bias vector b_a (m/s²), often more sensitive to mounting stress and temperature.
  • Axis alignment rotation R_robot_from_imu (or small roll/pitch/yaw misalignment angles).

Step-by-step: Gyro bias (static)

  1. Warm-up the robot for a consistent time (e.g., 5–10 minutes) if the IMU exhibits temperature-dependent bias.
  2. Keep the robot motionless on a stable surface for a fixed interval (e.g., 30–120 s).
  3. Log gyro readings g(t).
  4. Estimate bias as the mean: b_g = mean(g(t)).
  5. Apply correction: g_corr(t) = g(t) - b_g.
  6. Holdout check: repeat a second static interval later; the corrected mean should be near zero within your expected tolerance.

Step-by-step: Accelerometer bias sanity check (static orientations)

  1. Place the robot in several known static orientations (e.g., level, tilted on one side, tilted forward) if feasible and safe.
  2. For each pose, log accelerometer readings and compare the magnitude to expected gravity |a| ≈ g.
  3. Look for axis-specific offsets: if one axis shows a consistent shift across poses, it suggests bias or misalignment.

Step-by-step: Axis alignment check (qualitative + quantitative)

  1. Command a pure rotation about the robot’s vertical axis (e.g., rotate in place slowly).
  2. Observe gyro axes: the dominant angular rate should appear primarily on the IMU axis that corresponds to robot yaw. Significant energy on other axes suggests misalignment.
  3. Estimate alignment by fitting a rotation R that best maps IMU angular rate vectors to the expected robot-frame rotation axis over the dataset (least squares on direction vectors).
  4. Validate by repeating the motion and checking that transformed rates concentrate on the expected axis.

Practical notes

  • Bias is not always constant; document the temperature and warm-up state when estimating it.
  • Mounting changes (replacing foam tape, tightening screws) can change accelerometer bias due to stress; treat this as a recalibration trigger.

Mini-Workflow 3: Range Sensor Mounting Angle Verification

Goal: verify that a range sensor (ultrasonic/IR/ToF) is mounted at the intended pitch/yaw so that its reported distance corresponds to the correct direction in the robot frame. This is primarily an extrinsic (geometric) calibration: a small tilt can create systematic distance errors when measuring a flat wall or floor.

Parameters

  • Mounting yaw/pitch angles (or a rotation R_robot_from_sensor).
  • Optional sensor origin offset (translation) if you need accurate mapping to the robot base frame.

Step-by-step: Wall-based angle check

  1. Place a flat wall (or large flat board) in front of the robot. Ensure the wall is approximately perpendicular to the intended sensor forward axis.
  2. Measure reference distance from the sensor face (or defined sensor origin) to the wall at several robot positions along a straight line normal to the wall (e.g., 0.5 m, 1.0 m, 1.5 m).
  3. Log sensor readings at each position, averaging multiple samples per position to reduce random variation.
  4. Check residual pattern: if the sensor is tilted, the error often grows with distance or changes sign depending on geometry. Plot error = d_sensor - d_ref versus d_ref.
  5. Estimate tilt: adjust the assumed mounting angle in your geometric model until the residuals are minimized across positions (simple 1D search over pitch/yaw if small).
  6. Validate on holdout positions not used in fitting (different distances and slight lateral offsets).

Practical notes

  • Use a large target to avoid edge effects; ensure the beam/field-of-view is fully on the wall.
  • If the sensor is used for mapping, also verify the sensor’s height and forward offset relative to the robot base frame; small translation errors can matter near obstacles.

Mini-Workflow 4: Camera Intrinsic and Extrinsic Calibration

Goal: obtain camera parameters that allow accurate projection between 3D rays and image pixels, and accurate placement of the camera in the robot frame. Intrinsics describe the camera’s internal geometry; extrinsics describe the camera pose relative to a robot frame (e.g., base_link) or another sensor.

Parameters

  • Intrinsics: focal lengths, principal point, distortion coefficients (model-dependent).
  • Extrinsics: rotation and translation between camera frame and robot frame (or between camera and a calibration target frame).

Step-by-step: Intrinsic calibration (target-based)

  1. Select a calibration target (e.g., checkerboard or fiducial grid) with known dimensions.
  2. Capture images covering the full field of view: move the target to corners/edges, vary distance, and vary orientation (tilt/roll) to excite distortion parameters.
  3. Detect target features (corners/markers) and reject frames with poor detection.
  4. Fit intrinsics by minimizing reprojection error (difference between observed feature pixels and projected pixels from the model).
  5. Holdout validation: calibrate using a subset of images, then compute reprojection error on the remaining images. Inspect whether errors are larger near edges (often indicates insufficient coverage or wrong distortion model).

Step-by-step: Extrinsic calibration (camera-to-robot)

  1. Define the robot frame you care about (e.g., base frame at the center of the axle, or a frame used by navigation).
  2. Place the target in a known pose relative to the robot frame (using a fixture, measured offsets, or a repeatable docking position).
  3. Estimate camera pose relative to the target from images using the already-calibrated intrinsics.
  4. Compute camera-to-robot transform from the known target-to-robot transform and the estimated camera-to-target transform.
  5. Validate by projecting known 3D points (e.g., target corners in robot frame) into the image and checking pixel residuals, or by checking consistency with another sensor (e.g., range sensor detections aligned with image features).

Practical notes

  • If the camera mount can flex, extrinsics can drift with vibration; consider mechanical reinforcement or periodic recalibration checks.
  • Document the image resolution and any cropping/binning; intrinsics depend on these settings.

Validation Methodology: Proving Calibration Works

Holdout Tests (Train/Validate Split)

Do not validate on the same dataset used to fit parameters. A simple practice is to split data by time or by condition:

  • By condition: calibrate using short distances, validate on long distances; calibrate at one speed, validate at another.
  • By geometry: calibrate with target centered, validate with target near image edges.

Report validation metrics relevant to the sensor: distance error (m), angular error (deg), reprojection error (pixels), or frame alignment error (deg).

Cross-Checking Against Reference Measurements

Use at least one independent reference path to detect “calibrating to your own mistakes.” Examples:

  • Encoder distance vs measured tape distance (independent of encoder model).
  • IMU yaw rate integration vs a turntable angle or a known rotation command with external verification.
  • Range sensor distance vs a physical gauge block or a laser distance meter (when available).
  • Camera extrinsics vs a second method (e.g., mechanical measurement of mount offsets) as a sanity check.

Monitoring Calibration Drift Over Time

Calibration parameters can change due to wear, temperature cycling, shocks, and maintenance. Implement lightweight monitors:

  • Residual monitors: track the difference between predicted and observed measurements in scenarios with known structure (e.g., repeated docking distance, known wall distance at a station).
  • Consistency checks: compare redundant sensors (e.g., left vs right wheel distance over straight runs; camera-based landmark distance vs range sensor).
  • Trend logging: store time series of estimated biases (e.g., gyro bias at startup) to detect gradual drift.
SensorCommon drift symptomSimple monitor
EncodersDistance error grows after wheel wearPeriodic straight-line distance check on a marked track
IMUStartup bias changes with temperature/ageLog static gyro mean at startup and track trend
Range sensorMount loosens, tilt changesCheck wall distance residual at a fixed station
CameraExtrinsics shift after impactReprojection error on a quick target snapshot

Documentation and Deployment Practices

Store Calibration Constants with Versioning

Calibration should be treated like software configuration: traceable, reviewable, and reproducible.

  • Use a structured file (JSON/YAML) per robot and per sensor, including parameter names, units, and frames.
  • Include a version field and a schema version so you can evolve the model without ambiguity.
  • Record provenance: who/what generated it (script name, git commit hash, tool version).
{  "robot_id": "R17",  "calibration_version": "2026-01-15",  "schema": "calib_v2",  "encoder": {"meters_per_tick_left": 0.000312, "meters_per_tick_right": 0.000309},  "imu": {"gyro_bias_rad_s": [0.0012, -0.0004, 0.0009], "R_robot_from_imu": [[...],[...],[...]]},  "camera": {"resolution": [1280,720], "intrinsics": {...}, "T_robot_from_camera": {...}},  "metadata": {"tool": "calib_suite", "commit": "a1b2c3d", "operator": "tech_04"}}

Record Environmental and Operational Conditions

Calibration results depend on conditions. Store at least:

  • Temperature (and warm-up duration), especially for IMUs and cameras.
  • Payload and tire pressure (for wheeled odometry).
  • Surface type (for wheel slip sensitivity).
  • Sensor settings (camera exposure, resolution; range sensor mode; IMU sampling rate).

Define Recalibration Triggers (Explicit Policies)

Make recalibration a planned maintenance action with clear triggers:

  • Hardware changes: wheel/tire replacement, encoder remounting, IMU/camera/range sensor remounting, chassis repair.
  • Shocks and incidents: collisions, drops, or any event that could shift mounts.
  • Software/firmware changes: changes to filtering, timing, or sensor configuration that affect interpretation of raw data.
  • Performance thresholds: validation residual exceeds a limit (e.g., encoder distance error > 1% on holdout track; camera reprojection error above a set pixel threshold; range sensor station check error above tolerance).
  • Time-based: periodic recalibration schedule for fleets (e.g., monthly) if drift is known.

Now answer the exercise about the content:

Which activity is primarily part of geometric calibration rather than sensor model calibration?

You are right! Congratulations, now go to the next page

You missed! Try again.

Geometric calibration focuses on aligning frames, such as estimating a rotation between sensor axes and robot axes. Bias and scale factor corrections are part of the sensor model calibration layer.

Next chapter

Filtering Sensor Data: From Moving Averages to Complementary Filters

Arrow Right Icon
Free Ebook cover Sensors in Robotics: From Signals to Reliable Measurements
79%

Sensors in Robotics: From Signals to Reliable Measurements

New course

14 pages

Download the app to earn free Certification and listen to the courses in the background, even with the screen off.