Open Source2025-01 to OngoingSolo

DisplayAnalysis

Detect the display artifacts that cause eye strain.

3.8K

Python LOC

3

UI Modes

480Hz+

Video Support

6

Metric Types

CLIGUIFFTCIELABPDF Reports

Overview

I built DisplayAnalysis to quantify display artifacts that are hard to see but easy to feel. It analyzes high-frame-rate captures to detect temporal dithering, PWM backlight flicker, and uniformity issues, then generates a PDF report with plain-language explanations and risk assessments. Under the hood it uses FFT frequency analysis and perceptual (CIELAB) color metrics.

Problem

Display artifacts cause eye strain but are invisible.

Modern displays can use techniques like temporal dithering or PWM backlight dimming that operate below conscious perception—but still cause headaches and fatigue for some people.

I wanted an objective way to answer “is my display doing something weird?” without lab equipment: take a high-speed capture, compute metrics, and generate a report that’s shareable and understandable.

Solution

High-speed video analysis makes the invisible measurable.

I run a repeatable analysis pipeline: extract frames, compute temporal/spatial/color metrics, detect periodic flicker in the frequency domain, then render a PDF.

The output is quantified and explainable: you get numbers, charts, worst-case frames, and thresholds that translate “signal strength” into a practical risk assessment.

Workflow

Capture → Analyze → Report

Record your display with a high-speed camera, run the analysis, and receive a comprehensive PDF report with quantified metrics and risk assessments.

Lifecycle

INPUT_VALIDATIONROI_SELECTIONFRAME_EXTRACTIONMETRIC_CALCULATIONREPORT_GENERATION

Command Line Interface

CLI

analyze-display <input>

Run full analysis pipeline

CLI

analyze-display --interactive

Launch guided wizard mode

CLI

analyze-display-gui

Launch Tkinter GUI

Python

run_analysis(input_path, **opts)

Programmatic API

Architecture

Modular pipeline architecture separating frame processing, metric calculation, and report generation.

User Interface

  • CLI (argparse)
  • Interactive Wizard
  • Tkinter GUI

Orchestration

  • analyze_display.py
  • run_analysis()
  • Frame Generator

Analysis

  • Temporal Metrics
  • Spatial Metrics
  • Color Metrics
  • FFT Flicker Detection

Output

  • PDF Reports
  • CSV Export
  • JSON Export
  • Heatmap PNGs

Analysis Capabilities

8 Detection Methods

Temporal Stability (MAD/RMS)

core

Tracks sub-perceptual shimmer and pixel-level toggling

PWM Flicker Analysis

core

FFT-based frequency detection up to 480Hz+

Dither Pixel Detection

core

Tracks ±1 pixel variations indicative of temporal dithering

Spatial & Color Uniformity

core

Block-based CIELAB analysis for backlight bleed and tinting

Interactive Setup Wizard

dx

Guided FPS and ROI selection for non-technical users

Tkinter GUI Interface

dx

Native file selection and ROI confirmation preview

Worst-Case Frame Capture

core

Automatically identifies and exports peak artifact frames

Docker noVNC Support

core

Full GUI access inside isolated containers via browser

Tech Stack

Language

Python 3.8+

Core implementation, cross-version support

Computer Vision

OpenCV 4.8+

Video/image I/O, frame extraction

scikit-image 0.21+

CIELAB color space conversion

Scientific Computing

NumPy 1.24+

Vectorized array operations, pixel analysis

SciPy 1.10+

FFT for flicker frequency detection

pandas 2.0+

Metrics tabulation and CSV export

Visualization

Matplotlib 3.7+

PDF reports, graphs, heatmaps

Infrastructure

Docker

Containerized CLI and GUI deployment

GitHub Actions

CI/CD across 5 Python versions

Tradeoffs & Decisions

Why FFT over simple threshold detection for flicker?

Frame differencing alone can’t separate motion/content changes from periodic display artifacts. FFT moves the signal into the frequency domain, which makes PWM-like periodic components show up clearly at specific Hz rates.

Alternatives:Frame differencingPeak detectionAutocorrelation

Why CIELAB over RGB for color uniformity?

RGB distances don’t map to perception. CIELAB is designed to be perceptually uniform, so a delta‑E of ~1 means roughly the same visible difference anywhere in the space.

Alternatives:RGB Euclidean distanceHSV/HSLXYZ color space

Why three UI modes instead of one?

I wanted it to be usable by both technical and non-technical users. CLI enables automation, the interactive wizard reduces setup mistakes (FPS/ROI), and the GUI provides visual ROI confirmation.

Alternatives:CLI onlyGUI onlyWeb interface

Why PDF reports over interactive dashboards?

I optimized for shareability. PDFs are self-contained, printable, work offline, and make it easy to compare displays or share results with someone else.

Alternatives:HTML dashboardJupyter notebookTerminal output only

Challenges

High-speed video files exceed available RAM for pixel-level temporal analysis

I stream frames through the pipeline instead of loading everything at once, and I surface actionable errors (use ROI selection or frame skipping) when memory becomes a constraint.

ROI preview requires display environment but tool runs headless in Docker/SSH

I detect when a display environment isn’t available and fall back gracefully, and I added an optional noVNC image when GUI access in a container is necessary.

OpenCV uses BGR while scikit-image expects RGB, causing color analysis errors

I made the color conversion pipeline explicit (BGR → RGB + normalization) so CIELAB computations stay correct and reproducible.

Users don’t know what FPS their slow-motion camera actually captured at

I added an interactive prompt with common camera examples plus an `--override-fps` escape hatch so analysis isn’t silently wrong.

Outcomes

  • I detect temporal dithering patterns down to subtle pixel-level toggling using temporal analysis
  • I identify PWM flicker frequencies by extracting dominant periodic components via FFT
  • I generate multi-page PDF reports with per-frame metrics, worst-case captures, and heatmaps
  • I support inputs from normal videos to high-speed captures (240–480Hz+)
  • I ship three ways to run it: CLI automation, an interactive wizard, and a lightweight GUI for ROI selection
  • I provide containerized Docker usage (including optional noVNC) for headless environments

Investigate your display

If you’re experiencing eye strain, headaches, or fatigue, use this tool to objectively measure your display’s flicker and dithering artifacts.