Image Processing Map (OpenCV → Production)
      Goal (shortest path to “usable”): build a loop of stable input × reproducible processing × quantitative evaluation with three pillars:
      (1) imaging (lens & lighting included), (2) an OpenCV pipeline, (3) evaluation & operations.
    
  1) Learning Path (Core → Applied)
- Basics: image I/O, BGR/RGB & HSV, histogram equalization, gamma, thresholding (Otsu), morphology, Canny.
 - Shapes / Features: contours & moments, Hough (lines/circles), template matching, ORB/SIFT (keypoints).
 - Geometry / Calibration: checkerboard for intrinsics/extrinsics, distortion correction, homography / perspective (bird’s-eye).
 - Markers: 
ArUco/ChArUcofor coordinate alignment &solvePnP(pose). - Motion / Tracking: background subtraction (MOG2), Lucas–Kanade / Farnebäck optical flow, Kalman filter.
 - DNN: OpenCV DNN with ONNX / YOLO family inference (CPU → CUDA/NNAPI and other accelerations).
 
2) Pipeline Design for Production
- Flow: requirements → sample capture → fix lighting & FOV first → calibration → preprocessing → detection/measurement → thresholding → decision.
 - Non-functionals: latency, throughput, logging/observability, re-training playbook (data versioning).
 - Evaluation: manage PSNR/SSIM (preprocessing quality) separately from mAP/F1 (detection accuracy).
 - Data ops: annotate with CVAT/labelImg; harden models with augmentations (e.g., albumentations).
 
3) Hardware (Input) Essentials
- Camera: sensor (sensitivity/dynamic range), global vs rolling shutter, lens (focal length/distortion).
 - Lens: working distance, field of view, depth of field (aperture), distortion control.
 - Lighting: diffuse / ring / coaxial / low-angle; use cross-polarization (polarizer on light + analyzer on lens) to suppress specular glare.
 
4) Common Task Patterns
- Dimensional measurement: calibrate → edge extraction → sub-pixel localization → mm conversion → control charting.
 - Appearance inspection: stable lighting → background subtraction or learning-based → defect region → features → pass/fail.
 - Positioning / robot pick: ring/coaxial lighting → ArUco for coordinates → PnP (pose) → robot command.
 
5) Deployment Template
- PoC (bench): USB camera + variable lighting to secure a reproducible image.
 - Pilot (near line): rigid mounts, power/heat/dust handling, logging pipeline.
 - Production: versioning (model/threshold/calibration), monitoring, drift detection & refresh rules.
 
Lead: Books × USB Camera × Lighting (Start Here)
Books (foundation to implementation)
- Learning OpenCV (Gary Bradski, Adrian Kaehler; O’Reilly): the classic end-to-end introduction from the project’s founders.
 
USB Cameras (UVC for speed; industrial-leaning for headroom)
- Arducam IMX477 12MP USB3.0: high-resolution, high-speed board camera; interchangeable lenses make it versatile.
 - e-con Systems See3CAM series: UVC-compliant lineup with options like global shutter, low-light, and 4K for industrial use.
 
Lighting (start with ring + polarization to “make the image”)
- CCS ring lights: widely used in inspection; variants for longer working distance and larger fields.
 - Cross-polarization kit (e.g., Edmund Optics): polarizer on the light + analyzer on the lens to cut glare and boost contrast.
 
Appendix: On-Site Checklist
- Lighting fixed (color temperature / intensity / diffusion / polarization)
 - Field of view & resolution (derive mm/px from requirements)
 - Calibration procedure (with temperature / daily re-do plan)
 - Image logging and exact reproduction steps (version control)
 - Document the reasoning for decisions (rules vs learning)
 - SLOs for accuracy & throughput (and approval flow for threshold changes)
 
Comments
Post a Comment