- •Preface
- •Biological Vision Systems
- •Visual Representations from Paintings to Photographs
- •Computer Vision
- •The Limitations of Standard 2D Images
- •3D Imaging, Analysis and Applications
- •Book Objective and Content
- •Acknowledgements
- •Contents
- •Contributors
- •2.1 Introduction
- •Chapter Outline
- •2.2 An Overview of Passive 3D Imaging Systems
- •2.2.1 Multiple View Approaches
- •2.2.2 Single View Approaches
- •2.3 Camera Modeling
- •2.3.1 Homogeneous Coordinates
- •2.3.2 Perspective Projection Camera Model
- •2.3.2.1 Camera Modeling: The Coordinate Transformation
- •2.3.2.2 Camera Modeling: Perspective Projection
- •2.3.2.3 Camera Modeling: Image Sampling
- •2.3.2.4 Camera Modeling: Concatenating the Projective Mappings
- •2.3.3 Radial Distortion
- •2.4 Camera Calibration
- •2.4.1 Estimation of a Scene-to-Image Planar Homography
- •2.4.2 Basic Calibration
- •2.4.3 Refined Calibration
- •2.4.4 Calibration of a Stereo Rig
- •2.5 Two-View Geometry
- •2.5.1 Epipolar Geometry
- •2.5.2 Essential and Fundamental Matrices
- •2.5.3 The Fundamental Matrix for Pure Translation
- •2.5.4 Computation of the Fundamental Matrix
- •2.5.5 Two Views Separated by a Pure Rotation
- •2.5.6 Two Views of a Planar Scene
- •2.6 Rectification
- •2.6.1 Rectification with Calibration Information
- •2.6.2 Rectification Without Calibration Information
- •2.7 Finding Correspondences
- •2.7.1 Correlation-Based Methods
- •2.7.2 Feature-Based Methods
- •2.8 3D Reconstruction
- •2.8.1 Stereo
- •2.8.1.1 Dense Stereo Matching
- •2.8.1.2 Triangulation
- •2.8.2 Structure from Motion
- •2.9 Passive Multiple-View 3D Imaging Systems
- •2.9.1 Stereo Cameras
- •2.9.2 3D Modeling
- •2.9.3 Mobile Robot Localization and Mapping
- •2.10 Passive Versus Active 3D Imaging Systems
- •2.11 Concluding Remarks
- •2.12 Further Reading
- •2.13 Questions
- •2.14 Exercises
- •References
- •3.1 Introduction
- •3.1.1 Historical Context
- •3.1.2 Basic Measurement Principles
- •3.1.3 Active Triangulation-Based Methods
- •3.1.4 Chapter Outline
- •3.2 Spot Scanners
- •3.2.1 Spot Position Detection
- •3.3 Stripe Scanners
- •3.3.1 Camera Model
- •3.3.2 Sheet-of-Light Projector Model
- •3.3.3 Triangulation for Stripe Scanners
- •3.4 Area-Based Structured Light Systems
- •3.4.1 Gray Code Methods
- •3.4.1.1 Decoding of Binary Fringe-Based Codes
- •3.4.1.2 Advantage of the Gray Code
- •3.4.2 Phase Shift Methods
- •3.4.2.1 Removing the Phase Ambiguity
- •3.4.3 Triangulation for a Structured Light System
- •3.5 System Calibration
- •3.6 Measurement Uncertainty
- •3.6.1 Uncertainty Related to the Phase Shift Algorithm
- •3.6.2 Uncertainty Related to Intrinsic Parameters
- •3.6.3 Uncertainty Related to Extrinsic Parameters
- •3.6.4 Uncertainty as a Design Tool
- •3.7 Experimental Characterization of 3D Imaging Systems
- •3.7.1 Low-Level Characterization
- •3.7.2 System-Level Characterization
- •3.7.3 Characterization of Errors Caused by Surface Properties
- •3.7.4 Application-Based Characterization
- •3.8 Selected Advanced Topics
- •3.8.1 Thin Lens Equation
- •3.8.2 Depth of Field
- •3.8.3 Scheimpflug Condition
- •3.8.4 Speckle and Uncertainty
- •3.8.5 Laser Depth of Field
- •3.8.6 Lateral Resolution
- •3.9 Research Challenges
- •3.10 Concluding Remarks
- •3.11 Further Reading
- •3.12 Questions
- •3.13 Exercises
- •References
- •4.1 Introduction
- •Chapter Outline
- •4.2 Representation of 3D Data
- •4.2.1 Raw Data
- •4.2.1.1 Point Cloud
- •4.2.1.2 Structured Point Cloud
- •4.2.1.3 Depth Maps and Range Images
- •4.2.1.4 Needle map
- •4.2.1.5 Polygon Soup
- •4.2.2 Surface Representations
- •4.2.2.1 Triangular Mesh
- •4.2.2.2 Quadrilateral Mesh
- •4.2.2.3 Subdivision Surfaces
- •4.2.2.4 Morphable Model
- •4.2.2.5 Implicit Surface
- •4.2.2.6 Parametric Surface
- •4.2.2.7 Comparison of Surface Representations
- •4.2.3 Solid-Based Representations
- •4.2.3.1 Voxels
- •4.2.3.3 Binary Space Partitioning
- •4.2.3.4 Constructive Solid Geometry
- •4.2.3.5 Boundary Representations
- •4.2.4 Summary of Solid-Based Representations
- •4.3 Polygon Meshes
- •4.3.1 Mesh Storage
- •4.3.2 Mesh Data Structures
- •4.3.2.1 Halfedge Structure
- •4.4 Subdivision Surfaces
- •4.4.1 Doo-Sabin Scheme
- •4.4.2 Catmull-Clark Scheme
- •4.4.3 Loop Scheme
- •4.5 Local Differential Properties
- •4.5.1 Surface Normals
- •4.5.2 Differential Coordinates and the Mesh Laplacian
- •4.6 Compression and Levels of Detail
- •4.6.1 Mesh Simplification
- •4.6.1.1 Edge Collapse
- •4.6.1.2 Quadric Error Metric
- •4.6.2 QEM Simplification Summary
- •4.6.3 Surface Simplification Results
- •4.7 Visualization
- •4.8 Research Challenges
- •4.9 Concluding Remarks
- •4.10 Further Reading
- •4.11 Questions
- •4.12 Exercises
- •References
- •1.1 Introduction
- •Chapter Outline
- •1.2 A Historical Perspective on 3D Imaging
- •1.2.1 Image Formation and Image Capture
- •1.2.2 Binocular Perception of Depth
- •1.2.3 Stereoscopic Displays
- •1.3 The Development of Computer Vision
- •1.3.1 Further Reading in Computer Vision
- •1.4 Acquisition Techniques for 3D Imaging
- •1.4.1 Passive 3D Imaging
- •1.4.2 Active 3D Imaging
- •1.4.3 Passive Stereo Versus Active Stereo Imaging
- •1.5 Twelve Milestones in 3D Imaging and Shape Analysis
- •1.5.1 Active 3D Imaging: An Early Optical Triangulation System
- •1.5.2 Passive 3D Imaging: An Early Stereo System
- •1.5.3 Passive 3D Imaging: The Essential Matrix
- •1.5.4 Model Fitting: The RANSAC Approach to Feature Correspondence Analysis
- •1.5.5 Active 3D Imaging: Advances in Scanning Geometries
- •1.5.6 3D Registration: Rigid Transformation Estimation from 3D Correspondences
- •1.5.7 3D Registration: Iterative Closest Points
- •1.5.9 3D Local Shape Descriptors: Spin Images
- •1.5.10 Passive 3D Imaging: Flexible Camera Calibration
- •1.5.11 3D Shape Matching: Heat Kernel Signatures
- •1.6 Applications of 3D Imaging
- •1.7 Book Outline
- •1.7.1 Part I: 3D Imaging and Shape Representation
- •1.7.2 Part II: 3D Shape Analysis and Processing
- •1.7.3 Part III: 3D Imaging Applications
- •References
- •5.1 Introduction
- •5.1.1 Applications
- •5.1.2 Chapter Outline
- •5.2 Mathematical Background
- •5.2.1 Differential Geometry
- •5.2.2 Curvature of Two-Dimensional Surfaces
- •5.2.3 Discrete Differential Geometry
- •5.2.4 Diffusion Geometry
- •5.2.5 Discrete Diffusion Geometry
- •5.3 Feature Detectors
- •5.3.1 A Taxonomy
- •5.3.2 Harris 3D
- •5.3.3 Mesh DOG
- •5.3.4 Salient Features
- •5.3.5 Heat Kernel Features
- •5.3.6 Topological Features
- •5.3.7 Maximally Stable Components
- •5.3.8 Benchmarks
- •5.4 Feature Descriptors
- •5.4.1 A Taxonomy
- •5.4.2 Curvature-Based Descriptors (HK and SC)
- •5.4.3 Spin Images
- •5.4.4 Shape Context
- •5.4.5 Integral Volume Descriptor
- •5.4.6 Mesh Histogram of Gradients (HOG)
- •5.4.7 Heat Kernel Signature (HKS)
- •5.4.8 Scale-Invariant Heat Kernel Signature (SI-HKS)
- •5.4.9 Color Heat Kernel Signature (CHKS)
- •5.4.10 Volumetric Heat Kernel Signature (VHKS)
- •5.5 Research Challenges
- •5.6 Conclusions
- •5.7 Further Reading
- •5.8 Questions
- •5.9 Exercises
- •References
- •6.1 Introduction
- •Chapter Outline
- •6.2 Registration of Two Views
- •6.2.1 Problem Statement
- •6.2.2 The Iterative Closest Points (ICP) Algorithm
- •6.2.3 ICP Extensions
- •6.2.3.1 Techniques for Pre-alignment
- •Global Approaches
- •Local Approaches
- •6.2.3.2 Techniques for Improving Speed
- •Subsampling
- •Closest Point Computation
- •Distance Formulation
- •6.2.3.3 Techniques for Improving Accuracy
- •Outlier Rejection
- •Additional Information
- •Probabilistic Methods
- •6.3 Advanced Techniques
- •6.3.1 Registration of More than Two Views
- •Reducing Error Accumulation
- •Automating Registration
- •6.3.2 Registration in Cluttered Scenes
- •Point Signatures
- •Matching Methods
- •6.3.3 Deformable Registration
- •Methods Based on General Optimization Techniques
- •Probabilistic Methods
- •6.3.4 Machine Learning Techniques
- •Improving the Matching
- •Object Detection
- •6.4 Quantitative Performance Evaluation
- •6.5 Case Study 1: Pairwise Alignment with Outlier Rejection
- •6.6 Case Study 2: ICP with Levenberg-Marquardt
- •6.6.1 The LM-ICP Method
- •6.6.2 Computing the Derivatives
- •6.6.3 The Case of Quaternions
- •6.6.4 Summary of the LM-ICP Algorithm
- •6.6.5 Results and Discussion
- •6.7 Case Study 3: Deformable ICP with Levenberg-Marquardt
- •6.7.1 Surface Representation
- •6.7.2 Cost Function
- •Data Term: Global Surface Attraction
- •Data Term: Boundary Attraction
- •Penalty Term: Spatial Smoothness
- •Penalty Term: Temporal Smoothness
- •6.7.3 Minimization Procedure
- •6.7.4 Summary of the Algorithm
- •6.7.5 Experiments
- •6.8 Research Challenges
- •6.9 Concluding Remarks
- •6.10 Further Reading
- •6.11 Questions
- •6.12 Exercises
- •References
- •7.1 Introduction
- •7.1.1 Retrieval and Recognition Evaluation
- •7.1.2 Chapter Outline
- •7.2 Literature Review
- •7.3 3D Shape Retrieval Techniques
- •7.3.1 Depth-Buffer Descriptor
- •7.3.1.1 Computing the 2D Projections
- •7.3.1.2 Obtaining the Feature Vector
- •7.3.1.3 Evaluation
- •7.3.1.4 Complexity Analysis
- •7.3.2 Spin Images for Object Recognition
- •7.3.2.1 Matching
- •7.3.2.2 Evaluation
- •7.3.2.3 Complexity Analysis
- •7.3.3 Salient Spectral Geometric Features
- •7.3.3.1 Feature Points Detection
- •7.3.3.2 Local Descriptors
- •7.3.3.3 Shape Matching
- •7.3.3.4 Evaluation
- •7.3.3.5 Complexity Analysis
- •7.3.4 Heat Kernel Signatures
- •7.3.4.1 Evaluation
- •7.3.4.2 Complexity Analysis
- •7.4 Research Challenges
- •7.5 Concluding Remarks
- •7.6 Further Reading
- •7.7 Questions
- •7.8 Exercises
- •References
- •8.1 Introduction
- •Chapter Outline
- •8.2 3D Face Scan Representation and Visualization
- •8.3 3D Face Datasets
- •8.3.1 FRGC v2 3D Face Dataset
- •8.3.2 The Bosphorus Dataset
- •8.4 3D Face Recognition Evaluation
- •8.4.1 Face Verification
- •8.4.2 Face Identification
- •8.5 Processing Stages in 3D Face Recognition
- •8.5.1 Face Detection and Segmentation
- •8.5.2 Removal of Spikes
- •8.5.3 Filling of Holes and Missing Data
- •8.5.4 Removal of Noise
- •8.5.5 Fiducial Point Localization and Pose Correction
- •8.5.6 Spatial Resampling
- •8.5.7 Feature Extraction on Facial Surfaces
- •8.5.8 Classifiers for 3D Face Matching
- •8.6 ICP-Based 3D Face Recognition
- •8.6.1 ICP Outline
- •8.6.2 A Critical Discussion of ICP
- •8.6.3 A Typical ICP-Based 3D Face Recognition Implementation
- •8.6.4 ICP Variants and Other Surface Registration Approaches
- •8.7 PCA-Based 3D Face Recognition
- •8.7.1 PCA System Training
- •8.7.2 PCA Training Using Singular Value Decomposition
- •8.7.3 PCA Testing
- •8.7.4 PCA Performance
- •8.8 LDA-Based 3D Face Recognition
- •8.8.1 Two-Class LDA
- •8.8.2 LDA with More than Two Classes
- •8.8.3 LDA in High Dimensional 3D Face Spaces
- •8.8.4 LDA Performance
- •8.9 Normals and Curvature in 3D Face Recognition
- •8.9.1 Computing Curvature on a 3D Face Scan
- •8.10 Recent Techniques in 3D Face Recognition
- •8.10.1 3D Face Recognition Using Annotated Face Models (AFM)
- •8.10.2 Local Feature-Based 3D Face Recognition
- •8.10.2.1 Keypoint Detection and Local Feature Matching
- •8.10.2.2 Other Local Feature-Based Methods
- •8.10.3 Expression Modeling for Invariant 3D Face Recognition
- •8.10.3.1 Other Expression Modeling Approaches
- •8.11 Research Challenges
- •8.12 Concluding Remarks
- •8.13 Further Reading
- •8.14 Questions
- •8.15 Exercises
- •References
- •9.1 Introduction
- •Chapter Outline
- •9.2 DEM Generation from Stereoscopic Imagery
- •9.2.1 Stereoscopic DEM Generation: Literature Review
- •9.2.2 Accuracy Evaluation of DEMs
- •9.2.3 An Example of DEM Generation from SPOT-5 Imagery
- •9.3 DEM Generation from InSAR
- •9.3.1 Techniques for DEM Generation from InSAR
- •9.3.1.1 Basic Principle of InSAR in Elevation Measurement
- •9.3.1.2 Processing Stages of DEM Generation from InSAR
- •The Branch-Cut Method of Phase Unwrapping
- •The Least Squares (LS) Method of Phase Unwrapping
- •9.3.2 Accuracy Analysis of DEMs Generated from InSAR
- •9.3.3 Examples of DEM Generation from InSAR
- •9.4 DEM Generation from LIDAR
- •9.4.1 LIDAR Data Acquisition
- •9.4.2 Accuracy, Error Types and Countermeasures
- •9.4.3 LIDAR Interpolation
- •9.4.4 LIDAR Filtering
- •9.4.5 DTM from Statistical Properties of the Point Cloud
- •9.5 Research Challenges
- •9.6 Concluding Remarks
- •9.7 Further Reading
- •9.8 Questions
- •9.9 Exercises
- •References
- •10.1 Introduction
- •10.1.1 Allometric Modeling of Biomass
- •10.1.2 Chapter Outline
- •10.2 Aerial Photo Mensuration
- •10.2.1 Principles of Aerial Photogrammetry
- •10.2.1.1 Geometric Basis of Photogrammetric Measurement
- •10.2.1.2 Ground Control and Direct Georeferencing
- •10.2.2 Tree Height Measurement Using Forest Photogrammetry
- •10.2.2.2 Automated Methods in Forest Photogrammetry
- •10.3 Airborne Laser Scanning
- •10.3.1 Principles of Airborne Laser Scanning
- •10.3.1.1 Lidar-Based Measurement of Terrain and Canopy Surfaces
- •10.3.2 Individual Tree-Level Measurement Using Lidar
- •10.3.2.1 Automated Individual Tree Measurement Using Lidar
- •10.3.3 Area-Based Approach to Estimating Biomass with Lidar
- •10.4 Future Developments
- •10.5 Concluding Remarks
- •10.6 Further Reading
- •10.7 Questions
- •References
- •11.1 Introduction
- •Chapter Outline
- •11.2 Volumetric Data Acquisition
- •11.2.1 Computed Tomography
- •11.2.1.1 Characteristics of 3D CT Data
- •11.2.2 Positron Emission Tomography (PET)
- •11.2.2.1 Characteristics of 3D PET Data
- •Relaxation
- •11.2.3.1 Characteristics of the 3D MRI Data
- •Image Quality and Artifacts
- •11.2.4 Summary
- •11.3 Surface Extraction and Volumetric Visualization
- •11.3.1 Surface Extraction
- •Example: Curvatures and Geometric Tools
- •11.3.2 Volume Rendering
- •11.3.3 Summary
- •11.4 Volumetric Image Registration
- •11.4.1 A Hierarchy of Transformations
- •11.4.1.1 Rigid Body Transformation
- •11.4.1.2 Similarity Transformations and Anisotropic Scaling
- •11.4.1.3 Affine Transformations
- •11.4.1.4 Perspective Transformations
- •11.4.1.5 Non-rigid Transformations
- •11.4.2 Points and Features Used for the Registration
- •11.4.2.1 Landmark Features
- •11.4.2.2 Surface-Based Registration
- •11.4.2.3 Intensity-Based Registration
- •11.4.3 Registration Optimization
- •11.4.3.1 Estimation of Registration Errors
- •11.4.4 Summary
- •11.5 Segmentation
- •11.5.1 Semi-automatic Methods
- •11.5.1.1 Thresholding
- •11.5.1.2 Region Growing
- •11.5.1.3 Deformable Models
- •Snakes
- •Balloons
- •11.5.2 Fully Automatic Methods
- •11.5.2.1 Atlas-Based Segmentation
- •11.5.2.2 Statistical Shape Modeling and Analysis
- •11.5.3 Summary
- •11.6 Diffusion Imaging: An Illustration of a Full Pipeline
- •11.6.1 From Scalar Images to Tensors
- •11.6.2 From Tensor Image to Information
- •11.6.3 Summary
- •11.7 Applications
- •11.7.1 Diagnosis and Morphometry
- •11.7.2 Simulation and Training
- •11.7.3 Surgical Planning and Guidance
- •11.7.4 Summary
- •11.8 Concluding Remarks
- •11.9 Research Challenges
- •11.10 Further Reading
- •Data Acquisition
- •Surface Extraction
- •Volume Registration
- •Segmentation
- •Diffusion Imaging
- •Software
- •11.11 Questions
- •11.12 Exercises
- •References
- •Index
132
Table 3.2 Approximate depth of field as a function of a few beam radii. The laser wavelength is 0.633 μm. Table courtesy of NRC Canada
|
M.-A. Drouin and J.-A. Beraldin |
Beam radius (w0) |
Approximate depth of field (Df ) |
10 μm |
1 mm |
100 μm |
100 mm |
1 mm |
10 m |
volume of the scanner. Moreover, the usable measurement volume of a sheet-of-light scanner could be computed similarly.
3.8.6 Lateral Resolution
Intuitively, the lateral resolution is the capability of a scanner to discriminate two adjacent structures on the surface of a sample. A formal definition can be found in [3]. For some applications such as the one presented in Sect. 3.7.4, it is critical to use a 3D scanner with sufficient lateral resolution. The lateral resolution is limited by two factors which are the structural and the spatial resolution [3].
For a phase-shift system, when working out-of-focus, the lateral resolution of a system is not limited by the camera resolution (spatial resolution), but by the optical resolution (structural resolution) of the camera lens. Thus, to increase the lateral resolution, one may have to reduce the depth of field of the scanner or the lens aperture size. When a digital projector is used, artifacts induced by inter-pixel gaps and discretization may limit the lateral resolution of the system. Note that it is possible to alleviate those artifacts by using the hybrid hardware-software solution presented in [33].
For a laser spot scanner, the knowledge of the beam radius on the scene allows one to determine the structural component of the lateral resolution of the system. The spatial resolution is the smallest possible variation of the scan angle α. Increasing the angular resolution of α can improve the lateral resolution as long as the spatial resolution does not exceed the structural one. Thus, reducing the beam radius may be the only way to increase the lateral resolution. When the beam radius is reduced, the depth of field is also reduced unless an auto-focusing method is used while measuring. Thus, there is a trade off between lateral resolution and the depth of field. Table 3.2 gives some numerical examples of beam radii.
3.9 Research Challenges
In Sect. 3.6 we presented the error propagation from the image formation to the 3D points for some area scanners. To the best of our knowledge, no commercial scanner associates to each 3D point a covariance matrix that can be used for performing a first-order error propagation. An important research issue is the understanding and
3 Active 3D Imaging Systems |
133 |
modeling of error propagation from the calibration step to the visualization step of the modeling pipeline. This is challenging because the modeling pipeline can contain a significant amount of geometric processing such as the fusion of multiple scans, the transformation of point clouds into meshes, the decimation of triangles, and the fitting of geometric primitives.
As the individual components of 3D imaging systems continue to improve, it is expected that the spatial resolution of 3D imaging systems will increase up to the limits imposed by physics. As an example, in recent years the resolution of cameras has significantly increased. This had a significant impact on the performance of fringe projection systems; however, there are physical limitations that make further improvement of a 3D scanner impossible. As an example, a laser point scanner can be designed to reduce the effect of speckle, but speckle cannot be removed as it is a physical limit of any system that uses coherent light. Another example of physical limitations is the diffraction introduced by the finite size of a lens aperture. Thus, one of the main challenges in the development of 3D imaging systems is to combine the improvements in commercially available components with innovative new designs and algorithms in order to bring the performance of the system as close as possible to the physical limits. Another interesting area of research is the design of systems for niche applications which are required to work in harsh environments or that must scan very challenging objects, such as translucent objects, objects with grooves or other surface concavities, and underwater objects.
3.10 Concluding Remarks
Many of the traditional measurement instruments like theodolites and CMMs are being replaced by non-contact optical scanners based on triangulation, time-of-flight, or interferometry technology. This sudden change in process design and quality assurance practices needs to be addressed by research organizations and companies. When the goal of a business is to make a quality product for a profit, then metrology will have a direct impact on that business. The quality of measurements planned in the design stage, applied during manufacturing and performed during inspection directly affect the quality of a product. Poor measurements (those without an accuracy statement) may even lead to creating waste with scrapped products. Conversely, precise measurements (those with an accuracy statement) lead to superior products. The dimensional deviations between as-designed, as-built and as-measured devices can only be understood and controlled if traceable measurements can be made in compliance with clear standards. While 3D imaging systems are more widely available, standards, best practices and comparative data are limited. In the near future, we expect to see more comparative data in scientific publications and industrial standards aimed at active 3D imaging systems.
134 |
M.-A. Drouin and J.-A. Beraldin |
3.11 Further Reading
One of the earliest papers on triangulation-based spot scanning for the capture and recording of 3D data was published by Forsen in 1968 [36]. Kanade presents a collection of chapters from different authors that describe a number of close-range active 3D imaging systems [44]. Many survey papers that review range sensors have been published [7, 16]. The geometric description of the point and profile based system presented a simple scanner. Mirrors can be used to fold the baseline such that the baseline of a system is larger than the physical scanner and where the mirrors dynamically modify the field of view of the camera such that the sensor only sees a small area around the laser spot [54]. The calibration of point-based triangulation scanner is discussed in [12, 13, 25].
An article published by Salvi et al. [57] presents an in-depth classification of different types of structured light patterns. Davis et al. [29] present a unifying framework within which one can categorize 3D triangulation sensors, for example on the basis of their coding within the spatial and temporal domains. Moreover, an analysis of the uncertainty of a white light fringe projection based on Gray codes is presented in [63]. Many analyses of the impact of random noise on phase shift methods have been conducted [31, 40, 53, 62].
The two authoritative texts on the matter of uncertainty and vocabulary related to metrology are the Guide to the Expression of Uncertainty in Measurement (GUM) and the International Vocabulary of Metrology (VIM) [1, 5]. The document designated E 2544 from the American Society for Testing and Materials (ASTM) provides the definition and description of terms for 3D imaging systems [6]. Moreover, the VDI 2634 is a document from a standardization body that addresses the characterization of optical distance sensors [2]. Error propagation in the context of multiple-view geometry is discussed in [28]. The characterization of active 3D imaging systems is discussed in [19, 24, 26, 27, 35, 38, 45–47].
3.12 Questions
1.Name and explain three categories of triangulation scanner, based on different methods of scene illumination.
2.In recent years, the resolution of cameras has significantly increased. What are the impacts of this on each type of triangulation scanner?
3.What are the impacts on the 3D data of varying the baseline of a laser stripe scanner without recalibrating the system?
4.What are the impacts on the 3D data of varying the distance, d (the distance between the camera center and image plane) of a laser stripe scanner without recalibrating the system?
5.What are the values of Rp , Tp , x1, y1 and x2 for which the three constraints in Eq. (3.30) are not linearly independent?
6.What are the elements that can limit the lateral resolution of a stripe scanner? Classify those elements as belonging to the spatial or structural resolution.
3 Active 3D Imaging Systems |
135 |
3.13 Exercises
1.Using a programming environment of your choice, develop a 2D simulator of a phase shift fringe projection scanner that can reproduce the intensity artifact shown in Fig. 3.15. Assume that optical-induced blurring is only present in the camera images and that the camera has an infinite spatial resolution. Repeat the experiment for a fringe projection system that uses a Gray code.
2.Modify the previously developed prototype in order to apply it to a stripe scanner. Plot a graph that shows the variation of error due to the width of the stripe.
3.For a stripe-based scanner, an occlusion occurs when the linear detector does not see the laser spot. However, since the spot size is not infinitesimal, there are intermediate situations where only a fraction of the spot is seen by the detector. Using a prototyping environment, develop a model to evaluate the impact of this on the recovered geometry.
4.Perform the error propagation computation for a stripe scanner that includes uncertainty on the angles α.
5.Using Fig. 3.17, trigonometry and the thin lens equation, give the derivation of Eq. (3.46).
6.Modify the camera model presented in Sect. 3.3.1 to incorporate a Scheimpflug condition.
References
1.ISO Guide 98-3: Uncertainty of Measurement Part 3: Guide to the Expression of Uncertainty in Measurement (gum 1995) (1995)
2.VDI 2634: Part 2: Optical 3-d Measuring Systems Optical System Based on Area Scanning (2002)
3.VDI 2617: Part 6.2: Accuracy of Coordinate Measuring Machines Characteristics and Their Testing Guideline for the Application of DIN EN ISO 10360 to Coordinate Measuring Machines with Optical Distance Sensors. Beuth Verlag GmbH (2005)
4.ANSI z136: Part 1–6: American National Standard for Safe Use of Lasers (2007)
5.CGM 200:2008: International Vocabulary of Metrology Basic and General Concepts and Associated Terms (VIM) (2008)
6.ASTM e2544-10: Standard Terminology for Three-Dimensional (3d) Imaging Systems (2010)
7.Amann, M.C., Bosch, T., Lescure, M., Myllylä, R., Rioux, M.: Laser ranging: a critical review of usual techniques for distance measurement. Opt. Eng. 40(1), 10–19 (2001)
8.Baribeau, R., Rioux, M.: Influence of speckle on laser range finders. Appl. Opt. 30(20), 2873– 2878 (1991)
9.Benoit, P., Mathieu, E., Hormire, J., Thomas, A.: Characterization and control of threedimensional objects using fringe projection techniques. Nouv. Rev. Opt. 6(2), 67–86 (1975)
10.Beraldin, J.A., Blais, F., Lohr, U.: Laser scanning technology. In: Vosselman, G., Mass, H.-G. (eds.) Airborne and Terrestrial Laser Scanning. Whittles Publishers, Dunbeath (2010)
11.Beraldin, J.A., Blais, F., Rioux, M., Domey, J., Gonzo, L., Nisi, F.D., Comper, F., Stoppa, D., Gottardi, M., Simoni, A.: Optimized position sensors for flying-spot active triangulation systems. In: Proc. Int. Conf. 3D Digital Imaging and Modeling, pp. 29–36 (2003)
136 |
M.-A. Drouin and J.-A. Beraldin |
12.Beraldin, J.A., El-Hakim, S.F., Cournoyer, L.: Practical range camera calibration. In: Videometrics. SPIE Proceedings, vol. II, pp. 21–31 (1993)
13.Beraldin, J.A., Rioux, M., Blais, F., Godin, G., Baribeau, R.: Model-Based Calibration of a Range Camera (1992)
14.Besl, P.J.: Active, optical range imaging sensors. Mach. Vis. Appl. 1(2), 127–152 (1988)
15.Blahut, R.E.: Theory of Remote Image Formation. Cambridge University Press, Cambridge (2004)
16.Blais, F.: Review of 20 years of range sensor development. J. Electron. Imaging 13(1), 231– 243 (2004)
17.Blais, F., Beraldin, J.A.: Recent developments in 3d multi-modal laser imaging applied to cultural heritage. Mach. Vis. Appl. 17(3), 395–409 (2006)
18.Blais, F., Rioux, M.: Real-time numerical peak detector. Signal Process. 11(2), 145–155 (1986)
19.Boehler, W., Marbs, A.: Investigating scanner accuracy. Tech. rep, German University FH, Mainz (2003)
20.Born, M., Wolf, E.: Principles of Optics: Electromagnetic Theory of Propagation, Interference and Diffraction of Light, 7th edn. Cambridge University Press, Cambridge (1999)
21.Bosch, T., Lescure, M. (eds.): Selected Papers on Laser Distance Measurements, vol. 115. SPIE Milestone Series (1995). B.J. Thompson (General editor)
22.Breuckmann: GmbH: Projector for an arrangement for three-dimensional optical measurement of object. United State Patent Office 7532332 (2009)
23.Brown, D.: Decentering distortion of lenses. Photom. Eng. 32(3), 444–462 (1966)
24.Brownhill, A., Brade, R., Robson, S.: Performance study of non-contact surface measurement technology for use in an experimental fusion device. In: 21st Annual IS&T/SPIE Symposium on Electronic Imaging (2009)
25.Bumbaca, F., Blais, F.: Real-time correction of three-dimensional non-linearities for a laser range finder. Opt. Eng. (1986)
26.Carrier, B., Mackinnon, D., Cournoyer, L., Beraldin, J.A.: Proposed NRC portable target case for short-range triangulation-based 3-d imaging systems characterization. In: 23st Annual IS&T/SPIE Symposium on Electronic Imaging (2011)
27.Cox, M.G., Siebert, B.R.L.: The use of a Monte Carlo method for evaluating uncertainty and expanded uncertainty. Metrologia 43(4), S178 (2006)
28.Criminisi, A.: Accurate Visual Metrology from Single and Multiple Uncalibrated Images. Springer, New York (2001)
29.Davis, J., Nehab, D., Ramamoorthi, R., Rusinkiewicz, S.: Spacetime stereo: A unifying framework for depth from triangulation. IEEE Trans. Pattern Anal. Mach. Intell. 27(2), 296–302 (2005)
30.Dorsch, R.G., Häusler, G., Herrmann, J.M.: Laser triangulation: fundamental uncertainty in distance measurement. Appl. Opt. 33(7), 1306–1314 (1994)
31.Dorsch, R.G., Häusler, G., Herrmann, J.M.: Fourier-transform method of phase-shift determination. Appl. Opt. 40(17), 2886–2894 (2001)
32.Drouin, M.A.: Mise en correspondance active et passive pour la vision par ordinateur multivue. Université de Montréal (2007)
33.Drouin, M.A., Blais, F.: Method and System for Alleviating the Discretization and InterPixel Gaps Effect of a Digital Fringe Projection System (2011). United State Patent Office 13/108,378 (Application)
34.Fisher, R.B., Naidu, D.K.: A comparison of algorithms for subpixel peak detection. In: Image Technology, Advances in Image Processing, Multimedia and Machine Vision, pp. 385–404. Springer, Berlin (1996)
35.Forbes, A.B., Hughes, B., Sun, W.: Comparison of measurements in co-ordinate metrology. Measurement 42(10), 1473–1477 (2009)
3 Active 3D Imaging Systems |
137 |
36.Forsen, G.: Processing visual data with an automaton eye. In: Pictorial Pattern Recognition, pp. 471–502 (1968)
37.Ghiglia, D.C., Pritt, M.D.: Two-Dimensional Phase Unwrapping Theory, Algorithms ans Software. Wiley, New York (1998)
38.Goesele, M., Fuchs, C., Seidel, H.P.: Accuracy of 3d range scanners by measurement of the slanted edge modulation transfer function. In: International Conference on 3D Digital Imaging and Modeling, vol. 37 (2003)
39.Gray, F.: Pulse code communication. United State Patent Office 2632058 (1953)
40.Hibino, K.: Susceptibility of systematic error-compensating algorithms to random noise in phase-shifting interferometry. Appl. Opt. 36(10), 2084–2093 (1997)
41.Inokuchi, S., Sato, K., Matsuda, F.: Range imaging system for 3-D object recognition. In: Proc. Int. Conf. Pattern Recognition, pp. 806–808 (1984)
42.Inspect, Inc.: Optional 3d digitizer, system and method for digitizing an object. United State Patent Office 6493095 (2002)
43.Jahne, B., Haussecker, H.W., Geissler, P.: Handbook of Computer Vision and Applications. 1. Sensors and Imaging. Academic Press, San Diego (1999)
44.Kanade, T. (ed.): Three-Dimensional Machine Vision. Kluwer Academic, Norwell (1987)
45.Leach, R. (ed.): Optical Measurement of Surface Topography. Springer, Berlin (2011)
46.Luhmann, T., Bethmann, F., Herd, B., Ohm, J.: Comparison and verification of optical 3-d surface measurement systems. In: The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. XXXVII, Part B5, Beijing (2008)
47.MacKinnon, D., Aitken, V., Blais, F.: Review of measurement quality metrics for range imaging. J. Electron. Imaging 17 (2008)
48.Naidu, K., Fisher, R.B.: A comparative analysis of algorithms for determining the peak position of a stripe to sub-pixel accuracy. In: Proc. British Machine Vision Conf. (1991)
49.Newhall, B.: Photosculture. Image: Journal of Photography and Motion Pictures of the George Eastman House 7(5), 100–105 (1958)
50.Nitzan, D.: Three-dimensional vision structure for robot applications. IEEE Trans. Pattern Anal. Mach. Intell. 10(3), 291–309 (1988)
51.Ohyama, N., Kinoshita, S., Cornejo-Rodriguez, A., Tsujiuchi, J.: Accuracy of phase determination with unequal reference phase shift. J. Opt. Soc. Am. A 12(9), 1997–2008 (1995)
52.Rathjen, C.: Optical Shop Testing. Wiley, New York (1978). Edited by Malacara
53.Rathjen, C.: Statistical properties of phase-shift algorithms. J. Opt. Soc. Am. A 12(9), 1997– 2008 (1995)
54.Rioux, M.: Laser range finder based on synchronized scanners. Appl. Opt. 23(21), 3837–3855 (1984)
55.Rioux, M., Taylor, D., Duggan, M.: Design of a large depth of view three-dimensional camera for robot vision. Opt. Eng. 26(12), 1245–1250 (1987)
56.Robson, S., Beraldin, A., Brownhill, A., MacDonald, L.: Artefacts for optical surface measurement. In: Society of Photo-Optical Instrumentation & Electronics & Society for Imaging Science and Technology, in Videometrics, Range Imaging, and Applications XI (2011)
57.Salvi, J., Pages, J., Batlle, J.: Pattern codification strategies in structured light systems. Pattern Recognit. 37(4), 827–849 (2004)
58.Sansoni, G., Patrioli, A.: Noncontact 3d sensing of free-form complex surfaces. In: Proc. SPIE, vol. 4309 (2001)
59.Seitz, P.: Photon-noise limited distance resolution of optical metrology methods. In: Optical Measurement Systems for Industrial Inspection V. Proceedings of SPIE, vol. 6616 (2007)
60.Singer, C.J., Williams, T.I., Raper, R.: A History of Technology. Clarendon Press, Oxford (1954)
61.Smith, W.J.: Modern Optical Engineering, 3rd edn. McGraw-Hill, New York (2000)
62.Surrel, Y.: Additive noise effect in digital phase detection. Appl. Opt. 36(1), 271–276 (1994)
138 |
M.-A. Drouin and J.-A. Beraldin |
63.Trobina, M.: Error model of a coded-light range sensor. Tech. Rep. BIWI-TR-164, ETHZentrum (1995)
64.Will, P.M., Pennington, K.S.: Grid coding: a novel technique for image processing. Proc. IEEE 60(6), 669–680 (1972)
65.Willéme, F.: Photo-sculpture. United State Patent Office 43822 (1864)