Добавил:
Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
[2.1] 3D Imaging, Analysis and Applications-Springer-Verlag London (2012).pdf
Скачиваний:
12
Добавлен:
11.12.2021
Размер:
12.61 Mб
Скачать

126

M.-A. Drouin and J.-A. Beraldin

Fig. 3.16 The same surface containing defects scanned by four different scanners. (Top left) Point-based laser triangulation system. (Top right) Profile-based laser triangulation system. (Bottom left) Fringe projection system. (Bottom Right) Scanner based on conoscopic holography. Figure courtesy of [33]

3.7.4 Application-Based Characterization

We illustrate the principle behind this family of tests using a surface defect detection application. The objective of this application is to localize defects that create a variation on the surface of a product. In this type of application, the calibration of the system is not very important; however, the capability of the system to image small structural details is very important. In this test, an object which is known to contain defects is scanned and it is possible to verify the presence of those defects in the 3D data. Figure 3.16 illustrates surface defects as detected by four different systems.

3.8 Selected Advanced Topics

This section may be omitted at the first reading. It contains material that requires in-depth knowledge of the image formation process. Section 3.8.1 will present the thin lens equation. Section 3.8.2 and Sect. 3.8.3 examine the depth of field of a triangulation based 3D camera. Section 3.8.4 and Sect. 3.8.5 give some important results whose derivations would required in-depth knowledge of diffraction and Gaussian beam optics. Finally, Sect. 3.8.6 uses those results to discuss the lateral resolution of phase shift and spot scanners. Further information concerning optical issues can be found in [15, 43, 61].

3.8.1 Thin Lens Equation

Optical systems are complex and difficult to model. A very useful approximation is the thin lens equation which provides a first order approximation of a lens with

3 Active 3D Imaging Systems

127

Fig. 3.17 The image of the point at distance Z from the optical center is in focus on the image plane, while the point at distance Z from the optical center is imaged as a circle of diameter c on the image plane. The lens aperture is Φ and the distance between the image plane and the optical center is d . Figure courtesy of NRC Canada

negligible thickness. Given the distance Z between an object and the optical center of the lens and the focal length f of this lens, one may compute, using the thin lens equation the distance between the optical center and the image plane needed in order to obtain a sharp image of the object. Since optical engineering falls outside the scope of this chapter, we provide the thin lens equation without derivation (see for details [61]). The thin lens equation is

1

1

1

 

 

+

 

=

 

(3.45)

Z

d

f

where f is the lens focal length, Z is the distance between the optical center and the object plane and d is the distance between the optical center and the image plane (i.e. the CCD or CMOS sensor).

Since d f when the distance between the camera and the object is sufficiently large, Chap. 2 and other textbooks use f (for focal length) rather than using d in their camera models (see Sect. 3.3.1).

Usually, 3D imaging systems are used for applications that require the scan of non-planar objects. Thus, Eq. (3.45) is not fulfilled for all the points on the surface of the object. As will be explained next, this induces out-of-focus blurring in some parts of the image.

3.8.2 Depth of Field

The point located at Z in Fig. 3.17, will be imaged as a circle of diameter c on the image plane. This circle is named a circle of confusion. Using simple trigonometry and the thin lens equation, the diameter of this circle can be computed as

c

Φ

|Z Z|

 

f

(3.46)

 

 

=

 

Z

 

Z

f

 

 

 

where Φ is the size of the lens aperture. The proof is left as an exercise to the reader. Given a maximum diameter cmax for the circle of confusion and assuming

128

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

M.-A. Drouin and J.-A. Beraldin

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Fig. 3.18 (Left) The circle of confusion acts as a box filter. The diameter of the first circle of confusion is two units, while the diameter of the second one is one unit. (Right) Effect of blurring induced by the circles of confusion (shown left) on the magnitude of a sinusoidal pattern having a period of three units. Figure courtesy of NRC Canada

that Z > Z > f , the depth of field can be computed, using Eq. (3.46), as

Df = 2 Z Z

 

 

(3.47)

where

 

 

 

 

 

 

 

Z =

 

 

Φf Z

 

.

(3.48)

 

 

 

 

 

 

Φf

+

(f

Z)c

 

 

 

 

 

max

 

Thus, a large lens diameter will induce a small focusing range, even though more light is captured by the imaging system.

Figure 3.18 illustrates the impact of blurring on the magnitude of sinusoidal patterns used by a phase shift scanner. As the ratio between the circle-of-confusion diameter and spatial period of the pattern increases, the magnitude of the signal is reduced. In Eq. (3.40), it can be seen that reducing the spatial period reduces the uncertainty. However, once the optical components of the system are taken into account, one can see that reducing the spatial period, may also reduce the magnitude of the sinusoidal pattern, possibly increasing the uncertainty rather than reducing it. Furthermore, because the blurring depends on the distance of a 3D point, the magnitude of the sinusoidal pattern also depends on the distance. Thus, one should expect that the curve of standard deviation shown at Fig. 3.10, should look more like a U shape when taking into account the optically-induced blurring. It is, therefore, important to factor in the optically induced blurring and other optical related degradations when designing a system because those define the usable measurement volume, which is generally smaller than the reconstruction volume. Note than even when a system is perfectly in-focus, diffraction and aberrations induce a degradation of the image which is similar to out-of-focus blurring [61].

3 Active 3D Imaging Systems

129

Fig. 3.19 Scheimpflug geometry for a point-based triangulation sensor. The baseline is H . The projection and collection angles are α and β respectively. The angle between the photo-detector and the collecting lens is ρ. Finally, d and H are respectively the distance along the Z-axis and X-axis between the lens optical center and the position of the laser spot on the detector. Figure courtesy of NRC Canada

3.8.3 Scheimpflug Condition

As explained previously, a large lens diameter will induce a small focusing range. This affects all triangulation-based 3D cameras and many of them use the Scheimpflug condition in order to mitigate the impact of this reduced focusing range [16]. In order to simplify the discussion, the Scheimpflug condition will be presented for a point-based scanner. Nevertheless, it could be used with profile-based and areabased scanners. Figure 3.19 shows an optical geometry based on the Scheimpflug condition for the point-based scanner presented in Sect. 3.2. Note that the optical axis is no longer perpendicular to the photo-detector. The angle between the photodetector and the collecting lens is set to ρ and, as will be shown, this ensures that for a given illumination direction (i.e. angle α) all the points along the laser beam path will be in-focus on the position detector. Using simple trigonometry, one can verify that

d tan ρ = H + H

and

 

 

 

H =

d

 

 

.

 

 

 

tan(π/2

β)

 

 

 

 

where H is a line segment in Fig. 3.19. We obtain

H tan ρ

d =

1 tan β tan ρ

(3.49)

(3.50)

(3.51)

by substituting Eq. (3.50) in Eq. (3.49). Finally, substituting Eq. (3.51) and Eq. (3.2) in Eq. (3.45), we obtain

cot ρ =

H f tan α

.

(3.52)

 

f