Добавил:
Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
[2.1] 3D Imaging, Analysis and Applications-Springer-Verlag London (2012).pdf
Скачиваний:
12
Добавлен:
11.12.2021
Размер:
12.61 Mб
Скачать

3 Active 3D Imaging Systems

99

which part of the projected pattern corresponds to which part of the imaged pattern.

When working with coherent light sources (lasers) eye-safety is of paramount importance and one should never operate laser-based 3D imaging sensors without appropriate eye-safety training. Many 3D imaging systems use a laser in the visible spectrum where fractions of a milliwatt are sufficient to cause eye damage, since the laser light density entering the pupil is magnified, at the retina, through the lens. For an operator using any laser, an important safety parameter is the maximum permissible exposure (MPE) which is defined as the level of laser radiation to which a person may be exposed without hazardous effect or adverse biological changes in the eye or skin [4]. The MPE varies with wavelength and operating conditions of a system. We do not have space to discuss eye safety extensively here and refer the reader to the American National Standard for Safe use of Lasers [4]. Note that high power low coherence (and non-coherent) light sources can also pose eye safety issues.

3.1.4 Chapter Outline

Firstly, we will present spot scanners and this will be followed by stripe scanners. Those types of scanners are used to introduce the concepts needed for the presentation of structured light systems in Sect. 3.4. In the following section, we discuss the calibration of active 3D imaging systems. Then, the measurement uncertainty associated with triangulation systems is presented. This section is optional advanced material and may be omitted on the first reading. The experimental characterization of active 3D imaging systems is then presented. In Sect. 3.8, further advanced topics are included and this section also may be omitted on the first reading. Towards the end of the chapter, we present the main challenges for future research, concluding remarks and suggestions for further reading. Finally a set of questions and exercises are presented for the reader to develop and consolidate their understanding of active 3D imaging systems.

3.2 Spot Scanners

Usually, spot scanners use a laser. We limit the discussion to this type of technology and in order to study the basic principle of triangulation, we assume an infinitely thin laser beam diameter and constrain the problem to the plane (X, Z), i.e. Y = 0. The basic geometrical principle of optical triangulation for a spot scanner is shown in Fig. 3.2 and is identical to the one of passive stereo discussed in Chap. 2.

In Fig. 3.2, a laser source projects a beam of light on a surface of interest. The light scattered by that surface is collected from a vantage point spatially distinct from the projected light beam. This light is focused (imaged) onto a linear spot

100

M.-A. Drouin and J.-A. Beraldin

Fig. 3.2 Schematic diagram of a single point optical triangulation sensor based on a laser beam and a linear spot detector. The baseline is H and d is the distance between the lens and the linear spot detector. The projection angle is α. The collection angle is β and it is computed using the distance d and the position p on the linear spot detector. The point [X, Z]T is determined by the baseline H , the projection angle α and the collection angle β. Figure courtesy of [11]

detector.1 The knowledge of both projection and collection angles (α and β) relative to a baseline (H ) determines the [X, Z]T coordinate of a point on a surface. Note that it is assumed that the only light that traverses the lens goes through the optical center, which is the well-known pinhole model of the imaging process. Furthermore, we refer to projection of the laser light onto the scene and we can think of the imaged spot position on the detector and the lens optical center as a back-projected ray, traveling in the opposite direction to the light, back into the scene. This intersects with the projected laser ray to determine the 3D scene point.

The linear spot detector acts as an angle sensor and provides signals that are interpreted as a position p. Explicitly, given the value of p, the value of β in radians is computed as

p

 

 

β = arctan d

(3.1)

where d is the distance between the laser spot detector and the collection lens. (Typically this distance will be slightly larger than the focal length of the lens, such that the imaged spot is well focused at the depth at which most parts of the object surface are imaged. The relevant thin lens equation is discussed in Sect. 3.8.1.)

The position of p on the linear spot detector is computed using a peak detector which will be described later. Using simple trigonometry, one can verify that

H

Z = (3.2) tan α + tan β

and

X = Z tan α.

(3.3)

1A linear spot detector can be conceptually viewed as a conventional camera that has a singe row of pixels. Many linear spot detectors have been proposed in the past for 3D imaging [11].

3 Active 3D Imaging Systems

 

 

 

101

Substituting Eq. (3.1) into Eq. (3.2) gives

 

 

 

Z =

 

H d

(3.4)

 

 

 

.

p

+

d tan α

 

 

 

 

 

In order to acquire a complete profile without using a translation stage, the laser beam can be scanned around some [X, Z]T coordinate using a mirror mounted on a mechanical scanner (typically a galvonometer drive). In this case, the angle α is varied according to a predefined field of view. For practical reasons, the total scanned angle for a configuration like the one in Fig. 3.2 is about 30 degrees. Larger angles may be scanned by more sophisticated optical arrangements called synchronized scanners, where the field of view of the camera is scanned using the same mirror that scans the laser. Sometimes the reverse side of a double sided mirror is used [21, 54].

3.2.1 Spot Position Detection

It is crucial to obtain the position of the laser spot on the linear spot detector to sub-pixel accuracy. In order to accomplish this, the image of the laser spot must be a few pixels wide on the detector which is easy to achieve in a real system. Many peak detectors have been proposed to compute the position of the laser spot and two studies compare different peak detectors [34, 48]. We examine two peak detectors [18, 34, 48]. The first one localizes the ‘center of mass’ of the imaged spot intensity. In this method, the pixel iM with the maximum intensity is found in the 1D image which is denoted I . Then a window of size 2N + 1 centered on iM is used to compute the centroid position. Explicitly, the peak position p is defined as

 

=

M +

N

I (iM + i)i

 

 

 

N

 

 

p

i

 

i=−N

.

(3.5)

 

 

 

i=−N I (iM + i)

 

The second peak detector uses convolution with a derivative filter, followed by a linear interpolation. Explicitly, for each pixel, i, let

 

N

 

j

 

(3.6)

g(i) =

I (i j )F (j + N )

=−N

where F = [1, 1, 1, 1, 0, 1, 1, 1, 1] and N = 4. Finally, the linear interpolation process is implemented as

g(i0)

 

p = i0 + g(i0) g(i0 + 1)

(3.7)

where i0 is a pixel such that g(i0) 0 and g(i0 + 1) < 0. Moreover, F has the property of filtering out some of the frequency content of the image [18]. This makes it