WO2021131684A1 - Ranging device, method for controlling ranging device, and electronic apparatus - Google Patents

Ranging device, method for controlling ranging device, and electronic apparatus Download PDF

Info

Publication number
WO2021131684A1
WO2021131684A1 PCT/JP2020/045758 JP2020045758W WO2021131684A1 WO 2021131684 A1 WO2021131684 A1 WO 2021131684A1 JP 2020045758 W JP2020045758 W JP 2020045758W WO 2021131684 A1 WO2021131684 A1 WO 2021131684A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
light
light source
unit
distance
Prior art date
Application number
PCT/JP2020/045758
Other languages
French (fr)
Japanese (ja)
Inventor
ジャエシュ ハナーカル
竜生 諸角
陽太郎 安
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2021131684A1 publication Critical patent/WO2021131684A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems

Definitions

  • the present technology relates to a distance measuring device and its control method, and an electronic device, in particular, a distance measuring device and its control method capable of measuring a distance limited to a desired measurement range, and an electronic device. Regarding.
  • a distance measuring module is mounted on a mobile terminal such as a smartphone.
  • a distance measuring method in the distance measuring module for example, there is a method called a ToF (Time of Flight) method.
  • ToF Time of Flight
  • light is emitted toward an object to detect the light reflected on the surface of the object, and the distance to the object is calculated based on the measured value obtained by measuring the flight time of the light (for example,). See Patent Document 1).
  • ToF type distance measuring device With the ToF type distance measuring device, there is a request to measure the distance limited to the desired measurement range.
  • This technology was made in view of such a situation, and makes it possible to measure a distance limited to a desired measurement range.
  • the distance measuring device on the first side surface of the present technology determines a light emitting source that irradiates the irradiation light, a light receiving sensor that receives the reflected light that is reflected by the object and returned, and a light emitting timing of the light emitting source.
  • a coded light source modulated signal and a coded sensor modulated signal are generated by encoding the controlled light source modulated signal and the sensor modulated signal that controls the light receiving timing of the light receiving sensor corresponding to a predetermined code.
  • the control method of the distance measuring device on the second side of the present technology is a distance measuring device having a light emitting source that irradiates the irradiation light and a light receiving sensor that receives the reflected light that is reflected by the object and returned.
  • the coded light source modulated signal is obtained by encoding the light source modulated signal that controls the light emitting timing of the light emitting source and the sensor modulated signal that controls the light receiving timing of the light receiving sensor according to a predetermined code.
  • a delay sensor modulated signal whose phase is delayed by the amount of the delay of is generated.
  • the electronic device on the third aspect of the present technology controls a light emitting source that irradiates the irradiation light, a light receiving sensor that receives the reflected light that is reflected by the object and returned, and a light emitting timing of the light emitting source.
  • a coded light source modulated signal and a coded sensor modulated signal are generated by encoding the light source modulated signal to be generated and the sensor modulated signal that controls the light receiving timing of the light receiving sensor according to a predetermined code.
  • a distance measuring device including a sensor delay unit that generates a delay sensor modulated signal whose phase is delayed by an amount is provided.
  • a distance measuring device having a light emitting source that irradiates the irradiation light and a light receiving sensor that receives the reflected light that is reflected by the object and returned.
  • the light source modulated signal that controls the light emitting timing of the light emitting source and the sensor modulated signal that controls the light receiving timing of the light receiving sensor are coded as a coded light source modulated signal by coding corresponding to a predetermined code.
  • a sensor modulation signal is generated, and a delay light source modulation signal whose phase is delayed by a predetermined delay amount with respect to the coded light source modulation signal, or a predetermined delay amount with respect to the coded sensor modulation signal.
  • a delay sensor modulated signal with only a phase delay is generated.
  • the distance measuring device and the electronic device may be an independent device or a module incorporated in another device.
  • the relationship between the distance to the object and the signal strength in the normal mode and the code period 1 is shown.
  • the relationship between the distance to the object and the signal strength in the normal mode and the code period 2 is shown.
  • the relationship between the distance to the object and the signal strength in the long-distance mode and the code period 1 is shown.
  • the relationship between the distance to the object and the signal strength in the long-distance mode and the code period 2 is shown.
  • the relationship between the distance to the object and the signal strength in the short-distance mode and the code period 1 is shown.
  • the relationship between the distance to the object and the signal strength in the short-distance mode and the code period 2 is shown.
  • the relationship between the distance to the object and the signal strength in the short-distance mode and the code period 1 is shown.
  • FIG. 1 is a block diagram showing a configuration example of a distance measuring device according to an embodiment to which the present technology is applied.
  • the distance measuring device 1 shown in FIG. 1 is a distance measuring module that performs distance measuring by the Indirect ToF method, irradiates a predetermined object (measurement object) as a subject with light, and the light (irradiation light). Receives the light (reflected light) reflected by the object to generate and output a depth map and a reliability map as distance information to the object.
  • the distance measuring device 1 includes a timing signal generation unit 11, a phase setting unit 12, a light source modulation unit 13, a sensor modulation unit 14, a code generation unit 15, a coding unit 16, a light source delay unit 17, a sensor delay unit 18, and a light emitting source 19. , A light receiving sensor 20, and a control unit 21.
  • the timing signal generation unit 11 generates a timing signal that serves as a reference for the light emitting operation of the light emitting source 19 and the light receiving operation of the light receiving sensor 20. Specifically, the timing signal generation unit 11 generates a modulation signal having a predetermined modulation frequency Fmod (for example, 20 MHz) and supplies it to the light source modulation unit 13, the sensor modulation unit 14, and the code generation unit 15.
  • the modulated signal is, for example, as shown in FIG. 4, a pulse signal that repeats on (High) and off (Low) at the modulation frequency Fmod.
  • Phase setting unit 12 when performing a distance measurement by Indirect ToF method, the light emission timing of the light emitting source 19, to set the phase difference phi D of the light receiving timing of the light receiving sensor 20, the light source modulation unit 13 and the sensor modulation section 14 Supply to.
  • the phase difference ⁇ D between the light emission timing and the light reception timing is referred to as a drive phase difference ⁇ D to distinguish it from the phase difference ⁇ detected according to the distance to the subject.
  • the light source modulation unit 13 is a light source modulation signal whose phase is shifted by the drive phase difference ⁇ D with respect to the modulation signal supplied from the timing signal generation unit 11. Is generated and supplied to the coding unit 16.
  • Sensor modulation unit 14 when the drive phase difference phi D from the phase setting section 12 is supplied, the modulated signal supplied from the timing signal generator 11, a sensor modulated signal obtained by shifting the drive phase difference phi D phase by Is generated and supplied to the coding unit 16.
  • the phase setting unit 12 Since the phase of the light source modulation signal generated by the light source modulation unit 13 and the sensor modulation signal generated by the sensor modulation unit 14 need only be deviated by the drive phase difference ⁇ D , the phase setting unit 12 has the drive phase difference ⁇ . D may be supplied to either the light source modulation unit 13 or the sensor modulation unit 14. For example, when the phase setting unit 12 supplies the drive phase difference ⁇ D to the light source modulation unit 13, the light source modulation unit 13 generates a light source modulation signal whose phase is shifted by the drive phase difference ⁇ D with respect to the modulation signal.
  • the sensor modulation unit 14 supplies the modulation signal from the timing signal generation unit 11 to the coding unit 16 as it is as a sensor modulation signal.
  • the phase setting unit 12 when supplying a driving phase difference phi D to the sensor modulation section 14, the sensor modulation unit 14 generates a sensor modulated signal obtained by shifting the drive phase difference phi D phase against modulation signal
  • the light source modulation unit 13 supplies the modulation signal from the timing signal generation unit 11 to the coding unit 16 as it is as a light source modulation signal.
  • a modulation signal which is a reference timing signal, is supplied to the code generation unit 15 from the timing signal generation unit 11, and a code period is supplied from the control unit 21.
  • the code generation unit 15 randomly generates a code of 0 or 1 in a code cycle unit supplied from the control unit 21 and supplies the code to the coding unit 16.
  • One code cycle is one cycle of the modulated signal, and the code cycle unit supplied from the control unit 21 is an integral multiple of one cycle of the modulated signal.
  • the code period is also referred to as Chip Length.
  • the coding unit 16 generates a coded signal according to the code supplied from the code generation unit 15 from the light source modulation signal supplied from the light source modulation unit 13 and the sensor modulation signal supplied from the sensor modulation unit 14. To do. With respect to the supplied light source modulation signal or sensor modulation signal, the generated coded signal is a signal having the same phase when the code is 0, and a signal having an inverted phase when the code is 1.
  • the coding unit 16 generates a coded light source modulation signal corresponding to the code supplied from the code generation unit 15 from the light source modulation signal supplied from the light source modulation unit 13, and supplies the coded light source modulation signal to the light source delay unit 17.
  • the coding unit 16 generates a coded sensor modulation signal corresponding to the code supplied from the code generation unit 15 from the sensor modulation signal supplied from the sensor modulation unit 14, and supplies the sensor modulation signal to the sensor delay unit 18.
  • the light source delay unit 17 generates a delayed light source modulation signal whose phase is delayed by the delay amount ⁇ D supplied from the control unit 21 with respect to the coded light source modulation signal supplied from the coding unit 16, and the light source 19 Supply to.
  • the sensor delay unit 18 generates a delay sensor modulation signal whose phase is delayed by the delay amount ⁇ D supplied from the control unit 21 with respect to the coded sensor modulation signal supplied from the coding unit 16, and the light receiving sensor 20 Supply to.
  • the light emitting source 19 is composed of, for example, an infrared laser diode as a light source, a laser drive driver, or the like, and emits light while being modulated at a timing corresponding to a delayed light source modulation signal supplied from the light source delay unit 17 to the object. Irradiate the irradiation light.
  • the light receiving sensor 20 which will be described in detail later with reference to FIG. 2, is a pixel array unit 32 in which a plurality of pixels 31 are two-dimensionally arranged in a matrix, and receives reflected light from an object. A pixel signal corresponding to the amount of received reflected light is supplied to the control unit 21.
  • the control unit 21 controls the operation of the entire distance measuring device 1.
  • the control unit 21 includes a timing signal generation unit 11, a phase setting unit 12, a code generation unit 15, and the like according to a distance measurement instruction from the host control unit, which is a control unit of the host device in which the distance measurement device 1 is incorporated. Outputs a trigger signal to start operation. Further, the control unit 21 determines the code period (Chip Length), supplies it to the code generation unit 15, determines the delay amount ⁇ D according to the measurement mode, and determines either the light source delay unit 17 or the sensor delay unit 18. Supply to one side.
  • the code period Chip Length
  • control unit 21 generates a depth value and reliability for each pixel based on the pixel signal supplied from the light receiving sensor 20, and stores the depth value as the pixel value of each pixel, and a depth map of each pixel.
  • a reliability map that stores the reliability as a pixel value is generated and output to the host control unit.
  • the distance measuring device 1 of FIG. 1 has the above configuration.
  • the distance measuring device 1 has a first measurement mode (hereinafter, also referred to as a normal mode) and a second measurement mode (hereinafter, near) that focuses on distance measurement at a shorter distance than the first measurement mode. It also has a distance mode) and a third measurement mode (hereinafter, also referred to as a long distance mode) that focuses on distance measurement at a longer distance than the first measurement mode.
  • a first measurement mode hereinafter, also referred to as a normal mode
  • a second measurement mode hereinafter, near
  • a third measurement mode hereinafter, also referred to as a long distance mode
  • FIG. 2 shows a detailed configuration example of the light receiving sensor 20.
  • the light receiving sensor 20 has a pixel array unit 32 in which the pixels 31 are two-dimensionally arranged in a matrix in the row direction and the column direction, and a drive control circuit 33 arranged in a peripheral region of the pixel array unit 32.
  • the pixel 31 generates an electric charge according to the amount of reflected light received, and outputs a pixel signal corresponding to the electric charge.
  • the pixel 31 includes a photodiode 41 and FD (Floating Diffusion) units 42A and 42B as charge storage units for detecting the charges photoelectrically converted by the photodiode 41.
  • FD Floating Diffusion
  • the FD section 42A is also referred to as a tap A (first tap)
  • the FD section 42B is also referred to as a tap B (second tap).
  • the pixel 31 is a plurality of pixel transistors that control charge accumulation in the FD section 42A as the tap A, the transfer transistor 43A, the selection transistor 44A, the reset transistor 45A, and the charge to the FD section 42B as the tap B. It includes a transfer transistor 43B, a selection transistor 44B, and a reset transistor 45B, which are a plurality of pixel transistors that control storage.
  • a reset operation is performed to reset the excess charge before the start of exposure.
  • the drive control circuit 33 controls the distribution signals GDA and GDB and the reset signals RSA and RSB to High, and the transfer transistor 43A and the reset transistor 45A on the tap A side, the transfer transistor 43B on the tap B side, and the like. Turn on the reset transistor 45B.
  • the transfer transistor 43A and the reset transistor 45A and the transfer transistor 43B and the reset transistor 45B on the tap B side are turned off.
  • the drive control circuit 33 alternately controls the distribution signals GDA and GDB to High, and alternately turns on the transfer transistor 43A on the tap A side and the transfer transistor 43B on the tap B side.
  • the electric charge generated by the photodiode 41 is distributed to the FD section 42A as the tap A or the FD section 42B as the tap B.
  • the operation of distributing the electric charge generated by the photodiode 41 to the tap A or the tap B is periodically repeated for a time corresponding to the light emission period of one frame.
  • the charges transferred via the transfer transistor 43A are sequentially stored in the FD section 42A, and the charges transferred via the transfer transistor 43B are sequentially stored in the FD section 42B.
  • the drive control circuit 33 controls the selection signals ROA and ROB to High, so that the detection signal A corresponding to the accumulated charge of the FD unit 42A which is the tap A and the FD unit which is the tap B
  • the detection signal B corresponding to the accumulated charge of 42B is output as a pixel signal. That is, when the selection transistor 44A is turned on according to the selection signal ROA, the detection signal A corresponding to the amount of electric charge stored in the FD unit 42A is output from the pixel 31 via the signal line 46A. Similarly, when the selection transistor 44B is turned on according to the selection signal ROB, the detection signal B corresponding to the amount of electric charge stored in the FD unit 42B is output from the pixel 31 via the signal line 46B.
  • the pixel 31 distributes the electric charge generated by the reflected light received by the photodiode 41 to the tap A or the tap BB according to the distribution signals GDA and GDB, and outputs the detection signal A and the detection signal B.
  • the depth value d corresponding to the distance from the distance measuring device 1 to the object can be calculated by the following equation (1).
  • ⁇ t in the equation (1) is the time until the irradiation light emitted from the light emitting source 19 is reflected by the object as the subject and is incident on the light receiving sensor 20, and c is the speed of light.
  • pulsed light having a light emitting pattern that repeats on / off at high speed at a modulation frequency Fmod as shown in FIG. 3 is adopted.
  • One cycle T of the light emission pattern is 1 / Fmod.
  • the reflected light (light receiving pattern) is detected out of phase according to the time ⁇ t from the light emitting source 19 to the light receiving sensor 20.
  • the time ⁇ t can be calculated by the following equation (2).
  • the depth value d from the distance measuring device 1 to the object can be calculated from the equations (1) and (2) by the following equation (3).
  • Each pixel 31 of the pixel array unit 32 formed in the light receiving sensor 20 repeats ON / OFF of the transfer transistors 43A and 43B at high speed as described above, and accumulates electric charges only during the ON period.
  • the light receiving sensor 20 sequentially switches the ON / OFF execution timing of each pixel 31 of the pixel array unit 32, for example, in frame units, accumulates electric charges at each execution timing, and outputs a detection signal according to the accumulated electric charge. ..
  • phase 0 degrees phase 90 degrees
  • phase 180 degrees phase 270 degrees.
  • the execution timing of the phase 0 degree is a timing at which the ON timing (light receiving timing) of the tap A or the tap B of each pixel 31 of the pixel array unit 32 is set to the emission timing of the irradiation light, that is, the same phase as the emission pattern.
  • the execution timing of the phase 90 degrees is a timing in which the ON timing (light receiving timing) of tap A or tap B of each pixel 31 of the pixel array unit 32 is set to a phase 90 degrees behind the emission timing (emission pattern) of the irradiation light. is there.
  • the execution timing of the phase 180 degrees is a timing in which the ON timing (light receiving timing) of tap A or tap B of each pixel 31 of the pixel array unit 32 is set to a phase 180 degrees behind the emission timing (emission pattern) of the irradiation light. is there.
  • the execution timing of the phase 270 degrees is a timing in which the ON timing (light receiving timing) of tap A or tap B of each pixel 31 of the pixel array unit 32 is set to a phase 270 degrees behind the emission timing (emission pattern) of the irradiation light. is there.
  • the ON timing of the tap A and the ON timing of the tap B Is the timing when the phase is inverted.
  • the tap A of the pixel 31 is the execution timing of 0 degree phase
  • the tap B is the execution timing of 180 degree phase
  • the tap A of the pixel 31 is the execution timing of 90 degree phase
  • the tap B is.
  • the execution timing has a phase of 270 degrees.
  • the sensor 20 may receive light (imaging) for at least two frames.
  • a method of acquiring a detection signal of four phases by receiving light of two frames and calculating a depth value d in this way is called a two-phase method.
  • the light receiving sensor 20 may have a method called a 4 Phase method in which each of the tap A and the tap B acquires a detection signal of four phases of phase 0 degree, phase 90 degree, phase 180 degree, and phase 270 degree.
  • the 4-Phase method light reception (imaging) of 4 frames is required, but the result of removing the characteristic variation between the taps of the tap A and the tap B can be obtained.
  • the light receiving sensor 20 adopts, for example, a 4 Phase method
  • the light receiving timing is sequentially switched in the order of phase 0 degree, phase 90 degree, phase 180 degree, and phase 270 degree in frame units, and the received light amount (accumulation) of the reflected light at each light receiving timing Charge).
  • the timing at which the reflected light is incident is shaded.
  • the depth value d from the distance measuring device 1 to the object can be calculated.
  • the reliability conf is a value representing the intensity of the light received by each pixel, and is also called a signal intensity conf, and can be calculated by, for example, the following equation (5).
  • the drive control circuit 33 of the pixel array unit 32 emits irradiation light at a timing (light receiving timing) of accumulating the electric charge generated by the photodiode 41 of each pixel 31 on the tap A or the tap B.
  • Distribution signals GDA and GDB having a phase of 0 degrees, a phase of 90 degrees, a phase of 180 degrees, and a phase of 270 degrees are generated with respect to the timing.
  • the phase difference between the light emission timing and the light reception timing is the drive phase difference ⁇ D set by the phase setting unit 12.
  • FIG. 5 is a diagram illustrating processing from the timing signal generation unit 11 to the coding unit 16 of the distance measuring device 1.
  • the timing signal generation unit 11 generates a modulation signal of the modulation frequency Fmod and supplies it to the light source modulation unit 13, the sensor modulation unit 14, and the code generation unit 15.
  • Phase setting unit 12 sets the drive phase difference phi D, supplied to one of the light source modulation unit 13 or the sensor modulation section 14.
  • Figure 5 shows an example of a case where the drive phase difference phi D set by the phase setting section 12 is supplied to the light source modulation section 13.
  • the light source modulation unit 13, the modulated signal from the timing signal generator 11, and supplies to the encoding unit 16 generates a light source modulation signal obtained by shifting the drive phase difference phi D phase.
  • the sensor modulation unit 14 supplies the modulation signal from the timing signal generation unit 11 as it is to the coding unit 16 as a sensor modulation signal. Therefore, the sensor modulation signal shown in FIG. 5 is the same as the modulation signal generated by the timing signal generation unit 11.
  • the control unit 21 determines the code period (Chip Length) and supplies it to the code generation unit 15.
  • the code generation unit 15 randomly generates a code of 0 or 1 in a code cycle unit (two cycles in FIG. 5) and supplies the code to the coding unit 16.
  • the reference numerals are generated in the order of “0”, “1”, “0”, and “1”.
  • the coding unit 16 performs phase shift processing according to the code on the light source modulation signal supplied from the light source modulation unit 13 and the sensor modulation signal supplied from the sensor modulation unit 14, and serves as a coded signal.
  • a coded light source modulated signal and a coded sensor modulated signal are generated.
  • the phase shift process according to the code is a process of generating a coded signal having the same phase when the code is 0 and generating a coded signal having an inverted phase when the code is 1.
  • the coded light source modulated signal and the coded sensor modulated signal when the code is 0 are the same as the light source modulated signal and the sensor modulated signal, and the coded light source modulated signal and the code when the code is 1.
  • the sensor modulation signal is a signal whose phase is inverted (180 degree shift) with respect to the light source modulation signal and the sensor modulation signal.
  • FIG. 6 is a diagram illustrating processing of the light source delay unit 17 and the sensor delay unit 18 of the distance measuring device 1.
  • the control unit 21 determines the delay amount ⁇ D according to the measurement mode and supplies it to either the light source delay unit 17 or the sensor delay unit 18.
  • the light source delay unit 17 generates a delayed light source modulation signal whose phase is delayed by a delay amount ⁇ D with respect to the coded light source modulation signal supplied from the coding unit 16, and supplies the delayed light source modulation signal to the light emitting source 19.
  • the sensor delay unit 18 generates a delay sensor modulation signal whose phase is delayed by a delay amount ⁇ D with respect to the coded sensor modulation signal supplied from the coding unit 16, and supplies the delay sensor modulation signal to the light receiving sensor 20.
  • the delayed light source modulated signal and the delayed sensor modulated signal are shown.
  • the delay light source modulation signal becomes the same signal as the upper coded light source modulation signal, and the delay sensor modulation signal becomes the upper coded sensor modulation signal.
  • the signal is phase-delayed by the delay amount ⁇ D.
  • the delay light source modulation signal becomes a signal whose phase is delayed by the delay amount ⁇ D with respect to the coded light source modulation signal in the upper stage, and the delay sensor modulation signal is , It is the same signal as the coded sensor modulation signal in the upper stage.
  • the light emitting source 19 emits light while being modulated at a timing corresponding to the delayed light source modulation signal supplied from the light source delay unit 17, and irradiates the object with irradiation light.
  • the light receiving sensor 20 receives reflected light at each pixel 31 at a timing corresponding to the delay sensor modulation signal supplied from the sensor delay unit 18, and outputs a pixel signal corresponding to the amount of the received reflected light.
  • distribution signals GDA and GDB that alternately turn on tap A and tap B are generated based on the delay sensor modulation signal supplied from the sensor delay unit 18.
  • the delay sensor modulation signal supplied from the sensor delay unit 18 is referred to as the distribution signal GDA
  • the signal in which the phase of the delay sensor modulation signal is inverted is referred to as the distribution signal GDB.
  • the distance measuring device 1 encodes the light source modulation signal and the sensor modulation signal in the coding unit 16 in code cycle units according to the code generated by the code generation unit 15.
  • the coded coded light source modulation signal and the coded sensor modulation signal are BPSK (Binary Phase Shift Keying) whose phases are shifted to 0 degrees and 180 degrees according to the code (binary) of "0" or "1". It becomes a signal.
  • the light source delay unit 17 and the sensor delay unit 18 generate a delay light source modulation signal and a delay sensor modulation signal in which either the coded light source modulation signal or the coded sensor modulation signal is phase-delayed by a delay amount ⁇ D.
  • the light emitting source 19 emits light and the light receiving sensor 20 receives light.
  • the signal intensity of the reflected light is a reliability conf calculated by the equation (5) based on the detection signal of the pixel 31.
  • the timing signal shown in the upper left part of FIG. 7 shows the waveform of the irradiation light and the reflected light reflected by the object when the object to be measured is ideally located at a position of zero distance.
  • the light receiving timings of tap A and tap B that receive light are shown. Since it is assumed that the object to be measured is ideally located at a distance of zero, the waveform of the irradiation light is also the waveform of the reflected light.
  • the timing at which the reflected light is incident is marked with a pattern.
  • the light receiving timings of tap A and tap B that receive the reflected light reflected by the object are shown.
  • the waveform of the irradiation light is the same as the waveform of the irradiation light in the upper stage.
  • the timing at which the reflected light is incident is marked with a pattern.
  • the signal intensity Conf detected by each pixel 31 of the light receiving sensor 20 when the object shown in the upper left is at a position of zero distance is the signal intensity C2 shown in the graph on the right side of FIG.
  • the signal strength C2 is the case where the drive phase difference ⁇ D is other than 0, and when the drive phase difference ⁇ D is 0, the signal strength Conf is the maximum signal strength C1.
  • the reflected light reaches the light receiving sensor 20 with a delay of 2Pi as shown in the lower left side.
  • the light reception timing for the cycle is lost, and the signal strength Conf becomes the signal strength C3 shown in the graph on the right side of FIG. 7.
  • the signal intensity Conf is attenuated according to the distance and a predetermined distance. Becomes zero.
  • the distance of the zero point where the signal strength Conf becomes zero depends on the code period (Chip Length).
  • the distance at the zero point is a distance equivalent to 4Pi.
  • the distance at the zero point where the signal strength Conf becomes zero is a distance equivalent to (Chip Length x 2Pi).
  • the signal strength Conf after a predetermined distance can be set to zero. It is possible to cut off signals after a predetermined distance. Further, by controlling the code period (Chip Length), the distance at which the signal is cut off can be set to an arbitrary distance. According to the distance measuring device 1, it is possible to measure a distance limited to a desired measurement range. However, there is a trade-off with the SN ratio of the effective ranging range.
  • Short-range mode operation Next, the short-distance mode (second measurement mode) will be described with reference to FIG.
  • the timing signal shown in the upper left part of FIG. 9 shows the waveform of the irradiation light in the normal mode shown in FIG. 7 and the reception timing of tap A and tap B.
  • the delay amount ⁇ D 0.
  • the timing signal shown in the lower left part of FIG. 9 shows the waveform of the irradiation light in the short-distance mode and the reception timing of tap A and tap B.
  • the control unit 21 sets the delay amount ⁇ D to a predetermined value and supplies it to the light source delay unit 17.
  • the light source delay unit 17 generates a delayed light source modulation signal whose phase is delayed by a delay amount ⁇ D with respect to the coded light source modulation signal supplied from the coding unit 16.
  • the relationship between the distance to the object and the signal strength Conf is such that it is shifted to the left side, which is the short distance side, from the state of the normal mode, and the distance at the zero point. is a 2Pi + phi D corresponding distance.
  • the distance measuring device 1 sets the delay amount ⁇ D to a predetermined value and delays the emission of the irradiation light, so that the distance is larger than the distance equivalent to (Chip Length ⁇ 2Pi) in the normal mode. Further, it is possible to perform measurement with a limited range of distance measurement on the short distance side.
  • the short distance to be measured can be arbitrarily set by controlling the delay amount ⁇ D. However, there is a trade-off with the SN ratio of the effective ranging range.
  • the timing signal shown in the upper left part of FIG. 10 shows the waveform of the irradiation light in the normal mode shown in FIG. 7 and the reception timing of tap A and tap B.
  • the delay amount ⁇ D 0.
  • the timing signal shown in the lower left part of FIG. 10 shows the waveform of the irradiation light in the long-distance mode and the reception timing of the tap A and the tap B.
  • the control unit 21 sets the delay amount ⁇ D to a predetermined value and supplies it to the sensor delay unit 18.
  • the sensor delay unit 18 generates a delay sensor modulation signal whose phase is delayed by a delay amount ⁇ D with respect to the coded sensor modulation signal supplied from the coding unit 16.
  • the delay amount ⁇ D is set to 2Pi.
  • the relationship between the distance to the object and the signal intensity Conf is shown in the graph on the right side of FIG. The relationship is such that the signal is shifted to the right side, which is a long distance side, from the state of the normal mode, and the signal strength Conf becomes the maximum (signal strength C1) at a distance equivalent to 2Pi.
  • the distance measuring device 1 sets the delay amount ⁇ D to a predetermined value and delays the light reception of the light receiving sensor 20, so that the distance corresponding to (Chip Length ⁇ 2Pi) in the normal mode is increased. It is also possible to perform measurements with a limited range of distance measurement on the far side. However, the distance measurement performance on the short distance side is deteriorated, and the distance measurement can be performed so that the distance measurement performance is maximized at a desired distance (2Pi in FIG. 10).
  • the distance to be measured can be arbitrarily set by controlling the delay amount ⁇ D. Since the long-distance mode can attenuate the short-distance signal, for example, the influence of scattered light generated between the lens and the sensor can be reduced, and the signal amount of the distant subject can be relatively increased.
  • step S1 the control unit 21 determines the code period (Chip Length) and the delay amount ⁇ D according to the measurement mode.
  • the determined code period is supplied to the code generation unit 15.
  • the determined delay amount ⁇ D is supplied to the light source delay unit 17 when the measurement mode is the short-distance mode, and is supplied to the sensor delay unit 18 when the measurement mode is the long-distance mode.
  • the delay amount ⁇ D 0, so that the delay amount ⁇ D is not supplied to either the light source delay unit 17 or the sensor delay unit 18.
  • step S2 the timing signal generation unit 11 generates a modulation signal of the modulation frequency Fmod and supplies it to the light source modulation unit 13, the sensor modulation unit 14, and the code generation unit 15.
  • step S3 the phase setting section 12 sets the drive phase difference phi D, supplied to one of the light source modulation unit 13 or the sensor modulation section 14.
  • the drive phase difference phi D is supplied to the light source modulation section 13.
  • step S3 of the first frame of the 4Phase system for example, 0 is set as the drive phase difference ⁇ D.
  • step S4 the light source modulation unit 13 and the sensor modulation unit 14 generates a modulated signal corresponding to the drive phase difference phi D from the phase setting portion 12. Specifically, the light source modulation unit 13, the modulated signal from the timing signal generator 11 generates a light source modulation signal obtained by shifting the phase by driving the phase difference phi D, supplied to the encoding unit 16. The light source modulation unit 13 supplies the modulation signal from the timing signal generation unit 11 as it is to the coding unit 16 as a light source modulation signal.
  • step S5 the code generation unit 15 randomly generates a code of 0 or 1 in the code cycle unit set by the control unit 21 and supplies the code to the coding unit 16.
  • the code generation unit 15 sets 0 or 1 in units of 2 cycles of the modulation frequency Fmod.
  • a code is randomly generated and supplied to the coding unit 16.
  • steps S4 and S5 may be executed in the reverse order, or may be executed in parallel.
  • step S6 the coding unit 16 performs phase shift processing according to the code on the light source modulation signal supplied from the light source modulation unit 13 and the sensor modulation signal supplied from the sensor modulation unit 14, and obtains a code.
  • a coded light source modulated signal and a coded sensor modulated signal are generated as a coded signal.
  • the generated coded light source modulation signal is supplied to the light source delay unit 17, and the generated coded sensor modulation signal is supplied to the sensor delay unit 18.
  • step S7 the light source delay unit 17 and the sensor delay unit 18 generate a delay light source modulation signal and a delay sensor modulation signal whose phase is delayed by the delay amount ⁇ D supplied from the control unit 21.
  • the delay amount ⁇ D when the delay amount ⁇ D is supplied to the light source delay unit 17, the light source delay unit 17 delays the phase by the delay amount ⁇ D with respect to the coded light source modulation signal from the coding unit 16.
  • a light source modulation signal is generated and supplied to the light emitting source 19.
  • the sensor delay unit 18 When the delay amount ⁇ D is supplied to the sensor delay unit 18, the sensor delay unit 18 generates a delay sensor modulation signal whose phase is delayed by the delay amount ⁇ D with respect to the coded sensor modulation signal from the coding unit 16. Then, it is supplied to the light receiving sensor 20. If the delay amount ⁇ D is not supplied, the input modulation signal is output as it is.
  • step S8 the ranging device 1 emits the irradiation light and receives the reflected light.
  • the light emitting source 19 emits light while being modulated at a timing corresponding to the delayed light source modulation signal supplied from the light source delay unit 17, and irradiates the object with irradiation light.
  • Each pixel 31 of the light receiving sensor 20 receives the reflected light at the timing corresponding to the delay sensor modulation signal supplied from the sensor delay unit 18, and outputs the pixel signal corresponding to the amount of the received reflected light to the control unit 21. ..
  • step S9 the control unit 21 of the ranging device 1 determines whether the phase data of all frames has been acquired. Specifically, in the case of the 2Phase method, the control unit 21 determines whether or not the light reception for two frames has been performed, and if the light reception for two frames is performed, it is determined that the phase data of all the frames has been acquired. Further, for example, in the case of the 4Phase method, the control unit 21 determines whether or not the light reception for 4 frames has been performed, and if the light reception for 4 frames is performed, it is determined that the phase data of all the frames has been acquired.
  • step S9 If it is determined in step S9 that the phase data of all frames has not been acquired, the process returns to step S3, and the processes of steps S3 to S7 described above are repeated. In the next step S3, if the 4Phase method, for example, driving a phase difference phi D is set to 90 degrees.
  • step S9 if it is determined in step S9 that the phase data of all frames has been acquired, the process proceeds to step S10, and the control unit 21 generates and outputs a depth map and a reliability map. More specifically, the control unit 21 calculates the depth value d for each pixel 31 of the pixel array unit 32 based on the acquired phase data (detection signal) of all frames by the equation (3), and also calculates the depth value d by the equation (3). Calculate the reliability conf according to (5). Then, the control unit 21 generates and outputs a depth map in which the depth value is stored as the pixel value of each pixel 31 and a reliability map in which the reliability conf is stored as the pixel value of each pixel 31.
  • the measurement mode is set to the normal mode, the short distance mode, or the long distance mode, and the code period (Chip Length) and the delay amount ⁇ D are set to predetermined values.
  • the code period (Chip Length) and the delay amount ⁇ D are set to predetermined values.
  • Coded light source modulation signal a driving phase difference ⁇ D, 0 °, 90 ° , 180 °, and shows a case of setting the respective 270 °.
  • the lower part of FIG. 12 shows the result of simulating the relationship between the distance to the object and the signal strength Conf with the settings shown in the upper part.
  • FIG. 13 the types of the signal shown in the upper row and the graph shown in the lower row are the same as those in FIG. 12, so detailed description thereof will be omitted, and only the relationship between the distance to the object and the signal strength Conf will be described. The same applies to FIGS. 14 to 18 described later.
  • the distance at the zero point in other words, the distance at which the signal is cut off can be set to an arbitrary distance.
  • the delay amount ⁇ D is supplied to the sensor delay unit 18, and a delay sensor modulation signal whose phase is delayed by the delay amount ⁇ D with respect to the coded sensor modulation signal is generated.
  • the delay amount ⁇ D is 1 Pi.
  • the distance at which the signal strength Conf peaks is from 0 to Pi, and is shifted in the long distance direction.
  • the distance of the zero point is also changed from 2Pi to 3Pi, which is shifted in the long distance direction.
  • the delay amount ⁇ D supplied to the sensor delay unit 18 is 2Pi.
  • the distance at which the signal strength Conf peaks is from 0 to 2Pi, which is shifted in the long distance direction.
  • the distance of the zero point is also changed from 4Pi to 6Pi, which is shifted in the long distance direction.
  • the distance between the peak and the zero point of the signal strength Conf is set to be farther than the normal mode. It can be seen that it can be set to any distance.
  • a delay amount ⁇ D is supplied to the light source delay unit 17, and a delayed light source modulation signal whose phase is delayed by the delay amount ⁇ D with respect to the coded light source modulation signal is generated.
  • the delay amount ⁇ D is 1 Pi.
  • the peak value of the signal strength Conf (the signal strength Conf at a distance of zero) is half the value of the normal mode of FIG.
  • the distance of the zero point is also 1/2 Pi of 2Pi in the normal mode, and is shifted in the short distance direction.
  • the delay amount ⁇ D supplied to the light source delay unit 17 is 1 Pi.
  • the peak value of the signal strength Conf (signal strength Conf at zero distance) is 3/4 of the value of the normal mode of FIG.
  • the distance of the zero point is also 3/4 of 3Pi of 4Pi in the normal mode, and is shifted in the short distance direction.
  • the delay amount ⁇ D supplied to the light source delay unit 17 is Pi ⁇ 7/4.
  • the peak value of the signal strength Conf (signal strength Conf at zero distance) is 0.16 times the value of the normal mode of FIG.
  • the distance of the zero point is Pi / 4, which is 1/8 of 2Pi in the normal mode, and is shifted in the short distance direction.
  • the distance between the peak and the zero point of the signal intensity Conf can be set closer to the normal mode. It can be seen that it can be set to any distance.
  • FIG. 19 is a cross-sectional view of a smartphone 101 as an electronic device in which the distance measuring device 1 is incorporated, as viewed from a surface parallel to the display surface.
  • the distance measuring device 1 is incorporated in the smartphone 101.
  • a cover glass 102 is arranged on the front surface of the display panel (not shown) of the smartphone 101, and the distance measuring device 1 is arranged on the back side (inside the main body) of the display panel.
  • the irradiation light L1 emitted from the light emitting source 19 of the distance measuring device 1 passes through the cover glass 102 and is irradiated to the subject 103.
  • the subject 103 is, for example, a user using the smartphone 101.
  • the irradiation light L1 is reflected by the subject 103, passes through the cover glass 102 as the reflected light L2, and is incident on the light receiving sensor 20 via the lens 104.
  • foreign matter 121 such as dust or fingerprints may adhere to the surface of the cover glass 102. Since the user does not know where the distance measuring device 1 is located inside the smartphone 101, the user often does not notice the influence of the foreign matter 121 on the distance measuring device 1.
  • the irradiation light L1 is reflected by the foreign matter 121, refracted like the reflected light L3, and incident on the light receiving sensor 20.
  • the measurement is performed excluding the subject 103 by setting the measurement mode to the short distance mode. be able to.
  • FIG. 20 is a flowchart of the distance measuring process of the distance measuring device 1 that measures the distance to the original object to be measured while detecting the foreign matter described in FIG. This process is started, for example, when a distance measurement instruction is supplied from the control unit (AP) of the smartphone 101 in which the distance measurement device 1 is incorporated.
  • AP control unit
  • step S21 the control unit 21 sets the measurement mode to the short-distance mode, and in step S22, the distance measuring device 1 executes the measurement in the short-distance mode.
  • the code period (Chip Length) and the delay amount ⁇ D in the short-distance mode are set to optimum values when measuring the distance corresponding to the case where the irradiation light L1 is reflected on the surface of the cover glass 102.
  • the distance measuring device 1 executes the distance measuring process shown in FIG. 11, and the control unit 21 generates a depth map and a reliability map.
  • step S23 the control unit 21 determines whether or not a foreign substance has been detected based on the depth map and the reliability map acquired in the short-distance mode. For example, when the distance to the surface of the cover glass 102 is detected in the depth map, the control unit 21 determines that a foreign substance has been detected.
  • step S23 If it is determined in step S23 that a foreign matter has been detected, the process proceeds to step S24, and the control unit 21 notifies the control unit of the smartphone 101 of the detection of the foreign matter.
  • the control unit of the smartphone 101 notified from the distance measuring device 1 that the foreign matter has been detected displays, for example, an alert screen requesting the user to remove the foreign matter as shown in FIG. 21 on the display.
  • the display 141 of the smartphone 101 shows the message 142 "There is dust or fingerprints on the front of the camera. Wipe the area of the red line to clean it.” And the red area 143 indicating the area of wiping. Is displayed.
  • the red area 143 corresponds to the position of the distance measuring device 1 inside the smartphone 101.
  • the foreign matter 121 is removed by the user wiping the vicinity of the red area 143 based on the message 142.
  • the user When the user finishes the wiping work, the user operates (presses) the wiping end button 144 as an operation for the message 142.
  • the wiping end button 144 When the wiping end button 144 is operated by the user, the distance measuring instruction is again supplied from the control unit of the smartphone 101 to the distance measuring device 1.
  • step S24 after the control unit 21 notifies the control unit of the smartphone 101 of the detection of a foreign substance, in step S25, the control unit 21 issues a distance measurement instruction corresponding to the operation of the wiping end button 144 to the control unit of the smartphone 101. It is determined whether or not the notification has been sent from, and the device waits until it is determined that the distance measurement instruction has been notified.
  • step S25 when it is determined in step S25 that the distance measurement instruction has been notified, the process returns to step S22, and the measurement in the short-distance mode is executed again.
  • step S23 If it is determined in step S23 that no foreign matter has been detected, the process proceeds to step S26, the control unit 21 sets the measurement mode to the normal mode, and in step S27, the distance measuring device 1 sets the normal mode. Perform the measurement with. Then, in step S28, the control unit 21 outputs the measurement result in the normal mode. That is, the control unit 21 stores the depth map obtained as a result of the measurement in the normal mode as the pixel value of each pixel 31 and the reliability conf obtained as a result of the measurement in the normal mode for each pixel 31. A reliability map stored as a pixel value of is generated, output to the control unit of the smartphone 101, and the process is completed.
  • the distance measuring device 1 sets the measurement mode to the normal mode, the short distance mode, or the long distance mode according to the detection target, thereby measuring the distance limited to a desired distance range. It can be carried out.
  • the distance measuring device 1 described above can be mounted on an electronic device such as a smartphone, a tablet terminal, a mobile phone, a personal computer, a game machine, a television receiver, a wearable terminal, a digital still camera, or a digital video camera.
  • an electronic device such as a smartphone, a tablet terminal, a mobile phone, a personal computer, a game machine, a television receiver, a wearable terminal, a digital still camera, or a digital video camera.
  • FIG. 22 is a block diagram showing a configuration example of a smartphone as an electronic device equipped with a ranging module.
  • the distance measuring module 202, the image pickup device 203, the display 204, the speaker 205, the microphone 206, the communication module 207, the sensor unit 208, the touch panel 209, and the control unit 210 are connected via the bus 211. Is connected and configured. Further, the control unit 210 has functions as an application processing unit 221 and an operation system processing unit 222 by executing a program by the CPU.
  • the distance measuring device 1 of FIG. 1 is applied to the distance measuring module 202.
  • the distance measuring module 202 is arranged in front of the smartphone 201, and by performing distance measurement for the user of the smartphone 201, the depth value of the surface shape of the user's face, hand, finger, etc. is measured as a distance measurement result. Can be output as.
  • the image pickup device 203 is arranged in front of the smartphone 201, and by taking an image of the user of the smartphone 201 as a subject, the image taken by the user is acquired. Although not shown, the image pickup device 203 may be arranged on the back surface of the smartphone 201.
  • the display 204 displays an operation screen for performing processing by the application processing unit 221 and the operation system processing unit 222, an image captured by the image pickup device 203, and the like.
  • the speaker 205 and the microphone 206 for example, output the voice of the other party and collect the voice of the user when making a call by the smartphone 201.
  • the communication module 207 communicates via the communication network.
  • the sensor unit 208 senses speed, acceleration, proximity, etc., and the touch panel 209 acquires a touch operation by the user on the operation screen displayed on the display 204.
  • the application processing unit 221 performs processing for providing various services by the smartphone 201.
  • the application processing unit 221 can create a face by computer graphics that virtually reproduces the user's facial expression based on the depth value supplied from the distance measuring module 202, and can perform a process of displaying the face on the display 204. .. Further, the application processing unit 221 can perform a process of creating, for example, three-dimensional shape data of an arbitrary three-dimensional object based on the depth value supplied from the distance measuring module 202.
  • the operation system processing unit 222 performs processing for realizing the basic functions and operations of the smartphone 201.
  • the operation system processing unit 222 can perform a process of authenticating the user's face and unlocking the smartphone 201 based on the depth value supplied from the distance measuring module 202.
  • the operation system processing unit 222 performs, for example, a process of recognizing a user's gesture based on the depth value supplied from the distance measuring module 202, and performs a process of inputting various operations according to the gesture. Can be done.
  • the measurement mode can be switched to a normal mode, a short distance mode, or a long distance mode according to the purpose of the application. It can be measured, and the distance can be measured within a desired distance range.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is realized as a device mounted on a moving body of any kind such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, and a robot. You may.
  • FIG. 23 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via the communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside information detection unit 12030, an in-vehicle information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are shown as a functional configuration of the integrated control unit 12050.
  • the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 provides a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating a braking force of a vehicle.
  • the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, blinkers or fog lamps.
  • the body system control unit 12020 may be input with radio waves transmitted from a portable device that substitutes for the key or signals of various switches.
  • the body system control unit 12020 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
  • the vehicle outside information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000.
  • the image pickup unit 12031 is connected to the vehicle exterior information detection unit 12030.
  • the vehicle outside information detection unit 12030 causes the image pickup unit 12031 to capture an image of the outside of the vehicle and receives the captured image.
  • the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or characters on the road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electric signal according to the amount of the light received.
  • the image pickup unit 12031 can output an electric signal as an image or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared light.
  • the in-vehicle information detection unit 12040 detects the in-vehicle information.
  • a driver state detection unit 12041 that detects the driver's state is connected to the in-vehicle information detection unit 12040.
  • the driver state detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing.
  • the microcomputer 12051 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit.
  • a control command can be output to 12010.
  • the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. It is possible to perform cooperative control for the purpose of.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, so that the driver can control the driver. It is possible to perform coordinated control for the purpose of automatic driving, etc., which runs autonomously without depending on the operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the vehicle exterior information detection unit 12030.
  • the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the external information detection unit 12030, and performs coordinated control for the purpose of anti-glare such as switching the high beam to the low beam. It can be carried out.
  • the audio image output unit 12052 transmits an output signal of at least one of audio and an image to an output device capable of visually or audibly notifying information to the passenger or the outside of the vehicle.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices.
  • the display unit 12062 may include, for example, at least one of an onboard display and a heads-up display.
  • FIG. 24 is a diagram showing an example of the installation position of the imaging unit 12031.
  • the vehicle 12100 has image pickup units 12101, 12102, 12103, 12104, 12105 as the image pickup unit 12031.
  • the imaging units 12101, 12102, 12103, 12104, 12105 are provided at positions such as the front nose, side mirrors, rear bumpers, back doors, and the upper part of the windshield in the vehicle interior of the vehicle 12100, for example.
  • the imaging unit 12101 provided on the front nose and the imaging unit 12105 provided on the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
  • the imaging units 12102 and 12103 provided in the side mirrors mainly acquire images of the side of the vehicle 12100.
  • the imaging unit 12104 provided on the rear bumper or the back door mainly acquires an image of the rear of the vehicle 12100.
  • the images in front acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 24 shows an example of the photographing range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively
  • the imaging range 12114 indicates the imaging range of the imaging units 12102 and 12103.
  • the imaging range of the imaging unit 12104 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 as viewed from above can be obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the image pickup units 12101 to 12104 may be a stereo camera composed of a plurality of image pickup elements, or may be an image pickup element having pixels for phase difference detection.
  • the microcomputer 12051 has a distance to each three-dimensional object within the imaging range 12111 to 12114 based on the distance information obtained from the imaging units 12101 to 12104, and a temporal change of this distance (relative velocity with respect to the vehicle 12100).
  • a predetermined speed for example, 0 km / h or more.
  • the microcomputer 12051 can set an inter-vehicle distance to be secured in front of the preceding vehicle in advance, and can perform automatic braking control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform cooperative control for the purpose of automatic driving or the like in which the vehicle travels autonomously without depending on the operation of the driver.
  • the microcomputer 12051 converts three-dimensional object data related to a three-dimensional object into two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, electric poles, and other three-dimensional objects based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into obstacles that can be seen by the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 is used via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the captured image of the imaging units 12101 to 12104.
  • pedestrian recognition includes, for example, a procedure for extracting feature points in an image captured by an imaging unit 12101 to 12104 as an infrared camera, and pattern matching processing for a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian. It is done by the procedure to determine.
  • the audio image output unit 12052 When the microcomputer 12051 determines that a pedestrian is present in the captured image of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 outputs a square contour line for emphasizing the recognized pedestrian.
  • the display unit 12062 is controlled so as to superimpose and display. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon or the like indicating a pedestrian at a desired position.
  • the above is an example of a vehicle control system to which the technology according to the present disclosure can be applied.
  • the technique according to the present disclosure can be applied to the vehicle exterior information detection unit 12030 and the vehicle interior information detection unit 12040 among the configurations described above.
  • processing for recognizing the driver's gesture is performed, and various types according to the gesture (for example, It can perform operations on audio systems, navigation systems, air conditioning systems) and detect the driver's condition more accurately.
  • the distance measurement by the distance measuring device 1 can be used to recognize the unevenness of the road surface and reflect it in the control of the suspension.
  • the structure of the photodiode 41 of the pixel 31 includes a distance measuring sensor having a CAPD (Current Assisted Photonic Demodulator) structure and a gate type distance measuring sensor that alternately applies the charge of the photodiode to the two gates. It can be applied to a ranging sensor having a structure that distributes charges to one charge storage unit.
  • CAPD Current Assisted Photonic Demodulator
  • the configuration described as one device (or processing unit) may be divided and configured as a plurality of devices (or processing units).
  • the configurations described above as a plurality of devices (or processing units) may be collectively configured as one device (or processing unit).
  • a configuration other than the above may be added to the configuration of each device (or each processing unit).
  • a part of the configuration of one device (or processing unit) may be included in the configuration of another device (or other processing unit). ..
  • the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
  • the present technology can have the following configurations.
  • the coded light source modulated signal and the code are obtained by coding the light source modulated signal that controls the light emitting timing of the light emitting source and the sensor modulated signal that controls the light receiving timing of the light receiving sensor according to a predetermined code.
  • the coding unit that generates the modulation signal of the sensor A light source delay unit that generates a delayed light source modulated signal in which the phase is delayed by a predetermined delay amount with respect to the coded light source modulated signal.
  • a distance measuring device including a sensor delay unit that generates a delay sensor modulation signal whose phase is delayed by a predetermined delay amount with respect to the coded sensor modulation signal.
  • the control unit supplies the predetermined delay amount determined when the measurement mode is the second mode to the light source delay unit, and determines the predetermined delay amount when the measurement mode is the third mode.
  • the distance measuring device according to (2) or (3) which is supplied to the sensor delay unit.
  • the measurement mode includes a first mode in which the predetermined delay amount is zero and a second mode in which the predetermined delay amount is positive.
  • the control unit sets the measurement mode to the second mode and executes distance measurement, and then sets the measurement mode to the first mode to execute distance measurement.
  • the control according to (2).
  • Distance measuring device (6)
  • (7) The distance measuring device according to any one of (1) to (6), further comprising a code generation unit that generates the predetermined code in units of integral multiples of the period of the light source modulation signal and the sensor modulation signal.
  • a distance measuring device having a light emitting source that irradiates the irradiation light and a light receiving sensor that receives the reflected light that is reflected by the object and returned.
  • the coded light source modulated signal and the code are obtained by coding the light source modulated signal that controls the light emitting timing of the light emitting source and the sensor modulated signal that controls the light receiving timing of the light receiving sensor according to a predetermined code.
  • a method of controlling a distance measuring device that generates a modulated signal. (10) The light emitting source that irradiates the irradiation light and A light receiving sensor that receives the reflected light that is reflected by the object and returned.
  • the coded light source modulated signal and the code are obtained by coding the light source modulated signal that controls the light emitting timing of the light emitting source and the sensor modulated signal that controls the light receiving timing of the light receiving sensor according to a predetermined code.
  • the coding unit that generates the modulation signal of the sensor A light source delay unit that generates a delayed light source modulated signal in which the phase is delayed by a predetermined delay amount with respect to the coded light source modulated signal.
  • An electronic device including a distance measuring device including a sensor delay unit that generates a delay sensor modulation signal whose phase is delayed by a predetermined delay amount with respect to the coded sensor modulation signal.
  • 1 ranging device 11 timing signal generator, 12 phase setting unit, 13 light source modulation unit, 14 sensor modulation unit, 15 code generator, 16 coding unit, 17 light source delay unit, 18 sensor delay unit, 19 light source, 20 light source sensor, 21 control unit, 101 smartphone, 201 smartphone, 202 ranging module

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The present disclosure pertains to: a ranging device configured so as to be able to measure a distance limited to a desired measurement range; a control method thereof; and an electronic apparatus. The ranging device is provided with: a light-emitting source that emits radiated light; a light-receiving sensor that receives reflected light where the radiated light has been reflected by an object and has returned; an encoding unit that generates an encoded light source modulation signal and an encoded sensor modulation signal by subjecting a light source modulation signal, that controls a light-emission timing of the light-emitting source, and a sensor modulation signal, that controls the light-receiving timing of the light-receiving sensor, to encoding corresponded to a prescribed code; a light source delay unit that generates a delayed light source modulation signal for which the phase has been delayed by a prescribed delay amount with respect to the encoded light source modulation signal; and a sensor delay unit that generates a delayed sensor modulation signal for which the phase has been delayed by a prescribed delay amount with respect to the encoded sensor modulation signal. The present disclosure is applicable, for example, to a ranging module that measures the distance to a subject, etc.

Description

測距装置およびその制御方法、並びに、電子機器Distance measuring device, its control method, and electronic equipment
 本技術は、測距装置およびその制御方法、並びに、電子機器に関し、特に、所望の測定範囲に限定した距離の測定を行うことができるようにした測距装置およびその制御方法、並びに、電子機器に関する。 The present technology relates to a distance measuring device and its control method, and an electronic device, in particular, a distance measuring device and its control method capable of measuring a distance limited to a desired measurement range, and an electronic device. Regarding.
 近年、半導体技術の進歩により、物体までの距離を測定する測距モジュールの小型化が進んでいる。これにより、例えば、スマートフォンなどのモバイル端末に測距モジュールを搭載することが実現されている。 In recent years, advances in semiconductor technology have led to the miniaturization of distance measuring modules that measure the distance to an object. As a result, for example, it has been realized that a distance measuring module is mounted on a mobile terminal such as a smartphone.
 測距モジュールにおける測距方法としては、例えば、ToF(Time of Flight)方式と呼ばれる方式がある。ToF方式では、光を物体に向かって照射して物体の表面で反射されてくる光を検出し、その光の飛行時間を測定した測定値に基づいて物体までの距離が算出される(例えば、特許文献1参照)。 As a distance measuring method in the distance measuring module, for example, there is a method called a ToF (Time of Flight) method. In the ToF method, light is emitted toward an object to detect the light reflected on the surface of the object, and the distance to the object is calculated based on the measured value obtained by measuring the flight time of the light (for example,). See Patent Document 1).
特開2017-150893号公報JP-A-2017-150893
 ToF方式の測距装置では、所望の測定範囲に限定した距離の測定を行いたいという要請がある。 With the ToF type distance measuring device, there is a request to measure the distance limited to the desired measurement range.
 本技術は、このような状況に鑑みてなされたものであり、所望の測定範囲に限定した距離の測定を行うことができるようにするものである。 This technology was made in view of such a situation, and makes it possible to measure a distance limited to a desired measurement range.
 本技術の第1の側面の測距装置は、照射光を照射する発光源と、前記照射光が物体で反射されて返ってきた反射光を受光する受光センサと、前記発光源の発光タイミングを制御する光源変調信号と、前記受光センサの受光タイミングを制御するセンサ変調信号とに対して所定の符号に対応した符号化を行うことにより、符号化光源変調信号と符号化センサ変調信号とを生成する符号化部と、前記符号化光源変調信号に対して、所定の遅延量だけ位相を遅延させた遅延光源変調信号を生成する光源遅延部と、前記符号化センサ変調信号に対して、所定の遅延量だけ位相を遅延させた遅延センサ変調信号を生成するセンサ遅延部とを備える。 The distance measuring device on the first side surface of the present technology determines a light emitting source that irradiates the irradiation light, a light receiving sensor that receives the reflected light that is reflected by the object and returned, and a light emitting timing of the light emitting source. A coded light source modulated signal and a coded sensor modulated signal are generated by encoding the controlled light source modulated signal and the sensor modulated signal that controls the light receiving timing of the light receiving sensor corresponding to a predetermined code. A predetermined coding unit, a light source delay unit that generates a delayed light source modulation signal whose phase is delayed by a predetermined delay amount with respect to the coded light source modulation signal, and a predetermined code sensor modulation signal. It includes a sensor delay unit that generates a delay sensor modulated signal whose phase is delayed by the amount of delay.
 本技術の第2の側面の測距装置の制御方法は、照射光を照射する発光源と、前記照射光が物体で反射されて返ってきた反射光を受光する受光センサとを有する測距装置が、前記発光源の発光タイミングを制御する光源変調信号と、前記受光センサの受光タイミングを制御するセンサ変調信号とに対して所定の符号に対応した符号化を行うことにより、符号化光源変調信号と符号化センサ変調信号とを生成し、前記符号化光源変調信号に対して、所定の遅延量だけ位相を遅延させた遅延光源変調信号か、または、前記符号化センサ変調信号に対して、所定の遅延量だけ位相を遅延させた遅延センサ変調信号を生成する。 The control method of the distance measuring device on the second side of the present technology is a distance measuring device having a light emitting source that irradiates the irradiation light and a light receiving sensor that receives the reflected light that is reflected by the object and returned. However, the coded light source modulated signal is obtained by encoding the light source modulated signal that controls the light emitting timing of the light emitting source and the sensor modulated signal that controls the light receiving timing of the light receiving sensor according to a predetermined code. And a coded sensor modulation signal, and the delayed light source modulation signal whose phase is delayed by a predetermined delay amount with respect to the coded light source modulation signal, or a predetermined value with respect to the coded sensor modulation signal. A delay sensor modulated signal whose phase is delayed by the amount of the delay of is generated.
 本技術の第3の側面の電子機器は、照射光を照射する発光源と、前記照射光が物体で反射されて返ってきた反射光を受光する受光センサと、前記発光源の発光タイミングを制御する光源変調信号と、前記受光センサの受光タイミングを制御するセンサ変調信号とに対して所定の符号に対応した符号化を行うことにより、符号化光源変調信号と符号化センサ変調信号とを生成する符号化部と、前記符号化光源変調信号に対して、所定の遅延量だけ位相を遅延させた遅延光源変調信号を生成する光源遅延部と、前記符号化センサ変調信号に対して、所定の遅延量だけ位相を遅延させた遅延センサ変調信号を生成するセンサ遅延部とを備える測距装置を備える。 The electronic device on the third aspect of the present technology controls a light emitting source that irradiates the irradiation light, a light receiving sensor that receives the reflected light that is reflected by the object and returned, and a light emitting timing of the light emitting source. A coded light source modulated signal and a coded sensor modulated signal are generated by encoding the light source modulated signal to be generated and the sensor modulated signal that controls the light receiving timing of the light receiving sensor according to a predetermined code. A coding unit, a light source delay unit that generates a delayed light source modulation signal whose phase is delayed by a predetermined delay amount with respect to the coded light source modulated signal, and a predetermined delay with respect to the coded sensor modulated signal. A distance measuring device including a sensor delay unit that generates a delay sensor modulated signal whose phase is delayed by an amount is provided.
 本技術の第1乃至第3の側面においては、照射光を照射する発光源と、前記照射光が物体で反射されて返ってきた反射光を受光する受光センサとを有する測距装置において、前記発光源の発光タイミングを制御する光源変調信号と、前記受光センサの受光タイミングを制御するセンサ変調信号とに対して所定の符号に対応した符号化を行うことにより、符号化光源変調信号と符号化センサ変調信号とが生成され、前記符号化光源変調信号に対して、所定の遅延量だけ位相を遅延させた遅延光源変調信号か、または、前記符号化センサ変調信号に対して、所定の遅延量だけ位相を遅延させた遅延センサ変調信号が生成される。 In the first to third aspects of the present technology, in a distance measuring device having a light emitting source that irradiates the irradiation light and a light receiving sensor that receives the reflected light that is reflected by the object and returned. The light source modulated signal that controls the light emitting timing of the light emitting source and the sensor modulated signal that controls the light receiving timing of the light receiving sensor are coded as a coded light source modulated signal by coding corresponding to a predetermined code. A sensor modulation signal is generated, and a delay light source modulation signal whose phase is delayed by a predetermined delay amount with respect to the coded light source modulation signal, or a predetermined delay amount with respect to the coded sensor modulation signal. A delay sensor modulated signal with only a phase delay is generated.
 測距装置及び電子機器は、独立した装置であっても良いし、他の装置に組み込まれるモジュールであっても良い。 The distance measuring device and the electronic device may be an independent device or a module incorporated in another device.
本技術を適用した実施の形態である測距装置の構成例を示すブロック図である。It is a block diagram which shows the structural example of the distance measuring apparatus which is an embodiment to which this technique is applied. 受光センサの詳細構成例を示す図である。It is a figure which shows the detailed configuration example of a light receiving sensor. 受光センサの動作を説明する図である。It is a figure explaining the operation of a light receiving sensor. 駆動位相差を説明する図である。It is a figure explaining the drive phase difference. 測距装置の処理を説明する図である。It is a figure explaining the process of a distance measuring device. 測距装置の処理を説明する図である。It is a figure explaining the process of a distance measuring device. 通常モードの動作を説明する図である。It is a figure explaining the operation of a normal mode. 符号周期の特性を説明する図である。It is a figure explaining the characteristic of a code period. 近距離モードの動作を説明する図である。It is a figure explaining the operation of the short-distance mode. 遠距離モードの動作を説明する図である。It is a figure explaining the operation of a long-distance mode. 測距装置による測距処理を説明するフローチャートである。It is a flowchart explaining the distance measurement processing by a distance measuring device. 通常モードかつ符号周期1における物体までの距離と信号強度との関係を示すである。The relationship between the distance to the object and the signal strength in the normal mode and the code period 1 is shown. 通常モードかつ符号周期2における物体までの距離と信号強度との関係を示すである。The relationship between the distance to the object and the signal strength in the normal mode and the code period 2 is shown. 遠距離モードかつ符号周期1における物体までの距離と信号強度との関係を示すである。The relationship between the distance to the object and the signal strength in the long-distance mode and the code period 1 is shown. 遠距離モードかつ符号周期2における物体までの距離と信号強度との関係を示すである。The relationship between the distance to the object and the signal strength in the long-distance mode and the code period 2 is shown. 近距離モードかつ符号周期1における物体までの距離と信号強度との関係を示すである。The relationship between the distance to the object and the signal strength in the short-distance mode and the code period 1 is shown. 近距離モードかつ符号周期2における物体までの距離と信号強度との関係を示すである。The relationship between the distance to the object and the signal strength in the short-distance mode and the code period 2 is shown. 近距離モードかつ符号周期1における物体までの距離と信号強度との関係を示すである。The relationship between the distance to the object and the signal strength in the short-distance mode and the code period 1 is shown. 測距装置が組み込まれたスマートフォンの断面図である。It is sectional drawing of the smartphone which incorporated the distance measuring device. 図19の測距装置による測距処理のフローチャートである。It is a flowchart of the distance measurement processing by the distance measurement device of FIG. スマートフォンのアラート画面の例を示す図である。It is a figure which shows the example of the alert screen of a smartphone. 本技術を適用した電子機器の構成例を示すブロック図である。It is a block diagram which shows the structural example of the electronic device to which this technology is applied. 車両制御システムの概略的な構成の一例を示すブロック図である。It is a block diagram which shows an example of the schematic structure of a vehicle control system. 車外情報検出部及び撮像部の設置位置の一例を示す説明図である。It is explanatory drawing which shows an example of the installation position of the vehicle exterior information detection unit and the image pickup unit.
 以下、添付図面を参照しながら、本技術を実施するための形態(以下、実施の形態という)について説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。説明は以下の順序で行う。
1.測距装置の構成例
2.Indirect ToF方式の測距原理
3.測距装置の各部の動作説明
4.通常モードの動作
5.近距離モードの動作
6.遠距離モードの動作
7.測距処理のフローチャート
8.シミュレーション結果
9.アプリケーション適用例
10.電子機器の構成例
11.移動体への応用例
Hereinafter, embodiments for carrying out the present technology (hereinafter referred to as embodiments) will be described with reference to the accompanying drawings. In the present specification and the drawings, components having substantially the same functional configuration are designated by the same reference numerals, so that duplicate description will be omitted. The explanation will be given in the following order.
1. 1. Configuration example of distance measuring device 2. Indirect To F method distance measurement principle 3. Explanation of operation of each part of the distance measuring device 4. Normal mode operation 5. Short-range mode operation 6. Operation in long-distance mode 7. Flow chart of distance measurement processing 8. Simulation result 9. Application application example 10. Configuration example of electronic device 11. Application example to mobile
<1.測距装置の構成例>
 図1は、本技術を適用した実施の形態である測距装置の構成例を示すブロック図である。
<1. Configuration example of distance measuring device>
FIG. 1 is a block diagram showing a configuration example of a distance measuring device according to an embodiment to which the present technology is applied.
 図1に示される測距装置1は、Indirect ToF方式による測距を行う測距モジュールであり、被写体としての所定の物体(測定対象物)に対して光を照射し、その光(照射光)が物体で反射されてきた光(反射光)を受光することにより、物体までの距離情報としてのデプスマップと信頼度マップを生成して出力する。 The distance measuring device 1 shown in FIG. 1 is a distance measuring module that performs distance measuring by the Indirect ToF method, irradiates a predetermined object (measurement object) as a subject with light, and the light (irradiation light). Receives the light (reflected light) reflected by the object to generate and output a depth map and a reliability map as distance information to the object.
 測距装置1は、タイミング信号生成部11、位相設定部12、光源変調部13、センサ変調部14、符号生成部15、符号化部16、光源遅延部17、センサ遅延部18、発光源19、受光センサ20、および、制御部21を有する。 The distance measuring device 1 includes a timing signal generation unit 11, a phase setting unit 12, a light source modulation unit 13, a sensor modulation unit 14, a code generation unit 15, a coding unit 16, a light source delay unit 17, a sensor delay unit 18, and a light emitting source 19. , A light receiving sensor 20, and a control unit 21.
 タイミング信号生成部11は、発光源19の発光動作、および、受光センサ20の受光動作の基準となるタイミング信号を生成する。具体的には、タイミング信号生成部11は、所定の変調周波数Fmod(例えば、20MHzなど)の変調信号を生成し、光源変調部13、センサ変調部14、および、符号生成部15に供給する。変調信号は、例えば、図4で示されるように、変調周波数Fmodでオン(High)とオフ(Low)を繰り返すパルス信号である。 The timing signal generation unit 11 generates a timing signal that serves as a reference for the light emitting operation of the light emitting source 19 and the light receiving operation of the light receiving sensor 20. Specifically, the timing signal generation unit 11 generates a modulation signal having a predetermined modulation frequency Fmod (for example, 20 MHz) and supplies it to the light source modulation unit 13, the sensor modulation unit 14, and the code generation unit 15. The modulated signal is, for example, as shown in FIG. 4, a pulse signal that repeats on (High) and off (Low) at the modulation frequency Fmod.
 位相設定部12は、Indirect ToF方式による測距を行う際の、発光源19の発光タイミングと、受光センサ20の受光タイミングとの位相差φDを設定し、光源変調部13およびセンサ変調部14に供給する。以下では、発光タイミングと受光タイミングとの位相差φDを、被写体までの距離に応じて検出される位相差φと区別して、駆動位相差φDと称する。 Phase setting unit 12, when performing a distance measurement by Indirect ToF method, the light emission timing of the light emitting source 19, to set the phase difference phi D of the light receiving timing of the light receiving sensor 20, the light source modulation unit 13 and the sensor modulation section 14 Supply to. Hereinafter, the phase difference φ D between the light emission timing and the light reception timing is referred to as a drive phase difference φ D to distinguish it from the phase difference φ detected according to the distance to the subject.
 光源変調部13は、位相設定部12から駆動位相差φDが供給された場合、タイミング信号生成部11から供給される変調信号に対して、駆動位相差φDだけ位相をずらした光源変調信号を生成し、符号化部16に供給する。 When the drive phase difference φ D is supplied from the phase setting unit 12, the light source modulation unit 13 is a light source modulation signal whose phase is shifted by the drive phase difference φ D with respect to the modulation signal supplied from the timing signal generation unit 11. Is generated and supplied to the coding unit 16.
 センサ変調部14は、位相設定部12から駆動位相差φDが供給された場合、タイミング信号生成部11から供給される変調信号に対して、駆動位相差φDだけ位相をずらしたセンサ変調信号を生成し、符号化部16に供給する。 Sensor modulation unit 14, when the drive phase difference phi D from the phase setting section 12 is supplied, the modulated signal supplied from the timing signal generator 11, a sensor modulated signal obtained by shifting the drive phase difference phi D phase by Is generated and supplied to the coding unit 16.
 光源変調部13が生成する光源変調信号と、センサ変調部14が生成するセンサ変調信号との位相が、駆動位相差φDだけずれていればよいので、位相設定部12は、駆動位相差φDを、光源変調部13またはセンサ変調部14のいずれか一方に供給すればよい。例えば、位相設定部12が、駆動位相差φDを光源変調部13に供給する場合、光源変調部13が、変調信号に対して位相を駆動位相差φDだけずらした光源変調信号を生成して符号化部16に供給し、センサ変調部14は、タイミング信号生成部11からの変調信号を、そのまま、センサ変調信号として符号化部16に供給する。一方、位相設定部12が、駆動位相差φDをセンサ変調部14に供給する場合、センサ変調部14が、変調信号に対して位相を駆動位相差φDだけずらしたセンサ変調信号を生成して符号化部16に供給し、光源変調部13は、タイミング信号生成部11からの変調信号を、そのまま、光源変調信号として、符号化部16に供給する。 Since the phase of the light source modulation signal generated by the light source modulation unit 13 and the sensor modulation signal generated by the sensor modulation unit 14 need only be deviated by the drive phase difference φ D , the phase setting unit 12 has the drive phase difference φ. D may be supplied to either the light source modulation unit 13 or the sensor modulation unit 14. For example, when the phase setting unit 12 supplies the drive phase difference φ D to the light source modulation unit 13, the light source modulation unit 13 generates a light source modulation signal whose phase is shifted by the drive phase difference φ D with respect to the modulation signal. The sensor modulation unit 14 supplies the modulation signal from the timing signal generation unit 11 to the coding unit 16 as it is as a sensor modulation signal. On the other hand, the phase setting unit 12, when supplying a driving phase difference phi D to the sensor modulation section 14, the sensor modulation unit 14 generates a sensor modulated signal obtained by shifting the drive phase difference phi D phase against modulation signal The light source modulation unit 13 supplies the modulation signal from the timing signal generation unit 11 to the coding unit 16 as it is as a light source modulation signal.
 符号生成部15には、タイミング信号生成部11から基準のタイミング信号である変調信号が供給されるとともに、制御部21から、符号周期が供給される。符号生成部15は、制御部21から供給される符号周期単位で、0または1の符号をランダムに生成し、符号化部16に供給する。符号周期1周期は、変調信号の1周期であり、制御部21から供給される符号周期単位は、変調信号の1周期の整数倍となる。符号周期は、チップレングス(Chip Length)とも称する。 A modulation signal, which is a reference timing signal, is supplied to the code generation unit 15 from the timing signal generation unit 11, and a code period is supplied from the control unit 21. The code generation unit 15 randomly generates a code of 0 or 1 in a code cycle unit supplied from the control unit 21 and supplies the code to the coding unit 16. One code cycle is one cycle of the modulated signal, and the code cycle unit supplied from the control unit 21 is an integral multiple of one cycle of the modulated signal. The code period is also referred to as Chip Length.
 符号化部16は、光源変調部13から供給される光源変調信号と、センサ変調部14から供給されるセンサ変調信号とから、符号生成部15から供給される符号に応じた符号化信号を生成する。供給される光源変調信号またはセンサ変調信号に対して、生成される符号化信号は、符号が0のときは同一位相の信号となり、符号が1のときは反転位相の信号となる。符号化部16は、光源変調部13から供給される光源変調信号を、符号生成部15から供給される符号に応じた符号化光源変調信号を生成し、光源遅延部17に供給する。符号化部16は、センサ変調部14から供給されるセンサ変調信号を、符号生成部15から供給される符号に応じた符号化センサ変調信号を生成し、センサ遅延部18に供給する。 The coding unit 16 generates a coded signal according to the code supplied from the code generation unit 15 from the light source modulation signal supplied from the light source modulation unit 13 and the sensor modulation signal supplied from the sensor modulation unit 14. To do. With respect to the supplied light source modulation signal or sensor modulation signal, the generated coded signal is a signal having the same phase when the code is 0, and a signal having an inverted phase when the code is 1. The coding unit 16 generates a coded light source modulation signal corresponding to the code supplied from the code generation unit 15 from the light source modulation signal supplied from the light source modulation unit 13, and supplies the coded light source modulation signal to the light source delay unit 17. The coding unit 16 generates a coded sensor modulation signal corresponding to the code supplied from the code generation unit 15 from the sensor modulation signal supplied from the sensor modulation unit 14, and supplies the sensor modulation signal to the sensor delay unit 18.
 光源遅延部17は、符号化部16から供給される符号化光源変調信号に対して、制御部21から供給される遅延量ΔDだけ位相を遅延させた遅延光源変調信号を生成し、発光源19に供給する。 The light source delay unit 17 generates a delayed light source modulation signal whose phase is delayed by the delay amount ΔD supplied from the control unit 21 with respect to the coded light source modulation signal supplied from the coding unit 16, and the light source 19 Supply to.
 センサ遅延部18は、符号化部16から供給される符号化センサ変調信号に対して、制御部21から供給される遅延量ΔDだけ位相を遅延させた遅延センサ変調信号を生成し、受光センサ20に供給する。 The sensor delay unit 18 generates a delay sensor modulation signal whose phase is delayed by the delay amount ΔD supplied from the control unit 21 with respect to the coded sensor modulation signal supplied from the coding unit 16, and the light receiving sensor 20 Supply to.
 発光源19は、例えば、光源としての赤外線レーザダイオードと、レーザ駆動ドライバなどで構成され、光源遅延部17から供給される遅延光源変調信号に応じたタイミングで変調しながら発光し、物体に対して照射光を照射する。 The light emitting source 19 is composed of, for example, an infrared laser diode as a light source, a laser drive driver, or the like, and emits light while being modulated at a timing corresponding to a delayed light source modulation signal supplied from the light source delay unit 17 to the object. Irradiate the irradiation light.
 受光センサ20は、詳細は図2を参照して後述するが、複数の画素31が行列状に2次元配置された画素アレイ部32で、物体からの反射光を受光する。受光した反射光の光量に応じた画素信号が、制御部21へ供給される。 The light receiving sensor 20, which will be described in detail later with reference to FIG. 2, is a pixel array unit 32 in which a plurality of pixels 31 are two-dimensionally arranged in a matrix, and receives reflected light from an object. A pixel signal corresponding to the amount of received reflected light is supplied to the control unit 21.
 制御部21は、測距装置1全体の動作を制御する。例えば、制御部21は、測距装置1が組み込まれているホスト装置の制御部であるホスト制御部からの測距指示にしたがい、タイミング信号生成部11、位相設定部12、符号生成部15などに動作開始のトリガ信号を出力する。また、制御部21は、符号周期(Chip Length)を決定し、符号生成部15に供給するとともに、測定モードに対応して遅延量ΔDを決定し、光源遅延部17またはセンサ遅延部18のいずれか一方に供給する。さらに、制御部21は、受光センサ20から供給される画素信号に基づいて、画素単位にデプス値と信頼度を生成し、各画素の画素値としてデプス値を格納したデプスマップと、各画素の画素値として信頼度を格納した信頼度マップと生成して、ホスト制御部に出力する。 The control unit 21 controls the operation of the entire distance measuring device 1. For example, the control unit 21 includes a timing signal generation unit 11, a phase setting unit 12, a code generation unit 15, and the like according to a distance measurement instruction from the host control unit, which is a control unit of the host device in which the distance measurement device 1 is incorporated. Outputs a trigger signal to start operation. Further, the control unit 21 determines the code period (Chip Length), supplies it to the code generation unit 15, determines the delay amount ΔD according to the measurement mode, and determines either the light source delay unit 17 or the sensor delay unit 18. Supply to one side. Further, the control unit 21 generates a depth value and reliability for each pixel based on the pixel signal supplied from the light receiving sensor 20, and stores the depth value as the pixel value of each pixel, and a depth map of each pixel. A reliability map that stores the reliability as a pixel value is generated and output to the host control unit.
 図1の測距装置1は、以上のような構成を有する。 The distance measuring device 1 of FIG. 1 has the above configuration.
 測距装置1は、測定モードとして、第1の測定モード(以下、通常モードとも称する。)と、第1の測定モードよりも近距離の測距にフォーカスした第2の測定モード(以下、近距離モードとも称する。)と、第1の測定モードよりも遠距離の測距にフォーカスした第3の測定モード(以下、遠距離モードとも称する。)とを有する。 As the measurement mode, the distance measuring device 1 has a first measurement mode (hereinafter, also referred to as a normal mode) and a second measurement mode (hereinafter, near) that focuses on distance measurement at a shorter distance than the first measurement mode. It also has a distance mode) and a third measurement mode (hereinafter, also referred to as a long distance mode) that focuses on distance measurement at a longer distance than the first measurement mode.
<2.Indirect ToF方式の測距原理>
 測距装置1が実行する第1乃至第3の測定モードを説明する前に、それらの前提となるIndirect ToF方式の測距原理について、図2乃至図4を参照して説明する。
<2. Indirect To F method distance measurement principle>
Before explaining the first to third measurement modes executed by the distance measuring device 1, the distance measuring principle of the Indirect ToF method, which is a premise thereof, will be described with reference to FIGS. 2 to 4.
 なお、図2乃至図4を参照した説明では、受光センサ20の画素構成についても説明し、理解を容易とするため、受光センサ20が行う動作として説明する。 In the description with reference to FIGS. 2 to 4, the pixel configuration of the light receiving sensor 20 will also be described, and in order to facilitate understanding, the operation performed by the light receiving sensor 20 will be described.
 図2は、受光センサ20の詳細構成例を示している。 FIG. 2 shows a detailed configuration example of the light receiving sensor 20.
 受光センサ20は、画素31が行方向および列方向の行列状に2次元配置された画素アレイ部32と、画素アレイ部32の周辺領域に配置された駆動制御回路33とを有する。画素31は、受光した反射光の光量に応じた電荷を生成し、その電荷に応じた画素信号を出力する。 The light receiving sensor 20 has a pixel array unit 32 in which the pixels 31 are two-dimensionally arranged in a matrix in the row direction and the column direction, and a drive control circuit 33 arranged in a peripheral region of the pixel array unit 32. The pixel 31 generates an electric charge according to the amount of reflected light received, and outputs a pixel signal corresponding to the electric charge.
 画素31は、フォトダイオード41と、フォトダイオード41で光電変換された電荷を検出する電荷蓄積部として、FD(Floating Diffusion)部42Aおよび42Bを備える。以下では、簡単のため、FD部42AをタップA(第1タップ)と称し、FD部42BをタップB(第2タップ)とも称する。 The pixel 31 includes a photodiode 41 and FD (Floating Diffusion) units 42A and 42B as charge storage units for detecting the charges photoelectrically converted by the photodiode 41. Hereinafter, for the sake of simplicity, the FD section 42A is also referred to as a tap A (first tap), and the FD section 42B is also referred to as a tap B (second tap).
 画素31は、タップAとしてのFD部42Aへの電荷蓄積を制御する複数の画素トランジスタである、転送トランジスタ43A、選択トランジスタ44A、および、リセットトランジスタ45Aと、タップBとしてのFD部42Bへの電荷蓄積を制御する複数の画素トランジスタである、転送トランジスタ43B、選択トランジスタ44B、および、リセットトランジスタ45Bとを備える。 The pixel 31 is a plurality of pixel transistors that control charge accumulation in the FD section 42A as the tap A, the transfer transistor 43A, the selection transistor 44A, the reset transistor 45A, and the charge to the FD section 42B as the tap B. It includes a transfer transistor 43B, a selection transistor 44B, and a reset transistor 45B, which are a plurality of pixel transistors that control storage.
 画素31の動作について説明する。 The operation of the pixel 31 will be described.
 初めに、露光開始前の余分な電荷をリセットするリセット動作が行われる。具体的には、駆動制御回路33が、振り分け信号GDAおよびGDBと、リセット信号RSAおよびRSBをHighに制御し、タップA側の転送トランジスタ43Aおよびリセットトランジスタ45Aと、タップB側の転送トランジスタ43Bおよびリセットトランジスタ45Bとをオンさせる。これにより、FD部42AとFD部42Bに蓄積されていた電荷がリセットされるとともに、フォトダイオード41の蓄積電荷がリセットされる。リセット動作終了後、転送トランジスタ43Aおよびリセットトランジスタ45Aと、タップB側の転送トランジスタ43Bおよびリセットトランジスタ45Bとは、オフに戻される。 First, a reset operation is performed to reset the excess charge before the start of exposure. Specifically, the drive control circuit 33 controls the distribution signals GDA and GDB and the reset signals RSA and RSB to High, and the transfer transistor 43A and the reset transistor 45A on the tap A side, the transfer transistor 43B on the tap B side, and the like. Turn on the reset transistor 45B. As a result, the electric charges accumulated in the FD section 42A and the FD section 42B are reset, and the accumulated charges of the photodiode 41 are reset. After the reset operation is completed, the transfer transistor 43A and the reset transistor 45A and the transfer transistor 43B and the reset transistor 45B on the tap B side are turned off.
 次に、露光動作が開始される。具体的には、駆動制御回路33が、振り分け信号GDAおよびGDBを交互にHighに制御し、タップA側の転送トランジスタ43Aと、タップB側の転送トランジスタ43Bとを、交互にオンする。これにより、フォトダイオード41で発生した電荷が、タップAとしてのFD部42A、または、タップBとしてのFD部42Bに振り分けられる。フォトダイオード41で発生した電荷のタップAまたはタップBへの振り分け動作が、1フレームの発光期間に相当する時間、周期的に繰り返される。転送トランジスタ43Aを介して転送された電荷はFD部42Aに順次蓄積され、転送トランジスタ43Bを介して転送された電荷はFD部42Bに順次蓄積される。 Next, the exposure operation is started. Specifically, the drive control circuit 33 alternately controls the distribution signals GDA and GDB to High, and alternately turns on the transfer transistor 43A on the tap A side and the transfer transistor 43B on the tap B side. As a result, the electric charge generated by the photodiode 41 is distributed to the FD section 42A as the tap A or the FD section 42B as the tap B. The operation of distributing the electric charge generated by the photodiode 41 to the tap A or the tap B is periodically repeated for a time corresponding to the light emission period of one frame. The charges transferred via the transfer transistor 43A are sequentially stored in the FD section 42A, and the charges transferred via the transfer transistor 43B are sequentially stored in the FD section 42B.
 そして、露光期間終了後、駆動制御回路33が、選択信号ROAおよびROBをHighに制御することにより、タップAであるFD部42Aの蓄積電荷に応じた検出信号Aと、タップBであるFD部42Bの蓄積電荷に応じた検出信号Bが、画素信号として出力される。すなわち、選択信号ROAに従って選択トランジスタ44Aがオンとなると、FD部42Aに蓄積されている電荷の電荷量に応じた検出信号Aが、信号線46Aを介して画素31から出力される。同様に、選択信号ROBに従って選択トランジスタ44Bがオンとなると、FD部42Bに蓄積されている電荷の電荷量に応じた検出信号Bが、信号線46Bを介して画素31から出力される。 Then, after the end of the exposure period, the drive control circuit 33 controls the selection signals ROA and ROB to High, so that the detection signal A corresponding to the accumulated charge of the FD unit 42A which is the tap A and the FD unit which is the tap B The detection signal B corresponding to the accumulated charge of 42B is output as a pixel signal. That is, when the selection transistor 44A is turned on according to the selection signal ROA, the detection signal A corresponding to the amount of electric charge stored in the FD unit 42A is output from the pixel 31 via the signal line 46A. Similarly, when the selection transistor 44B is turned on according to the selection signal ROB, the detection signal B corresponding to the amount of electric charge stored in the FD unit 42B is output from the pixel 31 via the signal line 46B.
 このように、画素31は、フォトダイオード41が受光した反射光により発生する電荷を、振り分け信号GDAおよびGDBに応じてタップAまたはタップBBに振り分けて、検出信号Aおよび検出信号Bを出力する。 In this way, the pixel 31 distributes the electric charge generated by the reflected light received by the photodiode 41 to the tap A or the tap BB according to the distribution signals GDA and GDB, and outputs the detection signal A and the detection signal B.
 測距装置1から物体までの距離に相当するデプス値dは、以下の式(1)で計算することができる。
Figure JPOXMLDOC01-appb-M000001
The depth value d corresponding to the distance from the distance measuring device 1 to the object can be calculated by the following equation (1).
Figure JPOXMLDOC01-appb-M000001
 式(1)のΔtは、発光源19から出射された照射光が被写体としての物体に反射して受光センサ20に入射するまでの時間であり、cは、光速を表す。 Δt in the equation (1) is the time until the irradiation light emitted from the light emitting source 19 is reflected by the object as the subject and is incident on the light receiving sensor 20, and c is the speed of light.
 発光源19から照射される照射光には、図3に示されるような、変調周波数Fmodで高速にオンオフを繰り返す発光パターンのパルス光が採用される。発光パターンの1周期Tは1/Fmodとなる。受光センサ20では、発光源19から受光センサ20に到達するまでの時間Δtに応じて、反射光(受光パターン)の位相がずれて検出される。この発光パターンと受光パターンとの位相のずれ量(位相差)をφとすると、時間Δtは、下記の式(2)で算出することができる。
Figure JPOXMLDOC01-appb-M000002
As the irradiation light emitted from the light emitting source 19, pulsed light having a light emitting pattern that repeats on / off at high speed at a modulation frequency Fmod as shown in FIG. 3 is adopted. One cycle T of the light emission pattern is 1 / Fmod. In the light receiving sensor 20, the reflected light (light receiving pattern) is detected out of phase according to the time Δt from the light emitting source 19 to the light receiving sensor 20. Assuming that the amount of phase shift (phase difference) between the light emitting pattern and the light receiving pattern is φ, the time Δt can be calculated by the following equation (2).
Figure JPOXMLDOC01-appb-M000002
 したがって、測距装置1から物体までのデプス値dは、式(1)と式(2)とから、下記の式(3)で算出することができる。
Figure JPOXMLDOC01-appb-M000003
Therefore, the depth value d from the distance measuring device 1 to the object can be calculated from the equations (1) and (2) by the following equation (3).
Figure JPOXMLDOC01-appb-M000003
 次に、上述の位相差φの算出手法について説明する。 Next, the above-mentioned calculation method of the phase difference φ will be described.
 受光センサ20に形成された画素アレイ部32の各画素31は、上述したように転送トランジスタ43Aおよび43BのON/OFFを高速に繰り返し、ON期間のみの電荷を蓄積する。 Each pixel 31 of the pixel array unit 32 formed in the light receiving sensor 20 repeats ON / OFF of the transfer transistors 43A and 43B at high speed as described above, and accumulates electric charges only during the ON period.
 受光センサ20は、画素アレイ部32の各画素31のON/OFFの実行タイミングを、例えば、フレーム単位で順次切り替えて、各実行タイミングにおける電荷を蓄積し、蓄積電荷に応じた検出信号を出力する。 The light receiving sensor 20 sequentially switches the ON / OFF execution timing of each pixel 31 of the pixel array unit 32, for example, in frame units, accumulates electric charges at each execution timing, and outputs a detection signal according to the accumulated electric charge. ..
 ON/OFFの実行タイミングには、たとえば、位相0度、位相90度、位相180度、および、位相270度の4種類がある。 There are four types of ON / OFF execution timings, for example, phase 0 degrees, phase 90 degrees, phase 180 degrees, and phase 270 degrees.
 位相0度の実行タイミングは、画素アレイ部32の各画素31のタップAまたはタップBのONタイミング(受光タイミング)を、照射光の発光タイミング、すなわち発光パターンと同じ位相とするタイミングである。 The execution timing of the phase 0 degree is a timing at which the ON timing (light receiving timing) of the tap A or the tap B of each pixel 31 of the pixel array unit 32 is set to the emission timing of the irradiation light, that is, the same phase as the emission pattern.
 位相90度の実行タイミングは、画素アレイ部32の各画素31のタップAまたはタップBのONタイミング(受光タイミング)を、照射光の発光タイミング(発光パターン)から90度遅れた位相とするタイミングである。 The execution timing of the phase 90 degrees is a timing in which the ON timing (light receiving timing) of tap A or tap B of each pixel 31 of the pixel array unit 32 is set to a phase 90 degrees behind the emission timing (emission pattern) of the irradiation light. is there.
 位相180度の実行タイミングは、画素アレイ部32の各画素31のタップAまたはタップBのONタイミング(受光タイミング)を、照射光の発光タイミング(発光パターン)から180度遅れた位相とするタイミングである。 The execution timing of the phase 180 degrees is a timing in which the ON timing (light receiving timing) of tap A or tap B of each pixel 31 of the pixel array unit 32 is set to a phase 180 degrees behind the emission timing (emission pattern) of the irradiation light. is there.
 位相270度の実行タイミングは、画素アレイ部32の各画素31のタップAまたはタップBのONタイミング(受光タイミング)を、照射光の発光タイミング(発光パターン)から270度遅れた位相とするタイミングである。 The execution timing of the phase 270 degrees is a timing in which the ON timing (light receiving timing) of tap A or tap B of each pixel 31 of the pixel array unit 32 is set to a phase 270 degrees behind the emission timing (emission pattern) of the irradiation light. is there.
 上述したように、画素アレイ部32の各画素31のタップA側の転送トランジスタ43Aと、タップB側の転送トランジスタ43Bとは交互にオンされるため、タップAのONタイミングとタップBのONタイミングは、位相が反転したタイミングとなる。例えば、画素31のタップAを位相0度の実行タイミングとした場合、タップBは、位相180度の実行タイミングとなり、画素31のタップAを位相90度の実行タイミングとした場合、タップBは、位相270度の実行タイミングとなる。したがって、1フレームで2つの位相の検出信号を取得することができるので、位相0度、位相90度、位相180度、および、位相270度の4位相の検出信号を取得するためには、受光センサ20は、最低2フレームの受光(撮像)を行えばよい。このように2フレームの受光により4位相の検出信号を取得し、デプス値dを算出する方式を、2Phase方式と呼ぶ。一方、受光センサ20は、タップAとタップBのそれぞれで、位相0度、位相90度、位相180度、位相270度の4位相の検出信号を取得する4Phase方式と呼ぶ方式もある。4Phase方式の場合、4フレームの受光(撮像)が必要となるが、タップAとタップBのタップ間の特性ばらつきを除去した結果を得ることができる。 As described above, since the transfer transistor 43A on the tap A side and the transfer transistor 43B on the tap B side of each pixel 31 of the pixel array unit 32 are turned on alternately, the ON timing of the tap A and the ON timing of the tap B Is the timing when the phase is inverted. For example, when the tap A of the pixel 31 is the execution timing of 0 degree phase, the tap B is the execution timing of 180 degree phase, and when the tap A of the pixel 31 is the execution timing of 90 degree phase, the tap B is. The execution timing has a phase of 270 degrees. Therefore, since it is possible to acquire the detection signals of two phases in one frame, in order to acquire the detection signals of four phases of phase 0 degree, phase 90 degree, phase 180 degree, and phase 270 degree, the light is received. The sensor 20 may receive light (imaging) for at least two frames. A method of acquiring a detection signal of four phases by receiving light of two frames and calculating a depth value d in this way is called a two-phase method. On the other hand, the light receiving sensor 20 may have a method called a 4 Phase method in which each of the tap A and the tap B acquires a detection signal of four phases of phase 0 degree, phase 90 degree, phase 180 degree, and phase 270 degree. In the case of the 4-Phase method, light reception (imaging) of 4 frames is required, but the result of removing the characteristic variation between the taps of the tap A and the tap B can be obtained.
 受光センサ20は、例えば4Phase方式を採用すると、位相0度、位相90度、位相180度、位相270度の順番で受光タイミングをフレーム単位で順次切り替え、各受光タイミングにおける反射光の受光量(蓄積電荷)を取得する。図3では、各位相の受光タイミングにおいて、反射光が入射されるタイミングに斜線が付されている。 When the light receiving sensor 20 adopts, for example, a 4 Phase method, the light receiving timing is sequentially switched in the order of phase 0 degree, phase 90 degree, phase 180 degree, and phase 270 degree in frame units, and the received light amount (accumulation) of the reflected light at each light receiving timing Charge). In FIG. 3, in the light receiving timing of each phase, the timing at which the reflected light is incident is shaded.
 図3に示されるように、受光タイミングを、位相0度、位相90度、位相180度、および、位相270度としたときに蓄積された電荷を、それぞれ、Q、Q90、Q180、および、Q270とすると、位相差φは、Q、Q90、Q180、および、Q270を用いて、下記の式(4)で算出することができる。
Figure JPOXMLDOC01-appb-M000004
As shown in FIG. 3, when the light receiving timing is set to phase 0 degree, phase 90 degree, phase 180 degree, and phase 270 degree, the accumulated charges are Q 0 , Q 90 , Q 180 , respectively. and, when Q 270, the phase difference φ, Q 0, Q 90, Q 180 and, using a Q 270, can be calculated by the following equation (4).
Figure JPOXMLDOC01-appb-M000004
 式(4)で算出された位相差φを上記の式(3)に入力することにより、測距装置1から物体までのデプス値dを算出することができる。 By inputting the phase difference φ calculated by the equation (4) into the above equation (3), the depth value d from the distance measuring device 1 to the object can be calculated.
 また、信頼度conf(confidence)は、各画素で受光した光の強度を表す値であり、信号強度confとも呼ばれ、例えば、以下の式(5)で計算することができる。
Figure JPOXMLDOC01-appb-M000005
Further, the reliability conf (confidence) is a value representing the intensity of the light received by each pixel, and is also called a signal intensity conf, and can be calculated by, for example, the following equation (5).
Figure JPOXMLDOC01-appb-M000005
 画素アレイ部32の駆動制御回路33は、図4に示されるように、各画素31のフォトダイオード41で発生した電荷をタップAまたはタップBに蓄積するタイミング(受光タイミング)を、照射光の発光タイミングに対して、位相0度、位相90度、位相180度、および、位相270度とするような振り分け信号GDAおよびGDBを生成する。この発光タイミングと受光タイミングとの位相差が、位相設定部12が設定する駆動位相差φDとなる。 As shown in FIG. 4, the drive control circuit 33 of the pixel array unit 32 emits irradiation light at a timing (light receiving timing) of accumulating the electric charge generated by the photodiode 41 of each pixel 31 on the tap A or the tap B. Distribution signals GDA and GDB having a phase of 0 degrees, a phase of 90 degrees, a phase of 180 degrees, and a phase of 270 degrees are generated with respect to the timing. The phase difference between the light emission timing and the light reception timing is the drive phase difference φ D set by the phase setting unit 12.
 図4の例では、発光タイミングを基準として、受光タイミングを、駆動位相差φD(=0、90、180、270)だけ遅らせる例を示しているが、受光タイミングを基準として、発光タイミングを、駆動位相差φDだけ遅らせても同じである。 In the example of FIG. 4, an example is shown in which the light receiving timing is delayed by the drive phase difference φ D (= 0, 90, 180, 270) with reference to the light emitting timing, but the light emitting timing is set with reference to the light receiving timing. It is the same even if the drive phase difference φ D is delayed.
<3.測距装置の各部の動作説明>
 次に、上述のIndirect ToF方式の測距の基本動作を前提とする、図1の測距装置1の処理について説明する。
<3. Operation explanation of each part of the distance measuring device>
Next, the processing of the distance measuring device 1 of FIG. 1 on the premise of the basic operation of the above-mentioned Indirect ToF type distance measuring will be described.
 図5は、測距装置1のタイミング信号生成部11から符号化部16までの処理を説明する図である。 FIG. 5 is a diagram illustrating processing from the timing signal generation unit 11 to the coding unit 16 of the distance measuring device 1.
 タイミング信号生成部11は、変調周波数Fmodの変調信号を生成し、光源変調部13、センサ変調部14、および、符号生成部15に供給する。 The timing signal generation unit 11 generates a modulation signal of the modulation frequency Fmod and supplies it to the light source modulation unit 13, the sensor modulation unit 14, and the code generation unit 15.
 位相設定部12は、駆動位相差φDを設定し、光源変調部13またはセンサ変調部14のいずれか一方に供給する。図5は、位相設定部12で設定された駆動位相差φDが光源変調部13に供給される場合の例を示している。 Phase setting unit 12 sets the drive phase difference phi D, supplied to one of the light source modulation unit 13 or the sensor modulation section 14. Figure 5 shows an example of a case where the drive phase difference phi D set by the phase setting section 12 is supplied to the light source modulation section 13.
 すなわち、光源変調部13は、タイミング信号生成部11からの変調信号に対して、位相を駆動位相差φDだけずらした光源変調信号を生成して符号化部16に供給する。センサ変調部14は、タイミング信号生成部11からの変調信号を、そのまま、センサ変調信号として符号化部16に供給する。したがって、図5に示されるセンサ変調信号は、タイミング信号生成部11が生成した変調信号と同一である。 That is, the light source modulation unit 13, the modulated signal from the timing signal generator 11, and supplies to the encoding unit 16 generates a light source modulation signal obtained by shifting the drive phase difference phi D phase. The sensor modulation unit 14 supplies the modulation signal from the timing signal generation unit 11 as it is to the coding unit 16 as a sensor modulation signal. Therefore, the sensor modulation signal shown in FIG. 5 is the same as the modulation signal generated by the timing signal generation unit 11.
 制御部21は、符号周期(Chip Length)を決定し、符号生成部15に供給する。符号周期は、光源変調信号およびセンサ変調信号の周期の整数倍単位とされる。図5の例では、符号周期が2(Chip Length=2)に決定され、制御部21から符号生成部15に供給されている。 The control unit 21 determines the code period (Chip Length) and supplies it to the code generation unit 15. The code period is an integral multiple of the period of the light source modulation signal and the sensor modulation signal. In the example of FIG. 5, the code period is determined to be 2 (Chip Length = 2), and the code period is supplied from the control unit 21 to the code generation unit 15.
 符号生成部15は、符号周期単位(図5では2周期)に、0または1の符号をランダムに生成し、符号化部16に供給する。図5の例では、“0”、“1”、“0”、“1”の順に、符号が生成されている。 The code generation unit 15 randomly generates a code of 0 or 1 in a code cycle unit (two cycles in FIG. 5) and supplies the code to the coding unit 16. In the example of FIG. 5, the reference numerals are generated in the order of “0”, “1”, “0”, and “1”.
 符号化部16は、光源変調部13から供給される光源変調信号と、センサ変調部14から供給されるセンサ変調信号とに対して、符号に応じた位相シフト処理を行い、符号化信号としての符号化光源変調信号と符号化センサ変調信号とを生成する。符号に応じた位相シフト処理とは、具体的には、符号が0のときは同一位相の符号化信号を生成し、符号が1のときは反転位相の符号化信号を生成する処理である。 The coding unit 16 performs phase shift processing according to the code on the light source modulation signal supplied from the light source modulation unit 13 and the sensor modulation signal supplied from the sensor modulation unit 14, and serves as a coded signal. A coded light source modulated signal and a coded sensor modulated signal are generated. Specifically, the phase shift process according to the code is a process of generating a coded signal having the same phase when the code is 0 and generating a coded signal having an inverted phase when the code is 1.
 したがって、図5において、符号が0のときの符号化光源変調信号および符号化センサ変調信号は、光源変調信号およびセンサ変調信号と同一であり、符号が1のときの符号化光源変調信号および符号化センサ変調信号は、光源変調信号およびセンサ変調信号に対して位相が反転(180度シフト)した信号となっている。 Therefore, in FIG. 5, the coded light source modulated signal and the coded sensor modulated signal when the code is 0 are the same as the light source modulated signal and the sensor modulated signal, and the coded light source modulated signal and the code when the code is 1. The sensor modulation signal is a signal whose phase is inverted (180 degree shift) with respect to the light source modulation signal and the sensor modulation signal.
 図6は、測距装置1の光源遅延部17およびセンサ遅延部18の処理を説明する図である。 FIG. 6 is a diagram illustrating processing of the light source delay unit 17 and the sensor delay unit 18 of the distance measuring device 1.
 なお、図6の上段には、比較を容易とするため、図5と同じ信号を示してあり、その説明は省略する。 Note that the same signal as in FIG. 5 is shown in the upper part of FIG. 6 for easy comparison, and the description thereof will be omitted.
 制御部21は、測定モードに対応して遅延量ΔDを決定し、光源遅延部17またはセンサ遅延部18のいずれか一方に供給する。 The control unit 21 determines the delay amount ΔD according to the measurement mode and supplies it to either the light source delay unit 17 or the sensor delay unit 18.
 光源遅延部17は、符号化部16から供給される符号化光源変調信号に対して、遅延量ΔDだけ位相を遅延させた遅延光源変調信号を生成し、発光源19に供給する。センサ遅延部18は、符号化部16から供給される符号化センサ変調信号に対して、遅延量ΔDだけ位相を遅延させた遅延センサ変調信号を生成し、受光センサ20に供給する。 The light source delay unit 17 generates a delayed light source modulation signal whose phase is delayed by a delay amount ΔD with respect to the coded light source modulation signal supplied from the coding unit 16, and supplies the delayed light source modulation signal to the light emitting source 19. The sensor delay unit 18 generates a delay sensor modulation signal whose phase is delayed by a delay amount ΔD with respect to the coded sensor modulation signal supplied from the coding unit 16, and supplies the delay sensor modulation signal to the light receiving sensor 20.
 図6の下段には、遅延量ΔDがセンサ側(センサ遅延部18)に設定された場合の遅延光源変調信号および遅延センサ変調信号と、遅延量ΔDが光源側(光源遅延部17)に設定された場合の遅延光源変調信号および遅延センサ変調信号とが示されている。 In the lower part of FIG. 6, the delay light source modulation signal and the delay sensor modulation signal when the delay amount ΔD is set on the sensor side (sensor delay part 18) and the delay amount ΔD are set on the light source side (light source delay part 17). The delayed light source modulated signal and the delayed sensor modulated signal are shown.
 遅延量ΔDがセンサ側(センサ遅延部18)に設定された場合、遅延光源変調信号は、上段の符号化光源変調信号と同じ信号となり、遅延センサ変調信号が、上段の符号化センサ変調信号に対して遅延量ΔDだけ位相遅延された信号となっている。 When the delay amount ΔD is set on the sensor side (sensor delay unit 18), the delay light source modulation signal becomes the same signal as the upper coded light source modulation signal, and the delay sensor modulation signal becomes the upper coded sensor modulation signal. On the other hand, the signal is phase-delayed by the delay amount ΔD.
 遅延量ΔDが光源側(光源遅延部17)に設定された場合、遅延光源変調信号が、上段の符号化光源変調信号に対して遅延量ΔDだけ位相遅延された信号となり、遅延センサ変調信号は、上段の符号化センサ変調信号と同じ信号となっている。 When the delay amount ΔD is set on the light source side (light source delay unit 17), the delay light source modulation signal becomes a signal whose phase is delayed by the delay amount ΔD with respect to the coded light source modulation signal in the upper stage, and the delay sensor modulation signal is , It is the same signal as the coded sensor modulation signal in the upper stage.
 発光源19は、光源遅延部17から供給される遅延光源変調信号に応じたタイミングで変調しながら発光し、物体に対して照射光を照射する。 The light emitting source 19 emits light while being modulated at a timing corresponding to the delayed light source modulation signal supplied from the light source delay unit 17, and irradiates the object with irradiation light.
 受光センサ20は、各画素31において、センサ遅延部18から供給される遅延センサ変調信号に応じたタイミングで反射光を受光し、受光した反射光の光量に応じた画素信号を出力する。受光センサ20内において、センサ遅延部18から供給される遅延センサ変調信号に基づいて、タップAとタップBとを交互にオンする振り分け信号GDAおよびGDBが生成される。例えば、センサ遅延部18から供給される遅延センサ変調信号が振り分け信号GDAとされ、遅延センサ変調信号の位相を反転した信号が振り分け信号GDBとされる。 The light receiving sensor 20 receives reflected light at each pixel 31 at a timing corresponding to the delay sensor modulation signal supplied from the sensor delay unit 18, and outputs a pixel signal corresponding to the amount of the received reflected light. In the light receiving sensor 20, distribution signals GDA and GDB that alternately turn on tap A and tap B are generated based on the delay sensor modulation signal supplied from the sensor delay unit 18. For example, the delay sensor modulation signal supplied from the sensor delay unit 18 is referred to as the distribution signal GDA, and the signal in which the phase of the delay sensor modulation signal is inverted is referred to as the distribution signal GDB.
 上述したように、測距装置1は、符号生成部15で生成された符号に応じて、光源変調信号およびセンサ変調信号を符号化部16において符号周期単位で符号化する。符号化された符号化光源変調信号および符号化センサ変調信号は、“0”または“1”の符号(バイナリ)に応じて0度と180度に位相をシフトさせたBPSK(Binary Phase Shift Keying)信号となる。 As described above, the distance measuring device 1 encodes the light source modulation signal and the sensor modulation signal in the coding unit 16 in code cycle units according to the code generated by the code generation unit 15. The coded coded light source modulation signal and the coded sensor modulation signal are BPSK (Binary Phase Shift Keying) whose phases are shifted to 0 degrees and 180 degrees according to the code (binary) of "0" or "1". It becomes a signal.
 そして、光源遅延部17およびセンサ遅延部18において、符号化光源変調信号または符号化センサ変調信号のいずれか一方を遅延量ΔDだけ位相遅延させた遅延光源変調信号および遅延センサ変調信号が生成され、遅延光源変調信号および遅延センサ変調信号に基づいて、発光源19の発光と、受光センサ20の受光が行われる。 Then, the light source delay unit 17 and the sensor delay unit 18 generate a delay light source modulation signal and a delay sensor modulation signal in which either the coded light source modulation signal or the coded sensor modulation signal is phase-delayed by a delay amount ΔD. Based on the delayed light source modulated signal and the delayed sensor modulated signal, the light emitting source 19 emits light and the light receiving sensor 20 receives light.
<4.通常モードの動作>
 次に、以上のような測距装置1の動作で得られる反射光の信号強度について説明する。反射光の信号強度とは、画素31の検出信号に基づいて式(5)で計算される信頼度confである。
<4. Normal mode operation>
Next, the signal intensity of the reflected light obtained by the operation of the distance measuring device 1 as described above will be described. The signal intensity of the reflected light is a reliability conf calculated by the equation (5) based on the detection signal of the pixel 31.
 初めに、遅延量ΔDを考慮しない状態、換言すれば、遅延量ΔD=0の場合について説明する。 First, a state in which the delay amount ΔD is not considered, in other words, a case where the delay amount ΔD = 0 will be described.
 図7の左側上段に示されるタイミング信号は、理想的に距離ゼロの位置に測定対象の物体が存在しているとした場合の、照射光の波形と、その物体で反射されてきた反射光を受光するタップAおよびタップBの受光タイミングを示している。測定対象の物体が理想的に距離ゼロの位置と仮定するので、照射光の波形は、反射光の波形でもある。タップAおよびタップBの受光タイミングのうち、反射光が入射されるタイミングには模様が付されている。 The timing signal shown in the upper left part of FIG. 7 shows the waveform of the irradiation light and the reflected light reflected by the object when the object to be measured is ideally located at a position of zero distance. The light receiving timings of tap A and tap B that receive light are shown. Since it is assumed that the object to be measured is ideally located at a distance of zero, the waveform of the irradiation light is also the waveform of the reflected light. Of the light receiving timings of tap A and tap B, the timing at which the reflected light is incident is marked with a pattern.
 一方、左側下段に示されるタイミング信号は、測定対象の物体が遠距離、例えば、位相換算で2Pi(=2π=360°)の位置に存在しているとした場合の、反射光の波形と、その物体で反射されてきた反射光を受光するタップAおよびタップBの受光タイミングを示している。照射光の波形は、上段の照射光の波形と同じである。タップAおよびタップBの受光タイミングのうち、反射光が入射されるタイミングには模様が付されている。 On the other hand, the timing signal shown in the lower left is the waveform of the reflected light when the object to be measured exists at a long distance, for example, at a position of 2Pi (= 2π = 360 °) in terms of phase. The light receiving timings of tap A and tap B that receive the reflected light reflected by the object are shown. The waveform of the irradiation light is the same as the waveform of the irradiation light in the upper stage. Of the light receiving timings of tap A and tap B, the timing at which the reflected light is incident is marked with a pattern.
 左側上段に示される物体が距離ゼロの位置に存在している場合に、受光センサ20の各画素31で検出される信号強度Confは、図7の右側のグラフに示される信号強度C2である。信号強度C2は、駆動位相差φDが0以外の場合であり、駆動位相差φDが0のとき、信号強度Confは、最大の信号強度C1となる。 The signal intensity Conf detected by each pixel 31 of the light receiving sensor 20 when the object shown in the upper left is at a position of zero distance is the signal intensity C2 shown in the graph on the right side of FIG. The signal strength C2 is the case where the drive phase difference φ D is other than 0, and when the drive phase difference φ D is 0, the signal strength Conf is the maximum signal strength C1.
 これに対して、測定対象の物体が位相換算で2Piの位置に存在しているとした場合、反射光は、左側下段に示されるように、2Piだけ遅れて受光センサ20に到達するので、1サイクル分の受光タイミングを損失し、信号強度Confは、図7の右側のグラフに示される信号強度C3となる。 On the other hand, if the object to be measured exists at the position of 2Pi in terms of phase, the reflected light reaches the light receiving sensor 20 with a delay of 2Pi as shown in the lower left side. The light reception timing for the cycle is lost, and the signal strength Conf becomes the signal strength C3 shown in the graph on the right side of FIG. 7.
 測定対象の物体の位置が遠くなるほど、反射光は遅れて受光センサ20に到達するので、図7の右側のグラフに示されるように、信号強度Confは、距離に応じて減衰し、所定の距離でゼロになる。 As the position of the object to be measured becomes farther, the reflected light reaches the light receiving sensor 20 with a delay. Therefore, as shown in the graph on the right side of FIG. 7, the signal intensity Conf is attenuated according to the distance and a predetermined distance. Becomes zero.
 信号強度Confがゼロになるゼロ点の距離は、符号周期(Chip Length)に依存する。符号周期が2(Chip Length=2)のとき、ゼロ点の距離が、4Pi相当の距離となる。 The distance of the zero point where the signal strength Conf becomes zero depends on the code period (Chip Length). When the code period is 2 (Chip Length = 2), the distance at the zero point is a distance equivalent to 4Pi.
 図8のAは、符号周期が1(Chip Length=1)のときの、物体までの距離と信号強度Confとの関係を示すグラフである。 A in FIG. 8 is a graph showing the relationship between the distance to the object and the signal strength Conf when the code period is 1 (Chip Length = 1).
 符号周期が1(Chip Length=1)のとき、ゼロ点の距離は2Pi相当の距離となる。 When the code period is 1 (Chip Length = 1), the distance at the zero point is a distance equivalent to 2Pi.
 図8のBは、符号周期が8(Chip Length=8)のときの、物体までの距離と信号強度Confとの関係を示すグラフである。 B in FIG. 8 is a graph showing the relationship between the distance to the object and the signal strength Conf when the code period is 8 (Chip Length = 8).
 符号周期が8(Chip Length=8)のとき、ゼロ点の距離は16Pi相当の距離となる。 When the code period is 8 (Chip Length = 8), the distance at the zero point is a distance equivalent to 16Pi.
 すなわち、信号強度Confがゼロになるゼロ点の距離は、(Chip Length×2Pi)相当の距離となる。 That is, the distance at the zero point where the signal strength Conf becomes zero is a distance equivalent to (Chip Length x 2Pi).
 従って、測距装置1による、生成された符号に応じてBPSK変調させた信号に基づいて、測距を行う方式によれば、所定の距離以降の信号強度Confをゼロにすることができるので、所定の距離以降の信号をカットオフすることができる。また、符号周期(Chip Length)を制御することで、信号をカットオフする距離を任意の距離に設定することができる。測距装置1によれば、所望の測定範囲に限定した距離の測定を行うことができる。ただし、有効測距範囲のSN比とのトレードオフは存在する。 Therefore, according to the method of performing distance measurement based on the signal BPSK-modulated according to the generated code by the distance measuring device 1, the signal strength Conf after a predetermined distance can be set to zero. It is possible to cut off signals after a predetermined distance. Further, by controlling the code period (Chip Length), the distance at which the signal is cut off can be set to an arbitrary distance. According to the distance measuring device 1, it is possible to measure a distance limited to a desired measurement range. However, there is a trade-off with the SN ratio of the effective ranging range.
 測距装置1は、図7および図8で説明した遅延量ΔD=0とする測定モードを、通常モード(第1の測定モード)として実行する。 The distance measuring device 1 executes the measurement mode in which the delay amount ΔD = 0 described in FIGS. 7 and 8 is set as the normal mode (first measurement mode).
<5.近距離モードの動作>
 次に、図9を参照して、近距離モード(第2の測定モード)について説明する。なお、符号周期は、図7と同様に、2(Chip Length=2)に設定されている。
<5. Short-range mode operation>
Next, the short-distance mode (second measurement mode) will be described with reference to FIG. The code period is set to 2 (Chip Length = 2) as in FIG. 7.
 図9の左側上段に示されるタイミング信号は、図7に示した通常モードにおける照射光の波形と、タップAおよびタップBの受光タイミングを示している。通常モードでは、上述したように、遅延量ΔD=0である。 The timing signal shown in the upper left part of FIG. 9 shows the waveform of the irradiation light in the normal mode shown in FIG. 7 and the reception timing of tap A and tap B. In the normal mode, as described above, the delay amount ΔD = 0.
 一方、図9の左側下段に示されるタイミング信号は、近距離モードにおける照射光の波形と、タップAおよびタップBの受光タイミングを示している。 On the other hand, the timing signal shown in the lower left part of FIG. 9 shows the waveform of the irradiation light in the short-distance mode and the reception timing of tap A and tap B.
 近距離モードにおいては、制御部21は、遅延量ΔDを所定の値に設定し、光源遅延部17に供給する。光源遅延部17は、符号化部16から供給された符号化光源変調信号に対して、遅延量ΔDだけ位相を遅延させた遅延光源変調信号を生成する。 In the short-distance mode, the control unit 21 sets the delay amount ΔD to a predetermined value and supplies it to the light source delay unit 17. The light source delay unit 17 generates a delayed light source modulation signal whose phase is delayed by a delay amount ΔD with respect to the coded light source modulation signal supplied from the coding unit 16.
 図9の左側下段に示される例では、遅延量ΔDが2Piに設定されている。発光源19から照射される照射光の発光タイミングが、通常モードと比較して遅延量ΔD=2Piだけ遅延されることにより、受光センサ20に到達するタイミングも遅くなるので、信号強度Confは、通常モードよりもさらに低下する。 In the example shown in the lower left part of FIG. 9, the delay amount ΔD is set to 2Pi. Since the emission timing of the irradiation light emitted from the emission source 19 is delayed by the delay amount ΔD = 2Pi as compared with the normal mode, the timing of reaching the light receiving sensor 20 is also delayed, so that the signal intensity Conf is usually set to It is even lower than the mode.
 物体までの距離と信号強度Confとの関係は、図9の右側のグラフに示されるように、通常モードの状態よりも、近距離側である左側にシフトしたような関係となり、ゼロ点の距離は、2Pi+φD相当の距離となる。 As shown in the graph on the right side of FIG. 9, the relationship between the distance to the object and the signal strength Conf is such that it is shifted to the left side, which is the short distance side, from the state of the normal mode, and the distance at the zero point. is a 2Pi + phi D corresponding distance.
 以上のように、近距離モードでは、測距装置1は、遅延量ΔDを所定の値に設定し、照射光の発光を遅らせることで、通常モードにおける(Chip Length×2Pi)相当の距離よりもさらに近距離側に、測距範囲を限定した測定を行うことができる。どの程度の近距離を測定するかは、遅延量ΔDを制御することで、任意に設定することができる。ただし、有効測距範囲のSN比とのトレードオフは存在する。 As described above, in the short-distance mode, the distance measuring device 1 sets the delay amount ΔD to a predetermined value and delays the emission of the irradiation light, so that the distance is larger than the distance equivalent to (Chip Length × 2Pi) in the normal mode. Further, it is possible to perform measurement with a limited range of distance measurement on the short distance side. The short distance to be measured can be arbitrarily set by controlling the delay amount ΔD. However, there is a trade-off with the SN ratio of the effective ranging range.
<6.遠距離モードの動作>
 次に、図10を参照して、遠距離モード(第3の測定モード)について説明する。なお、符号周期は、図7と同様に、2(Chip Length=2)に設定されている。
<6. Long-distance mode operation>
Next, the long-distance mode (third measurement mode) will be described with reference to FIG. The code period is set to 2 (Chip Length = 2) as in FIG. 7.
 図10の左側上段に示されるタイミング信号は、図7に示した通常モードにおける照射光の波形と、タップAおよびタップBの受光タイミングを示している。通常モードでは、上述したように、遅延量ΔD=0である。 The timing signal shown in the upper left part of FIG. 10 shows the waveform of the irradiation light in the normal mode shown in FIG. 7 and the reception timing of tap A and tap B. In the normal mode, as described above, the delay amount ΔD = 0.
 一方、図10の左側下段に示されるタイミング信号は、遠距離モードにおける照射光の波形と、タップAおよびタップBの受光タイミングを示している。 On the other hand, the timing signal shown in the lower left part of FIG. 10 shows the waveform of the irradiation light in the long-distance mode and the reception timing of the tap A and the tap B.
 遠距離モードにおいては、制御部21は、遅延量ΔDを所定の値に設定し、センサ遅延部18に供給する。センサ遅延部18は、符号化部16から供給された符号化センサ変調信号に対して、遅延量ΔDだけ位相を遅延させた遅延センサ変調信号を生成する。 In the long-distance mode, the control unit 21 sets the delay amount ΔD to a predetermined value and supplies it to the sensor delay unit 18. The sensor delay unit 18 generates a delay sensor modulation signal whose phase is delayed by a delay amount ΔD with respect to the coded sensor modulation signal supplied from the coding unit 16.
 図10の左側下段に示される例では、遅延量ΔDが2Piに設定されている。受光センサ20における受光タイミングが、通常モードと比較して遅延量ΔD=2Piだけ遅延されると、物体までの距離と信号強度Confとの関係は、図9の右側のグラフに示されるように、通常モードの状態よりも、遠距離側である右側にシフトしたような関係となり、2Pi相当の距離で、信号強度Confが最大(信号強度C1)となる。 In the example shown in the lower left part of FIG. 10, the delay amount ΔD is set to 2Pi. When the light receiving timing in the light receiving sensor 20 is delayed by the delay amount ΔD = 2Pi as compared with the normal mode, the relationship between the distance to the object and the signal intensity Conf is shown in the graph on the right side of FIG. The relationship is such that the signal is shifted to the right side, which is a long distance side, from the state of the normal mode, and the signal strength Conf becomes the maximum (signal strength C1) at a distance equivalent to 2Pi.
 以上のように、遠距離モードでは、測距装置1は、遅延量ΔDを所定の値に設定し、受光センサ20の受光を遅らせることで、通常モードにおける(Chip Length×2Pi)相当の距離よりもさらに遠距離側に、測距範囲を限定した測定を行うことができる。ただし、近距離側の測距性能は低下し、所望の距離(図10では2Pi)で測距性能が最大となるような測距を行うことができる。どの程度の遠距離を測定するかは、遅延量ΔDを制御することで、任意に設定することができる。遠距離モードは、近距離の信号を減衰することができるので、例えば、レンズとセンサ間で発生する散乱光の影響を低減し、遠方の被写体の信号量を相対的に大きくすることができる。 As described above, in the long-distance mode, the distance measuring device 1 sets the delay amount ΔD to a predetermined value and delays the light reception of the light receiving sensor 20, so that the distance corresponding to (Chip Length × 2Pi) in the normal mode is increased. It is also possible to perform measurements with a limited range of distance measurement on the far side. However, the distance measurement performance on the short distance side is deteriorated, and the distance measurement can be performed so that the distance measurement performance is maximized at a desired distance (2Pi in FIG. 10). The distance to be measured can be arbitrarily set by controlling the delay amount ΔD. Since the long-distance mode can attenuate the short-distance signal, for example, the influence of scattered light generated between the lens and the sensor can be reduced, and the signal amount of the distant subject can be relatively increased.
<7.測距処理のフローチャート>
 次に、図11のフローチャートを参照して、測距装置1による測距処理を説明する。この処理は、例えば、測距装置1が組み込まれているホスト装置のホスト制御部から、測定モードとともに、測距指示が供給されたとき、開始される。なお、ホスト制御部から、測定モードではなく、ターゲットとする距離範囲が指定されてもよい。あるいはまた、測定モードと距離範囲の両方が指定されてもよい。
<7. Flowchart of distance measurement processing>
Next, the distance measuring process by the distance measuring device 1 will be described with reference to the flowchart of FIG. This process is started, for example, when a distance measurement instruction is supplied together with the measurement mode from the host control unit of the host device in which the distance measurement device 1 is incorporated. The target distance range may be specified by the host control unit instead of the measurement mode. Alternatively, both the measurement mode and the distance range may be specified.
 初めに、ステップS1において、制御部21は、測定モードに応じて符号周期(Chip Length)と遅延量ΔDを決定する。決定された符号周期は、符号生成部15に供給される。一方、決定された遅延量ΔDは、測定モードが近距離モードの場合は光源遅延部17に供給され、測定モードが遠距離モードの場合は、センサ遅延部18に供給される。測定モードが通常モードの場合は遅延量ΔD=0であるので、遅延量ΔDは光源遅延部17およびセンサ遅延部18のどちらにも供給されない。 First, in step S1, the control unit 21 determines the code period (Chip Length) and the delay amount ΔD according to the measurement mode. The determined code period is supplied to the code generation unit 15. On the other hand, the determined delay amount ΔD is supplied to the light source delay unit 17 when the measurement mode is the short-distance mode, and is supplied to the sensor delay unit 18 when the measurement mode is the long-distance mode. When the measurement mode is the normal mode, the delay amount ΔD = 0, so that the delay amount ΔD is not supplied to either the light source delay unit 17 or the sensor delay unit 18.
 ステップS2において、タイミング信号生成部11は、変調周波数Fmodの変調信号を生成し、光源変調部13、センサ変調部14、および、符号生成部15に供給する。 In step S2, the timing signal generation unit 11 generates a modulation signal of the modulation frequency Fmod and supplies it to the light source modulation unit 13, the sensor modulation unit 14, and the code generation unit 15.
 ステップS3において、位相設定部12は、駆動位相差φDを設定し、光源変調部13またはセンサ変調部14のいずれか一方に供給する。ここでは、図7乃至図10で説明した例と同様に、駆動位相差φDが光源変調部13に供給されることとする。4Phase方式の1フレーム目のステップS3では、例えば、駆動位相差φDとして0が設定される。 In step S3, the phase setting section 12 sets the drive phase difference phi D, supplied to one of the light source modulation unit 13 or the sensor modulation section 14. Here, as in the example described in FIGS. 7 to 10, and the drive phase difference phi D is supplied to the light source modulation section 13. In step S3 of the first frame of the 4Phase system, for example, 0 is set as the drive phase difference φ D.
 ステップS4において、光源変調部13およびセンサ変調部14は、位相設定部12からの駆動位相差φDに対応した変調信号を生成する。具体的には、光源変調部13は、タイミング信号生成部11からの変調信号に対して、駆動位相差φDだけ位相をずらした光源変調信号を生成し、符号化部16に供給する。光源変調部13は、タイミング信号生成部11からの変調信号を、そのまま、光源変調信号として、符号化部16に供給する。 In step S4, the light source modulation unit 13 and the sensor modulation unit 14 generates a modulated signal corresponding to the drive phase difference phi D from the phase setting portion 12. Specifically, the light source modulation unit 13, the modulated signal from the timing signal generator 11 generates a light source modulation signal obtained by shifting the phase by driving the phase difference phi D, supplied to the encoding unit 16. The light source modulation unit 13 supplies the modulation signal from the timing signal generation unit 11 as it is to the coding unit 16 as a light source modulation signal.
 ステップS5において、符号生成部15は、制御部21から設定された符号周期単位で、0または1の符号をランダムに生成し、符号化部16に供給する。ここでは、図7乃至図10で説明した例と同様に、符号周期は2(Chip Length=2)であるとすると、符号生成部15は、変調周波数Fmodの2周期単位に、0または1の符号をランダムに生成し、符号化部16に供給する。 In step S5, the code generation unit 15 randomly generates a code of 0 or 1 in the code cycle unit set by the control unit 21 and supplies the code to the coding unit 16. Here, as in the example described with reference to FIGS. 7 to 10, assuming that the code period is 2 (Chip Length = 2), the code generation unit 15 sets 0 or 1 in units of 2 cycles of the modulation frequency Fmod. A code is randomly generated and supplied to the coding unit 16.
 ステップS4とステップS5の処理は、逆の順番で実行されてもよいし、並行して実行することもできる。 The processes of steps S4 and S5 may be executed in the reverse order, or may be executed in parallel.
 ステップS6において、符号化部16は、光源変調部13から供給された光源変調信号と、センサ変調部14から供給されたセンサ変調信号とに対して、符号に応じた位相シフト処理を行い、符号化信号としての符号化光源変調信号と符号化センサ変調信号とを生成する。生成された符号化光源変調信号は、光源遅延部17に供給され、生成された符号化センサ変調信号は、センサ遅延部18に供給される。 In step S6, the coding unit 16 performs phase shift processing according to the code on the light source modulation signal supplied from the light source modulation unit 13 and the sensor modulation signal supplied from the sensor modulation unit 14, and obtains a code. A coded light source modulated signal and a coded sensor modulated signal are generated as a coded signal. The generated coded light source modulation signal is supplied to the light source delay unit 17, and the generated coded sensor modulation signal is supplied to the sensor delay unit 18.
 ステップS7において、光源遅延部17およびセンサ遅延部18は、制御部21から供給された遅延量ΔDだけ位相を遅延させた遅延光源変調信号および遅延センサ変調信号を生成する。 In step S7, the light source delay unit 17 and the sensor delay unit 18 generate a delay light source modulation signal and a delay sensor modulation signal whose phase is delayed by the delay amount ΔD supplied from the control unit 21.
 具体的には、遅延量ΔDが光源遅延部17に供給された場合、光源遅延部17は、符号化部16からの符号化光源変調信号に対して、位相を遅延量ΔDだけ遅延させた遅延光源変調信号を生成し、発光源19に供給する。遅延量ΔDがセンサ遅延部18に供給された場合、センサ遅延部18は、符号化部16からの符号化センサ変調信号に対して、位相を遅延量ΔDだけ遅延させた遅延センサ変調信号を生成し、受光センサ20に供給する。遅延量ΔDが供給されない場合、入力された変調信号がそのまま出力される。 Specifically, when the delay amount ΔD is supplied to the light source delay unit 17, the light source delay unit 17 delays the phase by the delay amount ΔD with respect to the coded light source modulation signal from the coding unit 16. A light source modulation signal is generated and supplied to the light emitting source 19. When the delay amount ΔD is supplied to the sensor delay unit 18, the sensor delay unit 18 generates a delay sensor modulation signal whose phase is delayed by the delay amount ΔD with respect to the coded sensor modulation signal from the coding unit 16. Then, it is supplied to the light receiving sensor 20. If the delay amount ΔD is not supplied, the input modulation signal is output as it is.
 ステップS8において、測距装置1は、照射光の発光と、反射光の受光を行う。具体的には、発光源19が、光源遅延部17から供給される遅延光源変調信号に応じたタイミングで変調しながら発光し、物体に対して照射光を照射する。受光センサ20の各画素31が、センサ遅延部18から供給される遅延センサ変調信号に応じたタイミングで反射光を受光し、受光した反射光の光量に応じた画素信号を制御部21へ出力する。 In step S8, the ranging device 1 emits the irradiation light and receives the reflected light. Specifically, the light emitting source 19 emits light while being modulated at a timing corresponding to the delayed light source modulation signal supplied from the light source delay unit 17, and irradiates the object with irradiation light. Each pixel 31 of the light receiving sensor 20 receives the reflected light at the timing corresponding to the delay sensor modulation signal supplied from the sensor delay unit 18, and outputs the pixel signal corresponding to the amount of the received reflected light to the control unit 21. ..
 ステップS9において、測距装置1の制御部21は、全フレームの位相データを取得したかを判定する。具体的には、制御部21は、2Phase方式の場合、2フレーム分の受光を行ったかを判定し、2フレーム分の受光を行った場合、全フレームの位相データを取得したと判定する。また例えば、4Phase方式の場合、制御部21は、4フレーム分の受光を行ったかを判定し、4フレーム分の受光を行った場合、全フレームの位相データを取得したと判定する。 In step S9, the control unit 21 of the ranging device 1 determines whether the phase data of all frames has been acquired. Specifically, in the case of the 2Phase method, the control unit 21 determines whether or not the light reception for two frames has been performed, and if the light reception for two frames is performed, it is determined that the phase data of all the frames has been acquired. Further, for example, in the case of the 4Phase method, the control unit 21 determines whether or not the light reception for 4 frames has been performed, and if the light reception for 4 frames is performed, it is determined that the phase data of all the frames has been acquired.
 ステップS9で、全フレームの位相データを取得していないと判定された場合、処理はステップS3に戻り、上述したステップS3乃至S7の処理が繰り返される。次のステップS3では、4Phase方式の場合、例えば、駆動位相差φDが90度に設定される。 If it is determined in step S9 that the phase data of all frames has not been acquired, the process returns to step S3, and the processes of steps S3 to S7 described above are repeated. In the next step S3, if the 4Phase method, for example, driving a phase difference phi D is set to 90 degrees.
 一方、ステップS9で、全フレームの位相データを取得したと判定された場合、処理はステップS10に進み、制御部21は、デプスマップおよび信頼度マップを生成して出力する。より具体的には、制御部21は、取得した全フレームの位相データ(検出信号)に基づいて、画素アレイ部32の各画素31について、式(3)によりデプス値dを算出するとともに、式(5)により信頼度confを算出する。そして、制御部21は、各画素31の画素値としてデプス値を格納したデプスマップと、各画素31の画素値として信頼度confを格納した信頼度マップと生成して出力する。 On the other hand, if it is determined in step S9 that the phase data of all frames has been acquired, the process proceeds to step S10, and the control unit 21 generates and outputs a depth map and a reliability map. More specifically, the control unit 21 calculates the depth value d for each pixel 31 of the pixel array unit 32 based on the acquired phase data (detection signal) of all frames by the equation (3), and also calculates the depth value d by the equation (3). Calculate the reliability conf according to (5). Then, the control unit 21 generates and outputs a depth map in which the depth value is stored as the pixel value of each pixel 31 and a reliability map in which the reliability conf is stored as the pixel value of each pixel 31.
 以上で、測距装置1による測距処理が終了する。 This completes the distance measuring process by the distance measuring device 1.
<8.シミュレーション結果>
 図12乃至図18を参照して、測距装置1において、測定モードを、通常モード、近距離モード、または、遠距離モードに設定し、符号周期(Chip Length)と遅延量ΔDを所定の値に設定した場合の、物体までの距離と信号強度Confとの関係をシミュレーションした結果について説明する。
<8. Simulation result>
With reference to FIGS. 12 to 18, in the distance measuring device 1, the measurement mode is set to the normal mode, the short distance mode, or the long distance mode, and the code period (Chip Length) and the delay amount ΔD are set to predetermined values. The result of simulating the relationship between the distance to the object and the signal strength Conf when set to is described.
 図12は、測定モードを通常モードとし、符号周期を1(Chip Length=1)に設定した場合の、符号化後の変調信号と、物体までの距離と信号強度Confとの関係を示すグラフである。 FIG. 12 is a graph showing the relationship between the modulated signal after coding, the distance to the object, and the signal intensity Conf when the measurement mode is set to the normal mode and the code period is set to 1 (Chip Length = 1). is there.
 図12の上段には、符号生成部15によってランダムに生成された0または1の符号と、符号化部16によって生成された符号化光源変調信号および符号化センサ変調信号が示されている。符号化光源変調信号は、駆動位相差φDを、0°、90°、180°、および、270°のそれぞれに設定した場合を示している。 In the upper part of FIG. 12, a code of 0 or 1 randomly generated by the code generation unit 15 and a coded light source modulation signal and a coded sensor modulation signal generated by the code unit 16 are shown. Coded light source modulation signal, a driving phase difference φ D, 0 °, 90 ° , 180 °, and shows a case of setting the respective 270 °.
 図12の下段には、上段に示した設定で、物体までの距離と信号強度Confとの関係をシミュレーションした結果が示されている。 The lower part of FIG. 12 shows the result of simulating the relationship between the distance to the object and the signal strength Conf with the settings shown in the upper part.
 きれいな直線の関係にはならないが、距離ゼロのとき、信号強度Confが最大値となり、距離が遠くなるにしたがい信号強度Confが減衰し、(Chip Length×2Pi)=2Pi相当の距離において、信号強度Confがゼロとなっている。 Although it does not have a clean straight line relationship, the signal strength Conf becomes the maximum value when the distance is zero, and the signal strength Conf attenuates as the distance increases, and the signal strength at a distance equivalent to (Chip Length x 2Pi) = 2Pi Conf is zero.
 図13は、測定モードを通常モードとし、符号周期を2(Chip Length=2)に設定した場合の、符号化後の変調信号と、物体までの距離と信号強度Confとの関係を示すグラフである。 FIG. 13 is a graph showing the relationship between the modulated signal after coding, the distance to the object, and the signal intensity Conf when the measurement mode is set to the normal mode and the code period is set to 2 (Chip Length = 2). is there.
 図13において、上段に示される信号と下段に示されるグラフの種類は、図12と同様であるので、詳細な説明は省略し、物体までの距離と信号強度Confとの関係についてのみ説明する。後述する図14乃至図18についても同様である。 In FIG. 13, the types of the signal shown in the upper row and the graph shown in the lower row are the same as those in FIG. 12, so detailed description thereof will be omitted, and only the relationship between the distance to the object and the signal strength Conf will be described. The same applies to FIGS. 14 to 18 described later.
 符号周期を2(Chip Length=2)に設定した場合、信号強度Confがゼロとなるゼロ点の距離が、図12に示した符号周期を1(Chip Length=1)とした場合の2倍である4Piとなっている。 When the code period is set to 2 (Chip Length = 2), the distance between the zero points where the signal strength Conf becomes zero is twice the distance when the code period shown in FIG. 12 is 1 (Chip Length = 1). It is a certain 4Pi.
 図12および図13のシミュレーション結果から、符号周期(Chip Length)を制御することで、ゼロ点の距離、換言すれば、信号をカットオフする距離を任意の距離に設定できることがわかる。 From the simulation results of FIGS. 12 and 13, it can be seen that by controlling the code period (Chip Length), the distance at the zero point, in other words, the distance at which the signal is cut off can be set to an arbitrary distance.
 図14は、測定モードを遠距離モードとし、符号周期を1(Chip Length=1)に設定した場合の、符号化後の変調信号と、物体までの距離と信号強度Confとの関係を示すグラフである。 FIG. 14 is a graph showing the relationship between the modulated signal after coding, the distance to the object, and the signal intensity Conf when the measurement mode is set to the long-distance mode and the code period is set to 1 (Chip Length = 1). Is.
 測定モードを遠距離モードとする場合、センサ遅延部18に遅延量ΔDが供給され、符号化センサ変調信号に対して、位相を遅延量ΔDだけ遅延させた遅延センサ変調信号が生成される。図14では、遅延量ΔDが、1Piとされている。 When the measurement mode is set to the long-distance mode, the delay amount ΔD is supplied to the sensor delay unit 18, and a delay sensor modulation signal whose phase is delayed by the delay amount ΔD with respect to the coded sensor modulation signal is generated. In FIG. 14, the delay amount ΔD is 1 Pi.
 図14の物体までの距離と信号強度Confとの関係を示すグラフを、図12に示した、符号周期が同じ1(Chip Length=1)で通常モードの物体までの距離と信号強度Confとの関係を示すグラフと比較すると、信号強度Confがピークとなる距離が、0からPiになっており、遠距離方向にシフトしている。また、ゼロ点の距離も、2Piから3Piになっており、遠距離方向にシフトしている。 The graph showing the relationship between the distance to the object and the signal strength Conf in FIG. 14 shows the relationship between the distance to the object in the normal mode and the signal strength Conf with the same code period of 1 (Chip Length = 1) shown in FIG. Compared with the graph showing the relationship, the distance at which the signal strength Conf peaks is from 0 to Pi, and is shifted in the long distance direction. The distance of the zero point is also changed from 2Pi to 3Pi, which is shifted in the long distance direction.
 図15は、測定モードを遠距離モードとし、符号周期を2(Chip Length=2)に設定した場合の、符号化後の変調信号と、物体までの距離と信号強度Confとの関係を示すグラフである。図15では、センサ遅延部18に供給される遅延量ΔDが2Piとされている。 FIG. 15 is a graph showing the relationship between the modulated signal after coding, the distance to the object, and the signal intensity Conf when the measurement mode is set to the long-distance mode and the code period is set to 2 (Chip Length = 2). Is. In FIG. 15, the delay amount ΔD supplied to the sensor delay unit 18 is 2Pi.
 図15の物体までの距離と信号強度Confとの関係を示すグラフを、図13に示した、符号周期が同じ2(Chip Length=2)で通常モードの物体までの距離と信号強度Confとの関係を示すグラフと比較すると、信号強度Confがピークとなる距離が、0から2Piになっており、遠距離方向にシフトしている。また、ゼロ点の距離も、4Piから6Piになっており、遠距離方向にシフトしている。 The graph showing the relationship between the distance to the object in FIG. 15 and the signal strength Conf is shown in FIG. 13 with the distance to the object and the signal strength Conf in the normal mode having the same code period of 2 (Chip Length = 2). Compared with the graph showing the relationship, the distance at which the signal strength Conf peaks is from 0 to 2Pi, which is shifted in the long distance direction. The distance of the zero point is also changed from 4Pi to 6Pi, which is shifted in the long distance direction.
 図14および図15のシミュレーション結果から、符号化センサ変調信号に対して位相を所定の遅延量ΔDだけ遅延させることで、信号強度Confのピークとゼロ点の距離を、通常モードよりも遠距離側の任意の距離に設定できることがわかる。 From the simulation results of FIGS. 14 and 15, by delaying the phase by a predetermined delay amount ΔD with respect to the coded sensor modulated signal, the distance between the peak and the zero point of the signal strength Conf is set to be farther than the normal mode. It can be seen that it can be set to any distance.
 図16は、測定モードを近距離モードとし、符号周期を1(Chip Length=1)に設定した場合の、符号化後の変調信号と、物体までの距離と信号強度Confとの関係を示すグラフである。 FIG. 16 is a graph showing the relationship between the modulated signal after coding, the distance to the object, and the signal intensity Conf when the measurement mode is the short-distance mode and the code period is set to 1 (Chip Length = 1). Is.
 測定モードを近距離モードとする場合、光源遅延部17に遅延量ΔDが供給され、符号化光源変調信号に対して、位相を遅延量ΔDだけ遅延させた遅延光源変調信号が生成される。図16では、遅延量ΔDが1Piとされている。 When the measurement mode is set to the short-range mode, a delay amount ΔD is supplied to the light source delay unit 17, and a delayed light source modulation signal whose phase is delayed by the delay amount ΔD with respect to the coded light source modulation signal is generated. In FIG. 16, the delay amount ΔD is 1 Pi.
 図16の物体までの距離と信号強度Confとの関係を示すグラフを、図12に示した、符号周期が同じ1(Chip Length=1)で通常モードの物体までの距離と信号強度Confとの関係を示すグラフと比較すると、信号強度Confのピーク値(距離ゼロの信号強度Conf)が、図12の通常モードの1/2の値になっている。また、ゼロ点の距離も、通常モードの2Piの1/2のPiになっており、近距離方向にシフトしている。 The graph showing the relationship between the distance to the object in FIG. 16 and the signal strength Conf is shown in FIG. 12 with the distance to the object in the normal mode and the signal strength Conf with the same code period of 1 (Chip Length = 1). Compared with the graph showing the relationship, the peak value of the signal strength Conf (the signal strength Conf at a distance of zero) is half the value of the normal mode of FIG. Further, the distance of the zero point is also 1/2 Pi of 2Pi in the normal mode, and is shifted in the short distance direction.
 図17は、測定モードを近距離モードとし、符号周期を2(Chip Length=2)に設定した場合の、符号化後の変調信号と、物体までの距離と信号強度Confとの関係を示すグラフである。図17では、光源遅延部17に供給される遅延量ΔDが1Piとされている。 FIG. 17 is a graph showing the relationship between the modulated signal after coding, the distance to the object, and the signal intensity Conf when the measurement mode is the short-distance mode and the code period is set to 2 (Chip Length = 2). Is. In FIG. 17, the delay amount ΔD supplied to the light source delay unit 17 is 1 Pi.
 図17の物体までの距離と信号強度Confとの関係を示すグラフを、図13に示した、符号周期が同じ2(Chip Length=2)で通常モードの物体までの距離と信号強度Confとの関係を示すグラフと比較すると、信号強度Confのピーク値(距離ゼロの信号強度Conf)が、図13の通常モードの3/4の値になっている。また、ゼロ点の距離も、通常モードの4Piの3/4の3Piになっており、近距離方向にシフトしている。 The graph showing the relationship between the distance to the object and the signal strength Conf in FIG. 17 is shown in FIG. 13 with the distance to the object and the signal strength Conf in the normal mode having the same code period of 2 (Chip Length = 2). Compared with the graph showing the relationship, the peak value of the signal strength Conf (signal strength Conf at zero distance) is 3/4 of the value of the normal mode of FIG. Further, the distance of the zero point is also 3/4 of 3Pi of 4Pi in the normal mode, and is shifted in the short distance direction.
 図18は、測定モードを近距離モードとし、符号周期を1(Chip Length=1)に設定した場合の、符号化後の変調信号と、物体までの距離と信号強度Confとの関係を示すグラフである。図18では、光源遅延部17に供給される遅延量ΔDがPi×7/4とされている。 FIG. 18 is a graph showing the relationship between the modulated signal after coding, the distance to the object, and the signal intensity Conf when the measurement mode is the short-distance mode and the code period is set to 1 (Chip Length = 1). Is. In FIG. 18, the delay amount ΔD supplied to the light source delay unit 17 is Pi × 7/4.
 図18の物体までの距離と信号強度Confとの関係を示すグラフを、図12に示した、符号周期が同じ1(Chip Length=1)で通常モードの物体までの距離と信号強度Confとの関係を示すグラフと比較すると、信号強度Confのピーク値(距離ゼロの信号強度Conf)が、図12の通常モードの0.16倍の値になっている。また、ゼロ点の距離は、通常モードの2Piの1/8のPi/4になっており、近距離方向にシフトしている。 A graph showing the relationship between the distance to the object and the signal strength Conf in FIG. 18 is shown in FIG. 12 with the distance to the object and the signal strength Conf in the normal mode having the same code period of 1 (Chip Length = 1). Compared with the graph showing the relationship, the peak value of the signal strength Conf (signal strength Conf at zero distance) is 0.16 times the value of the normal mode of FIG. Further, the distance of the zero point is Pi / 4, which is 1/8 of 2Pi in the normal mode, and is shifted in the short distance direction.
 図16乃至図18のシミュレーション結果から、符号化光源変調信号に対して位相を所定の遅延量ΔDだけ遅延させることで、信号強度Confのピークとゼロ点の距離を、通常モードよりも近距離側の任意の距離に設定できることがわかる。 From the simulation results of FIGS. 16 to 18, by delaying the phase by a predetermined delay amount ΔD with respect to the encoded light source modulated signal, the distance between the peak and the zero point of the signal intensity Conf can be set closer to the normal mode. It can be seen that it can be set to any distance.
<9.アプリケーション適用例>
 次に、測距装置1において、上述した測定モードの切り替えを行うアプリケーション適用例について説明する。
<9. Application application example>
Next, an application application example of switching the measurement mode described above in the distance measuring device 1 will be described.
 図19は、測距装置1が組み込まれた電子機器としてのスマートフォン101を、ディスプレイ面と平行な面から見た断面図である。 FIG. 19 is a cross-sectional view of a smartphone 101 as an electronic device in which the distance measuring device 1 is incorporated, as viewed from a surface parallel to the display surface.
 図19に示されるように、測距装置1は、スマートフォン101に組み込まれている。スマートフォン101のディスプレイパネル(不図示)の前面には、カバーガラス102が配置されており、ディスプレイパネルの裏側(本体内部側)に、測距装置1が配置されている。 As shown in FIG. 19, the distance measuring device 1 is incorporated in the smartphone 101. A cover glass 102 is arranged on the front surface of the display panel (not shown) of the smartphone 101, and the distance measuring device 1 is arranged on the back side (inside the main body) of the display panel.
 測距装置1の発光源19から出射された照射光L1は、カバーガラス102を通過して、被写体103へ照射される。被写体103は、例えば、スマートフォン101を使用しているユーザである。照射光L1は、被写体103で反射され、反射光L2としてカバーガラス102を通過し、レンズ104を介して、受光センサ20へ入射される。 The irradiation light L1 emitted from the light emitting source 19 of the distance measuring device 1 passes through the cover glass 102 and is irradiated to the subject 103. The subject 103 is, for example, a user using the smartphone 101. The irradiation light L1 is reflected by the subject 103, passes through the cover glass 102 as the reflected light L2, and is incident on the light receiving sensor 20 via the lens 104.
 このような測距装置1が組み込まれたスマートフォン101において、例えば、カバーガラス102の表面に、ゴミや指紋等の異物121が付着している場合がある。ユーザは、測距装置1がスマートフォン101内部のどこに配置されているかわからないため、測距装置1に対する異物121の影響に気が付かない場合が多い。カバーガラス102に異物121が付着している場合、照射光L1が異物121で反射され、反射光L3のように屈折して、受光センサ20へ入射される。 In the smartphone 101 in which such a distance measuring device 1 is incorporated, for example, foreign matter 121 such as dust or fingerprints may adhere to the surface of the cover glass 102. Since the user does not know where the distance measuring device 1 is located inside the smartphone 101, the user often does not notice the influence of the foreign matter 121 on the distance measuring device 1. When the foreign matter 121 is attached to the cover glass 102, the irradiation light L1 is reflected by the foreign matter 121, refracted like the reflected light L3, and incident on the light receiving sensor 20.
 本来の距離測定対象である被写体103までの距離に対して、異物121までの距離は極めて近い距離となるため、測定モードを近距離モードにして測定することで、被写体103を除外した測定を行うことができる。 Since the distance to the foreign matter 121 is extremely short compared to the distance to the subject 103, which is the original distance measurement target, the measurement is performed excluding the subject 103 by setting the measurement mode to the short distance mode. be able to.
 図20は、図19で説明した異物検出を行いつつ、本来の測定対象の物体までの距離を測定する、測距装置1の測距処理のフローチャートである。この処理は、例えば、測距装置1が組み込まれたスマートフォン101の制御部(AP)から、測距指示が供給されたとき、開始される。 FIG. 20 is a flowchart of the distance measuring process of the distance measuring device 1 that measures the distance to the original object to be measured while detecting the foreign matter described in FIG. This process is started, for example, when a distance measurement instruction is supplied from the control unit (AP) of the smartphone 101 in which the distance measurement device 1 is incorporated.
 初めに、ステップS21において、制御部21は、測定モードを近距離モードに設定し、ステップS22において、測距装置1は、近距離モードで測定を実行する。近距離モードにおける符号周期(Chip Length)と遅延量ΔDは、照射光L1がカバーガラス102の表面で反射された場合に相当する距離を測定する場合に最適な値に設定されている。測距装置1は、図11の測距処理を実行し、制御部21は、デプスマップと信頼度マップを生成する。 First, in step S21, the control unit 21 sets the measurement mode to the short-distance mode, and in step S22, the distance measuring device 1 executes the measurement in the short-distance mode. The code period (Chip Length) and the delay amount ΔD in the short-distance mode are set to optimum values when measuring the distance corresponding to the case where the irradiation light L1 is reflected on the surface of the cover glass 102. The distance measuring device 1 executes the distance measuring process shown in FIG. 11, and the control unit 21 generates a depth map and a reliability map.
 ステップS23において、制御部21は、近距離モードで取得したデプスマップと信頼度マップとに基づいて、異物が検出されたかを判定する。例えば、制御部21は、デプスマップにおいて、カバーガラス102の表面までの距離が検出された場合、異物が検出されたと判定する。 In step S23, the control unit 21 determines whether or not a foreign substance has been detected based on the depth map and the reliability map acquired in the short-distance mode. For example, when the distance to the surface of the cover glass 102 is detected in the depth map, the control unit 21 determines that a foreign substance has been detected.
 ステップS23で、異物が検出されたと判定された場合、処理はステップS24に進み、制御部21は、異物の検出をスマートフォン101の制御部に通知する。 If it is determined in step S23 that a foreign matter has been detected, the process proceeds to step S24, and the control unit 21 notifies the control unit of the smartphone 101 of the detection of the foreign matter.
 異物の検出が測距装置1から通知されたスマートフォン101の制御部は、例えば、図21に示されるような、異物の除去をユーザに依頼するアラート画面をディスプレイに表示させる。 The control unit of the smartphone 101 notified from the distance measuring device 1 that the foreign matter has been detected displays, for example, an alert screen requesting the user to remove the foreign matter as shown in FIG. 21 on the display.
 図21では、スマートフォン101のディスプレイ141に、「カメラ前面にゴミまたは指紋がついています。赤線の範囲を拭いてキレイにしてください。」のメッセージ142と、拭き取り範囲を示す赤色のエリア143とが表示されている。赤色のエリア143は、スマートフォン101内部の測距装置1の位置に対応している。ユーザが、メッセージ142に基づいて、赤色のエリア143近傍を拭き取ることにより、異物121が除去される。 In FIG. 21, the display 141 of the smartphone 101 shows the message 142 "There is dust or fingerprints on the front of the camera. Wipe the area of the red line to clean it." And the red area 143 indicating the area of wiping. Is displayed. The red area 143 corresponds to the position of the distance measuring device 1 inside the smartphone 101. The foreign matter 121 is removed by the user wiping the vicinity of the red area 143 based on the message 142.
 ユーザが拭き取り作業を終了すると、ユーザは、メッセージ142に対する操作として、拭き取り終了ボタン144を操作(押下)する。拭き取り終了ボタン144がユーザによって操作されると、再度、測距指示が、スマートフォン101の制御部から測距装置1へ供給される。 When the user finishes the wiping work, the user operates (presses) the wiping end button 144 as an operation for the message 142. When the wiping end button 144 is operated by the user, the distance measuring instruction is again supplied from the control unit of the smartphone 101 to the distance measuring device 1.
 ステップS24において、制御部21が異物の検出をスマートフォン101の制御部に通知した後、制御部21は、ステップS25において、拭き取り終了ボタン144の操作に対応した測距指示が、スマートフォン101の制御部から通知されたか否かを判定し、測距指示が通知されたと判定されるまで待機する。 In step S24, after the control unit 21 notifies the control unit of the smartphone 101 of the detection of a foreign substance, in step S25, the control unit 21 issues a distance measurement instruction corresponding to the operation of the wiping end button 144 to the control unit of the smartphone 101. It is determined whether or not the notification has been sent from, and the device waits until it is determined that the distance measurement instruction has been notified.
 そして、ステップS25で、測距指示が通知されたと判定されると、処理はステップS22に戻り、再度、近距離モードによる測定が実行される。 Then, when it is determined in step S25 that the distance measurement instruction has been notified, the process returns to step S22, and the measurement in the short-distance mode is executed again.
 そして、ステップS23で、異物が検出されなかったと判定された場合、処理はステップS26に進み、制御部21は、測定モードを通常モードに設定し、ステップS27において、測距装置1は、通常モードで測定を実行する。そして、ステップS28において、制御部21は、通常モードによる測定結果を出力する。すなわち、制御部21は、通常モードによる測定の結果得られたデプス値を、各画素31の画素値として格納したデプスマップと、通常モードの測定の結果得られた信頼度confを、各画素31の画素値として格納した信頼度マップを生成して、スマートフォン101の制御部に出力し、処理を終了する。 If it is determined in step S23 that no foreign matter has been detected, the process proceeds to step S26, the control unit 21 sets the measurement mode to the normal mode, and in step S27, the distance measuring device 1 sets the normal mode. Perform the measurement with. Then, in step S28, the control unit 21 outputs the measurement result in the normal mode. That is, the control unit 21 stores the depth map obtained as a result of the measurement in the normal mode as the pixel value of each pixel 31 and the reliability conf obtained as a result of the measurement in the normal mode for each pixel 31. A reliability map stored as a pixel value of is generated, output to the control unit of the smartphone 101, and the process is completed.
 以上のように、測距装置1は、検出対象に合わせて、測定モードを、通常モード、近距離モード、または、遠距離モードに設定することで、所望の距離範囲に限定した距離の測定を行うことができる。 As described above, the distance measuring device 1 sets the measurement mode to the normal mode, the short distance mode, or the long distance mode according to the detection target, thereby measuring the distance limited to a desired distance range. It can be carried out.
<10.電子機器の構成例>
 上述した測距装置1は、例えば、スマートフォン、タブレット型端末、携帯電話機、パーソナルコンピュータ、ゲーム機、テレビ受像機、ウェアラブル端末、デジタルスチルカメラ、デジタルビデオカメラなどの電子機器に搭載することができる。
<10. Configuration example of electronic device>
The distance measuring device 1 described above can be mounted on an electronic device such as a smartphone, a tablet terminal, a mobile phone, a personal computer, a game machine, a television receiver, a wearable terminal, a digital still camera, or a digital video camera.
 図22は、測距モジュールを搭載した電子機器としてのスマートフォンの構成例を示すブロック図である。 FIG. 22 is a block diagram showing a configuration example of a smartphone as an electronic device equipped with a ranging module.
 図22に示すように、スマートフォン201は、測距モジュール202、撮像装置203、ディスプレイ204、スピーカ205、マイクロフォン206、通信モジュール207、センサユニット208、タッチパネル209、および制御ユニット210が、バス211を介して接続されて構成される。また、制御ユニット210では、CPUがプログラムを実行することによって、アプリケーション処理部221およびオペレーションシステム処理部222としての機能を備える。 As shown in FIG. 22, in the smartphone 201, the distance measuring module 202, the image pickup device 203, the display 204, the speaker 205, the microphone 206, the communication module 207, the sensor unit 208, the touch panel 209, and the control unit 210 are connected via the bus 211. Is connected and configured. Further, the control unit 210 has functions as an application processing unit 221 and an operation system processing unit 222 by executing a program by the CPU.
 測距モジュール202には、図1の測距装置1が適用される。例えば、測距モジュール202は、スマートフォン201の前面に配置され、スマートフォン201のユーザを対象とした測距を行うことにより、そのユーザの顔や手、指などの表面形状のデプス値を測距結果として出力することができる。 The distance measuring device 1 of FIG. 1 is applied to the distance measuring module 202. For example, the distance measuring module 202 is arranged in front of the smartphone 201, and by performing distance measurement for the user of the smartphone 201, the depth value of the surface shape of the user's face, hand, finger, etc. is measured as a distance measurement result. Can be output as.
 撮像装置203は、スマートフォン201の前面に配置され、スマートフォン201のユーザを被写体とした撮像を行うことにより、そのユーザが写された画像を取得する。なお、図示しないが、スマートフォン201の背面にも撮像装置203が配置された構成としてもよい。 The image pickup device 203 is arranged in front of the smartphone 201, and by taking an image of the user of the smartphone 201 as a subject, the image taken by the user is acquired. Although not shown, the image pickup device 203 may be arranged on the back surface of the smartphone 201.
 ディスプレイ204は、アプリケーション処理部221およびオペレーションシステム処理部222による処理を行うための操作画面や、撮像装置203が撮像した画像などを表示する。スピーカ205およびマイクロフォン206は、例えば、スマートフォン201により通話を行う際に、相手側の音声の出力、および、ユーザの音声の収音を行う。 The display 204 displays an operation screen for performing processing by the application processing unit 221 and the operation system processing unit 222, an image captured by the image pickup device 203, and the like. The speaker 205 and the microphone 206, for example, output the voice of the other party and collect the voice of the user when making a call by the smartphone 201.
 通信モジュール207は、通信ネットワークを介した通信を行う。センサユニット208は、速度や加速度、近接などをセンシングし、タッチパネル209は、ディスプレイ204に表示されている操作画面に対するユーザによるタッチ操作を取得する。 The communication module 207 communicates via the communication network. The sensor unit 208 senses speed, acceleration, proximity, etc., and the touch panel 209 acquires a touch operation by the user on the operation screen displayed on the display 204.
 アプリケーション処理部221は、スマートフォン201によって様々なサービスを提供するための処理を行う。例えば、アプリケーション処理部221は、測距モジュール202から供給されるデプス値に基づいて、ユーザの表情をバーチャルに再現したコンピュータグラフィックスによる顔を作成し、ディスプレイ204に表示する処理を行うことができる。また、アプリケーション処理部221は、測距モジュール202から供給されるデプス値に基づいて、例えば、任意の立体的な物体の三次元形状データを作成する処理を行うことができる。 The application processing unit 221 performs processing for providing various services by the smartphone 201. For example, the application processing unit 221 can create a face by computer graphics that virtually reproduces the user's facial expression based on the depth value supplied from the distance measuring module 202, and can perform a process of displaying the face on the display 204. .. Further, the application processing unit 221 can perform a process of creating, for example, three-dimensional shape data of an arbitrary three-dimensional object based on the depth value supplied from the distance measuring module 202.
 オペレーションシステム処理部222は、スマートフォン201の基本的な機能および動作を実現するための処理を行う。例えば、オペレーションシステム処理部222は、測距モジュール202から供給されるデプス値に基づいて、ユーザの顔を認証し、スマートフォン201のロックを解除する処理を行うことができる。また、オペレーションシステム処理部222は、測距モジュール202から供給されるデプス値に基づいて、例えば、ユーザのジェスチャを認識する処理を行い、そのジェスチャに従った各種の操作を入力する処理を行うことができる。 The operation system processing unit 222 performs processing for realizing the basic functions and operations of the smartphone 201. For example, the operation system processing unit 222 can perform a process of authenticating the user's face and unlocking the smartphone 201 based on the depth value supplied from the distance measuring module 202. Further, the operation system processing unit 222 performs, for example, a process of recognizing a user's gesture based on the depth value supplied from the distance measuring module 202, and performs a process of inputting various operations according to the gesture. Can be done.
 このように構成されているスマートフォン201では、上述した測距装置1を適用することで、例えば、アプリケーションの目的に応じて、測定モードを通常モード、近距離モード、または、遠距離モードに切り替えて測定することができ、所望の距離範囲に限定した距離の測定を行うことができる。 In the smartphone 201 configured in this way, by applying the distance measuring device 1 described above, for example, the measurement mode can be switched to a normal mode, a short distance mode, or a long distance mode according to the purpose of the application. It can be measured, and the distance can be measured within a desired distance range.
<11.移動体への応用例>
 本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット等のいずれかの種類の移動体に搭載される装置として実現されてもよい。
<11. Application example to mobile>
The technology according to the present disclosure (the present technology) can be applied to various products. For example, the technology according to the present disclosure is realized as a device mounted on a moving body of any kind such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, and a robot. You may.
 図23は、本開示に係る技術が適用され得る移動体制御システムの一例である車両制御システムの概略的な構成例を示すブロック図である。 FIG. 23 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technology according to the present disclosure can be applied.
 車両制御システム12000は、通信ネットワーク12001を介して接続された複数の電子制御ユニットを備える。図23に示した例では、車両制御システム12000は、駆動系制御ユニット12010、ボディ系制御ユニット12020、車外情報検出ユニット12030、車内情報検出ユニット12040、及び統合制御ユニット12050を備える。また、統合制御ユニット12050の機能構成として、マイクロコンピュータ12051、音声画像出力部12052、及び車載ネットワークI/F(interface)12053が図示されている。 The vehicle control system 12000 includes a plurality of electronic control units connected via the communication network 12001. In the example shown in FIG. 23, the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside information detection unit 12030, an in-vehicle information detection unit 12040, and an integrated control unit 12050. Further, as a functional configuration of the integrated control unit 12050, a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are shown.
 駆動系制御ユニット12010は、各種プログラムにしたがって車両の駆動系に関連する装置の動作を制御する。例えば、駆動系制御ユニット12010は、内燃機関又は駆動用モータ等の車両の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、車両の舵角を調節するステアリング機構、及び、車両の制動力を発生させる制動装置等の制御装置として機能する。 The drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs. For example, the drive system control unit 12010 provides a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating a braking force of a vehicle.
 ボディ系制御ユニット12020は、各種プログラムにしたがって車体に装備された各種装置の動作を制御する。例えば、ボディ系制御ユニット12020は、キーレスエントリシステム、スマートキーシステム、パワーウィンドウ装置、あるいは、ヘッドランプ、バックランプ、ブレーキランプ、ウィンカー又はフォグランプ等の各種ランプの制御装置として機能する。この場合、ボディ系制御ユニット12020には、鍵を代替する携帯機から発信される電波又は各種スイッチの信号が入力され得る。ボディ系制御ユニット12020は、これらの電波又は信号の入力を受け付け、車両のドアロック装置、パワーウィンドウ装置、ランプ等を制御する。 The body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs. For example, the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, blinkers or fog lamps. In this case, the body system control unit 12020 may be input with radio waves transmitted from a portable device that substitutes for the key or signals of various switches. The body system control unit 12020 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
 車外情報検出ユニット12030は、車両制御システム12000を搭載した車両の外部の情報を検出する。例えば、車外情報検出ユニット12030には、撮像部12031が接続される。車外情報検出ユニット12030は、撮像部12031に車外の画像を撮像させるとともに、撮像された画像を受信する。車外情報検出ユニット12030は、受信した画像に基づいて、人、車、障害物、標識又は路面上の文字等の物体検出処理又は距離検出処理を行ってもよい。 The vehicle outside information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000. For example, the image pickup unit 12031 is connected to the vehicle exterior information detection unit 12030. The vehicle outside information detection unit 12030 causes the image pickup unit 12031 to capture an image of the outside of the vehicle and receives the captured image. The vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or characters on the road surface based on the received image.
 撮像部12031は、光を受光し、その光の受光量に応じた電気信号を出力する光センサである。撮像部12031は、電気信号を画像として出力することもできるし、測距の情報として出力することもできる。また、撮像部12031が受光する光は、可視光であっても良いし、赤外線等の非可視光であっても良い。 The imaging unit 12031 is an optical sensor that receives light and outputs an electric signal according to the amount of the light received. The image pickup unit 12031 can output an electric signal as an image or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared light.
 車内情報検出ユニット12040は、車内の情報を検出する。車内情報検出ユニット12040には、例えば、運転者の状態を検出する運転者状態検出部12041が接続される。運転者状態検出部12041は、例えば運転者を撮像するカメラを含み、車内情報検出ユニット12040は、運転者状態検出部12041から入力される検出情報に基づいて、運転者の疲労度合い又は集中度合いを算出してもよいし、運転者が居眠りをしていないかを判別してもよい。 The in-vehicle information detection unit 12040 detects the in-vehicle information. For example, a driver state detection unit 12041 that detects the driver's state is connected to the in-vehicle information detection unit 12040. The driver state detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing.
 マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車内外の情報に基づいて、駆動力発生装置、ステアリング機構又は制動装置の制御目標値を演算し、駆動系制御ユニット12010に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車両の衝突回避あるいは衝撃緩和、車間距離に基づく追従走行、車速維持走行、車両の衝突警告、又は車両のレーン逸脱警告等を含むADAS(Advanced Driver Assistance System)の機能実現を目的とした協調制御を行うことができる。 The microcomputer 12051 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit. A control command can be output to 12010. For example, the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. It is possible to perform cooperative control for the purpose of.
 また、マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車両の周囲の情報に基づいて駆動力発生装置、ステアリング機構又は制動装置等を制御することにより、運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。 Further, the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, so that the driver can control the driver. It is possible to perform coordinated control for the purpose of automatic driving, etc., which runs autonomously without depending on the operation.
 また、マイクロコンピュータ12051は、車外情報検出ユニット12030で取得される車外の情報に基づいて、ボディ系制御ユニット12020に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車外情報検出ユニット12030で検知した先行車又は対向車の位置に応じてヘッドランプを制御し、ハイビームをロービームに切り替える等の防眩を図ることを目的とした協調制御を行うことができる。 Further, the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the vehicle exterior information detection unit 12030. For example, the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the external information detection unit 12030, and performs coordinated control for the purpose of anti-glare such as switching the high beam to the low beam. It can be carried out.
 音声画像出力部12052は、車両の搭乗者又は車外に対して、視覚的又は聴覚的に情報を通知することが可能な出力装置へ音声及び画像のうちの少なくとも一方の出力信号を送信する。図23の例では、出力装置として、オーディオスピーカ12061、表示部12062及びインストルメントパネル12063が例示されている。表示部12062は、例えば、オンボードディスプレイ及びヘッドアップディスプレイの少なくとも一つを含んでいてもよい。 The audio image output unit 12052 transmits an output signal of at least one of audio and an image to an output device capable of visually or audibly notifying information to the passenger or the outside of the vehicle. In the example of FIG. 23, an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices. The display unit 12062 may include, for example, at least one of an onboard display and a heads-up display.
 図24は、撮像部12031の設置位置の例を示す図である。 FIG. 24 is a diagram showing an example of the installation position of the imaging unit 12031.
 図24では、車両12100は、撮像部12031として、撮像部12101,12102,12103,12104,12105を有する。 In FIG. 24, the vehicle 12100 has image pickup units 12101, 12102, 12103, 12104, 12105 as the image pickup unit 12031.
 撮像部12101,12102,12103,12104,12105は、例えば、車両12100のフロントノーズ、サイドミラー、リアバンパ、バックドア及び車室内のフロントガラスの上部等の位置に設けられる。フロントノーズに備えられる撮像部12101及び車室内のフロントガラスの上部に備えられる撮像部12105は、主として車両12100の前方の画像を取得する。サイドミラーに備えられる撮像部12102,12103は、主として車両12100の側方の画像を取得する。リアバンパ又はバックドアに備えられる撮像部12104は、主として車両12100の後方の画像を取得する。撮像部12101及び12105で取得される前方の画像は、主として先行車両又は、歩行者、障害物、信号機、交通標識又は車線等の検出に用いられる。 The imaging units 12101, 12102, 12103, 12104, 12105 are provided at positions such as the front nose, side mirrors, rear bumpers, back doors, and the upper part of the windshield in the vehicle interior of the vehicle 12100, for example. The imaging unit 12101 provided on the front nose and the imaging unit 12105 provided on the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100. The imaging units 12102 and 12103 provided in the side mirrors mainly acquire images of the side of the vehicle 12100. The imaging unit 12104 provided on the rear bumper or the back door mainly acquires an image of the rear of the vehicle 12100. The images in front acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
 なお、図24には、撮像部12101ないし12104の撮影範囲の一例が示されている。撮像範囲12111は、フロントノーズに設けられた撮像部12101の撮像範囲を示し、撮像範囲12112,12113は、それぞれサイドミラーに設けられた撮像部12102,12103の撮像範囲を示し、撮像範囲12114は、リアバンパ又はバックドアに設けられた撮像部12104の撮像範囲を示す。例えば、撮像部12101ないし12104で撮像された画像データが重ね合わせられることにより、車両12100を上方から見た俯瞰画像が得られる。 Note that FIG. 24 shows an example of the photographing range of the imaging units 12101 to 12104. The imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose, the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively, and the imaging range 12114 indicates the imaging range of the imaging units 12102 and 12103. The imaging range of the imaging unit 12104 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 as viewed from above can be obtained.
 撮像部12101ないし12104の少なくとも1つは、距離情報を取得する機能を有していてもよい。例えば、撮像部12101ないし12104の少なくとも1つは、複数の撮像素子からなるステレオカメラであってもよいし、位相差検出用の画素を有する撮像素子であってもよい。 At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information. For example, at least one of the image pickup units 12101 to 12104 may be a stereo camera composed of a plurality of image pickup elements, or may be an image pickup element having pixels for phase difference detection.
 例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を基に、撮像範囲12111ないし12114内における各立体物までの距離と、この距離の時間的変化(車両12100に対する相対速度)を求めることにより、特に車両12100の進行路上にある最も近い立体物で、車両12100と略同じ方向に所定の速度(例えば、0km/h以上)で走行する立体物を先行車として抽出することができる。さらに、マイクロコンピュータ12051は、先行車の手前に予め確保すべき車間距離を設定し、自動ブレーキ制御(追従停止制御も含む)や自動加速制御(追従発進制御も含む)等を行うことができる。このように運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。 For example, the microcomputer 12051 has a distance to each three-dimensional object within the imaging range 12111 to 12114 based on the distance information obtained from the imaging units 12101 to 12104, and a temporal change of this distance (relative velocity with respect to the vehicle 12100). By obtaining, it is possible to extract as the preceding vehicle a three-dimensional object that is the closest three-dimensional object on the traveling path of the vehicle 12100 and that travels in substantially the same direction as the vehicle 12100 at a predetermined speed (for example, 0 km / h or more). it can. Further, the microcomputer 12051 can set an inter-vehicle distance to be secured in front of the preceding vehicle in advance, and can perform automatic braking control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform cooperative control for the purpose of automatic driving or the like in which the vehicle travels autonomously without depending on the operation of the driver.
 例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を元に、立体物に関する立体物データを、2輪車、普通車両、大型車両、歩行者、電柱等その他の立体物に分類して抽出し、障害物の自動回避に用いることができる。例えば、マイクロコンピュータ12051は、車両12100の周辺の障害物を、車両12100のドライバが視認可能な障害物と視認困難な障害物とに識別する。そして、マイクロコンピュータ12051は、各障害物との衝突の危険度を示す衝突リスクを判断し、衝突リスクが設定値以上で衝突可能性がある状況であるときには、オーディオスピーカ12061や表示部12062を介してドライバに警報を出力することや、駆動系制御ユニット12010を介して強制減速や回避操舵を行うことで、衝突回避のための運転支援を行うことができる。 For example, the microcomputer 12051 converts three-dimensional object data related to a three-dimensional object into two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, electric poles, and other three-dimensional objects based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into obstacles that can be seen by the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 is used via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
 撮像部12101ないし12104の少なくとも1つは、赤外線を検出する赤外線カメラであってもよい。例えば、マイクロコンピュータ12051は、撮像部12101ないし12104の撮像画像中に歩行者が存在するか否かを判定することで歩行者を認識することができる。かかる歩行者の認識は、例えば赤外線カメラとしての撮像部12101ないし12104の撮像画像における特徴点を抽出する手順と、物体の輪郭を示す一連の特徴点にパターンマッチング処理を行って歩行者か否かを判別する手順によって行われる。マイクロコンピュータ12051が、撮像部12101ないし12104の撮像画像中に歩行者が存在すると判定し、歩行者を認識すると、音声画像出力部12052は、当該認識された歩行者に強調のための方形輪郭線を重畳表示するように、表示部12062を制御する。また、音声画像出力部12052は、歩行者を示すアイコン等を所望の位置に表示するように表示部12062を制御してもよい。 At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays. For example, the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the captured image of the imaging units 12101 to 12104. Such pedestrian recognition includes, for example, a procedure for extracting feature points in an image captured by an imaging unit 12101 to 12104 as an infrared camera, and pattern matching processing for a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian. It is done by the procedure to determine. When the microcomputer 12051 determines that a pedestrian is present in the captured image of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 outputs a square contour line for emphasizing the recognized pedestrian. The display unit 12062 is controlled so as to superimpose and display. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon or the like indicating a pedestrian at a desired position.
 以上、本開示に係る技術が適用され得る車両制御システムの一例について説明した。本開示に係る技術は、以上説明した構成のうち、車外情報検出ユニット12030や車内情報検出ユニット12040に適用され得る。具体的には、車外情報検出ユニット12030や車内情報検出ユニット12040として測距装置1による測距を利用することで、運転者のジェスチャを認識する処理を行い、そのジェスチャに従った各種(例えば、オーディオシステム、ナビゲーションシステム、エアーコンディショニングシステム)の操作を実行したり、より正確に運転者の状態を検出することができる。また、測距装置1による測距を利用して、路面の凹凸を認識して、サスペンションの制御に反映させたりすることができる。 The above is an example of a vehicle control system to which the technology according to the present disclosure can be applied. The technique according to the present disclosure can be applied to the vehicle exterior information detection unit 12030 and the vehicle interior information detection unit 12040 among the configurations described above. Specifically, by using the distance measurement by the distance measuring device 1 as the outside information detection unit 12030 and the inside information detection unit 12040, processing for recognizing the driver's gesture is performed, and various types according to the gesture (for example, It can perform operations on audio systems, navigation systems, air conditioning systems) and detect the driver's condition more accurately. Further, the distance measurement by the distance measuring device 1 can be used to recognize the unevenness of the road surface and reflect it in the control of the suspension.
 なお、本技術は、Indirect ToF方式の中でもContinuous-Wave方式と称する、物体へ投射する光を振幅変調する方式に適用することができる。また、画素31のフォトダイオード41の構造としては、CAPD(Current Assisted Photonic Demodulator)構造の測距センサや、フォトダイオードの電荷を2つのゲートに交互にパルスを加えるゲート方式の測距センサなど、2つの電荷蓄積部に電荷を振り分ける構造の測距センサに適用することができる。 Note that this technology can be applied to the continuous-Wave method, which is an indirect ToF method that amplitude-modulates the light projected onto an object. The structure of the photodiode 41 of the pixel 31 includes a distance measuring sensor having a CAPD (Current Assisted Photonic Demodulator) structure and a gate type distance measuring sensor that alternately applies the charge of the photodiode to the two gates. It can be applied to a ranging sensor having a structure that distributes charges to one charge storage unit.
 本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。 The embodiment of the present technology is not limited to the above-described embodiment, and various changes can be made without departing from the gist of the present technology.
 本明細書において複数説明した本技術は、矛盾が生じない限り、それぞれ独立に単体で実施することができる。もちろん、任意の複数の本技術を併用して実施することもできる。例えば、いずれかの実施の形態において説明した本技術の一部または全部を、他の実施の形態において説明した本技術の一部または全部と組み合わせて実施することもできる。また、上述した任意の本技術の一部または全部を、上述していない他の技術と併用して実施することもできる。 The present techniques described above in this specification can be independently implemented independently as long as there is no contradiction. Of course, any plurality of the present technologies can be used in combination. For example, some or all of the techniques described in any of the embodiments may be combined with some or all of the techniques described in other embodiments. It is also possible to carry out a part or all of any of the above-mentioned techniques in combination with other techniques not described above.
 また、例えば、1つの装置(または処理部)として説明した構成を分割し、複数の装置(または処理部)として構成するようにしてもよい。逆に、以上において複数の装置(または処理部)として説明した構成をまとめて1つの装置(または処理部)として構成されるようにしてもよい。また、各装置(または各処理部)の構成に上述した以外の構成を付加するようにしてももちろんよい。さらに、システム全体としての構成や動作が実質的に同じであれば、ある装置(または処理部)の構成の一部を他の装置(または他の処理部)の構成に含めるようにしてもよい。 Further, for example, the configuration described as one device (or processing unit) may be divided and configured as a plurality of devices (or processing units). On the contrary, the configurations described above as a plurality of devices (or processing units) may be collectively configured as one device (or processing unit). Further, of course, a configuration other than the above may be added to the configuration of each device (or each processing unit). Further, if the configuration and operation of the entire system are substantially the same, a part of the configuration of one device (or processing unit) may be included in the configuration of another device (or other processing unit). ..
 さらに、本明細書において、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、すべての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び、1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれも、システムである。 Further, in the present specification, the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and a device in which a plurality of modules are housed in one housing are both systems. ..
 なお、本明細書に記載された効果はあくまで例示であって限定されるものではなく、本明細書に記載されたもの以外の効果があってもよい。 It should be noted that the effects described in the present specification are merely examples and are not limited, and effects other than those described in the present specification may be obtained.
 なお、本技術は、以下の構成を取ることができる。
(1)
 照射光を照射する発光源と、
 前記照射光が物体で反射されて返ってきた反射光を受光する受光センサと、
 前記発光源の発光タイミングを制御する光源変調信号と、前記受光センサの受光タイミングを制御するセンサ変調信号とに対して所定の符号に対応した符号化を行うことにより、符号化光源変調信号と符号化センサ変調信号とを生成する符号化部と、
 前記符号化光源変調信号に対して、所定の遅延量だけ位相を遅延させた遅延光源変調信号を生成する光源遅延部と、
 前記符号化センサ変調信号に対して、所定の遅延量だけ位相を遅延させた遅延センサ変調信号を生成するセンサ遅延部と
 を備える測距装置。
(2)
 前記所定の遅延量を決定し、決定した前記所定の遅延量を、前記光源遅延部または前記センサ遅延部のいずれか一方に供給する制御部をさらに備える
 前記(1)に記載の測距装置。
(3)
 前記制御部は、測定モードが第1モードの場合に、前記所定の遅延量をゼロに決定する
 前記(2)に記載の測距装置。
(4)
 前記制御部は、測定モードが第2モードの場合に、決定した前記所定の遅延量を、前記光源遅延部に供給し、測定モードが第3モードの場合に、決定した前記所定の遅延量を、前記センサ遅延部に供給する
 前記(2)または(3)に記載の測距装置。
(5)
 測定モードには、前記所定の遅延量がゼロの第1モードと、前記所定の遅延量が正の第2モードとがあり、
 前記制御部は、測定モードを前記第2モードに設定して測距を実行させた後、測定モードを前記第1モードに設定して測距を実行させる制御を行う
 前記(2)に記載の測距装置。
(6)
 前記制御部は、測定モードが前記第2モードの場合に、決定した前記所定の遅延量を、前記光源遅延部に供給する
 前記(5)に記載の測距装置。
(7)
 前記光源変調信号と前記センサ変調信号の周期の整数倍単位で、前記所定の符号を生成する符号生成部をさらに備える
 前記(1)乃至(6)のいずれかに記載の測距装置。
(8)
 前記符号生成部は、0または1の符号をランダムに生成する
 前記(7)に記載の測距装置。
(9)
 照射光を照射する発光源と、前記照射光が物体で反射されて返ってきた反射光を受光する受光センサとを有する測距装置が、
 前記発光源の発光タイミングを制御する光源変調信号と、前記受光センサの受光タイミングを制御するセンサ変調信号とに対して所定の符号に対応した符号化を行うことにより、符号化光源変調信号と符号化センサ変調信号とを生成し、
 前記符号化光源変調信号に対して、所定の遅延量だけ位相を遅延させた遅延光源変調信号か、または、前記符号化センサ変調信号に対して、所定の遅延量だけ位相を遅延させた遅延センサ変調信号を生成する
 測距装置の制御方法。
(10)
 照射光を照射する発光源と、
 前記照射光が物体で反射されて返ってきた反射光を受光する受光センサと、
 前記発光源の発光タイミングを制御する光源変調信号と、前記受光センサの受光タイミングを制御するセンサ変調信号とに対して所定の符号に対応した符号化を行うことにより、符号化光源変調信号と符号化センサ変調信号とを生成する符号化部と、
 前記符号化光源変調信号に対して、所定の遅延量だけ位相を遅延させた遅延光源変調信号を生成する光源遅延部と、
 前記符号化センサ変調信号に対して、所定の遅延量だけ位相を遅延させた遅延センサ変調信号を生成するセンサ遅延部と
 を備える測距装置
 を備える電子機器。
The present technology can have the following configurations.
(1)
The light emitting source that irradiates the irradiation light and
A light receiving sensor that receives the reflected light that is reflected by the object and returned.
The coded light source modulated signal and the code are obtained by coding the light source modulated signal that controls the light emitting timing of the light emitting source and the sensor modulated signal that controls the light receiving timing of the light receiving sensor according to a predetermined code. The coding unit that generates the modulation signal of the sensor
A light source delay unit that generates a delayed light source modulated signal in which the phase is delayed by a predetermined delay amount with respect to the coded light source modulated signal.
A distance measuring device including a sensor delay unit that generates a delay sensor modulation signal whose phase is delayed by a predetermined delay amount with respect to the coded sensor modulation signal.
(2)
The distance measuring device according to (1), further comprising a control unit that determines the predetermined delay amount and supplies the determined predetermined delay amount to either the light source delay unit or the sensor delay unit.
(3)
The distance measuring device according to (2) above, wherein the control unit determines the predetermined delay amount to zero when the measurement mode is the first mode.
(4)
The control unit supplies the predetermined delay amount determined when the measurement mode is the second mode to the light source delay unit, and determines the predetermined delay amount when the measurement mode is the third mode. The distance measuring device according to (2) or (3), which is supplied to the sensor delay unit.
(5)
The measurement mode includes a first mode in which the predetermined delay amount is zero and a second mode in which the predetermined delay amount is positive.
The control unit sets the measurement mode to the second mode and executes distance measurement, and then sets the measurement mode to the first mode to execute distance measurement. The control according to (2). Distance measuring device.
(6)
The distance measuring device according to (5), wherein the control unit supplies the predetermined delay amount determined when the measurement mode is the second mode to the light source delay unit.
(7)
The distance measuring device according to any one of (1) to (6), further comprising a code generation unit that generates the predetermined code in units of integral multiples of the period of the light source modulation signal and the sensor modulation signal.
(8)
The distance measuring device according to (7) above, wherein the code generation unit randomly generates a code of 0 or 1.
(9)
A distance measuring device having a light emitting source that irradiates the irradiation light and a light receiving sensor that receives the reflected light that is reflected by the object and returned.
The coded light source modulated signal and the code are obtained by coding the light source modulated signal that controls the light emitting timing of the light emitting source and the sensor modulated signal that controls the light receiving timing of the light receiving sensor according to a predetermined code. Generates a sensor modulated signal and
A delayed light source modulated signal whose phase is delayed by a predetermined delay with respect to the coded light source modulated signal, or a delay sensor whose phase is delayed by a predetermined delay with respect to the coded sensor modulated signal. A method of controlling a distance measuring device that generates a modulated signal.
(10)
The light emitting source that irradiates the irradiation light and
A light receiving sensor that receives the reflected light that is reflected by the object and returned.
The coded light source modulated signal and the code are obtained by coding the light source modulated signal that controls the light emitting timing of the light emitting source and the sensor modulated signal that controls the light receiving timing of the light receiving sensor according to a predetermined code. The coding unit that generates the modulation signal of the sensor
A light source delay unit that generates a delayed light source modulated signal in which the phase is delayed by a predetermined delay amount with respect to the coded light source modulated signal.
An electronic device including a distance measuring device including a sensor delay unit that generates a delay sensor modulation signal whose phase is delayed by a predetermined delay amount with respect to the coded sensor modulation signal.
 1 測距装置, 11 タイミング信号生成部, 12 位相設定部, 13 光源変調部, 14 センサ変調部, 15 符号生成部, 16 符号化部, 17 光源遅延部, 18 センサ遅延部, 19 発光源, 20 受光センサ, 21 制御部, 101 スマートフォン, 201 スマートフォン, 202 測距モジュール 1 ranging device, 11 timing signal generator, 12 phase setting unit, 13 light source modulation unit, 14 sensor modulation unit, 15 code generator, 16 coding unit, 17 light source delay unit, 18 sensor delay unit, 19 light source, 20 light source sensor, 21 control unit, 101 smartphone, 201 smartphone, 202 ranging module

Claims (10)

  1.  照射光を照射する発光源と、
     前記照射光が物体で反射されて返ってきた反射光を受光する受光センサと、
     前記発光源の発光タイミングを制御する光源変調信号と、前記受光センサの受光タイミングを制御するセンサ変調信号とに対して所定の符号に対応した符号化を行うことにより、符号化光源変調信号と符号化センサ変調信号とを生成する符号化部と、
     前記符号化光源変調信号に対して、所定の遅延量だけ位相を遅延させた遅延光源変調信号を生成する光源遅延部と、
     前記符号化センサ変調信号に対して、所定の遅延量だけ位相を遅延させた遅延センサ変調信号を生成するセンサ遅延部と
     を備える測距装置。
    The light emitting source that irradiates the irradiation light and
    A light receiving sensor that receives the reflected light that is reflected by the object and returned.
    The coded light source modulated signal and the code are obtained by coding the light source modulated signal that controls the light emitting timing of the light emitting source and the sensor modulated signal that controls the light receiving timing of the light receiving sensor according to a predetermined code. The coding unit that generates the modulation signal of the sensor
    A light source delay unit that generates a delayed light source modulated signal in which the phase is delayed by a predetermined delay amount with respect to the coded light source modulated signal.
    A distance measuring device including a sensor delay unit that generates a delay sensor modulation signal whose phase is delayed by a predetermined delay amount with respect to the coded sensor modulation signal.
  2.  前記所定の遅延量を決定し、決定した前記所定の遅延量を、前記光源遅延部または前記センサ遅延部のいずれか一方に供給する制御部をさらに備える
     請求項1に記載の測距装置。
    The distance measuring device according to claim 1, further comprising a control unit that determines the predetermined delay amount and supplies the determined predetermined delay amount to either the light source delay unit or the sensor delay unit.
  3.  前記制御部は、測定モードが第1モードの場合に、前記所定の遅延量をゼロに決定する
     請求項2に記載の測距装置。
    The distance measuring device according to claim 2, wherein the control unit determines the predetermined delay amount to zero when the measurement mode is the first mode.
  4.  前記制御部は、測定モードが第2モードの場合に、決定した前記所定の遅延量を、前記光源遅延部に供給し、測定モードが第3モードの場合に、決定した前記所定の遅延量を、前記センサ遅延部に供給する
     請求項2に記載の測距装置。
    The control unit supplies the predetermined delay amount determined when the measurement mode is the second mode to the light source delay unit, and determines the predetermined delay amount when the measurement mode is the third mode. The distance measuring device according to claim 2, wherein the sensor delay unit is supplied with the distance measuring device.
  5.  測定モードには、前記所定の遅延量がゼロの第1モードと、前記所定の遅延量が正の第2モードとがあり、
     前記制御部は、測定モードを前記第2モードに設定して測距を実行させた後、測定モードを前記第1モードに設定して測距を実行させる制御を行う
     請求項2に記載の測距装置。
    The measurement mode includes a first mode in which the predetermined delay amount is zero and a second mode in which the predetermined delay amount is positive.
    The measurement according to claim 2, wherein the control unit sets the measurement mode to the second mode to execute distance measurement, and then sets the measurement mode to the first mode to execute distance measurement. Distance device.
  6.  前記制御部は、測定モードが前記第2モードの場合に、決定した前記所定の遅延量を、前記光源遅延部に供給する
     請求項5に記載の測距装置。
    The distance measuring device according to claim 5, wherein the control unit supplies the predetermined delay amount determined when the measurement mode is the second mode to the light source delay unit.
  7.  前記光源変調信号と前記センサ変調信号の周期の整数倍単位で、前記所定の符号を生成する符号生成部をさらに備える
     請求項1に記載の測距装置。
    The distance measuring device according to claim 1, further comprising a code generation unit that generates the predetermined code in units of integral multiples of the period of the light source modulation signal and the sensor modulation signal.
  8.  前記符号生成部は、0または1の符号をランダムに生成する
     請求項7に記載の測距装置。
    The distance measuring device according to claim 7, wherein the code generating unit randomly generates a code of 0 or 1.
  9.  照射光を照射する発光源と、前記照射光が物体で反射されて返ってきた反射光を受光する受光センサとを有する測距装置が、
     前記発光源の発光タイミングを制御する光源変調信号と、前記受光センサの受光タイミングを制御するセンサ変調信号とに対して所定の符号に対応した符号化を行うことにより、符号化光源変調信号と符号化センサ変調信号とを生成し、
     前記符号化光源変調信号に対して、所定の遅延量だけ位相を遅延させた遅延光源変調信号か、または、前記符号化センサ変調信号に対して、所定の遅延量だけ位相を遅延させた遅延センサ変調信号を生成する
     測距装置の制御方法。
    A distance measuring device having a light emitting source that irradiates the irradiation light and a light receiving sensor that receives the reflected light that is reflected by the object and returned.
    The coded light source modulated signal and the code are obtained by coding the light source modulated signal that controls the light emitting timing of the light emitting source and the sensor modulated signal that controls the light receiving timing of the light receiving sensor according to a predetermined code. Generates a sensor modulated signal and
    A delayed light source modulated signal whose phase is delayed by a predetermined delay with respect to the coded light source modulated signal, or a delay sensor whose phase is delayed by a predetermined delay with respect to the coded sensor modulated signal. A method of controlling a distance measuring device that generates a modulated signal.
  10.  照射光を照射する発光源と、
     前記照射光が物体で反射されて返ってきた反射光を受光する受光センサと、
     前記発光源の発光タイミングを制御する光源変調信号と、前記受光センサの受光タイミングを制御するセンサ変調信号とに対して所定の符号に対応した符号化を行うことにより、符号化光源変調信号と符号化センサ変調信号とを生成する符号化部と、
     前記符号化光源変調信号に対して、所定の遅延量だけ位相を遅延させた遅延光源変調信号を生成する光源遅延部と、
     前記符号化センサ変調信号に対して、所定の遅延量だけ位相を遅延させた遅延センサ変調信号を生成するセンサ遅延部と
     を備える測距装置
     を備える電子機器。
    The light emitting source that irradiates the irradiation light and
    A light receiving sensor that receives the reflected light that is reflected by the object and returned.
    The coded light source modulated signal and the code are obtained by coding the light source modulated signal that controls the light emitting timing of the light emitting source and the sensor modulated signal that controls the light receiving timing of the light receiving sensor according to a predetermined code. The coding unit that generates the modulation signal of the sensor
    A light source delay unit that generates a delayed light source modulated signal in which the phase is delayed by a predetermined delay amount with respect to the coded light source modulated signal.
    An electronic device including a distance measuring device including a sensor delay unit that generates a delay sensor modulation signal whose phase is delayed by a predetermined delay amount with respect to the coded sensor modulation signal.
PCT/JP2020/045758 2019-12-23 2020-12-09 Ranging device, method for controlling ranging device, and electronic apparatus WO2021131684A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019231535A JP2021099271A (en) 2019-12-23 2019-12-23 Distance measuring device, control method therefor, and electronic apparatus
JP2019-231535 2019-12-23

Publications (1)

Publication Number Publication Date
WO2021131684A1 true WO2021131684A1 (en) 2021-07-01

Family

ID=76541050

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/045758 WO2021131684A1 (en) 2019-12-23 2020-12-09 Ranging device, method for controlling ranging device, and electronic apparatus

Country Status (2)

Country Link
JP (1) JP2021099271A (en)
WO (1) WO2021131684A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11281744A (en) * 1998-01-28 1999-10-15 Nikon Corp Distance measuring instrument
JP2002181934A (en) * 2000-12-15 2002-06-26 Nikon Corp Apparatus and method for clocking as well as distance measuring apparatus
JP2005300233A (en) * 2004-04-07 2005-10-27 Denso Corp Radar apparatus for vehicle
WO2010098454A1 (en) * 2009-02-27 2010-09-02 パナソニック電工株式会社 Distance measuring apparatus
JP2016045066A (en) * 2014-08-22 2016-04-04 浜松ホトニクス株式会社 Distance-measuring method and distance-measuring device
WO2016075885A1 (en) * 2014-11-11 2016-05-19 パナソニックIpマネジメント株式会社 Distance detection device and distance detection method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11281744A (en) * 1998-01-28 1999-10-15 Nikon Corp Distance measuring instrument
JP2002181934A (en) * 2000-12-15 2002-06-26 Nikon Corp Apparatus and method for clocking as well as distance measuring apparatus
JP2005300233A (en) * 2004-04-07 2005-10-27 Denso Corp Radar apparatus for vehicle
WO2010098454A1 (en) * 2009-02-27 2010-09-02 パナソニック電工株式会社 Distance measuring apparatus
JP2016045066A (en) * 2014-08-22 2016-04-04 浜松ホトニクス株式会社 Distance-measuring method and distance-measuring device
WO2016075885A1 (en) * 2014-11-11 2016-05-19 パナソニックIpマネジメント株式会社 Distance detection device and distance detection method

Also Published As

Publication number Publication date
JP2021099271A (en) 2021-07-01

Similar Documents

Publication Publication Date Title
US10746874B2 (en) Ranging module, ranging system, and method of controlling ranging module
EP3572834A1 (en) Distance measurement processing apparatus, distance measurement module, distance measurement processing method, and program
WO2021085128A1 (en) Distance measurement device, measurement method, and distance measurement system
JP7321834B2 (en) Lighting device and ranging module
US11561303B2 (en) Ranging processing device, ranging module, ranging processing method, and program
WO2021065494A1 (en) Distance measurement sensor, signal processing method, and distance measurement module
US20220381917A1 (en) Lighting device, method for controlling lighting device, and distance measurement module
WO2020246264A1 (en) Distance measurement sensor, signal processing method, and distance measurement module
WO2020209079A1 (en) Distance measurement sensor, signal processing method, and distance measurement module
US20220113410A1 (en) Distance measuring device, distance measuring method, and program
WO2021065500A1 (en) Distance measurement sensor, signal processing method, and distance measurement module
WO2021065495A1 (en) Ranging sensor, signal processing method, and ranging module
WO2021131684A1 (en) Ranging device, method for controlling ranging device, and electronic apparatus
WO2021039458A1 (en) Distance measuring sensor, driving method therefor, and distance measuring module
WO2021106624A1 (en) Distance measurement sensor, distance measurement system, and electronic apparatus
JP7490653B2 (en) Measurement device, measurement method, and program
US20220413144A1 (en) Signal processing device, signal processing method, and distance measurement device
JP7476170B2 (en) Signal processing device, signal processing method, and ranging module
WO2021106623A1 (en) Distance measurement sensor, distance measurement system, and electronic apparatus
WO2022004441A1 (en) Ranging device and ranging method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20906321

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20906321

Country of ref document: EP

Kind code of ref document: A1