WO2014203658A1 - Dispositif de mesure de distance et procédé de mesure de distance - Google Patents

Dispositif de mesure de distance et procédé de mesure de distance Download PDF

Info

Publication number
WO2014203658A1
WO2014203658A1 PCT/JP2014/062897 JP2014062897W WO2014203658A1 WO 2014203658 A1 WO2014203658 A1 WO 2014203658A1 JP 2014062897 W JP2014062897 W JP 2014062897W WO 2014203658 A1 WO2014203658 A1 WO 2014203658A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
unit
state
image
captured image
Prior art date
Application number
PCT/JP2014/062897
Other languages
English (en)
Japanese (ja)
Inventor
自広 山谷
基広 浅野
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2015522667A priority Critical patent/JPWO2014203658A1/ja
Publication of WO2014203658A1 publication Critical patent/WO2014203658A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • G01C3/08Use of electric radiation detectors
    • G01C3/085Use of electric radiation detectors with electronic parallax measurement

Definitions

  • the present invention relates to a distance measuring device and a distance measuring method for measuring a distance from an image to a predetermined object.
  • the distance to another vehicle is required as follows. First, another vehicle is detected from an image captured by a camera mounted on the host vehicle, and the detected image of the other vehicle is defined and stored as a reference template. Next, the correlation between the input image newly picked up by the camera and the reference template is obtained while enlarging or reducing the input image or the reference template, whereby a new vehicle of the other vehicle in the input image is obtained. The position and size are detected. Then, the distance between the host vehicle and the other vehicle is obtained based on the detected position and size of the other vehicle.
  • the present invention has been made in view of the above circumstances, and an object thereof is to provide a distance measuring device and a distance measuring method capable of improving detection accuracy even when the imaging state changes.
  • the latest imaging based on the latest latest captured image of the time-series captured images captured by the imaging unit and the reference image stored in the reference information storage unit.
  • a distance to a predetermined object included in the image is obtained.
  • the imaging state of the imaging unit is acquired, it is determined whether or not the acquired imaging state satisfies a predetermined condition, and the reference stored in the reference information storage unit based on the determination result It is determined whether or not to discard the image. For this reason, such a distance measuring apparatus and distance measuring method can improve detection accuracy even when the imaging state changes.
  • FIG. 1 It is a block diagram which shows the structure of the distance measuring device in embodiment. It is a figure for demonstrating the distance calculation method in the distance measuring device of embodiment. It is a flowchart which shows operation
  • FIG. 1 is a block diagram showing a configuration of a distance measuring device in the embodiment.
  • FIG. 2 is a diagram for explaining a distance calculation method in the distance measuring apparatus according to the embodiment.
  • FIG. 2A is a diagram for describing a reference image extracted from a captured image
  • FIG. 2B illustrates a region that matches the reference image in the latest latest captured image by pattern matching processing (correlation processing). It is a figure for doing.
  • the distance measuring device is a device that measures a distance from the moving body to a predetermined object included in the captured image based on a captured image captured by an imaging unit mounted on the moving body.
  • the moving body is, for example, a vehicle such as an automobile or a train, a robot having a moving function, a movement support device for a visually impaired person, or the like.
  • the mobile body may be capable of self-propelled.
  • Such a distance measuring device DM of the present embodiment includes, for example, an imaging unit 1, a processing unit 2, a storage unit 3, and a notification unit 4 as shown in FIG.
  • the imaging unit 1 is a device that is mounted on the mobile body (not shown) and continuously images a subject.
  • the imaging unit 1 is disposed on the moving body such that the imaging direction (optical axis direction) coincides with the moving direction of the moving body.
  • the imaging unit 1 is mounted on a vehicle, the imaging unit 1 is disposed on a dashboard, for example, with the imaging direction (optical axis direction) facing forward.
  • the imaging unit 1 includes, for example, an imaging optical system, a color or monochrome image sensor, and an image processing unit, and an optical image of a subject is formed on the imaging surface (light receiving surface) of the image sensor by the imaging optical system.
  • the optical image of the subject is photoelectrically converted by the image sensor, and the captured image of the subject is generated from the signal obtained by the photoelectric conversion by the image processing of the image processing unit.
  • the imaging unit 1 continuously images the subject at predetermined time intervals. For example, the imaging unit 1 continuously images the subject in time so as to obtain a predetermined frame rate such as 15 frames / second, 24 frames / second, and 30 frames / second.
  • the imaging unit 1 is connected to the processing unit 2 and sequentially outputs the generated captured images to the processing unit 2.
  • the storage unit 3 is connected to the processing unit 2, and stores various programs executed by the processing unit 2 and data necessary for the execution in advance (ROM (Read Only Memory)) or EEPROM (Electrically Erasable Programmable Read Only Memory). And the like, a volatile memory element such as a RAM (Random Access Memory) serving as a so-called working memory of the processing unit 2, and a peripheral circuit thereof.
  • the storage unit 3 is functionally configured with a reference information storage unit 31 that stores a later-described reference image.
  • the processing unit 2 controls each unit of the distance measuring device DM according to the function of each unit in order to measure the distance from the moving body to a predetermined object included in the captured image.
  • the processing unit 2 includes, for example, a CPU (Central Processing Unit) and its peripheral circuits.
  • the processing unit 2 functionally executes a ranging program for measuring the distance from the moving object to a predetermined object included in the captured image, thereby functionally controlling the control unit 21, the distance calculation unit 22, and the state.
  • An acquisition unit 23, a discard determination unit 24, and a notification unit 25 are configured.
  • the control unit 21 controls the entire distance measuring device DM.
  • the distance calculation unit 22 stores one of time-series captured images captured continuously in time by the imaging unit 1 in the reference information storage unit 31 as a reference image, and when captured by the imaging unit 1 A distance to a predetermined object included in the latest captured image is obtained based on the latest latest captured image of the series of captured images and the reference image stored in the reference information storage unit 31.
  • the reference image may be the entire captured image so that a predetermined image of a predetermined object is extracted from the captured image when the distance is obtained.
  • the reference image is captured. This is an image of a predetermined object extracted from the image and is a part of the captured image.
  • the distance calculation unit 22 of the present embodiment has a predetermined timing set in advance at a predetermined timing (for example, when the distance measurement is started or the reference image is not stored in the reference information storage unit 31).
  • the object image (object image) is extracted from the captured image, and the extracted object image is stored in the reference information storage unit 31 as a reference image. More specifically, as shown in FIG.
  • the distance calculation unit 22 detects a predetermined object Ob set in advance from a captured image Pp captured temporally before the latest captured image, and this detection.
  • the rectangular image (object image) surrounding the predetermined object Ob is extracted from the captured image Pp as the reference image TP, and the extracted object image is stored in the reference information storage unit 31 as the reference image TP. If the predetermined object is not present in the image, the process returns to the beginning without extracting the reference image TP.
  • the predetermined object Ob may be an obstacle to movement of a moving object such as a vehicle (automobile and train), a motorcycle (bicycle and motorcycle), a person, an animal, and an artificial object. It is.
  • the predetermined object Ob is an automobile, as disclosed in Patent Document 1, for vehicle detection by image processing, for example, (1) It is sandwiched between white lines drawn on a road A first horizontal edge (lateral edge) is detected from the bottom to the top of the image, (2) a second horizontal edge above the first horizontal edge is detected, and (3) the first Well-known conventional means (conventional method) such as detecting a pair of first and second vertical edges in a region sandwiched between the horizontal edge and the second horizontal edge is used. That is, for vehicle detection by image processing, first and second vertical edges that form a pair of top and bottom or left and right pairs from a histogram (columnar diagram) of an edge sandwiched between white lines.
  • Known conventional means such as edge detection is used.
  • the distance calculation unit 22 detects the predetermined object Ob from the latest captured image Pt by template matching using the reference image TP as a template while enlarging or reducing the reference image TP or the latest captured image Pt. Thereby, the distance calculation unit 22 detects a new position and size of the predetermined object Ob on the latest captured image Pt. Then, the distance calculation unit 22 obtains a distance between the moving body and the predetermined object Ob based on the detected new position and size.
  • the distance z is given by the following equations (1) and (2), as disclosed in Patent Document 1.
  • atan (f / xc) (1)
  • z f ⁇ w / wc (2)
  • is the direction in which the object Ob at the position detected at the center xc in the width direction of the object Ob on the image exists
  • f is the focal length of the imaging unit 1
  • w is the actual object
  • the width of Ob, and wc is the width of the object Ob on the image.
  • the state acquisition unit 23 acquires the imaging state of the imaging unit 1. More specifically, the state acquisition unit 23 acquires the imaging state of the imaging unit 1 based on a predetermined feature amount in the latest captured image Pt captured by the imaging unit 1. For example, preferably, the state acquisition unit 23 acquires a statistical value (for example, an average value or a median value) of a pixel value or a luminance value as the predetermined feature amount. Further, for example, preferably, the state acquisition unit 23 acquires a frequency characteristic in the latest captured image as the predetermined feature amount. When calculating the predetermined feature amount, the state acquisition unit 23 may use a part of the latest captured image.
  • the state acquisition unit 23 may acquire the imaging direction of the imaging unit 1 as the imaging state of the imaging unit 1.
  • the discard determination unit 24 determines whether or not the imaging state acquired by the state acquisition unit 23 satisfies a predetermined condition, and uses the reference image TP stored in the reference information storage unit 31 based on the determination result. It is determined whether or not to discard.
  • the discard determination unit 24 discards the reference image TP stored in the reference information storage unit 31 when it is determined that the imaging state acquired by the state acquisition unit 23 satisfies a predetermined condition. That is, the discard determination unit 24 discards the reference image TP in the reference information storage unit 31.
  • the discard determination unit 24 functionally includes a state determination unit 241, a holdability determination unit 242, and a discard control unit 243.
  • the state determination unit 241 determines whether the imaging state acquired by the state acquisition unit 23 satisfies a predetermined condition.
  • the holdability determination unit 242 determines whether or not to discard the reference image TP stored in the reference information storage unit 31 based on the determination result determined by the state determination unit 241.
  • the discard control unit 243 discards the reference image TP in the reference information storage unit 31 when it is determined that the imaging state acquired by the state acquisition unit 23 satisfies a predetermined condition.
  • the notification unit 25 determines that the state acquisition unit 25 determines that the continuous determination count n continuously determined by the discard determination unit 24 is equal to or greater than a threshold value Thn.
  • the notification unit 4 notifies the outside that the imaging state acquired by the video camera 23 satisfies a predetermined condition.
  • the notification unit 4 is a device that is connected to the processing unit 2 and notifies the outside that the imaging state acquired by the state acquisition unit 23 in accordance with the control of the notification unit 25 satisfies a predetermined condition.
  • the notification unit 4 includes, for example, a light emitting diode that emits light, and turns on (including the case of blinking) in accordance with the control of the notification unit 25 to notify the outside.
  • the notification unit 4 is configured to include a buzzer or a speaker that emits a sound, and notifies the outside by issuing a warning sound according to the control of the notification unit 25.
  • the notification unit 4 is configured to include a display device such as a CRT display or a liquid crystal display, for example, and notifies the outside by displaying a message indicating the above in accordance with the control of the notification unit 25.
  • FIG. 3 is a flowchart showing the operation of the distance measuring apparatus in the embodiment.
  • the imaging unit 1 when the distance measurement is started by an input of an unillustrated start switch, an input of a distance measurement start instruction switch, or the like, the imaging unit 1 follows the control of the control unit 21 of the processing unit 2.
  • the captured image of the subject is captured at a predetermined frame rate, and the time-series captured images are sequentially output to the processing unit 2.
  • the distance calculation unit 22 of the processing unit 2 first determines whether or not the reference image TP is stored in the reference information storage unit 31 of the storage unit 3 in FIG. (S1). As a result of this determination, when the reference image TP is not stored (is not stored) in the reference information storage unit 31 (N), the distance calculation unit 22 executes the next process S3. On the other hand, when the reference image TP is stored (exists) in the reference information storage unit 31 as a result of the determination in step S1 (Y), template matching is performed on the captured image (input image) Pt. Then, the reference image TP is acquired (S2), and the process S3 is executed.
  • the distance calculation unit 22 detects a predetermined object Ob. More specifically, when it is determined in step S1 that there is no reference image TP, the distance calculation unit 22 newly adds a predetermined object Ob to the captured image (input image) Pt by, for example, the above-described known technique. Is detected. On the other hand, when it is determined in step S1 that the reference image TP is present, the distance calculation unit 22 performs template matching on the captured image (input image) using the reference image TP acquired in step S2 as a template. The predetermined object Ob is detected, and the position and size of the new predetermined object Ob on the captured image (input image) Pt are detected.
  • the state acquisition unit 23 of the processing unit 2 acquires the imaging state of the imaging unit 1, and notifies the state determination unit 241 of the discard determination unit 24 of the acquired imaging state of the imaging unit 1 (S4).
  • the state determination unit 241 determines whether the imaging state of the imaging unit 1 satisfies a predetermined condition and notifies the determination result to the holdability determination unit 242. (S5). The acquisition of the imaging state in the process S4 and the determination of the imaging state in the process S5 will be described later.
  • the state determination unit 241 notifies the distance calculation unit 22 accordingly, thereby the distance calculation unit. 22 stores the image including the predetermined object Ob detected in the process S3 in the reference information storage unit 31 as a new reference image TP and stores it (S6).
  • the distance calculation unit 22 adds a new image on the image image (input image) Pt obtained in the process S3. Based on the position and size of the predetermined object Ob, a distance between the moving object and the predetermined object Ob is obtained. Then, the distance calculation unit 22 returns the process to the process S1 in order to process the next captured image (next input image) sequentially input in time series.
  • step S5 when the imaging state of the imaging unit 1 satisfies a predetermined condition (Y), the state determination unit 241 notifies the hold determination unit 242 of the discard determination unit 24 to that effect. Accordingly, the holdability determination unit 242 determines whether to discard the reference image TP stored in the reference information storage unit 31 based on the determination result determined by the state determination unit 241. That is, the holdability determination unit 242 determines whether or not the reference image TP is stored in the reference information storage unit 31 (S7).
  • the holdability determination unit 242 notifies the discard control unit of the discard determination unit 24 to that effect.
  • the discard control unit 243 causes the reference information storage unit 31 to discard the reference image TP stored in the reference information storage unit 31 (S9), and processes the next captured image (next input image). In order to do so, the process returns to process S1.
  • the reference image TP may be discarded by deleting the data of the reference image TP from the reference information storage unit 31.
  • the reference image TP stores the data of the reference image TP. It may be abandoned by releasing the storage area.
  • the holdability determination unit 242 determines that the imaging state of the imaging unit 1 is predetermined.
  • the notification unit 25 notifies the notification unit 25 that the condition is satisfied, and when the notification unit 25 is continuously notified that the imaging state of the imaging unit 1 satisfies the predetermined condition, the notification unit 25 counts the number of notifications as the continuous determination number n. Then, it is determined whether or not the counted number n of consecutive determinations is equal to or greater than a threshold value Thn (S8).
  • step S8 when the number of consecutive determinations n is less than the threshold value Thn (N), the notification unit 25 processes the next captured image (next input image) sequentially input in time series. In order to do so, the process returns to process S1.
  • step S8 determines whether the result of determination in step S8 is that the number of consecutive determinations n is equal to or greater than the threshold value Thn (N)
  • the notification unit 25 indicates that the imaging state acquired by the state acquisition unit 23 satisfies a predetermined condition. Is notified to the outside by the notification unit 4 (S10).
  • the notification unit 25 returns the process to step S1 in order to process the next captured image (next input image) sequentially input in time series.
  • the processes S1 to S10 shown in FIG. 3 are executed for each captured image (input image) Pt sequentially input in time series as described above. Therefore, when it is continuously notified in the process S8 that the imaging state of the imaging unit 1 satisfies the predetermined condition, the captured images (input images) Pt sequentially input in time series are continuously displayed. This is a case where images are continuously captured in an imaging state that satisfies the predetermined condition.
  • FIG. 4 illustrates the determination method of the imaging state when the vehicle luminance value of the image is used as the imaging state of the imaging unit in the distance measuring apparatus according to the embodiment when the vehicle equipped with the imaging unit enters the tunnel. It is a figure for doing.
  • FIG. 5 illustrates the determination method of the imaging state when the vehicle luminance value of the image is used as the imaging state of the imaging unit in the distance measuring device of the embodiment when the vehicle equipped with the imaging unit exits the tunnel. It is a figure for doing.
  • FIG. 4A and 5A are diagrams showing temporal changes in the median luminance value in a time-series captured image (each frame), the horizontal axis is a frame (time), and the vertical axis is the luminance value.
  • FIG. 4B and FIG. 5B are diagrams showing the time change of the change amount of the median value, the horizontal axis is the frame (time), and the vertical axis is the change amount of the luminance value.
  • FIG. 6 is a diagram schematically illustrating captured images captured by the imaging unit before and after water droplets adhere to the imaging unit in the distance measuring apparatus according to the embodiment. 6A schematically illustrates a captured image captured by the imaging unit before water droplets adhere to the imaging unit, and FIG.
  • FIG. 6B illustrates an image captured by the imaging unit after water droplets adhere to the imaging unit. An image is shown typically.
  • FIG. 7 is a diagram for describing a method for determining the imaging state when the frequency characteristics of the image are used as the imaging state of the imaging unit in the distance measuring apparatus according to the embodiment.
  • FIG. 7A shows each frequency characteristic of each captured image before and after water droplets adhere to the imaging unit
  • FIG. 7B illustrates a determination method according to the first aspect for determining whether or not water droplets have adhered to the imaging unit.
  • FIG. 7C is a diagram for explaining a determination method of the second mode for determining whether or not water droplets have adhered to the imaging unit.
  • FIG. 8 is a diagram for explaining an image region used by the state acquisition unit in the distance measuring apparatus according to the embodiment.
  • FIG. 8A is a diagram for explaining an image area of the first mode used by the state acquisition unit
  • FIG. 8B is a diagram for explaining an image region of the second mode used by the state acquisition unit.
  • FIG. 9 is a diagram for explaining a method for determining the imaging state when the steering angle of a vehicle equipped with the imaging unit is used as the imaging state of the imaging unit in the distance measuring apparatus according to the embodiment.
  • FIG. 9A is a diagram showing the time change of the steering angle, the horizontal axis is time, and the vertical axis is the steering angle.
  • FIG. 9B is a diagram showing the change over time in the change amount of the steering angle, the horizontal axis is time, and the vertical axis is the change amount of the steering angle.
  • the state acquisition unit 23 described above acquires the imaging state of the imaging unit 1 based on a predetermined feature amount in the latest latest captured image Pt among the time-series captured images captured by the imaging unit 1. It may be.
  • Such a distance measuring device DM uses the latest captured image (input image) Pt imaged by the imaging unit 1, and therefore does not require a separate device for acquiring the imaging state of the imaging unit 1.
  • the predetermined feature amount includes an average value of each pixel value of each pixel in the latest captured image Pt, a median value of each pixel value of each pixel in the latest captured image Pt, and each luminance value of each pixel in the latest captured image Pt. And a predetermined statistic of each pixel in the latest captured image Pt, such as a median value of each luminance value of each pixel in the latest captured image Pt.
  • the predetermined feature amount is preferably a median value. More specifically, the median value of each pixel value of each pixel in the latest captured image Pt is shown in FIG.
  • the discard determination unit 24 determines whether or not the change amount in the median value of the latest captured image Pt acquired by the state acquisition unit 23a is equal to or greater than a predetermined threshold Th1. It is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition.
  • the discard determination unit 24 determines that the amount of change in the median value of the latest captured image Pt is equal to or greater than the predetermined threshold Th1, the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. Is determined. If the discard determination unit 24 determines that the amount of change in the median value of the latest captured image Pt is less than the predetermined threshold Th1, the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. Judge that there is no.
  • the predetermined threshold Th1 is statistically appropriately set by measuring a plurality of samples, for example.
  • the change in the median value and the change in the median value shown in FIGS. 4 and 5 are not only in the case of entering / exiting the tunnel, but also when entering / exiting a relatively long guard, or from normal light to backlight. This may occur when the brightness changes, such as when the image is changed, when the backlight changes to the forward light, or when the imaging unit 1 is irradiated on the light of the oncoming vehicle.
  • the distance measuring apparatus DM may be configured to further include a reference information updating unit that stores the image TP in the reference information storage unit 31. By including such a reference information update unit, storing the captured image captured in this unstable state in the reference information storage unit 31 is reduced, and the distance measuring device DM can improve the detection accuracy. it can.
  • the predetermined feature amount is a frequency characteristic in the latest captured image Pt.
  • the frequency characteristic of the captured image P is a frequency spectrum that is a distribution of frequency components (intensities) included in the captured image P, and is an intensity with respect to the frequency.
  • the captured image P1 is relatively clear as a whole as schematically shown in FIG. 6A, but the water droplet Dp adheres. Later, as shown schematically in FIG.
  • the frequency characteristic ⁇ of the captured image P1 before the water droplet Dp is attached is different from the frequency characteristic ⁇ of the captured image P2 after the water droplet Dp is attached, and the captured image after the water droplet Dp is attached.
  • the frequency characteristic ⁇ of P2 is higher in intensity on the low frequency side (for example, the frequency range of frequency fL or lower) than the frequency characteristic ⁇ of the captured image P1 before the water drop Dp is attached, while on the high frequency side (for example, frequency of frequency fH or higher) (Range) is a small profile.
  • the discard determination unit 24 since the frequency characteristics of the captured image change before and after the attachment of the foreign matter, the discard determination unit 24, for example, as shown in FIG. 7B, the frequency of the latest captured image Pt acquired by the state acquisition unit 23a.
  • a predetermined threshold Th21 By determining whether or not the amount of change in the low frequency fL in the characteristic is greater than or equal to a predetermined threshold Th21, it is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. That is, when the discard determination unit 24 determines that the amount of change in the low frequency fL in the frequency characteristic of the latest captured image Pt is equal to or greater than the predetermined threshold Th21, the imaging state acquired by the state acquisition unit 23a is predetermined. It is determined that the above condition is satisfied.
  • the discard determination unit 24 determines that the imaging state does not satisfy a predetermined condition. Further, for example, as illustrated in FIG. 7B, the discard determination unit 24 determines whether or not the amount of change in the high frequency fH in the frequency characteristics of the latest captured image Pt acquired by the state acquisition unit 23a is equal to or greater than a predetermined threshold Th22. It is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition.
  • the discard determination unit 24 determines that the amount of change in the high frequency fH in the frequency characteristic of the latest captured image Pt is equal to or greater than the predetermined threshold Th22. It is determined that the above condition is satisfied. When determining that the amount of change is less than a predetermined threshold Th22, the discard determination unit 24 determines that the imaging state does not satisfy a predetermined condition. Further, for example, the discard determination unit 24 has the change amount of the low frequency fL in the frequency characteristic of the latest captured image Pt acquired by the state acquisition unit 23 equal to or greater than a predetermined threshold Th21 and the change amount of the high frequency fH is predetermined.
  • the threshold Th22 By determining whether or not the threshold Th22 is greater than or equal to the threshold value Th22, it is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. That is, when the discard determination unit 24 determines that the amount of change in the low frequency fL in the frequency characteristics of the latest captured image Pt is equal to or greater than the predetermined threshold Th21 and the amount of change in the high frequency fH is equal to or greater than the predetermined threshold Th22. Is determined that the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition, and otherwise, it is determined that the imaging state does not satisfy the predetermined condition.
  • the discard determination unit 24 determines whether the intensity of the low frequency fL in the frequency characteristics of the latest captured image Pt acquired by the state acquisition unit 23a is equal to or greater than a predetermined threshold Th31. It is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. That is, when the discard determination unit 24 determines that the intensity of the low frequency fL in the frequency characteristics of the latest captured image Pt is equal to or greater than the predetermined threshold Th31, the imaging state acquired by the state acquisition unit 23a is a predetermined value. It is determined that the condition is satisfied.
  • the discard determination unit 24 determines that the intensity is less than the predetermined threshold Th31, the discard determination unit 24 determines that the imaging state does not satisfy the predetermined condition. Further, for example, as illustrated in FIG. 7C, the discard determination unit 24 determines whether the intensity amount of the high frequency fH in the frequency characteristics of the latest captured image Pt acquired by the state acquisition unit 23a is equal to or greater than a predetermined threshold Th32. It is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition.
  • the discard determination unit 24 determines that the intensity amount of the high frequency fH in the frequency characteristic of the latest captured image Pt is equal to or greater than the predetermined threshold Th32. It is determined that the above condition is satisfied. If the discard determination unit 24 determines that the intensity is less than the predetermined threshold Th32, the discard determination unit 24 determines that the imaging state does not satisfy the predetermined condition. Further, for example, the discard determination unit 24 has the change amount of the low frequency fL in the frequency characteristic of the latest captured image Pt acquired by the state acquisition unit 23 equal to or greater than a predetermined threshold Th21 and the change amount of the high frequency fH is predetermined.
  • the threshold Th22 By determining whether or not the threshold Th22 is greater than or equal to the threshold value Th22, it is possible to determine whether or not the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition. That is, when the discard determination unit 24 determines that the amount of change in the low frequency fL in the frequency characteristics of the latest captured image Pt is equal to or greater than the predetermined threshold Th21 and the amount of change in the high frequency fH is equal to or greater than the predetermined threshold Th22. Is determined that the imaging state acquired by the state acquisition unit 23a satisfies a predetermined condition, and otherwise, it is determined that the imaging state does not satisfy the predetermined condition.
  • the predetermined threshold values Th21, Th22, Th31, and Th32 are statistically appropriately set, for example, by measuring a plurality of samples.
  • the state acquisition unit 23 acquires one feature amount, and the discard determination unit 24 determines the imaging state based on the one feature amount acquired by the state acquisition unit 23.
  • the unit 23 may acquire a plurality of feature amounts, and the discard determination unit 24 may determine the imaging state based on the plurality of feature amounts acquired by the state acquisition unit 23.
  • the state acquisition unit 23a uses a part of the latest captured image Pt.
  • a part of the latest captured image Pt may be, for example, a rectangular region Pt11 from the lower end closest to the moving body to the vanishing point in the latest captured image Pt1, as shown in FIG. 9A.
  • an area above the vanishing point of the road surface is often drawn in the sky, and can be excluded from the area for determining an obstacle to the movement of the moving body.
  • the distance measuring device DM determines the feature amount from the captured image P. It is possible to appropriately set the image area for detecting the image. Further, since it is important that an obstacle to the movement of the moving body is present in the traveling area of the moving body, for example, a part of the latest captured image Pt is traveled by the moving body as shown in FIG. 9B. It may be a region Pt21.
  • the traveling area Pt21 of the moving object may be, for example, a triangular area (virtual traveling area) surrounded by both ends of the lower end closest to the moving object and the vanishing point in the latest captured image Pt2.
  • the travel area Pt21 may be a triangular area (actual travel area) surrounded by a lower end closest to the moving body in the latest captured image Pt2 and a pair of white lines drawn on both ends of the road surface.
  • a pair of white lines drawn at both ends of the road surface intersect at a vanishing point.
  • the vanishing point is a point where parallel lines that are actually parallel intersect in the image, and occurs because the captured image is formed using the perspective method.
  • the state acquisition unit 23 described above may be a state acquisition unit 23b that acquires the imaging direction of the imaging unit 1 as the imaging state of the imaging unit 1. Since such a distance measuring device DM sets the imaging direction of the imaging unit 1 to the imaging state of the imaging unit 1, the moving direction of the moving body can be diverted to the imaging state of the imaging unit 1. It is possible to cope with a change in the moving direction of the moving body such as a right turn.
  • the imaging direction of the imaging unit 1 since the imaging unit 1 mounted on the moving body is normally arranged to match the moving direction of the moving body as described above, the moving direction of the moving body is acquired. Can be obtained. More specifically, for example, the distance measuring device DM described above is mounted on a moving body and further includes a gyro sensor that detects the moving direction of the moving body, and the state acquisition unit 23b is based on the detection result of the gyro sensor. The moving direction of the moving body, that is, the imaging direction of the imaging unit 1 is acquired.
  • the state acquisition unit 23b acquires the steering angle information from vehicle information that transmits a so-called CAN (Controller Area Network), and the moving body is based on the acquired steering angle information.
  • Movement direction that is, the imaging direction of the imaging unit 1 is acquired.
  • the vehicle information includes lateral acceleration, yaw rate, operation information of the skid prevention device, and the like, and the state acquisition unit 23b considers these information based on these information.
  • the moving direction of the moving body that is, the imaging direction of the imaging unit 1 may be acquired.
  • the discard determination unit 24 determines whether the imaging direction of the imaging unit 1 acquired by the state acquisition unit 23b satisfies a predetermined condition. It is possible to determine whether or not the imaging state satisfies a predetermined condition. For example, when the state acquisition unit 23b uses the steering angle as the imaging direction of the imaging unit 1, as the driver steers, the steering angle starts to start as shown in FIG. The steering starts to increase as the steering wheel starts rotating, and the steering is maintained during the change of direction of the moving body (because the steering wheel is fixed), so that the direction of the moving body becomes the desired direction of the driver. Since the steering is returned (as the steering wheel is returned), the profile starts to decrease.
  • the discard determination unit 24 determines whether or not the steering angle acquired by the state acquisition unit 23b is equal to or greater than a predetermined threshold Th41 by the state acquisition unit 23b. It can be determined whether or not the acquired imaging state satisfies a predetermined condition. That is, when the discard determination unit 24 determines that the steering angle acquired by the state acquisition unit 23b is equal to or greater than the predetermined threshold Th41, the imaging state acquired by the state acquisition unit 23b satisfies a predetermined condition. Is determined. If the discard determination unit 24 determines that the steering angle is less than the predetermined threshold Th41, the discard determination unit 24 determines that the imaging state does not satisfy the predetermined condition.
  • the discard determination unit 24 determines that the change amount of the steering angle acquired by the state acquisition unit 23b is greater than or equal to a predetermined threshold Th42. By determining whether or not there is, it is possible to determine whether or not the imaging state acquired by the state acquisition unit 23b satisfies a predetermined condition.
  • the discard determination unit 24 determines that the change amount (absolute value) of the steering angle acquired by the state acquisition unit 23b is equal to or greater than the predetermined threshold Th42, the imaging acquired by the state acquisition unit 23b. It is determined that the state satisfies a predetermined condition. If the discard determination unit 24 determines that the change amount (absolute value) of the steering angle is less than the predetermined threshold Th42, the discard determination unit 24 determines that the imaging state does not satisfy the predetermined condition.
  • These predetermined threshold values Th41 and Th42 are statistically appropriately set by measuring, for example, a plurality of samples.
  • the distance measuring device DM and the distance measuring method implemented in the present embodiment it is determined whether or not the imaging state satisfies a predetermined condition, and the reference information storage unit 31 is based on the determination result. It is determined whether or not the reference image TP stored in is to be discarded. For this reason, since the distance measuring device DM and the distance measuring method in this embodiment can determine whether or not the reference image TP stored in the reference information storage unit 31 should be discarded based on the determination result of the discarding, the imaging state Even when the change occurs, it is possible to reduce the above-described erroneous detection and non-detection and improve the detection accuracy.
  • the distance measuring device DM and the distance measuring method in the present embodiment discard the reference image TP stored in the reference information storage unit 31 when it is determined that the imaging state satisfies a predetermined condition. Detection and non-detection can be reliably prevented.
  • the user continuously determines that the imaging state satisfies a predetermined condition by the notification of the notification unit 25 using the notification unit 4. Can be recognized.
  • a non-temporary abnormality such as sticking of a foreign object in the imaging unit 1 or a failure of the distance measuring device DM has occurred. Therefore, since the distance measuring device DM and the distance measuring method in the present embodiment include the notification unit 25, it is possible to prompt the user to take appropriate measures such as inspecting the imaging unit 1, for example. Therefore, the threshold value Thn is set to an appropriate number of times according to the frame rate, for example, 15 times or 30 times, from this viewpoint.
  • a distance measuring apparatus is mounted on a moving body, and an imaging unit that continuously images a subject in time and one of time-series captured images captured by the imaging unit as a reference image Based on the reference information storage unit to be stored, the latest latest captured image of the time-series captured images captured by the imaging unit, and the reference image stored in the reference information storage unit, A distance calculation unit that obtains a distance to a predetermined object included, a state acquisition unit that acquires an imaging state of the imaging unit, and whether or not the imaging state acquired by the state acquisition unit satisfies a predetermined condition And a discard determination unit that determines whether to discard the reference image stored in the reference information storage unit based on the determination result.
  • the distance measuring method is mounted on a moving body and is one of an imaging process for imaging a subject continuously in time, and a time-series captured image captured by the imaging process.
  • a reference image stored in the reference information storage unit, a latest latest captured image of the time-series captured images captured in the imaging step, and the reference image stored in the reference information storage unit A distance calculation step for obtaining a distance to a predetermined object included in the latest captured image, a state acquisition step for acquiring an imaging state of the imaging unit, and the imaging state acquired by the state acquisition step are predetermined.
  • a discard determination step of determining whether or not a condition is satisfied and determining whether or not to discard the reference image stored in the reference information storage unit based on the determination result.
  • the distance measuring apparatus and the distance measuring method it is determined whether or not the imaging state satisfies a predetermined condition, and the reference image stored in the reference information storage unit is discarded based on the determination result. It is determined whether or not. For this reason, since the distance measuring apparatus and the distance measuring method can determine whether or not the reference image stored in the reference information storage unit should be discarded based on the determination result of the discard, the imaging state changes. Even in this case, the detection accuracy can be improved.
  • the discard determination unit when the discard determination unit determines that the imaging state acquired by the state acquisition unit satisfies a predetermined condition, the discard determination unit stores the reference information storage unit.
  • the stored reference image is discarded.
  • the distance measuring device is captured by the imaging unit after capturing a predetermined number of images after capturing the latest captured image.
  • a reference information update unit that stores the captured image as a new reference image in the reference information storage unit.
  • Such a distance measuring device discards the reference image stored in the reference information storage unit when it is determined that the imaging state satisfies a predetermined condition, the above-described erroneous detection and non-detection are ensured. Can be prevented.
  • the state acquisition unit acquires the imaging state of the imaging unit based on a predetermined feature amount in the latest captured image captured by the imaging unit. .
  • the state acquisition unit acquires a statistic (for example, an average value or a median value) of a pixel value or a luminance value as the predetermined feature amount.
  • the state acquisition unit acquires a frequency characteristic in the latest captured image as the predetermined feature amount.
  • Such a distance measuring device acquires the imaging state of the imaging unit based on a predetermined feature amount in the latest captured image captured by the imaging unit, an apparatus for acquiring the imaging state of the imaging unit Is not required separately.
  • the state acquisition unit uses a part of the latest captured image.
  • Such a distance measuring device acquires the imaging state of the imaging unit by using a part of the latest captured image, so that the load of information processing can be reduced.
  • the state acquisition unit acquires an imaging direction of the imaging unit as an imaging state of the imaging unit.
  • the imaging unit is arranged on the moving body so that the imaging direction coincides with the moving direction of the moving body. Therefore, since such a distance measuring device sets the imaging direction of the imaging unit to the imaging state of the imaging unit, the moving direction of the moving body can be diverted to the imaging state of the imaging unit. It is possible to cope with changes in the moving direction of the moving body, such as turning right or turning.
  • the number of continuous determinations continuously determined by the discard determination unit when the imaging state acquired by the state acquisition unit satisfies a predetermined condition is a threshold value.
  • the information processing apparatus further includes a notification unit that notifies the outside that the imaging state acquired by the state acquisition unit satisfies a predetermined condition.
  • the user can recognize that the imaging state is continuously determined to satisfy a predetermined condition by the notification of the notification unit. Therefore, since such a distance measuring device includes the notification unit, it is possible to prompt the user to take appropriate measures such as checking the imaging unit.
  • a distance measuring device and a distance measuring method can be provided.

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

La présente invention concerne un dispositif de mesure de distance et un procédé de mesure de distance selon lesquels on détermine la distance jusqu'à un objet prescrit inclus dans une dernière image photographiée qui est l'image photographiée la plus récente parmi des images photographiées chronologiques, photographiées par une unité de photographie, sur base de la dernière image photographiée et d'une image de référence stockée dans une unité de stockage d'informations de référence. L'état de photographie de l'unité de photographie est obtenu, une détermination est faite afin de déterminer si l'état de photographie obtenu satisfait une condition prescrite, et sur base du résultat de détermination, une détermination est faite concernant si la nécessité ou non d'éliminer l'image de référence stockée dans l'unité de stockage d'informations de référence.
PCT/JP2014/062897 2013-06-21 2014-05-14 Dispositif de mesure de distance et procédé de mesure de distance WO2014203658A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015522667A JPWO2014203658A1 (ja) 2013-06-21 2014-05-14 測距装置および測距方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-130709 2013-06-21
JP2013130709 2013-06-21

Publications (1)

Publication Number Publication Date
WO2014203658A1 true WO2014203658A1 (fr) 2014-12-24

Family

ID=52104400

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/062897 WO2014203658A1 (fr) 2013-06-21 2014-05-14 Dispositif de mesure de distance et procédé de mesure de distance

Country Status (2)

Country Link
JP (1) JPWO2014203658A1 (fr)
WO (1) WO2014203658A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07141592A (ja) * 1993-11-12 1995-06-02 Toyota Motor Corp 道路白線検出装置
JP2004112144A (ja) * 2002-09-17 2004-04-08 Nissan Motor Co Ltd 前方車両追跡システムおよび前方車両追跡方法
JP2005229444A (ja) * 2004-02-13 2005-08-25 Toshiba Corp 車両追跡装置およびプログラム
JP2009085651A (ja) * 2007-09-28 2009-04-23 Hitachi Ltd 画像処理システム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07141592A (ja) * 1993-11-12 1995-06-02 Toyota Motor Corp 道路白線検出装置
JP2004112144A (ja) * 2002-09-17 2004-04-08 Nissan Motor Co Ltd 前方車両追跡システムおよび前方車両追跡方法
JP2005229444A (ja) * 2004-02-13 2005-08-25 Toshiba Corp 車両追跡装置およびプログラム
JP2009085651A (ja) * 2007-09-28 2009-04-23 Hitachi Ltd 画像処理システム

Also Published As

Publication number Publication date
JPWO2014203658A1 (ja) 2017-02-23

Similar Documents

Publication Publication Date Title
JP4987573B2 (ja) 車外監視装置
JP6257792B2 (ja) カメラの被覆状態の認識方法、カメラシステム、及び自動車
JP4914233B2 (ja) 車外監視装置
JP4631096B2 (ja) 車両周辺監視装置
JP5887219B2 (ja) 車線逸脱警報装置
JP6364797B2 (ja) 画像解析装置、および画像解析方法
JP6220327B2 (ja) 走行区画線認識装置、走行区画線認識プログラム
US10127460B2 (en) Lane boundary line information acquiring device
JP2011180982A (ja) 区画線検出装置
JP2007257449A (ja) 道路区画線検出装置
JP2008027309A (ja) 衝突判定システム、及び衝突判定方法
JP5759950B2 (ja) 車載カメラ装置
JP4528283B2 (ja) 車両周辺監視装置
JP2014160322A (ja) 車線境界線逸脱抑制装置
JP2009219555A (ja) 眠気検知装置、運転支援装置、眠気検知方法
JP2019219719A (ja) 異常検出装置および異常検出方法
JP2007128460A (ja) 衝突判定システム、衝突判定方法、コンピュータプログラム及び判定装置
JP2010108182A (ja) 車両の運転支援装置
WO2014203658A1 (fr) Dispositif de mesure de distance et procédé de mesure de distance
JP2008042759A (ja) 画像処理装置
JP2018005441A (ja) 車間距離警報及び衝突警報装置
JP5742676B2 (ja) 車線境界線認識装置
JP6495742B2 (ja) 対象物検出装置、対象物検出方法、及び、対象物検出プログラム
JP6564682B2 (ja) 対象物検出装置、対象物検出方法、及び、対象物検出プログラム
JP2014013451A (ja) 車載用車線認識装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14813005

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015522667

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14813005

Country of ref document: EP

Kind code of ref document: A1