CN112513571B - Distance calculating device - Google Patents

Distance calculating device Download PDF

Info

Publication number
CN112513571B
CN112513571B CN201980049441.1A CN201980049441A CN112513571B CN 112513571 B CN112513571 B CN 112513571B CN 201980049441 A CN201980049441 A CN 201980049441A CN 112513571 B CN112513571 B CN 112513571B
Authority
CN
China
Prior art keywords
distance
information
distance information
subject
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201980049441.1A
Other languages
Chinese (zh)
Other versions
CN112513571A (en
Inventor
稻田圭介
内田裕介
野中进一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Astemo Ltd
Original Assignee
Hitachi Astemo Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Astemo Ltd filed Critical Hitachi Astemo Ltd
Publication of CN112513571A publication Critical patent/CN112513571A/en
Application granted granted Critical
Publication of CN112513571B publication Critical patent/CN112513571B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Measurement Of Optical Distance (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

The distance calculating device of the present invention includes: a high-precision distance information generating unit that generates high-precision distance information, which is information of a distance having high precision, using the output of the sensor; a low-precision distance information generating unit that generates low-precision distance information having a lower precision than the high-precision distance information using the output of the sensor; and a distance correction unit for correcting the low-precision distance information by using the high-precision distance information.

Description

Distance calculating device
Technical Field
The present invention relates to a distance calculating device.
Background
There is known a technique for performing a process of recognizing an object to be observed by calculating a distance between a camera and the object to be observed using a stereo camera. In addition, there is also known a method of generating a compound eye region in front and a single eye region on the left and right sides by shifting 2 cameras constituting a stereo camera, thereby realizing high-precision and large-field-of-view vehicle periphery monitoring. Patent document 1 discloses an object detection device for detecting an object present on a reference surface, the device comprising: a parallax calculation unit that calculates a parallax between images captured by the plurality of imaging devices; a region calculating unit that calculates a common region of the plurality of images and a non-common region other than the common region based on the parallax; a 1 st generation unit configured to generate a rectangle that is a candidate for detecting a rectangle of the object, the rectangle being a rectangle in the common region and having a uniformity of the parallax of each point included in the rectangle being greater than a predetermined threshold value; a 2 nd generation unit configured to generate a rectangle in the non-common region as the candidate rectangle; and a judging unit that judges whether or not the generated candidate rectangle contains a detection target.
Prior art literature
Patent literature
Patent document 1: japanese patent application laid-open No. 2010-79582
Disclosure of Invention
Problems to be solved by the invention
There is a need to improve the distance information of an area where the distance information cannot be calculated with high accuracy.
Technical scheme for solving problems
A distance calculating device according to a first aspect of the present invention includes: a high-precision distance information generating unit that generates high-precision distance information, which is information of a distance having high precision, using the output of the sensor; a low-precision distance information generating unit that generates low-precision distance information having a lower precision than the high-precision distance information, using an output of the sensor; and a distance correction unit that corrects the low-precision distance information using the high-precision distance information.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, the distance information of the area with low accuracy can be corrected.
Drawings
Fig. 1 is a schematic diagram of a distance calculation system S including a distance calculation device 6.
Fig. 2 is a block diagram of the vehicle control system S of the first embodiment.
Fig. 3 (a) is a diagram showing a 1 st captured image 100, fig. 3 (b) is a diagram showing a 2 nd captured image 101, and fig. 3 (c) is a diagram showing a group.
Fig. 4 is a block diagram of the correction unit 20.
Fig. 5 is a block diagram of the group information generating unit 30.
Fig. 6 is a flowchart showing the operation of the distance calculating device 6.
Fig. 7 is a block diagram of the correction unit 20 in modification 3.
Fig. 8 is a diagram showing a calculation method of the distance difference information 120 and a correction process of the monocular distance information 102 with respect to the transit group G1 in modification 4.
Fig. 9 is a diagram showing a method for determining the weighting coefficient η shown in expression 6 or expression 7.
Fig. 10 is a diagram showing an example of correction processing of the monocular distance information 102 using the correction coefficient η shown in fig. 9. Fig. 10 (a) is a diagram showing the diagram before correction, and fig. 10 (b) is a diagram showing the diagram after correction.
Fig. 11 is a diagram showing a correction process of the distance difference information 120 and the monocular distance information 102 for the border-crossing group G1 in modification 11.
Fig. 12 is a diagram showing the correction processing performed by the correction execution unit 21 for the enlarged border crossing group G2 in modification 12. Fig. 12 (a) is a diagram showing the diagram before correction, and fig. 12 (b) is a diagram showing the diagram after correction.
Fig. 13 is a diagram showing a method for determining the correction coefficient η shown in expression 6 or expression 7.
Fig. 14 is a diagram showing the threshold L1 shown in fig. 9.
Fig. 15 is a diagram illustrating correction for enlarging an out-of-range group in modification 19. Fig. 15 (a) is a diagram showing the diagram before correction, and fig. 15 (b) is a diagram showing the diagram after correction.
Fig. 16 is a diagram showing the configuration of the correction unit 20 in modification 16.
Fig. 17 is a diagram showing a method for determining the correction coefficient η by the correction determination unit 23.
Fig. 18 is a block diagram of the group information generation unit 30 in modification 21.
Fig. 19 is a diagram showing a correspondence relationship between the type of subject and the correction method.
Fig. 20 is a schematic diagram of the distance calculation system S in the second embodiment.
Fig. 21 is a block diagram of the distance calculating device 6 in the second embodiment.
Fig. 22 is a schematic diagram of the distance calculation system S in the third embodiment.
Fig. 23 is a configuration diagram of the distance calculating device 6 in the third embodiment.
Fig. 24 is a schematic diagram of the distance calculation system S in the fourth embodiment.
Fig. 25 is a configuration diagram of the distance calculating device 6 in the fourth embodiment.
Fig. 26 is a block diagram of a distance calculation system S in the fifth embodiment.
Detailed Description
First embodiment-first embodiment
A first embodiment of the distance calculating device will be described below with reference to fig. 1 to 6.
(Structure)
Fig. 1 is a schematic diagram of a distance calculation system S including a distance calculation device 6. The distance calculation system S is mounted on the vehicle 1, and includes a 1 st imaging unit 2 and a 2 nd imaging unit 3. The 1 st image pickup unit 2 and the 2 nd image pickup unit 3 are arranged in line toward the front in the traveling direction of the vehicle 1. As shown in fig. 1, the field angles of the 1 st image pickup unit 2 and the 2 nd image pickup unit 3 are known, and a part of the fields of view overlap. The field of view of the region indicated by hatching in fig. 1 overlaps.
Fig. 2 is a block diagram of the vehicle control system S of the first embodiment. The vehicle control system S includes a 1 st image pickup section 2, a 2 nd image pickup section 3, a distance calculating device 6, an identification processing section 7, and a vehicle control section 8. The distance calculating device 6 includes a single eye distance information generating section 10, a compound eye distance information generating section 11, and a single eye distance information correcting section 12. The monocular distance information correcting unit 12 includes a correcting unit 20 and a group information generating unit 30. The 1 st captured image 100, which is the image captured by the 1 st imaging unit 2, is output to the distance calculating device 6. The 2 nd shot image 101, which is the image obtained by the 2 nd imaging unit 3, is output to the distance calculating device 6.
The distance calculating device 6 may be a single ECU (Electronic Control Unit, electronic control device) or may be a computing device having other functions. The distance calculating device 6 may be integrally formed with the 1 st imaging unit 2 and the 2 nd imaging unit 3. The distance calculating device 6, the recognition processing unit 7, and the vehicle control unit 8 may be realized by an integrated device. The recognition processing unit 7 and the vehicle control unit 8 may be realized by an integrated electronic control device, or may be realized by different electronic control devices, respectively.
The monocular distance information generating unit 10 generates monocular distance information 102 using the 1 st captured image 100 and the 2 nd captured image 101, respectively. The monocular distance information generating section 10 calculates the distance from the monocular image in a known manner. That is, the monocular distance information generating unit 10 does not use the 1 st captured image 100 and the 2 nd captured image 101 in combination in generating the monocular distance information 102. The monocular distance information generating unit 10 uses the distance information calculated using the 1 st captured image 100 and the distance information calculated using the 2 nd captured image 101 as monocular distance information 102.
The monocular distance information generating unit 10 calculates the distance from the upper part of the captured image as the distance increases, assuming that the capturing is performed in the horizontal direction in the plane, for example. However, the monocular distance information generating unit 10 calculates the distance of each distance information block composed of 1 or more pixel units. The distance information block may be, for example, 16 pixels each having a vertical 4 pixels×a horizontal 4 pixels, 2 pixels each having a vertical 2 pixels×a horizontal 1 pixels, or 1 pixel each having a vertical 1 pixels×a horizontal 1 pixels, that is, 1 pixel. The monocular distance information 102 generated by the monocular distance information generating unit 10 is output to the correcting unit 20 of the monocular distance information correcting unit 12.
The compound eye distance information generating unit 11 generates compound eye distance information 103 using a region where the 1 st captured image 100 and the 2 nd captured image 101 are combined, specifically, fields of view of the 1 st captured image 100 and the 2 nd captured image 101 overlap. Since the positional relationship between the 1 st image pickup unit 2 and the 2 nd image pickup unit 3 including the base line length is known, the compound eye distance information generating unit 11 calculates the distance from the parallax of the pixels common to the 1 st captured image 100 and the 2 nd captured image 101. The compound eye distance information 103 calculated by the compound eye distance information generating unit 11 is also calculated in units of distance information blocks.
The single eye distance information 102 is compared with the compound eye distance information 103, and the precision of the compound eye distance information 103 is higher. Therefore, the single-eye distance information 102 can be referred to as "low-precision distance information", and the compound-eye distance information 103 can be referred to as "high-precision distance information". Similarly, the single-eye distance information generating unit 10 can be referred to as a "low-precision distance information generating unit", and the compound-eye distance information generating unit 11 can be referred to as a "high-precision distance information generating unit".
The size of the distance information block used by the monocular distance information generating unit 10 may be different from the size of the distance information block used by the correcting unit 20, and in this embodiment, the distance information blocks may be the same for simplicity of description. The compound eye distance information 103 generated by the correction unit 20 is output to the correction unit 20 and the group information generation unit 30 of the monocular distance information correction unit 12.
The group information generating unit 30 uses the 1 st captured image 100, the 2 nd captured image 101, and the compound eye distance information 103 to detect the subject and perform group processing on a plurality of subjects, and outputs the result as group information 106 to the correcting unit 20. The details of the operation of the group information generating unit 30 will be described later.
The group information 106 generated by the group information generating unit 30 includes the type of group and the position on the image of the subject constituting the group. The group information 106 may include information on the type of the subject. Examples of the type of the subject include a person, an animal, a vehicle, a road surface, a curb, an accessory road, a white line, a guardrail, a sign, a structure, a signal, a road drop, a pit on a road surface, and a puddle. The correction unit 20 corrects the single-eye distance information 102 using the compound eye distance information 103 and the group information 106, and outputs corrected distance information 104. The operation of the correction unit 20 will be described in detail later. The recognition processing unit 7 performs recognition processing using the corrected distance information 104 generated by the distance calculation device 6.
The correction unit 20 corrects the single-eye distance information 102 using the compound-eye distance information 103 and the group information 106, and outputs the corrected single-eye distance information 102 and compound-eye distance information 103 as corrected distance information 104. The detailed structure and operation of the correction unit 20 will be described later.
The vehicle control unit 8 controls the vehicle 1 using the identification information 140 generated by the identification processing unit 7. The vehicle control unit 8 outputs the input information 105 to the distance calculating device 6. The input information 105 includes at least one of travel information of the vehicle 1, identification information 140, traffic infrastructure information, map information, and vehicle control information. The vehicle control unit 8 can acquire travel information of the vehicle 1 from a sensor, not shown, mounted on the vehicle 1. The vehicle control unit 8 can acquire traffic infrastructure information from outside the vehicle 1 using a communication device not shown. The vehicle control unit 8 can acquire map information by referring to a map database using a communication device not shown, or referring to a map database built in the vehicle 1.
As an example of the travel information, there are travel speed information, acceleration information, position information, travel direction, travel mode, and the like of the vehicle 1. Examples of the travel mode include an automatic drive mode, a travel mode based on a travel route, a travel mode based on a travel condition, a travel mode based on a surrounding natural environment, and an energy saving mode in which the vehicle travels with power saving or low fuel consumption. As an example of the automatic driving mode, there are a manual driving mode in which vehicle control is executed at the judgment of a person, a driving support mode in which a part of vehicle control is supported at the judgment of a system, and an automatic driving mode in which vehicle control is executed at the judgment of a system. As an example of the travel mode based on the travel route, there are a city travel mode, an expressway travel mode, and legal speed information. As an example of the travel mode based on the travel condition, there are a congestion-time travel mode, a parking lot mode, and a travel mode according to the position and movement of the surrounding vehicle. As an example of the running mode based on the surrounding natural environment, there are a night running mode and a backlight running mode. As an example of the map information, there are position information, road shape information, road surface ground object information, road width information, vehicle line information, and road gradient information of the vehicle 1 on the map. Examples of the road shape information include a t-way and an intersection. Examples of road surface ground object information include signals, lane sections, pavement sections, railroad crossings, bicycle parking lots, car parking lots, and crosswalks. As an example of the map information, there are position information, road shape information, road surface ground object information, road width information, vehicle line information, and road gradient information of the vehicle 1 on the map. Examples of the road shape information include a t-way and an intersection. Examples of road surface ground object information include signals, lane sections, pavement sections, railroad crossings, bicycle parking lots, car parking lots, and crosswalks.
As an example of the identification information 140, there are positional information, type information, operation information, and risk information of the subject. As an example of the position information, there are a direction and a distance from the vehicle 1. Examples of the category information include pedestrians, adults, children, old people, animals, falling rocks, bicycles, surrounding vehicles, surrounding structures, and curbs. Examples of the motion information include a shake, a sudden run-out, a road crossing, a moving direction, a moving speed, and a moving track of a pedestrian or a bicycle. Examples of the danger information include abnormal actions of surrounding vehicles such as sudden pedestrian escape, falling rocks, sudden braking, sudden deceleration, and sudden steering. As an example of the traffic infrastructure information, traffic control information such as congestion information, speed limit, and traffic prohibition, travel route guidance information for guiding to other travel routes, accident information, reminder information, surrounding vehicle information, map information, and the like are given. Examples of the vehicle control information include brake control, steering wheel control, accelerator control, in-vehicle lamp control, warning sound generation, in-vehicle camera control, information output to a surrounding vehicle connected via a network, a remote center device, and the like regarding an observation target object around the imaging device. The vehicle control unit 8 may perform the subject detection processing based on the information processed by the distance calculating device 6 or the recognition processing unit 7, may perform the display of the image obtained by the 1 st image pickup unit 2 or the 2 nd image pickup unit 3 and the recognition by the viewer on the display device connected to the vehicle control unit 8, and may provide the information of the observation target object detected by the distance calculating device 6 or the recognition processing unit 7 to the information device that processes the traffic information such as the map information and the congestion information.
(photographed image and group)
Fig. 3 is a diagram showing a region in which an image is captured. Fig. 3 (a) is a diagram showing a 1 st captured image 100 acquired by the 1 st imaging unit 2, fig. 3 (b) is a diagram showing a 2 nd captured image 101 acquired by the 2 nd imaging unit 3, and fig. 3 (c) is a diagram showing a group. The 1 st imaging unit 2 and the 2 nd imaging unit 3 are arranged in parallel in the horizontal direction, and the lateral directions of fig. 3 (a) and 3 (b) are aligned with the imaging position. That is, the region Q of the 1 st captured image 100 and the region R of the 2 nd captured image 101 are captured as the same region.
Therefore, by using the region Q of the 1 st captured image 100 and the region R of the 2 nd captured image 101, the compound eye distance information generating section 11 can calculate the distance with high accuracy. Hereinafter, the information included in the region Q and the region R is collectively referred to as "information of the region T" including the meaning that 2 pieces of information of the region Q and the region R are used together. The region T is referred to as a "binocular region". On the other hand, the region P of the 1 st captured image 100 and the region S of the 2 nd captured image 101 can only obtain information from either one of the 1 st imaging unit 2 and the 2 nd imaging unit 3, and the monocular distance information generating unit 10 can only calculate the distance with low accuracy. Hereinafter, the regions P to S are referred to as "monocular regions", respectively.
The subject shown in fig. 3 is as follows. In the 1 st captured image 100, the subject 410, the subject 420, and the subject 430 are captured. The subject 430 spans the region P and the region Q, and for convenience of description, the subject 430 of the region P is referred to as a partial subject 401, and the subject 430 of the region Q is referred to as a partial subject 400. The subject 410 is present at the right end of the region Q, and the subject 420 is present near the center of the region P. In the 2 nd shot image 101, the subject 401 is shot at the left end portion of the region R, and the subject 410 is shot near the right end of the region R. In addition, in the region S of the 2 nd shot image 101, the subject 411, the subject 412, and the subject 413 are shot from the left end. The shapes of the subject 410, the subject 411, the subject 412, and the subject 413 are substantially the same.
In the present embodiment, the group G1 is a set of the partial subject 400 and the partial subject 401, that is, the group G2 is a set of the subject 420 and the subject 430, and the group G3 is a set of the subject 410, the subject 411, the subject 412 and the subject 413. Group G1, group G2, and group G3 are also referred to as group 1, group 2, and group 3, respectively.
In the present embodiment, the group G1, which is a type of group of the subject existing across the binocular region and the monocular region in the portion of each region, is referred to as an "out-of-sight group". As applied to the example shown in fig. 3, the portion of the subject 430 existing across the binocular region Q and the monocular region P in each region, that is, the group G1 of the partial subjects 400 and 401 is classified into an out-of-the-way group. Group G1 is hereinafter referred to as "off-border group G1".
The group G2, which is a kind of a group of subjects existing in the vicinity of subjects belonging to the off-scene group and subjects constituting the off-scene group, which is a subject existing in the monocular region, is referred to as an "enlarged off-scene group". As applied to the example shown in fig. 3, the group G2, which is an object existing in the monocular region P and is an object 420 existing in the vicinity of the object 430 belonging to the overseas group G1 and the object 430 constituting the overseas group G1, is classified into an enlarged overseas group. Group G2 is hereinafter referred to as "extended border group G2". In the following, the subject included in the enlarged border-crossing group G2 and not included in the border-crossing group G1 as the subject 420 is referred to as an "additional subject".
The group G3, which is a kind of group composed of a plurality of subjects existing adjacent to the compound eye region and the monocular region, is referred to as a "single group". However, on the condition that the respective subjects belonging to the single group are the same or similar in terms of at least 1 of the kind of subject, the shape of the subject, the color of the subject, and the position of the subject in the up-down direction. As applied to the example shown in fig. 3, the group G3 of the subject 411, the subject 412, and the subject 413 existing adjacent to the monocular region S is classified as a single group. Group G3 is hereinafter referred to as "monomer group G3".
(correction part 20)
Fig. 4 is a block diagram of the correction unit 20. The correction unit 20 includes a distance difference information generation unit 22, a correction determination unit 23, and a correction execution unit 21. The distance difference information generating section 22 generates distance difference information 120 based on the single-eye distance information 102, compound-eye distance information 103, and group information 106. The distance difference information 120 will be described later.
The correction determination unit 23 determines a correction method for the monocular distance information 102 based on the distance difference information 120, the compound eye distance information 103, and the input information 105, and outputs correction determination information 121. The correction execution unit 21 performs correction processing on the single-eye distance information 102 based on the distance difference information 120, the group information 106, and the correction judgment information 121, and generates corrected distance information 104 including the compound-eye distance information 103 and the corrected single-eye distance information. The correction determination information 121 will be described later.
(distance Difference information generating section 22)
The operation of the distance difference information generating unit 22 will be specifically described with reference to fig. 3. The distance difference information generating unit 22 calculates an average value LAm for the partial subject 400 included in the compound eye region T based on the compound eye distance information 103. Next, the distance difference information generating section 22 calculates an average value LAs based on the monocular distance information 102 of the partial subject 400 calculated using only the monocular image. Then, the distance difference information generating section 22 calculates the distance difference information 120 using the average value LAm and the average value Las. The calculation of the average LAm and the average LAs may be, for example, a simple average of the entire region of the partial subject 400 or a weighted average in which the weight is increased as the boundary is closer.
Examples of the calculation in the case of performing weighted average are as follows. The compound eye distance information 103 of n distance information blocks k constituting the object 400 to be calculated is Lm (k), the monocular distance information 102 is Ls (k), the weighting coefficient for the compound eye distance information 103 is α (k), and the weighting coefficient for the monocular distance information 102 is β (k). Wherein k is an integer of 0 to n-1. At this time, the average LAm and LAs are represented by the formulas 1 and 2.
LAm=(1/n)Σ k=0 n-1 (α (k). Times.Lm (k)) … (formula 1)
LAs=(1/n)Σ k=0 n-1 (beta (k). Times.ls (k)) … (formula 2)
The distance difference information 120 is calculated as, for example, a difference between the average LAm and the average LAs. However, the average LAm or the average LAs may also be weighted when calculating the difference. When the weight coefficient with respect to the compound eye distance information 103 indicated by the symbol LAm is γ and the weight coefficient with respect to the monocular distance information 102 indicated by LAs is δ, the distance difference information 120 indicated by LD can be calculated as in the following equation 3.
Ld=γ×lam- δ× LAs … (formula 3)
The distance difference information 120 may be calculated as follows. The difference between the compound eye distance information 103 of the partial subject 400 in the compound eye region T and the single eye distance information 102 of the partial subject 400 in the single eye region Q may be calculated for each distance information block unit, and may be generated based on the generated difference for each distance information block unit of the partial subject 400 in the compound eye region T.
Further, the distance difference information 120 may be calculated as follows. That is, the distance difference information 120 may be a value obtained by weighting the average value LAm. That is, when the weighting coefficient is set to epsilon, the distance difference information 120 represented by the symbol LD can be calculated as shown in the following equation 4.
Ld=epsilon×lam … (4)
(correction judging section 23)
The correction determination unit 23 determines whether correction is necessary for the distance information of the monocular region using the input information 105 acquired from the vehicle control unit 8, and generates correction determination information 121. However, the correction determination information 121 may include parameters used for correction.
The correction determination unit 23 determines that the distance information of the monocular region is not corrected, for example, in the following case. That is, the vehicle speed exceeds a predetermined speed, the driving mode is a predetermined mode, the current position of the vehicle 1 is a highway, the steering angle is equal to or greater than a predetermined angle, and the like. Although these conditions may be used in combination. This determination is because close-range monitoring is important during low-speed traveling. At a close distance, the same object is photographed large and parallax becomes large, so the accuracy of the distance becomes high. The urban area is an environment in which people tend to be distracted from the vehicle when traveling, turning right and left at the intersection of the urban area, and the like, and the risk of contact between people and the vehicle increases. On the contrary, in the expressway, national roads where vehicles run at high speeds, and the like, the risk of contact between a person and a vehicle is greatly reduced.
The correction determination unit 23 determines parameters used for weighting as follows, for example. In addition, the larger the value of the parameter is, the larger the distance information of the compound eye area is reflected in the correction. The correction determination unit 23 sets the parameter to be larger as the vehicle speed increases, and maximizes the parameter when the vehicle speed is greater than the first predetermined value and minimizes the parameter when the vehicle speed is less than the second predetermined value. The correction determination unit 23 determines that the parameter is set to be large in a place where the vehicle 1 is traveling, for example, in a place where the person is frequently traveling, and the parameter is set to be medium on a general road other than the place where the person is frequently traveling, and the parameter is set to be minimum on an expressway. The correction determination unit 23 sets the parameter to be larger as the steering angle of the vehicle 1 is larger, and sets the parameter to be smaller as the steering angle is smaller. In addition, these conditions may be combined.
By providing the correction determination unit 23, the distance between the vehicle and an object around the vehicle, that is, a person, a vehicle, a parked vehicle, or the like, in the case of low-speed running or right-left turning at the intersection, where the risk of contact between the person and the vehicle is high, is accurately measured, and the risk of an accident can be greatly reduced. In the automatic driving, by improving the accuracy of the distance information referred to by the automatic driving system, more accurate automatic driving judgment can be realized. Automatic driving errors caused by erroneous detection by the erroneous distance information can be prevented. The processing load at the time of high-speed travel can be reduced by restricting at the time of low-speed travel.
(correction execution unit 21)
The correction execution unit 21 corrects the monocular distance information 102 in a different technical manner for each type of group based on the group information 106. The distance difference information 120 outputted from the distance difference information generating unit 22 is mainly used when the off-road group G1 is the processing target. The correction determination information 121 is used for determining whether correction of the border-crossing group G1 is necessary or not and determining correction parameters. The correction process is described below for each group type.
(correction of the out-of-range group by the correction execution unit 21)
The correction execution unit 21 corrects the monocular distance information 102 of the part of the subject 401 existing in the monocular region P. For example, the single-eye distance information 102 of all or a part of the distance information block of the partial subject 401 is replaced with the average LAm of the partial subject in the compound eye region T. In the case of replacement, a predetermined pixel may be weighted. As an example of the weighting, there is a method of making the weighting smaller as the distance from the boundary is longer. The corrected monocular distance information 102 represented by the symbol LCs is calculated as in equation 5 below when the weighting coefficient ζ is set.
Lcs=ζχld … (5)
The correction execution unit 21 outputs the corrected monocular distance information 102 as a part of the corrected distance information 104.
(correction of an enlarged View group by the correction execution part 21)
When the enlarged border group is the processing target, the correction execution unit 21 corrects the distance information as follows. That is, the correction execution unit 21 corrects the distance information of the subject in the portion of the monocular region of the augmented reality group in the same manner as the distance information of the monocular region of the augmented reality group. The correction execution unit 21 corrects the distance information of the subject in the monocular region of the extended-range group using the distance information of the subject in the portion of the compound-eye region. Specifically, in the example shown in fig. 3, the distance information of the partial subject 401 in the monocular region P of the enlarged border-crossing group G2 is the same as the correction of the partial subject 401 in the border-crossing group G1. The distance information of the subject 420 in the single-eye region P is corrected using the distance information of the subject 400 in the part of the compound-eye region T.
Correction of distance information of the subject in the single-eye region using the distance information of the part of the subject in the compound-eye region is described below, for example. That is, the distance information of the distance information block of each subject in the single-eye region is taken as the average value of the distances of the whole area of the partial subject in the compound-eye region and the average value of the distance information block of each subject in the single-eye region.
(correction of monomer group by correction execution Unit 21)
When the single group is the processing object, the correction execution unit 21 corrects the single-eye distance information 102 of the subject photographed only in the single-eye region by using the compound-eye distance information 103 of the subject belonging to the single group. The processing is illustrated in the example shown in fig. 3, and the monocular distance information 102 of the subject 411, the subject 412, and the subject 413 is corrected using the compound eye distance information 103 of the subject 410 calculated using the compound eye image T and the monocular distance information 102 of the subject 410 calculated using the monocular image R. The subject 410 is also imaged in the monocular image Q, but the monocular distance information 102 obtained from the monocular image Q is not used for correction in this case, because it is an image imaged by a camera different from the monocular image S to be corrected.
The correction execution unit 21 calculates, for example, a difference between the compound eye distance information 103 of the subject 410 and the monocular distance information 102, and adds the difference to the monocular distance information 102 of the subject 411, the subject 412, and the subject 413.
(group information Generation section 30)
Fig. 5 is a block diagram of the group information generating unit 30. The group information generating section 30 includes an object detecting section 32 and a group judging section 31.
(subject detection section 32)
The subject detection unit 32 of the group information generation unit 30 detects a subject using the 1 st captured image 100 output from the 1 st imaging unit 2 and the 2 nd captured image 101 output from the 2 nd imaging unit 3, and outputs subject detection information 130. In the detection of the subject, the 1 st captured image 100 or the 2 nd captured image 101 may be used, or both the 1 st captured image 100 and the 2 nd captured image 101 may be used. Further, the 1 st shot image 100 or the 2 nd shot image 101 having different shooting times may also be used.
Since the positional relationship between the 1 st imaging unit 2 and the 2 nd imaging unit 3, the attitudes of the 1 st imaging unit 2 and the 2 nd imaging unit 3, and the angles of view of the 1 st imaging unit 2 and the 2 nd imaging unit 3 are known, a rough region overlapping the 1 st captured image 100 and the 2 nd captured image 101 can be calculated in advance. The subject detecting unit 32 calculates the area where the 1 st captured image 100 and the 2 nd captured image 101 overlap with each other by using the previous calculation, and specifies a single-eye area and a compound-eye area. Thus, the boundary between the compound eye region and the monocular region is clear.
For example, pattern matching and a neural network can be used for detecting the subject. The condition of the subject detected by the subject detecting unit 32 may be input as the input information 105. The conditions of the subject refer to the type, size, position information, and the like of the subject. The subject detection information 130 includes the type of subject, the size of the subject, the subject crossing the boundary between the compound eye region and the monocular region, the subject in the monocular region, the positional information of the subject, and the information of the distance information block constituting the subject.
(group judgment part 31)
The group judgment section 31 calculates group information 106 based on the subject detection information 130, compound eye distance information 103, and input information 105. The group judgment unit 31 classifies the subjects crossing the boundary between the compound eye region and the monocular region into 1 group, and classifies the subjects into the off-border groups. When another subject exists in the vicinity of the off-border group in the monocular region, the group determination unit 31 classifies the other subjects as another 1 group into an enlarged off-border group. The group determination unit 31 classifies a plurality of subjects existing adjacent to the single-eye region into 1 group, and classifies the subjects into a single group.
(flow chart)
Fig. 6 is a flowchart showing the operation of the distance calculating device 6. First, the distance calculating device 6 acquires the 1 st captured image 100 and the 2 nd captured image 101 from the 1 st image capturing section 2 and the 2 nd image capturing section 3 (S01). Next, the distance calculating device 6 executes S02, S03, and S04 in parallel. However, the above is merely conceptual, and S02, S03, and S04 may be executed in any order. S02 is executed by the monocular distance information generating unit 10, S03 is executed by the compound eye distance information generating unit 11, and S04 is executed by the group information generating unit 30.
In S02, the distance calculating device 6 generates monocular distance information corresponding to the 1 st captured image 100 using the 1 st captured image 100, and generates monocular distance information corresponding to the 2 nd captured image 101 using the 2 nd captured image 101. That is, the compound eye area calculates distance information 2 times using each image in S02. In S03, the distance calculating device 6 generates compound eye distance information of the overlapping region using the 1 st captured image 100 and the 2 nd captured image 101. In S04, the distance calculating device 6 detects the subject, generates a group (S05), and determines whether or not the 1 st group exists (S06).
When the execution of S02 and S03 is completed and the affirmative determination is made in S06, the distance calculating device 6 generates the distance difference information 120 of the 1 st group (S07). If the 1 st group does not exist (S06: NO), the distance calculating device 6 ends the process.
Finally, the distance calculating device 6 determines whether the generated distance difference information 120 is greater than a predetermined value (S08), and if so, the correction executing unit 21 performs the correction processing of the monocular distance information 102 as described above, and outputs the corrected distance information 104 (S09). The distance calculating means 6 ends the process when the distance is smaller than the predetermined value. The distance calculating device 6 repeats these processes for every 1 frame, for example.
According to the first embodiment described above, the following operational effects can be obtained.
(1) The distance calculating device 6 includes: a compound eye distance information generating unit 11 that generates compound eye distance information 103, which is information of a distance with high accuracy, using the output of the sensor; a monocular distance information generating unit 10 that generates monocular distance information 102 having lower accuracy than the compound eye distance information 103 using the output of the sensor; and a single eye distance information correction unit 12 for correcting the single eye distance information 102 by using the compound eye distance information 103. Therefore, the distance information of the area with low accuracy can be corrected.
(2) The compound eye distance information generating unit 11 generates compound eye distance information 103 using the outputs of compound eye regions in which fields of view overlap among the outputs of the plurality of cameras. The monocular distance information generating unit 10 generates monocular distance information 102 of a monocular region using the output of a single camera. The distance calculating device 6 includes a group information generating unit 30 that sets 1 or more subjects included in each of the compound eye region and the monocular region as a 1 st group correspondence relationship based on a predetermined condition. The single eye distance information correction unit 12 corrects the single eye distance information 102 using the compound eye distance information 103 of the subject belonging to group 1 generated by the compound eye distance information generation unit 11. Therefore, the single eye distance information 102 can be corrected using the compound eye distance information 103.
(3) The subjects belonging to group 1G 1 are the same subject existing across the single-eye region and the compound-eye region.
(4) The group information generating unit 30 establishes a correspondence relationship as a 2 nd group together with the 1 st group as an additional subject that satisfies a predetermined condition included in the monocular region. In the example shown in fig. 3, the subject 420 as the additional subject and the 1 st group G1 are associated with each other as the 2 nd group G2. The monocular distance information correction unit 12 corrects the monocular distance information of the additional subject using the distance information of group 1.
(5) The monocular distance information correction unit 12 calculates a difference between the compound eye distance information 103 and the monocular distance information 102 in the same area as distance difference information, and determines whether correction is necessary for the monocular distance information 102 based on the distance difference information. Therefore, when the effect of the estimated correction is small, the correction can be omitted, and the processing load can be reduced.
(6) The monocular distance information correction unit 12 corrects the monocular distance information of the additional subject using the compound eye distance information of the 1 st group generated by the compound eye distance information generation unit 11 or the corrected monocular distance information generated by the monocular distance information correction unit 12.
Modification 1
The monocular distance information correction unit 12 may replace the monocular distance information 102 of the subject 401 with the compound eye distance information 103 of the subject 400 in the correction of the monocular distance of the border crossing group G1.
Modification 2
The monocular distance information correction unit 12 may replace the monocular distance information 102 of the subjects 411, 412, 413 existing in the monocular region with the compound eye distance information 103 of the subject 410 existing in the compound eye region in correction of the monocular distance of the other group G3.
Modification 3
Fig. 7 is a block diagram of the correction unit 20 in modification 3. The correction determination unit 23 in this modification does not input the input information 105 from the vehicle control unit 8. The correction determination unit 23 of the present modification generates correction determination information 121 for determining a correction method for the monocular distance information 102 based on the distance difference information 120. Specifically, the correction determination unit 23 determines that the monocular distance information 102 is to be corrected when the distance difference information 120 is equal to or greater than a predetermined threshold value, and determines that the monocular distance information 102 is not to be corrected when the distance difference information 120 is less than the predetermined threshold value. According to the present modification, there is no control based on the input information 105, and therefore there is an advantage that the processing load is small.
Modification 4
Fig. 8 is a diagram showing a calculation method of the distance difference information 120 and a correction process of the monocular distance information 102 for the border-crossing group G1 in modification 4. The lattices shown in the partial subject 400 and the partial subject 401 of fig. 8 are distance information blocks. Hereinafter, a row of distance information blocks arranged in the horizontal direction of the drawing in each region of the partial subject 400 and the partial subject 401 is referred to as a "distance block line". In the present modification, distance difference information 120 is generated for each distance block line.
The single eye distance information 102 in the single eye region is replaced with distance difference information 120 generated based on the single eye distance information 103 in the single eye region T located at the boundary between the single eye region T and the single eye region for each distance information block line. However, the single-eye distance information 102 in the single-eye region may be replaced with a difference between the single-eye distance information 102 and the compound-eye distance information 103 generated in distance information block units based on the compound-eye distance information 700 for each distance information block line.
The distance information block line 700 shown in fig. 8 is a distance information block line in the subject 400 in the compound eye region T. The distance information block line 701 in the figure is a distance information block line in the subject 401 in a portion in the single-eye region.
The distance difference information generating unit 22 generates an average value LAm for each distance information block line based on the compound eye distance information 103 in the partial subject in the compound eye region T. Then, an average value LAm for each distance information block line unit is generated based on the monocular distance information 102 in the partial subject within the monocular region P, and distance difference information 120 for each distance information block line is generated based on the average value LAm and the average value LAs.
Modification 5
In the correction for the border-crossing group G1, the correction execution unit 21 may perform correction in which the distance difference information 120 is added to the monocular distance information 102 before the correction. In this case, predetermined pixels may be weighted. When the weighting coefficients are η and θ, the corrected monocular distance information represented by the symbol LCs is calculated as in the following equation 6 or 7 when the symbol of the monocular distance information 102 is Ls and the symbol of the distance difference information 120 is LD.
Lcs=θ×ls+η×ld … (formula 6)
Lcs= (1- η) ×ls+η×ld … (formula 7)
Fig. 9 is a diagram showing a method for determining the weighting coefficient η shown in expression 6 or expression 7. However, the weighting coefficients are referred to herein as "correction coefficients". The correction execution unit 21 determines the correction coefficient η according to the correction function 1100 using the compound eye distance information 103 as a variable. Here, the distance indicated by the compound eye distance information 103 is referred to as a short distance 1101 where L0 is less than L1, and is referred to as a medium distance 1012 where L0 or more is less than L1, and is referred to as a long distance 1013 where L1 or more is more than L1. The correction coefficient η is set to 0.0 when the compound eye distance information 103 is the short distance 1101, and 1.0 when the compound eye distance information is the long distance 1103. When compound eye distance information 103 is intermediate distance 1102, correction coefficient η is changed according to compound eye distance information 103. L0 and L1 are predetermined thresholds.
Fig. 10 is a diagram showing an example of correction processing of the monocular distance information 102 using the correction coefficient η shown in fig. 9. Fig. 10 (a) shows before correction, and fig. 10 (b) shows after correction. Regions 90A and 90B represent regions of intermediate distance 1102 and span a part of the subject within the compound eye region T of the subject of the compound eye region T and the monocular region P. In addition, the partial subjects 90A and 90B are the same. Regions 91A and 91B represent a part of the subject within the monocular region P of the same subject as the regions 90A and 90B. The partial subjects 91A and 91B are the same.
Regions 92A and 92B represent a part of the subject within the compound eye region T of the subject that is distant 1103 and spans the compound eye region T and the single eye region P. In addition, the partial subjects 92A and 902 are the same. Regions 93A and 93B represent a part of the subject within the monocular region P of the same subject as the regions 92A and 92B. Note that reference numeral A, B, m, M, n described in each region is used to schematically indicate distance information of each region, and the distance information of the regions indicated by the same reference numeral is the same.
Based on the compound eye distance information 103 (90A) of the partial subject within the compound eye region T within the intermediate distance 1102, correction processing of the single eye distance information 102 of the partial subject within the single eye region P constituting the same subject is performed. Further, the correction processing of the single-eye distance information 102 of the same subject in the single-eye region P is not performed regardless of the compound-eye distance information 103 (92A) of the partial subject in the compound-eye region T and in the long-distance 1103.
According to this modification, the following operational effects can be obtained.
(7) The monocular distance information correction unit 12 determines whether to correct the monocular distance information 102 and parameters used for correction based on the compound eye distance information 103.
Modification 6
The distance difference information 120 may be calculated as follows. The difference between the compound eye distance information 103 in the partial subject in the compound eye region T and the monocular distance information 102 in the partial subject in the non-compound eye region is generated for each distance information block unit in the distance information block line. The difference may be used to calculate an average difference value of the partial subjects in the compound eye region T, and the average difference value may be used as the distance difference information 120.
Modification 7
In generating the distance difference information 120, the compound eye distance information 103 of the upper and lower distance information block lines may be used. As an example, there is a longitudinal N tap filter (n=integer). As an example of the vertical N-tap filter, there is an N-tap FIR (Finite Impulse Response: finite impulse response) filter using N pixels in the vertical direction.
Modification 8
The distance difference information 120 may be an average LAm within the distance information block line.
Modification 9
In the correction for the border-crossing group G1, the correction execution unit 21 may replace the single-eye distance information 102 of all or a part of the distance information blocks in the distance information block line of the part of the subject 401 with the average value LAm in the distance information block line of the part of the subject in the compound-eye region T. In the case of replacement, predetermined pixels may be weighted. As for the weighting, for example, the weighting is reduced in correspondence with the distance from the boundary of the compound eye region T and the single eye region.
Modification 10
In the correction for the border-crossing group G1, the correction execution unit 21 may add the distance difference information 120 for the same distance information block line to the monocular distance information 102 in the distance information block line. That is, the corrected monocular distance information=monocular distance information 102+distance difference information 120 may be used. In the case of addition, a predetermined pixel may be weighted. Regarding the weighting, for example, the farther from the boundary of the compound eye region T and the single eye region, the smaller the weighting is made.
Modification 11
Fig. 11 is a diagram showing a correction process of the distance difference information 120 and the monocular distance information 102 for the border-crossing group G1 according to modification 11. In the present modification, the subject 430 is placed across the compound eye region T and the monocular region P. The subject 430 is constituted by a part of the subject 401 in the monocular region P and a part of the subject 400 in the compound eye region T. In this modification, an average value of the compound-eye distance information 103 generated based on the compound-eye distance information 700 in the compound-eye region T located at the boundary between the compound-eye region T and the single-eye region P is calculated, and the single-eye distance information 102 in the subject 401 in the single-eye region P is replaced with the average value. The distance information block column 700 is located at a position closest to the monocular region P, i.e., a boundary with the monocular region P, of the subject 400 existing in the compound eye region T.
The distance difference information generating unit 22 generates an average value LAm of distance information blocks at the left end of the partial subject 400 in the compound eye region T based on the compound eye distance information 103 in the partial subject in the compound eye region T. The distance difference information generating unit 22 generates the average value LAs of the distance information blocks at the left end of the partial subject 400 in the compound eye region T based on the monocular distance information 102 in the partial subject 401 in the monocular region P, and generates the distance difference information 120 for each distance information block line based on the average value LAm and the average value LAs.
The average value LAm and the average value LAs can be, for example, average values of all or a part of distance information in a distance information block of the left end 700 of the subject 400 in the compound eye region T. In addition, in the calculation of the average, a predetermined pixel may be weighted. The distance difference information 120 can be, for example, a difference LD between the average value LAm and the average value LAs. In addition, the average LAm or the average LAs may be weighted when calculating the difference.
Modification 12
The distance difference information 120 may be an average value LA described below. The average value LA is an average value generated based on the difference between the compound eye distance information 103 in the partial subject 400 in the compound eye region T and the single eye distance information 102 in the partial subject 401 in the single eye region, and thus the difference per distance information block unit in the partial subject 400 in the generated compound eye region T.
Modification 13
The distance difference information 120 may be an average value LAm in a distance information block of the left end 700 of the subject 400 in the compound eye region T.
Modification 14
The correction processing performed by the correction execution unit 21 for the out-of-range group G1 may replace the single-eye distance information 102 of all or a part of the distance information blocks in the partial subject 401 with the average LAm of the partial subject 400 in the compound eye region T. In the case of replacement, predetermined pixels may be weighted. As for the weighting, for example, the weighting can be reduced in accordance with the distance from the boundary of the compound eye region T and the single eye region P.
Modification 15
The correction processing performed by the correction execution unit 21 for the border-crossing group G1 may be performed by adding the distance difference information 120 of the same distance information block to the monocular distance information 102. In the case of addition, a predetermined pixel may be weighted. As for the weighting, for example, the weighting can be reduced in accordance with the distance from the boundary of the compound eye region T and the single eye region P.
Modification 16
Fig. 12 is a diagram showing the correction processing performed by the correction execution unit 21 according to modification 12 for enlarging the border-crossing group G2. Fig. 12 (a) and 12 (b) show 2 subjects and their distance information disposed in the compound eye region T and the monocular region P, respectively, and fig. 12 (a) shows before correction and fig. 12 (b) shows after correction.
Regions 83A and 83B represent a part of the subject within the monocular region P of the subject across the compound eye region T and the monocular region P. The partial subjects 83A and 83B are the same. Regions 84A and 84B represent a part of the subject of the monocular region P of the subject across the compound eye region T and the monocular region P. The partial subjects 84A and 84B are the same. Regions 85A and 85B represent subjects present only in the monocular region P. Note that reference numeral A, n, m, M, N1 described in each region is used to schematically indicate distance information of each region, and the distance information of the regions indicated by the same reference numeral is the same.
The correction determination unit 23 first corrects the monocular distance information 102 of the partial subject in the monocular region P constituting the same subject from M to M based on the compound eye distance information 103 (a) of the partial subject in the compound eye region T. Next, the correction determination unit 23 corrects the monocular distance information 102 of the other subject in the monocular region P included in the extended border group G2 from N to N1 based on the corrected monocular distance information M generated for the part of the subject in the monocular region P. For example, the value of N1 is determined so as to satisfy the relationship of N-n1=m-M.
Modification 17
Fig. 13 is a diagram showing a method for determining the correction coefficient η, which is a weighting coefficient shown in equation 6 or equation 7. In this modification, the correction coefficient η is determined using the traveling speed of the host vehicle included in the input information 105. The correction coefficient η is set to 1.0 when the running speed is equal to or lower than a predetermined value F0, and the correction coefficient α is set to 0.0 when the running speed exceeds the predetermined value F0.
Modification 18
The threshold L1 shown in fig. 9 may also be variable. Fig. 14 is a diagram showing the threshold value L1. The threshold L1 is determined based on the traveling speed of the vehicle 1 included in the input information 105. When the running speed is equal to or lower than the predetermined value F0, the threshold L1 is D0 which is the smallest. When the running speed exceeds the predetermined value F1, the threshold L1 is D1 which is the maximum. When the running speed is between F0 and F1, the value of the threshold L1 increases with an increase in the running speed.
Modification 19
Fig. 15 is a diagram illustrating correction of modification 19 for enlarging an out-of-range group. Fig. 15 (a) shows before correction, and fig. 15 (b) shows after correction. The correction determination unit 23 determines whether or not to correct the monocular distance information of the additional subject in the extended border-crossing group as follows. That is, the correction determination information 121 determines that correction is performed when the distance between the additional subject and the compound eye region is shorter than a predetermined threshold value, and determines that correction is not performed when the distance is longer than the predetermined threshold value.
In the example shown in fig. 15, the subject 88A and the subject 89A both belong to an enlarged border group. The distance to the compound eye region T of the subject 88A is a distance indicated by reference numeral 1500, and is shorter than the predetermined value DB 0. Therefore, the monocular distance information of the subject 88A is corrected from N to N. The distance to the compound eye region T of the subject 89A is a distance denoted by reference numeral 1501 and is greater than a predetermined value DB 0. Therefore, the monocular distance information of the subject 89B is not corrected and does not change from k.
According to this modification, the following operational effects can be obtained.
(8) The monocular distance information correction unit 12 determines the presence or absence of the distance information correction and the correction parameter based on the distance between the other subject and the boundary between the monocular image and the compound eye image.
Modification 20
Fig. 16 is a diagram showing the configuration of the correction unit 20 in modification 16. The correction unit 20 in the present modification includes a holding unit 24 for holding the monocular region in the resting state, in addition to the configuration of the first embodiment. The monocular region stay period holding unit 24 generates monocular region stay period information 122 indicating the length of time, for example, the number of frames, of the monocular region in which the subject exists in the monocular region P based on the group information 106, and outputs the same to the correction determination unit 23.
As an example of the monocular region stay period information 122, there is frame number information and/or time information in the case where the subject continuously exists in the monocular region P. In the case where all or a part of the subject is present in the compound eye region T, the monocular region stay period information 122 may be initialized to 0. In the case where the subject first appears as a single-eye region, the single-eye region stay period information 122 may be initialized to a desirable maximum value or a predetermined value.
Fig. 17 is a diagram showing a method for determining the correction coefficient η by the correction determination unit 23. In this modification, the correction determination information 121 determines the correction coefficient η used in equations 6 and 7 based on the monocular region stay period information 122. The correction determination information 121 sets the correction coefficient α to 1.0 when the monocular region stay period information 122 is equal to or smaller than the predetermined value Ta. When the monocular region stay period information 122 exceeds the predetermined value Tb, the correction determination information 121 sets the correction coefficient α to 0.0. When the monocular region stay period information 122 is larger than the predetermined value Ta, the correction determination information 121 decreases the correction coefficient η according to an increase in the monocular region stay period information 122.
According to this modification, the following operational effects can be obtained.
(9) The distance calculating means 6 includes a single-eye region stay period holding section 24 that generates single-eye region stay period information 122 for groups existing outside the compound eye region. The monocular distance information correction unit 12 determines a distance information correction method based on the monocular region stay period information 122.
Modification 21
Fig. 18 is a block diagram of the group information generation unit 30 according to modification 21. The group information generating unit 30 further includes a monocular region stay period holding unit 33 in addition to the configuration shown in fig. 5. The monocular region stay period holding unit 33 generates monocular region stay period information 131 indicating the time length, for example, the number of frames, of the monocular region in which the subject exists in the monocular region P based on the subject detection information 130, and outputs the same to the group determination unit 31.
As an example of the monocular region stay period information 131, there is frame number information and/or time information in the case where the subject continuously exists in the monocular region P. In the case where all or a part of the subject is present in the compound eye region T, the monocular region stay period information 131 may be initialized to 0. In addition, when the first appearance of the subject is a monocular region, the monocular region stay period information 131 may be initialized to a desirable maximum value or a predetermined value.
Modification 22
The correction execution unit 21 may determine a mode of correcting the monocular distance information according to the type of the subject and the speed of the vehicle 1. For example, the correction method refers to whether correction is necessary or not, and the parameter to be corrected.
Fig. 19 is a diagram showing a correspondence relationship between the type of subject and the mode of correction. However, fig. 19 also shows a relationship between the speed of the vehicle 1 and the correction method. In addition, fig. 19 is classified into 2 stages according to the case of mapping. In fig. 19, "correction valid" indicates that the monocular distance information is corrected, and "correction invalid" indicates that the monocular distance information is not corrected. The percentages in fig. 19 indicate the values of the parameters used for correction and correction. That is, the percentage correction technique is described as a correction technique using parameters shown in equations 6 and 7. However, in the case of "0%", no correction is necessary.
When the correction execution unit 21 uses the correction method 1, the monocular distance information is corrected when the type of subject is a person, a vehicle, or a road surface, and the monocular distance information is not corrected when the type of subject is a structure. When the correction execution unit 21 uses the correction method 3, the correction is discriminated according to the speed of the vehicle 1, and if the vehicle is traveling at a low speed, the correction is performed regardless of the type of the subject, and if the vehicle is traveling at a high speed, the correction is not performed on the person and the vehicle, and if the vehicle is traveling at a medium speed, the person and the vehicle are corrected according to the speed.
In the correction method 3, since the road surface and the structure are large areas and are stationary objects having continuity, correction is easy, and thus correction is performed regardless of the speed. However, since the human and vehicle are relatively small in area and have no continuity, the correction is limited to the case of low-speed running where the necessity is high.
In the correction method 4, the distance accuracy of the person is improved during low-speed traveling, and the distance accuracy of the vehicle is improved during high-speed traveling. The overall load of correction processing is reduced by changing the surrounding environment of the vehicle, particularly by changing the object to be seen, according to the difference in speed, thereby realizing cost reduction. Specifically, by reducing the computation capability required for the distance computation device 6, the distance computation device 6 can be manufactured using a computation device having low processing capability.
Further, in the present modification, the correction may be performed only when the distance indicated by the monocular distance information before the correction is shorter than the predetermined distance. In the case where the type of subject is a person, the correction target may be defined as a person existing within a predetermined distance range from the road surface. Since the road surface is a large area and has continuity, correction accuracy is high. On the other hand, because people near the travel road need to be more attentive.
(10) The monocular distance information correction unit 12 determines a mode of correcting the monocular distance information based on at least one of the traveling state of the vehicle 1 and the type of the subject. Therefore, the distance calculating device 6 can perform correction based on the traveling state of the vehicle and the type of the subject.
Second embodiment-second embodiment
A second embodiment of the distance calculating device will be described with reference to fig. 20 to 21. In the following description, the same reference numerals are given to the same constituent elements as those of the first embodiment, and differences will be mainly described. The contents not specifically described are the same as those of the first embodiment. In the present embodiment, mainly, a sensor mounted on the vehicle 1 is different from the first embodiment.
Fig. 20 is a schematic diagram of the distance calculation system S in the second embodiment. The vehicle 1 includes, as sensors for collecting surrounding information, a 1 st image pickup section 2 and a distance sensor 2000. The distance sensor 2000 is a sensor capable of acquiring distance information, and is, for example, a LiDAR (Light Detection and Ranging: light detection and distance correction), a millimeter wave radar, a TOF (Time of Flight) type distance sensor, a stereo camera, or the like. The distance sensor 2000 obtains distance information of a region indicated by a solid line, that is, a region indicated by hatching in the center of the drawing. The 1 st image pickup unit 2 uses a wide range indicated by a dotted line, that is, a range including the range in which the distance sensor 2000 acquires the distance information, as an image pickup range.
The area indicated by hatching corresponds to the compound eye area T in the first embodiment. The region other than the region indicated by hatching in the imaging range of the 1 st imaging unit 2 corresponds to the monocular region P and the monocular region S in the first embodiment.
Fig. 21 is a block diagram of the distance calculating device 6 according to the second embodiment, in which the compound eye distance information generating unit 11 is omitted compared with the configuration of the first embodiment. The distance sensor 2000 outputs the distance information 2010 to the monocular distance information correcting unit 12. The single eye distance information correction unit 12 processes the distance information 2010 as compound eye distance information 103 in the first embodiment. Therefore, the virtual compound eye distance information generating unit 11 can be regarded as outputting the input sensor output as the high-precision distance information 103 as it is, although it does not exist in reality. The other structure is the same as the first embodiment.
According to the second embodiment described above, the following operational effects can be obtained.
(11) The compound eye distance information generating unit 11 generates compound eye distance information 103 using an output of at least one of a distance sensor of a TOF system, millimeter wave radar, and LiDAR. Therefore, the distance information of the monocular region can be corrected using the information of the distance sensor with higher accuracy.
Third embodiment-
A third embodiment of the distance calculating device will be described with reference to fig. 22 to 23. In the following description, the same reference numerals are given to the same constituent elements as those of the first embodiment, and differences will be mainly described. The contents not specifically described are the same as those of the first embodiment. In the present embodiment, mainly, a sensor mounted on the vehicle 1 is different from the first embodiment.
Fig. 22 is a schematic diagram of the distance calculation system S in the third embodiment. The vehicle 1 includes, as sensors for collecting surrounding information, a 1 st image pickup unit 2, a 2 nd image pickup unit 3, a 1 st distance sensor 2200, and a 2 nd distance sensor 2201. The 1 st distance sensor 2200 and the 2 nd distance sensor 2201 are sensors capable of acquiring distance information, and examples thereof include LiDAR, millimeter wave radar, TOF type distance sensors, and stereo cameras.
The 1 st imaging unit 2 captures an image of the hatched area in the center of the drawing in the same field as the 2 nd imaging unit 3. The field of view of the 1 st distance sensor 2200 is substantially forward left of the vehicle 1, and the field of view of the 2 nd distance sensor 2201 is substantially forward right of the vehicle 1. The field angle, set position, and attitude of all sensors are known. The area shown by hatching corresponds to the compound eye area T in the first embodiment. The upper and lower areas of hatching correspond to the monocular areas P and S in the first embodiment.
Fig. 23 is a configuration diagram of a distance calculating device 6 according to the third embodiment. Compared with the configuration of the first embodiment, the monocular distance information generating unit 10 is deleted. The 1 st distance sensor 2200 and the 2 nd distance sensor 2201 output the distance information 2210 and the distance information 2211 to the monocular distance information correcting unit 12. The monocular distance information correcting unit 12 extracts information of a desired region from the inputted distance information 2210 and distance information 2211, and processes the information as monocular distance information 102 in the first embodiment. Therefore, although not actually present, the virtual presence monocular distance information generating unit 10 can be regarded as outputting the input sensor output directly as monocular distance information 102. The other structure is the same as the first embodiment.
According to the third embodiment described above, the following operational effects can be obtained.
(12) The monocular distance information generating unit 10 generates monocular distance information 102 using an output of at least one of a distance sensor of a TOF system, millimeter wave radar, and LiDAR.
Fourth embodiment-
A fourth embodiment of the distance calculating device will be described with reference to fig. 24 to 25. In the following description, the same reference numerals are given to the same constituent elements as those of the first embodiment, and differences will be mainly described. The contents not specifically described are the same as those of the first embodiment. In the present embodiment, mainly, the angle of view of a camera mounted in the vehicle 1 and the posture of the camera are different from those of the first embodiment.
Fig. 24 is a schematic diagram of the distance calculation system S in the fourth embodiment. The vehicle 1 includes, as a sensor that collects surrounding information, a 1 st image pickup unit 2 and a 2 nd image pickup unit 3 as in the first embodiment. However, in the present embodiment, the 1 st image pickup unit 2 and the 2 nd image pickup unit 3 have wide angles of view, and the fields of view of both are identical. The 1 st imaging unit 2 and the 2 nd imaging unit 3 have, for example, a fisheye lens, and can perform imaging over a wide range, but the deformation amount of the lens peripheral portion is large, and the resolution is low.
The region indicated by hatching is near the center of the fields of view of the 1 st imaging unit 2 and the 2 nd imaging unit 3, and corresponds to the compound eye region T in the first embodiment. The regions other than the center of the fields of view of the 1 st imaging unit 2 and the 2 nd imaging unit 3 are considered to correspond to the monocular region P and the monocular region S in the first embodiment because of large distortion and low spatial resolution. In other words, an area with poor photographing conditions is regarded as a monocular area, and an area with relatively good photographing conditions is regarded as a compound eye area.
Fig. 25 is a block diagram of the distance calculating device 6 according to the fourth embodiment, and the monocular distance information generating unit 10 is omitted compared with the configuration of the first embodiment. The single eye distance information correction unit 12 processes only the vicinity of the center of the compound eye distance information 103 output from the compound eye distance information generation unit 11 as the compound eye distance information 103 in the first embodiment, and processes the peripheral portion as the single eye distance information 102 in the first embodiment. The other structure is the same as the first embodiment.
According to the fourth embodiment described above, the following operational effects can be obtained.
(13) The compound eye distance information generating unit 11 generates compound eye distance information 103 using information of a region in which the fields of view outputted from the 2 cameras overlap and the photographing condition is good, and the single eye distance information generating unit 10 generates single eye distance information 102 using information of a region in which the fields of view outputted from the 2 cameras overlap and the photographing condition is worse than the region used by the compound eye distance information generating unit 11. The invention can thus also be applied in systems where no monocular region exists.
(modification of the fourth embodiment)
In the photographing condition, not only the deformation of the lens but also the amount of light and dust adhering to the lens may be included. For example, in the case of using a lens with small distortion over the entire circumference, the distance information of a dark area with a small amount of light may be processed as the single-eye distance information 102, and the distance information of a bright and clear photographed area may be processed as the compound-eye distance information 103.
Fifth embodiment-
A fifth embodiment of the distance calculating device will be described with reference to fig. 26. In the following description, the same reference numerals are given to the same constituent elements as those of the first embodiment, and differences will be mainly described. The content not specifically described is the same as that of the first embodiment. In the present embodiment, mainly, correspondence between functions and hardware is different from that in the first embodiment.
Fig. 26 is a block diagram of a distance calculation system S according to the fifth embodiment. In the present embodiment, the monocular distance information correction unit 12 is implemented by hardware different from that of the monocular distance information generation unit 10 and the compound eye distance information generation unit 11. The camera processing device 15 includes a single-eye distance information generating section 10, a compound-eye distance information generating section 11, and a feature information generating section 13. The feature information generating unit 13 creates a feature amount of the inputted image, for example, an edge image or a histogram, and outputs the feature amount as feature information 107 to the image processing unit 60. The ECU14 includes a monocular distance information correction portion 12, an identification processing portion 7, and a vehicle control portion 8.
According to the fifth embodiment described above, the present invention can be implemented by various hardware configurations.
The above embodiments and modifications may be combined with each other. In the above description, various embodiments and modifications have been described, but the present invention is not limited to these. Other aspects that are considered to be within the scope of the technical idea of the present invention are also included in the scope of the present invention.
The disclosures of the following priority-based filed applications are incorporated herein by reference.
Japanese patent application 2018-141891 (filed on 27 days 7.2018)
Description of the reference numerals
1 … vehicle
2 … No. 1 image pickup section
3 nd image pickup unit 3 …
6 … distance calculating device
7 … identification processing part
8 … vehicle control part
10 … monocular distance information generating part
11 … compound eye distance information generating part
12 … monocular distance information correction unit
15 … camera processing device
20 … correction part
21 … correction executing part
22 … distance difference information generating part
23 … correction judging part
24 … holding part during monocular region stay
30 … group information generating part
31 and … group judgment part
32 … subject detection unit
33 … holding part during monocular region stay
100 … 1 st captured image
101 … No. 2 captured image
102 … monocular distance information
103 … compound eye distance information
104 … corrected distance information
105 … input information
106 … group information
120 … distance difference information
121 … correction judgment information
122 … monocular region stay period information
130 … shot object detection information
131 … monocular region stay period information
140 … identification information.

Claims (11)

1. A distance calculating apparatus, comprising:
a high-precision distance information generating unit that generates high-precision distance information, which is information of a distance having high precision, using the output of the sensor;
a low-precision distance information generating unit that generates low-precision distance information having a lower precision than the high-precision distance information, using an output of the sensor; and
A distance correcting section for correcting the low-precision distance information using the high-precision distance information,
the high-precision distance information generating section generates the high-precision distance information using outputs of compound eye regions in which fields of view overlap among outputs of a plurality of cameras,
the low-precision distance information generating section generates the low-precision distance information of a monocular region using an output of a single camera,
the distance calculating device further includes a group information generating unit that sets 1 st group of 1 or more subjects included in each of the compound eye region and the monocular region based on a predetermined condition, wherein the plurality of subjects existing adjacently in the compound eye region and the monocular region can be divided into one 1 st group on the condition that at least 1 st of the kind of subject, the shape of subject, the color of subject, and the position of subject in the up-down direction is the same or similar,
the distance correction unit corrects the low-precision distance information using the high-precision distance information of the subject belonging to the group 1 generated by the high-precision distance information generation unit.
2. The distance calculating apparatus according to claim 1, wherein:
the subject belonging to the group 1 is the same subject existing across the single-eye region and the compound-eye region.
3. The distance calculating apparatus according to claim 2, wherein:
the group information generating unit associates the subject included in the monocular region and satisfying a predetermined condition with the 1 st group as a 2 nd group as an additional subject,
the distance correction unit corrects the low-precision distance information of the additional subject using the distance information of the 1 st group.
4. The distance calculating apparatus according to claim 1, wherein:
the distance correction unit calculates a difference between the high-precision distance information and the low-precision distance information for the same area as distance difference information, and determines whether to correct the low-precision distance information based on the distance difference information.
5. A distance calculating apparatus according to claim 3, wherein:
the distance correction unit corrects the low-precision distance information of the additional subject using the compound eye distance information of the 1 st group generated by the high-precision distance information generation unit or the corrected single eye distance information generated by the distance correction unit.
6. The distance calculating apparatus according to claim 1, wherein:
the distance correction unit determines a mode of correcting the low-precision distance information based on the high-precision distance information.
7. The distance calculating apparatus according to claim 1, wherein:
the distance correction unit determines a mode for correcting the low-precision distance information based on the traveling state of the vehicle.
8. The distance calculating apparatus according to claim 1, wherein:
the distance correction unit determines a mode for correcting the low-precision distance information based on the type of the subject.
9. The distance calculating apparatus according to claim 1, wherein:
the distance correction unit determines a distance information correction mode based on a distance between an object to be corrected for the low-precision distance information and a boundary between the single-eye region and the compound-eye region.
10. The distance calculating apparatus according to claim 1, wherein:
also comprises a compound eye region outside stay period generating part for generating period information of the group outside the compound eye region,
the distance correction unit determines a mode of correcting the distance information based on the period information.
11. The distance calculating apparatus according to claim 1, wherein:
the distance correction unit determines grouping based on at least one condition among the compound eye distance of the 1 st group, travel information, the type of group, the distance from the boundary, and the period.
CN201980049441.1A 2018-07-27 2019-07-04 Distance calculating device Active CN112513571B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018141891A JP7042185B2 (en) 2018-07-27 2018-07-27 Distance calculation device
JP2018-141891 2018-07-27
PCT/JP2019/026647 WO2020022021A1 (en) 2018-07-27 2019-07-04 Distance calculation device

Publications (2)

Publication Number Publication Date
CN112513571A CN112513571A (en) 2021-03-16
CN112513571B true CN112513571B (en) 2023-08-25

Family

ID=69180414

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980049441.1A Active CN112513571B (en) 2018-07-27 2019-07-04 Distance calculating device

Country Status (3)

Country Link
JP (1) JP7042185B2 (en)
CN (1) CN112513571B (en)
WO (1) WO2020022021A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7341079B2 (en) * 2020-02-10 2023-09-08 住友重機械工業株式会社 Distance image estimation device and control device
JP2022041219A (en) * 2020-08-31 2022-03-11 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド Control device, distance measurement sensor, imaging device, control method, and program
JP7499140B2 (en) * 2020-10-14 2024-06-13 日立Astemo株式会社 Object Recognition Device
US20220268899A1 (en) * 2021-02-22 2022-08-25 Shenzhen Camsense Technologies Co., Ltd Ranging apparatus, lidar, and mobile robot

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000121319A (en) * 1998-10-15 2000-04-28 Sony Corp Image processor, image processing method and supply medium
CN1820496A (en) * 2003-10-31 2006-08-16 三菱电机株式会社 Image correcting method and imaging apparatus
WO2006123615A1 (en) * 2005-05-19 2006-11-23 Olympus Corporation Distance measuring apparatus, distance measuring method and distance measuring program
JP2007263669A (en) * 2006-03-28 2007-10-11 Denso It Laboratory Inc Three-dimensional coordinates acquisition system
CN106662441A (en) * 2014-09-11 2017-05-10 日立汽车***株式会社 Image processing device
CN110053625A (en) * 2018-01-19 2019-07-26 本田技研工业株式会社 Apart from computing device and controller of vehicle

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1881450A1 (en) * 2005-05-10 2008-01-23 Olympus Corporation Image processing apparatus, image processing method, and image processing program
JP2008304202A (en) * 2007-06-05 2008-12-18 Konica Minolta Holdings Inc Method and apparatus for distance image generation and program
JP6660751B2 (en) * 2016-02-04 2020-03-11 日立オートモティブシステムズ株式会社 Imaging device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000121319A (en) * 1998-10-15 2000-04-28 Sony Corp Image processor, image processing method and supply medium
CN1820496A (en) * 2003-10-31 2006-08-16 三菱电机株式会社 Image correcting method and imaging apparatus
WO2006123615A1 (en) * 2005-05-19 2006-11-23 Olympus Corporation Distance measuring apparatus, distance measuring method and distance measuring program
JP2007263669A (en) * 2006-03-28 2007-10-11 Denso It Laboratory Inc Three-dimensional coordinates acquisition system
CN106662441A (en) * 2014-09-11 2017-05-10 日立汽车***株式会社 Image processing device
CN110053625A (en) * 2018-01-19 2019-07-26 本田技研工业株式会社 Apart from computing device and controller of vehicle

Also Published As

Publication number Publication date
CN112513571A (en) 2021-03-16
JP7042185B2 (en) 2022-03-25
JP2020016628A (en) 2020-01-30
WO2020022021A1 (en) 2020-01-30

Similar Documents

Publication Publication Date Title
CN112513571B (en) Distance calculating device
EP3358302B1 (en) Travel control method and travel control device
EP3304886B1 (en) In-vehicle camera system and image processing apparatus
US9074906B2 (en) Road shape recognition device
US6734787B2 (en) Apparatus and method of recognizing vehicle travelling behind
JP6254084B2 (en) Image processing device
JP2019525568A5 (en)
KR102397156B1 (en) A method of providing a camera system and driver assistance functions for photographing the surrounding area of one's vehicle
WO2016194296A1 (en) In-vehicle camera system and image processing apparatus
CN103381825B (en) Use the full speed lane sensing of multiple photographic camera
CN110053625B (en) Distance calculation device and vehicle control device
US20200118280A1 (en) Image Processing Device
JP7371053B2 (en) Electronic devices, mobile objects, imaging devices, and control methods, programs, and storage media for electronic devices
JP2019146012A (en) Imaging apparatus
JP2017129543A (en) Stereo camera device and vehicle
US20220036730A1 (en) Dangerous driving detection device, dangerous driving detection system, dangerous driving detection method, and storage medium
KR20170014626A (en) Image providing apparatus for curved road
EP3865815A1 (en) Vehicle-mounted system
WO2021131064A1 (en) Image processing device, image processing method, and program
JP6891082B2 (en) Object distance detector
KR20210034626A (en) How to determine the type of parking space
JP7185571B2 (en) Viewing direction estimation device, viewing direction estimation method, and program
CN112634653B (en) Image processing apparatus
EP4246467A1 (en) Electronic instrument, movable apparatus, distance calculation method, and storage medium
JP2021154951A (en) Evaluation device, evaluation method and evaluation program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Ibaraki

Applicant after: Hitachi astemo Co.,Ltd.

Address before: Ibaraki

Applicant before: HITACHI AUTOMOTIVE SYSTEMS, Ltd.

GR01 Patent grant
GR01 Patent grant