CN112513571A - Distance calculating device - Google Patents

Distance calculating device Download PDF

Info

Publication number
CN112513571A
CN112513571A CN201980049441.1A CN201980049441A CN112513571A CN 112513571 A CN112513571 A CN 112513571A CN 201980049441 A CN201980049441 A CN 201980049441A CN 112513571 A CN112513571 A CN 112513571A
Authority
CN
China
Prior art keywords
distance
information
distance information
monocular
group
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201980049441.1A
Other languages
Chinese (zh)
Other versions
CN112513571B (en
Inventor
稻田圭介
内田裕介
野中进一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Astemo Ltd
Original Assignee
Hitachi Automotive Systems Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Automotive Systems Ltd filed Critical Hitachi Automotive Systems Ltd
Publication of CN112513571A publication Critical patent/CN112513571A/en
Application granted granted Critical
Publication of CN112513571B publication Critical patent/CN112513571B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Measurement Of Optical Distance (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

The distance calculation device of the present invention includes: a high-precision distance information generating unit that generates high-precision distance information, which is information of a distance with high precision, using an output of the sensor; a low-precision distance information generating unit that generates low-precision distance information having a lower precision than the high-precision distance information, using an output of the sensor; and a distance correction unit for correcting the low-accuracy distance information using the high-accuracy distance information.

Description

Distance calculating device
Technical Field
The present invention relates to a distance calculation device.
Background
A technique is known in which a distance between a camera and an observation target is calculated using a stereo camera, and recognition processing of the observation target is performed. Further, there is also known a method of generating a compound eye region in the front and a single eye region in the right and left sides by shifting 2 cameras constituting a stereo camera, thereby realizing highly accurate and large-field vehicle periphery monitoring. Patent document 1 discloses an object detection device for detecting an object present on a reference surface, the device including: a parallax calculation unit that calculates a parallax between images captured by the plurality of imaging devices; a region calculation unit that calculates a common region and a non-common region other than the common region of the plurality of images based on the parallax; a 1 st generation unit configured to generate a rectangle in the common region, the rectangle being a candidate rectangle that is a candidate for detecting a rectangle of the object, the rectangle being a rectangle in which the degree of uniformity of the parallax at each point included in the rectangle is greater than a predetermined threshold value; a 2 nd generation unit configured to generate a rectangle in the non-common region as the candidate rectangle; and a determination unit configured to determine whether or not the generated candidate rectangle includes a detection target.
Documents of the prior art
Patent document
Patent document 1: japanese laid-open patent publication No. 2010-79582
Disclosure of Invention
Problems to be solved by the invention
There is a demand for improvement in distance information of a region where the distance information cannot be calculated with high accuracy.
Means for solving the problems
A distance calculation device according to a first aspect of the present invention includes: a high-precision distance information generating unit that generates high-precision distance information, which is information of a distance with high precision, using an output of the sensor; a low-precision distance information generating unit that generates low-precision distance information having a lower precision than the high-precision distance information, using an output of the sensor; and a distance correction unit that corrects the low-accuracy distance information using the high-accuracy distance information.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, distance information of a region with low accuracy can be corrected.
Drawings
Fig. 1 is a schematic diagram of a distance calculation system S including a distance calculation device 6.
Fig. 2 is a configuration diagram of a vehicle control system S of the first embodiment.
Fig. 3(a) is a diagram showing the 1 st captured image 100, fig. 3 (b) is a diagram showing the 2 nd captured image 101, and fig. 3 (c) is a diagram showing a group.
Fig. 4 is a structural diagram of the correction unit 20.
Fig. 5 is a configuration diagram of the group information generating unit 30.
Fig. 6 is a flowchart showing the operation of the distance calculation device 6.
Fig. 7 is a configuration diagram of the correction unit 20 in modification 3.
Fig. 8 is a diagram showing a method of calculating the distance difference information 120 and a process of correcting the monocular distance information 102 with respect to the boundary group G1 in modification 4.
Fig. 9 is a diagram showing a method of determining the weighting coefficient η shown in equation 6 or equation 7.
Fig. 10 is a diagram showing an example of the correction process of the monocular distance information 102 using the correction coefficient η shown in fig. 9. Fig. 10 (a) shows a diagram before correction, and fig. 10 (b) shows a diagram after correction.
Fig. 11 is a diagram showing the correction processing of the distance difference information 120 and the monocular distance information 102 for the border crossing group G1 in modification 11.
Fig. 12 is a diagram showing the correction processing performed by the correction execution unit 21 for the enlarged detour group G2 in modification 12. Fig. 12 (a) is a diagram before correction, and fig. 12 (b) is a diagram after correction.
Fig. 13 is a diagram illustrating a method of determining the correction coefficient η expressed by equation 6 or equation 7.
Fig. 14 is a diagram showing the threshold value L1 shown in fig. 9.
Fig. 15 is a diagram for explaining correction for enlarging the border crossing group in modification 19. Fig. 15 (a) shows a diagram before correction, and fig. 15 (b) shows a diagram after correction.
Fig. 16 is a diagram showing a configuration of a correction unit 20 in modification 16.
Fig. 17 is a diagram illustrating a method of determining the correction coefficient η of the correction determination unit 23.
Fig. 18 is a configuration diagram of the group information generating unit 30 according to modification 21.
Fig. 19 is a diagram showing a correspondence relationship between the type of subject and the correction method.
Fig. 20 is a schematic diagram of a distance calculation system S according to the second embodiment.
Fig. 21 is a configuration diagram of the distance calculation device 6 in the second embodiment.
Fig. 22 is a schematic diagram of a distance calculation system S according to the third embodiment.
Fig. 23 is a configuration diagram of the distance calculation device 6 in the third embodiment.
Fig. 24 is a schematic diagram of a distance calculation system S according to the fourth embodiment.
Fig. 25 is a configuration diagram of the distance calculation device 6 in the fourth embodiment.
Fig. 26 is a configuration diagram of the distance calculation system S in the fifth embodiment.
Detailed Description
First embodiment
A first embodiment of a distance calculating device will be described below with reference to fig. 1 to 6.
(Structure)
Fig. 1 is a schematic diagram of a distance calculation system S including a distance calculation device 6. The distance calculation system S is mounted on the vehicle 1, and includes a 1 st image pickup unit 2 and a 2 nd image pickup unit 3. The 1 st image pickup unit 2 and the 2 nd image pickup unit 3 are arranged in a row toward the front in the traveling direction of the vehicle 1. The field angles of the 1 st image pickup unit 2 and the 2 nd image pickup unit 3 are known, and as shown in fig. 1, partial fields of view overlap. The areas shown hatched in fig. 1 overlap in the field of view.
Fig. 2 is a configuration diagram of a vehicle control system S of the first embodiment. The vehicle control system S includes a 1 st image pickup unit 2, a 2 nd image pickup unit 3, a distance calculation device 6, a recognition processing unit 7, and a vehicle control unit 8. The distance calculation device 6 includes a monocular distance information generation unit 10, a compound eye distance information generation unit 11, and a monocular distance information correction unit 12. The monocular distance information correcting unit 12 includes a correcting unit 20 and a group information generating unit 30. The 1 st captured image 100, which is an image captured by the 1 st imaging unit 2, is output to the distance calculation device 6. The 2 nd captured image 101, which is an image captured by the 2 nd imaging unit 3, is output to the distance calculation device 6.
The distance calculating device 6 may be a single ECU (Electronic Control Unit) or may be an arithmetic device having other functions. The distance calculation device 6 may be integrally configured with the 1 st image pickup unit 2 and the 2 nd image pickup unit 3. The distance calculation device 6, the recognition processing unit 7, and the vehicle control unit 8 may be realized by an integrated device. The recognition processing unit 7 and the vehicle control unit 8 may be realized by an integrated electronic control device, or may be realized by different electronic control devices.
The monocular distance information generating unit 10 generates monocular distance information 102 using the 1 st captured image 100 and the 2 nd captured image 101, respectively. The monocular distance information generating unit 10 calculates a distance from the monocular image in a known manner. That is, the monocular distance information generating unit 10 does not use the 1 st captured image 100 and the 2 nd captured image 101 in combination in generating the monocular distance information 102. The monocular distance information generating unit 10 uses the distance information calculated using the 1 st captured image 100 and the distance information calculated using the 2 nd captured image 101 as the monocular distance information 102.
The monocular distance information generating unit 10 calculates the distance as the distance becomes farther on the top of the captured image, assuming that the image is taken in the horizontal direction on a plane, for example. However, the monocular distance information generating unit 10 calculates the distance for each distance information block composed of 1 or more pixel units. The distance information block may be, for example, 16 pixels composed of 4 pixels in vertical direction × 4 pixels in horizontal direction, 2 pixels composed of 2 pixels in vertical direction × 1 pixel in horizontal direction, or 1 pixel in vertical direction × 1 pixel in horizontal direction. The monocular distance information 102 generated by the monocular distance information generating unit 10 is output to the correcting unit 20 of the monocular distance information correcting unit 12.
The compound eye distance information generating unit 11 generates compound eye distance information 103 using a combination of the 1 st captured image 100 and the 2 nd captured image 101, specifically, a region where the fields of view of the 1 st captured image 100 and the 2 nd captured image 101 overlap. Since the positional relationship including the base line length of the 1 st image pickup unit 2 and the 2 nd image pickup unit 3 is known, the compound eye distance information generating unit 11 calculates the distance from the parallax of the pixel common to the 1 st captured image 100 and the 2 nd captured image 101. The compound eye distance information 103 calculated by the compound eye distance information generating unit 11 is also calculated in units of distance information blocks.
Comparing the monocular distance information 102 with the compound eye distance information 103, the compound eye distance information 103 is more accurate. Accordingly, the monocular distance information 102 can be referred to as "low-precision distance information", and the compound eye distance information 103 can be referred to as "high-precision distance information". Similarly, the monocular distance information generator 10 may be referred to as a "low-accuracy distance information generator", and the compound-eye distance information generator 11 may be referred to as a "high-accuracy distance information generator".
The size of the distance information block used by the monocular distance information generating unit 10 may be different from the size of the distance information block used by the correcting unit 20, and in the present embodiment, the distance information blocks are made the same for simplicity of description. The compound eye distance information 103 generated by the correction unit 20 is output to the correction unit 20 of the monocular distance information correction unit 12 and the group information generation unit 30.
The group information generating unit 30 detects a subject and performs group processing on a plurality of subjects using the 1 st captured image 100, the 2 nd captured image 101, and the compound eye distance information 103, and outputs the result to the correcting unit 20 as group information 106. The operation of the group information generating unit 30 will be described in detail later.
The group information 106 generated by the group information generating unit 30 includes the type of the group and the position on the image of the subject constituting the group. The group information 106 may include information on the type of subject. The type of the object is, for example, a person, an animal, a vehicle, a road surface, a curb, a side road, a white line, a guardrail, a sign, a structure, a signal, a falling object on the road, a pit on the road surface, a puddle, or the like. The correction unit 20 corrects the monocular distance information 102 using the compound eye distance information 103 and the group information 106, and outputs the corrected distance information 104. The operation of the correction unit 20 will be described in detail later. The recognition processing unit 7 performs recognition processing using the corrected distance information 104 generated by the distance calculation device 6.
The correction unit 20 corrects the monocular distance information 102 using the compound eye distance information 103 and the group information 106, and outputs the corrected monocular distance information 102 and the compound eye distance information 103 as the corrected distance information 104. The detailed configuration and operation of the correction unit 20 will be described later.
The vehicle control unit 8 controls the vehicle 1 using the identification information 140 generated by the identification processing unit 7. Further, the vehicle control unit 8 outputs the input information 105 to the distance calculation device 6. The input information 105 contains at least one of the traveling information, the identification information 140, the traffic infrastructure information, the map information, and the vehicle control information of the vehicle 1. The vehicle control unit 8 can acquire the traveling information of the vehicle 1 from a sensor, not shown, mounted on the vehicle 1. The vehicle control unit 8 can acquire traffic infrastructure information from outside the vehicle 1 using a communication device not shown. The vehicle control unit 8 can acquire map information by referring to a map database using a communication device not shown, or by referring to a map database built in the vehicle 1.
Examples of the travel information include travel speed information, acceleration information, position information, traveling direction, and travel pattern of the vehicle 1. Examples of the driving mode include an automatic driving mode, a driving mode based on a driving route, a driving mode based on a driving situation, a driving mode based on a surrounding natural environment, and an energy saving mode for driving with power saving or fuel saving. Examples of the automatic driving mode include a manual driving mode in which vehicle control is executed according to human judgment, a driving assistance mode in which a part of vehicle control is assisted according to system judgment, and an automatic driving mode in which vehicle control is executed according to system judgment. Examples of the travel mode based on the travel route include a city travel mode, an expressway travel mode, and legal speed information. Examples of the travel pattern based on the travel situation include a traffic jam travel pattern, a parking lot pattern, and a travel pattern according to the position and activity of the surrounding vehicle. Examples of the running mode based on the surrounding natural environment include a night running mode and a backlight running mode. Examples of the map information include position information of the vehicle 1 on the map, road shape information, road surface and ground object information, road width information, lane information, and road gradient information. Examples of the road shape information include a t-road and an intersection. Examples of the road surface and ground object information include a signal, a lane portion, a sidewalk portion, a railroad crossing, a bicycle parking lot, a car parking lot, and a pedestrian crossing. Examples of the map information include position information of the vehicle 1 on the map, road shape information, road surface and ground object information, road width information, lane information, and road gradient information. Examples of the road shape information include a t-road and an intersection. Examples of the road surface and ground object information include a signal, a lane portion, a sidewalk portion, a railroad crossing, a bicycle parking lot, a car parking lot, and a pedestrian crossing.
Examples of the identification information 140 include position information, type information, motion information, and risk information of the subject. As an example of the position information, there are a direction and a distance from the vehicle 1. Examples of the category information include pedestrians, adults, children, the elderly, animals, rockfall, bicycles, surrounding vehicles, surrounding structures, and curbs. Examples of the motion information include shaking, sudden running, crossing of a road, a moving direction, a moving speed, and a moving trajectory of a pedestrian and a bicycle. Examples of the danger information include abnormal operations of peripheral vehicles such as sudden pedestrian running, falling rocks, sudden braking, sudden deceleration, and sudden steering. Examples of the traffic infrastructure information include traffic congestion information, traffic control information such as speed restrictions and traffic restrictions, travel route guidance information for guiding other travel routes, accident information, warning information, surrounding vehicle information, and map information. Examples of the vehicle control information include brake control, steering wheel control, accelerator control, vehicle lamp control, warning sound emission, vehicle camera control, information output on an observation target object in the vicinity of the imaging device to a surrounding vehicle or a remote center apparatus connected via a network, and the like. The vehicle control unit 8 may perform the subject detection processing based on the information processed by the distance calculation device 6 or the recognition processing unit 7, may perform display for identifying the viewer and the image obtained by the 1 st image pickup unit 2 or the 2 nd image pickup unit 3 on the display device connected to the vehicle control unit 8, or may provide information of the observation target object detected by the distance calculation device 6 or the recognition processing unit 7 to the information device that processes traffic information such as map information and traffic congestion information.
(captured image and group)
Fig. 3 is a diagram showing an area of a captured image. Fig. 3(a) is a diagram showing the 1 st captured image 100 acquired by the 1 st imaging unit 2, fig. 3 (b) is a diagram showing the 2 nd captured image 101 acquired by the 2 nd imaging unit 3, and fig. 3 (c) is a diagram showing a group. The 1 st image pickup unit 2 and the 2 nd image pickup unit 3 are arranged side by side in the horizontal direction, and the lateral directions of fig. 3(a) and 3 (b) are arranged so as to coincide with the image pickup positions. That is, the region Q of the 1 st captured image 100 is captured as the same region as the region R of the 2 nd captured image 101.
Therefore, by using the region Q of the 1 st captured image 100 and the region R of the 2 nd captured image 101, the compound eye distance information generating section 11 can calculate the distance with high accuracy. Hereinafter, the information included in the region Q and the region R is collectively referred to as "information of the region T" including the meaning of using 2 pieces of information of the region Q and the region R together. The region T is referred to as a "binocular region". On the other hand, the region P of the 1 st captured image 100 and the region S of the 2 nd captured image 101 can obtain information only from one of the 1 st imaging unit 2 and the 2 nd imaging unit 3, and the monocular distance information generating unit 10 can calculate the distance only with low accuracy. Hereinafter, each of the regions P to S is referred to as a "monocular region".
The subject shown in fig. 3 is as follows. In the 1 st captured image 100, a subject 410, a subject 420, and a subject 430 are captured. The subject 430 spans the region P and the region Q, and for convenience of description, the subject 430 in the region P is referred to as a partial subject 401, and the subject 430 in the region Q is referred to as a partial subject 400. The object 410 is present at the right end of the region Q, and the object 420 is present near the center of the region P. In the 2 nd captured image 101, the subject 401 is captured at the left end portion of the region R, and the subject 410 is captured near the right end of the region R. In addition, in the area S of the 2 nd captured image 101, the subject 411, the subject 412, and the subject 413 are captured from the left end. The shapes of the subject 410, the subject 411, the subject 412, and the subject 413 are substantially the same.
In the present embodiment, the set of the partial object 400 and the partial object 401, in other words, the object 430 itself is set as a group G1, the set of the object 420 and the object 430 is set as a group G2, and the set of the object 410, the object 411, the object 412, and the object 413 is set as a group G3. Group G1, group G2, and group G3 are also referred to as group 1, group 2, and group 3, respectively.
In the present embodiment, a group G1, which is a type of a group of parts of a subject in each region that exists across both eye regions and a single eye region, is referred to as an "out-of-range group". In the example shown in fig. 3, the group G1 of the subject 430 in each area, that is, the partial subject 400 and the partial subject 401, which are present across the binocular region Q and the monocular region P, is classified as an out-of-range group. The group G1 is hereinafter referred to as "out-of-range group G1".
The group G2, which is a type of group of subjects existing in the vicinity of subjects belonging to the cross group and subjects existing in the monocular area, and subjects constituting the cross group, is referred to as an "extended cross group". In the example shown in fig. 3, the object 420 existing in the object in the monocular area P and in the vicinity of the object 430 belonging to the border crossing group G1, and the group G2 of the objects 430 constituting the border crossing group G1 are classified as an enlarged border crossing group. The group G2 is hereinafter referred to as "enlarged-border group G2". In addition, the subject included in the enlarged navigation group G2 but not included in the navigation group G1, such as the subject 420, is hereinafter referred to as an "additional subject".
The group G3, which is a kind of group including a plurality of subjects adjacent to the compound-eye region and the single-eye region, is referred to as a "single-body group". However, it is conditioned that each of the subjects belonging to the individual group is the same as or similar to at least 1 of the type of subject, the shape of subject, the color of subject, and the position of subject in the vertical direction. In the example shown in fig. 3, the group G3 in which the object 411, the object 412, and the object 413 exist adjacent to the monocular area S is classified as a single group. Group G3 is hereinafter referred to as "monomer group G3".
(correcting section 20)
Fig. 4 is a structural diagram of the correction unit 20. The correction unit 20 includes a distance difference information generation unit 22, a correction determination unit 23, and a correction execution unit 21. The distance difference information generating unit 22 generates distance difference information 120 based on the monocular distance information 102, the compound-eye distance information 103, and the group information 106. The distance difference information 120 will be described later.
The correction determination unit 23 determines a correction method for the monocular distance information 102 based on the distance difference information 120, the compound eye distance information 103, and the input information 105, and outputs correction determination information 121. The correction execution unit 21 performs correction processing on the monocular distance information 102 based on the distance difference information 120, the group information 106, and the correction determination information 121, and generates corrected distance information 104 including the compound eye distance information 103 and the corrected monocular distance information. The correction judgment information 121 will be described later.
(distance difference information generating section 22)
The operation of the distance difference information generating unit 22 will be described in detail with reference to fig. 3 again. The distance difference information generating unit 22 calculates an average value LAm for a part of the subject 400 included in the compound-eye region T based on the compound-eye distance information 103. Next, the distance difference information generating unit 22 calculates the average value LAs based on the monocular distance information 102 of the partial object 400 calculated using only the monocular image. Then, the distance difference information generating unit 22 calculates the distance difference information 120 using the average value LAm and the average value Las. The average value LAm and the average value LAs may be calculated as a simple average of all the regions of a part of the subject 400, or may be calculated as a weighted average in which weights are weighted more heavily as the distance from the boundary becomes closer.
The calculation for performing the weighted average is described below. The compound eye distance information 103 of n distance information blocks k constituting a partial object 400 to be calculated is lm (k), the monocular distance information 102 is ls (k), the weighting coefficient for the compound eye distance information 103 is α (k), and the weighting coefficient for the monocular distance information 102 is β (k). Wherein k is an integer of 0 to n-1. In this case, the average values LAm and LAs are expressed by the formulas 1 and 2.
LAm=(1/n)Σk=0 n-1(α (k) × Lm (k)) (k) … (formula 1)
LAs=(1/n)Σk=0 n-1(beta (k) × ls (k)) … (formula 2)
The distance difference information 120 is calculated as the difference between the average value LAm and the average value LAs, for example. However, the mean LAm or the mean LAs may also be weighted when calculating the difference. When the weighting coefficient for the compound-eye distance information 103 indicated by the symbol LAm is γ and the weighting coefficient for the monocular distance information 102 indicated by LAs is δ, the distance difference information 120 indicated by LD can be calculated as shown in the following equation 3.
LD ═ γ × LAm- δ × LAs … (formula 3)
The distance difference information 120 may also be calculated as follows. The difference between the compound-eye distance information 103 of the partial object 400 in the compound-eye region T and the monocular distance information 102 of the partial object 400 in the monocular region Q may be calculated for each distance information block unit, and the difference may be generated based on the difference for each distance information block unit of the partial object 400 in the generated compound-eye region T.
Further, the distance difference information 120 may be calculated as follows. That is, the distance difference information 120 may be a value obtained by weighting the average value LAm. That is, when the weighting coefficient is ∈, the distance difference information 120 represented by the symbol LD can be calculated as in equation 4 below.
LD ═ epsilon × LAm … (formula 4)
(correction judging section 23)
The correction determination unit 23 determines whether or not the distance information of the monocular area needs to be corrected using the input information 105 acquired from the vehicle control unit 8, and generates the correction determination information 121. However, the correction determination information 121 may include parameters used for correction.
The correction determination unit 23 determines not to correct the distance information of the monocular area, for example, in the following case. That is, the vehicle speed exceeds a predetermined speed, the driving mode is a predetermined mode, the current position of the vehicle 1 is an expressway, and the steering angle is equal to or larger than a predetermined angle. However, these conditions may be used in combination. This determination is because monitoring of a short distance is important when traveling at a low speed. In the case of a short distance, the same object is captured largely, and the parallax increases, so that the accuracy of the distance increases. In driving in urban areas, turning right and left at intersections in urban areas, and the like, there is an environment in which people easily feel distracted from the vehicle, and the risk of contact between people and the vehicle increases. On the contrary, the risk of human-vehicle contact is greatly reduced on expressways, national roads on which vehicles run at high speeds, and the like.
The correction determination unit 23 determines a parameter used for weighting, for example, as follows. Further, the larger the value of the parameter is, the larger the distance information of the compound eye region is reflected in the correction. The correction determination unit 23 sets the parameter to be larger as the vehicle speed is faster, and maximizes the parameter when the vehicle speed is larger than a first predetermined value, and minimizes the parameter when the vehicle speed is smaller than a second predetermined value. The correction determination unit 23 determines the place where the vehicle 1 is traveling, and for example, when traveling in a place where people frequently travel, sets the parameter to be large, sets the parameter to be medium on an ordinary road other than the place, and sets the parameter to be minimum on an expressway. The correction determination unit 23 sets the parameter to be larger as the steering angle of the vehicle 1 is larger, and sets the parameter to be smaller as the steering angle is smaller. These conditions may be combined.
By providing the correction determination unit 23, the distance to the object around the vehicle, that is, the person, the vehicle, the parked vehicle, or the like, can be accurately measured during low-speed driving in which the risk of contact between the person and the vehicle is high and in a situation where the intersection turns right and left, and thereby the risk of an accident can be greatly reduced. In the automatic driving, more accurate automatic driving judgment can be realized by improving the precision of the distance information referred by the automatic driving system. It is possible to prevent an automatic driving error caused by erroneous detection resulting from erroneous distance information. The processing load during high-speed travel can be reduced by limiting during low-speed travel.
(correction execution part 21)
The correction execution unit 21 corrects the monocular distance information 102 in a different technical manner for each group type based on the group information 106. The distance difference information 120 output by the distance difference information generating unit 22 is mainly used when the border group G1 is a processing target. The correction judgment information 121 is used for judgment as to whether or not correction of the border crossing group G1 is necessary, and for determination of correction parameters. The following describes the correction processing for each group type.
(correction of the out-of-range group by the correction execution part 21)
The correction execution unit 21 corrects the monocular distance information 102 of the partial object 401 existing in the monocular region P. The correction is, for example, monocular distance information 102 obtained by replacing all or a part of the distance information blocks of the partial object 401 with the average value LAm of the partial object in the compound eye region T. In the case of replacement, a predetermined pixel may be weighted. As an example of the weighting, there is a method of making the weight smaller as the distance from the boundary becomes longer. The corrected monocular distance information 102 represented by the symbol LCs is calculated as in the following equation 5, assuming that the weighting coefficient is ζ.
LCs ═ ζ × LD … (formula 5)
Then, the correction execution unit 21 outputs the corrected monocular distance information 102 as a part of the corrected distance information 104.
(correction of enlarged border crossing group by correction execution part 21)
When the extended crossing group is a processing target, the correction execution unit 21 corrects the distance information as follows. That is, the correction execution unit 21 corrects the distance information of the partial object whose monocular area of the cross group is enlarged, in the same manner as the distance information of the monocular area of the cross group. The correction execution unit 21 corrects the distance information of the object in the monocular area of the enlarged cross-border group using the distance information of the part of the object in the compound eye area. Specifically, as shown in the example shown in fig. 3, the distance information of the partial object 401 in the monocular area P of the enlarged boundary group G2 is the same as the correction of the partial object 401 in the boundary group G1. The distance information of the object 420 in the monocular area P is corrected using the distance information of the object 400 in the part of the compound eye area T.
The correction of the distance information of the object in the monocular area using the distance information of the partial object in the compound eye area is, for example, as follows. That is, the distance information of the distance information block of each object in the monocular area is set as the average of the distances of the entire partial objects in the compound eye area and the average of the distance information blocks of each object in the monocular area.
(correction of the monomer group by the correction execution part 21)
When the single-body group is a processing target, the correction execution unit 21 corrects the monocular distance information 102 of the subject imaged only in the monocular area, using the compound eye distance information 103 of the subject belonging to the single-body group. The processing is exemplified by the example shown in fig. 3, and the monocular distance information 102 of the object 411, the object 412, and the object 413 is corrected by the compound eye distance information 103 of the object 410 calculated using the compound eye image T and the monocular distance information 102 of the object 410 calculated using the monocular image R. The object 410 is also captured in the monocular image Q, but since it is an image captured by a camera different from the monocular image S to be corrected, the monocular distance information 102 obtained from the monocular image Q is not used for correction in this case.
The correction execution unit 21 calculates, for example, a difference between the compound eye distance information 103 and the monocular distance information 102 of the subject 410, and adds the difference to the monocular distance information 102 of the subject 411, the subject 412, and the subject 413.
(group information generating section 30)
Fig. 5 is a configuration diagram of the group information generating unit 30. The group information generating unit 30 includes an object detecting unit 32 and a group determining unit 31.
(subject detecting section 32)
The subject detection unit 32 of the group information generation unit 30 detects a subject using the 1 st captured image 100 output from the 1 st imaging unit 2 and the 2 nd captured image 101 output from the 2 nd imaging unit 3, and outputs subject detection information 130. In the detection of the subject, the 1 st captured image 100 or the 2 nd captured image 101 may be used, or both the 1 st captured image 100 and the 2 nd captured image 101 may be used. Further, the 1 st captured image 100 or the 2 nd captured image 101, which are captured at different timings, may also be used.
Since the positional relationship between the 1 st image pickup unit 2 and the 2 nd image pickup unit 3, the postures of the 1 st image pickup unit 2 and the 2 nd image pickup unit 3, and the angles of view of the 1 st image pickup unit 2 and the 2 nd image pickup unit 3 are known, a rough region where the 1 st captured image 100 and the 2 nd captured image 101 overlap can be calculated in advance. The subject detection unit 32 calculates the region where the 1 st captured image 100 and the 2 nd captured image 101 overlap each other by the calculation in advance, and specifies the monocular region and the compound eye region. This makes the boundary between the compound eye region and the single eye region clear.
For example, pattern matching and a neural network can be used for detecting the subject. The condition of the subject detected by the subject detecting unit 32 may be input as the input information 105. The condition of the subject refers to the type, size, positional information, and the like of the subject. The subject detection information 130 includes the type of subject, the size of the subject, the subject across the boundary between the compound eye region and the monocular region, the subject within the monocular region, the position information of the subject, and the information of the distance information block constituting the subject.
(group judgment section 31)
The group determination section 31 calculates group information 106 based on the subject detection information 130, the compound eye distance information 103, and the input information 105. The group determination unit 31 classifies the subject, which crosses the boundary between the compound eye region and the single eye region, into 1 group and classifies the subject into a cross-border group. When another object exists in the vicinity of the crossing group in the monocular area, the group determination unit 31 classifies the other object as another 1 group including the other object into an enlarged crossing group. The group determination unit 31 classifies a plurality of subjects adjacent to the monocular area into 1 group and classifies the subjects into individual groups.
(flow chart)
Fig. 6 is a flowchart showing the operation of the distance calculation device 6. First, the distance calculation device 6 acquires the 1 st captured image 100 and the 2 nd captured image 101 from the 1 st image capturing unit 2 and the 2 nd image capturing unit 3 (S01). Next, the distance calculation means 6 executes S02, S03, S04 in parallel. However, this is only a conceptual representation, and S02, S03, and S04 may be performed in any order. S02 is executed by the monocular distance information generating section 10, S03 is executed by the compound-eye distance information generating section 11, and S04 is executed by the group information generating section 30.
In S02, the distance calculation device 6 generates monocular distance information corresponding to the 1 st captured image 100 using the 1 st captured image 100, and generates monocular distance information corresponding to the 2 nd captured image 101 using the 2 nd captured image 101. That is, the compound eye region calculates distance information 2 times using each image in S02. In S03, the distance calculation means 6 generates compound eye distance information of the overlapping area using the 1 st captured image 100 and the 2 nd captured image 101. In S04, the distance calculation device 6 detects the subject, creates a group (S05), and determines whether or not the 1 st group is present (S06).
The distance calculation device 6 generates the distance difference information 120 of the 1 st group in the case where the execution of S02 and S03 is completed, and an affirmative determination is made in S06 (S07). If the 1 st group is not present (S06: NO (NO)), the distance calculation device 6 ends the process.
Finally, the distance calculation device 6 determines whether or not the generated distance difference information 120 is larger than a predetermined value (S08), and if so, the correction execution unit 21 performs the correction processing of the monocular distance information 102 as described above and outputs the corrected distance information 104 (S09). The distance calculation means 6 ends the processing when the distance is smaller than the predetermined value. The distance calculation device 6 repeats these processes every 1 frame, for example.
According to the first embodiment described above, the following operational effects can be obtained.
(1) The distance calculation means 6 includes: a compound eye distance information generating unit 11 that generates compound eye distance information 103, which is information of a distance with high accuracy, using the output of the sensor; a monocular distance information generating unit 10 that generates monocular distance information 102 having lower accuracy than the compound eye distance information 103 using the output of the sensor; and a monocular distance information correcting unit 12 for correcting the monocular distance information 102 using the compound eye distance information 103. Therefore, the distance information of the region with low accuracy can be corrected.
(2) The compound eye distance information generation unit 11 generates compound eye distance information 103 using an output of a compound eye region in which fields of view overlap among outputs of a plurality of cameras. The monocular distance information generating unit 10 generates monocular distance information 102 for the monocular area using the output of the single camera. The distance calculation device 6 includes a group information generation unit 30 that associates 1 or more subjects included in each of the compound-eye region and the single-eye region as a 1 st group based on a predetermined condition. The monocular distance information correcting unit 12 corrects the monocular distance information 102 using the compound eye distance information 103 of the subject belonging to the 1 st group generated by the compound eye distance information generating unit 11. Therefore, the monocular distance information 102 can be corrected using the compound eye distance information 103.
(3) The subject belonging to the 1 st group G1 is the same subject existing across the monocular area and the compound eye area.
(4) The group information generating unit 30 associates the subject included in the monocular area and satisfying a predetermined condition as an additional subject with the 1 st group as the 2 nd group. In the example shown in fig. 3, the subject 420 as the additional subject and the 1 st group G1 are associated with each other as the 2 nd group G2. The monocular distance information correcting unit 12 corrects the monocular distance information of the additional object using the distance information of the group 1.
(5) The monocular distance information correcting unit 12 calculates a difference between the compound eye distance information 103 and the monocular distance information 102 for the same area as distance difference information, and determines whether the monocular distance information 102 needs to be corrected based on the distance difference information. Therefore, when the effect of the estimated correction is small, the correction can be omitted, and the processing load can be reduced.
(6) The monocular distance information correcting unit 12 corrects the monocular distance information of the additional object using the compound eye distance information of the group 1 generated by the compound eye distance information generating unit 11 or the corrected monocular distance information generated by the monocular distance information correcting unit 12.
(modification 1)
The monocular distance information correcting unit 12 may replace the monocular distance information 102 of the partial object 401 with the compound eye distance information 103 of the partial object 400 in the correction of the monocular distance of the cross-border group G1.
(modification 2)
In the monocular distance correction of the separate group G3, the monocular distance information correcting unit 12 may replace the monocular distance information 102 of the objects 411, 412, and 413 existing in the monocular area with the compound eye distance information 103 of the object 410 existing in the compound eye area.
(modification 3)
Fig. 7 is a configuration diagram of the correction unit 20 in modification 3. The correction determination unit 23 in the present modification does not receive input of the input information 105 from the vehicle control unit 8. The correction determination unit 23 of the present modification generates correction determination information 121 for determining a method of correcting the monocular distance information 102 based on the distance difference information 120. Specifically, the correction determination unit 23 determines that the monocular distance information 102 is to be corrected when the distance difference information 120 is equal to or greater than a predetermined threshold, and determines that the monocular distance information 102 is not to be corrected when the distance difference information 120 is less than the predetermined threshold. According to this modification, there is no control based on the input information 105, and thus there is an advantage that the processing load is small.
(modification 4)
Fig. 8 is a diagram showing a method of calculating the distance difference information 120 and a process of correcting the monocular distance information 102 for the cross-border group G1 in modification 4. The lattice shown by the partial object 400 and the partial object 401 in fig. 8 is a distance information block. Hereinafter, a row of distance information blocks arranged in the horizontal direction in the figure in each region of the partial object 400 and the partial object 401 is referred to as a "distance block line". In the present modification, the distance difference information 120 is generated for each distance block line.
The monocular distance information 102 in the monocular area is replaced with distance difference information 120 generated based on the compound-eye distance information 103 in the compound-eye area T located at the boundary between the compound-eye area T and the monocular area for each distance information block line. However, the monocular distance information 102 in the monocular area may be replaced with the difference between the compound eye distance information 103 and the monocular distance information 102 generated in units of distance information blocks based on the compound eye distance information 700 for each distance information block line.
A distance information block line 700 shown in fig. 8 is a distance information block line in a part of the subject 400 in the compound eye region T. In the figure, a distance information block line 701 is a distance information block line in a part of the subject 401 in the monocular area.
The distance difference information generating unit 22 generates an average value LAm for each distance information block line based on the compound eye distance information 103 in the partial subject in the compound eye region T. Then, the average value LAm for each distance information block line unit is generated based on the monocular distance information 102 in the partial subject in the monocular area P, and the distance difference information 120 for each distance information block line is generated based on the average value LAm and the average value LAs.
(modification 5)
In the correction for the out-of-range group G1, the correction execution unit 21 may perform correction in which the distance difference information 120 is added to the monocular distance information 102 before correction. In this case, a predetermined pixel may be weighted. The corrected monocular distance information represented by the symbol LCs when the weighting coefficients are η and θ is calculated as in the following expression 6 or 7, if the symbol of the monocular distance information 102 is Ls and the symbol of the distance difference information 120 is LD.
LCs ═ θ × Ls + η × LD … (formula 6)
LCs ═ 1- η × Ls + η × LD … (formula 7)
Fig. 9 is a diagram showing a method of determining the weighting coefficient η shown in equation 6 or equation 7. However, the weighting coefficients are referred to as "correction coefficients" herein. The correction execution unit 21 determines the correction coefficient η in accordance with the correction function 1100 having the compound eye distance information 103 as a variable. Here, the distance indicated by the compound eye distance information 103 is referred to as a short distance 1101 when it is less than L0, is referred to as a medium distance 1012 when it is less than L1 and is equal to or greater than L0, and is referred to as a long distance 1013 when it is equal to or greater than L1. The correction coefficient η is set to 0.0 in the case of the compound eye distance information 103 being the short distance 1101, and is set to 1.0 in the case of the long distance 1103. When the compound eye distance information 103 is the intermediate distance 1102, the correction coefficient η is changed in accordance with the compound eye distance information 103. L0 and L1 are predetermined thresholds.
Fig. 10 is a diagram showing an example of the correction process of the monocular distance information 102 using the correction coefficient η shown in fig. 9. Fig. 10 (a) shows before correction, and fig. 10 (b) shows after correction. The regions 90A and 90B represent regions of the intermediate distance 1102 and span the compound eye region T and a part of the subject within the compound eye region T of the subject of the monocular region P. In addition, the partial subjects 90A and 90B are the same. The regions 91A and 91B indicate partial subjects within the monocular region P of the same subject as the regions 90A and 90B. The partial subjects 91A and 91B are the same.
The regions 92A and 92B represent a partial object within the compound eye region T of the object which is a long distance 1103 and spans the compound eye region T and the monocular region P. In addition, the partial object 92A is the same as 902. The regions 93A and 93B indicate a part of the subject within the monocular region P of the same subject as the regions 92A and 92B. The reference numerals a, B, M, and n described in the respective regions are used to schematically indicate distance information of the respective regions, and the distance information indicating the regions with the same reference numerals are the same.
Based on the compound eye distance information 103(90A) of the partial subjects in the compound eye region T and within the intermediate distance 1102, correction processing of the monocular distance information 102 of the partial subjects in the monocular region P constituting the same subject is performed. The correction processing of the monocular distance information 102 of the same subject in the monocular region P is not performed regardless of the compound eye distance information 103(92A) of the partial subjects in the compound eye region T and in the long distance 1103.
According to this modification, the following operational effects can be obtained.
(7) The monocular distance information correcting unit 12 determines whether or not to correct the monocular distance information 102 and a parameter used for the correction, based on the compound eye distance information 103.
(modification 6)
The distance difference information 120 may also be calculated as follows. A difference value between the compound-eye distance information 103 in the partial subject in the compound-eye region T and the monocular distance information 102 in the partial subject in the non-compound-eye region is generated for each distance information block unit in the distance information block line. The average difference value of the partial objects in the compound-eye region T may be calculated using the difference value, and the average difference value may be used as the distance difference information 120.
(modification 7)
When generating the distance difference information 120, the compound eye distance information 103 of the upper and lower distance information block lines may be used. As an example, there is a vertical N-tap filter (N is an integer). As an example of the vertical N-tap filter, there is an N-tap FIR (Finite Impulse Response) filter using N pixels in the vertical direction.
(modification 8)
The distance difference information 120 may be an average value LAm in the line of the distance information block.
(modification 9)
In the correction for the offside group G1, the correction execution unit 21 may replace the monocular distance information 102 of all or some of the distance information blocks in the distance information block line in the partial object 401 with the average value LAm in the distance information block line in the partial object in the compound eye region T. In the case of replacement, a predetermined pixel may be weighted. Regarding the weighting, the weighting is reduced in accordance with, for example, the distance from the boundary of the compound-eye region T and the monocular region.
(modification 10)
In the correction for the out-of-range group G1, the correction execution unit 21 may add the distance difference information 120 for the same distance information block line to the monocular distance information 102 in the distance information block line. That is, the monocular distance information after correction may be set to the monocular distance information 102+ the distance difference information 120. When the addition is performed, a predetermined pixel may be weighted. Regarding the weighting, for example, the farther the distance from the boundary between the compound-eye region T and the monocular region, the smaller the weighting.
(modification 11)
Fig. 11 is a diagram showing the correction processing of the distance difference information 120 and the monocular distance information 102 for the border crossing group G1 according to modification 11. In the present modification, the subject 430 is placed across the compound eye region T and the monocular region P. The subject 430 is composed of a partial subject 401 in the monocular area P and a partial subject 400 in the compound eye area T. In the present modification, the average value of the compound eye distance information 103 generated based on the compound eye distance information 700 in the compound eye region T located at the boundary between the compound eye region T and the monocular region P is calculated, and the monocular distance information 102 in the partial object 401 in the monocular region P is replaced with the average value. The distance information block sequence 700 is located at a position closest to the monocular area P of the partial object 400 existing in the compound eye area T, that is, at a boundary with the monocular area P.
The distance difference information generating unit 22 generates an average value LAm of distance information blocks at the left end of the partial object 400 in the compound-eye region T based on the compound-eye distance information 103 in the partial object in the compound-eye region T. The distance difference information generating unit 22 generates an average value LAs of the distance information blocks at the left end of the partial object 400 in the compound-eye region T based on the monocular distance information 102 in the partial object 401 of the monocular region P, and generates the distance difference information 120 for each distance information block line based on the average value LAm and the average value LAs.
The average value LAm and the average value LAs may be, for example, an average value of all or a part of the distance information in the distance information block of the left end 700 of the partial object 400 in the compound eye region T. In addition, in the calculation of the average, a predetermined pixel may be weighted. The distance difference information 120 can be, for example, a difference LD between the average value LAm and the average value LAs. In addition, the average LAm or the average LAs may be weighted when calculating the difference.
(modification 12)
The distance difference information 120 may be an average value LA described below. The average value LA is an average value generated based on the difference value for each distance information block unit in the generated compound eye region T by generating the difference value between the compound eye distance information 103 in the partial object 400 in the compound eye region T and the monocular distance information 102 in the partial object 401 in the monocular region.
(modification 13)
The distance difference information 120 may be an average value LAm in the distance information block of the left end 700 of the partial object 400 in the compound-eye region T.
(modification 14)
The correction processing performed by the correction execution unit 21 for the border crossing group G1 may replace the monocular distance information 102 of all or a part of the distance information blocks in the partial object 401 with the average value LAm of the partial object 400 in the compound eye region T. In the case of replacement, a predetermined pixel may be weighted. The weighting can be reduced in accordance with the distance from the boundary between the compound eye region T and the monocular region P, for example.
(modification 15)
The correction processing performed by the correction execution unit 21 for the border crossing group G1 may be performed by adding the distance difference information 120 of the same distance information block to the monocular distance information 102. When the addition is performed, a predetermined pixel may be weighted. The weighting can be reduced in accordance with the distance from the boundary between the compound eye region T and the monocular region P, for example.
(modification 16)
Fig. 12 is a diagram showing the correction processing performed by the correction execution unit 21 for the enlarged detour group G2 in modification 12. Fig. 12 (a) and 12 (b) show 2 subjects and their distance information arranged in the compound eye region T and the monocular region P, respectively, fig. 12 (a) shows before correction, and fig. 12 (b) shows after correction.
The regions 83A and 83B represent a part of the subject within the monocular region P of the subject across the compound eye region T and the monocular region P. The partial subjects 83A and 83B are the same. The regions 84A and 84B represent a partial subject of the monocular region P of the subject across the compound-eye region T and the monocular region P. The partial subjects 84A and 84B are the same part. The regions 85A and 85B indicate subjects existing only in the monocular region P. Note that reference numerals a, N, M, and N1 described in the respective regions are used to schematically indicate distance information of the respective regions, and distance information indicating regions with the same reference numerals are the same.
The correction determination unit 23 first corrects the monocular distance information 102 of the partial subjects in the monocular region P constituting the same subject from M to M based on the compound eye distance information 103(a) of the partial subjects in the compound eye region T. Next, the correction determination unit 23 corrects the monocular distance information 102 of another object in the monocular region P included in the enlarged border group G2 from N to N1 based on the corrected monocular distance information M generated for the partial objects in the monocular region P. For example, the value of N1 is determined so as to satisfy the relationship of N-N1 ═ M-M.
(modification 17)
Fig. 13 is a diagram illustrating a method of determining the correction coefficient η, which is the weighting coefficient shown in equation 6 or equation 7. In the present modification, the correction coefficient η is determined using the traveling speed of the host vehicle included in the input information 105. The correction coefficient η is set to 1.0 when the travel speed is equal to or less than a predetermined value F0, and the correction coefficient α is set to 0.0 when the travel speed exceeds a predetermined value F0.
(modification 18)
The threshold L1 shown in fig. 9 may also be variable. Fig. 14 is a diagram showing the threshold value L1. Threshold value L1 is determined based on the traveling speed of vehicle 1 included in input information 105. When the travel speed is equal to or less than the predetermined value F0, the threshold value L1 is D0, which is the minimum. When the running speed exceeds a predetermined value F1, the threshold value L1 is D1 at the maximum. In the case where the running speed is between F0 and F1, the value of the threshold L1 increases with an increase in the running speed.
(modification 19)
Fig. 15 is a diagram illustrating correction for enlarging the border crossing group according to modification 19. Fig. 15 (a) shows before correction, and fig. 15 (b) shows after correction. The correction determination unit 23 determines whether or not to correct monocular distance information of the additional subject that enlarges the border crossing group as follows. That is, the correction determination information 121 determines that correction is to be performed when the distance between the additional subject and the compound eye region is shorter than a predetermined threshold value, and determines that correction is not to be performed when the distance is farther than the predetermined threshold value.
In the example shown in fig. 15, both the subject 88A and the subject 89A belong to the extended border crossing group. The distance to the compound eye region T of the subject 88A is shorter than the predetermined value DB0 as indicated by reference numeral 1500. Therefore, the monocular distance information of the subject 88A is corrected from N to N. The distance to the compound eye region T of the subject 89A is a distance denoted by reference numeral 1501 and is longer than a predetermined value DB 0. Therefore, the monocular distance information of the object 89B is not corrected and does not change from k.
According to this modification, the following operational effects can be obtained.
(8) The monocular distance information correcting unit 12 determines the presence or absence of distance information correction and a parameter for correction based on the distance between the other subject and the boundary between the monocular image and the compound eye image.
(modification 20)
Fig. 16 is a diagram showing a configuration of a correction unit 20 in modification 16. The correction unit 20 in the present modification includes the monocular area dwell period holding unit 24 in addition to the configuration of the first embodiment. The monocular-region dwell period holding unit 24 generates monocular-region dwell period information 122 indicating the time length, for example, the number of frames, of the monocular region in which the subject is present in the monocular region P based on the group information 106, and outputs the generated monocular-region dwell period information to the correction determining unit 23.
As an example of the monocular area dwell period information 122, there is frame number information and/or time information in the case where the subject continuously exists in a monocular area within the monocular area P. Note that, when all or part of the subject exists in the compound eye region T, the monocular-region dwell period information 122 may be initialized to 0. In addition, when the object appears as a monocular region first, the monocular region dwell period information 122 may be initialized to a desired maximum value or a predetermined value.
Fig. 17 is a diagram illustrating a method of determining the correction coefficient η of the correction determination unit 23. In the present modification, the correction determination information 121 determines the correction coefficient η used in equations 6 and 7 based on the monocular area dwell period information 122. The correction determination information 121 sets the correction coefficient α to 1.0 when the monocular area dwell period information 122 is equal to or less than the predetermined value Ta. In the correction determination information 121, when the monocular area dwell period information 122 exceeds the predetermined value Tb, the correction coefficient α is set to 0.0. The correction determination information 121 decreases the correction coefficient η in accordance with an increase in the monocular area dwell period information 122 when the monocular area dwell period information 122 is greater than a predetermined value Ta.
According to this modification, the following operational effects can be obtained.
(9) The distance calculation device 6 includes a monocular area dwell period holding section 24 that generates monocular area dwell period information 122 whose group exists outside the compound eye area. The monocular distance information correcting unit 12 determines the distance information correction method based on the monocular area dwell period information 122.
(modification 21)
Fig. 18 is a configuration diagram of the group information generating unit 30 according to modification 21. The group information generating unit 30 includes a monocular area dwell period holding unit 33 in addition to the configuration shown in fig. 5. The monocular area dwell period holding unit 33 generates monocular area dwell period information 131 indicating the time length, for example, the number of frames, etc., of the monocular area in which the subject is present in the monocular area P based on the subject detection information 130, and outputs the monocular area dwell period information to the group determination unit 31.
As an example of the monocular area dwell period information 131, there is frame number information and/or time information in the case where the subject continuously exists in a monocular area within the monocular area P. Note that, when all or part of the subject exists in the compound eye region T, the monocular-region dwell period information 131 may be initialized to 0. Note that, when the first appearance of the subject is a monocular area, the monocular area dwell period information 131 may be initialized to a maximum value that can be expected or a predetermined value.
(modification 22)
The correction execution unit 21 may determine the correction mode of the monocular distance information according to the type of the subject and the speed of the vehicle 1. For example, the correction method refers to whether or not correction is required, or a parameter of correction.
Fig. 19 is a diagram showing a correspondence relationship between the type of subject and the correction method. Fig. 19 also shows the relationship between the speed of the vehicle 1 and the correction mode. In addition, fig. 19 is classified into 2 levels according to the case of the drawing. In fig. 19, "correction valid" indicates that the monocular distance information is corrected, and "correction invalid" indicates that the monocular distance information is not corrected. Further, the percentage in fig. 19 indicates the value of the parameter used for correction and correction. That is, the percentage-based correction technique is described as a correction method using parameters shown in equations 6 and 7. However, in the case of "0%" the correction may not be performed.
When the correction method 1 is used in the correction execution unit 21, the monocular distance information is corrected when the type of the object is a person, a vehicle, or a road surface, and the monocular distance information is not corrected when the type of the object is a structure. When the correction method 3 is used in the correction execution unit 21, the correction is performed according to the speed of the vehicle 1, and if the vehicle is traveling at a low speed, the correction is performed regardless of the type of the subject, and if the vehicle is traveling at a high speed, the correction is not performed on the person and the vehicle, and if the vehicle is traveling at a medium speed, the person and the vehicle are corrected according to the speed.
In the correction mode 3, since the road surface and the structure are large areas and are stationary objects having continuity, correction is easy, and correction is performed regardless of the speed. However, since the human and vehicle regions are relatively small and have no continuity, the correction is limited to the case of low-speed running where the necessity is high.
In the correction method 4, the distance accuracy of the human is improved during low-speed traveling, and the distance accuracy of the vehicle is improved during high-speed traveling. The entire load of correction processing is reduced by utilizing the change of the vehicle surroundings, particularly the change of the object to be seen, according to the speed difference, thereby achieving cost reduction. Specifically, by reducing the computation capability required of the distance computation device 6, the distance computation device 6 can be manufactured using a computation device with low processing capability.
Further, in the present modification, the correction may be performed only when the distance indicated by the monocular distance information before correction is shorter than a predetermined distance. In addition, when the type of the subject is a person, the correction target quadrant may be defined as a person present within a predetermined distance range from the road surface. Since the road surface is a large area and has continuity, the correction accuracy is high. On the other hand, because attention is more needed to people near the driving road.
(10) The monocular distance information correcting unit 12 determines a method of correcting the monocular distance information based on at least one of the traveling state of the vehicle 1 and the type of the subject. Therefore, the distance calculation device 6 can perform correction based on the traveling state of the vehicle and the type of the subject.
Second embodiment-
A second embodiment of the distance calculating device will be described with reference to fig. 20 to 21. In the following description, the same components as those in the first embodiment are denoted by the same reference numerals, and the differences will be mainly described. The contents not particularly described are the same as those of the first embodiment. In the present embodiment, mainly the sensors mounted on the vehicle 1 are different from those of the first embodiment.
Fig. 20 is a schematic diagram of a distance calculation system S according to the second embodiment. The vehicle 1 includes a 1 st image pickup unit 2 and a distance sensor 2000 as sensors for collecting surrounding information. The distance sensor 2000 is a sensor capable of acquiring distance information, and examples thereof include a Light Detection and Ranging (Light Detection and distance correction), a millimeter wave radar, a TOF (Time of Flight) type distance sensor, a stereo camera, and the like. The distance sensor 2000 acquires distance information of a range indicated by a solid line, that is, an area indicated by a hatched line in the center of the figure. The 1 st imaging unit 2 takes a wide range including a range indicated by a broken line, that is, a range in which the distance sensor 2000 acquires distance information, as an imaging range.
The hatched region corresponds to the compound eye region T in the first embodiment. The regions other than the hatched regions in the imaging range of the 1 st imaging unit 2 correspond to the monocular region P and the monocular region S in the first embodiment.
Fig. 21 is a configuration diagram of the distance calculation device 6 in the second embodiment, in which the compound eye distance information generation unit 11 is deleted compared with the configuration of the first embodiment. The distance sensor 2000 outputs distance information 2010 to the monocular distance information correcting unit 12. The monocular distance information correcting unit 12 processes the distance information 2010 as the compound eye distance information 103 in the first embodiment. Therefore, although not actually present, the virtual compound eye distance information generating unit 11 can be regarded as outputting the input sensor output as the high-accuracy distance information 103 as it is. The other structure is the same as that of the first embodiment.
According to the second embodiment described above, the following operational effects can be obtained.
(11) The compound-eye distance information generation unit 11 generates compound-eye distance information 103 using an output of at least one of a distance sensor of a TOF system, a millimeter wave radar, and a LiDAR. Therefore, the distance information of the monocular area can be corrected using the information of the distance sensor with higher accuracy.
Third embodiment-
A third embodiment of the distance calculating device will be described with reference to fig. 22 to 23. In the following description, the same components as those in the first embodiment are denoted by the same reference numerals, and the differences will be mainly described. The contents not particularly described are the same as those of the first embodiment. In the present embodiment, mainly the sensors mounted on the vehicle 1 are different from those of the first embodiment.
Fig. 22 is a schematic diagram of a distance calculation system S according to the third embodiment. The vehicle 1 includes a 1 st image pickup unit 2, a 2 nd image pickup unit 3, a 1 st distance sensor 2200, and a 2 nd distance sensor 2201 as sensors for collecting surrounding information. The 1 st and 2 nd distance sensors 2200 and 2201 are sensors capable of acquiring distance information, and are, for example, LiDAR, millimeter-wave radar, a distance sensor of a TOF system, a stereo camera, or the like.
The 1 st image pickup unit 2 picks up an image of a hatched range in the center of the figure in the same field of view as the 2 nd image pickup unit 3. The field of view of the 1 st distance sensor 2200 is substantially forward left of the vehicle 1, and the field of view of the 2 nd distance sensor 2201 is substantially forward right of the vehicle 1. The field angle, the setting position, and the posture of all the sensors are known. The hatched region corresponds to the compound eye region T in the first embodiment. The hatched upper and lower regions correspond to the monocular region P and the monocular region S in the first embodiment.
Fig. 23 is a configuration diagram of the distance calculation device 6 according to the third embodiment. Compared to the configuration of the first embodiment, the monocular distance information generating unit 10 is deleted. The 1 st distance sensor 2200 and the 2 nd distance sensor 2201 output the distance information 2210 and the distance information 2211 to the monocular distance information correcting unit 12. The monocular distance information correcting unit 12 extracts information of a necessary region from the input distance information 2210 and distance information 2211, and processes the extracted information as the monocular distance information 102 in the first embodiment. Therefore, although not actually present, the virtual presence monocular distance information generating unit 10 can be regarded as outputting the input sensor output as the monocular distance information 102 as it is. The other structure is the same as that of the first embodiment.
According to the third embodiment described above, the following operational effects can be obtained.
(12) The monocular distance information generating unit 10 generates the monocular distance information 102 using an output of at least one of a distance sensor of a TOF system, a millimeter wave radar, and a LiDAR.
Fourth embodiment-
A fourth embodiment of the distance calculating device will be described with reference to fig. 24 to 25. In the following description, the same components as those in the first embodiment are denoted by the same reference numerals, and the differences will be mainly described. The contents not particularly described are the same as those of the first embodiment. In the present embodiment, mainly the angle of view and the posture of the camera mounted on the vehicle 1 are different from those of the first embodiment.
Fig. 24 is a schematic diagram of a distance calculation system S according to the fourth embodiment. The vehicle 1 includes a 1 st image pickup unit 2 and a 2 nd image pickup unit 3 as sensors for collecting surrounding information, as in the first embodiment. However, in the present embodiment, the 1 st image pickup unit 2 and the 2 nd image pickup unit 3 have wide viewing angles and the viewing fields thereof coincide with each other. The 1 st and 2 nd imaging units 2 and 3 have, for example, fisheye lenses, and can perform imaging over a wide range, but the amount of deformation of the lens peripheral portion is large and the resolution is low.
The vicinity of the center of the field of view of the 1 st image pickup unit 2 and the 2 nd image pickup unit 3 is a hatched region, and corresponds to the compound eye region T in the first embodiment. The regions other than the region where the field of view of the 1 st imaging unit 2 and the 2 nd imaging unit 3 are simultaneously the center are considered to correspond to the monocular regions P and S in the first embodiment because of large distortion and low spatial resolution. In other words, an area with poor shooting conditions is regarded as a monocular area, and an area with relatively good shooting conditions is regarded as a compound eye area.
Fig. 25 is a configuration diagram of the distance calculation device 6 in the fourth embodiment, in which the monocular distance information generating unit 10 is deleted compared with the configuration of the first embodiment. The monocular distance information correcting unit 12 processes only the vicinity of the center of the compound eye distance information 103 output from the compound eye distance information generating unit 11 as the compound eye distance information 103 in the first embodiment, and the peripheral portion as the monocular distance information 102 in the first embodiment. The other structure is the same as that of the first embodiment.
According to the fourth embodiment described above, the following operational effects can be obtained.
(13) The compound eye distance information generating unit 11 generates compound eye distance information 103 using information of an area where the fields of view output by the 2 cameras overlap and the imaging condition is good, and the monocular distance information generating unit 10 generates monocular distance information 102 using information of an area where the fields of view output by the 2 cameras overlap and the imaging condition is poor compared to the area used by the compound eye distance information generating unit 11. The invention can therefore also be applied in systems where there is no monocular area.
(modification of the fourth embodiment)
In the shooting condition, not only the deformation of the lens but also the amount of light and dust adhering to the lens may be included. For example, when a lens having a small distortion over the entire circumference is used, the distance information of a dark region with a small amount of light may be processed as the monocular distance information 102, and the distance information of a bright and clear region may be processed as the compound eye distance information 103.
Fifth embodiment-
A fifth embodiment of the distance calculation device is explained with reference to fig. 26. In the following description, the same components as those in the first embodiment are denoted by the same reference numerals, and the differences will be mainly described. The contents not specifically described are the same as those of the first embodiment. In the present embodiment, mainly the correspondence between functions and hardware is different from that in the first embodiment.
Fig. 26 is a configuration diagram of a distance calculation system S of the fifth embodiment. In the present embodiment, the monocular distance information correcting unit 12 is implemented by hardware different from the monocular distance information generating unit 10 and the compound eye distance information generating unit 11. The camera processing device 15 includes a monocular distance information generating unit 10, a compound eye distance information generating unit 11, and a feature information generating unit 13. The feature information generating unit 13 creates feature quantities of the inputted image, such as an edge image and a histogram, and outputs the feature quantities as feature information 107 to the image processing unit 60. The ECU14 includes the monocular distance information correcting unit 12, the recognition processing unit 7, and the vehicle control unit 8.
According to the fifth embodiment described above, the present invention can be implemented by various hardware configurations.
The above-described embodiments and modifications may be combined and implemented. In the above description, various embodiments and modifications have been described, but the present invention is not limited to these. Other modes contemplated within the scope of the technical idea of the present invention are also included within the scope of the present invention.
The following priority-based applications are hereby incorporated by reference.
Japanese patent application 2018-141891 (published in 2018, 7 and 27)
Description of reference numerals
1 … vehicle
2 … No. 1 imaging part
3 … No. 2 imaging part
6 … distance calculation device
7 … recognition processing unit
8 … vehicle control part
10 … monocular distance information generating unit
11 … compound eye distance information generating unit
12 … monocular distance information correcting unit
15 … camera processing device
20 … correcting unit
21 … correction execution unit
22 … distance difference information generating unit
23 … correction determination unit
24 … monocular region dwell period holding part
30 … group information generating part
31 … group judgment unit
32 … object detection unit
33 … monocular region dwell period holding part
100 … 1 st captured image
101 … image captured at 2 nd position
102 … monocular distance information
103 … compound eye distance information
104 … corrected distance information
105 … input information
106 … group information
120 … distance difference information
121 … correction judgment information
122 … monocular region dwell period information
130 … subject detection information
131 … monocular area dwell period information
140 … identify the information.

Claims (15)

1. A distance calculation apparatus, comprising:
a high-precision distance information generating unit that generates high-precision distance information, which is information of a distance with high precision, using an output of the sensor;
a low-precision distance information generating unit that generates low-precision distance information having a lower precision than the high-precision distance information, using an output of the sensor; and
and a distance correction unit for correcting the low-accuracy distance information using the high-accuracy distance information.
2. The distance calculation apparatus according to claim 1, wherein:
the high-precision distance information generating section generates the high-precision distance information using an output of a compound eye region in which fields of view overlap among outputs of a plurality of cameras,
the low-precision distance information generating section generates the low-precision distance information of a monocular area using an output of a single camera,
the distance calculation device further includes a group information generation unit that associates 1 or more subjects included in each of the compound eye region and the single eye region as a 1 st group based on a predetermined condition,
the distance correction unit corrects the low-accuracy distance information using the high-accuracy distance information of the subject belonging to the group 1 generated by the high-accuracy distance information generation unit.
3. The distance calculation apparatus according to claim 2, wherein:
the subject belonging to the group 1 is the same subject existing across the monocular area and the compound eye area.
4. A distance calculation apparatus according to claim 3, characterized in that:
the group information generating unit associates an object included in the monocular area and satisfying a predetermined condition as an additional object with the 1 st group as a 2 nd group,
the distance correction unit corrects the low-precision distance information of the additional object using the distance information of the group 1.
5. The distance calculation apparatus according to claim 2, wherein:
the distance correction unit calculates a difference between the high-precision distance information and the low-precision distance information for the same area as distance difference information, and determines whether or not to correct the low-precision distance information based on the distance difference information.
6. The distance calculation apparatus according to claim 4, wherein:
the distance correction unit corrects the low-accuracy distance information of the additional object using the compound eye distance information of the group 1 generated by the high-accuracy distance information generation unit or the corrected monocular distance information generated by the distance correction unit.
7. The distance calculation apparatus according to claim 2, wherein:
the distance correction unit determines a mode of correcting the low-accuracy distance information based on the high-accuracy distance information.
8. The distance calculation apparatus according to claim 2, wherein:
the distance correction unit determines a mode of correcting the low-precision distance information based on a traveling state of the vehicle.
9. The distance calculation apparatus according to claim 2, wherein:
the distance correction unit determines a method of correcting the low-precision distance information based on a type of a subject.
10. The distance calculation apparatus according to claim 2, wherein:
the distance correction unit determines a distance information correction mode based on a distance between the subject to which the low-precision distance information is to be corrected and a boundary between the monocular region and the compound eye region.
11. The distance calculation apparatus according to claim 2, wherein:
further comprises a stay period generating section outside the compound eye region for generating period information in which the group is located outside the compound eye region,
the distance correction unit determines a distance information correction method based on the period information.
12. The distance calculation apparatus according to claim 2, wherein:
the distance correction unit determines grouping based on at least one of the compound eye distance of the group 1, travel information, the type of group, the distance to the boundary, and the period.
13. The distance calculation apparatus according to claim 1, wherein:
the high-precision distance information generating unit generates the high-precision distance information using an output of at least one of a distance sensor of a TOF system, a millimeter wave radar, and a LiDAR.
14. The distance calculation apparatus according to claim 1, wherein:
the low-accuracy distance information generation unit generates the low-accuracy distance information using an output of at least one of a distance sensor of a TOF system, a millimeter wave radar, and a LiDAR.
15. The distance calculation apparatus according to claim 1, wherein:
the high-accuracy distance information generating unit generates the high-accuracy distance information using information of an area in which fields of view overlap and imaging conditions are good, the information being output from 2 cameras,
the low-accuracy distance information generating unit generates the low-accuracy distance information using information of an area that overlaps with a field of view output by the 2 cameras and has a lower imaging condition than the area used by the high-accuracy distance information generating unit.
CN201980049441.1A 2018-07-27 2019-07-04 Distance calculating device Active CN112513571B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018141891A JP7042185B2 (en) 2018-07-27 2018-07-27 Distance calculation device
JP2018-141891 2018-07-27
PCT/JP2019/026647 WO2020022021A1 (en) 2018-07-27 2019-07-04 Distance calculation device

Publications (2)

Publication Number Publication Date
CN112513571A true CN112513571A (en) 2021-03-16
CN112513571B CN112513571B (en) 2023-08-25

Family

ID=69180414

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980049441.1A Active CN112513571B (en) 2018-07-27 2019-07-04 Distance calculating device

Country Status (3)

Country Link
JP (1) JP7042185B2 (en)
CN (1) CN112513571B (en)
WO (1) WO2020022021A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7341079B2 (en) * 2020-02-10 2023-09-08 住友重機械工業株式会社 Distance image estimation device and control device
JP2022041219A (en) * 2020-08-31 2022-03-11 エスゼット ディージェイアイ テクノロジー カンパニー リミテッド Control device, distance measurement sensor, imaging device, control method, and program
JP7499140B2 (en) * 2020-10-14 2024-06-13 日立Astemo株式会社 Object Recognition Device
EP4047388B1 (en) * 2021-02-22 2024-07-24 Shenzhen Camsense Technologies Co., Ltd Ranging apparatus, lidar and mobile robot

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000121319A (en) * 1998-10-15 2000-04-28 Sony Corp Image processor, image processing method and supply medium
CN1820496A (en) * 2003-10-31 2006-08-16 三菱电机株式会社 Image correcting method and imaging apparatus
WO2006123615A1 (en) * 2005-05-19 2006-11-23 Olympus Corporation Distance measuring apparatus, distance measuring method and distance measuring program
JP2007263669A (en) * 2006-03-28 2007-10-11 Denso It Laboratory Inc Three-dimensional coordinates acquisition system
CN106662441A (en) * 2014-09-11 2017-05-10 日立汽车***株式会社 Image processing device
CN110053625A (en) * 2018-01-19 2019-07-26 本田技研工业株式会社 Apart from computing device and controller of vehicle

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1881450A1 (en) * 2005-05-10 2008-01-23 Olympus Corporation Image processing apparatus, image processing method, and image processing program
JP2008304202A (en) * 2007-06-05 2008-12-18 Konica Minolta Holdings Inc Method and apparatus for distance image generation and program
JP6660751B2 (en) * 2016-02-04 2020-03-11 日立オートモティブシステムズ株式会社 Imaging device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000121319A (en) * 1998-10-15 2000-04-28 Sony Corp Image processor, image processing method and supply medium
CN1820496A (en) * 2003-10-31 2006-08-16 三菱电机株式会社 Image correcting method and imaging apparatus
WO2006123615A1 (en) * 2005-05-19 2006-11-23 Olympus Corporation Distance measuring apparatus, distance measuring method and distance measuring program
JP2007263669A (en) * 2006-03-28 2007-10-11 Denso It Laboratory Inc Three-dimensional coordinates acquisition system
CN106662441A (en) * 2014-09-11 2017-05-10 日立汽车***株式会社 Image processing device
CN110053625A (en) * 2018-01-19 2019-07-26 本田技研工业株式会社 Apart from computing device and controller of vehicle

Also Published As

Publication number Publication date
JP7042185B2 (en) 2022-03-25
CN112513571B (en) 2023-08-25
JP2020016628A (en) 2020-01-30
WO2020022021A1 (en) 2020-01-30

Similar Documents

Publication Publication Date Title
CN112513571B (en) Distance calculating device
EP3358302B1 (en) Travel control method and travel control device
US9074906B2 (en) Road shape recognition device
US10964217B2 (en) Travel control method and travel control apparatus
CN114375467B (en) System and method for detecting an emergency vehicle
US20120277990A1 (en) Method and apparatus for determining a plausible lane for guiding a vehicle and an automobile
CN104335264A (en) Lane partition marking detection apparatus, and drive assist system
CN110858405A (en) Attitude estimation method, device and system of vehicle-mounted camera and electronic equipment
EP2224743B1 (en) Vehicle periphery monitoring device, vehicle, and vehicle periphery monitoring program
KR102397156B1 (en) A method of providing a camera system and driver assistance functions for photographing the surrounding area of one's vehicle
JP2023184572A (en) Electronic apparatus, movable body, imaging apparatus, and control method for electronic apparatus, program, and storage medium
JP2019146012A (en) Imaging apparatus
CN110053625B (en) Distance calculation device and vehicle control device
US20200118280A1 (en) Image Processing Device
JP2017129543A (en) Stereo camera device and vehicle
US10864856B2 (en) Mobile body surroundings display method and mobile body surroundings display apparatus
JP6253175B2 (en) Vehicle external environment recognition device
US11830254B2 (en) Outside environment recognition device
EP3865815A1 (en) Vehicle-mounted system
CN114868381A (en) Image processing apparatus, image processing method, and program
KR20210034626A (en) How to determine the type of parking space
EP4246467A1 (en) Electronic instrument, movable apparatus, distance calculation method, and storage medium
JP2024078845A (en) Lane Estimation Device and Map Generation Device
JPH1137752A (en) Distance detecting device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Ibaraki

Applicant after: Hitachi astemo Co.,Ltd.

Address before: Ibaraki

Applicant before: HITACHI AUTOMOTIVE SYSTEMS, Ltd.

GR01 Patent grant
GR01 Patent grant