WO2023175741A1 - External environment recognition device - Google Patents

External environment recognition device Download PDF

Info

Publication number
WO2023175741A1
WO2023175741A1 PCT/JP2022/011718 JP2022011718W WO2023175741A1 WO 2023175741 A1 WO2023175741 A1 WO 2023175741A1 JP 2022011718 W JP2022011718 W JP 2022011718W WO 2023175741 A1 WO2023175741 A1 WO 2023175741A1
Authority
WO
WIPO (PCT)
Prior art keywords
road edge
road
feature point
edge feature
feature points
Prior art date
Application number
PCT/JP2022/011718
Other languages
French (fr)
Japanese (ja)
Inventor
航汰郎 猪股
雅幸 竹村
Original Assignee
日立Astemo株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立Astemo株式会社 filed Critical 日立Astemo株式会社
Priority to PCT/JP2022/011718 priority Critical patent/WO2023175741A1/en
Publication of WO2023175741A1 publication Critical patent/WO2023175741A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/10Path keeping
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to an external world recognition device mounted on a vehicle or the like.
  • the external world recognition device is a device that recognizes the external environment (surrounding environment outside the vehicle) by utilizing sensors installed in the vehicle.
  • advanced driving support systems that use external world recognition devices have become popular to prevent traffic accidents.
  • road edges In the front sensing of advanced driving support systems, in addition to detecting the lanes of the driving road, there is also technology that detects the edges of the driving road (hereinafter referred to as road edges).
  • road edge feature points corresponding to the road edge are extracted from the sensing results output from the in-vehicle sensing device that detects information outside the vehicle, and the road edge feature points are connected to detect the road edge.
  • the road edge feature points are connected to detect the road edge.
  • Patent Document 1 there is a technique described in Patent Document 1 that utilizes an on-vehicle camera as an on-vehicle sensing device to detect the boundary of a stepped surface on a running road as a road edge.
  • the present invention has been made in view of the above problems, and its purpose is to provide an external world recognition device that can accurately detect left and right road edges.
  • an external world recognition device includes a processing section that performs roadside detection, and the processing section is configured to: a road edge feature point extraction unit that obtains a road edge feature point corresponding to a road edge; and a road edge direction line calculation unit that calculates a line representing the direction of the road edge constituted by the road edge feature points for each of the road edge feature points. and a road edge in which the road edge feature points are grouped into a right road edge group and a left road edge group according to the direction of the line, and the road edge is identified by the road edge feature points grouped into the same group. It has a specific part.
  • a line representing the direction of the road edge constituted by the road edge feature points is calculated for each road edge feature point, and the road edge is detected according to the direction of the calculated line.
  • FIG. 1 is a block configuration diagram of an external world recognition device according to an embodiment of the present invention.
  • 5 is a processing flowchart of a road edge direction line calculation unit. Example of roadside normal calculation results (overhead view).
  • FIG. 3 is a block configuration diagram of a three-dimensional roadside feature sorting section.
  • FIG. 3 is a block configuration diagram of a road end direction line angle sorting section.
  • 5 is a processing flowchart of a road end direction line angle sorting section.
  • An example of grouping when the road edge direction line is on the negative y-axis.
  • FIG. 1 is a block configuration diagram of an external world recognition device according to an embodiment of the present invention.
  • 5 is a processing flowchart of a road edge direction line calculation unit. Example of roadside normal calculation results (overhead view).
  • FIG. 3 is a block configuration
  • FIG. 3 is a block configuration diagram of a roadside height direction sorting section. Examples of road surface height detection and road edge identification threshold calculation results. Example of roadside identification threshold calculation results assuming off-road and snow-covered roads.
  • FIG. 3 is a block configuration diagram of a road edge depth direction sorting section. Example of roadside feature point selection based on depth direction.
  • FIG. 1 shows the overall configuration of an external world recognition device according to an embodiment of the present invention.
  • the external world recognition device 1 of this embodiment extracts roadside feature points corresponding to roadside from sensing results output from an on-vehicle sensing device that is mounted on a vehicle (self-vehicle or self-vehicle) and detects information outside the vehicle. Then, the road edge is detected by connecting the extracted road edge feature points.
  • the road edge is the end of the drivable area that includes the vehicle's travel path (also referred to as the vehicle's path), and the road edge feature point is a feature point that represents the end of the drivable area (details will be explained later).
  • the external world recognition device 1 of this embodiment mainly includes a stereo camera section 100 and a processing section 900.
  • the processing section 900 includes a parallax generation section 200, a roadside feature point extraction section 300, a roadside direction line calculation section 400, a roadside identification section 600 having a roadside feature three-dimensional selection section 500, and a roadside detection/judgment section. 700 and an alarm control section 800.
  • a parallax generation section 200 includes a parallax generation section 200, a roadside feature point extraction section 300, a roadside direction line calculation section 400, a roadside identification section 600 having a roadside feature three-dimensional selection section 500, and a roadside detection/judgment section. 700 and an alarm control section 800.
  • the stereo camera section 100 is equipped with an in-vehicle stereo camera.
  • the in-vehicle stereo camera constitutes an in-vehicle sensing device that detects information outside the vehicle.
  • the in-vehicle stereo camera acquires images of the outside of the vehicle using the left and right cameras, and outputs the acquired (photographed) images to the processing unit 900.
  • a stereo camera is described as an example of a front sensor, but the sensor itself is not limited to a stereo camera, and may be a single camera, a lidar, or the like. Alternatively, it may be a fusion sensor of a camera and lidar.
  • the processing unit 900 performs roadside detection based on the sensing results (images in this embodiment) output from the stereo camera unit 100.
  • parallax generation unit 200 specifies the position on the image where the same object is imaged from the images taken by the left and right cameras in the stereo camera unit 100, and generates the difference between the positions on the images as a parallax image. By generating a parallax image, it is possible to obtain three-dimensional positional information of the surrounding environment, such as the driving road surface, roadside objects, and obstacles.
  • a stereo camera is described as an example of a front sensor, so a parallax generation unit 200 is provided.
  • the front sensor is a lidar or a fusion sensor with a camera
  • a three-dimensional point cloud map Any device that can detect three-dimensional position information of the surrounding environment may be used.
  • the roadside feature point extraction unit 300 uses the parallax images generated by the parallax generation unit 200 and utilizes the principle of triangulation to generate each three-dimensional position as a 3D point group.
  • Feature points serving as roadside candidates (hereinafter also referred to as roadside feature points or roadside candidate feature points) are extracted from the calculated 3D point group.
  • the roadside feature point extraction unit 300 extracts feature points for a tall three-dimensional object by comparing it with an area corresponding to a driving road surface.
  • the horizontal direction of the image is used as the image coordinate
  • the vertical direction of the image is used as a parallax value indicating the depth
  • each three-dimensional position is determined.
  • Voting processing is performed for each row of abscissas on the generated image to extract feature points that serve as roadside candidates.
  • the extracted roadside feature points hold three-dimensional position information on the image.
  • the extracted roadside feature points can also be arranged on a bird's-eye view of the own vehicle, and can also be converted into a two-dimensional image.
  • Objects to be extracted as road edge feature points include not only general road edges such as curbs and walls, but also moving objects such as oncoming cars and parallel cars, and low level differences of about 5 cm.
  • road edges with a height are mainly described as examples, but negative road edges such as side ditches and rice field roads, and roads with the same height as running roads such as grass, gravel, and dirt without steps are also included. It is possible to extract road edge feature points even from the edge of a road without steps.
  • the height distribution of the area corresponding to the driving road surface and the vertical height distribution are calculated in the same way as for the above-mentioned high road edges. This makes it possible to extract feature points at road edges that are lower than the road surface.
  • Smooth road edges such as grass and gravel can be detected using a learning-based detection method that applies learning algorithms.
  • An example of learning is to annotate a region corresponding to a road edge using an image in which a road edge with no steps exists, and to learn that image as a correct image. By learning in this way, it becomes possible to extract the feature points of the area corresponding to the edge of the road without a step, and it becomes possible to use the feature points in the same way as the feature points of other road edges.
  • the road edge feature points extracted by the road edge feature point extracting unit 300 of this embodiment include a road edge with a height, a negative road edge, and a road edge without a step, and are a travelable area including the road where the vehicle is traveling. These are feature points that are candidates for regions corresponding to the edges of .
  • the road edge direction line calculation unit 400 uses the road edge feature points extracted by the road edge feature point extraction unit 300 to calculate a line representing the direction of the road edge constituted by the road edge feature points.
  • the normal line is described as an example of a line representing the direction of the road edge, but the line representing the direction of the road edge is not limited to the normal line, but may be the position or direction of the front sensor, or the direction of the own vehicle. Any line pointing in the direction of the road is fine. That is, the line representing the direction of the roadside calculated by the roadside direction line calculation unit 400 of this embodiment is a line that can represent the relative direction (inclination) of the roadside with respect to (the front sensor of) the host vehicle.
  • the road edge direction line calculation unit 400 will be described in detail.
  • the road edge direction line calculation unit 400 calculates a line representing the direction of the left and right road edges.
  • a normal line will be described as an example of a line representing the direction of the road edge.
  • FIG. 2 shows a processing flowchart of the road edge direction line calculation unit 400. The operation based on the processing flowchart of FIG. 2 is as follows.
  • step 401 the road edge feature points extracted by the road edge feature point extracting section 300 are acquired.
  • step 402 regarding the road edge feature points acquired in step 401, a search is made to see if there is a road edge feature point one by one in the depth direction from the bottom to the top of the image on the left and right sides.
  • a surface fitting process is performed on the 3D point cloud image generated by the road edge feature point extraction unit 300 using points within a certain amount of area around the position coordinates of the road edge feature point detected in step 402. conduct.
  • a surface fitting process performed using a 3D point group generated from 3D position information will be described as an example, but in a UD map or an overhead view of 2D position information, road edge feature points It is also possible to perform line fitting from a sequence of points.
  • a line in the normal direction of the road edge is calculated from the road edge feature point position using the results calculated in step 403.
  • the normal line is calculated only in the direction in which the sensor is attached, and the normal line in the direction away from the sensor is not calculated.
  • FIG. 3 shows an example of the calculation results of the road edge normal.
  • the line representing the direction of the road edge is described using the normal line as an example, but the line representing the direction of the road edge is not limited to the normal line, and may be the position or direction where the sensor is installed, or the line representing the direction of the road edge. , any line heading in the direction of the vehicle's driving route is sufficient.
  • the road edge feature point detected in step 402 is determined by a line representing the direction of the road edge (in this example Then, the road edge normal) is found.
  • step 405 it is determined whether there are any other acquired roadside feature points, and if there are still roadside feature points, the process returns to step 402 and repeats the processes from step 402 to step 404. If there are no other roadside feature points, the process ends.
  • the road edge identification unit 600 uses the road edge feature three-dimensional selection unit 500 to identify road edges.
  • the road edge feature three-dimensional selection unit 500 uses the line representing the direction of the road edge (sometimes referred to as a road edge direction line) calculated by the road edge direction line calculation unit 400 to extract the road edge feature point extraction unit 300.
  • the extracted roadside feature points are sorted into the right roadside group and the left roadside group.
  • the roadside feature three-dimensional sorting unit 500 will be described in detail.
  • FIG. 4 shows a block configuration of the three-dimensional roadside feature sorting section 500.
  • the road edge feature three-dimensional selection unit 500 includes a road edge direction line angle selection unit 510 that uses the line representing the direction of the road edge calculated by the road edge direction line calculation unit 400 and performs selection according to the direction of the line. , a road edge height direction sorting unit 530 that sorts the road edge direction lines according to height distribution information from the road surface; and a road edge height direction sorting unit 530 that sorts the roads according to the series of road edge direction lines in the direction toward the camera vanishing point on the image.
  • An edge depth direction sorting section 540 is provided to group road edge feature points into left and right road edge groups.
  • the road edge direction line angle sorting section 510 includes a road edge direction line acquisition unit 511 that acquires the road edge direction line generated by the road edge direction line calculation unit 400, and a road edge direction line angle selection unit 510 that selects a road edge direction line angle sorting unit 510, which is a road edge direction line angle sorting unit 510, and a road edge direction line acquisition unit 511 that acquires the road edge direction line generated by the road edge direction line calculation unit 400.
  • FIG. 6 shows a processing flowchart of the road edge direction line angle selection unit 510. The operation based on the processing flowchart of FIG. 6 is as follows.
  • step 514 a line (roadside direction line) representing the direction of the roadside calculated by the roadside direction line calculation unit 400 is acquired.
  • step 515 the road edge feature points acquired in step 514 are searched one by one in the depth direction from the bottom to the top of the image on the left and right sides.
  • FIG. 7 shows an example of sorting according to the direction of the road-end direction line in steps 516, 518, and 520 below.
  • FIG. 7 will be used to explain steps 516, 518, and 520 in detail.
  • the y-axis indicates the direction toward which the own vehicle (the front sensor thereof) points
  • the x-axis indicates the direction perpendicular to the y-axis in the overhead view (the same applies to other figures other than FIG. 7).
  • step 516 the road edge direction line detected in step 515 is used, and as shown in FIG. Determine whether it belongs to the 3rd quadrant.
  • step 517 if the determination result in step 516 is YES, the target road edge feature point is grouped as a right road edge.
  • Step 518 operates if the determination result in step 516 is NO.
  • the angle of the line belongs to the first quadrant, the positive x-axis, or the fourth quadrant. Determine whether
  • step 519 if the determination result in step 515 is YES, the target road edge feature point is grouped as a left road edge.
  • Step 520 operates when the determination result at step 518 is NO, that is, when the direction of the road edge direction line is on the negative y-axis. Search around a certain area with respect to the target road edge direction line.
  • FIG. 8 shows an example of searching around the target road edge direction line.
  • the target road edge feature points are grouped into the same group as the direction of the surrounding road edge direction line using the search result in step 520.
  • a search is performed within the dashed rectangular area centered on the target road edge direction line, and since the direction of the detected road edge direction line in the vicinity belongs to the third quadrant, it is classified as the right road edge group. Since it is classified, the target road edge direction line is grouped into the same right road edge group as the surrounding road edge direction lines.
  • step 522 it is determined whether there are any other acquired roadside feature points. If there are still roadside feature points, the process returns to step 515 and repeats the processes from step 515 to step 521. If there are no other roadside feature points, the process ends.
  • Step 514 is executed by the road edge direction line acquisition unit 511
  • step 515 is executed by the nearby road edge direction line search unit 512
  • steps after step 516 are executed by the target direction line angle selection unit 513.
  • FIG. 9 shows an example in which the road edge direction line straddles the center of the image.
  • a case will be described as an example in which the orientation of the road edge direction line exists in the third quadrant, but is classified as the left road edge in the driving road environment. If the target road edge direction line exists across the center of the image, in addition to the sorting according to the direction of the road edge direction line, the grouping results of the surrounding road edge direction lines are also referred to.
  • the target road edge direction line is grouped into left and right road edge groups, giving priority to the grouping result of the peripheral road edge direction line.
  • the angle of the target road edge direction line (A) exists in the third quadrant, but the surrounding road edge direction line (B) exists in the fourth quadrant and is grouped as the left road edge group.
  • the grouping result of the peripheral road edge direction line (B) is prioritized, and the target road edge direction line (A) is grouped as the same left road edge group as the peripheral road edge direction line (B).
  • the road edge height direction sorting section 530 includes a running road surface height detection unit 531 that detects the running road surface height from the acquired image, and a road edge feature detecting unit that detects the height from the running road surface at each of the road edge feature points.
  • a road edge feature point height selection unit 534 is provided to compare and select road edge feature points, and the road edge feature points are grouped based on the height information.
  • the running road surface height detection unit 531 uses the image generated by the parallax generation unit 200 to calculate the height of the running road surface area based on the parallax information.
  • FIG. 11 shows an example of detecting the running road surface height. From the generated image, height information of each feature point is detected centered around the vehicle's path. Among the detected feature points, the feature points located in the vehicle travel road area are given high priority, and the height information of the feature points is used as a height candidate for the road surface. Height information of feature points is detected from the vehicle travel path area to the left and right. After weighting the feature points in the vehicle's path area, the height of the road surface is detected, including the surrounding feature points.
  • the road edge feature point height detection unit 532 detects the height of each of the road edge feature points from the road surface.
  • the height of the running road surface detected by the running road surface height detection unit 531 is acquired. Thereafter, the height distribution from the road surface at each of the road edge feature points is detected using the three-dimensional position information of the road edge feature point and the height of the road surface.
  • the road edge identification threshold calculation unit 533 calculates a height threshold for identifying a road edge based on the height of the road surface detected by the road surface height detection unit 531. This height threshold is calculated based on the flatness of the road surface. In the case of a paved road such as a national highway, the rate of change in height is generally extremely small over the entire driving road surface. However, when the road surface is off-road, covered with snow, etc., the rate of change in height changes over the entire road surface. Therefore, if the height threshold value is fixed, it may not be possible to cope with slight changes in the height of the road surface when identifying the road edge, and there is a risk that the road edge will be erroneously detected.
  • FIG. 12 shows an example of the results of calculating the road edge identification threshold (height threshold) based on the flatness of the driving road surface, assuming an off-road or snow-covered road.
  • the degree of flatness of the running road surface is calculated using the running road surface feature points detected by the running road surface height detection section 531. Similar to the traveling road surface height detecting section 531, the feature points existing in the vehicle's traveling road area are given high priority weighting, and then the average height of the traveling road surface is calculated including surrounding traveling road surface feature points. This calculated average height of the running road surface is set as the road edge identification threshold.
  • the road edge feature point height selection unit 534 configures the left and right road edges using the height information of the road edge candidate feature points based on the road edge identification threshold calculated by the road edge identification threshold calculation unit 533. Select as feature points. For example, the road edge feature point height selection unit 534 determines whether a road edge candidate feature point The edge candidate feature point is determined to be a road edge feature point forming the same road edge as an adjacent road edge candidate feature point, and is selected.
  • FIG. 13 shows a block configuration of the road edge depth direction sorting section 540.
  • the road edge depth direction sorting section 540 includes a road edge feature point sorting result acquisition section 541 that acquires the sorting results of the road edge direction line angle sorting section 510 and the road edge height direction sorting section 530;
  • the system includes a road edge feature point depth direction selection unit 542 that searches in the depth direction toward the vanishing point of the camera and selects whether the target road edge feature point constitutes a road edge. Sort based on the series in the depth direction.
  • the vanishing point of a camera is described as an example, but it is also possible to search in the depth direction using LiDAR or the like.
  • LiDAR it is also possible to perform three-dimensional sensing like a camera, so it is possible to detect the area of the road surface on which the vehicle is traveling. Therefore, by regarding the direction along the detected running road surface as the depth direction, it is possible to search for road edge feature points in the depth direction.
  • the search is performed from the near side of the own vehicle toward the depth direction.
  • a series of road edge feature points is selected from the near side of the own vehicle, and it is possible to predict the direction in which road edge feature points are series at the same time as the series in the depth direction is selected. Therefore, at the beginning of the search, in the case of a camera, the search starts in the direction toward the vanishing point, but during the process of searching in the depth direction, it is also possible to dynamically change the direction of search. This makes it possible to select road edge feature points in the depth direction even at a road edge where the shape of the road edge is changing.
  • the road edge feature point selection result acquisition unit 541 obtains the results of road edge feature point selection in the road edge direction line angle selection unit 510 and the road edge height direction selection unit 530.
  • the roadside feature point depth direction selection unit 542 determines roadside feature points to be selected based on the results obtained by the roadside feature point selection result acquisition unit 541.
  • FIG. 14 shows an example of roadside feature point selection in the depth direction.
  • the depth direction refers to the direction from the bottom to the top of a captured image, and mainly refers to the direction toward the vanishing point of the camera. In this embodiment, the direction toward the vanishing point of the camera is described as an example, but the direction may be from the lower end to the upper end of the bird's-eye view.
  • the target road edge feature points determined from the results obtained by the road edge feature point selection result acquisition unit 531 are selected from the bottom edge of the image. As shown in FIG.
  • a processing area shaped like a rectangle with a broken line pointing toward the vanishing point of the camera is set based on the position information of the roadside feature point.
  • This processing area does not necessarily have to be directed toward the vanishing point of the camera, but may be in any direction from the bottom edge of the image to the top edge. Then, when this processing area is searched and there is a roadside feature point that has the same selection result as the target roadside feature point, the target roadside feature point is a roadside feature point that constitutes either the left or right roadside. . This process is performed on each side, and the left and right roadside groups are sorted in the depth direction.
  • the roadside feature point depth direction selection unit 542 selects the roadside feature point. is determined to be a road edge feature point constituting the same road edge as an adjacent road edge feature point, and is selected.
  • the road edge feature points are sorted based on a series of road edge features in the height direction, but this alone is not sufficient, and as shown in FIG. There is also a risk that objects or small amounts of snow that are at a certain height from the road surface will be classified as roadside. However, many of these feature points are one-off when considering the series in the depth direction. Therefore, with respect to the road edge feature points selected by the road edge direction line angle sorting unit 510 and the road edge height direction sorting unit 530, the lines constituting the left and right road edges are determined in consideration of continuity in the depth direction. Select as edge feature points. In FIG. 14, since the road edge feature points of the left and right curb road edges exist continuously in the depth direction, they are selected as road edge feature points that constitute the road edge.
  • the target is not only road edges where feature points exist continuously and continuously in the depth direction, such as curbs and walls.
  • road edges where feature points exist continuously and continuously in the depth direction
  • curbstones and road edges where curb stones are present at regular intervals as the same road edge.
  • the roadside identification unit 600 uses the grouping results obtained by the three-dimensional roadside feature classification unit 500 to configure the roadside feature points extracted by the roadside feature extraction unit 300 for left roadside and right roadside, respectively.
  • the road edge is identified by
  • the roadside identification unit 600 connects the road edge feature points forming the left and right road edges based on the results of the selection by the road edge feature three-dimensional selection unit 500 (more specifically, the right road edge group and the left road edge group). (Connect road edge feature points grouped into the same group) to identify the left road edge and right road edge, respectively. Based on the left and right grouping results sorted by the road edge direction line angle sorting section 510, the road edge sorting results in the height direction in the road edge height direction sorting section 530 and the road edge sorting results in the road edge depth direction sorting section 540 are determined. Road edge feature points that satisfy all three conditions of the road edge selection results in the depth direction are used. The left and right road edges are identified using these road edge feature points.
  • the road edge detection/judgment unit 700 uses the road edge information identified by the road edge identification unit 600 to generate information necessary for road edge deviation prevention control using road edge information such as the lateral position and yaw angle of the road edge. is calculated and the reliability of the identified roadside is determined.
  • the road edge detection/judgment unit 700 uses the position information of feature points identified as road edges to accurately detect the shape of the road edge, and is also capable of predicting whether the own vehicle will deviate to the road edge in the future. It is.
  • the warning control unit 800 executes warning and control to prevent the own vehicle from deviating to the roadside based on the roadside information calculated by the roadside detection determination unit 700.
  • the warning control unit 800 executes warning and control to prevent the own vehicle from deviating to the roadside based on the roadside information calculated by the roadside detection determination unit 700.
  • By calculating the lateral position and yaw angle of the road edge it is possible to perform road edge deviation prevention control even for complex road edge shapes. Furthermore, if the own vehicle is equipped with a slip angle or the like, the road edge detection result may not be accurate, so control using road edge information may not be activated.
  • the external world recognition device 1 of the present embodiment has a processing section 900 that performs roadside detection, and the processing section 900 uses sensing information output from an in-vehicle sensing device that detects information outside the vehicle. From the results, the road edge feature point extraction unit 300 obtains the road edge feature points corresponding to the road edge, and the road edge feature point extraction unit 300 extracts a line representing the direction of the road edge (for example, the normal line of the road edge) constituted by the road edge feature points.
  • a road edge direction line calculation unit 400 calculates each road edge feature point, and the road edge feature point is assigned to the right road edge group according to the direction of the line (for example, 1st and 4th quadrant/2nd and 3rd quadrant). and a road edge identifying unit 600 that groups the road edge into a left road edge group and identifies the road edge using the road edge feature points grouped in the same group.
  • the external world recognition device 1 of this embodiment determines road edge feature points corresponding to road edges from the sensing results output from the on-vehicle sensing device that detects information outside the vehicle, and determines the road edge features constituted by the road edge feature points.
  • a line representing the direction of the edge (for example, the normal line of the road edge) is calculated for each road edge feature point, and the road edge feature points are grouped into a right road edge group and a left road edge group according to the direction of the line.
  • a line representing the direction of the road edge constituted by the road edge feature points is calculated for each road edge feature point, and the road edge is detected according to the direction of the calculated line.
  • each of the above-mentioned configurations, functions, processing units, processing means, etc. may be partially or entirely realized by hardware, for example, by designing an integrated circuit.
  • each of the above configurations, functions, etc. may be realized by software by a processor interpreting and executing a program for realizing each function.
  • Information such as programs, tables, files, etc. that realize each function can be stored in memory, a recording device such as a hard disk, an SSD (Solid State Drive), or a recording medium such as an IC card, SD card, or DVD.
  • External world recognition device 100
  • Parallax generation unit 300
  • Road edge feature point extraction unit 400
  • Road edge direction line calculation unit 500
  • Road edge feature three-dimensional selection unit 510
  • Road edge direction line angle selection unit 511 ...Road end direction line acquisition section
  • 512 ...Nearby road end direction line search section 513
  • Target direction line angle selection section 530
  • RotaryRoad end height direction selection section 531 ...Driving road surface height detection Section 532
  • Road edge identification threshold calculation section 534
  • Road edge feature point height selection section 540
  • Road edge depth direction selection section 541
  • Road edge feature point depth direction selection unit 600
  • Road edge identification unit 700 700
  • Road edge detection determination unit 800 800
  • Alarm control unit 900 900

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)

Abstract

Provided is an external environment recognition device capable of accurately detecting left and right road edges. This external environment recognition device: obtains road edge feature points corresponding to road edges from sensing results output from a vehicle-mounted sensing device for detecting information outside the vehicle; calculates, for each road edge feature point, a line representing the direction of the road edge constituted by the road edge feature point (e.g., a normal to the road edge); groups the road edge feature points into a right road edge group and a left road edge group according to the direction of each line; and identifies each road edge on the basis of the road edge feature points grouped in the same group corresponding to the road edge, thereby making it possible to accurately detect left and right road edges.

Description

外界認識装置external world recognition device
 本発明は、車両等に搭載される外界認識装置に関する。 The present invention relates to an external world recognition device mounted on a vehicle or the like.
 外界認識装置は、車両に搭載されたセンサを活用して外界環境(車外周辺環境)を認識する装置である。近年、交通事故の未然防止として外界認識装置を利用した先進運転支援システムが普及している。先進運転支援システムのフロントセンシングにおいて、走行路の車線検知などに加え、走行路の端部(以下、路端とする)を検知する技術がある。 The external world recognition device is a device that recognizes the external environment (surrounding environment outside the vehicle) by utilizing sensors installed in the vehicle. In recent years, advanced driving support systems that use external world recognition devices have become popular to prevent traffic accidents. In the front sensing of advanced driving support systems, in addition to detecting the lanes of the driving road, there is also technology that detects the edges of the driving road (hereinafter referred to as road edges).
 詳しくは、近年、走行路の車線検知機能を応用した、車線逸脱抑制のための警報や制御機能が普及してきている。そのような先進運転支援システムにおいて、将来技術として走行可能領域の端部(路端)を検知する路端検知機能の開発が進められている。路端検知機能の開発背景として、日本国内をはじめ、欧米などの各国では路端への逸脱事故などの車両単独事故が多いことが挙げられる。また、白線などの区画線がないなどの理由で車線を検知できない道路では、走行路の車線検知機能を応用した、車線逸脱抑制のための警報や制御機能を利用できないため、運転支援を受けられないという問題もある。このような背景から、車線への逸脱抑制のみならず、路端への逸脱抑制機能の開発が求められる。この路端逸脱抑制機能の実現に向けて、路端部の検知を可能とする外界認識装置の開発が必要である。 Specifically, in recent years, warning and control functions for lane departure prevention that apply the lane detection function of the driving road have become popular. In such advanced driving support systems, the development of a road edge detection function that detects the edge of the drivable area (road edge) is underway as a future technology. The background to the development of the roadside detection function is that there are many single-vehicle accidents, such as vehicle run-off accidents, in Japan as well as in Europe and the United States. In addition, on roads where lanes cannot be detected due to lack of lane markings such as white lines, driving assistance cannot be received because warnings and control functions that apply the lane detection function of the driving road to prevent lane departure cannot be used. There is also the problem of not having one. Against this background, there is a need to develop a function that not only suppresses deviations into lanes, but also prevents deviations towards the edge of the road. To realize this road edge deviation prevention function, it is necessary to develop an external world recognition device that can detect road edges.
 前記のような路端検知技術において、車外の情報を検知する車載センシング装置から出力されるセンシング結果から、路端に相当する路端特徴点を抽出し、前記路端特徴点を繋いで路端部を構成する技術がある。 In the road edge detection technology described above, road edge feature points corresponding to the road edge are extracted from the sensing results output from the in-vehicle sensing device that detects information outside the vehicle, and the road edge feature points are connected to detect the road edge. There is a technology that composes the department.
 例えば、車載センシング装置として車載カメラを活用し、路端部として走行路における段差面の境界を検出する技術として、特許文献1に記載の技術がある。 For example, there is a technique described in Patent Document 1 that utilizes an on-vehicle camera as an on-vehicle sensing device to detect the boundary of a stepped surface on a running road as a road edge.
特開2015-184900号公報Japanese Patent Application Publication No. 2015-184900
 しかし、従来技術では、抽出された路端特徴点を繋ぐ際に、左路端の特徴点と右路端の特徴点を誤って同じ路端とみなして繋いでしまう課題がある。そのような課題の発生状況として、例えば、曲率が大きい左カーブ路において、車載センシング装置がセンシングできる範囲で、左路端の延長線上に右路端が交差するような場合などが挙げられる。 However, in the conventional technology, when connecting the extracted road edge feature points, there is a problem that the left road edge feature point and the right road edge feature point are mistakenly regarded as the same road edge and are connected. An example of a situation where such a problem occurs is when, on a left-curving road with a large curvature, the right road edge intersects with the extension of the left road edge within the sensing range of the on-vehicle sensing device.
 本発明は、上記課題に鑑みてなされたもので、その目的とするところは、左右の路端を正確に検知することが可能な外界認識装置を提供することにある。 The present invention has been made in view of the above problems, and its purpose is to provide an external world recognition device that can accurately detect left and right road edges.
 上記課題を解決するため、本発明に係る外界認識装置は、路端の検知を行う処理部を有し、前記処理部は、車外の情報を検知する車載センシング装置から出力されるセンシング結果から、路端に相当する路端特徴点を求める路端特徴点抽出部と、前記路端特徴点が構成する路端の方向を表す線を、前記路端特徴点ごとに算出する路端方向線算出部と、前記線の向きに応じて、前記路端特徴点を右路端グループと左路端グループにグルーピングし、同じグループにグルーピングされた前記路端特徴点によって前記路端を特定する路端特定部と、を有する。 In order to solve the above problems, an external world recognition device according to the present invention includes a processing section that performs roadside detection, and the processing section is configured to: a road edge feature point extraction unit that obtains a road edge feature point corresponding to a road edge; and a road edge direction line calculation unit that calculates a line representing the direction of the road edge constituted by the road edge feature points for each of the road edge feature points. and a road edge in which the road edge feature points are grouped into a right road edge group and a left road edge group according to the direction of the line, and the road edge is identified by the road edge feature points grouped into the same group. It has a specific part.
 本発明によれば、路端検知を行う処理において、路端特徴点が構成する路端の方向を表す線を路端特徴点ごとに算出し、算出された線の向きに応じて、路端特徴点を右路端および左路端にグルーピングすることで、より正確な路端検知を可能とする。さらに、左右の路端を正確に検知することにより、路端情報を用いた誤制御(例えば、路端逸脱抑制機能の誤制御)を抑制することが可能となる。 According to the present invention, in the process of performing road edge detection, a line representing the direction of the road edge constituted by the road edge feature points is calculated for each road edge feature point, and the road edge is detected according to the direction of the calculated line. By grouping feature points into right road edges and left road edges, more accurate road edge detection is possible. Furthermore, by accurately detecting the left and right road edges, it is possible to suppress erroneous control using road edge information (for example, erroneous control of the road edge departure prevention function).
 上述した以外の課題、構成、及び効果は以下の実施形態の説明により明らかにされる。 Issues, configurations, and effects other than those described above will be made clear by the description of the embodiments below.
本発明の実施例に係る外界認識装置のブロック構成図。FIG. 1 is a block configuration diagram of an external world recognition device according to an embodiment of the present invention. 路端方向線算出部の処理フローチャート。5 is a processing flowchart of a road edge direction line calculation unit. 路端法線の算出結果例(俯瞰図)。Example of roadside normal calculation results (overhead view). 路端特徴3次元選別部のブロック構成図。FIG. 3 is a block configuration diagram of a three-dimensional roadside feature sorting section. 路端方向線角度選別部のブロック構成図。FIG. 3 is a block configuration diagram of a road end direction line angle sorting section. 路端方向線角度選別部の処理フローチャート。5 is a processing flowchart of a road end direction line angle sorting section. 路端方向線の向きに応じた左右グルーピング例。An example of left and right grouping according to the direction of the road edge line. 路端方向線の向きが負のy軸上の場合のグルーピング例。An example of grouping when the road edge direction line is on the negative y-axis. 路端方向線が画像中心をまたいで存在する例。An example where the road edge direction line straddles the center of the image. 路端高さ方向選別部のブロック構成図。FIG. 3 is a block configuration diagram of a roadside height direction sorting section. 走行路面高さ検出例および路端特定閾値算出結果例。Examples of road surface height detection and road edge identification threshold calculation results. オフロードや積雪路想定の路端特定閾値算出結果例。Example of roadside identification threshold calculation results assuming off-road and snow-covered roads. 路端奥行方向選別部のブロック構成図。FIG. 3 is a block configuration diagram of a road edge depth direction sorting section. 奥行方向による路端特徴点選別例。Example of roadside feature point selection based on depth direction.
 以下、本発明の実施例について図面を用いて説明する。 Embodiments of the present invention will be described below with reference to the drawings.
 図1に、本発明の実施例に係る外界認識装置の全体構成を示す。 FIG. 1 shows the overall configuration of an external world recognition device according to an embodiment of the present invention.
 本実施例の外界認識装置1は、車両(自車または自車両)に搭載され、車外の情報を検知する車載センシング装置から出力されるセンシング結果から、路端に相当する路端特徴点を抽出し、抽出された路端特徴点を繋いで路端を検知する。路端は、自車走行路(自車進行路とも称する)を含む走行可能領域の端部であり、路端特徴点は、その走行可能領域の端部を表す特徴点である(詳細は後で説明)。図1に示すように、本実施例の外界認識装置1は、主に、ステレオカメラ部100と処理部900を備える。処理部900は、視差生成部200と、路端特徴点抽出部300と、路端方向線算出部400と、路端特徴3次元選別部500を有する路端特定部600と、路端検知判定部700と、警報制御部800を備える。以下、外界認識装置1の各部について記載する。 The external world recognition device 1 of this embodiment extracts roadside feature points corresponding to roadside from sensing results output from an on-vehicle sensing device that is mounted on a vehicle (self-vehicle or self-vehicle) and detects information outside the vehicle. Then, the road edge is detected by connecting the extracted road edge feature points. The road edge is the end of the drivable area that includes the vehicle's travel path (also referred to as the vehicle's path), and the road edge feature point is a feature point that represents the end of the drivable area (details will be explained later). ). As shown in FIG. 1, the external world recognition device 1 of this embodiment mainly includes a stereo camera section 100 and a processing section 900. The processing section 900 includes a parallax generation section 200, a roadside feature point extraction section 300, a roadside direction line calculation section 400, a roadside identification section 600 having a roadside feature three-dimensional selection section 500, and a roadside detection/judgment section. 700 and an alarm control section 800. Each part of the external world recognition device 1 will be described below.
(ステレオカメラ部100)
 ステレオカメラ部100には、車載ステレオカメラを搭載する。車載ステレオカメラは、車外の情報を検知する車載センシング装置を構成する。車載ステレオカメラは、左右カメラで車外を撮影した画像を取得し、取得(撮影)した画像を処理部900に出力する。本実施例ではステレオカメラをフロントセンサの例として記載するが、センサ自体はステレオカメラに限定するのではなく、カメラ単体であっても、Lidarなどであってもよい。また、カメラとLidarのフュージョンセンサであってもよい。
(Stereo camera section 100)
The stereo camera section 100 is equipped with an in-vehicle stereo camera. The in-vehicle stereo camera constitutes an in-vehicle sensing device that detects information outside the vehicle. The in-vehicle stereo camera acquires images of the outside of the vehicle using the left and right cameras, and outputs the acquired (photographed) images to the processing unit 900. In this embodiment, a stereo camera is described as an example of a front sensor, but the sensor itself is not limited to a stereo camera, and may be a single camera, a lidar, or the like. Alternatively, it may be a fusion sensor of a camera and lidar.
(処理部900)
 処理部900は、ステレオカメラ部100から出力されるセンシング結果(本実施例では画像)から、路端の検知を行う。
(Processing unit 900)
The processing unit 900 performs roadside detection based on the sensing results (images in this embodiment) output from the stereo camera unit 100.
(視差生成部200)
 視差生成部200では、ステレオカメラ部100において左右カメラで撮影された画像から、同一物体が撮像されている画像上の位置を特定し、その画像上の位置の差を視差画像として生成する。視差画像を生成することにより、走行路面や路端物体、障害物など、周辺環境の3次元位置情報を取得することが可能となる。また、本実施例では、ステレオカメラをフロントセンサの例として記載するため、視差生成部200を備えているが、フロントセンサがLidarやカメラとのフュージョンセンサなどの場合には、3次元点群マップなど周辺環境の3次元位置情報を検出できるものであればよい。
(Parallax generation unit 200)
The parallax generation unit 200 specifies the position on the image where the same object is imaged from the images taken by the left and right cameras in the stereo camera unit 100, and generates the difference between the positions on the images as a parallax image. By generating a parallax image, it is possible to obtain three-dimensional positional information of the surrounding environment, such as the driving road surface, roadside objects, and obstacles. In addition, in this embodiment, a stereo camera is described as an example of a front sensor, so a parallax generation unit 200 is provided. However, if the front sensor is a lidar or a fusion sensor with a camera, a three-dimensional point cloud map Any device that can detect three-dimensional position information of the surrounding environment may be used.
(路端特徴点抽出部300)
 路端特徴点抽出部300では、視差生成部200において生成された視差画像を用いて、三角測量の原理を活用し、それぞれの3次元位置を3D点群として生成する。算出した3D点群から、路端候補となる特徴点(以下、路端特徴点または路端候補特徴点とも称する)を抽出する。
(Roadside feature point extraction unit 300)
The roadside feature point extraction unit 300 uses the parallax images generated by the parallax generation unit 200 and utilizes the principle of triangulation to generate each three-dimensional position as a 3D point group. Feature points serving as roadside candidates (hereinafter also referred to as roadside feature points or roadside candidate feature points) are extracted from the calculated 3D point group.
 路端特徴点抽出部300について詳細に記載する。路端特徴点抽出部300では、走行路面に相当する領域と比較して、高さのある立体物に対して特徴点を抽出する。視差生成部200において生成された視差画像を利用し、画像の横方向に画像上座標、画像の縦方向に奥行を示す視差値とし、三角測量の原理を活用することで、それぞれの3次元位置を3D点群として生成する。生成された画像上で、横座標1列ごとに投票処理を行い、路端候補となる特徴点を抽出する。抽出された路端特徴点は画像上の3次元位置情報を保持している。抽出した路端特徴点は自車両の上から見る俯瞰図上にも並べることが可能であり、2次元画像にも変換することが可能である。 The roadside feature point extraction unit 300 will be described in detail. The road edge feature point extracting unit 300 extracts feature points for a tall three-dimensional object by comparing it with an area corresponding to a driving road surface. Using the parallax images generated by the parallax generation unit 200, the horizontal direction of the image is used as the image coordinate, and the vertical direction of the image is used as a parallax value indicating the depth, and by utilizing the principle of triangulation, each three-dimensional position is determined. is generated as a 3D point cloud. Voting processing is performed for each row of abscissas on the generated image to extract feature points that serve as roadside candidates. The extracted roadside feature points hold three-dimensional position information on the image. The extracted roadside feature points can also be arranged on a bird's-eye view of the own vehicle, and can also be converted into a two-dimensional image.
 路端特徴点として抽出する対象は、一般的な縁石や壁などの路端はもちろん、対向車や並走車などの移動体や、5cm程度の低段差も路端特徴点として抽出する。また、本実施例では、主に高さがある路端を例として記載するが、側溝や田んぼ道のようなマイナス路端や、段差のない芝や砂利、土などの走行路と同じ高さの段差なし路端も対象として、路端特徴点を抽出することが可能である。 Objects to be extracted as road edge feature points include not only general road edges such as curbs and walls, but also moving objects such as oncoming cars and parallel cars, and low level differences of about 5 cm. In addition, in this example, road edges with a height are mainly described as examples, but negative road edges such as side ditches and rice field roads, and roads with the same height as running roads such as grass, gravel, and dirt without steps are also included. It is possible to extract road edge feature points even from the edge of a road without steps.
 マイナス路端と呼んでいる側溝や田んぼ道などの路端については、前記高さがある路端と同様に走行路面に相当する領域の高さ分布と、垂直方向の高さ分布を算出することで、走行路面より低い路端部の特徴点を抽出することが可能となる。 For road edges such as side ditches and rice field roads, which are called negative road edges, the height distribution of the area corresponding to the driving road surface and the vertical height distribution are calculated in the same way as for the above-mentioned high road edges. This makes it possible to extract feature points at road edges that are lower than the road surface.
 段差のない芝や砂利などの段差なし路端については、学習アルゴリズムを応用した学習ベースの検知方法で検知可能である。学習例として、段差なし路端が存在する画像などを用いて、路端部として相当する領域をアノテーションさせ、その画像を正解画像として学習させることが挙げられる。このように学習させることにより、段差なし路端に相当する領域の特徴点を抽出することが可能となり、その他の路端の特徴点と同様に用いることが可能となる。  Smooth road edges such as grass and gravel can be detected using a learning-based detection method that applies learning algorithms. An example of learning is to annotate a region corresponding to a road edge using an image in which a road edge with no steps exists, and to learn that image as a correct image. By learning in this way, it becomes possible to extract the feature points of the area corresponding to the edge of the road without a step, and it becomes possible to use the feature points in the same way as the feature points of other road edges.
 すなわち、本実施例の路端特徴点抽出部300で抽出される路端特徴点は、高さがある路端、マイナス路端、段差なし路端を含み、自車走行路を含む走行可能領域の端部に相当する領域の候補となる特徴点である。 That is, the road edge feature points extracted by the road edge feature point extracting unit 300 of this embodiment include a road edge with a height, a negative road edge, and a road edge without a step, and are a travelable area including the road where the vehicle is traveling. These are feature points that are candidates for regions corresponding to the edges of .
(路端方向線算出部400)
 路端方向線算出部400では、路端特徴点抽出部300で抽出された路端特徴点を用いて、前記路端特徴点が構成する路端の方向を表す線を算出する。本実施例では法線を路端の方向を表す線の例として記載するが、路端の方向を表す線は法線に限定するのではなく、フロントセンサの位置ないし方向、または、自車走行路の方向に向かう線であれば良い。すなわち、本実施例の路端方向線算出部400で算出する路端の方向を表す線は、自車両(のフロントセンサ)に対する路端の相対的な方向(傾き)を表し得る線である。
(Road end direction line calculation unit 400)
The road edge direction line calculation unit 400 uses the road edge feature points extracted by the road edge feature point extraction unit 300 to calculate a line representing the direction of the road edge constituted by the road edge feature points. In this example, the normal line is described as an example of a line representing the direction of the road edge, but the line representing the direction of the road edge is not limited to the normal line, but may be the position or direction of the front sensor, or the direction of the own vehicle. Any line pointing in the direction of the road is fine. That is, the line representing the direction of the roadside calculated by the roadside direction line calculation unit 400 of this embodiment is a line that can represent the relative direction (inclination) of the roadside with respect to (the front sensor of) the host vehicle.
 路端方向線算出部400について詳細に記載する。路端方向線算出部400は、左右の路端の方向を表す線を算出する。本実施例では、路端の方向を表す線について法線を例として記載する。図2に路端方向線算出部400の処理フローチャートを示す。図2の処理フローチャートに基づく動作は以下の通りである。 The road edge direction line calculation unit 400 will be described in detail. The road edge direction line calculation unit 400 calculates a line representing the direction of the left and right road edges. In this embodiment, a normal line will be described as an example of a line representing the direction of the road edge. FIG. 2 shows a processing flowchart of the road edge direction line calculation unit 400. The operation based on the processing flowchart of FIG. 2 is as follows.
 ステップ401では、前記路端特徴点抽出部300において抽出された路端特徴点を取得する。 In step 401, the road edge feature points extracted by the road edge feature point extracting section 300 are acquired.
 ステップ402では、ステップ401で取得した路端特徴点について、画像の下端から上端に向かう奥行方向に1点ずつ路端特徴点があるか左右でそれぞれ探索する。 In step 402, regarding the road edge feature points acquired in step 401, a search is made to see if there is a road edge feature point one by one in the depth direction from the bottom to the top of the image on the left and right sides.
 ステップ403では、路端特徴点抽出部300で生成した3D点群画像において、ステップ402で検出した路端特徴点の位置座標を中心に、一定量域内の点群を用いて面フィッテイング処理を行う。本実施例では、3次元の位置情報より生成した3D点群を利用して行う面フィッテイング処理を例として記載するが、2次元位置情報のUDマップや俯瞰図などにおいては、路端特徴点の点列から線フィッテイングを行うことも可能である。 In step 403, a surface fitting process is performed on the 3D point cloud image generated by the road edge feature point extraction unit 300 using points within a certain amount of area around the position coordinates of the road edge feature point detected in step 402. conduct. In this embodiment, a surface fitting process performed using a 3D point group generated from 3D position information will be described as an example, but in a UD map or an overhead view of 2D position information, road edge feature points It is also possible to perform line fitting from a sequence of points.
 ステップ404では、ステップ403で算出した結果を利用し、路端特徴点位置から路端の法線方向の線を算出する。法線はセンサが取り付けられている方向のみを算出し、センサから遠ざかる方向の法線は算出しない。路端法線の算出結果例を図3に示す。本実施例では、路端の方向を表す線について法線を例として記載するが、路端の方向を表す線は法線に限定するものではなく、センサが取り付けられている位置ないし方向、または、自車走行路の方向に向かう線であればよい。 In step 404, a line in the normal direction of the road edge is calculated from the road edge feature point position using the results calculated in step 403. The normal line is calculated only in the direction in which the sensor is attached, and the normal line in the direction away from the sensor is not calculated. FIG. 3 shows an example of the calculation results of the road edge normal. In this example, the line representing the direction of the road edge is described using the normal line as an example, but the line representing the direction of the road edge is not limited to the normal line, and may be the position or direction where the sensor is installed, or the line representing the direction of the road edge. , any line heading in the direction of the vehicle's driving route is sufficient.
 ステップ403およびステップ404の処理により、ステップ402で検出した路端特徴点において、当該路端特徴点の周囲に存在する路端特徴点に対する位置関係から、路端の方向を表す線(本実施例では路端法線)が求められる。 Through the processing in steps 403 and 404, the road edge feature point detected in step 402 is determined by a line representing the direction of the road edge (in this example Then, the road edge normal) is found.
 ステップ405では、取得した路端特徴点が他にないかを判定し、まだ路端特徴点が存在する場合、ステップ402に戻り、ステップ402からステップ404の処理を繰り返す。他に路端特徴点がない場合は処理を終了する。 In step 405, it is determined whether there are any other acquired roadside feature points, and if there are still roadside feature points, the process returns to step 402 and repeats the processes from step 402 to step 404. If there are no other roadside feature points, the process ends.
(路端特定部600)
 路端特定部600では、路端特徴3次元選別部500を用いて、路端を特定する。
(Roadside identification unit 600)
The road edge identification unit 600 uses the road edge feature three-dimensional selection unit 500 to identify road edges.
(路端特徴3次元選別部500)
 路端特徴3次元選別部500では、路端方向線算出部400で算出した路端の方向を表す線(路端方向線と称する場合がある)を用いて、前記路端特徴点抽出部300で抽出した路端特徴点を右路端グループと左路端グループに選別する。
(Roadside feature three-dimensional selection unit 500)
The road edge feature three-dimensional selection unit 500 uses the line representing the direction of the road edge (sometimes referred to as a road edge direction line) calculated by the road edge direction line calculation unit 400 to extract the road edge feature point extraction unit 300. The extracted roadside feature points are sorted into the right roadside group and the left roadside group.
 路端特徴3次元選別部500について詳細に記載する。路端特徴3次元選別部500のブロック構成を図4に示す。路端特徴3次元選別部500では、路端方向線算出部400で算出された路端の方向を表す線を利用し、その線の向きに応じて選別する路端方向線角度選別部510と、路端方向線の路面からの高さ分布情報に応じて選別する路端高さ方向選別部530と、路端方向線の画像上のカメラ消失点に向かう方向の連なりに応じて選別する路端奥行方向選別部540を備え、路端特徴点を左右の路端グループにグルーピングする。 The roadside feature three-dimensional sorting unit 500 will be described in detail. FIG. 4 shows a block configuration of the three-dimensional roadside feature sorting section 500. The road edge feature three-dimensional selection unit 500 includes a road edge direction line angle selection unit 510 that uses the line representing the direction of the road edge calculated by the road edge direction line calculation unit 400 and performs selection according to the direction of the line. , a road edge height direction sorting unit 530 that sorts the road edge direction lines according to height distribution information from the road surface; and a road edge height direction sorting unit 530 that sorts the roads according to the series of road edge direction lines in the direction toward the camera vanishing point on the image. An edge depth direction sorting section 540 is provided to group road edge feature points into left and right road edge groups.
(路端方向線角度選別部510)
 路端方向線角度選別部510について詳細に記載する。図5に路端方向線角度選別部510のブロック構成を示す。路端方向線角度選別部510は、前記路端方向線算出部400で生成された路端方向線を取得する路端方向線取得部511と、画像の下端から上端に向かう方向で、選別対象となる前記路端方向線を探索する近傍路端方向線探索部512と、対象となる路端方向線を路端方向線の向きに応じて左右にグルーピングする対象方向線角度選別部513を備え、路端方向線の向きに応じて、前記路端特徴点を左右の路端グループにグルーピングする。図6に路端方向線角度選別部510の処理フローチャートを示す。図6の処理フローチャートに基づく動作は以下の通りである。
(Road end direction line angle selection unit 510)
The road edge direction line angle sorting section 510 will be described in detail. FIG. 5 shows a block configuration of the road end direction line angle sorting section 510. The road edge direction line angle sorting unit 510 includes a road edge direction line acquisition unit 511 that acquires the road edge direction line generated by the road edge direction line calculation unit 400, and a road edge direction line angle selection unit 510 that selects a road edge direction line angle sorting unit 510, which is a road edge direction line angle sorting unit 510, and a road edge direction line acquisition unit 511 that acquires the road edge direction line generated by the road edge direction line calculation unit 400. a nearby road edge direction line search unit 512 that searches for the road edge direction line, and a target direction line angle selection unit 513 that groups target road edge direction lines into left and right groups according to the direction of the road edge direction line. , the road edge feature points are grouped into left and right road edge groups according to the direction of the road edge direction line. FIG. 6 shows a processing flowchart of the road edge direction line angle selection unit 510. The operation based on the processing flowchart of FIG. 6 is as follows.
 ステップ514では、路端方向線算出部400で算出された路端の方向を表す線(路端方向線)を取得する。 In step 514, a line (roadside direction line) representing the direction of the roadside calculated by the roadside direction line calculation unit 400 is acquired.
 ステップ515では、ステップ514で取得した路端特徴点について、画像の下端から上端に向かう奥行方向に1点ずつ路端特徴点があるか左右でそれぞれ探索する。 In step 515, the road edge feature points acquired in step 514 are searched one by one in the depth direction from the bottom to the top of the image on the left and right sides.
 ここで、図7に、以下のステップ516、518、520において、路端方向線の向きに応じた選別例を示す。以下、図7は、ステップ516、518、520の詳細説明時に用いる。なお、図7の座標軸において、y軸は自車両(のフロントセンサ)が指向する方向、x軸は俯瞰図においてy軸に垂直な方向を示している(図7以外の他図も同じ)。 Here, FIG. 7 shows an example of sorting according to the direction of the road-end direction line in steps 516, 518, and 520 below. Hereinafter, FIG. 7 will be used to explain steps 516, 518, and 520 in detail. In the coordinate axes of FIG. 7, the y-axis indicates the direction toward which the own vehicle (the front sensor thereof) points, and the x-axis indicates the direction perpendicular to the y-axis in the overhead view (the same applies to other figures other than FIG. 7).
 ステップ516では、ステップ515で検出した路端方向線を利用し、図7に示すように、その線の始点を座標軸の原点として、その線の角度が第2象限か負のx軸上、第3象限に属するかを判定する。 In step 516, the road edge direction line detected in step 515 is used, and as shown in FIG. Determine whether it belongs to the 3rd quadrant.
 ステップ517では、ステップ516の判定結果がYESの場合に、対象路端特徴点を右路端としてグルーピングする。 In step 517, if the determination result in step 516 is YES, the target road edge feature point is grouped as a right road edge.
 ステップ518では、ステップ516の判定結果がNOの場合に動作する。ステップ515で検出した路端方向線を利用し、図7に示すように、その線の始点を座標軸の原点として、その線の角度が第1象限か正のx軸上、第4象限に属するかを判定する。 Step 518 operates if the determination result in step 516 is NO. Using the road edge direction line detected in step 515, as shown in FIG. 7, with the starting point of the line as the origin of the coordinate axis, the angle of the line belongs to the first quadrant, the positive x-axis, or the fourth quadrant. Determine whether
 ステップ519では、ステップ515の判定結果がYESの場合に、対象路端特徴点を左路端としてグルーピングする。 In step 519, if the determination result in step 515 is YES, the target road edge feature point is grouped as a left road edge.
 ステップ520では、ステップ518の判定結果がNOの場合、すなわち路端方向線の向きが負のy軸上の場合に動作する。対象の路端方向線に対して、一定領域の周辺を探索する。図8に対象路端方向線の周辺探索例を示す。 Step 520 operates when the determination result at step 518 is NO, that is, when the direction of the road edge direction line is on the negative y-axis. Search around a certain area with respect to the target road edge direction line. FIG. 8 shows an example of searching around the target road edge direction line.
 ステップ521では、ステップ520で探索した結果を利用し、周辺の路端方向線の向きと同じグループに対象路端特徴点をグルーピングする。図8を例とすると、対象路端方向線を中心として破線四角の領域内を探索し、検出された周辺の路端方向線の向きが第3象限に属しているために右路端グループと分類されているため、対象路端方向線は、周辺路端方向線と同じ右路端グループにグルーピングする。 In step 521, the target road edge feature points are grouped into the same group as the direction of the surrounding road edge direction line using the search result in step 520. Taking Fig. 8 as an example, a search is performed within the dashed rectangular area centered on the target road edge direction line, and since the direction of the detected road edge direction line in the vicinity belongs to the third quadrant, it is classified as the right road edge group. Since it is classified, the target road edge direction line is grouped into the same right road edge group as the surrounding road edge direction lines.
 ステップ522では、取得した路端特徴点が他にないかを判定し、まだ路端特徴点が存在する場合、ステップ515に戻り、ステップ515からステップ521の処理を繰り返す。他に路端特徴点がない場合は処理を終了する。 In step 522, it is determined whether there are any other acquired roadside feature points. If there are still roadside feature points, the process returns to step 515 and repeats the processes from step 515 to step 521. If there are no other roadside feature points, the process ends.
 ステップ514は路端方向線取得部511、ステップ515は近傍路端方向線探索部512、ステップ516以降は対象方向線角度選別部513で実行される。 Step 514 is executed by the road edge direction line acquisition unit 511, step 515 is executed by the nearby road edge direction line search unit 512, and steps after step 516 are executed by the target direction line angle selection unit 513.
 また、路端方向線が画像中心をまたいで存在した場合についても記載する。図9に路端方向線が画像中心をまたいで存在する例を示す。図9に示すように、路端方向線の向きが第3象限に存在するが、走行路環境においては左路端に分類されるような場合を例として記載する。対象路端方向線が画像中心をまたいで存在した場合、路端方向線の向きに応じた選別に加え、周辺路端方向線のグルーピング結果も参照する。対象路端方向線と周辺路端方向線の向きによる選別結果が異なる場合は、周辺路端方向線のグルーピング結果を優先として、対象路端方向線を左右の路端グループにグルーピングする。図9の場合、対象路端方向線(A)の角度は第3象限に存在するが、周辺路端方向線(B)は第4象限に存在し、左路端グループとしてグルーピングされているため、周辺路端方向線(B)のグルーピング結果を優先し、対象路端方向線(A)を周辺路端方向線(B)と同じ左路端グループとしてグルーピングする。 Also described is the case where the road edge direction line straddles the center of the image. FIG. 9 shows an example in which the road edge direction line straddles the center of the image. As shown in FIG. 9, a case will be described as an example in which the orientation of the road edge direction line exists in the third quadrant, but is classified as the left road edge in the driving road environment. If the target road edge direction line exists across the center of the image, in addition to the sorting according to the direction of the road edge direction line, the grouping results of the surrounding road edge direction lines are also referred to. If the sorting results based on the orientations of the target road edge direction line and the peripheral road edge direction line are different, the target road edge direction line is grouped into left and right road edge groups, giving priority to the grouping result of the peripheral road edge direction line. In the case of Fig. 9, the angle of the target road edge direction line (A) exists in the third quadrant, but the surrounding road edge direction line (B) exists in the fourth quadrant and is grouped as the left road edge group. , the grouping result of the peripheral road edge direction line (B) is prioritized, and the target road edge direction line (A) is grouped as the same left road edge group as the peripheral road edge direction line (B).
(路端高さ方向選別部530)
 路端高さ方向選別部530について詳細に記載する。図10に路端高さ方向選別部530のブロック構成を示す。路端高さ方向選別部530では、走行路面高さを取得した画像から検出する走行路面高さ検出部531と、前記路端特徴点のそれぞれにおける走行路面からの高さを検出する路端特徴点高さ検出部532と、路面の凹凸具合に基づいて動的に路面高さ閾値を算出する路端特定閾値算出部533と、前記路面高さ閾値と前記路端特徴点の高さ情報を比較し、路端特徴点を選別する路端特徴点高さ選別部534を備え、路端特徴点を高さ情報に基づいてグルーピングする。
(Roadside height direction sorting unit 530)
The road edge height direction sorting section 530 will be described in detail. FIG. 10 shows a block configuration of the road edge height direction sorting section 530. The road edge height direction sorting unit 530 includes a running road surface height detection unit 531 that detects the running road surface height from the acquired image, and a road edge feature detecting unit that detects the height from the running road surface at each of the road edge feature points. A point height detection unit 532, a road edge identification threshold calculation unit 533 that dynamically calculates a road surface height threshold based on the degree of unevenness of the road surface, and a road edge identification threshold calculation unit 533 that calculates the road surface height threshold and the height information of the road edge feature point. A road edge feature point height selection unit 534 is provided to compare and select road edge feature points, and the road edge feature points are grouped based on the height information.
 走行路面高さ検出部531では、前記視差生成部200で生成した画像を利用し、視差情報に基づいて走行路面領域の高さを算出する。図11に走行路面高さ検出例を示す。前記生成した画像から自車進行路付近を中心に各特徴点の高さ情報を検出する。検出した特徴点のうち、自車進行路領域にある特徴点を優先度高く用いて、その特徴点の高さ情報を走行路面の高さ候補とする。自車進行路領域から左右に向かって特徴点の高さ情報を検出する。自車進行路領域にある特徴点に対して重みづけをした後に、その周辺の特徴点を含めて走行路面の高さを検出する。 The running road surface height detection unit 531 uses the image generated by the parallax generation unit 200 to calculate the height of the running road surface area based on the parallax information. FIG. 11 shows an example of detecting the running road surface height. From the generated image, height information of each feature point is detected centered around the vehicle's path. Among the detected feature points, the feature points located in the vehicle travel road area are given high priority, and the height information of the feature points is used as a height candidate for the road surface. Height information of feature points is detected from the vehicle travel path area to the left and right. After weighting the feature points in the vehicle's path area, the height of the road surface is detected, including the surrounding feature points.
 路端特徴点高さ検出部532では、前記路端特徴点のそれぞれにおける走行路面からの高さを検出する。前記走行路面高さ検出部531において検出した走行路面の高さを取得する。その後、前記路端特徴点の3次元位置情報と走行路面高さを用いて、路端特徴点のそれぞれにおける走行路面からの高さ分布を検出する。 The road edge feature point height detection unit 532 detects the height of each of the road edge feature points from the road surface. The height of the running road surface detected by the running road surface height detection unit 531 is acquired. Thereafter, the height distribution from the road surface at each of the road edge feature points is detected using the three-dimensional position information of the road edge feature point and the height of the road surface.
 路端特定閾値算出部533では、前記走行路面高さ検出部531において検出した走行路面高さに基づいて路端として特定する高さ閾値を算出する。この高さ閾値は、走行路面の平坦度に基づいて算出する。道路が舗装された国道などの場合は、一般的に走行路面全域で高さの変化率は極めて小さい。しかし、走行路面がオフロードや積雪路などの場合には、走行路面全域で高さの変化率は変わってくる。そのため、高さ閾値が固定のものであった場合、路端として特定する際に、多少の走行路面高さ変化に対して対応できず、路端誤検知をする恐れがある。この問題を解決するために、走行路面の平坦度や凹凸度合いによって、動的に高さ閾値を変化させる。図12にオフロードや積雪路を想定した、走行路面の平坦度によって路端特定閾値(高さ閾値)を算出した結果の例を示す。前記走行路面高さ検出部531において検出した走行路面特徴点を用いて、走行路面の平坦度合いを算出する。前記走行路面高さ検出部531と同様に自車進行路領域に存在する特徴点は優先度を高く重みづけしたのち、周辺の走行路面特徴点を含んで走行路面の平均高さを算出する。この算出した走行路面の平均高さを路端特定閾値とする。走行路面の平坦度によって路端特定閾値を動的に変化させることにより、オフロードや積雪路などの走行路面の高さが一定ではない環境においても路端を正確に検知することが可能となる。 The road edge identification threshold calculation unit 533 calculates a height threshold for identifying a road edge based on the height of the road surface detected by the road surface height detection unit 531. This height threshold is calculated based on the flatness of the road surface. In the case of a paved road such as a national highway, the rate of change in height is generally extremely small over the entire driving road surface. However, when the road surface is off-road, covered with snow, etc., the rate of change in height changes over the entire road surface. Therefore, if the height threshold value is fixed, it may not be possible to cope with slight changes in the height of the road surface when identifying the road edge, and there is a risk that the road edge will be erroneously detected. To solve this problem, the height threshold value is dynamically changed depending on the flatness and degree of unevenness of the road surface. FIG. 12 shows an example of the results of calculating the road edge identification threshold (height threshold) based on the flatness of the driving road surface, assuming an off-road or snow-covered road. The degree of flatness of the running road surface is calculated using the running road surface feature points detected by the running road surface height detection section 531. Similar to the traveling road surface height detecting section 531, the feature points existing in the vehicle's traveling road area are given high priority weighting, and then the average height of the traveling road surface is calculated including surrounding traveling road surface feature points. This calculated average height of the running road surface is set as the road edge identification threshold. By dynamically changing the road edge identification threshold depending on the flatness of the road surface, it is possible to accurately detect the road edge even in environments where the height of the road surface is not constant, such as off-road or snow-covered roads. .
 路端特徴点高さ選別部534では、前記路端特定閾値算出部533で算出された路端特定閾値に基づいて、路端候補特徴点の高さ情報を用いて左右それぞれの路端を構成する特徴点として選別する。路端特徴点高さ選別部534では、例えば、路端候補特徴点が、近接する路端候補特徴点における走行路面からの高さの変化率が前記路端特定閾値以下の場合に、当該路端候補特徴点が近接する路端候補特徴点と同一の路端を構成する路端特徴点であると判定して選別する。 The road edge feature point height selection unit 534 configures the left and right road edges using the height information of the road edge candidate feature points based on the road edge identification threshold calculated by the road edge identification threshold calculation unit 533. Select as feature points. For example, the road edge feature point height selection unit 534 determines whether a road edge candidate feature point The edge candidate feature point is determined to be a road edge feature point forming the same road edge as an adjacent road edge candidate feature point, and is selected.
(路端奥行方向選別部540)
 路端奥行方向選別部540について詳細に記載する。図13に路端奥行方向選別部540のブロック構成を示す。路端奥行方向選別部540では、前記路端方向線角度選別部510と前記路端高さ方向選別部530の選別結果を取得する路端特徴点選別結果取得部541と、前記路端特徴点において、カメラの消失点に向かう奥行方向に探索していき、対象路端特徴点が路端を構成するものなのかを選別する路端特徴点奥行方向選別部542を備え、路端特徴点を奥行方向の連なりに基づいて選別する。また、本実施例では、カメラの消失点を例として記載するが、LiDARなどの場合にも奥行方向に探索することが可能である。LiDARの場合もカメラと同様に3次元センシングが可能であるため、走行路面領域を検知することが可能である。そのため、検知した走行路面に沿っていく方向を奥行方向とみなすことで、奥行方向に路端特徴点を探索することが可能となる。
(Road edge depth direction sorting unit 540)
The road edge depth direction sorting section 540 will be described in detail. FIG. 13 shows a block configuration of the road edge depth direction sorting section 540. The road edge depth direction sorting section 540 includes a road edge feature point sorting result acquisition section 541 that acquires the sorting results of the road edge direction line angle sorting section 510 and the road edge height direction sorting section 530; The system includes a road edge feature point depth direction selection unit 542 that searches in the depth direction toward the vanishing point of the camera and selects whether the target road edge feature point constitutes a road edge. Sort based on the series in the depth direction. Furthermore, in this embodiment, the vanishing point of a camera is described as an example, but it is also possible to search in the depth direction using LiDAR or the like. In the case of LiDAR, it is also possible to perform three-dimensional sensing like a camera, so it is possible to detect the area of the road surface on which the vehicle is traveling. Therefore, by regarding the direction along the detected running road surface as the depth direction, it is possible to search for road edge feature points in the depth direction.
 また、奥行方向の探索では、自車の手前側から奥行方向に向かって探索を行う。自車の手前側から路端特徴点の連なりを選別していくが、奥行方向の連なりを選別していくと同時に、路端特徴点の連なる方向を予測することが可能である。そのため、探索はじめは、カメラの場合は消失点に向かう方向で探索し始めるが、奥行方向に探索していく過程で、探索する方向を動的に変化させることも可能である。これにより、路端形状が変化していく路端においても、奥行方向に路端特徴点を選別することが可能となる。 In addition, in the search in the depth direction, the search is performed from the near side of the own vehicle toward the depth direction. A series of road edge feature points is selected from the near side of the own vehicle, and it is possible to predict the direction in which road edge feature points are series at the same time as the series in the depth direction is selected. Therefore, at the beginning of the search, in the case of a camera, the search starts in the direction toward the vanishing point, but during the process of searching in the depth direction, it is also possible to dynamically change the direction of search. This makes it possible to select road edge feature points in the depth direction even at a road edge where the shape of the road edge is changing.
 路端特徴点選別結果取得部541では、前記路端方向線角度選別部510と前記路端高さ方向選別部530において路端特徴点を選別した結果を取得する。 The road edge feature point selection result acquisition unit 541 obtains the results of road edge feature point selection in the road edge direction line angle selection unit 510 and the road edge height direction selection unit 530.
 路端特徴点奥行方向選別部542では、前記路端特徴点選別結果取得部541で取得した結果に基づいて、選別対象となる路端特徴点を決定する。図14に奥行方向による路端特徴点選別例を示す。奥行方向というのは、撮影された画像の下端から上端に向かう方向で、主にカメラの消失点に向かう方向を意味する。本実施例では、カメラの消失点に向かう方向を例に記載するが、俯瞰図の下端から上端に向かう方向でもよい。路端特徴点選別結果取得部531で取得した結果から決定した対象路端特徴点について、画像の下端からそれぞれ選別していく。図14に示すように、対象となる路端特徴点が存在した場合、その路端特徴点の位置情報をベースにカメラの消失点方向に向かう破線の四角形のような処理領域を設定する。この処理領域は、カメラの消失点に必ずしも向かうものでなければならないのではなく、画像の下端から上端に向かう方向であればよい。そして、この処理領域内を探索し、対象路端特徴点と同じ選別結果を持つ路端特徴点が存在した場合、対象路端特徴点は、左右いずれかの路端を構成する路端特徴点であると選別する。この処理を左右それぞれで行い、左右路端グループの奥行方向に対する選別を行う。路端特徴点奥行方向選別部542では、例えば、路端特徴点が、近接する路端特徴点から画像上のカメラ消失点に向かう方向に沿って位置している場合に、当該路端特徴点が近接する路端特徴点と同一の路端を構成する路端特徴点であると判定して選別する。 The roadside feature point depth direction selection unit 542 determines roadside feature points to be selected based on the results obtained by the roadside feature point selection result acquisition unit 541. FIG. 14 shows an example of roadside feature point selection in the depth direction. The depth direction refers to the direction from the bottom to the top of a captured image, and mainly refers to the direction toward the vanishing point of the camera. In this embodiment, the direction toward the vanishing point of the camera is described as an example, but the direction may be from the lower end to the upper end of the bird's-eye view. The target road edge feature points determined from the results obtained by the road edge feature point selection result acquisition unit 531 are selected from the bottom edge of the image. As shown in FIG. 14, when a target roadside feature point exists, a processing area shaped like a rectangle with a broken line pointing toward the vanishing point of the camera is set based on the position information of the roadside feature point. This processing area does not necessarily have to be directed toward the vanishing point of the camera, but may be in any direction from the bottom edge of the image to the top edge. Then, when this processing area is searched and there is a roadside feature point that has the same selection result as the target roadside feature point, the target roadside feature point is a roadside feature point that constitutes either the left or right roadside. . This process is performed on each side, and the left and right roadside groups are sorted in the depth direction. For example, when a roadside feature point is located along the direction from an adjacent roadside feature point toward the camera vanishing point on the image, the roadside feature point depth direction selection unit 542 selects the roadside feature point. is determined to be a road edge feature point constituting the same road edge as an adjacent road edge feature point, and is selected.
 前記路端高さ方向選別部530では、路端特徴点の高さ方向の連なりから選別していたが、これだけでは不十分であり、図14にあるように、高さがある程度あるタイヤの落下物や、小さな積雪など、走行路面からある程度高さがある場合にも路端として選別してしまう恐れがある。しかし、これらの特徴点は、奥行方向の連なりを考えた場合、単発的なものであるものが多い。そのため、前記路端方向線角度選別部510と前記路端高さ方向選別部530で選別された路端特徴点に対して、奥行方向の連続性を考慮して左右の路端を構成する路端特徴点として選別する。図14では、左右の縁石路端の路端特徴点について、奥行方向に連続して存在するため、路端を構成する路端特徴点として選別する。 In the road edge height direction sorting unit 530, the road edge feature points are sorted based on a series of road edge features in the height direction, but this alone is not sufficient, and as shown in FIG. There is also a risk that objects or small amounts of snow that are at a certain height from the road surface will be classified as roadside. However, many of these feature points are one-off when considering the series in the depth direction. Therefore, with respect to the road edge feature points selected by the road edge direction line angle sorting unit 510 and the road edge height direction sorting unit 530, the lines constituting the left and right road edges are determined in consideration of continuity in the depth direction. Select as edge feature points. In FIG. 14, since the road edge feature points of the left and right curb road edges exist continuously in the depth direction, they are selected as road edge feature points that constitute the road edge.
 また、縁石や壁のような奥行方向に絶え間なく連続的に特徴点が存在する路端のみを対象にするのではない。例えば、工事用に設置されたパイロンで構成された路端なども対象とすることが可能である。また、縁石においても、縁石が一定の間隔で存在するような路端においても、同一な路端として選別することが可能である。 In addition, the target is not only road edges where feature points exist continuously and continuously in the depth direction, such as curbs and walls. For example, it is possible to target roadsides made up of pylons installed for construction purposes. Furthermore, it is possible to sort out curbstones and road edges where curb stones are present at regular intervals as the same road edge.
(路端特定部600:路端特徴3次元選別部500の処理後)
 路端特定部600では、前記路端特徴3次元選別部500でグルーピングされた結果を用いて、路端特徴抽出部300で抽出された路端特徴点を左路端と右路端でそれぞれ構成することによって路端を特定する。
(Roadside identification unit 600: after processing by roadside feature three-dimensional selection unit 500)
The roadside identification unit 600 uses the grouping results obtained by the three-dimensional roadside feature classification unit 500 to configure the roadside feature points extracted by the roadside feature extraction unit 300 for left roadside and right roadside, respectively. The road edge is identified by
 路端特定部600について詳細に記載する。路端特定部600では、前記路端特徴3次元選別部500において選別した結果に基づいて、左右の路端を構成する路端特徴点を繋ぎ(詳しくは、右路端グループと左路端グループの同じグループにグルーピングされた路端特徴点を繋ぎ)、左路端と右路端をそれぞれ特定する。前記路端方向線角度選別部510によって選別された左右のグルーピング結果をベースに、前記路端高さ方向選別部530における高さ方向の路端選別結果と、前記路端奥行方向選別部540における奥行方向の路端選別結果の3条件全てを満たす路端特徴点を使用する。この路端特徴点によって左右の路端を特定する。 The roadside identification unit 600 will be described in detail. The road edge identification unit 600 connects the road edge feature points forming the left and right road edges based on the results of the selection by the road edge feature three-dimensional selection unit 500 (more specifically, the right road edge group and the left road edge group). (Connect road edge feature points grouped into the same group) to identify the left road edge and right road edge, respectively. Based on the left and right grouping results sorted by the road edge direction line angle sorting section 510, the road edge sorting results in the height direction in the road edge height direction sorting section 530 and the road edge sorting results in the road edge depth direction sorting section 540 are determined. Road edge feature points that satisfy all three conditions of the road edge selection results in the depth direction are used. The left and right road edges are identified using these road edge feature points.
 これにより、路端特徴点として抽出された特徴点を誤ることなく、左右でそれぞれ正確に路端検知することが可能となる。 This makes it possible to accurately detect road ends on the left and right sides without making mistakes in the feature points extracted as road end feature points.
(路端検知判定部700)
 路端検知判定部700では、路端特定部600で特定された路端情報を用いて、路端の横位置やヨー角などの路端情報を用いた路端逸脱抑制制御などに必要な情報を算出し、特定した路端の信頼度を判定する。路端検知判定部700では、路端として特定された特徴点の位置情報を使用し、路端の形状を正確に検知し、自車がこの先で路端に逸脱するかを予測することも可能である。
(Roadside detection determination unit 700)
The road edge detection/judgment unit 700 uses the road edge information identified by the road edge identification unit 600 to generate information necessary for road edge deviation prevention control using road edge information such as the lateral position and yaw angle of the road edge. is calculated and the reliability of the identified roadside is determined. The road edge detection/judgment unit 700 uses the position information of feature points identified as road edges to accurately detect the shape of the road edge, and is also capable of predicting whether the own vehicle will deviate to the road edge in the future. It is.
(警報制御部800)
 警報制御部800では、路端検知判定部700で算出された路端情報に基づいて、自車が路端に逸脱するのを抑制する警報や制御を実施する。路端の横位置やヨー角を算出することで、複雑な路端形状に対しても路端逸脱抑制制御を行うことが可能である。また、スリップアングルなどが自車に搭載されている場合は、路端の検知結果が正確ではない恐れがあるため、路端情報を用いた制御を作動させないことも可能である。
(Alarm control unit 800)
The warning control unit 800 executes warning and control to prevent the own vehicle from deviating to the roadside based on the roadside information calculated by the roadside detection determination unit 700. By calculating the lateral position and yaw angle of the road edge, it is possible to perform road edge deviation prevention control even for complex road edge shapes. Furthermore, if the own vehicle is equipped with a slip angle or the like, the road edge detection result may not be accurate, so control using road edge information may not be activated.
 以上で説明したように、本実施例の外界認識装置1は、路端の検知を行う処理部900を有し、前記処理部900は、車外の情報を検知する車載センシング装置から出力されるセンシング結果から、路端に相当する路端特徴点を求める路端特徴点抽出部300と、前記路端特徴点が構成する路端の方向を表す線(例えば、路端の法線)を、前記路端特徴点ごとに算出する路端方向線算出部400と、前記線の向き(例えば、第1・4象限/第2・3象限)に応じて、前記路端特徴点を右路端グループと左路端グループにグルーピングし、同じグループにグルーピングされた前記路端特徴点によって前記路端を特定する路端特定部600と、を有する。 As explained above, the external world recognition device 1 of the present embodiment has a processing section 900 that performs roadside detection, and the processing section 900 uses sensing information output from an in-vehicle sensing device that detects information outside the vehicle. From the results, the road edge feature point extraction unit 300 obtains the road edge feature points corresponding to the road edge, and the road edge feature point extraction unit 300 extracts a line representing the direction of the road edge (for example, the normal line of the road edge) constituted by the road edge feature points. A road edge direction line calculation unit 400 calculates each road edge feature point, and the road edge feature point is assigned to the right road edge group according to the direction of the line (for example, 1st and 4th quadrant/2nd and 3rd quadrant). and a road edge identifying unit 600 that groups the road edge into a left road edge group and identifies the road edge using the road edge feature points grouped in the same group.
 すなわち、本実施例の外界認識装置1は、車外の情報を検知する車載センシング装置から出力されるセンシング結果から、路端に相当する路端特徴点を求め、前記路端特徴点が構成する路端の方向を表す線(例えば、路端の法線)を路端特徴点ごとに算出し、前記線の向きに応じて、路端特徴点を右路端グループと左路端グループにグルーピングし、同じグループにグルーピングされた前記路端特徴点によって前記路端を特定することで、左右の路端を正確に検知することが可能となる。 That is, the external world recognition device 1 of this embodiment determines road edge feature points corresponding to road edges from the sensing results output from the on-vehicle sensing device that detects information outside the vehicle, and determines the road edge features constituted by the road edge feature points. A line representing the direction of the edge (for example, the normal line of the road edge) is calculated for each road edge feature point, and the road edge feature points are grouped into a right road edge group and a left road edge group according to the direction of the line. By specifying the road edge using the road edge feature points grouped into the same group, it becomes possible to accurately detect the left and right road edges.
 本実施例によれば、路端検知を行う処理において、路端特徴点が構成する路端の方向を表す線を路端特徴点ごとに算出し、算出された線の向きに応じて、路端特徴点を右路端および左路端にグルーピングすることで、より正確な路端検知を可能とする。さらに、左右の路端を正確に検知することにより、路端情報を用いた誤制御(例えば、路端逸脱抑制機能の誤制御)を抑制することが可能となる。 According to this embodiment, in the process of performing road edge detection, a line representing the direction of the road edge constituted by the road edge feature points is calculated for each road edge feature point, and the road edge is detected according to the direction of the calculated line. By grouping edge feature points into right road edges and left road edges, more accurate road edge detection is possible. Furthermore, by accurately detecting the left and right road edges, it is possible to suppress erroneous control using road edge information (for example, erroneous control of the road edge departure prevention function).
 なお、本発明は上述の実施例に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施例は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。 Note that the present invention is not limited to the above-described embodiments, and includes various modifications. For example, the embodiments described above are described in detail to explain the present invention in an easy-to-understand manner, and the present invention is not necessarily limited to having all the configurations described.
 また、上記の各構成、機能、処理部、処理手段等は、それらの一部又は全部を、例えば集積回路で設計する等によりハードウェアで実現してもよい。また、上記の各構成、機能等は、プロセッサがそれぞれの機能を実現するプログラムを解釈し、実行することによりソフトウェアで実現してもよい。各機能を実現するプログラム、テーブル、ファイル等の情報は、メモリや、ハードディスク、SSD(Solid State Drive)等の記録装置、または、ICカード、SDカード、DVD等の記録媒体に置くことができる。 Further, each of the above-mentioned configurations, functions, processing units, processing means, etc. may be partially or entirely realized by hardware, for example, by designing an integrated circuit. Furthermore, each of the above configurations, functions, etc. may be realized by software by a processor interpreting and executing a program for realizing each function. Information such as programs, tables, files, etc. that realize each function can be stored in memory, a recording device such as a hard disk, an SSD (Solid State Drive), or a recording medium such as an IC card, SD card, or DVD.
1  ・・・外界認識装置
100・・・ステレオカメラ部(車載センシング装置)
200・・・視差生成部
300・・・路端特徴点抽出部
400・・・路端方向線算出部
500・・・路端特徴3次元選別部
510・・・路端方向線角度選別部
511・・・路端方向線取得部
512・・・近傍路端方向線探索部
513・・・対象方向線角度選別部
530・・・路端高さ方向選別部
531・・・走行路面高さ検出部
532・・・路端特徴点高さ検出部
533・・・路端特定閾値算出部
534・・・路端特徴点高さ選別部
540・・・路端奥行方向選別部
541・・・路端特徴点選別結果取得部
542・・・路端特徴点奥行方向選別部
600・・・路端特定部
700・・・路端検知判定部
800・・・警報制御部
900・・・処理部
1... External world recognition device 100... Stereo camera section (vehicle sensing device)
200... Parallax generation unit 300... Road edge feature point extraction unit 400... Road edge direction line calculation unit 500... Road edge feature three-dimensional selection unit 510... Road edge direction line angle selection unit 511 ...Road end direction line acquisition section 512...Nearby road end direction line search section 513...Target direction line angle selection section 530...Road end height direction selection section 531...Driving road surface height detection Section 532... Road edge feature point height detection section 533... Road edge identification threshold calculation section 534... Road edge feature point height selection section 540... Road edge depth direction selection section 541... Road Edge feature point selection result acquisition unit 542... Road edge feature point depth direction selection unit 600... Road edge identification unit 700... Road edge detection determination unit 800... Alarm control unit 900... Processing unit

Claims (7)

  1.  路端の検知を行う処理部を有し、前記処理部は、
     車外の情報を検知する車載センシング装置から出力されるセンシング結果から、路端に相当する路端特徴点を求める路端特徴点抽出部と、
     前記路端特徴点が構成する路端の方向を表す線を、前記路端特徴点ごとに算出する路端方向線算出部と、
     前記線の向きに応じて、前記路端特徴点を右路端グループと左路端グループにグルーピングし、同じグループにグルーピングされた前記路端特徴点によって前記路端を特定する路端特定部と、
     を有することを特徴とする外界認識装置。
    It has a processing unit that performs roadside detection, and the processing unit includes:
    a road edge feature point extraction unit that obtains a road edge feature point corresponding to a road edge from sensing results output from an in-vehicle sensing device that detects information outside the vehicle;
    a road edge direction line calculation unit that calculates a line representing the direction of the road edge constituted by the road edge feature points for each of the road edge feature points;
    a road edge identification unit that groups the road edge feature points into a right road edge group and a left road edge group according to the direction of the line, and identifies the road edge by the road edge feature points grouped into the same group; ,
    An external world recognition device comprising:
  2.  請求項1に記載の外界認識装置において、
     前記路端の方向を表す線は、前記路端特徴点の周囲に存在する路端特徴点に対する位置関係から求められることを特徴とする外界認識装置。
    The external world recognition device according to claim 1,
    The external world recognition device is characterized in that the line representing the direction of the road edge is determined from a positional relationship with road edge feature points existing around the road edge feature point.
  3.  請求項1に記載の外界認識装置において、さらに、
     少なくとも前記路端特徴点のそれぞれにおける路面からの高さの分布を求める路端特徴点高さ検出部を有し、
     前記路端特定部は、前記路端特徴点が、近接する路端特徴点における前記路面からの高さの変化率が閾値以下の場合に、当該路端特徴点が前記近接する路端特徴点と同一の路端を構成する路端特徴点であると判定することを特徴とする外界認識装置。
    The external world recognition device according to claim 1, further comprising:
    a road edge feature point height detection unit that obtains a distribution of heights from the road surface at least at each of the road edge feature points;
    The road edge specifying unit is configured to identify the road edge feature point as the adjacent road edge feature point when a rate of change in height from the road surface at the adjacent road edge feature point is equal to or less than a threshold value. An external world recognition device characterized by determining that the road edge feature point is a road edge feature point that constitutes the same road edge.
  4.  請求項3に記載の外界認識装置において、
     前記閾値は、前記路面の平坦度に応じて決められることを特徴とする外界認識装置。
    The external world recognition device according to claim 3,
    The external world recognition device is characterized in that the threshold value is determined according to the flatness of the road surface.
  5.  請求項1に記載の外界認識装置において、
     前記センシング結果は画像であって、
     前記路端特定部は、前記路端特徴点が、近接する路端特徴点から前記画像上のカメラ消失点に向かう方向に沿って位置している場合に、当該路端特徴点が前記近接する路端特徴点と同一の路端を構成する路端特徴点であると判定することを特徴とする外界認識装置。
    The external world recognition device according to claim 1,
    The sensing result is an image,
    The road edge identification unit is configured to determine whether the road edge feature point is located in the vicinity of the road edge feature point when the road edge feature point is located along the direction from the adjacent road edge feature point toward the camera vanishing point on the image. An external world recognition device characterized by determining that a road edge feature point constitutes the same road edge as a road edge feature point.
  6.  請求項1に記載の外界認識装置において、
     前記路端特定部は、前記路端の方向を表す線の向き、前記路端の方向を表す線の路面からの高さの分布、および前記路端の方向を表す線の奥行方向の連なりに応じて、前記路端特徴点を右路端グループと左路端グループにグルーピングすることを特徴とする外界認識装置。
    The external world recognition device according to claim 1,
    The road edge identification unit determines the direction of the line representing the direction of the road edge, the distribution of the height of the line representing the direction of the road edge from the road surface, and the series of lines representing the direction of the road edge in the depth direction. The external world recognition device is characterized in that the roadside feature points are grouped into a right roadside group and a left roadside group accordingly.
  7.  請求項1に記載の外界認識装置において、
     前記路端の方向を表す線は、前記路端の法線であることを特徴とする外界認識装置。
    The external world recognition device according to claim 1,
    An external world recognition device characterized in that the line representing the direction of the roadside is a normal line to the roadside.
PCT/JP2022/011718 2022-03-15 2022-03-15 External environment recognition device WO2023175741A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/011718 WO2023175741A1 (en) 2022-03-15 2022-03-15 External environment recognition device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/011718 WO2023175741A1 (en) 2022-03-15 2022-03-15 External environment recognition device

Publications (1)

Publication Number Publication Date
WO2023175741A1 true WO2023175741A1 (en) 2023-09-21

Family

ID=88022479

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/011718 WO2023175741A1 (en) 2022-03-15 2022-03-15 External environment recognition device

Country Status (1)

Country Link
WO (1) WO2023175741A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017223511A (en) * 2016-06-14 2017-12-21 日本電信電話株式会社 Road structuring device, road structuring method and road structuring program
WO2018047295A1 (en) * 2016-09-09 2018-03-15 三菱電機株式会社 Parking assistance device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017223511A (en) * 2016-06-14 2017-12-21 日本電信電話株式会社 Road structuring device, road structuring method and road structuring program
WO2018047295A1 (en) * 2016-09-09 2018-03-15 三菱電機株式会社 Parking assistance device

Similar Documents

Publication Publication Date Title
EP2767927B1 (en) Road surface information detection apparatus, vehicle device control system employing road surface information detection apparatus, and carrier medium of road surface information detection program
US9569673B2 (en) Method and device for detecting a position of a vehicle on a lane
JP6747269B2 (en) Object recognition device
US8605947B2 (en) Method for detecting a clear path of travel for a vehicle enhanced by object detection
JP4650079B2 (en) Object detection apparatus and method
JP6657789B2 (en) Image processing device, imaging device, device control system, frequency distribution image generation method, and program
US8670592B2 (en) Clear path detection using segmentation-based method
JP3711405B2 (en) Method and system for extracting vehicle road information using a camera
US9360332B2 (en) Method for determining a course of a traffic lane for a vehicle
US11260861B2 (en) Method, device and computer-readable storage medium with instructions for determining the lateral position of a vehicle relative to the lanes on a roadway
JP5145585B2 (en) Target detection device
CN107389084B (en) Driving path planning method and storage medium
EP1901259A1 (en) Vehicle and lane recognizing device
JP2011511281A (en) Map matching method with objects detected by sensors
JP6911312B2 (en) Object identification device
JP6303362B2 (en) MAP MATCHING DEVICE AND NAVIGATION DEVICE HAVING THE SAME
JP6456682B2 (en) Traveling line recognition device
CN103366179A (en) Top-down view classification in clear path detection
Cerri et al. Computer vision at the hyundai autonomous challenge
JP2007264712A (en) Lane detector
JP5974923B2 (en) Road edge detection system, method and program
JP7454685B2 (en) Detection of debris in vehicle travel paths
EP3410345B1 (en) Information processing apparatus and non-transitory recording medium storing thereon a computer program
JP5888275B2 (en) Road edge detection system, method and program
CN115195773A (en) Apparatus and method for controlling vehicle driving and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22932023

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2024507271

Country of ref document: JP