WO2021033312A1 - Information output device, automated driving device, and method for information output - Google Patents

Information output device, automated driving device, and method for information output Download PDF

Info

Publication number
WO2021033312A1
WO2021033312A1 PCT/JP2019/032816 JP2019032816W WO2021033312A1 WO 2021033312 A1 WO2021033312 A1 WO 2021033312A1 JP 2019032816 W JP2019032816 W JP 2019032816W WO 2021033312 A1 WO2021033312 A1 WO 2021033312A1
Authority
WO
WIPO (PCT)
Prior art keywords
lane
information
shape
map
vehicle
Prior art date
Application number
PCT/JP2019/032816
Other languages
French (fr)
Japanese (ja)
Inventor
雄治 五十嵐
優子 大田
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2019/032816 priority Critical patent/WO2021033312A1/en
Publication of WO2021033312A1 publication Critical patent/WO2021033312A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to an information output device, an automatic driving device, and an information output method.
  • Patent Document 1 proposes a traveling control device that realizes automatic operation by using a lane marking position detected by a lane marking position sensor device. This travel control device can continue automatic driving on roads where lanes such as lane markings cannot be seen due to snowy roads, etc., based on the shape of the lane on the map data and the current position of the vehicle. ..
  • the present invention has been made in view of the above problems, and an object of the present invention is to provide a technique capable of appropriately using map-related information.
  • the information output device is generated based on the result of detecting the vicinity of the vehicle, and acquires the position of the vehicle and the first acquisition unit that acquires the first lane shape information indicating the shape of the first lane around the vehicle. Based on the second acquisition unit, the position of the vehicle acquired by the second acquisition unit, and the map data, the map vehicle position information indicating the position of the vehicle on the map data and the vehicle among the lanes on the map data.
  • the first generation unit that generates the second lane shape information indicating the shape of the surrounding second lane, the shape of the first lane indicated by the first lane shape information acquired by the first acquisition unit, and the first generation unit.
  • Map accuracy indicating the accuracy of the map-related information including at least one of the map vehicle position information and the second lane shape information based on the difference from the shape of the second lane indicated by the second lane shape information generated in. It includes a second generation unit that generates information and an output unit that outputs map-related information and map accuracy information.
  • the shape of the first lane indicated by the first lane shape information acquired by the first acquisition unit and the shape of the second lane indicated by the second lane shape information generated by the first generation unit Based on the difference, map accuracy information indicating the accuracy of map-related information is generated, and map-related information and map accuracy information are output.
  • map-related information can be used appropriately.
  • FIG. It is a block diagram which shows the structure of the information output device which concerns on Embodiment 1.
  • FIG. It is a block diagram which shows the structure of the information output device which concerns on Embodiment 2.
  • FIG. It is a flowchart which shows the operation of the information output device which concerns on Embodiment 2.
  • FIG. It is a figure for demonstrating the operation of the information output device which concerns on Embodiment 2.
  • FIG. It is a figure for demonstrating the operation of the information output device which concerns on Embodiment 2.
  • FIG. It is a figure for demonstrating operation of the information output device which concerns on modification 1.
  • FIG. It is a block diagram which shows the hardware configuration of the information output device which concerns on other modification.
  • It is a block diagram which shows the structure of the communication terminal which concerns on other modification.
  • FIG. 1 is a block diagram showing a configuration of an information output device 30 according to a first embodiment of the present invention.
  • the vehicle on which the information output device 30 is mounted and which is the subject of attention will be described as “own vehicle”.
  • the information output device 30 of FIG. 1 includes a first acquisition unit 31, a second acquisition unit 32, a first generation unit 34, a second generation unit 35, and an output unit 36.
  • the first acquisition unit 31 acquires the detected lane shape information which is the first lane shape information.
  • the detected lane shape information is generated based on the result of detecting the vicinity of the own vehicle, and is information indicating the shape of the first lane (hereinafter referred to as "detection lane") around the own vehicle.
  • the lane such as the first lane may be various lines of the road such as a lane marking, or may be a road portion between adjacent lane markings.
  • the periphery of the own vehicle may be a range from the own vehicle to a position advanced by about several tens of meters to several hundreds of meters in the traveling direction of the own vehicle, or may be a range other than this.
  • the first acquisition unit 31 can receive, for example, the detected lane shape information from the external device.
  • the interface to get is used.
  • the first acquisition unit 31 may include, for example, a detection device for detecting the vicinity of the own vehicle.
  • an analysis device that analyzes the detection result of the detection device is used.
  • an image sensor such as a camera
  • an optical range finder such as LiDAR (Light Detection and Ringing), or the like is used.
  • the second acquisition unit 32 acquires the position of the own vehicle.
  • the position of the own vehicle includes, for example, latitude, longitude, altitude, head direction, and the like.
  • the head direction is the angle between a specific direction (for example, due north) in the world geodetic system and the head direction, and is an angle from 0 degrees to 360 degrees measured clockwise from the specific direction.
  • the second acquisition unit 32 includes, for example, a device that detects the position of the own vehicle using satellite positioning results obtained from GNSS (Global Navigation Satellite System), etc., and inertial navigation (Dead Reckoning) based on the detection results of a camera, a sensor, or the like.
  • GNSS Global Navigation Satellite System
  • a device for detecting the position of the own vehicle using the above, or an interface thereof or the like is used.
  • the first generation unit 34 generates the map vehicle position information and the map lane shape information which is the second lane shape information based on the position of the own vehicle acquired by the second acquisition unit 32 and the map data. ..
  • the map vehicle position information is information indicating the position of the own vehicle on the map data.
  • the map lane shape information is information indicating the shape of the second lane (hereinafter referred to as "map lane") around the own vehicle among the lanes on the map data.
  • the map data used in the first generation unit 34 is data indicating the shape and connection of lanes.
  • This map data may be data stored in the memory of the information output device 30, and when the information output device 30 includes a communication device, the communication device receives the data from an external device of the information output device 30. It may be data.
  • the second generation unit 35 is the difference between the shape of the detection lane indicated by the detection lane shape information acquired by the first acquisition unit 31 and the shape of the map lane indicated by the map lane shape information generated by the first generation unit 34. Is calculated. Then, the second generation unit 35 generates map accuracy information indicating the accuracy of the map-related information including at least one of the map vehicle position information and the map lane shape information based on the calculated difference.
  • the output unit 36 outputs map-related information and map accuracy information.
  • the map-related information and the map accuracy information indicating the accuracy of the map-related information are output.
  • the output destination of the information output device 30 may, for example, use at least a part of the map-related information based on the accuracy indicated by the map accuracy information, or may not use the map-related information. Can be done. That is, since the output destination of the information output device 30 can appropriately use the map-related information, the robustness can be enhanced.
  • the output destination of the information output device 30 will be described as being a travel control device that controls the travel of the own vehicle, but the present invention is not limited to this.
  • the output destination of the information output device 30 may be a communication device or the like that transmits map-related information to a server or the like when it is determined that the accuracy indicated by the map accuracy information is low.
  • the server referred to here includes a server that manages map data, which needs to be updated at any time due to construction work or the like.
  • the output destination of the information output device 30 may be a display device or the like that displays that automatic operation or the like is not appropriate when it is determined that the accuracy indicated by the map accuracy information is low.
  • FIG. 2 is a block diagram showing a configuration of an automatic driving device including the information output device 30 according to the second embodiment of the present invention.
  • the components according to the second embodiment the components that are the same as or similar to the above-mentioned components are designated by the same or similar reference numerals, and different components will be mainly described.
  • the automatic driving device of FIG. 2 includes an in-vehicle sensor device group 10, a lane shape measuring device 20, an information output device 30, and a traveling control device 40.
  • the information output device 30 of FIG. 2 is the same device as the information output device 30 described in the first embodiment, and outputs map-related information and map accuracy information.
  • the information output device 30 of FIG. 2 is connected to the vehicle-mounted sensor device group 10, the lane shape measuring device 20, and the traveling control device 40 via a wire or wirelessly.
  • the vehicle-mounted sensor device group 10 outputs vehicle-mounted sensor information including at least one of vehicle speed, yaw rate, satellite positioning position (latitude, longitude, altitude), and time information to the information output device 30 and the travel control device 40.
  • the vehicle-mounted sensor device group 10 displays all of the vehicle speed, yaw rate, satellite positioning position (latitude, longitude, altitude), and time information according to the specifications predetermined for each information item, the information output device 30 and the information output device 30. It shall be output to the travel control device 40.
  • the predetermined specifications here specify, for example, that the output interval of vehicle speed and yaw rate is a 20-millisecond cycle, and that the output interval of satellite positioning position and time information is a 100-millisecond cycle. ..
  • the lane shape measuring device 20 generates detection lane shape information indicating the shape of the detection lane described in the first embodiment based on the result of detecting the vicinity of the own vehicle with a camera mounted on the own vehicle, LiDAR, or the like. Then, the lane shape measuring device 20 outputs the detected lane shape information to the information output device 30 and the traveling control device 40 at a fixed cycle (for example, a cycle of 100 milliseconds).
  • the travel control device 40 includes vehicle-mounted sensor information output from the vehicle-mounted sensor device group 10, detected lane shape information output from the lane shape measuring device 20, map-related information and map accuracy information output from the information output device 30. Based on the above, the running control of the own vehicle is performed.
  • the information output device 30 of FIG. 2 includes a lane shape acquisition unit 31a, a position calculation unit 32a, a map data storage unit 33, a map vehicle position generation unit 34a, a map lane shape generation unit 34b, and a map accuracy generation unit 35a. And an information output unit 36a.
  • the lane shape acquisition unit 31a is included in the concept of the first acquisition unit 31 in FIG. 1
  • the position calculation unit 32a is included in the concept of the second acquisition unit 32 in FIG. 1
  • the map lane shape generation unit 34b is included in the concept of the first generation unit 34 in FIG.
  • the map accuracy generation unit 35a is included in the concept of the second generation unit 35 in FIG. 1
  • the information output unit 36a is included in the concept of the output unit 36 in FIG.
  • the lane shape acquisition unit 31a acquires the detected lane shape information output from the lane shape measuring device 20.
  • the position calculation unit 32a determines the position (latitude, longitude, altitude, head direction) of the own vehicle at the current time in a fixed cycle (for example, 100 milliseconds cycle) based on the vehicle-mounted sensor information output from the vehicle-mounted sensor device group 10. calculate. For the calculation of the position of the own vehicle by the position calculation unit 32a, for example, satellite positioning result or inertial navigation is used.
  • the position calculation unit 32a may calculate the position of the own vehicle based on the detected lane shape information acquired by the lane shape acquisition unit 31a.
  • the position in the lane of the own vehicle, particularly the position in the left-right direction of the own vehicle, which is included in the detected lane shape information and is acquired by a camera or a sensor, is more accurate than the satellite positioning position. Therefore, if the detected lane shape information is used to calculate the position of the own vehicle, the accuracy of the position of the own vehicle can be expected to be improved.
  • the map data storage unit 33 stores map data indicating the shape and connection of lanes.
  • the map vehicle position generation unit 34a on the map data for example, by performing map matching based on the position of the own vehicle calculated by the position calculation unit 32a and the map data stored in the map data storage unit 33. Generates map vehicle position information indicating the position of the own vehicle.
  • the map lane shape generation unit 34b maps the shape of the lane around the position indicated by the map vehicle position information generated by the map vehicle position generation unit 34a among the lanes on the map data stored in the map data storage unit 33. Search as the shape of the lane. Then, the map lane shape generation unit 34b generates map lane shape information indicating the shape of the searched map lane.
  • the map accuracy generation unit 35a has a shape of the detection lane indicated by the detection lane shape information acquired by the lane shape acquisition unit 31a and a map lane shape indicated by the map lane shape information generated by the map lane shape generation unit 34b. Based on the difference, map accuracy information indicating the accuracy of map-related information including at least one of map vehicle position information and map lane shape information is generated.
  • the map accuracy generation unit 35a generates the map accuracy information, for example, every fixed time, every constant mileage, or every output of the detected lane shape information from the lane shape measuring device 20.
  • the information output unit 36a outputs map-related information and map accuracy information to the travel control device 40.
  • the map-related information may be generated by the map accuracy generation unit 35a or may be generated by the information output unit 36a.
  • FIG. 3 is a flowchart showing the operation of the information output device 30 according to the second embodiment.
  • step S1 the lane shape acquisition unit 31a acquires the detected lane shape information from the lane shape measuring device 20.
  • step S2 the position calculation unit 32a calculates the position of the own vehicle at the current time based on the in-vehicle sensor information output from the in-vehicle sensor device group 10.
  • step S3 the map vehicle position generation unit 34a generates map vehicle position information based on the position of the own vehicle calculated by the position calculation unit 32a and the map data.
  • step S4 the map lane shape generation unit 34b generates map lane shape information based on the map vehicle position information generated by the map vehicle position generation unit 34a and the map data.
  • the map accuracy generation unit 35a has the shape of the detection lane indicated by the detection lane shape information acquired by the lane shape acquisition unit 31a and the map indicated by the map lane shape information generated by the map lane shape generation unit 34b. Map accuracy information is generated based on the difference from the shape of the lane. The generation of map accuracy information by the map accuracy generation unit 35a will be described in detail later.
  • step S6 the information output unit 36a outputs the map-related information and the map accuracy information to the travel control device 40.
  • the operation of the travel control device 40 will be described in detail later. After that, the operation of FIG. 3 ends. After step S6, the process may return to step S1 as appropriate.
  • FIG. 4 shows the shape of the detection lane indicated by the detection lane shape information acquired by the map accuracy generation unit 35a in step S5 and the map lane shape generated by the map lane shape generation unit 34b. It is a conceptual diagram for demonstrating the process of calculating the difference from the shape of a map lane indicated by information.
  • the detected lane shape information acquired from the lane shape measuring device 20 by the lane shape acquiring unit 31a indicates the shape of the detected lane as a plurality of points on the coordinate system.
  • the lane shape measuring device 20 is the origin
  • the traveling direction of the own vehicle is the Y axis (the traveling direction is a positive value)
  • the lateral direction with respect to the traveling direction is the X axis (the right hand direction of the traveling direction is positive).
  • the XY coordinate system used as the value) is used.
  • the position of each point on the shape of the detection lane is a relative position with the lane shape measuring device 20 as the origin.
  • the map accuracy generation unit 35a converts the coordinate system of the map lane shape information generated by the map lane shape generation unit 34b into the coordinate system of the detection lane shape information generated by the lane shape measuring device 20.
  • the coordinate system of the map lane shape information is the coordinate system of the world geodetic system (values represented by latitude, longitude, altitude, and head orientation), and the coordinate system of the detected lane shape information originates from the above-mentioned own vehicle.
  • the coordinate system of the world geodetic system may be converted into the XY coordinate system.
  • the movement of coordinates such that the coordinates (latitude, longitude, altitude) of the position of the own vehicle used when generating the map lane shape information becomes the origin of the XY coordinate system, and the map lane shape information indicate.
  • the process may be performed in combination with the rotation of the coordinates so that the vehicle head orientation is in the Y-axis direction of the XY coordinate system.
  • the position of each point on the shape of the map lane becomes a relative position with the lane shape measuring device 20 as the origin, similarly to the position of each point on the shape of the detection lane.
  • the absolute coordinate system of the map lane shape information is converted to the relative coordinate system of the detected lane shape information, but conversely, the relative coordinate system of the detected lane shape information is the map lane shape. It may be converted into an absolute coordinate system of information.
  • the absolute coordinate system of the map lane shape information is used in view of the fact that the relative coordinates with the own vehicle as the origin are generally used. It is preferable that the detected lane shape information is converted into a relative coordinate system.
  • FIG. 4 shows the lane marking 51, which is the detection lane, and the lane marking 52, which is the map lane, after the above coordinate system conversion is performed.
  • FIG. 4 also shows a calculation range L, which is a range of lane shapes for which the difference should be calculated. If the calculation range L is too large, the detection accuracy of the shape of the detection lane tends to be low. Therefore, the maximum distance of the calculation range L is set to, for example, about 100 [m].
  • the map accuracy generation unit 35a calculates the average value Davg and the maximum value Dmax of the difference D between the division line 51 and the division line 52 in the calculation range L.
  • the map accuracy generation unit 35a may calculate these values for the division line 51 and the division line 52 on only one side of the left side and the right side in the calculation range L, or the divisions on both the left side and the right side in the calculation range L. These values may be calculated for the line 51 and the lane marking 52.
  • the map accuracy generation unit 35a determines the accuracy of the map-related information based on the calculated average value Davg and the maximum value Dmax.
  • FIG. 5 is a diagram for explaining the determination by the map accuracy generation unit 35a.
  • the determination condition in which the average value Davg and the maximum value Dmax are classified is associated with the accuracy level in which the accuracy of the map-related information is classified.
  • the map accuracy generation unit 35a searches the information in FIG. 5 for the determination condition to which the calculated average value Davg and the maximum value Dmax correspond, and determines the accuracy level corresponding to the determination condition as the accuracy of the map-related information.
  • the accuracy level “high” means that the accuracy of the map-related information is high, and the accuracy level “low” means that the accuracy of the map-related information is low.
  • the accuracy level “undetermined” means that the information on the lane marking 51 and the lane marking 52 was not normally obtained.
  • the accuracy level “undetermined” means that the information output device 30 obtains only one of the left and right division lines at at least one of the division line 51 and the division line 52, or the left side and the right side. This means, for example, when both lane markings cannot be obtained.
  • the accuracy level is classified into four (high, medium, low, undetermined), but is not limited to four.
  • the map accuracy generation unit 35a generates the information indicating the accuracy determined as described above as the map accuracy information indicating the accuracy of the map-related information.
  • the travel control device 40 receives the map-related information and the map accuracy information output from the information output device 30 in step S6 of FIG.
  • the travel control device 40 determines whether or not to execute automatic driving using the received map-related information based on the accuracy indicated by the received map accuracy information, and performs travel control related to automatic driving based on the determination result. Do. For example, when the map accuracy information indicates "high”, the travel control device 40 performs travel control related to automatic driving using the map-related information, and when the map accuracy information indicates "low”, the travel control device 40 performs map-related information. Stop driving control related to automatic driving using information.
  • ⁇ Modification example 1> For the calculation range L in FIG. 4, it is desirable to use the range in which the accuracy of the lane shape measuring device 20 is the best. For example, when the lane shape measuring device 20 generates the detected lane shape information based on the image of the camera, the detection accuracy of the detected lane shape in a place where the road has a curvature such as a curve is determined by the detection lane in a straight road. It tends to be lower than the detection accuracy of the shape of.
  • the map accuracy generation unit 35a may change the calculation range L based on the curvature of the shape of the lane marking 51 which is the detection lane.
  • the curvature here includes not only the curvature itself but also the radius of curvature, which is the reciprocal of the curvature.
  • the map accuracy generation unit 35a will be described by taking as an example a case where the calculation range L is changed based on the radius of curvature of the shape of the lane marking 51.
  • FIG. 6 is a diagram for explaining the change of the calculation range L by the map accuracy generation unit 35a.
  • the radius of curvature R [m] of the shape of the lane marking 51 is associated with the calculation range L shown as the range in the Y-axis direction.
  • the map accuracy generation unit 35a calculates the radius of curvature of the shape of the division line 51, searches the calculation range L corresponding to the calculated radius of curvature from the information in FIG. 6, and searches for the shape of the division line 51 and the division line 52. Used to calculate the difference from the shape.
  • the above configuration it is possible to suppress the influence of the decrease in detection accuracy due to a curve or the like on the accuracy indicated by the map accuracy information.
  • the radius of curvature R of the lane marking 51 is 1000 m or more
  • the road up to 50 m ahead of the own vehicle is regarded as a straight road, and the accuracy of the lane marking 51 is calculated from the position of the vehicle to 50 m ahead.
  • the information output device 30 has low accuracy of map-related information before the own vehicle enters a section where the accuracy of map-related information is low with respect to the travel control device 40 that is automatically driven by the own vehicle. It is preferable to notify that. Therefore, the position of the calculation range L on the own vehicle side may be separated from the own vehicle. With such a configuration, the travel control device 40 can take appropriate measures such as shifting to an automatic driving mode that does not use map-related information before the own vehicle enters the section.
  • the map accuracy generation unit 35a changes the calculation range L based on the curvature (radius of curvature) of the shape of the lane marking 51 which is the detection lane, but the curvature of the shape of the lane marking 52 which is the map lane.
  • the calculation range L may be changed based on.
  • the representation format in which the detection lane shape information indicates the shape of the detection lane and the representation format in which the map lane shape information indicates the shape of the map lane may differ.
  • the position of the detection lane indicated by the detection lane shape information output from the lane shape measuring device 20 includes the position of the end portion of the detection lane on the own vehicle side, the center position in the width direction of the detection lane, or the detection lane. It is assumed that various positions such as the position of the end opposite to the own vehicle will be applied.
  • the map accuracy generation unit 35a determines a predetermined lane width.
  • the above difference may be obtained by using.
  • the position of the lane marking 51 which is the detection lane in the detection lane shape information
  • the position of the lane marking 52 which is the map lane in the map lane shape information.
  • the position is the center position in the width direction of the lane marking.
  • the map accuracy generation unit 35a sets the average value and the maximum value of the values obtained by subtracting W / 2 from the difference D between the shape of the lane marking 51 and the shape of the lane marking 52 as the average value described in the second embodiment, respectively. It may be used as Davg and the maximum value Dmax.
  • the lane width W differs depending on the region or country, the lane width corresponding to the position of the own vehicle may be used in the above calculation among the lane widths shown in the map data (map lane shape information). If it is a general road in Japan, the lane width W is about 15 cm.
  • the accuracy of the difference between the shape of the lane marking 51 output by the lane shape measuring device 20 and the shape of the lane marking 52 in the map data can be further improved, so that the map accuracy information can be obtained.
  • the accuracy of showing can be improved.
  • a portion of the image data that is not a lane marking may be erroneously detected as a lane marking due to a headrace zone or a curb.
  • it can be expected to reduce such false detections.
  • the map accuracy generation unit 35a reduced the influence of the difference between the measurement specifications of the lane shape measuring device 20 and the map data specifications on the map accuracy information by using the lane width W, but the specifications (expression format). ) Is not limited to this.
  • the map accuracy generation unit 35a may obtain the difference between the shape of the center line of the two division lines 51 and the shape of the center line of the two division lines 52 as the above difference.
  • the map accuracy generation unit 35a determines the difference between the first lane center line calculated from the two lane markings 51 and the second lane center line calculated from the two lane markings 52 between the lane marking 51 and the lane marking. It is treated in the same way as the difference D from 52. At this time, the map accuracy generation unit 35a calculates the first lane center line from the midpoints of the two points that are the shortest distances of the two lanes 51, and the midpoints of the two points that are the shortest distances of the two lanes 52. It is preferable to calculate the second lane center line from.
  • the first lane center line and the second lane center line are in the middle of the left and right lane markings, the first lane center line and the lane marking 52 even if the representation formats of the lane marking 51 and the lane marking 52 are slightly different.
  • the expression format of the second lane center line can be the same. Therefore, it is possible to reduce the influence of the difference between the measurement specifications of the lane shape measuring device 20 and the map data specifications on the map accuracy information. Therefore, according to the above configuration, the accuracy of the difference between the shape of the lane marking 51 output by the lane shape measuring device 20 and the shape of the lane marking 52 in the map data can be further improved, and thus the map accuracy information. The accuracy indicated by can be improved.
  • the measurement time required for the lane shape measuring device 20 to measure the shape of the detected lane may be long.
  • the time that the detected lane shape information is used to calculate the difference may be delayed from the time that the map lane shape information is used to calculate the difference.
  • the lane marking 51 which is the detection lane indicated by the detection lane shape information, is in the traveling direction of the own vehicle (plus direction of the Y axis in FIG. 4) with respect to the lane marking 52, which is the map lane indicated by the map lane shape information. ) May shift.
  • the map accuracy generation unit 35a obtains the above difference after moving the shape of the lane marking 51 with respect to the shape of the lane marking 52 based on the generation time T of the detected lane shape information and the speed of the own vehicle. You may. For example, the value of the generation time T for which the lane shape measuring device 20 generates the detected lane shape information is obtained in advance by an experiment or the like, and stored as a parameter in the map accuracy generation unit 35a. Then, in step S5 of FIG. 3, the map accuracy generation unit 35a integrates the generation time T and the vehicle speed V [m / s] of the own vehicle at the current time to obtain the distance Ldelay [m].
  • the map accuracy generation unit 35a converts the lane marking 52 indicated by the map lane shape information into the same XY coordinate system as the lane marking 51 indicated by the detected lane shape information
  • the lane marking 51 is Y by the distance Ldelay. Translate in the minus direction of the axis.
  • the generation time T may include a communication delay time until the detected lane shape information is output from the lane shape measuring device 20 and received by the information output device 30.
  • the influence of the generation time T of the lane shape measuring device 20 on the map accuracy information can be reduced, so that the accuracy indicated by the map accuracy information can be improved.
  • the map accuracy generation unit 35a uses the information shown in FIG. 5 from the average value Davg and the maximum value Dmax of the difference D between the division line 51 and the division line 52 to indicate the accuracy of the map-related information.
  • the map accuracy information is not limited to this.
  • the map accuracy generation unit 35a has the average value Davg and the maximum value Dmax. May be generated as map accuracy information as it is.
  • the first acquisition unit 31, the second acquisition unit 32, the first generation unit 34, the second generation unit 35, and the output unit 36 of FIG. 1 described above are hereinafter referred to as “first acquisition unit 31 and the like”.
  • the first acquisition unit 31 and the like are realized by the processing circuit 81 shown in FIG. That is, the processing circuit 81 acquires the position of the vehicle and the first acquisition unit 31 that is generated based on the result of detecting the vicinity of the vehicle and acquires the first lane shape information indicating the shape of the first lane around the vehicle.
  • the map vehicle position information indicating the vehicle position on the map data and the lane on the map data based on the vehicle positions acquired by the second acquisition unit 32 and the second acquisition unit 32 and the map data.
  • the first generation unit 34 that generates the second lane shape information indicating the shape of the second lane around the vehicle, the shape of the first lane indicated by the first lane shape information acquired by the first acquisition unit 31, and the first 1. Accuracy of map-related information including at least one of map vehicle position information and second lane shape information based on the difference from the shape of the second lane indicated by the second lane shape information generated by the generation unit 34.
  • a second generation unit 35 that generates map accuracy information indicating the above, and an output unit 36 that outputs map-related information and map accuracy information are provided.
  • Dedicated hardware may be applied to the processing circuit 81, or a processor that executes a program stored in the memory may be applied. Examples of the processor include a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a DSP (Digital Signal Processor), and the like.
  • the processing circuit 81 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field Programmable Gate). Array), or a combination of these.
  • Each of the functions of each part such as the first acquisition unit 31 may be realized by a circuit in which processing circuits are dispersed, or the functions of each part may be collectively realized by one processing circuit.
  • the processing circuit 81 When the processing circuit 81 is a processor, the functions of the first acquisition unit 31 and the like are realized by combining with software and the like.
  • the software and the like correspond to, for example, software, firmware, or software and firmware.
  • Software and the like are described as programs and stored in memory.
  • the processor 82 applied to the processing circuit 81 realizes the functions of each part by reading and executing the program stored in the memory 83. That is, the information output device 30 is generated based on the result of detecting the vicinity of the vehicle when executed by the processing circuit 81, and is a step of acquiring the first lane shape information indicating the shape of the first lane around the vehicle.
  • the map vehicle position information indicating the position of the vehicle on the map data and the lanes on the map data around the vehicle
  • a step of generating map accuracy information indicating the accuracy of map-related information including at least one of map vehicle position information and second lane shape information based on the difference from the shape, and map-related information and map accuracy information. It includes a step to output and a memory 83 for storing a program to be executed as a result.
  • the memory 83 is, for example, non-volatile such as RAM (RandomAccessMemory), ROM (ReadOnlyMemory), flash memory, EPROM (ErasableProgrammableReadOnlyMemory), EPROM (ElectricallyErasableProgrammableReadOnlyMemory), or the like. Volatile semiconductor memory, HDD (HardDiskDrive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (DigitalVersatileDisc), its drive device, etc., or any storage medium that will be used in the future. You may.
  • RAM RandomAccessMemory
  • ROM ReadOnlyMemory
  • flash memory EPROM (ErasableProgrammableReadOnlyMemory), EPROM (ElectricallyErasableProgrammableReadOnlyMemory), or the like.
  • Volatile semiconductor memory HDD (HardDiskDrive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (DigitalVersatileDisc), its drive device,
  • each function of the first acquisition unit 31 and the like is realized by either hardware or software has been described above.
  • the present invention is not limited to this, and a configuration may be configured in which a part of the first acquisition unit 31 or the like is realized by dedicated hardware and another part is realized by software or the like.
  • the functions of the first acquisition unit 31 and the second acquisition unit 32 are realized by the processing circuit 81 as dedicated hardware, the interface, the receiver, and the like, and the processing circuit 81 as the processor 82 is the memory 83 in other cases.
  • the function can be realized by reading and executing the program stored in.
  • the processing circuit 81 can realize each of the above-mentioned functions by hardware, software, or a combination thereof.
  • the information output device 30 described above includes a vehicle device such as a PND (Portable Navigation Device), a navigation device and a DMS (Driver Monitoring System) device, and a communication terminal including a mobile terminal such as a mobile phone, a smartphone and a tablet. It can also be applied to an information output system constructed as a system by appropriately combining a function of an application installed in at least one of a vehicle device and a communication terminal and a server. In this case, each function or each component of the information output device 30 described above may be distributed to each device for constructing the system, or may be concentrated to any device. Good.
  • a vehicle device such as a PND (Portable Navigation Device), a navigation device and a DMS (Driver Monitoring System) device
  • a communication terminal including a mobile terminal such as a mobile phone, a smartphone and a tablet. It can also be applied to an information output system constructed as a system by appropriately combining a function of an application installed in at least one of a vehicle device and a communication terminal and a server
  • FIG. 9 is a block diagram showing the configuration of the server 91 according to this modification.
  • the server 91 of FIG. 9 includes a communication unit 91a and a control unit 91b, and can perform wireless communication with the vehicle device 93 of the vehicle 92.
  • the communication unit 91a which is the first acquisition unit and the second acquisition unit, receives the first lane shape information acquired by the vehicle device 93 and the position of the vehicle 92 by performing wireless communication with the vehicle device 93.
  • the control unit 91b performs the same functions as the first generation unit 34 and the second generation unit 35 of FIG. 1 by executing a program stored in a memory (not shown) of the server 91 by a processor (not shown) of the server 91. Have. That is, the control unit 91b generates the map vehicle position information and the second lane shape information based on the vehicle position and the map data received by the communication unit 91a. Then, the control unit 91b is the difference between the shape of the first lane indicated by the first lane shape information received by the communication unit 91a and the shape of the second lane indicated by the second lane shape information generated by the control unit 91b. Generates map accuracy information indicating the accuracy of map-related information based on.
  • the communication unit 91a which is an output unit, transmits map-related information and map accuracy information to the vehicle device 93. According to the server 91 configured in this way, the same effect as that of the information output device 30 described in the first embodiment can be obtained.
  • FIG. 10 is a block diagram showing the configuration of the communication terminal 96 according to this modification.
  • the communication terminal 96 of FIG. 10 includes a communication unit 96a similar to the communication unit 91a and a control unit 96b similar to the control unit 91b, and can perform wireless communication with the vehicle device 98 of the vehicle 97. ing.
  • a mobile terminal such as a mobile phone, a smartphone, or a tablet carried by the driver of the vehicle 97 is applied to the communication terminal 96, for example.
  • the communication terminal 96 configured in this way, the same effect as that of the information output device 30 described in the first embodiment can be obtained.
  • each embodiment and each modification can be freely combined, and each embodiment and each modification can be appropriately modified or omitted.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

The purpose of the present invention is to provide a technology that makes it possible to properly use map related information. This information output device comprises: a second generation unit that generates map accuracy information which indicates the accuracy of map related information containing at least one of map vehicle location information and second-lane form information, on the basis of the difference between the form of a first lane indicated by first lane-form information acquired by a first acquisition unit and the form of a second lane indicated by the second-lane form information generated by a first generation unit; and an output unit that outputs the map related information and the map accuracy information.

Description

情報出力装置、自動運転装置、及び、情報出力方法Information output device, automatic operation device, and information output method
 本発明は、情報出力装置、自動運転装置、及び、情報出力方法に関する。 The present invention relates to an information output device, an automatic driving device, and an information output method.
 近年、車両を自動的に運転する自動運転装置に関して様々な技術が提案されている。例えば特許文献1には、区画線位置センサ装置で検出された区画線位置を用いて自動運転を実現する走行制御装置が提案されている。この走行制御装置は、雪道などにより区画線などの車線が見えない道路においては、地図データ上における車線の形状及び車両の現在位置に基づいて、自動運転を継続することが可能となっている。 In recent years, various technologies have been proposed for automatic driving devices that automatically drive vehicles. For example, Patent Document 1 proposes a traveling control device that realizes automatic operation by using a lane marking position detected by a lane marking position sensor device. This travel control device can continue automatic driving on roads where lanes such as lane markings cannot be seen due to snowy roads, etc., based on the shape of the lane on the map data and the current position of the vehicle. ..
特許第6438516号公報Japanese Patent No. 6438516
 しかしながら、実運用上においては、区画線位置センサ装置に誤検出がある場合、または、地図データが実際の道路形状と大きく異なる場合がある。このような場合に、地図データ上における車線の形状及び車両の現在位置、つまり地図関連情報を自動運転などに用いると、適切な自動運転が実現できないという問題があった。 However, in actual operation, there may be a false detection in the lane marking position sensor device, or the map data may differ significantly from the actual road shape. In such a case, if the shape of the lane and the current position of the vehicle on the map data, that is, the map-related information is used for automatic driving or the like, there is a problem that appropriate automatic driving cannot be realized.
 そこで、本発明は、上記のような問題点を鑑みてなされたものであり、地図関連情報を適切に使用することが可能な技術を提供することを目的とする。 Therefore, the present invention has been made in view of the above problems, and an object of the present invention is to provide a technique capable of appropriately using map-related information.
 本発明に係る情報出力装置は、車両周辺を検出した結果に基づいて生成され、車両周辺の第1車線の形状を示す第1車線形状情報を取得する第1取得部と、車両の位置を取得する第2取得部と、第2取得部で取得された車両の位置と、地図データとに基づいて、地図データ上における車両の位置を示す地図車両位置情報と、地図データ上の車線のうち車両周辺の第2車線の形状を示す第2車線形状情報とを生成する第1生成部と、第1取得部で取得された第1車線形状情報が示す第1車線の形状と、第1生成部で生成された第2車線形状情報が示す第2車線の形状との差に基づいて、地図車両位置情報及び第2車線形状情報の少なくともいずれか1つを含む地図関連情報の精度を示す地図精度情報を生成する第2生成部と、地図関連情報及び地図精度情報を出力する出力部とを備える。 The information output device according to the present invention is generated based on the result of detecting the vicinity of the vehicle, and acquires the position of the vehicle and the first acquisition unit that acquires the first lane shape information indicating the shape of the first lane around the vehicle. Based on the second acquisition unit, the position of the vehicle acquired by the second acquisition unit, and the map data, the map vehicle position information indicating the position of the vehicle on the map data and the vehicle among the lanes on the map data. The first generation unit that generates the second lane shape information indicating the shape of the surrounding second lane, the shape of the first lane indicated by the first lane shape information acquired by the first acquisition unit, and the first generation unit. Map accuracy indicating the accuracy of the map-related information including at least one of the map vehicle position information and the second lane shape information based on the difference from the shape of the second lane indicated by the second lane shape information generated in. It includes a second generation unit that generates information and an output unit that outputs map-related information and map accuracy information.
 本発明によれば、第1取得部で取得された第1車線形状情報が示す第1車線の形状と、第1生成部で生成された第2車線形状情報が示す第2車線の形状との差に基づいて、地図関連情報の精度を示す地図精度情報を生成し、地図関連情報及び地図精度情報を出力する。このような構成によれば、地図関連情報を適切に使用することができる。 According to the present invention, the shape of the first lane indicated by the first lane shape information acquired by the first acquisition unit and the shape of the second lane indicated by the second lane shape information generated by the first generation unit. Based on the difference, map accuracy information indicating the accuracy of map-related information is generated, and map-related information and map accuracy information are output. With such a configuration, map-related information can be used appropriately.
 本発明の目的、特徴、態様及び利点は、以下の詳細な説明と添付図面とによって、より明白となる。 The object, features, aspects and advantages of the present invention will be made clearer by the following detailed description and accompanying drawings.
実施の形態1に係る情報出力装置の構成を示すブロック図である。It is a block diagram which shows the structure of the information output device which concerns on Embodiment 1. FIG. 実施の形態2に係る情報出力装置の構成を示すブロック図である。It is a block diagram which shows the structure of the information output device which concerns on Embodiment 2. FIG. 実施の形態2に係る情報出力装置の動作を示すフローチャートである。It is a flowchart which shows the operation of the information output device which concerns on Embodiment 2. 実施の形態2に係る情報出力装置の動作を説明するための図である。It is a figure for demonstrating the operation of the information output device which concerns on Embodiment 2. FIG. 実施の形態2に係る情報出力装置の動作を説明するための図である。It is a figure for demonstrating the operation of the information output device which concerns on Embodiment 2. FIG. 変形例1に係る情報出力装置の動作を説明するための図である。It is a figure for demonstrating operation of the information output device which concerns on modification 1. FIG. その他の変形例に係る情報出力装置のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware configuration of the information output device which concerns on other modification. その他の変形例に係る情報出力装置のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware configuration of the information output device which concerns on other modification. その他の変形例に係るサーバの構成を示すブロック図である。It is a block diagram which shows the structure of the server which concerns on other modification. その他の変形例に係る通信端末の構成を示すブロック図である。It is a block diagram which shows the structure of the communication terminal which concerns on other modification.
 <実施の形態1>
 図1は、本発明の実施の形態1に係る情報出力装置30の構成を示すブロック図である。以下、情報出力装置30が搭載され、着目の対象となる車両を「自車両」と記載して説明する。
<Embodiment 1>
FIG. 1 is a block diagram showing a configuration of an information output device 30 according to a first embodiment of the present invention. Hereinafter, the vehicle on which the information output device 30 is mounted and which is the subject of attention will be described as “own vehicle”.
 図1の情報出力装置30は、第1取得部31と、第2取得部32と、第1生成部34と、第2生成部35と、出力部36とを備える。 The information output device 30 of FIG. 1 includes a first acquisition unit 31, a second acquisition unit 32, a first generation unit 34, a second generation unit 35, and an output unit 36.
 第1取得部31は、第1車線形状情報である検出車線形状情報を取得する。検出車線形状情報は、自車両周辺を検出した結果に基づいて生成され、自車両周辺の第1車線(以下「検出車線」と記す)の形状を示す情報である。第1車線などの車線は、区画線などの道路の様々な線であってもよいし、隣り合う区画線同士の間の道路部分であってもよい。自車両周辺は、自車両から自車両の進行方向に数十m~数百m程度進んだ位置までの範囲であってもよいし、これ以外の範囲であってもよい。 The first acquisition unit 31 acquires the detected lane shape information which is the first lane shape information. The detected lane shape information is generated based on the result of detecting the vicinity of the own vehicle, and is information indicating the shape of the first lane (hereinafter referred to as "detection lane") around the own vehicle. The lane such as the first lane may be various lines of the road such as a lane marking, or may be a road portion between adjacent lane markings. The periphery of the own vehicle may be a range from the own vehicle to a position advanced by about several tens of meters to several hundreds of meters in the traveling direction of the own vehicle, or may be a range other than this.
 情報出力装置30の図示しない外部装置が、自車両周辺を検出した結果に基づいて検出車線形状情報を生成可能である場合、第1取得部31には、例えば、当該外部装置から検出車線形状情報を取得するインターフェースが用いられる。一方、第1取得部31自体が、自車両周辺を検出した結果に基づいて検出車線形状情報を生成可能である場合、第1取得部31には、例えば、自車両周辺を検出する検出装置、及び、当該検出装置の検出結果を解析する解析装置などが用いられる。ここでいう検出装置には、例えば、カメラなどの撮像センサ、または、LiDAR(Light Detection and Ranging)などの光学測距センサなどが用いられる。 When the external device (not shown) of the information output device 30 can generate the detected lane shape information based on the result of detecting the vicinity of the own vehicle, the first acquisition unit 31 can receive, for example, the detected lane shape information from the external device. The interface to get is used. On the other hand, when the first acquisition unit 31 itself can generate the detection lane shape information based on the result of detecting the vicinity of the own vehicle, the first acquisition unit 31 may include, for example, a detection device for detecting the vicinity of the own vehicle. In addition, an analysis device that analyzes the detection result of the detection device is used. As the detection device referred to here, for example, an image sensor such as a camera, an optical range finder such as LiDAR (Light Detection and Ringing), or the like is used.
 第2取得部32は、自車両の位置を取得する。自車両の位置は、例えば、緯度、経度、高度、車頭方位などを含む。車頭方位は、世界測地系での特定の方向(例えば真北)と車頭の方位との間の角度であり、特定の方向から時計回りに測定される0度から360度までの角度である。第2取得部32には、例えば、GNSS(Global Navigation Satellite System)などから得られる衛星測位結果を用いて自車両の位置を検出する装置、カメラやセンサなどの検出結果に慣性航法(Dead Reckoning)を用いて自車両の位置を検出する装置、または、これらのインターフェースなどが用いられる。 The second acquisition unit 32 acquires the position of the own vehicle. The position of the own vehicle includes, for example, latitude, longitude, altitude, head direction, and the like. The head direction is the angle between a specific direction (for example, due north) in the world geodetic system and the head direction, and is an angle from 0 degrees to 360 degrees measured clockwise from the specific direction. The second acquisition unit 32 includes, for example, a device that detects the position of the own vehicle using satellite positioning results obtained from GNSS (Global Navigation Satellite System), etc., and inertial navigation (Dead Reckoning) based on the detection results of a camera, a sensor, or the like. A device for detecting the position of the own vehicle using the above, or an interface thereof or the like is used.
 第1生成部34は、第2取得部32で取得された自車両の位置と、地図データとに基づいて、地図車両位置情報と、第2車線形状情報である地図車線形状情報とを生成する。地図車両位置情報は、地図データ上における自車両の位置を示す情報である。地図車線形状情報は、地図データ上の車線のうち自車両周辺の第2車線(以下「地図車線」と記す)の形状を示す情報である。 The first generation unit 34 generates the map vehicle position information and the map lane shape information which is the second lane shape information based on the position of the own vehicle acquired by the second acquisition unit 32 and the map data. .. The map vehicle position information is information indicating the position of the own vehicle on the map data. The map lane shape information is information indicating the shape of the second lane (hereinafter referred to as "map lane") around the own vehicle among the lanes on the map data.
 第1生成部34に用いられる地図データは、車線の形状及び接続を示すデータである。この地図データは、情報出力装置30のメモリに記憶されたデータであってもよいし、情報出力装置30が通信装置を備える場合には、情報出力装置30の外部装置から当該通信装置が受信したデータであってもよい。 The map data used in the first generation unit 34 is data indicating the shape and connection of lanes. This map data may be data stored in the memory of the information output device 30, and when the information output device 30 includes a communication device, the communication device receives the data from an external device of the information output device 30. It may be data.
 第2生成部35は、第1取得部31で取得された検出車線形状情報が示す検出車線の形状と、第1生成部34で生成された地図車線形状情報が示す地図車線の形状との差を算出する。そして、第2生成部35は、算出された差に基づいて、地図車両位置情報及び地図車線形状情報の少なくともいずれか1つを含む地図関連情報の精度を示す地図精度情報を生成する。 The second generation unit 35 is the difference between the shape of the detection lane indicated by the detection lane shape information acquired by the first acquisition unit 31 and the shape of the map lane indicated by the map lane shape information generated by the first generation unit 34. Is calculated. Then, the second generation unit 35 generates map accuracy information indicating the accuracy of the map-related information including at least one of the map vehicle position information and the map lane shape information based on the calculated difference.
 出力部36は、地図関連情報及び地図精度情報を出力する。 The output unit 36 outputs map-related information and map accuracy information.
 <実施の形態1のまとめ>
 以上のような本実施の形態1に係る情報出力装置30によれば、地図関連情報と、当該地図関連情報の精度を示す地図精度情報とを出力する。このような構成によれば、情報出力装置30の出力先は、例えば、地図精度情報が示す精度に基づいて地図関連情報の少なくとも一部を使用したり、地図関連情報を使用しなかったりすることができる。つまり、情報出力装置30の出力先は、地図関連情報を適切に使用することができるので、ロバスト性を高めることができる。
<Summary of Embodiment 1>
According to the information output device 30 according to the first embodiment as described above, the map-related information and the map accuracy information indicating the accuracy of the map-related information are output. According to such a configuration, the output destination of the information output device 30 may, for example, use at least a part of the map-related information based on the accuracy indicated by the map accuracy information, or may not use the map-related information. Can be done. That is, since the output destination of the information output device 30 can appropriately use the map-related information, the robustness can be enhanced.
 なお、以下の実施の形態2などでは、情報出力装置30の出力先は、自車両の走行制御を行う走行制御装置であるものとして説明するが、これに限ったものではない。例えば、情報出力装置30の出力先は、地図精度情報が示す精度が低いと判定された場合にサーバなどに地図関連情報を送信する通信装置などであってもよい。ここでいうサーバは、例えば工事等によって地図データの更新が随時必要な、地図データを管理するサーバを含む。また例えば、情報出力装置30の出力先は、地図精度情報が示す精度が低いと判定された場合に自動運転などが適切でないことを表示する表示装置などであってもよい。 In the following second embodiment and the like, the output destination of the information output device 30 will be described as being a travel control device that controls the travel of the own vehicle, but the present invention is not limited to this. For example, the output destination of the information output device 30 may be a communication device or the like that transmits map-related information to a server or the like when it is determined that the accuracy indicated by the map accuracy information is low. The server referred to here includes a server that manages map data, which needs to be updated at any time due to construction work or the like. Further, for example, the output destination of the information output device 30 may be a display device or the like that displays that automatic operation or the like is not appropriate when it is determined that the accuracy indicated by the map accuracy information is low.
 <実施の形態2>
 図2は、本発明の実施の形態2に係る情報出力装置30を備える自動運転装置の構成を示すブロック図である。以下、本実施の形態2に係る構成要素のうち、上述の構成要素と同じまたは類似する構成要素については同じまたは類似する参照符号を付し、異なる構成要素について主に説明する。
<Embodiment 2>
FIG. 2 is a block diagram showing a configuration of an automatic driving device including the information output device 30 according to the second embodiment of the present invention. Hereinafter, among the components according to the second embodiment, the components that are the same as or similar to the above-mentioned components are designated by the same or similar reference numerals, and different components will be mainly described.
 図2の自動運転装置は、車載センサ装置群10と、車線形状測定装置20と、情報出力装置30と、走行制御装置40とを備える。図2の情報出力装置30は、実施の形態1で説明した情報出力装置30と同様の装置であり、地図関連情報及び地図精度情報を出力する。図2の情報出力装置30は、有線または無線を介して、車載センサ装置群10、車線形状測定装置20、及び、走行制御装置40と接続されている。 The automatic driving device of FIG. 2 includes an in-vehicle sensor device group 10, a lane shape measuring device 20, an information output device 30, and a traveling control device 40. The information output device 30 of FIG. 2 is the same device as the information output device 30 described in the first embodiment, and outputs map-related information and map accuracy information. The information output device 30 of FIG. 2 is connected to the vehicle-mounted sensor device group 10, the lane shape measuring device 20, and the traveling control device 40 via a wire or wirelessly.
 車載センサ装置群10は、車速、ヨーレート、衛星測位位置(緯度、経度、高度)、時刻情報の少なくともいずれか1つ含む車載センサ情報を情報出力装置30及び走行制御装置40に出力する。本実施の形態2では、車載センサ装置群10は、車速、ヨーレート、衛星測位位置(緯度、経度、高度)、時刻情報の全てを、情報項目ごとに予め定められた仕様に従って情報出力装置30及び走行制御装置40に出力するものとする。ここでいう予め定められた仕様には、例えば、車速及びヨーレートの出力間隔が20ミリ秒周期であること、衛星測位位置及び時刻情報の出力間隔が100ミリ秒周期であることなどが規定される。 The vehicle-mounted sensor device group 10 outputs vehicle-mounted sensor information including at least one of vehicle speed, yaw rate, satellite positioning position (latitude, longitude, altitude), and time information to the information output device 30 and the travel control device 40. In the second embodiment, the vehicle-mounted sensor device group 10 displays all of the vehicle speed, yaw rate, satellite positioning position (latitude, longitude, altitude), and time information according to the specifications predetermined for each information item, the information output device 30 and the information output device 30. It shall be output to the travel control device 40. The predetermined specifications here specify, for example, that the output interval of vehicle speed and yaw rate is a 20-millisecond cycle, and that the output interval of satellite positioning position and time information is a 100-millisecond cycle. ..
 車線形状測定装置20は、自車両に搭載されたカメラやLiDARなどで自車両周辺を検出した結果に基づいて、実施の形態1で説明した検出車線の形状を示す検出車線形状情報を生成する。そして、車線形状測定装置20は、当該検出車線形状情報を一定周期(例えば100ミリ秒周期)で情報出力装置30及び走行制御装置40に出力する。 The lane shape measuring device 20 generates detection lane shape information indicating the shape of the detection lane described in the first embodiment based on the result of detecting the vicinity of the own vehicle with a camera mounted on the own vehicle, LiDAR, or the like. Then, the lane shape measuring device 20 outputs the detected lane shape information to the information output device 30 and the traveling control device 40 at a fixed cycle (for example, a cycle of 100 milliseconds).
 走行制御装置40は、車載センサ装置群10から出力された車載センサ情報と、車線形状測定装置20から出力された検出車線形状情報と、情報出力装置30から出力された地図関連情報及び地図精度情報とに基づいて、自車両の走行制御を行う。 The travel control device 40 includes vehicle-mounted sensor information output from the vehicle-mounted sensor device group 10, detected lane shape information output from the lane shape measuring device 20, map-related information and map accuracy information output from the information output device 30. Based on the above, the running control of the own vehicle is performed.
 次に、図2の本実施の形態2に係る情報出力装置30の構成について説明する。図2の情報出力装置30は、車線形状取得部31aと、位置算出部32aと、地図データ記憶部33と、地図車両位置生成部34aと、地図車線形状生成部34bと、地図精度生成部35aと、情報出力部36aとを備える。なお、車線形状取得部31aは、図1の第1取得部31の概念に含まれ、位置算出部32aは、図1の第2取得部32の概念に含まれ、地図車両位置生成部34a及び地図車線形状生成部34bは、図1の第1生成部34の概念に含まれる。また、地図精度生成部35aは、図1の第2生成部35の概念に含まれ、情報出力部36aは、図1の出力部36の概念に含まれる。 Next, the configuration of the information output device 30 according to the second embodiment of FIG. 2 will be described. The information output device 30 of FIG. 2 includes a lane shape acquisition unit 31a, a position calculation unit 32a, a map data storage unit 33, a map vehicle position generation unit 34a, a map lane shape generation unit 34b, and a map accuracy generation unit 35a. And an information output unit 36a. The lane shape acquisition unit 31a is included in the concept of the first acquisition unit 31 in FIG. 1, the position calculation unit 32a is included in the concept of the second acquisition unit 32 in FIG. 1, and the map vehicle position generation unit 34a and The map lane shape generation unit 34b is included in the concept of the first generation unit 34 in FIG. Further, the map accuracy generation unit 35a is included in the concept of the second generation unit 35 in FIG. 1, and the information output unit 36a is included in the concept of the output unit 36 in FIG.
 車線形状取得部31aは、車線形状測定装置20から出力された検出車線形状情報を取得する。 The lane shape acquisition unit 31a acquires the detected lane shape information output from the lane shape measuring device 20.
 位置算出部32aは、車載センサ装置群10から出力された車載センサ情報に基づいて、現時刻の自車両の位置(緯度、経度、高度、車頭方位)を一定周期(例えば100ミリ秒周期)で算出する。位置算出部32aによる自車両の位置の算出には、例えば衛星測位結果または慣性航法などが用いられる。 The position calculation unit 32a determines the position (latitude, longitude, altitude, head direction) of the own vehicle at the current time in a fixed cycle (for example, 100 milliseconds cycle) based on the vehicle-mounted sensor information output from the vehicle-mounted sensor device group 10. calculate. For the calculation of the position of the own vehicle by the position calculation unit 32a, for example, satellite positioning result or inertial navigation is used.
 なお、位置算出部32aは、車線形状取得部31aで取得された検出車線形状情報に基づいて自車両の位置を算出してもよい。検出車線形状情報に含まれる、カメラやセンサなどで取得した自車両の車線内の位置、特に自車両の左右方向の位置は、衛星測位位置と比較して精度が良い。このため、検出車線形状情報が自車両の位置の算出に用いられると、自車両の位置の精度向上が期待できる。 Note that the position calculation unit 32a may calculate the position of the own vehicle based on the detected lane shape information acquired by the lane shape acquisition unit 31a. The position in the lane of the own vehicle, particularly the position in the left-right direction of the own vehicle, which is included in the detected lane shape information and is acquired by a camera or a sensor, is more accurate than the satellite positioning position. Therefore, if the detected lane shape information is used to calculate the position of the own vehicle, the accuracy of the position of the own vehicle can be expected to be improved.
 地図データ記憶部33は、車線の形状及び接続を示す地図データを記憶している。 The map data storage unit 33 stores map data indicating the shape and connection of lanes.
 地図車両位置生成部34aは、位置算出部32aで算出された自車両の位置と、地図データ記憶部33に記憶された地図データとに基づいて、例えばマップマッチングを行うことによって、地図データ上における自車両の位置を示す地図車両位置情報を生成する。 The map vehicle position generation unit 34a on the map data, for example, by performing map matching based on the position of the own vehicle calculated by the position calculation unit 32a and the map data stored in the map data storage unit 33. Generates map vehicle position information indicating the position of the own vehicle.
 地図車線形状生成部34bは、地図データ記憶部33に記憶された地図データ上の車線のうち、地図車両位置生成部34aで生成された地図車両位置情報が示す位置周辺の車線の形状を、地図車線の形状として検索する。そして、地図車線形状生成部34bは、検索した地図車線の形状を示す地図車線形状情報を生成する。 The map lane shape generation unit 34b maps the shape of the lane around the position indicated by the map vehicle position information generated by the map vehicle position generation unit 34a among the lanes on the map data stored in the map data storage unit 33. Search as the shape of the lane. Then, the map lane shape generation unit 34b generates map lane shape information indicating the shape of the searched map lane.
 地図精度生成部35aは、車線形状取得部31aで取得された検出車線形状情報が示す検出車線の形状と、地図車線形状生成部34bで生成された地図車線形状情報が示す地図車線の形状との差に基づいて、地図車両位置情報及び地図車線形状情報の少なくともいずれか1つを含む地図関連情報の精度を示す地図精度情報を生成する。地図精度生成部35aによる地図精度情報の生成は、例えば一定時間ごと、一定の走行距離ごと、または、車線形状測定装置20からの検出車線形状情報の出力ごとに行われる。 The map accuracy generation unit 35a has a shape of the detection lane indicated by the detection lane shape information acquired by the lane shape acquisition unit 31a and a map lane shape indicated by the map lane shape information generated by the map lane shape generation unit 34b. Based on the difference, map accuracy information indicating the accuracy of map-related information including at least one of map vehicle position information and map lane shape information is generated. The map accuracy generation unit 35a generates the map accuracy information, for example, every fixed time, every constant mileage, or every output of the detected lane shape information from the lane shape measuring device 20.
 情報出力部36aは、地図関連情報及び地図精度情報を、走行制御装置40に出力する。なお、地図関連情報は、地図精度生成部35aで生成されてもよいし、情報出力部36aで生成されてもよい。 The information output unit 36a outputs map-related information and map accuracy information to the travel control device 40. The map-related information may be generated by the map accuracy generation unit 35a or may be generated by the information output unit 36a.
 <動作>
 図3は、本実施の形態2に係る情報出力装置30の動作を示すフローチャートである。
<Operation>
FIG. 3 is a flowchart showing the operation of the information output device 30 according to the second embodiment.
 まずステップS1にて、車線形状取得部31aは、車線形状測定装置20から検出車線形状情報を取得する。 First, in step S1, the lane shape acquisition unit 31a acquires the detected lane shape information from the lane shape measuring device 20.
 ステップS2にて、位置算出部32aは、車載センサ装置群10から出力された車載センサ情報に基づいて、現時刻の自車両の位置を算出する。 In step S2, the position calculation unit 32a calculates the position of the own vehicle at the current time based on the in-vehicle sensor information output from the in-vehicle sensor device group 10.
 ステップS3にて、地図車両位置生成部34aは、位置算出部32aで算出された自車両の位置と、地図データとに基づいて、地図車両位置情報を生成する。 In step S3, the map vehicle position generation unit 34a generates map vehicle position information based on the position of the own vehicle calculated by the position calculation unit 32a and the map data.
 ステップS4にて、地図車線形状生成部34bは、地図車両位置生成部34aで生成された地図車両位置情報と、地図データとに基づいて、地図車線形状情報を生成する。 In step S4, the map lane shape generation unit 34b generates map lane shape information based on the map vehicle position information generated by the map vehicle position generation unit 34a and the map data.
 ステップS5にて、地図精度生成部35aは、車線形状取得部31aで取得された検出車線形状情報が示す検出車線の形状と、地図車線形状生成部34bで生成された地図車線形状情報が示す地図車線の形状との差に基づいて、地図精度情報を生成する。なお、地図精度生成部35aによる地図精度情報の生成については後で詳細に説明する。 In step S5, the map accuracy generation unit 35a has the shape of the detection lane indicated by the detection lane shape information acquired by the lane shape acquisition unit 31a and the map indicated by the map lane shape information generated by the map lane shape generation unit 34b. Map accuracy information is generated based on the difference from the shape of the lane. The generation of map accuracy information by the map accuracy generation unit 35a will be described in detail later.
 ステップS6にて、情報出力部36aは、地図関連情報及び地図精度情報を、走行制御装置40に出力する。なお、走行制御装置40の動作については後で詳細に説明する。その後、図3の動作が終了する。なお、ステップS6の後、ステップS1に処理が適宜戻ってもよい。 In step S6, the information output unit 36a outputs the map-related information and the map accuracy information to the travel control device 40. The operation of the travel control device 40 will be described in detail later. After that, the operation of FIG. 3 ends. After step S6, the process may return to step S1 as appropriate.
 <ステップS5における地図精度情報の生成>
 図4は、ステップS5にて、地図精度生成部35aが、車線形状取得部31aで取得された検出車線形状情報が示す検出車線の形状と、地図車線形状生成部34bで生成された地図車線形状情報が示す地図車線の形状との差を算出する処理を説明するための概念図である。
<Generation of map accuracy information in step S5>
FIG. 4 shows the shape of the detection lane indicated by the detection lane shape information acquired by the map accuracy generation unit 35a in step S5 and the map lane shape generated by the map lane shape generation unit 34b. It is a conceptual diagram for demonstrating the process of calculating the difference from the shape of a map lane indicated by information.
 車線形状取得部31aが車線形状測定装置20から取得する検出車線形状情報には、検出車線の形状が座標系上の複数の点として示されている。この座標系には、例えば、車線形状測定装置20を原点とし、自車両の進行方向をY軸(進行方向を正値)とし、進行方向に対する横方向をX軸(進行方向の右手方向を正値)としたXY座標系が用いられる。この場合、検出車線の形状上の各点の位置は、車線形状測定装置20を原点とした相対位置となる。 The detected lane shape information acquired from the lane shape measuring device 20 by the lane shape acquiring unit 31a indicates the shape of the detected lane as a plurality of points on the coordinate system. In this coordinate system, for example, the lane shape measuring device 20 is the origin, the traveling direction of the own vehicle is the Y axis (the traveling direction is a positive value), and the lateral direction with respect to the traveling direction is the X axis (the right hand direction of the traveling direction is positive). The XY coordinate system used as the value) is used. In this case, the position of each point on the shape of the detection lane is a relative position with the lane shape measuring device 20 as the origin.
 地図精度生成部35aは、地図車線形状生成部34bで生成された地図車線形状情報の座標系を、車線形状測定装置20で生成された検出車線形状情報の座標系に変換する。例えば、地図車線形状情報の座標系が、世界測地系の座標系(緯度、経度、高度、車頭方位で表される値)であり、検出車線形状情報の座標系が、上述の自車両を原点とするXY座標系である場合には、世界測地系の座標系をXY座標系に変換すればよい。 The map accuracy generation unit 35a converts the coordinate system of the map lane shape information generated by the map lane shape generation unit 34b into the coordinate system of the detection lane shape information generated by the lane shape measuring device 20. For example, the coordinate system of the map lane shape information is the coordinate system of the world geodetic system (values represented by latitude, longitude, altitude, and head orientation), and the coordinate system of the detected lane shape information originates from the above-mentioned own vehicle. In the case of the XY coordinate system, the coordinate system of the world geodetic system may be converted into the XY coordinate system.
 具体的には、地図車線形状情報の生成時に用いられた自車両の位置の座標(緯度、経度、高度)が、XY座標系の原点となるような座標の移動と、地図車線形状情報が示す車頭方位が、XY座標系のY軸方向となるような座標の回転とを組み合わせた処理を行えばよい。これにより、地図車線の形状上の各点の位置は、検出車線の形状上の各点の位置と同様に、車線形状測定装置20を原点とした相対位置となる。 Specifically, the movement of coordinates such that the coordinates (latitude, longitude, altitude) of the position of the own vehicle used when generating the map lane shape information becomes the origin of the XY coordinate system, and the map lane shape information indicate. The process may be performed in combination with the rotation of the coordinates so that the vehicle head orientation is in the Y-axis direction of the XY coordinate system. As a result, the position of each point on the shape of the map lane becomes a relative position with the lane shape measuring device 20 as the origin, similarly to the position of each point on the shape of the detection lane.
 なお、以上の説明では、地図車線形状情報の絶対座標系が、検出車線形状情報の相対座標系に変換されたが、これとは逆に、検出車線形状情報の相対座標系が、地図車線形状情報の絶対座標系に変換されてもよい。なお、本実施の形態2のように自動運転を行う自動運転装置などでは、自車両を原点とした相対座標が一般的に利用されることに鑑みて、地図車線形状情報の絶対座標系が、検出車線形状情報の相対座標系に変換されることが好ましい。 In the above description, the absolute coordinate system of the map lane shape information is converted to the relative coordinate system of the detected lane shape information, but conversely, the relative coordinate system of the detected lane shape information is the map lane shape. It may be converted into an absolute coordinate system of information. In addition, in an automatic driving device or the like that performs automatic driving as in the second embodiment, the absolute coordinate system of the map lane shape information is used in view of the fact that the relative coordinates with the own vehicle as the origin are generally used. It is preferable that the detected lane shape information is converted into a relative coordinate system.
 図4には、以上の座標系の変換が行われた後の、検出車線である区画線51と、地図車線である区画線52とが示されている。また図4には、差を算出すべき車線の形状の範囲である算出範囲Lも示されている。算出範囲Lが大きすぎると、検出車線の形状の検出精度が低くなる傾向があることに鑑みて、算出範囲Lの最大距離は、例えば、100[m]程度に設定される。 FIG. 4 shows the lane marking 51, which is the detection lane, and the lane marking 52, which is the map lane, after the above coordinate system conversion is performed. FIG. 4 also shows a calculation range L, which is a range of lane shapes for which the difference should be calculated. If the calculation range L is too large, the detection accuracy of the shape of the detection lane tends to be low. Therefore, the maximum distance of the calculation range L is set to, for example, about 100 [m].
 地図精度生成部35aは、算出範囲L内の区画線51と区画線52との差Dの平均値Davg及び最大値Dmaxを算出する。地図精度生成部35aは、算出範囲L内の左側及び右側の一方側のみの区画線51及び区画線52についてこれら値を算出してもよいし、算出範囲L内の左側及び右側の両側の区画線51及び区画線52についてこれら値を算出してもよい。 The map accuracy generation unit 35a calculates the average value Davg and the maximum value Dmax of the difference D between the division line 51 and the division line 52 in the calculation range L. The map accuracy generation unit 35a may calculate these values for the division line 51 and the division line 52 on only one side of the left side and the right side in the calculation range L, or the divisions on both the left side and the right side in the calculation range L. These values may be calculated for the line 51 and the lane marking 52.
 次に、地図精度生成部35aは、算出された平均値Davg及び最大値Dmaxに基づいて、地図関連情報の精度を判定する。図5は、地図精度生成部35aによる判定を説明するための図である。図5では、平均値Davg及び最大値Dmaxを区分した判定条件と、地図関連情報の精度を区分した精度レベルとが対応付けられている。地図精度生成部35aは、算出された平均値Davg及び最大値Dmaxが該当する判定条件を図5の情報から検索し、当該判定条件に対応する精度レベルを地図関連情報の精度として判定する。 Next, the map accuracy generation unit 35a determines the accuracy of the map-related information based on the calculated average value Davg and the maximum value Dmax. FIG. 5 is a diagram for explaining the determination by the map accuracy generation unit 35a. In FIG. 5, the determination condition in which the average value Davg and the maximum value Dmax are classified is associated with the accuracy level in which the accuracy of the map-related information is classified. The map accuracy generation unit 35a searches the information in FIG. 5 for the determination condition to which the calculated average value Davg and the maximum value Dmax correspond, and determines the accuracy level corresponding to the determination condition as the accuracy of the map-related information.
 精度レベルの「高」は、地図関連情報の精度が高いことを意味し、精度レベルの「低」は、地図関連情報の精度が低いことを意味する。精度レベルの「未判定」は、区画線51及び区画線52の情報が正常に得られなかったことなどを意味する。例えば、精度レベルの「未判定」は、情報出力装置30が、区画線51及び区画線52の少なくとも一方において、左側及び右側の一方のみの区画線しか得られなかった場合、または、左側及び右側の両方の区画線が得られなかった場合などを意味する。なお、図4の例では、精度レベルの区分は4つ(高、中、低、未判定)であるが、4つに限ったものではない。 The accuracy level "high" means that the accuracy of the map-related information is high, and the accuracy level "low" means that the accuracy of the map-related information is low. The accuracy level "undetermined" means that the information on the lane marking 51 and the lane marking 52 was not normally obtained. For example, the accuracy level "undetermined" means that the information output device 30 obtains only one of the left and right division lines at at least one of the division line 51 and the division line 52, or the left side and the right side. This means, for example, when both lane markings cannot be obtained. In the example of FIG. 4, the accuracy level is classified into four (high, medium, low, undetermined), but is not limited to four.
 地図精度生成部35aは、以上によって判定された精度示す情報を、地図関連情報の精度を示す地図精度情報として生成する。 The map accuracy generation unit 35a generates the information indicating the accuracy determined as described above as the map accuracy information indicating the accuracy of the map-related information.
 <走行制御装置40の動作>
 走行制御装置40は、図3のステップS6で情報出力装置30から出力された地図関連情報及び地図精度情報を受け取る。走行制御装置40は、受け取った地図精度情報が示す精度に基づいて、受信した地図関連情報を用いた自動運転を実行するか否かを判定し、その判定結果に基づいて自動運転に関する走行制御を行う。走行制御装置40は、例えば、地図精度情報が「高」を示す場合には、地図関連情報を用いた自動運転に関する走行制御を行い、地図精度情報が「低」を示す場合には、地図関連情報を用いた自動運転に関する走行制御を停止する。
<Operation of travel control device 40>
The travel control device 40 receives the map-related information and the map accuracy information output from the information output device 30 in step S6 of FIG. The travel control device 40 determines whether or not to execute automatic driving using the received map-related information based on the accuracy indicated by the received map accuracy information, and performs travel control related to automatic driving based on the determination result. Do. For example, when the map accuracy information indicates "high", the travel control device 40 performs travel control related to automatic driving using the map-related information, and when the map accuracy information indicates "low", the travel control device 40 performs map-related information. Stop driving control related to automatic driving using information.
 <実施の形態2のまとめ>
 以上のような本実施の形態2に係る情報出力装置30を備える自動運転装置によれば、情報出力装置30から出力された地図関連情報及び地図精度情報に基づいて、自車両の走行制御を行う。このような構成によれば、適切な自動運転を実現することができる。
<Summary of Embodiment 2>
According to the automatic driving device provided with the information output device 30 according to the second embodiment as described above, the traveling control of the own vehicle is performed based on the map-related information and the map accuracy information output from the information output device 30. .. According to such a configuration, appropriate automatic operation can be realized.
 <変形例1>
 図4の算出範囲Lには、車線形状測定装置20の精度が最も良い範囲を用いることが望ましい。例えば、車線形状測定装置20がカメラの画像に基づいて検出車線形状情報を生成する場合には、カーブなど道路に曲率がある場所での検出車線の形状の検出精度は、直線路での検出車線の形状の検出精度よりも低くなる傾向がある。
<Modification example 1>
For the calculation range L in FIG. 4, it is desirable to use the range in which the accuracy of the lane shape measuring device 20 is the best. For example, when the lane shape measuring device 20 generates the detected lane shape information based on the image of the camera, the detection accuracy of the detected lane shape in a place where the road has a curvature such as a curve is determined by the detection lane in a straight road. It tends to be lower than the detection accuracy of the shape of.
 そこで、地図精度生成部35aは、検出車線である区画線51の形状の曲率に基づいて、算出範囲Lを変更してもよい。なお、ここでいう曲率は、曲率そのものだけでなく、曲率の逆数である曲率半径も含む。以下、地図精度生成部35aは、区画線51の形状の曲率半径に基づいて、算出範囲Lを変更する場合を例にして説明する。 Therefore, the map accuracy generation unit 35a may change the calculation range L based on the curvature of the shape of the lane marking 51 which is the detection lane. The curvature here includes not only the curvature itself but also the radius of curvature, which is the reciprocal of the curvature. Hereinafter, the map accuracy generation unit 35a will be described by taking as an example a case where the calculation range L is changed based on the radius of curvature of the shape of the lane marking 51.
 図6は、地図精度生成部35aによる算出範囲Lの変更を説明するための図である。図6では、区画線51の形状の曲率半径R[m]と、Y軸方向の範囲として示される算出範囲Lとが対応付けられている。地図精度生成部35aは、区画線51の形状の曲率半径を算出し、算出された曲率半径に対応する算出範囲Lを図6の情報から検索して、区画線51の形状と区画線52の形状との差の算出に用いる。 FIG. 6 is a diagram for explaining the change of the calculation range L by the map accuracy generation unit 35a. In FIG. 6, the radius of curvature R [m] of the shape of the lane marking 51 is associated with the calculation range L shown as the range in the Y-axis direction. The map accuracy generation unit 35a calculates the radius of curvature of the shape of the division line 51, searches the calculation range L corresponding to the calculated radius of curvature from the information in FIG. 6, and searches for the shape of the division line 51 and the division line 52. Used to calculate the difference from the shape.
 以上のような構成によれば、地図精度情報が示す精度に、カーブなどによる検出精度の低下が影響することを抑制することができる。これにより、例えば、区画線51の曲率半径Rが1000m以上である場合、自車両の前方50mまでは直線路とみなされ、区画線51の精度の計算が車両の位置から前方50mまでとなる。 According to the above configuration, it is possible to suppress the influence of the decrease in detection accuracy due to a curve or the like on the accuracy indicated by the map accuracy information. As a result, for example, when the radius of curvature R of the lane marking 51 is 1000 m or more, the road up to 50 m ahead of the own vehicle is regarded as a straight road, and the accuracy of the lane marking 51 is calculated from the position of the vehicle to 50 m ahead.
 なお、自車両に自動走行させている走行制御装置40に対して、地図関連情報の精度が低い区間に自車両が自動走行によって進入する前に、情報出力装置30は地図関連情報の精度が低いことを通知することが好ましい。このため、算出範囲Lの自車両側の位置が、自車両から離間されてもよい。このように構成した場合には、自車両が当該区間に進入する前に、走行制御装置40は、地図関連情報を利用しない自動運転モードに移行するなどの適切な対応をとることができる。 It should be noted that the information output device 30 has low accuracy of map-related information before the own vehicle enters a section where the accuracy of map-related information is low with respect to the travel control device 40 that is automatically driven by the own vehicle. It is preferable to notify that. Therefore, the position of the calculation range L on the own vehicle side may be separated from the own vehicle. With such a configuration, the travel control device 40 can take appropriate measures such as shifting to an automatic driving mode that does not use map-related information before the own vehicle enters the section.
 なお以上の説明では、地図精度生成部35aは、検出車線である区画線51の形状の曲率(曲率半径)に基づいて算出範囲Lを変更したが、地図車線である区画線52の形状の曲率に基づいて算出範囲Lを変更してもよい。 In the above description, the map accuracy generation unit 35a changes the calculation range L based on the curvature (radius of curvature) of the shape of the lane marking 51 which is the detection lane, but the curvature of the shape of the lane marking 52 which is the map lane. The calculation range L may be changed based on.
 <変形例2>
 検出車線形状情報が検出車線の形状を示す表現形式と、地図車線形状情報が地図車線の形状を示す表現形式とが異なる場合がある。例えば、車線形状測定装置20から出力される検出車線形状情報が示す検出車線の位置としては、検出車線の自車両側の端部の位置、検出車線の幅方向の中心位置、または、検出車線の自車両と逆側の端部の位置など、様々な位置が適用されることが想定される。
<Modification 2>
The representation format in which the detection lane shape information indicates the shape of the detection lane and the representation format in which the map lane shape information indicates the shape of the map lane may differ. For example, the position of the detection lane indicated by the detection lane shape information output from the lane shape measuring device 20 includes the position of the end portion of the detection lane on the own vehicle side, the center position in the width direction of the detection lane, or the detection lane. It is assumed that various positions such as the position of the end opposite to the own vehicle will be applied.
 そこで、地図精度生成部35aは、検出車線形状情報が検出車線の形状を示す表現形式と、地図車線形状情報が地図車線の形状を示す表現形式とが異なる場合に、予め定められた車線幅を用いて上記差を求めてもよい。ここでは、検出車線形状情報における検出車線である区画線51の位置が、区画線の幅方向における自車両側の端部の位置であり、地図車線形状情報における地図車線である区画線52の位置が、区画線の幅方向の中心の位置である場合を例にして説明する。この場合、区画線51の位置と区画線52の位置との間には、車線幅(区画線幅)Wの約半分、つまりW/2の誤差がある。そこで、地図精度生成部35aは、区画線51の形状と区画線52の形状との差DからW/2を減算した値の平均値及び最大値を、それぞれ実施の形態2で説明した平均値Davg及び最大値Dmaxとして用いてもよい。 Therefore, when the map accuracy generation unit 35a has a different expression format in which the detected lane shape information indicates the shape of the detected lane and the expression format in which the map lane shape information indicates the shape of the map lane, the map accuracy generation unit 35a determines a predetermined lane width. The above difference may be obtained by using. Here, the position of the lane marking 51, which is the detection lane in the detection lane shape information, is the position of the end portion on the own vehicle side in the width direction of the lane marking, and the position of the lane marking 52, which is the map lane in the map lane shape information. However, the case where the position is the center position in the width direction of the lane marking will be described as an example. In this case, there is an error of about half of the lane width (section line width) W, that is, W / 2 between the position of the lane marking 51 and the position of the lane marking 52. Therefore, the map accuracy generation unit 35a sets the average value and the maximum value of the values obtained by subtracting W / 2 from the difference D between the shape of the lane marking 51 and the shape of the lane marking 52 as the average value described in the second embodiment, respectively. It may be used as Davg and the maximum value Dmax.
 なお、車線幅Wは地域または国によって違うため、地図データ(地図車線形状情報)に示される車線幅のうち、自車両の位置に対応する車線幅が上記算出に用いられてもよい。なお、日本の一般道路であれば、車線幅Wは約15cmである。 Since the lane width W differs depending on the region or country, the lane width corresponding to the position of the own vehicle may be used in the above calculation among the lane widths shown in the map data (map lane shape information). If it is a general road in Japan, the lane width W is about 15 cm.
 以上のような構成によれば、車線形状測定装置20が出力する区画線51の形状と地図データ内の区画線52の形状との差の精度を、さらに高めることができるため、地図精度情報が示す精度を高めることができる。また、実際には、導流帯や縁石などが原因で、画像データ上において区画線でない箇所を、区画線として誤検出する場合がある。これに対して、以上のような構成によれば、このような誤検出を低減することが期待できる。 According to the above configuration, the accuracy of the difference between the shape of the lane marking 51 output by the lane shape measuring device 20 and the shape of the lane marking 52 in the map data can be further improved, so that the map accuracy information can be obtained. The accuracy of showing can be improved. Further, in reality, a portion of the image data that is not a lane marking may be erroneously detected as a lane marking due to a headrace zone or a curb. On the other hand, according to the above configuration, it can be expected to reduce such false detections.
 <変形例3>
 変形例2では、地図精度生成部35aは、車線形状測定装置20の測定仕様と地図データ仕様との差異が地図精度情報に与える影響を、車線幅Wを用いて低減したが、仕様(表現形式)による影響の低減はこれに限ったものではない。
<Modification example 3>
In the second modification, the map accuracy generation unit 35a reduced the influence of the difference between the measurement specifications of the lane shape measuring device 20 and the map data specifications on the map accuracy information by using the lane width W, but the specifications (expression format). ) Is not limited to this.
 例えば、2つの検出車線である2つの区画線51の形状を示す表現形式と、2つの地図車線である2つの区画線52の形状を示す表現形式とが異なる場合を想定する。このような場合に、地図精度生成部35aは、上記差として、2つの区画線51の中心線の形状と2つの区画線52の中心線の形状との差を求めてもよい。 For example, it is assumed that the expression format showing the shapes of the two lane markings 51, which are the two detection lanes, and the expression format showing the shapes of the two lane markings 52, which are the two map lanes, are different. In such a case, the map accuracy generation unit 35a may obtain the difference between the shape of the center line of the two division lines 51 and the shape of the center line of the two division lines 52 as the above difference.
 例えば、地図精度生成部35aは、2つの区画線51から算出した第1車線中心線と、2つの区画線52から算出した第2車線中心線との差を、上述の区画線51と区画線52との差Dと同様に扱う。この際、地図精度生成部35aは、2つの区画線51の最短距離となる2点の中点から第1車線中心線を算出し、2つの区画線52の最短距離となる2点の中点から第2車線中心線を算出することが好ましい。 For example, the map accuracy generation unit 35a determines the difference between the first lane center line calculated from the two lane markings 51 and the second lane center line calculated from the two lane markings 52 between the lane marking 51 and the lane marking. It is treated in the same way as the difference D from 52. At this time, the map accuracy generation unit 35a calculates the first lane center line from the midpoints of the two points that are the shortest distances of the two lanes 51, and the midpoints of the two points that are the shortest distances of the two lanes 52. It is preferable to calculate the second lane center line from.
 第1車線中心線及び第2車線中心線の位置は、左右の区画線の真ん中の位置であるため、区画線51及び区画線52の表現形式が多少異なっていても、第1車線中心線及び第2車線中心線の表現形式を同一にすることができる。このため、車線形状測定装置20の測定仕様と地図データ仕様との差異が地図精度情報に与える影響を低減することができる。したがって以上のような構成によれば、車線形状測定装置20が出力する区画線51の形状と地図データ内の区画線52の形状との差の精度を、さらに高めることができるため、地図精度情報が示す精度を高めることができる。 Since the positions of the first lane center line and the second lane center line are in the middle of the left and right lane markings, the first lane center line and the lane marking 52 even if the representation formats of the lane marking 51 and the lane marking 52 are slightly different. The expression format of the second lane center line can be the same. Therefore, it is possible to reduce the influence of the difference between the measurement specifications of the lane shape measuring device 20 and the map data specifications on the map accuracy information. Therefore, according to the above configuration, the accuracy of the difference between the shape of the lane marking 51 output by the lane shape measuring device 20 and the shape of the lane marking 52 in the map data can be further improved, and thus the map accuracy information. The accuracy indicated by can be improved.
 <変形例4>
 車線形状測定装置20が検出車線の形状を測定するのに要する測定時間、つまり車線形状測定装置20が検出車線形状情報を生成する生成時間Tが長い場合がある。この場合、検出車線形状情報が上記差の算出に用いられる時間が、地図車線形状情報が上記差の算出に用いられる時間から遅延する場合がある。この結果として、検出車線形状情報が示す検出車線である区画線51が、地図車線形状情報が示す地図車線である区画線52に対して、自車両の進行方向(図4のY軸のプラス方向)にずれる場合がある。
<Modification example 4>
The measurement time required for the lane shape measuring device 20 to measure the shape of the detected lane, that is, the generation time T for the lane shape measuring device 20 to generate the detected lane shape information may be long. In this case, the time that the detected lane shape information is used to calculate the difference may be delayed from the time that the map lane shape information is used to calculate the difference. As a result, the lane marking 51, which is the detection lane indicated by the detection lane shape information, is in the traveling direction of the own vehicle (plus direction of the Y axis in FIG. 4) with respect to the lane marking 52, which is the map lane indicated by the map lane shape information. ) May shift.
 そこで、地図精度生成部35aは、検出車線形状情報の生成時間Tと自車両の速度とに基づいて、区画線51の形状を区画線52の形状に対して移動させた後に、上記差を求めてもよい。例えば、車線形状測定装置20が検出車線形状情報を生成する生成時間Tの値を実験等で予め求めておき、地図精度生成部35aにパラメータとして記憶しておく。そして、図3のステップS5にて、地図精度生成部35aは、生成時間Tと現時刻における自車両の車速V[m/s]とを積算して距離Ldelay[m]を求める。それから、地図精度生成部35aは、地図車線形状情報が示す区画線52を、検出車線形状情報が示す区画線51と同一のXY座標系に変換する際に、区画線51を距離Ldelay分だけY軸のマイナス方向に平行移動させる。 Therefore, the map accuracy generation unit 35a obtains the above difference after moving the shape of the lane marking 51 with respect to the shape of the lane marking 52 based on the generation time T of the detected lane shape information and the speed of the own vehicle. You may. For example, the value of the generation time T for which the lane shape measuring device 20 generates the detected lane shape information is obtained in advance by an experiment or the like, and stored as a parameter in the map accuracy generation unit 35a. Then, in step S5 of FIG. 3, the map accuracy generation unit 35a integrates the generation time T and the vehicle speed V [m / s] of the own vehicle at the current time to obtain the distance Ldelay [m]. Then, when the map accuracy generation unit 35a converts the lane marking 52 indicated by the map lane shape information into the same XY coordinate system as the lane marking 51 indicated by the detected lane shape information, the lane marking 51 is Y by the distance Ldelay. Translate in the minus direction of the axis.
 なお、車速は車載センサ装置群10から情報出力装置30に入力されるものとする。また、自車両が直進移動しているとみなすことができる程度に、生成時間Tは短い(例えば100~300ミリ秒)ものとする。なお、生成時間Tは、検出車線形状情報が車線形状測定装置20から出力されて情報出力装置30で受信されるまでの通信遅延時間を含んでもよい。 It is assumed that the vehicle speed is input to the information output device 30 from the in-vehicle sensor device group 10. Further, it is assumed that the generation time T is short (for example, 100 to 300 milliseconds) so that the own vehicle can be regarded as moving straight. The generation time T may include a communication delay time until the detected lane shape information is output from the lane shape measuring device 20 and received by the information output device 30.
 以上のような構成によれば、車線形状測定装置20の生成時間Tが地図精度情報に与える影響を低減することができるため、地図精度情報が示す精度を高めることができる。 According to the above configuration, the influence of the generation time T of the lane shape measuring device 20 on the map accuracy information can be reduced, so that the accuracy indicated by the map accuracy information can be improved.
 <変形例5>
 実施の形態2に係る地図精度生成部35aは、区画線51と区画線52との差Dの平均値Davg及び最大値Dmaxから図5の情報を用いて地図関連情報の精度を示す地図精度情報を生成したが、地図精度情報はこれに限ったものではない。例えば、走行制御装置40が、差Dの平均値Davg及び最大値Dmaxに基づいて、自動運転に関する走行制御を行うことが可能であれば、地図精度生成部35aは、平均値Davg及び最大値Dmaxをそのまま地図精度情報として生成してもよい。
<Modification 5>
The map accuracy generation unit 35a according to the second embodiment uses the information shown in FIG. 5 from the average value Davg and the maximum value Dmax of the difference D between the division line 51 and the division line 52 to indicate the accuracy of the map-related information. However, the map accuracy information is not limited to this. For example, if the travel control device 40 can perform travel control related to automatic driving based on the average value Davg and the maximum value Dmax of the difference D, the map accuracy generation unit 35a has the average value Davg and the maximum value Dmax. May be generated as map accuracy information as it is.
 <その他の変形例>
 上述した図1の第1取得部31、第2取得部32、第1生成部34、第2生成部35及び出力部36を、以下「第1取得部31等」と記す。第1取得部31等は、図7に示す処理回路81により実現される。すなわち、処理回路81は、車両周辺を検出した結果に基づいて生成され、車両周辺の第1車線の形状を示す第1車線形状情報を取得する第1取得部31と、車両の位置を取得する第2取得部32と、第2取得部32で取得された車両の位置と、地図データとに基づいて、地図データ上における車両の位置を示す地図車両位置情報と、地図データ上の車線のうち車両周辺の第2車線の形状を示す第2車線形状情報とを生成する第1生成部34と、第1取得部31で取得された第1車線形状情報が示す第1車線の形状と、第1生成部34で生成された第2車線形状情報が示す第2車線の形状との差に基づいて、地図車両位置情報及び第2車線形状情報の少なくともいずれか1つを含む地図関連情報の精度を示す地図精度情報を生成する第2生成部35と、地図関連情報及び地図精度情報を出力する出力部36と、を備える。処理回路81には、専用のハードウェアが適用されてもよいし、メモリに格納されるプログラムを実行するプロセッサが適用されてもよい。プロセッサには、例えば、中央処理装置、処理装置、演算装置、マイクロプロセッサ、マイクロコンピュータ、DSP(Digital Signal Processor)などが該当する。
<Other variants>
The first acquisition unit 31, the second acquisition unit 32, the first generation unit 34, the second generation unit 35, and the output unit 36 of FIG. 1 described above are hereinafter referred to as “first acquisition unit 31 and the like”. The first acquisition unit 31 and the like are realized by the processing circuit 81 shown in FIG. That is, the processing circuit 81 acquires the position of the vehicle and the first acquisition unit 31 that is generated based on the result of detecting the vicinity of the vehicle and acquires the first lane shape information indicating the shape of the first lane around the vehicle. Of the map vehicle position information indicating the vehicle position on the map data and the lane on the map data based on the vehicle positions acquired by the second acquisition unit 32 and the second acquisition unit 32 and the map data. The first generation unit 34 that generates the second lane shape information indicating the shape of the second lane around the vehicle, the shape of the first lane indicated by the first lane shape information acquired by the first acquisition unit 31, and the first 1. Accuracy of map-related information including at least one of map vehicle position information and second lane shape information based on the difference from the shape of the second lane indicated by the second lane shape information generated by the generation unit 34. A second generation unit 35 that generates map accuracy information indicating the above, and an output unit 36 that outputs map-related information and map accuracy information are provided. Dedicated hardware may be applied to the processing circuit 81, or a processor that executes a program stored in the memory may be applied. Examples of the processor include a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a DSP (Digital Signal Processor), and the like.
 処理回路81が専用のハードウェアである場合、処理回路81は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field Programmable Gate Array)、またはこれらを組み合わせたものが該当する。第1取得部31等の各部の機能それぞれは、処理回路を分散させた回路で実現されてもよいし、各部の機能をまとめて一つの処理回路で実現されてもよい。 When the processing circuit 81 is dedicated hardware, the processing circuit 81 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field Programmable Gate). Array), or a combination of these. Each of the functions of each part such as the first acquisition unit 31 may be realized by a circuit in which processing circuits are dispersed, or the functions of each part may be collectively realized by one processing circuit.
 処理回路81がプロセッサである場合、第1取得部31等の機能は、ソフトウェア等との組み合わせにより実現される。なお、ソフトウェア等には、例えば、ソフトウェア、ファームウェア、または、ソフトウェア及びファームウェアが該当する。ソフトウェア等はプログラムとして記述され、メモリに格納される。図8に示すように、処理回路81に適用されるプロセッサ82は、メモリ83に記憶されたプログラムを読み出して実行することにより、各部の機能を実現する。すなわち、情報出力装置30は、処理回路81により実行されるときに、車両周辺を検出した結果に基づいて生成され、車両周辺の第1車線の形状を示す第1車線形状情報を取得するステップと、車両の位置を取得するステップと、取得された車両の位置と、地図データとに基づいて、地図データ上における車両の位置を示す地図車両位置情報と、地図データ上の車線のうち車両周辺の第2車線の形状を示す第2車線形状情報とを生成するステップと、取得された第1車線形状情報が示す第1車線の形状と、生成された第2車線形状情報が示す第2車線の形状との差に基づいて、地図車両位置情報及び第2車線形状情報の少なくともいずれか1つを含む地図関連情報の精度を示す地図精度情報を生成するステップと、地図関連情報及び地図精度情報を出力するステップと、が結果的に実行されることになるプログラムを格納するためのメモリ83を備える。換言すれば、このプログラムは、第1取得部31等の手順や方法をコンピュータに実行させるものであるともいえる。ここで、メモリ83は、例えば、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read Only Memory)などの、不揮発性または揮発性の半導体メモリ、HDD(Hard Disk Drive)、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVD(Digital Versatile Disc)、そのドライブ装置等、または、今後使用されるあらゆる記憶媒体であってもよい。 When the processing circuit 81 is a processor, the functions of the first acquisition unit 31 and the like are realized by combining with software and the like. The software and the like correspond to, for example, software, firmware, or software and firmware. Software and the like are described as programs and stored in memory. As shown in FIG. 8, the processor 82 applied to the processing circuit 81 realizes the functions of each part by reading and executing the program stored in the memory 83. That is, the information output device 30 is generated based on the result of detecting the vicinity of the vehicle when executed by the processing circuit 81, and is a step of acquiring the first lane shape information indicating the shape of the first lane around the vehicle. Based on the step of acquiring the position of the vehicle, the acquired position of the vehicle, and the map data, the map vehicle position information indicating the position of the vehicle on the map data and the lanes on the map data around the vehicle The step of generating the second lane shape information indicating the shape of the second lane, the shape of the first lane indicated by the acquired first lane shape information, and the second lane indicated by the generated second lane shape information. A step of generating map accuracy information indicating the accuracy of map-related information including at least one of map vehicle position information and second lane shape information based on the difference from the shape, and map-related information and map accuracy information. It includes a step to output and a memory 83 for storing a program to be executed as a result. In other words, it can be said that this program causes the computer to execute the procedure or method of the first acquisition unit 31 or the like. Here, the memory 83 is, for example, non-volatile such as RAM (RandomAccessMemory), ROM (ReadOnlyMemory), flash memory, EPROM (ErasableProgrammableReadOnlyMemory), EPROM (ElectricallyErasableProgrammableReadOnlyMemory), or the like. Volatile semiconductor memory, HDD (HardDiskDrive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (DigitalVersatileDisc), its drive device, etc., or any storage medium that will be used in the future. You may.
 以上、第1取得部31等の各機能が、ハードウェア及びソフトウェア等のいずれか一方で実現される構成について説明した。しかしこれに限ったものではなく、第1取得部31等の一部を専用のハードウェアで実現し、別の一部をソフトウェア等で実現する構成であってもよい。例えば、第1取得部31及び第2取得部32については専用のハードウェアとしての処理回路81、インターフェース及びレシーバなどでその機能を実現し、それ以外についてはプロセッサ82としての処理回路81がメモリ83に格納されたプログラムを読み出して実行することによってその機能を実現することが可能である。 The configuration in which each function of the first acquisition unit 31 and the like is realized by either hardware or software has been described above. However, the present invention is not limited to this, and a configuration may be configured in which a part of the first acquisition unit 31 or the like is realized by dedicated hardware and another part is realized by software or the like. For example, the functions of the first acquisition unit 31 and the second acquisition unit 32 are realized by the processing circuit 81 as dedicated hardware, the interface, the receiver, and the like, and the processing circuit 81 as the processor 82 is the memory 83 in other cases. The function can be realized by reading and executing the program stored in.
 以上のように、処理回路81は、ハードウェア、ソフトウェア等、またはこれらの組み合わせによって、上述の各機能を実現することができる。 As described above, the processing circuit 81 can realize each of the above-mentioned functions by hardware, software, or a combination thereof.
 また、以上で説明した情報出力装置30は、PND(Portable Navigation Device)、ナビゲーション装置及びDMS(Driver Monitoring System)装置などの車両装置と、携帯電話、スマートフォン及びタブレットなどの携帯端末を含む通信端末と、車両装置及び通信端末の少なくとも1つにインストールされるアプリケーションの機能と、サーバとを適宜に組み合わせてシステムとして構築される情報出力システムにも適用することができる。この場合、以上で説明した情報出力装置30の各機能あるいは各構成要素は、前記システムを構築する各機器に分散して配置されてもよいし、いずれかの機器に集中して配置されてもよい。 Further, the information output device 30 described above includes a vehicle device such as a PND (Portable Navigation Device), a navigation device and a DMS (Driver Monitoring System) device, and a communication terminal including a mobile terminal such as a mobile phone, a smartphone and a tablet. It can also be applied to an information output system constructed as a system by appropriately combining a function of an application installed in at least one of a vehicle device and a communication terminal and a server. In this case, each function or each component of the information output device 30 described above may be distributed to each device for constructing the system, or may be concentrated to any device. Good.
 図9は、本変形例に係るサーバ91の構成を示すブロック図である。図9のサーバ91は、通信部91aと制御部91bとを備えており、車両92の車両装置93と無線通信を行うことが可能となっている。 FIG. 9 is a block diagram showing the configuration of the server 91 according to this modification. The server 91 of FIG. 9 includes a communication unit 91a and a control unit 91b, and can perform wireless communication with the vehicle device 93 of the vehicle 92.
 第1取得部及び第2取得部である通信部91aは、車両装置93と無線通信を行うことにより、車両装置93で取得された第1車線形状情報と、車両92の位置とを受信する。 The communication unit 91a, which is the first acquisition unit and the second acquisition unit, receives the first lane shape information acquired by the vehicle device 93 and the position of the vehicle 92 by performing wireless communication with the vehicle device 93.
 制御部91bは、サーバ91の図示しないプロセッサなどが、サーバ91の図示しないメモリに記憶されたプログラムを実行することにより、図1の第1生成部34及び第2生成部35と同様の機能を有している。つまり、制御部91bは、通信部91aで受信された車両の位置と地図データとに基づいて、地図車両位置情報及び第2車線形状情報を生成する。そして、制御部91bは、通信部91aで受信された第1車線形状情報が示す第1車線の形状と、制御部91bで生成された第2車線形状情報が示す第2車線の形状との差に基づいて、地図関連情報の精度を示す地図精度情報を生成する。 The control unit 91b performs the same functions as the first generation unit 34 and the second generation unit 35 of FIG. 1 by executing a program stored in a memory (not shown) of the server 91 by a processor (not shown) of the server 91. Have. That is, the control unit 91b generates the map vehicle position information and the second lane shape information based on the vehicle position and the map data received by the communication unit 91a. Then, the control unit 91b is the difference between the shape of the first lane indicated by the first lane shape information received by the communication unit 91a and the shape of the second lane indicated by the second lane shape information generated by the control unit 91b. Generates map accuracy information indicating the accuracy of map-related information based on.
 出力部である通信部91aは、地図関連情報及び地図精度情報を車両装置93に送信する。このように構成されたサーバ91によれば、実施の形態1で説明した情報出力装置30と同様の効果を得ることができる。 The communication unit 91a, which is an output unit, transmits map-related information and map accuracy information to the vehicle device 93. According to the server 91 configured in this way, the same effect as that of the information output device 30 described in the first embodiment can be obtained.
 図10は、本変形例に係る通信端末96の構成を示すブロック図である。図10の通信端末96は、通信部91aと同様の通信部96aと、制御部91bと同様の制御部96bとを備えており、車両97の車両装置98と無線通信を行うことが可能となっている。なお、通信端末96には、例えば車両97の運転者が携帯する携帯電話、スマートフォン、及びタブレットなどの携帯端末が適用される。このように構成された通信端末96によれば、実施の形態1で説明した情報出力装置30と同様の効果を得ることができる。 FIG. 10 is a block diagram showing the configuration of the communication terminal 96 according to this modification. The communication terminal 96 of FIG. 10 includes a communication unit 96a similar to the communication unit 91a and a control unit 96b similar to the control unit 91b, and can perform wireless communication with the vehicle device 98 of the vehicle 97. ing. A mobile terminal such as a mobile phone, a smartphone, or a tablet carried by the driver of the vehicle 97 is applied to the communication terminal 96, for example. According to the communication terminal 96 configured in this way, the same effect as that of the information output device 30 described in the first embodiment can be obtained.
 なお、本発明は、その発明の範囲内において、各実施の形態及び各変形例を自由に組み合わせたり、各実施の形態及び各変形例を適宜、変形、省略したりすることが可能である。 It should be noted that, within the scope of the present invention, each embodiment and each modification can be freely combined, and each embodiment and each modification can be appropriately modified or omitted.
 本発明は詳細に説明されたが、上記した説明は、すべての態様において、例示であって、本発明がそれに限定されるものではない。例示されていない無数の変形例が、本発明の範囲から外れることなく想定され得るものと解される。 Although the present invention has been described in detail, the above description is exemplary in all embodiments and the present invention is not limited thereto. It is understood that innumerable variations not illustrated can be assumed without departing from the scope of the present invention.
 30 情報出力装置、31 第1取得部、32 第2取得部、34 第1生成部、35 第2生成部、36 出力部、40 走行制御装置、51,52 区画線、L 算出範囲。 30 information output device, 31 first acquisition unit, 32 second acquisition unit, 34 first generation unit, 35 second generation unit, 36 output unit, 40 travel control device, 51, 52 lane markings, L calculation range.

Claims (7)

  1.  車両周辺を検出した結果に基づいて生成され、前記車両周辺の第1車線の形状を示す第1車線形状情報を取得する第1取得部と、
     前記車両の位置を取得する第2取得部と、
     前記第2取得部で取得された前記車両の位置と、地図データとに基づいて、前記地図データ上における前記車両の位置を示す地図車両位置情報と、前記地図データ上の車線のうち前記車両周辺の第2車線の形状を示す第2車線形状情報とを生成する第1生成部と、
     前記第1取得部で取得された前記第1車線形状情報が示す前記第1車線の形状と、前記第1生成部で生成された前記第2車線形状情報が示す前記第2車線の形状との差に基づいて、前記地図車両位置情報及び前記第2車線形状情報の少なくともいずれか1つを含む地図関連情報の精度を示す地図精度情報を生成する第2生成部と、
     前記地図関連情報及び前記地図精度情報を出力する出力部と
    を備える、情報出力装置。
    A first acquisition unit that is generated based on the result of detecting the vicinity of the vehicle and acquires the first lane shape information indicating the shape of the first lane around the vehicle.
    The second acquisition unit that acquires the position of the vehicle and
    Based on the position of the vehicle acquired by the second acquisition unit and the map data, the map vehicle position information indicating the position of the vehicle on the map data and the lanes on the map data around the vehicle. The first generation unit that generates the second lane shape information indicating the shape of the second lane of
    The shape of the first lane indicated by the first lane shape information acquired by the first acquisition unit and the shape of the second lane indicated by the second lane shape information generated by the first generation unit. A second generation unit that generates map accuracy information indicating the accuracy of map-related information including at least one of the map vehicle position information and the second lane shape information based on the difference.
    An information output device including an output unit that outputs the map-related information and the map accuracy information.
  2.  請求項1に記載の情報出力装置であって、
     前記第2生成部は、
     前記第1車線の形状の曲率、または、前記第2車線の形状の曲率に基づいて、前記差を求めるべき車線の形状の範囲を変更する、情報出力装置。
    The information output device according to claim 1.
    The second generation unit
    An information output device that changes the range of the lane shape from which the difference should be obtained based on the curvature of the shape of the first lane or the curvature of the shape of the second lane.
  3.  請求項1に記載の情報出力装置であって、
     前記第2生成部は、
     前記第1車線形状情報が前記第1車線の形状を示す表現形式と、前記第2車線形状情報が前記第2車線の形状を示す表現形式とが異なる場合に、予め定められた車線幅を用いて前記差を求める、情報出力装置。
    The information output device according to claim 1.
    The second generation unit
    When the expression format in which the first lane shape information indicates the shape of the first lane and the expression format in which the second lane shape information indicates the shape of the second lane are different, a predetermined lane width is used. An information output device that obtains the above difference.
  4.  請求項1に記載の情報出力装置であって、
     前記第2生成部は、
     前記第1車線形状情報が隣り合う2つの前記第1車線の形状を示す表現形式と、前記第2車線形状情報が隣り合う2つの前記第2車線の形状を示す表現形式とが異なる場合に、前記差として、前記2つの第1車線の中心線の形状と前記2つの第2車線の中心線の形状との差を求める、情報出力装置。
    The information output device according to claim 1.
    The second generation unit
    When the expression format in which the first lane shape information indicates the shape of two adjacent first lanes and the expression format in which the second lane shape information indicates the shape of two adjacent second lanes are different. As the difference, an information output device that obtains the difference between the shape of the center lines of the two first lanes and the shape of the center lines of the two second lanes.
  5.  請求項1に記載の情報出力装置であって、
     前記第2生成部は、
     前記第1車線形状情報の生成時間と前記車両の速度とに基づいて、前記第1車線の形状を前記第2車線の形状に対して移動させた後に、前記差を求める、情報出力装置。
    The information output device according to claim 1.
    The second generation unit
    An information output device that obtains the difference after moving the shape of the first lane with respect to the shape of the second lane based on the generation time of the first lane shape information and the speed of the vehicle.
  6.  請求項1に記載の情報出力装置と、
     前記情報出力装置から出力された前記地図関連情報及び前記地図精度情報に基づいて、前記車両の走行制御を行う走行制御装置と
    を備える、自動運転装置。
    The information output device according to claim 1 and
    An automatic driving device including a travel control device that controls travel of the vehicle based on the map-related information and map accuracy information output from the information output device.
  7.  車両周辺を検出した結果に基づいて生成され、前記車両周辺の第1車線の形状を示す第1車線形状情報を取得し、
     前記車両の位置を取得し、
     取得された前記車両の位置と、地図データとに基づいて、前記地図データ上における前記車両の位置を示す地図車両位置情報と、前記地図データ上の車線のうち前記車両周辺の第2車線の形状を示す第2車線形状情報とを生成し、
     取得された前記第1車線形状情報が示す前記第1車線の形状と、生成された前記第2車線形状情報が示す前記第2車線の形状との差に基づいて、前記地図車両位置情報及び前記第2車線形状情報の少なくともいずれか1つを含む地図関連情報の精度を示す地図精度情報を生成し、
     前記地図関連情報及び前記地図精度情報を出力する、情報出力方法。
    The first lane shape information, which is generated based on the result of detecting the vicinity of the vehicle and indicates the shape of the first lane around the vehicle, is acquired.
    Obtain the position of the vehicle and
    Based on the acquired position of the vehicle and the map data, the map vehicle position information indicating the position of the vehicle on the map data and the shape of the second lane around the vehicle among the lanes on the map data. Generates second lane shape information and
    Based on the difference between the shape of the first lane indicated by the acquired first lane shape information and the shape of the second lane indicated by the generated second lane shape information, the map vehicle position information and the said Generate map accuracy information indicating the accuracy of map-related information including at least one of the second lane shape information.
    An information output method for outputting the map-related information and the map accuracy information.
PCT/JP2019/032816 2019-08-22 2019-08-22 Information output device, automated driving device, and method for information output WO2021033312A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/032816 WO2021033312A1 (en) 2019-08-22 2019-08-22 Information output device, automated driving device, and method for information output

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/032816 WO2021033312A1 (en) 2019-08-22 2019-08-22 Information output device, automated driving device, and method for information output

Publications (1)

Publication Number Publication Date
WO2021033312A1 true WO2021033312A1 (en) 2021-02-25

Family

ID=74660455

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/032816 WO2021033312A1 (en) 2019-08-22 2019-08-22 Information output device, automated driving device, and method for information output

Country Status (1)

Country Link
WO (1) WO2021033312A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7376634B2 (en) 2022-03-22 2023-11-08 本田技研工業株式会社 Vehicle control device, vehicle control method, and program
JP7449971B2 (en) 2022-03-25 2024-03-14 本田技研工業株式会社 Vehicle control device, vehicle control method, and program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11211492A (en) * 1998-01-29 1999-08-06 Fuji Heavy Ind Ltd Road information recognizing device
JP2001291197A (en) * 2000-04-07 2001-10-19 Honda Motor Co Ltd Vehicle controller
JP2006290072A (en) * 2005-04-07 2006-10-26 Toyota Motor Corp Vehicle control device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11211492A (en) * 1998-01-29 1999-08-06 Fuji Heavy Ind Ltd Road information recognizing device
JP2001291197A (en) * 2000-04-07 2001-10-19 Honda Motor Co Ltd Vehicle controller
JP2006290072A (en) * 2005-04-07 2006-10-26 Toyota Motor Corp Vehicle control device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7376634B2 (en) 2022-03-22 2023-11-08 本田技研工業株式会社 Vehicle control device, vehicle control method, and program
JP7449971B2 (en) 2022-03-25 2024-03-14 本田技研工業株式会社 Vehicle control device, vehicle control method, and program

Similar Documents

Publication Publication Date Title
JP6260114B2 (en) Traveling route information generation device
US7463974B2 (en) Systems, methods, and programs for determining whether a vehicle is on-road or off-road
US8112222B2 (en) Lane determining device, method, and program
CN110249207B (en) Method and apparatus for updating digital map
JP6229523B2 (en) Own vehicle traveling position specifying device and own vehicle traveling position specifying program
US20150354968A1 (en) Curve modeling device, curve modeling method, and vehicular navigation device
US8731825B2 (en) Trajectory display device
JP2009008590A (en) Vehicle-position-recognition apparatus and vehicle-position-recognition program
WO2015131464A1 (en) Method and device for correcting error of vehicle positioning
JP4596566B2 (en) Self-vehicle information recognition device and self-vehicle information recognition method
CN101576387B (en) Navigation information revision method and navigation device thereof
JP5762656B2 (en) Vehicle position display control device and vehicle position specifying program
US11668583B2 (en) Method, apparatus, and computer program product for establishing reliability of crowd sourced data
JP2010276545A (en) Apparatus and method for measuring vehicle position
WO2021033312A1 (en) Information output device, automated driving device, and method for information output
CN110375786B (en) Calibration method of sensor external parameter, vehicle-mounted equipment and storage medium
JP6790951B2 (en) Map information learning method and map information learning device
JP2017187989A (en) Position acquisition system
JP4953015B2 (en) Own vehicle position recognition device, own vehicle position recognition program, and navigation device using the same
JP4848931B2 (en) Signal correction device for angular velocity sensor
JP2018091714A (en) Vehicle position measurement system
JP2007322312A (en) Navigation system
KR101676145B1 (en) Curvature calculation device and curvature correction method
WO2020021596A1 (en) Vehicle position estimation device and vehicle position estimation method
WO2020031295A1 (en) Self-location estimation method and self-location estimation device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19942093

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19942093

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP