WO2019107397A1 - Data structure of map data - Google Patents

Data structure of map data Download PDF

Info

Publication number
WO2019107397A1
WO2019107397A1 PCT/JP2018/043742 JP2018043742W WO2019107397A1 WO 2019107397 A1 WO2019107397 A1 WO 2019107397A1 JP 2018043742 W JP2018043742 W JP 2018043742W WO 2019107397 A1 WO2019107397 A1 WO 2019107397A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
feature
information
map data
data structure
Prior art date
Application number
PCT/JP2018/043742
Other languages
French (fr)
Japanese (ja)
Inventor
多史 藤谷
加藤 正浩
岩井 智昭
研司 水戸
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Publication of WO2019107397A1 publication Critical patent/WO2019107397A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • G01C21/3815Road data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3837Data obtained from a single source
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3863Structures of map data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids

Definitions

  • the present application relates to the technical field of the data structure of map data used to detect features.
  • the feature position measured by a sensor such as a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) sensor is matched with the feature position described in map data for automatic It is necessary to estimate the vehicle position to the accuracy.
  • LIDAR Light Detection and Ranging, Laser Imaging Detection and Ranging
  • map data for automatic operation including the positions of these features is important for performing stable automatic operation.
  • Patent Document 1 discloses a technique for extracting data measuring a road surface from among points cloud data including many data other than the road surface, such as buildings and road trees, etc. There is. By using the technology of Patent Document 1, it is possible to determine a road area from point cloud data obtained by measuring the ground surface. Moreover, if the said laser measurement technique including the technique of patent document 1 is used, it is also possible to detect terrestrial features, such as a curve mirror installed in the roadside etc. besides the said road surface.
  • the information is used to estimate the position of the vehicle, information indicating the ease with which the LiDAR sensor detects each feature present on a roadside or the like is important. For example, if the vehicle position is estimated using only features that are easy to detect, the vehicle position can be estimated with high accuracy.
  • an example of the problem is to provide a data structure of map data including information indicating the detectability of an object such as a feature.
  • the invention according to claim 1 is a data structure of map data including position information indicating the position of a feature and detectability information indicating the detectability of the feature by the detection means in association with each other. It is characterized in that the detection means mounted on the movable body is used for the process of determining the easiness of detection of the feature specified by the position information based on the easiness of detection information.
  • FIG. 1 It is a block diagram which shows the data structure of the map data of embodiment. It is a block diagram showing a schematic structure of a map data management system of an example.
  • A is figure (I) which illustrates the format of the map data of an Example.
  • B is figure (II) which illustrates the format of the map data of an Example.
  • C is figure (III) which illustrates the format of the map data of an Example. It is a flowchart which shows the operation example of the process using the ease of detection information of an Example.
  • FIG. 1 is a block diagram which shows a structure of the data structure of the map data of this embodiment.
  • position information 1A indicating the position of the feature is associated with detectability information 1B indicating the detectability of the feature by the detecting means.
  • It is a data structure of map data FM including.
  • this map data FM is used for the process which the said detection means mounted in the mobile body determines based on the easiness of detection of the feature specified by position information 1A based on easiness information 1B.
  • the data structure of the map data FM includes the position information 1A and the detectability information 1B associated with the position information 1A, and the detection unit mounted on the moving object is Map data including information indicating the easiness of detection of an object such as a feature because it is used in the process of determining the easiness of detection of the feature specified by the position information 1A based on the easiness of detection information 1B Can provide the data structure of
  • FIGS. 2 is a block diagram showing a schematic configuration of the map data management system of the embodiment
  • FIG. 3 is a view exemplifying a format of map data of the embodiment
  • FIG. 4 is information of ease of detection of the embodiment. It is a flowchart which shows the operation example of the process which used.
  • the map data management system S of this embodiment includes a server device 100 that manages map data, and an on-board terminal 200 (an example of a terminal of the embodiment) mounted on each of a plurality of vehicles.
  • the server device 100 and each in-vehicle terminal 200 are connected so as to be able to transmit and receive various data via a network NW such as the Internet, for example.
  • NW such as the Internet
  • the map data management system S may include a plurality of in-vehicle terminals 200.
  • the server apparatus 100 may also be configured by a plurality of apparatuses.
  • the vehicle-mounted terminal 200 receives the reflected light of the laser light emitted by the LiDAR sensor from the feature existing in the vicinity of the vehicle.
  • the object is detected, and transmission data including data corresponding to the detection result is transmitted to the server apparatus 100.
  • the vehicle body terminal 200 performs a process described later corresponding to the feature based on the reflected light of the laser light by the feature.
  • the server device 100 updates map data, which will be described later, recorded in the server device 100 based on the plurality of transmission data received from each of the plurality of in-vehicle terminals 200, and responds to a request from any of the in-vehicle terminals 100. In response, the map data corresponding to the request is transmitted to the requested on-vehicle terminal 200.
  • the update of the map data may be performed by another device that has received an instruction from the server device 100.
  • the on-vehicle terminal 200 is roughly divided into a control unit 201, a storage unit 202, a communication unit 203, and an interface unit 204. Is connected to the field sensor 206.
  • the storage unit 202 is configured of, for example, a hard disk drive (HDD) or a solid state drive (SSD), and stores an operating system (OS), a processing program of the embodiment, map data, various data, and the like.
  • the map data includes map position information indicating the position of a feature to be detected by the LiDAR sensor 205, a feature ID for identifying the feature and another feature, and the feature
  • the easiness of detection information (an example of the easiness of detection information 1B of the embodiment) is described.
  • the map position information and the feature ID are information linked to one feature, the feature ID can be said to be one of position information indicating the position of the one feature.
  • map data similar to the map data stored in the storage unit 202 that is, map data in which map position information, feature ID and ease of detection information are described for each feature
  • map data stored in the storage unit 202 may store, for example, map data of the whole country, or map data corresponding to a certain area including the current position of the vehicle is received in advance from the server device 100 or the like. May be stored.
  • the communication unit 203 controls the communication state between the in-vehicle terminal 200 and the server device 100.
  • the interface unit 204 implements an interface function when data is exchanged between the LiDAR sensor 205 and the internal sensor 206 and the on-vehicle terminal 200.
  • the LiDAR sensor 205 is attached to, for example, a roof portion of a vehicle, and as one function, a pulse-like laser beam (more specifically, for example, a pulse-like shape) to draw a circle at the irradiation point of the laser beam around the vehicle. Infrared laser beam) is emitted and scanned. At this time, the LiDAR sensor 205 emits laser light downward at a constant angle, for example, from the roof portion. Then, the light reflected by the points on the surface of each feature around the vehicle is received, and reflection intensity data indicating the reflection intensity at each point and point cloud distance data are generated.
  • the reflection intensity data is data indicating the reflection intensity of the laser light by the ground and features.
  • the point group distance data is data indicating the direction in which the laser beam is irradiated and the distance to the point on the ground or feature on which the laser beam is irradiated.
  • These reflection intensity data and point cloud distance data are generically called sensor data.
  • a plurality of LiDAR sensors 205 are attached to the front and rear of the vehicle to combine the reflection intensity data and point cloud distance data of the field of view acquired by each, and generate sensor data of the surroundings of the vehicle. May be
  • the LiDAR sensor 205 immediately generates sensor data when the reflection intensity is measured, and transmits the sensor data to the on-vehicle terminal 200 via the interface unit 204.
  • the control unit 201 measures the received sensor data as measured position information indicating the position of the vehicle at the time of reception of the sensor data (that is, the position of the in-vehicle terminal 200 including the LiDAR sensor 205). It is stored in the storage unit 202 in association with measurement date and time information indicating the date and time of reception of sensor data. Note that the control unit 201 stores the sensor data, the measurement position information, and the measurement date and time information stored in the storage unit 202 for which the predetermined time has elapsed from the measurement or the information transmitted to the server device 100. It may be deleted from 202.
  • the control unit 201 temporarily stores various data, such as a central processing unit (CPU) for controlling the entire control unit 201, a read only memory (ROM) in which a control program for controlling the control unit 201 and the like is stored in advance. And a random access memory (RAM). Then, the CPU reads out and executes various programs stored in the ROM and the storage unit 202 to realize various functions including processing using the map data of the embodiment.
  • CPU central processing unit
  • ROM read only memory
  • RAM random access memory
  • the control unit 201 acquires estimated own vehicle position information.
  • the estimated own vehicle position information may be generated by a device outside the on-board terminal 200, or may be generated by the control unit 201.
  • the estimated vehicle position information may be generated, for example, by matching the feature position measured by the LiDAR sensor 205 with the feature position of map data for automatic driving, or information detected by the inner sensor 206 and the above It can be generated based on map data, or a combination of these.
  • control unit 201 detects, for example, a feature present on the side of the road based on the estimated own vehicle position information and the sensor data from the LiDAR sensor 205, and transmits transmission data corresponding to the detection result to the server device. Send to 100
  • control unit 201 detects a feature corresponding to the detectability information based on the estimated own vehicle position information and the detectability information of the embodiment included in the map data. The detection of the feature using the detectability information of this embodiment will be described in detail later using FIG.
  • the server apparatus 100 is roughly divided into a control unit 101, a storage unit 102, a communication unit 103, a display unit 104, and an operation unit 105.
  • the storage unit 102 is configured by, for example, an HDD or an SSD, and stores the OS, the map data, transmission data received from the in-vehicle terminal 200, and various other data.
  • the communication unit 103 controls the communication state with the on-vehicle terminal 200.
  • the display unit 104 is configured of, for example, a liquid crystal display or the like, and displays information such as characters and images.
  • the operation unit 105 includes, for example, a keyboard, a mouse, and the like, receives an operation instruction from an operator, and outputs the content of the instruction to the control unit 101 as an instruction signal.
  • the control unit 101 includes a CPU that controls the entire control unit 101, a ROM in which a control program that controls the control unit 101 and the like is stored in advance, and a RAM that temporarily stores various data. Then, the CPU reads and executes various programs stored in the ROM and the storage unit 102 to realize various functions including reception processing of transmission data by the map data management system S of the embodiment.
  • the control unit 101 receives a plurality of the transmission data received from each of the one or a plurality of on-vehicle terminals 200, and uses them to update the map data corresponding to the corresponding feature.
  • a first example of the data structure is described with reference to FIG. That is, as a first example of the data structure of the map data stored in the on-vehicle terminal 200 and the server device 100 according to the embodiment, basic information FMB, as illustrated in FIG. 3A as map data FM1, There is a data structure configured by the detectability information FMS.
  • the basic information FMB is composed of object ID data FMB1, object type data FMB2, position data FMB3, affiliated road data FMB4, and size data FMB5.
  • the object ID data FMB1 is ID data for identifying the corresponding feature from the other features.
  • the object type data FMB2 is code data for specifying the type of the corresponding feature.
  • the position data FMB3 is data indicating the latitude and longitude and the height (elevation) of the position where the corresponding feature exists.
  • the belonging road data FMB4 is data indicating the link ID in the map data FM of the road to which the corresponding feature belongs (that is, the corresponding feature is, for example, on the side thereof).
  • the size data FMB5 is data indicating the vertical and horizontal sizes of the feature part of the corresponding feature.
  • the ease of detection information FMS is described for each type of sensor connected to the vehicle body terminal 200. That is, as the detectability information FMS for the LiDAR sensor 205 as the sensor, the detectability information FMS1 for LiDAR and the detectability information FMS2 for a camera are described by being divided into time zones. In addition, the level of ten steps is illustrated as easiness of detection here, and it means that 10 is the easiest to detect and 1 is the hardest to detect.
  • the ease of detection by the LiDAR sensor 205 is determined in advance by, for example, the color or material of a mark as a feature. For example, for a label using a so-called retroreflective material, the ease of detection by the LiDAR sensor 205 is enhanced.
  • the easiness of detection by the LiDAR sensor 205 is lower than the label having many other color portions.
  • the ease of detection is constant from 00:00 to 23:59.
  • the ease of detection by the camera is, for example, determined in advance by the brightness of the surrounding environment. Therefore, from night to early morning, a level indicating that it is difficult to detect is set.
  • the method of expressing the ease of detection information FMS may be changed according to the type of sensor (for example, the LiDAR sensor 205 and the camera). Specifically, for the LiDAR sensor 205, since it is not necessary to separate the detection ease information FMS for each time zone, the level of detection ease according to the intensity of the laser beam to be irradiated is defined, while the camera As illustrated in FIG. 3A, the ease of detection information FMS 2 for each time zone can be set as the ease of detection information FMS 2 for use.
  • the ease of detection information FMS 2 for each time zone can be set as the ease of detection information FMS 2 for use.
  • map data FM2 As a second example of the data structure of the map data of the embodiment, as exemplified as map data FM2 in FIG. 3B, data configured by the basic information FMB and the detectability information FMS. The structure is mentioned.
  • the basic information FMB has the same configuration as the map data FM1 illustrated in FIG. 3A, so the detailed description will be omitted.
  • the number which shows the detectability is displayed on the level of 10 steps like FIG. 3 (A).
  • the ease of detection information FMS is described for each road data to which the corresponding feature belongs. That is, as illustrated in FIG. 3B as the road correspondence detection ease information FMS5, when the feature of the corresponding feature is viewed from the road detectable by the sensor (for example, LiDAR sensor 205) (ie, A road correspondence detection ease information FMS5 is described which indicates the ease of detection of the case where the laser light of the LiDAR sensor 205 is irradiated from above the road in a stepwise manner.
  • the sensor for example, LiDAR sensor 205
  • the road correspondence detectability information FMS 5 is set to a high value (“10” in the example of FIG. 3B) as compared to the other.
  • the road indicated by link ID: xxx 2 which corresponds to the opposite lane of the road indicated by link ID: xxx 1
  • the sign is easily detected from the road, so the road correspondence detection ease information
  • a high value is set as FMS5.
  • FIG. 3 (C) a third example of the data structure of the map data of the embodiment will be described using FIG. 3 (C). That is, as a third example of the data structure of the map data FM of the embodiment, as illustrated as map data FM3 in FIG. 3C, it is configured by basic information FMB and detectability information FMS. There is a data structure.
  • the basic information FMB has the same configuration as the map data FM1 illustrated in FIG. 3A, so the detailed description will be omitted.
  • the ease of detection information FMS is described as the degree of shielding of the feature with respect to the corresponding feature, for example, by a nearby vegetation or the like. That is, as exemplified in FIG. 3C as the shielding correspondence detection easiness information FMS 6, when the marker which is the corresponding feature is shielded by vegetation, the marker corresponding to the degree of shielding is detected Occlusion correspondence detection easiness information FMS6 indicating the easiness in proportion is described. That is, since the shielded label is in a state in which it is difficult to detect as compared with other labels, the degree of shielding can be used as the easiness of detection information. Note that the shielding correspondence detection easiness information FMS 6 may be information indicating the presence or absence of the shielding with respect to the feature other than the information indicated by the ratio as described above.
  • control unit 201 of the on-vehicle terminal 200 acquires estimated own vehicle position information (step S101).
  • the control unit 201 corresponds to the object ID data representing the feature present along the moving direction of the vehicle in which the on-board terminal 200 is mounted.
  • Detectability information FMS to be detected and position data FMB3 and the like indicating the position of the feature are acquired from the map data FM (step S102).
  • the control unit 201 detects the presence of the feature using the position data FMB 3 and the like (step S103).
  • the control unit 201 determines the ease of detection of the feature specified by the position data FMB3 based on the ease of detection information FMS, and for example, estimates the position of the vehicle based on the determination result. Select and detect features to be used.
  • control unit 201 selects in order from the feature having the highest value of LiDAR detectability information FMS1, the value of camera detectability information FMS2, or the road correspondence detectability information FMS5. .
  • control unit 201 selects the features in ascending order of the shielding correspondence detection easiness information FMS 6 (that is, the shielding ratio is low). Then, the control unit 201 detects the position of the selected feature, and estimates, for example, the position of the vehicle on which the vehicle terminal 200 is mounted, using the detected position data FMB3 of the feature (step S104). Thereafter, the control unit 201 ends the process using the detectability information of the embodiment.
  • the map data FM in the map data management system S in the present embodiment includes position data FMB3 (an example of position information in the embodiment) indicating the position of a feature, and the LiDAR sensor 205 (detection means in the embodiment) Map data FM having a data structure in which detectability information FMS indicating the detectability of a feature according to (an example) is associated with each other, and the body terminal 200 detects the feature identified by the position data FMB3. It is used for the process which determines the ease based on the ease of detection information FMS. Therefore, it is possible to provide the data structure of the map data FM including the information indicating the detectability of the feature.
  • map data FM is used for the process of selecting the feature based on the above-described determined ease of detection by the on-vehicle terminal 200, the feature which can be used for detecting the position of the vehicle, for example It can be selected.
  • the on-vehicle terminal 200 since the map data FM is used in processing for specifying the position of the feature using the position data FMB3 of the selected feature, the on-vehicle terminal 200, for example, the feature required for detecting the vehicle position The position can be identified accurately.
  • the road correspondence detectability information FMS5 is distinguished for each road where the feature is disposed. Since it is included, the required features can be detected for each road.
  • the road correspondence detection ease information FMS 5 may be included in the map data FM in distinction for each road on which the feature is disposed and for each movement direction of the vehicle on which the on-board terminal 200 is mounted. . In this case, the necessary features can be detected more finely.
  • the LiDAR sensor detectability information FMS1 and the camera detectability information FMS2 Since the map data FM is distinguished and is included in the map data FM, it is possible to detect a required feature for each time zone.
  • the detectability information FMS1 for LiDAR sensor may be described by the reflectance of the laser light. Good.
  • the ease of detection is indicated by the expression method corresponding to the type of sensor connected to the on-vehicle terminal 200, the necessary feature can be detected more accurately according to the type of sensor.
  • the shielding correspondence detectability information FMS 6 indicates the presence or absence of shielding of the feature by the obstacle, or the obstacle Since the easiness of detection is indicated by the shielding rate of the feature due to the above, when the feature is shielded by an obstacle such as a vegetation, the feature can be accurately detected in consideration of the influence thereof.
  • the ease of detection information FMS is distinguished by weather and included in the map data FM, and the ease of detection information FMS is selectively used according to the result of the weather detection at the time of actual detection of a feature. May be In this case, the necessary features can be detected more accurately according to the weather.
  • the material of the target feature can also be indicated as the ease of detection information FMS. Specifically, it indicates whether the target feature is a feature using a retroreflecting material or a light emitting plate that displays an image using an LED (Light Emitting Diode) or the like. In this case, although the feature using the retroreflective material is easy to be recognized by the LiDAR sensor 205, the feature on the light board can be treated as being hard to recognize.
  • FMS ease of detection information
  • the easiness of detection information FMS can also be indicated by the presence or absence of deterioration of a target feature or the progress of deterioration.
  • a demarcation line or the like drawn on a road is a feature that is difficult to be recognized by the LiDAR sensor 205 when it is rubbed due to deterioration.
  • a program corresponding to the flowchart shown in FIG. 4 is recorded in a recording medium such as an optical disk or a hard disk, or obtained via a network such as the Internet and recorded, and these are recorded using a general-purpose microcomputer or the like. It is also possible to cause the microcomputer or the like to function as the control unit 201 of the embodiment by reading and executing.
  • FM1, FM2, FM3 Map data 1A Position information 1B Detectability information S Map data management system 100 Server device 101, 201 Control unit 200 Mobile terminal 205 LiDAR sensor FMB Basic information FMB1 Object ID data FMB2 Object type data FMB3 Position data FMB4 affiliated road data FMB5 Size data FMS detectability information FMS1 LiDAR detectability information FMS2 camera detectability information FMS5 road correspondence detectability information FMS6 shield correspondence detectability information

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Databases & Information Systems (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

Provided is the data structure of map data including information that indicates the ease of detecting an object such as a terrestrial object. The data structure of map data FM1 including, in association, position data FMB3 that indicates the position of a terrestrial object and ease-of-detection information FMS that indicates the ease of detecting a terrestrial object by an LiDAR sensor, the data structure of map data FM1 being used in a process in which an onboard terminal determines, on the basis of the ease-of-detection information FMS, the ease of detecting the terrestrial object specified by the position data FMB3.

Description

地図データのデータ構造Data structure of map data
 本願は、地物の検出に用いられる地図データのデータ構造の技術分野に関する。 The present application relates to the technical field of the data structure of map data used to detect features.
 自動運転車両では、LIDAR(Light Detection and Ranging、Laser Imaging Detection and Ranging)センサなどのセンサで計測した地物位置と、自動運転用の地図データに記述された地物位置と、をマッチングして高精度に自車位置を推定する必要がある。利用する地物として標識や看板などがあり、これらの地物の位置を含む自動運転用の地図データは、安定した自動運転を行うために重要である。 In an autonomous driving vehicle, the feature position measured by a sensor such as a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) sensor is matched with the feature position described in map data for automatic It is necessary to estimate the vehicle position to the accuracy. There are signs and signs as features to be used, and map data for automatic operation including the positions of these features is important for performing stable automatic operation.
 このような地図データに関連する一技術として、地表面にレーザー光を照射し、その反射光を受光して得られた点群データを用いて地図データの利用・更新等に活かすレーザー計測技術の開発が進められている。例えば、特許文献1には、点群データに建物や街路樹等、道路面以外のデータも多く含まれているところ、この中から道路面を計測しているデータを抽出する技術が開示されている。特許文献1の技術を用いることにより、地表面を計測した点群データから道路領域を判定することができる。また、特許文献1の技術を含む上記レーザー計測技術を用いれば、上記道路面以外に、道路脇等に設置されているカーブミラー等の地物を検出することも可能である。 One of the techniques related to such map data is a laser measurement technology that uses laser light on the ground surface and receives point light from the reflected light to utilize and update map data, etc. Development is in progress. For example, Patent Document 1 discloses a technique for extracting data measuring a road surface from among points cloud data including many data other than the road surface, such as buildings and road trees, etc. There is. By using the technology of Patent Document 1, it is possible to determine a road area from point cloud data obtained by measuring the ground surface. Moreover, if the said laser measurement technique including the technique of patent document 1 is used, it is also possible to detect terrestrial features, such as a curve mirror installed in the roadside etc. besides the said road surface.
特開2017-9378号公報JP, 2017-9378, A
 ここで、上記自車位置の推定に用いられることから、道路脇等に存在する各地物のLiDARセンサによる検出のし易さを示す情報は重要である。例えば、検出し易い地物のみを用いて上記自車位置を推定すれば、高精度に当該自車位置を推定できることになる。 Here, since the information is used to estimate the position of the vehicle, information indicating the ease with which the LiDAR sensor detects each feature present on a roadside or the like is important. For example, if the vehicle position is estimated using only features that are easy to detect, the vehicle position can be estimated with high accuracy.
 そこで本願は、こうした要請に鑑みて為されたもので、その課題の一例は、地物等の物体の検出し易さを示す情報を含む地図データのデータ構造を提供することにある。 Therefore, the present application has been made in view of such a request, and an example of the problem is to provide a data structure of map data including information indicating the detectability of an object such as a feature.
 請求項1に記載の発明は、地物の位置を示す位置情報と、検出手段による前記地物の検出し易さを示す検出し易さ情報と、を関連付けて含む地図データのデータ構造であって、移動体に搭載された前記検出手段が、前記位置情報により特定される地物の検出し易さを前記検出し易さ情報に基づいて判定する処理に用いられることを特徴とする。 The invention according to claim 1 is a data structure of map data including position information indicating the position of a feature and detectability information indicating the detectability of the feature by the detection means in association with each other. It is characterized in that the detection means mounted on the movable body is used for the process of determining the easiness of detection of the feature specified by the position information based on the easiness of detection information.
実施形態の地図データのデータ構造を示すブロック図である。It is a block diagram which shows the data structure of the map data of embodiment. 実施例の地図データ管理システムの概要構成を示すブロック図である。It is a block diagram showing a schematic structure of a map data management system of an example. (A)は実施例の地図データのフォーマットを例示する図(I)である。(B)は実施例の地図データのフォーマットを例示する図(II)である。(C)は実施例の地図データのフォーマットを例示する図(III)である。(A) is figure (I) which illustrates the format of the map data of an Example. (B) is figure (II) which illustrates the format of the map data of an Example. (C) is figure (III) which illustrates the format of the map data of an Example. 実施例の検出し易さ情報を用いた処理の動作例を示すフローチャートである。It is a flowchart which shows the operation example of the process using the ease of detection information of an Example.
 本願発明を実施するための形態について、図1を用いて説明する。なお、図1は、本実施形態の地図データのデータ構造の構成を示すブロック図である。 A mode for carrying out the present invention will be described with reference to FIG. In addition, FIG. 1 is a block diagram which shows a structure of the data structure of the map data of this embodiment.
 図1に示すように、本実施形態のデータ構造は、地物の位置を示す位置情報1Aと、検出手段による上記地物の検出し易さを示す検出し易さ情報1Bと、を関連付けて含む地図データFMのデータ構造である。 As shown in FIG. 1, in the data structure of the present embodiment, position information 1A indicating the position of the feature is associated with detectability information 1B indicating the detectability of the feature by the detecting means. It is a data structure of map data FM including.
 そして、この地図データFMは、移動体に搭載された上記検出手段が、位置情報1Aにより特定される地物の検出し易さを検出し易さ情報1Bに基づいて判定する処理に用いられる。 And this map data FM is used for the process which the said detection means mounted in the mobile body determines based on the easiness of detection of the feature specified by position information 1A based on easiness information 1B.
 以上説明したように、本実施形態の地図データFMのデータ構造は、位置情報1Aと、位置情報1Aに関連付けられた検出し易さ情報1Bと、を含み、移動体に搭載された検出手段が、位置情報1Aにより特定される地物の検出し易さを検出し易さ情報1Bに基づいて判定する処理に用いられるので、地物等の物体の検出し易さを示す情報を含む地図データのデータ構造を提供することができる。 As described above, the data structure of the map data FM according to the present embodiment includes the position information 1A and the detectability information 1B associated with the position information 1A, and the detection unit mounted on the moving object is Map data including information indicating the easiness of detection of an object such as a feature because it is used in the process of determining the easiness of detection of the feature specified by the position information 1A based on the easiness of detection information 1B Can provide the data structure of
 図2-図4を用いて実施例について説明する。なお以下に説明する実施例は、本実施形態を、地図データ管理システムSに適用した場合の実施例である。また、図2は実施例の地図データ管理システムの概要構成を示すブロック図であり、図3は実施例の地図データのフォーマットを例示する図であり、図4は実施例の検出し易さ情報を用いた処理の動作例を示すフローチャートである。 An embodiment will be described with reference to FIGS. The embodiment described below is an embodiment in which the present embodiment is applied to the map data management system S. 2 is a block diagram showing a schematic configuration of the map data management system of the embodiment, FIG. 3 is a view exemplifying a format of map data of the embodiment, and FIG. 4 is information of ease of detection of the embodiment. It is a flowchart which shows the operation example of the process which used.
[1.地図データ管理システムSの構成及び概要]
 図2に示すように、本実施例の地図データ管理システムSは、地図データを管理するサーバ装置100と、複数の車両のそれぞれに搭載される車載端末200(実施形態の端末の一例)と、を含んで構成され、サーバ装置100と各車載端末200とは、例えばインターネット等のネットワークNWを介して諸データの授受が可能に接続されている。なお、図2では車載端末200を一台示しているが、地図データ管理システムSは複数の車載端末200を含んで構成してもよい。また、サーバ装置100についても複数の装置で構成してもよい。
[1. Configuration and Outline of Map Data Management System S]
As shown in FIG. 2, the map data management system S of this embodiment includes a server device 100 that manages map data, and an on-board terminal 200 (an example of a terminal of the embodiment) mounted on each of a plurality of vehicles. The server device 100 and each in-vehicle terminal 200 are connected so as to be able to transmit and receive various data via a network NW such as the Internet, for example. Although one in-vehicle terminal 200 is shown in FIG. 2, the map data management system S may include a plurality of in-vehicle terminals 200. Further, the server apparatus 100 may also be configured by a plurality of apparatuses.
 車載端末200は、車載端末200と共に後述するLiDARセンサが搭載された車両において、LiDARセンサが自ら出射したレーザー光の、当該車両の周辺に存在する地物からの反射光を受光することにより当該地物を検出し、その検出結果に対応するデータ等を含む送信データをサーバ装置100に送信する。このとき、実施例の地物としては、道路脇にある標識が挙げられる。これに加えて車体端末200は、上記地物による上記レーザー光の反射光に基づいて、当該地物に対応した後述の処理を行う。 In the vehicle mounted with the LiDAR sensor described later together with the vehicle-mounted terminal 200, the vehicle-mounted terminal 200 receives the reflected light of the laser light emitted by the LiDAR sensor from the feature existing in the vicinity of the vehicle. The object is detected, and transmission data including data corresponding to the detection result is transmitted to the server apparatus 100. At this time, as a feature of the embodiment, there is a sign beside the road. In addition to this, the vehicle body terminal 200 performs a process described later corresponding to the feature based on the reflected light of the laser light by the feature.
 サーバ装置100は、複数の車載端末200それぞれから受信した複数の上記送信データに基づいて、サーバ装置100に記録されている後述の地図データを更新すると共に、いずれかの車載端末100からの要求に応じて、当該要求に対応する地図データを当該要求した車載端末200に送信する。なお、上記地図データの更新は、サーバ装置100から指示を受けた他の装置が行うこととしてもよい。 The server device 100 updates map data, which will be described later, recorded in the server device 100 based on the plurality of transmission data received from each of the plurality of in-vehicle terminals 200, and responds to a request from any of the in-vehicle terminals 100. In response, the map data corresponding to the request is transmitted to the requested on-vehicle terminal 200. The update of the map data may be performed by another device that has received an instruction from the server device 100.
[2.車載端末200の構成]
 次に、本実施例の車載端末200の構成について説明する。
[2. Configuration of in-vehicle terminal 200]
Next, the configuration of the on-vehicle terminal 200 of the present embodiment will be described.
 図2に示すように、車載端末200は、大別して、制御部201と、記憶部202と、通信部203と、インターフェース部204を含んで構成されており、外部機器であるLiDARセンサ205及び内界センサ206に接続されている。 As shown in FIG. 2, the on-vehicle terminal 200 is roughly divided into a control unit 201, a storage unit 202, a communication unit 203, and an interface unit 204. Is connected to the field sensor 206.
 記憶部202は、例えばHDD(Hard Disk Drive)やSSD(Solid State Drive)等により構成されており、OS(Operating System)、実施例の処理プログラム、地図データ及び各種データ等を記憶する。地図データには、LiDARセンサ205による検出の対象となる地物について、その位置を示す地図位置情報と、当該地物と他の地物とを識別するための地物IDと、当該地物についての実施例の検出し易さ情報(実施形態の検出し易さ情報1Bの一例)と、が記述されている。このとき、地図位置情報と地物IDは一の地物に紐付く情報であるため、地物IDは当該一の地物の位置を示す位置情報の一つということができる。なお、上記地物の例としては、本実施例の標識の他に、例えば、車載端末200が搭載されている車両の周囲に存在する、人、他の車両、信号機、看板、建築物及び道路脇の植生等が挙げられる。また、記憶部202が記憶する地図データと同様の地図データ(即ち、地物ごとに、地図位置情報と地物IDと検出し易さ情報とが記述された地図データ)が、サーバ装置100の記憶部102にも記憶されており、車載装置200とサーバ装置100それぞれにおいて地物IDにより同一の地物を特定することができる。更に、記憶部202が記憶する地図データは、例えば全国の地図データを記憶しておくこととしてもよいし、車両の現在位置を含む一定地域に対応する地図データをサーバ装置100等から予め受信して記憶しておくこととしてもよい。 The storage unit 202 is configured of, for example, a hard disk drive (HDD) or a solid state drive (SSD), and stores an operating system (OS), a processing program of the embodiment, map data, various data, and the like. The map data includes map position information indicating the position of a feature to be detected by the LiDAR sensor 205, a feature ID for identifying the feature and another feature, and the feature The easiness of detection information (an example of the easiness of detection information 1B of the embodiment) is described. At this time, since the map position information and the feature ID are information linked to one feature, the feature ID can be said to be one of position information indicating the position of the one feature. In addition to the sign of the present embodiment, for example, a person, another vehicle, a traffic signal, a signboard, a building, and a road, which exist around the vehicle equipped with the on-vehicle terminal 200, are examples of the features. Vegetation on the side is mentioned. Further, map data similar to the map data stored in the storage unit 202 (that is, map data in which map position information, feature ID and ease of detection information are described for each feature) It is also stored in the storage unit 102, and the same feature can be specified by the feature ID in each of the in-vehicle apparatus 200 and the server apparatus 100. Furthermore, the map data stored in the storage unit 202 may store, for example, map data of the whole country, or map data corresponding to a certain area including the current position of the vehicle is received in advance from the server device 100 or the like. May be stored.
 通信部203は、車載端末200とサーバ装置100との通信状態を制御する。 The communication unit 203 controls the communication state between the in-vehicle terminal 200 and the server device 100.
 インターフェース部204は、上記LiDARセンサ205及び内界センサ206と、車載端末200と、の間でデータの授受を行う際のインターフェース機能を実現する。 The interface unit 204 implements an interface function when data is exchanged between the LiDAR sensor 205 and the internal sensor 206 and the on-vehicle terminal 200.
 LiDARセンサ205は、例えば車両のルーフ部分等に取り付けられ、一機能として、車両の周囲にレーザ光の照射点で円を描くようにパルス状のレーザー光(より具体的には、例えばパルス状の赤外線レーザー光)を出射し走査する。このときLiDARセンサ205は、例えばルーフ部分から下向きに一定の角度でレーザー光を出射する。そして、車両の周囲にある各地物の表面上の点で反射した光を受光し、各点における反射強度を示す反射強度データ、及び点群距離データを生成する。この反射強度データは、レーザー光の、地面や地物による反射強度を示すデータである。また、点群距離データは、レーザ光が照射された方向と、その方向に存在する地面や地物におけるレーザ光の照射点までの距離を示すデータである。これらの反射強度データと点群距離データを総称してセンサーデータと呼ぶ。なお、LiDARセンサ205は、車両フロント部やリア部などに複数取り付けられることで、それぞれが取得した視野範囲の反射強度データと点群距離データを合成して、車両の周囲のセンサーデータを生成してもよい。 The LiDAR sensor 205 is attached to, for example, a roof portion of a vehicle, and as one function, a pulse-like laser beam (more specifically, for example, a pulse-like shape) to draw a circle at the irradiation point of the laser beam around the vehicle. Infrared laser beam) is emitted and scanned. At this time, the LiDAR sensor 205 emits laser light downward at a constant angle, for example, from the roof portion. Then, the light reflected by the points on the surface of each feature around the vehicle is received, and reflection intensity data indicating the reflection intensity at each point and point cloud distance data are generated. The reflection intensity data is data indicating the reflection intensity of the laser light by the ground and features. The point group distance data is data indicating the direction in which the laser beam is irradiated and the distance to the point on the ground or feature on which the laser beam is irradiated. These reflection intensity data and point cloud distance data are generically called sensor data. A plurality of LiDAR sensors 205 are attached to the front and rear of the vehicle to combine the reflection intensity data and point cloud distance data of the field of view acquired by each, and generate sensor data of the surroundings of the vehicle. May be
 LiDARセンサ205は、上記反射強度を測定すると即時にセンサーデータを生成し、インターフェース部204を介して車載端末200に送信する。制御部201は、LiDARセンサ205からセンサーデータを受信すると、受信したセンサーデータを、センサーデータの受信時の車両の位置(即ちLiDARセンサ205を含む車載端末200の位置)を示す測定位置情報と、センサーデータの受信時の日時を示す測定日時情報に対応付けて記憶部202に記憶させる。なお、制御部201は、記憶部202に記憶させたセンサーデータ、測定位置情報及び測定日時情報のうち、測定から所定時間が経過したもの、或いは、サーバ装置100に送信したものについては、記憶部202から削除してもよい。 The LiDAR sensor 205 immediately generates sensor data when the reflection intensity is measured, and transmits the sensor data to the on-vehicle terminal 200 via the interface unit 204. When the control unit 201 receives sensor data from the LiDAR sensor 205, the control unit 201 measures the received sensor data as measured position information indicating the position of the vehicle at the time of reception of the sensor data (that is, the position of the in-vehicle terminal 200 including the LiDAR sensor 205). It is stored in the storage unit 202 in association with measurement date and time information indicating the date and time of reception of sensor data. Note that the control unit 201 stores the sensor data, the measurement position information, and the measurement date and time information stored in the storage unit 202 for which the predetermined time has elapsed from the measurement or the information transmitted to the server device 100. It may be deleted from 202.
 内界センサ206は、車両に搭載される、衛星測位センサ(GNSS(Global Navigation Satellite System))、ジャイロセンサ、車速センサ等の総称である。 The internal sensor 206 is a generic term such as a satellite positioning sensor (GNSS (Global Navigation Satellite System)), a gyro sensor, a vehicle speed sensor, and the like mounted on a vehicle.
 制御部201は、制御部201全体を制御するCPU(Central Processing Unit)と、制御部201を制御する制御プログラム等が予め記憶されているROM(Read Only Memory)と、各種データを一時的に格納するRAM(Random Access Memory)と、により構成されている。そして、CPUが、ROMや記憶部202に記憶された各種プログラムを読み出し実行することにより、実施例の地図データを用いた処理を含む各種機能を実現する。 The control unit 201 temporarily stores various data, such as a central processing unit (CPU) for controlling the entire control unit 201, a read only memory (ROM) in which a control program for controlling the control unit 201 and the like is stored in advance. And a random access memory (RAM). Then, the CPU reads out and executes various programs stored in the ROM and the storage unit 202 to realize various functions including processing using the map data of the embodiment.
 制御部201は、推定自車位置情報を取得する。推定自車位置情報は、車載端末200外の装置が生成したものであってもよいし、制御部201が生成したものであってもよい。推定自車位置情報は、例えば、LiDARセンサ205で計測した地物位置と、自動運転用の地図データの地物位置と、をマッチングして生成したり、内界センサ206で検出した情報と上記地図データに基づいて生成したり、これらの組み合わせにより生成することができる。 The control unit 201 acquires estimated own vehicle position information. The estimated own vehicle position information may be generated by a device outside the on-board terminal 200, or may be generated by the control unit 201. The estimated vehicle position information may be generated, for example, by matching the feature position measured by the LiDAR sensor 205 with the feature position of map data for automatic driving, or information detected by the inner sensor 206 and the above It can be generated based on map data, or a combination of these.
 また、制御部201は、推定自車位置情報と、LiDARセンサ205からのセンサーデータと、に基づいて、例えば道路脇に存在する地物を検出し、当該検出結果に相当する送信データをサーバ装置100に送信する。 Further, the control unit 201 detects, for example, a feature present on the side of the road based on the estimated own vehicle position information and the sensor data from the LiDAR sensor 205, and transmits transmission data corresponding to the detection result to the server device. Send to 100
 このとき制御部201は、推定自車位置情報と、地図データに含まれている実施例の検出し易さ情報と、に基づき、当該検出し易さ情報に対応した地物の検出を行う。この実施例の検出し易さ情報を用いた地物の検出については、後ほど図4を用いて詳述する。 At this time, the control unit 201 detects a feature corresponding to the detectability information based on the estimated own vehicle position information and the detectability information of the embodiment included in the map data. The detection of the feature using the detectability information of this embodiment will be described in detail later using FIG.
[3.サーバ装置100の構成]
 次に、サーバ装置100の構成について説明する。図2に示すように、サーバ装置100は、大別して、制御部101、記憶部102、通信部103、表示部104及び操作部105を含んで構成されている。
[3. Configuration of Server Device 100]
Next, the configuration of the server device 100 will be described. As shown in FIG. 2, the server apparatus 100 is roughly divided into a control unit 101, a storage unit 102, a communication unit 103, a display unit 104, and an operation unit 105.
 記憶部102は、例えばHDDやSSD等により構成されており、OS、上記地図データ、車載端末200から受信した送信データ及びその他各種データ等を記憶する。 The storage unit 102 is configured by, for example, an HDD or an SSD, and stores the OS, the map data, transmission data received from the in-vehicle terminal 200, and various other data.
 通信部103は、車載端末200との通信状態を制御する。 The communication unit 103 controls the communication state with the on-vehicle terminal 200.
 表示部104は、例えば、液晶ディスプレイ等により構成されており、文字や画像等の情報を表示する。 The display unit 104 is configured of, for example, a liquid crystal display or the like, and displays information such as characters and images.
 操作部105は、例えば、キーボード、マウス等により構成されており、オペレータからの操作指示を受け付け、その指示内容を指示信号として制御部101に出力する。 The operation unit 105 includes, for example, a keyboard, a mouse, and the like, receives an operation instruction from an operator, and outputs the content of the instruction to the control unit 101 as an instruction signal.
 制御部101は、制御部101全体を制御するCPUと、制御部101を制御する制御プログラム等が予め記憶されているROMと、各種データを一時的に格納するRAMと、により構成されている。そして、CPUが、ROMや記憶部102に記憶された各種プログラムを読み出し実行することにより、実施例の地図データ管理システムSによる送信データの受信処理を含む各種機能を実現する。 The control unit 101 includes a CPU that controls the entire control unit 101, a ROM in which a control program that controls the control unit 101 and the like is stored in advance, and a RAM that temporarily stores various data. Then, the CPU reads and executes various programs stored in the ROM and the storage unit 102 to realize various functions including reception processing of transmission data by the map data management system S of the embodiment.
 制御部101は、一又は複数の車載端末200それぞれから受信した複数の上記送信データを受信し、それらを用いて、対応する地物に対応する地図データを更新する。 The control unit 101 receives a plurality of the transmission data received from each of the one or a plurality of on-vehicle terminals 200, and uses them to update the map data corresponding to the corresponding feature.
 ここで、実施例の地図データのデータ構造について、図3を用いて説明する。 Here, the data structure of the map data of the embodiment will be described with reference to FIG.
 はじめに当該データ構造の第1例について、図3(A)を用いて説明する。即ち、実施例の車載端末200及びサーバ装置100にそれぞれ記憶されている地図データのデータ構造の第1例としては、図3(A)に地図データFM1として例示するように、基本情報FMBと、検出し易さ情報FMSと、により構成されるデータ構造が挙げられる。 First, a first example of the data structure is described with reference to FIG. That is, as a first example of the data structure of the map data stored in the on-vehicle terminal 200 and the server device 100 according to the embodiment, basic information FMB, as illustrated in FIG. 3A as map data FM1, There is a data structure configured by the detectability information FMS.
 このうち基本情報FMBは、物体IDデータFMB1と、物体種別データFMB2と、位置データFMB3と、所属道路データFMB4と、大きさデータFMB5と、により構成されている。このとき物体IDデータFMB1は、対応する地物を他の地物から識別するためのIDデータである。物体種別データFMB2は、対応する地物の種類を特定するためのコードデータである。位置データFMB3は、対応する地物が存在する位置の緯度及び経度並びに高さ(標高)を示すデータである。所属道路データFMB4は、対応する地物が属している(即ち、対応する地物が例えばその脇にある)道路の地図データFMにおけるリンクIDを示すデータである。大きさデータFMB5は、対応する地物の特徴部分の縦横の大きさを示すデータである。 Among them, the basic information FMB is composed of object ID data FMB1, object type data FMB2, position data FMB3, affiliated road data FMB4, and size data FMB5. At this time, the object ID data FMB1 is ID data for identifying the corresponding feature from the other features. The object type data FMB2 is code data for specifying the type of the corresponding feature. The position data FMB3 is data indicating the latitude and longitude and the height (elevation) of the position where the corresponding feature exists. The belonging road data FMB4 is data indicating the link ID in the map data FM of the road to which the corresponding feature belongs (that is, the corresponding feature is, for example, on the side thereof). The size data FMB5 is data indicating the vertical and horizontal sizes of the feature part of the corresponding feature.
 次に、検出し易さ情報FMSは、車体端末200に接続されるセンサの種別ごとに記述されている。即ち、当該センサとしてのLiDARセンサ205用の検出し易さ情報FMSとしては、LiDAR用検出し易さ情報FMS1及びカメラ用検出し易さ情報FMS2が、時間帯に分けて記述されている。なお、ここでは検出し易さとして、10段階のレベルが例示されており、10が最も検出し易く、1が最も検出し難いことを意味する。ここで、LiDARセンサ205による検出し易さは、例えば地物としての標識の色や素材によって予め決定されるものである。例えば、いわゆる再帰性反射材を用いた標識については、LiDARセンサ205による検出し易さは高くなる。また、黒色の部分が多い標識は光の反射率が低いことから、LiDARセンサ205による検出し易さは、他の色の部分が多い標識に比べて低くなる。なお、LiDARセンサ205は周囲の明るさに影響されることなく地物を検出することができるため、図3に例示する場合は、0時00分から23時59まで検出し易さは一定である。一方カメラによる検出し易さは、例えば周囲環境の明るさによって予め決定されるものである。従って、夜間から早朝にかけては、検出し難いことを示すレベルが設定されている。 Next, the ease of detection information FMS is described for each type of sensor connected to the vehicle body terminal 200. That is, as the detectability information FMS for the LiDAR sensor 205 as the sensor, the detectability information FMS1 for LiDAR and the detectability information FMS2 for a camera are described by being divided into time zones. In addition, the level of ten steps is illustrated as easiness of detection here, and it means that 10 is the easiest to detect and 1 is the hardest to detect. Here, the ease of detection by the LiDAR sensor 205 is determined in advance by, for example, the color or material of a mark as a feature. For example, for a label using a so-called retroreflective material, the ease of detection by the LiDAR sensor 205 is enhanced. In addition, since the label having many black portions has low light reflectance, the easiness of detection by the LiDAR sensor 205 is lower than the label having many other color portions. In addition, since the LiDAR sensor 205 can detect a feature without being influenced by the surrounding brightness, in the case illustrated in FIG. 3, the ease of detection is constant from 00:00 to 23:59. . On the other hand, the ease of detection by the camera is, for example, determined in advance by the brightness of the surrounding environment. Therefore, from night to early morning, a level indicating that it is difficult to detect is set.
 なお、センサの種類(例えばLiDARセンサ205とカメラ)に応じて、検出し易さ情報FMSの表現方法を変えても良い。具体的には、LiDARセンサ205については、時間帯ごとに検出し易さ情報FMSを分ける必要が無いため、照射するレーザ光の強度に応じた検出し易さのレベルを定義し、一方、カメラ用検出し易さ情報FMS2としては、図3(A)に例示したように、時間帯ごとの検出し易さ情報FMSを設定するように構成することができる。 Note that the method of expressing the ease of detection information FMS may be changed according to the type of sensor (for example, the LiDAR sensor 205 and the camera). Specifically, for the LiDAR sensor 205, since it is not necessary to separate the detection ease information FMS for each time zone, the level of detection ease according to the intensity of the laser beam to be irradiated is defined, while the camera As illustrated in FIG. 3A, the ease of detection information FMS 2 for each time zone can be set as the ease of detection information FMS 2 for use.
 次に、実施例の地図データFMのデータ構造の第2例について、図3(B)を用いて説明する。即ち、実施例の地図データのデータ構造の第2例としては、図3(B)に地図データFM2として例示するように、基本情報FMBと、検出し易さ情報FMSと、により構成されるデータ構造が挙げられる。 Next, a second example of the data structure of the map data FM of the embodiment will be described with reference to FIG. 3 (B). That is, as a second example of the data structure of the map data of the embodiment, as exemplified as map data FM2 in FIG. 3B, data configured by the basic information FMB and the detectability information FMS. The structure is mentioned.
 このうち基本情報FMBは、図3(A)に例示する地図データFM1と同様の構成であるので、細部の説明は省略する。なお、検出し易さを示す数字は図3(A)と同じく、10段階のレベルで表示されている。 Among these, the basic information FMB has the same configuration as the map data FM1 illustrated in FIG. 3A, so the detailed description will be omitted. In addition, the number which shows the detectability is displayed on the level of 10 steps like FIG. 3 (A).
 次に、検出し易さ情報FMSは、対応する地物が所属する道路データごとに記述されている。即ち、図3(B)に道路対応検出し易さ情報FMS5として例示するように、対応する地物の、その地物をセンサ(例えばLiDARセンサ205)によって検出可能な道路から見た場合(即ち当該道路上からLiDARセンサ205のレーザー光を照射した場合)の検出し易さを段階的に示す道路対応検出し易さ情報FMS5が記述されている。ここで、対応する地物としての標識が、リンクID:xxx1で示される道路に所属する場合、通常は当該標識は当該道路上の車両から見て正対するように設置されており、その標識はその道路から検出し易い状態であるため、道路対応検出し易さ情報FMS5としては他と比較して高い値(図3(B)に例示する場合は「10」)とされている。同様に、リンクID:xxx1で示される道路の反対車線に相当する、リンクID:xxx2で示される道路からも、その標識はその道路から検出し易い状態であるため、道路対応検出し易さ情報FMS5としては高い値が設定されている。一方、同じ標識を、リンクID:xxx1で示される道路と直交する道路(リンクID:yyy1)上の車両から見た場合は、その標識を横から見ることとなるため、その標識は検出し難い(即ち当該道路上からLiDARセンサ205のレーザー光を照射できる面積が小さい)状態であるため、道路対応検出し易さ情報FMS5としては他と比較して低い値(図3(B)に例示する場合は「3」)とされている。リンクID:yyy1で示される道路の反対車線に相当する、リンクID:yyy2で示される道路からも、その標識は検出し難いが、標識に対する角度がリンクID:yyy1で示される道路と比べて大きくなるため、少し大きい値が設定されている。 Next, the ease of detection information FMS is described for each road data to which the corresponding feature belongs. That is, as illustrated in FIG. 3B as the road correspondence detection ease information FMS5, when the feature of the corresponding feature is viewed from the road detectable by the sensor (for example, LiDAR sensor 205) (ie, A road correspondence detection ease information FMS5 is described which indicates the ease of detection of the case where the laser light of the LiDAR sensor 205 is irradiated from above the road in a stepwise manner. Here, when the sign as the corresponding feature belongs to the road indicated by the link ID: xxx1, the sign is usually installed to face the vehicle on the road, and the sign is Since it is an easy-to-detect state from the road, the road correspondence detectability information FMS 5 is set to a high value (“10” in the example of FIG. 3B) as compared to the other. Similarly, from the road indicated by link ID: xxx 2 which corresponds to the opposite lane of the road indicated by link ID: xxx 1, the sign is easily detected from the road, so the road correspondence detection ease information A high value is set as FMS5. On the other hand, when the same sign is viewed from a vehicle on a road (link ID: yyy1) orthogonal to the road indicated by the link ID: xxx1, the sign is difficult to detect because the sign is viewed from the side Since the area where the laser light of the LiDAR sensor 205 can be irradiated from the road is small (ie, the road correspondence detection ease information FMS 5 is lower than the others) The case is "3"). Even from the road indicated by link ID: yyy2, which corresponds to the opposite lane of the road indicated by link ID: yyy1, the sign is difficult to detect, but the angle to the sign is large compared to the road indicated by link ID: yyy1. Because of that, a little large value is set.
 最後に、実施例の地図データのデータ構造の第3例について、図3(C)を用いて説明する。即ち、実施例の地図データFMのデータ構造の第3例としては、図3(C)に地図データFM3として例示するように、基本情報FMBと、検出し易さ情報FMSと、により構成されるデータ構造が挙げられる。 Finally, a third example of the data structure of the map data of the embodiment will be described using FIG. 3 (C). That is, as a third example of the data structure of the map data FM of the embodiment, as illustrated as map data FM3 in FIG. 3C, it is configured by basic information FMB and detectability information FMS. There is a data structure.
 このうち基本情報FMBは、図3(A)に例示する地図データFM1と同様の構成であるので、細部の説明は省略する。 Among these, the basic information FMB has the same configuration as the map data FM1 illustrated in FIG. 3A, so the detailed description will be omitted.
 次に、検出し易さ情報FMSは、対応する地物に対して、例えば近くに生えている植生等による、地物の遮蔽の程度として記述されている。即ち、図3(C)に遮蔽対応検出し易さ情報FMS6として例示するように、対応する地物である標識が植生により遮蔽されている場合、その遮蔽の度合いに応じたその標識の検出し易さを割合で示す遮蔽対応検出し易さ情報FMS6が記述されている。即ち、遮蔽された標識は、そうでない標識と比べて検出し難い状態となっているため、遮蔽の度合いは検出し易さ情報として利用することができるのである。なお遮蔽対応検出し易さ情報FMS6としては、上述したような割合で示すものの他に、その地物に対する上記遮蔽の有無を示す情報としてもよい。 Next, the ease of detection information FMS is described as the degree of shielding of the feature with respect to the corresponding feature, for example, by a nearby vegetation or the like. That is, as exemplified in FIG. 3C as the shielding correspondence detection easiness information FMS 6, when the marker which is the corresponding feature is shielded by vegetation, the marker corresponding to the degree of shielding is detected Occlusion correspondence detection easiness information FMS6 indicating the easiness in proportion is described. That is, since the shielded label is in a state in which it is difficult to detect as compared with other labels, the degree of shielding can be used as the easiness of detection information. Note that the shielding correspondence detection easiness information FMS 6 may be information indicating the presence or absence of the shielding with respect to the feature other than the information indicated by the ratio as described above.
[4.地図データ管理システムSの動作例]
 次に、図4のフローチャートを用いて、地図データ管理システムSの動作例のうち、車載端末200による検出し易さ情報FMSを用いた地物の検出時の動作例について説明する。なお、図4のステップS101~ステップS104の処理は、定期的に(例えば、所定時間ごとに、及び/又は、車載端末200が搭載された車両が所定距離移動するごとに)実行される。
[4. Operation Example of Map Data Management System S]
Next, among the operation examples of the map data management system S, an operation example at the time of detection of a feature using the detectability information FMS by the on-vehicle terminal 200 will be described using the flowchart in FIG. 4. The processing of steps S101 to S104 in FIG. 4 is executed periodically (for example, every predetermined time and / or each time the vehicle equipped with the on-vehicle terminal 200 moves for a predetermined distance).
 まず、車載端末200の制御部201は、推定自車位置情報を取得する(ステップS101)。 First, the control unit 201 of the on-vehicle terminal 200 acquires estimated own vehicle position information (step S101).
 次に、制御部201は、LiDARセンサ205からのセンサーデータに基づき、車載端末200が搭載されている車両の移動方向に沿って存在している地物を示す物体IDデータ、当該地物に対応する検出し易さ情報FMS、及び当該地物の位置を示す位置データFMB3等を、地図データFMから取得する(ステップS102)。そして制御部201は、位置データFMB3等を用いて、当該地物の存在を検出する(ステップS103)。このとき、制御部201は、上記位置データFMB3により特定される地物の検出し易さを、検出し易さ情報FMSに基づいて判定し、その判定結果に基づいて例えば自車位置の推定に用いる地物を選定して検出する。より具体的に制御部201は、LiDAR用検出し易さ情報FMS1の値、カメラ用検出し易さ情報FMS2の値、又は道路対応検出し易さ情報FMS5の値が高い地物から順に選定する。また制御部201は、遮蔽対応検出し易さ情報FMS6の値が低い(即ち、遮蔽されている割合が低い)地物から順に選定する。そして制御部201は、選定された地物の位置を検出し、当該検出した地物の位置データFMB3を用いて、例えば車体端末200が搭載されている車両の位置を推定する(ステップS104)。その後制御部201は、実施例の検出し易さ情報を用いた処理を終了する。 Next, based on the sensor data from the LiDAR sensor 205, the control unit 201 corresponds to the object ID data representing the feature present along the moving direction of the vehicle in which the on-board terminal 200 is mounted. Detectability information FMS to be detected and position data FMB3 and the like indicating the position of the feature are acquired from the map data FM (step S102). Then, the control unit 201 detects the presence of the feature using the position data FMB 3 and the like (step S103). At this time, the control unit 201 determines the ease of detection of the feature specified by the position data FMB3 based on the ease of detection information FMS, and for example, estimates the position of the vehicle based on the determination result. Select and detect features to be used. More specifically, the control unit 201 selects in order from the feature having the highest value of LiDAR detectability information FMS1, the value of camera detectability information FMS2, or the road correspondence detectability information FMS5. . In addition, the control unit 201 selects the features in ascending order of the shielding correspondence detection easiness information FMS 6 (that is, the shielding ratio is low). Then, the control unit 201 detects the position of the selected feature, and estimates, for example, the position of the vehicle on which the vehicle terminal 200 is mounted, using the detected position data FMB3 of the feature (step S104). Thereafter, the control unit 201 ends the process using the detectability information of the embodiment.
 以上説明したように、本実施例における地図データ管理システムSにおける地図データFMは、地物の位置を示す位置データFMB3(実施形態の位置情報の一例)と、LiDARセンサ205(実施形態の検出手段の一例)による地物の検出し易さを示す検出し易さ情報FMSと、を関連付けたデータ構造の地図データFMであって、車体端末200が、位置データFMB3により特定される地物の検出し易さを検出し易さ情報FMSに基づいて判定する処理に用いられる。従って、地物の検出し易さを示す情報を含む地図データFMのデータ構造を提供することができる。 As described above, the map data FM in the map data management system S in the present embodiment includes position data FMB3 (an example of position information in the embodiment) indicating the position of a feature, and the LiDAR sensor 205 (detection means in the embodiment) Map data FM having a data structure in which detectability information FMS indicating the detectability of a feature according to (an example) is associated with each other, and the body terminal 200 detects the feature identified by the position data FMB3. It is used for the process which determines the ease based on the ease of detection information FMS. Therefore, it is possible to provide the data structure of the map data FM including the information indicating the detectability of the feature.
 また、地図データFMは、車載端末200が、上記判定された検出し易さに基づいて地物を選定する処理に用いられるので、例えば自車位置の検出に用いることができる地物を確実に選定することができる。 Further, since the map data FM is used for the process of selecting the feature based on the above-described determined ease of detection by the on-vehicle terminal 200, the feature which can be used for detecting the position of the vehicle, for example It can be selected.
 更に、地図データFMは、車載端末200が、上記選定された地物の位置データFMB3を用いて地物の位置を特定する処理に用いられるので、例えば自車位置の検出に必要な地物の位置を正確に特定することができる。 Furthermore, since the map data FM is used in processing for specifying the position of the feature using the position data FMB3 of the selected feature, the on-vehicle terminal 200, for example, the feature required for detecting the vehicle position The position can be identified accurately.
 更にまた、図3(B)に例示する実施例の地図データFMのデータ構造の第2例の場合は、道路対応検出し易さ情報FMS5が、地物が配置された道路ごとに区別されて含まれるので、道路ごとに、必要な地物を検出することができる。このとき、道路対応検出し易さ情報FMS5は、地物が配置された道路ごと、且つ車載端末200が搭載されている車両の移動方向ごとに区別されて地図データFMに含まれていてもよい。この場合は、よりきめ細かく、必要な地物を検出することができる。 Furthermore, in the case of the second example of the data structure of the map data FM of the embodiment illustrated in FIG. 3B, the road correspondence detectability information FMS5 is distinguished for each road where the feature is disposed. Since it is included, the required features can be detected for each road. At this time, the road correspondence detection ease information FMS 5 may be included in the map data FM in distinction for each road on which the feature is disposed and for each movement direction of the vehicle on which the on-board terminal 200 is mounted. . In this case, the necessary features can be detected more finely.
 また、図3(A)に例示する実施例の地図データFMのデータ構造の第1例の場合は、LiDARセンサ用検出し易さ情報FMS1及びカメラ用検出し易さ情報FMS2が、時間帯ごとに区別されて地図データFMに含まれているので、時間帯ごとに、必要な地物を検出することができる。 Further, in the case of the first example of the data structure of the map data FM of the embodiment illustrated in FIG. 3A, the LiDAR sensor detectability information FMS1 and the camera detectability information FMS2 Since the map data FM is distinguished and is included in the map data FM, it is possible to detect a required feature for each time zone.
 更に、図3(A)に例示する実施例の地図データFMのデータ構造の第1例の場合において、例えばLiDARセンサ用検出し易さ情報FMS1をレーザー光の反射率で記述するようにしてもよい。この場合は、車載端末200に接続されるセンサの種別に応じた表現方法で検出し易さが示されるので、センサの種類に応じてより的確に、必要な地物を検出することができる。 Furthermore, in the case of the first example of the data structure of the map data FM of the embodiment illustrated in FIG. 3A, for example, the detectability information FMS1 for LiDAR sensor may be described by the reflectance of the laser light. Good. In this case, since the ease of detection is indicated by the expression method corresponding to the type of sensor connected to the on-vehicle terminal 200, the necessary feature can be detected more accurately according to the type of sensor.
 更にまた、図3(C)に例示する実施例の地図データFMのデータ構造の第3例の場合は、遮蔽対応検出し易さ情報FMS6が、障害物による地物の遮蔽有無、又は障害物による地物の遮蔽率で検出し易さを示しているので、地物が植生等の障害物により遮蔽されている場合、その影響を考慮して的確に地物を検出することができる。 Furthermore, in the case of the third example of the data structure of the map data FM of the embodiment illustrated in FIG. 3C, the shielding correspondence detectability information FMS 6 indicates the presence or absence of shielding of the feature by the obstacle, or the obstacle Since the easiness of detection is indicated by the shielding rate of the feature due to the above, when the feature is shielded by an obstacle such as a vegetation, the feature can be accurately detected in consideration of the influence thereof.
 なお、検出し易さ情報FMSを天候ごとに区別して地図データFMに含ませておき、実際の地物の検出時における天候の検出結果に応じて検出し易さ情報FMSを使い分けるように構成してもよい。この場合は、天候に応じてより的確に、必要な地物を検出することができる。 In addition, the ease of detection information FMS is distinguished by weather and included in the map data FM, and the ease of detection information FMS is selectively used according to the result of the weather detection at the time of actual detection of a feature. May be In this case, the necessary features can be detected more accurately according to the weather.
 なお、検出し易さ情報FMSとして、対象となる地物の材質を示すこともできる。具体的には対象となる地物が、再帰性反射材を用いた地物であるのか、LED(Light Emitting Diode)等で表示を行う電光板であるのか等を示す。この場合、再帰性反射材を用いた地物はLiDARセンサ205には認識し易いが、電光板の地物は認識し難いものとして扱うことができる。 In addition, the material of the target feature can also be indicated as the ease of detection information FMS. Specifically, it indicates whether the target feature is a feature using a retroreflecting material or a light emitting plate that displays an image using an LED (Light Emitting Diode) or the like. In this case, although the feature using the retroreflective material is easy to be recognized by the LiDAR sensor 205, the feature on the light board can be treated as being hard to recognize.
 なお、検出し易さ情報FMSとしては、対象となる地物の劣化の有無、又は劣化の進行度合いによって示すこともできる。例えば、道路に描かれた区画線等は、劣化によって擦れた場合には、LiDARセンサ205によって認識し難い地物となるためである。 The easiness of detection information FMS can also be indicated by the presence or absence of deterioration of a target feature or the progress of deterioration. For example, a demarcation line or the like drawn on a road is a feature that is difficult to be recognized by the LiDAR sensor 205 when it is rubbed due to deterioration.
 なお、図4に示すフローチャートに相当するプログラムを、光ディスク又はハードディスク等の記録媒体に記録しておき、或いはインターネット等のネットワークを介して取得して記録しておき、これらを汎用のマイクロコンピュータ等により読み出して実行することで、当該マイクロコンピュータ等を実施例の制御部201として機能させることも可能である。 A program corresponding to the flowchart shown in FIG. 4 is recorded in a recording medium such as an optical disk or a hard disk, or obtained via a network such as the Internet and recorded, and these are recorded using a general-purpose microcomputer or the like. It is also possible to cause the microcomputer or the like to function as the control unit 201 of the embodiment by reading and executing.
 FM1、FM2、FM3 地図データ
 1A 位置情報
 1B 検出し易さ情報
 S 地図データ管理システム
 100 サーバ装置
 101、201 制御部
 200 車載端末
 205 LiDARセンサ
 FMB  基本情報
 FMB1  物体IDデータ
 FMB2  物体種別データ
 FMB3  位置データ
 FMB4  所属道路データ
 FMB5  大きさデータ
 FMS  検出し易さ情報
 FMS1  LiDAR用検出し易さ情報
 FMS2  カメラ用検出し易さ情報
 FMS5  道路対応検出し易さ情報
 FMS6  遮蔽対応検出し易さ情報
FM1, FM2, FM3 Map data 1A Position information 1B Detectability information S Map data management system 100 Server device 101, 201 Control unit 200 Mobile terminal 205 LiDAR sensor FMB Basic information FMB1 Object ID data FMB2 Object type data FMB3 Position data FMB4 Affiliated road data FMB5 Size data FMS detectability information FMS1 LiDAR detectability information FMS2 camera detectability information FMS5 road correspondence detectability information FMS6 shield correspondence detectability information

Claims (9)

  1.  地物の位置を示す位置情報と、検出手段による前記地物の検出し易さを示す検出し易さ情報と、を関連付けて含む地図データのデータ構造であって、
     移動体に搭載された前記検出手段が、前記位置情報により特定される地物の検出し易さを前記検出し易さ情報に基づいて判定する処理に用いられることを特徴とする地図データのデータ構造。
    A data structure of map data including position information indicating the position of a feature, and detectability information indicating the detectability of the feature by the detection means in association with each other,
    Data of map data characterized in that the detection means mounted on a moving body is used to determine the ease of detection of the feature specified by the position information based on the ease of detection information Construction.
  2.  請求項1に記載のデータ構造において、
     前記移動体に搭載された前記検出手段が、前記判定された検出し易さに基づいて、前記移動体の位置の推定に用いる地物を選定する処理に用いられることを特徴とする前記地図データのデータ構造。
    In the data structure according to claim 1,
    The map data characterized in that the detection means mounted on the movable body is used to select a feature to be used for estimating the position of the movable body based on the determined ease of detection. Data structure of.
  3.  請求項1又は請求項2のいずれか一項に記載のデータ構造において、
     前記検出し易さ情報は、前記検出手段の種別ごとに区別されて定義されることを特徴とする前記地図データのデータ構造。
    In the data structure according to any one of claims 1 or 2,
    The data structure of the map data, wherein the detectability information is distinguished and defined for each type of the detection means.
  4.  請求項1から請求項3のいずれか一項に記載のデータ構造において、
     前記検出し易さ情報は、前記地物の周囲の道路ごとに区別されて定義されることを特徴とする前記地図データのデータ構造。
    In the data structure according to any one of claims 1 to 3,
    The data structure of the map data, wherein the detectability information is defined separately for each road around the feature.
  5.  請求項4に記載のデータ構造において、
     前記検出し易さ情報は、前記地物の周囲の道路ごと、且つ前記道路について設定された移動体の走行方向ごとに区別されて定義されることを特徴とする前記地図データのデータ構造。
    In the data structure according to claim 4,
    The data structure of the map data is characterized in that the detectability information is distinguished for each road around the feature and for each traveling direction of a mobile set for the road.
  6.  請求項1から請求項5のいずれか一項に記載のデータ構造において、
     前記検出し易さ情報は、時間帯ごとに区別されて定義されることを特徴とする前記地図データのデータ構造。
    In the data structure according to any one of claims 1 to 5,
    The data structure of the map data, wherein the detectability information is defined separately for each time zone.
  7.  請求項1から請求項6のいずれか一項に記載のデータ構造において、
     前記検出し易さ情報は、天候ごとに区別されて定義されることを特徴とする前記地図データのデータ構造。
    In the data structure according to any one of claims 1 to 6,
    The data structure of the map data, wherein the detectability information is defined separately for each weather.
  8.  請求項1から請求項7のいずれか一項に記載のデータ構造において、
     前記検出し易さ情報は、前記検出手段の種別に応じた表現方法で前記検出し易さを示すことを特徴とする前記地図データのデータ構造。
    In the data structure according to any one of claims 1 to 7,
    The data structure of the map data, wherein the easiness of detection information indicates the easiness of detection by an expression method corresponding to a type of the detection means.
  9.  請求項1から請求項8のいずれか一項に記載のデータ構造において、
     前記検出し易さ情報は、障害物による前記地物の遮蔽有無、または障害物による前記地物の遮蔽率で前記検出し易さを示すことを特徴とする前記地図データのデータ構造。
    In the data structure according to any one of claims 1 to 8,
    The data structure of the map data, wherein the easiness of detection information indicates the easiness of detection based on whether or not the feature is blocked by an obstacle, or a blocking rate of the feature by an obstacle.
PCT/JP2018/043742 2017-11-30 2018-11-28 Data structure of map data WO2019107397A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017229997 2017-11-30
JP2017-229997 2017-11-30

Publications (1)

Publication Number Publication Date
WO2019107397A1 true WO2019107397A1 (en) 2019-06-06

Family

ID=66664985

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/043742 WO2019107397A1 (en) 2017-11-30 2018-11-28 Data structure of map data

Country Status (1)

Country Link
WO (1) WO2019107397A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110054791A1 (en) * 2009-08-25 2011-03-03 Southwest Research Institute Position estimation for ground vehicle navigation based on landmark identification/yaw rate and perception of landmarks
JP2013025401A (en) * 2011-07-15 2013-02-04 Fujitsu Ltd Self-position estimation device, self-position estimation method and program
JP2015108604A (en) * 2013-12-06 2015-06-11 日立オートモティブシステムズ株式会社 Vehicle position estimation system, device, method, and camera device
JP2016156973A (en) * 2015-02-25 2016-09-01 パイオニア株式会社 Map data storage device, control method, program and recording medium
WO2016139747A1 (en) * 2015-03-03 2016-09-09 パイオニア株式会社 Vehicle control device, control method, program, and storage medium
WO2016152873A1 (en) * 2015-03-24 2016-09-29 パイオニア株式会社 Automatic driving assistance device, control method, program, and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110054791A1 (en) * 2009-08-25 2011-03-03 Southwest Research Institute Position estimation for ground vehicle navigation based on landmark identification/yaw rate and perception of landmarks
JP2013025401A (en) * 2011-07-15 2013-02-04 Fujitsu Ltd Self-position estimation device, self-position estimation method and program
JP2015108604A (en) * 2013-12-06 2015-06-11 日立オートモティブシステムズ株式会社 Vehicle position estimation system, device, method, and camera device
JP2016156973A (en) * 2015-02-25 2016-09-01 パイオニア株式会社 Map data storage device, control method, program and recording medium
WO2016139747A1 (en) * 2015-03-03 2016-09-09 パイオニア株式会社 Vehicle control device, control method, program, and storage medium
WO2016152873A1 (en) * 2015-03-24 2016-09-29 パイオニア株式会社 Automatic driving assistance device, control method, program, and storage medium

Similar Documents

Publication Publication Date Title
US9891071B2 (en) Mapping road illumination
KR102420568B1 (en) Method for determining a position of a vehicle and vehicle thereof
US9863928B1 (en) Road condition detection system
US7877187B2 (en) Driving support method and device
US10303181B1 (en) Self-driving vehicle systems and methods
US10481606B1 (en) Self-driving vehicle systems and methods
US20200209005A1 (en) Systems and methods for loading object geometry data on a vehicle
US20200271453A1 (en) Lane marking localization and fusion
WO2009098319A2 (en) Navigational device for a vehicle
CN112042169B (en) Method for determining traffic information
JP2019513996A (en) Method of determining the attitude of at least a partially autonomous driving vehicle in the vicinity by a ground sign
US11142196B2 (en) Lane detection method and system for a vehicle
JP2023121809A (en) Deteriorated ground object identification device, deteriorated ground object identification system, deteriorated ground object identification method, deteriorated ground object identification program and computer readable recording medium with deteriorated ground object identification program recorded thereon
CN112654892A (en) Method for creating a map of an environment of a vehicle
US20200200870A1 (en) Radar Sensor Misalignment Detection for a Vehicle
WO2019107397A1 (en) Data structure of map data
JP2009019979A (en) Light receiver, distance calculator, distance calculation program, recording medium, and distance calculation method
JP5029186B2 (en) POSITION CALCULATION DEVICE, POSITION CALCULATION PROGRAM, RECORDING MEDIUM, AND POSITION CALCULATION METHOD
CN110599762A (en) Road condition sensing system and method
CN115597578A (en) Self-positioning of a vehicle in a parking facility with selective sensor activation
JP2019101138A (en) Data structure of map data
KR20220148378A (en) Apparatus for assisting driving of a host vehicle and method therof
JP2008158791A (en) Driving support system
JP2019101605A (en) Data structure for transmission data
JP2019100787A (en) Data structure of transmission data, determination device, determination method and determination program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18882874

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18882874

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP