WO2014013985A1 - 運転支援システム及び運転支援方法 - Google Patents
運転支援システム及び運転支援方法 Download PDFInfo
- Publication number
- WO2014013985A1 WO2014013985A1 PCT/JP2013/069294 JP2013069294W WO2014013985A1 WO 2014013985 A1 WO2014013985 A1 WO 2014013985A1 JP 2013069294 W JP2013069294 W JP 2013069294W WO 2014013985 A1 WO2014013985 A1 WO 2014013985A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- driving
- vehicle
- map
- statistical
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 53
- 238000012545 processing Methods 0.000 claims abstract description 35
- 238000004891 communication Methods 0.000 claims description 41
- 230000008569 process Effects 0.000 claims description 24
- 230000007613 environmental effect Effects 0.000 claims description 8
- 230000008859 change Effects 0.000 description 15
- 230000007704 transition Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 13
- 230000006399 behavior Effects 0.000 description 8
- 238000003384 imaging method Methods 0.000 description 8
- 238000004364 calculation method Methods 0.000 description 6
- 230000001133 acceleration Effects 0.000 description 5
- 238000001514 detection method Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 239000000284 extract Substances 0.000 description 4
- 230000008921 facial expression Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 208000032140 Sleepiness Diseases 0.000 description 2
- 206010041349 Somnolence Diseases 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 230000003183 myoelectrical effect Effects 0.000 description 2
- 238000000059 patterning Methods 0.000 description 2
- 230000037321 sleepiness Effects 0.000 description 2
- 238000013179 statistical model Methods 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 101001093748 Homo sapiens Phosphatidylinositol N-acetylglucosaminyltransferase subunit P Proteins 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000037007 arousal Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 230000007177 brain activity Effects 0.000 description 1
- 230000003727 cerebral blood flow Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000002068 genetic effect Effects 0.000 description 1
- 230000007794 irritation Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000035900 sweating Effects 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3833—Creation or updating of map data characterised by the source of data
- G01C21/3848—Data obtained from both position sensors and additional sensors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3635—Guidance using 3D or perspective road maps
- G01C21/3638—Guidance using 3D or perspective road maps including 3D objects and buildings
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3833—Creation or updating of map data characterised by the source of data
- G01C21/3841—Data obtained from two or more sources, e.g. probe vehicles
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/10—Map spot or coordinate position indicators; Map reading aids
- G09B29/106—Map spot or coordinate position indicators; Map reading aids using electronic means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0062—Adapting control system settings
- B60W2050/0075—Automatic parameter input, automatic initialising or calibrating means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/10—Historical data
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
- B60W2556/50—External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
- B60W2556/55—External transmission of data to or from the vehicle using telemetry
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
- B60W2556/65—Data transmitted between vehicles
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0112—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
- G08G1/0129—Traffic data processing for creating historical data or processing based on historical data
Definitions
- the present invention relates to a driving support system and a driving support method for supporting driving of a vehicle.
- Patent Document 1 there is a map information collecting apparatus described in Patent Document 1 below as a technique for collecting information on traffic environment while traveling by mounting an imaging device or the like on a vehicle.
- This map information collection device collects information included in a vehicle peripheral image acquired by an imaging device mounted on a vehicle by performing an image recognition process.
- this map information collecting apparatus suppresses the influence of the recognition position error by performing statistical learning processing when image information of the same place is recognized a plurality of times.
- the above-described map information collecting apparatus statistically performs learning processing when information is acquired by image recognition and similar image information is recognized multiple times at the same place. Therefore, dynamic traffic environment information that changes with time, such as the movement of the host vehicle and other surrounding vehicles, cannot be statistically created as map information.
- an object of the present invention is to provide a driving support system and a driving support method that can acquire time-varying information and provide it for driving support. To do.
- the present invention obtains driving information, and associates statistical information obtained by statistically processing the driving information as a time-series pattern with the position information of the vehicle when the driving information is acquired. Generate and update information.
- driving assistance is performed, the map information is read based on the vehicle position information and read, and the driving assistance is performed based on the read map information.
- FIG. 1 is a block diagram showing the configuration of the driving support system shown as the first embodiment of the present invention.
- FIG. 2 is a diagram illustrating color feature amounts and texture feature amounts in the driving support system shown as the first embodiment of the present invention.
- FIG. 3 is a diagram for explaining the object shape feature amount in the driving support system shown as the first embodiment of the present invention.
- FIG. 4 is a flowchart showing the time series processing when the driving information is a surrounding video in the driving support system shown as the first embodiment of the present invention.
- FIG. 5 is a flowchart showing a process of generating statistical information and updating map information in the driving support system shown as the first embodiment of the present invention.
- FIG. 1 is a block diagram showing the configuration of the driving support system shown as the first embodiment of the present invention.
- FIG. 2 is a diagram illustrating color feature amounts and texture feature amounts in the driving support system shown as the first embodiment of the present invention.
- FIG. 3 is a diagram for explaining the object shape feature amount in the driving support system shown as the
- FIG. 6 is a diagram illustrating segmentation of time-series driving information in the driving support system shown as the first embodiment of the present invention.
- FIG. 7 is a diagram illustrating hierarchical classification of segmented driving information in the driving support system shown as the first embodiment of the present invention, where (a) is one driving information and (b). Shows the other driving information.
- FIG. 8 is a flowchart showing a process for generating and storing map information in the driving support system shown as the first embodiment of the present invention.
- FIG. 9 is a flowchart showing a process of generating statistical information for each section in the driving support system shown as the first embodiment of the present invention.
- FIG. 10 is a flowchart including a process of predicting driving information in the driving support system shown as the first embodiment of the present invention.
- FIG. 11 is a block diagram showing a configuration in which an integrated pattern generation unit is added to the driving support system shown as the second embodiment of the present invention.
- FIG. 12 is a flowchart showing a process of generating an integrated pattern when generating map information in the driving support system shown as the second embodiment of the present invention.
- FIG. 13 is a diagram illustrating a process for generating an integrated pattern in the driving support system shown as the second embodiment of the present invention.
- FIG. 14 is a block diagram showing objects to communicate with each other in the driving support system shown as the third embodiment of the present invention.
- FIG. 15 is a block diagram showing a configuration in which in-vehicle equipment includes communication means in the driving support system shown as the third embodiment of the present invention.
- FIG. 16 is a block diagram in which a configuration for adding attributes to driving information is added in the driving support system shown as the fourth embodiment of the present invention.
- FIG. 17 is a flowchart showing a process of generating statistical information for each attribute of driving information in the driving support system shown as the fourth embodiment of the present invention.
- the driving support system shown as the first embodiment of the present invention is configured, for example, as shown in FIG.
- the driving support system includes a driving information acquisition unit 1, a vehicle position information acquisition unit 2, a statistical information generation unit 3, a map information generation unit 4, a map information reference unit 5, a driving support unit 6, and The map information storage unit 7 is included.
- a part of the driving support system is actually composed of a ROM, a RAM, a CPU, etc., but functions that can be realized when the CPU performs processing according to a driving support program stored in the ROM. It will be described as a block.
- the driving information acquisition unit 1 may be any one of the driver state information acquisition unit 11, the vehicle surrounding information acquisition unit 12, and the vehicle information acquisition unit 13.
- the driver state information acquisition unit 11 detects the state of the driver.
- the vehicle surrounding information acquisition unit 12 acquires environmental information around the vehicle.
- the vehicle information acquisition unit 13 detects a driver's operation amount and vehicle behavior as vehicle information.
- the driver state information acquisition unit 11 supplies the acquired driving information to the vehicle position information acquisition unit 2, the statistical information generation unit 3, and the driving support unit 6.
- the vehicle position information acquisition unit 2 acquires vehicle position information.
- the vehicle position information acquisition unit 2 includes, for example, a GPS antenna and a GPS arithmetic device.
- the vehicle position information acquisition unit 2 may calculate the vehicle position information based on the behavior of the vehicle as the driving information acquired from the driving information acquisition unit 1.
- the vehicle position information acquisition unit 2 supplies the acquired vehicle position information to the driving support unit 6, the map information generation unit 4, and the map information reference unit 5.
- the statistical information generation unit 3 statistically processes the driving information supplied from the driving information acquisition unit 1 as a time series pattern, and generates statistical information.
- the map information generation unit 4 generates map information by associating the vehicle position information when the driving information obtained from the vehicle position information acquisition unit 2 is acquired with the statistical information generated by the statistical information generation unit 3. That is, the map information generation unit 4 uses the statistical information generated by the statistical information generation unit 3 based on the position information as map information.
- the map information storage unit 7 stores road map data as a database.
- the map information storage unit 7 includes a map information update unit 71 into which the map information sent from the map information generation unit 4 can be written.
- the map information update unit 71 updates existing map information to the map information generated by the map information generation unit 4.
- the map information reference unit 5 reads the map information stored in the map information storage unit 7 with reference to the position information acquired by the vehicle position information acquisition unit 2. Thereby, the map information reference part 5 extracts the statistical information regarding the driving
- the driving support unit 6 performs driving support based on the statistical information (map information) obtained by the map information reference unit 5. Further, the driving support unit 6 may perform driving support based on not only map information but also real-time driving information obtained by the driving information acquisition unit 1. This driving assistance includes providing information useful for driving to the driver. Further, the driving support unit 6 may control the vehicle based on the statistical information and the driving information.
- the vehicle location information acquisition unit 2 uses a GPS (Global Positioning System) signal mounted on an in-vehicle navigation system or a mobile phone of a driver.
- the vehicle position information acquisition unit 2 may use, as position information, map information possessed by an in-vehicle navigation system or the like, in addition to acquiring latitude and longitude, altitude, and the like by GPS.
- map information possessed by an in-vehicle navigation system or the like, in addition to acquiring latitude and longitude, altitude, and the like by GPS.
- a map for a known navigation system for example, the number of a section divided into a mesh at regular intervals on the map, the link number when a road is expressed as a link, an intersection or a junction point is a contact point between the links
- a map that is expressed as a node and has the node number as map information is known.
- the vehicle position information acquisition unit 2 detects the latitude and longitude of the vehicle position by the in-vehicle GPS, then matches the map information and the vehicle position, and uses the mesh number, link number, and node number after matching as latitude and longitude as position information. May be added.
- the driver state information acquisition unit 11 detects the movement of the face as the driver state.
- the driver state information acquisition unit 11 acquires an image including the driver's head by using, for example, an imaging device installed in front of the driver's seat, and inputs the image information to the face motion detection device.
- a known recognition method may be used as the face motion detection device.
- the driver state information acquisition unit 11 acquires the posture angle of the head. In addition to the posture angle of the head, the driver state information acquisition unit 11 displays the driver's line of sight, blink, mouth movement and facial expression, movement of the hand touching the face and head, and the like based on the captured image.
- the driver state information acquisition unit 11 may use body pressure distribution values such as a seat seat surface, a back surface, and a side support as the movement of the driver's body. Further, the driver state information acquisition unit 11 uses blood pressure, breathing, heartbeat, sweating, myoelectric potential, or the like as a driver's biological signal, or a brain activity amount that is simply measured by cerebral blood flow or the like. May be.
- the driver state information acquisition unit 11 combines biometric information and a plurality of measurement values such as line of sight, facial expression, and body temperature, and uses a known calculation technique to determine the degree of sleepiness, irritation, arousal, tension, and relaxation. It may be calculated and acquired as information indicating the state of.
- the vehicle surrounding information acquisition unit 12 acquires observation values such as a distance and an object shape to an object around the host vehicle as environmental information around the vehicle.
- the vehicle surrounding information acquisition unit 12 uses, for example, a distance sensor such as a laser sensor, a millimeter wave radar, or an ultrasonic sensor that is a sensor such as a lane keeping system or an inter-vehicle maintenance control system.
- the vehicle position information acquisition unit 2 uses moving images and still images of an imaging device such as a video camera, and positions such as white lines, stop lines, pedestrian crossings, and arrows indicating a course on the road surface, Detects the target of the driver's attention when driving the vehicle, such as the lateral position of the vehicle, the curvature of the front lane, traffic lights, road signs, pedestrians and bicycles, and vehicles traveling around the vehicle.
- the shape and color of the vehicle, the relative distance (position) from the host vehicle, and the like may be acquired as environmental information around the vehicle.
- the vehicle position information acquisition unit 2 operates in a congested vehicle train, a crowded area, a crowded road, a quiet road, or a narrow residential road in a residential area, as will be described later. If the atmosphere of the place related to can be converted to time series data, it may be acquired as environmental information around the vehicle.
- the vehicle information acquisition unit 13 acquires information on a driver's operation amount and vehicle behavior.
- the vehicle information acquisition unit 13 includes, for example, a steering angle, an accelerator opening, a brake operation amount, a direction indicator switch signal, a wiper switch signal, a vehicle speed, a longitudinal acceleration, a vertical acceleration, a lateral acceleration, a yaw rate, a roll rate, a pitching rate, and the like. get.
- the vehicle information acquisition unit 13 may acquire a signal directly from a sensor or an operation device, or may acquire a signal flowing through an in-vehicle network such as the vehicle CAN.
- the most intuitive method is to detect the vehicle body yaw rate by using a yaw rate sensor provided on the vehicle body.
- the steering angle and the vehicle speed may be detected, and the vehicle yaw rate may be estimated based on the result.
- a steering angle sensor and a vehicle speed sensor are provided instead of the yaw rate sensor.
- the vehicle information acquisition unit 13 may acquire vehicle information as an estimated value by calculation without directly detecting vehicle information.
- the information acquired by the driving information acquisition unit 1 and the vehicle position information acquisition unit 2 includes information on sensor devices that are standard equipment on the vehicle, such as information flowing in the vehicle CAN, and the user of the vehicle later. Information on sensor devices mounted on the vehicle may be used.
- the driver state information acquisition unit 11 includes, for example, a navigation system equipped with an existing GPS or gyro sensor, an in-vehicle video camera, an acceleration sensor, or GPS, such as the sensor device that measures the driver's biological signal described above.
- a mobile device such as a mobile phone, an existing drive recorder, or the like may be installed in the vehicle, and output information of sensors mounted on those devices may be used.
- the driving information acquired by the driving information acquisition unit 1 includes one-dimensional information such as vehicle speed, steering angle, and inter-vehicle distance.
- the driving information includes multidimensional time series information in addition to the one-dimensional information.
- the driving information includes information measured on a “surface” such as a camera image or a millimeter wave radar.
- the driving information includes information measured by “line (point sequence)” by a scanning sensor such as a laser radar.
- the driving information includes information that is handled as a set of plural sensors such as ultrasonic sensors and myoelectric potentials installed at the four corners of the vehicle and is measured with relation to “plural points”.
- a camera image or the like includes a plurality of pieces of information having different meanings in one pixel constituting a screen, such as information on color, brightness, density, texture information, information on object shape features such as contours (edges), etc. have.
- the driving support system of the present embodiment maintains the relevance between each piece of information as the information acquired at the same time at the same point in the same traffic environment by using the above-described plurality of one-dimensional time-series information and multi-dimensional time-series information. It has a feature that statistical processing is performed in the state.
- the driving information acquisition unit 1 calculates a feature value 101 for each cell into which the camera video 100 is divided as shown in FIG.
- the time change of each feature amount is set as time series data.
- a known color space such as an RGB color space, a CIE-XYZ color space, or an L * a * b * color space may be used as the color space.
- a color histogram or a color moment (average, variance, covariance, asymmetry, etc.) that is a statistic of the color distribution in the color space may be used as the feature amount of the color.
- the driving information acquisition unit 1 can calculate the feature amount of the entire image. Further, the driving information acquisition unit 1 may divide the image into cells and calculate the feature amount of each region, then pay attention to the region and use the texture feature amount. Texture features include structural features that describe regular textures (morphological operations, amounts from adjacency graphs, etc.), and statistical features that are described by statistical distribution of image luminance (Fourier power spectrum, density co-occurrence matrix, Markov) A known recognition method such as a random field, a fractal model, various multi-resolution feature quantities such as Gabor transform and Wavelet transform may be used. The driving information acquisition unit 1 calculates the color feature amount and texture feature amount of the entire image and uses them as time series data.
- the time change of traffic environment (atmosphere) in can be statistically accumulated and used as time series data.
- the driving information acquisition unit 1 may detect the feature quantity 102 of the object shape as shown in FIG.
- Driving information acquisition unit 1 is the lateral position of the vehicle in the driving lane, the curvature of the front lane, the road surface display such as stop lines and pedestrian crossings, traffic lights and road signs, pedestrians and bicycles, other vehicles that run around, An object to which the driver pays attention when driving a car is detected.
- the driving information acquisition unit 1 detects a relative distance (positional relationship) with respect to the detection target, its shape, color, and the like.
- the driving information acquisition unit 1 acquires the shape, color, and the like of a target to which the driver pays attention when driving a car as the feature amount 102 of the object shape.
- the driving information acquisition unit 1 may use, as time-series data, a change in the feature amount 102 of the object shape and a movement of the target (positional change in position) obtained in the process of detecting the target.
- a known recognition method may be used as a method for detecting a specific target and a shape feature amount. For example, region feature values (Zemike moment, 2D Angular Radial Transformation (ART), etc.) and outline feature values (Fourier Descriptor, Curvature Scale Space, Edge Direction Histogram, etc.) are used as local feature values (Color) SIFT And specific features and shape features can be detected using features such as Spin Images, Video Google, and Bag-of-features approach.
- the driving information acquisition unit 1 converts the acquired driving information into time series data by a process as shown in FIG.
- step S1 the driving information acquisition unit 1 and the vehicle position information acquisition unit 2 acquire driving information together with the vehicle position information at the time of driving information acquisition.
- step S2 the driving information acquisition unit 1 converts the acquired driving information into time series data.
- step S2 may be skipped and the process may proceed to step S3.
- the driving information acquisition unit 1 calculates a feature amount or the like from the driving information and uses the time change of the calculated information as time series data.
- step S3 the time-series data set of each feature amount obtained in step S2 is associated with the vehicle position information.
- the driving information acquisition unit 1 transmits the driving information as multidimensional time series data associated with the vehicle position information in step S3 to the statistical information generation unit 3.
- the driving information acquisition unit 1 acquires the image around the vehicle together with the position information of the vehicle at the time of imaging in step S1.
- the driving information acquisition unit 1 extracts a feature amount from the acquired video. Since various information is included in the video, for example, the video is cut out as a still image at regular sample times, and the feature amount is extracted by the known image recognition method described above. The driving information acquisition unit 1 sets the time change for each extracted feature amount as time series data.
- the obtained time-series data set of each feature amount is associated with the imaging position information (vehicle position information) of the video.
- the driving information acquisition unit 1 sends a set of each feature quantity associated with the video position information to the statistical information generation unit 3 as multidimensional time-series data.
- the driving information is described as an example of a camera image that captures the outside of the vehicle, but is not limited thereto.
- time-series data not only the observed values for each dimension are converted to time-series data, but also one time-series data can be calculated from multiple-dimensional observed values and feature quantities can be extracted.
- the time change may be used as time series data.
- a known recognition method is used from the laser radar observation value.
- the relative position data of other vehicles calculated in this way may be used as time series data.
- Statistic information generation processing and map information update processing can be realized by the processing shown in FIG. 5, for example.
- step S ⁇ b> 11 the statistical information generation unit 3 performs time-series processing by the driving information acquisition unit 1 and acquires driving information associated with the vehicle position information.
- the statistical information generating unit 3 segments the time series data of the driving information acquired in step S11 with the time point when the change is large as the boundary of the time series pattern.
- the statistical information generation unit 3 reads the time series pattern of the driving information already stored as the map information from the map information storage unit 7 as a database.
- step S14 the statistical information generation unit 3 has already stored in the map information storage unit 7 a time series pattern that has a high likelihood and similarity to the time series pattern of the driving information segmented in step S13. Determine whether or not. If a similar time series pattern already exists in the map information storage unit 7, the driving information is supplied to the map information generation unit 4 and the process proceeds to step S15. On the other hand, if there is no time series pattern having a high likelihood and similar to the time series pattern of the segmented driving information, the process proceeds to step S17.
- step S15 the statistical information generation unit 3 and the map information update unit 71 perform statistical processing by adding the newly acquired driving information, and update the time series pattern stored in the map information storage unit 7.
- step S17 the statistical information generation unit 3 performs statistical processing based on the newly acquired driving information and the time series pattern of other stored driving information. As a result, the statistical information generation unit 3 newly generates a time series pattern of driving information.
- step S16 the statistical information generating unit 3 transmits the updated or newly generated time series pattern of the driving information to the map information generating unit 4 as statistical information.
- FIG. 6 is an image diagram for segmenting time-series data of multidimensional driving information. Regardless of the number of dimensions of the acquired time-series data, the statistical information generation unit 3 processes the data A, B, C, D, and E together while maintaining the relationship, and sets the time when the change is large. Segment as a sequence pattern boundary. In the example of FIG. 6, the time series data A, B, C, D, and E of the operation information are segmented into segments after t1 to t2, t2 to t3, t4 to t5, t6 to t7, and t8.
- FIG. 7 is an image diagram of a tree structure when segmented time-series driving information (segment) is classified into classes by a hierarchical clustering technique as an example of a technique for calculating a time-series pattern from driving information.
- the time series pattern may be obtained as a statistical time series pattern of each class by statistically processing the segmented operation information of each time series as shown in (a) and (b).
- one representative segmented time-series operation information may be selected from the class to form a time-series pattern.
- Japanese Patent No. 4027838 is devised as one means for discovering time series data generation (estimation) methods and controlling the system using the recognition / generation methods.
- this method may be used as a pattern extraction / recognition method.
- a pattern extraction method from segmented time series data a stochastic statistical model called a Hidden Markov Model (HMM) is used.
- HMM Hidden Markov Model
- a method using a neural network or a genetic algorithm or a method using a linear discriminant function that is identified using a threshold value or a feature plane may be used.
- a known pattern identification method or data set clustering method may be used.
- a well-known hierarchical clustering method such as supervised or unsupervised machine learning, group averaging, or Ward method may be used.
- step S21 the map information generating unit 4 acquires the driving information acquired by the driving information acquiring unit 1, the position information of the vehicle that acquired the driving information, and the time series pattern (driving pattern information).
- step S22 the map information generating unit 4 converts the driving information acquired in step S21 into a time series pattern, and describes the driving information in time series.
- step S23 the map information generation unit 4 determines the transition probability from the temporally previous driving information pattern to the temporally subsequent driving information pattern in the driving information sequence described in the time axis in step S22. Is calculated.
- step S24 the map information generation unit 4 stores the time series pattern of the driving information together with the transition probability obtained in step S23 as map information in association with the position in the map space, and stores it in the map information storage unit 7.
- the map information is stored in association with the time series pattern of the statistically processed driving information and the position information. For this reason, both the vehicle position information obtained by the vehicle position information acquisition unit 2 and the time series pattern of the driving information as the statistical information obtained by the map information reference unit 5 are combined. As a result, it is possible to know as a statistical time series pattern how the vehicle behavior, driving operation, driver state, traffic environment, etc. change over time in a specific section / point on the map space.
- the time series pattern of driving information that has been subjected to statistical processing is handled as map information. Thereby, it is not necessary to record all the data acquired by the driving information acquisition unit 1 as it is. For this reason, the recording data capacity of the map information storage unit 7 can be saved, and information can be stored efficiently.
- temporal and spatial transitions from one time series pattern to another time series pattern are obtained as transition probabilities and stored as map information in association with the time series pattern. For this reason, it becomes possible to discriminate a time series pattern of driving information having a high probability of transition based on the observed driving information and position information.
- FIG. 9 is a flowchart showing the contents of statistical information generation processing for each section performed by the statistical information generation unit 3 and the map information update unit 71.
- the time-series processed driving information associated with the position information is acquired in step S11, the time-series driving information acquired in step S12 is segmented, and time-series patterning processing is performed in step S13 and subsequent steps. It is carried out.
- time series patterning process time series patterns are extracted based on all segmented driving information regardless of the location where the driving information is acquired.
- the map information generation unit 4 associates the time-series driving information together with the transition probability with the position on the map space to obtain the map information.
- the time-series driving information acquired in step S11 is divided by the statistical information generation unit 3 for each section on the map in step S41.
- the section to be divided may be divided at certain distances on the road.
- segment according to track alignment such as a road division and a road classification.
- the navigation map may be divided for each link or mesh, or may be divided by latitude / longitude or an area of a certain range.
- the statistical information generating unit 3 segments the operation information divided for each section in step S12, and then proceeds to step S42.
- the map information generation unit 4 performs reading, updating, and new generation processing of the time series pattern for each section without targeting all the time series patterns in the world.
- step S43 the statistical information generation unit 3 transmits to the map information generation unit 4 a time series pattern for each section that is calculated for each section and is already associated with the map space.
- driving information is divided for each section on the acquired map, and a time series pattern of driving information is calculated for each section based on the divided driving information. Thereby, the time series pattern of the driving information peculiar to the section can be extracted.
- the timing to calculate and update the time series pattern by statistically processing the driving information is updated sequentially as the driving information is collected, or updated at a fixed period, and updated when a certain amount of recorded driving information is accumulated, Or the timing which the driver
- the driving information used for updating by calculating the time series pattern may be statistically processed with all the driving information from the start of recording until the latest information, or statistically processed with the driving information recorded for a certain period. Also good.
- the driving support system shown as the first embodiment may predict the next information provision or driving operation by the map information reference unit 5 and the driving support unit 6 as shown in FIG.
- step S61 the map information reference unit 5 is based on the vehicle position information obtained in the vehicle position information acquisition unit 2, and the statistics relating to the driving stored at the position from the data stored in the map information storage unit 7. Extract information.
- the map information reference unit 5 acquires a time series pattern of driving information at the current position.
- the map information reference unit 5 probabilistically predicts the time series pattern of the driving information that transitions next from the time series pattern of the driving information acquired in step S62.
- the map information reference unit 5 includes the time series pattern of the statistical driving information that transits next stochastically predicted in step S63 and the real-time information obtained by the driving information acquisition unit 1, Both types of information are used to provide driving assistance such as providing information and operating assistance to the driver.
- the driving support system uses the driving information observed in real time by the driving information acquisition unit 1 and the statistical information obtained by the map information reference unit 5 to display the current driving state in a time series pattern. You can replace it with And based on the transition probability of the said time series pattern, it becomes possible to discriminate
- the vehicle position information observed in real time by the vehicle position information acquisition unit 2 and the statistical information obtained by the map information reference unit 5 are used. Thereby, the time series pattern of the driving information associated with the place where the current vehicle exists can be acquired. As a result, it is possible to provide information on what kind of traffic environment the place where the current vehicle exists is to provide driving assistance.
- the driving support unit 6 sets audio information, visual information, and operation information corresponding to the time-series pattern of driving information in advance.
- the driving support unit 6 includes an audio output device such as a speaker or a visual information presentation device such as a monitor so that it can be provided to the driver, or a power assist device that can control and output the operation amount and operating force of the steering wheel, accelerator, and brake.
- the driving support unit 6 may generate a driving operation amount for safety from the predicted time series pattern and present the operating amount to the driver using the driving support means. Further, the driving support unit 6 may present in advance information for preparing for a complicated traffic environment. Further, the driving support unit 6 can perform driving support such as information on route change for traveling more easily, presentation of timing for changing the route, and assisting operation amount.
- the driving support system shown as the first embodiment statistically processes driving information as a time-series pattern, generates map information in association with vehicle position information, and updates existing map information.
- the time series pattern of the statistically processed driving information is used as map information in association with the position information, so the vehicle behavior and driving operation in the section / point on the map, the driver's state, the traffic environment It is possible to describe and store dynamic time changes such as. Therefore, according to the driving support system, it is possible to provide information to the driver and provide driving support such as operation support regarding the dynamic traffic environment that changes with time as the host vehicle and other surrounding vehicles move. .
- the driving support system can acquire the time series pattern associated with the location where the current vehicle exists by using the fact that the time series pattern is stored as the map information. As a result, the driving support system can provide information on what kind of traffic environment the vehicle is usually located in.
- the driving support system can provide information to the driver as “careful about vehicles changing lanes”. If it is found that there are many vehicles crossing lanes as the time series pattern of the next section spatially, it is possible to provide information to the driver as “Beware of vehicles that will change lanes ahead”. If the driver's condition is often a pattern that turns his face or gaze sideways, combining it with the time-series pattern of vehicle information such as the vehicle speed, the driver will be aware of side-drive and rear-end collision warnings. It is possible to provide such information.
- the driving support system not only provides the above information provision as audio information and visual information to the display, but also controls the pedal operation amount as the driving operation amount to guide the driver's driving to the safe side. It is also possible to assist. Furthermore, the driving support system obtains time-series patterns from the vehicle information such as acceleration / deceleration when starting from a stop or stop state, stop position and movement of a stop intersection, cruise speed with respect to a limit speed, and steering amount with respect to road alignment. Thus, it is possible to use it as a driving operation amount support for the driver or as a control target value for the autonomous driving vehicle.
- the driving information is divided for each section on the map from which the information is acquired, and a time series pattern is calculated for each section based on the divided information.
- the driving support system can extract a time series pattern of driving information specific to the section.
- a time series pattern of driving information is calculated regardless of the section on the map, and the time series pattern is any time series among a plurality of preset time series patterns. It is determined whether it corresponds to a pattern. Thereby, the number of time series patterns can be suppressed as compared with the case of calculating time series patterns of individual driving information for each place, and the calculation amount and data capacity can be saved.
- the driving support system shown as the second embodiment is configured as shown in FIG. 11, for example.
- This driving support system differs from the driving support system shown as the first embodiment in that an integrated pattern generation unit 41 is provided in the map information generation unit 4.
- the driving information acquisition unit 1 acquires a plurality of information among a driver's operation amount and vehicle behavior, a driver's state, and environmental information around the vehicle.
- the integrated pattern generation unit 41 associates each time series pattern calculated using each driving information acquired by the driving information acquisition unit 1 as a time series pattern acquired in the same traffic environment.
- the integrated pattern generation unit 41 performs a process as shown in FIG. For example, the process of the integrated pattern generation unit 41 when the driving information acquisition unit 1 acquires vehicle information, driver state information, and vehicle surrounding information will be described.
- step S31, step S33, and step S35 the driving information acquisition unit 1 and the vehicle position information acquisition unit 2 simultaneously acquire driving information of vehicle information, driver state information, and vehicle surrounding information together with position information.
- step S32, step S34, and step S36 the driving information acquisition unit 1 converts each vehicle information, driver state information, and vehicle surrounding information into a time series pattern, and describes each information in a time series of driving patterns.
- the vehicle information, the driver status information, and the vehicle surrounding information are driving information acquired in the same traffic environment at the same moment and at the same moment, and are originally related to each other.
- the relationship between information is not the relationship represented by the temporal transition probability in the same type of information, but between different types of driving information by being acquired in the same traffic environment at the same point at the same moment. Relevance (causality, correlation) that exists in
- the integrated pattern generation unit 41 can express the relevance (causality, correlation) between patterns of different driving information as shown in FIG.
- FIG. 13 is an image diagram for explaining the processing flow performed by the integrated pattern generation unit 41. The description of FIG. 13 will be described later.
- the integrated pattern generation unit 41 calculates a conditional probability at each segment point.
- the integrated pattern generation unit 41 sets the time series pattern of each driving information together with the conditional probability calculated in step S37, and stores it in the map information storage unit 7 in association with the position on the map.
- a method for expressing the relationship (causality, correlation) between time series patterns of different driving information using a statistical model will be described by taking as an example the case of acquiring vehicle information and vehicle surrounding information.
- the driving information acquisition unit 1 converts each vehicle information and vehicle surrounding information into a time series pattern and describes each information in a time series pattern, the positions of the nodes are different as shown in FIG.
- the transition between the vehicle information and the vehicle surrounding information at each node is described as a conditional probability, and the relationship between time series patterns of different driving information is expressed.
- n ( ⁇ i, ⁇ j) is the number of times a combination ( ⁇ i, ⁇ j) of the time series pattern of the vehicle information and the time series pattern of the vehicle surrounding information appears in the data.
- n ( ⁇ i, ⁇ j, ⁇ k, ⁇ l) represents the number of times of transition from the set of time series patterns ( ⁇ i, ⁇ j) to ( ⁇ k, ⁇ l).
- the integrated pattern generation unit 41 obtains ( ⁇ k, ⁇ l) that maximizes the probability P ( ⁇ k, ⁇ l
- Information and vehicle surrounding information can be estimated. This is similar to a state in which the driver predicts the future traffic environment from the vehicle information and the vehicle surrounding information that the driver feels based on experience. Therefore, the driving support system can present information that can be easily recognized by the driver.
- the prediction of the next vehicle information and the vehicle surrounding information can be obtained by the following Expression 2.
- each time series pattern calculated using each driving information is associated as a time series pattern acquired in the same traffic environment. That is, two or more time series patterns based on driving information collected in the same traffic environment are stored in association with each other.
- the driving support system it is possible to accumulate information while maintaining the relationship between the driving operation corresponding to the traffic environment and the information such as the traffic environment and the driver's state.
- the information such as the traffic environment and the driver's state.
- the next state is predicted and related in time based on the driving information that can be observed.
- Estimate driving information predict driving information that should be happening at the same time in the same traffic environment but cannot be observed, and provide driving assistance to the driver.
- the driving support system shown as the third embodiment is realized by a network system as shown in FIG. 14, for example.
- the driving support system shown in FIG. 14 has a communication function in which the in-vehicle device 201 transmits and receives information to and from various devices outside the vehicle.
- This communication function can communicate with the base station 202, the infrastructure device 203, the on-vehicle device 204, and the mobile device 205.
- This communication function exchanges information via a telephone line or an Internet line via packet communication of a mobile communication service or wireless communication such as a wireless LAN.
- the components of the driving support system are made wireless, for example, the base station 202 is provided as a data center outside the vehicle, and all the arithmetic processing is not performed only by the own vehicle in-vehicle device 201 but shared with the base station 202. Good.
- this driving support system has a configuration in which the vehicle-mounted device 201 and the base station 202 are connected by a communication line.
- This driving support system includes a communication unit 201a provided in the vehicle-mounted device 201, a temporary storage unit 201b, and a communication unit 202a provided in the base station 202 with respect to the driving support system in the first embodiment.
- the communication unit 201a that exchanges information with the in-vehicle device 201 and the outside of the vehicle does not necessarily need to be able to always communicate. For example, it communicates at regular intervals, communicates only when the communication line is in good condition, communicates when idling at zero vehicle speed, communicates during charging of an electric vehicle, or communicates at the start of travel or after the end of travel. Alternatively, communication may be performed only when the driver permits communication. Further, the recorded operation information may be stored while the communication line is disconnected, and information may be exchanged in a divided or batch manner at the time of communication connection.
- the communication unit 201a is not necessarily required to perform wireless communication such as packet communication of a mobile communication service or wireless LAN, and may exchange information by wire when the vehicle is parked.
- a removable storage medium is installed in the vehicle on-vehicle device 201, and driving information is stored in the storage medium during traveling. Then, the owner of the vehicle may attach / detach the storage medium and manually upload the driving information and download the map information.
- the communication units 201a and 202a are roadside devices DSRC (spot communication: 5.8 GHz narrow band communication (Dedicated Short Range Communication)), VICS (registered trademark) (road traffic information communication system: Vehicle Information and Communication System), FM used in ETC, etc.
- Information distribution by communication equipment provided as infrastructure equipment 203 such as multiplex broadcasting, digital broadcasting, radio wave beacon, and optical beacon may be used.
- the communication unit 201a may exchange information of an imaging device (traffic monitoring camera) or a sensor installed on the road and obtain information obtained by the driving information acquisition unit 1. For example, it is possible to acquire fixed point observed traffic information that cannot be collected by a mobile object from sensors installed on the road.
- an imaging device traffic monitoring camera
- a sensor installed on the road For example, it is possible to acquire fixed point observed traffic information that cannot be collected by a mobile object from sensors installed on the road.
- the communication unit 201a is not limited to the exchange of information between road vehicles, and may perform information communication between the on-vehicle device 204 and the vehicle. Furthermore, the communication unit 201a may exchange information between a mobile device 205 held by a pedestrian or a person riding a bicycle and a person's vehicle. By acquiring travel data of vehicles other than the host vehicle, driving information acquired from a vehicle with a large number of in-vehicle sensors to a small number of vehicles can be provided, or the relative positional relationship in the same environment can be included in the information be able to.
- the communication unit 201a exchanges information between a mobile device 205 and a person's vehicle possessed by a pedestrian or a person riding a bicycle.
- the presence and movement of moving objects other than vehicles can be used as information, and pattern information can be exchanged between the pedestrians, bicycles, and vehicles. And avoiding danger.
- FIG. 14 only two vehicles, ie, a vehicle on which the on-vehicle in-vehicle device 201 is mounted as a vehicle and a vehicle on which the other-vehicle in-vehicle device 204 is mounted are illustrated. However, information can be transmitted to or received from the base station (data center) 202 via the communication path, and the number is not limited. Similarly, only one infrastructure device 203 or mobile device 205 is shown, but other devices (not shown) also transmit and receive information with a plurality of devices and vehicles via a communication path, and limit the number of the devices. Not what you want.
- the communication unit 201a that exchanges information with the communication device outside the vehicle has the information acquired by the communication unit 201a by the driving information acquisition unit 1. It can be used as driving information.
- the driving support system can acquire travel data of other vehicles by the vehicle on-vehicle device 201, and can create and use a wide range of map information such as a place where the vehicle has not traveled yet.
- the driving support system can collect information on imaging devices and sensors in the traffic environment installed on the road. Furthermore, the driving support system enables the vehicle on-vehicle device 201 to communicate with a computing device outside the vehicle such as a server environment, so that it is not necessary to perform all the calculation processing on the vehicle on-vehicle device 201, and communication load and calculation amount can be reduced. Considering this, it becomes possible to share processing optimally.
- the driving support system shown as the fourth embodiment has an attribute assigning unit 8 as compared with the driving support system of the above-described embodiment, for example, as shown in FIG.
- the attribute assigning unit 8 assigns an attribute to the driving information acquired by the driving information acquiring unit 1.
- the statistical information generation unit 3 performs statistical processing using the driving information as a time series pattern for each attribute assigned by the attribute assigning unit 8.
- Attribute provided by the attribute assigning unit 8 is information for classifying the situation at the time of driving information acquisition. Attributes include, for example, attributes related to time (date, time, time zone, day of the week, month, quarter, season, etc.) and attributes related to driving environment (congestion, crowd, cruise, stop and go, residential area, school zone, mountainous area, etc. Part, etc.), attributes related to the usage of the car (commuting, work, leisure, long distance travel, etc.), and attributes related to the vehicle occupant (one person, multiple persons, family members, senior citizens, driving beginners, etc.).
- the attribute assigning unit 8 may automatically give an attribute to the driving information when the driving information is acquired, or may give the attribute to the driving information according to the driver's switch operation.
- This driving support system performs, for example, the processing shown in FIG. 17 and performs statistical processing on the driving information to which the attribute is given.
- the statistical information generation unit 3 of the driving support system classifies the time-series driving information to which the attribute acquired in step S11 is assigned for each attribute in step S51.
- the statistical information generation unit 3 segments the driving information classified for each attribute in step S12.
- step S52 the statistical information generation unit 3 reads out the time series pattern stored as the map information together with the attribute.
- step S14, S15, and S17 regardless of the attribute, the time-series pattern is updated or a new time-series pattern is generated as in the above-described embodiment.
- new attribute information is added to the calculated time-series pattern together with the attribute given before the calculation process, and transmitted to the map information generation unit 4.
- attributes are given to driving information, and statistical processing is performed using the driving information as a time-series pattern for each attribute. It can be obtained for each attribute and provided for driving support. For example, even in the same place on the map, it is possible to describe in more detail the characteristics of places where the traffic environment changes depending on the time zone, season, day of the week, and weather as attributes. For each attribute that forms a location, the location feature can be described.
- a time series pattern of driving information can be extracted for each driving skill related to the driver, and driving support such as information provision and operation support according to the driving skill can be performed.
- driving support such as information provision and operation support according to the driving skill can be performed.
- the driver can select an avoidance route in advance even when he / she passes in a different time zone or at a place where he / she passes for the first time.
- the time series pattern of the statistically processed driving information is used as map information in association with the position information, so that the time change of the dynamic driving information can be stored and driving assistance can be performed regarding the dynamic environment. it can.
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Physics (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Traffic Control Systems (AREA)
- Instructional Devices (AREA)
- Navigation (AREA)
Abstract
Description
本発明の第1実施形態として示す運転支援システムは、例えば図1に示すように構成されている。図1に示すように、この運転支援システムは、運転情報取得部1、車両位置情報取得部2、統計情報生成部3、地図情報生成部4、地図情報参照部5、運転支援部6、及び、地図情報記憶部7を含む。
つぎに、第2実施形態に係る運転支援システムについて説明する。なお、上述の第1実施形態と同様の部分については同一符号を付することによりその詳細な説明を省略する。
つぎに、第3実施形態に係る運転支援システムについて説明する。なお、上述の実施形態と同様の部分については同一符号を付することによりその詳細な説明を省略する。
つぎに、第4実施形態に係る運転支援システムについて説明する。なお、上述の実施形態と同様の部分については同一符号を付することによりその詳細な説明を省略する。
2 車両位置情報取得部
3 統計情報生成部
4 地図情報生成部
5 地図情報参照部
6 運転支援部
7 地図情報記憶部
8 属性付与部
11 運転者状態情報取得部
12 車両周囲情報取得部
13 車両情報取得部
41 統合パターン生成部
71 地図情報更新部
201 自車両車載機器
201a 通信部
201b 一時記憶部
202 基地局
202a 通信部
203 インフラ機器
204 他車両車載機器
205 モバイル機器
Claims (7)
- 運転者の操作量及び車両挙動、運転者の状態、車両周囲の環境情報の何れかを運転情報として取得する運転情報取得手段と、
車両の位置情報を取得する車両位置情報取得手段と、
前記運転情報を時系列パターンとして統計処理して、統計情報を生成する統計情報生成手段と、
前記運転情報を取得した時の車両の位置情報と前記統計情報とを関連付けて地図情報を生成する地図情報生成手段と、
既存の地図情報を前記地図情報生成手段により生成された地図情報に更新する地図更新手段と、
前記車両位置情報取得手段によって取得された位置情報に基づいて、前記地図情報を参照して読み出す地図情報参照手段と、
前記地図情報参照手段によって読み出された地図情報に基づき、運転支援を行う運転支援手段と
を備えることを特徴とする運転支援システム。 - 前記運転情報取得手段は、運転者の操作量及び車両挙動、運転者の状態、車両周囲の環境情報のうち複数の情報を取得し、
前記運転情報取得手段により取得された各運転情報を用いて算出される各時系列パターンを、同じ環境で取得された時系列パターンとして関連付ける統合パターン生成手段を有すること
を特徴とする請求項1に記載の運転支援システム。 - 車両外の通信装置と情報を授受する通信手段を有し、
前記運転情報取得手段は、前記通信手段により取得された情報を運転情報として利用すること
を特徴とする請求項1又は請求項2に記載の運転支援システム。 - 前記統計情報生成手段は、地図上の区間ごとに前記運転情報を分割し、区間ごとに分割された運転情報を用いて区間ごとの時系列パターンを算出することを特徴とする請求項1乃至請求項3の何れか1項に記載の運転支援システム。
- 前記統計情報生成手段は、地図上の区間に関わりなく前記運転情報の時系列パターンを算出し、
前記統計情報生成手段により算出された時系列パターンが、予め設定された複数の時系列パターンのうち何れの時系列パターンに相当するかを判別するパターン判別手段を有し、
前記地図情報生成手段は、前記パターン判別手段により判別された時系列パターンと前記運転情報とを関連付けて地図情報を生成すること
を特徴とする請求項1乃至請求項3の何れか1項に記載の運転支援システム。 - 前記運転情報取得手段により取得された運転情報に、当該運転情報に属性を付与する属性付与手段を有し、
前記統計情報生成手段は、前記属性付与手段により付与された属性ごとに、前記運転情報を時系列パターンとして統計処理を行うこと
を特徴とする請求項1乃至請求項5の何れか1項に記載の運転支援システム。 - 運転者の操作量及び車両挙動、運転者の状態、車両周囲の環境情報の何れかを運転情報として取得すると共に、車両の位置情報を取得し、
前記運転情報を時系列パターンとして統計処理して、統計情報を生成し、
前記運転情報を取得した時の車両の位置情報と前記統計情報とを関連付けて地図情報を生成し、
既存の地図情報を前記生成された地図情報に更新し、
前記取得された車両の位置情報に基づいて、前記地図情報を参照して読み出し、
前記読み出された地図情報に基づき、運転支援を行うこと
を特徴とする運転支援方法。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP13819767.8A EP2876620B1 (en) | 2012-07-17 | 2013-07-16 | Driving assistance system and driving assistance method |
CN201380037746.3A CN104508719B (zh) | 2012-07-17 | 2013-07-16 | 驾驶辅助***以及驾驶辅助方法 |
JP2014525822A JP6200421B2 (ja) | 2012-07-17 | 2013-07-16 | 運転支援システム及び運転支援方法 |
RU2015105174A RU2015105174A (ru) | 2012-07-17 | 2013-07-16 | Система помощи при вождении и способ помощи при вождении |
US14/415,162 US10161754B2 (en) | 2012-07-17 | 2013-07-16 | Driving assistance system and driving assistance method |
MX2015000832A MX2015000832A (es) | 2012-07-17 | 2013-07-16 | Sistema de asistencia a la conduccion y metodo de asistencia a la conduccion. |
BR112015000983A BR112015000983A2 (pt) | 2012-07-17 | 2013-07-16 | sistema de assistência de condução e método de assistência de condução |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012158314 | 2012-07-17 | ||
JP2012-158314 | 2012-07-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014013985A1 true WO2014013985A1 (ja) | 2014-01-23 |
Family
ID=49948811
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/069294 WO2014013985A1 (ja) | 2012-07-17 | 2013-07-16 | 運転支援システム及び運転支援方法 |
Country Status (8)
Country | Link |
---|---|
US (1) | US10161754B2 (ja) |
EP (1) | EP2876620B1 (ja) |
JP (1) | JP6200421B2 (ja) |
CN (1) | CN104508719B (ja) |
BR (1) | BR112015000983A2 (ja) |
MX (1) | MX2015000832A (ja) |
RU (1) | RU2015105174A (ja) |
WO (1) | WO2014013985A1 (ja) |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105059288A (zh) * | 2015-08-11 | 2015-11-18 | 奇瑞汽车股份有限公司 | 一种车道保持控制***及方法 |
JP2015230694A (ja) * | 2014-06-06 | 2015-12-21 | 株式会社デンソー | 運転コンテキスト情報生成装置 |
JP2016095707A (ja) * | 2014-11-14 | 2016-05-26 | 株式会社デンソー | 遷移予測データ生成装置および遷移予測装置 |
CN105629977A (zh) * | 2016-03-06 | 2016-06-01 | 王保亮 | 一种全自动导航式无人驾驶电动汽车的使用方法 |
JP2016103267A (ja) * | 2014-11-14 | 2016-06-02 | 株式会社デンソー | 運転データ収集システム |
CN105691230A (zh) * | 2016-03-06 | 2016-06-22 | 王保亮 | 全自动导航式无人驾驶电动汽车 |
CN105810013A (zh) * | 2014-12-30 | 2016-07-27 | ***通信集团公司 | 一种基于车群风险的车辆防撞控制方法和装置 |
CN107074178A (zh) * | 2014-09-16 | 2017-08-18 | 本田技研工业株式会社 | 驾驶辅助装置 |
WO2017179209A1 (ja) * | 2016-04-15 | 2017-10-19 | 本田技研工業株式会社 | 車両制御システム、車両制御方法、および車両制御プログラム |
WO2017179151A1 (ja) * | 2016-04-13 | 2017-10-19 | 本田技研工業株式会社 | 車両制御システム、車両制御方法、および車両制御プログラム |
WO2017208529A1 (ja) * | 2016-06-02 | 2017-12-07 | オムロン株式会社 | 運転者状態推定装置、運転者状態推定システム、運転者状態推定方法、運転者状態推定プログラム、対象者状態推定装置、対象者状態推定方法、対象者状態推定プログラム、および記録媒体 |
JP2017220197A (ja) * | 2016-06-12 | 2017-12-14 | バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド | 車両制御方法と装置及び判断モジュールの獲得方法と装置 |
US10093320B2 (en) | 2016-04-14 | 2018-10-09 | Toyota Jidosha Kabushiki Kaisha | Drive support device for controlling drive support functions and evaluating driver performance |
WO2018189990A1 (ja) * | 2017-04-11 | 2018-10-18 | 株式会社デンソー | 車両用報知装置 |
WO2019008755A1 (ja) * | 2017-07-07 | 2019-01-10 | マクセル株式会社 | 情報処理システム及びそれに用いる情報処理システムインフラ及び情報処理方法 |
JP2019016238A (ja) * | 2017-07-07 | 2019-01-31 | Kddi株式会社 | 運転車両信号から個人特性を特定しやすい道路区間を推定する推定装置、車両端末、プログラム及び方法 |
WO2019022119A1 (ja) * | 2017-07-26 | 2019-01-31 | 学校法人五島育英会 | 移動体データ処理装置及びコンピュータプログラム |
CN109993966A (zh) * | 2018-01-02 | 2019-07-09 | ***通信有限公司研究院 | 一种构建用户画像的方法及装置 |
JP2020060674A (ja) * | 2018-10-10 | 2020-04-16 | トヨタ自動車株式会社 | 地図情報システム |
US10692369B2 (en) | 2016-04-14 | 2020-06-23 | Toyota Jidosha Kabushiki Kaisha | Server and information providing device |
US10691123B2 (en) | 2016-04-08 | 2020-06-23 | Honda Motor Co., Ltd. | Vehicle control system, vehicle control method, and vehicle control program |
JP2020144332A (ja) * | 2019-03-08 | 2020-09-10 | トヨタ自動車株式会社 | 仮想現実システムおよび仮想現実方法 |
WO2020188691A1 (ja) * | 2019-03-18 | 2020-09-24 | 三菱電機株式会社 | 車載機器及び通信方法 |
JP2022058625A (ja) * | 2014-09-30 | 2022-04-12 | Case特許株式会社 | 自動運転制御装置及び車両 |
Families Citing this family (74)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014157359A1 (ja) * | 2013-03-28 | 2014-10-02 | 本田技研工業株式会社 | マップ生成システム、マップ生成装置、マップ生成方法およびプログラム |
JP6316068B2 (ja) * | 2014-03-31 | 2018-04-25 | 国立大学法人 東京大学 | 検査システムおよび検査方法 |
JP6433268B2 (ja) | 2014-03-31 | 2018-12-05 | 国立大学法人 東京大学 | 検査システムおよび検査方法 |
WO2015183143A1 (en) * | 2014-05-26 | 2015-12-03 | Telefonaktiebolaget L M Ericsson (Publ) | Methods and network nodes for notifying vehicle drivers about their driving |
US9321461B1 (en) * | 2014-08-29 | 2016-04-26 | Google Inc. | Change detection using curve alignment |
US9731713B2 (en) * | 2014-09-10 | 2017-08-15 | Volkswagen Ag | Modifying autonomous vehicle driving by recognizing vehicle characteristics |
CN104820424B (zh) * | 2015-05-15 | 2017-12-01 | 山东省计算中心(国家超级计算济南中心) | 基于北斗导航的电动汽车自动驾驶***及其控制方法 |
CN104952123A (zh) * | 2015-05-27 | 2015-09-30 | 关晓芙 | 安装在车辆中的车载设备及相关设备与方法 |
JP6439591B2 (ja) | 2015-05-31 | 2018-12-19 | 株式会社デンソー | 走行制御装置、走行制御方法 |
JP5945999B1 (ja) * | 2015-07-31 | 2016-07-05 | パナソニックIpマネジメント株式会社 | 運転支援装置、運転支援システム、運転支援方法、運転支援プログラム及び自動運転車両 |
US10846537B2 (en) * | 2015-09-30 | 2020-11-24 | Nec Corporation | Information processing device, determination device, notification system, information transmission method, and program |
DE102016205436A1 (de) * | 2015-11-25 | 2017-06-01 | Volkswagen Aktiengesellschaft | Verfahren und System zum Erstellen einer digitalen Karte |
DE102016205434A1 (de) * | 2015-11-25 | 2017-06-01 | Volkswagen Aktiengesellschaft | Verfahren und System zum Erstellen einer spurgenauen Belegungskarte für Fahrspuren |
US11663508B1 (en) * | 2015-12-15 | 2023-05-30 | Lytx, Inc. | Environmental condition-based risk level |
US10165171B2 (en) | 2016-01-22 | 2018-12-25 | Coban Technologies, Inc. | Systems, apparatuses, and methods for controlling audiovisual apparatuses |
DE102017101343A1 (de) * | 2016-01-26 | 2017-07-27 | GM Global Technology Operations LLC | Systeme und verfahren zur fahrzeugsystemsteuerung auf grundlage physiologischer merkmale |
CN107200022B (zh) * | 2016-03-15 | 2019-11-12 | 奥迪股份公司 | 驾驶辅助***和方法 |
KR20170109275A (ko) * | 2016-03-21 | 2017-09-29 | 현대자동차주식회사 | 차량 및 그 제어 방법 |
JP6214796B1 (ja) * | 2016-03-30 | 2017-10-18 | 三菱電機株式会社 | 走行計画生成装置、走行計画生成方法及び走行計画生成プログラム |
JP6790417B2 (ja) * | 2016-03-31 | 2020-11-25 | ソニー株式会社 | 情報処理装置及び情報処理サーバ |
US10370102B2 (en) | 2016-05-09 | 2019-08-06 | Coban Technologies, Inc. | Systems, apparatuses and methods for unmanned aerial vehicle |
US10152858B2 (en) | 2016-05-09 | 2018-12-11 | Coban Technologies, Inc. | Systems, apparatuses and methods for triggering actions based on data capture and characterization |
US10789840B2 (en) * | 2016-05-09 | 2020-09-29 | Coban Technologies, Inc. | Systems, apparatuses and methods for detecting driving behavior and triggering actions based on detected driving behavior |
US11222438B2 (en) * | 2016-05-27 | 2022-01-11 | Kabushiki Kaisha Toshiba | Information processing apparatus, vehicle, and information processing method for presence probability of object |
US11132611B2 (en) * | 2016-05-27 | 2021-09-28 | Kabushiki Kaisha Toshiba | Information processing apparatus and information processing method for determining presence probability of object |
US10007854B2 (en) * | 2016-07-07 | 2018-06-26 | Ants Technology (Hk) Limited | Computer vision based driver assistance devices, systems, methods and associated computer executable code |
DE102016214257A1 (de) * | 2016-08-02 | 2018-02-08 | Continental Teves Ag & Co. Ohg | Verfahren zum Bereitstellen einer Karte in einem Fahrzeug |
CN106326873B (zh) * | 2016-08-29 | 2019-04-16 | 吉林大学 | Cacc驾驶员肢体肌电信号表征的操纵意图预测方法 |
US9739627B1 (en) | 2016-10-18 | 2017-08-22 | Allstate Insurance Company | Road frustration index risk mapping and mitigation |
US10830605B1 (en) | 2016-10-18 | 2020-11-10 | Allstate Insurance Company | Personalized driving risk modeling and estimation system and methods |
CN106611481B (zh) * | 2016-11-02 | 2018-12-25 | 纳智源科技(唐山)有限责任公司 | 疲劳驾驶监控装置及可穿戴设备 |
CN109565564B (zh) * | 2016-11-18 | 2021-07-02 | Jvc 建伍株式会社 | 记录装置、记录方法、再现方法以及存储介质 |
EP3563365A4 (en) * | 2017-01-02 | 2020-08-12 | Visteon Global Technologies, Inc. | USE OF VEHICLE SENSOR INFORMATION FOR DATA RECOVERY |
US10380886B2 (en) | 2017-05-17 | 2019-08-13 | Cavh Llc | Connected automated vehicle highway systems and methods |
EP3348964A1 (de) | 2017-01-13 | 2018-07-18 | Carrosserie Hess AG | Verfahren zur vorhersage zukünftiger fahrbedingungen für ein fahrzeug |
DE102017100671A1 (de) * | 2017-01-16 | 2018-07-19 | Voith Patent Gmbh | Verfahren zur Optimierung einer Schaltstrategie |
JP6528797B2 (ja) * | 2017-03-30 | 2019-06-12 | トヨタ自動車株式会社 | 車載ミリ波通信装置および通信方法 |
JP7016351B2 (ja) * | 2017-03-31 | 2022-02-14 | 本田技研工業株式会社 | 車載装置、情報管理サーバ、情報管理システム、および方法 |
US10692365B2 (en) | 2017-06-20 | 2020-06-23 | Cavh Llc | Intelligent road infrastructure system (IRIS): systems and methods |
US11735035B2 (en) | 2017-05-17 | 2023-08-22 | Cavh Llc | Autonomous vehicle and cloud control (AVCC) system with roadside unit (RSU) network |
JP2018203214A (ja) * | 2017-06-09 | 2018-12-27 | アイシン精機株式会社 | 駐車支援装置、駐車支援方法、運転支援装置、および運転支援方法 |
JP7013722B2 (ja) * | 2017-08-22 | 2022-02-01 | 株式会社デンソー | 運転支援装置 |
CN107704870B (zh) * | 2017-09-07 | 2021-04-20 | 武汉大学 | 基于ble指纹定位与imu动作识别融合的可靠人车临近感知装置与方法 |
CA3079656C (en) * | 2017-10-24 | 2021-11-23 | Nissan North America, Inc. | Localization determination for vehicle operation |
EP3495220B1 (en) * | 2017-12-11 | 2024-04-03 | Volvo Car Corporation | Path prediction for a vehicle |
CA3088142A1 (en) | 2018-02-06 | 2019-08-15 | Cavh Llc | Intelligent road infrastructure system (iris): systems and methods |
CN108600938B (zh) * | 2018-03-29 | 2020-09-22 | 南京邮电大学 | 一种车联网中感知服务节点智能选择方法 |
DE102018205199B4 (de) * | 2018-04-06 | 2021-03-18 | Volkswagen Aktiengesellschaft | Ermittlung und Verwendung von Haltepunkten für Kraftfahrzeuge |
JP7272530B2 (ja) | 2018-05-09 | 2023-05-12 | シーエーブイエイチ エルエルシー | 車両と幹線道路間のドライビングインテリジェンス割り当てのためのシステム及び方法 |
US11842642B2 (en) | 2018-06-20 | 2023-12-12 | Cavh Llc | Connected automated vehicle highway systems and methods related to heavy vehicles |
CN110634324A (zh) * | 2018-06-22 | 2019-12-31 | 上海擎感智能科技有限公司 | 一种基于车载终端的礼让行人的提醒方法、***及车载终端 |
US12057011B2 (en) | 2018-06-28 | 2024-08-06 | Cavh Llc | Cloud-based technology for connected and automated vehicle highway systems |
WO2020014227A1 (en) | 2018-07-10 | 2020-01-16 | Cavh Llc | Route-specific services for connected automated vehicle highway systems |
US11373122B2 (en) | 2018-07-10 | 2022-06-28 | Cavh Llc | Fixed-route service system for CAVH systems |
KR102677702B1 (ko) * | 2018-09-11 | 2024-06-25 | 현대자동차주식회사 | 차량 및 그 제어방법 |
JP7031005B2 (ja) * | 2018-09-17 | 2022-03-07 | 日産自動車株式会社 | 車両挙動予測方法及び車両挙動予測装置 |
AT521724A1 (de) * | 2018-09-24 | 2020-04-15 | Avl List Gmbh | Verfahren und Vorrichtung zur Analyse eines Sensordatenstroms sowie Verfahren zum Führen eines Fahrzeugs |
CN109376682A (zh) * | 2018-11-06 | 2019-02-22 | 东莞市凯木金电子科技有限公司 | 一种智能摄像头及疲劳状态识别方法 |
CN109540162B (zh) * | 2018-11-12 | 2021-12-21 | 北京四维图新科技股份有限公司 | Adas地图数据的处理方法、获取方法、装置及车载设备 |
CN111413957B (zh) * | 2018-12-18 | 2021-11-02 | 北京航迹科技有限公司 | 用于确定自动驾驶中的驾驶动作的***和方法 |
DE102019203739A1 (de) * | 2018-12-20 | 2020-06-25 | Continental Automotive Gmbh | Datenspeicher, Recheneinheit und Verfahren zum Ausführen einer Funktion eines Fahrzeuges |
JP7238393B2 (ja) * | 2018-12-25 | 2023-03-14 | 株式会社デンソー | 地図データ生成装置、地図データ生成システム、地図データ生成プログラム及び記憶媒体 |
CN109774492B (zh) * | 2018-12-29 | 2021-06-22 | 江苏大学 | 一种基于未来驱动功率需求的纯电动汽车整车功率分配方法 |
CN109808705B (zh) * | 2019-01-23 | 2021-11-02 | 青岛慧拓智能机器有限公司 | 一种用于远程遥控驾驶控制的*** |
TWI715958B (zh) * | 2019-04-08 | 2021-01-11 | 國立交通大學 | 評估駕駛者之疲勞分數的方法 |
KR20200135630A (ko) * | 2019-05-23 | 2020-12-03 | 현대자동차주식회사 | 자율 주행 차량의 제어장치 및 그 방법 |
JP7269103B2 (ja) * | 2019-06-05 | 2023-05-08 | 日立Astemo株式会社 | 電子制御装置、制御方法、自動運転システム |
US11142214B2 (en) | 2019-08-06 | 2021-10-12 | Bendix Commercial Vehicle Systems Llc | System, controller and method for maintaining an advanced driver assistance system as active |
CN110689642B (zh) * | 2019-09-18 | 2020-10-09 | 山东大学 | 基于车载obd数据及概率统计的异常驾驶判别方法及*** |
CN112309117A (zh) * | 2020-10-30 | 2021-02-02 | 上海炬宏信息技术有限公司 | 一种基于密度聚类的交通事件融合***和方法 |
US11718314B1 (en) * | 2022-03-11 | 2023-08-08 | Aptiv Technologies Limited | Pedestrian alert system |
GB2618341A (en) * | 2022-05-03 | 2023-11-08 | Oxa Autonomy Ltd | Controlling an autonomous vehicle |
CN115346362B (zh) * | 2022-06-10 | 2024-04-09 | 斑马网络技术有限公司 | 行车数据处理方法、装置、电子设备及存储介质 |
CN116486606B (zh) * | 2023-03-07 | 2023-11-24 | 智能网联汽车(山东)协同创新研究院有限公司 | 一种智能网联车载终端中央控制*** |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11321690A (ja) * | 1998-03-10 | 1999-11-24 | Nissan Motor Co Ltd | レーンキープシステム |
JP2005196567A (ja) | 2004-01-08 | 2005-07-21 | Nissan Motor Co Ltd | 顔向き検出装置 |
JP4027838B2 (ja) | 2003-05-08 | 2007-12-26 | 独立行政法人科学技術振興機構 | 隠れマルコフモデルによる運動データの認識・生成方法、それを用いた運動制御方法及びそのシステム |
JP2008250687A (ja) | 2007-03-30 | 2008-10-16 | Aisin Aw Co Ltd | 地物情報収集装置及び地物情報収集方法 |
JP2008292498A (ja) * | 2001-08-06 | 2008-12-04 | Panasonic Corp | 情報提供方法 |
JP2011134207A (ja) * | 2009-12-25 | 2011-07-07 | Konica Minolta Holdings Inc | 運転記録装置および地図作成システム |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7715961B1 (en) * | 2004-04-28 | 2010-05-11 | Agnik, Llc | Onboard driver, vehicle and fleet data mining |
US7783417B2 (en) * | 2007-03-09 | 2010-08-24 | Mitac International Corporation | Methods and apparatus for determining a route having an estimated minimum fuel usage for a vehicle |
JP4427759B2 (ja) | 2007-06-29 | 2010-03-10 | アイシン・エィ・ダブリュ株式会社 | 車両挙動学習装置及び車両挙動学習プログラム |
WO2009098071A1 (de) * | 2008-02-08 | 2009-08-13 | FKFS Forschungsinstitut für Kraftfahrwesen und Fahrzeugmotoren Stuttgart | Vorrichtung und verfahren zur bereitstellung von informationen über fahrsituationen |
EP2356640A4 (en) | 2008-11-13 | 2012-11-14 | Aser Rich Ltd | SYSTEM AND METHOD FOR IMPROVING VEHICLE SAFETY BY BETTER SENSITIZATION TO THE DRIVER SITUATION OF A VEHICLE |
JP5691145B2 (ja) * | 2009-08-10 | 2015-04-01 | ソニー株式会社 | 車両経路判定方法およびナビゲーション装置 |
DE102010048263A1 (de) * | 2010-10-12 | 2011-05-19 | Daimler Ag | Verfahren und Vorrichtung zum Unterstützen eines Fahrers eines Fahrzeugs |
CN201910918U (zh) * | 2010-12-30 | 2011-07-27 | 上海博泰悦臻电子设备制造有限公司 | 业务***及车载终端 |
-
2013
- 2013-07-16 EP EP13819767.8A patent/EP2876620B1/en active Active
- 2013-07-16 JP JP2014525822A patent/JP6200421B2/ja active Active
- 2013-07-16 CN CN201380037746.3A patent/CN104508719B/zh active Active
- 2013-07-16 RU RU2015105174A patent/RU2015105174A/ru not_active Application Discontinuation
- 2013-07-16 WO PCT/JP2013/069294 patent/WO2014013985A1/ja active Application Filing
- 2013-07-16 MX MX2015000832A patent/MX2015000832A/es not_active Application Discontinuation
- 2013-07-16 US US14/415,162 patent/US10161754B2/en active Active
- 2013-07-16 BR BR112015000983A patent/BR112015000983A2/pt not_active IP Right Cessation
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11321690A (ja) * | 1998-03-10 | 1999-11-24 | Nissan Motor Co Ltd | レーンキープシステム |
JP2008292498A (ja) * | 2001-08-06 | 2008-12-04 | Panasonic Corp | 情報提供方法 |
JP4027838B2 (ja) | 2003-05-08 | 2007-12-26 | 独立行政法人科学技術振興機構 | 隠れマルコフモデルによる運動データの認識・生成方法、それを用いた運動制御方法及びそのシステム |
JP2005196567A (ja) | 2004-01-08 | 2005-07-21 | Nissan Motor Co Ltd | 顔向き検出装置 |
JP2008250687A (ja) | 2007-03-30 | 2008-10-16 | Aisin Aw Co Ltd | 地物情報収集装置及び地物情報収集方法 |
JP2011134207A (ja) * | 2009-12-25 | 2011-07-07 | Konica Minolta Holdings Inc | 運転記録装置および地図作成システム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2876620A4 |
Cited By (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015230694A (ja) * | 2014-06-06 | 2015-12-21 | 株式会社デンソー | 運転コンテキスト情報生成装置 |
CN107074178A (zh) * | 2014-09-16 | 2017-08-18 | 本田技研工业株式会社 | 驾驶辅助装置 |
CN107074178B (zh) * | 2014-09-16 | 2018-07-24 | 本田技研工业株式会社 | 驾驶辅助装置 |
JP2022058625A (ja) * | 2014-09-30 | 2022-04-12 | Case特許株式会社 | 自動運転制御装置及び車両 |
JP2016103267A (ja) * | 2014-11-14 | 2016-06-02 | 株式会社デンソー | 運転データ収集システム |
JP2016095707A (ja) * | 2014-11-14 | 2016-05-26 | 株式会社デンソー | 遷移予測データ生成装置および遷移予測装置 |
CN105810013A (zh) * | 2014-12-30 | 2016-07-27 | ***通信集团公司 | 一种基于车群风险的车辆防撞控制方法和装置 |
CN105059288A (zh) * | 2015-08-11 | 2015-11-18 | 奇瑞汽车股份有限公司 | 一种车道保持控制***及方法 |
CN105691230A (zh) * | 2016-03-06 | 2016-06-22 | 王保亮 | 全自动导航式无人驾驶电动汽车 |
CN105629977A (zh) * | 2016-03-06 | 2016-06-01 | 王保亮 | 一种全自动导航式无人驾驶电动汽车的使用方法 |
US10691123B2 (en) | 2016-04-08 | 2020-06-23 | Honda Motor Co., Ltd. | Vehicle control system, vehicle control method, and vehicle control program |
JPWO2017179151A1 (ja) * | 2016-04-13 | 2018-10-25 | 本田技研工業株式会社 | 車両制御システム、車両制御方法、および車両制御プログラム |
WO2017179151A1 (ja) * | 2016-04-13 | 2017-10-19 | 本田技研工業株式会社 | 車両制御システム、車両制御方法、および車両制御プログラム |
US10676101B2 (en) | 2016-04-13 | 2020-06-09 | Honda Motor Co., Ltd. | Vehicle control system, vehicle control method, and vehicle control program |
US10093320B2 (en) | 2016-04-14 | 2018-10-09 | Toyota Jidosha Kabushiki Kaisha | Drive support device for controlling drive support functions and evaluating driver performance |
US10692369B2 (en) | 2016-04-14 | 2020-06-23 | Toyota Jidosha Kabushiki Kaisha | Server and information providing device |
WO2017179209A1 (ja) * | 2016-04-15 | 2017-10-19 | 本田技研工業株式会社 | 車両制御システム、車両制御方法、および車両制御プログラム |
CN108885828B (zh) * | 2016-04-15 | 2021-08-17 | 本田技研工业株式会社 | 车辆控制***、车辆控制方法及存储介质 |
CN108885828A (zh) * | 2016-04-15 | 2018-11-23 | 本田技研工业株式会社 | 车辆控制***、车辆控制方法及车辆控制程序 |
JPWO2017179209A1 (ja) * | 2016-04-15 | 2018-10-18 | 本田技研工業株式会社 | 車両制御システム、車両制御方法、および車両制御プログラム |
US11169537B2 (en) | 2016-04-15 | 2021-11-09 | Honda Motor Co., Ltd. | Providing driving support in response to changes in driving environment |
WO2017208529A1 (ja) * | 2016-06-02 | 2017-12-07 | オムロン株式会社 | 運転者状態推定装置、運転者状態推定システム、運転者状態推定方法、運転者状態推定プログラム、対象者状態推定装置、対象者状態推定方法、対象者状態推定プログラム、および記録媒体 |
US10429841B2 (en) | 2016-06-12 | 2019-10-01 | Baidu Online Network Technology (Beijing) Co., Ltd. | Vehicle control method and apparatus and method and apparatus for acquiring decision-making model |
JP2017220197A (ja) * | 2016-06-12 | 2017-12-14 | バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド | 車両制御方法と装置及び判断モジュールの獲得方法と装置 |
JP2018179704A (ja) * | 2017-04-11 | 2018-11-15 | 株式会社デンソー | 車両用報知装置 |
US11191468B2 (en) | 2017-04-11 | 2021-12-07 | Denso Corporation | Vehicular notification apparatus |
WO2018189990A1 (ja) * | 2017-04-11 | 2018-10-18 | 株式会社デンソー | 車両用報知装置 |
WO2019008755A1 (ja) * | 2017-07-07 | 2019-01-10 | マクセル株式会社 | 情報処理システム及びそれに用いる情報処理システムインフラ及び情報処理方法 |
JP2019016238A (ja) * | 2017-07-07 | 2019-01-31 | Kddi株式会社 | 運転車両信号から個人特性を特定しやすい道路区間を推定する推定装置、車両端末、プログラム及び方法 |
WO2019022119A1 (ja) * | 2017-07-26 | 2019-01-31 | 学校法人五島育英会 | 移動体データ処理装置及びコンピュータプログラム |
CN109993966A (zh) * | 2018-01-02 | 2019-07-09 | ***通信有限公司研究院 | 一种构建用户画像的方法及装置 |
JP2020060674A (ja) * | 2018-10-10 | 2020-04-16 | トヨタ自動車株式会社 | 地図情報システム |
JP7147448B2 (ja) | 2018-10-10 | 2022-10-05 | トヨタ自動車株式会社 | 地図情報システム |
JP2020144332A (ja) * | 2019-03-08 | 2020-09-10 | トヨタ自動車株式会社 | 仮想現実システムおよび仮想現実方法 |
WO2020188691A1 (ja) * | 2019-03-18 | 2020-09-24 | 三菱電機株式会社 | 車載機器及び通信方法 |
Also Published As
Publication number | Publication date |
---|---|
EP2876620A1 (en) | 2015-05-27 |
EP2876620A4 (en) | 2016-03-23 |
EP2876620B1 (en) | 2019-08-14 |
CN104508719A (zh) | 2015-04-08 |
BR112015000983A2 (pt) | 2017-06-27 |
CN104508719B (zh) | 2018-02-23 |
JP6200421B2 (ja) | 2017-09-20 |
US10161754B2 (en) | 2018-12-25 |
RU2015105174A (ru) | 2016-09-10 |
JPWO2014013985A1 (ja) | 2016-06-30 |
MX2015000832A (es) | 2015-04-08 |
US20150211868A1 (en) | 2015-07-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6200421B2 (ja) | 運転支援システム及び運転支援方法 | |
US11640174B2 (en) | Smart vehicle | |
US10816993B1 (en) | Smart vehicle | |
US10928830B1 (en) | Smart vehicle | |
US11990036B2 (en) | Driver behavior monitoring | |
US20200151479A1 (en) | Method and apparatus for providing driver information via audio and video metadata extraction | |
US10769456B2 (en) | Systems and methods for near-crash determination | |
Bila et al. | Vehicles of the future: A survey of research on safety issues | |
JP7499256B2 (ja) | ドライバの挙動を分類するためのシステムおよび方法 | |
US20150178572A1 (en) | Road surface condition classification method and system | |
US10521749B2 (en) | Risk information processing method and server device | |
JP2015075398A (ja) | 車両用車線案内システム及び車両用車線案内方法 | |
US12049218B2 (en) | Evaluating the safety performance of vehicles | |
JP2014081947A (ja) | 情報配信装置 | |
US20230048304A1 (en) | Environmentally aware prediction of human behaviors | |
US20230134342A1 (en) | System and/or method for vehicle trip classification | |
US20230303122A1 (en) | Vehicle of interest detection by autonomous vehicles based on amber alerts | |
JP2023171455A (ja) | 経路予測装置、それを備えた車載装置、経路予測システム、経路予測方法、及びコンピュータプログラム | |
CN117416344A (zh) | 自主驾驶***中校车的状态估计 | |
Rodemerk | Potential of Driving Style Adaptation for a Maneuver Prediction System at Urban Intersections |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13819767 Country of ref document: EP Kind code of ref document: A1 |
|
DPE2 | Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101) | ||
ENP | Entry into the national phase |
Ref document number: 2014525822 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14415162 Country of ref document: US Ref document number: MX/A/2015/000832 Country of ref document: MX |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: IDP00201500304 Country of ref document: ID |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013819767 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2015105174 Country of ref document: RU Kind code of ref document: A |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112015000983 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112015000983 Country of ref document: BR Kind code of ref document: A2 Effective date: 20150115 |