CN110936960A - Driving assisting method and system - Google Patents

Driving assisting method and system Download PDF

Info

Publication number
CN110936960A
CN110936960A CN201811108869.2A CN201811108869A CN110936960A CN 110936960 A CN110936960 A CN 110936960A CN 201811108869 A CN201811108869 A CN 201811108869A CN 110936960 A CN110936960 A CN 110936960A
Authority
CN
China
Prior art keywords
driving
vehicle
information
road
road data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811108869.2A
Other languages
Chinese (zh)
Inventor
吴栋磊
蔡岭
朱永盛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Banma Zhixing Network Hongkong Co Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201811108869.2A priority Critical patent/CN110936960A/en
Priority to TW108124236A priority patent/TW202031538A/en
Priority to PCT/CN2019/105278 priority patent/WO2020057406A1/en
Publication of CN110936960A publication Critical patent/CN110936960A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a driving assisting method, which comprises the following steps: acquiring road data within a predetermined range; identifying one or more vehicles and vehicle motion information in each object based on the road data; determining driving-related information of the identified vehicle based on the road data and the vehicle motion information; and transmitting the driving-related information to the identified vehicle through a predetermined communication means. The invention also discloses corresponding roadside sensing equipment and an auxiliary driving system.

Description

Driving assisting method and system
Technical Field
The present invention relates to the field of vehicle driving assistance, and in particular to the field of using road environment data to assist in vehicle driving.
Background
As the automotive industry moves into the internet and intelligent era, sensors and arithmetic units in or around the vehicle can provide increasingly greater driving-related data and computing power. These data and capabilities can assist in driving the vehicle more efficiently than previously, making vehicle driving simpler, more intelligent, and safer.
Safety and convenience are often concerns for the driver in relation to driving a vehicle. In the existing vehicle-mounted driving assistance scheme, data collection such as a distance to a vehicle ahead, a speed of the vehicle itself, and a real-time position of the vehicle is generally performed during driving using sensors on the vehicle, and then an on-vehicle computing unit analyzes the data and performs a driving assistance providing capability based on the analysis result. This solution is limited on the one hand to the relevant sensors installed on the vehicle, i.e. it cannot be implemented on vehicles not equipped with relevant sensors. On the other hand, the vehicle sensor can only sense data in a small range around the vehicle, and cannot provide driving environment related information at a greater distance from the vehicle, which has obvious limitations.
The existing road monitoring equipment only provides functions of measuring vehicle flow, vehicle distance, vehicle speed and the like, can only provide a few pieces of road flow prompting information for vehicle driving, and cannot achieve the aim of effectively assisting the vehicle driving.
With the development of the technology of the internet of vehicles V2X, a collaborative environment awareness system appears. The system can use the data of the vehicle and the surrounding environment together to assist the driving of the vehicle. However, how to construct the environmental data and how to fuse the vehicle itself and the environmental data are problems faced by the collaborative environmental awareness system.
Therefore, a new driving assistance scheme for vehicles is needed, which can provide the driving assistance function for vehicles without depending on the sensor capability of the vehicles, and can provide the driving assistance capability beyond the visual range, thereby breaking through the limitation of the existing driving assistance system.
Disclosure of Invention
To this end, the present invention provides a new vehicle assisted driving solution in an attempt to solve or at least alleviate at least one of the problems presented above.
According to an aspect of the present invention, there is provided a driving assist method including the steps of: acquiring road data in a preset range, wherein the road data comprises static and/or dynamic information of each object in the preset range; identifying one or more vehicles and vehicle motion information in each object based on the road data; determining driving-related information of the identified vehicle based on the road data and the vehicle motion information; and transmitting the driving-related information to the identified vehicle through a predetermined communication means.
Alternatively, in the driving assist method according to the present invention, the step of acquiring road data within a predetermined range includes: acquiring static information which is stored in advance and is about the preset range; obtaining static and/or dynamic information of each object in a predetermined range by using each sensor in the drive test sensing equipment deployed in the predetermined range; the road data is generated by combining static information stored in advance and information obtained by the respective sensors.
Alternatively, in the driving assist method according to the present invention, the step of acquiring road data within a predetermined range further includes: receiving vehicle running information sent by a vehicle in a preset range in a preset communication mode; and combining the pre-stored static information, the information obtained by the respective sensors, and the received vehicle travel information to generate road data.
Alternatively, in the driving assist method according to the present invention, the step of acquiring static information about a predetermined range includes: determining the geographical position of the roadside sensing equipment; and obtaining static information from the server that is within a predetermined range of the geographic location.
Alternatively, in the driving assistance method according to the present invention, the identifying one or more vehicles and vehicle motion information in each object based on the road data includes: determining vehicle objects belonging to the vehicle and motion information thereof based on the motion characteristics of the objects; and identifying the identity of each vehicle object.
Optionally, in the driving assistance method according to the present invention, the communication means includes one or more of: V2X, 5G, 4G and 3G communications.
Alternatively, in the driving assist method according to the present invention, each of the objects includes one or more of the following objects: lane lines, guardrails, isolation strips, vehicles, pedestrians, and sprinkles; the static and/or dynamic information includes one or more of the following: location, distance, velocity, angular velocity, license plate, type and size, etc.
Optionally, in the driving assistance method according to the present invention, the sensor in the roadside sensing device includes one or more of: millimeter wave radar, laser radar, camera, infrared probe.
Alternatively, in the driving assist method according to the present invention, the vehicle travel information includes one or more of the following: current time, size, velocity, acceleration, angular velocity, and position.
Alternatively, in the driving-assist method according to the present invention, the driving-related information includes a potential collision risk, and the step of determining the driving-related information of the identified vehicle based on the road data and the vehicle motion information includes: potential collision hazards for the identified vehicle are determined based on road data and vehicle motion information by way of modeling or deep learning.
Alternatively, in the driving assist method according to the present invention, the step of determining the driving-related information of the identified vehicle based on the road data includes: receiving a scene request sent by a vehicle within a preset range; and determining driving-related information corresponding to the scene based on the road data.
Optionally, the driving assistance method according to the present invention is adapted to be executed in a roadside sensing device disposed in the predetermined range or on a cloud server coupled to the roadside sensing device.
According to another aspect of the present invention, there is provided a driving assistance method performed in a vehicle that travels on a road on which a roadside sensing device is disposed, the method including the steps of: receiving driving related information from roadside sensing equipment in a preset communication mode, wherein the driving related information is generated by the roadside sensing equipment according to road data in a preset range; and outputting the received driving-related information in the vehicle.
According to still another aspect of the present invention, there is provided a roadside sensing apparatus including: a sensor group adapted to obtain static and/or dynamic information of each object within a predetermined range thereof; a storage unit adapted to store the road data, the road data including static and/or dynamic information of each object within a predetermined range; and a calculation unit adapted to perform the driving assistance method according to the present invention.
According to still another aspect of the present invention, there is provided a driving assistance system including: the roadside sensing devices are deployed at the side positions of the road; and a vehicle that travels on a road and performs the driving assistance method according to the present invention.
According to still another aspect of the present invention, there is also provided a computing device. The computing device includes at least one processor and a memory storing program instructions, wherein the program instructions are configured to be adapted to be executed by the at least one processor and include instructions for performing the driving assistance method described above.
According to still another aspect of the present invention, there is also provided a readable storage medium storing program instructions that, when read and executed by a computing device, cause the computing device to execute the driving assistance method described above.
According to the driving assisting scheme, the sensing capability of the roadside sensing equipment is fully utilized, so that the requirement on a vehicle-mounted sensor is remarkably reduced. So that various driving assistance capabilities can be obtained even if no additional sensor is installed on the vehicle.
In addition, various driving related information is obtained by analyzing the perception data and is sent to the vehicle, so that more efficient and safer driving assisting capability can be provided for the vehicle, and the limitation of the conventional driving assisting system is broken through
Drawings
To the accomplishment of the foregoing and related ends, certain illustrative aspects are described herein in connection with the following description and the annexed drawings, which are indicative of various ways in which the principles disclosed herein may be practiced, and all aspects and equivalents thereof are intended to be within the scope of the claimed subject matter. The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description read in conjunction with the accompanying drawings. Throughout this disclosure, like reference numerals generally refer to like parts or elements.
FIG. 1 shows a schematic diagram of a driving assistance system according to an embodiment of the invention;
FIG. 2 shows a schematic diagram of a roadside sensing device according to one embodiment of the invention;
FIG. 3 shows a schematic diagram of a driving assistance method according to an embodiment of the invention; and
fig. 4 shows a schematic diagram of a driving assistance method according to another embodiment of the invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Fig. 1 shows a schematic view of a driving assistance system 100 according to an embodiment of the invention. As shown in fig. 1, the driving assistance system 100 includes a vehicle 110 and a roadside sensing device 200. Vehicle 110 is traveling on road 140. Roadway 140 includes a plurality of lanes 150. During the driving process of the vehicle 110 on the road 140, different lanes 150 may be switched according to the road condition and the driving target. The roadside sensing device 200 is disposed at the periphery of the road, and collects various information within a predetermined range around the roadside sensing device 200, particularly road data related to the road, using various sensors it has.
The roadside sensing device 200 has a predetermined coverage. According to the coverage range and the road condition of each roadside sensing device 200, a sufficient number of roadside sensing devices 200 can be deployed on two sides of the road, and the whole road can be fully covered. Of course, according to an embodiment, instead of fully covering the entire road, the roadside sensing devices 200 may be deployed at the feature points (corners, intersections, and diversions) of each road to obtain the feature data of the road. The present invention is not limited by the specific number of roadside sensing devices 200 and the coverage of the road.
When the roadside sensing devices 200 are deployed, the positions of the sensing devices 200 to be deployed are calculated according to the coverage area of a single roadside sensing device 200 and the condition of the road 140. The coverage area of the roadside sensing device 200 depends on at least the arrangement height of the sensing device 200, the effective distance sensed by the sensors in the sensing device 200, and the like. And the condition of road 140 includes road length, number of lanes 150, road curvature and grade, etc. The deployment location of the perceiving device 200 may be calculated in any manner known in the art.
After the deployment location is determined, the roadside sensing device 200 is deployed at the determined location. Since the data that the roadside sensing device 200 needs to sense includes motion data of a large number of objects, clock synchronization of the roadside sensing device 200 is performed, that is, the time of each sensing device 200 is kept consistent with the time of the vehicle 110 and the cloud platform.
Subsequently, the position of each deployed roadside sensing device 200 is determined. Since the perception device 200 is to provide the driving assistance function for the vehicle 110 traveling at a high speed on the road 140, the position of the perception device 200 must be highly accurate as the absolute position of the perception device. There are a number of ways to calculate the high accuracy absolute position of the perceiving device 200. According to one embodiment, a Global Navigation Satellite System (GNSS) may be utilized to determine a high accuracy position.
The roadside sensing device 200 collects and senses the static conditions (lane lines 120, guardrails, isolation belts and the like) and the dynamic conditions (running vehicles 110, pedestrians 130 and sprinklers) of the roads in the coverage area thereof by using the sensors thereof, and fuses the sensing data of the different sensors to form the road data of the section of the road. The road data comprises static and dynamic information of all objects within the coverage area of the perceiving device 200, in particular within the road-related field. The roadside sensing devices 200 may then calculate driving-related information for each vehicle based on the road data, such as whether the vehicle has a potential collision risk, traffic conditions outside the field of view of the vehicle (such as road conditions after a road curve, road conditions before a preceding vehicle), and the like.
A vehicle 110 entering the coverage area of one roadside sensing device 200 may communicate with the roadside sensing device 200. A typical communication method is the V2X communication method. Of course, the mobile internet provided by the mobile communication service provider may communicate with the roadside sensing devices 200 using mobile communication means such as 5G, 4G and 3G. In consideration of the fact that the vehicle runs at a high speed and the requirement for the time delay of communication is as short as possible, the V2X communication method is adopted in the general embodiment of the present invention. However, any communication means that can meet the time delay requirements required by the present invention is within the scope of the present invention.
The vehicle 110 may receive driving-related information related to the vehicle 110 from the roadside sensing device 200 and assist the vehicle driving using the driving-related information.
Optionally, the driving assistance system 100 further comprises a server 160. Although only one server 160 is shown in fig. 1, it should be understood that the server 160 may be a cloud service platform consisting of a plurality of servers. Each roadside sensing device 100 transmits the sensed road data to the server 160. The server 160 may combine the road data based on the location of each roadside sensing device 100 to form road data for the entire road. The server 160 may also perform further processing on the road data for the road to form driving-related information, such as traffic conditions, accident sections, expected transit times, etc. for the entire road.
The server 160 may transmit the road data and the driving related information of the formed whole road to each roadside sensing device 200, or may transmit the road related data and the driving related information of a section of road corresponding to several roadside sensing devices 200 adjacent to a certain roadside sensing device 200 to the roadside sensing device 200. In this way, the vehicle 110 may obtain a greater range of driving-related information from the roadside sensing device 200. Of course, the vehicle 110 may obtain the driving-related information and the road data directly from the server 160 without passing through the roadside sensing device 200.
If roadside sensing devices 200 are deployed on all roads within an area and the roadside sensing devices 200 transmit road data to the server 160, navigation instructions for road traffic within the area may be formed at the server 160. Vehicle 110 may receive the navigation instructions from server 160 and navigate accordingly.
FIG. 2 shows a schematic diagram of a roadside sensing device 200 according to one embodiment of the invention. As shown in fig. 2, the roadside sensing device 200 includes a communication unit 210, a sensor group 220, a storage unit 230, and a calculation unit 240.
The roadside sensing devices 200 are to communicate with each vehicle 110 entering its coverage area to provide driving-related information to the vehicle 110 and to receive vehicle driving information of the vehicle from the vehicle 110. At the same time, the roadside sensing devices 200 also need to communicate with the server 160. The communication unit 210 provides a communication function for the roadside sensing device 200. The communication unit 210 may employ various communication methods including, but not limited to, ethernet, V2X, 5G, 4G, and 3G mobile communication, etc., as long as they can complete data communication with as little time delay as possible. In one embodiment, roadside sensing device 200 may communicate with vehicle 110 entering its coverage area using V2X, while roadside sensing device 200 may communicate with server 160 using, for example, a high speed internet.
The sensor group 220 includes various sensors, for example, radar sensors such as a millimeter wave radar 222 and a laser radar 224, and image sensors such as a camera 226 and an infrared probe 228 having a light supplement function. For the same object, various sensors can obtain different properties of the object, for example, radar sensors can make object velocity and acceleration measurements, while image sensors can obtain object shape, relative angle, etc.
The sensor group 220 collects and senses static conditions (lane lines 120, guardrails, isolation strips, etc.) and dynamic conditions (running vehicles 110, pedestrians 130, and sprinklers) of roads in the coverage area using the respective sensors, and stores data collected and sensed by the respective sensors in the storage unit 230.
The computing unit 240 fuses the data sensed by the sensors to form road data for the road segment and also stores the road data in 234. In addition, the calculation unit 240 may further perform data analysis based on the road data, identify one or more vehicles and vehicle motion information therein, and further determine driving-related information for the vehicle 110. Such data and information may be stored in storage unit 230 for transmission to vehicle 110 or server 160 via communication unit 210.
In addition, various calculation models, such as a collision detection model, a license plate recognition model, and the like, may be stored in the storage unit 230. These computational models may be used by the computational unit 240 to implement the corresponding steps in the method 300 described below with reference to fig. 3.
Fig. 3 shows a schematic representation of a driving assistance method 300 according to an embodiment of the invention. The driving assistance method 300 is suitable for being executed in the roadside sensing device 200 shown in fig. 2 and is also suitable for being executed in the server 160 of fig. 1. When executed in server 160, all relevant data acquired by roadside sensing devices 200 may be sent to server 160 for execution in server 160.
As shown in fig. 3, the driving assistance method 300 starts at step S310.
In step S310, road data within a predetermined range of road positions is acquired. As described above with reference to fig. 1, the roadside sensing device 200 is generally fixedly disposed near a certain road, and thus has a corresponding road position. In addition, the roadside sensing device 200 has a predetermined coverage area depending on at least the arrangement height of the sensing device 200, the effective distance for sensing by the sensors in the sensing device 200, and the like. Once the roadside sensing device 200 is deployed at a side of a certain road, a predetermined range of the road that can be covered by the sensing device can be determined according to the specific positions, heights and effective sensing distances of the sensing device and the road.
The roadside sensing device 200 collects and/or senses the static conditions (lane lines 120, guardrails, isolation strips, etc.) and dynamic conditions (running vehicles 110, pedestrians 130, and sprinklers) of the road in the coverage area by using the various sensors thereof to obtain and store various sensor data.
As described above, the roadside sensing device 200 includes various sensors, for example, radar sensors such as the millimeter wave radar 222 and the laser radar 224, and image sensors such as the camera 226 and the infrared probe 228 having a light supplement function, and the like. For the same object, various sensors can obtain different properties of the object, for example, a radar sensor can perform object velocity and acceleration measurements, and an image sensor can obtain the shape and relative angle of the object.
In step S310, processing and fusion may be performed based on the obtained various sensor raw data, thereby forming unified road data. In one embodiment, step S310 may further include a substep S312. In step S312, static information on a predetermined range of road positions, which is stored in advance, is acquired. After the roadside sensing device is deployed at a certain position of a road, the range of the road covered by the sensing device is fixed. Static information of the predetermined range, such as road width, number of lanes, turning radius, etc., within the range may be obtained. There are a number of ways to obtain static information of a road. In one embodiment, this static information may be pre-stored in the perceiving device at the time of deployment of the perceiving device. In another embodiment, the location information of the perceiving device may be obtained first, and then a request containing the location information may be sent to the server 160, so that the server 160 returns the static information of the relevant road range according to the request.
Subsequently, in step S314, the raw sensor data is processed according to different sensors to form sensing data such as distance measurement, speed measurement, type identification, size identification, and the like. Next, in step S316, based on the road static data obtained in step S312, calibration is performed using different sensor data as a reference and other sensor data, and finally uniform road data is formed.
Steps S312-S136 describe one way to obtain road data. The invention is not limited to the particular manner in which the data of the various sensors is fused to form the roadway data. This approach is within the scope of the present invention as long as the road data contains static and dynamic information for various objects within a predetermined range of the road location.
According to one embodiment, each vehicle 110 entering the coverage area of the roadside sensing device 200 actively communicates with the sensing device 200 through various communication means (e.g., V2X). Therefore, as described in step S318, the vehicle 110 transmits the vehicle travel information of the vehicle to the perception device 200. The travel information of the vehicle includes the travel information that the vehicle has during travel, including, for example, the current time at which the travel information is generated, the size, speed, acceleration, angular velocity, and position of the vehicle. The method S310 further includes a step S319 in which the vehicle travel information obtained in the step S318 is further fused on the basis of the road data formed in the step S316 to form new road data.
Next, in step S320, one or more vehicles within the sensing unit coverage and motion information of the vehicles are identified based on the road data obtained at step S310. The identification in step S320 includes two aspects of identification. One aspect of the identification is vehicle identification, i.e. identifying which objects in the road data are vehicle objects. Since the vehicle objects have different motion characteristics, such as a relatively high speed, traveling in a lane in one direction, generally not sending collisions with other objects, and the like. A conventional classification detection model or a deep learning-based model may be constructed based on these motion characteristics, and the constructed model is applied to road data, thereby determining motion characteristics such as a vehicle object and a motion trajectory of the vehicle object in the road data.
Another aspect of the identification is identifying a vehicle identification. For the recognized vehicle object, its vehicle identification is further determined. One way to determine the identity of the vehicle is to determine the unique license plate of the vehicle, for example by means of image recognition or the like. When the license plate of the vehicle cannot be identified, another way to determine the vehicle identifier may be to generate a unique mark of the vehicle by combining the size, type, position information, driving speed, and the like of the vehicle object. The vehicle identification is the unique identification of the vehicle object within the road section and is used to distinguish it from other vehicle objects. The vehicle identification is used in subsequent data transmission and is transmitted in different road side sensing devices in the road so as to facilitate overall analysis.
Subsequently, in step S330, based on the road data obtained in step S310 and the vehicle object and its motion information recognized in step S320, data analysis is performed to determine driving-related information of the vehicle.
The present invention includes a variety of driving-related information and thus has a plurality of analysis models for analysis of the driving-related information.
According to one embodiment, data analysis is actively performed in step S330 to determine driving-related information. In this case, for example, if the driving-related information is a potential collision possibility, in step S330, a vehicle having a potential collision possibility in the road is detected. And the collision may include a forward collision, an overtaking collision, a lane change collision, etc. Potential collision detection may be performed in various ways. One way is a collision detection model to detect a vehicle having a possibility of collision from road data. Another way is to perform deep learning to determine vehicles with a possibility of collision by analyzing a large number of actual road collision examples. The invention is not limited to the particular manner in which the potential collision is made.
According to another embodiment, in step S330, data analysis may be performed to determine driving-related information of the vehicle 110 according to the request of the vehicle. In this case, the driving-related information is, for example, scene information related to a scene requested by the vehicle.
Each scene may be defined in advance, and information corresponding to each scene. For example, when the scene is night vision assistance, the driving-related information includes information on a road and a vehicle within a certain range in front of the vehicle; when the scene is a 360-degree panoramic view, the driving related information comprises all information in a certain range around the vehicle; and when the scene is beyond the visual range, the driving related information comprises all information in the visual range in which the vehicle is blocked.
For this, step S330 may further include step S332 and step S334. In step S332, a scene request transmitted from the vehicle 110 is received, and in step S334, driving-related information corresponding to the scene is determined based on the road data and the identification and motion information of the vehicle 110. Since the identity of the vehicle 110 is known, dynamic and static information of other vehicles and environments around the vehicle 110 may be determined from the road data, so that driving-related information corresponding to the requested scene may be provided.
Whether the data analysis is performed in an active manner at step S330 or in a passive manner at the request of the vehicle, vehicle matching is required to determine which vehicle object or objects within the coverage of the current perception device 200 the vehicle 110 that is to receive the driving analysis results is. For example, if the potential collision detection is performed at step S330, after determining a vehicle with a higher possibility of collision, it is necessary to determine a vehicle identification and a corresponding communication means that match it. If the driving-related information is scene-related information, after receiving a scene request from the vehicle 110, the vehicle that has made the request needs to be matched with vehicles within the coverage area, so as to determine which vehicle made the scene request, so as to perform data analysis for the vehicle.
Vehicle matching can be performed through various matching modes or combination of license plate matching, driving speed and type matching, position information fuzzy matching and the like. According to one embodiment, the vehicle 110 may bind the license plate information through V2X or application verification, and the license plate information may further be matched to the vehicle data of the corresponding license plate in the roadside sensing device and the server, thereby implementing license plate matching.
After the driving-related information is determined at step S330, the determined driving-related information is transmitted to the corresponding vehicle 110 through a predetermined communication means at step S340. In step S340, a communication manner associated with the matching vehicle 110 is determined, and the driving-related information is transmitted to the corresponding vehicle using the determined communication manner. Alternatively, the communication method is generally a mobile communication method such as V2X, 5G, 4G, or 3G.
The vehicle 110 may perform different processes according to the attribute of the driving-related information after receiving the driving-related information. For example, if the data is scene-related data, the driving-related information is displayed on a display screen or an application such as an in-vehicle center control large screen, an intelligent instrument panel, or navigation software, according to the scene definition.
If the warning information is warning information such as collision warning, the warning information can be prompted to the vehicle owner in various ways such as display, voice, alarm, vibration and the like according to the type and the urgency degree of the warning.
Fig. 4 shows a schematic representation of a driving assistance method 400 according to another embodiment of the invention. The driving assistance method 400 is adapted to be executed in a vehicle 110, and the vehicle 110 runs on a road on which the roadside sensing device 200 is disposed. The method 400 includes step S410. In step S410, driving-related information of the approaching road side perception device 200 is received through a predetermined communication means. Step S410 corresponds to step S340 in the method 300 described above with reference to fig. 3, and thus the driving-related information is generated by the roadside sensing device according to road data within a predetermined range of the road position thereof. The processing in S410 is not described again here.
Subsequently, in step S420, the received driving-related information is output in the vehicle 110. In step S420, the output manner may be determined according to the attribute of the driving-related information.
If the driving-related information is warning information such as a collision warning. The method 400 may include, in addition to presenting the alert information within the vehicle in a conventional manner, step S430, wherein the driver or owner of the vehicle is notified of the potential collision hazard, for example, the alert information may be presented to the owner in a number of different manners, such as display, voice, alarm, vibration, etc., depending on the type and urgency of the alert. In addition, the method 400 may further include step S440, which may convert the warning information into vehicle control, directly control the vehicle to run or provide various driving assistance capabilities including forward collision warning, overtaking warning, lane change warning, blind zone warning, and rear vehicle protection, so as to reduce the possibility of collision, thereby forming more efficient, safer, and more direct driving assistance.
If the driving-related information is scene-related data, the method 400 may further include a corresponding step S450, as described above with reference to fig. 3. In S450, a scene request is sent to the roadside sensing device through a predetermined communication mode, and in step S420, the driving related information is displayed on a display screen or an application such as a vehicle-mounted central control large screen, an intelligent instrument panel or navigation software according to a scene definition.
In addition, optionally, in order to better construct the road data, the method 400 may further include step S460, in which vehicle driving information is transmitted to the roadside sensing device through a predetermined communication manner. The processing in step S460 corresponds to step S318, and is not described here again.
According to the driving assisting scheme, the perception capability of the road side unit can be fully utilized, and the perceived data is further analyzed and processed and then provided for the vehicle, so that efficient driving assisting performance can be provided.
It should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules or units or components of the devices in the examples disclosed herein may be arranged in a device as described in this embodiment or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may be further divided into multiple sub-modules.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
Furthermore, some of the described embodiments are described herein as a method or combination of method elements that can be performed by a processor of a computer system or by other means of performing the described functions. A processor having the necessary instructions for carrying out the method or method elements thus forms a means for carrying out the method or method elements. Further, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is used to implement the functions performed by the elements for the purpose of carrying out the invention.
As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present invention has been disclosed in an illustrative rather than a restrictive sense, and the scope of the present invention is defined by the appended claims.

Claims (22)

1. A driving assistance method, the method comprising the steps of:
acquiring road data in a preset range, wherein the road data comprises static and/or dynamic information of each object in the preset range;
identifying one or more vehicles and vehicle motion information in the objects based on the road data;
determining driving-related information of the identified vehicle based on the road data and vehicle motion information; and
transmitting the driving-related information to the identified vehicle through a predetermined communication means.
2. The driving assist method according to claim 1, the step of acquiring road data within a predetermined range including:
acquiring static information which is stored in advance and is about the preset range;
obtaining static and/or dynamic information of each object in the predetermined range by using each sensor in the roadside sensing equipment deployed in the predetermined range;
combining the pre-stored static information and information obtained by the respective sensors to generate the road data.
3. The driving assist method according to claim 2, the step of acquiring road data within a predetermined range including:
receiving vehicle running information sent by vehicles in the preset range in a preset communication mode; and
the pre-stored static information, the information obtained by the respective sensors, and the received vehicle travel information are combined to generate the road data.
4. The driving assist method according to claim 2 or 3, the step of acquiring static information on the predetermined range, which is stored in advance, includes:
determining a geographical location of the drive test awareness device; and
static information within a predetermined range of the geographic location is obtained from a server.
5. The driving assistance method according to any one of claims 1 to 4, the identifying one or more vehicles and vehicle motion information in the objects based on the road data step including:
determining vehicle objects belonging to the vehicle and motion information thereof based on the motion characteristics of the objects; and
an identification of each vehicle object is identified.
6. The driving assistance method according to claim 5, the communication means including one or more of:
V2X, 5G, 4G and 3G communications.
7. The driving assistance method according to any one of claims 1 to 6, the objects including one or more of the following objects: lane lines, guardrails, isolation strips, vehicles, pedestrians, and sprinkles;
the static and/or dynamic information includes one or more of the following: location, distance, velocity, angular velocity, license plate, type and size, etc.
8. The driving assistance method according to any one of claims 2 to 7, the sensor in the roadside sensing device including one or more of:
millimeter wave radar, laser radar, camera, infrared probe.
9. The driving assist method according to any one of claims 3 to 8, the vehicle travel information including one or more of:
current time, size, velocity, acceleration, angular velocity, and position.
10. The driving assistance method according to any one of claims 1 to 9, the driving-related information comprising a potential collision risk, and the step of determining the driving-related information of the identified vehicle based on the road data and the vehicle motion information comprising:
determining potential collision hazards for the identified vehicle based on the road data and vehicle motion information by way of modeling or deep learning.
11. The driving assistance method according to any one of claims 1 to 9, the step of determining driving-related information of the identified vehicle based on the road data comprising:
receiving a scene request sent by vehicles within the preset range; and
determining driving-related information corresponding to the scene based on the road data.
12. Driving assistance method according to any one of claims 1 to 11, wherein the method is adapted to be performed in a roadside sensing device deployed in the predetermined range or on a cloud server coupled to the roadside sensing device.
13. A driving assistance method performed in a vehicle that runs on a road on which a roadside sensing device is disposed, the method comprising the steps of:
receiving driving related information in a preset communication mode, wherein the driving related information is generated by the road side sensing equipment according to road data in a preset range of the road side sensing equipment;
the received driving-related information is output in the vehicle.
14. The driving assist method according to claim 13, further comprising the step of:
and sending vehicle running information to the roadside sensing equipment in a preset communication mode.
15. The driving assistance method according to claim 13 or 14, the driving-related information including a potential collision risk, the method further comprising:
notifying a driver of the vehicle of the potential collision hazard.
16. The driving assist method according to claim 15, further comprising:
controlling the driving of the vehicle based on the received driving-related information to reduce the potential collision risk.
17. The driving assist method according to any one of claims 13 to 16, further comprising:
sending a scene request to the roadside sensing equipment in a preset communication mode; and
the driving-related information includes driving-related information corresponding to the scene.
18. A roadside sensing device comprising:
each sensor adapted to obtain static and dynamic information of each object within its predetermined range;
a storage unit adapted to store the road data, the road data including static and/or dynamic information of each object within the predetermined range; and
a computing unit adapted to perform the method of any of claims 1-12.
19. A driving assistance system comprising:
a plurality of roadside sensing devices as recited in claim 18 deployed at a lateral location on a road; and
a vehicle that travels on the road and that executes the driving assist method according to any one of claims 13 to 17.
20. The driver assistance system according to claim 19, further comprising:
and the cloud server is suitable for receiving the road data of the road side sensing equipment and combining the road data based on the deployment position of each road side sensing equipment to generate the road data of the whole road.
21. A computing device, comprising:
at least one processor; and
a memory storing program instructions configured for execution by the at least one processor, the program instructions comprising instructions for performing the method of any of claims 1-17.
22. A readable storage medium storing program instructions that, when read and executed by a computing device, cause the computing device to perform the method of any of claims 1-17.
CN201811108869.2A 2018-09-21 2018-09-21 Driving assisting method and system Pending CN110936960A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201811108869.2A CN110936960A (en) 2018-09-21 2018-09-21 Driving assisting method and system
TW108124236A TW202031538A (en) 2018-09-21 2019-07-10 Auxiliary driving method and system
PCT/CN2019/105278 WO2020057406A1 (en) 2018-09-21 2019-09-11 Driving aid method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811108869.2A CN110936960A (en) 2018-09-21 2018-09-21 Driving assisting method and system

Publications (1)

Publication Number Publication Date
CN110936960A true CN110936960A (en) 2020-03-31

Family

ID=69888290

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811108869.2A Pending CN110936960A (en) 2018-09-21 2018-09-21 Driving assisting method and system

Country Status (3)

Country Link
CN (1) CN110936960A (en)
TW (1) TW202031538A (en)
WO (1) WO2020057406A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111540224A (en) * 2020-06-12 2020-08-14 深圳市元征科技股份有限公司 Road data processing method and related equipment
CN111899625A (en) * 2020-07-16 2020-11-06 北京理工大学 Intelligent driving assisting development device
CN113066289A (en) * 2021-04-30 2021-07-02 腾讯科技(深圳)有限公司 Driving assistance processing method and device, computer readable medium and electronic device
CN113538913A (en) * 2021-07-16 2021-10-22 河南理工大学 Expressway overtaking prompt and psychological intervention device and method
WO2022028314A1 (en) * 2020-08-06 2022-02-10 索尼集团公司 Electronic device and method for wireless communications, and computer readable storage medium
CN114368388A (en) * 2022-01-28 2022-04-19 中国第一汽车股份有限公司 Driving behavior analysis method, device, equipment and storage medium
CN114724366A (en) * 2022-03-29 2022-07-08 北京万集科技股份有限公司 Driving assistance method, device, equipment, storage medium and program product

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102431556A (en) * 2011-11-15 2012-05-02 武汉理工大学 Integrated driver early warning device based on vehicle-road cooperation
CN108399792A (en) * 2018-01-25 2018-08-14 北京墨丘科技有限公司 A kind of automatic driving vehicle preventing collision method, device and electronic equipment
CN108417087A (en) * 2018-02-27 2018-08-17 浙江吉利汽车研究院有限公司 A kind of vehicle safety traffic system and method
CN108447291A (en) * 2018-04-03 2018-08-24 南京锦和佳鑫信息科技有限公司 A kind of Intelligent road facility system and control method
CN108765982A (en) * 2018-05-04 2018-11-06 东南大学 Signalized crossing speed guiding system and bootstrap technique under bus or train route cooperative surroundings

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2007224206A1 (en) * 2006-03-03 2007-09-13 Inrix, Inc. Assessing road traffic conditions using data from mobile data sources
US9494940B1 (en) * 2015-11-04 2016-11-15 Zoox, Inc. Quadrant configuration of robotic vehicles
US9507346B1 (en) * 2015-11-04 2016-11-29 Zoox, Inc. Teleoperation system and method for trajectory modification of autonomous vehicles
DE102016009760A1 (en) * 2016-08-11 2018-02-15 Trw Automotive Gmbh Control system and control method for guiding a motor vehicle along a path
EP3315359B1 (en) * 2016-10-28 2022-03-02 Volvo Car Corporation Road vehicle turn signal assist system and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102431556A (en) * 2011-11-15 2012-05-02 武汉理工大学 Integrated driver early warning device based on vehicle-road cooperation
CN108399792A (en) * 2018-01-25 2018-08-14 北京墨丘科技有限公司 A kind of automatic driving vehicle preventing collision method, device and electronic equipment
CN108417087A (en) * 2018-02-27 2018-08-17 浙江吉利汽车研究院有限公司 A kind of vehicle safety traffic system and method
CN108447291A (en) * 2018-04-03 2018-08-24 南京锦和佳鑫信息科技有限公司 A kind of Intelligent road facility system and control method
CN108765982A (en) * 2018-05-04 2018-11-06 东南大学 Signalized crossing speed guiding system and bootstrap technique under bus or train route cooperative surroundings

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111540224A (en) * 2020-06-12 2020-08-14 深圳市元征科技股份有限公司 Road data processing method and related equipment
CN111899625A (en) * 2020-07-16 2020-11-06 北京理工大学 Intelligent driving assisting development device
WO2022028314A1 (en) * 2020-08-06 2022-02-10 索尼集团公司 Electronic device and method for wireless communications, and computer readable storage medium
EP4181543A4 (en) * 2020-08-06 2023-12-27 Sony Group Corporation Electronic device and method for wireless communications, and computer readable storage medium
CN113066289B (en) * 2021-04-30 2024-03-15 腾讯科技(深圳)有限公司 Driving assistance processing method, driving assistance processing device, computer readable medium and electronic equipment
CN113066289A (en) * 2021-04-30 2021-07-02 腾讯科技(深圳)有限公司 Driving assistance processing method and device, computer readable medium and electronic device
WO2022227616A1 (en) * 2021-04-30 2022-11-03 腾讯科技(深圳)有限公司 Driving assistance processing method and apparatus, computer-readable medium, and electronic device
US20230078241A1 (en) * 2021-04-30 2023-03-16 Tencent Technology (Shenzhen) Company Limited Driving assistance processing method and apparatus, computer-readable medium, and electronic device
CN113538913A (en) * 2021-07-16 2021-10-22 河南理工大学 Expressway overtaking prompt and psychological intervention device and method
CN114368388A (en) * 2022-01-28 2022-04-19 中国第一汽车股份有限公司 Driving behavior analysis method, device, equipment and storage medium
CN114368388B (en) * 2022-01-28 2024-05-03 中国第一汽车股份有限公司 Driving behavior analysis method, device, equipment and storage medium
CN114724366A (en) * 2022-03-29 2022-07-08 北京万集科技股份有限公司 Driving assistance method, device, equipment, storage medium and program product
CN114724366B (en) * 2022-03-29 2023-06-20 北京万集科技股份有限公司 Driving assistance method, device, equipment and storage medium

Also Published As

Publication number Publication date
WO2020057406A1 (en) 2020-03-26
TW202031538A (en) 2020-09-01

Similar Documents

Publication Publication Date Title
CN110936960A (en) Driving assisting method and system
US11126877B2 (en) Predicting vehicle movements based on driver body language
CN111354182A (en) Driving assisting method and system
US10800455B2 (en) Vehicle turn signal detection
US11597396B2 (en) Vehicle control device
CN111429739A (en) Driving assisting method and system
CN110942623B (en) Auxiliary traffic accident handling method and system
CN111354214B (en) Auxiliary parking method and system
CN111354222A (en) Driving assisting method and system
CN110796007B (en) Scene recognition method and computing device
CN110940347B (en) Auxiliary vehicle navigation method and system
CN111915915A (en) Driving scene reconstruction method, device, system, vehicle, equipment and storage medium
JP6384419B2 (en) Animal type determination device
KR20170082674A (en) Apparatus and method for assisting safety driving
CN111508276B (en) High-precision map-based V2X reverse overtaking early warning method, system and medium
US20200271453A1 (en) Lane marking localization and fusion
JP2019049774A (en) Vehicle control device, vehicle control method, and program
US20170103271A1 (en) Driving assistance system and driving assistance method for vehicle
WO2017104209A1 (en) Driving assistance device
CN110962744A (en) Vehicle blind area detection method and vehicle blind area detection system
Fuerstenberg et al. Advanced intersection safety-The EC project INTERSAFE
CN110763244B (en) Electronic map generation system and method
CN114572243A (en) Target object detection device and vehicle equipped with the same
CN112866636A (en) Group fog recognition early warning method and system based on farthest visible distance and electronic equipment
CN112950995A (en) Parking assistance device, corresponding method, vehicle and server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20201216

Address after: Room 603, 6 / F, Roche Plaza, 788 Cheung Sha Wan Road, Kowloon, China

Applicant after: Zebra smart travel network (Hong Kong) Ltd.

Address before: The big Cayman capital building, a four - story mailbox 847

Applicant before: Alibaba Group Holding Ltd.

REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40026901

Country of ref document: HK

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200331