WO2018161774A1 - 驾驶行为确定方法、装置、设备及存储介质 - Google Patents

驾驶行为确定方法、装置、设备及存储介质 Download PDF

Info

Publication number
WO2018161774A1
WO2018161774A1 PCT/CN2018/075954 CN2018075954W WO2018161774A1 WO 2018161774 A1 WO2018161774 A1 WO 2018161774A1 CN 2018075954 W CN2018075954 W CN 2018075954W WO 2018161774 A1 WO2018161774 A1 WO 2018161774A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
driving
information
driving behavior
image
Prior art date
Application number
PCT/CN2018/075954
Other languages
English (en)
French (fr)
Inventor
王达峰
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to JP2019528487A priority Critical patent/JP7072763B2/ja
Priority to EP18764635.1A priority patent/EP3561780A4/en
Priority to KR1020197025758A priority patent/KR20190115040A/ko
Publication of WO2018161774A1 publication Critical patent/WO2018161774A1/zh
Priority to US16/428,332 priority patent/US10913461B2/en

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • G07C5/085Registering performance data using electronic data carriers
    • G07C5/0866Registering performance data using electronic data carriers the electronic data carrier being a digital video recorder in combination with video camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/18Conjoint control of vehicle sub-units of different type or different function including control of braking systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/20Conjoint control of vehicle sub-units of different type or different function including control of steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0953Predicting travel path or likelihood of collision the prediction being responsive to vehicle dynamic parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W40/09Driving style or behaviour
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/107Longitudinal acceleration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/08Insurance
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0808Diagnosing performance data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0002Automatic control, details of type of controller or control system architecture
    • B60W2050/0004In digital systems, e.g. discrete-time systems involving sampling
    • B60W2050/0005Processor details or data handling, e.g. memory registers or chip architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0019Control system elements or transfer functions
    • B60W2050/0022Gains, weighting coefficients or weighting functions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • B60W2520/105Longitudinal acceleration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/404Characteristics
    • B60W2554/4041Position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • B60W2554/801Lateral distance
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • B60W2554/802Longitudinal distance
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • B60W2554/804Relative longitudinal speed
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station

Definitions

  • Embodiments of the present invention relate to the field of vehicle networking, and in particular, to a driving behavior determining method, apparatus, device, and storage medium.
  • UBI Usage Based Insurance
  • OBD On-Board Diagnostic
  • the OBD device collects the corresponding sensor data through the built-in acceleration sensor and the gyroscope, thereby determining the current driving state of the vehicle based on the sensor data, so that the vehicle insurance company can determine the driving style of the owner based on the driving state determined by the OBD device.
  • the corresponding insurance policy is formulated. For example, for owners who often have emergency braking or sharp turns during driving, the vehicle insurance company judges that the driving style of such owners is more radical and increases the amount of insurance paid by such owners.
  • the driving behavior of the owner is relatively one-sided according to the driving state of the vehicle, which may easily lead to misjudgment of dangerous driving behavior.
  • the emergency braking of the owner of the road in a crowded road section is mistaken for dangerous driving behavior, thereby affecting the formulation of the vehicle insurance company.
  • the accuracy of the insurance policy is determined that the driving behavior of the owner is relatively one-sided according to the driving state of the vehicle, which may easily lead to misjudgment of dangerous driving behavior. For example, the emergency braking of the owner of the road in a crowded road section is mistaken for dangerous driving behavior, thereby affecting the formulation of the vehicle insurance company. The accuracy of the insurance policy.
  • the embodiment of the invention provides a driving behavior determining method, device, device and storage medium, which can solve the problem that the driving behavior of the vehicle owner is determined to be one-sided according to the driving state of the vehicle, which easily leads to misjudgment of dangerous driving behavior, and thus affects Vehicle insurance companies have developed the issue of the accuracy of the insurance policy.
  • the technical solution is as follows:
  • a driving behavior determining method comprising:
  • the vehicle driving information is used to represent a driving state in which the vehicle is located;
  • the dangerous driving behavior refers to the driving behavior in which the risk of accident exists.
  • a driving behavior determining apparatus comprising:
  • a first acquiring module configured to acquire a driving image, where the driving image includes a road image and a vehicle image of the front and rear vehicles;
  • a second acquiring module configured to acquire driving information of the vehicle, where the driving information of the vehicle is used to represent a driving state in which the vehicle is located;
  • the first determining module is configured to determine whether there is a dangerous driving behavior according to the driving image and the driving information of the vehicle, and the dangerous driving behavior refers to a driving behavior in which an accident risk exists.
  • a driving behavior determining apparatus comprising a processor and a memory, wherein the memory stores at least one instruction, and the at least one instruction is loaded and executed by the processor to implement the first aspect The driving behavior determination method described.
  • a computer readable storage medium having stored therein at least one instruction loaded by the processor and executed to implement the first aspect Driving behavior determination method
  • FIG. 1 shows a schematic diagram of an implementation environment provided by an embodiment of the present invention
  • FIG. 2 is a flow chart showing a driving behavior determining method provided by an embodiment of the present invention.
  • FIG. 3A is a flowchart of a driving behavior determining method according to another embodiment of the present invention.
  • FIG. 3B is a schematic diagram of a driving image provided by an embodiment
  • FIG. 3C is a flowchart showing a driving behavior determining method according to still another embodiment of the present invention.
  • FIG. 3D is a schematic diagram of an implementation of determining a lane information process in the driving behavior determining method illustrated in FIG. 3C;
  • FIG. 3D is a schematic diagram of an implementation of determining a lane information process in the driving behavior determining method illustrated in FIG. 3C;
  • 3E is a schematic diagram of an implementation process of determining a distance information in the driving behavior determining method shown in FIG. 3C;
  • FIG. 4 is a block diagram showing the structure of a driving behavior determining apparatus according to an embodiment of the present invention.
  • FIG. 5 is a schematic structural diagram of a driving behavior determining apparatus according to an embodiment of the present invention.
  • FIG. 1 illustrates a schematic diagram of an implementation environment in which an embodiment of the present invention includes a driving record device 110, a mobile terminal 120, and a server 130.
  • the driving record device 110 is an electronic device that is disposed in a vehicle for recording driving images.
  • the driving recording device 110 is a driving recorder or a car navigation device equipped with a camera.
  • the driving recording device 110 is disposed in front of the vehicle for recording a driving image in front of the vehicle during driving, or a driving recording device. 110 is also installed in front of and behind the vehicle to record the driving image in front of and behind the vehicle during driving.
  • a wired or wireless connection is established between the driving record device 110 and the mobile terminal 120.
  • the driving record device 110 establishes a wired connection with the mobile terminal 120 through the data line, and performs data interaction through the data line; or the driving record device 110 establishes a wireless connection with the mobile terminal 120 through Bluetooth or infrared, and through the wireless connection Perform data interaction.
  • the mobile terminal 120 is an electronic device having an Internet access function, which is a smart phone, a tablet or a wearable smart device, and the like.
  • the mobile terminal 120 obtains the data sent by the driving record device 110 through the connection with the driving recording device 110, and acquires the data through the Internet. The data is reported.
  • the mobile terminal 120 or the driving record device 110 is also connected to the OBD device installed in the vehicle, and acquires the travel information of the vehicle collected by the OBD device during the running of the vehicle.
  • the vehicle travel information is used to characterize a running state of the vehicle, including at least one of a current speed of the vehicle, a current acceleration, and steering information.
  • the driving record device 110 or the mobile terminal 120 is provided with an artificial intelligence (AI) algorithm, by which the driving record device 110 or the mobile terminal 120 can recognize the road image included in the driving image and Vehicle image, and analyzes the driving lane, the distance from the preceding vehicle (or the rear vehicle), and the relative speed of the preceding vehicle (or the following vehicle) according to the identified road image and the vehicle image; further, the driving record is recorded by the AI algorithm.
  • AI artificial intelligence
  • the device 110 or the mobile terminal 120 can quantitatively process the identified dangerous driving behavior to obtain driving behavior data corresponding to the dangerous driving behavior.
  • the mobile terminal 120 and the server 130 are connected by a wired network or a wireless network.
  • the server 130 is a server for managing vehicle-related driving behavior data, wherein the driving behavior data is reported by the driving recording device 110 or the mobile terminal 120 (real-time reporting during vehicle driving or reporting every predetermined time interval).
  • the server 130 is a server cluster or a cloud computing center composed of several servers.
  • the server 130 receives the update data sent by the mobile terminal 12 (analysis of the driving image collected by the driving recording device 110 and the driving information of the vehicle) And updating the stored driving behavior data according to the updated data.
  • the server 130 when the driving record device 110 has an Internet access function and establishes a wireless connection with the server 130, the server 130 receives the update data uploaded by the driving record device 110 via the Internet (according to real-time driving behavior) Calculated), and the stored driving behavior data is updated according to the updated data.
  • the wireless or wired network described above uses standard communication techniques and/or protocols.
  • the network is usually the Internet, but can also be any network, including but not limited to a Local Area Network (LAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), a mobile, a wired or a wireless. Any combination of networks, private networks, or virtual private networks).
  • data exchanged over a network is represented using techniques and/or formats including Hyper Text Markup Language (HTML), Extensible Markup Language (XML), and the like.
  • SSL Secure Socket Layer
  • TLS Transport Layer Security
  • VPN Virtual Private Network
  • IPsec Internet Protocol Security
  • Regular encryption techniques are used to encrypt all or some of the links.
  • the above described data communication techniques may also be replaced or supplemented using custom and/or dedicated data communication techniques.
  • the driving behavior determining method provided by various embodiments of the present invention may be performed by the driving recording device 110 alone, separately by the mobile terminal 120, by the driving recording device 110 and the mobile terminal 120, or by the server 130.
  • the driving record device 110 or the mobile terminal 120 uploads the collected driving image and the driving information of the vehicle to the server 130, and the server 130 corresponds to the driving image and the driving information of the vehicle.
  • a timestamp that determines if the vehicle has dangerous driving behavior at various times.
  • the mobile terminal 120 As an example of the driving behavior determination method, but are not limited thereto.
  • determining the driving behavior depends on the sensor data collected by the OBD device, and only the driving state of the vehicle emergency braking, emergency starting, sharp turning, and the like can be determined by the sensor data.
  • only the sensor data is used as the basis for determining the driving behavior.
  • the sensor data collected by the OBD device cannot accurately reflect the current driving state of the vehicle, thereby affecting the accuracy of the driving behavior judgment.
  • it is relatively one-sided and not straightforward to measure the risk of an accident based on the driving state of the vehicle such as emergency braking, emergency starting, and sharp turning, resulting in a reference value and accuracy by determining the driving behavior. low.
  • the AI algorithm is used to determine whether the vehicle has dangerous driving behavior according to the driving image (external situation) during driving and the driving information of the vehicle (the driving state of the vehicle itself), so that the judgment basis of the dangerous driving behavior is further improved. It is rich and improves the accuracy of the judged dangerous driving behavior. At the same time, combined with the driving image, it can identify dangerous driving behaviors that deviate from the lane, the vehicle distance is too small, and the risk of accidents such as emergency braking is large, so that the dangerous driving behavior is determined.
  • the reference value is higher and more comprehensive and accurate, which is beneficial for vehicle insurance companies to develop more accurate insurance policies for car owners based on the dangerous driving behavior.
  • FIG. 2 is a flowchart of a driving behavior determining method according to an embodiment of the present invention.
  • the driving behavior determining method is used in the mobile terminal 120 shown in FIG. 1 as an example. include:
  • Step 201 Acquire a driving image, where the driving image includes a road image and a vehicle image of the front and rear vehicles.
  • the driving image is a video image captured by the driving recording device through the image capturing component while the vehicle is running. And since the driving recording device is usually disposed in front of and/or behind the vehicle, the driving image includes road images of the front and/or rear roads and vehicle images of the front and rear vehicles.
  • the driving image is a real-time driving image transmitted by the driving recording device to the mobile terminal during the running of the vehicle, or the driving image is transmitted to the mobile terminal by the driving recording device every predetermined time interval (for example, 5 minutes). Driving images.
  • Step 202 Acquire vehicle driving information, and the vehicle driving information is used to represent the driving state in which the vehicle is located.
  • the mobile terminal while acquiring the driving image, acquires real-time vehicle driving information of the vehicle.
  • the vehicle driving information includes a current vehicle speed and steering information of the vehicle.
  • the vehicle driving information further includes other information for indicating the running state of the vehicle, such as the current acceleration, the driving direction, and the like, which is not limited by the embodiment of the present invention.
  • the current vehicle speed, the current acceleration, and the traveling direction in the vehicle travel information are collected by the driving recording device, the OBD device, or the mobile terminal through built-in sensors (including an acceleration sensor or a gyroscope, etc.);
  • the steering information contained in the information is obtained from the vehicle by the OBD device.
  • the steering information is used to characterize the opening of the turn signal.
  • Step 203 Determine whether there is a dangerous driving behavior according to the driving image and the driving information of the vehicle at the same time, and the dangerous driving behavior refers to the driving behavior in which the risk of the accident exists.
  • the mobile terminal After obtaining the driving image and the driving information of the vehicle, the mobile terminal analyzes the driving image and the driving information of the vehicle at the same time to determine whether the vehicle has dangerous driving behavior.
  • the dangerous driving behavior comprises: at least one of a deviation from the lane, a small distance between the vehicle (too small between the vehicle in front or between the vehicle in front), and emergency braking. It should be noted that the mobile terminal can also determine other dangerous driving behaviors that directly cause an accident according to the driving image and the driving information of the vehicle, which is not limited by the embodiment of the present invention.
  • the mobile terminal performs image recognition on the acquired driving image, determines lane information and distance information, and combines the steering information indicated by the driving information of the vehicle to determine whether the vehicle has a dangerous driving behavior that deviates from the lane; or, the mobile terminal Perform image recognition on the acquired driving image, determine the distance information and relative speed information, and combine the current speed in the driving information of the vehicle to determine whether the vehicle has a dangerous driving behavior with a small vehicle distance; or, the mobile terminal acquires The driving image performs image recognition, determines the distance information, and combines the current acceleration in the driving information of the vehicle to determine that the vehicle has dangerous driving behavior with emergency braking.
  • the mobile terminal reports the report to the server, and the server stores the vehicle and the dangerous driving behavior in association, so as to construct the corresponding driving based on the type, frequency of occurrence and risk level of the dangerous behavior.
  • Behavioral model (indicating the relationship between driving behavior and accident risk).
  • the driving record device has strong image analysis and data processing capability, the above steps may be performed by the driving record device, which is not limited in the embodiment of the present invention.
  • the comprehensive judgment of the dangerous driving behavior of the vehicle is realized based on the driving image and the driving information of the vehicle.
  • the judgment basis of dangerous driving behavior is more comprehensive, and the accuracy of the judgment result is improved, thereby improving the judgment based on The results are based on the accuracy of the insurance policy.
  • the mobile terminal needs to quantify the determined dangerous driving behavior.
  • the mobile terminal after determining that the vehicle has a dangerous driving behavior, the mobile terminal further quantitatively calculates the dangerous driving behavior according to the type, the dangerous level, and the frequency of the dangerous driving behavior, and reports the calculated quantitative data to the server. So that the server updates the driving behavior data of the vehicle according to the quantitative data, thereby formulating an insurance policy based on the driving behavior data.
  • the following description is made using the illustrative embodiments.
  • FIG. 3A is a flowchart of a driving behavior determining method according to another embodiment of the present invention.
  • the driving behavior determining method is used in the mobile terminal 120 shown in FIG. 1 as an example. Methods include:
  • Step 301 Acquire a driving image, where the driving image includes a road image and a vehicle image of the front and rear vehicles.
  • the mobile terminal acquires the driving image collected by the driving recording device as shown in FIG. 3B, wherein the driving image includes the road image 31 and the vehicle image 32 of the preceding vehicle.
  • Step 302 Identify the driving image to obtain a driving image recognition result, where the driving image recognition result includes at least one of lane information, distance information, and relative speed information.
  • the mobile terminal After obtaining the driving image, the mobile terminal analyzes and recognizes the driving image through image analysis technology, thereby obtaining a driving image recognition result including at least one of lane information, distance information, and relative speed information.
  • the lane information is used to characterize the lane in which the vehicle is located
  • the distance information is used to characterize the distance between the vehicle in front and/or the vehicle in the vehicle
  • the relative speed information is used to characterize the relative speed of the vehicle in front and/or the vehicle in the vehicle. .
  • this step includes the following steps.
  • Step 302A identifying a lane line included in the road image; determining lane information according to the lane line.
  • the mobile terminal In order to determine the lane in which the vehicle is currently located, the mobile terminal identifies a lane line included in the road image in the driving image (also referred to as a road traffic marking, including a white dotted line, a white solid line, a yellow dotted line, a yellow solid line, a double white dotted line, Double white solid line, double yellow solid line, yellow virtual solid line, etc.), thereby determining lane information based on the identified lane line.
  • a road traffic marking including a white dotted line, a white solid line, a yellow dotted line, a yellow solid line, a double white dotted line, Double white solid line, double yellow solid line, yellow virtual solid line, etc.
  • the mobile terminal recognizes the trapezoidal region 33 (or a triangle-like region) located in the lower half of the driving image, and further identifies the lane line included in the trapezoidal region 33. 34, thereby determining lane information based on the lane line 34.
  • the lane-like area includes two lane lines
  • the left side of the trapezoid-like area is identified to include the lane line
  • it is determined that the vehicle is located in the right lane
  • the mobile terminal determines that the vehicle is currently in the middle lane.
  • the mobile terminal identifies a lane line included in the road image by image recognition technology, and assigns a lane line number to the identified lane line. If the lane line located in the middle of the road image is the first lane line, it is determined that the vehicle is currently located in the first lane (the leftmost lane); if the lane line in the middle of the road image is the nth and n+1th lane lines, Then, it is determined that the vehicle is currently located in the n+1th lane; if the lane line located in the middle of the road image is the nth lane line, it is determined that the vehicle is currently located in the nth lane (the rightmost lane).
  • the mobile terminal may also use other image analysis technologies to determine the lane information, which is not limited by the embodiment of the present invention.
  • Step 302B identifying an image position of the vehicle image in the driving image; determining the distance information according to the image position and the preset distance scale, wherein the preset distance scale is used to represent a mapping between different image positions and actual distances in the driving image. relationship.
  • the mobile terminal obtains the driving image, determines the image position of the vehicle image in the driving image, and determines the vehicle in front according to the image position. The actual distance between (or the rear vehicle).
  • the preset distance scale indicating the mapping relationship between the image position and the actual distance is pre-stored in the mobile terminal.
  • the driving image corresponds to the preset distance scale 35 .
  • the mobile terminal performs contour recognition on the image included in the driving image to identify the vehicle image of the preceding vehicle; further, the mobile device determines the image position of the lower edge of the image of the vehicle image in the driving image, and according to The image position and the preset distance scale determine the distance to the preceding vehicle. For example, as shown in FIG. 3E, the mobile terminal determines that the distance between the vehicle and the preceding vehicle is 50 m.
  • Step 302C identifying an image position of the vehicle image in the driving image; determining relative speed information according to the change of the image position.
  • the relative speed information is used to characterize the relationship with the speed of the preceding vehicle or the rear vehicle.
  • the relative speed information indicates that the current vehicle speed of the vehicle is greater than the current vehicle speed of the preceding vehicle.
  • the mobile terminal In order to determine the magnitude of the relative speed with the preceding vehicle (or the rear vehicle), for each image frame in the driving image, the mobile terminal identifies the image location of the vehicle image in the driving image (specifically similar to step 302B above), and The relative speed information is determined based on the change in the position of the image in the adjacent image frame.
  • the mobile terminal when determining the magnitude of the relative speed with the preceding vehicle, when detecting that the image position of the vehicle image is shifted to the lower side of the screen (the vehicle distance becomes smaller), the mobile terminal determines that the speed of the preceding vehicle is less than the current speed of the vehicle; When it is detected that the image position of the vehicle image is shifted to the upper side of the screen (the vehicle distance becomes larger), the mobile terminal determines that the speed of the preceding vehicle is greater than the current speed of the vehicle.
  • the mobile terminal when determining the magnitude of the relative speed with the rear vehicle, when detecting that the image position of the vehicle image is offset to the lower side of the screen (the vehicle distance becomes larger), the mobile terminal determines that the speed of the rear vehicle is less than the current speed of the vehicle; When it is detected that the image position of the vehicle image is shifted to the upper side of the screen (the vehicle distance becomes smaller), the mobile terminal determines that the speed of the rear vehicle is greater than the current speed of the vehicle.
  • the mobile terminal may further calculate the vehicle distance change amount according to the change of the image position, thereby calculating the speed of the front (or rear) vehicle according to the current vehicle speed and the vehicle distance change amount, and finally calculating the two.
  • the relative speed is poor, and this embodiment does not limit this.
  • the above steps 301 and 302 can be performed by the driving record device, and the mobile terminal 120 only needs to obtain the driving image recognition result provided by the driving record device, which is an embodiment of the present invention. This is not limited.
  • Step 303 Acquire vehicle driving information, and the vehicle driving information is used to represent the driving state in which the vehicle is located.
  • Step 304 Determine whether there is a dangerous driving behavior according to the driving image recognition result and the driving information of the vehicle.
  • the mobile terminal determines whether the vehicle has dangerous driving behavior by combining the vehicle driving information and the driving image recognition result obtained in step 302 above.
  • the mobile terminal performs the following step 304A;
  • the recognition result includes the distance information and the relative speed information, and the vehicle travel information includes the current vehicle speed, the mobile terminal performs the following step 304B;
  • the driving image recognition result includes the vehicle distance information, and the vehicle travel information includes the current acceleration
  • the mobile terminal performs the following step 304C.
  • Step 304A when the lane information changes, and the distance between the distance information indication and the preceding vehicle is less than the first threshold, and the steering information indicates that the turn signal is not turned on, determining that there is a first dangerous driving behavior, and the first dangerous driving behavior refers to Driving behavior that deviates from the lane.
  • the mobile terminal detects whether the lane information obtained in step 302A changes.
  • the mobile terminal detects the vehicle in front according to the distance information obtained in step 302B.
  • the first threshold is positively correlated with the current vehicle speed.
  • the first threshold is a safe braking distance corresponding to the current vehicle speed.
  • the mobile terminal determines that the vehicle is off the lane.
  • step 304A when the mobile terminal can recognize the lane change condition of the vehicle, and can further determine whether the lane change behavior has an accident risk according to the distance from the preceding vehicle; compared to only identifying the driving behavior based on the sensor data,
  • the dangerous driving behaviors that can be identified in the embodiments are more abundant, and the identified dangerous driving behaviors are matched with the probability of accidents.
  • Step 304B determining a safe braking distance corresponding to the current vehicle speed; determining that there is a second danger when the distance between the vehicle distance information representation and the preceding vehicle is less than the safe braking distance, and the relative speed information indicates that the current vehicle speed is greater than the vehicle speed of the preceding vehicle Driving behavior.
  • the mobile terminal After determining the distance between the vehicle and the preceding vehicle through the above step 302B, the mobile terminal further detects whether the distance is smaller than the safe braking distance corresponding to the current vehicle speed, and the vehicle distance is less than the safe braking distance corresponding to the current vehicle speed, and When the relative speed information indicates that the current vehicle speed is greater than the vehicle speed of the preceding vehicle, it is determined that there is a dangerous driving behavior in which the vehicle distance is too small.
  • the correspondence between the vehicle speed and the safe braking distance under different road conditions is pre-stored in the mobile terminal, and the correspondence relationship is schematically shown in Table 1.
  • the mobile terminal can calculate the safe braking distance in real time according to the proportional relationship between the current vehicle speed and the safe braking distance, wherein the current vehicle speed and safety system
  • the dynamic distance is in a proportional relationship, which is not limited by the embodiment of the present invention.
  • the mobile terminal alerts the driver by a predetermined manner (such as issuing a predetermined reminder voice) to avoid an accident.
  • the mobile terminal can recognize the behavior that the vehicle is too close to or fast to the vehicle ahead, and further determines the behavior as a dangerous driving behavior; compared with the related art, only the vehicle speed can be identified according to the current vehicle speed. Too fast, the dangerous driving behaviors that can be identified in this embodiment are more abundant, and the identified dangerous driving behaviors match the probability of accidents more.
  • Step 304C when the distance between the vehicle distance information representation and the preceding vehicle is greater than a second threshold, and the distance between the vehicle distance information representation and the rear vehicle is less than a third threshold, and the current acceleration characterizes the vehicle brake, determining that there is a third danger Driving behavior, the third dangerous driving behavior refers to the driving behavior of emergency braking.
  • the sudden braking of the vehicle can easily cause the distance between the vehicle and the rear vehicle to be too small, or even directly lead to an accident.
  • the mobile terminal acquires the distance between the vehicle in front, the distance from the vehicle behind and the current acceleration of the vehicle, and detects and forwards
  • the distance between the vehicles is greater than a second threshold, and the distance between the vehicle distance information representation and the rear vehicle is less than a third threshold, and when the current acceleration characterizes the vehicle brake, it is determined that the vehicle is less spaced from the rear vehicle due to the sudden braking.
  • the second threshold and the third threshold are positively correlated with the current vehicle speed.
  • the emergency braking is directly determined to be dangerous driving behavior.
  • the image recognition technology into the dangerous driving behavior judgment, it is possible to avoid erroneously determining the emergency braking of the crowded road section as dangerous driving behavior (because When the crowded road section is in emergency braking, the distance between the vehicle and the preceding vehicle is less than the second threshold, and the above judgment condition is not satisfied, thereby improving the accuracy of the dangerous driving behavior judgment.
  • Step 305 determining a hazard level of dangerous driving behavior.
  • the mobile terminal after determining that there is a dangerous driving behavior, the mobile terminal further determines a dangerous level of dangerous driving behavior according to data such as current traffic volume, current vehicle speed, and distance information of the road, wherein the higher the dangerous level, the possibility of an accident The higher the sex.
  • this step includes the following steps as shown in FIG. 3C.
  • step 305A when the dangerous driving behavior is the first dangerous driving behavior, the danger level is determined according to the current vehicle speed and the traffic flow.
  • the mobile terminal acquires the current vehicle speed of the vehicle, and calculates the traffic volume according to the number of vehicles in the driving image within a predetermined duration (for example, one minute), thereby determining the vehicle speed according to the current vehicle speed and the traffic volume.
  • the current vehicle speed is positively correlated with the hazard level
  • the traffic flow is positively correlated with the hazard level.
  • the mobile terminal determines the dangerous level of dangerous driving behavior according to the correspondence shown in Table 2.
  • Step 305B When the dangerous driving behavior is the second dangerous driving behavior or the third dangerous driving behavior, the dangerous level is determined according to the current vehicle speed and the current distance.
  • the mobile terminal acquires the current vehicle speed of the vehicle and the current vehicle distance with the preceding vehicle, thereby determining the danger level according to the current vehicle speed and the current vehicle distance.
  • the current speed is positively correlated with the dangerous level
  • the current distance is negatively correlated with the dangerous level.
  • the mobile terminal determines the dangerous level of dangerous driving behavior according to the correspondence shown in Table 3.
  • the mobile terminal acquires the current vehicle speed of the vehicle and the current vehicle distance with the rear vehicle, thereby determining the danger level according to the current vehicle speed and the current vehicle distance.
  • the current speed is positively correlated with the dangerous level
  • the current distance is negatively correlated with the dangerous level.
  • the mobile terminal may further calculate a risk level corresponding to the dangerous driving behavior according to a risk level calculation formula corresponding to different types of dangerous driving behaviors, which is not limited by the embodiment of the present invention.
  • Step 306 Calculate update data according to the frequency of occurrence of the dangerous driving behavior, the risk level, and the weight corresponding to the dangerous driving behavior, and update the data for updating the driving behavior data corresponding to the vehicle.
  • mobile terminals need to combine the frequency of occurrence of dangerous driving behavior, the level of danger and the weight corresponding to dangerous driving behavior, and convert dangerous driving behavior into quantitative data (ie for updating driving). Update data for behavioral data).
  • the quantitative data corresponding to the dangerous driving behavior frequency of occurrence of dangerous driving behavior ⁇ risk level ⁇ weight of dangerous driving behavior.
  • the above steps 303 to 306 can be performed by the driving record device, which is not limited by the embodiment of the present invention.
  • Step 307 Send a data update request to the server, where the data update request includes a vehicle identifier and update data, and the server is configured to update the driving behavior data of the vehicle indicated by the vehicle identifier according to the update data.
  • the mobile terminal sends the update data calculated in real time to the server, or sends the updated update data in the time interval to the server every predetermined time interval (for example, 10 minutes).
  • the server after receiving the data update request carrying the update data and the vehicle identifier, the server updates the driving behavior data of the vehicle indicated by the vehicle identifier according to the update data.
  • the sending data update request to the server further carries information such as the type, frequency of occurrence, location of occurrence, and time of occurrence of the dangerous driving behavior corresponding to the updated data, so that the server constructs a corresponding insurance model according to driving behavior data of different information dimensions.
  • the server constructs an insurance model for different regions according to the geographical distribution of the vehicle, thereby using the corresponding insurance model to formulate an insurance policy for the vehicle.
  • the above step 307 can be performed by the driving record device, which is not limited in the embodiment of the present invention.
  • the dangerous driving behavior judgment is performed based on the above information, thereby improving the accuracy of the dangerous driving behavior judgment.
  • the combined driving image can identify the dangerous driving behavior of the vehicle directly deviating from the lane, too close to the preceding vehicle or emergency braking, which directly leads to an accident, and further improves the reference value of the identified dangerous driving behavior.
  • the mobile terminal quantitatively calculates the dangerous driving behavior according to the type, the dangerous level and the frequency of the dangerous driving behavior, and reports the calculated quantitative data to the server, so that the server can drive the vehicle according to the quantitative data.
  • the data is updated, and the insurance policy is formulated based on the driving behavior data, which is beneficial to improving the accuracy of formulating the insurance policy.
  • FIG. 4 is a structural block diagram of a driving behavior determining apparatus according to an embodiment of the present invention.
  • the driving behavior determining device is implemented as a whole or a part of the mobile terminal 120 in FIG. 1 by hardware or a combination of hardware and software.
  • the driving behavior determining device includes a first acquiring module 410, a second acquiring module 420, and a first determining module 430.
  • the first obtaining module 410 is configured to implement the functions of the foregoing step 201 or 301;
  • the second obtaining module 420 is configured to implement the functions of the foregoing step 202 or 303;
  • the first determining module 430 is configured to implement the function of the foregoing step 203.
  • the first determining module 430 includes: an identifying unit and a first determining unit;
  • An identification unit configured to implement the function of step 302 above;
  • the first determining unit is configured to implement the function of step 304 above.
  • the identifying unit is further configured to implement the functions of the foregoing step 302A, 302B or 302C.
  • the vehicle driving information includes current vehicle speed and steering information
  • the first determining unit is further configured to implement the functions of the foregoing step 304A, 304B or 304C.
  • the device further includes: a second determining module, a calculating module, and a sending module;
  • a second determining module configured to implement the function of step 305 above;
  • a calculation module configured to implement the functions of step 306 above;
  • the sending module is configured to implement the function of step 307 above.
  • the second determining module includes: a second determining unit and a third determining unit;
  • a second determining unit configured to implement the function of step 305A above;
  • the third determining unit is configured to implement the function of the foregoing step 305B.
  • the device 500 is the driving record device 110 or the mobile terminal 120 of FIG. Specifically:
  • Apparatus 500 can include an RF (Radio Frequency) circuit 510, a memory 520 including one or more computer readable storage media, an input unit 530, a display unit 540, a sensor 550, an audio circuit 560, a near field communication module 570, A processor 580 having one or more processing cores, and a power supply 590 and the like are included.
  • RF Radio Frequency
  • FIG. 5 does not constitute a limitation to the terminal, and may include more or less components than those illustrated, or a combination of certain components, or different component arrangements. among them:
  • the RF circuit 510 can be used for receiving and transmitting signals during and after receiving or transmitting information, in particular, after receiving downlink information of the base station, and processing it by one or more processors 580; in addition, transmitting data related to the uplink to the base station.
  • the RF circuit 510 includes, but is not limited to, an antenna, at least one amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, an LNA (Low Noise Amplifier). , duplexer, etc.
  • SIM Subscriber Identity Module
  • RF circuitry 510 can also communicate with the network and other devices via wireless communication.
  • the wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System of Mobile communication), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access). , Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), e-mail, SMS (Short Messaging Service), and the like.
  • GSM Global System of Mobile communication
  • GPRS General Packet Radio Service
  • CDMA Code Division Multiple Access
  • WCDMA Wideband Code Division Multiple Access
  • LTE Long Term Evolution
  • e-mail Short Messaging Service
  • the memory 520 can be used to store software programs and modules, and the processor 580 executes various functional applications and data processing by running software programs and modules stored in the memory 520.
  • the memory 520 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may be stored according to The data created by the use of the device 500 (such as audio data, phone book, etc.) and the like.
  • memory 520 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, memory 520 may also include a memory controller to provide access to memory 520 by processor 580 and input unit 530.
  • Input unit 530 can be used to receive input numeric or character information, as well as to generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function controls.
  • the input unit 530 may include an image input device 531 and other input devices 532.
  • the image input device 531 may be a camera or an optical scanning device.
  • the input unit 530 may also include other input devices 532.
  • other input devices 532 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, joysticks, and the like.
  • Display unit 540 can be used to display information entered by the user or information provided to the user and various graphical user interfaces of device 500, which can be comprised of graphics, text, icons, video, and any combination thereof.
  • the display unit 540 can include a display panel 541.
  • the display panel 541 can be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like.
  • Device 500 may also include at least one type of sensor 550, such as a light sensor, motion sensor, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 541 according to the brightness of the ambient light, and the proximity sensor may close the display panel 541 when the device 500 moves to the ear. / or backlight.
  • the gravity acceleration sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity.
  • the gesture of the mobile phone such as horizontal and vertical screen switching, related Game, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping), etc.; as for the device 500 can also be configured with gyroscopes, barometers, hygrometers, thermometers, infrared sensors and other sensors, here Let me repeat.
  • Audio circuit 560, speaker 561, and microphone 562 can provide an audio interface between the user and device 500.
  • the audio circuit 560 can transmit the converted electrical data of the received audio data to the speaker 561, and convert it into a sound signal output by the speaker 561.
  • the microphone 562 converts the collected sound signal into an electrical signal, and the audio circuit 560 is used by the audio circuit 560. After receiving, it is converted into audio data, and then processed by the audio data output processor 580, sent to another electronic device via the RF circuit 510, or outputted to the memory 520 for further processing.
  • the audio circuit 560 may also include an earbud jack to provide communication of the peripheral earphones with the device 500.
  • the device 500 establishes a near field communication connection with the external device through the near field communication module 570 and performs data interaction through the near field communication connection.
  • the near field communication module 570 specifically includes a Bluetooth module and/or a WiFi module.
  • Processor 580 is the control center of device 500, which connects various portions of the entire handset using various interfaces and lines, by running or executing software programs and/or modules stored in memory 520, and recalling data stored in memory 520, The various functions and processing data of the device 500 are performed to perform overall monitoring of the handset.
  • the processor 580 may include one or more processing cores; preferably, the processor 580 may integrate an application processor and a modem processor, where the application processor mainly processes an operating system, a user interface, an application, and the like.
  • the modem processor primarily handles wireless communications. It will be appreciated that the above described modem processor may also not be integrated into the processor 580.
  • the device 500 also includes a power source 590 (such as a battery) that supplies power to the various components.
  • a power source 590 such as a battery
  • the power source can be logically coupled to the processor 580 via a power management system to manage functions such as charging, discharging, and power management through the power management system.
  • Power supply 590 may also include any one or more of a DC or AC power source, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
  • the device 500 may further include a Bluetooth module or the like, and details are not described herein again.
  • the embodiment of the present invention further provides a computer readable storage medium, where the storage medium stores at least one instruction, and the at least one instruction is loaded and executed by the processor to implement the driving behavior determining method provided by each of the foregoing method embodiments. .
  • the embodiment of the present invention further provides a computer program product, where the computer program product stores at least one instruction, and the at least one instruction is loaded and executed by the processor to implement the driving behavior determining method provided by the foregoing method embodiments.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Transportation (AREA)
  • Business, Economics & Management (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Mathematical Physics (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • Technology Law (AREA)
  • General Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)
  • Time Recorders, Dirve Recorders, Access Control (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

一种驾驶行为确定方法、装置、设备及存储介质。该方法包括:获取行车影像,行车影像中包含道路图像和前后方车辆的车辆图像(步骤201);获取车辆行驶信息,车辆行驶信息用于表征车辆所处的行驶状态(步骤202);根据同一时刻下的行车影像和车辆行驶信息,确定是否存在危险驾驶行为,危险驾驶行为指存在事故风险的驾驶行为(步骤203)。相较于仅根据车辆行驶状态判断是否存在危险驾驶行为,该方法通过结合更为直观的行车影像进行判断,使得危险驾驶行为的判断依据更为全面,提高了判断结果的准确性。

Description

驾驶行为确定方法、装置、设备及存储介质
本发明实施例要求于2017年03月06日提交中国国家知识产权局、申请号为201710127810.7、发明名称为“驾驶行为确定方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本发明实施例中。
技术领域
本发明实施例涉及车联网领域,特别涉及一种驾驶行为确定方法、装置、设备及存储介质。
背景技术
随着民众保险意识的不断提高,越来越多的车主在购买车辆后,都会选择为车辆购买保险,以减小事故发生时的经济损失。
基于使用的保险(Usage Based Insurance,UBI)作为一种基于驾驶行为的保险,被广泛应用于车辆保险行业,车主选择UBI后,需要在车辆中安装车载诊断***(On-Board Diagnostic,OBD)设备。在驾驶过程中,OBD设备通过内置的加速度传感器和陀螺仪采集相应的传感器数据,从而根据传感器数据确定车辆当前的行驶状态,以便车辆保险公司基于OBD设备确定出的行驶状态判断车主的驾驶风格,并最终根据该驾驶风格制定相应的投保策略。比如,针对驾驶过程中经常紧急制动或急转弯的车主,车辆保险公司判断此类车主的驾驶风格较为激进,并提高此类车主缴纳的保险金金额。
然而,仅根据车辆的行驶状态确定车主的驾驶行为较为片面,容易导致危险驾驶行为的误判,比如,车主在道路拥挤路段进行紧急制动被误认为危险驾驶行为,进而影响车辆保险公司制定的投保策略的准确性。
发明内容
本发明实施例提供了一种驾驶行为确定方法、装置、设备及存储介质,可以解决相关技术中仅根据车辆的行驶状态确定车主的驾驶行为较为片面,容易导致危险驾驶行为的误判,进而影响车辆保险公司制定投保策略的准确性的问题。所述技术方案如下:
根据本发明实施例的第一方面,提供了一种驾驶行为确定方法,该方法包 括:
获取行车影像,行车影像中包含道路图像和前后方车辆的车辆图像;
获取车辆行驶信息,车辆行驶信息用于表征车辆所处的行驶状态;
根据行车影像和车辆行驶信息,确定是否存在危险驾驶行为,危险驾驶行为指存在事故风险的驾驶行为。
根据本发明实施例的第二方面,提供了一种驾驶行为确定装置,该装置包括:
第一获取模块,用于获取行车影像,行车影像中包含道路图像和前后方车辆的车辆图像;
第二获取模块,用于获取车辆行驶信息,车辆行驶信息用于表征车辆所处的行驶状态;
第一确定模块,用于根据行车影像和车辆行驶信息,确定是否存在危险驾驶行为,危险驾驶行为指存在事故风险的驾驶行为。
根据本发明实施例的第三方面,提供了一种驾驶行为确定设备,该设备包括处理器和存储器,存储器中存储有至少一条指令,至少一条指令由处理器加载并执行以实现如第一方面所述的驾驶行为确定方法。
根据本发明实施例的第四方面,提供了一种计算机可读存储介质,该存储介质中存储有至少一条指令,至少一条指令由所述处理器加载并执行以实现如第一方面所述的驾驶行为确定方法
本发明实施例提供的技术方案带来的有益效果至少包括:
通过获取包含道路图像和车辆图像的行车影像,以及表征车辆行驶状态的车辆行驶信息,从而基于该行车影像和车辆行驶信息,实现车辆危险驾驶行为的综合判断;相较于仅根据车辆行驶状态判断是否存在危险驾驶行为,通过结合更为直观的行车影像进行判断,使得危险驾驶行为的判断依据更为全面,提高了判断结果的准确性,进而提高基于判断结果制定出的投保策略的准确性。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1示出了本发明一个实施例提供实施环境的示意图;
图2示出了本发明一个实施例提供的驾驶行为确定方法的流程图;
图3A示出了本发明另一个实施例提供的驾驶行为确定方法的流程图;
图3B是一个实施例提供的行车影像的示意图;
图3C示出了本发明再一个实施例提供的驾驶行为确定方法的流程图;
图3D是图3C所示驾驶行为确定方法中确定车道信息过程的实施示意图;
图3E是图3C所示驾驶行为确定方法中确定车距信息过程的实施示意图;
图4示出了本发明一个实施例提供的驾驶行为确定装置的结构方框图;
图5示出了本发明一个实施例提供的驾驶行为确定设备的结构示意图。
具体实施方式
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合附图对本发明实施方式作进一步地详细描述。
请参考图1,其示出了本发明一个实施例提供实施环境的示意图,该实施环境中包括:行车记录设备110、移动终端120和服务器130。
行车记录设备110是设置在车辆中用于录制行车影像的电子设备。其中,该行车记录设备110为行车记录仪或配备有摄像头的车载导航设备,可选的,行车记录设备110设置在车辆前方,用于录制行车过程中车辆前方的行车影像,或,行车记录设备110同时设置在车辆前方和后方,用于录制行车过程中车辆前方以及后方的行车影像。
行车记录设备110与移动终端120之间建立有线或无线连接。可选的,行车记录设备110通过数据线与移动终端120建立有线连接,并通过数据线进行数据交互;或者,行车记录设备110通过蓝牙或红外线与移动终端120建立无线连接,并通过该无线连接进行数据交互。
移动终端120是具有互联网访问功能的电子设备,该电子设备为智能手机,平板电脑或可穿戴式智能设备等等。在一种可能的实施方式中,当行车记录设备110不具备互联网访问功能时,移动终端120即通过与行车记录设备110之间的连接获取行车记录设备110发送的数据,并通过互联网对获取到的数据进行上报。
可选的,移动终端120或行车记录设备110还与车辆中安装的OBD设备 相连,并获取车辆行驶过程中OBD设备采集到的车辆行驶信息。其中,该车辆行驶信息用于表征车辆的行驶状态,包括车辆的当前速度、当前加速度和转向信息中的至少一种。
本发明实施例中,行车记录设备110或移动终端120中设置有人工智能(Artificial Intelligence,AI)算法,通过该AI算法,行车记录设备110或移动终端120能够识别行车影像中包含的道路图像和车辆图像,并根据识别出的道路图像和车辆图像分析行驶车道、与前车(或后车)距离、与前车(或后车)相对速度等信息;进一步的,通过该AI算法,行车记录设备110或移动终端120能够对识别出的危险驾驶行为进行定量处理,从而得到危险驾驶行为对应的驾驶行为数据。
移动终端120与服务器130之间通过有线网络或无线网络相连。
服务器130是用于管理车辆对应驾驶行为数据的服务器,其中,该驾驶行为数据由行车记录设备110或移动终端120上报(车辆行驶过程中实时上报或每隔预定时间间隔上报)。该服务器130为若干台服务器组成的服务器集群或云计算中心。
在一种可能的实施方式中,当行车记录设备110不具备互联网访问功能时,服务器130即接收移动终端12发送的更新数据(对行车记录设备110采集到的行车影像以及车辆行驶信息分析得到),并根据该更新数据对存储的驾驶行为数据进行更新。
在另一种可能的实施方式中,当行车记录设备110具有互联网访问功能,并与服务器130之间建立无线连接时,服务器130即接收行车记录设备110通过互联网上传的更新数据(根据实时驾驶行为计算得到),并根据该更新数据对存储的驾驶行为数据进行更新。
可选地,上述的无线网络或有线网络使用标准通信技术和/或协议。网络通常为因特网、但也可以是任何网络,包括但不限于局域网(Local Area Network,LAN)、城域网(Metropolitan Area Network,MAN)、广域网(Wide Area Network,WAN)、移动、有线或者无线网络、专用网络或者虚拟专用网络的任何组合)。在一些实施例中,使用包括超文本标记语言(Hyper Text Mark-up Language,HTML)、可扩展标记语言(Extensible Markup Language,XML)等的技术和/或格式来代表通过网络交换的数据。此外还可以使用诸如安全套接字层(Secure Socket Layer,SSL)、传输层安全(Transport Layer Security,TLS)、 虚拟专用网络(Virtual Private Network,VPN)、网际协议安全(Internet Protocol Security,IPsec)等常规加密技术来加密所有或者一些链路。在另一些实施例中,还可以使用定制和/或专用数据通信技术取代或者补充上述数据通信技术。
本发明各个实施例提供的驾驶行为确定方法可以由行车记录设备110单独执行、由移动终端120单独执行、由行车记录设备110和移动终端120协同执行或由服务器130执行。
可选的,当驾驶行为确定方法由服务器130执行时,行车记录设备110或移动终端120将采集到的行车影像以及车辆行驶信息上传至服务器130,由服务器130根据行车影像以及车辆行驶信息对应的时间戳,确定车辆在各个时刻下是否存在危险驾驶行为。
为了方便描述,下述各个实施例以驾驶行为确定方法由移动终端120执行为例进行说明,但并不对此构成限定。
相关技术中,判断驾驶行为依赖于OBD设备采集到的传感器数据,且通过该传感器数据仅能确定出车辆紧急制动、紧急启动、急转弯等行驶状态。然而,仅以传感器数据作为驾驶行为的判断依据,在OBD设备跌落或车辆自然抖动等情况下,OBD设备采集的传感器数据并不能准确反映车辆当前的行驶状态,进而影响驾驶行为判断的准确性。并且,不考虑其他车辆的行驶状态,仅根据紧急制动、紧急启动、急转弯这类车辆自身行驶状态衡量事故发生风险较为片面且不够直接,导致通过确定出驾驶行为的参考价值及准确性较低。
而本发明各个实施例中,借助AI算法,同时根据行驶过程中的行车影像(外部情况)以及车辆行驶信息(车辆自身行驶状态)确定车辆是否存在危险驾驶行为,使得危险驾驶行为的判断依据更加丰富,提高了判断出的危险驾驶行为的准确性;同时,结合行车影像能够识别出偏离车道,车距过小以及紧急制动等事故风险较大的危险驾驶行为,使得确定出的危险驾驶行为的参考价值更高且更为全面准确,有利于车辆保险公司根据该危险驾驶行为为车主制定更加准确的投保策略。
请参考图2,其示出了本发明一个实施例提供的驾驶行为确定方法的流程图,本实施例以该驾驶行为确定方法用于图1所示的移动终端120为例进行说明,该方法包括:
步骤201,获取行车影像,行车影像中包含道路图像和前后方车辆的车辆图像。
其中,该行车影像是行车记录设备在车辆行驶状态下,通过图像采集组件拍摄的视频影像。且由于行车记录设备通常设置在车辆前方和/或后方,因此,该行车影像中包含前方和/或后方道路的道路图像以及前后方车辆的车辆图像。
可选的,该行车影像为行车记录设备在车辆行驶过程中传输给移动终端的实时行车影像,或,该行车影像为行车记录设备每隔预定时间间隔(比如5分钟)传输给移动终端的缓存行车影像。
步骤202,获取车辆行驶信息,车辆行驶信息用于表征车辆所处的行驶状态。
可选的,在获取行车影像的同时,移动终端获取车辆实时的车辆行驶信息。其中,该车辆行驶信息中包括车辆的当前车速和转向信息。可选的,该车辆行驶信息中还包括当前加速度、行驶方向等其他用于指示车辆行驶状态的信息,本发明实施例并不对此进行限定。
在一种可能的实施方式中,车辆行驶信息中的当前车速、当前加速度和行驶方向由行车记录设备、OBD设备或移动终端通过内置传感器(包括加速度传感器或陀螺仪等等)采集得到;车辆行驶信息中包含的转向信息则由OBD设备从车机处获得,可选的,该转向信息用于表征转向灯的开启情况。
步骤203,根据同一时刻下的行车影像和车辆行驶信息,确定是否存在危险驾驶行为,危险驾驶行为指存在事故风险的驾驶行为。
获取到行车影像和车辆行驶信息后,移动终端对同一时刻下的行车影像和车辆行驶信息进行分析,从而确定车辆是否存在危险驾驶行为。可选的,该危险驾驶行为包括:偏离车道、车距过小(与前方车辆之间或与后方车辆之间的间距过小)和紧急制动中的至少一种。需要说明的是,移动终端还可以根据行车影像和车辆行驶信息,确定出其他直接导致事故的危险驾驶行为,本发明实施例并不对此进行限定。
可选的,移动终端对获取到的行车影像进行图像识别,确定出车道信息和车距信息,并结合车辆行驶信息指示的转向信息,确定车辆是否存在偏离车道的危险驾驶行为;或,移动终端对获取到的行车影像进行图像识别,确定出车距信息和相对速度信息,并结合车辆行驶信息中的当前速度,确定车辆是否存在车距过小的危险驾驶行为;或,移动终端对获取到的行车影像进行图像识别, 确定出车距信息,并结合车辆行驶信息中的当前加速度,确定车辆是都存在紧急制动的危险驾驶行为。
可选的,对于确定出的危险驾驶行为,移动终端将其上报服务器,由服务器对车辆与危险驾驶行为进行关联存储,以便后续基于危险行为的类型、发生频次以及危险等级,构建车辆对应的驾驶行为模型(指示驾驶行为与事故风险的关系)。
需要说明的是,当行车记录设备具备较强的图像分析及数据处理能力时,上述步骤可以由行车记录设备执行,本发明实施例并不对此进行限定。
综上所述,本实施例中,通过获取包含道路图像和车辆图像的行车影像,以及表征车辆行驶状态的车辆行驶信息,从而基于该行车影像和车辆行驶信息,实现车辆危险驾驶行为的综合判断;相较于仅根据车辆行驶状态判断是否存在危险驾驶行为,通过结合更为直观的行车影像进行判断,使得危险驾驶行为的判断依据更为全面,提高了判断结果的准确性,进而提高基于判断结果制定出的投保策略的准确性。
为了方便后续基于危险驾驶行为制定相应的投保策略,移动终端需要对确定出的危险驾驶行为进行定量化处理。在一种可能的实施方式中,移动终端确定车辆存在危险驾驶行为后,进一步根据危险驾驶行为的类型、危险等级以及发生频次对危险驾驶行为进行定量计算,并将计算得到定量化数据上报给服务器,以便服务器根据该定量化数据对车辆的驾驶行为数据进行更新,从而基于该驾驶行为数据制定投保策略。下面采用示意性的实施例进行说明。
请参考图3A,其示出了本发明另一个实施例提供的驾驶行为确定方法的流程图,本实施例以该驾驶行为确定方法用于图1所示的移动终端120为例进行说明,该方法包括:
步骤301,获取行车影像,行车影像中包含道路图像和前后方车辆的车辆图像。
本步骤的实施方式与上述步骤201相似,本实施例在此不再赘述。
示意性的,在行驶状态下,移动终端获取到行车记录设备采集的行车影像如图3B所示,其中,该行车影像中包含道路图像31以及前方车辆的车辆图像32。
步骤302,对行车影像进行识别,得到行车影像识别结果,行车影像识别 结果中包含车道信息、车距信息和相对速度信息中的至少一种。
获取到行车影像后,移动终端通过图像分析技术对该行车影像进行分析识别,从而得到包含车道信息、车距信息以及相对速度信息中的至少一种信息的行车影像识别结果。其中,车道信息用于表征车辆所在的车道,车距信息用于表征与前方车辆和/或与后方车辆之间的距离,相对速度信息则用于表征与前方车辆和/或后方车辆的相对速度。
在一种可能的实施方式中,如图3C所示,本步骤包括如下步骤。
步骤302A,识别道路图像中包含的车道线;根据车道线确定车道信息。
为了确定车辆当前所处的车道,移动终端识别行车影像中道路图像中包含的车道线(又称为道路交通标线,包括白色虚线、白色实线、黄色虚线、黄色实线、双白虚线、双白实线、双黄实线、黄色虚实线等等),从而基于识别出的车道线确定车道信息。
在一种可能的实施方式中,如图3D所示,移动终端识别行车影像中位于下半部的类梯形区域33(或类三角形区域),并进一步识别该类梯形区域33中包含的车道线34,从而根据该车道线34确定车道信息。可选的,当识别出类梯形区域中包含两条车道线时,确定车辆位于中间车道;当识别出类梯形区域中仅左侧包含车道线时,确定车辆位于右侧车道;当识别出类梯形区域中仅右侧包含车道线时,确定车辆位于左侧车道。比如,如图3D所示,移动终端确定车辆当前处于中间车道。
在另一种可能的实施方式中,移动终端通过图像识别技术识别道路图像中包含的车道线,并为识别出的车道线分配车道线编号。若位于道路图像中部的车道线为第1车道线时,则确定车辆当前位于第1车道(最左侧车道);若位于道路图像中部的车道线为第n和第n+1车道线时,则确定车辆当前位于第n+1车道;若位于道路图像中部的车道线为第n车道线时,则确定车辆当前位于第n车道(最右侧车道)。
需要说明的是,移动终端还可以采用其他的图像分析技术确定车道信息,本发明实施例并不对此进行限定。
步骤302B,识别车辆图像在行车影像中所处的图像位置;根据图像位置和预设距离标尺,确定车距信息,预设距离标尺用于表征行车影像中不同图像位置与实际距离之间的映射关系。
为了确定与前方车辆(或后方车辆)之间的距离是否过近,移动终端获取 到行车影像后,确定行车影像中车辆图像在画面中所处的图像位置,从而根据该图像位置确定与前方车辆(或后方车辆)之间的实际距离。
在一种可能的实施方式中,移动终端中预先存储有表征图像位置与实际距离之间映射关系的预设距离标尺,示意性的,如图3E所示,行车影像对应预设距离标尺35。可选的,移动终端对行车影像中包含的图像进行轮廓识别,从而识别出前方车辆的车辆图像;进一步的,移动设备确定车辆图像的图像下边缘在行车影像中所处的图像位置,并根据该图像位置和预设距离标尺确定与前车之间的距离。比如,如图3E所示,移动终端确定与前方车辆之间的车距为50m。
步骤302C,识别车辆图像在行车影像中所处的图像位置;根据图像位置的变化情况确定相对速度信息。
可选的,该相对速度信息用于表征与前方车辆或与后方车辆的速度大小关系,比如,该相对速度信息表征车辆的当前车速大于前方车辆的当前车速。
为了确定与前方车辆(或后方车辆)相对速度的大小,对于行车影像中的每个图像帧,移动终端识别车辆图像在行车影像中所处的图像位置(具体方式与上述步骤302B相似),并根据相邻图像帧中图像位置的变化情况确定相对速度信息。
可选的,在确定与前方车辆相对速度的大小时,当检测到车辆图像的图像位置向画面的下方偏移时(车距变小),移动终端确定前方车辆的速度小于车辆当前速度;当检测到车辆图像的图像位置向画面的上方偏移时(车距变大),移动终端确定前方车辆的速度大于车辆当前速度。
可选的,在确定与后方车辆相对速度的大小时,当检测到车辆图像的图像位置向画面的下方偏移时(车距变大),移动终端确定后方车辆的速度小于车辆当前速度;当检测到车辆图像的图像位置向画面的上方偏移时(车距变小),移动终端确定后方车辆的速度大于车辆当前速度。
在其他可能的实施方式中,移动终端还可以根据图像位置的变化情况计算车距变化量,从而根据车辆当前速度和车距变化量计算前方(或后方)车辆的速度,并最终计算两者的相对速度差,本实施例并不对此进行限定
需要说明的是,当行车记录设备具备图像分析识别功能时,上述步骤301和302可以由行车记录设备执行,移动终端120只需获取行车记录设备提供的行车影像识别结果即可,本发明实施例并不对此进行限定。
步骤303,获取车辆行驶信息,车辆行驶信息用于表征车辆所处的行驶状态。
本步骤的实施方式与上述步骤202相似,本实施例在此不再赘述。
步骤304,根据行车影像识别结果和车辆行驶信息,确定是否存在危险驾驶行为。
可选的,移动终端结合车辆行驶信息以及上述步骤302得到的行车影像识别结果,确定车辆是否存在危险驾驶行为。在一种可能的实施方式中,如图3C所示,当行车影像识别结果中包含车道信息和车距信息,且车辆行驶信息中包含转向信息时,移动终端执行下述步骤304A;当行车影像识别结果中包含车距信息和相对速度信息,且车辆行驶信息中包含当前车速时,移动终端执行下述步骤304B;当行车影像识别结果中包含车距信息,且车辆行驶信息中包含当前加速度时,移动终端执行下述步骤304C。
步骤304A,当车道信息发生变化,且车距信息指示与前方车辆之间的距离小于第一阈值,且转向信息指示未开启转向灯时,确定存在第一危险驾驶行为,第一危险驾驶行为指偏离车道的驾驶行为。
正常行驶过程中,因前方车辆速度较慢而需要变道,或需要转弯时,为了确保变道安全,驾驶员需要开启车辆的转向灯,并在与前车保持一定距离的情况下进行变道。因此,在一种可能的实施方式中,移动终端检测上述步骤302A得到的车道信息是否发生变化;当检测到车道信息发生变化时,移动终端根据上述步骤302B得到的车距信息,检测与前方车辆之间的距离是否小于第一阈值,根据转向信息检测车辆是否开启转向灯,并在检测到与前方车辆之间的距离小于第一阈值,且未开启转向灯时,确定车辆存在危险驾驶行为。其中,该第一阈值与当前车速呈正相关关系。比如,该第一阈值为当前车速对应的安全制动距离。
比如,当检测到车辆由中间车道跨越至右侧车道,且与前方车辆之间的距离小于第一阈值,且未开启右转转向灯时,移动终端确定车辆偏离车道。
通过上述步骤304A,移动终端能够识别出车辆的变道情况时,并能够根据与前方车辆之间的距离,进一步确定变道行为是否存在事故风险;相较于仅根据传感器数据识别驾驶行为,本实施例中所能识别的危险驾驶行为更加丰富,且识别出的危险驾驶行为与事故发生概率的匹配更高。
步骤304B,确定当前车速对应的安全制动距离;当车距信息表征与前方 车辆之间的距离小于安全制动距离,且相对速度信息指示当前车速大于前方车辆的车速时,确定存在第二危险驾驶行为。
在行驶状态下,当车辆与前车之间的距离较小,或车辆快速接近前车时,发生碰撞事故的概率极高。因此,移动终端通过上述步骤302B确定与前车之间的车距后,进一步检测该车距是否小于当前车速对应的安全制动距离,并在车距小于当前车速对应的安全制动距离,且相对速度信息指示当前车速大于前方车辆的车速时,确定存在车距过小的危险驾驶行为。可选的,移动终端中预先存储有不同路面状况下,车速与安全制动距离之间的对应关系,该对应关系示意性如表一所示。
表一
路面状况 车速 安全制动距离
干燥 40km/h 30m
干燥 60km/h 50m
湿滑 40km/h 50m
湿滑 60km/h 80m
除了根据表一所示的对应关系查找车速对应的安全制动距离外,移动终端还可以根据当前车速与安全制动距离之间的比例关系实时计算安全制动距离,其中,当前车速与安全制动距离呈正比例关系,本发明实施例并不对此进行限定。
可选的,在确定车辆存在危险驾驶行为时,移动终端通过预定方式(比如发出预定提醒语音)提醒驾驶员,避免发生事故。
通过上述步骤304B,移动终端能够识别出车辆与前方车辆过近或快速接近前方车辆的行为,并进一步将该行为确定为危险驾驶行为;相较于相关技术中仅能够根据当前车速识别出是否车速过快,本实施例中所能识别的危险驾驶行为更加丰富,且识别出的危险驾驶行为与事故发生概率的匹配更高。
步骤304C,当车距信息表征与前方车辆之间的距离大于第二阈值,且车距信息表征与后方车辆之间的距离小于第三阈值,且当前加速度表征车辆刹车时,确定存在第三危险驾驶行为,第三危险驾驶行为指紧急制动的驾驶行为。
正常行驶过程中(在非拥堵路段),车辆急刹车(又称为紧急制动)时极易造成与后方车辆之间的距离过小,甚至直接导致事故。为了识别出因车辆急刹车导致车距过小的情况,车辆行驶过程中,移动终端获取与前方车辆之间的 距离、与后方车辆之间的距离以及车辆的当前加速度,并在检测到与前方车辆之间的距离大于第二阈值,且车距信息表征与后方车辆之间的距离小于第三阈值,且当前加速度表征车辆刹车时,确定车辆因急刹车导致与后方车辆间距较小。其中,第二阈值和第三阈值与当前车速呈正相关关系。
与相关技术中直接将紧急制动确定为危险驾驶行为不同的是,本实施例,通过将图像识别技术融入危险驾驶行为判断,能够避免将拥挤路段的紧急制动误判断为危险驾驶行为(因为拥挤路段紧急制动时,与前方车辆之间的距离小于第二阈值,不满足上述判断条件),从而提高了危险驾驶行为判断的准确性。
步骤305,确定危险驾驶行为的危险等级。
可选的,确定存在危险驾驶行为后,移动终端进一步根据当前道路的车流量、当前车速以及车距信息等数据,确定危险驾驶行为的危险等级,其中,危险等级越高,表示发生事故的可能性越高。
针对不同类型的危险驾驶行为,在一种可能的实施方式中,如图3C所示本步骤包括如下步骤。
步骤305A,当危险驾驶行为是第一危险驾驶行为时,根据当前车速和车流量确定危险等级。
可选的,当危险驾驶行为指示车辆偏离车道时,移动终端获取车辆的当前车速,并根据预定时长(比如一分钟)内行车影像中车辆的数量计算得到车流量,从而根据当前车速和车流量确定该危险驾驶行为的危险等级。其中,当前车速与危险等级呈正相关关系,车流量与危险等级呈正相关关系。
比如,移动终端根据表二所示的对应关系,确定危险驾驶行为的危险等级。
表二
当前车速 车流量 危险等级
40km/h 10辆/分 2
40km/h 20辆/分 4
60km/h 10辆/分 5
60km/h 20辆/分 8
步骤305B,当危险驾驶行为是第二危险驾驶行为或第三危险驾驶行为时,根据当前车速和当前车距确定危险等级。
可选的,当危险驾驶行为表征与前方车辆之间的距离过小时,移动终端获取车辆的当前车速以及与前方车辆的当前车距,从而根据当前车速和当前车距 确定危险等级。其中,当前车速与危险等级呈正相关关系,当前车距与危险等级呈负相关关系。
比如,移动终端根据表三所示的对应关系,确定危险驾驶行为的危险等级。
表三
当前车速 当前车距 危险等级
40km/h 20m 2
40km/h 10m 5
60km/h 40m 3
60km/h 20m 8
可选的,当危险驾驶行为表征存在紧急制动时,移动终端获取车辆的当前车速以及与后方车辆的当前车距,从而根据当前车速和当前车距确定危险等级。其中,当前车速与危险等级呈正相关关系,当前车距与危险等级呈负相关关系。
在其他可能的实施方式中,移动终端还可以根据不同类型危险驾驶行为对应的危险等级计算公式,计算危险驾驶行为对应的危险等级,本发明实施例并不对此进行限定。
步骤306,根据危险驾驶行为的发生频次、危险等级以及危险驾驶行为对应的权重,计算更新数据,更新数据用于更新车辆对应的驾驶行为数据。
由于不同类型危险驾驶行为对应的事故风险不同,因此,移动终端需要综合危险驾驶行为的发生频次、危险等级以及危险驾驶行为对应的权重,将危险驾驶行为转化为定量化数据(即用于更新驾驶行为数据的更新数据)。
在一种可能的实施方式中,针对同一类型的危险驾驶行为,该危险驾驶行为对应的定量化数据=危险驾驶行为的发生频次×危险等级×危险驾驶行为的权重。比如,当确定出的危险驾驶行为指示与前方车辆之间的距离过小,且发生频次为2次,危险等级为5级,权重为1.5时,该危险驾驶行为对应的定量化数据即为2×5×1.5=15;当确定出的危险驾驶行为指示偏离车道时,且发生频次为1次,危险等级为8级,权重为2时,该危险驾驶行为对应的定量化数据即为1×8×2=16。
需要说明的是,当行车记录设备具备较强的数据处理能力时,上述步骤303至306可以由行车记录设备执行,本发明实施例并不对此进行限定。
步骤307,向服务器发送数据更新请求,数据更新请求中包含车辆标识和 更新数据,服务器用于根据更新数据更新车辆标识所指示车辆的驾驶行为数据。
可选的,移动终端将实时计算得到的更新数据发送至服务器,或者,每隔预定时间间隔(比如10分钟),将该时间间隔内累计的更新数据发送至服务器。
相应的,服务器接收到携带更新数据和车辆标识的数据更新请求后,根据该更新数据对该车辆标识所指示车辆的驾驶行为数据进行更新。
可选的,向服务器发送数据更新请求中还携带更新数据对应的危险驾驶行为的类型、发生频次、发生地点以及发生时间等信息,以便服务器根据不同信息维度的驾驶行为数据构建相应的保险模型,从而进一步提高后续制定的投保策略的准确性。比如,服务器根据车辆的地域分布,构建针对不同地域的保险模型,从而利用相应的保险模型为车辆制定投保策略。
需要说明的是,当行车记录设备具备互联网访问功能时,上述步骤307可以由行车记录设备执行,本发明实施例并不对此进行限定。
本实施例中,通过对行车影像进行识别,确定出车道信息、车距信息以及相对速度信息,并基于上述信息进行危险驾驶行为判断,提高了危险驾驶行为判断的准确性。
本实施例中,结合行车影像能够识别出车辆偏离车道、与前方车辆之间过近或紧急制动等直接导致事故的危险驾驶行为,进一步提高了识别出的危险驾驶行为的参考价值。
本实施例中,移动终端根据危险驾驶行为的类型、危险等级以及发生频次对危险驾驶行为进行定量计算,并将计算得到定量化数据上报给服务器,以便服务器根据该定量化数据对车辆的驾驶行为数据进行更新,并基于该驾驶行为数据制定投保策略,有利于提高制定投保策略的准确性。
下述为本发明装置实施例,对于装置实施例中未详尽描述的细节,可以参考上述一一对应的方法实施例。
请参考图4,其示出了本发明一个实施例提供的驾驶行为确定装置的结构方框图。该驾驶行为确定装置通过硬件或者软硬件的结合实现成为图1中移动终端120的全部或者一部分。该驾驶行为确定装置包括:第一获取模块410、第二获取模块420和第一确定模块430。
第一获取模块410,用于实现上述步骤201或301的功能;
第二获取模块420,用于实现上述步骤202或303的功能;
第一确定模块430,用于实现上述步骤203的功能。
可选的,第一确定模块430,包括:识别单元和第一确定单元;
识别单元,用于实现上述步骤302的功能;
第一确定单元,用于实现上述步骤304的功能。
可选的,识别单元,还用于实现上述步骤302A、302B或302C的功能。
可选的,车辆行驶信息中包含当前车速和转向信息;
第一确定单元,还用于实现上述步骤304A、304B或304C的功能。
可选的,该装置,还包括:第二确定模块、计算模块和发送模块;
第二确定模块,用于实现上述步骤305的功能;
计算模块,用于实现上述步骤306的功能;
发送模块,用于实现上述步骤307的功能。
可选的,第二确定模块,包括:第二确定单元和第三确定单元;
第二确定单元,用于实现上述步骤305A的功能;
第三确定单元,用于实现上述步骤305B的功能。
请参考图5,其示出了本发明一个实施例提供的车辆驾驶行为确定设备的结构示意图。该设备500为图1中的行车记录设备110或移动终端120。具体来讲:
设备500可以包括RF(Radio Frequency,射频)电路510、包括有一个或一个以上计算机可读存储介质的存储器520、输入单元530、显示单元540、传感器550、音频电路560、近场通信模块570、包括有一个或者一个以上处理核心的处理器580、以及电源590等部件。本领域技术人员可以理解,图5中示出的终端结构并不构成对终端的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。其中:
RF电路510可用于收发信息或通话过程中,信号的接收和发送,特别地,将基站的下行信息接收后,交由一个或者一个以上处理器580处理;另外,将涉及上行的数据发送给基站。通常,RF电路510包括但不限于天线、至少一个放大器、调谐器、一个或多个振荡器、用户身份模块(SIM)卡、收发信机、耦合器、LNA(Low Noise Amplifier,低噪声放大器)、双工器等。此外,RF电路510还可以通过无线通信与网络和其他设备通信。所述无线通信可以使用 任一通信标准或协议,包括但不限于GSM(Global System of Mobile communication,全球移动通讯***)、GPRS(General Packet Radio Service,通用分组无线服务)、CDMA(Code Division Multiple Access,码分多址)、WCDMA(Wideband Code Division Multiple Access,宽带码分多址)、LTE(Long Term Evolution,长期演进)、电子邮件、SMS(Short Messaging Service,短消息服务)等。
存储器520可用于存储软件程序以及模块,处理器580通过运行存储在存储器520的软件程序以及模块,从而执行各种功能应用以及数据处理。存储器520可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作***、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据设备500的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器520可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。相应地,存储器520还可以包括存储器控制器,以提供处理器580和输入单元530对存储器520的访问。
输入单元530可用于接收输入的数字或字符信息,以及产生与用户设置以及功能控制有关的键盘、鼠标、操作杆、光学或者轨迹球信号输入。具体地,输入单元530可包括图像输入设备531以及其他输入设备532。图像输入设备531可以是摄像头,也可以是光电扫描设备。除了图像输入设备531,输入单元530还可以包括其他输入设备532。具体地,其他输入设备532可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆等中的一种或多种。
显示单元540可用于显示由用户输入的信息或提供给用户的信息以及设备500的各种图形用户接口,这些图形用户接口可以由图形、文本、图标、视频和其任意组合来构成。显示单元540可包括显示面板541,可选的,可以采用LCD(Liquid Crystal Display,液晶显示器)、OLED(Organic Light-Emitting Diode,有机发光二极管)等形式来配置显示面板541。
设备500还可包括至少一种传感器550,比如光传感器、运动传感器以及其他传感器。具体地,光传感器可包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板541的亮度,接近传感器可在设备500移动到耳边时,关闭显示面板541和/或背光。作为运动传感器的一 种,重力加速度传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;至于设备500还可配置的陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
音频电路560、扬声器561,传声器562可提供用户与设备500之间的音频接口。音频电路560可将接收到的音频数据转换后的电信号,传输到扬声器561,由扬声器561转换为声音信号输出;另一方面,传声器562将收集的声音信号转换为电信号,由音频电路560接收后转换为音频数据,再将音频数据输出处理器580处理后,经RF电路510以发送给比如另一电子设备,或者将音频数据输出至存储器520以便进一步处理。音频电路560还可能包括耳塞插孔,以提供外设耳机与设备500的通信。
设备500通过近场通信模块570与外部设备建立近场通信连接,并通过该近场通信连接进行数据交互。本实施例中,该近场通信模块570具体包括蓝牙模块和/或WiFi模块。
处理器580是设备500的控制中心,利用各种接口和线路连接整个手机的各个部分,通过运行或执行存储在存储器520内的软件程序和/或模块,以及调用存储在存储器520内的数据,执行设备500的各种功能和处理数据,从而对手机进行整体监控。可选的,处理器580可包括一个或多个处理核心;优选的,处理器580可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作***、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器580中。
设备500还包括给各个部件供电的电源590(比如电池),优选的,电源可以通过电源管理***与处理器580逻辑相连,从而通过电源管理***实现管理充电、放电、以及功耗管理等功能。电源590还可以包括一个或一个以上的直流或交流电源、再充电***、电源故障检测电路、电源转换器或者逆变器、电源状态指示器等任意组件。
尽管未示出,设备500还可以包括蓝牙模块等,在此不再赘述。
本发明实施例还提供了一种计算机可读存储介质,该存储介质存储有至少一条指令,所述至少一条指令由所述处理器加载并执行以实现上述各个方法实施例提供的驾驶行为确定方法。
本发明实施例还提供了一种计算机程序产品,该计算机程序产品存储有至少一条指令,所述至少一条指令由所述处理器加载并执行以实现上述各个方法实施例提供的驾驶行为确定方法。
上述本发明实施例序号仅仅为了描述,不代表实施例的优劣。
以上所述仅为本发明的较佳实施例,并不用以限制本发明实施例,凡在本发明实施例的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本发明实施例的保护范围之内。

Claims (18)

  1. 一种驾驶行为确定方法,其特征在于,所述方法包括:
    获取行车影像,所述行车影像中包含道路图像和前后方车辆的车辆图像;
    获取车辆行驶信息,所述车辆行驶信息用于表征车辆所处的行驶状态;
    根据同一时刻下的所述行车影像和所述车辆行驶信息,确定是否存在危险驾驶行为,所述危险驾驶行为指存在事故风险的驾驶行为。
  2. 根据权利要求1所述的方法,其特征在于,所述根据同一时刻下的所述行车影像和所述车辆行驶信息,确定是否存在危险驾驶行为,包括:
    对所述行车影像进行识别,得到行车影像识别结果,所述行车影像识别结果中包含车道信息、车距信息和相对速度信息中的至少一种;
    根据所述行车影像识别结果和所述车辆行驶信息,确定是否存在所述危险驾驶行为;
    其中,所述车道信息用于表征车辆所在的车道,所述车距信息用于表征与前方车辆和/或与后方车辆之间的距离,所述相对速度信息用于表征与前方车辆的相对速度,和/或,与后方车辆的相对速度。
  3. 根据权利要求2所述的方法,其特征在于,所述对所述行车影像进行识别,得到行车影像识别结果,包括:
    识别所述道路图像中包含的车道线;根据所述车道线确定所述车道信息;
    和/或,
    识别所述车辆图像在所述行车影像中所处的图像位置;根据所述图像位置和预设距离标尺,确定所述车距信息,所述预设距离标尺用于表征所述行车影像中不同图像位置与实际距离之间的映射关系;
    和/或,
    识别所述车辆图像在所述行车影像中所处的图像位置;根据所述图像位置的变化情况确定所述相对速度信息。
  4. 根据权利要求2所述的方法,其特征在于,所述行车影像识别结果中包含所述车道信息和所述车距信息,所述车辆行驶信息中包含转向信息;
    所述根据所述行车影像识别结果和所述车辆行驶信息,确定是否存在所述危险驾驶行为,包括:
    当所述车道信息发生变化,且所述车距信息表征与前方车辆之间的距离小于第一阈值,且所述转向信息表征未开启转向灯时,确定存在第一危险驾驶行为,所述第一危险驾驶行为指偏离车道的驾驶行为。
  5. 根据权利要求2所述的方法,其特征在于,所述行车影像识别结果中包含所述车距信息和所述相对速度信息,所述车辆行驶信息中包含当前车速;
    所述根据所述行车影像识别结果和所述车辆行驶信息,确定是否存在所述危险驾驶行为,包括:
    确定所述当前车速对应的安全制动距离;当所述车距信息表征与前方车辆之间的距离小于所述安全制动距离,且所述相对速度信息表征当前车速大于前方车辆的车速时,确定存在第二危险驾驶行为,所述第二危险驾驶行为指车距过小的驾驶行为。
  6. 根据权利要求2所述的方法,其特征在于,所述行车影像识别结果中包含所述车距信息,所述车辆行驶信息中包含当前加速度;
    所述根据所述行车影像识别结果和所述车辆行驶信息,确定是否存在所述危险驾驶行为,包括:
    当所述车距信息表征与前方车辆之间的距离大于第二阈值,且所述车距信息表征与后方车辆之间的距离小于第三阈值,且所述当前加速度表征车辆刹车时,确定存在第三危险驾驶行为,所述第三危险驾驶行为指紧急制动的驾驶行为。
  7. 根据权利要求4至6任一所述的方法,其特征在于,所述根据所述行车影像识别结果和所述车辆行驶信息,确定是否存在所述危险驾驶行为之后,还包括:
    确定所述危险驾驶行为的危险等级;
    根据所述危险驾驶行为的发生频次、所述危险等级以及所述危险驾驶行为对应的权重,计算更新数据,所述更新数据用于更新车辆对应的驾驶行为数据。
  8. 根据权利要求7所述的方法,其特征在于,所述确定所述危险驾驶行为的危险等级,包括:
    当所述危险驾驶行为是所述第一危险驾驶行为时,根据当前车速和车流量确定所述危险等级;
    或,
    当所述危险驾驶行为是所述第二危险驾驶行为或所述第三危险驾驶行为时,根据当前车速和当前车距确定所述危险等级;
    其中,所述当前车速与所述危险等级呈正相关关系,所述车流量与所述危险等级呈正相关关系,所述当前车距与所述危险等级呈负相关关系。
  9. 一种驾驶行为确定装置,其特征在于,所述装置包括:
    第一获取模块,用于获取行车影像,所述行车影像中包含道路图像和前后方车辆的车辆图像;
    第二获取模块,用于获取车辆行驶信息,所述车辆行驶信息用于表征车辆所处的行驶状态;
    第一确定模块,用于根据同一时刻下的所述行车影像和所述车辆行驶信息,确定是否存在危险驾驶行为,所述危险驾驶行为指存在事故风险的驾驶行为。
  10. 根据权利要求9所述的装置,其特征在于,所述第一确定模块,包括:
    识别单元,用于对所述行车影像进行识别,得到行车影像识别结果,所述行车影像识别结果中包含车道信息、车距信息和相对速度信息中的至少一种;
    第一确定单元,用于根据所述行车影像识别结果和所述车辆行驶信息,确定是否存在所述危险驾驶行为;
    其中,所述车道信息用于表征车辆所在的车道,所述车距信息用于表征与前方车辆和/或与后方车辆之间的距离,所述相对速度信息用于表征与前方车辆的相对速度,和/或,与后方车辆的相对速度。
  11. 根据权利要求10所述的装置,其特征在于,所述识别单元,还用于:
    识别所述道路图像中包含的车道线;根据所述车道线确定所述车道信息;
    和/或,
    识别所述车辆图像在所述行车影像中所处的图像位置;根据所述图像位置 和预设距离标尺,确定所述车距信息,所述预设距离标尺用于表征所述行车影像中不同图像位置与实际距离之间的映射关系;
    和/或,
    识别所述车辆图像在所述行车影像中所处的图像位置;根据所述图像位置的变化情况确定所述相对速度信息。
  12. 根据权利要求10所述的装置,其特征在于,所述行车影像识别结果中包含所述车道信息和所述车距信息,所述车辆行驶信息中包含转向信息;
    所述第一确定单元,用于:
    当所述车道信息发生变化,且所述车距信息表征与前方车辆之间的距离小于第一阈值,且所述转向信息表征未开启转向灯时,确定存在第一危险驾驶行为,所述第一危险驾驶行为指偏离车道的驾驶行为。
  13. 根据权利要求10所述的装置,其特征在于,所述行车影像识别结果中包含所述车距信息和所述相对速度信息,所述车辆行驶信息中包含当前车速;
    所述第一确定单元,用于:
    确定所述当前车速对应的安全制动距离;当所述车距信息表征的与前方车辆之间的距离小于所述安全制动距离,且所述相对速度信息表征当前车速大于前方车辆的车速时,确定存在第二危险驾驶行为,所述第二危险驾驶行为指车距过小的驾驶行为。
  14. 根据权利要求10所述的装置,其特征在于,所述行车影像识别结果中包含所述车距信息,所述车辆行驶信息中包含当前加速度;
    所述第一确定单元,用于:
    当所述车距信息表征与前方车辆之间的距离大于第二阈值,且所述车距信息表征与后方车辆之间的距离小于第三阈值,且所述当前加速度表征车辆刹车时,确定存在第三危险驾驶行为,所述第三危险驾驶行为指紧急制动的驾驶行为。
  15. 根据权利要求12至14任一所述的装置,其特征在于,所述装置,还包括:
    第二确定模块,用于确定所述危险驾驶行为的危险等级;
    计算模块,用于根据所述危险驾驶行为的发生频次、所述危险等级以及所述危险驾驶行为对应的权重,计算更新数据,所述更新数据用于更新车辆对应的驾驶行为数据。
  16. 根据权利要求15所述的装置,其特征在于,所述第二确定模块,包括:
    第二确定单元,用于当所述危险驾驶行为是所述第一危险驾驶行为时,根据当前车速和车流量确定所述危险等级;
    或,
    第三确定单元,用于当所述危险驾驶行为是所述第二危险驾驶行为或所述第三危险驾驶行为时,根据当前车速和当前车距确定所述危险等级;
    其中,所述当前车速与所述危险等级呈正相关关系,所述车流量与所述危险等级呈正相关关系,所述当前车距与所述危险等级呈负相关关系。
  17. 一种驾驶行为确定设备,其特征在于,所述设备包括处理器和存储器,所述存储器中存储有至少一条指令,所述至少一条指令由所述处理器加载并执行以实现如权利要求1至9任一所述的驾驶行为确定方法。
  18. 一种计算机可读存储介质,其特征在于,所述存储介质中存储有至少一条指令,所述至少一条指令由所述处理器加载并执行以实现如权利要求1至9任一所述的驾驶行为确定方法。
PCT/CN2018/075954 2017-03-06 2018-02-09 驾驶行为确定方法、装置、设备及存储介质 WO2018161774A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2019528487A JP7072763B2 (ja) 2017-03-06 2018-02-09 運転行為の確定方法、装置、機器及び記憶媒体
EP18764635.1A EP3561780A4 (en) 2017-03-06 2018-02-09 METHOD FOR DETERMINING THE DRIVING BEHAVIOR, DEVICE, EQUIPMENT AND STORAGE MEDIUM
KR1020197025758A KR20190115040A (ko) 2017-03-06 2018-02-09 운전 거동 결정 방법, 디바이스, 장비 및 저장 매체
US16/428,332 US10913461B2 (en) 2017-03-06 2019-05-31 Driving behavior determining method, apparatus, and device, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710127810.7 2017-03-06
CN201710127810.7A CN108288312A (zh) 2017-03-06 2017-03-06 驾驶行为确定方法及装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/428,332 Continuation US10913461B2 (en) 2017-03-06 2019-05-31 Driving behavior determining method, apparatus, and device, and storage medium

Publications (1)

Publication Number Publication Date
WO2018161774A1 true WO2018161774A1 (zh) 2018-09-13

Family

ID=62831524

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/075954 WO2018161774A1 (zh) 2017-03-06 2018-02-09 驾驶行为确定方法、装置、设备及存储介质

Country Status (7)

Country Link
US (1) US10913461B2 (zh)
EP (1) EP3561780A4 (zh)
JP (1) JP7072763B2 (zh)
KR (1) KR20190115040A (zh)
CN (1) CN108288312A (zh)
TW (1) TWI670191B (zh)
WO (1) WO2018161774A1 (zh)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111832376A (zh) * 2019-07-18 2020-10-27 北京骑胜科技有限公司 一种车辆逆行检测方法、装置、电子设备及存储介质
CN113500993A (zh) * 2021-06-21 2021-10-15 上汽通用五菱汽车股份有限公司 防碰撞功能参数的标定方法、车辆及可读存储介质
CN113611007A (zh) * 2021-08-05 2021-11-05 北京百姓车服网络科技有限公司 一种数据处理方法及数据采集***
CN113619609A (zh) * 2021-09-18 2021-11-09 北京声智科技有限公司 一种危险提示方法、装置、电子设备和计算机可读介质
CN115376335A (zh) * 2022-10-25 2022-11-22 创辉达设计股份有限公司 一种城市道路交织区多目标优化控制方法及装置
EP3973695A4 (en) * 2019-05-23 2023-08-02 Streetscope, Inc. DEVICE AND METHOD FOR PROCESSING VEHICLE SIGNALS TO CALCULATE A BEHAVIORAL RISK MEASURE
CN116644585A (zh) * 2023-05-30 2023-08-25 清华大学 基于目标车辆危险度的险态场景数据生成方法和装置

Families Citing this family (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9754338B2 (en) 2015-10-09 2017-09-05 Gt Gettaxi Limited System to facilitate a correct identification of a service provider
US10636108B2 (en) 2016-09-30 2020-04-28 Lyft, Inc. Identifying matched requestors and providers
WO2018229550A1 (en) * 2017-06-16 2018-12-20 Nauto Global Limited System and method for adverse vehicle event determination
WO2019189908A1 (ja) * 2018-03-30 2019-10-03 パナソニックIpマネジメント株式会社 運転支援装置、車両、非行運転検知システムおよびサーバ装置
CN108860162B (zh) * 2018-07-18 2022-02-15 平安科技(深圳)有限公司 电子装置、基于用户驾驶行为的安全预警方法及存储介质
CN108860158B (zh) * 2018-07-27 2021-08-24 平安科技(深圳)有限公司 车辆、车险防欺诈预警及存储介质
JP7155750B2 (ja) * 2018-08-23 2022-10-19 トヨタ自動車株式会社 情報システムおよびプログラム
JP7070278B2 (ja) * 2018-09-20 2022-05-18 株式会社デンソー ドライブレコーダ及び状況情報管理システム
CN109447127A (zh) * 2018-09-29 2019-03-08 深圳市元征科技股份有限公司 数据处理方法及装置
JP7438126B2 (ja) * 2018-10-22 2024-02-26 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ 情報処理方法及び情報処理システム
JP7163152B2 (ja) * 2018-11-28 2022-10-31 京セラ株式会社 画像処理装置、撮像装置、移動体及び画像処理方法
CN109801490A (zh) * 2018-12-10 2019-05-24 百度在线网络技术(北京)有限公司 行驶数据处理方法、装置、设备及计算机可读存储介质
CN109887124B (zh) * 2019-01-07 2022-05-13 平安科技(深圳)有限公司 车辆运动数据的处理方法、装置、计算机设备、存储介质
CN111526311B (zh) * 2019-01-17 2023-02-28 北京嘀嘀无限科技发展有限公司 驾驶用户行为的判断方法、***、计算机设备及存储介质
JP2020160481A (ja) * 2019-03-25 2020-10-01 株式会社野村総合研究所 異常判定装置
US11157784B2 (en) * 2019-05-08 2021-10-26 GM Global Technology Operations LLC Explainable learning system and methods for autonomous driving
US11910452B2 (en) 2019-05-28 2024-02-20 Lyft, Inc. Automatically connecting wireless computing devices based on recurring wireless signal detections
CN110766826B (zh) * 2019-09-10 2023-07-25 中国平安财产保险股份有限公司 一种驾驶行为分析方法
CN110784845B (zh) * 2019-09-12 2021-07-16 腾讯科技(深圳)有限公司 一种基于车联网的属性数据的平衡方法、装置、电子设备及存储介质
CN110737736B (zh) * 2019-09-16 2024-02-09 连尚(新昌)网络科技有限公司 一种获取车况地图信息的方法与设备
EP3819891A1 (en) * 2019-11-07 2021-05-12 Ningbo Geely Automobile Research & Development Co. Ltd. Threat mitigation for vehicles
JP2021096530A (ja) * 2019-12-13 2021-06-24 トヨタ自動車株式会社 運転支援装置、運転支援プログラムおよび運転支援システム
CN111191949B (zh) * 2020-01-03 2022-06-17 北京三快在线科技有限公司 网约车辆异常行驶行为识别方法、装置、电子设备
CN111260882A (zh) * 2020-02-04 2020-06-09 上海博泰悦臻电子设备制造有限公司 一种驾驶行为的提醒方法、***及服务器
CN111326019A (zh) * 2020-02-27 2020-06-23 腾讯科技(深圳)有限公司 驾驶风险预警方法及装置、计算机介质和电子设备
JP2021152701A (ja) * 2020-03-24 2021-09-30 株式会社東芝 運転特性評価装置及び運転特性評価プログラム
US11887386B1 (en) * 2020-03-30 2024-01-30 Lyft, Inc. Utilizing an intelligent in-cabin media capture device in conjunction with a transportation matching system
JP7371562B2 (ja) * 2020-04-08 2023-10-31 トヨタ自動車株式会社 情報処理装置、情報処理システム、及び、プログラム
KR102325998B1 (ko) * 2020-04-08 2021-11-15 주식회사 별따러가자 운행 기록을 분석하는 방법 및 시스템, 및 운행 기록 장치
US20230148350A1 (en) * 2020-04-08 2023-05-11 Star Pickers. Inc. Method and system for analyzing operation record, and operation record device
CN111619561B (zh) * 2020-05-22 2022-01-14 安徽江淮汽车集团股份有限公司 自动驾驶汽车的逃逸救生方法、装置、设备及存储介质
CN111832901A (zh) * 2020-06-17 2020-10-27 北京嘀嘀无限科技发展有限公司 网约车监控方法、装置、服务器和存储介质
US11644835B2 (en) 2020-07-29 2023-05-09 Toyota Research Institute, Inc. Game-theoretic planning for risk-aware interactive agents
CN111951548B (zh) * 2020-07-30 2023-09-08 腾讯科技(深圳)有限公司 一种车辆驾驶风险确定方法、装置、***及介质
CN112258837B (zh) * 2020-10-19 2024-04-12 腾讯科技(深圳)有限公司 一种车辆预警的方法、相关装置、设备以及存储介质
CN112810617A (zh) * 2021-01-04 2021-05-18 宝能(西安)汽车研究院有限公司 驾驶行为分析***的控制方法、控制装置和***
CN113112866B (zh) * 2021-04-14 2022-06-03 深圳市旗扬特种装备技术工程有限公司 一种智能交通预警方法及智能交通预警***
CN113071505B (zh) * 2021-04-16 2024-02-13 阿波罗智联(北京)科技有限公司 驾驶行为习惯的确定、车辆行驶控制方法、装置及设备
JP7447870B2 (ja) * 2021-06-04 2024-03-12 トヨタ自動車株式会社 情報処理サーバ、情報処理サーバの処理方法、プログラム
CN114023109B (zh) * 2021-11-03 2023-06-30 中国矿业大学 一种用于移动罐车防追尾的预警***
TWI816233B (zh) * 2021-11-19 2023-09-21 公信電子股份有限公司 車門控制系統及其方法
CN114103985A (zh) * 2021-11-26 2022-03-01 国汽智控(北京)科技有限公司 一种基于障碍物的提示方法、装置和设备
CN114419888A (zh) * 2022-01-21 2022-04-29 北京汇通天下物联科技有限公司 一种货运车辆的安全预警方法、装置、设备及存储介质
CN114973669B (zh) * 2022-05-23 2023-09-26 江苏智行未来汽车研究院有限公司 基于车路协同的公路危险环境预警方法、装置及介质
KR20240052218A (ko) 2022-10-14 2024-04-23 강성호 고기양념 주입시스템
KR20240052220A (ko) 2022-10-14 2024-04-23 강성호 고기의 스틱형 양념 주입 방법 및 그 주입 조리 방법
CN115966100B (zh) * 2022-12-19 2023-11-03 深圳市昂星科技有限公司 一种行车安全控制方法及***

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101161524A (zh) * 2006-10-12 2008-04-16 财团法人车辆研究测试中心 侦测车距的方法与装置
CN103043021A (zh) * 2013-01-04 2013-04-17 浙江海康集团有限公司 一种集成五路视频检测的全方位汽车主动安全***
CN104077819A (zh) * 2014-06-17 2014-10-01 深圳前向启创数码技术有限公司 基于行车安全的远程监控方法及***
CN104260723A (zh) * 2014-09-29 2015-01-07 长安大学 一种后方车辆运动状态追踪预测装置及预测方法
DE102013021866A1 (de) * 2013-12-20 2015-06-25 Audi Ag Verfahren zum Bereitstellen einer Funktion eines Kraftfahrzeugs

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4075172B2 (ja) 1998-12-24 2008-04-16 マツダ株式会社 車両の障害物警報装置
WO2009013815A1 (ja) * 2007-07-24 2009-01-29 Nissan Motor Co., Ltd. 車両用運転支援装置および車両用運転支援装置を備える車両
JP4995046B2 (ja) * 2007-11-21 2012-08-08 株式会社日立製作所 自動車保険料設定システム
WO2013008996A1 (ko) * 2011-07-14 2013-01-17 에스케이플래닛 주식회사 텔레매틱스 서비스를 위한 장치 및 방법
US20140379385A1 (en) * 2013-06-24 2014-12-25 Elwha, Llc System and method for monitoring driving to determine an insurance property
US20140379384A1 (en) * 2013-06-24 2014-12-25 Elwha, Llc System and method for monitoring driving to determine an insurance property
JP2015184968A (ja) * 2014-03-25 2015-10-22 株式会社日立製作所 運転特性診断方法
US10475127B1 (en) * 2014-07-21 2019-11-12 State Farm Mutual Automobile Insurance Company Methods of providing insurance savings based upon telematics and insurance incentives
JP6486640B2 (ja) * 2014-10-09 2019-03-20 株式会社日立製作所 運転特性診断装置、運転特性診断システム、運転特性診断方法、情報出力装置、情報出力方法
JP6480143B2 (ja) * 2014-10-09 2019-03-06 株式会社日立製作所 運転特性診断装置、運転特性診断システム、運転特性診断方法
US9836963B1 (en) * 2015-01-20 2017-12-05 State Farm Mutual Automobile Insurance Company Determining corrective actions based upon broadcast of telematics data originating from another vehicle
JP2016197378A (ja) 2015-04-06 2016-11-24 株式会社リコー 運転特性評価用情報提供システムおよび提供方法
EP3272610B1 (en) * 2015-04-21 2019-07-17 Panasonic Intellectual Property Management Co., Ltd. Information processing system, information processing method, and program
CN106447496A (zh) * 2015-08-06 2017-02-22 平安科技(深圳)有限公司 车险保单自动生成方法、车辆、电子设备及保险服务器
US9996756B2 (en) * 2015-08-31 2018-06-12 Lytx, Inc. Detecting risky driving with machine vision
CN105206052B (zh) * 2015-09-21 2018-05-11 张力 一种驾驶行为分析方法及设备
CN105261225A (zh) * 2015-09-30 2016-01-20 肖建辉 改良驾驶行为习惯的监测***
CN105894610A (zh) * 2015-11-11 2016-08-24 乐卡汽车智能科技(北京)有限公司 数据处理方法、装置及***
CN105698874B (zh) * 2016-04-12 2018-02-02 吉林大学 车辆行驶状态突变检测装置
CN105956625B (zh) * 2016-05-11 2019-07-05 清华大学深圳研究生院 一种基于给定物理模型的汽车运动状态识别方法及***

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101161524A (zh) * 2006-10-12 2008-04-16 财团法人车辆研究测试中心 侦测车距的方法与装置
CN103043021A (zh) * 2013-01-04 2013-04-17 浙江海康集团有限公司 一种集成五路视频检测的全方位汽车主动安全***
DE102013021866A1 (de) * 2013-12-20 2015-06-25 Audi Ag Verfahren zum Bereitstellen einer Funktion eines Kraftfahrzeugs
CN104077819A (zh) * 2014-06-17 2014-10-01 深圳前向启创数码技术有限公司 基于行车安全的远程监控方法及***
CN104260723A (zh) * 2014-09-29 2015-01-07 长安大学 一种后方车辆运动状态追踪预测装置及预测方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3561780A4 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3973695A4 (en) * 2019-05-23 2023-08-02 Streetscope, Inc. DEVICE AND METHOD FOR PROCESSING VEHICLE SIGNALS TO CALCULATE A BEHAVIORAL RISK MEASURE
CN111832376A (zh) * 2019-07-18 2020-10-27 北京骑胜科技有限公司 一种车辆逆行检测方法、装置、电子设备及存储介质
CN113500993A (zh) * 2021-06-21 2021-10-15 上汽通用五菱汽车股份有限公司 防碰撞功能参数的标定方法、车辆及可读存储介质
CN113500993B (zh) * 2021-06-21 2022-09-20 上汽通用五菱汽车股份有限公司 防碰撞功能参数的标定方法、车辆及可读存储介质
CN113611007A (zh) * 2021-08-05 2021-11-05 北京百姓车服网络科技有限公司 一种数据处理方法及数据采集***
CN113611007B (zh) * 2021-08-05 2023-04-18 北京百姓车服网络科技有限公司 一种数据处理方法及数据采集***
CN113619609A (zh) * 2021-09-18 2021-11-09 北京声智科技有限公司 一种危险提示方法、装置、电子设备和计算机可读介质
CN115376335A (zh) * 2022-10-25 2022-11-22 创辉达设计股份有限公司 一种城市道路交织区多目标优化控制方法及装置
CN116644585A (zh) * 2023-05-30 2023-08-25 清华大学 基于目标车辆危险度的险态场景数据生成方法和装置
CN116644585B (zh) * 2023-05-30 2024-01-09 清华大学 基于目标车辆危险度的险态场景数据生成方法和装置

Also Published As

Publication number Publication date
EP3561780A4 (en) 2020-02-26
KR20190115040A (ko) 2019-10-10
TWI670191B (zh) 2019-09-01
TW201832962A (zh) 2018-09-16
JP2020513617A (ja) 2020-05-14
EP3561780A1 (en) 2019-10-30
US20190283763A1 (en) 2019-09-19
US10913461B2 (en) 2021-02-09
CN108288312A (zh) 2018-07-17
JP7072763B2 (ja) 2022-05-23

Similar Documents

Publication Publication Date Title
WO2018161774A1 (zh) 驾驶行为确定方法、装置、设备及存储介质
US10972975B2 (en) Electronic device for transmitting communication signal related to pedestrian safety and method of operating same
US10470131B2 (en) Electronic device for controlling communication circuit based on identification information received from external device and operation method thereof
Engelbrecht et al. Survey of smartphone‐based sensing in vehicles for intelligent transportation system applications
CN108665678B (zh) 一种请求救援的方法和装置
US10234867B2 (en) Information processing device, vehicle-mounted device, and information processing method
CN105788321B (zh) 车辆通信方法、装置及***
CN107749194B (zh) 一种变道辅助方法及移动终端
US8907772B1 (en) System and method for automatic unsafe driving determination and notification
KR20190032090A (ko) 외부 이동 수단으로 릴레이 메시지를 전송하는 전자 장치 및 그 동작 방법
US20150061875A1 (en) Method and system for detecting conditions of drivers, and electronic apparatus thereof
CN107826109B (zh) 车道保持方法和装置
CN110775059B (zh) 一种基于人工智能的自动跟车方法和相关装置
CN112258837B (zh) 一种车辆预警的方法、相关装置、设备以及存储介质
US20230022123A1 (en) Autonomous driving method and apparatus
CN109064746A (zh) 一种信息处理方法、终端和计算机可读存储介质
CN108091159A (zh) 一种躲避拥堵路线的方法及移动终端
CN111885500A (zh) 基于窄带物联网的路况提醒方法、装置及存储介质
CN115762138A (zh) 路面状况的提醒方法、装置、电子设备及存储介质
US20140031061A1 (en) Systems And Methods For Monitoring Device And Vehicle
CN105151173A (zh) 自行车的车灯控制方法及装置
CN109855643B (zh) 一种车道引导方法及导航设备
CN109685850B (zh) 一种横向定位方法及车载设备
CN113299098A (zh) 交通路口车辆引导方法及装置
CN112298184B (zh) 基于人工智能的驾驶切换方法、装置、设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18764635

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019528487

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2018764635

Country of ref document: EP

Effective date: 20190723

ENP Entry into the national phase

Ref document number: 20197025758

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE