WO2021195955A1 - 检测车辆行驶场景的复杂度的方法和装置 - Google Patents

检测车辆行驶场景的复杂度的方法和装置 Download PDF

Info

Publication number
WO2021195955A1
WO2021195955A1 PCT/CN2020/082412 CN2020082412W WO2021195955A1 WO 2021195955 A1 WO2021195955 A1 WO 2021195955A1 CN 2020082412 W CN2020082412 W CN 2020082412W WO 2021195955 A1 WO2021195955 A1 WO 2021195955A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
complexity
static
host vehicle
target vehicle
Prior art date
Application number
PCT/CN2020/082412
Other languages
English (en)
French (fr)
Inventor
陈伟
要志良
余荣杰
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2020/082412 priority Critical patent/WO2021195955A1/zh
Priority to EP20928784.6A priority patent/EP4120180A4/en
Priority to CN202080005175.5A priority patent/CN112740295B/zh
Publication of WO2021195955A1 publication Critical patent/WO2021195955A1/zh
Priority to US17/956,087 priority patent/US20230050063A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • B60W30/16Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/04Traffic conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/105Speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/005Handover processes
    • B60W60/0059Estimation of the risk associated with autonomous or manual driving, e.g. situation too complex, sensor failure or driver incapacity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/052Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/05Type of road, e.g. motorways, local streets, paved or unpaved roads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/10Number of lanes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/20Static objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/408Traffic behavior, e.g. swarm
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • B60W2554/803Relative lateral speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • B60W2554/804Relative longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • B60W2554/805Azimuth angle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/20Ambient conditions, e.g. wind or rain

Definitions

  • This application relates to the field of automatic driving technology, and in particular to a method and device for detecting the complexity of a vehicle driving scene.
  • the following methods can usually be used to determine the complexity of a driving scene.
  • the above-mentioned static factors include: the number of lanes, the form of intermediate separation, and the width of the lane where the vehicle is located.
  • the complexity corresponding to the actual values of the above-mentioned static factors is determined.
  • the complexity corresponding to the travel speed interval the complexity corresponding to the travel speed of the vehicle is determined. Then, the determined complexity is added together to obtain the complexity of the driving scene where the vehicle is currently located.
  • the existing technology has at least the following problems:
  • the driving conditions of the remaining vehicles within a certain range around the vehicle are affected by the driving safety of the vehicle. Therefore, the complexity of the driving scene determined solely by the above factors cannot fully reflect the actual complexity of the current driving scene of the vehicle, which makes the automatic driving of the vehicle pose a certain safety hazard.
  • the embodiments of the present application provide a method and device for detecting the complexity of a vehicle driving scene, so as to overcome the problem that the complexity determined by related technologies cannot fully reflect the actual complexity of the current driving scene of the vehicle.
  • a method for detecting the complexity of a vehicle driving scene includes:
  • the target vehicle is a target vehicle that meets a preset distance condition from the host vehicle.
  • the dynamic complexity of the driving scene where the host vehicle is located is acquired. Determine the static information of each static factor in the current driving scene of the vehicle. Based on the static information of the static factors, the static complexity of the driving scene where the own vehicle is located is determined. Based on the dynamic complexity and the static complexity, the comprehensive complexity of the driving scene where the own vehicle is located is determined.
  • the solution shown in the embodiments of this application can detect the driving speed of the target vehicle within a certain range of the vehicle through the on-board radar, and calculate the dynamics of the driving scene where the vehicle is located according to the driving speed of the target vehicle and the driving speed of the vehicle the complexity.
  • the dynamic complexity is used to characterize the degree of influence of the driving situation of the target vehicle in the driving scene of the host vehicle on the driving of the host vehicle.
  • the static information of each static factor in the driving scene of the vehicle can also be obtained through the on-board GPS and high-definition map. For example, if the static factor is the width of the lane, the static information can be the actual width of the lane.
  • the static complexity of the driving scene of the vehicle is obtained.
  • the static complexity is used to characterize the degree of influence of static factors in the driving scene of the vehicle on the driving of the vehicle.
  • the above dynamic complexity and static complexity can be combined to calculate the dynamic complexity of the driving scene where the vehicle is located. This solution combines static factors and the driving situation of the target vehicle within a certain range to comprehensively determine the comprehensive complexity of the driving scene where the vehicle is located, which can more comprehensively reflect the actual complexity of the current driving scene of the vehicle.
  • the acquiring the dynamic complexity of the driving scene in which the own vehicle is located based on the running speed of the own vehicle and the running speed of the target vehicle includes:
  • For each target vehicle obtain the complexity corresponding to the target vehicle based on the traveling speed of the host vehicle and the traveling speed of the target vehicle;
  • the complexity corresponding to each target vehicle is added to obtain the dynamic complexity of the driving scene in which the own vehicle is located.
  • each target vehicle when determining the dynamic complexity, can be used as the research object, and the corresponding complexity of each target vehicle can be calculated separately, and the corresponding complexity of each target vehicle can be synthesized to obtain The dynamic complexity of the driving scene where the vehicle is located.
  • the obtaining the complexity corresponding to the target vehicle based on the traveling speed of the host vehicle and the traveling speed of the target vehicle includes:
  • the complexity corresponding to the target vehicle is determined.
  • the relative travel speed of the host vehicle and the target vehicle may be the travel speed of the target vehicle minus the travel speed of the host vehicle.
  • the calculation can be performed as follows: ⁇ ij ⁇ [0°,90°]
  • the relative lateral speed of the host vehicle and the target vehicle that is, the horizontal and vertical component of the traveling speed of the target vehicle in the traveling direction of the host vehicle minus the horizontal and vertical component of the traveling speed of the host vehicle in the traveling direction of the host vehicle.
  • the relative longitudinal speed of the host vehicle and the target vehicle that is, the component of the traveling speed of the target vehicle in the traveling direction of the host vehicle minus the component of the traveling speed of the host vehicle in the traveling direction of the host vehicle.
  • the obtaining the complexity corresponding to the target vehicle based on the included angle ⁇ ij includes:
  • the initial complexity f( ⁇ ij ) corresponding to the target vehicle is corrected to obtain The complexity corresponding to the target vehicle.
  • the solution shown in the embodiment of the present application can calculate the complexity of each target vehicle more accurately based on the distance between the host vehicle and the target vehicle, and can be based on the distance between the target vehicle and the host vehicle, and the target vehicle and the host vehicle.
  • the relative speed between the vehicles corrects the initial complexity.
  • the initial value corresponding to the target vehicle is The complexity f( ⁇ ij ) is modified to obtain the complexity corresponding to the target vehicle, including:
  • the first safety distance Based on the traveling speed, maximum deceleration and minimum deceleration of the host vehicle, the traveling speed and maximum deceleration of the target vehicle, and the preset driver reaction time, the relationship between the host vehicle and the target vehicle is obtained.
  • the first safety distance
  • the initial complexity f( ⁇ ij ) corresponding to the target vehicle, the longitudinal correction coefficient f 1 (q) and the lateral correction coefficient f 1 (p) are multiplied to obtain the complexity corresponding to the target vehicle.
  • the solution shown in the embodiment of the present application can be considered from both the horizontal and vertical directions when correcting the initial complexity.
  • the horizontal direction is the horizontal and vertical direction of the own vehicle's traveling direction
  • the vertical direction is the traveling direction of the own vehicle.
  • the correction can be made based on the lateral relative speed between the host vehicle and the target vehicle.
  • the correction can be made based on the longitudinal distance between the own vehicle and the target vehicle.
  • the obtaining the static complexity of the driving scene of the own vehicle based on the static information of the various static factors includes:
  • the static complexity of the driving scene where the own vehicle is located is obtained.
  • the value of the static factor may be determined first according to the static information of each static factor.
  • the static factor is the lane width, and the corresponding static information can be 3m, 3.5m, 4m, 4.5m, etc.
  • the value of the static factor can be wide lane (lane width ⁇ 3.5m), narrow lane (lane width ⁇ 3.5m) Two kinds.
  • the complexity corresponding to the multiple static factors is obtained.
  • the static complexity of the driving scene of the vehicle can be obtained.
  • the determining the static complexity of the driving scene in which the own vehicle is located based on the respective complexity corresponding to the multiple static factors includes:
  • the corresponding weighted complexity of the multiple static factors are added together to obtain the static complexity of the driving scene where the own vehicle is located.
  • the solution shown in the embodiment of the present application takes into account the different impacts of various static factors on the driving of the vehicle.
  • determining the static complexity of the driving scene of the vehicle based on the corresponding complexity of each static factor you can first The corresponding complexity of each static factor in the scene is multiplied by the weight corresponding to the static factor to obtain the weighted complexity corresponding to the static factor, and then the weighted complexity corresponding to each static factor is added together to obtain the driving scene of the vehicle Static complexity.
  • the method further includes:
  • N sample images corresponding to the first value For the first value of the first static factor, obtain N sample images corresponding to the first value, where N is an integer greater than 1;
  • the ratio of M to N is obtained as the complexity corresponding to the first value.
  • the complexity corresponding to the value can be determined in advance, and the value of the static factor and the corresponding complexity can be stored correspondingly.
  • N sample images corresponding to the value can be obtained.
  • image recognition is performed on the N sample images, and the predicted value corresponding to each sample image is obtained.
  • Count the number M of sample images whose corresponding predicted value is different from the corresponding calibrated true value.
  • the static factors include at least one of road type, number of lanes in the same direction, lane width, central isolated driving, machine non-isolated form, traffic sign, and traffic signal light.
  • the static information corresponding to the road type can be urban expressway, trunk road, secondary trunk road, branch road, and the corresponding value can also be urban expressway, trunk road, secondary trunk road. , Branch Road.
  • the static information corresponding to the number of lanes in the same direction can be 1, 2, 3, 4, 5, 6, etc.
  • the corresponding value can be 1, 2, 3, 4, and greater than or equal to 5.
  • the static information corresponding to the lane width can be 3m, 3.5m, 4m, 4.5m, etc., then the corresponding value can be wide lane (lane width ⁇ 3.5m), narrow lane (lane width ⁇ 3.5m).
  • the static information corresponding to the central isolation form can be no central isolation, marking isolation, hard isolation fence, and green isolation.
  • the corresponding value can also be no central isolation, marking isolation, hard isolation fence, and green isolation.
  • the static information corresponding to the machine non-isolation form can be inorganic non-isolation, marking isolation, hard isolation fence, and green isolation. Then, the corresponding value can also be inorganic non-isolation, marking isolation, hard isolation fence, and green isolation.
  • the static information corresponding to the traffic sign can be a speed limit card with a speed limit of 40km/h, a speed limit card with a speed limit of 60km/h, an unlimited speed card, etc. Then, the corresponding value can be a speed limit card and an unlimited speed card.
  • the static information corresponding to the traffic signal is that the traffic signal is red, the traffic signal is green, the traffic signal is yellow, and there is no traffic signal. Then, the corresponding value can be with or without traffic signal.
  • the method further includes:
  • the determining the comprehensive complexity of the driving scene in which the own vehicle is located based on the dynamic complexity and the static complexity includes:
  • the dynamic complexity and the static complexity are added together and multiplied by the target complexity correction coefficient to obtain the comprehensive complexity of the driving scene where the own vehicle is located.
  • environmental factors are also important factors that affect the driving of the vehicle. That is, environmental factors will also have a certain impact on the complexity of the driving scene where the vehicle is located.
  • the environment image in the driving scene can be obtained through the camera of the vehicle's perception system, and the environmental information of each environmental factor can be identified in the environment image through the pre-built pattern recognition algorithm, and then the environmental information of the environmental factor can be used as the environment The value of the factor.
  • the target complexity correction coefficient corresponding to the value of the environmental factor in the driving scene of the vehicle can be obtained according to the corresponding relationship between the value of the environmental factor and the complexity correction coefficient, and the target complexity correction coefficient can be used for dynamic complexity.
  • the degree and static complexity are corrected.
  • the environmental factors include light, weather, and road conditions.
  • the value corresponding to the light can include daytime, dusk or dawn, light in the night, and no light in the night, and the value corresponding to the weather can include sunny, cloudy, rain, snow, fog, and road conditions.
  • the value of can include dry, wet, snow, and icy.
  • the determining a target vehicle that meets a preset distance condition from the own vehicle includes:
  • the front vehicle whose distance from the host vehicle in the traveling direction of the host vehicle is not greater than the third safe distance is determined as the target vehicle.
  • a device for detecting the complexity of a vehicle driving scene includes:
  • An acquisition module for acquiring the traveling speed of the own vehicle and the traveling speed of the target vehicle, where the target vehicle is a vehicle that meets a preset distance condition from the own vehicle;
  • the first determining module is configured to obtain the dynamic complexity of the driving scene where the own vehicle is located based on the running speed of the own vehicle and the running speed of the target vehicle;
  • the second determining module is used to obtain static information of static factors in the driving scene where the own vehicle is located;
  • the third determining module is configured to obtain the static complexity of the driving scene of the vehicle based on the static information of the static factors;
  • the fourth determining module is configured to obtain the comprehensive complexity of the driving scene in which the own vehicle is located based on the dynamic complexity and the static complexity.
  • the first determining module is configured to:
  • For each target vehicle obtain the complexity corresponding to the target vehicle based on the traveling speed of the host vehicle and the traveling speed of the target vehicle;
  • the complexity corresponding to each target vehicle is added to obtain the dynamic complexity of the driving scene in which the own vehicle is located.
  • the first determining module is configured to:
  • the first determining module is configured to:
  • the initial complexity f( ⁇ ij ) corresponding to the target vehicle is corrected, Obtain the corresponding complexity of the target vehicle.
  • the first determining module is configured to:
  • the first safety distance Based on the traveling speed, maximum deceleration and minimum deceleration of the host vehicle, the traveling speed and maximum deceleration of the target vehicle, and the preset driver reaction time, the relationship between the host vehicle and the target vehicle is obtained.
  • the first safety distance
  • the initial complexity f( ⁇ ij ) corresponding to the target vehicle, the longitudinal correction coefficient f 1 (q) and the lateral correction coefficient f 1 (q) are multiplied to obtain the complexity corresponding to the target vehicle.
  • the third determining module is configured to:
  • the static complexity of the driving scene where the own vehicle is located is obtained.
  • the third determining module is configured to:
  • the corresponding weighted complexity of the multiple static factors are added together to obtain the static complexity of the driving scene where the own vehicle is located.
  • the device further includes:
  • the statistics module is used to obtain N sample images corresponding to the first value of the first static factor, where N is an integer greater than 1, and perform image recognition on the N sample images respectively , Obtain the predicted value corresponding to each sample image; count the number M of sample images whose corresponding predicted value and the corresponding calibrated true value are different, wherein the calibrated true value is used to uniquely identify the first value; obtain M
  • the ratio to N is used as the complexity corresponding to the first value.
  • the static factors include at least one of road type, number of lanes in the same direction, lane width, central isolated driving, machine non-isolated form, traffic sign, and traffic signal light.
  • the device further includes:
  • the correction module is used to obtain the environmental information of the environmental factors in the driving scene where the own vehicle is located;
  • the fourth module is used for:
  • the dynamic complexity and the static complexity are added together and multiplied by the target complexity correction coefficient to obtain the comprehensive complexity of the driving scene where the own vehicle is located.
  • the environmental factors include at least one of sunlight, weather, and road conditions.
  • the acquisition module is used to:
  • the vehicle ahead whose distance from the host vehicle in the traveling direction of the host vehicle is not greater than the third safe distance is taken as the target vehicle.
  • a vehicle driving decision controller including a processor and a memory;
  • the memory stores at least one computer-readable instruction configured to be executed by the processor for implementing the method for detecting the complexity of a vehicle driving scene as described in the first aspect described above.
  • a computer-readable storage medium including computer-readable instructions, when the computer-readable storage medium runs on a vehicle driving decision controller, the vehicle driving decision controller executes the above-mentioned first On the one hand, the method for detecting the complexity of the vehicle driving scene.
  • a computer program product containing instructions is provided.
  • the vehicle driving decision controller executes the detection of vehicle driving described in the first aspect. The method of the complexity of the scene.
  • the static complexity of the driving scene of the vehicle in addition to determining the static complexity of the driving scene of the vehicle based on the static information of the static factors in the driving scene where the vehicle is located, it also determines the driving speed of the vehicle based on the driving speed of the vehicle and the driving speed of the target vehicle.
  • the dynamic complexity of the driving scene where the vehicle is located, and the target vehicle is a vehicle that meets the preset distance condition from the own vehicle.
  • the comprehensive complexity can more comprehensively reflect the current vehicle The complexity of the driving scene.
  • FIG. 1 is a schematic structural diagram of a vehicle driving decision controller provided by an embodiment of the present application
  • FIG. 2 is a flowchart of a method for detecting the complexity of a vehicle driving scene provided by an embodiment of the present application
  • Fig. 3 is a schematic diagram of a target vehicle provided by an embodiment of the present application.
  • Fig. 4 is a schematic diagram of a target vehicle provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a target vehicle provided by an embodiment of the present application.
  • Fig. 6 is a schematic structural diagram of an apparatus for detecting the complexity of a vehicle driving scene provided by an embodiment of the present application.
  • the embodiment of the present application provides a method for detecting the complexity of a vehicle driving scene, and the method can be applied to an automatic driving car.
  • the method can be implemented by a vehicle driving decision controller in an autonomous vehicle.
  • Sensing systems, positioning systems, etc. can be deployed in autonomous vehicles.
  • the sensing system may include a radar, a camera, etc.
  • the positioning system may be a global positioning system (Global Positioning System, GPS), a Beidou system, etc.
  • Fig. 1 is a schematic diagram of a vehicle driving decision controller 100 provided by an embodiment of the present application.
  • the vehicle driving decision controller may include a processor 101 and a memory 102.
  • the processor 101 may be a central processing unit (CPU).
  • the processor 101 may refer to one processor, or may include multiple processors.
  • the memory 102 may include volatile memory, such as random access memory (RAM); the memory may also include non-volatile memory, such as read-only memory (ROM), flash memory, etc.
  • the memory may also include a combination of the above-mentioned types of memory.
  • the memory 102 may refer to one memory, or may include multiple memories.
  • the memory 102 stores computer-readable instructions, and the computer-readable instructions can be executed by the processor 101 to implement the method for detecting the complexity of a vehicle driving scene provided by the embodiment of the present application.
  • the vehicle driving decision controller collects the driving data of the vehicle in front of a certain range through the sensing system, obtains the value of each static factor in the driving scene of the vehicle through the combination of the positioning system and the high-precision map, and comprehensively calculates the vehicle The complexity of the driving scene.
  • the processing flow of a method for detecting the complexity of a vehicle driving scene may include the following steps:
  • Step 201 Obtain the running speed of the host vehicle and the running speed of the target vehicle, where the target vehicle is a vehicle that meets a preset distance condition from the host vehicle.
  • the target vehicle that meets the preset distance condition from the host vehicle can be determined first, as a research vehicle for subsequent calculation of the dynamic complexity of the driving scene where the host vehicle is located.
  • the method for determining the target vehicle that meets the preset distance condition from the own vehicle can be as follows:
  • the front vehicle in the driving scene of the vehicle is detected by radar. If there is a front vehicle in the same lane as the vehicle, the front vehicle with the smallest distance to the vehicle is determined as the reference vehicle. Then, the travel speed of the reference vehicle and the component of the distance between the reference vehicle and the host vehicle in the travel direction of the host vehicle are obtained by radar.
  • the component of the distance between the vehicle and the host vehicle in the traveling direction of the host vehicle may also be referred to as the longitudinal distance between the vehicle and the host vehicle.
  • the calculation formula can be as follows:
  • V following is the running speed of the vehicle
  • V leading is the running speed of the reference vehicle
  • a max accel is the predicted maximum acceleration of the vehicle, for example, it can be 0.2g
  • g is the acceleration due to gravity.
  • a min brake is the preset minimum deceleration of the vehicle, for example, it can be 0.3g.
  • b max brake represents the preset maximum deceleration of the preceding vehicle, for example, it can be 0.4g.
  • p is the preset driver reaction time, for example, it can be 2s.
  • the longitudinal distance between the reference vehicle and the own vehicle and the smaller value of the second safety distance determined above can be determined.
  • the vehicle in front of the lane where the vehicle is located and in the adjacent lanes, the longitudinal distance between the vehicle and the vehicle is not greater than the minimum value, is determined as the target vehicle.
  • the target vehicle is determined when the second safety distance is greater than the longitudinal distance between the reference vehicle and the own vehicle.
  • the target vehicle is determined when there is no vehicle ahead in the same lane as the own vehicle.
  • Step 202 Obtain the dynamic complexity of the driving scene in which the own vehicle is located based on the running speed of the own vehicle and the running speed of the target vehicle.
  • the complexity corresponding to the target vehicle is determined.
  • the determined complexity of each target vehicle is added together to obtain the dynamic complexity of the driving scene where the vehicle is located. The method for determining the corresponding complexity of the target vehicle will be described below.
  • the relative lateral speed of the host vehicle and the target vehicle that is, the horizontal and vertical component of the traveling speed of the target vehicle in the traveling direction of the host vehicle minus the horizontal and vertical component of the traveling speed of the host vehicle in the traveling direction of the host vehicle.
  • the relative longitudinal speed of the host vehicle and the target vehicle that is, the component of the traveling speed of the target vehicle in the traveling direction of the host vehicle minus the component of the traveling speed of the host vehicle in the traveling direction of the host vehicle.
  • the correction method can have the following steps:
  • the calculation method of the longitudinal correction coefficient can be as follows:
  • the first safe distance between the host vehicle and the target vehicle calculates the first safe distance between the host vehicle and the target vehicle.
  • the calculation formula of the first safety distance is the same as the calculation formula of the second safety distance described above, and will not be repeated here.
  • calculate the longitudinal distance between the target vehicle and the vehicle And preset dangerous distance The first difference between.
  • calculate the safety critical value Pre-set dangerous distance The second difference between.
  • the safety critical value It can be the first safe distance between the host vehicle and the target vehicle, It can be 0m.
  • the ratio between the first difference and the second difference is calculated to obtain the standard longitudinal distance p between the host vehicle and the target vehicle.
  • the standard longitudinal distance p can be expressed as:
  • the calculation method of the longitudinal correction coefficient can be as follows.
  • calculate the preset safe lateral relative speed Relative speed to the preset dangerous lateral The fourth difference between. in, It can be 3.5m/s, which means that the speed at which the vehicle and the target vehicle approach each other in the lateral direction is 3.5m/s, It can be -3.5m/s, which means that the speed at which the host vehicle and the target vehicle move away from each other in the lateral direction is 3.5m/s.
  • the ratio between the third difference and the fourth difference is calculated to obtain the standard lateral relative speed q between the host vehicle and the target vehicle.
  • the standard lateral relative speed q can be expressed as:
  • Step 203 Obtain static information of various static factors in the current driving scene of the vehicle.
  • static factors include road type, number of lanes in the same direction, lane width, central isolated driving, machine non-isolated form, traffic signs and traffic lights.
  • the current location can be determined through the positioning system, and the static information of the static factors of the current location of the vehicle (the driving scene of the vehicle) can be determined through the high-precision map.
  • the road type in the current driving scene of the vehicle is an urban expressway
  • the number of lanes in the same direction is 2
  • the lane width is greater than 3.5m is a wide lane
  • the central isolation driving is green isolation
  • the non-isolated form is green isolation.
  • Traffic signs are speed limit signs
  • traffic lights are traffic lights. The following describes the possible static information of each static factor.
  • the static information of road types can be urban express roads, main roads, sub-main roads, and branch roads.
  • the static information of the number of lanes in the same direction can be 1, 2, 3, 4, 5, 6, etc.
  • the static information of the lane width can be 3m, 3.5m, 4m, 4.5m, etc.
  • the static information in the form of central isolation can be non-central isolation, marking isolation, hard isolation fence, and green isolation.
  • the static information in the form of machine non-isolation can be inorganic non-isolation, marking isolation, rigid isolation fence, and green isolation.
  • the static information of traffic signs can be speed limit signs 40km/h, speed limit signs 60km/h, unlimited speed signs, etc.
  • the static information of the traffic signal is that the traffic signal is red, the traffic signal is green, the traffic signal is yellow, and there is no traffic signal.
  • Step 204 Obtain the static complexity of the driving scene of the vehicle based on the static information of each static factor.
  • the values of the multiple static factors in the driving scene of the vehicle may be obtained first according to the determined static information of the multiple static factors. Then, according to the corresponding relationship between the value of the static factor and the complexity, the complexity corresponding to the value of the multiple static factors in the driving scene of the vehicle is determined. Finally, according to the corresponding complexity of multiple static factors, the static complexity of the driving scene of the vehicle is obtained.
  • the static information of the road type can be directly determined as the value of the road type, that is, the static information urban expressway, main road, secondary trunk road, and branch road.
  • the corresponding values can also be urban expressway, main road, secondary trunk road. Road, branch road.
  • the static information of the number of lanes in the same direction is 1 to 4
  • the static information of the number of lanes in the same direction can be directly determined as the value of the number of lanes in the same direction.
  • the static information of the number of lanes in the same direction is 5, 6, 7, etc. , Can uniformly correspond to the value " ⁇ 5".
  • the corresponding value " ⁇ 3.5m” can be uniformly corresponding, and when the static information of the lane width is 4m, 4.5m, etc., the corresponding value ">3.5m” can be uniformly corresponding.
  • the static information in the form of central isolation can be directly determined as the value of the form of central isolation.
  • the static information in the machine-non-isolated form can be directly determined as the value of the machine-non-isolated form.
  • the static information of the traffic sign is 40km/h
  • the speed limit of the speed limit is 60km/h, etc.
  • the corresponding value can be "speed limit sign".
  • the corresponding value can be regarded as "unlimited speed card”.
  • the static information corresponding to the traffic signal is red, the traffic signal is green, and the traffic signal is yellow, the corresponding value "with traffic signal” can be unified.
  • the static information corresponding to the traffic signal is no traffic signal, The corresponding value can be regarded as "no traffic light”.
  • the weight corresponding to the static factor can be obtained based on the corresponding relationship between the static factor and the weight, and the complexity corresponding to the static factor is multiplied by the weight corresponding to the static factor to obtain the weighted complexity of the static factor.
  • the corresponding relationship between static factors and weights can be shown in Table 3.
  • the technician can predetermine the complexity corresponding to each value of each static factor.
  • the method for determining the complexity corresponding to each value of each static factor can be as follows.
  • N sample images corresponding to the value are obtained.
  • N is an integer greater than 1.
  • the sample images can be obtained in the following ways: intercepting through high-definition street view maps, shooting on the spot, or downloading through the Internet, etc.
  • the embodiment of the present application does not limit the method for acquiring the sample image.
  • image recognition is performed on the N sample images, and the predicted value corresponding to each sample image is obtained.
  • Count the number M of sample images whose corresponding predicted value is different from the corresponding calibrated true value. Among them, the calibrated true value is used to uniquely identify the value.
  • the ratio of M to N is determined as the complexity corresponding to this value. The following takes the determination of the complexity corresponding to the greening isolation as an example for description.
  • the recognition error is considered, and the number of sample images with incorrect recognition is recorded. For example, if there are 90 incorrectly recognized sample images, the number of incorrectly recognized sample images is 90 divided by the total number of sample images 100 to obtain a recognition error rate of 90%.
  • the recognition error rate can be used as the complexity of the greening isolation.
  • the weight corresponding to each static factor can be determined and stored using analytic hierarchy process.
  • the following is an explanation of determining the weights corresponding to static factors through the analytic hierarchy process.
  • the static factor importance judgment matrix table can be entered as shown in Table 4, and the ratio scale table can be shown in Table 5 below.
  • n 7.
  • Step 205 Determine the comprehensive complexity of the driving scene where the vehicle is located based on the dynamic complexity and the static complexity.
  • the dynamic complexity and static complexity determined above can be directly added to obtain the comprehensive complexity of the driving scene where the vehicle is located.
  • the environmental factors of the driving scene where the vehicle is located can be considered to comprehensively determine the comprehensive complexity of the driving scene where the vehicle is located.
  • the determination method may be as follows: obtain environmental information of environmental factors in the driving scene where the vehicle is located. Obtain the value of the environmental factor based on the environmental information of the environmental factor. Based on the stored corresponding relationship between the value of the environmental factor and the complexity correction coefficient, the target complexity correction coefficient corresponding to the environmental factor in the driving scene of the vehicle is determined. The dynamic complexity and the static complexity are added together and multiplied by the target complexity correction coefficient to obtain the comprehensive complexity of the driving scene where the vehicle is located.
  • environmental factors can include sunlight, weather, and road conditions.
  • the environmental image in the driving scene can be acquired through the camera of the vehicle's perception system, and the environmental information of each environmental factor can be identified in the environmental image through the pre-built pattern recognition algorithm. Since the output of the pattern recognition algorithm is several pre-set possible environmental information, the environmental information of the environmental factor can be directly determined as the value of the environmental factor. The value of each environmental factor can be shown in Table 6 below.
  • Table 7 The corresponding relationship between the value of each environmental factor and the complexity correction coefficient can be shown in Table 7 below. It should be noted that Table 7 is only an example of a complexity correction coefficient.
  • envirnmental factor Value (complexity correction factor) illumination Day (1), dusk or dawn (1.2), light at night (1.5), no light at night (2) weather Sunny (1), cloudy (1.2), rain (1.8), snow (1.5), fog (2) Road condition Dry (1), wet (1.5), snow (1.8), icing (2)
  • the corresponding relationship between the values of the aforementioned environmental factors and the complexity correction coefficients can be queried, and the complexity correction coefficients corresponding to each environmental factor in the driving scene of the vehicle can be determined. Then, the determined complexity correction coefficient corresponding to each environmental factor is multiplied to obtain the target complexity correction coefficient.
  • the obtained values of each environmental factor are: daylight is daylight, weather is rainy, and road conditions are wet.
  • the complexity correction coefficient corresponding to daytime is 1, and the complexity correction coefficient corresponding to rain is 1.8, the complexity correction coefficient corresponding to wet is 1.5, and the target complexity correction coefficient is 2.7 by multiplying the three.
  • the obtained target complexity correction coefficient is multiplied by the sum of the dynamic complexity and the static complexity determined above to obtain the comprehensive complexity of the driving scene where the vehicle is located.
  • the comprehensive complexity can be displayed in real-time on the display screen in the self-driving car, so that the driver can understand the complexity of the driving environment in which the vehicle is located in real time.
  • the dangerous complexity threshold can be set. When the calculated comprehensive complexity is greater than the set dangerous complexity threshold, a manual takeover driving prompt can be displayed on the display screen. And it can also give voice prompts at the same time, reminding the driver that the current driving environment is relatively complicated, please use manual driving.
  • the target vehicle is a vehicle that meets the preset distance condition from the vehicle.
  • the static complexity determined by static factors and the dynamic complexity determined by the driving conditions of surrounding vehicles are combined to comprehensively determine the comprehensive complexity of the driving scene where the vehicle is located.
  • the comprehensive complexity can more comprehensively reflect the current vehicle The actual complexity of the driving scene.
  • an embodiment of the present application also provides a device for detecting the complexity of a vehicle driving scene. As shown in FIG. 6, the device includes:
  • the acquiring module 610 is configured to acquire the driving speed of the host vehicle and the driving speed of the target vehicle, where the target vehicle is a vehicle that meets a preset distance condition from the host vehicle. Specifically, the acquisition function in step 201 above can be implemented, as well as other implicit steps
  • the first determining module 620 is configured to obtain the dynamic complexity of the driving scene where the own vehicle is located based on the traveling speed of the own vehicle and the traveling speed of the target vehicle. Specifically, the acquisition function in step 202 above and other implicit steps can be implemented.
  • the second determining module 630 is configured to obtain static information of static factors in the driving scene where the own vehicle is located. Specifically, the acquisition function in step 203 above can be implemented, as well as other implicit steps
  • the third determining module 640 is configured to obtain the static complexity of the driving scene of the vehicle based on the static information of the static factors. Specifically, the acquisition function in step 204 above can be implemented, as well as other implicit steps
  • the fourth determining module 650 is configured to obtain the comprehensive complexity of the driving scene where the own vehicle is located based on the dynamic complexity and the static complexity. Specifically, the acquisition function in step 205 above can be implemented, as well as other implicit steps
  • the first determining module 620 is configured to:
  • For each target vehicle obtain the complexity corresponding to the target vehicle based on the traveling speed of the host vehicle and the traveling speed of the target vehicle;
  • the complexity corresponding to each target vehicle is added to obtain the dynamic complexity of the driving scene in which the own vehicle is located.
  • the first determining module 620 is configured to:
  • the first determining module 620 is configured to:
  • the initial complexity f( ⁇ ij ) corresponding to the target vehicle is corrected, Obtain the corresponding complexity of the target vehicle.
  • the first determining module 620 is configured to:
  • the first safety distance Based on the traveling speed, maximum deceleration and minimum deceleration of the host vehicle, the traveling speed and maximum deceleration of the target vehicle, and the preset driver reaction time, the relationship between the host vehicle and the target vehicle is obtained.
  • the first safety distance
  • the initial complexity f( ⁇ ij ) corresponding to the target vehicle, the longitudinal correction coefficient f 1 (q) and the lateral correction coefficient f 1 (q) are multiplied to obtain the complexity corresponding to the target vehicle.
  • the third determining module 640 is configured to:
  • the static complexity of the driving scene where the own vehicle is located is obtained.
  • the third determining module 640 is configured to:
  • the corresponding weighted complexity of the multiple static factors are added together to obtain the static complexity of the driving scene where the own vehicle is located.
  • the device further includes:
  • the statistics module is used to obtain N sample images corresponding to the first value of the first static factor, where N is an integer greater than 1, and perform image recognition on the N sample images respectively , Obtain the predicted value corresponding to each sample image; count the number M of sample images whose corresponding predicted value and the corresponding calibrated true value are different, wherein the calibrated true value is used to uniquely identify the first value; obtain M
  • the ratio to N is used as the complexity corresponding to the first value.
  • the static factors include at least one of road type, number of lanes in the same direction, lane width, central isolated driving, machine non-isolated form, traffic sign, and traffic signal light.
  • the device further includes:
  • the correction module is used to obtain the environmental information of the environmental factors in the driving scene where the own vehicle is located;
  • the fourth module 650 is used for:
  • the dynamic complexity and the static complexity are added together and multiplied by the target complexity correction coefficient to obtain the comprehensive complexity of the driving scene where the own vehicle is located.
  • the environmental factors include at least one of sunlight, weather, and road conditions.
  • the obtaining module 610 is configured to:
  • the vehicle ahead whose distance from the host vehicle in the traveling direction of the host vehicle is not greater than the third safe distance is taken as the target vehicle.
  • the device for detecting the complexity of the vehicle driving scene provided by the above embodiment only uses the division of the above-mentioned functional modules for illustration when detecting the complexity of the vehicle driving scene. In actual applications, it can be implemented as required.
  • the above function allocation is completed by different functional modules, that is, the internal structure of the vehicle driving decision controller is divided into different functional modules to complete all or part of the functions described above.
  • the device for detecting the complexity of a vehicle driving scene provided by the foregoing embodiment belongs to the same concept as the embodiment of the method for detecting the complexity of a vehicle driving scene. For the specific implementation process, please refer to the method embodiment, which will not be repeated here.
  • the computer program product includes one or more computer instructions, and when the computer program instructions are loaded and executed on a device, the processes or functions described in the embodiments of the present application are generated in whole or in part.
  • the computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • the computer instructions may be transmitted from a website, computer, server, or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by the device or a data storage device such as a server or data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (such as a floppy disk, a hard disk, a magnetic tape, etc.), an optical medium (such as a digital video disk (Digital Video Disk, DVD), etc.), or a semiconductor medium (such as a solid-state hard disk, etc.).
  • the program can be stored in a computer-readable storage medium.
  • the storage medium mentioned can be a read-only memory, a magnetic disk or an optical disk, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Transportation (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Mathematical Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Traffic Control Systems (AREA)

Abstract

本申请公开了一种检测车辆行驶场景的复杂度的方法和装置,属于自动驾驶技术领域,可以应用在智能汽车、新能源汽车、网联汽车上,该方法包括:获取与本车辆满足预设距离条件的目标车辆,获取本车辆的行驶速度和目标车辆的行驶速度。基于本车辆的行驶速度与目标车辆的行驶速度,确定本车辆所在行驶场景的动态复杂度。确定本车辆当前所在行驶场景中的各静态因素的静态信息。基于各静态因素的静态信息,获取本车辆所在行驶场景的静态复杂度。基于动态复杂度和静态复杂度,获取本车辆所在行驶场景的综合复杂度。通过该方法确定出的综合复杂度可以更加全面的反应出车辆当前所在行驶场景的实际复杂程度。

Description

检测车辆行驶场景的复杂度的方法和装置 技术领域
本申请涉及自动驾驶技术领域,特别涉及一种检测车辆行驶场景的复杂度的方法和装置。
背景技术
随着自动驾驶汽车的快速发展,自动驾驶汽车的交通安全问题也引起了人们更多的关注。在高复杂度的行驶场景下,自动驾驶汽车自动驾驶可能存在一定危险,需要驾驶员接管进行人工驾驶。这样,为了能在高复杂度的行驶场景下提醒驾驶员做好接管驾驶的准备,确定行驶场景的复杂度也就至关重要。
目前,确定行驶场景的复杂度通常可以采用如下方法。通过车载全球定位***(Global Positioning System,GPS)和高精度地图,确定出本车辆当前所在行驶场景中影响车辆行驶安全的静态因素的实际取值和本车辆的行驶速度。上述静态因素包括:车道数、中间分隔形式和本车辆所在车道的宽度等。根据预先存储的上述各静态因素的不同取值对应的复杂度,确定出上述各静态因素的实际取值对应的复杂度。根据行驶速度区间对应的复杂度,确定出本车辆的行驶速度对应的复杂度。再将确定出的各复杂度相加,即可得到本车辆当前所在的行驶场景的复杂度。
现有技术至少存在以下问题:
在车辆的行驶场景中,影响车辆行驶安全的除了上述方法中所提及的静态因素和车辆的行驶速度外,还有车辆周围一定范围内的其余车辆的行驶情况。因此,仅仅依靠上述因素确定出的行驶场景的复杂度并不能全面反应出车辆当前所在行驶场景的实际复杂程度,使得车辆自动驾驶存在一定安全隐患。
发明内容
本申请实施例提供了一种检测车辆行驶场景的复杂度的方法和装置,以克服相关技术确定的复杂度不能全面反应车辆当前所在行驶场景的实际复杂程度的问题。
第一方面、提供了一种检测车辆行驶场景的复杂度的方法,所述方法包括:
获取所述本车辆的行驶速度和目标车辆的行驶速度,其中,目标车辆为与本车辆满足预设距离条件的目标车辆。基于所述本车辆的行驶速度与所述目标车辆的行驶速度,获取所述本车辆所在行驶场景的动态复杂度。确定本车辆当前所在行驶场景中的各静态因素的静态信息。基于所述各静态因素的静态信息,确定所述本车辆所在行驶场景的静态复杂度。基于所述动态复杂度和所述静态复杂度,确定所述本车辆所在行驶场景的综合复杂度。
本申请实施例所示的方案,可以通过车载的雷达检测本车辆一定范围内的目标车辆的行驶速度,并根据目标车辆的行驶速度和本车辆的行驶速度,计算出本车辆所在行驶场景的动态复杂度。该动态复杂度用来表征本车辆所在行驶场景中的目标车辆的行驶情况对本车辆行驶的影响程度大小。同时,还可以通过车载的GPS和高清地图,获取本车辆所在行驶场景中的各静态因素的静态信息。例如,静态因素为车道宽度,那么静态信息可以为车道的实际宽 度值。
并根据各静态因素的静态信息,得到本车辆所在行驶场景的静态复杂度。该静态复杂度用来表征本车辆所在行驶场景中的静态因素对本车辆行驶的影响程度大小。最后,可以结合上述动态复杂度和静态复杂度,计算出本车辆所在行驶场景的动态复杂度。本方案结合了静态因素和一定范围内目标车辆的行驶情况,综合确定本车辆所在行驶场景的综合复杂度,可以更加全面的反应出车辆当前所在行驶场景的实际复杂程度。
在一种可能的实现方式中,所述基于所述本车辆的行驶速度与所述目标车辆的行驶速度,获取所述本车辆所在行驶场景的动态复杂度,包括:
对于每个目标车辆,基于所述本车辆的行驶速度与所述目标车辆的行驶速度,得到所述目标车辆对应的复杂度;
将各目标车辆对应的复杂度相加,得到所述本车辆所在行驶场景的动态复杂度。
本申请实施例所示的方案,在确定动态复杂度时,可以以每个目标车辆为研究对象,分别计算出每个目标车辆对应的复杂度,并综合各目标车辆对应的复杂度,计算得到本车辆所在行驶场景的动态复杂度。
在一种可能的实现方式中,所述基于所述本车辆的行驶速度与所述目标车辆的行驶速度,得到所述目标车辆对应的复杂度,包括:
获取所述本车辆和所述目标车辆的相对行驶速度与所述本车辆的行驶方向的夹角θ ij
基于所述夹角θ ij,确定所述目标车辆对应的复杂度。
本申请实施例所示的方案,本车辆和目标车辆的相对行驶速度可以为目标车辆的行驶速度减去本车辆的行驶速度。在计算本车辆和目标车辆的相对行驶速度与本车辆的行驶方向的夹角θ ij时,可以按照如下方法进行计算:
Figure PCTCN2020082412-appb-000001
θ ij∈[0°,90°]
其中,
Figure PCTCN2020082412-appb-000002
为本车辆和目标车辆的相对横向速度,即,目标车辆的行驶速度在本车辆的行驶方向的水平垂直方向的分量减去本车辆的行驶速度在本车辆的行驶方向的水平垂直方向的分量。
Figure PCTCN2020082412-appb-000003
为本车辆和目标车辆的相对纵向速度,即,目标车辆的行驶速度在本车辆的行驶方向的分量减去本车辆的行驶速度在本车辆的行驶方向的分量。
在一种可能的实现方式中,所述基于所述夹角θ ij,得到所述目标车辆对应的复杂度,包括:
将所述夹角θ ij,代入如下方程:
Figure PCTCN2020082412-appb-000004
得到所述目标车辆对应的初始复杂度f(θ ij);
基于所述目标车辆与所述本车辆之间的距离,以及所述目标车辆与所述本车辆之间的相对速度,对所述目标车辆对应的初始复杂度f(θ ij)进行修正,得到所述目标车辆对应的复杂度。
本申请实施例所示的方案,可以根据本车辆和目标车辆之间的为了更准确的计算出每个目标车辆的复杂度,可以基于目标车辆与本车辆之间的距离,以及目标车辆与本车辆之间的相对速度对初始复杂度进行修正。
在一种可能的实现方式中,所述基于所述目标车辆与所述本车辆之间的距离,以及所述目标车辆与所述本车辆之间的相对速度,对所述目标车辆对应的初始复杂度f(θ ij)进行修正, 得到所述目标车辆对应的复杂度,包括:
基于所述本车辆的行驶速度、最大减速度和最小减速度,所述目标车辆的行驶速度和最大减速度,以及预设的驾驶员反应时间,得到所述本车辆与所述目标车辆之间的第一安全距离;
计算所述目标车辆与上述本车辆之间的距离在所述本车辆的行驶方向上的分量和预设危险距离之间的第一差值;
计算所述第一安全距离与所述预设危险距离之间的第二差值;
计算所述第一差值和所述第二差值之间的比值,得到所述本车辆与所述目标车辆之间的标准纵向距离p;
将所述标准纵向距离P,代入如下方程:f 1(p)=(1-p)lg(1/p),得到纵向修正系数f 1(p);
计算所述目标车辆与所述本车辆之间的横向相对速度和预设危险横向相对速度之间的第三差值;
计算所述预设安全横向相对速度与所述预设危险横向相对速度之间的第四差值;
计算所述第三差值和所述第四差值之间的比值,得到所述本车辆与所述目标车辆之间的标准横向相对速度q;
将所述标准横向相对速度q,代入如下方程:f 1(q)=(1-q)lg(1/q),得到横向修正系数f 1(q);
将所述目标车辆对应的初始复杂度f(θ ij),所述纵向修正系数f 1(q)和所述横向修正系数f 1(p)相乘,得到所述目标车辆对应的复杂度。
本申请实施例所示的方案,在对初始复杂度进行修正时,可以从横向和纵向两个方向考虑。其中,横向为本车辆行驶方向的水平垂直方向,纵向即为本车辆的行驶方向。在横向修正时,可以基于本车辆和目标车辆之间的横向相对速度,进行修改。在纵向修正时,可以基于本车辆和目标车辆之间的纵向距离,进行修正。
在一种可能的实现方式中,所述基于所述各静态因素的静态信息,获取所述本车辆所在行驶场景的静态复杂度,包括:
基于所述多个静态因素的静态信息获取所述多个静态因素的取值,并基于所述多个静态因素的取值与复杂度的对应关系,获取所述多个静态因素分别对应的复杂度;
基于所述多个静态因素分别对应的复杂度,得到所述本车辆所在行驶场景的静态复杂度。
本申请实施例所示的方案,可以先根据各静态因素的静态信息,确定静态因素的取值。例如,静态因素为车道宽度,相应的静态信息可以为3m、3.5m、4m、4.5m等,那么,静态因素的取值可以为宽车道(车道宽度≥3.5m)、窄车道(车道宽度<3.5m)两种。再根据静态因素的取值与复杂度的对应关系,得到多个静态因素分别对应的复杂度。最后,将多个静态因素分别对应的复杂度相加,即可得到本车辆所在行驶场景的静态复杂度。
在一种可能的实现方式中,所述基于所述多个静态因素分别对应的复杂度,确定所述本车辆所在行驶场景的静态复杂度,包括:
获取所述多个静态因素分别对应的权重;
将所述多个静态因素的取值对应的复杂度分别乘以相对应的权重,得到所述多个静态因素分别对应的加权复杂度;
将所述多个静态因素的对应的加权复杂度相加,得到所述本车辆所在行驶场景的静态复杂度。
本申请实施例所示的方案,考虑到各静态因素对车辆行驶的影响大小不同,在基于各静态因素对应的复杂度,确定本车辆所在行驶场景的静态复杂度时,可以先对本车辆所在行驶场景中的每个静态因素的对应的复杂度乘以该静态因素对应的权重,得到该静态因素对应的加权复杂度,再将各静态因素对应的加权复杂度相加,得到本车辆所在行驶场景的静态复杂度。
在一种可能的实现方式中,所述方法还包括:
对于第一静态因素的第一取值,获取N个所述第一取值对应的样本图像,其中,N为大于1的整数;
对所述N个样本图像分别进行图像识别,得到每个样本图像对应的预测值;
统计对应的预测值和对应的标定真值不同的样本图像的数目M,其中,所述标定真值用于唯一标识所述第一取值;
获取M与N的比值,作为所述第一取值对应的复杂度。
本申请实施例所示的方案,对于每个静态因素的每个取值,均可以预先确定出该取值对应的复杂度,并将静态因素的取值与对应的复杂度对应存储。在预先确定静态因素的取值对应的复杂度时,可以获取N个该取值对应的样本图像。然后,对N个样本图像分别进行图像识别,得到每个样本图像对应的预测值。统计对应的预测值和对应的标定真值不同的样本图像的数目M。获取M与N的比值,作为该取值对应的复杂度。早获取样本图像时,为了节省人力,可以通过高清街景地图获取各种静态因素的各取值对应的样本图像。
在一种可能的实现方式中,所述静态因素包括道路类型、同向车道数、车道宽度、中央隔离行驶、机非隔离形式、交通标志和交通信号灯中的至少一个。
本申请实施例所示的方案,道路类型对应的静态信息可以为城市快速路、主干路、次干路、支路,那么,对应的取值也可以为城市快速路、主干路、次干路、支路。同向车道数对应的静态信息可以为1、2、3、4、5、6等,那么,对应的取值可以为1、2、3、4、大于等于5。车道宽度对应的静态信息可以为3m、3.5m、4m、4.5m等,那么,对应的取值可以为宽车道(车道宽度≥3.5m)、窄车道(车道宽度<3.5m)。中央隔离形式对应的静态信息可以为无中央隔离、标线隔离、硬质隔离栏、绿化隔离,那么,对应的取值也可以为无中央隔离、标线隔离、硬质隔离栏、绿化隔离。机非隔离形式对应的静态信息可以为无机非隔离、标线隔离、硬质隔离栏、绿化隔离,那么,对应的取值也可以为无机非隔离、标线隔离、硬质隔离栏、绿化隔离。交通标志对应的静态信息可以为限速牌限速40km/h,限速牌限速60km/h,无限速牌等,那么,对应的取值可以为有限速牌和无限速牌。交通信号灯对应的静态信息为交通信号灯为红灯、交通信号灯为绿灯、交通信号灯为黄灯、无交通信号灯,那么,对应取值可以为有交通信号灯和无交通信号灯。
在一种可能的实现方式中,所述方法还包括:
获取所述本车辆所在行驶场景中的环境因素的静态信息;
基于所述环境因素的环境信息获取所述环境因素的取值;
基于存储的环境因素的取值与复杂度修正系数的对应关系,确定所述本车辆所在行驶场景中的环境因素对应的目标复杂度修正系数;
所述基于所述动态复杂度和所述静态复杂度,确定所述本车辆所在行驶场景的综合复杂度,包括:
将所述动态复杂度和所述静态复杂度相加,乘以所述目标复杂度修正系数,得到所述本车辆所在行驶场景的综合复杂度。
本申请实施例所示的方案,除上述静态因素和目标车辆行驶状况外,环境因素也是影响车辆行驶的重要因素。即环境因素对本车辆所在行驶场景的复杂程度也会有一定的影响。可以通过本车辆的感知***的摄像头获取行驶场景中的环境图像,并通过预先构建的模式识别算法,在环境图像识别出各环境因素的环境信息,然后,可以将环境因素的环境信息作为该环境因素的取值。这样,便可以根据环境因素的取值和复杂度修正系数的对应关系,得到本车辆所在行驶场景的环境因素的取值对应的目标复杂度修正系数,并使用目标复杂度修正系数,对动态复杂度和静态复杂进行修正。
在一种可能的实现方式中,所述环境因素包括光照、天气和路面情况。
本申请实施例所示的方案,光照对应的取值可以包括白天、黄昏或黎明、黑夜有光照、黑夜无光照,天气对应的取值可以包括晴、阴、雨、雪、雾,路面情况对应的取值可以包括干燥、潮湿、积雪、结冰。
在一种可能的实现方式中,所述确定与本车辆满足预设距离条件的目标车辆,包括:
如果存在与所述本车辆在相同车道的前方车辆,则确定所述与所述本车辆在相同车道的前方车辆中与所述本车辆之间的距离最小的前方车辆为参考车辆;
确定所述参考车辆的行驶速度,以及所述参考车辆与所述本车辆之间的距离在所述本车辆的行驶方向上的分量;
基于所述本车辆的行驶速度、预设最大加速度和预设最小减速度,所述参考车辆的行驶速度和预设最大减速度,以及预设驾驶员反应时间,确定所述本车辆与所述参考车辆之间的第二安全距离;
确定所述参考车辆与所述本车辆之间的距离在所述本车辆的行驶方向上的分量,和所述第二安全距离中的最小值;
将所述本车辆所在车道以及相邻车道中,与所述本车辆之间的距离在所述本车辆的行驶方向上的分量不大于所述最小值的前方车辆,确定为目标车辆;
如果不存在与所述本车辆在相同车道的前方车辆,则基于所述本车辆的行驶速度、预设最大加速度和预设最小减速度,预设前方车辆行驶速度和所述预设最大减速度,以及所述预设驾驶员反应时间,确定第三安全距离,其中,所示预设前方车辆行驶速度为0;
将所述本车辆所在车道以及相邻车道中,与所述本车辆之间的距离在所述本车辆的行驶方向上的分量不大于所述第三安全距离的前方车辆,确定为目标车辆。
第二方面、提供了一种检测车辆行驶场景的复杂度的装置,所述装置包括:
获取模块,用于获取本车辆的行驶速度和目标车辆的行驶速度,所述目标车辆为与本车辆满足预设距离条件的车辆;
第一确定模块,用于基于所述本车辆的行驶速度与所述目标车辆的行驶速度,获取所述本车辆所在行驶场景的动态复杂度;
第二确定模块,用于获取所述本车辆所在行驶场景中的静态因素的静态信息;
第三确定模块,用于基于所述各静态因素的静态信息,获取所述本车辆所在行驶场景的静态复杂度;
第四确定模块,用于基于所述动态复杂度和所述静态复杂度,获取所述本车辆所在行驶场景的综合复杂度。
在一种可能的实现方式中,所述第一确定模块,用于:
对于每个目标车辆,基于所述本车辆的行驶速度与所述目标车辆的行驶速度,得到所述目标车辆对应的复杂度;
将各目标车辆对应的复杂度相加,得到所述本车辆所在行驶场景的动态复杂度。
在一种可能的实现方式中,所述第一确定模块,用于:
基于所述本车辆的行驶速度与所述目标车辆的行驶速度,得到所述本车辆和所述目标车辆的相对行驶速度;
获取所述本车辆和所述目标车辆的相对行驶速度与所述本车辆的行驶方向的夹角θ ij
基于所述夹角θ ij,得到所述目标车辆对应的复杂度。
在一种可能的实现方式中,所述第一确定模块,用于:
将所述夹角θ ij,代入如下方程:
Figure PCTCN2020082412-appb-000005
得到所述目标车辆对应的初始复杂度f(θ ij);
基于所述目标车辆与所述本车辆之间的距离,以及所述目标车辆与所述本车辆之间的相对行驶速度,对所述目标车辆对应的初始复杂度f(θ ij)进行修正,得到所述目标车辆对应的复杂度。
在一种可能的实现方式中,所述第一确定模块,用于:
基于所述本车辆的行驶速度、最大减速度和最小减速度,所述目标车辆的行驶速度和最大减速度,以及预设的驾驶员反应时间,得到所述本车辆与所述目标车辆之间的第一安全距离;
计算所述目标车辆与所述本车辆之间的距离在所述本车辆的行驶方向上的分量和预设危险距离之间的第一差值;
计算所述第一安全距离与所述预设危险距离之间的第二差值;
计算所述第一差值和所述第二差值之间的比值,得到所述本车辆与所述目标车辆之间的标准纵向距离p;
将所述标准纵向距离P,代入如下方程:f 1(p)=(1-p)lg(1/p),得到纵向修正系数f 1(p);
计算所述目标车辆与所述本车辆之间的横向相对速度和预设危险横向相对速度之间的第三差值;
计算所述预设安全横向相对速度与所述预设危险横向相对速度之间的第四差值;
计算所述第三差值和所述第四差值之间的比值,得到所述本车辆与所述目标车辆之间的标准横向相对速度q;
将所述标准横向相对速度q,代入如下方程:f 1(q)=(1-q)lg(1/q),得到横向修正系数f 1(q);
将所述目标车辆对应的初始复杂度f(θ ij),所述纵向修正系数f 1(q)和所述横向修正系数 f 1(q)相乘,得到所述目标车辆对应的复杂度。
在一种可能的实现方式中,所述第三确定模块,用于:
基于所述多个静态因素的静态信息获取所述多个静态因素的取值,并基于所述多个静态因素的取值与复杂度的对应关系,获取所述多个静态因素分别对应的复杂度;
基于所述多个静态因素分别对应的复杂度,得到所述本车辆所在行驶场景的静态复杂度。
在一种可能的实现方式中,所述第三确定模块,用于:
获取所述多个静态因素分别对应的权重;
将所述多个静态因素的取值对应的复杂度分别乘以相对应的权重,得到所述多个静态因素分别对应的加权复杂度;
将所述多个静态因素的对应的加权复杂度相加,得到所述本车辆所在行驶场景的静态复杂度。
在一种可能的实现方式中,所述装置还包括:
统计模块,用于对于第一静态因素的第一取值,获取N个所述第一取值对应的样本图像,其中,N为大于1的整数;对所述N个样本图像分别进行图像识别,得到每个样本图像对应的预测值;统计对应的预测值和对应的标定真值不同的样本图像的数目M,其中,所述标定真值用于唯一标识所述第一取值;获取M与N的比值,作为所述第一取值对应的复杂度。
在一种可能的实现方式中,所述静态因素包括道路类型、同向车道数、车道宽度、中央隔离行驶、机非隔离形式、交通标志和交通信号灯中的至少一个。
在一种可能的实现方式中,所述装置还包括:
修正模块,用于获取所述本车辆所在行驶场景中的环境因素的环境信息;
基于所述环境因素的环境信息获取所述环境因素的取值;
基于存储的环境因素的取值与复杂度修正系数的对应关系,得到所述本车辆所在行驶场景中的环境因素的取值对应的目标复杂度修正系数;
所述第四模块,用于:
将所述动态复杂度和所述静态复杂度相加,乘以所述目标复杂度修正系数,得到所述本车辆所在行驶场景的综合复杂度。
在一种可能的实现方式中,所述环境因素包括光照、天气和路面情况中的至少一个。
在一种可能的实现方式中,所述获取模块,用于:
如果存在与所述本车辆在相同车道的前方车辆,则确定所述与所述本车辆在相同车道的前方车辆中与所述本车辆之间的距离最小的前方车辆为参考车辆;
获取所述参考车辆的行驶速度,以及所述参考车辆与所述本车辆之间的距离在所述本车辆的行驶方向上的分量;
基于所述本车辆的行驶速度、预设最大加速度和预设最小减速度,所述参考车辆的行驶速度和预设最大减速度,以及预设驾驶员反应时间,得到所述本车辆与所述参考车辆之间的第二安全距离;
获取所述参考车辆与所述本车辆之间的距离在所述本车辆的行驶方向上的分量,和所述第二安全距离中的最小值;
将所述本车辆所在车道以及相邻车道中,与所述本车辆之间的距离在所述本车辆的行驶方向上的分量不大于所述最小值的前方车辆,作为目标车辆;
如果不存在与所述本车辆在相同车道的前方车辆,则基于所述本车辆的行驶速度、预设最大加速度和预设最小减速度,预设前方车辆行驶速度和所述预设最大减速度,以及所述预设驾驶员反应时间,得到第三安全距离,其中,所示预设前方车辆行驶速度为0;
将所述本车辆所在车道以及相邻车道中,与所述本车辆之间的距离在所述本车辆的行驶方向上的分量不大于所述第三安全距离的前方车辆,作为目标车辆。
第三方面、提供了一种车辆行驶决策控制器,所述车辆行驶决策控制器包括处理器和存储器;
所述存储器存储有至少一个计算机可读指令,所述计算机可读指令被配置成由所述处理器执行,用于实现如上述第一方面所述的检测车辆行驶场景的复杂度的方法。
第四方面、提供了一种计算机可读存储介质,包括计算机可读指令,当所述计算机可读存储介质在车辆行驶决策控制器上运行时,使得所述车辆行驶决策控制器执行如上述第一方面所述的检测车辆行驶场景的复杂度的方法。
第五方面、提供了一种包含指令的计算机程序产品,当所述计算机程序产品在车辆行驶决策控制器上运行时,使得所述车辆行驶决策控制器执行上述第一方面所述的检测车辆行驶场景的复杂度的方法。
本申请提供的技术方案至少包括以下有益效果:
本申请中除了基于本车辆所在的行驶场景中的各静态因素的静态信息,确定本车辆所在行驶场景的静态复杂度外,还基于本车辆的行驶速度与目标车辆的行驶速度,确定所述本车辆所在行驶场景的动态复杂度,目标车辆为与本车辆满足预设距离条件的车辆。这样,结合了静态因素确定出的静态复杂度和周围车辆的行驶情况确定出的动态复杂度,来综合确定本车辆所在行驶场景的综合复杂度,该综合复杂度可以更加全面的反应出车辆当前所在行驶场景的复杂程度。
附图说明
图1是本申请实施例提供的一种车辆行驶决策控制器的结构示意图;
图2是本申请实施例提供的一种检测车辆行驶场景的复杂度的方法的流程图;
图3是本申请实施例提供的一种目标车辆的示意图;
图4是本申请实施例提供的一种目标车辆的示意图;
图5是本申请实施例提供的一种目标车辆的示意图;
图6是本申请实施例提供的一种检测车辆行驶场景的复杂度的装置的结构示意图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施方式作进一步地详细描述。
本申请实施例提供了一种检测车辆行驶场景的复杂度的方法,该方法可以应用于自动驾 驶汽车中。该方法可以由自动驾驶汽车中的车辆行驶决策控制器实现。在自动驾驶汽车中可部署有感知***、定位***等。其中,感知***可以包括有雷达、摄像头等,定位***可以为全球定位***(Global Positioning System,GPS)、北斗***等。
如图1所示是本申请实施例提供的一种车辆行驶决策控制器100的示意图。在图1中,车辆行驶决策控制器可以包括有处理器101和存储器102。处理器101可以是中央处理器(central processing unit,CPU)。处理器101可以是指一个处理器,也可以包括多个处理器。存储器102可以包括易失性存储器,例如随机存取存储器(random access memory,RAM);存储器也可以包括非易失性存储器,例如只读存储器(read-only memory,ROM),快闪存储器等。存储器还可以包括上述种类的存储器的组合。存储器102可以是指一个存储器,也可以包括多个存储器。存储器102中存储有计算机可读指令,该计算机可读指令可以由处理器101执行,以实现本申请实施例提供的检测车辆行驶场景的复杂度的方法。
车辆行驶决策控制器通过感知***采集到一定范围内的前方车辆的行驶数据,通过定位***和高精度地图结合获取到本车辆所在行驶场景中的各静态因素的取值,并综合计算出本车辆所在行驶场景的复杂度。
参见图2,本申请实施例提供的一种检测车辆行驶场景的复杂度的方法的处理流程可以包括如下步骤:
步骤201、获取本车辆的行驶速度和目标车辆的行驶速度,其中,目标车辆为与本车辆满足预设距离条件的车辆。
在实施中,可以先确定与本车辆满足预设距离条件的目标车辆,作为后续计算本车辆所在行驶场景的动态复杂度的研究车辆。对于与本车辆满足预设距离条件的目标车辆的确定方法可以如下:
通过雷达探测本车辆所在行驶场景的前方车辆,如果存在与本车辆在相同车道的前方车辆,则确定与本车辆之间的距离最小的前方车辆为参考车辆。然后,通过雷达获取参考车辆的行驶速度,以及该参考车辆与本车辆之间的距离在本车辆的行驶方向上的分量。此处,车辆与本车辆之间的距离在本车辆的行驶方向上的分量也可以称为该车辆与本车辆之间的纵向距离。
再然后,计算该参考车辆和本车辆之间的第二安全距离d min,计算公式可以如下:
Figure PCTCN2020082412-appb-000006
其中,V following为本车辆的行驶速度,V leading为参考车辆的行驶速度,a max,accel为本车辆的预示最大加速度,例如,可以为0.2g,g为重力加速度。a min,brake为本车辆的预设最小减速度,例如,可以为0.3g。b max,brake表示前车的预设最大减速度,例如,可以为0.4g。p为预设驾驶员反应时间,例如,可以为2s。
接着,可以确定参考车辆与本车辆之间纵向距离,和上述确定出的第二安全距离的之中的较小值。将本车辆所在车道以及相邻车道中,与本车辆之间的纵向距离不大于最小值的前方车辆,确定为目标车辆。如图3所示,为第二安全距离小于参考车辆与本车辆之间纵向距离的情况下,确定出的目标车辆。如图4所示,为第二安全距离大于参考车辆与本车辆之间纵向距离的情况下,确定出的目标车辆。
如果不存在与本车辆在相同车道的前方车辆,则可以假设存在与本车辆在相同车道的前 方车辆作为参考车辆,且该假设的参考车辆的行驶速度为0。计算该参考车辆和本车辆之间的第三安全距离。该第三安全距离的计算公式与上述第二安全距离的计算公式相同,在此不做赘述。如图5所示,为不存在与本车辆在相同车道的前方车辆的情况下,确定出的目标车辆。
步骤202、基于本车辆的行驶速度与目标车辆的行驶速度,得到本车辆所在行驶场景的动态复杂度。
在实施中,对于每个目标车辆,基于本车辆的行驶速度与该目标车辆的行驶速度,确定该目标车辆对应的复杂度。将确定出的各目标车辆对应的复杂度相加,得到本车辆所在行驶场景的动态复杂度。下面对于确定目标车辆对应的复杂度的方法进行说明。
首先,计算本车辆和目标车辆的相对行驶速度与本车辆的行驶方向的夹角θ ij。计算公式可以如下:
Figure PCTCN2020082412-appb-000007
其中,
Figure PCTCN2020082412-appb-000008
为本车辆和目标车辆的相对横向速度,即,目标车辆的行驶速度在本车辆的行驶方向的水平垂直方向的分量减去本车辆的行驶速度在本车辆的行驶方向的水平垂直方向的分量。
Figure PCTCN2020082412-appb-000009
为本车辆和目标车辆的相对纵向速度,即,目标车辆的行驶速度在本车辆的行驶方向的分量减去本车辆的行驶速度在本车辆的行驶方向的分量。
然后,将上述计算出的夹角θ ij,代入如下公式:
Figure PCTCN2020082412-appb-000010
即可得到目标车辆对应的初始复杂度f(θ ij)。
再然后,可以基于目标车辆与本车辆之间的距离,以及目标车辆与所述本车辆之间的相对速度,对目标车辆对应的初始复杂度f(θ ij)进行修正,得到目标车辆对应的复杂度。修正方法可以有如下步骤:
一、计算纵向修正系数,该纵向修正系数的计算方法可以如下:
首先,计算本车辆与目标车辆之间的第一安全距离。该第一安全距离的计算公式与上述第二安全距离的计算公式相同,在此不做赘述。然后,计算目标车辆与本车辆之间的纵向距离
Figure PCTCN2020082412-appb-000011
和预设危险距离
Figure PCTCN2020082412-appb-000012
之间的第一差值。再然后,计算安全临界值
Figure PCTCN2020082412-appb-000013
与预设危险距离
Figure PCTCN2020082412-appb-000014
之间的第二差值。其中,安全临界值
Figure PCTCN2020082412-appb-000015
可以为本车辆与目标车辆之间的第一安全距离,
Figure PCTCN2020082412-appb-000016
可以为0m。接着,计算第一差值和第二差值之间的比值,得到本车辆与目标车辆之间的标准纵向距离p。该标准纵向距离p可以表示为:
Figure PCTCN2020082412-appb-000017
再然后,将标准纵向距离p代入公式f 1(p)=(1-p)lg(1/p),得到纵向修正系数f 1(p)。
二、计算横向修正系数,该纵向修正系数的计算方法可以如下。
首先,计算目标车辆与本车辆之间的横向相对速度
Figure PCTCN2020082412-appb-000018
和预设危险横向相对速度
Figure PCTCN2020082412-appb-000019
之间的第三差值。然后,计算预设安全横向相对速度
Figure PCTCN2020082412-appb-000020
与所述预设危险横向相对速度
Figure PCTCN2020082412-appb-000021
之间的第四差值。其中,
Figure PCTCN2020082412-appb-000022
可以为3.5m/s,表示本车辆和目标车辆在横向相互靠近的速度为3.5m/s,
Figure PCTCN2020082412-appb-000023
可以为-3.5m/s,表示本车辆和目标车辆在横向相互远离的速度为3.5m/s。再然后,计算第三差值和第四差值之间的比值,得到本车辆与目标车辆之间的标准横向相对速度q。该标准横向相对速度q可以表示为:
Figure PCTCN2020082412-appb-000024
再然后,将标准横向相对速度q,代入公式f 1(q)=(1-q)lg(1/q),得到横向修正系数f 1(q)。
三、将初始复杂度f(θ ij),纵向修正系数f 1(p)和横向修正系数f 1(q)相乘,即可得到目标车辆对应的复杂度。此处需要说明的是,上述纵向修正系数和计算横向修正系数的计算顺序,仅为一种示例,本申请实施例对于二者的计算顺序不做限定,在一种可能的实现方式中二者还可以并行计算。
步骤203、获取本车辆当前所在行驶场景中的各静态因素的静态信息。
其中,静态因素包括道路类型、同向车道数、车道宽度、中央隔离行驶、机非隔离形式、交通标志和交通信号灯。
在实施中,可以通过定位***确定当前所在位置,并通过高精度地图确定本车辆当前所在位置(本车辆所在行驶场景)的各静态因素的静态信息。例如,获取到本车辆当前所在行驶场景中的道路类型为城市快速路,同向车道数为2,车道宽度大于3.5m为宽车道,中央隔离行驶为绿化隔离,机非隔离形式为绿化隔离,交通标志为有限速牌,交通信号灯为有交通信号灯。下面对于各静态因素可能的静态信息进行说明。
道路类型的静态信息可以为城市快速路、主干路、次干路、支路。同向车道数的静态信息可以为1、2、3、4、5、6等。车道宽度的静态信息可以为3m、3.5m、4m、4.5m等。中央隔离形式的静态信息可以为无中央隔离、标线隔离、硬质隔离栏、绿化隔离。机非隔离形式的静态信息可以为无机非隔离、标线隔离、硬质隔离栏、绿化隔离。交通标志的静态信息可以为限速牌限速40km/h,限速牌限速60km/h,无限速牌等。交通信号灯的静态信息为交通信号灯为红灯、交通信号灯为绿灯、交通信号灯为黄灯、无交通信号灯。
步骤204、基于各静态因素的静态信息,获取本车辆所在行驶场景的静态复杂度。
在实施中,可以先根据确定出的多个静态因素的静态信息,获取本车辆所在行驶场景中的多个静态因素的取值。然后,根据静态因素的取值与复杂度的对应关系,确定出本车辆所在行驶场景中的多个静态因素的取值分别对应的复杂度。最后,根据多个静态因素分别对应的复杂度,得到本车辆所在行驶场景的静态复杂度。
由于静态因素的静态信息情况过于繁多,如果使每个静态信息均单独对应一个取值,那么,静态因素的取值和复杂度的对应关系也会相对复杂。因此,对于部分静态信息,可以使多个静态信息对应一个取值。下面对于静态因素的静态信息和静态因素的取值之间的关系进行说明。
对于道路类型的静态信息可以直接确定为道路类型的取值,即静态信息城市快速路、主干路、次干路、支路,分别对应的取值也可以为城市快速路、主干路、次干路、支路。对于同向车道数的静态信息为1到4时,可以直接将同向车道数的静态信息确定为同向车道数的取值,对于同向车道数的静态信息为5、6、7等时,可以统一对应取值“≥5”。对于车道宽 度的静态信息为3m、3.5m时,可以统一对应取值“≤3.5m”,对于车道宽度的静态信息为4m、4.5m等时,可以统一对应取值“>3.5m”。对于中央隔离形式的静态信息可以直接确定为中央隔离形式的取值。对于机非隔离形式的静态信息可以直接确定为机非隔离形式的取值。对于交通标志的静态信息为限速牌限速40km/h,限速牌限速60km/h等时,对应的取值可以为“有限速牌”,对于交通标志的静态信息为无限速牌时,对应的取值即可以为“无限速牌”。对于交通信号灯对应的静态信息为交通信号灯为红灯、交通信号灯为绿灯、交通信号灯为黄灯时,可以统一对应取值“有交通信号灯”,对于交通信号灯对应的静态信息为无交通信号灯时,对应的取值即可以为“无交通信号灯”。
各静态因素的取值情况可以如下表1所示。
表1
静态因素 取值
道路类型 城市快速路、主干路、次干路、支路
同向车道数 1、2、3、4、5及以上
车道宽度 宽车道(车道宽度≥3.5m)、窄车道(车道宽度<3.5m)
中央隔离形式 无中央隔离、标线隔离、硬质隔离栏、绿化隔离
机非隔离形式 无机非隔离、标线隔离、硬质隔离栏、绿化隔离
交通标志 有限速牌、无限速牌
交通信号灯 有交通信号灯、无交通信号灯
静态因素的取值与复杂度的对应关系可以如表2所示。
表2
Figure PCTCN2020082412-appb-000025
对于每个静态因素,可以基于静态因素和权重的对应关系,获取该静态因素对应的权重,将该静态因素对应的复杂度乘以该静态因素对应的权重,得到静态因素的加权复杂度。其中,静态因素和权重的对应关系可以如表3所示。
最后,将各静态因素对应的加权复杂度相加,即可得到本车辆所在行驶场景的静态复杂度。
表3
静态因素 权重
道路类型 W 1
同向车道数 W 2
车道宽度 W 3
中央隔离形式 W 4
机非隔离形式 W 5
交通标志 W 6
交通信号灯 W 7
在一种可能的实现方式中,技术人员可以预先确定每个静态因素的每个取值对应的复杂度。对于每个静态因素的每个取值对应的复杂度的确定方法可以如下。
对于每个静态因素的每个取值,获取N个该取值对应的样本图像。其中,N为大于1的整数。对于样本图像的获取可以有如下方式:通过高清街景地图截取,实地拍摄或者通过互联网下载等。本申请实施例对于样本图像的获取方法不作限定。然后,对N个样本图像分别进行图像识别,得到每个样本图像对应的预测值。统计对应的预测值和对应的标定真值不同的样本图像的数目M。其中,标定真值用于唯一标识取值。将M与N的比值,确定为该取值对应的复杂度。下面以绿化隔离对应的复杂度的确定为例进行说明。
在高清街景地图中截取100张有绿化隔离的街景图像作为绿化隔离对应的样本图像。当然,也可以实地拍摄100张有绿化隔离的街景图像作为绿化隔离对应的样本图像,或者通过互联网下载100张有绿化隔离的街景图像作为绿化隔离对应的样本图像。然后,可以通过图像识别算法对100张样本图像分别进行图像识别。此处,图像识别算法可以为经过训练的神经网络模型等。对于中央隔离形式的每个取值可以分别对应一个标定真值,例如,无中央隔离为0、标线隔离为1、硬质隔离栏为2、绿化隔离为3。对于每张样本图像经过图像识别后,可以得到该样本图像对应的预测值,如果预测值和标定真值不相同,则认为识别错误,并记录识别错误的样本图像的数目。例如,识别错误的样本图像有90张,则使用识别错误的样本图像数目90除以样本图像总数目100,得到识别错误率为90%,该识别错误率即可以作为绿化隔离对应的复杂度。
对于每个静态因素对应的权重可以采用层次分析确定并存储。下面对于通过层次分析法确定静态因素对应的权重进行说明。
首先,技术人员可以根据静态因素比较时使用的比例标度表,填写静态因素重要性判断矩阵表。该静态因素重要性判断矩阵表可以入如表4所示,比例标度表可以如下表5所示。
然后,计算静态因素重要性判断矩阵表中每一行元素b kl的乘积,计算公式可以如下:
Figure PCTCN2020082412-appb-000026
其中,n=7。
再然后,计算M k的n次方根V k,计算公式可以如下:
Figure PCTCN2020082412-appb-000027
最后,计算静态因素重要性判断矩阵的特征向量W k,即静态因素对应的权重,计算公式可以如下:
Figure PCTCN2020082412-appb-000028
表4
Figure PCTCN2020082412-appb-000029
表5
因素k比因素l 量化值
同等重要 1
稍微重要 3
较强重要 5
强烈重要 7
极端重要 9
两相邻判断的中间值 2,4,6,8
步骤205、基于动态复杂度和静态复杂度,确定本车辆所在行驶场景的综合复杂度。
在实施中,可以直接将上述确定出的动态复杂度和静态复杂度相加,即可得到本车辆所在行驶场景的综合复杂度。
在一种可能的实现方式中,可以考虑本车辆所在行驶场景的环境因素,来综合确定本车辆所在行驶场景的综合复杂度。确定方法可以如下:获取本车辆所在行驶场景中的环境因素的环境信息。基于环境因素的环境信息获取环境因素的取值。基于存储的环境因素的取值与复杂度修正系数的对应关系,确定本车辆所在行驶场景中的环境因素对应的目标复杂度修正系数。将动态复杂度和静态复杂度相加,乘以目标复杂度修正系数,得到本车辆所在行驶场景的综合复杂度。
其中,环境因素可以包括光照、天气和路面情况。
在实施中,可以通过本车辆的感知***的摄像头获取行驶场景中的环境图像,并通过预先构建的模式识别算法,在环境图像识别出各环境因素的环境信息。由于模式识别算法的输出为预先设置的几种可能的环境信息,那么,可以直接将环境因素的环境信息确定为环境因素的取值。各环境因素的取值情况可以如下表6所示。
表6
环境因素 取值
光照 白天、黄昏或黎明、黑夜有光照、黑夜无光照
天气 晴、阴、雨、雪、雾
路面情况 干燥、潮湿、积雪、结冰
各环境因素的取值与复杂度修正系数的对应关系可以如下表7所示,需要说明的是,表7中仅是给出了一种复杂度修正系数的示例。
表7
环境因素 取值(复杂度修正系数)
光照 白天(1)、黄昏或黎明(1.2)、黑夜有光照(1.5)、黑夜无光照(2)
天气 晴(1)、阴(1.2)、雨(1.8)、雪(1.5)、雾(2)
路面情况 干燥(1)、潮湿(1.5)、积雪(1.8)、结冰(2)
然后,可以查询上述各环境因素的取值与复杂度修正系数的对应关系,确定出本车辆所在行驶场景中的每个环境因素对应的复杂度修正系数。再然后,将确定出的每个环境因素对应的复杂度修正系数相乘,得到目标复杂度修正系数。
例如,获取到的各环境因素的取值为:光照为白天,天气为雨、路面情况为潮湿,相应的,可以查询出白天对应的复杂度修正系数为1,雨对应的复杂度修正系数为1.8,潮湿对应的复杂度修正系数为1.5,将三者相乘得到目标复杂度修正系数为2.7。
最后,将得到的目标复杂度修正系数与上述确定出的动态复杂度和静态复杂度之和相乘,即可得到本车辆所在行驶场景的综合复杂度。
在一种可能的实现方式中,在计算出本车辆所在的行驶场景的综合复杂度之后,可以根据计算出的综合复杂度,确定是否要进行人工接管驾驶提示。
在实施中,该综合复杂度可以自动驾驶汽车内的显示屏实时显示,使得驾驶员可以实时了解到车辆所在的行驶环境的复杂度。此外,该可以设置危险复杂度阈值,当计算出的综合复杂度大于该设置的危险复杂度阈值时,则可以在显示屏显示人工接管驾驶提示。并且还可以同时进行语音提示,提示驾驶人员当前行驶环境的复杂度较高,请进行人工驾驶。
本申请实施例中除了基于本车辆所在的行驶场景中的各静态因素的静态信息,确定本车辆所在行驶场景的静态复杂度外,还基于本车辆的行驶速度与目标车辆的行驶速度,确定所述本车辆所在行驶场景的动态复杂度,目标车辆为与本车辆满足预设距离条件的车辆。这样,结合了静态因素确定出的静态复杂度和周围车辆的行驶情况确定出的动态复杂度,来综合确定本车辆所在行驶场景的综合复杂度,该综合复杂度可以更加全面的反应出车辆当前所在行驶场景的实际复杂程度。
基于相同的技术构思,本申请实施例还提供了一种检测车辆行驶场景的复杂度的装置,如图6所示,该装置包括:
获取模块610,用于获取本车辆的行驶速度和目标车辆的行驶速度,所述目标车辆为与本车辆满足预设距离条件的车辆。具体可以实现上述步骤201中的获取功能,以及其他隐含步骤
第一确定模块620,用于基于所述本车辆的行驶速度与所述目标车辆的行驶速度,获取所述本车辆所在行驶场景的动态复杂度。具体可以实现上述步骤202中的获取功能,以及其他隐含步骤。
第二确定模块630,用于获取所述本车辆所在行驶场景中的静态因素的静态信息。具体可以实现上述步骤203中的获取功能,以及其他隐含步骤
第三确定模块640,用于基于所述各静态因素的静态信息,获取所述本车辆所在行驶场景的静态复杂度。具体可以实现上述步骤204中的获取功能,以及其他隐含步骤
第四确定模块650,用于基于所述动态复杂度和所述静态复杂度,获取所述本车辆所在行驶场景的综合复杂度。具体可以实现上述步骤205中的获取功能,以及其他隐含步骤
在一种可能的实现方式中,所述第一确定模块620,用于:
对于每个目标车辆,基于所述本车辆的行驶速度与所述目标车辆的行驶速度,得到所述目标车辆对应的复杂度;
将各目标车辆对应的复杂度相加,得到所述本车辆所在行驶场景的动态复杂度。
在一种可能的实现方式中,所述第一确定模块620,用于:
基于所述本车辆的行驶速度与所述目标车辆的行驶速度,得到所述本车辆和所述目标车辆的相对行驶速度;
获取所述本车辆和所述目标车辆的相对行驶速度与所述本车辆的行驶方向的夹角θ ij
基于所述夹角θ ij,得到所述目标车辆对应的复杂度。
在一种可能的实现方式中,所述第一确定模块620,用于:
将所述夹角θ ij,代入如下方程:
Figure PCTCN2020082412-appb-000030
得到所述目标车辆对应的初始复杂度f(θ ij);
基于所述目标车辆与所述本车辆之间的距离,以及所述目标车辆与所述本车辆之间的相对行驶速度,对所述目标车辆对应的初始复杂度f(θ ij)进行修正,得到所述目标车辆对应的复杂度。
在一种可能的实现方式中,所述第一确定模块620,用于:
基于所述本车辆的行驶速度、最大减速度和最小减速度,所述目标车辆的行驶速度和最大减速度,以及预设的驾驶员反应时间,得到所述本车辆与所述目标车辆之间的第一安全距离;
计算所述目标车辆与所述本车辆之间的距离在所述本车辆的行驶方向上的分量和预设危险距离之间的第一差值;
计算所述第一安全距离与所述预设危险距离之间的第二差值;
计算所述第一差值和所述第二差值之间的比值,得到所述本车辆与所述目标车辆之间的标准纵向距离p;
将所述标准纵向距离P,代入如下方程:f 1(p)=(1-p)lg(1/p),得到纵向修正系数f 1(p);
计算所述目标车辆与所述本车辆之间的横向相对速度和预设危险横向相对速度之间的第三差值;
计算所述预设安全横向相对速度与所述预设危险横向相对速度之间的第四差值;
计算所述第三差值和所述第四差值之间的比值,得到所述本车辆与所述目标车辆之间的标准横向相对速度q;
将所述标准横向相对速度q,代入如下方程:f 1(q)=(1-q)lg(1/q),得到横向修正系数f 1(q);
将所述目标车辆对应的初始复杂度f(θ ij),所述纵向修正系数f 1(q)和所述横向修正系数f 1(q)相乘,得到所述目标车辆对应的复杂度。
在一种可能的实现方式中,所述第三确定模块640,用于:
基于所述多个静态因素的静态信息获取所述多个静态因素的取值,并基于所述多个静态因素的取值与复杂度的对应关系,获取所述多个静态因素分别对应的复杂度;
基于所述多个静态因素分别对应的复杂度,得到所述本车辆所在行驶场景的静态复杂度。
在一种可能的实现方式中,所述第三确定模块640,用于:
获取所述多个静态因素分别对应的权重;
将所述多个静态因素的取值对应的复杂度分别乘以相对应的权重,得到所述多个静态因素分别对应的加权复杂度;
将所述多个静态因素的对应的加权复杂度相加,得到所述本车辆所在行驶场景的静态复杂度。
在一种可能的实现方式中,所述装置还包括:
统计模块,用于对于第一静态因素的第一取值,获取N个所述第一取值对应的样本图像,其中,N为大于1的整数;对所述N个样本图像分别进行图像识别,得到每个样本图像对应的预测值;统计对应的预测值和对应的标定真值不同的样本图像的数目M,其中,所述标定真值用于唯一标识所述第一取值;获取M与N的比值,作为所述第一取值对应的复杂度。
在一种可能的实现方式中,所述静态因素包括道路类型、同向车道数、车道宽度、中央隔离行驶、机非隔离形式、交通标志和交通信号灯中的至少一个。
在一种可能的实现方式中,所述装置还包括:
修正模块,用于获取所述本车辆所在行驶场景中的环境因素的环境信息;
基于所述环境因素的环境信息获取所述环境因素的取值;
基于存储的环境因素的取值与复杂度修正系数的对应关系,得到所述本车辆所在行驶场景中的环境因素的取值对应的目标复杂度修正系数;;
所述第四模块650,用于:
将所述动态复杂度和所述静态复杂度相加,乘以所述目标复杂度修正系数,得到所述本车辆所在行驶场景的综合复杂度。
在一种可能的实现方式中,所述环境因素包括光照、天气和路面情况中的至少一个。
在一种可能的实现方式中,所述获取模块610,用于:
如果存在与所述本车辆在相同车道的前方车辆,则确定所述与所述本车辆在相同车道的 前方车辆中与所述本车辆之间的距离最小的前方车辆为参考车辆;
获取所述参考车辆的行驶速度,以及所述参考车辆与所述本车辆之间的距离在所述本车辆的行驶方向上的分量;
基于所述本车辆的行驶速度、预设最大加速度和预设最小减速度,所述参考车辆的行驶速度和预设最大减速度,以及预设驾驶员反应时间,得到所述本车辆与所述参考车辆之间的第二安全距离;
获取所述参考车辆与所述本车辆之间的距离在所述本车辆的行驶方向上的分量,和所述第二安全距离中的最小值;
将所述本车辆所在车道以及相邻车道中,与所述本车辆之间的距离在所述本车辆的行驶方向上的分量不大于所述最小值的前方车辆,作为目标车辆;
如果不存在与所述本车辆在相同车道的前方车辆,则基于所述本车辆的行驶速度、预设最大加速度和预设最小减速度,预设前方车辆行驶速度和所述预设最大减速度,以及所述预设驾驶员反应时间,得到第三安全距离,其中,所示预设前方车辆行驶速度为0;
将所述本车辆所在车道以及相邻车道中,与所述本车辆之间的距离在所述本车辆的行驶方向上的分量不大于所述第三安全距离的前方车辆,作为目标车辆。
还需要说明的是,上述实施例提供的检测车辆行驶场景的复杂度的装置在检测车辆行驶场景的复杂度时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将车辆行驶决策控制器的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的检测车辆行驶场景的复杂度的装置与检测车辆行驶场景的复杂度的方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现,当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令,在设备上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴光缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是设备能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质(如软盘、硬盘和磁带等),也可以是光介质(如数字视盘(Digital Video Disk,DVD)等),或者半导体介质(如固态硬盘等)。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。
以上所述仅为本申请的示例性实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (26)

  1. 一种检测车辆行驶场景的复杂度的方法,其特征在于,所述方法包括:
    获取本车辆的行驶速度和目标车辆的行驶速度,所述目标车辆为与所述本车辆满足预设距离条件的车辆;
    基于所述本车辆的行驶速度与所述目标车辆的行驶速度,获取所述本车辆所在行驶场景的动态复杂度;
    获取所述本车辆所在行驶场景中的静态因素的静态信息;
    基于所述各静态因素的静态信息,获取所述本车辆所在行驶场景的静态复杂度;
    基于所述动态复杂度和所述静态复杂度,获取所述本车辆所在行驶场景的综合复杂度。
  2. 根据权利要求1所述的方法,其特征在于,所述基于所述本车辆的行驶速度与所述目标车辆的行驶速度,获取所述本车辆所在行驶场景的动态复杂度,包括:
    对于每个目标车辆,基于所述本车辆的行驶速度与所述目标车辆的行驶速度,得到所述目标车辆对应的复杂度;
    将各目标车辆对应的复杂度相加,得到所述本车辆所在行驶场景的动态复杂度。
  3. 根据权利要求2所述的方法,其特征在于,所述基于所述本车辆的行驶速度与所述目标车辆的行驶速度,确定所述目标车辆对应的复杂度,包括:
    基于所述本车辆的行驶速度与所述目标车辆的行驶速度,得到所述本车辆和所述目标车辆的相对行驶速度;
    获取所述本车辆和所述目标车辆的相对行驶速度与所述本车辆的行驶方向的夹角θ ij
    基于所述夹角θ ij,得到所述目标车辆对应的复杂度。
  4. 根据权利要求3所述的方法,其特征在于,所述基于所述夹角θ ij,得到所述目标车辆对应的复杂度,包括:
    将所述夹角θ ij,代入如下方程:
    Figure PCTCN2020082412-appb-100001
    得到所述目标车辆对应的初始复杂度f(θ ij);
    基于所述目标车辆与所述本车辆之间的距离,以及所述目标车辆与所述本车辆之间的相对行驶速度,对所述目标车辆对应的初始复杂度f(θ ij)进行修正,得到所述目标车辆对应的复杂度。
  5. 根据权利要求4所述的方法,其特征在于,所述基于所述目标车辆与所述本车辆之间的距离,以及所述目标车辆与所述本车辆之间的相对行驶速度,对所述目标车辆对应的初始复杂度f(θ ij)进行修正,得到所述目标车辆对应的复杂度,包括:
    基于所述本车辆的行驶速度、最大减速度和最小减速度,所述目标车辆的行驶速度和最大减速度,以及预设的驾驶员反应时间,得到所述本车辆与所述目标车辆之间的第一安全距 离;
    计算所述目标车辆与所述本车辆之间的距离在所述本车辆的行驶方向上的分量和预设危险距离之间的第一差值;
    计算所述第一安全距离与所述预设危险距离之间的第二差值;
    计算所述第一差值和所述第二差值之间的比值,得到所述本车辆与所述目标车辆之间的标准纵向距离p;
    将所述标准纵向距离P,代入如下方程:f 1(p)=(1-p)lg(1/p),得到纵向修正系数f 1(p);
    计算所述目标车辆与所述本车辆之间的横向相对速度和预设危险横向相对速度之间的第三差值;
    计算所述预设安全横向相对速度与所述预设危险横向相对速度之间的第四差值;
    计算所述第三差值和所述第四差值之间的比值,得到所述本车辆与所述目标车辆之间的标准横向相对速度q;
    将所述标准横向相对速度q,代入如下方程:f 1(q)=(1-q)lg(1/q),得到横向修正系数f 1(q);
    将所述目标车辆对应的初始复杂度f(θ ij),所述纵向修正系数f 1(q)和所述横向修正系数f 1(q)相乘,得到所述目标车辆对应的复杂度。
  6. 根据权利要求1-5中任一项所述的方法,其特征在于,所述静态因素包括多个静态因素,所述基于所述静态因素的静态信息,获取所述本车辆所在行驶场景的静态复杂度,包括:
    基于所述多个静态因素的静态信息获取所述多个静态因素的取值,并基于所述多个静态因素的取值与复杂度的对应关系,获取所述多个静态因素分别对应的复杂度;
    基于所述多个静态因素分别对应的复杂度,得到所述本车辆所在行驶场景的静态复杂度。
  7. 根据权利要求6所述的方法,其特征在于,所述基于所述多个静态因素分别对应的复杂度,确定所述本车辆所在行驶场景的静态复杂度,包括:
    获取所述多个静态因素分别对应的权重;
    将所述多个静态因素的取值对应的复杂度分别乘以相对应的权重,得到所述多个静态因素分别对应的加权复杂度;
    将所述多个静态因素的对应的加权复杂度相加,得到所述本车辆所在行驶场景的静态复杂度。
  8. 根据权利要求6或7中任一项所述的方法,其特征在于,所述方法还包括:
    对于第一静态因素的第一取值,获取N个所述第一取值对应的样本图像,其中,N为大于1的整数;
    对所述N个样本图像分别进行图像识别,得到每个样本图像对应的预测值;
    统计对应的预测值和对应的标定真值不同的样本图像的数目M,其中,所述标定真值用于唯一标识所述第一取值;
    获取M与N的比值,作为所述第一取值对应的复杂度。
  9. 根据权利要求1-8中任一项所述的方法,其特征在于,所述静态因素包括道路类型、同向车道数、车道宽度、中央隔离行驶、机非隔离形式、交通标志和交通信号灯中的至少一个。
  10. 根据权利要求1-9中任一项所述的方法,其特征在于,所述方法还包括:
    获取所述本车辆所在行驶场景中的环境因素的环境信息;
    基于所述环境因素的环境信息获取所述环境因素的取值;
    基于存储的环境因素的取值与复杂度修正系数的对应关系,得到所述本车辆所在行驶场景中的环境因素对应的目标复杂度修正系数;
    所述基于所述动态复杂度和所述静态复杂度,获取所述本车辆所在行驶场景的综合复杂度,包括:
    将所述动态复杂度和所述静态复杂度相加,乘以所述目标复杂度修正系数,得到所述本车辆所在行驶场景的综合复杂度。
  11. 根据权利要求10所述的方法,其特征在于,所述环境因素包括光照、天气和路面情况中的至少一个。
  12. 根据权利要求1-11中任一项所述的方法,其特征在于,所述确定与本车辆满足预设距离条件的目标车辆,包括:
    如果存在与所述本车辆在相同车道的前方车辆,则确定所述与所述本车辆在相同车道的前方车辆中与所述本车辆之间的距离最小的前方车辆为参考车辆;
    获取所述参考车辆的行驶速度,以及所述参考车辆与所述本车辆之间的距离在所述本车辆的行驶方向上的分量;
    基于所述本车辆的行驶速度、预设最大加速度和预设最小减速度,所述参考车辆的行驶速度和预设最大减速度,以及预设驾驶员反应时间,得到所述本车辆与所述参考车辆之间的第二安全距离;
    获取所述参考车辆与所述本车辆之间的距离在所述本车辆的行驶方向上的分量,和所述第二安全距离中的最小值;
    将所述本车辆所在车道以及相邻车道中,与所述本车辆之间的距离在所述本车辆的行驶方向上的分量不大于所述最小值的前方车辆,作为目标车辆;
    如果不存在与所述本车辆在相同车道的前方车辆,则基于所述本车辆的行驶速度、预设最大加速度和预设最小减速度,预设前方车辆行驶速度和所述预设最大减速度,以及所述预设驾驶员反应时间,得到第三安全距离,其中,所示预设前方车辆行驶速度为0;
    将所述本车辆所在车道以及相邻车道中,与所述本车辆之间的距离在所述本车辆的行驶方向上的分量不大于所述第三安全距离的前方车辆,作为目标车辆。
  13. 一种确定车辆行驶场景的复杂度的装置,其特征在于,所述装置包括:
    获取模块,用于获取本车辆的行驶速度和目标车辆的行驶速度,所述目标车辆为与所述 本车辆满足预设距离条件的车辆;
    第一确定模块,用于基于所述本车辆的行驶速度与所述目标车辆的行驶速度,获取所述本车辆所在行驶场景的动态复杂度;
    第二确定模块,用于获取所述本车辆所在行驶场景中的静态因素的静态信息;
    第三确定模块,用于基于所述各静态因素的静态信息,获取所述本车辆所在行驶场景的静态复杂度;
    第四确定模块,用于基于所述动态复杂度和所述静态复杂度,获取所述本车辆所在行驶场景的综合复杂度。
  14. 根据权利要求13所述的装置,其特征在于,所述第一确定模块,用于:
    对于每个目标车辆,基于所述本车辆的行驶速度与所述目标车辆的行驶速度,得到所述目标车辆对应的复杂度;
    将各目标车辆对应的复杂度相加,得到所述本车辆所在行驶场景的动态复杂度。
  15. 根据权利要求14所述的装置,其特征在于,所述第一确定模块,用于:
    基于所述本车辆的行驶速度与所述目标车辆的行驶速度,得到所述本车辆和所述目标车辆的相对行驶速度;
    获取所述本车辆和所述目标车辆的相对行驶速度与所述本车辆的行驶方向的夹角θ ij
    基于所述夹角θ ij,得到所述目标车辆对应的复杂度。
  16. 根据权利要求15所述的装置,其特征在于,所述第一确定模块,用于:
    将所述夹角θ ij,代入如下方程:
    Figure PCTCN2020082412-appb-100002
    得到所述目标车辆对应的初始复杂度f(θ ij);
    基于所述目标车辆与所述本车辆之间的距离,以及所述目标车辆与所述本车辆之间的相对行驶速度,对所述目标车辆对应的初始复杂度f(θ ij)进行修正,得到所述目标车辆对应的复杂度。
  17. 根据权利要求16所述的装置,其特征在于,所述第一确定模块,用于:
    基于所述本车辆的行驶速度、最大减速度和最小减速度,所述目标车辆的行驶速度和最大减速度,以及预设的驾驶员反应时间,得到所述本车辆与所述目标车辆之间的第一安全距离;
    计算所述目标车辆与所述本车辆之间的距离在所述本车辆的行驶方向上的分量和预设危险距离之间的第一差值;
    计算所述第一安全距离与所述预设危险距离之间的第二差值;
    计算所述第一差值和所述第二差值之间的比值,得到所述本车辆与所述目标车辆之间的标准纵向距离p;
    将所述标准纵向距离P,代入如下方程:f 1(p)=(1-p)lg(1/p),得到纵向修正系数f 1(p);
    计算所述目标车辆与所述本车辆之间的横向相对速度和预设危险横向相对速度之间的第三差值;
    计算所述预设安全横向相对速度与所述预设危险横向相对速度之间的第四差值;
    计算所述第三差值和所述第四差值之间的比值,得到所述本车辆与所述目标车辆之间的标准横向相对速度q;
    将所述标准横向相对速度q,代入如下方程:f 1(q)=(1-q)lg(1/q),得到横向修正系数f 1(q);
    将所述目标车辆对应的初始复杂度f(θ ij),所述纵向修正系数f 1(q)和所述横向修正系数f 1(q)相乘,得到所述目标车辆对应的复杂度。
  18. 根据权利要求13-17中任一项所述的装置,其特征在于,所述第三确定模块,用于:
    基于所述多个静态因素的静态信息获取所述多个静态因素的取值,并基于所述多个静态因素的取值与复杂度的对应关系,获取所述多个静态因素分别对应的复杂度;
    基于所述多个静态因素分别对应的复杂度,得到所述本车辆所在行驶场景的静态复杂度。
  19. 根据权利要求18所述的装置,其特征在于,所述第三确定模块,用于:
    获取所述多个静态因素分别对应的权重;
    将所述多个静态因素的取值对应的复杂度分别乘以相对应的权重,得到所述多个静态因素分别对应的加权复杂度;
    将所述多个静态因素的对应的加权复杂度相加,得到所述本车辆所在行驶场景的静态复杂度。
  20. 根据权利要求18或19中任一项所述的装置,其特征在于,所述装置还包括:
    统计模块,用于对于第一静态因素的第一取值,获取N个所述第一取值对应的样本图像,其中,N为大于1的整数;对所述N个样本图像分别进行图像识别,得到每个样本图像对应的预测值;统计对应的预测值和对应的标定真值不同的样本图像的数目M,其中,所述标定真值用于唯一标识所述第一取值;获取M与N的比值,作为所述第一取值对应的复杂度。
  21. 根据权利要求13-20中任一项所述的装置,其特征在于,所述静态因素包括道路类型、同向车道数、车道宽度、中央隔离行驶、机非隔离形式、交通标志和交通信号灯中的至少一个。
  22. 根据权利要求13-21中任一项所述的装置,其特征在于,所述装置还包括:
    修正模块,用于获取所述本车辆所在行驶场景中的环境因素的环境信息;
    基于所述环境因素的环境信息获取所述环境因素的取值;
    基于存储的环境因素的取值与复杂度修正系数的对应关系,得到所述本车辆所在行驶场景中的环境因素的取值对应的目标复杂度修正系数;
    所述第四模块,用于:
    将所述动态复杂度和所述静态复杂度相加,乘以所述目标复杂度修正系数,得到所述本 车辆所在行驶场景的综合复杂度。
  23. 根据权利要求22所述的装置,其特征在于,所述环境因素包括光照、天气和路面情况中的至少一个。
  24. 根据权利要求13-23中任一项所述的装置,其特征在于,所述获取模块,用于:
    如果存在与所述本车辆在相同车道的前方车辆,则确定所述与所述本车辆在相同车道的前方车辆中与所述本车辆之间的距离最小的前方车辆为参考车辆;
    获取所述参考车辆的行驶速度,以及所述参考车辆与所述本车辆之间的距离在所述本车辆的行驶方向上的分量;
    基于所述本车辆的行驶速度、预设最大加速度和预设最小减速度,所述参考车辆的行驶速度和预设最大减速度,以及预设驾驶员反应时间,得到所述本车辆与所述参考车辆之间的第二安全距离;
    获取所述参考车辆与所述本车辆之间的距离在所述本车辆的行驶方向上的分量,和所述第二安全距离中的最小值;
    将所述本车辆所在车道以及相邻车道中,与所述本车辆之间的距离在所述本车辆的行驶方向上的分量不大于所述最小值的前方车辆,作为目标车辆;
    如果不存在与所述本车辆在相同车道的前方车辆,则基于所述本车辆的行驶速度、预设最大加速度和预设最小减速度,预设前方车辆行驶速度和所述预设最大减速度,以及所述预设驾驶员反应时间,得到第三安全距离,其中,所示预设前方车辆行驶速度为0;
    将所述本车辆所在车道以及相邻车道中,与所述本车辆之间的距离在所述本车辆的行驶方向上的分量不大于所述第三安全距离的前方车辆,作为目标车辆。
  25. 一种车辆行驶决策控制器,其特征在于,所述车辆行驶决策控制器包括处理器和存储器;
    所述存储器存储有至少一个计算机可读指令,所述计算机可读指令被配置成由所述处理器执行,用于实现如权利要求1-12中任一项所述的检测车辆行驶场景的复杂度的方法。
  26. 一种计算机可读存储介质,其特征在于,包括计算机可读指令,当所述计算机可读存储介质在车辆行驶决策控制器上运行时,使得所述车辆行驶决策控制器执行所述权利要求1-12中任一项权利要求所述的检测车辆行驶场景的复杂度的方法。
PCT/CN2020/082412 2020-03-31 2020-03-31 检测车辆行驶场景的复杂度的方法和装置 WO2021195955A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
PCT/CN2020/082412 WO2021195955A1 (zh) 2020-03-31 2020-03-31 检测车辆行驶场景的复杂度的方法和装置
EP20928784.6A EP4120180A4 (en) 2020-03-31 2020-03-31 METHOD AND DEVICE FOR MEASURING THE COMPLEXITY OF A VEHICLE MOTION SCENE
CN202080005175.5A CN112740295B (zh) 2020-03-31 2020-03-31 检测车辆行驶场景的复杂度的方法和装置
US17/956,087 US20230050063A1 (en) 2020-03-31 2022-09-29 Method and Apparatus for Detecting Complexity of Traveling Scenario of Vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/082412 WO2021195955A1 (zh) 2020-03-31 2020-03-31 检测车辆行驶场景的复杂度的方法和装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/956,087 Continuation US20230050063A1 (en) 2020-03-31 2022-09-29 Method and Apparatus for Detecting Complexity of Traveling Scenario of Vehicle

Publications (1)

Publication Number Publication Date
WO2021195955A1 true WO2021195955A1 (zh) 2021-10-07

Family

ID=75609568

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/082412 WO2021195955A1 (zh) 2020-03-31 2020-03-31 检测车辆行驶场景的复杂度的方法和装置

Country Status (4)

Country Link
US (1) US20230050063A1 (zh)
EP (1) EP4120180A4 (zh)
CN (1) CN112740295B (zh)
WO (1) WO2021195955A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114638420A (zh) * 2022-03-22 2022-06-17 交通运输部公路科学研究所 道路智能度评测方法及危化品车辆道路级导航方法
CN115107765A (zh) * 2022-06-29 2022-09-27 重庆长安汽车股份有限公司 车辆限速方法、装置、车辆及存储介质
WO2023103459A1 (zh) * 2021-12-07 2023-06-15 中兴通讯股份有限公司 车辆控制方法、决策服务器及存储介质
WO2023201964A1 (zh) * 2022-04-19 2023-10-26 合众新能源汽车股份有限公司 一种跟车目标确定方法、装置、设备及介质

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114590262B (zh) * 2022-03-28 2024-07-09 智己汽车科技有限公司 一种交通参与者静态距离测距准确性测试方法、装置及车辆
CN115376324B (zh) * 2022-10-24 2023-03-24 中国汽车技术研究中心有限公司 车辆场景复杂度确定方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2444947A1 (en) * 2010-10-20 2012-04-25 Yan-Hong Chiang Assistant driving system with video radar
US8447437B2 (en) * 2010-11-22 2013-05-21 Yan-Hong Chiang Assistant driving system with video recognition
CN109017786A (zh) * 2018-08-09 2018-12-18 北京智行者科技有限公司 车辆避障方法
US20190176841A1 (en) * 2017-12-13 2019-06-13 Luminar Technologies, Inc. Training multiple neural networks of a vehicle perception component based on sensor settings

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102015014139A1 (de) * 2015-10-31 2017-05-04 Daimler Ag Verfahren zum Betreiben einer Abstands- und Geschwindigkeitsregelfunktion eines Fahrzeugs und Fahrerassistenzsystem zur Durchführung des Verfahrens
US9802599B2 (en) * 2016-03-08 2017-10-31 Ford Global Technologies, Llc Vehicle lane placement
JP6699831B2 (ja) * 2016-04-28 2020-05-27 トヨタ自動車株式会社 運転意識推定装置
CN106205169B (zh) * 2016-07-20 2019-04-19 天津职业技术师范大学 基于车路协同的主干道交叉口进口道车速控制方法
CN107180219A (zh) * 2017-01-25 2017-09-19 问众智能信息科技(北京)有限公司 基于多模态信息的驾驶危险系数评估方法和装置
CN107672597A (zh) * 2017-09-25 2018-02-09 驭势科技(北京)有限公司 一种用于控制车辆驾驶模式的方法与设备
CN107697071B (zh) * 2017-11-06 2019-07-02 东南大学 一种基于场论的驾驶安全等级确定方法及装置
US10745006B2 (en) * 2018-02-01 2020-08-18 GM Global Technology Operations LLC Managing automated driving complexity of the forward path using perception system measures
CN110660270B (zh) * 2018-06-29 2021-09-21 比亚迪股份有限公司 车辆碰撞风险评价模型的建立方法和碰撞风险评价方法
CN108922177B (zh) * 2018-06-29 2021-08-10 东南大学 一种无人驾驶车辆通过交叉路口时速度控制***及方法
CN110660214A (zh) * 2018-06-29 2020-01-07 比亚迪股份有限公司 车辆及其能耗数据的获取方法、装置
CN110579359B (zh) * 2019-09-10 2021-11-09 武汉光庭信息技术股份有限公司 自动驾驶失效场景库的优化方法及***、服务器及介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2444947A1 (en) * 2010-10-20 2012-04-25 Yan-Hong Chiang Assistant driving system with video radar
US8447437B2 (en) * 2010-11-22 2013-05-21 Yan-Hong Chiang Assistant driving system with video recognition
US20190176841A1 (en) * 2017-12-13 2019-06-13 Luminar Technologies, Inc. Training multiple neural networks of a vehicle perception component based on sensor settings
CN109017786A (zh) * 2018-08-09 2018-12-18 北京智行者科技有限公司 车辆避障方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4120180A4 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023103459A1 (zh) * 2021-12-07 2023-06-15 中兴通讯股份有限公司 车辆控制方法、决策服务器及存储介质
CN114638420A (zh) * 2022-03-22 2022-06-17 交通运输部公路科学研究所 道路智能度评测方法及危化品车辆道路级导航方法
WO2023201964A1 (zh) * 2022-04-19 2023-10-26 合众新能源汽车股份有限公司 一种跟车目标确定方法、装置、设备及介质
CN115107765A (zh) * 2022-06-29 2022-09-27 重庆长安汽车股份有限公司 车辆限速方法、装置、车辆及存储介质

Also Published As

Publication number Publication date
CN112740295B (zh) 2022-05-10
CN112740295A (zh) 2021-04-30
US20230050063A1 (en) 2023-02-16
EP4120180A1 (en) 2023-01-18
EP4120180A4 (en) 2023-01-25

Similar Documents

Publication Publication Date Title
WO2021195955A1 (zh) 检测车辆行驶场景的复杂度的方法和装置
US11024165B2 (en) Driver behavior monitoring
CN109377726B (zh) 一种基于车联网的高速公路团雾精确警示、诱导***及方法
US20230019164A1 (en) Image Processing Method and Apparatus
CN112700470B (zh) 一种基于交通视频流的目标检测和轨迹提取方法
WO2020042348A1 (zh) 自动驾驶导航地图的生成方法、***、车载终端及服务器
JP6424761B2 (ja) 運転支援システム及びセンタ
CN105512623B (zh) 基于多传感器雾天行车视觉增强与能见度预警***及方法
WO2019183751A1 (zh) 一种车前积雪与结冰的检测报警方法、存储介质和服务器
EP3674971B1 (en) Method and system for training machine learning algorithm to detect objects at distance
WO2022246852A1 (zh) 基于航测数据的自动驾驶***测试方法、测试***及存储介质
CN103810854B (zh) 一种基于人工标定的智能交通参数检测方法
US10705530B2 (en) Vehicle travel control method and vehicle travel control device
CN111723854B (zh) 一种高速公路交通拥堵检测方法、设备及可读存储介质
WO2023240805A1 (zh) 一种基于滤波校正的网联车超速预警方法及***
CN109753841B (zh) 车道线识别方法和装置
US11727595B2 (en) Assessing visibility of a target object with autonomous vehicle fleet
CN113191030A (zh) 一种自动驾驶测试场景构建方法及装置
US20230148097A1 (en) Adverse environment determination device and adverse environment determination method
CN112687103A (zh) 基于车联网技术的车辆变道的检测方法及***
US20210300356A1 (en) Vehicle uncertainty sharing
CN116142186A (zh) 不良环境下车辆安全行驶预警方法、装置、介质和设备
CN110550041B (zh) 一种基于云端数据共享的路面附着系数估计方法
CN115909240A (zh) 一种基于车道线和车辆识别的道路拥堵检测方法
CN110660141A (zh) 路面状况检测方法、装置、电子设备及可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20928784

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020928784

Country of ref document: EP

Effective date: 20221013

NENP Non-entry into the national phase

Ref country code: DE