CN110816524B - Vehicle control device, vehicle control method, and storage medium - Google Patents

Vehicle control device, vehicle control method, and storage medium Download PDF

Info

Publication number
CN110816524B
CN110816524B CN201910715385.2A CN201910715385A CN110816524B CN 110816524 B CN110816524 B CN 110816524B CN 201910715385 A CN201910715385 A CN 201910715385A CN 110816524 B CN110816524 B CN 110816524B
Authority
CN
China
Prior art keywords
vehicle
object information
processing
recognition result
output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910715385.2A
Other languages
Chinese (zh)
Other versions
CN110816524A (en
Inventor
齐京真理奈
三浦弘
高冈贤人
三井相和
樫本凉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of CN110816524A publication Critical patent/CN110816524A/en
Application granted granted Critical
Publication of CN110816524B publication Critical patent/CN110816524B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Image Processing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Image Analysis (AREA)

Abstract

Provided are an object recognition device, a vehicle control device, an object recognition method, and a storage medium, which can obtain a sensor fusion result of an express report more quickly. An object recognition device is mounted on a vehicle, and is provided with: a first processing unit that outputs object information by performing at least a process a and a process B having more man-hours than the process a, based on an output of a first device for recognizing an object; a second processing unit that outputs object information by performing at least a process C and a process D having more man-hours than the process C based on an output of a second device attached to the vehicle so that the same direction as the first device is set as a detection range, and outputs the object information; a first integrated identification unit that identifies an object based on the object information obtained as a result of the process a and the object information obtained as a result of the process C; and a second integrated identification unit that identifies the object based on the object information obtained as a result of the process B and the object information obtained as a result of the process D.

Description

Vehicle control device, vehicle control method, and storage medium
Technical Field
The invention relates to an object recognition device, a vehicle control device, an object recognition method, and a storage medium.
Background
Currently, a technique of recognizing an object by integrating outputs of a plurality of apparatuses is disclosed (for example, japanese patent laid-open No. 2005-239114). Such a technique is called sensor fusion. In general, it can be said that recognizing an object by sensor fusion is more reliable than recognizing an object based on the output of the device alone. Therefore, in the technique described in patent document 1, the degree of vehicle control is reduced in comparison with the case where the object is recognized by the device alone and the object is recognized by the sensor fusion.
In the case of recognizing an object by sensor fusion, the reliability is high, but the completion of the processing is slower than in the case of recognizing an object based on the output of the device alone. Therefore, for example, even if the preliminary control is performed on an object far from the vehicle, the preliminary control may not be performed because the process of the sensor fusion is not specified. Thus, in the current sensor fusion technology, there is a case where the recognition of the object is delayed.
Disclosure of Invention
The present invention has been made in view of the above circumstances, and an object thereof is to provide an object recognition device, a vehicle control device, an object recognition method, and a storage medium, which can obtain a sensor fusion result of an express report more quickly.
The object recognition device, the vehicle control device, the object recognition method, and the storage medium according to the present invention adopt the following configurations.
(1) An object recognition device according to an aspect of the present invention is an object recognition device mounted on a vehicle, and includes: a first processing unit that outputs object information by performing at least a process a and a process B having more man-hours than the process a, based on an output of a first device for recognizing an object; a second processing unit that outputs object information by performing at least a process C and a process D having more man-hours than the process C, based on an output of a second device for recognizing an object, the second device being attached to the vehicle such that a direction same as that of the first device is set as a detection range; a first integrated identification unit that identifies an object based on object information obtained as a result of the processing a and object information obtained as a result of the processing C; and a second integrated identification unit that identifies an object based on the object information obtained as a result of the processing B and the object information obtained as a result of the processing D.
(2) In the aspect of (1) above, the process a is a process performed with data output by the first device in a fewer number of output cycles than the process B as an object, and/or the process C is a process performed with data output by the second device in a fewer number of output cycles than the process D as an object.
(3) In the aspect of (2) above, the first device is a camera, and the process a includes a process of generating the object information based on an image of a fewer number of frames than the process B.
(4) In the aspect of the above (1), the process a is a process performed in fewer process steps than the process B, and/or the process C is a process performed in fewer process steps than the process D.
(5) In the aspect of (4) above, the second device is a LIDAR, and the process C includes a process of determining a distance between the reference position and the object and including the determined distance in the object information to output.
(6) In the aspect (5) above, the processing D includes processing for determining the distance and the size of the object and including the determined distance and size in the object information to output.
(7) Another aspect of the present invention relates to a vehicle control device including: the object identifying device according to any one of the above (1) to (6); and a driving control unit that controls one or both of a speed and a steering angle of a vehicle, wherein the driving control unit performs control to adjust a positional relationship between the vehicle and the object using a recognition result of the first integrated recognition unit until a recognition result of the second integrated recognition unit is obtained, and the driving control unit performs control to adjust the positional relationship between the vehicle and the object by increasing a ratio of the recognition results of the second integrated recognition unit to be higher than a ratio of the recognition results of the first integrated recognition unit after the recognition result of the second integrated recognition unit is obtained.
(8) In the aspect of (7) above, the driving control unit sets the control degree of the period until the recognition result of the second integrated recognition unit is obtained to be smaller than the control degree of the period after the recognition result of the second integrated recognition unit is obtained.
(9) In the aspect of (7) or (8) described above, the driving control unit sets a relative control degree with respect to the positional relationship between the vehicle and the object in a period until the recognition result of the second integrated recognition unit is obtained to be smaller than a relative control degree with respect to the positional relationship between the vehicle and the object after the recognition result of the second integrated recognition unit is obtained.
(10) Another aspect of the present invention relates to an object recognition method executed by an object recognition apparatus mounted on a vehicle, including: performing at least a process A to output object information and performing a process B having more man-hours than the process A to output the object information, based on an output of a first device for recognizing an object; outputting object information by performing at least a process C of outputting object information and a process D having more man-hours than the process C of outputting object information based on an output of a second device for recognizing an object, the second device being attached to the vehicle so that a direction same as that of the first device is set as a detection range; identifying an object based on the object information obtained by the process a and the object information obtained by the process C; and identifying an object based on the object information obtained by the processing B and the object information obtained by the processing D.
(11) Another aspect of the present invention relates to a storage medium storing a program that is executed by a processor of an object recognition device mounted on a vehicle, the program causing the processor to execute the program to perform: performing at least a process A to output object information and performing a process B having more man-hours than the process A to output the object information, based on an output of a first device for recognizing an object; outputting object information by performing at least a process C of outputting object information and a process D having more man-hours than the process C of outputting object information based on an output of a second device for recognizing an object, the second device being attached to the vehicle so that a direction same as that of the first device is set as a detection range; identifying an object based on the object information obtained by the process a and the object information obtained by the process C; and identifying an object based on the object information obtained by the processing B and the object information obtained by the processing D.
According to (1) to (11), the sensor fusion result of the quick report can be obtained more quickly.
Drawings
Fig. 1 is a configuration diagram of a vehicle system 1 using a first embodiment of an object recognition device 300.
Fig. 2 is a block diagram of the object recognition apparatus 300.
Fig. 3 is a diagram illustrating a travel scene of the host vehicle M generated by the processing performed by the vehicle system 1.
Fig. 4 is a flowchart showing an example of the flow of the processing executed by the first processing unit 310.
Fig. 5 is a flowchart showing an example of the flow of processing executed by the second processing unit 320.
Fig. 6 is a flowchart showing an example of the flow of processing executed by the first integrated identification unit 330.
Fig. 7 is a flowchart showing an example of the flow of processing executed by the second integrated identification unit 340.
Fig. 8 is a diagram showing an example of the relationship between the distance and the detection accuracy.
Fig. 9 is a diagram showing an example of the relationship between the distance and the detection accuracy.
Fig. 10 is a diagram showing an example of the relationship between the distance and the detection accuracy in the embodiment.
Fig. 11 is a configuration diagram of a vehicle system 2 using a second embodiment of an object recognition device 300.
Detailed Description
Embodiments of an object recognition device, a vehicle control device, an object recognition method, and a storage medium according to the present invention will be described below with reference to the drawings.
< first embodiment >
[ integral Structure of vehicle ]
Fig. 1 is a configuration diagram of a vehicle system 1 using a first embodiment of an object recognition device 300. The vehicle on which the vehicle system 1 is mounted is, for example, a two-wheeled, three-wheeled, four-wheeled vehicle, and the driving source of these vehicles is an internal combustion engine such as a diesel engine or a gasoline engine, an electric motor, or a combination thereof. When the electric motor is provided, the electric motor is operated using generated power generated by a generator connected to the internal combustion engine or discharge power of a secondary battery or a fuel cell.
The vehicle system 1 includes, for example, a camera 10, a radar device 12, a probe 14, an object recognition device 300, a communication device 20, an HMI (Human Machine Interface) 30, a vehicle sensor 40, a navigation device 50, an MPU (Map Positioning Unit) 60, a driving operation Unit 80, an automatic driving control device 100, a traveling driving force output device 200, a brake device 210, and a steering device 220. The above-described apparatuses and devices are connected to each other by a multiplex communication line such as a CAN (Controller Area Network) communication line, a serial communication line, a wireless communication Network, or the like. The configuration shown in fig. 1 is merely an example, and a part of the configuration may be omitted, or another configuration may be further added. The camera 10 and the detector 14 are examples of a first device and a second device, respectively.
The camera 10 is a digital camera using a solid-state imaging Device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor). One or more cameras 10 are mounted on an arbitrary portion of a vehicle (hereinafter, referred to as a host vehicle M) on which the vehicle system 1 is mounted. When shooting the front, the camera 10 is attached to the upper part of the front windshield, the rear surface of the interior mirror, or the like. The camera 10 repeatedly captures the periphery of the host vehicle M periodically, for example. The camera 10 may also be a stereo camera.
The radar device 12 radiates radio waves such as millimeter waves around the host vehicle M and detects radio waves (reflected waves) reflected by an object to detect at least the position (distance and direction) of the object. One or more radar devices 12 are mounted on an arbitrary portion of the host vehicle M. The radar device 12 may detect the position and velocity of the object by FM-CW (Frequency Modulated Continuous Wave) method.
The detector 14 is a LIDAR (Light Detection and Ranging). The detector 14 irradiates the periphery of the host vehicle M with light to measure scattered light. The probe 14 detects the distance to the object based on the time from light emission to light reception. The light to be irradiated is, for example, pulsed laser light. One or more sensors 14 are mounted on any portion of the host vehicle M. The detector 14 irradiates light with an angle offset in the horizontal direction, for example, to measure scattered light. The detector 14 outputs information on the reflection point from which the scattered light is emitted to the object recognition device 300 in association with the irradiation angle at that time point.
The object recognition device 300 performs a sensor fusion process on the detection results of some or all of the camera 10, the radar device 12, and the probe 14 to recognize the position, the type, the speed, and the like of the object. The object recognition device 300 outputs the recognition result to the automatic driving control device 100. Further, the object recognition device 300 may output the detection results of the camera 10, the radar device 12, and the detector 14 directly to the automatic driving control device 100 as necessary. Details of the object recognition device 300 will be described later.
The Communication device 20 communicates with other vehicles present in the vicinity of the host vehicle M by using, for example, a cellular network, a Wi-Fi network, bluetooth (registered trademark), DSRC (Dedicated Short Range Communication), or the like, or communicates with various server devices via a wireless base station.
The HMI30 presents various information to the passenger of the host vehicle M and accepts input operations by the passenger. The HMI30 includes various display devices, a speaker, a buzzer, a touch panel, switches, keys, and the like.
The vehicle sensors 40 include a vehicle speed sensor that detects the speed of the own vehicle M, an acceleration sensor that detects acceleration, a yaw rate sensor that detects an angular velocity about a vertical axis, an orientation sensor that detects the orientation of the own vehicle M, and the like.
The Navigation device 50 includes, for example, a GNSS (Global Navigation Satellite System) receiver 51, a Navigation HMI52, and a route determination unit 53, and holds first map information 54 in a storage device such as an HDD (Hard Disk Drive) or a flash memory. The GNSS receiver 51 determines the position of the own vehicle M based on the signals received from the GNSS satellites. The position of the host vehicle M may also be determined or supplemented by an INS (Inertial Navigation System) that utilizes the output of the vehicle sensors 40. The navigation HMI52 includes a display device, a speaker, a touch panel, keys, and the like. The navigation HMI52 may also be partially or wholly shared with the aforementioned HMI 30. The route determination unit 53 determines a route (hereinafter, referred to as an on-map route) from the position of the own vehicle M (or an arbitrary input position) specified by the GNSS receiver 51 to the destination input by the passenger using the navigation HMI52, for example, with reference to the first map information 54. The first map information 54 is, for example, information representing a road shape by a line representing a road and nodes connected by the line. The on-map route determined by the route determination unit 53 is output to the MPU 60. The navigation device 50 may perform route guidance using the navigation HMI52 based on the on-map route determined by the route determination unit 53. The navigation device 50 may be realized by a function of a terminal device such as a smartphone or a tablet terminal held by a passenger. The navigation apparatus 50 may transmit the current position and the destination to the navigation server via the communication apparatus 20, and acquire the on-map route returned from the navigation server.
The MPU60 functions as, for example, a recommended lane determining unit 61, and holds second map information 62 in a storage device such as an HDD or a flash memory. The recommended lane determining unit 61 divides the route provided from the navigation device 50 into a plurality of sections (for example, every 100m in the vehicle traveling direction), and determines the recommended lane for each section with reference to the second map information 62. When there is a branch point, a junction point, or the like in the route, the recommended lane determining unit 61 determines the recommended lane so that the host vehicle M can travel on a reasonable route for traveling to the branch point.
The second map information 62 is map information with higher accuracy than the first map information 54. The second map information 62 includes, for example, information on the center of a lane, information on the boundary of a lane, and the like. The second map information 62 may include road information, traffic control information, address information (address, zip code), facility information, telephone number information, and the like. The second map information 62 may also be updated at any time by using the communication device 20 to access other devices.
The driving operation element 80 includes other operation elements such as an accelerator pedal, a brake pedal, a shift lever, a steering wheel, and a joystick. A sensor for detecting the operation amount or the presence or absence of operation is attached to driving operation element 80, and the detection result of the sensor is output to automatic driving control device 100 or a part or all of running driving force output device 200, brake device 210, and steering device 220.
The automatic driving control device 100 includes, for example, a first control unit 120 and a second control unit 160. The first control Unit 120 and the second control Unit 160 are each realized by a hardware processor such as a CPU (Central Processing Unit) executing a program (software). Some or all of these components may be realized by hardware (including Circuit Unit) such as LSI (Large Scale Integration), ASIC (Application Specific Integrated Circuit), FPGA (Field-Programmable Gate Array), GPU (Graphics Processing Unit), or the like, or may be realized by cooperation of software and hardware. The object recognition device 300 is an example of a vehicle control device in combination with the automatic driving control device 100. The object recognition device 300 may be a function of the automatic driving control device 100.
Fig. 2 is a functional configuration diagram of the first control unit 120 and the second control unit 160. The first control unit 120 includes, for example, a recognition unit 130, a travel track prediction unit 140, and an action plan generation unit 150.
The recognition unit 130 recognizes the surrounding situation of the host vehicle M based on information input from the camera 10, the radar device 12, and the probe 14 via the object recognition device 300. The surrounding situation recognized by the recognition unit includes various objects. The position of the object is first recognized as a position on absolute coordinates with the representative point (sensor position, center of gravity, center of drive axis, etc.) of the host vehicle M as the origin, and is converted into a position on road coordinates along the road as necessary for control. In the present embodiment, the recognition unit 130 is designed to exclusively acquire information (object information) relating to an object from the object recognition device 300.
The surrounding situation recognized by the recognition unit 130 may include road structures and other information of other vehicles, the relative position or posture of the host vehicle M and the traveling lane, the state of a bicycle or a pedestrian, road items such as a temporary stop line, an obstacle, a red light, a toll booth, and other information. The recognition result of the recognition unit 130 is output to the travel track prediction unit 140 and the action plan generation unit 150.
The action plan generating unit 150 generates a target trajectory on which the host vehicle M will travel in the future so that the host vehicle M travels on the recommended lane determined by the recommended lane determining unit 61 in principle and so that the host vehicle M does not come into contact with the object recognized by the object recognition device 300 or the recognition unit 130. The target trajectory contains, for example, a plurality of trajectory points and a velocity element. For example, the target track may be a form in which the points (track points) to which the vehicle M should arrive are arranged in order. The track point is a point to which the host vehicle M should arrive at a predetermined travel distance (for example, several meters) in the distance along the route, and the target speed and the target acceleration at a predetermined sampling time (for example, several tenths of a second) are generated as a part of the target track. The track point may be a position to which the vehicle M should arrive at a predetermined sampling time at each sampling time. In this case, the information on the target velocity and the target acceleration is expressed by the interval between the track points.
The second control unit 160 controls the traveling driving force output device 200, the brake device 210, and the steering device 220 so that the host vehicle M passes through the target trajectory generated by the action plan generation unit 150 at a predetermined timing. The second control unit 160 obtains information of the target trajectory (trajectory point) generated by the action plan generating unit 150, for example, stores the information in a memory (not shown), and controls the running driving force output device 200 or the brake device 210 based on the speed factor attached to the target trajectory stored in the memory. The second control unit 160 controls the steering device 220 in accordance with the degree of curvature of the target track stored in the memory. In this way, the action plan generating unit 150 determines the target trajectory to which the velocity element is added, and controls both acceleration and deceleration and steering of the host vehicle M.
Running drive force output device 200 outputs running drive force (torque) for running of the vehicle to the drive wheels. The travel driving force output device 200 includes, for example, a combination of an internal combustion engine, a motor, a transmission, and the like, and an ECU that controls these devices. The ECU controls the above configuration in accordance with information input from the second control unit 160 or information input from the driving operation element 80.
The brake device 210 includes, for example, a caliper, a hydraulic cylinder that transmits hydraulic pressure to the caliper, an electric motor that generates hydraulic pressure in the hydraulic cylinder, and a brake ECU. The brake ECU controls the electric motor in accordance with information input from the second control unit 160 or information input from the driving operation element 80, and outputs a braking torque corresponding to a braking operation to each wheel. The brake device 210 may be provided with a mechanism for transmitting the hydraulic pressure generated by the operation of the brake pedal included in the driving operation element 80 to the hydraulic cylinder via the master cylinder as a backup. The brake device 210 is not limited to the above-described configuration, and may be an electronically controlled hydraulic brake device that controls an actuator in accordance with information input from the second control unit 160 and transmits the hydraulic pressure of the master cylinder to the hydraulic cylinder.
The steering device 220 includes, for example, a steering ECU and an electric motor. The electric motor changes the orientation of the steered wheels by applying a force to a rack-and-pinion mechanism, for example. The steering ECU drives the electric motor in accordance with information input from the second control unit 160 or information input from the driving operation element 80 to change the direction of the steered wheels.
[ object recognition device and control associated therewith ]
Fig. 2 is a block diagram of the object recognition apparatus 300. The object recognition device 300 includes, for example, a first processing unit 310, a second processing unit 320, a first integrated recognition unit 330, and a second integrated recognition unit 340. These components are realized by executing a program (software) by a hardware processor such as a CPU. Some or all of these components may be realized by hardware (including circuit units) such as LSIs, ASICs, FPGAs, GPUs, or the like, or may be realized by cooperation of software and hardware. In addition, these functional units need not be realized by a single processor, but may be realized by performing distributed processing by a plurality of processors.
The first processing unit 310 analyzes an image captured by the camera 10, for example, generates object information, and outputs the object information. That is, the first processing unit 310 functions as a so-called image analysis unit. The first processing unit 310 may generate the object information by extracting an outline in the image and performing pattern matching, or may generate the object information by using a classifier that is learned by deep learning or the like. The processes performed by the first processing unit 310 include at least a process a and a process B having more steps than the process a.
The process a is, for example, the following process: the distance, height, and size of the object reflected on the captured image are derived (obtained) based on the images of k frames, and the derived contents are output as object information. For example, k is 1.
The process B includes, for example: a process of deriving (obtaining) a distance, a height, and a size of an object reflected on the captured image based on the n frames of images; processing for objectifying the object; and a process of outputting the targeted content as object information. For example, n is a natural number of 2 or more. If the relationship of k < n holds true, k and n are arbitrary natural numbers. The targeting means the following processing: the type of the object (four-wheel vehicle, two-wheel vehicle, bicycle, pedestrian, falling object, etc.) is estimated from the distance and speed of the object derived by processing based on a plurality of frames, and the estimation result is associated with the distance, height, size, etc. of the object. The speed of the object needs to be obtained by subtracting the speed of the host vehicle M from the relative speed, but the object recognition device 300 may obtain the speed of the host vehicle M from the automatic driving control device 100, for example, and use the obtained information for calculation. The processing B may also include approximation processing based on a kalman filter or the like.
The second processing unit 320 processes the detection result output from the probe 14, for example, generates object information based on the processed information, and outputs the object information. The processes performed by the second processing unit 320 include at least process C and process D having more man-hours than process C.
The processing C is, for example, processing of obtaining the distance (and orientation) of the object based on the detection result output by the probe 14 and outputting the obtained information as object information.
The processing D includes, for example, processing for obtaining the distance (and orientation) and size of the object based on the detection result output by the probe 14; a process of objectifying an object; and a process of outputting the targeted content as object information. As described above, since the probe 14 is controlled to operate while changing the irradiation direction of light, for example, when reflection points located at an equivalent distance are arranged between the irradiation angles θ 1 to θ 2, the lateral dimension of the object can be approximated by multiplying the distance by the absolute value | θ 2 to θ 1| of the angle difference. The process D includes such a process. The processing D may also include approximation processing based on a kalman filter or the like. The targeting is the same as in the process B.
The object information as a result of the process a (hereinafter referred to as object information a) and the object information as a result of the process C (hereinafter referred to as object information C) are output to the first integrated identification portion 330. "output" is a simplified term, and may refer to an operation associated with physical "transmission" from an output source to an output destination, or may refer to an operation in which the output destination writes to an accessible area of a memory.
The first integrated identifying unit 330 identifies the object based on the object information a and the object information C. The recognition result (first integrated recognition result) of the first integrated recognition unit 330 is output to the automatic driving control device 100 as the quick report information. The first integrated identification unit 330 identifies that the object is present at the distance when the difference between the distance of the object included in the object information a and the distance of the object included in the object information C is within a predetermined range. That is, when the first processing unit 310 determines that a certain object is present at the position of the predetermined distance X1 in the 1-frame image from the camera 10 and the second processing unit 320 implicitly determines that the object is present at the position close to the predetermined distance X1 based on the output of the detector 14, the first integrated recognition unit 330 generates a first integrated recognition result indicating information that the object is present at the position of the predetermined distance X1 and outputs the first integrated recognition result to the automatic driving control device 100.
The object information as a result of the processing B (hereinafter referred to as object information B) and the object information as a result of the processing D (hereinafter referred to as object information D) are output to the second integrated identification portion 340.
The second integrated recognition portion 340 recognizes the object based on the object information B and the object information D. The recognition result (second integrated recognition result) of the second integrated recognition unit 340 is output to the automatic driving control device 100 as information having higher reliability than the recognition result of the first integrated recognition unit 330. When the information after the object processing included in both the object B and the object D matches within the allowable range and the object type matches, the second integrated identification unit 340 identifies that the estimated object type exists at the position indicated by the matching portion between the object B and the object D, and outputs the identification result to the automatic driving control unit 100. For example, when the first processing unit 310 determines that the "vehicle" is present at the position of the predetermined distance X2 in the images of the plurality of frames from the camera 10 and the second processing unit 320 implicitly determines that the "vehicle" is present at the position close to the predetermined distance X2 based on the output of the detector 14 (for example, when the size of the object included in the object information matches the size of the vehicle in the lateral direction and the object is on foot and moves at a slow speed), the second integrated recognition unit 340 generates a second integrated recognition result indicating information that the "vehicle" is present at the position of the predetermined distance X2 and outputs the second integrated recognition result to the automatic driving control device 100.
In the automatic driving control device 100, for example, the action plan generating unit 150 generates a target trajectory on which the host vehicle M will travel so as not to contact an object indicated by each recognition result based on each of the first integrated recognition result and the second integrated recognition result, and performs vehicle control (adjustment of the positional relationship between the host vehicle M and the object).
At this time, the action plan generating unit 150 performs control for adjusting the positional relationship between the host vehicle M and the object using the first integrated recognition result until the second integrated recognition result is obtained after the first integrated recognition result is obtained, and performs control for adjusting the positional relationship between the host vehicle M and the object by increasing the ratio of the second integrated recognition result so as to be higher than the ratio of the first integrated recognition result after the second integrated recognition result is obtained. The "increase ratio" includes a meaning that the ratio of the second integrated recognition result is 100% and the ratio of the first integrated recognition result is 0%.
The action plan generating unit 150 reduces the control degree in the period from the time when the first integrated recognition result is obtained to the time when the second integrated recognition result is obtained, as compared with the control degree after the second integrated recognition result is obtained. The "degree of reduction control" refers to a degree of reduction of control applied to the running driving force output device 200, the brake device 210, or the steering device 220. When the degree of control is to be reduced, the action plan generating unit 150 may generate the target trajectory so that the control amount such as the braking torque, the acceleration torque, and the change amount of the steering angle of the steering wheel is reduced, or may generate the target trajectory so that the vehicle behavior amount such as the acceleration, the deceleration, and the turning angle, which is presented as a result of the control, is reduced.
The above-described reduction control degree may be an absolute pointer that does not depend on the positional relationship between the host vehicle M and the object, or may be a relative pointer that depends on the positional relationship between the host vehicle M and the object. In the former case, for example, the upper limit of the braking torque during a period from after the first integrated recognition result is obtained to before the second integrated recognition result is obtained is set to be smaller than the upper limit of the braking torque after the second integrated recognition result is obtained, thereby achieving a reduction in the degree of control. In the latter case, for example, when the braking torque is determined in a tendency of being inversely proportional To the TTC (Time To Collision) of the object and the host vehicle M, the gain for calculating the braking torque in the period from after the first integrated recognition result is obtained To before the second integrated recognition result is obtained may be set smaller than that after the second integrated recognition result is obtained. In addition, the action plan generating unit 150 may perform a limiting control (preliminary deceleration) such as "decelerate to the upper limit speed when the speed of the host vehicle M exceeds the upper limit speed" during a period from when the first integrated recognition result is obtained to before when the second integrated recognition result is obtained.
Fig. 3 is a diagram exemplifying a travel scene of the host vehicle M generated by the processing executed by the vehicle system 1. In the figure, an arrow D indicates the traveling direction of the host vehicle M. An obstacle OB exists ahead of the host vehicle M in the traveling direction. Since the actual distance varies depending on the performance of the apparatus, etc., the numerical value is merely an example, and only the first integrated recognition result starts to be obtained at a point spaced apart from the obstacle OB by 200[ m ]. Accordingly, the action plan generating unit 150 performs the control such as the aforementioned preliminary braking. In addition, a second comprehensive recognition result is obtained, for example, starting at a point distant by 150[ m ] from the obstacle OB. Accordingly, the action plan generating unit 150 performs braking control corresponding to the TTC of the host vehicle M and the obstacle OB, and stops the host vehicle M as necessary (for example, it is impossible to avoid by steering).
[ treatment procedure ]
Fig. 4 is a flowchart showing an example of the flow of the processing executed by the first processing unit 310. In the processing of the present flowchart, for example, a routine is executed each time an image of one frame is input from the camera 10.
First, the first processing unit 310 performs a process of extracting a target region to be processed from an image input from the camera 10 (step S100). The target region is a region other than the left and right ends and the upper portion of the image, and is a region captured in front of the host vehicle M.
Next, the first processing unit 310 performs object extraction processing in the target region (step S102). The details of the processing in this step are as described above. Then, the first processing unit 310 determines whether or not an object is detected (extracted) in the image (step S104). Here, the object is preferably defined to satisfy a condition such as "occupy several pixels or more in an image". When the object is not detected, the process of an example routine of the present flowchart is ended.
When an object is detected, the first processing unit 310 generates the object information a and outputs the object information a to the first integrated identification unit 330 (step S106). The first processing unit 310 performs a process of discriminating the object determined to be detected in step S104 from the object detected in the past by comparing the detected object with the latest frame in the past (step S108). At this time, the first processing unit 310 performs the discrimination processing in consideration of the absolute velocity and the size of the object.
Next, the first processing unit 310 refers to the processing result of step S108, and determines whether or not the same (estimated to be the same) object is continuously detected m times while the processing of the present flowchart is repeatedly executed (step S110). When it is determined that the same object is continuously detected m times, the first processing unit 310 generates the object information B and outputs the object information B to the second integrated identification unit 340 (step S112).
Fig. 5 is a flowchart showing an example of the flow of processing executed by the second processing unit 320. In the processing of the present flowchart, for example, a routine is executed each time information of a reflection point is input from the detector 14.
First, the second processing unit 320 performs area limiting processing on the information input from the probe 14 (step S200). The area limiting process is a process of excluding reflection points measured outside an area on the road surface.
Next, the second processing unit 320 determines whether or not there is a reflection point on the road (step S202). When it is determined that there is a reflection point on the road, the second processing unit 320 generates the object information C and outputs the object information C to the first integrated identification unit 330 (step S204).
The second processing unit 320 determines whether or not reflection points located at substantially the same distance are detected from one end to the other end (step S206). When the reflection points located at substantially the same distance are detected from one end to the other end, the second processing unit 320 generates the object information D and outputs the object information D to the second integrated identification unit 340 (step S208).
Fig. 6 is a flowchart showing an example of the flow of the processing executed by the first integrated identification unit 330. First, the first integrated identification unit 330 determines whether or not the object information a and the object information C are input in synchronization with each other (or within a predetermined time) (step S300). When the object information a and the object information C are input in synchronization with each other, the first integrated identification unit 330 determines whether or not both of them coincide with each other (step S302). The coincidence is, for example, a difference that a predetermined number or more of the two items fall within an allowable range, and when the two items coincide, the first integrated identification unit 330 generates the first integrated identification information (step S304) and outputs the first integrated identification information to the automatic driving control device 100.
Fig. 7 is a flowchart showing an example of the flow of the processing executed by the second integrated identification unit 340. First, the second integrated identification unit 340 determines whether or not the object information B and the object information D are input in synchronization with each other (or within a predetermined time) (step S400). When the object information B and the object information D are input in synchronization with each other, the second integrated identification unit 340 determines whether or not both of them match (step S402). When both are matched, the second integrated identification unit 340 generates second integrated identification information (step S404) and outputs the second integrated identification information to the automatic driving control device 100.
[ conclusion ]
According to the object recognition device 300 of the first embodiment described above, the sensor fusion result of the quick report can be obtained more quickly. Fig. 8 to 10 are diagrams showing an example of the relationship between the distance and the detection accuracy. In these figures, 10cm is placed on the road 3 And 15cm 3 The detection rate corresponding to the distance between the host vehicle M and the obstacle is set on the premise of (2).
Fig. 8 shows a result assuming that the first processing unit 310 detects the probability of obstacles of various sizes, that is, the detection rate, based on the output of the camera 10. Here, "detecting an obstacle" means "generating object information B". Fig. 9 shows a result of assuming a detection rate, which is a probability that the second processing unit 320 detects obstacles of various sizes based on the output of the probe 14. Here, "detecting an obstacle" means "generating object information D". As shown in fig. 8, in particular 10cm 3 If the distance of the obstacle (2) is not close to 160m, it is assumed that the detection rate is not 80% or more. In addition, as shown in FIG. 9, especially 10cm 3 If the distance of the obstacle (D) is not as close as 180m, it is assumed that the object information D is not generated.
The second integrated identification information generated by summing the object information B and the object information D is obtained later than the detection rates shown in fig. 8 and 9. As a result, it is assumed that sufficient information cannot be provided to the automatic driving control apparatus 100 particularly when the distance between the host vehicle M and the obstacle is 150M or more.
In contrast, fig. 10 shows expected performance in the embodiment. Fig. 10 shows a change in the probability (generation rate) of generating the first integrated identification information by the first integrated identification unit 330 based on the object information a and the object information C, which are started to be generated at earlier timing than fig. 8 and 9. As shown in the figure, the generation rate of the first integrated identification information in the present embodiment is expected to be sufficiently close to 100% when the distance is close to 190 m. Therefore, in a scene close to an obstacle, a fusion result of the flash can be obtained at an earlier timing and provided to the automatic driving control apparatus 100.
The camera 10 and the detector 14 are exemplified as examples of the first device and the second device, but one or both of them may be the radar device 12 or another object recognition device.
< second embodiment >
Hereinafter, a second embodiment will be described. Fig. 11 is a configuration diagram of a vehicle system 2 using a second embodiment of an object recognition device 300. The same reference numerals as in the first embodiment are given to the common components with the first embodiment, and detailed description thereof is omitted. In the second embodiment, the object recognition device 300 outputs the recognition results (the first integrated recognition result and the second integrated recognition result) to the driving support device 400 instead of to the automatic driving control device 100.
The drive assisting apparatus 400 is an apparatus that performs various drive assisting controls such as intervention control of acceleration/deceleration or steering and output of reaction force to the drive operation element 80 when the driver performs manual driving. In the following description, the driving support apparatus 400 is an apparatus for performing brake control for an obstacle. The driving support apparatus 400 includes, for example, a collision possibility determination unit 402, a brake operation amount derivation unit 404, and a brake amount determination unit 406. These functional units are realized by executing a program (software) by a hardware processor such as a CPU. Some or all of these components may be realized by hardware (including circuit units) such as LSIs, ASICs, FPGAs, GPUs, or the like, or may be realized by cooperation of software and hardware. Further, these functional units need not be realized by a single processor, but may be realized by performing distributed processing by a plurality of processors.
The collision possibility determination unit 402 determines the collision possibility of the own vehicle colliding with the object with reference to the recognition result of the object recognition device 300. For example, the collision possibility determination unit 402 calculates TTC between the host vehicle and the object, and determines that the host vehicle and the object have a possibility of collision when the TTC is smaller than the collision determination threshold value Tcol.
The brake operation amount derivation unit 404 derives the amount of brake operation performed by the occupant of the host vehicle (or the amount of brake output as a result) with reference to the detection result of the brake depression amount sensor included in the driving operation element 80.
First, based on the determination result of the collision possibility determination unit 402 and the process thereof, the braking amount determination unit 406 determines the amount of braking to be taken with respect to the presence of the object (the amount of braking with respect to the object) and outputs the amount of braking to the braking device 210. The target object braking amount is determined such that the target object braking amount is larger as the TTC between the host vehicle and the target object is shorter, for example. When the braking amount determined based on the braking operation amount derived by the braking operation amount deriving unit 404 exceeds the amount of braking for the object, the braking amount determining unit 406 may output the braking amount determined based on the braking operation amount to the braking device 220. The braking amount determination unit 406 sets the control degree during a period from the time when the first integrated recognition result is obtained to the time when the second integrated recognition result is obtained to be smaller than the control degree after the second integrated recognition result is obtained. In addition, the braking amount determination unit 406 may perform limiting control (preliminary deceleration) such as "deceleration to the upper limit speed when the speed of the host vehicle M exceeds the upper limit speed" for example, during a period from when the first integrated recognition result is obtained to before when the second integrated recognition result is obtained.
According to the second embodiment described above, the same effects as those of the first embodiment can be obtained.
The above-described embodiment can be expressed as follows.
An object recognition device mounted on a vehicle, wherein,
the object recognition device is provided with:
a storage device storing a program; and
a hardware processor capable of executing the program,
the hardware processor performs the following processing by executing the program:
performing at least a process A to output object information and performing a process B having more man-hours than the process A to output the object information, based on an output of a first device for recognizing an object;
outputting object information by performing at least a process C of outputting object information and a process D having more man-hours than the process C of outputting object information based on an output of a second device for recognizing an object, the second device being attached to the vehicle so that a direction same as that of the first device is set as a detection range;
identifying an object based on object information obtained as a result of the processing a and object information obtained as a result of the processing C; and
an object is identified based on the object information obtained as a result of the processing B and the object information obtained as a result of the processing D.
While the specific embodiments of the present invention have been described above using the embodiments, the present invention is not limited to the embodiments, and various modifications and substitutions can be made without departing from the spirit of the present invention.

Claims (10)

1. A vehicle control device mounted on a vehicle, wherein,
the vehicle control device includes:
a first processing unit that outputs object information by performing at least a process a and a process B having more man-hours than the process a, based on an output of a first device for recognizing an object;
a second processing unit that outputs object information by performing at least a process C and a process D having more man-hours than the process C based on an output of a second device for recognizing an object, the second device being attached to the vehicle such that a direction same as that of the first device is set as a detection range;
a first integrated identification unit that identifies an object based on object information obtained as a result of the processing a and object information obtained as a result of the processing C;
a second integrated identification unit that identifies an object based on object information obtained as a result of the processing B and object information obtained as a result of the processing D; and
a driving control unit that controls one or both of a speed and a steering angle of the vehicle,
the driving control unit performs control for adjusting the positional relationship between the vehicle and the object using the recognition result of the first integrated recognition unit until the recognition result of the second integrated recognition unit is obtained, and performs control for adjusting the positional relationship between the vehicle and the object by increasing the ratio of the recognition results of the second integrated recognition unit to be higher than the ratio of the recognition results of the first integrated recognition unit after the recognition result of the second integrated recognition unit is obtained.
2. The vehicle control apparatus according to claim 1,
the processing a is processing to be executed by the first device for data output in fewer output cycles than the processing B, and/or,
the process C is a process executed by the second device for data output in fewer output cycles than the process D.
3. The vehicle control apparatus according to claim 2,
the first device is a camera and the second device is a camera,
the process a includes a process of generating object information based on images of a fewer number of frames than the process B.
4. The vehicle control apparatus according to claim 1,
the process a is a process performed in fewer process steps than the process B, and/or,
the process C is a process executed in fewer process steps than the process D.
5. The vehicle control apparatus according to claim 4,
the second device is a LIDAR,
the process C includes a process of determining a distance between the LIDAR and an object and including the determined distance in the object information to output.
6. The vehicle control apparatus according to claim 5,
the process D includes a process of determining the distance and the size of the object and including the determined distance and size in the object information to output.
7. The vehicle control apparatus according to any one of claims 1 to 6,
the driving control unit sets the control degree of the period until the recognition result of the second integrated recognition unit is obtained to be smaller than the control degree of the period after the recognition result of the second integrated recognition unit is obtained.
8. The vehicle control apparatus according to any one of claims 1 to 6,
the driving control unit sets a relative control degree with respect to the positional relationship between the vehicle and the object in a period until the recognition result by the second integrated recognition unit is obtained to be smaller than a relative control degree with respect to the positional relationship between the vehicle and the object after the recognition result by the second integrated recognition unit is obtained.
9. A vehicle control method executed by a vehicle control device mounted on a vehicle, wherein,
the vehicle control method includes the steps of:
performing at least a process A to output object information and performing a process B having more man-hours than the process A to output the object information, based on an output of a first device for recognizing an object;
outputting object information by performing at least a process C of outputting object information and a process D having more man-hours than the process C of outputting object information based on an output of a second device for recognizing an object, the second device being attached to the vehicle so that a direction same as that of the first device is set as a detection range;
recognizing an object based on the object information obtained by the processing a and the object information obtained by the processing C and outputting a first integrated recognition result;
recognizing an object based on the object information obtained by the processing B and the object information obtained by the processing D and outputting a second integrated recognition result; and
one or both of the speed and steering angle of the vehicle are controlled,
the control includes: and a control unit configured to perform control to adjust a positional relationship between the vehicle and the object using the first integrated recognition result until the second integrated recognition result is obtained, and perform control to adjust the positional relationship between the vehicle and the object by increasing a ratio of the second integrated recognition result to be higher than a ratio of the first integrated recognition result after the second integrated recognition result is obtained.
10. A storage medium storing a program to be executed by a processor of a vehicle control device mounted on a vehicle,
performing the following processing by causing the processor to execute the program:
performing at least a process A to output object information and performing a process B having more man-hours than the process A to output the object information, based on an output of a first device for recognizing an object;
outputting object information by performing at least a process C of outputting object information and a process D having more man-hours than the process C of outputting object information based on an output of a second device for recognizing an object, the second device being attached to the vehicle so that a direction same as that of the first device is set as a detection range;
recognizing an object based on the object information obtained by the processing a and the object information obtained by the processing C and outputting a first integrated recognition result;
recognizing an object based on the object information obtained by the processing B and the object information obtained by the processing D and outputting a second integrated recognition result; and
one or both of the speed and steering angle of the vehicle are controlled,
the controlling includes causing the processor to execute: the control unit performs control for adjusting the positional relationship between the vehicle and the object using the first integrated recognition result until the second integrated recognition result is obtained, and performs control for adjusting the positional relationship between the vehicle and the object by increasing the ratio of the second integrated recognition result to be higher than the ratio of the first integrated recognition result after the second integrated recognition result is obtained.
CN201910715385.2A 2018-08-07 2019-08-02 Vehicle control device, vehicle control method, and storage medium Active CN110816524B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018148551A JP7027279B2 (en) 2018-08-07 2018-08-07 Vehicle control devices, vehicle control methods, and programs
JP2018-148551 2018-08-07

Publications (2)

Publication Number Publication Date
CN110816524A CN110816524A (en) 2020-02-21
CN110816524B true CN110816524B (en) 2022-12-06

Family

ID=69547726

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910715385.2A Active CN110816524B (en) 2018-08-07 2019-08-02 Vehicle control device, vehicle control method, and storage medium

Country Status (2)

Country Link
JP (1) JP7027279B2 (en)
CN (1) CN110816524B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112512887B (en) * 2020-07-21 2021-11-30 华为技术有限公司 Driving decision selection method and device
US20240071095A1 (en) 2020-12-28 2024-02-29 Hitachi Astemo, Ltd. Vehicle control system externality recognition device and vehicle control method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4967840B2 (en) 2007-06-14 2012-07-04 トヨタ自動車株式会社 Collision mitigation device
JP5727356B2 (en) 2011-11-30 2015-06-03 日立オートモティブシステムズ株式会社 Object detection device
JP5870908B2 (en) * 2012-12-11 2016-03-01 株式会社デンソー Vehicle collision determination device
JP6190758B2 (en) * 2014-05-21 2017-08-30 本田技研工業株式会社 Object recognition device and vehicle
JP6558733B2 (en) * 2015-04-21 2019-08-14 パナソニックIpマネジメント株式会社 Driving support method, driving support device, driving control device, vehicle, and driving support program using the same
US10160448B2 (en) * 2016-11-08 2018-12-25 Ford Global Technologies, Llc Object tracking using sensor fusion within a probabilistic framework

Also Published As

Publication number Publication date
JP7027279B2 (en) 2022-03-01
JP2020024562A (en) 2020-02-13
CN110816524A (en) 2020-02-21

Similar Documents

Publication Publication Date Title
CN108688681B (en) Vehicle control system, vehicle control method, and medium storing vehicle control program
CN109693667B (en) Vehicle control device, vehicle control method, and storage medium
CN112208533B (en) Vehicle control system, vehicle control method, and storage medium
CN110341704B (en) Vehicle control device, vehicle control method, and storage medium
CN111511621B (en) Vehicle control device, vehicle control method, and storage medium
US20200307569A1 (en) Vehicle control device, vehicle control method, and storage medium
CN111201170A (en) Vehicle control device, vehicle control method, and program
CN110001641B (en) Vehicle control device, vehicle control method, and storage medium
CN112026770B (en) Vehicle control device, vehicle control method, and storage medium
CN112208532B (en) Vehicle control device, vehicle control method, and storage medium
CN111183082A (en) Vehicle control device, vehicle control method, and program
CN110217231B (en) Vehicle control device, vehicle control method, and storage medium
CN112124311A (en) Vehicle control device, vehicle control method, and storage medium
CN110949376A (en) Vehicle control device, vehicle control method, and storage medium
CN112319475A (en) Vehicle control device, vehicle control method, and storage medium
CN112172814B (en) Vehicle control device, vehicle control method, and storage medium
CN111942379A (en) Vehicle control device and vehicle control method
CN112462751A (en) Vehicle control device, vehicle control method, and storage medium
US20220266855A1 (en) Determination device, vehicle control device, determination method, and storage medium
CN110816524B (en) Vehicle control device, vehicle control method, and storage medium
CN114537386A (en) Vehicle control device, vehicle control method, and computer-readable storage medium
CN110341703B (en) Vehicle control device, vehicle control method, and storage medium
CN110194153B (en) Vehicle control device, vehicle control method, and storage medium
CN109559540B (en) Periphery monitoring device, periphery monitoring method, and storage medium
CN114954511A (en) Vehicle control device, vehicle control method, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant