CN109987082B - Vehicle control device, vehicle control method, and storage medium - Google Patents

Vehicle control device, vehicle control method, and storage medium Download PDF

Info

Publication number
CN109987082B
CN109987082B CN201811587645.4A CN201811587645A CN109987082B CN 109987082 B CN109987082 B CN 109987082B CN 201811587645 A CN201811587645 A CN 201811587645A CN 109987082 B CN109987082 B CN 109987082B
Authority
CN
China
Prior art keywords
vehicle
unit
linear object
curve
road
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811587645.4A
Other languages
Chinese (zh)
Other versions
CN109987082A (en
Inventor
川边浩司
三浦弘
石川诚
土屋成光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of CN109987082A publication Critical patent/CN109987082A/en
Application granted granted Critical
Publication of CN109987082B publication Critical patent/CN109987082B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/20Conjoint control of vehicle sub-units of different type or different function including control of steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • B60W40/072Curvature of the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/105Speed

Landscapes

  • Engineering & Computer Science (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Image Processing (AREA)
  • Steering Control In Accordance With Driving Conditions (AREA)

Abstract

The invention provides a vehicle control device, a vehicle control method and a storage medium, which can perform substitution control when a road dividing line is missing in a wider scene. The vehicle control device includes: an image pickup unit that picks up an image of the front or rear of a vehicle; a linear object recognition unit that recognizes a linear object that exists at a different height from a road and extends along the road when viewed from above, from an image captured by the imaging unit; and a driving control unit that controls the traveling of the vehicle according to the position of the linear object in the image.

Description

Vehicle control device, vehicle control method, and storage medium
Technical Field
The invention relates to a vehicle control device, a vehicle control method, and a storage medium.
Background
In recent years, research and practical use of automatic control (hereinafter, referred to as "automatic driving") of vehicle traveling have been advanced. In the automatic driving, it is important to recognize a lane on which the vehicle should travel in order to realize stable traveling. As powerful information for recognizing a lane, there is position information of a white line drawn on a road. The white line detects the position by an in-vehicle camera or the like. The position information of the white line is used not only for automatic driving but also for driving support control such as lane keeping control. When the white line is worn out and cannot be seen, it is difficult to perform these controls.
In connection with this, there is known a road white line detection method characterized by calculating spatial differential values of luminance of pixels in an image captured by an in-vehicle camera, extracting edges of white lines from positions indicating their extreme values, collecting portions where the luminance values of the detected edges are close as white line candidates, grouping the collected white line candidates according to their positional relationship, and detecting one white line on each of the left and right sides (see, for example, japanese patent laid-open No. 2002-175534).
However, in the conventional technique, if a white line does not remain to some extent, the white line cannot be detected. In addition, it is impossible to cope with a case where the white line cannot be recognized due to the presence of the preceding vehicle or the following vehicle.
Disclosure of Invention
An aspect of the present invention has been made in view of such a situation, and an object thereof is to provide a vehicle control device, a vehicle control method, and a program that enable alternative control in the event of a road dividing line loss in a wider scene.
The vehicle control device, the vehicle control method, and the storage medium of the present invention adopt the following configurations.
(1): a vehicle control device according to an aspect of the present invention includes: an image pickup unit (10) that picks up an image of the front or rear of the vehicle; a linear object recognition unit (132) that recognizes a linear object that is present at a height different from the road and extends along the road when viewed from above, in the image captured by the image capture unit; and a driving control unit (150, 160) that controls the travel of the vehicle according to the position of the linear object in the image.
(2): in the aspect (1) described above, the driving control unit controls steering of the vehicle such that an inclination of the linear object in the image is fixed.
(3): in the aspect (2) described above, the driving control unit performs feedback control for maintaining a constant inclination of the linear object in the image with respect to a steering angle of the vehicle.
(4): in the aspect (1), the image pickup unit picks up an image of the front of the vehicle, and the vehicle control device further includes a curve estimation unit (134) that estimates the presence of a curve ahead of the travel of the vehicle, based on the extended form of the linear object recognized by the linear object recognition unit.
(5): in the aspect (4) described above, the curve estimation unit further estimates a curvature of a curve existing ahead of the travel of the vehicle, and the driving control unit controls the steering of the vehicle based on the curvature of the curve estimated by the curve estimation unit.
(6): in the aspect (1), the linear object is an object provided above the imaging unit.
(7): in the aspect (6) above, the linear object is an upper end portion of a side wall of the road.
(8): in a vehicle control method according to another aspect of the present invention, an imaging unit images a front or a rear of a vehicle, a linear object recognition unit recognizes a linear object that exists at a different height from a road and extends along the road in a plan view in an image captured by the imaging unit, and a driving control unit controls at least steering of the vehicle according to a position of the linear object in the image.
(9): a storage medium according to another aspect of the present invention stores a program for causing a computer mounted in a vehicle having an imaging unit for imaging the front or rear of the vehicle to execute: in the image captured by the imaging unit, a linear object that is present at a different height from a road and extends along the road in a plan view is recognized, and at least steering of the vehicle is controlled in accordance with a position of the linear object in the image.
Effects of the invention
According to the aspects (1) to (9), the substitute control in the case where the road dividing line is missing can be performed in a wider scene.
Drawings
Fig. 1 is a configuration diagram of a vehicle system 1 using a vehicle control device according to a first embodiment.
Fig. 2 is a functional configuration diagram of the first control unit 120 and the second control unit 160.
Fig. 3 is a diagram showing an example of the captured image IM of the camera 10.
Fig. 4 is a diagram showing a relationship between the turning angle of the vehicle M and the inclination of the upper end portion ue (SW) of the side wall SW.
Fig. 5 is a flowchart showing an example of the processing flow executed by the linear object recognition unit 132 and the control unit 152 when the dividing line is missing.
Fig. 6 is a diagram showing a relationship between a curve existing ahead of the host vehicle M and the upper end portion ue (SW) of the side wall SW.
Fig. 7 is a flowchart showing an example of the processing flow executed by the linear object recognition unit 132, the curve estimation unit 134, and the control unit 152 when the dividing line is missing.
Fig. 8 is a configuration diagram of the driving support apparatus 400 according to the second embodiment.
Fig. 9 is a diagram showing an example of a hardware configuration of the automatic driving control apparatus 100 according to the first embodiment or the driving support apparatus 400 according to the second embodiment.
Detailed Description
Embodiments of a vehicle control device, a vehicle control method, and a program according to the present invention will be described below with reference to the drawings.
< first embodiment >
[ integral Structure ]
Fig. 1 is a configuration diagram of a vehicle system 1 using a vehicle control device according to a first embodiment. The vehicle on which the vehicle system 1 is mounted is, for example, a two-wheel, three-wheel, four-wheel or the like vehicle, and the drive source thereof is an internal combustion engine such as a diesel engine or a gasoline engine, an electric motor, or a combination thereof. When the electric motor is provided, the electric motor operates using the generated power of a generator connected to the internal combustion engine or the discharge power of a secondary battery or a fuel cell.
The vehicle system 1 includes, for example, a camera 10, a radar device 12, a probe 14, an object recognition device 16, a communication device 20, an hmi (human Machine interface)30, a vehicle sensor 40, a navigation device 50, an mpu (map Positioning unit)60, a driving operation unit 80, an automatic driving control device 100, a driving force output device 200, a brake device 210, a steering device 220, and a headlamp device 250. These apparatuses or devices are connected to each other by a multiplex communication line such as a can (controller Area network) communication line, a serial communication line, a wireless communication network, or the like. The configuration shown in fig. 1 is merely an example, and a part of the configuration may be omitted, or another configuration may be further added.
The camera 10 is a digital camera using a solid-state imaging device such as a ccd (charge Coupled device) or a cmos (complementary Metal Oxide semiconductor). One or more cameras 10 are mounted at an arbitrary portion of a vehicle (hereinafter, referred to as a host vehicle M) on which the vehicle system 1 is mounted. When shooting the front, the camera 10 is attached to the upper part of the windshield, the rear surface of the interior mirror, or the like. In the case of photographing the rear, the camera 10 is mounted near a rear bumper, for example. The camera 10 repeatedly shoots the periphery of the host vehicle M periodically, for example. The camera 10 may also be a stereo camera.
The radar device 12 radiates radio waves such as millimeter waves to the periphery of the host vehicle M, and detects radio waves reflected by an object (reflected waves), thereby detecting at least the position (distance and direction) of the object. One or more radar devices 12 are mounted at arbitrary positions of the host vehicle M. The radar device 12 may detect the position and velocity of the object by FM-cw (frequency Modulated Continuous wave) method.
The detector 14 is a LIDAR (light Detection and ranging). The detector 14 irradiates light to the periphery of the host vehicle M to measure scattered light. The probe 14 detects the distance to the object based on the time from light emission to light reception. The light to be irradiated is, for example, a pulsed laser. One or more sensors 14 are mounted at any position of the host vehicle M. The probe 14 is an example of an object detection device.
The object recognition device 16 performs sensor fusion processing on a part or all of the detection results of the camera 10, the radar device 12, and the probe 14, and recognizes the position, the type, the speed, and the like of the object. The object recognition device 16 outputs the recognition result to the automatic driving control device 100. Further, the object recognition device 16 may directly output the detection results of the camera 10, the radar device 12, and the detector 14 to the automatic driving control device 100 as necessary.
The communication device 20 communicates with another vehicle present in the vicinity of the vehicle M by using, for example, a cellular network, a Wi-Fi network, Bluetooth (registered trademark), dsrc (dedicated Short Range communication), or the like, or communicates with various server devices via a wireless base station.
The HMI30 presents various information to the passenger of the host vehicle M and accepts an input operation by the passenger. The HMI30 includes various display devices, speakers, buzzers, touch panels, switches, keys, and the like.
The vehicle sensors 40 include a vehicle speed sensor that detects the speed of the own vehicle M, an acceleration sensor that detects acceleration, a yaw rate sensor that detects an angular velocity about a vertical axis, an orientation sensor that detects the orientation of the own vehicle M, and the like.
The Navigation device 50 includes, for example, a gnss (global Navigation Satellite system) receiver 51, a Navigation HMI52, and a route determination unit 53, and holds the first map information 54 in a storage device such as an hdd (hard Disk drive) or a flash memory. The GNSS receiver 51 determines the position of the own vehicle M from the signals received from the GNSS satellites. The position of the host vehicle M may also be determined or supplemented by an ins (inertial Navigation system) using the output of the vehicle sensors 40. The navigation HMI52 includes a display device, a speaker, a touch panel, keys, and the like. The navigation HMI52 may share a portion or all of the HMI 30. The route determination unit 53 determines, for example, a route (hereinafter, referred to as an on-map route) from the position of the own vehicle M (or an arbitrary input position) specified by the GNSS receiver 51 to the destination input by the passenger using the navigation HMI52, with reference to the first map information 54. The first map information 54 is information representing a road shape by, for example, a line representing a road and nodes connected by the line. The first map information 54 may include curvature Of a road, poi (point Of interest) information, and the like. The on-map route determined by the route determination unit 53 is output to the MPU 60. The navigation device 50 may perform route guidance using the navigation HMI52 based on the on-map route determined by the route determination unit 53. The navigation device 50 may be realized by a function of a terminal device such as a smartphone or a tablet terminal held by a passenger, for example. The navigation device 50 may transmit the current position and the destination to the navigation server via the communication device 20, and acquire the on-map route returned from the navigation server.
The MPU60 functions as the recommended lane determining unit 61, for example, and holds the second map information 62 in a storage device such as an HDD or a flash memory. The recommended lane determining unit 61 divides the route provided from the navigation device 50 into a plurality of sections (for example, 100[ m ] in the vehicle traveling direction), and determines the recommended lane for each section with reference to the second map information 62. The recommended lane determining unit 61 determines to travel in the first lane from the left. The recommended lane determining unit 61 determines the recommended lane so that the host vehicle M can travel on an appropriate route for traveling ahead of the branch when there is a branch point, a junction point, or the like in the route.
The second map information 62 is map information with higher accuracy than the first map information 54. The second map information 62 includes, for example, information on the center of a lane, information on the boundary of a lane, and the like. The second map information 62 may also include road information, traffic regulation information, address information (address, zip code), facility information, telephone number information, and the like. The second map information 62 can be updated at any time by accessing other devices using the communication device 20.
The driving operation member 80 includes, for example, an accelerator pedal, a brake pedal, a shift lever, a steering wheel, a joystick, and other operation members. A sensor for detecting the operation amount or the presence or absence of operation is attached to the driving operation element 80, and the detection result is output to some or all of the automatic driving control device 100, the running driving force output device 200, the brake device 210, and the steering device 220.
The automatic driving control device 100 includes, for example, a first control unit 120 and a second control unit 160. The first control unit 120 and the second control unit 160 are each realized by a hardware processor such as a cpu (central Processing unit) executing a program (software). Some or all of these components may be realized by hardware (circuit section; including circuit) such as lsi (large Scale integration), asic (application Specific Integrated circuit), FPGA (Field-Programmable Gate Array), gpu (graphics Processing unit), or the like, or may be realized by cooperation of software and hardware. The automatic driving control apparatus 100 is an example of a vehicle control apparatus.
Fig. 2 is a functional configuration diagram of the first control unit 120 and the second control unit 160. The first control unit 120 includes, for example, a recognition unit 130 and an action plan generation unit 150. The first control unit 120 implements, for example, an AI (Artificial Intelligence) function and a model function in parallel. For example, the function of "recognizing an intersection" is realized by executing intersection recognition by deep learning or the like and recognition based on a predetermined condition (presence of a signal that can be pattern-matched, a road sign, or the like) in parallel, and adding scores to both of them to comprehensively evaluate them. Thereby, the reliability of the automatic driving is ensured.
The recognition unit 130 recognizes the surrounding situation of the host vehicle M based on information input from the camera 10, the radar device 12, and the probe 14 via the object recognition device 16. For example, the recognition unit 130 recognizes the position, speed, acceleration, and other states of the object located in the periphery of the host vehicle M. The position of the object is recognized as a position on absolute coordinates with the origin at a representative point (center of gravity, center of a drive shaft, or the like) of the host vehicle M, for example, and used for control. The position of the object may be represented by a representative point such as the center of gravity or a corner of the object, or may be represented by a region to be represented. The "state" of the object may also include acceleration, jerk, or "behavior state" of the object (e.g., whether a lane change is being made, or whether a lane change is to be made). The recognition unit 130 recognizes the shape of a curve through which the host vehicle M will pass thereafter, from the captured image of the camera 10. The recognition unit 130 converts the shape of the curve from the captured image of the camera 10 into a real plane, and outputs, for example, two-dimensional point sequence information or information expressed using the same model as the point sequence information to the action plan generation unit 150 as information indicating the shape of the curve.
The recognition unit 130 recognizes, for example, a lane (traveling lane) in which the host vehicle M is traveling. For example, the recognition unit 130 recognizes the traveling lane by comparing the pattern of road dividing lines (for example, an arrangement of solid lines and broken lines) obtained from the second map information 62 with the pattern of road dividing lines around the host vehicle M recognized from the image captured by the camera 10. In this recognition, the position of the own vehicle M acquired from the navigation device 50 and the processing result by the INS may be considered. The recognition unit 130 recognizes a temporary stop line, an obstacle, a red light, a toll booth, and other road items.
The recognition unit 130 recognizes the position and posture of the host vehicle M with respect to the travel lane when recognizing the travel lane. The recognition unit 130 may recognize, for example, a deviation of the reference point of the host vehicle M from the lane center and an angle formed with respect to a line connecting the lane centers in the traveling direction of the host vehicle M as the relative position and posture of the host vehicle M with respect to the traveling lane. Alternatively, the recognition unit 130 may recognize, as the relative position of the host vehicle M with respect to the travel lane, the position of the reference point of the host vehicle M with respect to any one of the side end portions (road dividing line or road boundary) of the travel lane.
The recognition unit 130 may derive the recognition accuracy in the recognition processing and output the recognition accuracy to the action plan generation unit 150 as recognition accuracy information. For example, the recognition unit 130 generates recognition accuracy information based on the frequency with which the road dividing line can be recognized for a certain period.
The recognition unit 130 includes, for example, a linear object recognition unit 132 and a curve estimation unit 134. They are described later.
The action plan generating unit 150 determines events that are sequentially executed during autonomous driving so as to travel on the recommended lane determined by the recommended lane determining unit 61 in principle and further to be able to cope with the surrounding situation of the host vehicle M. The action plan generating unit 150 generates a target trajectory on which the host vehicle M will travel in the future, based on the event of activation. The target trajectory contains, for example, a plurality of trajectory points and a velocity element. For example, the target track represents contents in which the places (track points) to which the own vehicle M should arrive are sequentially arranged. The track point is a point to which the host vehicle M should arrive at every predetermined travel distance (for example, about several [ M ]) along the road distance, and unlike this, a target speed and a target acceleration are generated as part of the target track at every predetermined sampling time (for example, about several zero [ see ]). The trajectory point may be a position to which the host vehicle M should arrive at a predetermined sampling time. In this case, information of the target velocity or the target acceleration is expressed at intervals of the track points.
The action plan generating unit 150 includes, for example, a dividing line missing control unit 152. Which will be described later.
The second control unit 160 controls the running driving force output device 200, the brake device 210, and the steering device 220 so that the host vehicle M passes through the target trajectory generated by the action plan generation unit 150 at a predetermined timing. A portion obtained by combining the action plan generating unit 150 and the second control unit 160 is an example of the "driving control unit".
The second control unit 160 includes, for example, an acquisition unit 162, a speed control unit 164, and a steering control unit 166. The acquisition unit 162 acquires information of the target track (track point) generated by the action plan generation unit 150 and stores the information in a memory (not shown). The speed control unit 164 controls the running drive force output device 200 or the brake device 210 based on the speed element attached to the target track stored in the memory. The steering control unit 166 controls the steering device 220 according to the degree of curvature of the target track stored in the memory. The processing of the speed control section 164 and the steering control section 166 is realized by, for example, a combination of feedforward control and feedback control. As an example, the steering control unit 166 performs a combination of feedforward control corresponding to the curvature of the road ahead of the host vehicle M and feedback control based on the deviation from the target trajectory.
The running drive force output device 200 outputs running drive force (torque) for running of the vehicle to the drive wheels. The running driving force output device 200 includes, for example, a combination of an internal combustion engine, a motor, a transmission, and the like, and an ECU that controls them. The ECU controls the above configuration based on information input from the second control unit 160 or information input from the driving operation element 80.
The brake device 210 includes, for example, a caliper, a hydraulic cylinder that transmits hydraulic pressure to the caliper, an electric motor that generates hydraulic pressure in the hydraulic cylinder, and a brake ECU. The brake ECU controls the electric motor based on information input from the second control unit 160 or information input from the driving operation element 80, and outputs braking torque corresponding to a braking operation to each wheel. The brake device 210 may have a mechanism for transmitting hydraulic pressure generated by operation of a brake pedal included in the driving operation element 80 to the hydraulic cylinder via the master cylinder as a backup. The brake device 210 is not limited to the above-described configuration, and may be an electronically controlled hydraulic brake device as follows: the actuator is controlled based on information input from the second control unit 160, and the hydraulic pressure of the master cylinder is transmitted to the hydraulic cylinder.
The steering device 220 includes, for example, a steering ECU and an electric motor. The electric motor changes the orientation of the steering wheel by applying a force to a rack-and-pinion mechanism, for example. The steering ECU drives the electric motor based on information input from the second control unit 160 or information input from the driving operation element 80, and changes the direction of the steered wheels.
[ control when dividing line is missing ]
Next, the contents of the processing executed by the control unit 152 when the dividing line between the linear object recognition unit 132 of the recognition unit 130 and the action plan generation unit 150 is missing will be described.
As described above, the recognition unit 130 recognizes the driving lane mainly from the position of the road marking line in the image captured by the camera 10. Fig. 3 is a diagram showing an example of the captured image IM of the camera 10. This figure shows a scenario in which the own vehicle M is traveling on an expressway. In the photographed image IM shown in the figure, road dividing lines LM1 and LM2 that divide a lane L1 in which the vehicle M is traveling, road dividing lines LM2 and LM3 that divide an adjacent lane L2, a guard rail GR, a side wall SW, and the like are reflected. These image elements are linear or curved, and the contrast (luminance difference) between the outline portion and the peripheral image is large, so that the position can be relatively easily recognized by image recognition. For example, the contour of the road-dividing line can be recognized by connecting horizontal edges (pixels having a large contrast with adjacent pixels in the lateral direction), and the upper ends ue (gr) and ue (sw) of the guard rails can be recognized by connecting vertical edges (pixels having a large contrast with adjacent pixels in the longitudinal direction). In principle, the recognition unit 130 recognizes a portion of the road segment line that is within a predetermined range in the captured image IM as a road segment line that divides the own lane, and recognizes the region divided by them as a driving lane.
Then, the action plan generating unit 150 generates the target trajectory so as not to deviate from the travel lane in principle (when the lane is changed, branched, or merged, the travel lane is switched in the middle). Therefore, in the control of the automatic driving, it is an important element to be able to recognize the lane to be followed.
Here, the recognition unit 130 may not always clearly recognize the road dividing line. For example, a road dividing line of the type depicted in a road with a white line or a yellow line sometimes generates wear due to aged deterioration. In addition, in the case of road construction or in the vicinity of a toll gate, there may be a road without a road dividing line. Further, a scene in which the camera 10 cannot capture the road marking line may occur due to another vehicle or the like present in the periphery of the host vehicle M. Hereinafter, such various scenes are referred to as "when a dividing line is missing". When the dividing line is missing, the host vehicle M also needs to travel along the road, and therefore, some alternative processing is preferably performed.
In view of this, the linear object recognition unit 132 recognizes a linear object that exists at a different height from the road and extends along the road as viewed from above. In the example of fig. 3, the upper ends ue (gr) of the guard rails and the upper ends ue (sw) of the side walls correspond to linear objects. Further, since it is difficult to recognize a linear object when another vehicle or the like enters between the camera 10, it is preferable that the linear object is an object provided above the camera 10. The object corresponding to this is, for example, the upper end ue (SW) of the side wall SW. Next, the linear object recognition unit 132 will be described as a part that recognizes the position on the captured image IM of the upper end ue (SW) of the side wall SW.
The linear object recognition unit 132 recognizes the positions of a plurality of pixels constituting the upper end portion ue (SW) of the side wall SW, for example. Then, the linear object recognition unit 132 approximates the positions of the plurality of pixels to a straight line and recognizes the positions as straight lines in the image plane. On the other hand, the control unit 152 performs feedback control on the steering angle of the host vehicle M so that the inclination of the straight line is fixed when the dividing line is missing.
Fig. 4 is a diagram showing a relationship between the turning angle of the vehicle M and the inclination of the upper end portion ue (SW) of the side wall SW. As shown in the drawing, when the host vehicle M turns in the left direction, the inclination of the upper end portion ue (SW) of the side wall SW with respect to the lateral direction of the image increases (steeply), and when the host vehicle M turns in the right direction, the inclination of the upper end portion ue (SW) of the side wall SW with respect to the lateral direction of the image decreases (moderately). Using this, the control unit 152 performs the feedback control shown in equation (1) when the dividing line is missing. Where theta is a steering angle, delta theta is a steering angle change amount,
Figure BDA0001917918760000101
for example the angle of the upper end ue (SW) of the side wall SW with respect to the reference direction in the image,
Figure BDA0001917918760000102
the angle is the angle formed at the initial time when the dividing line is missing. Further, KP is a proportional gain, KI is an integral gain, and KD is a derivative gain. Although the example of PID control is shown in equation (1), the control unit 152 may perform P control, PI control, or any feedback control similar thereto when the dividing line is missing.
Figure BDA0001917918760000103
Fig. 5 is a flowchart showing an example of the processing flow executed by the linear object recognition unit 132 and the control unit 152 when the dividing line is missing.
First, the linear object recognition unit 132 determines whether or not the recognition unit 130 lacks (cannot recognize) a road marking (step S100). When the recognition unit 130 has lost the road marking line, the linear object recognition unit 132 recognizes the linear object and stores the inclination with respect to the captured image IM in the memory (step S102). The inclination is set as
Figure BDA0001917918760000104
Next, the control unit 152 performs feedback control of the steering angle θ of the vehicle M so that the angle is formed with respect to the image of the linear object when the dividing line is missing
Figure BDA0001917918760000111
Approach to
Figure BDA0001917918760000112
(step S104). The process of step S104 is executed until the road dividing line is recognized again by the recognition unit 130 (step S106).
[ estimation of Curve ]
Next, the contents of the processing executed by the control unit 152 when the linear object recognition unit 132, the curve estimation unit 134, and the action plan generation unit 150 of the recognition unit 130 lack a dividing line will be described.
The curve estimation unit 134 estimates a curve existing ahead of the host vehicle M from the position of the linear object recognized by the linear object recognition unit 132. Fig. 6 is a diagram showing a relationship between a curve existing ahead of the host vehicle M and the upper end portion ue (SW) of the side wall SW. As shown in the drawing, when there is a right curve ahead of the host vehicle M, the upper end portion ue (SW) of the side wall SW forms a convex curve protruding downward, and the inclination in the lateral direction with respect to the image is reduced (gentle) as compared with the case where there is no curve ahead of the host vehicle M. In addition, when there is a left curve ahead of the host vehicle M, the upper end portion ue (SW) of the side wall SW forms a convex curve protruding upward, and the inclination in the lateral direction with respect to the image is increased (steep) as compared with the case where there is no curve ahead of the host vehicle M.
The curve estimation unit 134 approximates the arrangement of pixels constituting the linear object recognized by the linear object recognition unit 132 to a curve such as a cubic curve, and estimates the start point, the curvature, and the like of the curve from the parameters of the curve. For example, the curve estimation unit 134 estimates the start point, the curvature, and the like of the curve by applying parameters to a map prepared in advance.
When the vehicle M reaches the start point of the curve estimated by the curve estimation unit 134 and the road division line is missing in the time recognition unit 130, the control unit 152 performs control of the steering angle θ based on the curvature of the curve estimated by the curve estimation unit 134.
For example, the control unit 152 derives the feedforward steering angle θ from the equation (2) when the dividing line is missing ff . Where C is the curvature of the curve.
θ ff =f(C)...(2)
The control unit 152 may determine the feedback steering angle θ from the height position (h in fig. 6) of the image edge of the upper end ue (SW) of the side wall SW when the dividing line is missing fb . Feedback steering angle theta fb Represented by formula (3).
Δθ fb =KP·(h-h 0 )+KI·∫(h-h 0 )·dt+KD·{d(h-h 0 )/dt}...(3)
Then, as shown in equation (4), the control unit 152 may calculate the feedforward steering angle θ when the dividing line is missing ff And feedback steering angle theta fb Determines the steering angle θ of the host vehicle M. For example, α + β ═ 1.
θ=α×θ ff +β×θ fb ...(4)
Fig. 7 is a flowchart showing an example of the processing flow executed by the linear object recognition unit 132, the curve estimation unit 134, and the control unit 152 when the dividing line is missing. The processing of the flowchart of fig. 7 may be executed in parallel with the processing of the flowchart of fig. 5, or only one of the processing of the flowcharts of fig. 5 and 7 may be executed.
First, the linear object recognition unit 132 recognizes a linear object (step S200). Next, the curve estimation unit 134 determines whether or not the extended form of the linear object changes (step S202). As described with reference to fig. 6, the curve estimation unit 134 determines whether or not the extending form of the linear object changes, based on the inclination of the linear object, the direction of the curve, and the degree of the curve.
When it is determined that the extended form of the linear object is changed, the curve estimation unit 134 estimates the presence, the start point, the curvature, and the like of the curve, and stores the information in the memory (step S204). Next, the dividing line missing control unit 152 determines whether or not the road dividing line is missing (cannot be recognized) by the recognition unit 130 (step S206). If the recognition unit 130 has lost the road dividing line, the process returns to step S200.
When the recognition unit 130 lacks a road dividing line, the control unit 152 determines the feed-forward steering angle from the curvature stored in the memory when the dividing line is absent (step S208), and waits until the vehicle M reaches the curve start point (step S210). When the host vehicle M reaches the curve start point, the control unit 152 controls the steering angle of the host vehicle M by feedforward and feedback when the dividing line is missing as described above (step S212). The process of step S212 is executed until the road dividing line is recognized again by the recognition unit 130 (step S214). If the road marking is again recognized by the recognition unit 130 during the execution of steps S208 and S210, the process may be deviated from the process of the flowchart.
The vehicle control device according to the first embodiment described above includes: an image pickup unit (10) that picks up an image of the front or rear of the vehicle; a linear object recognition unit (132) that recognizes a linear object that is present at a different height from the road and extends along the road when viewed from above, from the image captured by the image capture unit; and driving control units (150, 160) that control the travel of the vehicle according to the position of the linear object in the image, thereby enabling alternative control in the event of a lane marking line loss in a wider scene.
In the explanation of fig. 4, the camera 10 is explained on the premise that the camera 10 captures an image of the front side of the own vehicle M, but the camera 10 may capture an image of the rear side of the own vehicle M. In this case, the relationship between the turning angle and the inclination is different from the relationship described in fig. 4. It should be noted that if the camera 10 does not capture an image of the front side of the host vehicle M, it is difficult to estimate a curve.
< second embodiment >
In the second embodiment, an example will be described in which the vehicle control device is applied to a driving support device that performs driving support such as lane keeping control. The driving support device is not mounted on the automatically driven vehicle as in the first embodiment, and may be mounted mainly on a vehicle that performs manual driving or on a vehicle that selectively performs automatic driving and manual driving plus driving support.
Fig. 8 is a configuration diagram of the driving support apparatus 400 according to the second embodiment. The driving support apparatus 400 includes, for example, a lane keeping control unit 420, a linear object recognition unit 432, a curve estimation unit 434, and a control unit 452 for the absence of a dividing line. These components are realized by executing a program (software) by a hardware processor such as a CPU. Some or all of these components may be realized by hardware (circuit unit) such as LSI, ASIC, FPGA, GPU, etc., or may be realized by cooperation of software and hardware.
Like the recognition unit 130 of the first embodiment, the lane keeping control unit 420 recognizes the road dividing line and the traveling lane, and gives a steering reaction force to the steering wheel so that the host vehicle M does not deviate from the traveling lane, or controls the steering angle in the direction opposite to the deviation direction.
The linear object recognition unit 432, the curve estimation unit 434, and the control unit 452 for the absence of a dividing line have the same functions as the linear object recognition unit 132, the curve estimation unit 134, and the control unit 152 for the absence of a dividing line of the first embodiment, respectively. Thus, similarly to the first embodiment, the driving support apparatus 400 according to the second embodiment performs the control for keeping the lane instead of the control for keeping the lane according to the position of the linear object when the lane keeping control unit 420 lacks the road marking. Since the drive support is based on manual driving, the drive support may be stopped after notifying the passenger if the control for maintaining the lane is performed instead for a certain period of time.
According to the second embodiment described above, the same effects as those of the first embodiment can be exhibited.
< hardware Structure >
Fig. 9 is a diagram showing an example of a hardware configuration of the automatic driving control device 100 according to the first embodiment or the driving support device 400 according to the second embodiment (hereinafter, the automatic driving control device 100 and the like). As shown in the figure, the automatic driving control apparatus 100 and the like are configured as a communication controller 100-1, a CPU100-2, a ram (random Access memory)100-3 used as a work memory, a rom (read Only memory)100-4 storing a boot program and the like, a storage apparatus 100-5 such as a flash memory or hdd (hard Disk drive), a driving apparatus 100-6 and the like, which are connected to each other via an internal bus or a dedicated communication line. The communication controller 100-1 communicates with components other than the automatic driving control device 100 and the like. The storage device 100-5 stores a program 100-5a executed by the CPU 100-2. This program is developed in the RAM100-3 by a dma (direct Memory access) controller (not shown) or the like, and executed by the CPU 100-2. As a result, one or both of the recognition unit 130 and the action plan generation unit 150, or part or all of the linear object recognition unit 432, the curve estimation unit 434, and the dividing line-missing control unit 452 are realized.
The above-described embodiments can be expressed as follows.
The vehicle control device is configured to include:
an image pickup unit that picks up an image of the front or rear of the vehicle;
a storage device storing a program; and
a hardware processor for executing a program of a program,
the hardware processor executes the program, thereby,
recognizing a linear object existing at a height different from a road and extending along the road as viewed from above, in an image captured by the image capturing unit,
and controlling the vehicle to travel according to the position of the linear object in the image.
While the embodiments for carrying out the present invention have been described above with reference to the embodiments, the present invention is not limited to the embodiments at all, and various modifications and substitutions can be made without departing from the scope of the present invention.

Claims (7)

1. A vehicle control device is characterized by comprising:
an image pickup unit that picks up an image of the front of the vehicle;
a surrounding situation recognition unit that recognizes a situation of the surroundings of the vehicle;
a linear object recognition unit that recognizes a linear object that exists at a different height from a road and extends along the road when viewed from above, from an image captured by the imaging unit;
a driving control unit that controls traveling of the vehicle according to a position of the linear object in the image; and
a curve estimation unit that estimates the presence of a curve ahead of the travel of the vehicle and the curvature of the curve based on the extended form of the linear object recognized by the linear object recognition unit,
the driving control unit controls steering of the vehicle according to a weighted sum of a feed-forward steering angle calculated based on a curvature of the curve estimated by the curve estimation unit and a feed-back steering angle calculated based on a height position of an image end of the linear object, when the vehicle reaches a start point of the curve estimated by the curve estimation unit and the surrounding situation recognition unit does not recognize the road dividing line at that time.
2. The vehicle control apparatus according to claim 1,
the driving control unit controls steering of the vehicle such that an inclination of the linear object in the image is fixed.
3. The vehicle control apparatus according to claim 2,
the driving control unit performs feedback control for maintaining the inclination of the linear object in the image constant with respect to a steering angle of the vehicle.
4. The vehicle control apparatus according to claim 1,
the linear object is an object provided above the imaging unit.
5. The vehicle control apparatus according to claim 4,
the linear object is an upper end portion of a side wall of the road.
6. A vehicle control method characterized by comprising, in a vehicle control unit,
the image pickup unit picks up an image of the front of the vehicle,
the surrounding situation recognition portion recognizes a situation of the surroundings of the vehicle,
a linear object recognition unit that recognizes a linear object that is present at a different height from a road and extends along the road in a plan view, in an image captured by the imaging unit,
the curve estimation unit estimates the presence of a curve ahead of the travel of the vehicle and the curvature of the curve based on the extended form of the linear object recognized by the linear object recognition unit,
when the vehicle reaches the start point of the curve estimated by the curve estimation unit and the peripheral situation recognition unit does not recognize the road dividing line at that time, the driving control unit controls the steering of the vehicle based on a weighted sum of a feed-forward steering angle calculated based on the curvature of the curve estimated by the curve estimation unit and a feed-back steering angle calculated based on the height position of the image end portion of the linear target.
7. A storage medium characterized in that,
the storage medium stores a program that causes a computer mounted in a vehicle having an imaging unit that images the front of the vehicle to execute:
the surrounding condition of the vehicle is identified,
recognizing a linear object that is present at a different height from a road and extends along the road in a plan view, in an image captured by the imaging unit,
estimating the presence of a curve ahead of the vehicle and the curvature of the curve based on the extended form of the linear object recognized by the linear object recognition unit,
when the vehicle reaches the start point of the estimated curve and no road dividing line is recognized at that time, the steering of the vehicle is controlled based on a weighted sum of a feed-forward steering angle calculated based on the estimated curvature of the curve and a feedback steering angle calculated based on the height position of the image end of the linear object.
CN201811587645.4A 2017-12-26 2018-12-24 Vehicle control device, vehicle control method, and storage medium Active CN109987082B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017248942A JP6965152B2 (en) 2017-12-26 2017-12-26 Vehicle control devices, vehicle control methods, and programs
JP2017-248942 2017-12-26

Publications (2)

Publication Number Publication Date
CN109987082A CN109987082A (en) 2019-07-09
CN109987082B true CN109987082B (en) 2022-09-09

Family

ID=67129154

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811587645.4A Active CN109987082B (en) 2017-12-26 2018-12-24 Vehicle control device, vehicle control method, and storage medium

Country Status (2)

Country Link
JP (1) JP6965152B2 (en)
CN (1) CN109987082B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11352010B2 (en) * 2019-09-30 2022-06-07 Baidu Usa Llc Obstacle perception calibration system for autonomous driving vehicles
JP7325296B2 (en) 2019-10-25 2023-08-14 日産自動車株式会社 Object recognition method and object recognition system
JP7304379B2 (en) * 2021-03-30 2023-07-06 本田技研工業株式会社 DRIVER ASSISTANCE SYSTEM, DRIVER ASSISTANCE METHOD, AND PROGRAM
CN115000469B (en) * 2022-07-11 2022-11-08 佛山市清极能源科技有限公司 Power control method of fuel cell system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202271993U (en) * 2011-07-29 2012-06-13 富士重工业株式会社 Vehicle drive-assistant device
JP2014019262A (en) * 2012-07-17 2014-02-03 Fuji Heavy Ind Ltd Vehicular drive support device
JP2017220056A (en) * 2016-06-08 2017-12-14 株式会社デンソー Information processing device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07125563A (en) * 1993-11-01 1995-05-16 Mitsubishi Motors Corp Travel controller of automobile
JP5007840B2 (en) * 2009-05-22 2012-08-22 トヨタ自動車株式会社 Driving assistance device
JP6456761B2 (en) * 2015-04-21 2019-01-23 本田技研工業株式会社 Road environment recognition device, vehicle control device, and vehicle control method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202271993U (en) * 2011-07-29 2012-06-13 富士重工业株式会社 Vehicle drive-assistant device
JP2014019262A (en) * 2012-07-17 2014-02-03 Fuji Heavy Ind Ltd Vehicular drive support device
JP2017220056A (en) * 2016-06-08 2017-12-14 株式会社デンソー Information processing device

Also Published As

Publication number Publication date
JP2019112007A (en) 2019-07-11
JP6965152B2 (en) 2021-11-10
CN109987082A (en) 2019-07-09

Similar Documents

Publication Publication Date Title
CN109484404B (en) Vehicle control device, vehicle control method, and storage medium
JP6793845B2 (en) Vehicle control devices, vehicle control methods, and programs
CN111201170B (en) Vehicle control device and vehicle control method
CN109987082B (en) Vehicle control device, vehicle control method, and storage medium
JP6859239B2 (en) Vehicle control devices, vehicle control methods, and programs
CN109835344B (en) Vehicle control device, vehicle control method, and storage medium
CN109693667B (en) Vehicle control device, vehicle control method, and storage medium
US11370420B2 (en) Vehicle control device, vehicle control method, and storage medium
CN110341704B (en) Vehicle control device, vehicle control method, and storage medium
JP7071250B2 (en) Vehicle control devices, vehicle control methods, and programs
CN111273651B (en) Vehicle control device, vehicle control method, and storage medium
JP6614509B2 (en) Vehicle control device, vehicle control method, and program
CN110194153B (en) Vehicle control device, vehicle control method, and storage medium
CN109559540B (en) Periphery monitoring device, periphery monitoring method, and storage medium
CN115158348A (en) Mobile object control device, mobile object control method, and storage medium
CN114954511A (en) Vehicle control device, vehicle control method, and storage medium
CN115158347A (en) Mobile object control device, mobile object control method, and storage medium
JP7028838B2 (en) Peripheral recognition device, peripheral recognition method, and program
CN113492844A (en) Vehicle control device, vehicle control method, and storage medium
CN113479204A (en) Vehicle control device, vehicle control method, and storage medium
CN112141097A (en) Vehicle control device, vehicle control method, and storage medium
US20240182024A1 (en) Vehicle control device, vehicle control method, and storage medium
JP7160730B2 (en) VEHICLE SYSTEM, VEHICLE SYSTEM CONTROL METHOD, AND PROGRAM
JP6575016B2 (en) Vehicle control apparatus, vehicle control method, and program
JP6951547B2 (en) Vehicle control devices, vehicle control methods, and programs

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant