CN110785334B - Vehicle control device - Google Patents

Vehicle control device Download PDF

Info

Publication number
CN110785334B
CN110785334B CN201780092380.8A CN201780092380A CN110785334B CN 110785334 B CN110785334 B CN 110785334B CN 201780092380 A CN201780092380 A CN 201780092380A CN 110785334 B CN110785334 B CN 110785334B
Authority
CN
China
Prior art keywords
vehicle
driver
control
sight
line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780092380.8A
Other languages
Chinese (zh)
Other versions
CN110785334A (en
Inventor
木间塚渡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of CN110785334A publication Critical patent/CN110785334A/en
Application granted granted Critical
Publication of CN110785334B publication Critical patent/CN110785334B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/02Control of vehicle driving stability
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

A vehicle control device is provided with: a determination unit that determines an area toward which a driver's line of sight is directed; and a control unit that operates based on an operation by a driver and executes a first control for causing the vehicle to travel so as to approach a first reference position, which is a relative position with respect to a travel lane, wherein the control unit automatically operates when the determination unit determines that the line of sight of the driver is out of a predetermined area or when the determination unit determines that the line of sight of the driver is directed toward a specific area, which is an area different from the predetermined area, and executes a second control for causing the vehicle to travel so as to approach a second reference position, which is based on the position of the vehicle when the determination unit determines that the line of sight is out of the predetermined area or directed toward the specific area.

Description

Vehicle control device
Technical Field
The invention relates to a vehicle control device, a vehicle control method, and a program.
Background
Conventionally, there is disclosed a vehicle travel control device including: a normal-time control execution means that executes lane maintenance control for causing the host vehicle to travel along a travel lane when a predetermined switch operated by a vehicle driver is in an on state that allows execution of control; a distraction state determination unit that determines whether or not a vehicle driver is in a distraction state; and a distraction control execution unit that executes the lane keeping control when the predetermined switch is in an off state in which execution of the control is not permitted and the distraction determination unit determines that the vehicle driver is in a distracted state (see, for example, patent document 1).
Documents of the prior art
Patent document
Patent document 1: japanese patent No. 4835291
Disclosure of Invention
Problems to be solved by the invention
However, if the vehicle is controlled in the same manner as in the case where the predetermined switch is in the on state that allows execution of the control when it is determined that the vehicle is in the distracted state as described above, the occupant may be given a sense of discomfort due to behavior of the vehicle.
The present invention has been made in view of the above circumstances, and an object thereof is to provide a vehicle control assistance device, a vehicle control method, and a program that can improve the comfort of a passenger.
Means for solving the problems
(1): a vehicle control device is provided with: a determination unit that determines an area toward which a driver's line of sight is directed; and a control unit that operates based on an operation by a driver and executes a first control for causing the vehicle to travel so as to approach a first reference position, which is a relative position with respect to a travel lane, wherein the control unit automatically operates when the determination unit determines that the line of sight of the driver is out of a predetermined area or when the determination unit determines that the line of sight of the driver is directed toward a specific area, which is an area different from the predetermined area, and executes a second control for causing the vehicle to travel so as to approach a second reference position, which is based on the position of the vehicle when the determination unit determines that the line of sight is out of the predetermined area or directed toward the specific area.
(2): in the aspect (1), the predetermined area is a predetermined area in an external environment in front of the vehicle, and the specific area is a specific area in a vehicle interior of the vehicle.
(3): in (1) or (2), the first reference position is a position on a center line of the traveling lane.
(4): in any one of (1) to (3), the determination unit determines whether or not the driver has moved the line of sight from the predetermined area to an in-vehicle device that can be operated by the driver, and the control unit executes the second control when the determination unit determines that the driver has moved the line of sight from the predetermined area to the in-vehicle device, when the operation is not performed.
(5): in any one of (1) to (4), the determination unit determines whether or not the driver has moved the line of sight from the predetermined area to a direction of a rear seat, and the control unit executes the second control when the determination unit determines that the driver has moved the line of sight from the predetermined area to the direction of the rear seat without performing the operation.
(6): in any one of (1) to (3), the control unit executes the second control at an earlier timing when the determination unit determines that the line of sight is directed toward the specific area and out of the predetermined area than when the determination unit determines that the line of sight is directed toward the specific area and not.
(7): in any one of (1) to (6), the control unit executes the second control when a speed of the vehicle is equal to or higher than a predetermined speed.
(8): in any one of (1) to (7), the control portion executes the second control when the vehicle is traveling on an expressway.
(9): in any one of (1) to (8), the vehicle control apparatus further includes a driver determination unit that determines whether or not the hand of the driver is located within a setting area set with respect to an in-vehicle device that can be operated by the driver, and the control unit executes the second control when the determination unit determines that the line of sight of the driver is out of the predetermined area and the driver determination unit determines that the hand of the driver is located within the setting area.
(10): in any one of (1) to (9), the determination unit determines whether or not a line of sight falling point is located on a rear mirror unit provided in a vehicle for confirming a periphery of the vehicle or a display unit for displaying an image obtained by imaging the periphery of the vehicle after the line of sight of the driver has deviated, and the control unit does not execute the second control when the determination unit determines that the line of sight falling point is located on the rear mirror unit or the display unit after the line of sight of the driver has deviated from the predetermined area without performing the operation.
(11): in any one of (1) to (10), the control unit may cause the vehicle to travel so as to approach the first reference position after the second control is executed for a predetermined time.
(12): in any one of (1) to (10), the control unit sets, as the second reference position, a position that is gradually closer to the first reference position from a position of the vehicle when it is determined that the line of sight is deviated, in the second control.
(13): a vehicle control device is provided with:
a driver determination unit that determines whether or not a hand of a driver is located within a setting area set for an in-vehicle device that can be operated by the driver;
a control unit that operates in accordance with an operation by a driver, and executes a first control for causing the vehicle to travel so as to approach a first reference position that is a relative position with respect to a travel lane,
the control unit is automatically operated when the driver determination unit determines that the hand of the driver is located within the setting area, and executes a second control for causing the vehicle to travel so as to approach a second reference position based on the position of the vehicle when the hand of the driver is determined to be located within the setting area.
(14): a vehicle control device is provided with:
an inquiry judging unit for judging whether or not an inquiry requiring a response from a driver is made;
a control unit that operates in accordance with an operation by a driver, and executes a first control for causing the vehicle to travel so as to approach a first reference position that is a relative position with respect to a travel lane,
the control unit automatically operates when the inquiry determination unit determines that the inquiry has been made, and executes second control for causing the vehicle to travel so as to approach a second reference position that is based on the position of the vehicle when the inquiry is determined to have been made.
(15): in (14), the query is a voice guide-based query from an in-vehicle device, and the response is made by the voice of the driver.
Effects of the invention
According to (1) to (6) and (9) to (13), the comfort of the passenger can be improved.
According to (7) and (8), the passenger comfort can be improved in a situation where the behavior of the vehicle in the lateral direction affects the passenger comfort.
According to (14) and (15), when the inquiry and the conversation are performed with the timing when the driver starts the action of thinking, the second control is executed, whereby the comfort of the passenger can be further improved.
Drawings
Fig. 1 is a diagram showing an example of a configuration of a vehicle on which a driving support apparatus 100 is mounted.
Fig. 2 is a diagram showing an example of the installation position of the touch panel 12.
Fig. 3 is a diagram illustrating the position of the sight line falling point when it is determined that the driver's sight line is out of the predetermined area.
Fig. 4 is a diagram for explaining the processing of the driver detection unit.
Fig. 5 is a diagram showing an example of the relationship between the reaction force and the relative position of the host vehicle M with respect to the host lane.
Fig. 6 is a diagram showing an example of behavior of the host vehicle M when the first control and the second control are executed.
Fig. 7 is a flowchart showing a flow of the first control process.
Fig. 8 is a flowchart showing the flow of the process of the second control.
Fig. 9 is a flowchart showing a flow of processing executed by the driving support apparatus 100.
Fig. 10 is a flowchart showing another example of the flow of the line-of-sight determination process.
Fig. 11 is a diagram showing an example of behavior of the own vehicle M in modification 1.
Fig. 12 is a diagram showing an example of behavior of the own vehicle M in modification 2.
Fig. 13 is a diagram showing an example of the configuration of a vehicle on which the driving support apparatus 100A according to the second embodiment is mounted.
Fig. 14 is a diagram showing an example of the hardware configuration of the driving support apparatus 100 according to the embodiment.
Detailed Description
< first embodiment >
Fig. 1 is a diagram showing an example of a configuration of a vehicle (hereinafter, a host vehicle M) on which a driving support apparatus 100 is mounted. The driving support apparatus 100 is connected to, for example, an HMI (Human Machine Interface) 10, a radar device 20, a camera 22, an image recognition device 24, and a vehicle sensor 30. The driving support device 100 is connected to, for example, a traveling driving force output device 40, a brake device 42, a steering device 44, and an in-vehicle device 50.
The driving support apparatus 100 includes, for example, a line-of-sight determination unit 110, a driver detection unit 120, a lane keeping support control unit 130, and a follow-up running support control unit 150. The above-described components are realized by executing a program (software) by a hardware processor such as a CPU (Central Processing Unit). Some or all of the above components may be realized by hardware (Circuit portion including Circuit) such as LSI (Large Scale Integrated Circuit), ASIC (Application Specific Integrated Circuit), FPGA (Field-Programmable Gate Array), GPU (Graphics Processing Unit), or the like, or may be realized by cooperation of software and hardware. The lane keeping assist control unit 130 and the follow-up running assist control unit 150 may be realized by one processor or may be realized by a plurality of processors. In the latter case, the driving support apparatus 100 may be a system in which a plurality of ECUs (Electronic Control units) are combined.
The HMI10 is a device that receives an operation from a passenger of the host vehicle M and outputs information. The HMI10 includes, for example, a touch panel 12, a switch not shown, and the like. The touch panel 12 may be configured by combining a Display device such as an LCD (Liquid Crystal Display) or an organic EL (Electro Luminescence) Display with a touch pad. The touch panel 12 is an example of an "in-vehicle device". The HMI may include an in-vehicle device other than the touch panel 12.
Fig. 2 is a diagram showing an example of the installation position of the touch panel 12. As shown in the drawing, the touch panel 12 is located below a front windshield, for example, and is provided on an instrument panel provided on the front of a driver seat and an assistant seat. The touch panel 12 may function as an instrument panel (fascia) for a measuring instrument such as a display speedometer and a tachometer, for example, which is provided on the front surface of the driver's seat. A steering wheel provided on the front side of the driver's seat is provided with a main switch 14 and an LKAS (Lane Keeping System) operating switch 16, which will be described later. The main switch 14 and the LKAS operation switch 16 may be configured to be included in the HMI 10. An ACC (Adaptive Cruise Control) switch or other switches for causing the vehicle to perform predetermined Control may be provided on the steering wheel.
The radar device 20 radiates radio waves such as millimeter waves to the front of the host vehicle M, and detects radio waves reflected by an object (reflected waves) to detect at least the position (distance and direction) of the object. One or more radar devices 20 are mounted on an arbitrary portion of the host vehicle M. The radar device 20 may detect the position and speed of an object by an FM-CW (Frequency Modulated Continuous Wave) method. The radar device 20 transmits the detection result to the driving support device 100.
The camera 22 is a digital camera using a solid-state image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor). The camera 22 is mounted at any position of the host vehicle M. When shooting the front, the camera 22 is attached to the upper part of the front windshield, the rear surface of the vehicle interior mirror, or the like. The camera 22 repeatedly photographs the periphery of the own vehicle M periodically, for example. The camera 22 may be a still camera. The camera 22 outputs the captured image to the image recognition device 24.
The image recognition device 24 performs image processing on the image captured by the camera 22 to recognize the position, type, speed, and the like of an object existing around the vehicle M. The image recognition device 24 outputs the recognition result to the driving support device 100.
The vehicle sensor 30 includes: a vehicle speed sensor that detects a speed of the vehicle M; an acceleration sensor that detects acceleration; a yaw rate sensor that detects an angular velocity about a vertical axis; and an orientation sensor for detecting the orientation of the vehicle M. The vehicle sensor 30 outputs the detection result to the driving support apparatus 100.
The running drive force output device 40 outputs running drive force (torque) for running of the vehicle M to the drive wheels. The running drive force output device 40 includes, for example, a combination of an internal combustion engine, a motor, a transmission, and the like, and an ECU that controls these components. The ECU controls the above configuration in accordance with information input from the driving support apparatus 100 or information input from a driving operation member not shown.
The brake device 42 includes, for example, a caliper, a hydraulic cylinder that transmits hydraulic pressure to the caliper, an electric motor that generates hydraulic pressure in the hydraulic cylinder, and a brake ECU. The brake ECU controls the electric motor in accordance with information input from the driving assistance device 100 or information input from a driving operation member not shown, and outputs a braking torque corresponding to a braking operation to each wheel. The brake device 42 may have a mechanism for transmitting a hydraulic pressure generated by an operation of a brake pedal included in the driving operation tool to the hydraulic cylinder via the master cylinder as a backup. The brake device 42 is not limited to the above-described configuration, and may be an electronically controlled hydraulic brake device as follows: the actuator is controlled in accordance with information input from the driving support apparatus 100, and the hydraulic pressure of the master cylinder is transmitted to the hydraulic cylinder.
The steering device 44 includes, for example, a steering ECU and an electric motor. The electric motor changes the direction of the steered wheels by, for example, applying a force to a rack and pinion mechanism. The steering ECU drives the electric motor to change the direction of the steered wheels in accordance with information input from the driving support apparatus 100 or information input from the driving operation member.
The vehicle interior camera 50 photographs the upper body of a passenger seated in the driver's seat. The captured image of the vehicle interior camera 50 is output to the driving support apparatus 100.
[ Sight line judging section ]
The line-of-sight determination unit 110 performs the line-of-sight determination process as follows. The sight line determination unit 110 analyzes an image captured by the vehicle interior camera 50. The sight line determination unit 110 determines the area to which the driver's sight line is directed based on the analysis result, and outputs the determination result to the lane keeping support control unit 130. The predetermined area is, for example, an area that is considered to require visual confirmation when the driver monitors the front.
For example, the sight line determination unit 110 detects the positional relationship between the head and the eyes of the driver and the combination of the reference point and the moving point in the eyes from the image by using a method such as template matching. The sight line determination unit 110 performs conversion processing from the image plane to the actual plane and the like based on the position of the eye with respect to the head and the position of the moving point with respect to the reference point, and derives the direction of the sight line. For example, when the reference point is the inner canthus, the moving point is the iris. When the reference point is the corneal reflection region, the moving point is the pupil. The cornea reflection region is an infrared-viewing reflection region in the cornea when the driver is irradiated with infrared light by the vehicle interior camera 50 or the like. In this way, the sight line determination unit 110 determines whether the derived sight line direction is directed to a predetermined area.
The line-of-sight determination unit 110 may directly determine from the image whether or not the line of sight of the driver is directed to the predetermined area based on information stored in a storage device provided in the driving support device 100. The storage device stores information indicating whether or not the line of sight exists in a predetermined area with respect to the relationship between the head direction of the driver, the position of the reference point, and the position of the moving point in the image.
Fig. 3 is a diagram illustrating the position of the sight line falling point when it is determined that the driver's sight line is out of the predetermined area. Hereinafter, the XY coordinates will be used as necessary for explanation. For example, the X direction is the central axis direction of the host vehicle M, and the Y direction is the width direction of the host vehicle M.
The area AR is a determination area for determining whether the driver is facing a predetermined area. For example, the sight line determination unit 110 obtains a position D1 at a landing point of the derived direction of the driver's sight line. In this case, the position D1 is included in the area AR, and the sight line determination unit 110 determines that the driver's sight line is directed to the area AR. When the line of sight of the driver moves from the line of sight D1 to the line of sight D2, the line of sight determination unit 110 determines that the line of sight of the driver is out of the predetermined area because the line of sight D2 is outside the predetermined area. For example, the sight line determination unit 110 can determine whether or not a rapid eye movement (so-called saccade for eyes) is performed to obtain the foveal vision by determining the eye movement of the driver as described above. The area AR in fig. 3 is merely an example, and can be set arbitrarily.
The sight line determination unit 110 may determine that the sight line of the driver of the host vehicle M has deviated from the predetermined area when the time period during which the sight line of the driver of the host vehicle M deviates from the area AR is equal to or longer than a predetermined time period. Further, the area AR may be divided into a plurality of areas. For example, a predetermined area may be set for the interior mirror and the side mirror in addition to the area AR. In this case, the line-of-sight determination unit 110 does not determine that the line of sight of the driver is out of the predetermined area when the line of sight of the driver is moved to the interior mirror or the side mirror at the instant when the line of sight of the driver is moved away from the area AR (less than the predetermined time).
The area AR may be set for a display type side mirror or a display type mirror that displays an image captured by an imaging unit that captures the periphery of the vehicle.
In the above example, the determination of the line-of-sight landing point of the driver based on the position of the reference point of the eyes and the position of the moving point has been described. Instead, the sight line determination unit 110 may determine the sight line landing point of the driver based on the orientation of the head of the driver.
[ driver detecting part ]
The driver detection unit 120 determines whether or not a part of the body of the driver (for example, an arm or a hand) is located within a setting region set for an in-vehicle device (for example, the touch panel 12) that can be operated by the driver, based on an image captured by the in-vehicle camera 50 or another camera, a detection result of an infrared sensor, a capacitance sensor, or the like, or an operation of the in-vehicle device by the driver, and outputs the determination result to the driving support apparatus 100. In addition, the driver detection unit 120 may determine whether or not a part of the body (for example, an arm or a hand) of the driver is located within the setting area, by combining the above.
The infrared sensor is provided at a position where an object enters a set area and detects the object. The electrostatic capacitance sensor is, for example, a sensor provided in the touch panel. The electrostatic capacitance sensor outputs information indicating a change in electrostatic capacitance to the driver detection unit 120 when the hand of the driver touches the touch panel. When acquiring information indicating a change in the capacitance, the driver detection unit 120 detects that the hand of the driver touches the touch panel. The electrostatic capacity sensor described above may be a sensor as follows: when the hand of the driver approaches the touch panel (for example, when the hand approaches a position of about 15 to 20[ cm ] from the touch panel 12), the change in the electrostatic capacitance is detected, and information indicating the change in the electrostatic capacitance is output to the driver detection unit 120.
Fig. 4 is a diagram for explaining the processing of the driver detection unit. Fig. 4 is an example of a case where the touch panel 12 is viewed from the + Y direction. As shown in the drawing, a setting area AR1 is set with respect to the touch panel 12. The setting area AR1 is set, for example, with respect to the following areas: when the driver operates the touch panel 12 while sitting in the driver's seat, the hand (or a part of the body) of the driver enters an area immediately before the operation (before the operation is performed for a predetermined time). When the hand of the driver enters the setting area AR1, the driver detection unit 120 outputs information indicating that the hand of the driver has entered the setting area AR1 to the lane keeping support control unit 130.
[ one of lane keeping support control ]
The support control unit 130 includes, for example, a recognition processing unit 132 and a steering support control unit 134. When the main switch 14 is operated, the recognition processing unit 132 starts processing for recognizing the relative position and posture of the host vehicle M with respect to the travel lane. For example, the recognition processing unit 132 recognizes, as a traveling lane, a region defined by two dividing lines closest to the host vehicle M among the dividing lines of the road, and recognizes the relative position and posture of the host vehicle M with respect to the traveling lane.
[ first control ]
The steering support control unit 134 controls the steering device 44 so that the vehicle M approaches the own lane (for example, the center of the own lane) recognized by the recognition processing unit 132 when the LKAS operation switch 16 is operated. Hereinafter, such control is referred to as "first control". The center of the lane is an example of the "first reference position" which is a relative position with respect to the driving lane.
The LKAS operation switch 16 is set to a state of not accepting an operation (a state of disabling an operation) until a predetermined time elapses after the main switch 14 is operated, and to a state of accepting an operation (a state of enabling an operation) after the predetermined time elapses, for example. The predetermined time is set in advance to a time longer than the time required from the start of the processing by the recognition processing unit 132 to the recognition of the traveling lane and the position and posture of the host vehicle M with respect to the lane, for example.
For example, when the LKAS operation switch 16 in the state of receiving the operation is received from the passenger, the steering support control unit 134 applies a reaction force to the shaft of the steering wheel so that the own vehicle M passes through the center of the lane of the own vehicle as the first control. The reaction force at this time is a steering torque in the same direction as a steering torque applied to the shaft when the steering wheel is switched to the lane center side.
[ second control ]
The steering support control unit 134 is automatically operated to control the steering device 44 so as to approach a second reference position that is a reference position of the own vehicle M determined to be out of the line of sight even when the LKAS operation switch 16 is not operated, when the line of sight determination unit 110 determines that the line of sight of the driver is out of the predetermined area. Hereinafter, such control is referred to as "second control".
Note that, even when the master switch 14 and the LKAS operation switch 16 are not operated, the second control may be performed as described above. The second control may be executed when the speed of the host vehicle M is equal to or higher than a predetermined speed or when the host vehicle M is traveling on the vehicle-dedicated road, or may be executed when the speed is equal to or higher than a predetermined speed and the host vehicle M is traveling on an expressway.
For example, when the line-of-sight determination unit 110 determines that the line of sight of the driver is out of the predetermined area, the steering support control unit 134 applies, as the second control, a reaction force to the shaft of the steering wheel so as to approach a second reference position that is based on the position of the vehicle M determined to be out of the line of sight. The reaction force at this time is a steering torque in the same direction as the steering torque applied to the shaft when the steering wheel is switched to the second reference position side.
Fig. 5 is a diagram showing an example of the relationship between the reaction force and the relative position of the host vehicle M with respect to the host lane. In the figure, the vertical axis represents the absolute value of the reaction force (steering torque) applied to the shaft of the steering wheel, and the horizontal axis represents the distance of the lane in the vehicle width direction. Further, LM R The dividing line, LM, to the right in the direction of travel L The dividing line on the left side of the traveling direction is shown. As shown in the drawing, for example, when the first control is performed, the steering support control unit 134 sets a point at which the reaction force is minimum (hereinafter, sometimes referred to as a "minimum point of the reaction force") as the first reference position, and increases the reaction force as the host vehicle M moves away from the lane center CL. In the case of performing the second control, the steering support control unit 134 sets the minimum point of the reaction force as the second reference position, and increases the reaction force as the host vehicle M moves away from the second reference position PL.
Fig. 6 is a diagram showing an example of behavior of the host vehicle M when the first control and the second control are executed. Fig. 6 (a) shows the behavior of the host vehicle M when the first control is executed in the travel lane LM, and fig. 6 (B) shows the behavior of the host vehicle M when the second control is executed. For example, as shown in fig. 6 (a), the recognition processing unit 132 recognizes the dividing line LM R And LM L Dividing line LM for the vehicle M R And LM L The area in between is identified as the traveling lane LM of the host vehicle M. The recognition processing unit 132 recognizes the deviation OS of the reference point P of the host vehicle M (for example, the center of the host vehicle M in the width direction) from the lane center CL as the relative position of the host vehicle M with respect to the traveling lane LM. The recognition processing unit 132 recognizes an angle θ formed between the traveling direction of the host vehicle M and the lane center CL as the posture of the host vehicle M with respect to the traveling lane LM. It should be noted thatInstead, the recognition processing unit 132 may recognize the position of the reference point of the host vehicle M with respect to either side end of the host lane LM as the relative position of the recognized host vehicle M with respect to the travel lane.
In the first control, as shown in fig. 6 (a), the host vehicle M is controlled so as to pass through the center CL of the lane of the host vehicle M.
In the second control, as shown in fig. 6 (B), the host vehicle M is controlled so as to pass through the second reference position PL. For example, as shown in fig. 6B, when the host vehicle M is not moving straight (when the host vehicle M is not traveling along the lane), the steering support control unit 134 sets the second reference position PL between the reference point P of the host vehicle M and the lane center CL so that the host vehicle M can smoothly move straight. Thus, the host vehicle M travels so that the reference point P of the host vehicle M smoothly coincides with the second reference position PL. As a result, the behavior of the host vehicle M is suppressed from becoming abrupt, and the comfort of the passengers of the host vehicle M is improved.
[ second support control for lane keeping ]
The lane keeping assist control unit 130 executes the second control based on, for example, the detection result of the driver detection unit 120. The steering assist control unit 134 executes the second control when the driver detection unit 120 detects that the hand of the driver has entered the setting area AR1, for example, when the LKAS operation switch 16 is not operated. In this case, the lane keeping control is automatically performed, and thus the comfort of the passengers is improved.
[ flow chart of lane keeping support control ]
Fig. 7 is a flowchart showing a flow of the first control process. First, the recognition processing unit 132 acquires an image of the road surface (step S100). Next, the recognition processing unit 132 detects a lane from the image acquired in step S100 (step S102). Next, the recognition processing unit 132 derives a first target route based on the first reference position (step S104). The first target route is a route that coincides with the lane center CL.
Next, the recognition processing unit 132 derives a first travel route for traveling on the first target route (step S106). The first travel route is a route traveled by the host vehicle M and is set so that the host vehicle M smoothly enters the first target route from the current position.
Next, the recognition processing unit 132 derives the degree of deviation between the first course derived in step S106 and the position of the host vehicle M (step S108). Next, the steering support control unit 134 controls the steering so as to travel on the first target route, based on the degree of deviation derived in step S108 (step S110). Whereby the processing of the 1 routine of the present flowchart ends.
Fig. 8 is a flowchart showing the flow of the process of the second control. The processing of steps S200 and S202 in fig. 8 is the same as the processing of steps S100 and S102 in fig. 7, and therefore, the description thereof is omitted.
In fig. 8, in step S204, the recognition processing unit 132 derives a second target route with the second reference position as a reference (step S204). The second target course is a course coinciding with the second reference position. Next, the recognition processing unit 132 derives a second travel route for traveling on the second target route (step S206). The second travel route is a route traveled by the host vehicle M and is set so that the host vehicle M smoothly enters the second target route from the current position. Next, the recognition processing unit 132 derives the degree of deviation between the second course derived in step S206 and the position of the host vehicle M (step S208). Next, the steering support control unit 134 controls the steering so as to travel the second target route based on the degree of deviation derived in step S208 (step S210). Whereby the processing of the 1 routine of the present flowchart ends.
[ tracking travel support control ]
The follow-up running support control unit 150 includes, for example, a vehicle recognition unit 152 and a speed support control unit 154. The vehicle recognition unit 152 recognizes the position and speed of the host vehicle M present in the vicinity of the host vehicle M based on the detection result of the radar device 20 and the recognition result of the image recognition device 24.
The speed support control unit 154 controls the running driving force output device 40 and the brake device 42 to accelerate or decelerate the host vehicle M within a preset vehicle speed range (e.g., 50 to 100[ km/h ]) so that the host vehicle M follows a peripheral vehicle (hereinafter, referred to as a preceding vehicle) existing within a predetermined distance (e.g., about 50[ M ]) ahead of the host vehicle M, among the peripheral vehicles recognized by the vehicle recognition unit 152. "follow-up" refers to a running mode in which the relative distance (inter-vehicle distance) between the host vehicle M and the preceding vehicle is kept constant, for example. Hereinafter, such control will be referred to as "follow-up running support control". The speed support control unit 154 may be configured to cause the host vehicle M to travel only within the set vehicle speed range when the vehicle recognition unit 152 does not recognize the preceding vehicle.
[ example of drive support control ]
Fig. 9 is a flowchart showing a flow of processing executed by the driving support apparatus 100. The lane keeping assist control unit 130 determines whether or not the LKAS operation switch 16 is operated (step S300). When the LKAS operation switch 16 is operated, the lane keeping support control unit 130 executes the first control (step S302).
When the LKAS operation switch 16 is not operated, the lane keeping support control unit 130 executes the line of sight determination process (step S304). The sight line determination process is a process of determining whether or not the driver's sight line deviates from a predetermined area, for example. The line of sight determination process may be a process of the flowchart of fig. 10 described later.
Next, the lane keeping assist control unit 130 determines whether or not a predetermined condition is satisfied based on the result of the line-of-sight determination process (step S306). The predetermined condition is, for example, a case where the driver's line of sight is out of a predetermined area. The above-described processing in steps S304 and S306 may be omitted.
When the predetermined condition is satisfied, the driving support apparatus 100 executes the second control and the follow-up running support control (step S308). If the predetermined condition is not satisfied, the driver detection unit 120 determines whether or not the position of the hand of the driver is present within the setting area (step S310). When the position of the hand of the driver is within the setting region, the lane change maintenance support unit 130 executes the second control (step S312). When the position of the hand of the driver is not within the setting area, the routine 1 of the present flowchart ends. The processing in steps S310 and S312 may be prioritized over the sight line determination processing (processing in steps S304 and S306). Further, the driving support apparatus 100 may execute the second control when the predetermined condition is satisfied and the position of the hand of the driver is within the setting region in step S306.
As described above, the lane change maintenance support unit 130 executes the second control when the predetermined condition is satisfied or the driver's hand is present in the setting area even when the LKAS operation switch 16 is not operated, and therefore, the passenger comfort can be improved.
[ Another example of the line-of-sight determination processing ]
Fig. 10 is a flowchart showing another example of the flow of the line-of-sight determination process. First, the sight line determination unit 110 acquires position information of a sight line landing point of the driver (step S400). Next, the sight line determination unit 110 compares the position information of the sight line landing point acquired last time with the position information of the sight line landing point acquired in step S400, derives the movement amount of the sight line per unit time, and determines whether or not the derived movement amount is larger than a predetermined value α (step S402).
When the derived movement amount is larger than the predetermined value α, the line-of-sight determination unit 110 determines whether or not the line of sight is directed to the outside of the predetermined area (step S404). The line of sight is directed out of the predetermined area means, for example, that the line of sight is directed out of the area AR or out of an area set for the rear view mirror or the side mirror. The line of sight may be directed to a direction of the in-vehicle device operable by the driver or a direction of a rear seat, for example, outside the predetermined area. When the line of sight is directed to the outside of the predetermined area, the line of sight determination unit 110 determines whether or not the time outside the predetermined area exceeds a predetermined time β (step S406).
When the time outside the predetermined region exceeds the predetermined time β, the line-of-sight determination unit 110 determines that the predetermined condition is satisfied (step S408). If a negative determination result is obtained in step S402, S404, or S406, the line-of-sight determination unit 110 determines that the predetermined condition is not satisfied (step S410). Whereby the processing of the 1 routine of the present flowchart ends. Note that the processing in one or both of steps S402 and S406 may be omitted.
By the above-described processing, the state of the driver's sight line can be monitored with high accuracy, and whether or not to execute the second control can be determined more appropriately.
In the above processing, it is determined whether or not the line of sight is directed to the outside of the "predetermined area", and it is determined whether or not the time of watching "outside the predetermined area" exceeds the predetermined time β, but the "outside the predetermined area" may be determined by changing to the "specific area". The specific region is a region different from the predetermined region and is a region set arbitrarily in advance. The specific area is, for example, the direction of the in-vehicle device operable by the driver or the direction of the rear seat.
In the above-described processing, the second control may be executed at an earlier timing when the line of sight is directed to outside the predetermined area and the line of sight is directed to the specific area, or when the line of sight is directed to outside the predetermined area and the line of sight is not directed to the specific area. For example, when the line of sight is directed to the outside of the predetermined area and the line of sight is not directed to the specific area, it may be determined that the predetermined condition is satisfied when the predetermined time β is exceeded, and when the line of sight is directed to the outside of the predetermined area and the line of sight is directed to the specific area, it may be determined that the predetermined condition is satisfied when the predetermined time β 1 shorter than the predetermined time β is exceeded.
[ modified example 1 of the second control ]
In the above example, the lane change maintenance support unit 130 has been described as controlling the host vehicle M so as to approach the second reference position PL during the second control, but the host vehicle M may be controlled so as to approach the first reference position after traveling along the second reference position PL for a predetermined time.
Fig. 11 is a diagram showing an example of behavior of the own vehicle M in modification 1. For example, the host vehicle M is controlled as follows: the lane change maintenance support unit 130 causes the host vehicle M to travel so as to approach the second reference position PL at the time t +1 and the time t +2, and causes the host vehicle M to travel so as to approach the first reference position CL after the time t + 3.
By performing the processing as described above, the host vehicle M travels so as to approach the first reference position after traveling so as to approach the second reference position PL for a predetermined time. When the host vehicle M travels from the state of traveling along the second reference position PL so as to approach the first reference position, the host vehicle M is controlled to travel smoothly along the first reference position.
[ modified example 2 of the second control ]
In the second control, the lane change maintenance support unit 130 sets, as the second reference position, a position at which the vehicle position approaches the first reference position in a stepwise manner (or slowly) with time from the position at which it is determined that the line of sight of the driver deviates from the predetermined area. That is, the lane change maintenance support unit 130 updates the second position so as to approach the first reference position in a stepwise manner.
Fig. 12 is a diagram showing an example of behavior of the own vehicle M in modification 2. In the illustrated example, the minimum point of the reaction force moves in the order of the second reference positions PL to PL3 and the first reference position. The second reference positions PL, PL1, PL2, PL3 are positions sequentially close to the lane center CL. The lane change maintenance support unit 130 causes the host vehicle M to travel so as to approach the second reference position PL when the line of sight of the driver deviates from the predetermined area, for example, at time t. Thereafter, the lane change maintenance support unit 130 sets the second reference position PL1 at time t +1, sets the second reference position PL2 at time t +2, sets the second reference position PL3 at time t +3, and causes the host vehicle M to travel so as to approach the set second reference position at each time. Then, the lane change maintenance support unit 130 causes the host vehicle M to travel so as to approach the first reference position at time t + 4.
By performing the processing as described above, the host vehicle M is controlled to approach the lane center CL in stages. As a result, the vehicle travels smoothly approaching the center of the lane.
< second embodiment >
A second embodiment will be explained. In the second embodiment, the second control is performed when it is determined that an inquiry requiring a response of the driver is made by the in-vehicle apparatus. The second reference position in the second embodiment is a position based on the position of the vehicle when the inquiry is made. Hereinafter, differences from the first embodiment will be mainly described.
Fig. 13 is a diagram showing an example of the configuration of a vehicle on which the driving support apparatus 100A according to the second embodiment is mounted. The driving support apparatus 100A includes an inquiry determining unit 122 in addition to the functional configuration of the driving support apparatus 100 according to the first embodiment.
The inquiry judging unit 122 judges whether or not an inquiry requiring a response from the driver is made. The query that requires a response from the driver is, for example, a voice guidance-based query issued from an in-vehicle apparatus. The response is, for example, a response by the voice (speaking) of the driver. For example, in the case where the in-vehicle device is a navigation device, the inquiry that requires a response from the driver is "to change the route of traffic congestion ahead or not? "etc. Such an inquiry becomes a trigger for the driver to start a thinking action. Therefore, the comfort of the passenger can be improved by performing the second control as described later.
The in-vehicle device such as the navigation device described above includes a microphone to which a voice of the driver is input, and a recognition control unit that analyzes the voice input to the microphone and recognizes meaning information corresponding to the input voice. The recognition control unit is not limited to being included in the navigation device, and may be included in the driving support unit 100A. The recognition control unit recognizes the meaning of the response to the inquiry, and executes control according to the recognition result. For example, when the driver utters a sound of "please change the route" in response to the above-mentioned inquiry, the recognition control unit recognizes the sound and resets the route for avoiding traffic congestion.
The lane keeping assist control unit 130 according to the second embodiment is a control unit that operates based on an operation by the driver and executes a first control of causing the host vehicle M to travel so as to approach a first reference position, which is a relative position with respect to the traveling lane, and the lane keeping assist control unit 130 automatically operates when the inquiry determination unit 122 determines that an inquiry is made, and executes a second control of causing the host vehicle M to travel so as to approach a second reference position with respect to the position of the vehicle determined when the inquiry is made. The second control may be continued until it is determined that the later-described thought operation is completed.
In the above example, the in-vehicle device that makes an inquiry or recognizes the voice of the driver generates an opportunity for the driver to start a thought operation, but instead of (or in addition to) this, the opportunity for the driver to start a thought operation may be an inquiry or a conversation made by the passenger of the vehicle to the driver. In this case, the inquiry judging unit 122 distinguishes between the utterance of the driver and the utterance of the passenger based on the voice information stored in advance in the storage device. The voice information is the voice of the driver and the voice of the passenger stored in advance.
Then, the inquiry determining unit 122 may determine whether or not the passenger other than the driver makes an inquiry or a conversation to the driver to start the thinking operation based on inquiry information or conversation information stored in advance in the storage device as a trigger to start the thinking operation, and output the determination result to the lane keeping support control unit 130. Thus, the second control is executed when the condition is satisfied during the conversation between the driver and the passenger, and the comfort of the passenger can be improved.
The inquiry that initiates the driver's thought operation may be an inquiry when the passenger searches for an address in the navigation device. The inquiry at the time of address search is an inquiry such as "please notify the driver of the address (prefecture, city district city, house number, etc.)" made by the navigation device.
The inquiry that triggers the driver to start the action of thinking may be any of the following inquiries (1) to (5). For example, when the passenger operates a predetermined key, the navigation device (HMI 10) makes an inquiry with a voice such as (1) "notify command". The instruction refers to, for example, searching for a destination, selecting a song, making a call, and the like. When the driver returns "search for a destination" in response to the query of (1), the navigation device notifies prefecture in accordance with "request" of (2), for example. ", (3)" please notify the city (the next level address). ", (4)" please tell the house number. "is interrogated in that order.
When the driver has answered the queries (2) to (4), the navigation device performs control, searches for map information, and outputs "set" to the destination by city Δ Δ city × city 1. "etc., or set the destination to the above-specified destination. When such control is performed (i.e., when the navigation device is in a state where it can search for or set an address), it is determined that the driver's thinking operation is finished. When the inquiry judging unit 122 acquires information indicating that the thought operation is completed from the navigation device, it outputs information indicating that the thought operation is completed to the lane keeping support control unit 130. When the information indicating that the thought operation is completed is acquired from the inquiry judging unit 122, the lane keeping support control unit 130 suspends the second control and executes the first control.
Further, completion of the driver's thought action can be determined by: the navigation device in each of the above examples ends a series of sessions with the driver, or the vehicle (navigation device) transitions to a state in which the next process can be started due to the end of the session. The inquiry judging unit 122 judges whether or not the series of sessions described above is ended based on information on a session set in advance stored in the storage device.
Instead of the above-described inquiry, the predetermined utterance of the passenger may be recognized as a trigger to start the thinking operation. The predetermined utterance of the passenger is, for example, a predetermined utterance made in response to the inquiry of (1) above.
In the above example, the description has been given of whether or not the second control is executed based on the inquiry by the navigation device and the reply to the inquiry, but the present invention is not limited to this, and the same processing may be performed based on the inquiry by another in-vehicle device and the reply to the inquiry.
In the driver assistance device 100 according to the second embodiment, one or both of the line-of-sight determination unit 110 and the driver detection unit 120 may be omitted.
According to the second embodiment described above, the second control is performed when it is determined that an inquiry requiring a response from the driver is made, and the comfort of the passenger can be improved.
According to the embodiment described above, the comfort of the passengers can be improved by providing the line-of-sight determination unit 110 and the lane-change maintenance support unit 130, the line-of-sight determination unit 110 determining the area toward which the line of sight of the driver is directed, the lane-change maintenance support unit 130 operating on the basis of the operation of the driver and executing the first control of causing the host vehicle M to travel so as to approach the first reference position, which is the relative position with respect to the travel lane, and the lane-change maintenance support unit 130 automatically executing the second control of causing the host vehicle M to travel so as to approach the second reference position with reference to the position of the host vehicle M determined as the line of sight of the driver being away from the predetermined area or being directed toward the specific area, which is the area different from the predetermined area, when the line-of-sight determination unit 110 determines that the line of sight of the driver is away from the predetermined area or is directed toward the specific area.
< hardware construction >
The driving support apparatuses 100 and 100A according to the above embodiments are realized by a hardware configuration shown in fig. 14, for example. Fig. 14 is a diagram showing an example of the hardware configuration of the driving support apparatus 100, 100A according to the embodiment.
The driving support apparatuses 100 and 100A are configured such that a communication controller 100-1, a CPU100-2, a RAM100-3, a ROM100-4, a secondary storage device 100-5 such as a flash memory or an HDD, and a drive apparatus 100-6 are connected to each other via an internal bus or a dedicated communication line. A removable storage medium such as an optical disk is mounted on the drive device 100-6. The program 100-5a stored in the secondary storage device 100-5 is developed in the RAM100-3 by a DMA controller (not shown) or the like and executed by the CPU100-2, thereby realizing the sight line determination unit 110, the driver detection unit 120, the inquiry determination unit 122, the lane keeping support control unit 130, and the follow-up running support control unit 150. The program referred to by the CPU100-2 may be stored in a removable storage medium installed in the drive device 100-6, or may be downloaded from another device via the network NW.
The above-described embodiments can be described as follows.
Is provided with a storage device and a hardware processor,
the storage device stores a program for causing the hardware processor to execute the steps of:
determining an area toward which a driver's sight line is directed;
executing a first control that causes the vehicle to travel so as to approach a first reference position that is a relative position with respect to a travel lane, in accordance with an operation based on a driver;
when it is determined that the driver's line of sight is out of a predetermined area or when it is determined that the driver's line of sight is directed toward a specific area that is an area different from the predetermined area, the vehicle automatically performs a second control of causing the vehicle to travel so as to approach a second reference position that is based on the position of the vehicle when it is determined that the line of sight is out of the predetermined area or directed toward the specific area.
While the embodiments of the present invention have been described above, the present invention is not limited to the above embodiments, and various modifications and substitutions can be made without departing from the scope of the present invention.
Description of the reference numerals
10 · HMI,12 · touch panel, 14 · main switch, 16 · LKAS operation switch, 20 · radar device, 22 · camera, 24 · image recognition device, 30 · vehicle sensor, 100 · driving support device, 110 · sight line determination part, 120 · driver detection part, 130 · lane maintenance support control part, 132 · recognition processing part, 134 · steering support control part, 150 · tracking driving support control part, 152 · vehicle recognition part, 154 · speed support control part.

Claims (15)

1. A vehicle control device is provided with:
a determination unit that determines an area toward which a driver's line of sight is directed;
a control unit that operates in accordance with an operation by a driver and executes a first control for causing the vehicle to travel so as to approach a first reference position that is a relative position with respect to a travel lane,
the control unit automatically operates to execute a second control for causing the vehicle to travel so as to approach a second reference position based on the position of the vehicle when the determination unit determines that the line of sight of the driver is out of the predetermined area or the line of sight of the driver is directed toward a specific area out of areas different from the predetermined area.
2. The vehicle control apparatus according to claim 1,
the prescribed area is a prescribed area in the outside in front of the vehicle,
the specific area is a specific area within a cabin of the vehicle.
3. The vehicle control apparatus according to claim 1 or 2, wherein,
the first reference position is a position on a center line of the driving lane.
4. The vehicle control apparatus according to claim 1 or 2, wherein,
the determination unit determines whether or not the driver has moved his or her line of sight from the predetermined area to an in-vehicle device that can be operated by the driver,
the control unit executes the second control when the determination unit determines that the driver has moved the line of sight from the predetermined area to the in-vehicle device without performing the operation.
5. The vehicle control apparatus according to claim 1 or 2, wherein,
the determination unit determines whether or not the driver moves the line of sight from the predetermined area to a direction of a rear seat,
the control unit executes the second control when the determination unit determines that the driver has moved the line of sight from the predetermined area to a direction of a rear seat without performing the operation.
6. The vehicle control apparatus according to claim 1 or 2, wherein,
the control unit executes the second control at an earlier timing when the determination unit determines that the line of sight is directed to the outside of the predetermined area and to the specific area than when the determination unit determines that the line of sight is directed to the outside of the predetermined area and not to the specific area.
7. The vehicle control apparatus according to claim 1 or 2, wherein,
the control unit executes the second control when the speed of the vehicle is equal to or higher than a predetermined speed.
8. The vehicle control apparatus according to claim 1 or 2, wherein,
the control portion executes the second control in a case where the vehicle is traveling on an expressway.
9. The vehicle control apparatus according to claim 1 or 2, wherein,
the vehicle control device further includes a driver determination unit that determines whether or not a hand of the driver is located within a setting area set with respect to an in-vehicle device that can be operated by the driver,
the control unit executes the second control when the determination unit determines that the line of sight of the driver is out of the predetermined area and the driver determination unit determines that the hand of the driver is located within the set area.
10. The vehicle control apparatus according to claim 1 or 2, wherein,
the determination unit determines whether or not a sight line falling point is located on a rear mirror unit provided in a vehicle for confirming a periphery of the vehicle or a display unit for displaying an image obtained by imaging the periphery of the vehicle after the driver's sight line is deviated,
the control unit does not perform the second control when the determination unit determines that the line of sight of the driver is located at the rear mirror unit or the display unit after the line of sight of the driver has deviated from the predetermined area.
11. The vehicle control apparatus according to claim 1 or 2, wherein,
the control unit causes the vehicle to travel so as to approach the first reference position after the second control is executed for a predetermined time.
12. The vehicle control apparatus according to claim 1 or 2, wherein,
the control unit sets, as the second reference position, a position at which the vehicle gradually approaches the first reference position from the position at which the vehicle is determined to have the line of sight deviated, in the second control.
13. A vehicle control device is provided with:
a driver determination unit that determines whether or not a hand of a driver is located within a setting area set for an in-vehicle device that can be operated by the driver;
a control unit that operates in accordance with an operation by a driver, and executes a first control for causing the vehicle to travel so as to approach a first reference position that is a relative position with respect to a travel lane,
the control unit is automatically operated when the driver determination unit determines that the hand of the driver is located within the setting area, and executes a second control for causing the vehicle to travel so as to approach a second reference position based on the position of the vehicle when the hand of the driver is determined to be located within the setting area.
14. A vehicle control device is provided with:
an inquiry judging unit for judging whether or not an inquiry requiring a response from a driver is made;
a control unit that operates in accordance with an operation by a driver, and executes a first control for causing the vehicle to travel so as to approach a first reference position that is a relative position with respect to a travel lane,
the control unit automatically operates when the inquiry determination unit determines that the inquiry has been made, and executes second control for causing the vehicle to travel so as to approach a second reference position that is based on the position of the vehicle when the inquiry is determined to have been made.
15. The vehicle control apparatus according to claim 14,
the query is a voice guide based query from the in-vehicle device,
the response is made by the voice of the driver.
CN201780092380.8A 2017-08-02 2017-08-02 Vehicle control device Active CN110785334B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/028012 WO2019026199A1 (en) 2017-08-02 2017-08-02 Vehicle control device, vehicle control method, and program

Publications (2)

Publication Number Publication Date
CN110785334A CN110785334A (en) 2020-02-11
CN110785334B true CN110785334B (en) 2023-01-10

Family

ID=65232477

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780092380.8A Active CN110785334B (en) 2017-08-02 2017-08-02 Vehicle control device

Country Status (2)

Country Link
CN (1) CN110785334B (en)
WO (1) WO2019026199A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7237684B2 (en) * 2019-03-27 2023-03-13 本田技研工業株式会社 VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD, AND PROGRAM
JP2021144336A (en) * 2020-03-10 2021-09-24 トヨタ自動車株式会社 Information processing device, information processing system, and information processing method

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4972913B2 (en) * 2005-11-18 2012-07-11 トヨタ自動車株式会社 Driving support device
JP4835291B2 (en) * 2006-07-12 2011-12-14 トヨタ自動車株式会社 Vehicle travel control device
JP2008174092A (en) * 2007-01-18 2008-07-31 Aisin Seiki Co Ltd Speed control apparatus
JP4992907B2 (en) * 2007-05-02 2012-08-08 トヨタ自動車株式会社 Vehicle behavior control device
EP2312551A4 (en) * 2008-08-05 2014-10-15 Panasonic Corp Driver awareness degree judgment device, method, and program
JP2010049383A (en) * 2008-08-20 2010-03-04 Mazda Motor Corp Warning device for vehicle
JP5576134B2 (en) * 2010-02-04 2014-08-20 本田技研工業株式会社 Wakimi alarm device
JP5018926B2 (en) * 2010-04-19 2012-09-05 株式会社デンソー Driving assistance device and program
JP5612926B2 (en) * 2010-06-30 2014-10-22 住友電気工業株式会社 Traffic information processing apparatus, traffic information processing system, program, and traffic information processing method
KR20140064388A (en) * 2012-11-20 2014-05-28 현대모비스 주식회사 Control method for lane keeping assist of vehicle and apparatus for lane keeping assist implementing the same
JP6007748B2 (en) * 2012-11-22 2016-10-12 マツダ株式会社 Vehicle driving support device
JP6023654B2 (en) * 2013-05-15 2016-11-09 本田技研工業株式会社 Driving support device and driving support method

Also Published As

Publication number Publication date
WO2019026199A1 (en) 2019-02-07
CN110785334A (en) 2020-02-11

Similar Documents

Publication Publication Date Title
US11851090B2 (en) Vehicle control apparatus, vehicle control method, and storage medium
JP6972294B2 (en) Vehicle control systems, vehicle control methods, and programs
WO2019163010A1 (en) Vehicle control system, vehicle control method, and program
US20150274162A1 (en) Drive assist apparatus, and drive assist method
JP2017165157A (en) Vehicle control system, vehicle control method and vehicle control program
US11402844B2 (en) Vehicle control apparatus, vehicle control method, and storage medium
US20210101600A1 (en) Vehicle control device, vehicle control method, and storage medium
US20180141569A1 (en) Vehicle control system, vehicle control method, and vehicle control program
US11479246B2 (en) Vehicle control device, vehicle control method, and storage medium
US11505193B2 (en) Vehicle control apparatus, vehicle control method, and storage medium
US20220204027A1 (en) Vehicle control device, vehicle control method, and storage medium
JP2018075900A (en) Vehicle control apparatus, vehicle control method and vehicle control program
CN110785334B (en) Vehicle control device
JP7194224B2 (en) VEHICLE CONTROL DEVICE, VEHICLE CONTROL METHOD, AND PROGRAM
US20220388533A1 (en) Display method and system
JP2023150579A (en) Control device, control method, and program
JP2022152607A (en) Driving support device, driving support method, and program
US11673606B2 (en) Driving support device, driving support method, and storage medium
US20230399028A1 (en) Vehicle control device, vehicle control method, and program
US20230311886A1 (en) Vehicle control device, vehicle control method, and storage medium
US20230316778A1 (en) Vehicle control device, vehicle control method, and storage medium
US20240051528A1 (en) Vehicle control device, vehicle control method, and storage medium
US20230311938A1 (en) Vehicle control device, vehicle control method, and storage medium
US20230311887A1 (en) Vehicle system, control method, and storage medium
US20230311939A1 (en) Vehicle control device, vehicle control method, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant