CN117657147A - Driving assistance device - Google Patents

Driving assistance device Download PDF

Info

Publication number
CN117657147A
CN117657147A CN202311120791.7A CN202311120791A CN117657147A CN 117657147 A CN117657147 A CN 117657147A CN 202311120791 A CN202311120791 A CN 202311120791A CN 117657147 A CN117657147 A CN 117657147A
Authority
CN
China
Prior art keywords
stop line
host vehicle
line
stop
intersection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311120791.7A
Other languages
Chinese (zh)
Inventor
喜多正一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of CN117657147A publication Critical patent/CN117657147A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18109Braking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18159Traversing an intersection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/53Road markings, e.g. lane marker or crosswalk

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

A driving support device (100) supports the travel operation of a host vehicle when the host vehicle enters a crossing defined by a first boundary line on the side closer to the host vehicle and a second boundary line on the side farther from the host vehicle. A driving support device (100) is provided with: a recognition unit (111) that recognizes a first stop line before a first boundary line, a second stop line before a second boundary line of an opposite lane, and the second boundary line, based on an external situation in front of the host vehicle detected by the camera (1); a setting unit (112) that sets a virtual stop line in front of the first stop line based on the positional relationship between the second stop line and the second boundary line recognized by the recognition unit (111); and a travel control unit (114) that controls a travel Actuator (AC) mounted on the host vehicle such that the host vehicle, which is temporarily stopped by the first stop line, enters the intersection, and is further temporarily stopped by the virtual stop line set by the setting unit.

Description

Driving assistance device
Technical Field
The present invention relates to a driving support device for supporting a driving operation of a vehicle when entering an intersection.
Background
As such a device, a device that sets a target stop position before a stop line and automatically starts a vehicle after the vehicle stops at the target stop position has been conventionally known (for example, refer to patent document 1). In the device described in patent document 1, when a crosswalk is present in front of a stop line, a target stop position is set to a position on the front side than when no crosswalk is present.
However, when the own vehicle enters the intersection after stopping the vehicle, the view before reaching the intersection may be poor, and it is desirable to provide driving assistance in this case.
Prior art literature
Patent document 1: international publication No. 2018/092298.
Disclosure of Invention
A driving support device according to an aspect of the present invention supports a traveling operation of a host vehicle when the host vehicle enters a crossing defined by a first boundary line on a side close to the host vehicle and a second boundary line on a side far from the host vehicle. The driving support device is provided with: an external situation detection unit that detects an external situation in front of the host vehicle; a recognition unit that recognizes a first stop line before the first boundary line, a second stop line before the second boundary line in the opposite lane, and the second boundary line, based on the external situation detected by the external situation detection unit; a setting unit that sets a virtual stop line in front of the first stop line based on the positional relationship between the second stop line and the second boundary line recognized by the recognition unit; and a travel control unit that controls a travel actuator mounted on the host vehicle such that the host vehicle temporarily stopped at the first stop line is further temporarily stopped at the virtual stop line set by the setting unit when the host vehicle enters the intersection.
Drawings
The objects, features and advantages of the present invention are further elucidated by the following description of embodiments in connection with the accompanying drawings.
Fig. 1 is a diagram showing an example of a driving scenario in which a driving assistance device according to an embodiment of the present invention is applied.
Fig. 2 is a diagram showing a schematic configuration of a driving support device according to an embodiment of the present invention.
Fig. 3 is a bird's eye view of the intersection of fig. 1.
Fig. 4 is a flowchart showing an example of processing performed by the controller of fig. 2.
Fig. 5 is a diagram for explaining the operation of the driving assistance device.
Fig. 6 is a flowchart showing another example of the processing performed by the controller of fig. 2.
Detailed Description
Next, an embodiment of the present invention will be described with reference to fig. 1 to 6. The vehicle to which the driving support apparatus according to the present embodiment is applied may be referred to as a host vehicle, as distinguished from other vehicles. Hereinafter, it is assumed that the host vehicle is a manually driven vehicle having an ADAS (Advanced driver-assistance systems) and is an engine vehicle having an internal combustion engine (engine) as a running drive source. The host vehicle may travel not only in the manual driving mode based on the driving operation by the driver, but also in the automatic driving mode without the driving operation by the driver. The host vehicle may be any one of an electric vehicle having a travel motor as a travel drive source and a hybrid vehicle having an engine and a travel motor as travel drive sources. The driving support device of the present embodiment is configured to support the traveling operation of the host vehicle when the host vehicle enters a road intersecting the road on which the host vehicle travels (hereinafter referred to as an intersecting road).
Fig. 1 is a diagram showing an example of a driving scenario in which the driving assistance device according to the present embodiment is applied. When the driver of the host vehicle 101 recognizes the temporarily stopped road sign SG11 or the road sign SG12 of the intersection IS while the host vehicle 101 IS traveling on the left-hand road SR, the host vehicle 101 IS temporarily stopped immediately before the stop line SL1. At this time, as shown in fig. 1, when a structure such as a block wall WL or a building BL is present between the own vehicle 101 and the intersection IR, it is difficult for the driver to see another vehicle, pedestrian, or the like passing through the intersection IR. Therefore, after the driver temporarily stops the vehicle immediately before the stop line SL1, the driver moves the vehicle 101 to a position where the road condition of the intersection IR can be visually recognized while slowing down to make the presence of the vehicle 101 known to other vehicles, pedestrians, and the like passing through the intersection IR.
However, when the multi-stage stop is performed as described above, depending on the driving technique of the driver, the front end portion of the own vehicle 101 may cross the boundary of the intersection IR during the multi-stage stop, and may rapidly approach another vehicle, pedestrian, or the like passing through the intersection IR, and there is a possibility that a collision may occur. Therefore, when entering the intersection IS having a poor view as shown in fig. 1, it IS desirable to perform a proper multi-stage stop. Therefore, in order to improve the driving safety when entering an intersection with a poor visual field, the driving support device according to the present embodiment is configured as follows. Hereinafter, the driving assistance performed by the driving assistance device is referred to as multi-stage stop assistance or simply as stop assistance. In fig. 1, the marks SG11 and SG21 having the reverse triangular sign plates including the "stop" character are illustrated as the temporary stop road marks, but the shape of the sign plates of the temporary stop road marks, the characters included in the sign plates, and the like are determined by the road traffic law of each country and the like, and are different depending on the country and region. Similarly, in fig. 1, the road surface marks SG12 and SG22 including a "stop" character are exemplified as the road surface marks for temporary stop, but the characters and the like of the road surface marks for temporary stop are also specified by the road traffic law and the like in each country, and are different depending on the country and region.
Fig. 2 is a block diagram showing a main part configuration of a driving support device 100 according to an embodiment of the present invention. The driving assistance device 100 includes a controller 10, a camera 1, an input-output device 2, a positioning unit 3, a communication unit 4, and an actuator AC.
The camera 1 is a stereoscopic camera having an imaging element (image sensor) such as a CCD (charge coupled device) or a CMOS (complementary metal oxide semiconductor). The camera 1 may also be a monocular camera. The camera 1 detects an external situation around the host vehicle 101. The camera 1 is mounted at a predetermined position in front of the host vehicle 101, for example, and continuously captures images of the object in the front space of the host vehicle 101. The objects include road signs SG11, SG21, road marks SG12, SG22, block walls WL, buildings BL, and the like in front of the host vehicle 101.
The input-output device 2 is a generic term for devices for inputting instructions from the driver or outputting information to the driver. The input-output device 2 includes various switches for inputting various instructions by a driver through operation of an operation member, a microphone for inputting instructions by a driver with voice, a display for providing information to the driver through a display image, a speaker for providing information to the driver with sound, and the like.
In addition, the input-output device 2 includes a stop-assist on-off switch for enabling (turning on) or disabling (turning off) the assist function. The stop assist on/off switch is provided around the driver's seat (instrument panel, steering device), for example. When the engine is started (when the power is turned on to the system in the case where the host vehicle 101 is an electric vehicle), the stop assist function is automatically set to be on. Therefore, even when the stop assist function is set to off by the stop assist on-off switch, the stop assist function is turned on again at the time of engine start.
The positioning unit (GNSS unit) 3 has a positioning sensor that receives positioning signals transmitted from positioning satellites. The positioning satellite is a satellite such as a GPS (global positioning system) satellite or a quasi-zenith satellite. The positioning unit 3 measures the current position (latitude, longitude, and altitude) of the vehicle 101 using the positioning information received by the positioning sensor.
The actuator AC is a travel actuator for controlling travel of the host vehicle 101. In the case where the travel drive source is an engine, the actuator AC includes a throttle actuator that adjusts an opening degree (throttle opening degree) of a throttle valve of the engine. In the case where the travel drive source is a travel motor, the travel motor is included in the actuator AC. A brake actuator for driving the brake device of the vehicle 101 and a steering actuator for driving the steering device are also included in the actuator AC.
The communication unit 4 communicates with various servers, not shown, via a network including a wireless communication network typified by the internet and a mobile phone network, and acquires map information, travel history information, traffic information, and the like from the servers at regular intervals or at arbitrary timing. The network includes not only a public wireless communication network but also a closed communication network provided for each prescribed management area, such as a wireless LAN, wi-Fi (registered trademark), bluetooth (registered trademark), or the like. The acquired map information is output to the storage unit 12. Thus, the road map information stored in the storage unit 12 is updated as needed. The communication unit 4 may communicate with other vehicles and roadside apparatuses via a network, and acquire map information from the other vehicles and the roadside apparatuses.
The controller 10 includes a computer having an arithmetic unit 11 such as a CPU (microprocessor), a storage unit 12 such as a ROM (read only memory) and a RAM (random access memory), and other peripheral circuits not shown such as an I/O interface.
The storage unit 12 stores road map information of accuracy. The road map information includes road position information, road shape (curvature, etc.) information, road gradient information, position information of intersections or intersections, lane number information, lane width, and position information of each lane (center position of lane, boundary line information of lane position), and position information of landmarks (traffic signals, signs, buildings, etc.) as marks on a map.
The computing unit 11 has a functional configuration of a recognition unit 111, a setting unit 112, an output control unit 113, and a travel control unit 114.
The recognition unit 111 recognizes the intersection IS in front of the host vehicle 101 based on the external situation around the host vehicle 101 detected by the camera 1. The identifying unit 111 may identify the intersection IS in front of the own vehicle 101 based on the current position of the own vehicle 101 measured by the positioning unit 3 and the road map information stored in the storage unit 12.
The identifying unit 111 identifies the stop line based on the external situation around the host vehicle 101 detected by the camera 1. Fig. 3 IS a bird's eye view of the intersection IS of fig. 1. In the traveling scene shown in fig. 1, the identifying unit 111 identifies the stop line SL1 immediately before the boundary line BD1 on the side of the intersection IR near the own vehicle 101 when the own vehicle 101 travels on the road SR toward the intersection IS. The identification unit 111 identifies a boundary line BD2 on the side of the intersection IR away from the host vehicle 101. Specifically, the identifying unit 111 identifies a Free Space (road shoulder or the like) on the side of the intersection IR away from the host vehicle 101, and identifies an edge of the Free Space (hereinafter referred to as a Free Space edge) as a boundary line BD2. When the free space is not present and is not recognized, the recognition unit 111 recognizes a road end (road edge) of the intersection IR on the side away from the host vehicle 101 as a boundary line BD2. The method of recognizing the boundary line BD2 is not limited to this. The recognition unit 111 recognizes a stop line SL2 immediately before a boundary line BD2 on a lane (opposing lane) opposing the lane (own lane) on which the host vehicle 101 is traveling.
The setting unit 112 sets the virtual stop line VL in front of (on the inner side in the traveling direction of) the stop line SL1 at a time before the host vehicle 101 reaches the stop line SL1, based on the positional relationship between the stop line SL2 and the boundary line BD2 recognized by the recognition unit 111. Specifically, first, the setting unit 112 calculates the distance DT2 between the stop line SL2 and the boundary line BD2 in the traveling direction (vertical direction in fig. 3) based on the positions of the stop line SL2 and the boundary line BD2 in the traveling direction (vertical direction in fig. 3) recognized by the recognition unit 111. As shown in fig. 3, the setting unit 112 sets the virtual stop line VL such that the distance DT1 between the stop line SL1 and the virtual stop line VL becomes the distance DT2. As a result, as in the example shown in fig. 3, even when the position (position in the traveling direction) of the boundary line BD1 of the intersection IR cannot be recognized from the host vehicle 101 due to the block wall WL or the building BL, the virtual stop line VL can be set at an appropriate position, specifically, at the position of the boundary line BD1, based on the distance DT2 between the stop line SL2 on the opposite lane side and the boundary line BD2, which have high visibility. The virtual stop line VL is set when the stop assist function is set to on.
When the field of view of the intersection IS good, the setting unit 112 may determine that the stop assistance IS not required and may not set the virtual stop line VL. Here, a method for determining whether the field of view of the intersection IS good will be described. The setting unit 112 performs dead angle detection with respect to a predetermined range on the left side (left side in fig. 3) and the right side (right side in fig. 3) of the intersection IS based on the captured image of the camera 1 using an identification technique such as semantic segmentation. The predetermined range IS, for example, a range having a predetermined length from the left-hand corner of the intersection IS to the left-hand corner of the intersection IS and a range having a predetermined length from the right-hand corner of the intersection IS to the right-hand corner of the intersection IS. The setting unit 112 determines that the field of view of the intersection IS good when no dead angle IS detected, and determines that the field of view of the intersection IS poor when a dead angle IS detected. The setting unit 112 may determine whether the visual field of the intersection IS good or bad based on the position, shape, and size of the detected dead angle using a technique such as machine learning, in addition to whether the dead angle IS detected.
However, when the road structure around the stop line SL1 is different from the road structure around the stop line SL2, for example, a crosswalk is provided in front of the stop line SL1, and when there is no crosswalk in front of the stop line SL2 facing the lane, there is a possibility that the distance between the stop line SL1 and the boundary line BD1 and the distance between the stop line SL2 and the boundary line BD2 are different. At this time, if the virtual stop line VL is set based on the distance DT2 between the stop line SL2 on the opposite lane side and the boundary line BD2, the virtual stop line VL is not set at an appropriate position. If the virtual stop line VL is set to be forward (inward) of the boundary line BD1, the front end portion of the host vehicle 101 may pass over the boundary line BD1 during multi-stage stop, and the host vehicle 101 may rapidly approach another vehicle or pedestrian traveling on the intersection IR, and may collide with the other vehicle or pedestrian. In addition, for example, at an intersection of a t-road such as a dead-end road on the road SR on which the host vehicle 101 is traveling, since the stop line SL2 and the boundary line BD2 on the opposite lane side do not exist, the distance DT2 cannot be calculated, and the virtual stop line VL cannot be set.
Therefore, when the road structure around the stop line SL1 is different from the road structure around the stop line SL2, the setting unit 112 determines that the stop assistance cannot be performed, and does not set the virtual stop line VL. Specifically, the setting unit 112 performs feature point matching or the like after aligning the captured image around the stop line SL1 and the captured image around the stop line SL2 obtained by the camera 1 using affine transformation or the like, and calculates the degree of coincidence (similarity) between the road structure around the stop line SL1 and the road structure around the stop line SL2. Note that other matching methods may be used for calculating the similarity, and the present invention is not limited to this. The degree of coincidence (similarity) may be calculated from the road map information stored in the storage unit 12. When the calculated similarity is smaller than the predetermined threshold, the setting unit 112 does not set the virtual stop line VL.
When the road width of the intersection IR is large, the distance from the host vehicle 101 to the boundary line BD2 and the stop line SL2 may be long, and the accuracy of the positions of the stop line SL2 and the boundary line BD2 recognized based on the captured image of the camera 1 may be reduced. In this case, since the calculation accuracy of the distance DT2 by the setting unit 112 is reduced, if the virtual stop line VL is set based on the distance DT2, the virtual stop line VL may not be set at an appropriate position.
Therefore, when the distance between the stop line SL1 and the stop line SL2 is equal to or greater than the predetermined distance, the setting unit 112 determines that the stop assistance is not possible, and does not set the virtual stop line VL. Specifically, the setting unit 112 calculates the distance between the stop line SL1 and the stop line SL2 from the captured image including the stop line SL1 and the stop line SL2 obtained by the camera 1, and does not set the virtual stop line VL when the calculated distance is equal to or greater than the predetermined distance.
Further, since there is a structure or the like between the host vehicle 101 and the stop line SL2 or the boundary line BD2, the setting unit 112 may not recognize the stop line SL2 or the boundary line BD2 based on the captured image of the camera 1. For example, when a separate belt is provided on the intersection IR, a road construction is performed on the intersection IR, and a fence or a sign relating to the road construction is present around the road construction, so-called shielding occurs in which the stop line SL2 or the boundary line BD2 is hidden behind these structures in the captured image of the camera 1, and the setting unit 112 cannot recognize the stop line SL2 or the boundary line BD2. In this case, the setting section 112 cannot calculate the distance DT2 and cannot set the virtual stop line VL at an appropriate position. Therefore, when the recognition unit 111 cannot recognize the stop line SL2 or the boundary line BD2 due to a structure or the like on the intersection IR, the setting unit 112 determines that the stop assistance cannot be performed, and does not set the virtual stop line VL.
In addition, when the road surface of the road SR IS wet and the road surface reflection occurs, and the field of view around the intersection IS deteriorated due to bad weather (rain or snow), the recognition accuracy of the stop lines SL1, SL2 by the recognition unit 111 IS lowered. When the stop lines SL1 and SL2 wear, the recognition accuracy of the stop lines SL1 and SL2 by the recognition unit 111 also decreases. Since the position of the virtual stop line VL is determined based on the positions of the stop lines SL1 and SL2, the setting unit 112 cannot set the virtual stop line VL at an appropriate position when the recognition accuracy of the stop lines SL1 and SL2 is low. Therefore, when the recognition accuracy of the stop line SL1 or the stop line SL2 by the recognition unit 111 is low, the position of the virtual stop line VL is set to be near the front side in the traveling direction. Specifically, the setting unit 112 calculates the recognition accuracy of the recognition unit 111 for the stop line SL1 or the stop line SL2, and sets the position of the virtual stop line VL to be closer to the front side in the traveling direction than when the recognition accuracy of the stop line SL1 or the stop line SL2 is equal to or higher than the recognition accuracy of the stop line SL1 or the stop line SL2 is equal to or lower than the predetermined level. More specifically, when the widths (lengths in the vertical direction in fig. 3) of the stop lines SL1 and SL2 recognized by the recognition unit 111 are smaller than the predetermined width, the setting unit 112 determines that the recognition accuracy of the stop lines SL1 and SL2 is smaller than the predetermined level. This can suppress the host vehicle 101 from crossing the boundary line BD1 at the time of multi-stage stop. When the recognition accuracy of the stop line SL1 or the stop line SL2 is less than a predetermined level, it may be determined that the stop assist is not possible, and the virtual stop line VL may not be set. Since the predetermined widths of the stop lines SL1 and SL2 are different depending on the number of lanes of the road SR, the setting unit 112 recognizes the number of lanes of the road SR from the road map information stored in the storage unit 12 or the captured image of the camera 1, and uses the predetermined width corresponding to the number of lanes for the determination.
When a preceding vehicle IS present between the host vehicle 101 and the intersection IS, the recognition unit 111 may fail to recognize the stop line SL1, the stop line SL2, the boundary line BD2, and the like. When the opposite vehicle is present on the opposite lane, the stop line SL2, the boundary line BD2, and the like enter the dead angle of the opposite vehicle, and the recognition unit 111 may not recognize the stop line SL2, the boundary line BD2, and the like. Therefore, when such a preceding vehicle or an oncoming vehicle is recognized by the recognition unit 111, the setting unit 112 determines that the stop assistance is not possible, and does not set the virtual stop line VL.
When the setting unit 112 sets the virtual stop line VL, the output control unit 113 outputs information (hereinafter referred to as advice information) for notifying that the stop assist function is available to the input/output device 2 before the host vehicle 101 reaches the stop line SL1. For example, the output control unit 113 outputs display information including advice information to a display of the input/output device 2, for example, a composite information display (Multi-information display:mid) provided in a meter provided in a driver's seat. The output control unit 113 outputs sound information including advice information to a speaker (speaker provided in the vehicle interior) of the input/output device 2, for example. This can notify the driver that the stop assist function can be used. For this notification, the driver performs an approval operation by an operation member of the input/output device 2, for example, a predetermined switch provided near the steering device, and the travel control unit 114 performs a stop assistance. The travel control unit 114 may perform the stop assistance instead of the output control unit 113.
The travel control unit 114 controls the actuator AC to assist the vehicle 101 in stopping. More specifically, when the host vehicle 101 is stopped at the stop line SL1, the travel control unit 114 controls the actuator AC so that the host vehicle 101 automatically starts from the stop line SL1, and the host vehicle 101 slowly travels to the virtual stop line VL and temporarily stops at the virtual stop line VL. When the stop assist function is turned off or when the setting unit 112 determines that the stop assist is not possible, the travel control unit 114 does not perform the stop assist.
Fig. 4 is a flowchart showing an example of processing executed by the controller 10 of fig. 2 according to a predetermined program. The processing shown in the flowchart is repeated at a predetermined cycle, for example, starting when the engine of the host vehicle 101 is started.
First, in step S1, it IS determined whether or not an intersection IS exists in front of the host vehicle 101. More specifically, it IS determined whether or not the intersection IS included in the captured image of the front of the host vehicle 101 obtained by the camera 1. When step S1 IS affirmative (yes in step S1), that IS, when the captured image includes the intersection IS, it IS determined that the intersection IS recognized in front of the host vehicle 101, and the flow proceeds to step S2. If step S1 IS negative (S1: no), that IS, if the captured image does not include the intersection IS, it IS determined that the intersection IS not recognized in front of the host vehicle 101, and the process ends.
In step S2, it IS determined whether or not a stop line SL1 corresponding to the own lane exists immediately before the intersection IS. Specifically, it is determined whether or not the stop line SL1 corresponding to the host lane is included in the captured image of the front of the host vehicle 101 obtained by the camera 1. When step S2 IS negative (S2: no), that IS, when the stop line SL1 IS not present, it IS determined that the traveling priority order of the road SR at the intersection IS higher than that of the intersection IR, that IS, the road SR IS a priority road, and it IS determined that the stop assistance IS not necessary, and the process IS terminated. When step S2 IS affirmative (yes in step S2), it IS determined in step S3 whether or not the field of view of the intersection IS poor. If step S3 IS negative (no in S3), that IS, if the field of view of the intersection IS good, it IS determined that the assist IS not necessary to stop, and the process IS terminated. On the other hand, when step S3 is affirmative (yes in step S3), it is determined in step S4 whether the stop line SL2 on the opposite lane side can be recognized. Specifically, it is determined whether or not the stop line SL2 corresponding to the opposite lane is included in the captured image of the front of the host vehicle 101 obtained by the camera 1. When step S4 is affirmative (yes in S4), it is determined in step S5 whether or not the boundary line BD2 on the inner side of the intersection IR can be identified. When step S5 is affirmative (S5: yes), in step S6, a distance DT2 between the stop line SL2 identified in step S4 and the boundary line BD2 identified in step S5 is calculated, and a virtual stop line VL is set based on the calculated distance DT2 and the stop line SL1 identified in step S2. In step S7, the driver is notified via the input-output device 2 that the stop assistance function can be used. In detail, display information (icons, messages, etc.) including advice information is output to the MID. The sound information including the advice information may be output to a speaker in the vehicle interior. On the other hand, if steps S4 and S5 are negative (S4, S5: no), the driver is notified that the stop assist function is not available via the input/output device 2 in step S8. Specifically, information for notifying that the stop assist function is unavailable (hereinafter referred to as "no assist information") is output to the input/output device 2. The auxiliary information may be display information (such as an icon or a message) output to the MID, or may be audio information output to a speaker in the vehicle cabin, similarly to the advice information.
The operation of the driving support device 100 of the present embodiment is summarized as follows. Fig. 5 is a diagram for explaining the operation of the driving assistance device 100. Fig. 5 shows a case where the host vehicle 101 IS temporarily stopped at the stop line SL1 immediately before the intersection IS and then temporarily stopped at the virtual stop line VL to enter the intersection IS while being manually driven on the road SR. When the own vehicle 101 travels in the manual driving mode, if the intersection IS recognized at time t0 (S1), it IS determined whether or not a stop line SL1 corresponding to the own lane exists immediately before the intersection IS (S2). When it IS determined at time t1 that the stop line SL1 IS present, it IS determined whether or not the field of view of the intersection IS poor, that IS, whether or not stop assistance IS required (S3). In the example of fig. 5, since dead angles are generated by the structure on the left and right sides of the intersection IS, the intersection IS determined to have a poor view, and thus stop assistance IS required. Then, the virtual stop line VL is set based on the stop line SL2 on the opposite lane side and the boundary line BD2 on the inner side of the intersection IR (S4, S5, S6). Specifically, a distance DT2 between the stop line SL2 and the boundary line BD2 is calculated, and a virtual stop line VL is set at a position separated from the stop line SL1 forward (upward in fig. 5) by the distance DT2.
When the virtual stop line VL is set, advice information is output to the MID or the like until the time (time t 2) when the driver temporarily stops the vehicle 101 at the stop line SL1, and the driver is notified that the stop assist function is available (S7). When the driver performs an approval operation before starting the own vehicle 101 temporarily stopped on the stop line SL1, the start of the stop assistance is notified. Thus, the vehicle 101 temporarily stopped on the stop line SL1 automatically starts to slow down. When the front end portion of the vehicle 101 reaches the virtual stop line VL at time t3, the vehicle 101 is temporarily stopped again at the virtual stop line VL. With such stop assistance, it is possible to avoid abrupt approaching and collision with the other vehicle 102 or the pedestrian PD in the crossing road IR passage, and to move the host vehicle 101 from the stop line SL1 to a position where the driver can easily visually recognize the other vehicle 102 or the pedestrian PD in the crossing road IR passage.
When the host vehicle 101 stops on the virtual stop line VL, the stop assistance ends. At this time, information for notifying the stop assistance end (hereinafter referred to as assistance end information) is output to the input/output device 2 (MID, speaker). When the driver recognizes that the stop assistance is ended based on the assistance end information, the driving operation is restarted. It should be noted that, instead of ending the stop assist at the time when the host vehicle 101 stops at the virtual stop line VL, the stop assist may be ended at the time when the driver performs the steering operation and the acceleration operation on the host vehicle 101 stopped at the virtual stop line VL. In addition, when the driver performs the steering operation or the acceleration operation on the vehicle 101 before the vehicle 101 reaches the virtual stop line VL, it may be determined that the driver has canceled the stop assistance and the stop assistance may be ended at that point.
By adopting the embodiment of the invention, the following operation and effects can be realized.
(1) The driving support device 100 supports the travel operation of the host vehicle 101 when the host vehicle 101 enters an intersection IR defined by a boundary line BD1 on the side closer to the host vehicle 101 and a boundary line BD2 on the side farther from the host vehicle 101. The driving assistance device 100 includes: a camera 1 that detects an external situation in front of the host vehicle 101; a recognition unit 111 that recognizes a stop line SL1 near the boundary line BD1, a stop line SL2 near the boundary line BD2 of the opposite lane, and the boundary line BD2, based on the external situation detected by the camera 1; a setting unit 112 that sets a virtual stop line VL in front of the stop line SL1 based on the positional relationship between the stop line SL2 and the boundary line BD2 recognized by the recognition unit 111; and a travel control unit 114 that controls the travel actuator AC mounted on the host vehicle 101 so that the host vehicle 101 temporarily stopped by the stop line SL1 is further temporarily stopped by the virtual stop line VL set by the setting unit 112 when the host vehicle enters the intersection IR. The setting unit 112 sets a virtual stop line at a position separated from the stop line SL1 by a distance between the boundary line BD2 and the stop line SL2 toward the rear side in the traveling direction of the vehicle 101. Thus, even at an intersection with a poor view, the own vehicle can enter the intersection while avoiding a sudden approach and collision with another vehicle or pedestrian in the road passing by the intersection. In this way, by appropriately performing driving assistance when entering an intersection with a poor visual field, the safety of traffic can be improved.
(2) The setting unit 112 calculates the similarity between the road structure around the stop line SL1 and the road structure around the stop line SL2 based on the external situation detected by the camera 1, and does not set the virtual stop line when the similarity is smaller than a predetermined threshold. The setting unit 112 does not set the virtual stop line VL when the distance between the stop line SL1 and the stop line SL2 in the traveling direction of the vehicle is equal to or greater than a predetermined distance. When the stop line SL2 or the boundary line BD2 cannot be recognized by the structure provided on or around the intersection IR, the setting unit 112 does not set the virtual stop line VL. Accordingly, when setting the virtual stop line VL based on the positional relationship between the stop line SL2 and the boundary line BD2, setting of the virtual stop line in front (back) of the boundary line BD1 can be prevented.
(3) The setting unit 112 calculates the recognition accuracy of the stop line SL1 and the stop line SL2 recognized by the recognition unit 111, and sets the position of the virtual stop line to be closer to the front side in the traveling direction than when the recognition accuracy of the stop line SL1 and the stop line SL2 is equal to or greater than a predetermined level when the recognition accuracy of the stop line SL1 or the stop line SL2 is smaller than the predetermined level. In this way, when the recognition accuracy of the stop line SL1 or the stop line SL2 is low, the stop assist is terminated in advance, so that the vehicle can be prevented from crossing the boundary line BD1 during the stop assist.
The above-described embodiments may be modified in various ways. The following describes modifications. In the above embodiment, the identification unit 111 is set to identify the first stop line (stop line SL1 in fig. 1) before the first boundary line (boundary line BD1 in fig. 1) and the second stop line (boundary line BD2 in fig. 1) before the second boundary line (boundary line BD2 in fig. 1) opposite the lane, based on the external situation detected by the camera 1 as the external situation detection unit. However, the identification unit 111 may identify the first stop line, the second stop line, or the like by using the road map information with high accuracy stored in the storage unit 12 instead of or together with the information detected by the camera 1.
In the above embodiment, when it IS determined in step S3 that the field of view of the intersection IS poor, the virtual stop line VL IS set based on the stop line SL1, the stop line SL2, and the boundary line BD2 (steps S4, S5, S6). However, even when the field of view of the intersection IS poor, the boundary line BD1 may be recognized from the captured image of the camera 1. In addition, the boundary line BD1 may be identified based on the captured image of the camera 1 and the road map information stored in the storage unit 12 with high accuracy. Accordingly, the controller 10 may also execute the process of fig. 6 instead of the process of fig. 4. Fig. 6 is a flowchart showing another example of processing executed by the controller 10 of fig. 2 according to a predetermined program. The processing in steps S11 to S13, S15, S16, S18, and S19 in fig. 6 is the same as that in steps S1 to S3, S4, S5, S7, and S8 in fig. 4, and therefore, the description thereof is omitted.
As shown in fig. 6, when step S13 is affirmative (yes in S13), it is determined whether or not the boundary line BD1 on the front side of the intersection IR can be identified in step S14. When step S14 is negative (no in S14), the flow proceeds to step S15. On the other hand, when step S14 is affirmative (yes in S14), the process proceeds to step S17. The process of step S17 is the same as that of step S6 of fig. 4, but when the boundary line BD1 is recognized in step S14, the virtual stop line VL is set based on the positional relationship between the stop line SL1 and the boundary line BD1. At this time, the virtual stop line VL may be set so as to overlap with only the boundary line BD1.
In the above embodiment, the setting unit 112 is set to determine that the recognition accuracy of the stop lines SL1, SL2 is less than a predetermined level when the widths of the stop lines SL1, SL2 recognized by the recognition unit 111 are less than a predetermined width. However, if the stop lines SL1, SL2 are worn, it is difficult to detect edges (in detail, long edges) of the stop lines SL1, SL2 from the captured image of the camera 1. When the road surface around the stop lines SL1, SL2 is wet, it is difficult to detect the edges of the stop lines SL1, SL2 from the captured image of the camera 1 due to road surface reflection. In this case, the positions of the stop lines SL1 and SL2 cannot be recognized with high accuracy. As a result, if the virtual stop line VL is set based on the stop lines SL1, SL2, the virtual stop line VL cannot be set at an appropriate position. Therefore, the setting unit may determine that the recognition accuracy of the stop lines SL1, SL2 is less than the predetermined level when the edges of the stop lines SL1, SL2 recognized by the recognition unit 111 (both the edge of the long side of the host vehicle 101 and the edge of the long side of the host vehicle 101) cannot be detected.
In the above embodiment, when the recognition accuracy of the recognition unit 111 to the stop line SL1 is smaller than the predetermined level, the setting unit 112 sets the position of the virtual stop line VL to the front side in the traveling direction or determines that the stop assistance is not possible and does not set the virtual stop line VL, compared with the case where the recognition accuracy of the stop line SL1 is equal to or higher than the predetermined level. However, even when the recognition accuracy of the temporary stop road sign SG11 provided immediately before the intersection IS less than the predetermined level, the setting unit may set the position of the virtual stop line VL to the immediately before the traveling direction, or may determine that the stop assistance IS not possible and may not set the virtual stop line VL, as compared with the case where the recognition accuracy of the stop line SL1 IS equal to or greater than the predetermined level. When the recognition unit 111 repeatedly recognizes and does not recognize the road sign SG11 during the period until the own vehicle 101 reaches the intersection IS, the setting unit may determine that the accuracy of recognizing the road sign SG11 IS less than a predetermined level. Even if the road sign SG11 is recognized based on the road map information stored in the storage unit 12, the setting unit may determine that the recognition accuracy of the road sign SG11 is less than a predetermined level when the road sign SG11 is not recognized based on the captured image of the camera 1.
In the above embodiment, when it IS determined in step S3 that the field of view of the intersection IS good, it IS determined that the stop assistance IS not necessary. However, if the field of view of the intersection IS good, the driver may be relaxed, and the attention to the surroundings of the vehicle 101 may become distracted. Therefore, even if it IS determined in step S3 that the field of view of the intersection IS good, the flow of processing may proceed to step S4 without ending the processing when drowsiness, eastern and western observation, and the like of the driver are detected by the driver monitoring system (not shown) mounted in the host vehicle 101. Alternatively, the process may be terminated after the driver is alerted (e.g., whistling) via the MID or the in-vehicle speaker.
In the above embodiment, when it is determined in step S2 that the stop line SL1 is not present, it is determined that the road SR is a priority road, and it is determined that the stop assistance is not necessary, and the process is terminated. However, if the stop line SL1 is not recognized due to abrasion or road reflection, it may be erroneously determined that the stop line SL1 is not present in step S2. Therefore, in order to further improve the safety, it may be determined in step S2 whether or not the road sign SG11 is temporarily stopped. Specifically, when neither the road sign SG11 nor the stop line SL1 is recognized, it is determined that the stop assistance is not necessary, and the process is terminated. The presence or absence of the road surface mark SG12 may be determined together.
The above description is merely an example, and the present invention is not limited to the above embodiments and modifications as long as the features of the present invention are not impaired. One or more of the above embodiments and modifications may be arbitrarily combined, or the modifications may be combined with each other.
The present invention can appropriately assist driving when entering an intersection with a poor visual field.
While the invention has been described in connection with the preferred embodiments thereof, it will be understood by those skilled in the art that various modifications and changes can be made without departing from the scope of the disclosure of the following claims.

Claims (8)

1. A driving support device (100) for supporting a driving operation of a host vehicle when the host vehicle enters a crossing defined by a first boundary line on the side closer to the host vehicle and a second boundary line on the side farther from the host vehicle, the driving support device comprising:
an external situation detection unit (1) that detects an external situation in front of the host vehicle;
a recognition unit (111) that recognizes a first stop line before the first boundary line, a second stop line before the second boundary line of the opposite lane, and the second boundary line, based on the external situation detected by the external situation detection unit (1);
a setting unit (112) that sets a virtual stop line in front of the first stop line, based on the positional relationship between the second stop line and the second boundary line recognized by the recognition unit (111); and
and a travel control unit (114) that controls a travel actuator mounted on the host vehicle such that the virtual stop line set by the setting unit (112) is further temporarily stopped when the host vehicle, which is temporarily stopped by the first stop line, enters the intersection.
2. The driving assistance device according to claim 1, characterized in that,
the setting unit (112) calculates the similarity between the road structure around the first stop line and the road structure around the second stop line based on the external situation detected by the external situation detection unit (1), and does not set the virtual stop line when the similarity is smaller than a predetermined threshold.
3. The driving assistance device according to claim 1, characterized in that,
the setting unit (112) does not set the virtual stop line when the distance between the first stop line and the second stop line in the traveling direction of the vehicle is equal to or greater than a predetermined distance.
4. The driving assistance device according to claim 1, characterized in that,
the setting unit (112) does not set the virtual stop line when the second stop line or the second boundary line cannot be recognized by the structure on or around the intersection.
5. The driving assistance device according to claim 1, characterized in that,
the setting unit (112) performs dead angle detection for a predetermined range on the left and right sides of an intersection where a road on which the host vehicle is traveling intersects with the intersecting road, based on the external situation detected by the external situation detection unit (1), and does not set the virtual stop line when no dead angle is detected from the predetermined range on the left and right sides of the intersection.
6. The driving assistance device according to claim 1, characterized in that,
the setting unit (112) calculates the recognition accuracy of the first stop line and the second stop line by the recognition unit (111), and sets the position of the virtual stop line to the front side in the traveling direction when the recognition accuracy of the first stop line or the second stop line is smaller than a predetermined level, as compared with when the recognition accuracy of the first stop line and the second stop line is equal to or greater than the predetermined level.
7. The apparatus of claim 1, wherein the device comprises a plurality of sensors,
the setting unit (112) sets the virtual stop line at a position separated from the first stop line toward the rear side in the traveling direction of the vehicle by a distance between the second boundary line and the second stop line.
8. A driving support device (100) for supporting a driving operation of a host vehicle when the host vehicle enters a crossing defined by a first boundary line on the side closer to the host vehicle and a second boundary line on the side farther from the host vehicle, the driving support device comprising:
an external situation detection unit (1) that detects an external situation in front of the host vehicle;
a recognition unit (111) that recognizes a first stop line immediately before the first boundary line and the first boundary line, based on the external situation detected by the external situation detection unit (1);
a setting unit (112) that sets a virtual stop line in front of the first stop line according to the positional relationship between the first stop line and the first boundary line recognized by the recognition unit (111); and
and a travel control unit (114) that controls a travel actuator mounted on the host vehicle such that the virtual stop line set by the setting unit (112) is further temporarily stopped when the host vehicle, which is temporarily stopped by the first stop line, enters the intersection.
CN202311120791.7A 2022-09-07 2023-08-31 Driving assistance device Pending CN117657147A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022141974A JP7522801B2 (en) 2022-09-07 2022-09-07 Driving Support Devices
JP2022-141974 2022-09-07

Publications (1)

Publication Number Publication Date
CN117657147A true CN117657147A (en) 2024-03-08

Family

ID=90077747

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311120791.7A Pending CN117657147A (en) 2022-09-07 2023-08-31 Driving assistance device

Country Status (2)

Country Link
JP (1) JP7522801B2 (en)
CN (1) CN117657147A (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4862502B2 (en) 2006-06-08 2012-01-25 日産自動車株式会社 Driving support system and driving support method
MX2019003320A (en) 2016-09-23 2019-08-05 Nissan Motor Driving assistance method and driving assistance device.
JP6936043B2 (en) 2017-04-28 2021-09-15 株式会社トヨタマップマスター A recording medium on which a vehicle control system, a vehicle control device, a vehicle control method, a computer program, and a computer program are recorded.
JP7098883B2 (en) 2017-05-24 2022-07-12 日産自動車株式会社 Vehicle control methods and equipment

Also Published As

Publication number Publication date
JP7522801B2 (en) 2024-07-25
JP2024037265A (en) 2024-03-19

Similar Documents

Publication Publication Date Title
US11366477B2 (en) Information processing device, information processing method, and computer readable medium
CN111149140B (en) Driving assistance method and driving assistance device
JP6962468B2 (en) Driving support method and vehicle control device
US20100030474A1 (en) Driving support apparatus for vehicle
US11074818B2 (en) Drive assist device and drive assist method
US10926696B2 (en) Out-of-vehicle notification device
US11535249B2 (en) Vehicle action determining method and vehicle action determining device
CN112533809A (en) Vehicle control method and vehicle control device
CN114194186B (en) Vehicle travel control device
US20220105942A1 (en) Traveling control apparatus
US20210284192A1 (en) Movable object control device, movable object control method, and storage medium storing program
US12030515B2 (en) Vehicle control apparatus
CN114973644B (en) Road information generating device
CN114944073B (en) Map generation device and vehicle control device
JP7522801B2 (en) Driving Support Devices
CN115050203B (en) Map generation device and vehicle position recognition device
US20230169779A1 (en) Vehicle control apparatus
CN115937802A (en) Driving assistance device for vehicle
CN115723781A (en) Travel control device
JP2023130576A (en) road recognition device
JP2023111474A (en) Vehicle control apparatus
JP2024099903A (en) Road information generating device
CN116890871A (en) vehicle control device
JP2023146019A (en) Vehicle control apparatus
CN118116186A (en) Lane estimating device and map generating device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination