WO2023027039A1 - Dispositif d'aide au stationnement et procédé d'aide au stationnement - Google Patents

Dispositif d'aide au stationnement et procédé d'aide au stationnement Download PDF

Info

Publication number
WO2023027039A1
WO2023027039A1 PCT/JP2022/031613 JP2022031613W WO2023027039A1 WO 2023027039 A1 WO2023027039 A1 WO 2023027039A1 JP 2022031613 W JP2022031613 W JP 2022031613W WO 2023027039 A1 WO2023027039 A1 WO 2023027039A1
Authority
WO
WIPO (PCT)
Prior art keywords
parking
route
vehicle
information
image
Prior art date
Application number
PCT/JP2022/031613
Other languages
English (en)
Japanese (ja)
Inventor
賢治 小原
Original Assignee
株式会社デンソー
株式会社J-QuAD DYNAMICS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー, 株式会社J-QuAD DYNAMICS filed Critical 株式会社デンソー
Priority to CN202280055839.8A priority Critical patent/CN117836183A/zh
Priority to JP2023543911A priority patent/JPWO2023027039A1/ja
Publication of WO2023027039A1 publication Critical patent/WO2023027039A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/06Automatic manoeuvring for parking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present disclosure relates to a parking assistance device and a parking assistance method.
  • Patent Document 1 Conventionally, as a parking assistance device, there is known a device that automatically parks a vehicle at a predetermined parking position (see Patent Document 1, for example).
  • the parking assistance device described in Patent Document 1 learns the driving route of the vehicle from the reference start position to the parking position when the vehicle is parked at the planned parking position by the driver's operation, and uses the learning result to park the vehicle. To automatically park a vehicle at a predetermined position.
  • An object of the present disclosure is to provide a parking assistance device and a parking assistance method capable of improving usability.
  • the parking assist device a route generation unit that generates a target route that the vehicle should follow when the vehicle is parked based on route information including information about the vehicle's driving route and the surroundings of the vehicle on the driving route when the vehicle is parked by the user; , a follow-up control unit that performs a follow-up control process for automatically moving the vehicle to the planned parking position along the target route; and an information providing unit that provides information to the user,
  • the information providing unit provides the information regarding the planned parking position included in the route information to the user in a visual manner before the follow-up control process is started.
  • Parking assistance method generating a target route that the vehicle should follow when the vehicle is parked based on route information including information about the vehicle's travel route and the surroundings of the vehicle on the travel route when the vehicle is parked by the user; performing follow-up control processing for automatically moving the vehicle to the planned parking position along the target route; providing information to users; Providing the information to the user includes providing the information on the planned parking position included in the route information in a visual manner to the user before starting the follow-up control process.
  • the user can start automatically parking the vehicle at the planned parking position after clearly grasping the planned parking position. Therefore, according to the parking assistance device and the parking assistance method of the present disclosure, usability of automatic parking can be improved.
  • “Usability” refers to the degree of effectiveness, efficiency, and user satisfaction when a product is used by a specific user to achieve a specified goal in a specific usage situation. is.
  • FIG. 1 is a schematic configuration diagram of an automatic parking system according to an embodiment of the present disclosure
  • FIG. It is an explanatory view for explaining a parking lot containing a parking space of vehicles.
  • 4 is a flowchart showing an example of learning processing executed by a parking control unit of the parking assistance device
  • 4 is a flowchart showing an example of assistance processing executed by a parking control unit
  • FIG. 10 is an explanatory diagram for explaining an example of the content displayed on the touch panel display unit before starting the support process
  • 7 is a flowchart illustrating an example of target route generation processing executed by a parking control unit
  • FIG. 10 is an explanatory diagram for explaining an example of display contents on the touch panel display section before the follow-up control process is started; It is an explanatory view for explaining an example of a display mode on a touch-panel display part of a parking plan position.
  • FIG. 10 is an explanatory diagram for explaining an example of display contents on the display unit when there are a plurality of candidates for the planned parking position; 4 is a flowchart showing an example of follow-up control processing executed by a parking control unit; FIG.
  • FIG. 10 is an explanatory diagram for explaining an example of display contents on the touch panel display unit when the follow-up control process is started;
  • FIG. 10 is an explanatory diagram for explaining an example of display contents on the touch panel display unit after the follow-up control process is started;
  • FIG. 10 is an explanatory diagram for explaining an example of automatic adjustment of the angle of the virtual viewpoint of the virtual viewpoint image according to the position change between the target and the vehicle;
  • FIG. 11 is an explanatory diagram for explaining another example of display contents on the touch panel display unit after the follow-up control process is started;
  • FIG. 11 is an explanatory diagram for explaining an example of display contents on the touch panel display unit during searching for an avoidance route;
  • It is an explanatory view for explaining an example of an avoidance course.
  • It is an explanatory view for explaining an example of a display mode on a touch-panel display part, such as an avoidance route.
  • FIG. 1 the automatic parking system 1 has a perimeter monitoring sensor 3 , various ECUs 4 , and a parking assistance device 5 .
  • the parking assistance device 5 is communicably connected to the perimeter monitoring sensor 3 and various ECUs 4 directly or via an in-vehicle LAN (Local Area Network).
  • LAN Local Area Network
  • the surroundings monitoring sensor 3 is an autonomous sensor that monitors the surroundings of the vehicle V itself.
  • the surroundings monitoring sensor 3 detects an obstacle OB composed of three-dimensional objects around the own vehicle, such as moving dynamic targets such as pedestrians and other vehicles, and stationary static targets such as structures on the road.
  • parking assistance marks indicating parking information which is information about the parking lot PL, etc., are detected as objects to be detected.
  • a surroundings monitoring camera 31 that captures a predetermined range around the vehicle
  • a sonar 32 that transmits search waves to a predetermined range around the vehicle
  • a millimeter wave radar 33 LiDAR (Light Detection and An abbreviation for Ranging) 34 or the like is provided.
  • LiDAR Light Detection and An abbreviation for Ranging
  • the surroundings monitoring camera 31 corresponds to an image capturing device, captures an image of the surroundings of the own vehicle, and outputs the imaged data to the parking assistance device 5 as sensing information.
  • the front camera 31a, the rear camera 31b, the left side camera 31c, and the right side camera 31d which capture images in front of, behind, and on the left and right sides of the vehicle, are exemplified as the peripheral monitoring camera 31, but the camera is limited to this. not a thing
  • the search wave sensor outputs the search wave and acquires the reflected wave, and the measurement results such as the relative speed and relative distance to the target and the azimuth angle where the target exists are used as sensing information for the parking assist device. 5 sequentially.
  • the sonar 32 performs measurement using ultrasonic waves as survey waves, and is provided at a plurality of locations on the vehicle V. For example, a plurality of sonars 32 are arranged side by side in the left and right direction of the vehicle on the front and rear bumpers, and are arranged around the vehicle. Measurement is performed by outputting a probe wave.
  • the millimeter wave radar 33 performs measurement using millimeter waves as search waves.
  • the LiDAR 34 performs measurements using laser light as the probe wave. Both the millimeter wave radar 33 and the LiDAR 34 output search waves, for example, within a predetermined range in front of the vehicle V, and perform measurement within that output range.
  • the peripheral monitoring sensor 3 includes a peripheral monitoring camera 31, a sonar 32, a millimeter wave radar 33, and a LiDAR 34. It's fine if you can do it, and you don't have to have everything.
  • the parking assistance device 5 constitutes an ECU (that is, an electronic control device) for performing various controls for realizing a parking assistance method in the automatic parking system 1, and includes a CPU, a storage unit 50, an I/O, and the like. It is composed of a microcomputer with
  • the storage unit 50 includes ROM, RAM, EEPROM, and the like. That is, the storage unit 50 has a volatile memory such as RAM and a nonvolatile memory such as EEPROM.
  • the storage unit 50 is composed of a non-transition tangible recording medium.
  • the parking assistance device 5 determines a target route that the vehicle V should follow when the vehicle V is parked, based on information about the vehicle V's traveling route and the surroundings of the vehicle V on the traveling route when the vehicle V is parked by the user. Generate a TP.
  • the "information around the vehicle V" includes, for example, dynamic targets such as people and other vehicles around the vehicle V, curbs around the vehicle V, static targets such as buildings, various signs, guide lines, and the like. information such as road markings.
  • the parking assistance device 5 automatically moves the vehicle V from the assistance start position STP to the planned parking position SEP along the target route TP.
  • the planned parking position SEP is the end point of the target route TP.
  • the planned parking position SEP is registered in advance by the user as the parking space SP for the own vehicle.
  • the parking assistance device 5 stores the sensing information, which is the result of detection by the periphery monitoring sensor 3, in the non-volatile memory of the storage unit 50 when the user performs the parking operation of the vehicle V. .
  • the parking assistance device 5 generates the target route TP and performs various controls for parking assistance based on the sensing information stored in the storage unit 50 and the sensing information from the surrounding monitoring sensor 3 during parking assistance.
  • the learning process for storing information about the driving route and the surroundings of the vehicle V during manual driving by the user is performed when an instruction to perform the learning process is issued, for example, when a learning switch (not shown) is operated by the user. executed. Further, parking assistance is executed when the user issues an instruction to perform parking assistance, such as when the parking assistance start switch 35 is operated by the user.
  • the parking assistance device 5 recognizes targets, parking free spaces, parking positions, etc. on the travel route of the vehicle V based on sensing information from the surroundings monitoring sensor 3 . These recognition results are sequentially stored in the non-volatile memory of the storage unit 50 and used for parking assistance.
  • the parking assistance device 5 when the user issues a parking assistance instruction, the parking assistance device 5 generates a target route TP based on the sensing information stored in the storage unit 50 and the sensing information of the surrounding monitoring sensor 3 during parking assistance. and perform route tracking control according to the route.
  • the parking assistance device 5 includes a recognition processing unit 51, a vehicle information acquisition unit 52, and a parking control unit 53 as functional units that execute various controls.
  • the recognition processing unit 51 receives sensing information from the surroundings monitoring sensor 3, and based on the sensing information, recognizes the surrounding environment of the vehicle that is about to be parked, recognizes the scene in which parking is to be performed, Furthermore, it recognizes objects existing around the vehicle.
  • the recognition processing section 51 is composed of an image recognition section 51a, a space recognition section 51b, and a free space recognition section 51c.
  • the image recognition unit 51a performs scene recognition, three-dimensional object recognition, and the like. Various recognitions by the image recognition unit 51a are realized by image analysis of image data from the peripheral monitoring camera 31 that is input as sensing information.
  • scene recognition it recognizes what kind of scene the parking scene is. For example, it is recognized whether it is a normal parking scene in which there is no obstacle OB near the planned parking position SEP and the parking of the vehicle V is not particularly restricted, or a special parking scene in which the parking of the vehicle V is restricted by the obstacle OB. I do.
  • the imaging data input from the surroundings monitoring camera 31 shows the surroundings, so if the image is analyzed, it can be determined whether it is a normal parking scene or a special parking scene. For example, if an object around the planned parking position SEP is detected from the imaging data and the object obstructs parking at the planned parking position SEP, it can be determined as a special parking scene.
  • the scene recognition may be performed based on not only the sensing information of the perimeter monitoring camera 31 but also the sensing information of the survey wave sensor.
  • obstacles OB composed of three-dimensional objects that exist around the vehicle, such as dynamic targets and static targets, are recognized as objects to be detected. Based on the detection object recognized by this three-dimensional object recognition, preferably the shape of a static target among them, the scene recognition described above and the generation of the parking support map including the obstacle OB are performed.
  • the space recognition unit 51b performs three-dimensional object recognition and the like.
  • the space recognition unit 51b recognizes three-dimensional objects in the space around the vehicle based on sensing information from at least one of the sonar 32, the millimeter wave radar 33, and the LiDAR 34.
  • the three-dimensional object recognition here is the same as the three-dimensional object recognition performed by the image recognition section 51a. Therefore, if either one of the image recognition section 51a and the space recognition section 51b is provided, three-dimensional object recognition can be performed.
  • the space recognition unit 51b does not perform scene recognition, but the space recognition unit 51b performs scene recognition based on sensing information from at least one of the sonar 32, the millimeter wave radar 33, and the LIDAR 34. can also be done.
  • three-dimensional object recognition and scene recognition can be performed by either the image recognition unit 51a or the space recognition unit 51b, it is possible to perform three-dimensional object recognition and scene recognition with higher accuracy by using both.
  • the image recognition unit 51a by complementing the three-dimensional object recognition and scene recognition by the image recognition unit 51a with the three-dimensional object recognition and scene recognition by the space recognition unit 51b, it is possible to perform three-dimensional object recognition and scene recognition with higher accuracy.
  • the free space recognition unit 51c performs free space recognition to recognize a free space in the parking lot PL.
  • the free space means, for example, a space with a size and shape that allows the vehicle V to stop in the parking lot PL.
  • the space is not limited to a plurality of spaces in the parking lot PL, and there may be only one space.
  • the free space recognition unit 51c recognizes free spaces in the parking lot PL based on the recognition results of scene recognition and three-dimensional object recognition by the image recognition unit 51a and the space recognition unit 51b. For example, from the results of scene recognition and three-dimensional object recognition, the shape of the parking lot PL and the presence or absence of parking of other vehicles can be grasped, so based on this, the free space in the parking lot PL is recognized.
  • the free space recognizing unit 51c identifies free spaces in the image, for example, by using semantic segmentation to categorize each pixel in the image based on the peripheral information of each pixel.
  • the vehicle information acquisition unit 52 acquires information on the amount of operation of the vehicle V from other ECUs 4 and the like. Specifically, the vehicle information acquisition unit 52 acquires detection signals output from sensors such as an accelerator position sensor, a brake depression force sensor, a steering angle sensor, a wheel speed sensor, and a shift position sensor mounted on the vehicle V. .
  • the parking control unit 53 executes various controls required for parking assistance.
  • the parking control unit 53 includes a route storage unit 54, a route generation unit 55, a position estimation unit 56, a tracking control unit 57, an information provision unit 58, and an image generation unit 59 as functional units that execute various controls. It is configured with
  • the route storage unit 54 stores in the storage unit 50 the sensing information of the perimeter monitoring sensor 3 when the user performs the parking operation of the vehicle V. For example, when the learning process is started, the route storage unit 54 stores targets in the travel route of the vehicle V, parking free spaces, parking positions, etc., which are sequentially acquired by the recognition processing unit 51, as route information. memorize to
  • the route storage unit 54 stores image data and the like sequentially input from the perimeter monitoring camera 31 in the storage unit 50 as route information. Note that when the imaging data of the front camera 31a, the rear camera 31b, the left side camera 31c, and the right side camera 31d are sequentially stored in the storage unit 50, the amount of route information increases, and the capacity of the storage unit 50 becomes tight. I will stay. Therefore, even if the route storage unit 54 stores, in the storage unit 50, synthetic image data obtained by synthesizing the imaging data of the front camera 31a, the rear camera 31b, the left side camera 31c, and the right side camera 31d, for example. good.
  • the route generation unit 55 generates a route based on the results of scene recognition, three-dimensional object recognition, and free space recognition.
  • the route generation unit 55 generates a target route TP that the vehicle V should follow when the vehicle V is parked, based on the travel route of the vehicle V during the learning process and information about the surroundings of the vehicle V on the travel route. For example, the route generation unit 55 sets the travel route of the vehicle V as a reference route, and if there is a section in the reference route in which the distance between the vehicle V and the obstacle OB is equal to or less than a predetermined value, the section is defined as the vehicle V and the obstacle OB.
  • a target route TP is generated by replacing it with a route whose distance from the OB exceeds a predetermined value.
  • the obstacle OB is composed of a three-dimensional object recognized by three-dimensional object recognition.
  • the position estimation unit 56 estimates the current position of the vehicle V based on the sensing information stored in the storage unit 50 and the sensing information sequentially acquired by the surroundings monitoring sensor 3 during parking assistance.
  • the position estimation unit 56 compares, for example, the sensing information stored in the storage unit 50 with the sensing information acquired during parking assistance, and estimates the current position based on the difference between them.
  • the follow-up control unit 57 automatically moves the vehicle V from the support start position STP to the planned parking position SEP along the target route TP by performing vehicle motion control such as acceleration/deceleration control and steering control of the vehicle V. Specifically, the follow-up control unit 57 outputs a control signal to various ECUs 4 so that the current position of the vehicle V estimated by the position estimation unit 56 reaches the planned parking position SEP along the target route TP. .
  • the various ECUs 4 include a steering ECU 41 that controls steering, a brake ECU 42 that controls acceleration and deceleration, a power management ECU 43, and a body ECU 44 that controls various electrical components such as lights and door mirrors.
  • the follow-up control unit 57 outputs from each sensor such as an accelerator position sensor, a brake depression force sensor, a steering angle sensor, a wheel speed sensor, and a shift position sensor mounted on the vehicle V via the vehicle information acquisition unit 52. A detected signal is obtained. Then, the follow-up control unit 57 detects the state of each unit from the acquired detection signal, and outputs control signals to various ECUs 4 in order to move the vehicle V following the target route TP.
  • each sensor such as an accelerator position sensor, a brake depression force sensor, a steering angle sensor, a wheel speed sensor, and a shift position sensor mounted on the vehicle V via the vehicle information acquisition unit 52. A detected signal is obtained. Then, the follow-up control unit 57 detects the state of each unit from the acquired detection signal, and outputs control signals to various ECUs 4 in order to move the vehicle V following the target route TP.
  • the information providing unit 58 uses HMI (abbreviation for Human Machine Interface) 45 to provide information to the user.
  • the HMI 45 is a device for providing various types of support to the user.
  • the HMI 45 has a touch panel display section 46 and a speaker 47 .
  • the touch panel display unit 46 is a touch panel type display used in a navigation system or a meter system.
  • the information providing unit 58 provides the user with information regarding the planned parking position SEP included in the route information stored in the storage unit 50 in a visual manner before the follow-up control process is started. For example, the information providing unit 58 displays an image showing the surroundings of the planned parking position SEP on the touch panel display unit 46 before starting the follow-up control process.
  • the information providing unit 58 displays various buttons to prompt the user to perform touch operations on the touch panel display unit 46.
  • Various buttons are operation buttons touch-operated by the user.
  • the information providing unit 58 displays, for example, a start button STB for follow-up control processing, a selection button SLB for selecting the expected parking position SEP, and the like on the touch panel display unit 46 .
  • the touch panel display section 46 of the present embodiment not only displays information, but also serves as an "operation section" operated by the user.
  • the information providing unit 58 changes the display contents of the touch panel display unit 46 according to the operation signal of the touch operation of the touch panel display unit 46.
  • the information providing unit 58 changes the viewpoint of the three-dimensional display (that is, 3D view) displayed on the touch panel display unit 46 in response to an operation signal of the touch panel display unit 46 by the user.
  • the image generation unit 59 generates image data to be displayed on the touch panel display unit 46 using the image data of the perimeter monitoring camera 31 .
  • the image generator 59 and the image recognition unit 51a are separate in this embodiment, the image generator 59 may be included in the image recognition unit 51a.
  • the image generator 59 periodically or irregularly generates peripheral image data (hereinafter also referred to as peripheral images) using, for example, captured data from the front camera 31a, the rear camera 31b, the left side camera 31c, and the right side camera 31d. do.
  • the peripheral image is an image corresponding to at least a partial range of the area around the vehicle V, and includes the camera viewpoint image Gc, a synthesized image, and the like.
  • the camera viewpoint image Gc is an image with a viewpoint of the arrangement position of each lens of the periphery monitoring camera 31 .
  • One of the synthesized images is an image of the surroundings of the vehicle V viewed from a virtual viewpoint set at an arbitrary position around the vehicle V (hereinafter also referred to as a virtual viewpoint image). A method of generating a virtual viewpoint image will be described below.
  • the image generation unit 59 converts the information of each pixel included in the imaging data of the front camera 31a, the rear camera 31b, the left side camera 31c, and the right side camera 31d into a predetermined projected curved surface (for example, a bowl) in a virtual three-dimensional space. the curved surface of the shape). Specifically, the image generator 59 projects the information of each pixel included in the imaging data of the front camera 31a, the rear camera 31b, the left camera 31c, and the right camera 31d onto a portion other than the center of the projection curved surface.
  • the center of the projected curved surface is defined as the vehicle V position.
  • the image generation unit 59 sets a virtual viewpoint in a virtual three-dimensional space, and extracts a predetermined region of the projected curved surface included in a predetermined viewing angle as image data when viewed from the virtual viewpoint. Generate a viewpoint image.
  • the virtual viewpoint image obtained in this manner is a three-dimensional representation of an image showing the surroundings of the vehicle V. FIG.
  • the image generator 59 further superimposes a virtual vehicle image Gv showing the vehicle V and lines, frames, marks, etc. for supporting the parking operation on the camera viewpoint image Gc and the virtual viewpoint image. Generate an image.
  • the virtual vehicle image Gv is composed of, for example, opaque or translucent polygons representing the shape of the vehicle V.
  • the automatic parking system 1 is configured as described above. Next, the operation of the automatic parking system 1 configured in this manner will be described.
  • the case where the vehicle V is parked in the parking lot PL shown in FIG. 2 will be described as an example.
  • Four parking spaces SP for vehicles V are set in the parking lot PL shown in FIG.
  • a first parking space SP1 and a second parking space SP2 are vertically arranged along a passage PS that extends linearly from a vehicle entrance/exit B.
  • a third parking space SP3 and a fourth parking space SP4 are provided adjacent to each other so as to cross the passage PS.
  • a third parking space SP3 and a fourth parking space SP4 are provided between the building BL and the house HM.
  • the vehicle V can be parked facing forward by moving back and forth (that is, turning). This also applies to the fourth parking space SP4.
  • the third parking space SP3 is assumed to be the planned parking position SEP, and the vehicle V is parked facing forward at the planned parking position SEP.
  • the learning process shown in FIG. 3 is executed by the parking control unit 53 at each predetermined control cycle when an instruction to perform the learning process is issued, such as when a learning switch (not shown) is operated by the user. .
  • Each processing shown in this flowchart is implemented by each functional unit of the parking assistance device 5 . Further, each step for realizing this processing can also be grasped as each step for realizing the parking assistance method.
  • the parking control unit 53 starts recognition processing in step S100.
  • recognition processing scene recognition, three-dimensional object recognition, and free space recognition by the recognition processing unit 51 are started based on the sensing information of the periphery monitoring sensor 3 .
  • the parking control unit 53 determines whether or not the learning start condition is satisfied.
  • the learning start condition is, for example, a condition that is met when the vehicle V enters a learning start area designated in advance by the user around the parking lot PL.
  • the learning start condition may be a condition that is satisfied when a learning switch (not shown) is turned on.
  • the parking control unit 53 waits until the learning start condition is satisfied, and when the learning start condition is satisfied, in step S120, starts storing various information necessary for parking assistance.
  • the parking control unit 53 stores, for example, targets in the traveling route of the vehicle V, parking available free spaces, parking positions, etc., which are sequentially acquired by the recognition processing unit 51, in the storage unit 50 as route information.
  • the parking control unit 53 of the present embodiment stores in the storage unit 50 peripheral images when the vehicle V is running and when the vehicle is parked at the parking position. Specifically, the parking control unit 53 stores images captured by the front camera 31a, the rear camera 31b, the left side camera 31c, and the right side camera 31d in the storage unit 50 as surrounding images during parking.
  • a composite image obtained by synthesizing the images captured by the front camera 31a, the rear camera 31b, the left camera 31c, and the right camera 31d is stored as the surrounding image when parking. Preferably stored in unit 50 .
  • the parking control unit 53 determines whether or not the learning stop condition is satisfied.
  • the learning stop condition is a condition that is met when the vehicle V stops at the planned parking position SEP designated in advance by the user or in the vicinity of the planned parking position SEP.
  • the learning stop condition may be a condition that is met when the shift position is switched to a position that means parking (for example, the P position).
  • the parking control unit 53 continues storing various information in the storage unit 50 until the learning stop condition is satisfied. On the other hand, when the learning stop condition is established, the parking control unit 53 stops storing various information in step S140.
  • step S150 the parking control unit 53 notifies the user via the HMI 45 that the storage of various information has been completed, and exits the learning process.
  • step S150 for example, the travel route of the vehicle V during the learning process and the circumstances around the travel route may be notified.
  • the learning process from step S100 to step S150 is performed by the route storage section 54 of the parking control section 53.
  • FIG. 4 An example of support processing for automatically moving the vehicle V from the support start position STP to the planned parking position SEP along the target route TP will be described with reference to the flowchart shown in FIG.
  • the support process shown in FIG. 4 is executed by the parking control unit 53 in each predetermined control cycle under the condition that the learning process has been performed at least once.
  • Each processing shown in this flowchart is implemented by each functional unit of the parking assistance device 5 . Further, each step for realizing this processing can also be grasped as each step for realizing the parking assistance method.
  • step S200 the parking control unit 53 determines that the current position of the vehicle V is near the support start position STP using the sensing information of the surroundings monitoring sensor 3, the GPS (not shown), and a map database. Determine whether or not The support start position STP is set near the vehicle entrance/exit B of the parking lot PL.
  • the vehicle entrance/exit B is a boundary portion between the public road OL and the parking lot PL.
  • the support start position STP may be set on the public road OL side instead of the parking lot PL side.
  • step S210 the parking control unit 53 notifies the user via the HMI 45 that the current position of the vehicle V is near the support start position STP.
  • the parking control unit 53 displays the camera viewpoint image Gc in the left area of the touch panel display unit 46, and displays the overhead image Gh in the right area, so that the current position of the vehicle V is displayed.
  • the user is notified that it is near the support start position STP.
  • the notification to the user may be realized by displaying a message on the touch panel display unit 46 indicating that the user is near the support start position STP, or by outputting a voice from the speaker 47 notifying that the user is near the support start position STP. .
  • the camera viewpoint image Gc shown in FIG. 5 is an image taken from the viewpoint of the arrangement position of the lens of the camera (the front camera 31a in this example) that captures the scenery in the direction in which the vehicle V is scheduled to move.
  • a bird's-eye view image Gh shown in FIG. It is an image on which Gv is superimposed.
  • the camera viewpoint image Gc and the bird's-eye view image Gh are generated by the image generation unit 59 based on the images captured by the surroundings monitoring camera 2 during execution of the support process.
  • the parking spaces SP, objects, and the like appearing in the camera viewpoint image Gc and the bird's-eye view image Gh are given the same reference numerals as those of the actual objects. This is the same for images other than the camera viewpoint image Gc and the overhead image Gh.
  • step S220 the parking control unit 53 determines whether or not an instruction to perform parking assistance has been given by the user through the operation of the parking assistance start switch 35.
  • the parking control unit 53 skips subsequent processing and exits this processing when the user does not turn on the parking support start switch 35 .
  • the parking control unit 53 performs processing for generating the target route TP in step S230 when the start switch 35 for parking assistance is turned on by the user. Details of the processing in step S230 will be described below with reference to the flowchart shown in FIG.
  • the parking control unit 53 reads the route information stored in the storage unit 50 during the learning process in step S300.
  • the parking control unit 53 reads the plurality of pieces of route information.
  • the parking control unit 53 starts recognition processing in step S310.
  • recognition processing scene recognition, three-dimensional object recognition, and free space recognition by the recognition processing unit 51 are started based on the sensing information of the periphery monitoring sensor 3 .
  • step S320 the parking control unit 53 generates the target route TP based on the route information. Specifically, the parking control unit 53 generates the target route TP that the vehicle V should follow when the vehicle V is parked, based on the travel route of the vehicle V during the learning process and information about the surroundings of the vehicle V on the travel route. . As shown in FIG. 7, this target route TP passes in front of the third parking space SP3 as in the learning process, and then the vehicle V is turned back so that the vehicle V is directed forward to the third parking space. This is the route for parking at SP3.
  • the parking control unit 53 sets the third parking space SP3 to the planned parking position. Generate a target route TP as SEP.
  • the parking control unit 53 controls the third parking space SP3 and the fourth parking space SP4.
  • the fourth parking space SP4 is set as a candidate position for the planned parking position SEP. Then, a target route TP is generated for each candidate position of the planned parking position SEP.
  • step S330 the parking control unit 53 determines whether or not there is a new obstacle OB that did not exist during the learning process on the target route TP. Specifically, the parking control unit 53 determines whether or not there is a new obstacle OB based on the recognition result of the three-dimensional object recognition at the support start position STP and the recognition result of the three-dimensional object recognition included in the route information.
  • the obstacle OB is composed of a three-dimensional object recognized by three-dimensional object recognition.
  • the parking control unit 53 skips the subsequent processes and exits this process.
  • step S340 the parking control unit 53 searches for an object avoidance route that avoids the obstacle OB on the target route TP and reaches the planned parking position SEP, and attempts to generate the object avoidance route. Specifically, the parking control unit 53 searches for a section in which the distance between the vehicle V and the obstacle OB is equal to or less than a predetermined value on the traveling route of the vehicle V included in the route information, The object avoidance route is generated by replacing the route with the distance from the object OB exceeding a predetermined value.
  • the object avoidance route generated in this manner is, for example, a route avoiding a collision between the vehicle V and the obstacle OB, as shown in FIG. Of the obstacles OB, dynamic targets move. Therefore, it is desirable that the object avoidance route is a route that avoids only static targets.
  • step S350 the parking control unit 53 determines whether the object avoidance route has been generated. If the object avoidance route can be generated, the parking control unit 53 sets the object avoidance route as the target route TP in step S360, and exits this process. Accordingly, by setting the object avoidance route as the target route TP, the object avoidance route is visually provided to the user at the time of display processing of the planned parking position SEP, which will be described later.
  • the parking control unit 53 turns on the parking prohibition flag indicating that parking at the planned parking position SEP is impossible in step S370, and ends this processing. .
  • the processing of steps S320 to S370 is performed by the route generation section 55 of the parking control section 53.
  • step S240 the parking control unit 53 determines whether the vehicle can be parked at the planned parking position SEP. In this determination processing, for example, when the parking prohibition flag is off, it is determined that parking is possible at the planned parking position SEP, and when the parking prohibition flag is on, it is determined that parking is not possible at the expected parking position SEP. judge.
  • the parking control unit 53 When parking at the planned parking position SEP is possible, the parking control unit 53 performs display processing of the planned parking position SEP in step S250.
  • the processing for displaying the expected parking position SEP will be described with reference to the flowchart shown in FIG.
  • the parking control unit 53 determines in step S400 whether there are multiple candidate positions for the planned parking position SEP. For example, the parking control unit 53 determines whether the storage unit 50 stores route information obtained when the vehicle V is parked in a different parking space SP.
  • step S410 the parking control unit 53 visually provides the user with information on the planned parking position SEP included in the route information in step S410.
  • the processing of step S410 is performed by the information providing section 58 of the parking control section 53.
  • the parking control unit 53 provides the user with a virtual parking image Gp obtained as information about the vicinity of the planned parking position SEP among the route information before starting the follow-up control process. For example, the parking control unit 53 displays a virtual parking image Gp in the upper right area of the touch panel display unit 46, as shown in FIG.
  • the virtual parking image Gp is generated by the image generation unit 59 based on the image stored in the storage unit 50 as the route information during the learning process.
  • the parking control unit 53 superimposes the virtual vehicle image Gv and the parking frame image Gf indicating the planned parking position SEP on the virtual viewpoint image showing the surroundings of the planned parking position SEP as the virtual parking image Gp. It is displayed in the upper right area of the touch panel display section 46 .
  • the virtual vehicle image Gv is an image (this example of Polygon).
  • the parking frame image Gf is a thick-line image colored in blue or red so that it can be distinguished from the parking frame shown in the virtual viewpoint image.
  • the virtual parking image Gp is a three-dimensional representation of an image showing the surroundings of the planned parking position SEP.
  • the parking control unit 53 changes the viewpoint of the virtual parking image Gp according to the operation signal of the touch operation by the user of the touch panel display unit 46 .
  • the parking control unit 53 acquires an operation signal corresponding to a touch operation such as a flick or drag on the touch panel display unit 46 in the directions indicated by the vertical and horizontal rotation arrows R shown in the virtual parking image Gp, and performs the operation.
  • the viewpoint of the virtual parking image Gp is changed according to the signal.
  • the parking control unit 53 enlarges and reduces the virtual parking image Gp, for example, by touching the zoom-in icon ZI and the zoom-out icon ZO shown in the virtual parking image Gp.
  • enlargement and reduction of the virtual parking image Gp may be realized by operations other than icon operations.
  • Enlargement and reduction of the virtual parking image Gp are realized, for example, by pinching out to widen the distance between two fingers on the surface of the touch panel display unit 46 and pinching in to narrow the distance by pinching the two fingers. It is desirable that If enlargement and reduction of the virtual parking image Gp are realized by such screen operations, it is possible to avoid reduction in the display size of the virtual parking image Gp due to icon display and obscuring part of the image due to superimposition of icons. can do. That is, it is possible to ensure the display size of the image and the visibility of the image. These are not limited to enlargement and reduction of an image, but are the same when changing the viewpoint of an image.
  • the parking control unit 53 displays a vehicle surrounding image Ga including a surrounding image showing the surroundings of the vehicle V in the left area of the touch panel display unit 46, and an illustration image Gi in the lower right area of the touch panel display unit 46. indicate.
  • the vehicle peripheral image Ga is a virtual viewpoint image obtained by viewing a projected curved surface from a virtual viewpoint set behind the vehicle V and extracting an area on the projected curved surface included in a predetermined viewing angle as an image. It is an image in which an image Gt and an icon P are superimposed.
  • This vehicle surroundings image Ga is generated by the image generating unit 59 based on the image captured by the surroundings monitoring camera 2 during the execution of the support process and the information regarding the target route TP.
  • the target route image Gt is an image showing the target route TP from the current position of the vehicle V to the planned parking position SEP.
  • the parking control unit 53 identifies a route in front of an object appearing in the vehicle peripheral image Ga on the target route TP as a front route, and identifies a route behind the object on the target route TP as a back route.
  • the parking control unit 53 uses semantic segmentation to identify an object appearing in the vehicle peripheral image Ga, and identifies the positional relationship between the object and the target route TP.
  • the image generation unit 59 of the parking control unit 53 superimposes the portion Gtb corresponding to the back route and the portion Gtf corresponding to the front route in the target route image Gt on the vehicle peripheral image Ga in different manners.
  • the image generating unit 59 superimposes an image in which the portion Gtf corresponding to the front route is a solid line and the portion Gtb corresponding to the back route is a broken line as the target route image Gt and superimposed on the vehicle peripheral image Ga.
  • a virtual vehicle image Gv is displayed in an opaque manner in the vehicle peripheral image Ga.
  • a front portion of the virtual vehicle image Gv is superimposed on the vehicle peripheral image Ga so that the viewpoint can be easily grasped.
  • a virtual vehicle image Gv showing the entire vehicle V may be superimposed on the vehicle peripheral image Ga.
  • the icon P indicates the planned parking position SEP.
  • the icon P is superimposed near the planned parking position SEP.
  • the icon P is superimposed in a translucent manner.
  • the illustration image Gi is an image showing the relationship between the current position of the vehicle V, the target route TP, and the planned parking position SEP.
  • the illustration image Gi of this embodiment is composed of an overall bird's-eye view image including all of the current position of the vehicle V, the target route TP, and the planned parking position SEP.
  • the illustration image Gi is composed of pictures and diagrams.
  • the illustration image Gi is generated by the image generation unit 59 based on information about the target route TP and icons indicating the vehicle V and the planned parking position SEP prepared in advance. Note that the illustration image Gi shown in FIG. 10 shows only the current position of the vehicle V, the target route TP, and the planned parking position SEP, but other information such as an obstacle OB may also be shown.
  • the parking control unit 53 displays a start button STB in the area below the illustration image Gi on the touch panel display unit 46.
  • the start button STB is a button touched by the user when instructing the start of the follow-up control process.
  • the parking control unit 53 determines in step S420 whether or not the user has touched the start button STB. Then, the parking control unit 53 waits until the start button STB is touched, and when the start button STB is touched, exits this process and proceeds to the follow-up control process of step S260 shown in FIG.
  • the parking control unit 53 provides the user with information on the multiple candidate positions and information prompting for the planned parking position SEP in step S430.
  • This step S430 processing is performed by the information providing section 58 of the parking control section 53 .
  • the parking control unit 53 displays an icon P1 indicating a candidate position in a portion corresponding to the candidate position of the planned parking position SEP shown in each of the virtual parking image Gp, the vehicle peripheral image Ga, and the illustration image Gi.
  • the image on which P2 is superimposed is displayed on the touch panel display section 46 .
  • the parking control unit 53 displays the selection buttons SLB1 and SLB2 corresponding to the respective candidate positions in the area below the vehicle periphery image Ga on the touch panel display unit 46, and determines the planned parking position SEP in the area below the illustration image Gi. Display a decision button DB for
  • step S440 the parking control unit 53 determines whether or not the user has selected a candidate position. For example, when one of the selection buttons SLB1 and SLB2 is selected and the decision button DB is touched, the parking control unit 53 determines that the user has selected a candidate position. Further, the parking control unit 53 determines that the user has not selected a candidate position if one of the touch operations of the selection buttons SLB1 and SLB2 and the touch operation of the enter button DB has not been performed.
  • the parking control unit 53 waits until the user selects a candidate position, and when the user selects a candidate position, the process proceeds to step S450. After setting the candidate position selected by the user as the planned parking position SEP in step S450, the parking control unit 53 proceeds to step S410.
  • the parking control unit 53 displays an image showing all of the plurality of candidate positions on the touch panel display unit 46 before the user selects a candidate position.
  • the parking control unit 53 displays an image focused on the selected candidate position on the touch panel display unit 46 after the user selects the candidate position. According to this, the user can clearly grasp the planned parking position SEP.
  • the parking control unit 53 proceeds to step S250 shown in FIG.
  • the parking control unit 53 starts follow-up control processing in step S250.
  • the follow-up control process is a process of automatically moving the vehicle V to the planned parking position SEP along the target route TP. The follow-up control process will be described with reference to the flowchart shown in FIG.
  • the parking control unit 53 starts recognition processing in step S500.
  • recognition processing scene recognition, three-dimensional object recognition, and free space recognition by the recognition processing unit 51 are started based on the sensing information of the periphery monitoring sensor 3 .
  • step S510 the parking control unit 53 estimates the current position of the vehicle V based on the sensing information stored in the storage unit 50 and the sensing information sequentially acquired by the periphery monitoring sensor 3 during parking assistance.
  • the processing of step S510 is performed by the position estimation section 56 of the parking control section 53 .
  • step S520 the parking control unit 53 starts automatic parking of the vehicle V at the planned parking position SEP by performing vehicle motion control such as acceleration/deceleration control and steering control of the vehicle V.
  • vehicle motion control such as acceleration/deceleration control and steering control of the vehicle V.
  • step S ⁇ b>520 is performed by the follow-up control section 57 of the parking control section 53 .
  • step S530 the parking control unit 53 determines whether there is a new obstacle OB that did not exist during the learning process on the target route TP and the planned parking position SEP. Specifically, the parking control unit 53 determines whether or not there is a new obstacle OB based on the recognition result of the three-dimensional object recognition after the start of the tracking control process and the recognition result of the three-dimensional object recognition included in the route information.
  • the obstacle OB is composed of a three-dimensional object recognized by three-dimensional object recognition.
  • step S540 processing is performed by the information providing section 58 of the parking control section 53 .
  • the parking control unit 53 displays, in the left area of the touch panel display unit 46, an icon P superimposed on a portion corresponding to the planned parking position SEP shown in the vehicle peripheral image Ga. .
  • FIG. 14 illustrates an example of display contents on the touch panel display unit 46 at the start of the follow-up control process.
  • FIG. 15 illustrates an example of display contents on the touch panel display unit 46 after the follow-up control process is started.
  • the icon P is superimposed in a translucent manner.
  • the icon P is superimposed in an opaque manner. This makes it possible to grasp the position of the planned parking position SEP.
  • a part of the planned parking position SEP is cut off. Therefore, the icon P is superimposed on the right end of the vehicle peripheral image Ga.
  • the parking control unit 53 displays, in the right area of the touch panel display unit 46, the target route image Gt superimposed on the overhead image Gh. Furthermore, the parking control unit 53 displays a progress bar PB indicating the progress of automatic parking of the vehicle V in the area below the vehicle peripheral image Ga on the touch panel display unit 46 .
  • This progress bar PB has a horizontally long bar shape, and as the distance from the support start position STP to the current position of the vehicle V increases, the colored portion inside the bar increases. This allows the user to visually grasp the progress of automatic parking.
  • the remaining distance to the planned parking position SEP may be displayed in the lower area of the vehicle periphery image Ga, instead of the progress bar PB, for example.
  • the parking control unit 53 increases the angle of the virtual viewpoint of the virtual viewpoint image when approaching the planned parking position SEP, and decreases the angle of the virtual viewpoint of the virtual viewpoint image when moving away from the planned parking position SEP. , the display mode of the virtual viewpoint image is changed. That is, as with the user's field of view, the closer the target is, the narrower the range to be displayed in the image. According to this, since the display mode of the virtual viewpoint image is changed in the same manner as the user's field of view, it becomes easier for the user to grasp the distance from the planned parking position SEP.
  • the image displayed in the right area of the touch panel display section 46 is not limited to the overhead image Gh.
  • an illustration image Gi may be displayed in the right area of the touch panel display section 46 instead of the overhead image Gh.
  • step S550 the parking control unit 53 determines whether or not the vehicle V has reached the planned parking position SEP.
  • the parking control unit 53 returns to the process of step S510 when the vehicle V has not reached the planned parking position SEP, and exits the follow-up control process when the vehicle V reaches the planned parking position SEP.
  • step S560 the parking control unit 53 searches for an avoidance route that avoids the obstacle OB on the target route TP and reaches the planned parking position SEP, and attempts to generate the avoidance route. Since the avoidance route search and the like are the same as the processing in step S340, the description thereof will be omitted.
  • the parking control unit 53 displays a message image Gm indicating that the avoidance route is being searched on the touch panel display unit 46, as shown in FIG. 18, for example. In this way, by notifying the user of the internal state of the system, it is possible to make the user ready to change the route during automatic parking.
  • the notification that the avoidance route is being searched is not limited to the display of the message image Gm on the touch panel display unit 46.
  • the user is notified that the avoidance route is being searched.
  • the user may be notified by voice that the avoidance route is being searched.
  • step S570 the parking control unit 53 determines whether the avoidance route has been generated. If the avoidance route can be generated, the parking control unit 53 replaces the avoidance route with the target route TP and displays information about the avoidance route on the touch panel display unit 46 in step S580. For example, the parking control unit 53 displays, on the touch panel display unit 46, an image indicating an avoidance route superimposed on the vehicle peripheral image Ga and the overhead image Gh. In addition, the parking control unit 53 uses the speaker 47 to announce to the user that the avoidance route is replaced with the target route TP. A message indicating that the target route is to be changed to the avoidance route may be displayed on the touch panel display section 46 . Since changing the target route to the avoidance route is unintended by the user, it is desirable to combine the display of the avoidance route on the touch panel display unit 46 with the display of a message regarding route change and voice notification.
  • step S550 the parking control unit 53 determines whether or not the vehicle V has reached the planned parking position SEP.
  • the parking control unit 53 returns to the process of step S510 when the vehicle V has not reached the planned parking position SEP, and exits the follow-up control process when the vehicle V reaches the planned parking position SEP.
  • the parking control unit 53 identifies a stop position TSP where the vehicle V can be stopped within the parking lot PL in step S590. Specifically, the parking control unit 53 identifies the stop position TSP using the recognition result of the free space recognition. For example, as shown in FIG. 19, when the second parking space SP2 is vacant, the parking control unit 53 identifies the second parking space SP2 as the stop position TSP. This stop position TSP is a possible stop position different from the expected parking position SEP. The parking control unit 53 may specify a free space other than the second parking space SP2 as the stop position TSP.
  • step S600 the parking control unit 53 provides the user with information regarding the stop position TSP and the route to the stop position TSP, and recommends that the user stop at a position different from the planned parking position SEP.
  • the parking control unit 53 superimposes an icon P on a portion corresponding to the stop position TSP shown in the vehicle peripheral image Ga, as shown in FIG. is displayed in the left area of the touch panel display unit 46 .
  • the parking control unit 53 displays, in the right area of the touch panel display unit 46, an image showing the route to the stop position TSP superimposed on the bird's-eye view image Gh.
  • step S600 uses the speaker 47 to announce that the vehicle cannot be parked at the planned parking position SEP and will stop at a position different from the planned parking position SEP.
  • the processing of step S600 is performed by the information providing section 58 of the parking control section 53.
  • FIG. the parking control unit 53 displays a start button STB in the area below the vehicle periphery image Ga on the touch panel display unit 46, and when the start button STB is touch-operated, the process proceeds to another position stop processing in step S610.
  • step S610 the parking control unit 53 executes another position stop processing for moving the vehicle V to the stop position TSP and stopping the vehicle.
  • the parking control unit 53 starts automatic parking of the vehicle V at the stop position TSP by performing vehicle motion control such as acceleration/deceleration control and steering control of the vehicle V.
  • FIG. The process of step S610 is performed by the follow-up control section 57 of the parking control section 53.
  • the parking control unit 53 stops the vehicle V at the stop position TSP and exits from this process.
  • the parking control unit 53 specifies a possible stop position where the vehicle V can be stopped in step S270. .
  • the parking control unit 53 specifies the stop possible position when the object avoidance route cannot be generated when the target route TP is generated.
  • the parking control unit 53 uses the recognition result of the free space recognition to specify the position where the vehicle can be stopped. For example, as shown in FIG. 19, when the second parking space SP2 is vacant, the parking control unit 53 identifies the second parking space SP2 as a possible stop position. This stopable position is a stopable position different from the expected parking position SEP.
  • the parking control unit 53 may specify a free space other than the second parking space SP2 as a possible stop position.
  • step S280 the parking control unit 53 provides the user with information on the possible stop position and the route to the possible stop position, and recommends that the user stop at a position different from the planned parking position SEP.
  • the touch panel display unit 46 displays an icon P superimposed on the stopable position shown in the vehicle peripheral image Ga or the bird's-eye view image Gh.
  • the parking control unit 53 uses the speaker 47 to announce that the vehicle cannot be parked at the planned parking position SEP and will stop at a position different from the planned parking position SEP.
  • the processing of step S280 is performed by the information providing section 58 of the parking control section 53.
  • FIG. the parking control unit 53 displays the start button STB on the touch panel display unit 46, and when the start button STB is touch-operated, the process proceeds to another position stop processing in step S290.
  • step S290 the parking control unit 53 executes another position stop processing for moving the vehicle V to a position where the vehicle can be stopped and stopping the vehicle.
  • the parking control unit 53 performs vehicle motion control such as acceleration/deceleration control and steering control of the vehicle V to start automatic parking of the vehicle V at a position where the vehicle can be stopped.
  • the process of step S290 is performed by the follow-up control section 57 of the parking control section 53. FIG.
  • the parking control unit 53 stops the vehicle V at the stoppable position and exits from this process.
  • the parking assistance device 5 and the parking assistance method described above are used to park the vehicle V based on route information including information about the vehicle V's surroundings on the travel route and the travel route when the vehicle V is parked by the user.
  • a target route TP that the vehicle V should follow when V is parked is generated.
  • the parking assistance device 5 and the parking assistance method perform follow-up control processing for automatically moving the vehicle V to the planned parking position SEP along the target route TP.
  • the parking assistance device 5 and the parking assistance method provide the information regarding the planned parking position SEP included in the above route information to the user in a visual manner before starting the follow-up control process.
  • the user can start automatically parking the vehicle V at the planned parking position SEP after clearly grasping the planned parking position SEP. Therefore, according to the parking assistance device 5 and the parking assistance method of the present disclosure, usability of automatic parking can be improved.
  • the information providing unit 58 superimposes a virtual vehicle image Gv showing the vehicle V on the planned parking position SEP in the image obtained as the information about the planned parking position SEP in the route information. Provide the object to the user before the start of the follow-up control process. According to this, the user can visually grasp the parking state of the vehicle V at the planned parking position SEP before the follow-up control process is started. That is, the user can easily visualize the parking position of the vehicle V by automatic parking before the follow-up control process is started. This is a factor that increases user satisfaction, and greatly contributes to improving the usability of automatic parking.
  • the virtual vehicle image Gv is an image showing the vehicle V in a translucent manner. According to this, the user can be made to recognize that the image showing the parking state of the vehicle V at the planned parking position SEP does not show the current position of the vehicle V. FIG. That is, it is possible to prevent misunderstanding that the image showing the parking state of the vehicle V at the planned parking position SEP shows the current position of the vehicle V.
  • the information providing unit 58 superimposes the parking frame image Gf indicating the planned parking position SEP on the image obtained as the information about the planned parking position SEP among the route information, and starts the follow-up control process. provided to users in advance. According to this, by emphasizing the planned parking position SEP, it becomes easier for the user to visually grasp the planned parking position SEP before the follow-up control process is started.
  • the information providing unit 58 provides the user with a three-dimensional display of an image showing the surroundings of the planned parking position SEP, and the viewpoint of the three-dimensional display according to the operation signal of the touch panel display unit 46 by the user. make changes. According to this, it becomes possible to provide the user with detailed information regarding the planned parking position SEP. In particular, since it is possible to change the viewpoint of the three-dimensional display according to the touch operation of the touch panel display section 46, it is possible to provide information in accordance with the user's intention.
  • the information providing unit 58 visually presents information regarding the plurality of candidate positions included in the route information to the user. provided in a manner. Then, the information providing unit 58 provides information for prompting selection of the planned parking position SEP from among the plurality of candidate positions. Thus, if the user can select the intended parking position SEP, it is possible to realize parking assistance that appropriately reflects the user's intention.
  • the follow-up control unit 57 determines whether the vehicle V will be parked based on information about the surroundings of the vehicle V obtained after the start of the follow-up control process. A possible stop position different from the position SEP is specified. Then, when the vehicle V cannot be parked at the planned parking position SEP after the follow-up control process is started, the information providing unit 58 provides information for recommending that the vehicle be stopped at a possible stop position. In this way, even if the vehicle V cannot be parked at the planned parking position SEP after the automatic parking starts, the user is encouraged to stop at a possible stop position different from the planned parking position SEP.
  • a situation in which the vehicle V cannot be parked at the planned parking position SEP is, for example, when another vehicle is parked at the planned parking position SEP, or an obstacle OB is installed at the planned parking position SEP to obstruct the parking of the vehicle V.
  • the information providing unit 58 After starting the follow-up control process, the information providing unit 58 superimposes a target route image Gt indicating the target route TP on a surrounding image showing the surroundings of the vehicle V obtained during execution of the follow-up control process. provided to users. In this manner, it is desirable that the route along which the vehicle V is scheduled to travel is provided visually to the user during automatic parking. According to this, the user can clearly grasp the travel route of the vehicle V to the planned parking position SEP, and then automatically park the vehicle V at the planned parking position SEP.
  • the information providing unit 58 identifies the path in front of the object appearing in the surrounding image on the target path TP as the front path, and identifies the path behind the object on the target path TP as the back path.
  • the information providing unit 58 superimposes the portion Gtb corresponding to the back route and the portion Gtf corresponding to the front route in the target route image Gt on the surrounding images in different manners. In this way, it is desirable that the target route TP of the vehicle V is provided in such a manner that the route that the user can actually see and the route that the user cannot actually see are distinguished.
  • the information providing unit 58 provides the user with an illustration image Gi showing the relationship between the current position of the vehicle V, the target route TP, and the planned parking position SEP. Since the illustration image Gi has a smaller amount of extra information than the captured image or the like, the current position of the vehicle V, the target route TP, and the planned parking position SEP stand out. Therefore, by providing the user with an illustration image Gi showing the current position of the vehicle V, the target route TP, and the planned parking position SEP, it becomes easier to convey an overview of automatic parking to the user.
  • the follow-up control unit 57 When an obstacle OB is found on the target route TP, the follow-up control unit 57 tries to generate an avoidance route that avoids the obstacle OB and reaches the planned parking position SEP. Then, when the follow-up control unit 57 generates an avoidance route, the information providing unit 58 provides the user with information regarding the avoidance route in a visual manner. In this way, it is desirable to provide the user with information on the avoidance route in a visual manner when there is an obstacle OB on the target route TP.
  • the follow-up control unit 57 identifies a stop position TSP where the vehicle V can be stopped.
  • the information providing unit 58 provides the user with information regarding the stop position TSP and the route to the stop position TSP. In this way, when the avoidance route cannot be generated, it is desirable to provide the user with information regarding the stop position TSP of the vehicle V, etc., as an alternative for the avoidance route.
  • the information providing unit 58 provides the user with information indicating that the follow-up control unit 57 is generating the avoidance route. In this way, it is desirable to notify the user that the avoidance route is being generated before providing the user with information on the avoidance route. It can be expected that the route change will be transmitted in stages rather than suddenly, thereby reducing the psychological burden on the user accompanying the route change.
  • the path generation unit 55 avoids the obstacle OB before the start of the follow-up control process and determines the parking position. Attempt to generate an object avoidance path leading to When the object avoidance route is generated by the route generation unit 55, the information providing unit 58 provides the user with information on the object avoidance route in a visual manner before the follow-up control process is started. In this way, when it is found that there is an obstacle OB on the target route TP before starting automatic parking, information regarding the object avoidance route is provided to the user in a visual manner before starting automatic parking. It is desirable to have a configuration that
  • the follow-up control unit 57 identifies a possible stop position where the vehicle V can be stopped. Then, the information providing unit 58 provides the user with information regarding the possible stop position and the route to the possible stop position. In this manner, when it is determined that parking at the planned parking position SEP is not possible before the start of automatic parking, the user is provided with information regarding possible stop positions of the vehicle V as an alternative to the object avoidance route. It is desirable that
  • the parking assistance device 5 displays the virtual parking image Gp on the touch panel display unit 46 as information about the expected parking position SEP before starting the follow-up control process, the present invention is not limited to this.
  • the parking assistance device 5 may provide the user with an image of the detection result of the search wave sensor at the planned parking position SEP, for example.
  • the above-described parking assistance device 5 displays not only the virtual parking image Gp but also the vehicle surrounding image Ga and the illustration image Gi on the touch panel display unit 46 before starting the follow-up control process. For example, only the virtual parking image Gp may be displayed.
  • the vehicle peripheral image Ga is displayed in the left area of the touch panel display section 46, and the overhead image Gh, the virtual parking image Gp, and the illustration image Gi are displayed in the right area of the touch panel display section 46.
  • the image display layout and the like are not limited to this.
  • the image display layout, image size, and the like on the touch panel display unit 46 may be different from those described above.
  • the HMI 45 has the touch panel display unit 46 in the above embodiment, the HMI 45 is not limited to this.
  • the HMI 45 may have a display operated by an operation device such as a remote controller instead of the touch panel display unit 46, for example.
  • the HMI 45 may be implemented using part of the navigation system.
  • touch panel display unit 46 also serves as an operation unit, the operation unit and the display unit may be configured separately.
  • the operation unit is not limited to the touch operation, and may be operated by the user's voice, for example.
  • the parking assistance device 5 superimposes the virtual vehicle image Gv and the parking frame image Gf on an image obtained as information about the area around the planned parking position SEP, and provides the image to the user.
  • the parking assistance device 5 directs to the user an image itself obtained as information about the vicinity of the planned parking position SEP, or an image obtained by superimposing one of the virtual vehicle image Gv and the parking frame image Gf on the image. may be provided.
  • the virtual vehicle image Gv is not limited to showing the vehicle V in a translucent manner, and may also show the vehicle V in an opaque manner.
  • the parking assistance device 5 provides the user with a three-dimensional display of an image showing the surroundings of the planned parking position SEP, and changes the three-dimensional display according to the operation of the operation unit by the user.
  • the parking assistance device 5 may, for example, provide the user with a two-dimensional display of an image showing the surroundings of the planned parking position SEP.
  • the parking assistance device 5 when the route information includes a plurality of candidate positions for the planned parking position SEP, the parking assistance device 5 directs the information regarding the plurality of candidate positions included in the route information to the user. Although it is desirable to provide it in a visual manner, it is not limited to this. For example, if the route information includes a plurality of candidate positions for the planned parking position SEP, the parking assistance device 5 selects one of the plurality of candidate positions as the planned parking position SEP based on predetermined criteria. may be automatically set to
  • the parking assistance device 5 allows the user to visually grasp the progress of automatic parking by using the progress bar PB or the like, but it is not limited to this.
  • the parking assistance device 5 may, for example, allow the user to audibly grasp the progress of automatic parking.
  • the parking assistance device 5 changes the angle of the virtual viewpoint of the virtual viewpoint image according to the distance to the target, but it is not limited to this.
  • the parking assistance device 5 may keep the angle of the virtual viewpoint image constant regardless of the distance to the target.
  • the parking assistance device 5 instructs the user to stop the vehicle at a possible stop position different from the planned parking position SEP when the vehicle cannot be parked at the planned parking position SEP after the start of automatic parking.
  • the parking assistance device 5 notifies the driver of the situation, stops the vehicle V on the spot, and forcibly terminates the automatic parking. good.
  • the parking assistance device 5 after the follow-up control process is started, the parking assistance device 5 superimposes the target route image Gt indicating the target route TP on the surrounding image obtained during the execution of the follow-up control process.
  • the parking assistance device 5 may display the peripheral image obtained during the execution of the follow-up control process as it is.
  • the parking assistance device 5 displays the part behind the object and the part in front of the object in the target route image Gt in different manners. is not limited to The parking assistance device 5 may display, for example, the part behind the object and the part in front of the object in the target route image Gt in the same manner.
  • the parking assistance device 5 preferably provides the user with an illustration image Gi showing the relationship between the current position of the vehicle V, the target route TP, and the planned parking position SEP during automatic parking.
  • the illustration image Gi may not be provided.
  • the parking assistance device 5 notifies the user when searching for a route to avoid the obstacle OB on the target route TP. Although it is desirable that there is, it is not limited to this, and nothing may be notified.
  • the parking assistance device 5 searches for a route that avoids the obstacle OB when there is an obstacle OB on the target route TP, but is not limited to this.
  • the parking assistance device 5 may, for example, prompt the vehicle to stop on the spot or request the designation of the parking position without searching for an avoidance route for the obstacle OB.
  • the parking assistance device 5 of the present disclosure is applied to parking assistance in a parking lot PL having a plurality of parking spaces SP
  • the application target of the parking assistance device 5 is not limited to this.
  • the parking assistance device 5 can also be applied to parking assistance in a land or the like where one parking space SP is provided, such as in front of one's house.
  • the controller and techniques of the present disclosure are implemented on a dedicated computer provided by configuring a processor and memory programmed to perform one or more functions embodied by the computer program. good too.
  • the controller and techniques of the present disclosure may be implemented in a dedicated computer provided by configuring the processor with one or more dedicated hardware logic circuits.
  • the control unit and method of the present disclosure is a combination of a processor and memory programmed to perform one or more functions and a processor configured by one or more hardware logic circuits. It may be implemented on one or more dedicated computers.
  • the computer program may also be stored as computer-executable instructions on a computer-readable non-transitional tangible recording medium.
  • a parking assistance device A target to be passed by the vehicle when the vehicle is parked based on route information including a travel route of the vehicle and information about the surroundings of the vehicle on the travel route when the vehicle (V) is parked by a user.
  • a route generation unit (55) that generates a route (TP);
  • a follow-up control unit (57) that performs a follow-up control process for automatically moving the vehicle to the planned parking position (SEP) along the target route;
  • An information providing unit (58) that provides information to the user, The parking assistance device, wherein the information providing unit provides the information regarding the planned parking position included in the route information in a visual form to the user before starting the follow-up control process.
  • the information providing unit superimposes a virtual vehicle image (Gv) showing the vehicle at the planned parking position in the image obtained as information about the planned parking position in the route information.
  • Gv virtual vehicle image
  • the information providing unit superimposes a parking frame image (Gf) indicating the planned parking position on an image obtained as information around the planned parking position among the route information in the follow-up control process.
  • Gf parking frame image
  • the information providing unit provides the user with a three-dimensional display of an image showing the surroundings of the planned parking position, and the three-dimensional display in response to an operation signal of the operation unit (46) by the user. 5.
  • the parking assistance device according to any one of Disclosures 1 to 4, which changes a viewpoint.
  • the information providing unit visually displays information regarding the plurality of candidate positions included in the route information to the user.
  • the parking assistance device according to any one of Disclosures 1 to 5, which provides information for prompting selection of the planned parking position from among the plurality of candidate positions.
  • the follow-up control unit controls the parking plan based on information about the vehicle's surroundings obtained after the start of the follow-up control process. Identify a possible stop position that is different from the position,
  • the information providing unit provides information for recommending that the vehicle be stopped at the stoppable position.
  • the parking assistance device according to any one of the above.
  • the information providing unit After the follow-up control process is started, the information providing unit superimposes a target route image (Gt) indicating the target route on a surrounding image showing the surroundings of the vehicle obtained during execution of the follow-up control process. toward the user, the parking assistance device according to any one of Disclosures 1 to 7.
  • Gt target route image
  • the follow-up control unit When an obstacle is found on the target route, the follow-up control unit tries to generate an avoidance route that avoids the obstacle and reaches the planned parking position, 10.
  • the information providing unit according to any one of Disclosures 1 to 9, wherein when the follow-up control unit generates the avoidance route, the information providing unit provides the user with information regarding the avoidance route in a visual manner. parking aid.
  • the follow-up control unit identifies a stop position where the vehicle can be stopped when the avoidance route cannot be generated, 12.
  • the route generation unit avoids the obstacle before the follow-up control process is started and reaches the planned parking position. Attempt to generate an escape route,
  • the information providing unit provides the user with information regarding the object avoidance route in a visual manner before the follow-up control process is started.
  • a parking assist device according to any one of Disclosures 1 to 13.
  • the follow-up control unit identifies a possible stop position where the vehicle can be stopped when the object avoidance route cannot be generated by the route generation unit, 15.
  • the parking assistance device according to Disclosure 14, wherein the information providing unit provides the user with information about the possible stop position and a route to the possible stop position.
  • a parking assistance method comprising: A target to be passed by the vehicle when the vehicle is parked based on route information including a travel route of the vehicle when the user performs a parking operation of the vehicle (V) and information about the surroundings of the vehicle on the travel route. generating a path (TP); performing follow-up control processing for automatically moving the vehicle to a planned parking position (SEP) along the target route; providing information to the user; Providing the information to the user includes providing the information on the planned parking position included in the route information in a visual manner to the user before starting the follow-up control process. Parking assistance method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

La présente invention concerne un dispositif d'aide au stationnement (5) comprenant une unité de génération de trajet (55) qui génère un trajet cible (TP) devant être parcouru par un véhicule pendant le stationnement du véhicule sur la base d'informations de trajet comprenant un trajet de déplacement du véhicule lorsqu'une opération de stationnement du véhicule (V) est effectuée par un utilisateur et des informations concernant l'environnement du véhicule sur le trajet de déplacement. Le dispositif d'aide au stationnement comprend : une unité de commande de suivi (57) qui effectue un processus de commande de suivi pour déplacer automatiquement le véhicule vers une position de stationnement planifiée le long du trajet cible ; et une unité de fourniture d'informations (58) qui fournit des informations à l'utilisateur. L'unité de fourniture d'informations fournit des informations relatives à la position de stationnement planifiée comprise dans les informations de trajet, dans un mode visuel à l'utilisateur avant le début du processus de commande de suivi.
PCT/JP2022/031613 2021-08-24 2022-08-22 Dispositif d'aide au stationnement et procédé d'aide au stationnement WO2023027039A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202280055839.8A CN117836183A (zh) 2021-08-24 2022-08-22 停车辅助装置、停车辅助方法
JP2023543911A JPWO2023027039A1 (fr) 2021-08-24 2022-08-22

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021136598 2021-08-24
JP2021-136598 2021-08-24

Publications (1)

Publication Number Publication Date
WO2023027039A1 true WO2023027039A1 (fr) 2023-03-02

Family

ID=85323198

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/031613 WO2023027039A1 (fr) 2021-08-24 2022-08-22 Dispositif d'aide au stationnement et procédé d'aide au stationnement

Country Status (3)

Country Link
JP (1) JPWO2023027039A1 (fr)
CN (1) CN117836183A (fr)
WO (1) WO2023027039A1 (fr)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001133277A (ja) * 1999-11-09 2001-05-18 Equos Research Co Ltd ナビゲーション装置
JP2004340827A (ja) * 2003-05-16 2004-12-02 Xanavi Informatics Corp ルート図表示方法および表示制御装置
JP2010034645A (ja) * 2008-07-25 2010-02-12 Nissan Motor Co Ltd 駐車支援装置および駐車支援方法
JP2011035729A (ja) * 2009-08-03 2011-02-17 Alpine Electronics Inc 車両周辺画像表示装置および車両周辺画像表示方法
JP2011079372A (ja) * 2009-10-05 2011-04-21 Sanyo Electric Co Ltd 駐車支援装置
JP2012066614A (ja) * 2010-09-21 2012-04-05 Aisin Seiki Co Ltd 駐車支援装置
JP2018184091A (ja) * 2017-04-26 2018-11-22 株式会社Jvcケンウッド 運転支援装置、運転支援方法およびプログラム
JP2018203214A (ja) * 2017-06-09 2018-12-27 アイシン精機株式会社 駐車支援装置、駐車支援方法、運転支援装置、および運転支援方法
WO2019058781A1 (fr) * 2017-09-20 2019-03-28 日立オートモティブシステムズ株式会社 Dispositif d'aide au stationnement
DE102018220298A1 (de) * 2017-11-28 2019-05-29 Jaguar Land Rover Limited Einparkassistenzverfahren und Vorrichtung
WO2020095636A1 (fr) * 2018-11-09 2020-05-14 日立オートモティブシステムズ株式会社 Dispositif d'aide au stationnement et procédé d'aide au stationnement
US20200307616A1 (en) * 2019-03-26 2020-10-01 DENSO TEN AMERICA Limited Methods and systems for driver assistance

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001133277A (ja) * 1999-11-09 2001-05-18 Equos Research Co Ltd ナビゲーション装置
JP2004340827A (ja) * 2003-05-16 2004-12-02 Xanavi Informatics Corp ルート図表示方法および表示制御装置
JP2010034645A (ja) * 2008-07-25 2010-02-12 Nissan Motor Co Ltd 駐車支援装置および駐車支援方法
JP2011035729A (ja) * 2009-08-03 2011-02-17 Alpine Electronics Inc 車両周辺画像表示装置および車両周辺画像表示方法
JP2011079372A (ja) * 2009-10-05 2011-04-21 Sanyo Electric Co Ltd 駐車支援装置
JP2012066614A (ja) * 2010-09-21 2012-04-05 Aisin Seiki Co Ltd 駐車支援装置
JP2018184091A (ja) * 2017-04-26 2018-11-22 株式会社Jvcケンウッド 運転支援装置、運転支援方法およびプログラム
JP2018203214A (ja) * 2017-06-09 2018-12-27 アイシン精機株式会社 駐車支援装置、駐車支援方法、運転支援装置、および運転支援方法
WO2019058781A1 (fr) * 2017-09-20 2019-03-28 日立オートモティブシステムズ株式会社 Dispositif d'aide au stationnement
DE102018220298A1 (de) * 2017-11-28 2019-05-29 Jaguar Land Rover Limited Einparkassistenzverfahren und Vorrichtung
WO2020095636A1 (fr) * 2018-11-09 2020-05-14 日立オートモティブシステムズ株式会社 Dispositif d'aide au stationnement et procédé d'aide au stationnement
US20200307616A1 (en) * 2019-03-26 2020-10-01 DENSO TEN AMERICA Limited Methods and systems for driver assistance

Also Published As

Publication number Publication date
CN117836183A (zh) 2024-04-05
JPWO2023027039A1 (fr) 2023-03-02

Similar Documents

Publication Publication Date Title
EP3367367B1 (fr) Dispositif d'aide au stationnement et procédé d'aide au stationnement
CN108140311B (zh) 停车辅助信息的显示方法及停车辅助装置
JP6493545B2 (ja) 情報提示装置及び情報提示方法
EP3650285B1 (fr) Procédé et dispositif d'aide au stationnement
JP6547836B2 (ja) 駐車支援方法及び駐車支援装置
US11479238B2 (en) Parking assist system
JP4614005B2 (ja) 移動軌跡生成装置
WO2020261781A1 (fr) Dispositif de commande d'affichage, programme de commande d'affichage et support lisible par ordinateur tangible persistant
JP7218822B2 (ja) 表示制御装置
JP7443705B2 (ja) 周辺監視装置
CN110831818B (zh) 泊车辅助方法以及泊车辅助装置
CN111891119A (zh) 一种自动泊车控制方法及***
US20220309803A1 (en) Image display system
US20220308345A1 (en) Display device
CN112124092A (zh) 驻车辅助***
US20200398865A1 (en) Parking assist system
WO2023027039A1 (fr) Dispositif d'aide au stationnement et procédé d'aide au stationnement
US11222552B2 (en) Driving teaching device
JP2022043996A (ja) 表示制御装置及び表示制御プログラム
JP7473087B2 (ja) 駐車支援装置、駐車支援方法
WO2023002863A1 (fr) Dispositif d'aide à la conduite, procédé d'aide à la conduite
WO2024157449A1 (fr) Procédé d'aide au stationnement et dispositif d'aide au stationnement
US20240075879A1 (en) Display system and display method
JP2023123353A (ja) 情報処理装置、および情報処理方法、並びにプログラム
JP2023028238A (ja) 障害物表示装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22861327

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023543911

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 202280055839.8

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE