WO2022090758A1 - 駐車支援方法及び駐車支援装置 - Google Patents
駐車支援方法及び駐車支援装置 Download PDFInfo
- Publication number
- WO2022090758A1 WO2022090758A1 PCT/IB2020/000915 IB2020000915W WO2022090758A1 WO 2022090758 A1 WO2022090758 A1 WO 2022090758A1 IB 2020000915 W IB2020000915 W IB 2020000915W WO 2022090758 A1 WO2022090758 A1 WO 2022090758A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- image
- parking space
- reference images
- parking
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 77
- 238000006243 chemical reaction Methods 0.000 claims description 9
- 230000033001 locomotion Effects 0.000 claims description 3
- 230000006870 function Effects 0.000 description 34
- 238000012545 processing Methods 0.000 description 29
- 240000004050 Pentaglottis sempervirens Species 0.000 description 22
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 22
- 238000001514 detection method Methods 0.000 description 19
- 230000007613 environmental effect Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/168—Driving aids for parking, e.g. acoustic or visual feedback on parking space
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/06—Automatic manoeuvring for parking
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D15/00—Steering not otherwise provided for
- B62D15/02—Steering position indicators ; Steering position determination; Steering aids
- B62D15/027—Parking aids, e.g. instruction means
- B62D15/0285—Parking performed automatically
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/586—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of parking space
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/14—Traffic control systems for road vehicles indicating individual free spaces in parking areas
- G08G1/141—Traffic control systems for road vehicles indicating individual free spaces in parking areas with means giving the indication of available parking spaces
- G08G1/143—Traffic control systems for road vehicles indicating individual free spaces in parking areas with means giving the indication of available parking spaces inside the vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
Definitions
- the present invention relates to a parking support method and a parking support device.
- the location information of the parking space is registered in advance, and when parking in the parking space from the next time onward, the location information of the parking space is read from the memory and parking support control is performed.
- a parking support device that executes the above (Patent Document 1).
- This parking support device uses four cameras provided on each of the front bumper, rear trunk, and left and right door mirrors of the vehicle, and is a viewpoint image of the vehicle and the surrounding area of the vehicle from one virtual viewpoint, for example, a vehicle.
- a bird's-eye view image that looks down on the vehicle is generated from the position directly above, and the parking space reflected in this image is registered in the memory or collated with the template image registered in the memory.
- one bird's-eye view image is generated, and this is stored as an image of the surroundings of the vehicle captured at the current stop position or as a template image. Therefore, if the current image around the vehicle or the template image itself registered in the memory in advance is an image that is incompatible with the recognition process, such as the absence of feature points that are easy to match, the recognition accuracy of the target parking space to be parked from now on will be high. There is a problem of deterioration.
- the problem to be solved by the present invention is to provide a parking support method and a parking support device having high recognition accuracy of the target parking space.
- the present invention is a case where a plurality of different reference images including a target parking space generated by using an image pickup device that images the surroundings of the own vehicle are stored in advance, and then the own vehicle is parked in the target parking space. Recognizes the target parking space by collating the current image of the surroundings of the own vehicle with at least one reference image of the different plurality of reference images, thereby solving the above-mentioned problem.
- the matching process of the target parking space is executed using a plurality of different reference images, it is possible to provide a parking support method and a parking support device having high recognition accuracy of the target parking space.
- FIG. 1 It is a block diagram which shows one Embodiment of the parking support system to which the parking support method and the parking support device of this invention are applied. It is a perspective view which shows the image pickup apparatus of the parking support system of FIG. This is an example of a display image showing a bird's-eye view image generated by the image processing device of the parking support system of FIG. 1. It is a top view which shows the distance measuring device of the parking support system of FIG. It is a top view which shows the example of the line pattern which defines the parking space which concerns on embodiment of this invention. It is a top view which shows an example of a parking space. It is a figure which shows an example of the 1st reference image registered in the parking support apparatus of FIG.
- FIG. 1 is a block diagram showing an embodiment of a parking support system 1 to which the parking support method and the parking support device of the present invention are applied.
- the parking support system 1 described below is not particularly limited, but a parking space of a parking lot that is used relatively frequently, such as a parking lot at home or a parking lot at work, is registered in the parking support system 1 in advance. After that, when the own vehicle is parked in the registered parking space, the parking operation is supported by using the information of the parking space registered in advance and the autonomous driving control.
- the parking support system 1 of the present embodiment includes a parking support device 10, an image pickup device 20, an image processing device 30, a distance measuring device 40, a vehicle controller 50, a drive system 60, a vehicle speed sensor 70, and a steering angle. It is equipped with a sensor 80. These devices are connected by a CAN (Control Area Network) or other in-vehicle LAN in order to exchange information with each other.
- CAN Control Area Network
- the parking support device 10 of the present embodiment includes a control device 11 and an output device 12.
- the control device 11 includes a ROM (Read Only Memory) 111 in which a parking support program is stored, a CPU (Central Processing Unit) 112 that executes a program stored in the ROM 111, and a RAM (Random) that functions as an accessible storage device. It is a computer equipped with Access Memory) 113.
- ROM Read Only Memory
- CPU Central Processing Unit
- RAM Random
- ROM 111 for example, a parking space in which the own vehicle can be parked is detected, this parking space is displayed on the display 121, and a parking route for parking the own vehicle in the target parking space set by the driver. Is stored, and a parking support program that executes autonomous driving of the own vehicle along the parking route is stored.
- the output device 12 includes a display 121 for presenting image information such as a parking space as a parking candidate for the own vehicle to the driver, and a speaker 122 for outputting instructions to the driver by voice.
- the parking support device 10 of the present embodiment in addition to autonomous parking in which all of the steering, accelerator, and brake are autonomously operated, at least one of the steering, accelerator, and brake is manually operated, and the remaining operations are autonomously controlled. It can also be applied to semi-autonomous parking. Further, the parking support device 10 of the present embodiment is also applied to parking support in which the driver is presented with a parking route and the driver manually operates the steering, accelerator, and brake to park the vehicle in the target parking space. Can be done. When executing autonomous parking and semi-autonomous parking, it is more preferable to use an automatic return type switch such as a deadman switch.
- the autonomous driving control of the own vehicle is executed only when the deadman switch is pressed, and when the pressing of the deadman switch is released, the autonomous driving control and semi-autonomous driving of the own vehicle are executed. This is because the travel control is canceled.
- the parking support device 10 may be remotely controlled by a signal transmitted from a portable terminal device (a device such as a smartphone or a PDA) capable of exchanging information with the parking support device 10.
- a portable terminal device a device such as a smartphone or a PDA
- the parking support device 10 of the present embodiment can also be applied to remote parking in which the driver operates and parks his / her own vehicle from outside the vehicle.
- the image pickup device 20, the image processing device 30, and the distance measuring device 40 of the present embodiment have a white line indicating a parking space around the own vehicle, such as the entire circumference of the own vehicle such as the front, side, and rear of the own vehicle, and parking. It is a device for detecting information on the driving environment including curbs, steps or walls that mark spaces, obstacles around the vehicle, and other conditions around the vehicle.
- the image pickup device 20 of the present embodiment is an in-vehicle device for recognizing information on the traveling environment around the own vehicle from images.
- the image pickup device 20 images the surroundings of the own vehicle, and includes a white line indicating the parking space existing around the own vehicle, a curb that serves as a mark of the parking space, a step or a wall, and an obstacle around the own vehicle. By acquiring the data, information on the driving environment around the own vehicle is acquired. Further, the image pickup device 20 includes a camera including an image pickup element such as a CCD, an ultrasonic camera, an infrared camera, and other cameras. The information about the traveling environment around the own vehicle acquired by the image pickup device 20 is output to the control device 11.
- FIG. 2 is a perspective view showing the image pickup device 20 of the parking support system 1 of the present embodiment, and shows an arrangement example of the image pickup device 20 mounted on the own vehicle V1.
- the image pickup device 20a is arranged on the front grille portion of the own vehicle V1
- the image pickup device 20b is arranged under the left door mirror
- the image pickup device 20c is arranged under the right door mirror
- the image pickup device 20c is arranged on the upper part of the rear bumper.
- An image pickup device 20d is arranged.
- the image pickup devices 20a to 20d may be cameras provided with a wide-angle lens having a large viewing angle. As shown in FIG.
- the image pickup device 20a mainly takes an image from the right front to the left front of the own vehicle V1
- the image pickup device 20b mainly takes an image from the left front to the left rear of the own vehicle V1.
- the image is mainly taken from the left rear to the right rear of the own vehicle V1
- the image pickup apparatus 20d mainly takes an image from the right rear to the right front of the own vehicle V1.
- the image processing device 30 of the present embodiment is a device for generating a bird's-eye view image showing the surrounding situation of the own vehicle V1 when the own vehicle V1 is viewed from an upper virtual viewpoint.
- the virtual viewpoint is, for example, the virtual viewpoint VP1 shown in FIG.
- the image processing device 30 generates a bird's-eye view image using a plurality of captured images acquired by using the image pickup device 20.
- the image processing performed in the image processing apparatus 30 to generate a bird's-eye view image is not particularly limited, but for example, "Masayasu Suzuki, Satoshi Chino, Teruhisa Takano, Development of a bird's-eye view system, Pre-printed collection of the Society of Automotive Engineers of Japan, Academic Lecture, etc.” 116-07 (2007-10), 17-22. ”.
- the bird's-eye view image generated by the image processing device 30 is output to the control device 11 and presented to the driver by the display 121 shown in FIGS. 2 and 3.
- the installation position of the display 121 is not particularly limited, and the display 121 is installed at an appropriate position that can be visually recognized by the driver. can.
- the image displayed on the display 121 is a bird's-eye view image from the virtual viewpoint VP1
- an image viewed in the horizontal direction from another virtual viewpoint VP2 can be generated by the same image conversion method, and parking will be described later. Used for space registration. Details will be described later.
- FIG. 3 is a display screen of the display 121 showing an example of a bird's-eye view image generated by the image processing device 30.
- FIG. 3 shows a scene in which the own vehicle V1 is driving slowly in a parking lot in order to find a suitable parking space P.
- the bird's-eye view image IM1 generated by the image processing device 30 is displayed on the left side of the display screen of the display 121 shown in FIG. 3, and the monitoring image IM2 for monitoring the surroundings of the own vehicle V1 is displayed on the right side. ..
- the own vehicle V1 is displayed in the center of the bird's-eye view image IM1, and parking spaces separated by white lines are displayed on the left and right sides of the own vehicle V1.
- the other vehicle V2 is displayed in the parking space in which the other vehicle V2 is detected.
- a broken line frame indicating the parking space P is displayed as a space in which the own vehicle V1 can park.
- a thick solid line frame is displayed in the target parking space Pt selected by the driver from the parking space P.
- the monitoring image IM2 is an image acquired from the image pickup device 20a arranged on the front grille portion of the own vehicle V1 in order to present information on the surrounding environment in front of the current traveling direction of the own vehicle V1. Is displayed.
- an obstacle such as another vehicle V2 is displayed.
- the distance measuring device 40 of the present embodiment is a device for calculating the relative distance and the relative speed between the own vehicle V1 and the object.
- the range measuring device 40 is, for example, a radar device such as a laser radar, a millimeter wave radar (LRF), a LiDAR unit (light detection and ranking unit), an ultrasonic radar, or a sonar.
- the distance measuring device 40 detects the presence / absence of an object, the position of the object, and the distance to the object based on the received signal of the radar device or sonar.
- the object is, for example, an obstacle around the own vehicle V1, a pedestrian, a bicycle, or another vehicle.
- the information of the object detected by the distance measuring device 40 is output to the control device 11.
- the control device 11 stops the own vehicle V1 and the object and the own vehicle V1 are close to each other.
- the display 121 and the speaker 122 notify the driver to that effect.
- FIG. 4 is a plan view showing the distance measuring device 40 of the parking support system 1 of the present embodiment, and shows an arrangement example when the distance measuring device 40 is mounted on the own vehicle V1.
- the own vehicle V1 shown in FIG. 4 has four front distance measuring devices 40a for detecting an object in front of the own vehicle V1 and two right side distance measuring devices for detecting an object on the right side of the own vehicle V1.
- the device 40b, two left-side ranging devices 40c for detecting an object on the left side of the own vehicle V1, and four rear ranging devices 40d for detecting an object behind the own vehicle V1 are provided. ..
- These ranging devices 40a to 40d can be installed, for example, in the front bumper and the rear bumper of the own vehicle V1.
- the vehicle controller 50 of the present embodiment is an in-vehicle computer such as an electronic control unit (ECU: Electronic Control Unit) for electronically controlling the drive system 60 that regulates the operation of the own vehicle V1.
- ECU Electronic Control Unit
- the vehicle controller 50 controls the drive device, the braking device, and the steering device included in the drive system 60, and when the parking support routine is executed, the own vehicle V1 moves from the current position to the target parking space Pt and parks. To support driving movements.
- the vehicle controller 50 receives a control command based on a parking route, a target vehicle speed, and a target steering angle calculated in advance from the parking support device 10. Details of these parking routes, target vehicle speeds, and target steering angles will be described later.
- the drive system 60 of the present embodiment includes an internal combustion engine and / or an electric motor as a traveling drive source, a power transmission device including a drive shaft and a transmission for transmitting the output from these traveling drive sources to the drive wheels, and the power transmission device. It includes various devices such as a drive device for controlling the wheel, a braking device for braking the wheels, and a steering device for controlling the steering wheel according to the steering angle of the steering wheel (so-called steering wheel).
- the vehicle controller 50 receives a control command based on the parking route and the target vehicle speed calculated in advance from the parking support device 10.
- the vehicle controller 50 generates a control signal to the drive device and the braking device of the drive system 60 based on the control command from the parking support device 10, and executes the control of the driving behavior including the acceleration / deceleration of the vehicle.
- the drive system 60 can autonomously control the vehicle speed of the own vehicle V1 by receiving the control signal from the vehicle controller 50.
- the drive system 60 includes a steering device.
- the steering device includes a steering actuator, and the steering actuator includes a motor and the like attached to the column shaft of the steering.
- the steering device of the drive system 60 is such that the vehicle travels while maintaining a predetermined lateral position (position in the left-right direction of the vehicle) of the own vehicle with respect to the pre-calculated parking route. It is controlled by the controller 50. Further, the vehicle controller 50 has the environmental information around the own vehicle V1 acquired by the image pickup device 20, the bird's-eye view image IM1 generated by the image processing device 30, and the obstacle around the own vehicle V1 detected by the distance measuring device 40.
- the steering device is controlled by using at least one of the information of an object, a pedestrian, and another vehicle.
- the parking support device 10 transmits a control command based on the parking route and the target steering angle calculated in advance to the vehicle controller 50. Then, the vehicle controller 50 generates a control signal to the steering device of the drive system 60 based on the control command from the parking support device 10, and executes the steering control of the own vehicle V1.
- the drive system 60 can autonomously control the steering of the own vehicle V1 by receiving the control signal from the vehicle controller 50.
- the vehicle speed sensor 70 of the present embodiment is a sensor provided in the drive device of the drive system 60 for detecting the vehicle speed of the own vehicle V1.
- the steering angle sensor 80 of the present embodiment is a sensor provided in the steering device of the drive system 60 for detecting the steering angle of the own vehicle V1.
- the vehicle speed of the own vehicle V1 detected by the vehicle speed sensor 70 and the steering angle of the own vehicle V1 detected by the steering angle sensor 80 are output to the control device 11 via the vehicle controller 50.
- the control device 11 of the present embodiment has an environment information acquisition function for acquiring environment information around the own vehicle V1 by executing a parking support program stored in the ROM 111 by the CPU 112, and parking in which the own vehicle V1 can park.
- a parking space detection function that detects an area
- a parking space display function that displays the detected parking space P as a bird's-eye view image IM1 on the display 121
- a parking space display function for the own vehicle V1 to move from the current position to the target parking space Pt and park.
- a parking route calculation function for calculating a parking route and a traveling operation planning function for planning a traveling operation for the own vehicle V1 to park along the calculated parking route are realized.
- the control device 11 of the present embodiment acquires information on the surrounding environment including the existence of obstacles located around the own vehicle V1 by the environment information acquisition function. For example, the control device 11 obtains information on the surrounding environment from the vehicle speed information of the own vehicle V1 detected by the vehicle speed sensor 70 and the steering angle information of the own vehicle V1 detected by the steering angle sensor 80 by the environment information acquisition function. Get as. Further, for example, the control device 11 has a position information of the own vehicle V1 detected by the own vehicle position detection device equipped with a GPS unit, a gyro sensor, and the like by the environmental information acquisition function, or a three-dimensional high-precision map stored in the ROM 111.
- the position information of the own vehicle V1 detected by (including the position information of various facilities and specific points) is acquired as information on the surrounding environment.
- the environment information acquisition function of the control device 11 automatically determines whether or not the traveling condition of the own vehicle V1 is a scene in which the own vehicle V1 is parked in the target parking space Pt. You may judge.
- the parking space detection function is a function for detecting a parking area where the own vehicle V1 can park by using the environmental information around the own vehicle V1 acquired by the environmental information acquisition function.
- the control device 11 creates a bird's-eye view image IM1 by the image processing device 30 using the captured image acquired by the image pickup device 20.
- the control device 11 detects a line defining the boundary of the area from the created bird's-eye view image IM1 by the parking space detection function, and identifies a candidate line defining the parking area from the detected line.
- the control device 11 determines whether or not the specified line candidate defines the parking area by the parking space detection function, and if it is determined that the specified line candidate defines the parking area, the control device 11 determines. It is determined whether or not the own vehicle V1 can park in the detected parking area.
- the control device 11 In order to detect the line defining the boundary of the area from the bird's-eye view image IM1, the control device 11 performs edge detection on the bird's-eye view image IM1 by the parking space detection function, and calculates the luminance difference (contrast). Then, the control device 11 identifies a pixel sequence having a luminance difference of a predetermined value or more from the bird's-eye view image IM1 by the parking space detection function, and calculates the line thickness and the line length.
- the color of the detected line does not necessarily have to be white, and may be red, yellow or other colors.
- the control device 11 uses a known image processing technique such as pattern matching.
- the pattern used for pattern matching is stored in advance in the ROM 111 of the control device 11.
- FIG. 5A shows a scene in which the own vehicle V1 is about to park in the target parking space Pt1 where the own vehicle V1 can park, which is located between the other vehicle V2a and the other vehicle V2b.
- the lines defining the parking space corresponding to the target parking space Pt1 are the lines L1, the line L2, and the line L3, which are three sides of the sides constituting the rectangle of the target parking space Pt1.
- the ROM 111 of the control device 11 stores the combination of the line L1, the line L2, and the line L3 defining the parking space as a pattern corresponding to the scene of FIG. 5A.
- FIG. 5B shows a scene in which the own vehicle V1 is trying to parallel park in the target parking space Pt2 between the other vehicle V2c and the other vehicle V2d.
- the control device 11 detects the line L4 as a line defining the parking space corresponding to the target parking space Pt2 by using the image pickup device 20, and the curb located on the left side of the other vehicle V2c and the other vehicle V2d.
- a wall or the like (not shown) is further detected.
- the control device 11 cannot recognize the parking space only by the line L4 and the curb. Therefore, in the scene of FIG.
- the control device 11 sets the virtual line L5 behind the other vehicle V2c and sets the virtual line L6 in front of the other vehicle V2d. Then, when the parking space can be recognized by adding the virtual lines L5 and L6 to the line L4 detected by the image pickup apparatus 20, it is determined that the line L4 is a line defining the parking space.
- the ROM 111 of the control device 11 corresponds to the scene of FIG. 5B by combining the line L4, the virtual line L5 arranged behind the other vehicle V2c, and the virtual line L6 arranged in front of the other vehicle V2d. I remember it as a pattern.
- FIG. 5C shows a scene in which the own vehicle V1 is trying to park in an oblique posture in the target parking space Pt3 where the own vehicle V1 can park, which is located between the other vehicle V2e and the other vehicle V2f.
- the lines defining the parking area corresponding to the target parking space Pt3 are the lines L7, L8, L9, and L10, which are the sides constituting the parallelogram of the target parking space Pt3.
- the ROM 111 of the control device 11 stores the combination of the line L7, the line L8, the line L9, and the line L10 defining the parking area as a pattern corresponding to the scene of FIG. 5C. Then, the parking space P detected by the parking space detection function is presented to the driver by the parking space display function.
- the parking space detection function of the control device 11 of the present embodiment detects whether or not there is a space in which the own vehicle can park in an arbitrary place, but in addition to this, the parking lot at home or the like.
- a parking space of a parking lot that is used relatively frequently such as a parking lot at work, is registered in the parking support system 1 in advance, it also has a function of detecting the pre-registered parking space.
- the registration process of a specific parking space and the collation process thereof will be described. If there is a parking lot at home, a parking lot at work, or any other parking space where the driver wants to use autonomous parking control the next time he / she parks, for example, park his / her vehicle in that parking space and park in this state. Press the parking space registration button provided on the support device 10. As a result, the latitude and longitude of the parking space are stored in the memory of the control device 11 by the own vehicle position detection device (environmental information acquisition function) such as the GPS unit and the gyro sensor included in the control device 11. At the same time, the image pickup device 20 is used to image the surroundings of the own vehicle, and this is stored in the memory of the control device 11 as a reference image.
- the own vehicle position detection device environment information acquisition function
- the image pickup device 20 is used to image the surroundings of the own vehicle, and this is stored in the memory of the control device 11 as a reference image.
- the position information of the parking space and the reference image are associated and stored, and the registration process of the specific parking space is completed. Then, when the vehicle approaches the parking space and tries to park after that, it can be set as one of the target parking spaces of the autonomous parking control.
- the reference image stored in advance in the control device 11 of the present embodiment is an image for specifying the target parking space, but not one but a plurality of different reference images are stored.
- the term "different reference images" as used herein means to include at least one of a plurality of reference images having different viewpoint positions, a plurality of reference images having different line-of-sight directions, and a plurality of reference images having different conversion methods. be.
- a plurality of reference images having different viewpoint positions can be generated by using the image data captured by the image pickup device 20 in the same manner as the image processing using the image processing device 30 described with reference to FIG. For example, an image from a viewpoint position having a different height (z coordinate) or a position on a plane (xy coordinate) can be used as a reference image even if the line-of-sight direction is the same from top to bottom. Similarly, a plurality of reference images having different line-of-sight directions can also be generated by the image processing device 30 using the image data captured by the image pickup device 20.
- images in different line-of-sight directions such as a vertically downward line-of-sight and a horizontal line-of-sight can be used as a reference image.
- a plurality of reference images having different conversion methods can also be generated by the image processing device 30 using the image data captured by the image pickup device 20, for example, an image with fisheye conversion and an image without fisheye conversion. be.
- a plurality of reference images in which both the viewpoint position and the line-of-sight direction, both the viewpoint position and the conversion method, both the line-of-sight direction and the conversion method, or all three are different are generated and stored. good.
- the reference image may be generated using only the image data captured by one camera, or the image data may be captured by a plurality of cameras.
- a reference image may be generated using a plurality of image data.
- Such a reference image is generated by pressing a registration button or the like provided on the parking support device 10 of the present embodiment, and is stored in the memory of the control device 11. That is, when the registration button is pressed, the image pickup device 20 captures an image of the surroundings of the own vehicle, and the image processing device 30 processes the image to generate a reference image.
- the plurality of different reference images to be stored those set in advance in a predetermined form such as the viewpoint position, the line-of-sight direction, or the return method may be automatically stored, or a part or all of the forms may be stored in the driver. It may be presented and selected by the driver. Further, when the reference image is used, the entire image generated by the image processing device 30 may be used as the reference image, or a portion of the image generated by the image processing device 30 that should be a feature portion may be automatically extracted. However, the driver may select it.
- FIG. 6 is a plan view showing an example of a parking space to be registered in advance. It is assumed that the parking space shown in the figure is the parking space of the driver's home, the driver's home has a house H1, a courtyard H2, a courtyard passage H3, and a target parking space Pt next to the courtyard H2. H4 indicates a pole erected at the end of passage H3, and RD indicates a road in front of the house.
- the parking support device 10 of the present embodiment uses image data captured by the image pickup device 20b attached to the left door mirror of the image pickup device 20 in a state where the own vehicle V1 is parked in the target parking space Pt, that is, in a state where parking is completed. It is assumed that the image processing apparatus 30 generates an image IM3 viewed directly below from the virtual viewpoint VP1 shown in FIG. 2 and an image IM4 viewed from the virtual viewpoint VP2 on the outside in the horizontal direction.
- FIG. 7A is a diagram showing an example of the first reference image IM31 registered in the parking support device 10
- FIG. 7B is a diagram showing an example of the second reference images IM41 and IM42 also registered in the parking support device 10.
- the image IM3 shown in FIG. 7A is a planar image of the left side view of the own vehicle V1 parked in the target parking space Pt, viewed directly below from the virtual viewpoint VP1. Therefore, the boundary line L11 between the courtyard H2 and the passage H3 and the boundary line L12 between the courtyard H2 and the parking space Pt are characteristic points suitable for collation.
- the courtyard H2 itself is not suitable for collation because it has uniform colors, no patterns, and no feature points. Therefore, in the present embodiment, the image IM31 of the portion including the boundary lines L11 and L12 is used as the first reference image.
- the image IM4 shown in FIG. 7B is a front image of the left side view of the own vehicle V1 parked in the target parking space Pt, viewed from the virtual viewpoint VP2 toward the horizontal house H1. Therefore, the vertical line L13 on the entrance wall of the house H1 and the pole H4 at the end of the passage H3 are characteristic points suitable for collation. On the other hand, the courtyard H2 and the passage H3 are not suitable for collation because they have the boundary lines L11 and L12 but do not show much features in the horizontal front image. Therefore, in the present embodiment, at least one of the image IM41 of the portion including the vertical line L13 and the image IM42 of the portion including the pole H4 is used as the second reference image.
- the first reference image IM31 and the second reference images IM41 and IM42 are subsequently set by the driver as the target parking space Pt.
- it is read from the memory of the control device 11 and serves as a reference image for the collation process.
- the reference images stored in the memory of the control device 11 are the images IM31, IM41, and IM42 obtained by cutting out a part of the entire images IM3 and IM4 that are particularly characteristic points.
- the entire image IM3 and image IM4 may be used as a reference image.
- a goodness of fit for collation may be set for each of a plurality of different reference images, and the set goodness of fit may be used for calculating the position information of the target parking space Pt.
- the goodness of fit of the present embodiment is the likelihood, correlation coefficient, and other degrees of the reference image that are estimated to have a high matching probability, and the goodness of fit of all the reference images is 1 or 100%. It is a characteristic value. For example, if a feature point that can be easily distinguished from another image is included, or if a plurality of different feature points are included, the goodness of fit becomes high.
- the position information of the target space calculated based on the reference image with a goodness of fit of 60% is weighted by 60%, and the position information of the target space calculated based on the reference image with a goodness of fit of 40% is 40%. May be weighted and weighted to calculate the final target parking space position information.
- a traveling route from a predetermined parking start position P0 on the road RD to the target parking space Pt is associated with the registration.
- R may be tracked and stored in the memory of the control device 11 in advance.
- the image pickup device 20 is used to capture an image of the surroundings of the own vehicle. , By comparing this image with the reference image, the position information of the target parking space specified by the reference image is calculated. At this time, the image around the current own vehicle is converted into an image form corresponding to each of a plurality of different reference images. For example, in the case where one reference image is the first reference image IM31 which is a plane image viewed directly below from the virtual viewpoint VP1 shown in FIG.
- the image processing device 30 when collating with the first reference image IM31, the image processing device 30 Is used to convert the image data captured by the image pickup apparatus 20 into a flat image viewed directly below from the virtual viewpoint VP1.
- the other reference image is the second reference images IM41 and IM42 which are front images viewed in the horizontal direction from the virtual viewpoint VP2 shown in FIG. 7B
- the collation with the second reference images IM41 and IM42 is performed.
- the image processing device 30 is used to convert the image data captured by the image pickup device 20 into a front image viewed from the virtual viewpoint VP2 in the horizontal direction.
- the image around the current own vehicle when collating the image around the current own vehicle with a plurality of different reference images, the image around the current own vehicle is collated with all the different reference images. It may be processed, or it may be collated with a part of reference images of a plurality of different reference images. Then, when collating the image around the current own vehicle with all the different reference images, the target parking space is weighted by the above-mentioned goodness of fit from all the recognized reference images. The position information of Pt is calculated. Further, when collating the image around the current own vehicle with the reference image of a part of a plurality of different reference images, there is a reference image that can be recognized by collating the different reference images in order. At that time, even if there is a remaining reference image, the collation process is terminated there, and the position information of the target parking space Pt is calculated based on the recognized reference image.
- the parking space display function of the control device 11 of the present embodiment presents the parking space P to the driver, the parking space P detected by the control device 11 or the parking space registered in advance (FIG. 1). It is a function to display Pt of 6) on the display 121.
- the control device 11 presents the parking space P to the driver in the broken line frame.
- the driver selects the target parking space Pt for parking the own vehicle V1 from the parking space P shown in FIG.
- the target parking space Pt can be selected, for example, by touching the screen of the display 121 made of a touch panel.
- the control device 11 can calculate the parking route by the parking route calculation function.
- the parking route calculation function of the control device 11 of the present embodiment is a function of calculating the parking route to the target parking space Pt in order to park the own vehicle V1 in the set target parking space Pt.
- FIGS. 8 to 10 is an example of a flowchart showing the basic processing of the parking support process of the present embodiment
- FIG. 9 is a flowchart showing an example of the subroutine of step S6 of FIG. 8
- FIG. 10 is the subroutine of step S6 of FIG. It is a flowchart which shows the other example.
- the parking support process described below is executed by the control device 11 at predetermined time intervals.
- step S1 of FIG. 8 the control device 11 determines whether or not the autonomous parking start button has been pressed. For example, when the driver presses the autonomous parking button when he / she enters the parking lot, the parking support process of the present embodiment starts. Alternatively, instead of this, it may be determined whether or not autonomous parking is automatically started by the environmental information acquisition function. If it is determined that autonomous parking is to be started, the process proceeds to step S2, and if not, this step S1 is repeated.
- step S2 it is determined whether or not the driver has selected the registered parking space registered in advance in the memory of the control device 11. If the registered parking space is selected, the process proceeds to step S6. If not, the process proceeds to step S3.
- step S3 the control device 11 creates a bird's-eye view image IM1 by the image processing device 30 using the image acquired by the image pickup device 200 by the parking space detection function. Then, the parking area is detected from the bird's-eye view image IM1, and the parking space P is detected from the parking area by using the image pickup device 200 and the distance measuring device 40.
- step S4 the control device 11 presents the parking space P to the driver using the display 121 by the parking space display function.
- the driver selects a desired parking space from the displayed parking spaces P and sets this as the target parking space.
- step S5 the control device 11 determines whether or not the target parking space Pt for parking the own vehicle V1 is set from the parking space P. If the driver determines that the target parking space Pt for parking the own vehicle V1 has not been selected from the parking spaces P, the process returns to step S4, and the parking space P is used until the driver selects the target parking space Pt. Continue to present to. On the other hand, if it is determined that the driver has selected the target parking space Pt for parking the own vehicle V1 from the parking space P, the process proceeds to step S7.
- FIG. 9 is a flowchart showing the subroutine in step S6, in which, when the driver selects the registered parking space, the image of the current surroundings of the own vehicle captured by the image pickup device 20 and a plurality of different reference images registered in advance. Is collated, the target parking space Pt is recognized, and the position information of the target parking space Pt is calculated.
- step S611 of FIG. 9 the image around the current own vehicle captured by the image pickup apparatus 20 is converted according to the form of the first reference image IM31 and collated with the first reference image IM31.
- the first reference image IM31 is a plane image viewed directly below from the virtual viewpoint VP1
- the image around the current own vehicle captured by the image pickup apparatus 20 is viewed directly below from the virtual viewpoint VP1. Convert to a flat image.
- the first reference image IM31 in the image around the current own vehicle is recognized by using a matching method using a conventionally known template image, a feature point matching method, and other matching methods.
- the image around the current own vehicle captured by the image pickup apparatus 20 is converted according to the form of the second reference images IM41 and IM42, and collated with the second reference images IM41 and IM42.
- the second reference images IM41 and IM42 are front images viewed horizontally from the virtual viewpoint VP2
- the image around the current own vehicle captured by the image pickup apparatus 20 is displayed horizontally from the virtual viewpoint VP2. Convert to a frontal image viewed in the direction.
- the second reference images IM41 and IM42 in the image around the current own vehicle are recognized by using a matching method using a conventionally known template image, a feature point matching method, and other matching methods.
- step S613 as a result of the collation executed in step S611, it is determined whether or not there is a part matching the first reference image IM31 in the image around the current own vehicle, and if it is determined that there is a matching part, it is determined. The process proceeds to step S614, and if it is determined that there is no matching portion, the process proceeds to step S617.
- step S614 as a result of the collation executed in step S612, it is determined whether or not there is a portion matching the second reference images IM41 and IM42 in the image around the current own vehicle, and it is determined that there is a matching portion. Proceeds to step S615, and if it is determined that there is no matching portion, the process proceeds to step S616.
- step S613 determines whether or not there is a portion matching the second reference images IM41 and IM42 in the surrounding image, and if it is determined that there is a matching portion, the process proceeds to step S618, and if it is determined that there is no matching portion, step S619 is performed. Proceed to.
- step S615 it is determined in step S613 that there is a portion that matches the first reference image IM31, and further, in step S614, it is determined that there is a portion that matches the second reference images IM41 and IM42.
- the position of the target parking space Pt is calculated based on the detection positions of the image IM31 and the second reference images IM41 and IM42. At this time, when the goodness of fit is set for the first reference image IM31 and the second reference images IM41 and IM42, the position of the target parking space Pt is calculated by the weighted average using the goodness of fit.
- step S616 it was determined in step S613 that there was a portion matching the first reference image IM31, and in step S614 it was determined that there was no portion matching the second reference images IM41 and IM42.
- the position of the target parking space Pt is calculated based on the detection position of IM31. In this case, the second reference images IM41 and IM42 are not used for calculating the position of the target parking space Pt.
- step S618 it was determined in step S613 that there was no portion matching the first reference image IM31, and in step S614 it was determined that there was a portion matching the second reference images IM41 and IM42.
- the position of the target parking space Pt is calculated based on the detection positions of IM41 and IM42. In this case, the first reference image IM31 is not used for calculating the position of the target parking space Pt.
- step S619 it was determined in step S613 that there was no portion matching the first reference image IM31, and further in step S614 it was determined that there was no portion matching the second reference images IM41 and IM42.
- the parking support process is terminated without calculating the position of Pt.
- step S7 the control device 11 calculates the parking route R in which the own vehicle V1 moves to the set target parking space Pt, for example, from the current position or from the position after changing the traveling direction by the parking route calculation function. ..
- step S8 the control device 11 plans a traveling operation for the own vehicle V1 to park along the calculated parking route R by the traveling operation planning function, and the calculated parking route R and the planned traveling.
- the parking support of the own vehicle V1 is executed by using the vehicle controller 50, the drive system 60, the vehicle speed sensor 70, and the steering angle sensor 80.
- the control device 11 detects an obstacle around the own vehicle V1 by using the image pickup device 200 and the distance measuring device 40 by the environmental information acquisition function during the execution of the parking support in step S8, and the obstacle and the control device 11 are used. Carry out parking assistance while avoiding interference.
- step S9 it is determined whether or not the own vehicle V1 has finished parking in the target parking space Pt, and the parking support in step S8 is continued until the own vehicle V1 completes parking.
- FIG. 10 is a flowchart showing another example of the subroutine of step S6 of FIG.
- step S621 of FIG. 9 the image around the current own vehicle captured by the image pickup apparatus 20 is converted according to the form of the first reference image IM31 and collated with the first reference image IM31.
- the first reference image IM31 is a plane image viewed directly below from the virtual viewpoint VP1
- the image around the current own vehicle captured by the image pickup apparatus 20 is viewed directly below from the virtual viewpoint VP1. Convert to a flat image.
- the first reference image IM31 in the image around the current own vehicle is recognized by using a matching method using a conventionally known template image, a feature point matching method, and other matching methods.
- step S622 as a result of the collation executed in step S621, it is determined whether or not there is a part matching the first reference image IM31 in the image around the current own vehicle, and if it is determined that there is a matching part, it is determined. The process proceeds to step S623, and if it is determined that there is no matching portion, the process proceeds to step S624.
- step S623 since it was determined in step S622 that there is a portion that matches the first reference image IM31, the position of the target parking space Pt is calculated based on the detection position of the first reference image IM31. In this case, the second reference images IM41 and IM42 are not used for calculating the position of the target parking space Pt.
- step S624 the image around the current own vehicle captured by the image pickup apparatus 20 is converted according to the form of the second reference images IM41 and IM42, and collated with the second reference images IM41 and IM42.
- the second reference images IM41 and IM42 are front images viewed horizontally from the virtual viewpoint VP2
- the image around the current own vehicle captured by the image pickup apparatus 20 is displayed horizontally from the virtual viewpoint VP2. Convert to a frontal image viewed in the direction.
- the second reference images IM41 and IM42 in the image around the current own vehicle are recognized by using a matching method using a conventionally known template image, a feature point matching method, and other matching methods.
- step S625 as a result of the collation executed in step S624, it was determined whether or not there was a portion matching the second reference images IM41 and IM42 in the image around the current own vehicle, and it was determined that there was a matching portion. In that case, the process proceeds to step S626, and if it is determined that there is no matching portion, the process proceeds to step S627.
- step S626 it was determined in step S622 that there was no portion matching the first reference image IM31, and in step S624 it was determined that there was a portion matching the second reference images IM41 and IM42.
- the position of the target parking space Pt is calculated based on the detection positions of IM41 and IM42. In this case, the first reference image IM31 is not used for calculating the position of the target parking space Pt.
- step S627 it was determined in step S622 that there was no portion matching the first reference image IM31, and further in step S625 it was determined that there was no portion matching the second reference images IM41 and IM42.
- the parking support process is terminated without calculating the position of Pt.
- the images first reference image IM31, second reference image IM41, IM42
- the own vehicle V1 is parked in the target parking space Pt
- the image around the current own vehicle V1 is used. Since the target parking space Pt is recognized by collating with at least one reference image of a plurality of different reference images, the recognition accuracy of the target parking space Pt can be improved.
- the vehicle V1 travels along the travel path R to reach the target parking space Pt based on the recognized position information of the target parking space Pt. Since the operation is autonomously controlled, the own vehicle V1 can be autonomously controlled more reliably.
- first reference image IM31, second reference image IM41, IM42 a plurality of reference images (first reference image IM31, second reference image IM41, IM42) different from the image around the current own vehicle V1. Since the target parking space Pt is recognized by collating with all of the above, the recognition accuracy of the target parking space Pt is further improved.
- a plurality of reference images (first reference image IM31, second reference image IM41, IM42) different from the image around the current own vehicle V1. Since the target parking space Pt is recognized by collating with a part of the target parking space Pt, the recognition processing time of the target parking space Pt is shortened and the recognition processing load is also reduced.
- the plurality of different reference images are a plurality of reference images having different viewpoint positions. Since at least one of a plurality of reference images having different line-of-sight directions and a plurality of reference images having different conversion methods is included, the recognition accuracy of the target parking space Pt is further improved.
- a plurality of reference images (first reference image IM31, second reference image IM41, IM42) different from the image around the current own vehicle V1.
- first reference image IM31, second reference image IM41, IM42 a plurality of reference images
- the image around the current own vehicle is converted into an image form corresponding to each of a plurality of different reference images, so that the collation process is simplified and the collation accuracy is high.
- the degree of conformity in the case of collation is set for each of a plurality of different reference images (first reference image IM31, second reference image IM41, IM42).
- first reference image IM31, second reference image IM41, IM42 the target parking space Pt is recognized by weighting using the set degree of conformity. Therefore, the suitability of collation can be improved according to the environment of the target parking space Pt registered in advance, and the recognition accuracy of the target parking space Pt is further improved.
- a predetermined parking start position is obtained.
- the travel route R from to the target parking space Pt to the target parking space Pt is memorized and then the own vehicle V1 is parked in the target parking space Pt, if the target parking space Pt is recognized, the memorized travel route R is followed. Since the traveling operation of the own vehicle V1 is autonomously controlled, the calculation load of the traveling route can be reduced, and at the same time, the time until the start of parking support can be shortened.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
Abstract
Description
10…駐車支援装置
11…制御装置
111…ROM
112…CPU
113…RAM
12…出力装置
121…ディスプレイ
122…スピーカ
20…撮像装置
2a…車両のフロントグリル部に配置された撮像装置
2b…車両の左ドアミラーの下部に配置された撮像装置
2c…車両の右ドアミラーの下部に配置された撮像装置
2d…車両のリアバンパ近傍に配置された撮像装置
30…画像処理装置
40…測距装置
4a…前方測距装置
4b…右側方測距装置
4c…左側方測距装置
4d…後方測距装置
50…車両コントローラ
60…駆動システム
70…車速センサ
IM1…俯瞰画像
IM2…監視画像
IM31…第1基準画像
IM41,IM42…第2基準画像
L1~L10…線
P…駐車スペース
Pt、Pt1、Pt2、Pt3…目標駐車スペース
V1…自車両
V2、V2a、V2b、V2c、V2d、V2e…他車両
VP1,VP2…仮想視点
Claims (9)
- 自車両の周囲を撮像する撮像装置を用いて生成した、目標駐車スペースを特定する異なる複数の基準画像を予め記憶しておき、
その後に自車両を前記目標駐車スペースに駐車させる場合には、現在の自車両の周囲の画像と、前記異なる複数の基準画像の少なくとも一つの基準画像とを照合して前記目標駐車スペースを認識する駐車支援方法。 - 前記認識された目標駐車スペースの位置情報に基づいて、前記目標駐車スペースに至る自車両の走行動作を自律制御する請求項1に記載の駐車支援方法。
- 前記現在の自車両の周囲の画像と、前記異なる複数の基準画像の全部の基準画像とを照合して前記目標駐車スペースを認識する請求項1又は2に記載の駐車支援方法。
- 前記現在の自車両の周囲の画像と、前記異なる複数の基準画像の一部の基準画像とを照合して前記目標駐車スペースを認識する請求項1又は2に記載の駐車支援方法。
- 前記異なる複数の基準画像は、視点位置が異なる複数の基準画像、視線方向が異なる複数の基準画像及び変換方法が異なる複数の基準画像のうちの少なくともいずれかの基準画像を含む請求項1~4のいずれか一項に記載の駐車支援方法。
- 現在の自車両の周囲の画像と、前記異なる複数の基準画像の少なくとも一つの基準画像とを照合する場合に、前記現在の自車両の周囲の画像を、前記異なる複数の基準画像にそれぞれ対応する画像形態に変換する請求項1~5のいずれか一項に記載の駐車支援方法。
- 前記異なる複数の基準画像ごとに、照合する場合の適合度を設定しておき、現在の自車両の周囲の画像と、前記異なる複数の基準画像の少なくとも一つの基準画像とを照合する場合に、設定された適合度を用いた重み付けにより目標駐車スペースを認識する請求項1~6のいずれか一項に記載の駐車支援方法。
- 前記複数の基準画像を予め記憶するときに、所定の駐車開始位置から前記目標駐車スペースに至る走行経路を記憶しておき、
その後に自車両を前記目標駐車スペースに駐車させる場合に、前記目標駐車スペースを認識したら、記憶しておいた走行経路に沿って自車両の走行動作を自律制御する請求項1~7のいずれか一項に記載の駐車支援方法。 - 自車両の周囲を撮像する撮像装置を用いて生成した、目標駐車スペースを含む異なる複数の基準画像を予め記憶するメモリと、
その後に自車両を前記目標駐車スペースに駐車させる場合には、現在の自車両の周囲の画像と、前記異なる複数の基準画像の少なくとも一つの基準画像とを照合して前記目標駐車スペースを認識するコントローラと、を備える駐車支援装置。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
BR112023007828A BR112023007828A2 (pt) | 2020-11-02 | 2020-11-02 | Método de assistência de estacionamento e dispositivo de assistência de estacionamento |
EP20959661.8A EP4239613A4 (en) | 2020-11-02 | 2020-11-02 | PARKING ASSISTANCE METHOD AND PARKING ASSISTANCE DEVICE |
MX2023004918A MX2023004918A (es) | 2020-11-02 | 2020-11-02 | Método de asistencia al estacionamiento y dispositivo de asistencia al estacionamiento. |
CN202080106745.XA CN116508083B (zh) | 2020-11-02 | 2020-11-02 | 停车辅助方法及停车辅助装置 |
PCT/IB2020/000915 WO2022090758A1 (ja) | 2020-11-02 | 2020-11-02 | 駐車支援方法及び駐車支援装置 |
US18/033,850 US11884265B1 (en) | 2020-11-02 | 2020-11-02 | Parking assistance method and parking assistance device |
JP2022558363A JP7405277B2 (ja) | 2020-11-02 | 2020-11-02 | 駐車支援方法及び駐車支援装置 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2020/000915 WO2022090758A1 (ja) | 2020-11-02 | 2020-11-02 | 駐車支援方法及び駐車支援装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022090758A1 true WO2022090758A1 (ja) | 2022-05-05 |
Family
ID=81381414
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2020/000915 WO2022090758A1 (ja) | 2020-11-02 | 2020-11-02 | 駐車支援方法及び駐車支援装置 |
Country Status (7)
Country | Link |
---|---|
US (1) | US11884265B1 (ja) |
EP (1) | EP4239613A4 (ja) |
JP (1) | JP7405277B2 (ja) |
CN (1) | CN116508083B (ja) |
BR (1) | BR112023007828A2 (ja) |
MX (1) | MX2023004918A (ja) |
WO (1) | WO2022090758A1 (ja) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007011994A (ja) * | 2005-07-04 | 2007-01-18 | Toyota Motor Corp | 道路認識装置 |
JP2007315956A (ja) * | 2006-05-26 | 2007-12-06 | Aisin Aw Co Ltd | 駐車場マップ作成方法、駐車場案内方法及びナビゲーション装置 |
JP2016197314A (ja) * | 2015-04-03 | 2016-11-24 | 株式会社日立製作所 | 運転支援システム、運転支援装置及び運転支援方法 |
JP2018163530A (ja) * | 2017-03-27 | 2018-10-18 | クラリオン株式会社 | 対象物検知装置、対象物検知方法、及び対象物検知プログラム |
JP2019202697A (ja) | 2018-05-25 | 2019-11-28 | トヨタ自動車株式会社 | 駐車支援装置 |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009111536A (ja) * | 2007-10-26 | 2009-05-21 | Victor Co Of Japan Ltd | 車両用後方確認システム及びその誘導方法 |
JP5268368B2 (ja) * | 2008-01-14 | 2013-08-21 | トヨタホーム株式会社 | 駐車支援システム及び建物 |
JP4940168B2 (ja) * | 2008-02-26 | 2012-05-30 | 日立オートモティブシステムズ株式会社 | 駐車スペース認識装置 |
JP6406159B2 (ja) * | 2015-08-04 | 2018-10-17 | 株式会社デンソー | 車載表示制御装置、車載表示制御方法 |
JP2017067466A (ja) * | 2015-09-28 | 2017-04-06 | アイシン精機株式会社 | 駐車支援装置 |
KR102154510B1 (ko) * | 2016-10-13 | 2020-09-10 | 닛산 지도우샤 가부시키가이샤 | 주차 지원 방법 및 주차 지원 장치 |
JP6946652B2 (ja) * | 2017-02-02 | 2021-10-06 | 株式会社アイシン | 駐車支援装置 |
JP6722616B2 (ja) * | 2017-04-07 | 2020-07-15 | クラリオン株式会社 | 駐車支援装置 |
JP7000822B2 (ja) * | 2017-12-06 | 2022-01-19 | 株式会社アイシン | 周辺監視装置 |
DE102018202738A1 (de) * | 2018-02-23 | 2019-08-29 | Bayerische Motoren Werke Aktiengesellschaft | Fernbedienbares Parkassistenzsystem mit selbstständiger Entscheidung über das Vorliegen einer Ein- oder Ausparksituation und entsprechendes Parkverfahren |
JP7151293B2 (ja) * | 2018-09-06 | 2022-10-12 | 株式会社アイシン | 車両周辺表示装置 |
JP7130056B2 (ja) * | 2018-11-08 | 2022-09-02 | 日立Astemo株式会社 | 車両制御装置、経路配信装置、車両誘導システム |
US11107354B2 (en) * | 2019-02-11 | 2021-08-31 | Byton North America Corporation | Systems and methods to recognize parking |
JP2020142752A (ja) * | 2019-03-08 | 2020-09-10 | トヨタ自動車株式会社 | 駐車支援装置 |
JP7139284B2 (ja) * | 2019-05-14 | 2022-09-20 | 本田技研工業株式会社 | 車両制御装置、駐車場管理装置、車両制御方法、およびプログラム |
KR102061750B1 (ko) * | 2019-05-15 | 2020-01-03 | 주식회사 라이드플럭스 | 사전 정보를 이용하여 차량의 주행을 제어하는 방법 및 장치 |
JP7053560B2 (ja) * | 2019-12-13 | 2022-04-12 | 本田技研工業株式会社 | 駐車支援システムおよびその制御方法 |
-
2020
- 2020-11-02 US US18/033,850 patent/US11884265B1/en active Active
- 2020-11-02 BR BR112023007828A patent/BR112023007828A2/pt unknown
- 2020-11-02 CN CN202080106745.XA patent/CN116508083B/zh active Active
- 2020-11-02 WO PCT/IB2020/000915 patent/WO2022090758A1/ja active Application Filing
- 2020-11-02 MX MX2023004918A patent/MX2023004918A/es unknown
- 2020-11-02 EP EP20959661.8A patent/EP4239613A4/en active Pending
- 2020-11-02 JP JP2022558363A patent/JP7405277B2/ja active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007011994A (ja) * | 2005-07-04 | 2007-01-18 | Toyota Motor Corp | 道路認識装置 |
JP2007315956A (ja) * | 2006-05-26 | 2007-12-06 | Aisin Aw Co Ltd | 駐車場マップ作成方法、駐車場案内方法及びナビゲーション装置 |
JP2016197314A (ja) * | 2015-04-03 | 2016-11-24 | 株式会社日立製作所 | 運転支援システム、運転支援装置及び運転支援方法 |
JP2018163530A (ja) * | 2017-03-27 | 2018-10-18 | クラリオン株式会社 | 対象物検知装置、対象物検知方法、及び対象物検知プログラム |
JP2019202697A (ja) | 2018-05-25 | 2019-11-28 | トヨタ自動車株式会社 | 駐車支援装置 |
Non-Patent Citations (4)
Title |
---|
HUANG, Y. ET AL.: "Vision-based Semantic Mapping and Localization for Autonomous Indoor Parking", 2018 IEEE INTELLIGENT VEHICLES SYMPOSIUM, June 2018 (2018-06-01), pages 636 - 641, XP033423415, DOI: 10.1109/IVS.2018.8500516 * |
JUNG, H. G. ET AL.: "Parking Slot Markings Recognition for Automatic Parking Assist System", 2006 IEEE INTELLIGENT VEHICLES SYMPOSIUM, June 2006 (2006-06-01), pages 106 - 113, XP010936998 * |
See also references of EP4239613A4 |
SUZUKI MASAYASUCHINOMI SATOSHITAKANO TERUHISA: "Proceedings of Society of JSAE Annual Congress", vol. 116-07, October 2007, article "Development of Around View System", pages: 17 - 22 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022090758A1 (ja) | 2022-05-05 |
EP4239613A4 (en) | 2023-09-20 |
BR112023007828A2 (pt) | 2024-02-06 |
US20240034305A1 (en) | 2024-02-01 |
JP7405277B2 (ja) | 2023-12-26 |
MX2023004918A (es) | 2023-11-10 |
CN116508083B (zh) | 2024-04-26 |
US11884265B1 (en) | 2024-01-30 |
CN116508083A (zh) | 2023-07-28 |
EP4239613A1 (en) | 2023-09-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10351060B2 (en) | Parking assistance apparatus and vehicle having the same | |
US11086335B2 (en) | Driving assistance system and vehicle comprising the same | |
US11377099B2 (en) | Parking assist system | |
EP3184365A2 (en) | Display device for vehicle and control method thereof | |
EP3290301A1 (en) | Parking assist device | |
WO2020116195A1 (ja) | 情報処理装置、情報処理方法、プログラム、移動体制御装置、及び、移動体 | |
JP2018531175A (ja) | 自動車両を駐車するための駐車エリアを特定するための方法、運転者支援システム、及び自動車両 | |
JP2018531175A6 (ja) | 自動車両を駐車するための駐車エリアを特定するための方法、運転者支援システム、及び自動車両 | |
US11842548B2 (en) | Parking space recognition system and parking assist system including the same | |
WO2022138123A1 (en) | Available parking space identification device, available parking space identification method, and program | |
CN114792475A (zh) | 自动停车*** | |
US20210294338A1 (en) | Control apparatus, control method, and computer-readable storage medium storing program | |
JP2022098397A (ja) | 情報処理装置、および情報処理方法、並びにプログラム | |
KR20170018701A (ko) | 운전자 보조 장치 및 그 제어 방법 | |
CN112602124A (zh) | 用于车辆调度***的通讯方法、车辆调度***以及通讯装置 | |
WO2022090758A1 (ja) | 駐車支援方法及び駐車支援装置 | |
US20220379879A1 (en) | Parking assistance device and parking assistance method | |
EP4102323B1 (en) | Vehicle remote control device, vehicle remote control system, vehicle remote control method, and vehicle remote control program | |
EP4105087A1 (en) | Parking assist method and parking assist apparatus | |
US20200282978A1 (en) | Vehicle control system, vehicle control method, and storage medium | |
RU2791347C1 (ru) | Способ помощи при парковке и устройство помощи при парковке | |
WO2024069689A1 (ja) | 運転支援方法及び運転支援装置 | |
WO2023002863A1 (ja) | 運転支援装置、運転支援方法 | |
US11548500B2 (en) | Parking assist system | |
WO2024069690A1 (ja) | 運転支援方法及び運転支援装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20959661 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022558363 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202080106745.X Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202347030417 Country of ref document: IN |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112023007828 Country of ref document: BR |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2020959661 Country of ref document: EP Effective date: 20230602 |
|
ENP | Entry into the national phase |
Ref document number: 112023007828 Country of ref document: BR Kind code of ref document: A2 Effective date: 20230425 |