WO2023026760A1 - Dispositif de détection d'objet et procédé de détection d'objet - Google Patents

Dispositif de détection d'objet et procédé de détection d'objet Download PDF

Info

Publication number
WO2023026760A1
WO2023026760A1 PCT/JP2022/029149 JP2022029149W WO2023026760A1 WO 2023026760 A1 WO2023026760 A1 WO 2023026760A1 JP 2022029149 W JP2022029149 W JP 2022029149W WO 2023026760 A1 WO2023026760 A1 WO 2023026760A1
Authority
WO
WIPO (PCT)
Prior art keywords
detection
vehicle
movement
detection device
sensor
Prior art date
Application number
PCT/JP2022/029149
Other languages
English (en)
Japanese (ja)
Inventor
秀典 田中
真澄 福万
幹生 大林
拓也 中川
Original Assignee
株式会社デンソー
トヨタ自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー, トヨタ自動車株式会社 filed Critical 株式会社デンソー
Publication of WO2023026760A1 publication Critical patent/WO2023026760A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/50Systems of measurement, based on relative movement of the target
    • G01S15/52Discriminating between fixed and moving objects or between objects moving at different speeds
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present disclosure relates to an object detection device and an object detection method.
  • Patent Document 1 proposes an in-vehicle object discrimination device that determines whether an object around the vehicle is a moving object or a stationary object. With this device, two ultrasonic sensors detect the same area at different times. If the detection results are the same, it is determined that the object is stationary. We are making a judgment on whether In this way, an obstacle map of only stationary objects is created, and a parking assistance system is constructed.
  • An object of the present disclosure is to provide an object detection device and an object detection method that can identify in a short time whether an object existing around a vehicle is a moving object.
  • the object detection device detects an object existing around the vehicle, an object detection unit that detects an object within the detection area of the search wave sensor as a first object based on the detection result of the search wave sensor installed in the vehicle; A movement detection unit that detects the movement of the second object in the target area based on the detection result of the detection device that sequentially detects the object within the range of the target area as the second object, with the area including the detection area as the target area; an object identification unit that identifies the first object as a moving object when the movement of the second object is detected by the movement detection unit and the first object and the second object are the same.
  • the object detection method detects an object existing around the vehicle, Detecting an object within the detection area of the probe wave sensor as the first object based on the detection result of the probe wave sensor installed in the vehicle; Detecting the movement of the second object in the target area based on the detection results of a detection device that sequentially detects objects within the range of the target area as the second object, with an area including the detection area as the target area; identifying the first object as a moving object when movement of the second object is detected and the first object and the second object are the same.
  • the detection result of the detection device is used to detect the movement of the object within the detection area of the search wave sensor, the detection of the object based on the detection result of the search wave sensor and the detection device detection of the movement of the object based on the detection result of can be carried out in parallel. Therefore, since the standby state unlike the conventional technology does not occur, it is possible to quickly identify whether or not an object existing around the vehicle is a moving object.
  • FIG. 1 is a schematic configuration diagram of a parking assistance system including an object detection device according to an embodiment of the present disclosure
  • FIG. 1 is a schematic plan view showing a schematic configuration of a vehicle equipped with an object detection device according to an embodiment of the present disclosure
  • FIG. 4 is an explanatory diagram for explaining a state in which another vehicle passes the own vehicle when the own vehicle is parked
  • 6 is a flowchart showing an example of object detection processing executed by the control device
  • FIG. 4 is an explanatory diagram for explaining identification of a moving object by an object detection device
  • FIG. 4 is an explanatory diagram for explaining identification of a stationary object by the object detection device
  • It is an explanatory view for explaining an obstacle map.
  • FIG. 1 An embodiment of the present disclosure will be described with reference to FIGS. 1 to 7.
  • FIG. 1 a case where the object detection device 10 is incorporated in the parking assistance system 1 in the vehicle will be described as an example.
  • the object detection device 10 is incorporated into the parking assistance system 1 here, it may be incorporated into a system other than the parking assistance system 1 .
  • the parking assist system 1 includes a perimeter monitoring device 20, a control device 30, an HMIECU 40, a brake ECU 50, and a power train (hereinafter referred to as power train) ECU 60.
  • the perimeter monitoring device 20 is directly connected to the control device 30 so that the monitoring result of the perimeter monitoring device 20 is input to the control device 30 .
  • communication between the control device 30 and the HMIECU 40, the brake ECU 50, and the power train ECU 60 is possible via an in-vehicle communication bus 70 such as an in-vehicle LAN (Local Area Network).
  • in-vehicle communication bus 70 such as an in-vehicle LAN (Local Area Network).
  • the controller 30 is connected to various sensors for vehicle control such as a wheel speed sensor and a steering angle sensor.
  • a wheel speed sensor is provided for each of the four wheels, and generates a detection signal corresponding to the rotation state of each wheel as a pulse output.
  • the steering angle sensor outputs a detection signal corresponding to the direction of steering and the amount of operation by the steering operation.
  • the surroundings monitoring device 20 is an autonomous sensor that monitors the surrounding environment of its own vehicle (hereinafter referred to as own vehicle Va).
  • the perimeter monitoring device 20 detects obstacles and the like composed of three-dimensional objects in the perimeter of the own vehicle Va, such as moving objects such as other vehicles Vb and stationary stationary objects such as structures on the road. detected as The vehicle is equipped with a survey wave sensor 21, a surroundings monitoring camera 22, and the like as a surroundings monitoring device 20. - ⁇
  • the search wave sensor 21 transmits search waves to a predetermined range around the own vehicle Va.
  • the exploration wave sensor 21 includes a plurality of ultrasonic sensors arranged at predetermined intervals along the traveling direction of the vehicle Va on the side of the vehicle Va.
  • the search wave sensor 21 includes a left front side sensor SLf, a left rear side sensor SLb, a right front side sensor SRf, a right rear side sensor SRb, a millimeter wave radar (not shown), and a LiDAR (light detection and ranging) not shown. contains.
  • the left front side sensor SLf, the left rear side sensor SLb, the right front side sensor SRf, and the right rear side sensor SRb are arranged on the side of the vehicle Va along the traveling direction of the vehicle Va.
  • ultrasonic sensors are provided on the front left, rear left, front right, and rear right of the own vehicle Va.
  • the front-rear direction of the vehicle Va is indicated by DR1
  • the left-right direction of the vehicle Va is indicated by DR2.
  • the ultrasonic sensor generates ultrasonic waves, receives the reflected waves, and detects the distance to an object existing in the pointing direction of the ultrasonic sensor based on the time from generation to reception of the reflected waves. , and outputs the result as a detection signal.
  • the distance detected by the ultrasonic sensor will be referred to as detection distance.
  • the detection area of the left front side sensor SLf is Rfl
  • the detection area of the left rear side sensor SLb is Rrl
  • the detection area of the right front side sensor SRf is Rfr
  • the detection area of the right rear side sensor SRb is Rrr.
  • the distance between the left front side sensor SLf and the left rear side sensor SLb and between the right front side sensor SRf and the right rear side sensor SRb are set to a predetermined distance, and are substantially the same distance.
  • the left front side sensor SLf and the right front side sensor SRf are arranged in a tire housing on the front wheel side of the own vehicle Va.
  • the left rear side sensor SLb and the right rear side sensor SRb are arranged in a portion of the rear bumper located on the side of the host vehicle Va, a rear fender portion, and the like.
  • the left front side sensor SLf, the left rear side sensor SLb, the right front side sensor SRf, and the right rear side sensor SRb are arranged to face the side of the own vehicle Va, and are arranged to face an object present on the side of the own vehicle Va. Detect distance.
  • the surroundings monitoring camera 22 images a predetermined range around the own vehicle Va as target areas RL and RR.
  • the perimeter monitoring camera 22 outputs imaging data obtained by imaging the perimeter of the vehicle Va as an imaging result.
  • the surrounding monitoring camera 22 includes a front camera CF, a back camera CB, a left side camera CL, a right side camera CR, and the like.
  • the front camera CF is provided, for example, on the front bumper or front grill that constitutes the front mask of the vehicle.
  • the back camera CB is provided, for example, on the rear bumper or in the vicinity of the rear bumper.
  • the left side camera CL is provided, for example, on the left side mirror ML or near the left side mirror ML.
  • the left side camera CL has a target area RL that includes the detection areas Rfl and Rrl of the left front side sensor SLf and the left rear side sensor SLb.
  • the right side camera CR is provided, for example, on the right side mirror MR or in the vicinity of the right side mirror MR.
  • the right side camera CR uses an area including the detection areas Rfr and Rrr of the right front side sensor SRf and the right rear side sensor SRb as a target area RR.
  • the perimeter monitoring camera 22 regards the areas including the detection areas Rfr and Rrr of the survey wave sensor 21 as the target areas RL and RR, and treats objects within the range of the target areas RL and RR as “second objects. ” to configure a detection device that detects sequentially.
  • the side sensors SLf, SLb, SRf, and SRb one of the ultrasonic sensors adjacent in the traveling direction of the vehicle Va is assumed to be the first sensor, and the other is assumed to be the second sensor.
  • the surroundings monitoring camera 22 sequentially detects the “second object” by using the areas including the detection area of the first sensor and the detection area of the second sensor as target areas RL and RR.
  • the control device 30 constitutes an ECU (that is, an electronic control device) for performing various controls for realizing a parking assistance method in the parking assistance system 1, and includes a CPU, a storage unit 31, an I/O, and the like. It is composed of a microcomputer equipped with
  • the storage unit 31 includes ROM, RAM, EEPROM, and the like. That is, the storage unit 31 has a volatile memory such as RAM and a nonvolatile memory such as EEPROM.
  • the storage unit 31 is composed of a non-transition tangible recording medium. For example, the storage unit 31 holds a predetermined amount of various information obtained from the survey wave sensor 21 and the surrounding monitoring camera 22 in chronological order.
  • the control device 30 executes parking support control including object detection processing based on the monitoring results of the perimeter monitoring device 20 described above.
  • this parking support control obstacles such as "stationary objects" that exist around the own vehicle Va are recognized, a moving route is calculated when the vehicle is parked so as to avoid the obstacles, and along the moving route Assist in moving vehicles. If the recognized obstacle is a moving object, the "moving object" may no longer exist when the vehicle starts to park after recognizing the "moving object.” For this reason, for example, processing such as calculating a movement route by excluding "moving objects” from obstacles is performed.
  • assistance is provided to move the vehicle along the travel route, for example, the driver can visually grasp the travel route, or control the braking/driving force to directly move the vehicle along the travel route. is done. Therefore, control signals are transmitted from the control device 30 to the HMIECU 40, the brake ECU 50, and the powertrain ECU 60 through the vehicle-mounted communication bus 70 in order to execute the control.
  • control device 30 includes an object detection unit 32, a movement detection unit 33, an object identification unit 34, a map generation unit 35, and a support control unit 36 as functional units. ing.
  • the object detection unit 32 detects an object within the detection areas Rfl, Rrl, Rfr, and Rrr of the survey wave sensor 21 based on the detection result of the survey wave sensor 21 installed in the own vehicle Va as a "first object". To detect.
  • the object detection unit 32 calculates the distance between the vehicle and the object as the detection distance based on the sensor output of each of the side sensors SLf, SLb, SRf, and SRb. Then, the object detection unit 32 identifies where the object is by mobile triangulation from the detection distance based on the sensor outputs of the side sensors SLf, SLb, SRf, and SRb.
  • the detection distance changes so as to become shorter when the ultrasonic sensor moves forward, the object is positioned in front of the ultrasonic sensor, and if the detection distance does not change, A survey is performed such that an object exists at a side position of an ultrasonic sensor.
  • the detection of the “first object” by the object detection unit 32 may be realized by a method other than the moving triangulation method.
  • the movement detection unit 33 detects the movement of the “second object” in the detection areas Rfl, Rrl, Rfr, and Rrr of the side sensors SLf, SLb, SRf, and SRb based on the detection results of the peripheral monitoring camera 22 that constitutes the detection device. to detect
  • the movement detection unit 33 extracts, for example, a feature point corresponding to the "second object" in the captured image based on the imaged data around the own vehicle Va, and detects the time change of the feature point and the image of the own vehicle Va. Movement of the "second object” is detected based on the movement state. Extraction of feature points can be realized by using, for example, a Sobel filter using first-order differentiation and a Laplacian filter using second-order differentiation. Further, the movement state of the own vehicle Va can be acquired based on the sensor outputs of the wheel speed sensor and the steering angle sensor, for example. Note that the method of detecting movement of the "second object" is not limited to the above.
  • the method of detecting the movement of the "second object” includes, for example, optical flow that detects the movement of the object by vectorizing the movement of the feature points between frames, and the movement of the object that detects the movement of the object from the difference between the frames of the captured image. It may be implemented using inter-frame differences or the like.
  • the object identification unit 34 identifies whether the "first object" detected by the object detection unit 32 is a "moving object” or a "stationary object”.
  • the object identification unit 34 identifies the “first ,” is identified as a moving object.
  • the object identification unit 34 grasps the position of the "first object” from the detection result of the object detection unit 32, and grasps the position of the "second object” from the object monitoring result of the surroundings monitoring camera 22. .
  • the object specifying unit 34 determines that the "second object” and the "first object” are the same. It is determined that For example, as shown in FIG.
  • the object detection unit 32 detects the other vehicle Vb as a “first object”, and the movement detection unit 33 detects the other vehicle Vb as a “first object”. Movement of Vb is detected as movement of the "second object”. Then, since the position of the "first object” and the position of the "second object” substantially match, the object identifying unit 34 identifies the other vehicle Vb as the "first object” as the "moving object”. do.
  • the map generation unit 35 generates an obstacle map MP that defines the positional relationship between the vehicle Va and obstacles existing around the vehicle Va.
  • the map generation unit 35 generates an obstacle map MP based on the detection result of the "first object" by the object detection unit 32 and the identification result of the "first object” by the object identification unit 34, for example, by SLAM technology. Generate.
  • the map generator 35 generates, for example, a two-dimensional obstacle map MP or a three-dimensional obstacle map MP.
  • SLAM is an abbreviation for Simultaneous Localization and Mapping.
  • the map generator 35 excludes “moving objects” from the obstacle map MP. Specifically, the map generation unit 35 registers the “stationary object” identified by the object identification unit 34 in the obstacle map MP, and registers the “moving object” identified by the object identification unit 34 in the obstacle map MP. Do not register.
  • the support control unit 36 performs parking support control so that the vehicle can be parked while avoiding obstacles, based on the information about the obstacles in the obstacle map MP. For example, the support control unit 36 excludes moving objects from obstacles as objects that have already moved and do not exist when parking, and considers stationary objects among the obstacles as obstacles to be avoided. Calculate the avoidance movement route. Then, in order to assist the vehicle in moving along the moving route, it outputs appropriate control signals to the HMIECU 40, the brake ECU 50, and the power train ECU 60 through the on-vehicle communication bus 70.
  • the HMIECU 40 has a display section 41 and an audio output section 42 such as a speaker, which constitute an HMI (abbreviation for Human Machine Interface).
  • the display unit 41 is a device that provides the user with information in a visual manner.
  • the display unit 41 is configured by, for example, a touch panel display in which a display function and an operation function are integrated, or a head-up display that projects information onto a transparent glass element.
  • the HMIECU 40 receives image data of the surroundings of the own vehicle Va captured by the surroundings monitoring camera 22, and controls the display unit 41 to display the data together with a virtual vehicle image showing the own vehicle Va.
  • the HMIECU 40 for example, performs a process of clearly indicating the planned parking position of the own vehicle Va in the bird's-eye view image showing the own vehicle Va, indicating the moving route with an arrow, and emphasizing obstacles.
  • the HMIECU 40 controls various meter displays such as vehicle speed display and engine speed display.
  • the display unit 41 displays information related to parking assistance
  • the audio output unit 42 issues an alarm sound or information related to parking assistance.
  • the HMIECU 40 indicates a planned operation of the vehicle such as "Move forward” or “Reverse” through the display unit 41 or the voice output unit 42, or an automatic command such as “Set the shift position to 'D'". It issues instructions for preparations for parking.
  • the brake ECU 50 constitutes a braking control device that performs various brake controls, and automatically generates brake fluid pressure by driving an actuator for brake fluid pressure control, pressurizes the wheel cylinders, and produces braking force. generate When the brake ECU 50 receives the control signal from the support control unit 36, the brake ECU 50 controls the braking force of each wheel so as to move the vehicle along the moving route.
  • the power train ECU 60 constitutes a driving force control device that performs various driving force controls, and generates a desired driving force by controlling the engine or motor rotation speed and controlling the transmission.
  • the power train ECU 60 receives the control signal from the support control unit 36, the power train ECU 60 controls the driving force of the drive wheels so as to move the vehicle along the moving route.
  • the brake ECU 50 and the power train ECU 60 are included here as systems capable of performing automatic parking.
  • the HMIECU 40 is included in order to display a bird's-eye view image and display relating to parking assistance. However, these are not essential, and are selectively used as needed.
  • the parking assistance system 1 is configured as described above.
  • the parking assistance system 1 includes an object detection device 10 .
  • the parking assistance system 1 identifies whether an object existing around the vehicle Va is a "moving object” or a "stationary object” by object detection processing executed by the object detection device 10, Get information about obstacles in In the parking assistance system 1, the assistance control unit 36 performs parking assistance control based on the information about the obstacle acquired by the object detection device 10 so that the vehicle can be parked while avoiding the obstacle.
  • the object detection device 10 of the present embodiment includes an investigation wave sensor 21, a surrounding monitoring camera 22, a storage unit 31, an object detection unit 32, a movement detection unit 33, an object identification unit 34, and a map generation unit 35. It is An overview of the object detection process executed by the object detection device 10 will be described below with reference to the flowchart of FIG.
  • the object detection device 10 performs object detection processing periodically or irregularly, for example, when a start switch such as an ignition switch of the own vehicle Va is turned on.
  • Each processing shown in this flowchart is implemented by each functional unit of the object detection device 10 . Further, each step for realizing this processing can also be grasped as each step for realizing the object detection method.
  • the object detection device 10 reads various information in step S100.
  • the object detection device 10 sequentially reads sensor outputs from, for example, an investigation wave sensor 21, a peripheral monitoring camera 22, a wheel speed sensor, a steering angle sensor, and the like.
  • the object detection device 10 performs detection processing for the "first object”. Specifically, the object detection device 10 detects an object within the detection areas Rfl, Rrl, Rfr, and Rrr of the surveying wave sensor 21 based on the detection results of the surveying wave sensor 21 installed in the own vehicle Va. 1 object”. This processing is performed by the object detection unit 32 of the object detection device 10 .
  • step S120 the object detection device 10 performs movement detection processing for the "second object". Specifically, the object detection device 10 detects the “second object” in the detection areas Rfl, Rrl, Rfr, and Rrr of the survey wave sensor 21 based on the detection result of the “second object” in the surroundings monitoring camera 22. ” movement is detected. This processing is performed by the movement detection unit 33 of the object detection device 10 .
  • step S130 the object detection device 10 determines whether or not the object detection unit 32 has detected the "first object". As a result, when the "first object" is not detected, the object detection device 10 skips subsequent processes and exits from this process. On the other hand, when the "first object" is detected, the object detection device 10 proceeds to step S140.
  • step S140 the object detection device 10 determines whether or not the movement detection unit 33 has detected movement of the "second object". Specifically, the object detection device 10 determines whether movement of an object has been detected in the target areas RL and RR including the detection areas Rfl, Rrl, Rfr, and Rrr of the probe wave sensor 21 .
  • the object detection device 10 determines whether the movement of the "first object” detected by the object detection unit 32 and the movement detection unit 33 are detected in step S150. It is determined whether or not the detected "second object” is the same object. For example, the object detection device 10 detects the position of the "first object” ascertained from the detection result of the object detection unit 32 and the position of the "second object” ascertained from the monitoring result of the periphery monitoring camera 22. If they substantially match, it is determined that the "second object” and the "first object” are the same object. Note that this determination process is not limited to the above. It may be realized by grasping the outer shape of the "object” and comparing them.
  • step S150 if the "second object” and the "first object” are the same object, the object detection device 10 proceeds to step S160 to change the "first object” to " identified as “moving object”.
  • step S170 Identify the "first object” as the "stationary object”. Note that each process from steps S130 to S170 is performed by the object identification unit 34.
  • step S180 the object detection device 10 generates an obstacle map MP and exits the object detection process. Specifically, the object detection device 10 registers obstacles identified as “stationary objects” by the object identification unit 34 in the obstacle map MP, and obstacles identified as “moving objects” by the object identification unit 34 are registered in the obstacle map MP. is not registered in the obstacle map MP. The object detection device 10 stores the obstacle map MP in the storage section 31 . Note that the process of step S180 is performed by the map generator 35. FIG.
  • FIG. 5 is an example showing the positional relationship between the own vehicle Va and the mobile object MO when the own vehicle Va and the mobile object MO such as the other vehicle Vb pass each other on the road in chronological order.
  • FIG. 6 is an example of a time-series representation of the positional relationship between the own vehicle Va and the installation object OB when the own vehicle Va travels on the road by the installation object OB. A specific example of the object detection process will be described below with reference to FIGS. 5 and 6.
  • FIG. 5 is an example showing the positional relationship between the own vehicle Va and the mobile object MO when the own vehicle Va and the mobile object MO such as the other vehicle Vb pass each other on the road in chronological order.
  • FIG. 6 is an example of a time-series representation of the positional relationship between the own vehicle Va and the installation object OB when the own vehicle Va travels on the road by the installation object OB.
  • the object detection device 10 When the own vehicle Va is in the position shown in the upper part of FIG. 5, the side wall SW provided along the road is detected by the left front side sensor SLf and the left rear side sensor SLb as the "first object". On the other hand, since the mobile object MO is located outside the range of the target area RL of the perimeter monitoring camera 22, the object detection device 10 does not detect the movement of the mobile object MO. Therefore, the object detection device 10 identifies the side wall SW as the "first object” as the "stationary object".
  • the left front side sensor SLf detects the mobile object MO as the "first object”.
  • the object detection device 10 detects the movement of the moving object MO as the movement of the "second object”.
  • the "first object” and the “second object” are the same mobile object MO, and have the same position, outer shape, and the like. Therefore, the object detection device 10 identifies the moving object MO as the "first object” as the "moving object”.
  • the left rear side sensor SLb detects the mobile object MO as the "first object”.
  • the object detection device 10 detects the movement of the moving object MO as the movement of the "second object”.
  • the "first object” and the “second object” are the same mobile object MO, and have the same position, outer shape, and the like. Therefore, the object detection device 10 identifies the moving object MO as the "first object” as the "moving object”.
  • the side wall SW provided along the road is detected by the left front side sensor SLf and the left rear side sensor SLb as the "first object”.
  • the installation object OB is located outside the range of the target area RL of the perimeter monitoring camera 22, and the object detection device 10 does not detect the installation object OB. Therefore, the object detection device 10 identifies the side wall SW as the "first object” as the "stationary object”.
  • the left front side sensor SLf detects the installed object OB as the "first object”.
  • the object detection device 10 simply detects the installation object OB as the "second object”. ” movement is not detected. Therefore, the object detection device 10 identifies the installation object OB as the "first object” as the "stationary object”.
  • the left rear side sensor SLb detects the installed object OB as the "first object”.
  • the object detection device 10 simply detects the installation object OB as the "second object”. ” movement is not detected. Therefore, the object detection device 10 identifies the installation object OB as the "first object” as the "stationary object”.
  • the object detection device 10 After that, the object detection device 10 generates an obstacle map MP.
  • the obstacle map MP is constructed by adding information corresponding to a "stationary object" to a grid map obtained by dividing the area around the own vehicle Va into small areas (for example, mesh). be done.
  • the object detection device 10 described above detects an object within the detection areas Rfl, Rrl, Rfr, and Rrr of the surveying wave sensor 21 based on the detection results of the surveying wave sensor 21 installed in the own vehicle Va as the "first object”.
  • the object detection device 10 detects objects in the target areas RL and RR including the detection areas Rfl, Rrl, Rfr, and Rrr of the survey wave sensor 21 as "second objects" based on the detection results of the peripheral monitoring camera 22. , the movement of the "second object” in the target areas RL and RR is detected. Then, when the movement of the "second object” is detected and the "first object” and the "second object” are the same, the object detection device 10 detects the "first object". is identified as a "moving object”.
  • the detection of an object based on the detection result of the survey wave sensor 21 and the detection of movement of the object based on the detection result of the perimeter monitoring camera 22 can be performed in parallel.
  • the standby state unlike the conventional technology does not occur, it is possible to identify in a short period of time whether or not the object existing around the own vehicle Va is a "moving object".
  • the object identification unit 34 identifies the "first object” as a "stationary object”. According to this, it is possible to identify in a short time whether an object existing around the own vehicle Va is a "stationary object”.
  • the object identification unit 34 Identify the "first object” as the "stationary object”. This also makes it possible to identify in a short period of time whether or not an object existing around the own vehicle Va is a "stationary object”.
  • the object detection device 10 includes a map generator 35 that generates an obstacle map MP that defines the positional relationship between the vehicle and obstacles existing around the own vehicle Va.
  • the map generation unit 35 registers the “stationary object” identified by the object identification unit 34 in the obstacle map MP, and does not register the “moving object” identified by the object identification unit 34 in the obstacle map MP. According to this, a "stationary object” among the objects existing around the host vehicle Va can be quickly reflected in the obstacle map MP as an obstacle.
  • the search wave sensor 21 includes a plurality of ultrasonic sensors arranged at predetermined intervals along the traveling direction of the vehicle Va on the side of the vehicle Va.
  • the plurality of ultrasonic sensors one of the ultrasonic sensors adjacent to each other in the traveling direction of the vehicle Va is assumed to be the first sensor, and the other is assumed to be the second sensor.
  • the surroundings monitoring camera 22 sequentially detects the second object by using the areas including the detection area of the first sensor and the detection area of the second sensor as target areas RL and RR.
  • the means for detecting the "first object” becomes redundant, it is possible to appropriately specify whether the "first object" is a "stationary object” or a "moving object". Become. For example, even if the other vehicle Vb that is temporarily stopped when the own vehicle Va and the other vehicle Vb pass each other is specified as a "stationary object" based on the detection result of the first sensor, the "stationary object” is determined based on the detection result of the second sensor. Can be changed to "moving object".
  • the target areas RL and RR for the perimeter monitoring camera 22 to detect the "second object" include the detection area of the first sensor and the detection area of the second sensor. According to this, the device configuration of the object detection device 10 can be simplified as compared with the case where a camera is provided for each of the detection area of the first sensor and the detection area of the second sensor.
  • the detection device is configured by the perimeter monitoring camera 22, but the detection device is not limited to this, and any device other than the perimeter monitoring camera 22 may be used as long as it can detect the movement of an object. may be composed of
  • the object detection device 10 when the movement detection unit 33 does not detect the movement of the "second object”, the object detection device 10 preferably identifies the "first object” as the "stationary object”. but not limited to this. For example, when the movement detection unit 33 does not detect movement of the "second object”, the object detection device 10 identifies each of the "first object” and the "second object” as “stationary objects”. It can be like this.
  • the object detection device 10 detects the movement of the "second object” by the movement detection unit 33, and the "first object” and the “second object” are different. In some cases, it may be desirable, but not limited, to identify the "first object” as a "stationary object". For example, when the movement detection unit 33 detects the movement of the "second object” and the “first object” and the “second object” are different, the object detection device 10 detects the "second object”. While specifying the "first object” as the "stationary object", the "second object” may be specified as the "moving object”.
  • the object detection device 10 creates the obstacle map MP after identifying the "first object” as either the "stationary object” or the "moving object”. is not limited to For example, instead of creating the obstacle map MP, the object detection device 10 may notify the user of detection of a “moving object” through at least one of the display unit 41 and the audio output unit 42 .
  • two ultrasonic sensors are installed on each of the left and right sides of the vehicle Va.
  • One ultrasonic sensor may be installed on each side.
  • the probe wave sensor 21 may be composed of, for example, one ultrasonic sensor.
  • the object detection device 10 may detect the "first object" based on the output of a millimeter wave radar or LiDAR instead of the ultrasonic sensor.
  • the target areas RL and RR of the perimeter monitoring camera 22 are desirably set so as to include the respective detection areas of the ultrasonic sensors adjacent in the traveling direction of the host vehicle Va. Not limited.
  • the target areas RL and RR of the surroundings monitoring camera 22 may include the detection area of one of the adjacent ultrasonic sensors and not include the detection area of the other sensor.
  • the object detection device 10 identifies whether an object around the own vehicle Va is a "moving object” or a "stationary object” when the own vehicle Va moves backward (so-called back). good too.
  • the object detection device 10 of the present disclosure is incorporated in the parking assistance system 1
  • the object detection device 10 is not limited to this, and may be incorporated in another system such as a collision prevention system. is also possible. That is, the object detection device 10 can be applied to applications other than those described above.
  • the vehicle's external environment information is obtained from a sensor
  • the controller and techniques of the present disclosure are implemented on a dedicated computer provided by configuring a processor and memory programmed to perform one or more functions embodied by the computer program. good too.
  • the controller and techniques of the present disclosure may be implemented in a dedicated computer provided by configuring the processor with one or more dedicated hardware logic circuits.
  • the control unit and method of the present disclosure is a combination of a processor and memory programmed to perform one or more functions and a processor configured by one or more hardware logic circuits. It may be implemented on one or more dedicated computers.
  • the computer program may also be stored as computer-executable instructions on a computer-readable non-transitional tangible recording medium.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Traffic Control Systems (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

La présente invention concerne un dispositif de détection d'objet (10) comprenant une unité de détection d'objet (32) qui détecte, en tant que premier objet, un objet dans la plage d'une zone de détection (Rfl, Rr1, Rfr, Rrr) d'un capteur d'onde de sonde (21) installé dans un véhicule sur la base d'un résultat de détection provenant du capteur d'onde de sonde. Le dispositif de détection d'objet comprend une unité de détection de mouvement (33) qui détecte le mouvement d'un second objet dans une zone d'intérêt (RL, RR) sur la base d'un résultat de détection provenant d'un dispositif de détection (22) qui détecte successivement, en tant que second objet, un objet dans la plage de la zone d'intérêt, la zone d'intérêt étant définie sur une zone qui comprend la zone de détection. Le dispositif de détection d'objet comprend une unité d'identification d'objet (34) qui identifie le premier objet comme objet mobile si le mouvement du second objet est détecté par l'unité de détection de mouvement et les premier et second objets sont les mêmes.
PCT/JP2022/029149 2021-08-27 2022-07-28 Dispositif de détection d'objet et procédé de détection d'objet WO2023026760A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021139030A JP2023032736A (ja) 2021-08-27 2021-08-27 物体検知装置、物体検知方法
JP2021-139030 2021-08-27

Publications (1)

Publication Number Publication Date
WO2023026760A1 true WO2023026760A1 (fr) 2023-03-02

Family

ID=85323039

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/029149 WO2023026760A1 (fr) 2021-08-27 2022-07-28 Dispositif de détection d'objet et procédé de détection d'objet

Country Status (2)

Country Link
JP (1) JP2023032736A (fr)
WO (1) WO2023026760A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013020458A (ja) * 2011-07-12 2013-01-31 Daihatsu Motor Co Ltd 車載用物体判別装置
US20160116593A1 (en) * 2014-10-23 2016-04-28 Hyundai Mobis Co., Ltd. Object detecting apparatus, and method of operating the same
JP2018010466A (ja) * 2016-07-13 2018-01-18 株式会社Soken 物体検知装置
JP2021064098A (ja) * 2019-10-11 2021-04-22 株式会社デンソー 制御装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013020458A (ja) * 2011-07-12 2013-01-31 Daihatsu Motor Co Ltd 車載用物体判別装置
US20160116593A1 (en) * 2014-10-23 2016-04-28 Hyundai Mobis Co., Ltd. Object detecting apparatus, and method of operating the same
JP2018010466A (ja) * 2016-07-13 2018-01-18 株式会社Soken 物体検知装置
JP2021064098A (ja) * 2019-10-11 2021-04-22 株式会社デンソー 制御装置

Also Published As

Publication number Publication date
JP2023032736A (ja) 2023-03-09

Similar Documents

Publication Publication Date Title
JP6531832B2 (ja) 駐車スペース検出方法および装置
JP6854890B2 (ja) 通知システムおよびその制御方法、車両、並びにプログラム
US11230284B2 (en) Driving assistance apparatus and driving assistance method
CN102470876B (zh) 汽车的碰撞监控
JP2016199262A (ja) 後退走行中の前輪軌道逸脱に基づく衝突の回避
EP3650315B1 (fr) Procédé d'aide au stationnement et dispositif d'aide au stationnement
JPWO2018235274A1 (ja) 駐車制御方法及び駐車制御装置
CN110730735B (zh) 泊车辅助方法及泊车辅助装置
CN113525337B (zh) 驻车位辨识***及包括该驻车位辨识***的驻车辅助***
CN113525350B (zh) 驻车辅助***
JP2023112053A (ja) 画像処理装置
WO2023026760A1 (fr) Dispositif de détection d'objet et procédé de détection d'objet
WO2021172532A1 (fr) Dispositif d'aide au stationnement et procédé d'aide au stationnement
WO2021199674A1 (fr) Dispositif d'aide au stationnement, système d'aide au stationnement et procédé d'aide au stationnement
JP2014069721A (ja) 周辺監視装置、制御方法、及びプログラム
JP7263962B2 (ja) 車両用表示制御装置および車両用表示制御方法
JP7271950B2 (ja) 車両の制御装置
WO2023032538A1 (fr) Dispositif de détection d'objet et procédé de détection d'objet
JP2021126951A (ja) 駐車支援装置および駐車支援方法
JP7244398B2 (ja) 移動物体判定装置
CN112977418B (zh) 驻车辅助***
WO2023002863A1 (fr) Dispositif d'aide à la conduite, procédé d'aide à la conduite
JP6198871B2 (ja) 車両周辺監視装置、車両周辺監視方法、及び車両周辺監視プログラムとその記録媒体
JP2023023158A (ja) 運転支援装置、運転支援方法
JP2023083081A (ja) 車両制御装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22861051

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE