WO2018088262A1 - 駐車枠認識装置 - Google Patents

駐車枠認識装置 Download PDF

Info

Publication number
WO2018088262A1
WO2018088262A1 PCT/JP2017/039137 JP2017039137W WO2018088262A1 WO 2018088262 A1 WO2018088262 A1 WO 2018088262A1 JP 2017039137 W JP2017039137 W JP 2017039137W WO 2018088262 A1 WO2018088262 A1 WO 2018088262A1
Authority
WO
WIPO (PCT)
Prior art keywords
parking frame
line
parking
same
shape
Prior art date
Application number
PCT/JP2017/039137
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
大輔 杉浦
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Priority to CN201780069123.2A priority Critical patent/CN109923586B/zh
Priority to DE112017005670.5T priority patent/DE112017005670T5/de
Publication of WO2018088262A1 publication Critical patent/WO2018088262A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/586Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of parking space
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/48Extraction of image or video features by mapping characteristic values of the pattern into a parameter space, e.g. Hough transformation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/14Traffic control systems for road vehicles indicating individual free spaces in parking areas
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • This disclosure relates to a technique for recognizing a parking frame that represents a parking frame in which the host vehicle can park.
  • Patent Document 1 As a technique for recognizing a parking frame, a white line farther to the host vehicle than a white line closer to the host vehicle out of the white lines constituting the parking frame that can be observed from the host vehicle. A technique has been proposed for determining that the frame is a parking frame when it is long.
  • One aspect of the present disclosure is to provide a technique that allows a parking frame recognition device that recognizes a parking frame representing a parking frame in which the host vehicle can be parked to recognize the parking frame more accurately.
  • the parking frame recognition device includes an image acquisition unit, a line extraction unit, and a possible frame recognition unit.
  • the image acquisition unit is configured to acquire a plurality of captured images representing respective captured images captured by a plurality of imaging devices arranged at different positions of the host vehicle.
  • the line extraction unit is configured to extract a parking frame from a plurality of captured images and extract a target line representing a line that defines an end of the parking frame farther from the vehicle.
  • the possible frame recognizing unit determines whether or not the shape of the target line is the same for the same parking frame extracted from a plurality of captured images, and can park the parking frame determined to have the same shape of the target line It is configured to be recognized as a frame.
  • the shape of the same target line captured at different positions is different because the target line is hidden by the parked vehicle. Take advantage of the property of being observed.
  • the shape of the target line can be determined by how much the comparison parameters such as the number of line corners, line length, aspect ratio, and edge curvature match.
  • the degree of matching can be determined, for example, by comparing the comparison parameter ratio with a threshold value.
  • the parkingable frame is accurately determined. Can be recognized.
  • the imaging system 1 of this embodiment is a system mounted on a vehicle such as a passenger car, and includes a control unit 10. Moreover, you may provide the front camera 4F, the rear camera 4B, the right camera 4R, the left camera 4L, the display part 30, the vehicle control part 32, etc.
  • a vehicle on which the imaging system 1 is mounted is also referred to as a host vehicle.
  • the front camera 4F and the rear camera 4B each have a function of imaging the roads ahead and behind the host vehicle, and are attached to the front and rear parts of the host vehicle. Further, the right camera 4R and the left camera 4L have a function of imaging the right and left roads of the host vehicle, respectively, and are attached to the right side and the left side of the host vehicle. That is, the front camera 4F, the rear camera 4B, the right camera 4R, and the left camera 4L are arranged at different positions on the own vehicle. Here, the different positions indicate that the imaging devices are arranged on different vertical lines without being arranged on the same vertical line.
  • the control unit 10 generates a bird's-eye image in which the road around the vehicle is viewed from the vertical direction from the images captured by the cameras 4F, 4B, 4R, and 4L. Then, the generated bird's-eye view image is displayed on the display unit 30 configured with a liquid crystal display or the like and disposed in the vehicle interior.
  • a parking frame means an object having at least one line having a width within a preset range and a side or line parallel to the line at a position separated from the line by a distance corresponding to the width of the vehicle. Represents the area between.
  • the parking frame means a parking frame where the own vehicle can be parked.
  • parallel includes substantially parallel.
  • the “object having a side or line parallel to the line” referred to here includes, for example, a curbstone, another line, a wall, a tree, a guardrail, and the like.
  • the control unit 10 includes an imaging signal input unit 12, a detection signal input unit 14, a memory 16, a display control unit 18, and an image processing unit 20.
  • the imaging signal input unit 12 has a function of capturing imaging signals from the front camera 4F, the rear camera 4B, the right camera 4R, and the left camera 4L, and inputting them to the image processing unit 20 as captured image data.
  • the detection signal input unit 14 takes in detection signals from the wheel speed sensor 6 that detects the rotational speed of each wheel of the host vehicle and the steering angle sensor 8 that detects the steering angle of the steering, respectively, and wheel speed data and steering angle data. And a function of inputting to the image processing unit 20.
  • the image processing unit 20 is mainly configured by a known microcomputer having a CPU 18 and a semiconductor memory (hereinafter, memory 16) such as a RAM, a ROM, and a flash memory.
  • memory 16 a semiconductor memory
  • Various functions of the image processing unit 20 are realized by the CPU 18 executing a program stored in a non-transitional physical recording medium.
  • the memory 16 corresponds to a non-transitional tangible recording medium that stores a program.
  • the non-transitional tangible recording medium means that the electromagnetic waves in the recording medium are excluded.
  • the number of microcomputers constituting the image processing unit 20 may be one or plural.
  • the image processing unit 20 includes a line detection unit 22, a position correspondence unit 24, a frame estimation unit 26, and a tracking unit 28 as functional configurations realized by the CPU 18 executing a program. And comprising.
  • the method of realizing these elements constituting the image processing unit 20 is not limited to software, and some or all of the elements may be realized using one or a plurality of hardware.
  • the electronic circuit may be realized by a digital circuit including a large number of logic circuits, an analog circuit, or a combination thereof.
  • the function as the line detection unit 22 performs a process of detecting a line such as a white line or a yellow line by performing a known image process such as a Hough transform on each captured image.
  • a known image process such as a Hough transform
  • correspondence information in which an imaging range common to other cameras and corresponding coordinates are set for each camera is stored in the memory 16, and it is determined whether or not the lines are the same using the correspondence information.
  • the function as the frame estimation unit 26 performs a process of estimating one or a plurality of parking frames present in the captured image and estimating a parking available frame from these parking frames.
  • the function as the tracking unit 28 performs processing for tracking an object in the captured image by associating the amount of movement of the host vehicle with the amount of movement of the object in the captured image.
  • the parking frame is tracked and the position of the parking frame is recognized. Information on the position of the parking frame is sent to the vehicle control unit 32.
  • the function as the tracking unit 28 generates an image showing a parking frame while taking into account the amount of movement of the host vehicle, and this image is output to the display control unit 18.
  • the display control unit 18 converts the image sent from the image processing unit into a video signal that can be displayed on the display unit 30, and sends the video signal to the display unit 30.
  • the vehicle control unit 32 receives the position of the parking frame, generates a track of the host vehicle for parking in the parking frame, and moves the host vehicle according to the track. Control the rudder angle, etc.
  • the parking-permitted frame recognition process is a process that is started, for example, when the vehicle is turned on and then repeatedly executed.
  • the parkingable frame recognition process may be activated every imaging cycle of the camera.
  • the captured image captured by the front camera 4F for example, a captured image as illustrated in FIG. 3 is obtained
  • the captured image captured by the left camera 4L for example, a captured image as illustrated in FIG. 4 is obtained. It is done.
  • two or more captured images obtained by the cameras 4F, 4B, 4R, and 4L mounted on the vehicle can be used in any combination. However, in the present embodiment, the description is simplified. Only an example in which images captured by the front camera 4F and the left camera 4L are used in combination will be described.
  • a line such as a white line or a yellow line is detected for each acquired captured image.
  • the edges that are the boundaries of brightness and color are detected in a large number of pixels that make up the captured image, and processing such as well-known Hough transform is performed on the edges, so that all existing in the captured image are detected.
  • Detect the line The line here has a width and includes, for example, paint on the road.
  • the detected line is called a detection line.
  • all the lines 41, 42, and 43 among the white lines 40 indicating all the parking frames are detection lines.
  • S130 it is determined whether or not there are a plurality of detection lines in each captured image.
  • an area between two lines is recognized as a parking frame, it is recognized that there is no parking frame when two or more lines are not detected. Therefore, in S130, it can be said that it is determined whether there is a possibility that a parking frame exists.
  • the parking frame recognition process is terminated. If there are a plurality of detection lines, the entire captured image obtained in S140 or at least the detection lines of the captured image is converted into a planar coordinate system. This process may be realized by a well-known process for generating a bird's-eye image obtained by looking down on a captured image from the vertical direction.
  • the parking frame is estimated at S150.
  • a parking frame is extracted from a plurality of captured images.
  • the parking frame is an area that is sandwiched between at least two lines, and an area in which the interval between the two lines is within a preset range that is set based on the width of the vehicle is extracted.
  • all the parking frames present in the captured image are extracted.
  • the area between the white lines 41 and 42 and the area between the white lines 42 and 43 are extracted as parking frames.
  • the area between the white lines 41 and 43 can also be extracted as a parking frame, in this process, only the area where the distance between the white lines is within a preset range is recognized as a parking frame. The area is excluded from the parking frame.
  • a target line representing a line that defines the end of the parking frame farther from the host vehicle is extracted.
  • the white line 42 is the target line in the parking frame between the white lines 41 and 42
  • the white line 43 is the target line in the parking frame between the white lines 42 and 43.
  • the candidate parking frame includes a parking frame, but is not limited to a parking frame, and indicates a general parking frame including a parking frame in which another vehicle has already been parked.
  • the process proceeds to S240 described above. If there is a parking frame candidate, it is determined in S210 whether or not the shape of the target line is the same in an imaging region common to a plurality of captured images.
  • the same parking frame observed from different positions is extracted from a plurality of captured images, and the shapes of the same target lines constituting the parking frame are compared.
  • the same parking frame specifies a common area in which the same object appears in a plurality of captured images according to conditions under which a plurality of captured images are obtained, for example, an offset of arrangement in a plurality of cameras, and the parking frame in a certain captured image It is specified by extracting a parking frame that falls within the common area.
  • the lengths of the target lines are compared.
  • the length of the line represents the length of the line in the longitudinal direction.
  • the longitudinal direction is a direction orthogonal to the line width direction.
  • the target line such as the white line 43 is observed in a relatively wide range without being hidden by the vehicle. In this case, the difference in the length of the target line is reduced.
  • the difference in length of the target line is within a preset threshold value, it is recognized that the shapes are the same. If the shape of the target line is the same in S210, the parking frame determined to have the same shape of the target line in S220 is recognized as a parking available frame. Further, if the shape of the target line is not the same, the parking frame determined in S230 that the shape of the target line is not the same is recognized as not a parkingable frame.
  • the image processing unit 20 acquires a plurality of captured images representing each captured image captured by a plurality of imaging devices arranged at different positions of the host vehicle. Then, the image processing unit 20 extracts a parking frame from a plurality of captured images, and extracts a target line representing a line that defines an end of the parking frame farther from the host vehicle. Further, the image processing unit 20 determines whether or not the shape of the target line is the same for the same parking frame extracted from the plurality of captured images, and determines the parking frame that is determined to have the same shape as the target line. Recognize as a parking frame.
  • the parking-capable frame is accurately recognized. it can.
  • the image processing unit 20 determines whether or not the length of the target line is the same as the shape of the target line. According to such an imaging system 1, since the shape of the target line is determined by comparing the lengths of the target lines, the parking frame can be configured with a simple configuration as compared with the configuration that is determined by the shape of the entire line. Can be recognized.
  • the image processing unit 20 extracts a region sandwiched between at least two lines as a parking frame.
  • the parking frame can be extracted using a known white line recognition technique. Therefore, the parking frame can be recognized with a simpler configuration.
  • the image processing unit 20 converts each captured image into a planar coordinate system, and applies the subject line to the same parking frame extracted from each captured image converted into the planar coordinate system. It is determined whether or not the shape of the target line is the same, and the parking frame in which the shape of the target line is determined to be the same is recognized as a parkingable frame.
  • whether or not the shape of the target line is the same is determined based on the length of the target line, but the present invention is not limited to this.
  • the shape of the target line it may be determined whether or not the overall shape of the target line, the shape of the end portion on the side far from the own vehicle in the target line, and the like are the same.
  • the shapes that is, the shapes of the white lines 42 in the region [A] and the region [B] are compared.
  • the end of the white line 42 far from the vehicle is hidden by the vehicle, and the part of the vehicle where the white line 42 is hidden changes depending on the position of the camera.
  • the shape of the end far from the vehicle changes.
  • the white line 42 is not hidden, the shape of the end of the white line 42 far from the vehicle does not change. Therefore, it can be determined whether or not the parking frame is a parkingable frame based on whether or not the shape of the end portion on the side far from the host vehicle is the same in the target line.
  • the shape of the target line is determined by comparing the shape of the end of the target line that is far from the host vehicle, it is simpler than the configuration that is determined by the shape of the entire line. It is possible to recognize the parking available frame with a simple configuration.
  • the imaging system 1 when determining whether or not the parking frame is a parking frame, it may be determined using only the parking frame recognition process described above. You may determine by combining a process and a well-known process. For example, a parkable frame may be temporarily extracted by a known process, and a parkable frame recognition process may be used for final determination. Moreover, you may determine with a parking frame being a parking possible frame, when it determines with it being a parking possible frame by at least one of a well-known process and a parking possible frame recognition process, or a majority of processes.
  • a plurality of functions of one constituent element in the embodiment may be realized by a plurality of constituent elements, or a single function of one constituent element may be realized by a plurality of constituent elements. . Further, a plurality of functions possessed by a plurality of constituent elements may be realized by one constituent element, or one function realized by a plurality of constituent elements may be realized by one constituent element. Moreover, you may abbreviate
  • various devices such as a parking frame recognition device as a component of the imaging system 1, a program for causing a computer to function as the imaging system 1, a semiconductor memory storing the program, and the like
  • the present disclosure can also be realized in various forms such as a non-transition actual recording medium and a parking frame recognition method.
  • the image processing unit 20 of the imaging system 1 corresponds to a parking frame recognition device in the present disclosure.
  • the process of S110 among the processes executed by the image processing unit 20 corresponds to the image acquisition unit referred to in the present disclosure
  • the processes of S120 and S150 correspond to the line extraction unit referred to in the present disclosure in the above embodiment.
  • the process of S140 corresponds to the coordinate conversion unit referred to in the present disclosure
  • the processes of S210 and S220 in the above embodiment correspond to the possible frame recognition unit referred to in the present disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
PCT/JP2017/039137 2016-11-10 2017-10-30 駐車枠認識装置 WO2018088262A1 (ja)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201780069123.2A CN109923586B (zh) 2016-11-10 2017-10-30 停车框识别装置
DE112017005670.5T DE112017005670T5 (de) 2016-11-10 2017-10-30 Parklücken-Erkennungsvorrichtung

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016219644A JP6677141B2 (ja) 2016-11-10 2016-11-10 駐車枠認識装置
JP2016-219644 2016-11-10

Publications (1)

Publication Number Publication Date
WO2018088262A1 true WO2018088262A1 (ja) 2018-05-17

Family

ID=62110479

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/039137 WO2018088262A1 (ja) 2016-11-10 2017-10-30 駐車枠認識装置

Country Status (4)

Country Link
JP (1) JP6677141B2 (zh)
CN (1) CN109923586B (zh)
DE (1) DE112017005670T5 (zh)
WO (1) WO2018088262A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7466324B2 (ja) * 2020-02-12 2024-04-12 フォルシアクラリオン・エレクトロニクス株式会社 画像処理装置及び画像処理方法
JP7482054B2 (ja) 2020-02-27 2024-05-13 フォルシアクラリオン・エレクトロニクス株式会社 画像処理装置及び画像処理方法

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016016681A (ja) * 2014-07-04 2016-02-01 クラリオン株式会社 駐車枠認識装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5287344B2 (ja) * 2009-02-26 2013-09-11 日産自動車株式会社 駐車支援装置及び障害物検知方法
JP6152261B2 (ja) * 2012-11-27 2017-06-21 クラリオン株式会社 車載用駐車枠認識装置
KR101470240B1 (ko) * 2013-11-14 2014-12-08 현대자동차주식회사 주차 영역 검출 장치 및 그 방법
JP2016219644A (ja) 2015-05-22 2016-12-22 日東工業株式会社 電気機器収納用箱

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016016681A (ja) * 2014-07-04 2016-02-01 クラリオン株式会社 駐車枠認識装置

Also Published As

Publication number Publication date
JP2018077705A (ja) 2018-05-17
DE112017005670T5 (de) 2019-08-08
JP6677141B2 (ja) 2020-04-08
CN109923586A (zh) 2019-06-21
CN109923586B (zh) 2023-02-28

Similar Documents

Publication Publication Date Title
JP6197291B2 (ja) 複眼カメラ装置、及びそれを備えた車両
JP6649738B2 (ja) 駐車区画認識装置、駐車区画認識方法
CN107273788B (zh) 在车辆中执行车道检测的成像***与车辆成像***
JP7206583B2 (ja) 情報処理装置、撮像装置、機器制御システム、移動体、情報処理方法およびプログラム
JP2013190421A (ja) 車両において通行物***置検出を向上する方法
WO2014002692A1 (ja) ステレオカメラ
JP6617150B2 (ja) 物体検出方法及び物体検出装置
KR20170055738A (ko) 영상 기반 주행 차로 판단 장치 및 방법
WO2018088262A1 (ja) 駐車枠認識装置
JP6483360B2 (ja) 対象物認識装置
WO2011016257A1 (ja) 車両用距離算出装置
JP5097681B2 (ja) 地物位置認識装置
US20110019000A1 (en) Vehicular image processing device and vehicular image processing program
JP2018074411A (ja) 物体検出装置及び物体検出方法
WO2018088263A1 (ja) 駐車枠認識装置
JP4704998B2 (ja) 画像処理装置
JP6032141B2 (ja) 走行路面標示検知装置および走行路面標示検知方法
WO2022009537A1 (ja) 画像処理装置
US9030560B2 (en) Apparatus for monitoring surroundings of a vehicle
JP5785515B2 (ja) 歩行者検出装置及び方法、並びに車両用衝突判定装置
JP6729358B2 (ja) 認識装置
WO2018097269A1 (en) Information processing device, imaging device, equipment control system, mobile object, information processing method, and computer-readable recording medium
EP2919191B1 (en) Disparity value deriving device, equipment control system, movable apparatus, robot, and disparity value producing method
JP4876277B2 (ja) 車両用画像処理装置、車両、及び車両用画像処理プログラム
JP7322651B2 (ja) 障害物認識装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17868804

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17868804

Country of ref document: EP

Kind code of ref document: A1