CN109435942B - Information fusion-based parking space line and parking space recognition method and device - Google Patents
Information fusion-based parking space line and parking space recognition method and device Download PDFInfo
- Publication number
- CN109435942B CN109435942B CN201811283453.4A CN201811283453A CN109435942B CN 109435942 B CN109435942 B CN 109435942B CN 201811283453 A CN201811283453 A CN 201811283453A CN 109435942 B CN109435942 B CN 109435942B
- Authority
- CN
- China
- Prior art keywords
- parking space
- line
- vehicle
- inter
- point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 58
- 230000004927 fusion Effects 0.000 title claims abstract description 38
- 238000001514 detection method Methods 0.000 claims abstract description 34
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 claims abstract description 15
- 238000000605 extraction Methods 0.000 claims description 12
- 230000009466 transformation Effects 0.000 claims description 11
- 230000008569 process Effects 0.000 claims description 10
- 240000004050 Pentaglottis sempervirens Species 0.000 claims description 9
- 238000004364 calculation method Methods 0.000 claims description 5
- 238000009434 installation Methods 0.000 claims description 5
- 238000007670 refining Methods 0.000 claims description 5
- 238000007689 inspection Methods 0.000 claims description 4
- 238000012545 processing Methods 0.000 claims description 4
- 235000021152 breakfast Nutrition 0.000 claims description 3
- 230000000007 visual effect Effects 0.000 abstract description 14
- 238000006243 chemical reaction Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000011160 research Methods 0.000 description 4
- 241001300198 Caperonia palustris Species 0.000 description 1
- 235000000384 Veronica chamaedrys Nutrition 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/06—Automatic manoeuvring for parking
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/93—Sonar systems specially adapted for specific applications for anti-collision purposes
- G01S15/931—Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/60—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
- B60R2300/607—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/806—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- Transportation (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a method and a device for identifying a parking space line and a parking space based on information fusion. The method for identifying the line parking spaces of the parking spaces comprises the following steps: identifying a parking space angular point from a bird's eye view image on the side of the vehicle body, and recording the coordinate of the parking space angular point at the moment when the parking space angular point and the camera are at corresponding positions, wherein the parking space angular point is made to be a first parking space angular point; the ultrasonic radar of the vehicle starts to detect whether an obstacle exists in the lateral direction of the vehicle facing the corner point of the first parking space, if so, the ultrasonic radar returns to the previous step, otherwise, the next step is carried out; similarly, obtaining a second vehicle position angle point; and calculating the approximate width of the parking space line parking space according to the two parking space corner points and the lateral distance between the vehicle and the parking space line parking space. The invention can realize detection and identification of the parking space with the standard parking space mark line, utilizes the ultrasonic radar to identify whether the parking space is provided with an obstacle or not, utilizes the driving distance value acquired by the wheel speed sensor and the visual information to fuse and acquire the parking space angular point coordinates, and finally realizes the identification of the parking space line and the parking space.
Description
Technical Field
The invention relates to the technical field of automatic parking, in particular to a method and a device for identifying a parking space line and a parking space based on information fusion.
Background
The automatic parking system (Automatic Parking System, APS) is a comprehensive system integrating the functions of environment sensing, decision making, planning, intelligent control, execution and the like, and is an important component of the intelligent driving auxiliary system. As an environment sensing technology which is one of three key technologies of an automatic parking system, a sensing and identifying system mainly based on an ultrasonic radar or a monocular camera is applied to a part of successful application and is applied to a real vehicle in a commercialized manner. In recent years, the multi-sensor information fusion technology becomes a hotspot of various universities and research institutions, and a certain research result is obtained in the field of mobile robots, but the research on the information fusion technology in the field of automatic parking is less.
At present, the common automatic parking technical field is mainly used for carrying out parking space detection based on monocular vision and a 360-degree looking system in the research about the identification of the parking space line and the parking space. The parking space line parking space recognition system based on monocular vision is difficult to include the whole parallel parking space due to the narrow view field of a single camera, if the whole parallel parking space is to be included, the lateral distance from the vehicle to the parking space is required to be greater than 4m or even more, but the space beside the parallel parking space is usually narrow, so that the parking space cannot be recognized. The parking space line and parking space recognition system based on the 360-degree looking-around system solves the defect of narrow visual field, but along with the expansion of visual field, the influence of noise, illumination, ground texture and the like in an image can reduce the recognition precision of the parking space line, if the recognition precision is required to be improved, more complex algorithm design is required, the calculated amount can be increased, and the instantaneity is reduced.
Disclosure of Invention
The invention provides a parking space line and parking space recognition method and device based on information fusion, which aim to solve the technical problem that a single camera is narrow in view and is difficult to cover the whole parallel parking space.
In order to achieve the above purpose, the present invention adopts the following technical scheme:
information fusion-based parking space line and parking space identification method, wherein a camera of a vehicle is used as a coordinate origin, and the running direction of the vehicle is used as the positive direction of an x-axis to establish O 1 An xy coordinate system; a first vehicle position angle point with the center point of the rear axle of the vehicle as the originAnd the straight line where the second vehicle position angle point is located is an x 'axis, and the direction of the first vehicle position angle point pointing to the second vehicle position angle point is the positive direction of the x' axis, so as to establish an Ox 'y' coordinate system; the parking space line parking space passes through a first parking space angle point positioned on the same sideSecond point of viewAnd the vehicle is from the first vehicle location angle point +.>Driving to the corner of the second parking space>The approximate width L of the parking space obtained in the process of (2) p Determining; the vehicle is characterized in that in the running process of the vehicle, two parking space corner points are respectively imaged, and a corresponding vehicle body side aerial view is obtained;
the method for identifying the line parking spaces of the parking spaces comprises the following steps:
step one, recognizing a parking space corner point (x) under a corresponding coordinate system of a bird's-eye view from the bird's-eye view of the side face of a vehicle body inter ,y inter ) And parking space corner point (x) inter ,y inter ) When the camera is at the corresponding position, the position of the camera at the moment is taken as O 1 The origin in the xy coordinate system is recorded at this time at O 1 The angular point coordinates of the parking space under the xy coordinate system are made to be the first angular point of the parking space
Wherein, based on the basis that the pixels of the parking space line part in the side aerial view of the vehicle body are higher than the road surface area, the vehicle position angle point (x inter ,y inter );
The judgment conditions when the parking space corner points and the cameras are positioned at the corresponding positions are as follows: x/2-delta 3 ≤x inter ≤X/2+δ 3 Wherein X is birdMaximum value of aerial view abscissa, which is 540, delta 3 10;
step two, the ultrasonic radar of the vehicle starts to detect the first vehicle position angle point facing the vehicleIf there is an obstacle on the side of the first step, if yes, returning to the first step, otherwise, carrying out the third step;
step three, obtaining a second vehicle position angle point according to the same principle of the step
Step four, according to the first vehicle position angle pointAnd a second point of bearing->Calculating the approximate width L of a parking space line parking space p Lateral distance y between vehicle and parking space line and parking space p ;
Lateral distance y p The method comprises the following steps: y is P =K+(Y-y inter ) K, wherein K is the blind area distance, and is determined by the installation pitching angle and the installation height of the camera; y is the maximum value of the ordinate of the aerial view, and the value is 430; k is the ratio of the real world plane to the inverse perspective image plane.
As a further improvement of the above scheme, the parking space corner point (x inter ,y inter ) The extraction method of (1) comprises the following steps:
a) Performing feature extraction and binarization on the aerial view of the side face of the vehicle body;
b) Edge noise point removal is carried out on the aerial view of the side face of the vehicle body after feature extraction and binarization;
c) Performing image refinement on the side aerial view of the vehicle body with the edge breakfast removed;
d) Taking the refined aerial view of the side face of the vehicle body as an inspection side straight line and obtaining an intersection of the inspection side straight lines, namely a parking space corner point (x inter ,y inter )。
Further, the feature extraction and binarization steps are as follows:
in the aerial view of the side of the vehicle body, the gray value of the parking space line is higher than the pixel values at two sides of the parking space line, if one pixel is higher than the pixel value of which the left and right sides are gathered by a preset parking space line width or one pixel is higher than the pixel gray value of which the upper and lower sides are separated by a preset parking space line width, the pixel is considered to be a possible parking space line pixel, 255 is set for the pixel, otherwise, 0 is set for the pixel, namely g 1 (x, y) is:
wherein,d V+e (x, y) is the feature obtained when the parking space line feature template moves right, d V-e (x, y) is the feature obtained when the parking space line feature template moves left, d P+e (x, y) is the feature obtained when the parking space line feature template moves upwards, d P-e (x, y) is a feature obtained when the parking space line feature template moves down, p (x, y) is an input gray level map, namely, the vehicle body side aerial view map, p (x, y + -e) is a map obtained by horizontally shifting the gray level map by e pixels, p (x + -e, y) is a map obtained by vertically shifting the gray level map by e pixels, delta 0 40.
Further, the edge noise removing step comprises the following steps: and (5) continuing the following processing along with the aerial view of the side face of the vehicle body:
wherein the method comprises the steps of
F L (x n ),F R (x n ) Respectively, left and right curve equation g 1 (x, y) is the input binarized feature map, g 2 (x, y) is the output binarized feature map.
Further, the image refinement steps are as follows: and (3) refining the image by adopting a Zhang-Suen refining algorithm to obtain an image skeleton, namely a refined aerial view of the side face of the vehicle body.
Further, probability Hough transformation detection is carried out on the upper edge of the obtained refined aerial view of the side face of the vehicle body, the radial and tangential longest line segments are reserved according to the length, one longest line segment is the detection side straight line, and then the parking space corner point (x inter ,y inter )。
Still further, the parking space corner point (x inter ,y inter ) The process of the calculation is as follows:
the straight line of the tangential line segment is defined as a first straight line, the straight line of the radial line segment is defined as a second straight line, and two points (x 1 ,y 1 )、(x 2 ,y 2 ) And two points of straight line two (x 3 ,y 3 )、(x 4 ,y 4 ) The method is characterized by comprising the following steps:
3) When both straight lines are perpendicular to the x-axis, the slope k 1 、k 2 Absence of;
4) With only one straight line perpendicular to the x-axis, i.e. k 1 Or k 2 Absent, according to the definition of the slope and the truncated equation thereof
Then the equation for line one is y=k 1 x+b 1 ;
The intersection point coordinate of the two straight lines is then
3) Neither line is perpendicular to the x-axis, i.e. k 1 、k 2 All exist, according to the definition of slope and the truncated equation thereof
Then the equation for line one is y=k 1 x+b 1 The linear two equation is y=k 2 x+b 2
The intersection point coordinate of the two straight lines is then
Preferably, the parking space corner points (x inter ,y inter ) The extraction method of (2) further comprises the steps of: e) Type classification.
Still preferably, according to the parking space corner point (x inter ,y inter ) And two points of straight line one (x 1 ,y 1 )、(x 2 ,y 2 ) Dividing the parking space mark lines into three types: the judgment standard of the T type, the left L type and the right L type is as follows:
wherein delta 1 30, delta 2 5.
The invention also provides a parking space line and parking space recognition device based on information fusion, which adopts the arbitrary parking space line and parking space recognition method based on information fusion, and the parking space line and parking space recognition device comprises:
parking space corner detection module for acquiring first parking space cornerAnd a second point of view
An obstacle detection module for detecting a first vehicle-position-angle-point-oriented vehicle using an ultrasonic radar of the vehicleWhether there is an obstacle in the lateral direction of (2);
a calculation module for, when the obstacle detection module does not detect an obstacle, calculating a first vehicle position angle pointAnd a second point of bearing->Calculating the approximate width L of a parking space line parking space p Lateral distance y between vehicle and parking space line and parking space p 。
The method for identifying the parking space line and the parking space based on information fusion can realize detection and identification of the parking space with the standard parking space mark line, utilizes the ultrasonic radar to identify whether the parking space is provided with an obstacle or not, utilizes the driving distance value acquired by the wheel speed sensor and the visual information to fuse and acquire the parking space angular point coordinates, and finally realizes the identification of the parking space line and the parking space.
Drawings
Fig. 1 is a software system frame diagram of a parking space line and space recognition system applying the parking space line and space recognition method of the invention.
Fig. 2 is a diagram of intermediate results of parking space line detection in the parking space line and parking space recognition method of the present invention.
Fig. 3 is a diagram of defining types of parking space mark lines in the method for identifying parking space lines and parking spaces according to the invention.
Fig. 4 is a flow chart of information fusion of the method for identifying the parking space line and the parking space of the invention.
Fig. 5 is a plane vision ranging model diagram based on inverse perspective transformation in the parking space line and parking space recognition method of the invention.
Fig. 6 is an explanatory diagram of the establishment of a parking space coordinate system in the parking space line and parking space recognition method of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention.
The information fusion-based parking space line and space recognition method is applied to the information fusion-based parking space line and space recognition device, and the information fusion-based parking space line and space recognition device can be loaded in a parking space line and space recognition system. The parking space line and parking space recognition system comprises the following systems:
1. a power supply system; the storage battery capable of providing 12V direct current, the ultrasonic sensor uses an additional power module to complete transformation, the computer uses an inverter to complete transformation, and other devices are provided with matched independent transformation devices.
2. A sensing system; a high-definition USB camera with a monocular camera model HD DOGITAL CAMERA; the ultrasonic sensor is an ultrasonic radar with the model KS103 produced by a certain manufacturer; the wheel speed sensor adopts an incremental rotary encoder manufactured by a company, the model is BS80T20-5-24F-360BM, and the pulse number is 360.
3. Communication system: the communication data of the current system includes:
(1) The image processing result is sent to a parking controller through CAN;
(2) The ultrasonic radar data are sent to the parking controller in an IIC communication mode;
(3) The wheel speed sensor data is sent to the parking controller through the CAN.
The parking space line and parking space recognition system can be integrally installed on Jianghuai IEV4 pure electric vehicles. The monocular camera HD DOGITAL CAMERA of the parking space line and parking space recognition system is arranged at the rearview mirror and is positioned on the side surface of the vehicle body; the ultrasonic radar of the parking space line and parking space recognition system is arranged on a front wheel fender, and the ground clearance is 80cm; wheel speed sensors of the parking space line and parking space recognition system are respectively arranged on two rear wheel axles of the automobile.
The system mainly comprises an offline calibration and back projection conversion module, a parking space corner detection and visual ranging module, an information fusion module and a parking space coordinate system establishment module, and can realize real-time and accurate identification of a parking space with standard parking space mark lines. The system comprises the following parts: the ultrasonic radar adopts an ultrasonic radar with the model KS103 produced by a certain manufacturer, and the maximum detection distance is 5m; the camera adopts a HD DOGITAL CAMERA high-definition USB camera manufactured by a certain manufacturer, and is calibrated by setting parameters at a pitch angle of 30 degrees at a height of 80cm from the ground; the wheel speed sensor adopts an incremental rotary encoder manufactured by a company, the model is BS80T20-5-24F-360BM, and the pulse number is 360; the parking space coordinate system establishment module is a parking controller which is designed for autonomous development and is developed based on the minimum system of the Freescale MC9S128 singlechip. The camera and the wheel speed sensor collect information to the parking controller through the CAN channel, and the ultrasonic radar collects signals to the parking controller in an IIC communication mode. The system can stably provide a parking space mark line detection result within a range of 5.5m in front of the side of the vehicle body at the frequency of 100HZ, and can distinguish the T-shaped parking space mark line from the L-shaped parking space mark lines on the left and right. The system has low cost, low power consumption and higher overall portability, and is suitable for productization.
The working principle of the parking space line and parking space recognition system is as follows: the off-line calibration and back projection conversion module is used for completing the internal parameter calibration and external parameter calibration of the camera by utilizing a program written by a Zhang Zhengyou camera calibration method, acquiring a current parking space image by a HD DOGITAL CAMERA high-definition USB camera, and carrying out back projection conversion according to the internal and external parameters to acquire a bird's-eye view image; the parking space corner detection and visual ranging module is used for acquiring a parking space corner detection result of a current frame based on an angle constraint condition caused by local highlighting characteristics and parallel perpendicularity constraint of a parking space marker line, and the visual ranging module is used for acquiring a lateral distance value of the parking space corner of the current frame based on a bird's eye view image of back projection transformation.
The invention relates to a vehicle based on information fusionBit line parking space recognition method, which takes a camera of a vehicle as a coordinate origin, and establishes O with the running direction of the vehicle as the positive direction of an x-axis 1 An xy coordinate system; and establishing an Ox 'y' coordinate system by taking the coordinates of the first parking position angle point as an origin, taking the straight line of the first parking position angle point and the second parking position angle point as an x 'axis, and taking the direction of the first parking position angle point pointing to the second parking position angle point as the positive direction of the x' axis. The parking space line parking space passes through a first parking space angle point positioned on the same sideSecond parking space corner>And the vehicle is from the first vehicle location angle point +.>Driving to the corner of the second parking space>The approximate width L of the parking space obtained in the process of (2) p And (5) determining. And in the running process of the vehicle, the two parking space corner points are respectively imaged, and a corresponding vehicle body side aerial view is obtained.
The method for identifying the line parking spaces of the parking spaces comprises the following steps:
step one, recognizing a parking space corner point (x) under a corresponding coordinate system of a bird's-eye view from the bird's-eye view of the side face of a vehicle body inter ,y inter ) And parking space corner point (x) inter ,y inter ) When the camera is at the corresponding position, the position of the camera at the moment is taken as O 1 The origin in the xy coordinate system is recorded at this time at O 1 The angular point coordinates of the parking space under the xy coordinate system are made to be the first angular point of the parking space
Wherein, based on the basis that the pixels of the parking space line part in the side aerial view of the vehicle body are higher than the road surface area, the vehicle position angle point (x inter ,y inter );
The judgment conditions when the parking space corner points and the cameras are positioned at the corresponding positions are as follows: x/2-delta 3 ≤x inter ≤X/2+δ 3 Wherein X is the maximum value of the aerial view abscissa, and the value is 540, delta 3 10;
step two, the ultrasonic radar of the vehicle starts to detect the first vehicle position angle point facing the vehicleIf there is an obstacle on the side of the first step, if yes, returning to the first step, otherwise, carrying out the third step;
step three, obtaining a second vehicle position angle point according to the same principle of the step
Step four, according to the first vehicle position angle pointAnd a second point of bearing->Calculating the approximate width L of a parking space line parking space p Lateral distance y between vehicle and parking space line and parking space p ;
Wherein, the parking space is approximately wide L p The method comprises the following steps:
lateral distance y p The method comprises the following steps: y is P =K+(Y-y inter ) K, wherein K is the blind area distance, and is determined by the installation pitching angle of the camera; y is the maximum value of the ordinate of the aerial view, and the value is 430; k is the ratio of the real world plane to the inverse perspective image plane.
The information fusion-based parking space line and space recognition method is programmed to be capable of being designed into an information fusion-based parking space line and space recognition device, and the parking space line and space recognition device comprises: parking stall angular point detectsA module for acquiring a first vehicle location angle pointAnd a second point of view->An obstacle detection module for detecting a corner of the vehicle facing the first parking space using an ultrasonic radar of the vehicle>Whether there is an obstacle in the lateral direction of (2); a calculation module for, when the obstacle detection module does not detect an obstacle, determining a first vehicle position angle point +.>And a second point of bearing->Calculating the approximate width L of a parking space line parking space p Lateral distance y between vehicle and parking space line and parking space p 。
The parking space line and parking space recognition system based on information fusion can realize detection and recognition of a standard parking space mark line and parking space, utilizes an ultrasonic radar to recognize whether an obstacle exists in the parking space, utilizes a driving distance value obtained by a wheel speed sensor and visual information to fuse and obtain parking space angular point coordinates, and finally realizes recognition of the parking space line and parking space.
Example 1
Referring to fig. 1, a parking space line and parking space recognition system based on information fusion mainly includes on a software framework: the system comprises an off-line calibration and back projection conversion module M1, a parking space corner detection and visual ranging module M2, an information fusion module M3 and a parking space coordinate system establishment module M4.
The off-line calibration and back projection conversion module M1 is used for collecting image information and completing back projection conversion relative to a vehicle body side plane to obtain a vehicle body side aerial view, and the off-line calibration and back projection conversion module M1 comprises an off-line calibration module M11 and a back projection conversion module M12 for performing camera internal parameters and external parameters. The off-line calibration module M11 utilizes the characteristic mark points to obtain the internal parameters of the camera and the external parameters of the camera relative to the coordinates of the vehicle body. The back projection conversion module M12 acquires BGR images in real time, converts the images into a Bird's eye View (Bird-eye View) by utilizing the calculated internal parameters and external parameters of the camera, and converts the Bird's eye View into gray images, namely the Bird's eye View of the side of the vehicle body which needs to be subjected to data processing subsequently.
The parking space corner detection and visual ranging module M2 is used for carrying out parking space corner detection and detection of the distance between the side surface of the vehicle and the parking space corner, and the parking space corner detection and visual ranging module M2 comprises a parking space corner detection module M21 and a plane ranging module M22 based on inverse perspective transformation. The parking space corner detection module M21 extracts a parking space corner (x) from the vehicle body side aerial view based on the basis that pixels of a parking space line part in the vehicle body side aerial view are higher than a road surface area inter ,y inter ). The plane ranging module M22 obtains the parking space corner point (x) inter ,y inter ) Distance value y from the side of the vehicle body p In this embodiment, a planar ranging model is used to obtain a distance value of a parking space corner from a side surface of a vehicle body.
The parking space corner detection module M21 outputs a parking space corner (x) inter ,y inter ) The data processing method of (1) includes the steps of:
a) Feature extraction and binarization.
Referring to fig. 2 (a), the parking space line feature extraction is based on the following assumption: namely, the gray value of the parking space line is higher than the pixel values at two sides of the parking space line, if one pixel is higher than the pixel value of which the width of the parking space line is gathered to the left and right or one pixel is higher than the pixel gray value of which the width of the parking space line is spaced from the upper and lower positions, the pixel is considered to be a possible parking space line pixel, and the pixel is 255; if a pixel point and its adjacent pixels do not meet the above condition, the pixel point is set to 0.
Wherein,
g 1 (x, y) is the obtained preliminary parking space line characteristic diagram, d V+e (x, y) is the feature obtained when the parking space line feature template moves right, d V-e (x, y) is the feature obtained when the parking space line feature template moves left, d P+e (x, y) is the feature obtained when the parking space line feature template moves upwards, d P-e (x, y) is a feature obtained when the parking space line feature template moves down, p (x, y) is an input gray scale map, p (x, y + -e) is a map obtained by shifting an original input gray scale map by e pixels left and right, and p (x + -e, y) is a map obtained by shifting an original input gray scale map by e pixels up and down.
b) The bird's eye view edge breakfast is removed.
Referring to fig. 2 (b), for the extracted features of the parking space line, the comparison between the inner edge and the outer edge of the aerial view obtained in the off-line calibration and back projection conversion module also meets the condition that the pixel value of the width of the parking space line is higher than that of the left-right phase convergence of the aerial view, and once calibrated, the edge line is a fixed curve. Taking a plurality of pixel point coordinates on two curves by adopting an off-line method, and performing curve fitting by utilizing a cftool kit in Matlab to obtain a fitting curve equation F L (x n ) And F R (x n ) The following process is continued for the graph (a):
wherein the method comprises the steps of
F L (x n ),F R (x n ) Left and right curve equations, respectively.
g 1 (x, y) is the input binarized feature map, g 2 (x, y) is the output binarized feature map.
c) And (5) image refinement.
Referring to fig. 2 (c), an image skeleton is obtained by refining an image on a binarized map with features of parking space lines by using a Zhang-Suen refinement algorithm.
d) And detecting the side straight line and obtaining the intersection point of the vertical and parallel straight lines.
Referring to fig. 2 (d), probability hough transform detection is performed on the lower edge of the obtained image skeleton, the longest radial and tangential line segments are reserved according to the length, the linear equation of the line segments is calculated, and the intersection point is calculated, wherein the process is as follows:
two points (x) of two non-parallel straight lines 1 (the straight line where the tangential line segments lie) are known 1 ,y 1 )、(x 2 ,y 2 ) And two points (x) of a straight line 2 (the straight line where the radial line segment is located) 3 ,y 3 )、(x 4 ,y 4 ) The method is characterized by comprising the following steps:
5) When both straight lines are perpendicular to the x-axis, the slope k 1 、k 2 Is not present. Not within the scope of the present discussion.
6) With only one straight line perpendicular to the x-axis, i.e. k 1 Or k 2 This is the case for inventive line 2, which is absent. According to the definition of the slope and the truncated equation thereof
Then the equation for line 1 is y=k 1 x+b 1 ;
The intersection point coordinate of the two straight lines is then
3) Neither line is perpendicular to the x-axis, i.e. k 1 、k 2 Are all present. According to the definition of the slope and the truncated equation thereof
Then the equation for line 1 is y=k 1 x+b 1 The equation of straight line 2 is y=k 2 x+b 2
The intersection point coordinate of the two straight lines is then
e) Type classification.
Referring to fig. 2 (e) and 3, the intersection point (x inter ,y inter ) And two points (x) of a straight line 1 (a straight line where a tangential line segment is located) 1 ,y 1 )、(x 2 ,y 2 ) Dividing the parking space mark lines into three types: the judgment standard of the T type, the left L type and the right L type is as follows:
the information fusion module M3 is used for fusing the distance information acquired by the ultrasonic radar and the wheel speed sensor and the visual information acquired by the camera to acquire and establish the characteristic point coordinates of the parking space coordinate system.
Referring to fig. 4, the information fusion module M3 mainly includes the following steps:
step one, a camera detects a parking space corner point of a mark parking space line parking space, if the parking space corner point is identified and the parking space corner point is in a parallel position with the camera, the parking space corner point coordinates at the moment are recorded, and the parking space corner point coordinates are made to be first parking space corner points
Step two, the ultrasonic radar starts to detect whether an obstacle exists sideways, if so, the ultrasonic radar returns to the step one, and if not, the step M3S3 is continued;
step three, the step three is that,the camera detects the parking space corner point of the mark parking space line parking space, if the parking space corner point is identified and the parking space corner point is in a parallel position with the camera, the parking space corner point coordinates at the moment are recorded, and the parking space corner point coordinates are made to be second vehicle position corner points
And step four, calculating the width of the target parking space and the lateral distance between the vehicle and the target parking space according to the coordinates of the first parking space corner and the second parking space corner.
Wherein the first vehicle position angle point coordinatesAnd a second position-angle point coordinate +.>The abscissa of the parking space angle point detection and visual ranging module M2 is obtained by the self-vehicle driving distance obtained by the wheel speed sensor, and the ordinate of the parking space angle point detection and visual ranging module M22 is obtained by the plane ranging module M22 based on inverse perspective transformation.
The parking space coordinate system establishment module M4 is used for judging that the detected parking space accords with the vertical parking space or the parallel parking space, determining the relative position of the vehicle and the parking space, and further planning a corresponding path. The parking space coordinate system establishment module M4 obtains first vehicle position angular point coordinates by utilizing the information fusion moduleAnd a second position-angle point coordinate +.>Calculating to obtain the approximate width L of the parking space P Judging whether the vehicle is parallel or perpendicular according to a set threshold value of the width of the vehicle, and planning a path according to the calculated central coordinate of the rear axle of the vehicle, wherein the vehicle has the approximate width L P Calculated from the following formula: />
In this embodiment, referring to fig. 5, the camera is installed at the vehicle body side rearview mirror based on the principle of a planar ranging module of inverse perspective transformation: the inverse perspective image theoretically has a strict linear relationship with the real plane, i.e. the real area corresponding to each pixel block (1 pix) in the aerial view image is equal, or the distance between two points in the aerial view image and the distance between two corresponding points in the real world plane have the following relationship, i.e.
D AB =k·d ab
Wherein D is AB Representing the Euclidean distance between points A and B in mm, d on the real world plane ab The Euclidean distance between a point A and a point B of the real world plane in the bird's-eye view image is represented by a pixel (pix), k is the relationship coefficient between the bird's-eye view image and the real world plane, and the unit is mm/pix.
In practical vehicles, since the camera has limited field of view, there is generally a blind field of view, which is related to the pitch angle of the camera when installed, as shown in fig. 3:
d=k+kd
Wherein D is the distance between the parking space mark line (tangential straight line) and the side surface of the vehicle body; k is a blind area of the camera, and can be obtained by measuring through experiments; and k is the ratio coefficient of the real world plane to the inverse perspective image plane; d is the pixel distance between the position of the reverse perspective image where the parking space mark line is located and the lowest end of the image.
The distance from the parking space corner point to the side surface of the vehicle body, which is obtained by the plane ranging module based on inverse perspective transformation, is as follows:
y P =K+(Y-y inter )·k
wherein y is P The vertical coordinate of the parking space corner point is the distance between the parking space mark line (tangential straight line) and the side surface of the vehicle body; k is the distance of the blind area and is determined by the installation pitching angle of the camera; y is the maximum value of the ordinate of the aerial view, and the value is 430; y is inter A longitudinal coordinate value of the intersection point of two straight lines of the parking space mark line; k is the ratio of the real world plane to the inverse perspective image plane.
Referring to FIG. 6, with the camera as the origin of coordinates, O is established from the vehicle travel direction as the positive x-axis direction 1 An xy coordinate system; and establishing an Ox 'y' coordinate system by taking the first parking space angular point as an origin and taking the self-vehicle running direction as the positive direction of the x axis. Determining a parking space angular point coordinate in the information fusion module and a rear axle coordinate of the vehicle in the parking space coordinate system building module as shown in the figure, and recording the parking space angular point coordinate at the moment when the parking space angular point is identified and the parking space angular point and the camera are in parallel positions (I positions in the figure) and enabling the parking space angular point coordinate to be a first parking space angular pointThe second vehicle position angle point coordinates are the same; the judging conditions that the parking space corner points and the cameras are in parallel positions are as follows:
X/2-δ 3 ≤x inter ≤X/2+δ 3
wherein X is the maximum value of the aerial view abscissa, and the value thereof is 540.
When the vehicle runs to the position II in the graph, the coordinate point of the camera becomes
Wherein x is c The distance value of the self-vehicle is obtained;
w is the actual width of the parking space, and is obtained by ultrasonic radar detection or obtained by national standards.
When the vehicle runs to the position II in the graph, the coordinate point of the center of the rear axle of the vehicle is:
wherein L is the distance between the camera and the rear axle, and w is the width of the vehicle.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the method embodiments described above may be performed by hardware associated with program instructions. The foregoing program may be stored in a computer readable storage medium. The program, when executed, performs steps including the method embodiments described above; and the aforementioned storage medium includes: various media that can store program code, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.
Claims (10)
1. A method for identifying a parking space line and a parking space based on information fusion is characterized in that,
taking a camera of a vehicle as a coordinate origin, and establishing O by taking the running direction of the vehicle as the positive direction of the x-axis 1 An xy coordinate system; taking the coordinates of a first parking position angle point as an origin, taking the straight line where the first parking position angle point and the second parking position angle point are located as an x 'axis, and establishing an Ox' y 'coordinate system in the positive direction of the x' axis when the first parking position angle point points to the second parking position angle point; the parking space line parking space passes through a first parking space angle point positioned on the same sideSecond parking space corner>And the vehicle is from the first vehicle location angle point +.>Driving to the corner of the second parking space>The approximate width L of the parking space obtained in the process of (2) p Determining; the vehicle is characterized in that in the running process of the vehicle, two parking space corner points are respectively imaged, and a corresponding vehicle body side aerial view is obtained;
the method for identifying the line parking spaces of the parking spaces comprises the following steps:
step one, recognizing a parking space corner point (x) under a corresponding coordinate system of a bird's-eye view from the bird's-eye view of the side face of a vehicle body inter ,y inter ) And parking space corner point (x) inter ,y inter ) When the camera is at the corresponding position, the position of the camera at the moment is taken as O 1 The origin in the xy coordinate system is recorded at this time at O 1 The angular point coordinates of the parking space under the xy coordinate system are made to be the first angular point of the parking space
Wherein, based on the basis that the pixels of the parking space line part in the side aerial view of the vehicle body are higher than the road surface area, the vehicle position angle point (x inter ,y inter );
The judgment conditions when the parking space corner points and the cameras are positioned at the corresponding positions are as follows:
X/2-δ 3 ≤x inter ≤X/2+δ 3 wherein X is the maximum value of the aerial view abscissa, and the value is 540, delta 3 10;
step two, the ultrasonic radar of the vehicle starts to detect the first vehicle position angle point facing the vehicleIf there is an obstacle on the side of the first step, if yes, returning to the first step, otherwise, carrying out the third step;
step three, obtaining a second vehicle position angle point according to the same principle of the step
Step four, according to the first vehicle position angle pointAnd a second point of bearing->Calculating the approximate width L of a parking space line parking space p Lateral distance y between vehicle and parking space line and parking space p ;
Wherein, the parking space is approximately wide L p The method comprises the following steps: l (L) P =x P2 -x P1 ;
Lateral distance y p The method comprises the following steps: y is P =K+(Y-y inter ) K, wherein K is the blind area distance, and is determined by the installation pitch angle and the height of the camera; y is the maximum value of the ordinate of the aerial view, and the value is 430; k is the ratio of the real world plane to the inverse perspective image plane.
2. The information fusion-based parking space line and space recognition method according to claim 1, wherein the parking space corner points (x inter ,y inter ) The extraction method of (1) comprises the following steps:
a) Performing feature extraction and binarization on the aerial view of the side face of the vehicle body;
b) Edge noise point removal is carried out on the aerial view of the side face of the vehicle body after feature extraction and binarization;
c) Performing image refinement on the side aerial view of the vehicle body with the edge breakfast removed;
d) Taking the refined aerial view of the side face of the vehicle body as an inspection side straight line and obtaining an intersection of the inspection side straight lines, namely a parking space corner point (x inter ,y inter )。
3. The information fusion-based parking space line and space recognition method according to claim 2, wherein the steps of feature extraction and binarization are as follows:
in the side aerial view of the vehicle body, the pixel value of the parking space line is higher than the pixel values at two sides of the parking space line, if one existsThe pixel is higher than the pixel value of which the left and right sides are gathered by a preset parking space line width or the gray value of the pixel which is higher than the gray value of the pixel which is positioned above and below the pixel and is separated by a preset parking space line width is existed, namely the pixel is considered to be a possible parking space line pixel, 255 is arranged on the pixel, otherwise 0 is arranged on the pixel, namely g 1 (x, y) is:
wherein,d V+e (x, y) is the feature obtained when the parking space line feature template moves right, d V-e (x, y) is the feature obtained when the parking space line feature template moves left, d P+e (x, y) is the feature obtained when the parking space line feature template moves upwards, d P-e (x, y) is a feature obtained when the parking space line feature template moves down, p (x, y) is an input gray level map, namely, the vehicle body side aerial view map, p (x, y + -e) is a map obtained by horizontally shifting the gray level map by e pixels, p (x + -e, y) is a map obtained by vertically shifting the gray level map by e pixels, delta 0 40.
4. The information fusion-based parking space line and space recognition method according to claim 2, wherein the step of removing edge noise points is as follows: and (5) continuing the following processing along with the aerial view of the side face of the vehicle body:
wherein the method comprises the steps of
F L (x n ),F R (x n ) Respectively, left and right curve equation g 1 (x, y) is the input binarized feature map, g 2 (x, y) is the output binarized feature map.
5. The information fusion-based parking space line and space recognition method according to claim 2, wherein the image refinement step comprises the following steps: and (3) refining the image by adopting a Zhang-Suen refining algorithm to obtain an image skeleton, namely a refined aerial view of the side face of the vehicle body.
6. The information fusion-based parking space line and space recognition method according to claim 2, wherein probability hough transformation detection is performed on the lower edge of the obtained refined vehicle body side aerial view, the radial and tangential longest line segments are respectively reserved according to the length, one longest line segment is the detection side straight line, and then a parking space corner point (x inter ,y inter )。
7. The information fusion-based parking space line and space recognition method according to claim 6, wherein the parking space corner points (x inter ,y inter ) The process of the calculation is as follows:
the straight line of the tangential line segment is defined as a first straight line, the straight line of the radial line segment is defined as a second straight line, and two points (x 1 ,y 1 )、(x 2 ,y 2 ) And two points of straight line two (x 3 ,y 3 )、(x 4 ,y 4 ) The method is characterized by comprising the following steps:
1) When both straight lines are perpendicular to the x-axis, the slope k 1 、k 2 Absence of;
2) With only one straight line perpendicular to the x-axis, i.e. k 1 Or k 2 Absent, according to the definition of the slope and the truncated equation thereof
Then the equation for line one is y=k 1 x+b 1 ;
The intersection point coordinate of the two straight lines is then
3) Neither line is perpendicular to the x-axis, i.e. k 1 、k 2 All exist, according to the definition of slope and the truncated equation thereof
Then the equation for line one is y=k 1 x+b 1 The linear two equation is y=k 2 x+b 2
The intersection point coordinate of the two straight lines is then
8. The information fusion-based parking space line and space recognition method according to claim 7, wherein the parking space corner points (x inter ,y inter ) The extraction method of (2) further comprises the steps of: e) Type classification.
9. The information fusion-based parking space line and space recognition method according to claim 8, wherein the information fusion-based parking space line and space recognition method is characterized in that the information fusion-based parking space line and space recognition method is based on the parking space corner points (x inter ,y inter ) And two points of straight line one (x 1 ,y 1 )、(x 2 ,y 2 ) Dividing the parking space mark lines into three types: the judgment standard of the T type, the left L type and the right L type is as follows:
wherein delta 1 30, delta 2 5.
10. A parking space line and parking space recognition device based on information fusion, which is characterized in that the method for recognizing the parking space line and parking space based on information fusion according to any one of claims 1 to 9 is adopted, and the parking space line and parking space recognition device comprises:
parking space corner detection module for acquiring first parking space cornerAnd a second point of view->
An obstacle detection module for detecting a first vehicle-position-angle-point-oriented vehicle using an ultrasonic radar of the vehicleWhether there is an obstacle in the lateral direction of (2);
a calculation module for, when the obstacle detection module does not detect an obstacle, calculating a first vehicle position angle pointAnd a second point of bearing->Calculating the approximate width L of a parking space line parking space p Lateral distance y between vehicle and parking space line and parking space p 。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811283453.4A CN109435942B (en) | 2018-10-31 | 2018-10-31 | Information fusion-based parking space line and parking space recognition method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811283453.4A CN109435942B (en) | 2018-10-31 | 2018-10-31 | Information fusion-based parking space line and parking space recognition method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109435942A CN109435942A (en) | 2019-03-08 |
CN109435942B true CN109435942B (en) | 2024-04-09 |
Family
ID=65548957
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811283453.4A Active CN109435942B (en) | 2018-10-31 | 2018-10-31 | Information fusion-based parking space line and parking space recognition method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109435942B (en) |
Families Citing this family (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109740584B (en) * | 2019-04-02 | 2019-06-25 | 纽劢科技(上海)有限公司 | Automatic parking parking space detection method based on deep learning |
CN110097087B (en) * | 2019-04-04 | 2021-06-11 | 浙江科技学院 | Automatic reinforcing steel bar binding position identification method |
CN110070752A (en) * | 2019-05-29 | 2019-07-30 | 北京百度网讯科技有限公司 | One kind is parked appraisal procedure, device, electronic equipment and storage medium |
CN110293964B (en) * | 2019-06-25 | 2020-11-03 | 重庆长安汽车股份有限公司 | Automatic parking fusion parking space judgment method and system, computer readable storage medium and vehicle |
CN110414355A (en) * | 2019-06-27 | 2019-11-05 | 沈阳工业大学 | The right bit sky parking stall of view-based access control model and parking stall line detecting method during parking |
CN110390306B (en) * | 2019-07-25 | 2021-08-10 | 湖州宏威新能源汽车有限公司 | Method for detecting right-angle parking space, vehicle and computer readable storage medium |
CN110544386A (en) * | 2019-09-18 | 2019-12-06 | 奇瑞汽车股份有限公司 | parking space identification method and device and storage medium |
CN111311925B (en) * | 2020-01-21 | 2022-02-11 | 阿波罗智能技术(北京)有限公司 | Parking space detection method and device, electronic equipment, vehicle and storage medium |
CN113255405B (en) * | 2020-02-12 | 2024-06-25 | 广州汽车集团股份有限公司 | Parking space line identification method and system, parking space line identification equipment and storage medium |
CN111754468A (en) * | 2020-06-10 | 2020-10-09 | 深圳南方德尔汽车电子有限公司 | Parking space detection method and device based on projection, computer equipment and storage medium |
CN112052782B (en) * | 2020-08-31 | 2023-09-05 | 安徽江淮汽车集团股份有限公司 | Method, device, equipment and storage medium for recognizing parking space based on looking around |
CN112172797B (en) * | 2020-09-27 | 2021-12-17 | 华人运通(上海)自动驾驶科技有限公司 | Parking control method, device, equipment and storage medium |
CN112201078B (en) * | 2020-09-30 | 2021-08-10 | 中国人民解放军军事科学院国防科技创新研究院 | Automatic parking space detection method based on graph neural network |
CN112036385B (en) * | 2020-11-04 | 2021-02-02 | 天津天瞳威势电子科技有限公司 | Library position correction method and device, electronic equipment and readable storage medium |
CN114475434B (en) * | 2020-11-11 | 2023-08-04 | 广州汽车集团股份有限公司 | Control and adjustment method for reversing outside rearview mirror, system and storage medium thereof |
CN112455430B (en) * | 2020-12-02 | 2023-05-30 | 苏州优达斯汽车科技有限公司 | Method for detecting inclined parking places without parking place lines, parking method and parking system |
CN112598922B (en) * | 2020-12-07 | 2023-03-21 | 安徽江淮汽车集团股份有限公司 | Parking space detection method, device, equipment and storage medium |
CN112633152B (en) * | 2020-12-22 | 2021-11-26 | 深圳佑驾创新科技有限公司 | Parking space detection method and device, computer equipment and storage medium |
CN112622885B (en) * | 2020-12-30 | 2022-03-22 | 惠州市德赛西威汽车电子股份有限公司 | Method and system for constructing inclined parking spaces based on ultrasonic radar |
CN112767425A (en) * | 2020-12-30 | 2021-05-07 | 智车优行科技(北京)有限公司 | Parking space detection method and device based on vision |
CN112927552B (en) * | 2021-01-20 | 2022-03-11 | 广州小鹏自动驾驶科技有限公司 | Parking space detection method and device |
CN112863242B (en) * | 2021-02-08 | 2022-07-01 | 广州小鹏自动驾驶科技有限公司 | Parking space detection method and device |
CN112776797A (en) * | 2021-02-27 | 2021-05-11 | 重庆长安汽车股份有限公司 | Original parking space parking establishment method and system, vehicle and storage medium |
CN112983085A (en) * | 2021-04-30 | 2021-06-18 | 的卢技术有限公司 | Parking space line identification method based on vision |
CN113053164A (en) * | 2021-05-11 | 2021-06-29 | 吉林大学 | Parking space identification method using look-around image |
CN113311437B (en) * | 2021-06-08 | 2022-04-19 | 安徽域驰智能科技有限公司 | Method for improving angular point position accuracy of vehicle-mounted radar positioning side parking space |
CN113822156B (en) * | 2021-08-13 | 2022-05-24 | 北京易航远智科技有限公司 | Parking space detection processing method and device, electronic equipment and storage medium |
CN113799769B (en) * | 2021-09-28 | 2023-06-16 | 北京经纬恒润科技股份有限公司 | Parking space recognition precision detection method and device and automatic driving vehicle |
CN113885532B (en) * | 2021-11-11 | 2023-07-25 | 江苏昱博自动化设备有限公司 | Unmanned floor truck control system of barrier is kept away to intelligence |
CN115206130B (en) * | 2022-07-12 | 2023-07-18 | 合众新能源汽车股份有限公司 | Parking space detection method, system, terminal and storage medium |
CN115148047B (en) * | 2022-07-25 | 2024-05-24 | 中汽创智科技有限公司 | Parking space detection method and device |
CN118097999A (en) * | 2024-04-29 | 2024-05-28 | 知行汽车科技(苏州)股份有限公司 | Parking space identification method, device, equipment and medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2436577A3 (en) * | 2010-09-30 | 2012-10-10 | Valeo Schalter und Sensoren GmbH | Device and method for detecting free parking spots |
CN104933409A (en) * | 2015-06-12 | 2015-09-23 | 北京理工大学 | Parking space identification method based on point and line features of panoramic image |
CN105946853A (en) * | 2016-04-28 | 2016-09-21 | 中山大学 | Long-distance automatic parking system and method based on multi-sensor fusion |
CN108281041A (en) * | 2018-03-05 | 2018-07-13 | 东南大学 | A kind of parking space's detection method blended based on ultrasonic wave and visual sensor |
-
2018
- 2018-10-31 CN CN201811283453.4A patent/CN109435942B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2436577A3 (en) * | 2010-09-30 | 2012-10-10 | Valeo Schalter und Sensoren GmbH | Device and method for detecting free parking spots |
CN104933409A (en) * | 2015-06-12 | 2015-09-23 | 北京理工大学 | Parking space identification method based on point and line features of panoramic image |
CN105946853A (en) * | 2016-04-28 | 2016-09-21 | 中山大学 | Long-distance automatic parking system and method based on multi-sensor fusion |
CN108281041A (en) * | 2018-03-05 | 2018-07-13 | 东南大学 | A kind of parking space's detection method blended based on ultrasonic wave and visual sensor |
Also Published As
Publication number | Publication date |
---|---|
CN109435942A (en) | 2019-03-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109435942B (en) | Information fusion-based parking space line and parking space recognition method and device | |
CN107738612B (en) | Automatic parking space detection and identification system based on panoramic vision auxiliary system | |
CN109460709B (en) | RTG visual barrier detection method based on RGB and D information fusion | |
WO2021259344A1 (en) | Vehicle detection method and device, vehicle, and storage medium | |
US8699754B2 (en) | Clear path detection through road modeling | |
US8634593B2 (en) | Pixel-based texture-less clear path detection | |
US8670592B2 (en) | Clear path detection using segmentation-based method | |
US8452053B2 (en) | Pixel-based texture-rich clear path detection | |
CN106952308B (en) | Method and system for determining position of moving object | |
CN111563412B (en) | Rapid lane line detection method based on parameter space voting and Bessel fitting | |
JP4714104B2 (en) | Object tilt detection device | |
US9042639B2 (en) | Method for representing surroundings | |
CN112349144B (en) | Monocular vision-based vehicle collision early warning method and system | |
KR20160123668A (en) | Device and method for recognition of obstacles and parking slots for unmanned autonomous parking | |
US20100104137A1 (en) | Clear path detection using patch approach | |
US20090121899A1 (en) | Parking assistance device | |
CN109997148B (en) | Information processing apparatus, imaging apparatus, device control system, moving object, information processing method, and computer-readable recording medium | |
CN101910781A (en) | Moving state estimation device | |
CN110555407B (en) | Pavement vehicle space identification method and electronic equipment | |
KR20120072020A (en) | Method and apparatus for detecting run and road information of autonomous driving system | |
TWI656518B (en) | Marked parking space identification system and method thereof | |
Liu et al. | Development of a vision-based driver assistance system with lane departure warning and forward collision warning functions | |
CN112927303B (en) | Lane line-based automatic driving vehicle-mounted camera pose estimation method and system | |
JP4296287B2 (en) | Vehicle recognition device | |
JP4956099B2 (en) | Wall detector |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |