WO2021218310A1 - Procédé et appareil de stationnement, et véhicule - Google Patents

Procédé et appareil de stationnement, et véhicule Download PDF

Info

Publication number
WO2021218310A1
WO2021218310A1 PCT/CN2021/077351 CN2021077351W WO2021218310A1 WO 2021218310 A1 WO2021218310 A1 WO 2021218310A1 CN 2021077351 W CN2021077351 W CN 2021077351W WO 2021218310 A1 WO2021218310 A1 WO 2021218310A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
moment
gear
parking
sensor information
Prior art date
Application number
PCT/CN2021/077351
Other languages
English (en)
Chinese (zh)
Inventor
李俊超
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021218310A1 publication Critical patent/WO2021218310A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/06Automatic manoeuvring for parking

Definitions

  • This application relates to the technical field of smart cars, and in particular to a parking method, device and vehicle.
  • Parking is the biggest problem people face, and it is also a common daily problem faced by car owners.
  • vehicles equipped with sensors enter people's sight. Sensors can not only play a key role in autonomous driving, but also help people solve parking problems.
  • Vehicles equipped with sensors can detect idle parking spaces through sensors, and determine the relative position of the vehicle to the idle parking spaces according to the changes in pose during driving, and automatically park or assist the driver to park according to the relative position, thereby helping people solve parking difficulties The problem.
  • Whether the relative position can be accurately determined to a large extent determines whether the vehicle can safely and accurately complete the parking.
  • vehicles generally detect changes in pose through an inertial measurement unit (IMU) mounted thereon, so as to determine the position of the vehicle relative to an empty parking space.
  • IMU inertial measurement unit
  • the embodiments of the present application provide a parking method, device, and vehicle, which are used to improve the accuracy of the relative position between the vehicle and the idle parking space.
  • an embodiment of the present application provides a parking method, including: acquiring a first relative position between a vehicle and an empty parking space; acquiring first sensor information of the vehicle at the first moment; acquiring information about the vehicle at the second moment Second sensor information; the direction of movement of the vehicle from the first time to the first position is opposite to the direction of movement of the vehicle from the first position to the second time; acquiring the second time to the The relative pose information of the vehicle at the first moment; the second sensor information and the first sensor information are matched to obtain the cumulative error in the relative pose information; according to the cumulative error and the The first relative position determines the second relative position of the vehicle and the free parking space.
  • the time at which the first relative position is acquired is called the initial time
  • the relative pose information of the vehicle from the second time to the first time is called the first relative pose information
  • the relative pose information of the vehicle from the second moment to the initial moment is called second relative pose information.
  • the cumulative error in the second relative pose information includes the cumulative error in the first relative pose information.
  • the method in the embodiment of the present application is beneficial to eliminate the cumulative error in the second relative pose information.
  • the second relative position can be determined according to the first relative position and the second relative pose information, in the method of the embodiment of the present application, according to the first relative position and the first relative pose
  • the cumulative error in the information determines the second relative position, which is beneficial to eliminate the cumulative error in the second relative position by eliminating the cumulative error in the second relative pose information, and improve the performance of the second relative position. Accuracy, in turn, helps improve the parking success rate.
  • the first relative position and the second relative position may also include the posture of the vehicle relative to the idle parking space.
  • the first moment is a moment when at least one of the following signals is acquired: a braking signal of the vehicle, a start signal of the vehicle, and the free parking space The identification signal, and the selection signal of the free parking space.
  • Obtaining the first sensor information at any one of the foregoing moments is beneficial to timely obtaining the matching object of the second sensor information, thereby helping to eliminate the accumulated error in the second relative position.
  • the relative pose information is used to indicate a change in the pose of the vehicle from the second moment to the first moment; and the position of the vehicle
  • the attitude change includes the position change of the vehicle, and/or the attitude change of the vehicle, which is beneficial to eliminate the accumulated error in the position change and the attitude change, so as to more accurately determine the relative position of the vehicle and the idle parking space.
  • the relative pose information is detected by an inertial measurement unit in the vehicle.
  • the first sensor information and the second sensor information are obtained by at least one of the following sensors: ultrasonic radar, microwave radar, laser rangefinder and Image detector.
  • the above-mentioned sensors are used to detect the environmental information of the vehicle, and the obtained first sensor information and second sensor information are beneficial to reflect the environment around the vehicle, facilitate the matching of the first sensor information and the second sensor information, and construct a loss function to calculate Cumulative error.
  • the gear position of the vehicle at the first time is a first gear position
  • the gear position of the vehicle at the second time is a second gear position
  • the first gear is a reverse gear
  • the second gear is a forward gear
  • the first gear is a forward gear
  • the second gear is a reverse gear
  • the matching success rate of the second sensor information and the first sensor information can be improved with a lower computing power loss.
  • the gear of the vehicle in the first position is switched from the first gear to the second gear; the first travel distance is equal to The difference between the second travel distance does not exceed a threshold, the first travel distance is the travel distance of the vehicle from the first moment to the first position, and the second travel distance is the vehicle travel distance from The travel distance from the first position to the second moment.
  • the matching success rate of the second sensor information and the first sensor information can be more accurately improved.
  • the threshold value is determined according to the sensing range of at least one of the following sensors: ultrasonic radar, microwave radar, laser rangefinder, and image detector.
  • Determining the threshold value according to the sensing range of the sensor is beneficial for determining the threshold value more accurately, and further improving the matching success rate of the second sensor information and the first sensor information.
  • the present application provides a parking device, including: an acquisition module for acquiring a first relative position between a vehicle and a free parking space; acquiring first sensor information of the vehicle at the first moment; acquiring the location at the second moment According to the second sensor information of the vehicle, the direction of movement of the vehicle from the first moment to the first position is opposite to the direction of movement of the vehicle from the first position to the second moment; acquiring the second The relative pose information of the vehicle from the moment to the first moment; the second sensor information and the first sensor information are matched to obtain the cumulative error in the relative pose information; a determining module for The second relative position of the vehicle and the free parking space is determined according to the accumulated error and the first relative position.
  • the first moment is a moment when at least one of the following signals is acquired: a braking signal of the vehicle, a start signal of the vehicle, and the free parking space The identification signal, and the selection signal of the free parking space.
  • the relative pose information is used to indicate a change in the pose of the vehicle from the second moment to the first moment; and the position of the vehicle
  • the posture change includes a change in the position of the vehicle, and/or a change in the posture of the vehicle.
  • the relative pose information is detected by an inertial measurement unit in the vehicle.
  • the first sensor information and the second sensor information are obtained by at least one of the following sensors: ultrasonic radar, microwave radar, laser rangefinder and Image detector.
  • the gear position of the vehicle at the first time is the first gear position
  • the gear position of the vehicle at the second time is the second gear position
  • the first gear is a reverse gear
  • the second gear is a forward gear
  • the first gear is a forward gear
  • the second gear is a reverse gear
  • the gear of the vehicle in the first position is switched from the first gear to the second gear; the first travel distance is equal to The difference between the second travel distance does not exceed a threshold, the first travel distance is the travel distance of the vehicle from the first moment to the first position, and the second travel distance is the vehicle travel distance from The travel distance from the first position to the second moment.
  • the threshold value is determined according to the sensing range of at least one of the following sensors: ultrasonic radar, microwave radar, laser rangefinder, and image detector.
  • the present application provides a parking device, including: one or more processors; one or more memories; the processor is coupled to the memory, and the memory is configured to store a program; The processor is configured to execute the program in the memory to execute the steps described in the first aspect or any one of the possible implementation manners of the first aspect.
  • an embodiment of the present application provides a vehicle, including the parking device described in the second aspect or any one of the possible implementation manners of the second aspect.
  • the embodiments of the present application provide a computer-readable storage medium, including a program, which when running on a computer, causes the computer to execute the first aspect or any one of the possible implementations of the first aspect. method.
  • this application provides a computer program product, which when the computer program product runs on a computer, causes the computer to execute the method described in the first aspect or any one of the possible implementation manners of the first aspect.
  • the present application provides a chip system including a processor and a memory, the memory is used to store a computer program, and the processor is used to call and run the computer program stored in the memory to execute The method described in one aspect or any one of the possible implementation manners of the first aspect.
  • the chip system can be composed of chips, and can also include chips and other discrete devices.
  • Fig. 1 is an exemplary functional block diagram of a vehicle involved in an embodiment of the present application
  • Figure 2a shows a possible parking process of a vehicle taking parallel parking spaces as an example
  • Figure 2b shows a possible parking process of a vehicle with a vertical parking space as an example
  • Figure 3a is a possible schematic diagram of a vehicle provided with an image detector
  • Figure 3b is a possible schematic diagram of a vehicle equipped with an ultrasonic radar
  • Fig. 3c is a schematic diagram of the detection distance detected by the vehicle shown in Fig. 3b changing with time;
  • Fig. 4 shows a possible schematic diagram of parking the vehicle according to the relative position with accumulated error during the parking process described in Fig. 2a;
  • Figure 5a is a schematic diagram of a possible implementation of the parking method of the present application.
  • Figure 5b is a schematic diagram of a possible refinement of step 505 in Figure 5a;
  • Figure 5c is a schematic diagram of the principle of constructing an error function
  • Fig. 6 is a schematic diagram of the parking process obtained by translating the away path in Fig. 2a for a certain distance along the y direction;
  • FIG. 7 is a schematic diagram of a possible embodiment of the parking device of the present application.
  • Fig. 8 is a schematic diagram of another possible embodiment of the parking device of the present application.
  • first and second in the description and claims of the embodiments of the present application are used to distinguish similar objects, and are not necessarily used to describe a specific sequence or sequence. It should be understood that the terms used in this way can be interchanged under appropriate circumstances, and this is merely a way of distinguishing objects with the same attributes in the description of the embodiments of the present application.
  • the terms “including” and “having” and any variations of them are intended to cover non-exclusive inclusion, so that a process, method, system, product, or device that includes a series of units is not necessarily limited to those units, but may include Listed or inherent to these processes, methods, products, or equipment.
  • At least one (item) refers to one or more, and “multiple” refers to two or more.
  • “And/or” is used to describe the association relationship of the associated objects, which means that there can be three kinds of relationships.
  • a and/or B can mean: there is only A, only B, and both A and B. Among them, A and B can be singular or plural.
  • the character “/” generally indicates that the associated objects before and after are in an "or” relationship.
  • the following at least one item (a)” or similar expressions refers to any combination of these items, including any combination of a single item (a) or a plurality of items (a).
  • At least one of a, b, or c can mean: a, b, c, "a and b", “a and c", “b and c", or "a and b and c" ", where a, b, and c can be single or multiple.
  • the embodiment of the present application relates to a vehicle, which refers to an off-track-carrying vehicle with 4 or more wheels driven by power, for example, used to carry people and/or goods.
  • vehicle can integrate sensors (for example, ultrasonic radar and image detectors, etc.), controllers, actuators and other devices to have environmental perception capabilities and realize different levels of automatic parking functions.
  • FIG. 1 is an exemplary functional block diagram of a vehicle 100 involved in an embodiment of the present application.
  • the vehicle 100 is configured in a fully or partially autonomous driving mode.
  • the vehicle 100 can control itself while in the automatic driving mode, and can determine the current state of the vehicle and its surrounding environment through human operations, determine the possible behavior of at least one other vehicle in the surrounding environment, and determine the other vehicle The confidence level corresponding to the possibility of performing the possible behavior is controlled based on the determined information.
  • the vehicle 100 can be placed to operate without human interaction.
  • the vehicle 100 may include various subsystems, such as a travel system 102, a sensor system 104, a control system 106, one or more peripheral devices 108 and a power supply 110, a computer system 112, and a user interface 116.
  • the vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements.
  • each of the subsystems and elements of the vehicle 100 may be wired or wirelessly interconnected.
  • the travel system 102 may include components that provide power movement for the vehicle 100.
  • the traveling system 102 may include an engine 118, an energy source 119, a transmission 120, and wheels 121.
  • the sensor system 104 may include several sensors.
  • the sensor system 104 may include a positioning system 122 (the positioning system may be a GPS system, a Beidou system or other positioning systems) and an inertial measurement unit (IMU) 124, and the sensor system 104 may also include a detection vehicle 100 Sensors for surrounding environment information, such as radar 126, laser rangefinder 128, and image detector 130.
  • the radar 126 may be an ultrasonic radar, a microwave radar, or the like.
  • the positioning system 122 can be used to estimate the geographic location of the vehicle 100.
  • the IMU 124 is used to detect the pose change of the vehicle 100 based on the inertial acceleration.
  • the IMU 124 may be a combination of an accelerometer and a gyroscope.
  • the radar 126 may use radio signals to detect objects in the surrounding environment of the vehicle 100. In some embodiments, in addition to detecting an object, the radar 126 may also be used to detect the speed and/or direction of the object.
  • the laser rangefinder 128 may use laser light to detect objects in the environment where the vehicle 100 is located. In some embodiments, the laser rangefinder 128 may include one or more laser sources, laser scanners, and one or more detectors, as well as other system components.
  • the image detector 130 (or called the image detector) may be used to capture multiple images of the surrounding environment of the vehicle 100. The image detector 130 may be a static image detector or a video image detector.
  • the control system 106 controls the operation of the vehicle 100 and its components.
  • the control system 106 may include various components, including a steering system 132, a throttle 134, a braking unit 136, a sensor fusion algorithm 138, a computer vision system 140, a route control system 142, and an obstacle avoidance system 144.
  • the computer vision system 140 may be operable to process and analyze the images captured by the image detector 130 in order to identify objects and/or features in the surrounding environment of the vehicle 100. In some embodiments, the computer vision system 140 may be used to map the environment, track objects, estimate the speed of objects, identify free parking spaces, and so on.
  • the route control system 142 is used to determine the travel route of the vehicle 100. In some embodiments, the route control system 142 may combine data from the sensor system 104 to determine a travel route for the vehicle 100.
  • the vehicle 100 interacts with external sensors, other vehicles, other computer systems, or users through the peripheral device 108.
  • the peripheral device 108 may include a wireless communication system 146, an onboard computer 148, a microphone 150, and/or a speaker 152.
  • the peripheral device 108 provides a means for the user of the vehicle 100 to interact with the user interface 116.
  • the touch display screen of the onboard computer 148 can provide information to the user of the vehicle 100.
  • the user interface 116 can also operate the touch screen of the on-board computer 148 to receive user input.
  • the computer system 112 may include at least one processor 113 that executes instructions 115 stored in a non-transitory computer readable medium such as the memory 114.
  • the computer system 112 may also be multiple computing devices that control individual components or subsystems of the vehicle 100 in a distributed manner.
  • the memory 114 may contain instructions 115 (eg, program logic), which may be executed by the processor 113 to perform various functions of the vehicle 100, including those functions described above.
  • the memory 114 may also contain additional instructions, including those for sending data to, receiving data from, interacting with, and/or controlling one or more of the propulsion system 102, the sensor system 104, the control system 106, and the peripheral device 108. instruction.
  • the memory 114 may also store data, such as road maps, route information, and other information. Such information may be used by the vehicle 100 and the computer system 112 during the operation of the vehicle 100 in autonomous, semi-autonomous, and/or manual modes.
  • the user interface 116 is used to provide information to or receive information from a user of the vehicle 100.
  • the user interface 116 may include one or more input/output devices in the set of peripheral devices 108, such as a wireless communication system 146, a car computer 148, a microphone 150, and a speaker 152.
  • the computer system 112 may control the functions of the vehicle 100 based on inputs received from various subsystems (for example, the travel system 102, the sensor system 104, and the control system 106) and from the user interface 116. For example, the computer system 112 may utilize input from the control system 106 in order to control the steering unit 132 to avoid obstacles detected by the sensor system 104 and the obstacle avoidance system 144. In some embodiments, the computer system 112 is operable to provide control of many aspects of the vehicle 100 and its subsystems.
  • the above-mentioned computer system 112 may be referred to as a computing center, which is used to perform calculation and control of one or more functions.
  • one or more of these components described above may be installed or associated with the vehicle 100 separately.
  • the storage 114 may exist partially or completely separately from the vehicle 100.
  • the above-mentioned components may be communicatively coupled together in a wired and/or wireless manner.
  • the various systems in the above vehicle 100 (such as the sensing system 102, the control system 106, and the computer system 112) are logical concepts.
  • one or more forms of each system may be a physical device.
  • the box can also be a single board, or a chip or area on a single board.
  • the application scenarios of the embodiments of the present application may include, but are not limited to: indoor and outdoor parking.
  • Parking space types may include, but are not limited to: vertical parking spaces, parallel parking spaces, or oblique parking spaces.
  • the parking space can be a marked area (for example, a lined parking space) or an unmarked area.
  • parking space 1 and parking space 3 are occupied by vehicle 1 and vehicle 3 respectively, and parking space 2 is not occupied and is an idle parking space.
  • Fig. 2a and Fig. 2b the areas in three adjacent rectangular frames represent parking space 1, parking space 2 and parking space 3, respectively.
  • FIG. 2a shows the parking process of vehicle 2 to parking space 2.
  • the parking space lines of parking space 1, parking space 2 and parking space 3 are three rectangular boxes with short sides parallel to each other.
  • Vehicle 2 starts at position A where parking space 2 is found, and enters parking space 2 according to the path shown by the dotted line with an arrow. .
  • FIG. 2b shows the parking process of vehicle 2 from position A to parking space 2 as an example.
  • the parking space lines of parking space 1, parking space 2 and parking space 3 are three rectangular boxes with parallel long sides.
  • the parking path includes two sub-paths, which are respectively called the away path and the drive-in path.
  • the far path refers to the path from position A through position B to position C;
  • the entry path refers to the path from position C to the parking space 2 through the vicinity of position B.
  • the process of vehicle 2 driving into parking space 2 according to the parking path includes: when vehicle 2 is driving away from the path, vehicle 2 sets forward gear, starting from position A, approaching parking space 2 first, then driving through parking space 2, passing position B, Stop at position C; at position B, the vehicle 2 is switched to reverse gear, after which, the vehicle 2 drives according to the entry path, and enters the parking space 2 from the position C through the vicinity of the position B.
  • the vehicle involved in the embodiment of the present application can perform any one of the following automatic driving level (SAE) parking:
  • SAE automatic driving level
  • Level 0 Refers to the provision of warning and instant assistance during parking, such as active braking, blind spot monitoring, lane departure warning, and body stabilization systems.
  • Level 1 It means that the driver controls the speed of the vehicle, and the vehicle determines and executes the steering according to the speed of the vehicle and the surrounding environment. Level 1 corresponds to semi-automatic parking;
  • Level 2 It means that the vehicle determines and executes all operations such as steering, acceleration and deceleration according to the surrounding environment, and the driver monitors inside or outside the vehicle. Level 2 corresponds to fully automatic parking;
  • Level 3 It means that the vehicle parks in designated or random free parking spaces without driver's operation and supervision. Level 3 corresponds to autonomous parking.
  • the parking process is divided into multiple parking links in the following, and each parking link is exemplified.
  • the parking process generally includes the following links: environment perception, idle parking space positioning, parking route planning, parking route following control, and simulation display.
  • Environmental perception It refers to the detection of objects around the vehicle body, such as obstacles and free parking spaces, through the sensor information obtained by the vehicle.
  • the sensor used for environment perception may be a sensor that detects information about the environment around the vehicle, such as a fusion of one or more of the radar 126, the laser rangefinder 128, and the image detector 130 in FIG. 1.
  • Free parking space positioning refers to the detection of free parking spaces based on the environmental perception link, and determining the relative position of the vehicle to the free parking space based on sensor information to complete the positioning of the free parking space.
  • Environment perception and vacant parking space location can be performed by the computer vision system 140 in FIG. 1, for example.
  • Fig. 3a is a schematic diagram of a vehicle 2 with image detectors deployed. Take the image detector as a fisheye camera as an example. In Figure 3a, a black-filled rectangle represents a fisheye camera. It can be seen from Fig. 3a that a fisheye camera is provided on the front, rear, left, and right of the vehicle 2 for environmental perception.
  • the vehicle 2 can obtain 4 fisheye images at the position A in FIG. 2a through the 4 fisheye cameras provided, and obtain a top view using image stitching technology. After the vehicle 2 detects the parking space 2 in the top view, the pixel position of the parking space 2 in the top view can be determined. Then, according to the pixel position and the pre-calibrated measurement matrix of the fisheye camera of the vehicle, the position of parking space 2 in the vehicle body coordinate system is obtained, that is, the relative position of the vehicle and parking space 2 is determined, and the idle parking space positioning is completed.
  • the following introduces the environment perception and idle parking space locating links performed by vehicle 2 using ultrasonic radar.
  • Fig. 3b is a schematic diagram of the vehicle 2 on which the ultrasonic radar is deployed. It can be seen from Fig. 3b that four ultrasonic parking assist (UPA) sensors are installed at the front and rear of the vehicle 2, and the UPA sensors are represented by black-filled triangles, which are used to detect obstacles around the vehicle; the left and right sides of the vehicle 2
  • UPA ultrasonic parking assist
  • APA automatic parking assistance
  • the vehicle 2 starts from the position A and passes through the parking space 2 in parallel at a certain speed.
  • the vehicle 2 continuously obtains the sensor information detected by the APA sensor, and the vehicle 2 can determine the change over time of the detection distance of the APA according to the sensor information.
  • the detection distance increases and the increase exceeds the threshold d
  • the vehicle 2 considers that a boundary of the candidate free parking space is detected; when the detection distance decreases and the decrease exceeds the threshold d, the vehicle 2 considers the backup to be detected. Choose another boundary of free parking spaces.
  • the vehicle 2 obtains the change of the detection distance with time according to the data collected by the APA sensor (the APA sensor represented by the circle in the dashed line in Fig. 3b) on the right rear of the vehicle body, as shown in Fig. 3c.
  • the APA sensor represented by the circle in the dashed line in Fig. 3b
  • the time t1 in FIG. 3c corresponds to the time when the vehicle 2 is at the position A.
  • the detection distance is d1, which means that the distance between the vehicle 2 and the vehicle 1 is d1.
  • the detection distance increases to d2, assuming that d2-d1>d, vehicle 2 considers that a boundary of the candidate free parking space is detected at time t2, which corresponds to the common boundary between parking space 1 and parking space 2 in Figure 2a .
  • the detection distance is still d2, but after time t3, the detection distance gradually drops to d1.
  • the position of the vehicle 2 corresponds to the position B in Fig. 2a. Since d2-d1>d, vehicle 2 considers that another boundary of the candidate free parking space is detected at time t3, which corresponds to the common boundary between parking space 2 and parking space 3 in Fig. 2a.
  • the length of the candidate idle parking space may be determined according to the vehicle speed and the time between the two moments.
  • the vehicle 2 considers the candidate free parking space (ie, parking space 2) as an available free parking space.
  • the position of the vehicle 2 relative to the parking space 2 can be determined at the position B of the vehicle 2 to complete the vacant parking space positioning.
  • Parking path planning It refers to formulating a parking path according to the relative position of the vehicle and the free parking space.
  • the parking path planning may be performed by, for example, the route control system 142 in FIG. 1.
  • the parking path is used to guide the driver to enter the idle parking space by means of images or voice.
  • the parking path is used to instruct the vehicle to automatically enter the free parking space.
  • Parking path following control It refers to continuously detecting the surrounding environment of the vehicle through the vehicle's sensors (such as IMU), recording the change of the vehicle's pose, and judging whether the vehicle follows the parking path according to the change of pose, and if the vehicle deviates For the parking path, the parking path is adjusted according to the current relative position of the vehicle and the free parking space.
  • vehicle's sensors such as IMU
  • Simulation display refers to the construction and output of a parking simulation environment based on the sensor information collected by the vehicle's sensors and the relative position of the vehicle and the free parking space.
  • the parking simulation environment has prompt and interactive effects.
  • the analog display can be realized by the user interface 116 in FIG. 1, for example.
  • a vehicle finds a free parking space, it is prompted to find a free parking space in the displayed parking simulation environment, and the location of the free parking space is prompted. Free parking spaces, continue to look for other free parking spaces.
  • the parking process involved in this application may only include part of the parking links in the above-mentioned parking links.
  • the parking link 3) can be executed after an empty parking space is found.
  • the far-away path is mainly used to be far away from parking space 2 to provide sufficient parking distance.
  • the demand for parking assistance is not large.
  • the parking step 3) may not be executed in the route, but the parking step 3) may be executed in the entering route.
  • the vehicle may execute the parking link 3) according to the relative position determined in the parking link 2).
  • the vehicle may not be able to locate free parking spaces through the parking steps 1) and 2), so The vehicle cannot perform the parking step 3) or the parking step 4) based on the relative position determined in the parking step 2). Therefore, in the parking step 4) or the parking step 3) performed in the driving path, the prior art generally determines the relative position between the vehicle and the free parking space according to the position and attitude change of the vehicle and the relative position determined in the parking step 2). Location.
  • a pose detection device such as an IMU is generally used to detect the pose change of the vehicle.
  • the working principle of this type of pose detection device is generally: by recording the pose changes of the vehicle between adjacent moments, accumulatively calculate the pose changes of the vehicle over a period of time. As the pose detection device detects a certain error in the pose changes between adjacent moments, and during the driving of the vehicle, the errors in the detected pose changes continue to accumulate and increase.
  • the pose change detected by the pose detection device introduces a large cumulative error into the relative position determined in the parking step 4), which reduces the accuracy of the relative position and causes parking failure.
  • the embodiment of the present application provides a parking method, which is used to eliminate the accumulated error in the relative position of the vehicle and the idle parking space, thereby improving the success rate of parking.
  • the following describes an embodiment of the parking method provided in the present application.
  • the execution subject of the parking method of the present application is a parking device.
  • the parking device may be the aforementioned vehicle 100 or the computer system 112 of the vehicle 100.
  • FIG. 5a is a flowchart of a parking method according to an embodiment of this application.
  • a possible embodiment of the parking method of this application may include the following steps:
  • the parking device can obtain the relative position of the vehicle and the free parking space after finding the free parking space.
  • the relative position is referred to as the first relative position
  • the time at which the first relative position is acquired is referred to as the initial time.
  • step 501 can be understood with reference to the introduction of the parking section 1) and 2).
  • the initial moment may be the earliest moment when the free parking space is detected through environmental perception, for example, corresponding to position A in FIG. 2a or FIG. 2b; or, alternatively, the initial moment may be the latest through environmental perception The time when the free parking space is detected.
  • the parking device may generate a parking path from the initial moment to the free parking space according to the first relative position.
  • the parking device may record the pose change of the vehicle.
  • a sensor may be provided in the vehicle, and the parking device may obtain corresponding sensor information.
  • the parking device can obtain the sensor information of the vehicle at the first moment after finding an empty parking space.
  • this sensor information is referred to as first sensor information.
  • the first sensor information is obtained according to data collected by the sensor.
  • the data is detected at the first time, or the data includes data detected at the first time and data detected within a certain period of time before the first time.
  • the senor may include one or more sensors in the aforementioned sensor system 104 for detecting environmental information around the vehicle 100.
  • the senor is an image detector, and the first sensor information is image information.
  • the sensor may be a fish-eye camera.
  • the vehicle may be provided with 4 fish-eye cameras, and the first sensor information may be data collected by the fish-eye camera at the first moment. The resulting fisheye image or top view.
  • the senor is an ultrasonic radar
  • the first sensor information is the distribution of the detection distance over time, or is referred to as an envelope diagram of an obstacle.
  • the sensor may be an APA sensor.
  • the vehicle may be equipped with 4 APA sensors, and the first sensor information may be obtained at the first time and before the first time according to the APA sensor data.
  • the detection distance within a certain period of time.
  • the first sensor information can be used to represent the time distribution information of the detection distance in FIG. 3c.
  • the vehicle travels to the first position, and then continues to travel from the first position to the second time, and the direction of movement of the vehicle from the first time to the first position is the same as that of the vehicle from the first position to the second position.
  • the direction of movement at the moment is opposite.
  • the first position may be position C
  • the first moment is a moment away from the path
  • the second moment is a moment when entering the path.
  • the parking device may obtain the sensor information of the vehicle at the second moment.
  • the sensor information is referred to as second sensor information.
  • second sensor information For the understanding of the second sensor information, reference may be made to the description of the first sensor information in step 502, which will not be repeated here.
  • the parking device can acquire the relative pose information of the vehicle from the second moment to the first moment.
  • the relative pose information is used to indicate the pose change of the vehicle's pose at the second moment relative to the pose of the vehicle at the first moment.
  • the pose of the vehicle includes the position and posture of the vehicle.
  • the position of the vehicle can be represented by three-dimensional coordinates
  • the posture of the vehicle can be represented by the yaw angle, pitch angle, and roll angle of the vehicle.
  • the position and posture changes of the vehicle include the position change and the posture change of the vehicle.
  • the position change can be represented by a translation vector
  • the pose change can be represented by a rotation matrix.
  • the rotation matrix can be used to represent changes in the yaw angle, pitch angle, and roll angle of the vehicle.
  • the parking device detects the pose change of the vehicle through the pose detection device.
  • the working principle of the pose detection device is generally: by recording the pose change of the vehicle between adjacent moments, the position of the vehicle in a period of time is accumulated and calculated. Posture changes. There is a certain cumulative error in the relative pose information.
  • the pose detection device is an IMU.
  • the parking device can match the second sensor information with the first sensor information to obtain the cumulative error in the pose change of the vehicle from the second moment to the first moment.
  • cumulative error in pose change and “cumulative error in relative pose information” have the same meaning.
  • the parking device may, according to the first relative position, the relative pose information of the vehicle from the second moment to the initial moment, and the cumulative error, Determine the second relative position of the vehicle and the free parking space.
  • the cumulative error in the second relative pose information includes the cumulative error in the first relative pose information.
  • the method in the embodiment of the present application is beneficial to eliminate the cumulative error in the second relative pose information. Since the second relative position can be determined according to the first relative position and the second relative pose information, in the method of the embodiment of the present application, according to the first relative position and the first relative pose The cumulative error in the information determines the second relative position, which is beneficial to eliminate the cumulative error in the second relative position and improve the accuracy of the second relative position, thereby helping to increase the parking success rate.
  • the relative pose information of the vehicle from the second moment to the initial moment is used to indicate the pose change of the vehicle from the second moment to the initial moment.
  • the parking device may detect a change in the pose of the vehicle at the second moment relative to the pose of the vehicle at the initial moment according to the pose detection device. You can refer to the aforementioned related descriptions to understand the pose change and the pose detection device, which will not be repeated here.
  • the first moment is the initial moment. Then, the pose change of the vehicle from the second moment to the initial moment is the pose change of the vehicle from the second moment to the first moment.
  • the first relative position may be determined according to the first sensor information.
  • the first moment is a moment after the initial moment.
  • the pose change of the vehicle from the second moment to the initial moment includes: the pose change of the vehicle from the second moment to the first moment and the pose change of the vehicle from the first moment to the initial moment.
  • the embodiment of the present application may also obtain the relative pose information of the vehicle from the second moment to the initial moment, or obtain the relative pose information of the vehicle from the first moment to the initial moment.
  • the parking device may generate a parking path according to the second relative position.
  • the parking device Based on step 501, the parking device generates a parking path from the initial moment to the free parking space according to the first relative position.
  • the parking device can also generate a parking path based on the cumulative error
  • the parking device judges whether the vehicle is driving according to the parking path according to the corrected relative pose information, and if it deviates from the parking If the vehicle path, the parking device executes step 506 and adjusts the parking path according to the second relative position.
  • step 505 may specifically include the following steps:
  • Feature refers to the identifying point or line or area in the sensor information.
  • the sensor information corresponding to the image detector is image information, and the features extracted from the image information can be point features, line features, or parking spaces in the image.
  • the sensor information corresponding to the ultrasonic radar is the outer envelope image of the obstacle scanned by the probe wave emitted by the ultrasonic radar.
  • the features extracted from the outer envelope image can be information such as point features and line features in the outer envelope map.
  • the point feature refers to the corner point that satisfies certain characteristics.
  • the gradient in more directions at the corner point will obtain a larger value, or the value of a longer continuous pixel point on the ring centered on the corner point.
  • Line features are one or more groups of straight lines that satisfy the relationship of collinear, parallel, and perpendicular.
  • the parking space information includes the position of the midpoint of the parking space entrance line, the direction of the parking space, the length of the parking space, and so on.
  • the feature in the first sensor information matches the feature in the second sensor information, then the feature in the first sensor information and the feature in the second sensor information correspond to the same object in the world coordinate system. Match the features in the first sensor information with the features in the second sensor information.
  • the matching feature in the first sensor information (called the first feature) can be determined, and the matching feature in the second sensor information can also be determined ( Called the second feature).
  • the first sensor information includes n first features
  • the second sensor information includes n second features
  • the i-th first feature and the i-th second feature match each other
  • n is a positive integer
  • i is less than A positive integer of n.
  • n error functions with cumulative errors as variables can be constructed. After that, the cumulative error is determined by calculating the optimal solution of n error functions.
  • the error function can also be called an optimization function or a constraint function.
  • step 5053 Taking the first and second sensor information as the first image and the second image detected by the camera on the vehicle as an example, a possible specific implementation of step 5053 is introduced below.
  • Figure 5c is a schematic diagram of the principle of constructing an error function.
  • point O1 represents the position of the camera at the first moment
  • point O2 represents the position of the camera at the second moment
  • rectangle I1 represents the first image
  • rectangle I2 represents the second image
  • rectangle I1 represents P1
  • the point represents the first first feature
  • the P2 point in the rectangle I2 represents the first second feature
  • the P1 and P2 points correspond to the same point P in the world coordinate system.
  • the coordinates of point P1 in the camera coordinate system are (X1, Y1, Z1)T
  • the coordinates of point P2 in the camera coordinate system (that is, the pixel position in the second image) is (X2, Y2, Z2) T
  • the internal parameter matrix of the camera is K.
  • n error functions can be constructed according to the n first feature structures and n second features, and the cumulative error in R and t can be calculated according to the n error functions.
  • the first moment is a moment when at least one of the following signals is acquired: a braking signal of the vehicle, a start signal of the vehicle, an identification signal of an empty parking space, and a selection signal of an empty parking space.
  • the vehicle recognizes an idle parking space through the parking link 1) at the first moment, and the vehicle acquires the first sensor information at this time;
  • the vehicle passes through the parking environment. 1) After identifying a free parking space, it can prompt the information of the free parking space (such as the relative position of the vehicle and the free parking space) through the parking link. The selection signal of, the vehicle can obtain the first sensor information at this time;
  • the vehicle passes the parking link during the driving process. 1) Detects free parking spaces. After the vehicle recognizes the free parking space and outputs the information of the free parking space to the user, the vehicle's information is obtained at the first moment. Braking signal, at this time the vehicle can obtain the first sensor information;
  • the vehicle passes through the parking link in the stopped state. 1) Detects the free parking space. After the free parking space is recognized and the information of the free parking space is output to the user, the start of the vehicle is obtained at the first moment Signal, the vehicle can obtain the first sensor information at this time.
  • the position of the vehicle at the first moment is exemplarily described with reference to FIG. 6.
  • the parking process shown in FIG. 6 is the same as the parking process of FIG. 2a. The difference is only that, in order to distinguish the away path and the entering path, in FIG. 6, the away path in FIG. 2a is translated for a certain distance along the y direction. In FIG. 6, the position 1 on the far away path represents the position of the vehicle 2 at the first moment.
  • the parking device may also obtain sensor information at one or more moments for matching with the second sensor information, which is beneficial to provide more information for the second sensor. More matching objects, improve matching success rate and accuracy.
  • the position 2 on the far path is used to represent the position of the vehicle 2 at one of the one or more times.
  • the gear of the vehicle at the first time is the first gear
  • the gear of the vehicle at the second time is the second gear.
  • the first gear is a reverse gear
  • the second gear is a forward gear
  • the first gear is a forward gear
  • the second gear is a reverse gear
  • the gear of the vehicle in the first position is switched from the first gear to the second gear.
  • the vehicle travels from the first time to the first position in reverse gear, and then travels from the first position to the second time in forward gear.
  • the vehicle travels from the first time to the first position in the forward gear, and then travels from the first position to the second time in the reverse gear.
  • the parking device may execute step 503 to step 506 at multiple times, for example, execute step 503 to step 506 at regular intervals. .
  • the parking device obtains The sensor information at these two moments does not match, which will waste the computing resources and storage resources of the parking device.
  • the travel distance of the vehicle from the first time to the first position is referred to as the first travel distance
  • the travel distance of the vehicle from the first position to the second time is referred to as the second travel distance.
  • the difference between the first travel distance and the second travel distance does not exceed the threshold.
  • the parking device records the first travel distance of the vehicle from the first time to the first position, and then, when the vehicle travels from the first position to the free parking space, the parking device detects the travel distance of the vehicle, When the difference between the travel distance and the first travel distance is less than the threshold, the parking device may select the second time and execute step 503 to step 506.
  • the selected distance between the position of the vehicle at the second moment and the position of the vehicle at the first moment is less than the threshold, it is beneficial to improve the matching success rate of the first sensor information and the second sensor information, and is beneficial to save the computing resources and storage resources of the parking device .
  • step 503 to step 506 can be executed.
  • the parking device may determine the threshold according to the sensing range of at least one of the following sensors: ultrasonic radar, microwave radar, laser rangefinder, infrared detector, and image detector.
  • the sensor used to determine the threshold is a sensor provided on the vehicle, and may be a sensor used to detect the first sensor information and the second sensor information.
  • the larger the sensing range of the sensor the larger the threshold corresponding to the sensor; conversely, the smaller the sensing range of the sensor, the smaller the threshold corresponding to the sensor.
  • the parking device obtains the first sensor information and the second sensor information according to the data collected by the fisheye camera, then the threshold value is determined according to the perception range of the fisheye camera.
  • the parking device obtains the first sensor information and the second sensor information according to the data collected by the ultrasonic radar, then the threshold value is determined according to the sensing range of the ultrasonic radar.
  • the perception range of the fisheye camera is larger than the perception range of the ultrasonic radar. Therefore, the threshold value determined according to the perception range of the fisheye camera is greater than the threshold value determined according to the perception range of the ultrasonic radar.
  • the parking device can obtain the second sensor information at the position 3 on the upper side; and for the case where the first sensor information and the second sensor information are obtained according to the data collected by the ultrasonic radar, the position 3 does not yet meet the threshold condition, and the vehicle is driven to drive When entering position 4 on the path, the parking device can obtain the second sensor information.
  • the parking device may obtain first sensor information based on data collected by multiple sensors, and obtain second sensor information based on data collected by at least one of the multiple sensors.
  • the parking device can obtain a top view and an envelope map of position 1. After that, when the vehicle travels from the first position to the position 3, since the position 3 meets the threshold condition corresponding to the fisheye camera, the parking device can obtain the top view of the position 3, and the top view of the position 1 is used as the first sensor information, and the position 3 The top view of is the second sensor information, and step 505 and step 506 are executed.
  • the parking device can obtain the top view and envelope diagram of position 3, taking the top view and envelope diagram of position 1 as the first For sensor information, taking the top view and envelope diagram of position 3 as the second sensor information, step 505 and step 506 are executed.
  • the parking device matches the sensor information of the same category, and then obtains the cumulative error according to the matching results of the sensor information of the two categories, which is beneficial to improve the accuracy of the cumulative error.
  • the size of the sequence number of the above-mentioned processes does not mean the order of execution, and the execution order of each process should be determined by its function and internal logic, and should not correspond to the embodiments of the present application.
  • the implementation process constitutes any limitation.
  • an embodiment of the present application provides a parking device for implementing the parking method provided in any of the foregoing embodiments.
  • the parking device may be a vehicle or a device installed in the vehicle, such as the vehicle 100 or the computer system 112 of the vehicle 100 in FIG. 1.
  • FIG. 7 is a schematic diagram of a possible structure of a parking device provided by an embodiment of the present application.
  • the parking device 700 may include an acquiring module 701 for acquiring the first relative position of the vehicle and the free parking space; acquiring the first sensor information of the vehicle at the first moment; acquiring the second sensor information of the vehicle at the second moment, so The direction of movement of the vehicle from the first time to the first position is opposite to the direction of movement of the vehicle from the first position to the second time; The relative pose information of the vehicle; the second sensor information and the first sensor information are matched to obtain the cumulative error in the relative pose information.
  • the parking device 700 may further include a determining module 702, configured to determine a second relative position of the vehicle and the free parking space according to the accumulated error and the first relative position.
  • the obtaining module 701 is coupled with the parking device 700.
  • the specific execution process please refer to the detailed description of the corresponding steps in the above method embodiment, which will not be repeated here.
  • the coupling in the embodiments of the present application is an indirect coupling or communication connection between devices, units or modules, and may be in electrical, mechanical or other forms, and is used for information exchange between devices, units or modules.
  • the division of modules in the embodiments of this application is illustrative, and it is only a logical function division. In actual implementation, there may be other division methods.
  • the functional modules in the various embodiments of this application can be integrated into one process. In the device, it can also exist alone physically, or two or more modules can be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or software functional modules.
  • FIG. 8 is a schematic diagram of another possible structure of the parking device provided by the embodiment of the present application.
  • the parking device 800 may be a chip system.
  • the chip system may be composed of chips, or may include chips and other discrete devices.
  • the parking device 800 includes at least one processor 801 for implementing the method provided in the embodiment of the present application.
  • the parking device 800 may further include at least one memory 802 for storing program instructions and/or data.
  • the memory 802 is coupled with the processor 801.
  • the processor 801 may cooperate with the memory 802 to operate.
  • the processor 801 may execute program instructions stored in the memory 802.
  • One or more of the at least one memory 802 may be included in the processor 801.
  • the specific connection medium between the foregoing processor 801 and the memory 802 is not limited in the embodiment of the present application.
  • the memory 802 and the processor 801 are connected through a bus 803 as an example.
  • the bus is represented by a thick line in FIG. Do not limit it.
  • the bus can be divided into an address bus, a data bus, a control bus, and so on. For ease of representation, only one thick line is used in FIG. 8, but it does not mean that there is only one bus or one type of bus.
  • the processor may be a general-purpose processor, a digital signal processor, an application specific integrated circuit, a field programmable gate array or other programmable logic device, a discrete gate or transistor logic device, or a discrete hardware component, which may implement or Perform the methods, steps, and logic block diagrams disclosed in the embodiments of the present application.
  • the general-purpose processor may be a microprocessor or any conventional processor or the like.
  • the steps of the method disclosed in combination with the embodiments of the present application may be directly embodied as being executed and completed by a hardware processor, or executed and completed by a combination of hardware and software modules in the processor.
  • the memory may be a non-volatile memory, such as a hard disk drive (HDD) or a solid-state drive (SSD), etc., or a volatile memory (volatile memory), for example Random-access memory (random-access memory, RAM).
  • the memory is any other medium that can be used to carry or store desired program codes in the form of instructions or data structures and that can be accessed by a computer, but is not limited to this.
  • the memory in the embodiments of the present application may also be a circuit or any other device capable of realizing a storage function for storing program instructions and/or data.
  • the technical solutions provided in the embodiments of the present application may be implemented in whole or in part by software, hardware, firmware, or any combination thereof.
  • software it can be implemented in the form of a computer program product in whole or in part.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, a network device, a terminal device, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • the computer instructions may be transmitted from a website, computer, server, or data center. Transmission to another website, computer, server, or data center via wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.).
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, a magnetic tape), an optical medium (for example, a digital video disc (DVD)), or a semiconductor medium.
  • the embodiments can be mutually cited.
  • the methods and/or terms between the method embodiments can be mutually cited, such as the functions and/or functions between the device embodiments.
  • Or terms may refer to each other, for example, functions and/or terms between the device embodiment and the method embodiment may refer to each other.
  • At least one may also be described as one or more, and the multiple may be two, three, four or more, which is not limited in the present application.
  • “/" can indicate that the associated objects are in an "or” relationship.
  • A/B can indicate A or B; and "and/or” can be used to describe that there are three types of associated objects.
  • the relationship, for example, A and/or B can mean that: A alone exists, A and B exist at the same time, and B exists alone, where A and B can be singular or plural.
  • words such as “first” and “second” may be used to distinguish technical features with the same or similar functions. The words “first” and “second” do not limit the quantity and order of execution, and the words “first” and “second” do not limit the difference.
  • words such as “exemplary” or “for example” are used to indicate examples, illustrations or illustrations, and any embodiment or design solution described as “exemplary” or “for example” shall not be interpreted It is more preferable or more advantageous than other embodiments or design solutions.
  • the use of words such as “exemplary” or “for example” is intended to present related concepts in a specific manner to facilitate understanding.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

Procédé et appareil de stationnement, et véhicule, appliqués aux domaines de la conduite autonome, de la technologie de voiture intelligente, etc. Le procédé comprend : l'acquisition d'une première position relative entre un véhicule et un espace de stationnement libre ; l'acquisition de premières informations de capteur du véhicule à un premier moment ; l'acquisition de secondes informations de capteur du véhicule à un second moment, un sens de déplacement du véhicule du premier moment à une première position étant opposée à un sens de déplacement du véhicule de la première position au second moment ; l'acquisition d'informations de pose relative du véhicule du second moment au premier moment ; la mise en correspondance des secondes informations de capteur avec les premières informations de capteur, et l'acquisition d'une erreur accumulée dans les informations de pose relatives ; et en fonction de l'erreur accumulée et de la première position relative, la détermination d'une seconde position relative entre le véhicule et l'espace de stationnement libre. La précision d'une position relative entre un véhicule et un espace de stationnement libre est ainsi améliorée.
PCT/CN2021/077351 2020-04-29 2021-02-23 Procédé et appareil de stationnement, et véhicule WO2021218310A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010355549.8 2020-04-29
CN202010355549.8A CN113561963B (zh) 2020-04-29 2020-04-29 一种泊车方法、装置及车辆

Publications (1)

Publication Number Publication Date
WO2021218310A1 true WO2021218310A1 (fr) 2021-11-04

Family

ID=78158503

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/077351 WO2021218310A1 (fr) 2020-04-29 2021-02-23 Procédé et appareil de stationnement, et véhicule

Country Status (2)

Country Link
CN (1) CN113561963B (fr)
WO (1) WO2021218310A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113928309A (zh) * 2021-11-24 2022-01-14 纵目科技(上海)股份有限公司 自动泊车方法、***、设备及计算机可读存储介质
CN114228701A (zh) * 2021-11-30 2022-03-25 岚图汽车科技有限公司 一种基于传感器数据融合的泊车控制方法和装置
CN114494428A (zh) * 2021-12-23 2022-05-13 禾多科技(北京)有限公司 车辆位姿矫正方法、装置、电子设备和计算机可读介质
CN115140022A (zh) * 2022-06-24 2022-10-04 重庆金康赛力斯新能源汽车设计院有限公司 自动泊车调试方法、装置、计算机设备和存储介质
CN115629386A (zh) * 2022-12-21 2023-01-20 广州森弘信息科技有限公司 一种用于自动泊车的高精度定位***及方法
CN116189137A (zh) * 2022-12-07 2023-05-30 深圳市速腾聚创科技有限公司 车位检测方法、电子设备及计算机可读存储介质

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115223132B (zh) * 2021-11-10 2023-10-27 广州汽车集团股份有限公司 一种空车位识别方法与***、计算机可读存储介质
CN116740982B (zh) * 2023-08-15 2023-12-01 禾多科技(北京)有限公司 目标车位的确定方法、装置、存储介质及电子装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101573257A (zh) * 2006-12-28 2009-11-04 株式会社丰田自动织机 停车辅助装置、停车辅助装置部件、停车辅助方法、停车辅助程序、行车参数的计算方法和计算程序、行车参数计算装置以及行车参数计算装置部件
CN102407848A (zh) * 2010-09-21 2012-04-11 高强 具有自动泊车与智能驾驶功能的控制器***
CN102874252A (zh) * 2012-08-30 2013-01-16 江苏大学 辅助泊车轨迹规划及修正方法及***
JP2015013596A (ja) * 2013-07-05 2015-01-22 トヨタ自動車株式会社 駐車支援装置及び駐車支援方法
CN106458212A (zh) * 2014-06-30 2017-02-22 日立汽车***株式会社 停车轨迹计算装置及停车轨迹计算方法
CN109733384A (zh) * 2018-12-25 2019-05-10 科大讯飞股份有限公司 泊车路径设置方法及***

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102227855B1 (ko) * 2015-01-22 2021-03-15 현대모비스 주식회사 주차 가이드 시스템 및 그 제어방법
US11393340B2 (en) * 2016-12-30 2022-07-19 Hyundai Motor Company Automatic parking system and automatic parking method
CN109949609B (zh) * 2019-04-30 2020-11-13 广州小鹏汽车科技有限公司 一种车辆的定位修正方法及***、车辆
CN110333510A (zh) * 2019-06-29 2019-10-15 惠州市德赛西威汽车电子股份有限公司 一种单雷达泊车定位方法及***
CN110422167A (zh) * 2019-07-26 2019-11-08 浙江吉利汽车研究院有限公司 用于混合动力车辆自动泊车的驱动控制***

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101573257A (zh) * 2006-12-28 2009-11-04 株式会社丰田自动织机 停车辅助装置、停车辅助装置部件、停车辅助方法、停车辅助程序、行车参数的计算方法和计算程序、行车参数计算装置以及行车参数计算装置部件
CN102407848A (zh) * 2010-09-21 2012-04-11 高强 具有自动泊车与智能驾驶功能的控制器***
CN102874252A (zh) * 2012-08-30 2013-01-16 江苏大学 辅助泊车轨迹规划及修正方法及***
JP2015013596A (ja) * 2013-07-05 2015-01-22 トヨタ自動車株式会社 駐車支援装置及び駐車支援方法
CN106458212A (zh) * 2014-06-30 2017-02-22 日立汽车***株式会社 停车轨迹计算装置及停车轨迹计算方法
CN109733384A (zh) * 2018-12-25 2019-05-10 科大讯飞股份有限公司 泊车路径设置方法及***

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113928309A (zh) * 2021-11-24 2022-01-14 纵目科技(上海)股份有限公司 自动泊车方法、***、设备及计算机可读存储介质
CN113928309B (zh) * 2021-11-24 2024-04-30 纵目科技(上海)股份有限公司 自动泊车方法、***、设备及计算机可读存储介质
CN114228701A (zh) * 2021-11-30 2022-03-25 岚图汽车科技有限公司 一种基于传感器数据融合的泊车控制方法和装置
CN114228701B (zh) * 2021-11-30 2023-10-20 岚图汽车科技有限公司 一种基于传感器数据融合的泊车控制方法和装置
CN114494428A (zh) * 2021-12-23 2022-05-13 禾多科技(北京)有限公司 车辆位姿矫正方法、装置、电子设备和计算机可读介质
CN114494428B (zh) * 2021-12-23 2022-11-11 禾多科技(北京)有限公司 车辆位姿矫正方法、装置、电子设备和计算机可读介质
CN115140022A (zh) * 2022-06-24 2022-10-04 重庆金康赛力斯新能源汽车设计院有限公司 自动泊车调试方法、装置、计算机设备和存储介质
CN115140022B (zh) * 2022-06-24 2024-05-24 重庆赛力斯新能源汽车设计院有限公司 自动泊车调试方法、装置、计算机设备和存储介质
CN116189137A (zh) * 2022-12-07 2023-05-30 深圳市速腾聚创科技有限公司 车位检测方法、电子设备及计算机可读存储介质
CN116189137B (zh) * 2022-12-07 2023-08-04 深圳市速腾聚创科技有限公司 车位检测方法、电子设备及计算机可读存储介质
CN115629386A (zh) * 2022-12-21 2023-01-20 广州森弘信息科技有限公司 一种用于自动泊车的高精度定位***及方法

Also Published As

Publication number Publication date
CN113561963A (zh) 2021-10-29
CN113561963B (zh) 2023-05-05

Similar Documents

Publication Publication Date Title
WO2021218310A1 (fr) Procédé et appareil de stationnement, et véhicule
CN110969655B (zh) 用于检测车位的方法、装置、设备、存储介质以及车辆
US11915492B2 (en) Traffic light recognition method and apparatus
CN110796063B (zh) 用于检测车位的方法、装置、设备、存储介质以及车辆
US11024055B2 (en) Vehicle, vehicle positioning system, and vehicle positioning method
KR20210050925A (ko) 차량 충돌 회피 장치 및 방법
CN111507157A (zh) 基于强化学习而在自动驾驶时优化资源分配的方法及装置
US11325590B2 (en) Autonomous driving device and driving method thereof
US20170359561A1 (en) Disparity mapping for an autonomous vehicle
US11679783B2 (en) Method of and system for generating trajectory for self-driving car (SDC)
US11403947B2 (en) Systems and methods for identifying available parking spaces using connected vehicles
US11673581B2 (en) Puddle occupancy grid for autonomous vehicles
US20230046289A1 (en) Automatic labeling of objects in sensor data
CN113442908B (zh) 自动泊车路径规划方法及***、泊车控制设备
CN114973050A (zh) 自动驾驶应用中深度神经网络感知的地面实况数据生成
KR20220035946A (ko) 시각적 추적 및 이미지 재투영에 의한 자율 주행을 위한 객체 로컬라이제이션
US11760384B2 (en) Methods and systems for determining trajectory estimation order for vehicles
US11810371B1 (en) Model-based localization on high-definition maps
US20220388531A1 (en) Method and device for operating a self-driving car
CN113469045A (zh) 无人集卡的视觉定位方法、***、电子设备和存储介质
CN114511834A (zh) 一种确定提示信息的方法、装置、电子设备及存储介质
US11082622B2 (en) Systems and methods for dynamically switching image signal processor configurations
US20230237793A1 (en) False track mitigation in object detection systems
US20240208488A1 (en) Information processing device, control method, and recording medium
EP4397561A1 (fr) Procédé et appareil de commande de véhicule

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21796572

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21796572

Country of ref document: EP

Kind code of ref document: A1