CN113125795B - Obstacle speed detection method, device, equipment and storage medium - Google Patents

Obstacle speed detection method, device, equipment and storage medium Download PDF

Info

Publication number
CN113125795B
CN113125795B CN202110426631.XA CN202110426631A CN113125795B CN 113125795 B CN113125795 B CN 113125795B CN 202110426631 A CN202110426631 A CN 202110426631A CN 113125795 B CN113125795 B CN 113125795B
Authority
CN
China
Prior art keywords
obstacle
coordinate system
information
obstacle position
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110426631.XA
Other languages
Chinese (zh)
Other versions
CN113125795A (en
Inventor
孔炤
尹周建铖
韩旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Weride Technology Co Ltd
Original Assignee
Guangzhou Weride Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Weride Technology Co Ltd filed Critical Guangzhou Weride Technology Co Ltd
Priority to CN202110426631.XA priority Critical patent/CN113125795B/en
Publication of CN113125795A publication Critical patent/CN113125795A/en
Application granted granted Critical
Publication of CN113125795B publication Critical patent/CN113125795B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/64Devices characterised by the determination of the time taken to traverse a fixed distance

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses a method, a device, equipment and a storage medium for detecting obstacle speed, wherein the method comprises the following steps: judging whether at least one of a first obstacle position and a second obstacle position of a target obstacle is positioned on a curve or not in a first coordinate system; when it is determined that at least one of the first obstacle position and the second obstacle position is located in the curve, speed information of the target obstacle in the second coordinate system is calculated according to mapping information of the first obstacle position and the second obstacle position in the second coordinate system. The technical problem that the accuracy of the detection result of the obstacle positioned in the curve is low in the traditional speed detection method is solved.

Description

Obstacle speed detection method, device, equipment and storage medium
Technical Field
The present application relates to the field of unmanned driving, and in particular, to a method, apparatus, device, and storage medium for detecting speed of an obstacle.
Background
The unmanned vehicle can safely run without accurate perception of the running environment. Since an obstacle occurs in a driving environment, sensing the speed of the obstacle is critical to the driving safety of the unmanned vehicle.
When facing different road types, the traditional obstacle speed detection method has errors and has low accuracy.
Disclosure of Invention
The application provides a method, a device, equipment and a storage medium for detecting obstacle speed, which solve the technical problem that the accuracy of the obstacle detection result in a curve is lower in the traditional speed detection method.
In view of the above, a first aspect of the present application provides an obstacle speed detection method, including:
Judging whether at least one of a first obstacle position and a second obstacle position of a target obstacle is positioned on a curve or not in a first coordinate system;
When it is determined that at least one of the first obstacle position and the second obstacle position is located in the curve, speed information of the target obstacle in a second coordinate system is calculated according to mapping information of the first obstacle position and the second obstacle position in the second coordinate system.
Optionally, the method further comprises:
And when judging that the first obstacle position and the second obstacle position are not positioned on the curve, calculating speed information of the target obstacle in a second coordinate system according to the speed information of the target obstacle in the first coordinate system.
Optionally, the calculating speed information of the target obstacle in the second coordinate system according to the mapping information of the first obstacle position and the second obstacle position in the second coordinate system specifically includes:
Mapping first pose information of the first obstacle position under the first coordinate system to the second coordinate system to obtain first mapping information;
Mapping second pose information of the second obstacle position under the first coordinate system to the second coordinate system to obtain second mapping information;
And calculating the speed information of the target obstacle under a second coordinate system according to the mapping information difference value between the first mapping information and the second mapping information.
Optionally, the mapping the first pose information of the first obstacle position in the first coordinate system to the second coordinate system to obtain first mapping information specifically includes:
Decomposing the first pose information in a second coordinate system corresponding to the target obstacle to obtain decomposition information;
And summing the preset distance and the decomposition information to obtain first mapping information, wherein the preset distance is the distance between the datum point and the vehicle. Optionally, the configuration process of the second coordinate system corresponding to the target obstacle specifically includes:
Sampling a lane corresponding to the target obstacle along a lane center line under the first coordinate system to obtain a sampling point sequence;
And taking a sampling point closest to the first obstacle position in the sampling point sequence as a reference point, and taking a second coordinate system corresponding to the reference point as a second coordinate system corresponding to the target obstacle.
Optionally, the determining whether at least one of the first obstacle position and the second obstacle position corresponding to the target obstacle is located in the curve specifically includes:
Acquiring an obstacle driving route corresponding to the first obstacle position and the second obstacle position;
Acquiring a route curvature of the obstacle driving route;
Judging whether the curvature of the route is larger than a preset curvature threshold value, if so, judging that at least one of the first obstacle position and the second obstacle position is positioned on a curve, and if not, judging that neither the first obstacle position nor the second obstacle position is positioned on the curve.
A second aspect of the present application provides an obstacle speed detection device comprising:
A judging unit configured to judge whether at least one of a first obstacle position and a second obstacle position of the target obstacle is located in a curve in the first coordinate system;
a first calculation unit configured to calculate, when it is determined that at least one of the first obstacle position and the second obstacle position is located in the curve, speed information of the target obstacle in a second coordinate system according to mapping information of the first obstacle position and the second obstacle position in the second coordinate system.
Optionally, the method further comprises:
And the second calculation unit is used for calculating the speed information of the target obstacle in the second coordinate system according to the speed information of the target obstacle in the first coordinate system when judging that the first obstacle position and the second obstacle position are not positioned in the curve.
A third aspect of the present application provides an obstacle speed detection device, the device comprising a processor and a memory;
The memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to execute any one of the obstacle speed detecting methods described in the first aspect according to instructions in the program code.
A fourth aspect of the present application provides a storage medium storing program code for executing the obstacle speed detecting method according to any one of the first aspects described above.
From the above technical method, the application has the following advantages:
According to the application, firstly, whether the target obstacle is located in the curve is judged under the first coordinate system, namely, whether at least one of the first obstacle position and the second obstacle position corresponding to the target obstacle is located in the curve is judged, if yes, the target obstacle is located in the curve is indicated, at this time, the speed information of the target obstacle in the second coordinate system is calculated according to the mapping information of the first obstacle position and the second obstacle position in the second coordinate system.
Drawings
In order to more clearly illustrate the technical method in the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it will be apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort to those of ordinary skill in the art.
Fig. 1 is a schematic flow chart of an embodiment of a method for detecting speed of an obstacle according to an embodiment of the application;
FIG. 2 is a schematic diagram of the construction of SL coordinate system;
FIG. 3 is a diagram illustrating the calculation of step 103 in the first embodiment of the present application;
Fig. 4 is a schematic flow chart of a second embodiment of a method for detecting speed of an obstacle according to the embodiment of the application;
FIG. 5 is a diagram illustrating the calculation of the first mapping information according to the present application;
Fig. 6 is a schematic structural diagram of an embodiment of an obstacle speed detecting device according to an embodiment of the application.
Detailed Description
The embodiment of the application provides a method, a device, equipment and a storage medium for detecting obstacle speed, which solve the technical problem that the accuracy of the obstacle detection result in a curve is lower in the traditional speed detection method.
In order that the method of the present application may be better understood by those skilled in the art, a technical method in the embodiments of the present application will be clearly and completely described in the following description with reference to the accompanying drawings, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
For easy understanding, referring to fig. 1, fig. 1 is a flowchart illustrating an embodiment of a method for detecting an obstacle speed according to an embodiment of the application.
An obstacle speed detection method in this embodiment includes:
step 101, in the first coordinate system, it is determined whether at least one of the first obstacle position and the second obstacle position of the target obstacle is located in the curve.
In this embodiment, the first obstacle position is a target obstacle position at a first time, and the second obstacle position is a target obstacle position at a second time. It is to be understood that the first time may be a time before the second time or a time after the second time, which is not particularly limited in this embodiment.
The coordinate system used by the automated driving system in the unmanned vehicle is different from a conventional coordinate system (e.g., cartesian coordinate system) representing spatial information and the like, so that it is necessary to map the speed under the conventional coordinate system into the automated driving coordinate system.
It should be noted that, the first coordinate system is a coordinate system describing the spatial information of the object, and the coordinate system used by the autopilot system is a second coordinate system, specifically, the first coordinate system in this embodiment is a cartesian coordinate system. When the vehicle is running, the speed information acquired by the vehicle-mounted sensor is the speed information under the first coordinate system, but for the automatic driving system, the speed information of the target obstacle under the second coordinate system is required to be known, so that obstacle avoidance planning of the automatic driving system can be performed.
In this embodiment, in order to accurately detect the speed information of the target obstacle in the second coordinate system, in this embodiment, different manners of calculation are performed according to different lanes in which the target obstacle is located. The general driving lanes are classified into a curve and a straight road, and thus the present embodiment classifies the lane where the target obstacle is located into a curve and a straight road. Therefore, after the target obstacle is acquired, it is first determined whether the target obstacle is located in a curve. It will be appreciated that in this embodiment the target obstacle is considered to be located in a curve when at least one of the first obstacle position and the second obstacle position is located in the curve. Specifically, there are various ways of determining whether at least one of the first obstacle position and the second obstacle position is located in the curve, and detailed descriptions thereof will be omitted in the following embodiments.
In a driving scene of a vehicle, there may be a plurality of obstacles, wherein not all of the obstacles need to be subjected to speed detection or the like. In this embodiment, therefore, after obtaining a plurality of obstacles in the current driving scene of the vehicle, the target obstacle is determined according to the obstacle information corresponding to each obstacle.
It will be appreciated that in one embodiment, the target obstacle is an obstacle in the vicinity of the vehicle, where the target obstacle is determined based on the distance between the obstacle that the vehicle is currently traveling on and the vehicle. Correspondingly, the step of determining the target obstacle specifically includes:
step S11, obtaining the corresponding interval distance of each of a plurality of obstacles in the current running scene of the vehicle.
It is to be understood that the above-mentioned distance is specifically a distance between each obstacle and the vehicle, and the calculation of the specific distance may be a vehicle-mounted camera or a vehicle-mounted radar calculation, which is not limited and described herein. The obstacle may be a person, another traveling vehicle, a fixed building, or the like, and is not particularly limited in this embodiment.
Step S12, judging whether a target obstacle exists in the plurality of obstacles according to the interval distance.
At this time, the target obstacle needs to satisfy the following preset conditions: the interval distance is smaller than a preset threshold, and the corresponding step S12 specifically includes:
Step S121, when it is determined that all the separation distances are greater than the preset threshold, it is determined that the target obstacle is not present among the plurality of obstacles.
All the spacing distances are larger than a preset threshold value, which indicates that the plurality of obstacles are not nearby the vehicle, namely, no target obstacle exists in the plurality of obstacles, namely, the target obstacle is not detected.
Step S122, when at least one interval distance is smaller than a preset threshold value, determining that a target obstacle exists in the plurality of obstacles, and taking the obstacle with the interval distance smaller than the preset threshold value as the target obstacle.
When at least one of the spacing distances is smaller than the preset threshold value, at least one of the plurality of obstacles is located near the vehicle, namely, a target obstacle exists in the plurality of obstacles, namely, the target obstacle is detected, and the obstacle with the spacing distance smaller than the preset threshold value is taken as the target obstacle.
The number of the target obstacle may be one or plural, and the speed detection method in the present embodiment is performed for each target obstacle when the number of the target obstacles is plural.
It will be appreciated that the above description of the target obstacle is merely illustrative, and that one skilled in the art may arrange for other ways in accordance with the above description and will not be described in detail herein.
And 102, when at least one of the first obstacle position and the second obstacle position is judged to be positioned in the curve, calculating the speed information of the target obstacle in the second coordinate system according to the mapping information of the first obstacle position and the second obstacle position in the second coordinate system.
At least one of the first obstacle position and the second obstacle position is located in a curve, including: only the first obstacle location is located at the curve, only the second obstacle location is located at the curve, and the first obstacle location and the second obstacle location are located at the curve.
It can be understood that the second coordinate system in this embodiment is specifically: in the SL coordinate system, as shown in fig. 2, the forward direction of the lane is taken as the S axis, the left-right direction of the lane is taken as the L axis, and thus, the curved lane is converted into an orthogonal coordinate system with the forward direction upward, and the description parameters in decision planning in the automatic driving system are also greatly reduced.
When it is determined that at least one of the first obstacle position and the second obstacle position is located in the curve, it is indicated that the target obstacle is located in the curve, and at this time, the conventional linear mapping method for directly detecting the speed is no longer used, but the speed information of the target obstacle in the second coordinate system is calculated through the mapping information of the first obstacle position and the second obstacle position in the second coordinate system.
And 103, calculating the speed information of the target obstacle in the second coordinate system according to the speed information of the target obstacle in the first coordinate system when judging that the first obstacle position and the second obstacle position are not positioned in the curve.
When the first obstacle position and the second obstacle position are not positioned in the curve, the target obstacle is not positioned in the curve, and the speed detection under the second coordinate system is performed by using a linear mapping method of speed detection, namely the speed information under the second coordinate system at the same moment can be directly detected according to the speed information under the first coordinate system at the first moment of the target obstacle.
It can be understood that in the process of detecting the speed information of the target obstacle in the second coordinate system by using the linear mapping method of speed detection, the speed information in the first coordinate system can be decomposed according to each direction of the second coordinate system, so as to obtain the speed information of the target obstacle in the second coordinate system.
Wherein, since the speeds of the target obstacle in both coordinate systems correspond to the first moment, the second coordinate system is established according to the first obstacle position where the target obstacle is located at the first moment, and the specific second coordinate system establishing process may include the following steps S21-S22:
And S21, under a first coordinate system, sampling a lane corresponding to the target obstacle along the lane center line to obtain a sampling point sequence.
The lane corresponding to the target obstacle is a lane where the target obstacle is located at the current moment, and may be obtained by identifying a field image acquired by a vehicle-mounted camera.
As can be seen from fig. 2 and the description thereof, the second coordinate system is determined according to the lane corresponding to the target obstacle. Therefore, in order to obtain the data basis for establishing the second coordinate system, the lane corresponding to the target obstacle in the visual field image can be positioned in the first coordinate system, the lane is usually provided with a mark of the lane center line, the mark is used for positioning the lane center line of the lane where the target obstacle is located, and finally, sampling points are acquired along the lane center line to obtain a sampling point sequence.
The number of sampling points in the sampling point sequence can be 100, 500, 1000 and the like, and the self-adaptive adjustment can be performed according to the distance from the target obstacle. The sampling mode can be equidistant sampling, or can be to adjust different sampling gradients in real time according to the distance between the current position and the target obstacle for sampling so as to improve the accuracy of sampling points.
Step S22, taking the sampling point closest to the first obstacle position in the sampling point sequence as a reference point, and taking a second coordinate system corresponding to the reference point as a second coordinate system corresponding to the target obstacle.
After the sampling point sequence is obtained, the distances between each sampling point in the sampling point sequence and the first obstacle position are calculated respectively, and the sampling point with the shortest distance is used as a reference point. The distance between each sampling point and the first obstacle position is calculated according to the position information corresponding to each sampling point and the first obstacle position. Since the distance between the reference point and the first obstacle position is the shortest, the detection error of the speed by using the second coordinate system established by the reference point is the smallest, and at this time, the second coordinate system established by the reference point can be used as the second coordinate system corresponding to the target obstacle.
As shown in fig. 3, the reference point in the present embodiment is point 1 in fig. 3, and the position of the first obstacle is point 2. And taking the second coordinate system corresponding to the point 1 as the second coordinate system corresponding to the target obstacle, and then mapping the speed information v under the first coordinate system to the second coordinate system to obtain the speed information.
For the acquisition of the speed information of the target obstacle in the first coordinate system, in one embodiment, the speed information of the target obstacle in the first coordinate system is a speed value calculated according to the speed detection information of the target obstacle at the first moment. The speed detection information may be image information collected by a vehicle-mounted camera, the vehicle-mounted camera may continuously shoot the target obstacle, and then speed estimation is performed on each frame of image shot by the vehicle-mounted camera based on a speed estimation method, so as to calculate speed information of the target obstacle. It is to be understood that, in other embodiments, the speed detection information may be motion state information of the target obstacle acquired by the vehicle radar.
In this embodiment, different speed detection methods are adopted for the target obstacle in the curve and the straight road, firstly, under the first coordinate system, judging whether the target obstacle is located under the curve, that is, judging whether at least one of the first obstacle position and the second obstacle position corresponding to the target obstacle is located in the curve, if yes, indicating that the target obstacle is located in the curve, and calculating the speed information of the target obstacle in the second coordinate system according to the mapping information of the first obstacle position and the second obstacle position in the second coordinate system; otherwise, the speed information of the target obstacle in the second coordinate system is detected directly according to the speed information of the target obstacle in the first coordinate system.
The first embodiment of the method for detecting an obstacle speed according to the present application is described in detail below, and the second embodiment of the method for detecting an obstacle speed according to the present application is described in detail below with reference to fig. 4 on the basis of the first embodiment.
In this embodiment, the speed information of the target obstacle in the second coordinate system is solved according to the difference value of the mapping information of the first obstacle position and the second obstacle position in the second coordinate system, and specifically the implementation steps include:
Step 401, mapping first pose information of a first obstacle position under a first coordinate system to a second coordinate system to obtain first mapping information.
It may be appreciated that in one embodiment, the mapping of the first pose information to obtain the first mapping information is calculated according to the decomposition information of the first pose information under the second coordinate system corresponding to the target obstacle, and specifically, the mapping process in step 401 includes:
And S31, decomposing the first pose information in a second coordinate system corresponding to the target obstacle to obtain decomposition information.
It is to be understood that the determination of the second coordinate system corresponding to the target obstacle may be performed according to the descriptions of the above step S21 and step S22, which are not described herein.
Specifically, as shown in fig. 5, the reference point in the present embodiment is the point labeled 1 in fig. 3, and the first obstacle position is the point labeled 2 in fig. 5. At this time, the second coordinate system corresponding to the target obstacle is the coordinate system drawn at the point 1, the first pose information corresponding to the point 2 is (x 1,y1,z1,yaw1), and the point 2 is decomposed into the coordinate system corresponding to the point 1, so as to obtain the decomposition information (s 1',l1',w'sl1).
And step S32, summing the preset distance and the decomposition information to obtain first mapping information, wherein the preset distance is the distance between the datum point and the vehicle.
As shown in fig. 5, after the decomposition information (s 1',l1',w'sl1) is obtained, the decomposition information (s 1',l1',w'sl1) and the preset distance d are summed to obtain the first mapping information (s 1,l1,wsl1). The calculation of the preset distance d may be performed based on the position information corresponding to the reference point and the position information corresponding to the vehicle, or may be performed by mounting a laser sensor on the vehicle and obtaining the distance by means of laser ranging, which is not limited in this embodiment.
In another embodiment, the mapping the first pose information to obtain the first mapping information is calculated according to a mapping function between the first coordinate system and the second coordinate system, and the specific implementation steps include:
Step S41, a mapping function when the first coordinate system and the second coordinate system are mapped in position is obtained.
In one embodiment of the present application, the mapping function may be preset in the local storage of the vehicle, so that the mapping function may be directly obtained from the vehicle. In another embodiment, the mapping function may be stored in a server connected to the vehicle, and the mapping function is transmitted to the vehicle by the server in response to the start of the location mapping when the location mapping is performed.
For ease of understanding, in this embodiment, the first pose information is (x 1,y1,z1,yaw1), and the obtained mapping function is T.
And step S42, mapping the first pose information into a second coordinate system according to the mapping function to obtain first mapping information.
After the mapping function is obtained, T, the first pose information (x 1,y1,z1,yaw1), the first mapping information (s 1,l1,wsl1)=T(x1,y1,z1,yaw1).
Specifically, the first pose information includes first position information and a first yaw angle, and for obtaining the first position information, in an implementation manner, in order to ensure accuracy of the first position information, the first position information may be implemented by combining a camera on a vehicle with a neural network, and specific implementation steps may be:
step S51, acquiring an acquired image containing the target obstacle through the vehicle-mounted camera.
In this embodiment, an obstacle positioning model of the obstacle plane coordinates and the obstacle image is preconfigured, and the object obstacle can be positioned by directly using the obstacle positioning model in the subsequent process.
Step S52, inputting the acquired image into an obstacle positioning model to obtain a plane coordinate of the target obstacle under a plane coordinate system corresponding to the acquired image, wherein the obstacle positioning model is obtained after training a preset neural network according to a plurality of training samples.
After the input parameter acquisition image of the obstacle positioning model is obtained, the acquisition image is input into the obstacle positioning model, so that the plane coordinates of the target obstacle under the plane coordinate system corresponding to the acquisition image can be obtained.
It can be understood that the process of obtaining the obstacle locating model by training the preset neural network through the training samples including the obstacles includes inputting the training samples into the preset neural network, taking the actual plane coordinates of the obstacles in the training samples as the ideal output result of the preset neural network, and adjusting the model parameters by the preset neural network continuously according to the deviation between the actual output and the ideal output result until the actual output and the ideal output result are very close (the deviation is smaller than the preset value), and taking the preset neural network at the moment as the obstacle locating model.
Step S53, converting the plane coordinate into a space coordinate under the first coordinate system according to the corresponding relation between the plane coordinate system and the first coordinate system, and taking the space coordinate as the first position information.
It can be understood that the above correspondence is M, the plane coordinates are (x, y), and the obtained first pose information (x 1,y1,z1) =m (x, y). The corresponding relation is pre-constructed, namely, the same point is marked under a plane coordinate system to obtain the corresponding plane coordinate, the corresponding space coordinate is obtained by marking under a first coordinate system for representing the space coordinate, and then the corresponding relation between the two coordinates can be established according to the data relation of the plane coordinate and the space coordinate. In the correspondence, one pixel corresponds to two coordinates, one is the coordinate of the pixel in the plane coordinate system, and the other is the coordinate of the pixel corresponding to the spatial coordinate (i.e., the first coordinate system). Illustratively, the first image has a pixel 1, the coordinates of the pixel 1 in the planar coordinate system are (x 1, y 1), and the spatial coordinates of the pixel in the first coordinate system are (x 1,y1,z1).
In one embodiment, the first yaw angle acquisition in the first pose information may be: and fitting a motion track of the target obstacle when the first obstacle position is cut off through a millimeter wave radar sensor on the vehicle, and taking an included angle between the motion track and the direction of the vehicle body as a yaw angle of the target obstacle.
Step 402, mapping second pose information of the second obstacle position under the first coordinate system to the second coordinate system to obtain second mapping information.
It is to be understood that the process of mapping the first pose information to obtain the second mapping information may be referred to the description of step 401, which is not described herein.
In one embodiment, the obtaining of the second pose information corresponding to the second moment may be implemented according to the first pose information and the first speed information at the first moment. In this embodiment, the first time is specifically described as a time before the second time. At this time, the specific implementation steps for obtaining the second pose information may be:
step S61, acquiring first pose information and first speed information of a target obstacle at a first moment in time under a first coordinate system.
The acquisition of the first pose information may be referred to the above description, and will not be described in detail here. It may be appreciated that the first coordinate system in this embodiment is a cartesian coordinate system, the corresponding first pose information is (x 1,y1,z1,yaw1), the first velocity information is (v x1,vy1,vz1,,vaw1), where x 1,y1,z1 in the first pose information is first position information, and y aw1 is a first yaw angle.
Step S62, calculating second pose information according to the first pose information and the first speed information.
In obtaining the first pose information (x 1,y1,z1,yaw1) and the first velocity information (v x1,vy1,vz1,,vaw1), the second pose information (x 2,y2,z2,yaw2) is calculated according to the sub-pose information in the first pose information and the sub-velocity information in the first velocity information, and specifically, the calculation mode may be:
x2=x1+vx1*dt;
y2=y1+vy1*dt;
z2=z1+vz1*dt;
yaw2=yaw1+vaw1*dt。
Where dt is the time interval between the first instant and the second instant.
Step 403, calculating the speed information of the target obstacle under the second coordinate system according to the mapping information difference between the first mapping information and the second mapping information.
In this embodiment, the second speed information is calculated based on the first map information and the second map information, which is calculated based on a difference in map information between the first map information and the second map information.
Specifically, the speed information of the target obstacle in the second coordinate system may be a linear speed, an angular speed, or a linear speed and an angular speed, and the solutions when the speed information is different information in the present embodiment are described as follows:
in one embodiment, the speed information of the target obstacle in the second coordinate system is: linear velocity;
Calculating speed information of the target obstacle under the second coordinate system according to a mapping information difference value between the first mapping information and the second mapping information, wherein the speed information specifically comprises the following steps:
Step S71, calculating a position information difference between the first mapping information and the second mapping information.
Specifically, when the above-described speed information is a linear speed, the position information difference between the first map information (s 1,l1,wsl1) and the second map information (s 2,l2,wsl2) is:
Δs=s2-s1
Δl=l2-l1
Step S72, calculating the linear velocity of the target obstacle under the second coordinate system according to the position information difference value.
At this time, the specific ground speed is calculated as:
vs=Δs/dt;
vl=Δl/dt。
In one embodiment, the speed information of the target obstacle in the second coordinate system is: angular velocity;
Calculating speed information of the target obstacle under the second coordinate system according to a mapping information difference value between the first mapping information and the second mapping information, wherein the speed information specifically comprises the following steps:
Step S81, a yaw angle difference between the first map information and the second map information is calculated.
Specifically, when the above-described speed information is an angular speed, the yaw angle difference between the first map information (s 1,l1,wsl1) and the second map information (s 2,l2,wsl2) is:
Δwsl=wsl2-wsl1
Step S82, calculating the angular velocity of the target obstacle under the second coordinate system according to the yaw angle difference value.
At this time, the calculation of the angular velocity is specifically:
vaw1=Δwsl/dt。
In one embodiment, the speed information of the target obstacle in the second coordinate system is: linear and angular velocities;
Calculating speed information of the target obstacle under the second coordinate system according to a mapping information difference value between the first mapping information and the second mapping information, wherein the speed information specifically comprises the following steps:
Step S91, calculating a position information difference value between the first mapping information and the second mapping information;
Step S92, calculating a yaw angle difference value between the first mapping information and the second mapping information;
Step S93, calculating the linear velocity of the target obstacle under a second coordinate system according to the position information difference value;
step S94, calculating the angular velocity of the target obstacle under the second coordinate system according to the yaw angle difference value.
The speed information of the target obstacle under the second coordinate system is as follows: the calculation of the linear velocity and the angular velocity can be referred to the description of the above steps S71 to S82, and will not be repeated here.
It can be understood that, in the above description, the first time is the time before the second time, and the first time may also be the time after the second time, where the calculation of the second pose information is opposite to the above process, and will not be described herein.
In this embodiment, how to detect the speed of the target obstacle in the curve is specifically described, so that when the speed information in the first coordinate system is mapped to the second coordinate system, the error of the speed estimation is corrected, and the problem that the calculation result is inconsistent with the actual situation due to the error of the speed estimation by the automatic driving system is avoided.
The above is an embodiment two of the obstacle speed detection method provided by the embodiment of the present application, and the following is an embodiment three of the obstacle speed detection method provided by the embodiment of the present application, where the determining process of whether the target obstacle is located in the curve is described in detail based on the embodiment one.
Specifically, in one embodiment, the specific implementation steps include:
Step S101, acquiring first pose information of a target obstacle at a first obstacle position and second pose information of the target obstacle at a second obstacle position under a first coordinate system.
It will be appreciated that the acquisition of the first pose information and the second pose information has been described in the foregoing embodiments, which are not described in detail in this embodiment.
Step S102, judging whether at least one of the first pose information and the second pose information is located in a curve area in a preset map, if so, judging that at least one of the first obstacle position and the second obstacle position is located in the curve, and if not, judging that neither the first obstacle position nor the second obstacle position is located in the curve.
For judging whether the first pose information is located in a curve area in the preset map, it can be understood that the preset map in the embodiment is a high-precision semantic map, the coordinate areas of the curve area and the straight channel are recorded in the map, and when the first pose information is located in the curve area, the position of the first obstacle is judged to be located in the curve; and when the first pose information is positioned in the coordinate area of the straight road, determining that the first obstacle position is positioned in the straight road.
For judging whether the second pose information is located in a curve area in the preset map, it can be understood that the preset map in the embodiment is a high-precision semantic map, the coordinate areas of the curve area and the straight road are recorded in the map, and when the second pose information is located in the curve area, the position of the second obstacle is judged to be located in the curve; and when the second pose information is positioned in the coordinate area of the straight road, determining that the second obstacle position is positioned in the straight road.
In another embodiment, the determining whether the first obstacle position and the second obstacle position are located in the curve is performed according to a curvature of the obstacle travel route corresponding to the first pose information and the second pose information, and the specific implementation steps include:
Step S111, an obstacle travel route corresponding to the first obstacle position and the second obstacle position is acquired.
The above-mentioned obtaining of the obstacle driving route may be achieved by combining the driving direction of the target obstacle with the first position and the second position, which is not limited and described in detail in this embodiment.
It will be appreciated that the above-described obstacle course may be from a first obstacle position to a second obstacle position, or from a second obstacle position to a first obstacle position. Specifically, when the first time is a time before the second time, the obstacle travel route is from the first obstacle position to the second obstacle position, and when the first time is a time after the second time, the obstacle travel route is from the second obstacle position to the first obstacle position.
Step S112, obtaining the route curvature of the obstacle driving route.
The calculation of the curvature of the route may be performed according to a curvature calculation formula, and is not specifically limited and described herein.
Step S113, judging whether the curvature of the route is larger than a preset curvature threshold value, if so, judging that at least one of the first obstacle position and the second obstacle position is positioned on the curve, and if not, judging that neither the first obstacle position nor the second obstacle position is positioned on the curve.
It will be appreciated that, when the target obstacle travels straight, the obstacle travel route is a straight line, and the curvature corresponding to the straight line is 0. In this embodiment, since the actual driving situation is considered, and the target obstacle is not strictly a straight line when driving on a straight road, the curvature threshold value may be preset for the determination parameter for determining whether or not the target obstacle is located on a curve, and a value of 0 or more and 10 or more may be set in this embodiment, and the present embodiment is not particularly limited.
When the vehicle is in a curve, the obstacle driving route is in an arc shape, and the curvature of the corresponding obstacle driving route is larger than the preset curvature threshold. At this time, the first obstacle position may be located at the curve, or the second obstacle position may be located at the curve, or both the first obstacle position and the second obstacle position may be located at the curve, that is, the first obstacle position and/or the second obstacle position are located at the curve.
When the route curvature of the obstacle driving route is smaller than a preset curvature threshold value, the obstacle driving route is indicated to be a straight line, the corresponding target obstacle is located on a straight road, and at the moment, the first obstacle position and the second obstacle position are not located on a curve.
It will be appreciated that the above determination as to whether the first obstacle position and the second obstacle position are located in a curve is schematically illustrated, and those skilled in the art can refer to the above description to perform other determination manners.
The third embodiment of the method for detecting an obstacle speed according to the embodiment of the present application is provided above, and the following is an embodiment of an apparatus for detecting an obstacle speed according to the embodiment of the present application.
Referring to fig. 6, the obstacle speed detecting apparatus in the present embodiment specifically includes:
a determining unit 601, configured to determine, in a first coordinate system, whether at least one of a first obstacle position and a second obstacle position of a target obstacle is located in a curve;
A first calculating unit 602, configured to calculate, when it is determined that at least one of the first obstacle position and the second obstacle position is located in the curve, speed information of the target obstacle in the second coordinate system according to mapping information of the first obstacle position and the second obstacle position in the second coordinate system.
Further, the speed detecting apparatus in the present embodiment further includes:
The second calculating unit 603 is configured to calculate, when it is determined that neither the first obstacle position nor the second obstacle position is located in the curve, speed information of the target obstacle in the second coordinate system according to the speed information of the target obstacle in the first coordinate system.
Further, calculating the speed information of the target obstacle in the second coordinate system according to the mapping information of the first obstacle position and the second obstacle position in the second coordinate system specifically includes:
Mapping first pose information of the first obstacle position under a first coordinate system to a second coordinate system to obtain first mapping information;
mapping second pose information of the second obstacle position under the first coordinate system to the second coordinate system to obtain second mapping information;
and calculating the speed information of the target obstacle under the second coordinate system according to the mapping information difference value between the first mapping information and the second mapping information.
Further, mapping the first pose information of the first obstacle position under the first coordinate system to the second coordinate system to obtain first mapping information, which specifically includes:
Decomposing the first pose information in a second coordinate system corresponding to the target obstacle to obtain decomposed information;
And summing the preset distance and the decomposition information to obtain first mapping information, wherein the preset distance is the distance between the datum point and the vehicle.
Further, the method comprises the steps of,
The configuration process of the second coordinate system corresponding to the target obstacle specifically comprises the following steps:
Sampling a lane corresponding to a target obstacle along a lane center line under a first coordinate system to obtain a sampling point sequence;
and taking a sampling point closest to the first obstacle position in the sampling point sequence as a reference point, and taking a second coordinate system corresponding to the reference point as a second coordinate system corresponding to the target obstacle.
Further, the speed information of the target obstacle in the second coordinate system is: linear velocity;
Calculating speed information of the target obstacle under the second coordinate system according to a mapping information difference value between the first mapping information and the second mapping information, wherein the speed information specifically comprises the following steps:
Calculating a position information difference between the first mapping information and the second mapping information;
And calculating the linear velocity of the target obstacle in the second coordinate system according to the position information difference value.
Further, the speed information of the target obstacle in the second coordinate system is: angular velocity;
Calculating speed information of the target obstacle under the second coordinate system according to a mapping information difference value between the first mapping information and the second mapping information, wherein the speed information specifically comprises the following steps:
calculating a yaw angle difference between the first mapping information and the second mapping information;
and calculating the angular speed of the target obstacle under the second coordinate system according to the yaw angle difference value.
Further, the speed information of the target obstacle in the second coordinate system is: linear and angular velocities;
Calculating speed information of the target obstacle under the second coordinate system according to a mapping information difference value between the first mapping information and the second mapping information, wherein the speed information specifically comprises the following steps:
Calculating a position information difference between the first mapping information and the second mapping information;
calculating a yaw angle difference between the first mapping information and the second mapping information;
Calculating the linear speed of the target obstacle under a second coordinate system according to the position information difference value;
and calculating the angular speed of the target obstacle under the second coordinate system according to the yaw angle difference value.
Further, determining whether at least one of the first obstacle position and the second obstacle position corresponding to the target obstacle is located in the curve specifically includes:
acquiring an obstacle driving route corresponding to the first obstacle position and the second obstacle position;
Acquiring a route curvature of a driving route of the obstacle;
Judging whether the curvature of the route is larger than a preset curvature threshold value, if so, judging that at least one of the first obstacle position and the second obstacle position is positioned on the curve, and if not, judging that neither the first obstacle position nor the second obstacle position is positioned on the curve.
Further, determining whether at least one of the first obstacle position and the second obstacle position corresponding to the target obstacle is located in the curve specifically includes:
Acquiring first pose information of a target obstacle at a first obstacle position and second pose information of the target obstacle at a second obstacle position in a first coordinate system;
Judging whether at least one of the first pose information and the second pose information is located in a curve area in a preset map, if so, judging that at least one of the first obstacle position and the second obstacle position is located in the curve, and if not, judging that neither the first obstacle position nor the second obstacle position is located in the curve.
In this embodiment, different speed detection methods are adopted for the target obstacle located in the curve, firstly, under the first coordinate system, whether the target obstacle is located in the curve is judged, namely, whether at least one of the first obstacle position and the second obstacle position corresponding to the target obstacle is located in the curve is judged, if yes, the target obstacle is located in the curve is indicated, at this time, according to the mapping information of the first obstacle position and the second obstacle position in the second coordinate system, the speed information of the target obstacle in the second coordinate system is calculated, and the method is different from the traditional speed detection method for the target obstacle located in the curve, so that the technical problem that the accuracy of the detection result of the traditional speed detection method for the obstacle located in the curve is low is solved.
The embodiment of the application also provides an embodiment of the obstacle detection device, which comprises a processor and a memory; the memory is used for storing the program codes and transmitting the program codes to the processor; the processor is configured to execute the obstacle detection method in the foregoing embodiment according to an instruction in the program code.
The present application also provides an embodiment of a computer-readable storage medium storing program code for executing the obstacle detecting method in the foregoing embodiment.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
The terms "first," "second," "third," "fourth," and the like in the description of the application and in the above figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented, for example, in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that in the present application, "at least one (item)" means one or more, and "a plurality" means two or more. "and/or" for describing the association relationship of the association object, the representation may have three relationships, for example, "a and/or B" may represent: only a, only B and both a and B are present, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b or c may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
In the several embodiments provided in the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of elements is merely a logical functional division, and there may be additional divisions of actual implementation, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: u disk, mobile hard disk, read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk, etc.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (9)

1. An obstacle speed detection method, comprising:
Judging whether at least one of a first obstacle position and a second obstacle position of a target obstacle is positioned on a curve or not in a first coordinate system;
Calculating speed information of the target obstacle in a second coordinate system according to mapping information of the first obstacle position and the second obstacle position in the second coordinate system when at least one of the first obstacle position and the second obstacle position is judged to be located in the curve;
the establishing step of the second coordinate system specifically includes:
Sampling a lane corresponding to the target obstacle along a lane center line under the first coordinate system to obtain a sampling point sequence;
And taking a sampling point closest to the first obstacle position in the sampling point sequence as a reference point, and taking a second coordinate system corresponding to the reference point as a second coordinate system of the target obstacle.
2. The obstacle speed detection method according to claim 1, further comprising:
And when judging that the first obstacle position and the second obstacle position are not positioned on the curve, calculating speed information of the target obstacle in a second coordinate system according to the speed information of the target obstacle in the first coordinate system.
3. The obstacle speed detection method according to claim 1, wherein the calculating speed information of the target obstacle in the second coordinate system based on the mapping information of the first obstacle position and the second obstacle position in the second coordinate system specifically includes:
Mapping first pose information of the first obstacle position under the first coordinate system to a second coordinate system to obtain first mapping information;
Mapping second pose information of the second obstacle position under the first coordinate system to the second coordinate system to obtain second mapping information;
And calculating the speed information of the target obstacle under a second coordinate system according to the mapping information difference value between the first mapping information and the second mapping information.
4. The obstacle speed detecting method according to claim 3, wherein mapping the first pose information of the first obstacle position in the first coordinate system to the second coordinate system to obtain the first mapping information comprises:
Decomposing the first pose information in a second coordinate system corresponding to the target obstacle to obtain decomposition information;
And summing the preset distance and the decomposition information to obtain first mapping information, wherein the preset distance is the distance between the datum point and the vehicle.
5. The obstacle speed detecting method according to claim 1, wherein the determining whether at least one of the first obstacle position and the second obstacle position of the target obstacle is located in a curve, specifically includes:
Acquiring an obstacle driving route corresponding to the first obstacle position and the second obstacle position;
Acquiring a route curvature of the obstacle driving route;
Judging whether the curvature of the route is larger than a preset curvature threshold value, if so, judging that at least one of the first obstacle position and the second obstacle position is positioned on a curve, and if not, judging that neither the first obstacle position nor the second obstacle position is positioned on the curve.
6. An obstacle speed detecting apparatus, comprising:
A judging unit configured to judge whether at least one of a first obstacle position and a second obstacle position of the target obstacle is located in a curve in the first coordinate system;
A first calculation unit configured to calculate, when it is determined that at least one of the first obstacle position and the second obstacle position is located in the curve, speed information of the target obstacle in a second coordinate system according to mapping information of the first obstacle position and the second obstacle position in the second coordinate system;
the establishing step of the second coordinate system specifically includes:
Sampling a lane corresponding to the target obstacle along a lane center line under the first coordinate system to obtain a sampling point sequence;
And taking a sampling point closest to the first obstacle position in the sampling point sequence as a reference point, and taking a second coordinate system corresponding to the reference point as a second coordinate system of the target obstacle.
7. The obstacle speed detecting device as claimed in claim 6, further comprising:
And the second calculation unit is used for calculating the speed information of the target obstacle in the second coordinate system according to the speed information of the target obstacle in the first coordinate system when judging that the first obstacle position and the second obstacle position are not positioned in the curve.
8. An obstacle speed detection device, the device comprising a processor and a memory;
The memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to execute the obstacle speed detection method according to any one of claims 1 to 5 according to instructions in the program code.
9. A storage medium storing program code for executing the obstacle speed detection method according to any one of claims 1 to 5.
CN202110426631.XA 2021-04-20 2021-04-20 Obstacle speed detection method, device, equipment and storage medium Active CN113125795B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110426631.XA CN113125795B (en) 2021-04-20 2021-04-20 Obstacle speed detection method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110426631.XA CN113125795B (en) 2021-04-20 2021-04-20 Obstacle speed detection method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113125795A CN113125795A (en) 2021-07-16
CN113125795B true CN113125795B (en) 2024-06-18

Family

ID=76778412

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110426631.XA Active CN113125795B (en) 2021-04-20 2021-04-20 Obstacle speed detection method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113125795B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113849971B (en) * 2021-09-16 2023-03-28 广州文远知行科技有限公司 Driving system evaluation method and device, computer equipment and storage medium
CN114742958B (en) * 2022-02-18 2023-02-17 禾多科技(北京)有限公司 Three-dimensional lane information generation method, device, equipment and computer readable medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109191487A (en) * 2018-08-30 2019-01-11 百度在线网络技术(北京)有限公司 Collision checking method, device, equipment and storage medium based on unmanned vehicle
CN109212530A (en) * 2017-07-04 2019-01-15 百度在线网络技术(北京)有限公司 Method and apparatus for determining barrier speed

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001199260A (en) * 2000-01-20 2001-07-24 Matsushita Electric Ind Co Ltd Inter-vehicle distance controller, vehicle traveling condition display device, vehicle speed control releasing device, and vehicle sudden brake warning device
CN109557925B (en) * 2018-12-29 2021-12-10 北京智行者科技有限公司 Obstacle avoiding method and device for automatic driving vehicle
CN110189547B (en) * 2019-05-30 2020-10-20 广州小鹏汽车科技有限公司 Obstacle detection method and device and vehicle
CN111497852B (en) * 2020-04-30 2021-05-11 安徽江淮汽车集团股份有限公司 Method, device and equipment for judging position of obstacle in curve scene and storage medium
CN111596086B (en) * 2020-05-15 2022-10-11 北京百度网讯科技有限公司 Method and device for estimating speed of obstacle in automatic driving and electronic equipment
CN112633101A (en) * 2020-12-14 2021-04-09 深兰人工智能(深圳)有限公司 Obstacle speed detection method and device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109212530A (en) * 2017-07-04 2019-01-15 百度在线网络技术(北京)有限公司 Method and apparatus for determining barrier speed
CN109191487A (en) * 2018-08-30 2019-01-11 百度在线网络技术(北京)有限公司 Collision checking method, device, equipment and storage medium based on unmanned vehicle

Also Published As

Publication number Publication date
CN113125795A (en) 2021-07-16

Similar Documents

Publication Publication Date Title
JP6272566B2 (en) Route prediction device
CN106405555B (en) Obstacle detection method and device for Vehicular radar system
KR101714145B1 (en) Apparatus for identifying peripheral vehicle and method thereof
US20190310651A1 (en) Object Detection and Determination of Motion Information Using Curve-Fitting in Autonomous Vehicle Applications
JP6246609B2 (en) Self-position estimation apparatus and self-position estimation method
KR100899820B1 (en) The discriminative apparatus and method of ground/obstacle for autonomous mobile vehicle
WO2018221453A1 (en) Output device, control method, program, and storage medium
WO2018181974A1 (en) Determination device, determination method, and program
CN113125795B (en) Obstacle speed detection method, device, equipment and storage medium
KR20180009755A (en) Lane estimation method
US11231285B2 (en) Map information system
WO2021056499A1 (en) Data processing method and device, and movable platform
WO2020029706A1 (en) Dummy lane line elimination method and apparatus
US20200116509A1 (en) Assistance control system
JP7051366B2 (en) Information processing equipment, trained models, information processing methods, and programs
CN109085829A (en) A kind of sound state target identification method
CN110632617A (en) Laser radar point cloud data processing method and device
CN110850859B (en) Robot and obstacle avoidance method and obstacle avoidance system thereof
CN108628318A (en) Congestion environment detection method, device, robot and storage medium
CN108573272A (en) Track approximating method
CN115993597A (en) Visual radar perception fusion method and terminal equipment
CN112686951A (en) Method, device, terminal and storage medium for determining robot position
JP2020003463A (en) Vehicle's self-position estimating device
CN112711255B (en) Mobile robot obstacle avoidance method, control equipment and storage medium
CN113158779B (en) Walking method, walking device and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant