CN113296116A - Obstacle detection method, driving device, and storage medium - Google Patents

Obstacle detection method, driving device, and storage medium Download PDF

Info

Publication number
CN113296116A
CN113296116A CN202110528166.0A CN202110528166A CN113296116A CN 113296116 A CN113296116 A CN 113296116A CN 202110528166 A CN202110528166 A CN 202110528166A CN 113296116 A CN113296116 A CN 113296116A
Authority
CN
China
Prior art keywords
coordinate
obstacle
coordinate value
point
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110528166.0A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tonn Intelligent Technology Suzhou Co ltd
Original Assignee
Tonn Intelligent Technology Suzhou Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tonn Intelligent Technology Suzhou Co ltd filed Critical Tonn Intelligent Technology Suzhou Co ltd
Priority to CN202110528166.0A priority Critical patent/CN113296116A/en
Publication of CN113296116A publication Critical patent/CN113296116A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application relates to the technical field of intelligent robots, and discloses an obstacle detection method, driving equipment and a storage medium, wherein the method comprises the following steps: the method comprises the following steps: acquiring a first coordinate value obtained by detecting an obstacle by the single-line laser radar, wherein the first coordinate value is determined by referring to a coordinate system of the single-line laser radar; performing coordinate conversion on the first coordinate value to obtain a second coordinate value, wherein the second coordinate value is determined by referring to a coordinate system of the driving equipment; the type of the obstacle is determined based on the second coordinate value. The method comprises the steps of collecting polar coordinate values of a plurality of detection points through a single-line laser radar, obtaining Cartesian coordinate values of a coordinate system of a reference driving device after coordinate system conversion and coordinate origin conversion, and screening barrier points for linear fitting analysis based on longitudinal coordinate values to determine the type of a barrier; the size and the height of the ground obstacle and the slope or the slope angle of the slope obstacle can be further determined, the obstacle avoidance performance of the robot can be improved, and the implementation cost is low.

Description

Obstacle detection method, driving device, and storage medium
Technical Field
The present application relates to the field of intelligent robot technology, and in particular, to an obstacle detection method, a driving device, and a storage medium.
Background
Intelligent driving equipment, unmanned robot for example, can be used in each field such as patrolling and examining, sanitary cleanness, express delivery service, dining service, and the labour has been liberated greatly in unmanned robot's appearance, lets artificial intelligence bring more facilities for people's life. In order to meet the requirements of various use scenarios, the unmanned robot generally needs to have various sensing capabilities, such as the capability of sensing obstacles to avoid obstacles during the traveling process.
The existing multi-line three-dimensional laser radar sensor generally extracts barrier features based on three-dimensional point cloud information returned by the sensor through a Vector Field Histogram (VFH) method so as to determine barrier targets to avoid barriers, although the barrier avoiding technology can also identify slope barriers, the multi-line three-dimensional laser radar is high in cost and is not suitable for popularization and application, and detection blind areas of different degrees exist in installation positions of most multi-line three-dimensional laser radars on a robot at present, so that the robot cannot sense low barriers or lower parts of the slope barriers, and the barrier avoiding performance of the robot is reduced. For robots in basic service fields of logistics, sanitation and the like, due to the fact that the implementation cost of the scheme for detecting the obstacles by adopting the multi-line three-dimensional laser radar is too high, the obstacles are generally avoided by only adopting the low-cost single-line laser radar.
In the obstacle avoidance scheme realized by the single-line laser radar, only whether an obstacle exists in the traveling direction of the robot can be detected, and the specific characteristics of the obstacle are difficult to analyze and predict. For example, when encountering a slope obstacle with a small gradient in the cleaning process, a cleaning robot with certain climbing capability can originally climb and advance, and the cleaning robot can directly determine the obstacle as a ground obstacle to avoid and bypass without distinguishing the type of the obstacle according to the preset obstacle characteristics, and finally can change the preset advancing path or direction unreasonably, so that the effect and the timeliness of the cleaning robot for executing the work task are affected.
Disclosure of Invention
The embodiment of the application provides a barrier detection method, driving equipment and a storage medium, wherein a single-line laser radar is adopted to acquire polar coordinate values of a plurality of detection points in the advancing direction of the driving equipment such as a robot, cartesian coordinate values of a reference driving equipment coordinate system are obtained after coordinate system conversion and coordinate origin conversion, barrier points are screened based on longitudinal coordinate values to perform linear fitting analysis, and finally the type of a barrier is determined to be used as a basis for accurately selecting a barrier avoiding scheme. The single line laser radar that this application obstacle detection method adopted simple structure, low cost, when the single line laser radar quantity of installation is more on the robot, the obstacle detection scheme of this application can have higher rate of accuracy and sensitivity, does benefit to the obstacle avoidance performance that improves the robot, and the manufacturing cost of robot is also controllable simultaneously.
In a first aspect, an embodiment of the present application provides an obstacle detection method, which is applied to a driving device including a single line laser radar, and the method includes: acquiring a first coordinate value obtained by detecting an obstacle by the single-line laser radar, wherein the first coordinate value is determined by referring to a coordinate system of the single-line laser radar; performing coordinate conversion on the first coordinate value to obtain a second coordinate value, wherein the second coordinate value is determined by referring to a coordinate system of the driving equipment; determining a type of the obstacle based on the second coordinate value.
The original point cloud polar coordinates are formed by acquiring a plurality of detection points on the ground and the encountered obstacles in the traveling direction of the driving equipment or in the surrounding environment through the single-line laser radar. Wherein, the original point cloud is a set of all detection points. The single-line laser radar can return to a plurality of detection point coordinates on the ground when scanning to the ground, the single-line laser radar can return to a plurality of detection point coordinates on the surface of the obstacle when scanning to the obstacle, the plurality of detection point coordinates on the ground and the plurality of detection point coordinates on the surface of the obstacle jointly form an original point cloud coordinate, and the plurality of detection point coordinates on the surface of the obstacle are effective coordinates for determining the type of the obstacle, namely the first coordinate value.
For example, if the obstacle detected by the single line laser radar is a ground obstacle, the first coordinate values acquired by scanning the ground obstacle surface by the single line laser radar are the original polar coordinate values of the detection points on the ground obstacle surface, and the second coordinate values obtained by coordinate conversion are, for example, coordinate values in a cartesian coordinate system.
In a possible implementation of the first aspect, the determining the type of the obstacle based on the second coordinate value includes: determining a point of the second coordinate value, of which the z-axis coordinate value is greater than a longitudinal coordinate value corresponding to a preset height threshold, as an obstacle point; performing a linear fit based on the barrier points; determining a type of the obstacle based on the linear fit result.
That is, the ordinate value in the three-dimensional coordinate system obtained by coordinate conversion of the probe point may be used as a basis for defining whether the probe point is a point on the ground, an obstacle point above the ground, or a cliff point below the ground. The ordinate value corresponding to the preset height threshold is the ordinate value corresponding to the position point higher than the ground within a certain height range, when the z-axis coordinate value of the detection point is greater than the ordinate value corresponding to the preset height threshold, the detection point can be determined as an obstacle point, and the determined obstacle point is used for analyzing the characteristics of the obstacle encountered by the robot.
For example, the second coordinate value is a cartesian coordinate value of the reference robot coordinate system, and the preset height threshold is, for example, the first height threshold z in the following embodimentsuAccording to whether the z-axis coordinate value in the Cartesian coordinate values of the reference robot coordinate system corresponding to each detection point is larger than the first height threshold value zuThe corresponding ordinate value can determine whether the detection point is an obstacle point, and linear fitting is carried out on the determined obstacle point set, so that the characteristics of the obstacle can be analyzed, and whether the characteristics of the obstacle accord with the characteristics of the slope obstacle or not is mainly analyzed; and if the obstacle is not matched, determining that the type of the obstacle is a ground obstacle.
In a possible implementation of the first aspect, the coordinate transforming the first coordinate value to obtain a second coordinate value includes: converting a coordinate system based on the first coordinate value to obtain a third coordinate value, wherein the first coordinate value is determined by referring to a polar coordinate system of the single line laser radar, and the third coordinate value is determined by referring to a Cartesian coordinate system of the single line laser radar; and performing coordinate conversion based on the third coordinate value and a preset coordinate transformation matrix to obtain the second coordinate value. The coordinate system of the single-line laser radar comprises a coordinate system taking a laser emission point of the single-line laser radar as a coordinate origin; the coordinate system of the driving device comprises a coordinate system with a central point of a rear wheel connecting shaft of the driving device as a coordinate origin.
That is, the first coordinate value is converted into the second coordinate value, and two steps of conversion of a coordinate system and conversion of a coordinate origin are required, for example, the first coordinate value is a polar coordinate value of a sensor coordinate system in which the barrier point reference single line laser radar laser emission point is the coordinate origin, the first coordinate value is converted into a cartesian coordinate value of a reference sensor coordinate system, that is, the third coordinate value, and then the coordinate origin is converted into the cartesian coordinate value of the robot coordinate system in which the center point of the robot rear wheel connecting shaft is used as the coordinate origin. The conversion of the coordinate origin can be realized through a coordinate conversion matrix, and the values of all matrix points in the preset coordinate conversion matrix are preset through experiments based on the relative positions of a single line laser radar laser emission point serving as the coordinate origin and the central point of a connecting shaft of a rear wheel of the robot and the included angles of all coordinate planes between a sensor coordinate system and a robot coordinate system.
In one possible implementation of the first aspect, the coordinate system of the singlet lidar includes a polar coordinate system with the laser emission point of the singlet lidar as a coordinate origin, and a cartesian coordinate system with the laser emission point of the singlet lidar as an origin; the first coordinate value is a polar coordinate value under a polar coordinate system taking the laser emission point as a coordinate origin; the third coordinate value is a cartesian coordinate value in a cartesian coordinate system with the laser emission point as the origin of coordinates.
In one possible implementation of the first aspect, the coordinate system of the driving device includes a cartesian coordinate system with a center point of a rear wheel connecting shaft of the driving device as a coordinate origin; the second coordinate value is a Cartesian coordinate value in a Cartesian coordinate system with the center point of the rear wheel connecting shaft as the origin of coordinates.
In a possible implementation of the first aspect, the coordinate transformation matrix is determined based on a relative position between a laser emission point of the single line laser radar and a center point of a rear wheel connecting shaft of the device, and a deflection angle of a laser scanning plane of the single line laser radar with respect to a bottom surface or a side surface of the device.
In a possible implementation of the first aspect, obtaining the second coordinate value based on the third coordinate value and a preset coordinate transformation matrix includes: and obtaining the second coordinate value based on the third coordinate value multiplied by the coordinate transformation matrix.
In one possible implementation of the first aspect, the types of obstacles include a slope obstacle and a ground obstacle; the method further comprises the following steps: determining the obstacle to be a slope obstacle under the condition that the linear fitting result meets a preset fitting condition; and under the condition that the linear fitting result does not meet the preset fitting condition, determining that the obstacle is a ground obstacle.
That is, it is possible to determine the type of the obstacle, whether it is a ground obstacle or a slope obstacle, based on the result of the linear fitting performed by the above-described obstacle point set. For example, a linear fit of obstacle points that match a slope obstacle feature is a sloped straight line, and the slope of the straight line is greater than a fit straight line slope threshold; and the type of the obstacle can be judged to be the ground obstacle according to the linear fitting result which does not accord with the characteristics of the slope obstacle.
In a possible implementation of the first aspect, the linear fitting includes a straight line fitting, the preset fitting condition includes a straight line fitting error threshold and a fitted straight line slope threshold, and the method further includes: determining that the linear fitting result meets a preset fitting condition under the condition that the fitting error value of the linear fitting is smaller than the linear fitting error threshold value and the slope of the fitting straight line is larger than the fitting straight line slope threshold value; and under the condition that the fitting error value of the straight line fitting is greater than the straight line fitting error threshold value or the slope of the fitting straight line is less than or equal to the fitting straight line slope threshold value, determining that the linear fitting result does not meet the preset fitting condition.
That is, in the result of fitting the straight line to the obstacle point, an approximate straight line with a small error in fitting the straight line may be regarded as a straight line, for example, when the error in fitting the straight line is smaller than the error threshold; in addition, threshold value judgment is needed to be carried out on the slope of the straight line obtained through fitting, and if the slope of the fitting straight line is larger than a preset threshold value of the slope of the fitting straight line, it is indicated that the obstacle encountered by the robot is a very steep slope obstacle or a ground obstacle with a relatively flat inclined plane.
In a second aspect, an embodiment of the present application provides a driving apparatus, including: the system comprises a processor, a memory, a single-wire laser radar, a communication interface and a communication bus; the single-wire laser radar, the memory and the communication interface are connected through the communication bus; the memory is configured to store at least one instruction, and when the processor executes the at least one instruction stored in the memory, the driving apparatus is caused to execute the above obstacle detection method.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, where instructions are stored on the storage medium, and when executed on a computer, the instructions cause the computer to execute the above obstacle detection method.
Drawings
Fig. 1 is a schematic view illustrating an application scenario of the obstacle detection method based on the single line laser radar according to an embodiment of the present application.
Fig. 2 shows a schematic structural diagram of a system 200 for driving a device according to an embodiment of the present application.
Fig. 3a is a schematic view illustrating an application scenario of the obstacle detection method according to a first embodiment of the present application.
Fig. 3b is a schematic view illustrating a scene in which the type of the obstacle is determined to be a slope obstacle according to the first embodiment of the present application.
Fig. 4 is a schematic flowchart illustrating a specific flow of an obstacle detection method according to a first embodiment of the present application.
Fig. 5 is a schematic diagram illustrating a process of adjusting the installation angle of the single line laser radar 102 according to a first embodiment of the present application.
Fig. 6a is a schematic diagram illustrating a straight line fitting result according to a first embodiment of the present application.
Fig. 6b is a diagram illustrating another straight line fitting result according to the first embodiment of the present application.
Fig. 7a to 7b are schematic diagrams illustrating another application scenario of the obstacle detection method according to the second embodiment of the present application.
Fig. 8 is a schematic diagram illustrating another installation position of the single-line lidar in the scenario illustrated in fig. 7a according to the second embodiment of the present application.
Fig. 9 is a schematic view illustrating another application scenario of the obstacle detection method according to the third embodiment of the present application.
Fig. 10 is a schematic perspective view illustrating a mounting position of a single-line lidar in the scene shown in fig. 9 according to a third embodiment of the present application.
Fig. 11a to 11b are schematic diagrams illustrating changes in an included angle between single line laser radars according to a third embodiment of the present application.
Detailed Description
The present application is further described with reference to the following detailed description and the accompanying drawings. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. The terms of orientation such as "upper", "lower", "front", "back", etc. are used in a positional description corresponding to the specific drawings, and are not intended to limit the position and orientation of the specific structures. In addition, for convenience of description, only a part of structures or processes related to the present application, not all of them, is illustrated in the drawings. It should be noted that in this specification, like reference numerals and letters refer to like items in the following drawings.
The application provides a barrier detection scheme based on single line laser radar, and particularly provides a barrier detection method, driving equipment and a storage medium.
Fig. 1 shows an application scenario diagram of the obstacle detection method based on the single line laser radar of the present application.
As shown in fig. 1, the scene includes a driving apparatus 101 and a single line laser radar 102, wherein a laser scanning plane 1021 formed by laser light emitted from the single line laser radar 102 can detect an obstacle 103 within a certain distance range. In general, the single line lidar 102 is used to detect the obstacle 103 in the traveling direction of the driving apparatus 101, and therefore, the single line lidar 102 is generally disposed on the front end surface 1011 of the driving apparatus 101, such as a schematic position 1012 shown in fig. 1, which schematic position 1012 is located above the center point of the lower edge line of the front end surface 1011 of the driving apparatus 101.
In order to solve the problems in the prior art, the obstacle detection method based on the single line laser radar provided by the application comprises the steps of collecting a plurality of detection points on the ground, the encountered obstacles 103 and the like in the traveling direction of a robot and in the surrounding environment through the single line laser radar 102 to form an original point cloud, referring to a polar coordinate value under a sensor coordinate system taking a laser emission point of the single line laser radar 102 as a coordinate origin to the original point cloud as an original point cloud polar coordinate, obtaining a Cartesian coordinate of the original point cloud reference driving equipment coordinate origin after coordinate system conversion and coordinate origin conversion, screening obstacle points higher than a preset height threshold value from the corresponding original point cloud based on the Cartesian coordinate value obtained by conversion to perform linear fitting analysis, further determining the type of the obstacles 103 based on the linear fitting analysis result of the screened obstacle points, and selecting an obstacle avoidance scheme based on the determined obstacle type by the driving equipment 101, the accurate obstacle avoidance capability of the driving device 101 is improved, the driving device 101 can accurately judge slope obstacles, low obstacles and the like, and under the condition that climbing can be smoothly performed, the driving device 101 can appropriately reduce detour, keep running on a preset running path and direction, and optimize the working effect.
Further, when the type of the obstacle 103 is determined as a ground obstacle, the size and the height of the ground obstacle can be determined based on coordinate values of obstacle points obtained through screening, the driving device 101 can determine whether to avoid the obstacle to detour according to the determined size and height of the obstacle in combination with the stability of the driving device, for example, the obstacle 103 is a ground obstacle with a small size or a short ground obstacle with a small height, the driving device 101 can directly pass through the obstacle without avoiding the obstacle, and the driving device 101 only slightly shakes to avoid the obstacle without affecting the normal execution of a work task when the obstacle passes through the obstacle; when the type of the obstacle 103 is determined as a slope obstacle, the present application is also able to determine the slope or the slope angle of the slope obstacle based on the linear fitting result, and the drive apparatus 101 may determine a climbing scheme based on the determined slope or slope angle in combination with its climbing capability. In addition, the cost of the single-line laser radar is low, so that the implementation cost of the technical scheme of the application is low, and the technical scheme of the application is beneficial to wide popularization and application on the unmanned driving device 101.
The single line laser radar 102 collects the original point cloud polar coordinates of a plurality of detection points on the ground and the encountered obstacle 103 in the traveling direction of the robot 101 and the surrounding environment, and the original point cloud polar coordinates are the polar coordinate values of the detection points returned by the sensor of the single line laser radar 102 when the single line laser radar 102 performs laser scanning on the ground and the surrounding ground in the traveling direction of the driving device 101.
It is to be understood that fig. 1 does not constitute a limitation on the arrangement position and the arrangement number of the single line laser radars 102, and in some embodiments, one or more single line laser radars 102 may be arranged on the driving device 101, which is not limited herein. It will be appreciated that, given that the singlet lidar 102 typically has a fixed scanning frequency, the detection frequency of the plurality of singlet lidar 102 is also typically higher than the detection frequency of the single singlet lidar 102, thereby making the drive apparatus 101 more sensitive to detect the obstacle 103.
Fig. 2 shows a schematic structural diagram of a system 200 of the driving apparatus 101 according to an embodiment of the present application.
As shown in fig. 2, system 200 may include one or more processors 204, system control logic 208 coupled to at least one of processors 204, system memory 212 coupled to system control logic 208, non-volatile memory (NVM)/storage 216 coupled to system control logic 208, network interface 210 coupled to system control logic 208, and single-wire lidar 102.
The processor 204 may include one or more single-core or multi-core processors. The processor 204 may include any combination of general-purpose processors and dedicated processors (e.g., graphics processors, application processors, baseband processors, etc.). In the embodiment of the present application, the processor 204 may operate a preset algorithm to perform operations such as coordinate transformation operation, linear fitting operation, calculation of the size and height of the ground obstacle, and calculation of the slope obstacle.
System control logic 208 for an embodiment may include any suitable interface controllers to provide any suitable interface to at least one of processors 204 and/or any suitable device or component in communication with system control logic 208.
System control logic 208 for one embodiment may include one or more memory controllers to provide an interface to system memory 212. System memory 212 may be used to load and store data and/or instructions, for example, for system 200, memory 212 for an embodiment may comprise any suitable volatile memory, such as suitable Dynamic Random Access Memory (DRAM).
NVM/memory 216 may include one or more tangible, non-transitory computer-readable media for storing data and/or instructions. For example, NVM/memory 216 may include any suitable non-volatile memory, such as flash memory, and/or any suitable non-volatile storage device(s), such as one or more hard disk drives (hdd (s)), one or more Compact Disc (CD) drives, and/or one or more Digital Versatile Disc (DVD) drives.
NVM/memory 216 may comprise a portion of a storage resource on the device on which system 200 is installed or it may be accessible by, but not necessarily a part of, a device. For example, NVM/storage 216 may be accessed over a network via network interface 210.
In particular, system memory 212 and NVM/storage 216 may each include: temporary and permanent copies of instructions 224. The instructions 224 may include: instructions that, when executed by at least one of the processors 204, cause the system 200 to implement the obstacle detection method of the present application. In various embodiments, the instructions 224 or hardware, firmware, and/or software components thereof may additionally/alternatively be disposed in the system control logic 208, the network interface 210, and/or the processor 204. In the embodiment of the present application, operation programs such as coordinate transformation operation, linear fitting operation, ground obstacle size, height calculation, slope calculation of a slope obstacle, and the like, which are performed on the original point cloud coordinates acquired by the single line laser radar 102, may be stored in the NVM/memory 216 in advance for the processor 204 to call.
Network interface 210 is used to provide a radio interface for system 200 to communicate with any other suitable devices (e.g., front end modules, antennas, etc.) over one or more networks. The network interface 210 may further include any suitable hardware and/or firmware to provide a multiple-input multiple-output radio interface. For example, network interface 210 for one embodiment may be a network adapter, a wireless network adapter, a telephone modem, and/or a wireless modem.
The single line laser radar 102 is used to detect the surrounding environment of the driving apparatus 101 to return coordinate information (i.e., original point cloud coordinates) of a number of detected points. The original point cloud coordinates are used to determine whether there is an obstacle 103 around the drive apparatus 101 and determine the type of the obstacle 103, calculate the size and height of the ground obstacle, and the slope or slope angle of the slope obstacle, and the like. The original point cloud coordinate acquired by the single line laser radar 102 is generally a coordinate value under a polar coordinate system, and the original point cloud coordinate acquired by the single line laser radar 102 is sent to the processor 204 for coordinate conversion, linear fitting analysis and other processing, or sent to the processor 204 through the system control logic 208 for coordinate conversion, linear fitting analysis and other processing. The processing such as coordinate conversion, linear fitting analysis and the like performed on the processor 204 includes converting an original point cloud coordinate under a coordinate system of the acquired single line laser radar to a cartesian coordinate value of a coordinate origin of a coordinate system of the reference driving device 101 through a preset coordinate conversion matrix, and performing linear fitting analysis on an obstacle point of which the cartesian coordinate value meets a screening condition to determine the type of the obstacle 103 encountered by the driving device 101; if the obstacle 103 is a ground obstacle, further calculating the size, height and the like of the ground obstacle; if the obstacle 103 is a slope obstacle, the slope angle or the like of the slope obstacle is further calculated, and the process of the above processing calculation will be described in detail in the following specific embodiment, which is not described herein again.
It is to be understood that the structure of the system 200 shown in fig. 2 is not a specific limitation to the system of the driving apparatus 101, and in other embodiments, the driving apparatus 101 may implement the obstacle detection scheme of the present application through other forms of system structures.
It is understood that the driving device 101 provided in the present application may be various driving devices having an automatic driving and sensing function, for example, the driving device 101 may be a logistics distribution robot, an unmanned cleaning vehicle, a catering robot, an unmanned inspection robot, etc., without limitation. The following describes an implementation procedure of the obstacle detection scheme of the present application, taking the driving apparatus 101 as a logistics distribution robot (hereinafter referred to as the robot 101) as an example.
First, a detailed description will be given of a specific implementation process of implementing the obstacle detection scheme of the present application by arranging a single line laser radar on the robot 101 according to an embodiment.
Example one
In the present embodiment, a single line laser radar 102 is disposed on the robot 101, and the following describes in detail the installation position of the single line laser radar 102 and the specific process of implementing the obstacle detection method of the present application based on the single line laser radar 102 in the present embodiment with reference to the drawings.
Fig. 3a is a schematic view illustrating an application scenario of the obstacle detection method according to an embodiment of the present application. As shown in fig. 3a, the single line laser radar 102 is provided on the front end surface 1011 of the robot 101, and an obstacle 103 is present in the traveling direction of the robot 101. Assuming that a coordinate system with a laser emitting point of the single line laser radar 102 as a coordinate origin is a sensor coordinate system, for example, a rectangular planar coordinate system XOY shown in fig. 3a, the single line laser radar 102 emits a laser to detect a ground environment in a traveling direction during traveling of the robot 101 and returns coordinate information of a detection point as an original point cloud coordinate, for example, as shown in fig. 3a, if the laser emitted by the single line laser radar 102 encounters an obstacle 103, the returned detection point 1022 is located on the surface of the obstacle 103, and if the laser of the single line laser radar 102 scans a point on the ground, the returned detection point 1022 is located on the ground.
As shown in fig. 3a, in determining whether there is an obstacle in the traveling direction of the robot 101, the rear wheels on the robot 101 are providedThe coordinate system connecting the center point of the shaft as the coordinate origin is a robot coordinate system, such as the XOZ coordinate plane shown in fig. 3a, and the first height threshold above the ground can be preset as z with reference to the robot coordinate systemuThe second height threshold below the ground is zdThen the height is greater than zuCan be determined as an obstacle point with a height smaller than zdIs determined as a cliff point with a height zuAnd zdThe probe points in between can then be determined as ground points.
It is understood that if the existence of the obstacle point is determined based on the coordinate information of the detection point returned by the single line laser radar 102 sensor, it may be determined that the obstacle 103 exists in the traveling direction of the robot, and the type of the obstacle 103 may be further determined based on the linear fitting analysis of the determined obstacle point. If the obstacle 103 is a ground obstacle, as shown in fig. 3a, the robot 101 further calculates the size, height, etc. of the ground obstacle; if the obstacle 103 is a slope obstacle, as shown in fig. 3b, the robot 101 may further determine the slope or the slope angle of the obstacle 103, so as to determine whether the robot 101 needs to avoid the obstacle 103 encountered by the detour.
It is understood that the single line laser radar 102 may be disposed at any position on the front end surface 1011 of the robot 101, for example, the single line laser radar 102 is mounted at the center point of the edge on the front end surface of the robot 101 shown in fig. 3a, and in other embodiments, the mounting position of the single line laser radar 102 may be other positions different from the position shown in fig. 3a, which is not limited herein.
Fig. 4 shows a detailed flowchart of an obstacle detection method according to an embodiment of the present application. It will be appreciated that the steps in the flow chart shown in fig. 4 may be implemented by the processor 204 of the robot 101 by running a preset algorithm or software program.
Specifically, as shown in fig. 4, the obstacle detection method provided in the embodiment of the present application includes the following steps:
step 401: and acquiring polar coordinate values of all the detection points in a reference sensor coordinate system as original point cloud coordinates.
For example, the polar coordinate value of the detected point i acquired by the single line lidar 102 with reference to the sensor coordinate system may be expressed as { r }iiWherein, the detection point i is any detection point r collected by the single line laser radar 102iFor scanning the laser beam over the radius of the spot i on the planeiTo detect the polar angle of point i on the laser scan plane.
Step 402: and converting the original point cloud coordinate into a Cartesian coordinate value of each detection point reference robot coordinate system.
For example, first, the polar coordinate value { r) of the detection point i of the sensor coordinate system is referred toiiConverting the coordinate values into Cartesian coordinate values { x } under the same coordinate systemsi,ysi,zsiThe conversion formula refers to the following formula (1):
Figure BDA0003066864920000071
as an example, when i is 1, r1=1.58,θ1When the value is-1.38, then xs1=1.58×sin(-1.38)=-1.55,ys1=1.58×cos(-1.38)=0.28,zs1The cartesian coordinate values of the 1-point reference sensor coordinate system are therefore { -1.55, 0.28, 0 }.
And then, carrying out coordinate conversion on the Cartesian coordinate value of the detection point i in the reference sensor coordinate system through a preset coordinate transformation matrix M to obtain the Cartesian coordinate value of the detection point i in the reference robot coordinate system. The coordinate transformation matrix M is used to transform cartesian coordinate values of the sensor coordinate system into a matrix of cartesian coordinate values of the robot coordinate system, and generally, values of matrix points of M are determined based on experiments and preset in the robot 101 system.
The process of performing coordinate conversion by the coordinate transformation matrix M may refer to the following formula (2):
Figure BDA0003066864920000081
wherein n is a single atomThe number of detection points returned by the sensors of line lidar 102, M is a preset coordinate transformation matrix, { x }i,yi,ziAnd f, referring to the Cartesian coordinate value of the robot coordinate system for the detection point i.
It will be appreciated that in some embodiments, the coordinate origin of the robot coordinate system may refer to the center point of the rear wheel connecting shaft on the robot 101 in fig. 3a and the related description, and the XOZ coordinate plane in the cartesian coordinate system of the robot coordinate system may refer to the XOZ coordinate plane shown in fig. 3a or fig. 3 b; in other embodiments, the coordinate origin of the robot coordinate system may also be another reference point, for example, the center of gravity of the robot 101 or the center point of the front wheel connecting shaft, and the XOZ coordinate plane in the cartesian coordinate system of the reference robot coordinate system may also be another coordinate plane that forms a certain included angle with the XOZ coordinate plane shown in fig. 3a, which is not limited herein.
As an example, taking the above-mentioned probe point i as 1, the cartesian coordinate values of the point reference sensor coordinate system calculated in the above-mentioned step 402 are { -1.55, 0.28, 0}, and the coordinate transformation process based on the preset coordinate transformation matrix M may refer to the following transformation process:
Figure BDA0003066864920000082
and finally, obtaining the Cartesian coordinate values of {2.23, 0.08, -0.001} of the coordinate system of the reference robot when the detection point i is 1.
It can be understood that the preset value of each matrix point in the coordinate transformation matrix M depends on the position and the installation angle of the single line laser radar 102 disposed on the robot 101, and the installation angle of the single line laser radar 102 determines the deflection angle of the laser scanning plane thereof. The process of adjusting the installation angle of the singlet lidar 102 can refer to fig. 5, for example, the singlet lidar is installed on the front end surface 1011 of the robot 101, and the laser scanning plane of the singlet lidar 102 can be adjusted to be parallel to the horizontal plane 601 during installation, or deflected upwards to an angle parallel to the plane 602, or deflected downwards to a position parallel to the plane 603, which is not limited herein. In actual practice, the optimal installation angle of the single line lidar 102 may be determined based on experimental data.
Step 403: and adding a classification mark to each detection point based on a preset Z value. Specifically, the preset Z value may refer to the above-mentioned first height threshold ZuAnd a second height threshold zdThe preset value Z can also refer to the first height threshold value ZuCorresponding ordinate value and second height threshold value zdAnd setting corresponding ordinate values, and comparing the Z-axis coordinate value in the Cartesian coordinate values of the reference robot coordinate systems of the detection points obtained by conversion with a preset Z value or the ordinate value corresponding to the preset Z value so as to determine which type of the obstacle point, the ground point and the cliff point each detection point belongs to. Wherein the first height threshold value zuCorresponding ordinate value and second height threshold value zdThe corresponding ordinate values refer to a cartesian coordinate system in the robot coordinate system.
It is understood that any point on the obstacle 103 encountered by the robot 101 during travel is above the travel surface (i.e., the ground) of the robot 101. Referring to fig. 3a, the height of each probe point, for example, the ground height, may be calculated based on the cartesian coordinate values of the reference robot coordinate system of each probe point converted in step 402, and the absolute value of the difference between the z-axis coordinate value of each probe point and the z-axis coordinate value determined by the reference robot coordinate system of the point on the ground is calculated, so as to determine the height of each probe point. For the calculated height of each probe point, as described above, the height is greater than zuIs marked as an obstacle point and has a height less than zdCan be marked as a cliff point with a height zuAnd zdThe probe points in between can then be labeled as ground points. In the process of the robot 101 moving, the blocking effect caused by the marked ground points is almost negligible, so if the detection points are all the ground points, the robot 101 moving direction detected by the surface single line laser radar 102 or the surrounding environment is free of blocking; if the detected points include obstacle points, it indicates a row of the robot 101 detected by the single line laser radar 102There are obstacles 103 above the ground in the direction of travel or in the surrounding environment; if the detected point includes a cliff point, it indicates that there may be a cliff or a threshold in the direction of travel of the robot 101 or in the surrounding environment detected by the single line lidar 102.
As described above, the probe point i refers to the cartesian coordinate values of the robot coordinate system with the z-coordinate value as ziAs an example, the point on the ground is referenced to z-axis coordinate value among cartesian coordinate values of the robot coordinate system as z0Therefore, the judgment condition for adding the classification flag to each detection point may refer to the following judgment conditions:
when zi-z0|>zuAnd judging that the detection point i is an obstacle point, and adding a classification mark corresponding to the obstacle point to the detection point i. For example, the classification mark corresponding to the obstacle point may refer to "obstacle point" or "obstacle point", and the like, which is not limited herein.
Z is wheni-z0|≤zuAnd | zi-z0|≥zdAnd judging that the detection point i is a ground point, and adding a classification mark corresponding to the ground point to the detection point i. For example, the classification mark corresponding to the ground point may refer to "ground point" or "floor point", etc., and is not limited herein.
When zi-z0|<zdAnd judging that the detection point i is a cliff point, and adding a classification mark corresponding to the cliff point to the detection point i. For example, the classification mark corresponding to a cliff point may refer to "cliff point" or "cliff point", etc., and is not limited herein.
In other embodiments, the predetermined first height threshold z isuA second height threshold zdIt is also possible to set the first height threshold z based on the ordinate values of the corresponding height position points, for example, in the cartesian coordinate system O-XYZ with the center point of the robot rear wheel connecting axle as the origin as shown in fig. 3a or fig. 3b, as described aboveuA second height threshold zdIf the corresponding ordinate values are negative numbers, the following determination conditions can be referred to add classification marks to the detection points:
referring to the ordinate value z of the probe point i of the above-mentioned O-XYZ coordinate systemiFirst height threshold zuWhen the corresponding ordinate value is obtained, marking the detection point i as an obstacle point;
referring to the ordinate value z of the probe point i of the above-mentioned O-XYZ coordinate systemiIs less than or equal to the first height threshold value zuCorresponding ordinate value and referring to the ordinate value z of the probe point i in the above-mentioned O-XYZ coordinate systemiNot less than the first height threshold zdWhen the corresponding ordinate value is obtained, marking the detection point i as a ground point;
referring to the ordinate value z of the probe point i of the above-mentioned O-XYZ coordinate systemi< first height threshold zdAnd marking the detection point i as a cliff point when the corresponding ordinate value is obtained. And are not intended to be limiting herein.
Step 404: a linear fit is made on the XOZ coordinate plane for all obstacle points. Specifically, the projected points of the detected points marked as the obstacle points on the XOZ coordinate plane shown in fig. 3a or 3b may be linearly fitted.
It will be appreciated that linear fitting is an important tool for analyzing the characteristics of the discrete point distribution. Among these, the most basic and most common of linear fits is a straight line fit. Taking the least square method for fitting a straight line as an example, the functional relationship between the independent variable x and the dependent variable y is given by the following relation (3):
yi=a+bxi (3)
wherein x isiIs the x-axis coordinate value, y, of the projected point of the obstacle point on the XOZ coordinate plane shown in FIG. 3a or FIG. 3biIs the z-coordinate value of the projected point of the obstacle point on the XOZ coordinate plane shown in fig. 3a or fig. 3 b. In addition, the above relation (3) has two undetermined parameters a and b, where a represents intercept and b represents slope. For N groups of data (x) obtained by equal precision measurementi,yi),i=1,2……,N,xiThe values are considered accurate and all errors are related only to yi
And (4) taking the obstacle point as an observation point, and fitting the corresponding observation point data into a straight line by using a least square method. When estimating parameters by least squares, the requirements areObserved value yiThe weighted sum of squares of the deviations of (a) is minimal. For the straight line fitting of the observed values obtained by equal precision measurement, even if the value of the fitting error calculated by the following formula (4) is minimum:
Figure BDA0003066864920000101
during fitting, the value a and the value b are iterated by using a program, so that the fitting error is finally converged, the final fitting equation y is a + bx and the fitting error is obtained, and the fitting error is smaller than a preset threshold value, namely the fitting result is regarded as a straight line. For example, referring to the graphs of the fitting results of the straight lines shown in fig. 6a to 6b, when error < 0.001, the distribution characteristic of each obstacle point meets the straight line standard, and the fitting result is a straight line, as shown in fig. 6 a; when error is greater than or equal to 0.001, the distribution characteristics of the obstacle points do not meet the straight line standard, and the fitting result is a non-straight line, which is shown in reference to fig. 6 b.
It is understood that in other embodiments, the marked obstacle points may be linearly fitted on other coordinate planes, for example, the obstacle points may be linearly fitted on the YOZ coordinate plane shown in fig. 3a or fig. 3b, which is not limited herein.
Step 405: the type of the detected obstacle 103 is determined based on preset fitting conditions. If the type of the obstacle 103 is judged to be a slope obstacle, step 406 is executed; if the type of the obstacle 103 is determined to be a ground obstacle, step 407 is performed. Specifically, when the linear fitting result obtained in step 404 satisfies the preset fitting condition, it is determined that the obstacle 103 is a slope obstacle; and under the condition that the linear fitting result obtained in the step 404 does not meet the preset fitting condition, judging that the obstacle 103 is the ground obstacle. The preset fitting condition comprises whether the error of the straight line fitting is smaller than a preset error threshold value or not and whether the slope of the fitting straight line is larger than a slope threshold value of the fitting straight line or not.
As described above, in the linear fit analysis of step 404 above, b is the slope of the fitted straight line. It is understood that when the slope b obtained by the fitting analysis is small, the robot 101 may determine that the obstacle 103 is a low-rise obstacle, and execute step 407 to determine the size and height of the low-rise obstacle, and when the slope b is large, the obstacle 103 may be a slope obstacle. It is therefore possible to set the slope lower limit threshold as a further determination condition for determining whether the obstacle 103 encountered by the robot 101 is an obstacle on a slope. For example, if the lower limit threshold value of the slope is set to 0.0524, it is determined that the obstacle 103 encountered by the robot 101 is a slope obstacle when b is equal to or greater than 0.0524.
Step 406: the slope or the slope angle of the slope obstacle is calculated. Specifically, the slope or the slope angle of the slope obstacle is calculated from the slope b of the fitted straight line with reference to fig. 3b described above.
It will be appreciated that the slope b represents the slope of the obstacle 103, i.e. the sine of the slope angle a of the obstacle 103, i.e.:
tanα=b
therefore, the calculation formula (5) of the slope angle α can be obtained as:
α=tan-1b (5)
therefore, it can be understood that in other embodiments, it may also be determined whether the obstacle 103 met by the robot 101 is a slope obstacle by presetting a slope angle lower limit threshold, for example, the slope angle lower limit threshold is set to 3 °, a slope angle α of the slope obstacle may be calculated based on the slope b and the above equation (5), and when α > 3 °, it is determined that the obstacle 103 met by the robot 101 is a slope obstacle, which is not limited herein.
Further, based on the climbing ability of the robot 101, a slope upper threshold value of a slope obstacle that the robot 101 can climb smoothly may be preset, and in the case where the slope b of the slope obstacle determined in step 404 is smaller than the preset slope upper threshold value, it is determined that the robot 101 can smoothly pass through the obstacle 103.
For example, the preset upper limit gradient threshold value is 0.364, and when the gradient b of the slope obstacle is less than 0.364, the robot 101 can smoothly pass through the slope obstacle.
It is understood that in other embodiments, it may also be determined whether the robot 101 can smoothly pass through the slope obstacle by presetting an upper threshold value of the slope angle, for example, the upper threshold value of the slope angle is 10 °, the slope angle α of the slope obstacle may be calculated based on the slope b determined in the step 404 and the above formula (5), when α <10 °, it is determined that the robot 101 can smoothly pass through the slope obstacle, otherwise, the robot 101 cannot pass through the slope obstacle, and at this time, the robot 101 may choose to detour the slope obstacle or stop advancing. And are not intended to be limiting herein.
Step 407: and calculating the size and height of the ground obstacle. Specifically, referring to fig. 3a described above, the size and height of the ground obstacle are calculated with reference to cartesian coordinate values of the robot coordinate system according to each obstacle point.
As shown in fig. 3a, when the robot 101 is equipped with a single line laser radar 102, the single line laser radar 102 collects the polar coordinate values of K, L, M, N points of the plurality of obstacle points at different times, and the coordinate values are transformed in step 403 to obtain cartesian coordinate values in the reference robot coordinate system. As shown in fig. 3a, point K is the obstacle point with the largest ordinate value, and point N is the obstacle point with the smallest ordinate value, so that the difference between the ordinate values of point K and point N can be determined as the height of the obstacle 103.
Similarly, based on the cartesian coordinate values of the point K, L, M, N shown in fig. 3 or other obstacle points, the extreme positions marked as obstacle points located on the left and right sides of the traveling direction of the robot 101 may be determined, for example, the point L, M, and then the distance between the projected points of the point L, M on the coordinate plane parallel to the ground may be determined as the width of the obstacle 103.
Based on the determined height and width of the obstacle 103, the robot 101 may determine an obstacle avoidance scheme. For example, if the height of the obstacle 103 is greater than the preset height threshold, the robot 101 selects an obstacle avoidance bypassing scheme, and the avoidance distance is reasonably determined according to the determined width of the obstacle 103, which is not described herein again.
It can be understood that the detection range based on the single line laser radar is limited, for example, the detection range is 5m, so that the obstacle detection method implemented by the embodiment of the application can detect obstacles within a certain distance range in the specific implementation process.
In addition, the x-axis coordinate value x of the projected point of the obstacle point on the XOZ coordinate plane shown in fig. 3a or 3b determined based on the above step 404iThe distance between the robot 101 and the obstacle 103 can be determined, so that the robot 101 can generate a climbing or obstacle avoidance scheme in advance or in time by using the distance information, and the sensitivity of the robot 101 is improved.
It can be understood that, in this embodiment, only one single line laser radar is arranged on the robot 101 to implement the obstacle detection method of the present application, and the detection range thereof has certain limitations, and generally in practical applications, it is mainly necessary to detect an obstacle in the traveling direction of the robot 101, so that the single line laser radars 102 arranged on the front end surface 1011 of the robot 101 shown in fig. 3 are relatively limited to the obstacle detection on the left and right sides of the traveling direction of the robot 101.
A specific process of implementing the obstacle detection method of the present application by providing two single line laser radars on the robot 101 is described below by another embodiment.
Example two
The embodiment introduces a specific implementation process of the obstacle detection method of the present application by providing two single line laser radars on the robot 101.
Fig. 7a to 7b are schematic diagrams illustrating another application scenario of the obstacle detection method applied to the present embodiment. As shown in FIG. 7a, a single line laser radar 102-1 is arranged above the center point of the lower edge line of the front end surface 1011 of the robot 101, a single line laser radar 102-2 is arranged at the center point of the upper edge of the front end surface 1011 of the robot 101, and a laser scanning plane 102-1 'of the single line laser radar 102-1 and a laser scanning plane 102-2' of the single line laser radar 102-2 are intersected at O1O2. Fig. 7b shows a top view from the top of the robot 101 corresponding to the mounting position shown in fig. 7 a.
The difference from the first embodiment is that in the present embodiment, two single line laser radars are provided on the robot 101. Therefore, the single line laser radar 102 detects the original point cloud coordinates returned by the obstacle detection on the ground in the traveling direction of the robot 101, and the processor 204 of the robot 101 determines the type of the obstacle 103 encountered in front of the traveling direction of the robot 101 based on the returned original point cloud coordinates to select an appropriate obstacle avoidance scheme or climbing scheme.
In other embodiments, the positions of the two single line laser radars set on the robot 101 may also be other positions of the robot 101, referring to the schematic diagram of the installation position of another single line laser radar in the application scenario of this embodiment shown in fig. 8. Describing with a top view angle shown in fig. 8, as shown in fig. 8, two ends of the adjacent edge of the left side of the top surface and the upper side of the front end surface of the robot 101 are respectively provided with a single line laser radar 102-1 and a single line laser radar 102-2, and an intersection line of a laser scanning plane 102-1 'of the single line laser radar 102-1 and a laser scanning plane 102-2' of the single line laser radar 102-2 is O1O2
As can be seen from comparison of the coverage areas of the scanning planes of the single line lidar shown in fig. 7a to 7b and fig. 8, the installation position of the single line lidar shown in fig. 7a to 7b can more accurately detect the obstacle 103 encountered in the traveling direction of the robot 101 and the distance from the robot 101 to the obstacle 103, but the robot 101 cannot easily sense the surrounding environments on the left and right sides of the traveling direction in such an installation position; the installation positions of the single line laser radars 102-1 and 102-2 shown in fig. 8 may not only detect the obstacle 103 encountered in the traveling direction of the robot 101, but also detect the obstacles 103 on the left and right sides of the traveling direction of the robot 101, that is, the installation position shown in fig. 8 may make the robot 101 have a larger detection sensing range, but due to the limited number of the single line laser radars in the traveling direction of the robot 101 and the limited scanning frequency of the single line laser radars, the number of the detection points collected in the traveling direction of the robot 101 is small, and therefore, the speed of determining the obstacle information by the robot 101 based on the coordinate information of the detection points is slow, and the timeliness is also poor. The following describes in detail a specific implementation procedure of the obstacle detection method according to the present embodiment based on the installation position of the single line laser radar shown in fig. 7 a.
Referring to steps 401 to 406 in the first embodiment, the process of determining the type of the obstacle 103, calculating the size and height of the ground obstacle, and calculating the slope or the slope angle of the slope obstacle by the processor 204 of the robot 101 based on the coordinate information of each detection point in the original point cloud coordinates returned by the single line laser radar for detecting the obstacle is specifically as follows:
referring to step 401, the processor 204 of the robot 101 obtains the polar coordinate value of the reference sensor coordinate system of each detection point returned when the single line laser radars 102-1 and 102-2 perform obstacle detection, as the original point cloud coordinate. Referring to FIG. 7a, it can be understood that, in the present embodiment, the polar coordinate value of each probe point in the sensor coordinate system includes the polar coordinate value of each probe point returned by the monolithic laser radar 102-1 in the sensor coordinate system of the monolithic laser radar 102-1 and the polar coordinate value of each probe point returned by the monolithic laser radar 102-2 in the sensor coordinate system of the monolithic laser radar 102-2, wherein the sensor coordinate system of the monolithic laser radar 102-1 uses the laser emitting point of the monolithic laser radar 102-1 as the origin and uses the laser scanning plane 102-1' of the monolithic laser radar 102-1 as the coordinate plane X shown in FIG. 7a1O1Y1(ii) a The sensor coordinate system of the single line laser radar 102-2 is a coordinate plane X shown in FIG. 7a, which is a laser scanning plane 102-2' of the single line laser radar 102-2 and is a laser emitting point of the single line laser radar 102-2 as an origin2O2Y2. In this embodiment, the processor 204 of the robot 101 may acquire polar coordinate values of the respective probe points returned by the sensors of the two single line laser radars.
Referring to step 402, the processor 204 of the robot 101 converts the obtained polar coordinate values of the probe points returned by the two sensors of the single line laser radar into cartesian coordinate values in the robot coordinate system by running a corresponding algorithm. The coordinate transformation process refers to equations (1) to (2) in the first embodiment, and is not described herein again.
Referring to step 403, the processor 204 of the robot 101 compares the Z-coordinate value of the cartesian coordinate values of the reference robot coordinate system of each probe point obtained by the conversion with a preset Z-value by running a corresponding algorithm, classifies each probe point, and adds a classification markThe classification of each detection point refers to the obstacle point, the ground point and the cliff point in the first embodiment, and the preset Z value refers to the first height threshold Z in the first embodimentuAnd a second height threshold zdThe setting is not described herein.
Referring to step 404, the processor 204 of the robot 101 performs linear fitting on the XOZ coordinate plane, for example, linear fitting by a least square method, on all the probe points marked as the obstacle points by operating a corresponding algorithm, the fitting process refers to the description in the first embodiment, the fitting relation used for the linear fitting refers to the relation (3) in the first embodiment, and the calculation formula of the fitting error refers to the formula (4) in the first embodiment, which is not described herein again.
Referring to step 405, the processor 204 of the robot 101 determines the type of the detected obstacle 103 by running a corresponding algorithm based on whether the linear fitting result satisfies the preset fitting condition, and the specific determination process may refer to the related description of step 405 in the first embodiment, which is not described herein again.
Referring to step 406, if the obstacle 103 detected by the robot 101 is determined to be a slope obstacle, the processor 204 of the robot 101 further determines the slope or the slope angle of the slope obstacle by running a corresponding algorithm, and determines whether the climbing is smooth according to the upper threshold value of the slope or the upper threshold value of the slope angle preset in the system of the robot 101. For the specific process of calculating the slope or the slope angle of the slope obstacle and the process of determining whether the robot 101 can climb the slope smoothly, reference may be made to the related description of step 406 in the first embodiment, and details are not repeated here.
Referring to step 407, if the obstacle 103 detected by the robot 101 is determined as a ground obstacle, the processor 204 of the robot 101 further determines the size and height of the ground obstacle by running a corresponding algorithm, so as to determine a reasonable obstacle avoidance scheme. For the specific process of calculating the size and height of the ground obstacle and selecting a reasonable obstacle avoidance scheme, reference may be made to the related description of step 407 in the first embodiment, and details are not described here.
In addition, referenceIn step 404 in the first embodiment, based on the x-axis coordinate value x of the projection point of the obstacle point determined by the processor 204 of the robot 101 in the first embodiment in the step 404 on the XOZ coordinate plane shown in fig. 7aiThe distance between the robot 101 and the obstacle 103 can be determined, so that the robot 101 can generate a climbing or obstacle avoidance scheme in advance or in time by using the distance information, and the sensitivity of the robot 101 is improved.
It can be understood that, in the present embodiment, the obstacle detection method of the present application is implemented by two single line laser radars provided on the robot 101, and as described above, the installation positions of the single line laser radars shown in fig. 7a to 7b and fig. 8 have certain limitations in obstacle detection. In order to overcome the limitation of installing two single line laser radars to detect obstacles in the embodiment and realize a better obstacle detection effect, three or more single line laser radars can be arranged on the robot 101.
A specific process of implementing the obstacle detection method of the present application by providing three single line laser radars on the robot 101 is described below by another embodiment.
EXAMPLE III
The embodiment introduces a specific implementation process of the obstacle detection method of the present application by providing three single line laser radars on the robot 101.
Fig. 9 is a schematic view showing another application scenario of the obstacle detection method applied to the present embodiment. As shown in FIG. 9, a single line laser radar 102-1 is disposed above the center point of the lower edge line of the front end surface 1011 of the robot 101, and single line laser radars 102-2 and 102-3 are disposed at the two end positions of the upper edge of the front end surface 1011 of the robot 101, respectively. The single-line laser radars 102-1, 102-2 and 102-3 return original point cloud coordinates after detecting obstacles on the ground in the traveling direction of the robot 101, and the processor 204 of the robot 101 determines the type of the obstacle 103 in front of the traveling direction of the robot 101 based on the returned original point cloud coordinates so as to select a proper obstacle avoidance scheme or a suitable climbing scheme.
Wherein, the laser scanning plane formed by the rotation of the laser emitted by the single-line laser radar 102-1 is 102-1 ', the laser scanning plane formed by the rotation of the laser emitted by the single-line laser radar 102-2 is 102-2 ', and the laser scanning plane formed by the rotation of the laser emitted by the single-line laser radar 102-3 is 102-3 '.
In order to more clearly describe the installation positions of the three single line lidar in the embodiment, fig. 10 shows a schematic perspective structure of the installation positions of the single line lidar in the scene shown in fig. 9. As shown in fig. 10, a single line laser radar 102-1 is disposed above a center point of a lower edge line of a front end surface in a traveling direction of the robot 101, and single line laser radars 102-2 and 102-3 are respectively disposed at both ends of an upper edge of a front end surface 1011 of the robot 101.
The installation angles of the single line laser radars 102-2 and 102-3 are adjusted by referring to the method shown in fig. 5, taking the single line laser radar 102-3 as an example, the initial laser scanning plane (hereinafter referred to as a sensor plane) of the single line laser radar 102-3 is parallel to the XOZ coordinate plane of the reference robot coordinate system shown in fig. 10, and an original robot coordinate system O-XYZ is formed. When the sensor is installed, the plane of the sensor can be rotated to an original X-axis position to an original Z-axis position by taking the Z-axis in the original robot coordinate system as a central axis to rotate an angle A1At the axis position, the original Y-axis position is rotated to Y1An axial position; when the sensor plane is at X1The axis is a central axis rotating angle B, so that the sensor plane of the single line laser radar 102-3 rotates to the position of the laser scanning plane 102-3'. It is understood that in other embodiments, the installation angle of the singlet lidar 102-3 may be adjusted by other rotation methods to enable the singlet lidar 102-3 to be in the preferred scanning position, which is not limited herein.
In other embodiments, the three positions of the single line laser radars on the robot 101 may also be other positions on the front end surface of the robot 101, wherein the installation angles of the single line laser radars 102-2 and 102-3 may also be other installation angles, which is not limited herein. Referring to fig. 11a to 11b, which are schematic diagrams illustrating the change of the included angle between two types of single line laser radars applied to the present embodiment, the included angle C between the laser scanning plane 102-2 'of the single line laser radar 102-2 and the laser scanning plane 102-3' of the single line laser radar 102-3 may be larger (refer to fig. 11 a) or smaller (refer to fig. 11 b).
Next, based on the installation positions of the single line laser radars 102-1, 102-2 and 102-3 shown in fig. 9, the specific implementation process of the obstacle detection method of the present embodiment is described in detail with reference to the flow of steps shown in fig. 4 in the first embodiment of the present invention for implementing the obstacle detection method of the present application.
Referring to steps 401 to 406 in the first embodiment, the process of determining whether the obstacle 103 is the obstacle 103 and calculating the slope or the slope angle of the obstacle 103 by the processor 204 of the robot 101 based on the coordinate information of each detection point returned by the single line laser radar to detect the obstacle is specifically as follows:
referring to step 401, the processor 204 of the robot 101 obtains the polar coordinate value of each detection point reference sensor coordinate system returned when the single line laser radars 102-1, 102-2, and 102-3 perform obstacle detection, as the original point cloud coordinate. Referring to fig. 9, it can be understood that the polar coordinate values of the respective probe point reference sensor coordinate systems in the present embodiment include the polar coordinate values of the sensor coordinate system of the respective probe point reference single line laser radar 102-1 returned by the single line laser radar 102-1, the polar coordinate values of the sensor coordinate system of the respective probe point reference single line laser radar 102-2 returned by the single line laser radar 102-2, and the polar coordinate values of the sensor coordinate system of the respective probe point reference single line laser radar 102-3 returned by the single line laser radar 102-3. Wherein, the sensor coordinate system of the single line laser radar 102-1 takes the laser emission point of the single line laser radar 102-1 as the origin, and the laser scanning plane 102-1' of the single line laser radar 102-1 as the coordinate plane X shown in FIG. 91O1Y1(ii) a The sensor coordinate system of the single line lidar 102-2 is a coordinate plane X shown in FIG. 9, which is a laser emission point of the single line lidar 102-2 as an origin, and a laser scanning plane 102-2' of the single line lidar 102-2 as a scanning plane2O2Y2(ii) a The sensor coordinate system of the single line laser radar 102-3 takes a laser emission point of the single line laser radar 102-3 as an origin and a laser scanning plane of the single line laser radar 102-3The plane 102-3' is the coordinate plane X shown in FIG. 93O3Y3. In this embodiment, the processor 204 of the robot 101 may acquire polar coordinate values of the respective probe points returned by the sensors of the two single line laser radars.
Referring to step 402, the processor 204 of the robot 101 converts the obtained polar coordinate values of the probe points returned by the two sensors of the single line laser radar into cartesian coordinate values in the robot coordinate system by running a corresponding algorithm. The coordinate transformation process refers to equations (1) to (2) in the first embodiment, and is not described herein again.
Referring to step 403, the processor 204 of the robot 101 runs a corresponding algorithm to compare the Z-axis coordinate value in the cartesian coordinate values of the reference robot coordinate system of each probe point obtained through the conversion with the preset Z value or the ordinate value corresponding to the preset Z value, and classifies and adds the classification mark to each probe point, the classification of each probe point refers to the obstacle point, the ground point and the cliff point in the first embodiment, and the preset Z value refers to the first height threshold Z in the first embodimentuAnd a second height threshold zdAnd will not be described herein.
Referring to step 404, the processor 204 of the robot 101 performs linear fitting, for example, linear fitting by a least square method, on the XOZ coordinate plane shown in fig. 9 for all the detection points marked as the obstacle points by operating a corresponding algorithm, the fitting process refers to the description in the first embodiment, the fitting relation used for the linear fitting refers to the relation (3) in the first embodiment, and the calculation formula of the fitting error refers to the formula (4) in the first embodiment, which is not described herein again.
Referring to step 405, the processor 204 of the robot 101 determines the type of the detected obstacle 103 by running a corresponding algorithm based on whether the result of the linear fitting satisfies a preset fitting condition, and the specific determination process may refer to the related description of step 405 in the first embodiment, which is not described herein again.
Referring to step 406, if the determination is that the obstacle 103 detected by the robot 101 is determined to be a slope obstacle, the processor 204 of the robot 101 further determines the slope or the slope angle of the slope obstacle by running a corresponding algorithm, and determines whether the robot can climb the slope smoothly according to a preset upper threshold value of the slope or the upper threshold value of the slope angle in the system of the robot 101. For the specific process of calculating the slope or the slope angle of the slope obstacle and the process of determining whether the robot 101 can climb the slope smoothly, reference may be made to the related description of step 406 in the first embodiment, and details are not repeated here.
Referring to step 407, if the obstacle 103 detected by the robot 101 is determined as a ground obstacle, the processor 204 of the robot 101 further determines the size and height of the ground obstacle by running a corresponding algorithm, so as to determine a reasonable obstacle avoidance scheme. For the specific process of calculating the size and height of the ground obstacle and selecting a reasonable obstacle avoidance scheme, reference may be made to the related description of step 407 in the first embodiment, and details are not described here.
In addition, referring to step 404 in the first embodiment, based on the x-axis coordinate value x of the projection point of the obstacle point determined by the processor 204 of the robot 101 in the first embodiment in the step 404 on the XOZ coordinate plane shown in fig. 9iThe distance between the robot 101 and the obstacle 103 can be determined, so that the robot 101 can generate a climbing or obstacle avoidance scheme in advance or in time by using the distance information, and the sensitivity of the robot 101 is improved.
It can be understood that, in the present embodiment, the two single line laser radars are arranged on the robot 101 to implement the obstacle detection method of the present application, and as described above, when the three single line laser radars shown in fig. 9 are installed at positions for obstacle detection, compared with the obstacle detection schemes described in the foregoing first and second embodiments, the obstacle detection method of the present application can achieve a more comprehensive laser scanning range and can also take into consideration the timeliness aspect of returning to a detection point. That is to say, in the implementation process of this embodiment, the robot 101 can sense a wider range of the surrounding environment, and can return to a denser original point cloud coordinate to timely and accurately determine whether the obstacle 103 exists around the robot 101, the type of the obstacle 103, and the like, and the method and the system can also timely calculate the size and height of the ground obstacle or the slope or slope angle of the slope obstacle to further determine an accurate obstacle avoidance scheme or a climbing scheme, which is favorable for improving the accurate obstacle avoidance capability of the robot 101.
In addition, because the cost of the single line laser radar is low, a small number of single line laser radars are adopted to realize the obstacle detection in the embodiments of the application, so that the production cost of the robot 101 can be greatly reduced, and meanwhile, the obstacle detection method can achieve the technical effect superior to that of a multi-line laser radar to detect obstacles, and is beneficial to wide popularization and application of the obstacle detection scheme.
Embodiments of the mechanisms disclosed herein may be implemented in hardware, software, firmware, or a combination of these implementations. Embodiments of the application may be implemented as computer programs or program code executing on programmable systems comprising at least one processor, a storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
The program code may be implemented in a high level procedural or object oriented programming language to communicate with a processing system. The program code can also be implemented in assembly or machine language, if desired. Indeed, the mechanisms described in this application are not limited in scope to any particular programming language. In any case, the language may be a compiled or interpreted language.
In some cases, the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be read and executed by one or more processors. For example, the instructions may be distributed via a network or via other computer readable media. Thus, a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), including, but not limited to, floppy diskettes, optical disks, read-only memories (CD-ROMs), magneto-optical disks, read-only memories (ROMs), Random Access Memories (RAMs), erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), magnetic or optical cards, flash memory, or a tangible machine-readable memory for transmitting information (e.g., carrier waves, infrared digital signals, etc.) using the internet in an electrical, optical, acoustical or other form of propagated signal. Thus, a machine-readable medium includes any type of machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
In the drawings of embodiments of the present application, some features of structure or method may be shown in a particular arrangement and/or order. However, it is to be understood that such specific arrangement and/or ordering may not be required. Rather, in some embodiments, the features may be arranged in a manner and/or order different from that shown in the illustrative figures. In addition, the inclusion of a structural or methodical feature in a particular figure is not meant to imply that such feature is required in all embodiments, and in some embodiments, may not be included or may be combined with other features.
It should be noted that, in the embodiments of the apparatuses in the present application, each unit/module is a logical unit/module, and physically, one logical unit/module may be one physical unit/module, or may be a part of one physical unit/module, and may also be implemented by a combination of multiple physical units/modules, where the physical implementation manner of the logical unit/module itself is not the most important, and the combination of the functions implemented by the logical unit/module is the key to solve the technical problem provided by the present application. Furthermore, in order to highlight the innovative part of the present application, the above-mentioned device embodiments of the present application do not introduce units/modules which are not so closely related to solve the technical problems presented in the present application, which does not indicate that no other units/modules exist in the above-mentioned device embodiments.
It is noted that, in the examples and descriptions of this patent, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the use of the verb "comprise a" to define an element does not exclude the presence of another, same element in a process, method, article, or apparatus that comprises the element.
While the present application has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present application.

Claims (12)

1. An obstacle detection method applied to a drive apparatus including a single line laser radar, comprising:
acquiring a first coordinate value obtained by detecting an obstacle by the single-line laser radar, wherein the first coordinate value is determined by referring to a coordinate system of the single-line laser radar;
performing coordinate conversion on the first coordinate value to obtain a second coordinate value, wherein the second coordinate value is determined by referring to a coordinate system of the driving equipment;
determining a type of the obstacle based on the second coordinate value.
2. The method of claim 1, wherein said determining the type of the obstacle based on the second coordinate value comprises:
determining a point of the second coordinate value, of which the z-axis coordinate value is greater than a longitudinal coordinate value corresponding to a preset height threshold, as an obstacle point;
performing a linear fit based on the barrier points;
determining a type of the obstacle based on the linear fit result.
3. The method of claim 2, wherein the coordinate transforming the first coordinate value to a second coordinate value comprises:
converting a coordinate system based on the first coordinate value to obtain a third coordinate value, wherein the first coordinate value is determined by referring to a polar coordinate system of the single line laser radar, and the third coordinate value is determined by referring to a Cartesian coordinate system of the single line laser radar;
and performing coordinate conversion based on the third coordinate value and a preset coordinate transformation matrix to obtain the second coordinate value.
4. The method of claim 3, wherein the coordinate system of the singlet lidar includes a coordinate system having a laser emission point of the singlet lidar as a coordinate origin;
the coordinate system of the driving device comprises a coordinate system with a central point of a rear wheel connecting shaft of the driving device as a coordinate origin.
5. The method of claim 4, wherein: the coordinate system of the single-line laser radar comprises a polar coordinate system taking a laser emission point of the single-line laser radar as a coordinate origin and a Cartesian coordinate system taking the laser emission point of the single-line laser radar as the origin;
the first coordinate value is a polar coordinate value under a polar coordinate system taking the laser emission point as a coordinate origin;
the third coordinate value is a cartesian coordinate value in a cartesian coordinate system with the laser emission point as the origin of coordinates.
6. The method of claim 5, wherein: the coordinate system of the driving equipment comprises a Cartesian coordinate system taking the center point of a rear wheel connecting shaft of the driving equipment as a coordinate origin;
the second coordinate value is a Cartesian coordinate value in a Cartesian coordinate system with the center point of the rear wheel connecting shaft as the origin of coordinates.
7. The method of claim 6, wherein the coordinate transformation matrix is determined based on a relative position between a laser emission point of the singlet lidar and a center point of a rear wheel connecting shaft of the apparatus, and a deflection angle of a laser scanning plane of the singlet lidar with respect to a bottom surface or a side surface of the apparatus.
8. The method of claim 7, wherein: obtaining the second coordinate value based on the third coordinate value and a preset coordinate transformation matrix, including:
and obtaining the second coordinate value based on the third coordinate value multiplied by the coordinate transformation matrix.
9. The method of claim 2, wherein: the types of the obstacles comprise slope obstacles and ground obstacles; the method further comprises the following steps:
determining the obstacle to be a slope obstacle under the condition that the linear fitting result meets a preset fitting condition;
and under the condition that the linear fitting result does not meet the preset fitting condition, determining that the obstacle is a ground obstacle.
10. The method of claim 9, wherein the linear fit comprises a straight line fit, wherein the preset fit conditions comprise a straight line fit error threshold and a fit straight line slope threshold, and wherein the method further comprises:
determining that the linear fitting result meets a preset fitting condition under the condition that the fitting error value of the linear fitting is smaller than the linear fitting error threshold value and the slope of the fitting straight line is larger than the fitting straight line slope threshold value;
and under the condition that the fitting error value of the straight line fitting is greater than the straight line fitting error threshold value or the slope of the fitting straight line is less than or equal to the fitting straight line slope threshold value, determining that the linear fitting result does not meet the preset fitting condition.
11. A drive apparatus, characterized by comprising: the system comprises a processor, a memory, a single-wire laser radar, a communication interface and a communication bus;
the single-wire laser radar, the memory and the communication interface are connected through the communication bus;
the memory is configured to store at least one instruction that, when executed by the processor, causes the drive apparatus to perform the obstacle detection method of any one of claims 1 to 10.
12. A computer-readable storage medium having stored thereon instructions which, when executed on a computer, cause the computer to perform the obstacle detection method of any one of claims 1 to 10.
CN202110528166.0A 2021-05-14 2021-05-14 Obstacle detection method, driving device, and storage medium Pending CN113296116A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110528166.0A CN113296116A (en) 2021-05-14 2021-05-14 Obstacle detection method, driving device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110528166.0A CN113296116A (en) 2021-05-14 2021-05-14 Obstacle detection method, driving device, and storage medium

Publications (1)

Publication Number Publication Date
CN113296116A true CN113296116A (en) 2021-08-24

Family

ID=77322227

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110528166.0A Pending CN113296116A (en) 2021-05-14 2021-05-14 Obstacle detection method, driving device, and storage medium

Country Status (1)

Country Link
CN (1) CN113296116A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114543725A (en) * 2022-01-26 2022-05-27 深圳市云鼠科技开发有限公司 Line laser calibration method, device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108398672A (en) * 2018-03-06 2018-08-14 厦门大学 Road surface based on the 2D laser radar motion scans that lean forward and disorder detection method
CN110032188A (en) * 2019-04-10 2019-07-19 湖南汽车工程职业学院 A kind of automatic obstacle avoiding method based on unmanned sightseeing electric car
CN110726993A (en) * 2019-09-06 2020-01-24 武汉光庭科技有限公司 Obstacle detection method using single line laser radar and millimeter wave radar
CN111308500A (en) * 2020-04-07 2020-06-19 三一机器人科技有限公司 Obstacle sensing method and device based on single-line laser radar and computer terminal
CN112255633A (en) * 2020-09-25 2021-01-22 中国矿业大学 Method for automatic unloading of unmanned dump truck in refuse landfill

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108398672A (en) * 2018-03-06 2018-08-14 厦门大学 Road surface based on the 2D laser radar motion scans that lean forward and disorder detection method
CN110032188A (en) * 2019-04-10 2019-07-19 湖南汽车工程职业学院 A kind of automatic obstacle avoiding method based on unmanned sightseeing electric car
CN110726993A (en) * 2019-09-06 2020-01-24 武汉光庭科技有限公司 Obstacle detection method using single line laser radar and millimeter wave radar
CN111308500A (en) * 2020-04-07 2020-06-19 三一机器人科技有限公司 Obstacle sensing method and device based on single-line laser radar and computer terminal
CN112255633A (en) * 2020-09-25 2021-01-22 中国矿业大学 Method for automatic unloading of unmanned dump truck in refuse landfill

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114543725A (en) * 2022-01-26 2022-05-27 深圳市云鼠科技开发有限公司 Line laser calibration method, device, equipment and storage medium
CN114543725B (en) * 2022-01-26 2023-08-18 深圳市云鼠科技开发有限公司 Line laser calibration method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
US10203409B2 (en) Method and device for the localization of a vehicle from a fixed reference map
AU2024201606A1 (en) Systems and methods for optical target based indoor vehicle navigation
CN110889808B (en) Positioning method, device, equipment and storage medium
CN111258320B (en) Robot obstacle avoidance method and device, robot and readable storage medium
JP2020087415A (en) Map construction and positioning of robot
CN112513679B (en) Target identification method and device
JP4971369B2 (en) Object detection method using swivelable sensor device
US20170276487A1 (en) Target recognition and localization methods using a laser sensor for wheeled mobile robots
US20210213962A1 (en) Method for Determining Position Data and/or Motion Data of a Vehicle
CN110794406B (en) Multi-source sensor data fusion system and method
US20200233061A1 (en) Method and system for creating an inverse sensor model and method for detecting obstacles
CN113093218A (en) Slope detection method, drive device, and storage medium
US20150198735A1 (en) Method of Processing 3D Sensor Data to Provide Terrain Segmentation
Fu et al. SLAM for mobile robots using laser range finder and monocular vision
Wettach et al. Dynamic frontier based exploration with a mobile indoor robot
CN113296116A (en) Obstacle detection method, driving device, and storage medium
CN112990151B (en) Precision detection method of obstacle detection module and electronic equipment
CN112731337B (en) Map construction method, device and equipment
CN112734619B (en) Free-form surface coverage viewpoint automatic sampling method for detecting feature scanning uncertainty
CN114859380A (en) Cliff detection method, driving device and storage medium
CN115902839A (en) Port laser radar calibration method and device, storage medium and electronic equipment
US20230359186A1 (en) System And Method for Controlling a Mobile Industrial Robot Using a Probabilistic Occupancy Grid
CN113050103A (en) Ground detection method, device, electronic equipment, system and medium
CN114529539A (en) Method and device for detecting road surface obstacle of unmanned equipment, unmanned equipment and storage medium
Lecking et al. Localization in a wide range of industrial environments using relative 3D ceiling features

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination