CN112700471B - Collision detection method, apparatus, and computer-readable storage medium - Google Patents

Collision detection method, apparatus, and computer-readable storage medium Download PDF

Info

Publication number
CN112700471B
CN112700471B CN202011627667.6A CN202011627667A CN112700471B CN 112700471 B CN112700471 B CN 112700471B CN 202011627667 A CN202011627667 A CN 202011627667A CN 112700471 B CN112700471 B CN 112700471B
Authority
CN
China
Prior art keywords
boundary
component
collision detection
coordinate value
detection method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011627667.6A
Other languages
Chinese (zh)
Other versions
CN112700471A (en
Inventor
吴博文
朱林楠
杨林
黄健东
陈凌之
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Midea Group Co Ltd
Guangdong Midea White Goods Technology Innovation Center Co Ltd
Original Assignee
Midea Group Co Ltd
Guangdong Midea White Goods Technology Innovation Center Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Midea Group Co Ltd, Guangdong Midea White Goods Technology Innovation Center Co Ltd filed Critical Midea Group Co Ltd
Priority to CN202011627667.6A priority Critical patent/CN112700471B/en
Publication of CN112700471A publication Critical patent/CN112700471A/en
Application granted granted Critical
Publication of CN112700471B publication Critical patent/CN112700471B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/2433Single-class perspective, e.g. one-against-all classification; Novelty detection; Outlier detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0014Image feed-back for automatic industrial control, e.g. robot with camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/12Bounding box
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/21Collision detection, intersection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Multimedia (AREA)
  • Robotics (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a collision detection method, a device and a computer readable storage medium, wherein the collision detection method comprises the steps of determining a first boundary of a first object; determining a second boundary of the first object using the first boundary and the first prediction coefficient; and judging whether the second object at least partially falls in the second boundary so as to judge whether the first object collides with the second object. By the aid of the method, the collision detection precision and speed can be improved.

Description

Collision detection method, apparatus, and computer-readable storage medium
Technical Field
The present invention relates to the field of automation technology, and in particular, to a collision detection method, apparatus, and computer readable storage medium.
Background
With the rapid development of artificial intelligence, the field of intelligent manufacturing is becoming a focus of attention. For example, the assembly robot can effectively replace the traditional complex assembly process, particularly for mass production, greatly reduces the manufacturing cost, improves the production efficiency and accelerates the industrial automation and intelligent processes in the manufacturing field. However, since the objects have a tendency of relative movement, collision may occur between different objects, and for the industrial field with high precision requirements, very small collision accidents tend to cause the whole assembly line to enter a stagnation state. Therefore, when the industrial robot performs path planning on the assembly process, the moving or static obstacle needs to be avoided in real time, and the process involves a problem of collision detection. In the collision detection process, the speed and accuracy of detection are often not balanced, and improvement is needed.
Disclosure of Invention
The invention mainly solves the technical problem of providing a collision detection method, a device and a computer readable storage medium, which can improve the precision and the speed of collision detection.
In order to solve the technical problems, the invention adopts a technical scheme that: there is provided a collision detection method comprising determining a first boundary of a first object; determining a second boundary of the first object using the first boundary and the first prediction coefficient; and judging whether the second object at least partially falls in the second boundary so as to judge whether the first object collides with the second object.
Wherein determining whether the second object at least partially falls within the second boundary to determine whether the first object collides with the second object includes: in response to the second object portion falling within the second boundary, a determination is made as to whether the first component collides with the first object, the first component being the portion of the second object falling within the second boundary. .
Wherein determining whether the first component collides with the first object comprises: determining a third boundary of the first component; determining a fourth boundary of the first component using the third boundary and the second prediction coefficient; whether the first object falls at least partially within the fourth boundary is determined to determine whether the first component collides with the first object.
Wherein in response to the first object portion falling within the fourth boundary, it is determined whether a collision of the second component with the first component occurs, the second component being a portion of the first object falling within the fourth boundary.
Wherein determining whether the first component collides with the second component comprises: and judging whether the minimum distance between the first component and the second component is smaller than an early warning value so as to judge whether the first component collides with the second component.
Wherein, in response to the minimum distance between the first component and the second component being less than the pre-warning value, it is determined that the first component and the second component may collide; or in response to the minimum distance between the first component and the second component being greater than or equal to the pre-warning value, determining that the first component and the second component will not collide.
Wherein determining whether the minimum distance between the first component and the second component is less than the pre-warning value comprises: respectively acquiring point cloud data of a first component and a second component; calculating a minimum distance between the point cloud of the first component and the point cloud of the second component; and judging whether the minimum distance between the point cloud of the first component and the point cloud of the second component is smaller than an early warning value.
Wherein the first prediction coefficient is the same as the second prediction coefficient.
The fourth boundary coordinate value is the sum of the third boundary coordinate value and the second prediction coefficient; the fourth boundary coordinate value is the coordinate value of the boundary point on the fourth boundary, and the third boundary coordinate value is the coordinate value of the boundary point on the third boundary.
The second prediction coefficient is the product of the running speed of the first object and a prediction interval, and the prediction interval is the time interval between the current moment and the predicted next moment.
The second boundary coordinate value is the sum of the first boundary coordinate value and the first prediction coefficient; the second boundary coordinate value is the coordinate value of the boundary point on the second boundary, and the first boundary coordinate value is the coordinate value of the boundary point on the first boundary.
Wherein determining whether the first component collides with the first object comprises: and judging whether the minimum distance between the first component and the whole first object is smaller than an early warning value so as to judge whether the first component collides with the first object.
Wherein determining whether the second object at least partially falls within the second boundary range to determine whether the first object collides with the second object includes: and in response to the second object not falling within the second boundary, determining that the first object and the second object do not collide.
The envelope algorithm comprises any one of an axis alignment bounding box algorithm, a directed bounding box algorithm, a discrete directed polyhedral bounding box algorithm and a sphere bounding box algorithm.
The method comprises the steps of respectively acquiring original point cloud data of a first object and original point cloud data of a second object; respectively carrying out three-dimensional reconstruction on the original point cloud data of the first object and the second object to obtain dense point cloud data of the first object and the second object; and downsampling the dense point cloud data of the first object and the second object to obtain sample point cloud data of the first object and the second object, and executing a collision detection step of the first object and the second object based on the sample point cloud data.
The method for respectively acquiring the original point cloud data of the first object and the second object comprises the following steps: and respectively acquiring original point cloud data of the first object and the second object by using a ToF depth camera.
All track points from the initial moment to the current moment on the running track of the first object are acquired; acquiring coordinate values of all track points; combining the coordinate values of all the track points into a feature vector; and processing the feature vector by using a clustering method with noise and based on density so as to judge whether the running track of the first object is abnormal.
Wherein, after obtaining the coordinate values of all the track points, the method comprises the following steps: and removing the abnormal track points by using an isolated forest algorithm.
Wherein, after combining the coordinate values of all the track points into the feature vector, the method comprises the following steps: and performing dimension reduction processing on the feature vector by using a principal component analysis method.
In order to solve the technical problems, the invention adopts another technical scheme that: there is provided a collision detection apparatus including a processor for executing instructions to implement the collision detection method of any one of the above.
In order to solve the technical problems, the invention adopts another technical scheme that: there is provided a computer readable storage medium for storing instructions/program data that can be executed to implement the collision method of any of the above.
The beneficial effects of the invention are as follows: the present invention provides a collision detection method by determining a first boundary of a first object, unlike the prior art; determining a second boundary of the first object based on the envelope algorithm, the second boundary being obtained using the first boundary and the first prediction coefficient; judging whether the second object at least partially falls in the second boundary to judge whether the first object collides with the second object, and when the distance between the two objects is calculated, selecting the key part on the object, wherein the key part is the part of the object most likely to collide with the other object, calculating the distance between the key parts of the two objects, or calculating the distance between the key part of one object and the whole of the other object, so that the distance between the two whole objects is not required to be calculated, and the calculated amount of collision detection can be further reduced.
Drawings
FIG. 1 is a flow chart of a collision detection method according to an embodiment of the application;
FIG. 2 is a flow chart of a collision detection method according to an embodiment of the application;
fig. 3 is a schematic structural view of a collision detecting apparatus in an embodiment of the present application;
Fig. 4 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and effects of the present application clearer and more specific, the present application will be described in further detail below with reference to the accompanying drawings and examples.
Because of the tendency of objects to move relative to each other, there may be a risk of collision between different objects. Particularly in the industrial fields with high precision requirements, such as the industrial robot field, when the robot performs path planning, the robot needs to avoid moving or static obstacles in real time, and collision detection is needed in the process. Collision detection generally involves at least two objects in relative motion, although it is possible to involve multiple objects in relative motion. The present application will be described with respect to a collision detection method using two objects that move relatively, but is not limited to this. For convenience of description, two objects involved in collision detection will be referred to as a first object and a second object respectively, where one object is referred to as a first object, and the other object is referred to as a second object, where specific identities of the two objects are not limited, and the second object and the first object may be interchanged.
The two relatively moving objects involved in collision detection are generally stationary and moving; of course, it is also possible that both objects move. For ease of calculation, when two objects are moving, one of the objects may be considered stationary relative to the other object, where the speed of movement of the moving object is the sum of the speeds of movement of the two objects. The present application will be described with respect to a collision detection method by taking a movement of an object and a stationary object as an example, but is not limited to this. The movable object may be referred to as a first object and the stationary object as a second object, although the reverse may be true, the movable object may be referred to as a second object and the stationary object as the first object.
In performing collision detection, the geometry of an object that may collide is generally quite complex, and any position of the object may collide, which may require a large amount of calculation for collision detection, making collision detection very complex. In order to simplify the collision detection method, a complex object can be approximated by a simple geometric object (such as a sphere, a cuboid, an ellipsoid, etc.), i.e. the original object is surrounded by a simple surrounding body, and if an object does not collide with the surrounding body, it does not collide with an object inside the surrounding body. A simplified enclosure of objects may be achieved with an envelope algorithm. The envelope algorithm may be any one of an Axis Alignment Bounding Box (AABB) algorithm, a directed bounding box (OBB) algorithm, a discrete directed polyhedral bounding box (k-DOP) algorithm, a sphere bounding box algorithm. Different enveloping algorithms use correspondingly shaped bounding volumes to enclose the original object. The present application will explain a collision detection method by taking an AABB algorithm as an example.
In collision detection based on an envelope algorithm, an object is abstracted into a bounding volume, so that the operation amount is reduced, the operation speed is improved, and the detection precision is lost. Based on the above, the application provides an optimized collision detection method, which aims at different situations, carries out detection judgment with different levels so as to enable the collision detection to simultaneously meet the requirements of detection speed and precision.
In one embodiment, if two objects are far apart, it is only necessary to roughly estimate whether there is an opposite object within the envelope to determine whether the two objects collide. Specifically, if no counterpart object exists in the envelope range, judging that the two objects cannot collide; if the object of the other side exists in the envelope range, the collision of the two objects is judged.
In one embodiment, if the two objects are relatively close to each other, firstly estimating whether there is an opposite object in the envelope range, and if there is no opposite object in the envelope range, judging that the two objects cannot collide; if there is a counterpart object in the envelope, the distance between the two objects needs to be calculated continuously to further determine whether the two objects collide. According to the method provided by the application, when the distance between two objects is calculated, the key part on the object can be selected, the key part is the part on the object most likely to collide with the object of the other side, the distance between the key parts of the two objects is calculated, or the distance between the key part of one object and the whole body of the other object is calculated, the distance between the two whole objects is not required to be calculated, and the calculation amount of collision detection can be further reduced.
Referring to fig. 1, fig. 1 is a flow chart of a collision detection method according to an embodiment of the application. In this embodiment, the collision detection method includes:
s110: a first boundary of a first object is determined.
The first boundary is a base boundary of the first object. The first boundary of the first object may be determined using an envelope algorithm, and the bounding volume enclosed by the first boundary is the smallest bounding volume capable of bounding the first object. Alternatively, the first boundary is an abstract geometric boundary of the first object, and whether the first object collides with the first boundary can be determined by determining whether the first object collides with the first boundary. Spatial position data of the first object can be acquired, a minimum circumscribed cube of the first object is obtained, a minimum bounding volume capable of bounding the first object is obtained, and then a first boundary is obtained.
S120: a second boundary of the first object is determined using the first boundary and the first prediction coefficient.
The second boundary is a predicted boundary of the first object. It is possible to draw a prediction area on the basis of the first object, and to use this prediction area to determine the object that is likely to collide with the first object. In particular, objects falling within the predicted area are likely to collide with the first object.
The second boundary may be obtained by extrapolating the first boundary by a prediction coefficient. The first boundary of the first object is expanded by combining the first boundary and the first prediction coefficient, and the area range possibly collided with the first object is predicted to obtain the second boundary. The first prediction coefficient can be set according to the detection precision, and if the speed is high and the calculated amount is small, the prediction coefficient can be set to be smaller, so that the number of the packed points to be detected is small.
The second boundary may be obtained by expanding the surrounding range based on the first boundary. Namely, the second boundary coordinate value is the sum of the first boundary coordinate value and the first prediction coefficient; the second boundary coordinate value is the coordinate value of the boundary point on the second boundary, and the first boundary coordinate value is the coordinate value of the boundary point on the first boundary.
S130: it is determined whether the second object at least partially falls within the second boundary.
If the second object does not fall in the second boundary, judging that the second object and the first object cannot collide; if the second object portion falls within the second boundary, the second object may collide with the first object, specifically, whether the first member collides with the first object is determined.
Wherein the first component is a portion of the second object that falls within the second boundary. The portion of the second object falling within the second boundary is a critical portion of the second object that may collide with the first object, i.e. the second boundary range is used to define and select the critical portion of the second object that is most likely to collide with the first object, and then it is determined whether the critical portion collides with the first object. It may be determined whether the second object collides with the first object by calculating only whether the first member collides with the first object.
In one embodiment, after the critical portion (i.e., the first component) of the second object that may collide with the first object is selected, the distances between the first component and all points on the first object may be directly calculated, and whether the minimum distance is smaller than the pre-warning value may be determined to determine whether the first component may collide with the first object. In other embodiments, the envelope algorithm may be used again to select a critical portion of the first object that may collide with the first component, and determine whether the critical portion of the first object collides with the first component, so as to determine whether the first object collides with the second object. In this way, the calculation amount can be reduced even further.
In particular, a third boundary of the first component may be determined, the third boundary being a base boundary of the first component. A fourth boundary of the first component is determined based on the envelope algorithm, the fourth boundary being a predicted boundary of the first component. A fourth boundary of the first component may be determined using the third boundary and the second prediction coefficient. The fourth boundary may be obtained by expanding the surrounding area based on the third boundary. The fourth boundary coordinate value is the sum of the third boundary coordinate value and the second prediction coefficient; the fourth boundary coordinate value is the coordinate value of the boundary point on the fourth boundary, and the third boundary coordinate value is the coordinate value of the boundary point on the third boundary. It is determined whether the first object at least partially falls within the fourth boundary. In response to the first object portion falling within the fourth boundary, a determination is made as to whether the second component collides with the first component. The second component is the portion of the first object that falls within the fourth boundary. In this embodiment, the fourth bounding region may be utilized to define a critical portion of the first object selected to be most likely to collide with the first component. The specific principle is referred to above and will not be described in detail herein. Wherein the first prediction coefficient and the second prediction coefficient may be the same or different.
In the above embodiment, by alternately using the envelope algorithm based on the first object and the second object, respectively, it is possible to select the critical portions (the second member and the first member) on the first object and the second object that are likely to collide, and only determine whether the second member and the first member collide, that is, determine whether the second object collides with the first object. In this way, the calculation amount of collision detection can be reduced, and the accuracy of collision detection can be ensured.
The method can be iterated repeatedly for a plurality of times, and a more accurate key part is selected, but the calculation time is increased, so that judgment delay is caused; in order to ensure the real-time performance, accuracy and calculated amount of collision detection, the method can select to iterate twice, and select key parts on the first object and the second object which are likely to collide respectively.
In one embodiment, it may be determined whether a minimum distance between the first component and the second component is less than an early warning value to determine whether the first component collides with the second component.
Responding to the fact that the minimum distance between the first component and the second component is smaller than the early warning value, and judging that the first component and the second component collide; or in response to the minimum distance between the first component and the second component being greater than or equal to the pre-warning value, determining that the first component and the second component will not collide. The magnitude of the early warning value can be set according to the precision requirement of collision detection.
In one embodiment, determining whether the minimum distance between the first component and the second component is less than the pre-warning value comprises: respectively acquiring point cloud data of a first component and a second component; calculating a minimum distance between the point cloud of the first component and the point cloud of the second component; and judging whether the minimum distance between the point cloud of the first component and the point cloud of the second component is smaller than an early warning value.
The ToF depth camera can be used for respectively collecting original point cloud data of the first object and original point cloud data of the second object, and respectively carrying out three-dimensional reconstruction on the original point cloud data of the first object and the original point cloud data of the second object to obtain dense point cloud data of the first object and dense point cloud data of the second object. The three-dimensional reconstruction technology based on the depth camera takes an RGB image and a depth image as input, and restores a sparse point cloud three-dimensional model of the object. And matching the sparse point cloud and the mathematical model, so as to reconstruct a dense point cloud model. Each pixel value in the depth image is the distance from each point in the scene to the vertical plane in which the depth camera is located. And down-sampling the dense point cloud data of the first object and the second object to obtain sample point cloud data of the first object and the second object, and judging whether the first object collides with the second object or not based on the sample point cloud data. By means of downsampling and sparsification, the operation speed of collision detection can be improved, and the real-time performance of positioning is improved.
In the above embodiment, the TOF depth camera is used for performing 3D point cloud data acquisition and dense reconstruction on the surrounding environment, performing real-time positioning on the first object and the second object, and performing alternative key part selection by means of the AABB enveloping algorithm. And the detection boundary is subjected to expansion by utilizing a preset collision early warning coefficient, so that the minimum distance calculated amount in the collision process is closely related to the distance in the object motion process, and the real-time requirement of the collision detection process is ensured.
Referring to fig. 2, fig. 2 is a flow chart of a collision detection method according to an embodiment of the application. In this embodiment, the collision detection method includes:
s210: a first boundary of a first object is determined.
The ToF depth camera can be used for collecting 3D point cloud data of the first object, and the minimum circumscribed cube is calculated on the obtained point cloud data of the first object to obtain a first boundary.
S220: a second boundary of the first object is determined using the first boundary and the first prediction coefficient.
And selecting boundary points of the first boundary, such as two coordinate points (min (x), min (y), min (z)) and (max (x), max (y), max (z)) of the diagonal angle in the cube as boundary points, and performing outward expansion on the minimum cube to obtain a second boundary. Coordinates of boundary points of the second boundary, namely two coordinate points of diagonal angles of the expanded cube: (min (x) +h, min (y) +h, min (z) +h) and (max (x) +h, max (y) +h, max (z) +h). Wherein h is a first prediction coefficient.
S230: it is determined whether the second object at least partially falls within the second boundary.
And 3D point cloud data of the second object can be acquired by using the ToF depth camera, and the point cloud coordinates of the second object and the coordinates of the second boundary are compared to judge whether the second object at least partially falls in the second boundary.
S240: a third boundary of the first component is determined, and a fourth boundary of the first component is determined using the third boundary and the second prediction coefficient.
The first component is the portion of the second object that falls within the second boundary, and the corresponding point cloud data may be denoted as T1. And solving a minimum circumscribed cube of T1 to obtain a third boundary of the first component, and determining a fourth boundary of the first component by using the third boundary and the second prediction coefficient. If two coordinate points of the diagonal angle in the cube are selected: (min (x 1), min (y 1), min (z 1)) and (max (x 1), max (y 1), max (z 1)) are used as boundary points, and the minimum cube is expanded to obtain a fourth boundary. Coordinates of boundary points of the fourth boundary, namely two coordinate points of diagonal angles: (min (x 1) +h, min (y 1) +h, min (z 1) +h) and (max (x 1) +h, max (y 1) +h, max (z 1) +h), h being the first prediction coefficient.
S250: it is determined whether the first object at least partially falls within the fourth boundary.
And comparing the point cloud coordinates of the first object with the coordinates of the fourth boundary, and judging whether the first object at least partially falls in the fourth boundary.
S260: and judging whether the minimum distance between the first component and the second component is smaller than an early warning value.
The second component is the portion of the first object that falls within the fourth boundary, and the corresponding point cloud data may be denoted as T2. Solving the minimum distance d of T1 and T2, and if d is smaller than h, h is an early warning value, collision occurs; otherwise, no collision occurs, i.e. in this embodiment, the first prediction coefficient is the same as the second prediction coefficient and is equal to the early warning value.
In an embodiment of the present application, the position of the object at the current time may be used to combine with the running speed of the object to predict in advance whether two objects collide at the next time. In this embodiment, the first prediction coefficient may be a displacement value of the object, where the displacement value is a product (v×t) of the running speed v of the first object and a prediction interval t, and the prediction interval is a time interval between the current time and a predicted next time.
In one embodiment, the trajectory may also be subjected to anomaly detection prior to collision detection.
The abnormal noise of the position coordinates of the object from the initial moment to the current moment can be filtered by utilizing an isolated forest algorithm, so that the track coordinates of the cleaner object are obtained. And then, judging whether the track at the current moment is abnormal or not by utilizing a DBSCAN density clustering algorithm on the track coordinates. The input of DBSCAN is a feature vector composed of all the point coordinates of the current track from the initial time to the current time. Because of the large dimensions of the feature vectors constructed here, the PCA technique can be utilized for dimension reduction. If the track is abnormal, the whole operation process of the object is terminated in advance. Otherwise, judging whether the minimum distance between the vehicle body and the tire at the next time predicted by the tire is smaller than a collision early warning value or not by utilizing the collected point cloud data. If the threshold value is smaller, collision occurs; otherwise, no collision occurs and the operation is continued. Whether the trajectory is abnormal may be determined once before each collision determination, or whether the trajectory is abnormal may be determined at predetermined intervals.
In the above embodiment, the clustering algorithm is first used to determine the abnormal track of the mechanical arm, so as to react to the abnormal motion of the mechanical arm in real time in advance, and reduce unnecessary operations afterwards. The clustering method of DBSCAN is utilized to detect abnormal tracks, and feature vectors formed by all point coordinates from the initial moment to the current moment of the current track are used as input, so that not only can the suddenly-generated abnormal coordinate points be detected, but also the abnormal tracks generated by time accumulation can be detected. In the collision detection process, the scheme utilizes the running speed of the mechanical arm in the current frame and the time interval between two adjacent frames to estimate whether the next frame has the risk of collision or not, and makes advance judgment. Through the combination of the functions, the probability of collision is greatly avoided, and the calculated amount in the collision detection process is reduced.
Referring to fig. 3, fig. 3 is a schematic structural view of a collision detecting apparatus in an embodiment of the present application. In this embodiment, the collision detecting apparatus 10 includes a processor 11.
The processor 11 may also be referred to as a CPU (Central Processing Unit ). The processor 11 may be an integrated circuit chip with signal processing capabilities. The processor 11 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The general purpose processor may be a microprocessor or the processor 11 may be any conventional processor or the like.
The collision detection apparatus 10 may further include a memory (not shown) for storing instructions and data required for the operation of the processor 11.
The processor 11 is configured to execute instructions to implement the method provided by any of the embodiments of the collision detection method of the present application and any non-conflicting combination described above.
The collision detection device may be a computer device such as a server, a single server, a server cluster, or the like.
Referring to fig. 4, fig. 4 is a schematic structural diagram of a computer readable storage medium according to an embodiment of the application. The computer readable storage medium 30 of an embodiment of the present application stores instruction/program data 31, which instructions/program data 31 when executed implement the method provided by any of the above-described embodiments of the collision detection method of the present application, as well as any combination that does not collide. Wherein the instructions/program data 31 may be stored in the storage medium 30 as a software product in a form of a program file, so that a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) performs all or part of the steps of the methods according to the embodiments of the present application. And the aforementioned storage medium 30 includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, an optical disk, or other various media capable of storing program codes, or a terminal device such as a computer, a server, a mobile phone, a tablet, or the like.
In the several embodiments provided in the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of elements is merely a logical functional division, and there may be additional divisions of actual implementation, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The foregoing description is only of embodiments of the present application, and is not intended to limit the scope of the application, and all equivalent structures or equivalent processes using the descriptions and the drawings of the present application or directly or indirectly applied to other related technical fields are included in the scope of the present application.

Claims (9)

1. A collision detection method, characterized by comprising:
Determining a first boundary of a first object; the first boundary is a basic boundary of the first object, and the enclosure surrounded by the first boundary is a minimum enclosure capable of enclosing the first object;
determining a second boundary of the first object using the first boundary and a first prediction coefficient; the second boundary is a prediction boundary of the first object, wherein the first prediction coefficient is externally expanded on the first boundary of the first object to obtain the second boundary;
judging whether a second object at least partially falls within the second boundary so as to judge whether the first object collides with the second object;
Wherein responsive to the second object portion falling within the second boundary, the determining whether the first object collides with the second object comprises:
determining a third boundary of a first component, the first component being a portion of the second object that falls within the second boundary; the third boundary is a basic boundary of the first component, and the enclosure surrounded by the third boundary is a minimum enclosure capable of enclosing the first component;
determining a fourth boundary of the first component using the third boundary and a second prediction coefficient; the fourth boundary is a prediction boundary of the first component, wherein the second prediction coefficient is extrapolated to the third boundary of the first component to obtain the fourth boundary;
Determining whether the first object falls at least partially within the fourth boundary.
2. The collision detection method according to claim 1, wherein,
In response to the first object portion falling within the fourth boundary, the determining whether the first object collides with the second object includes:
and judging whether the minimum distance between the first component and the second component is smaller than an early warning value, wherein the second component is a part of the first object falling in the fourth boundary.
3. The collision detection method according to claim 2, wherein,
Determining that the first component and the second component collide in response to the minimum distance between the first component and the second component being less than the early warning value; or (b)
And in response to the minimum distance between the first component and the second component being greater than or equal to the early warning value, determining that the first component and the second component cannot collide.
4. The collision detection method according to claim 1, wherein,
The first prediction coefficient is the same as the second prediction coefficient.
5. The collision detection method according to claim 1, wherein,
The second boundary coordinate value is the sum of the first boundary coordinate value and the first prediction coefficient; the second boundary coordinate value is the coordinate value of the boundary point on the second boundary, and the first boundary coordinate value is the coordinate value of the boundary point on the first boundary
The fourth boundary coordinate value is the sum of the third boundary coordinate value and the second prediction coefficient; the fourth boundary coordinate value is the coordinate value of the boundary point on the fourth boundary, and the third boundary coordinate value is the coordinate value of the boundary point on the third boundary.
6. The collision detection method according to any one of claims 1 to 5, wherein,
The boundary of the object is determined by an envelope algorithm, wherein the envelope algorithm comprises any one of an axis alignment bounding box algorithm, a directed bounding box algorithm, a discrete directed polyhedral bounding box algorithm and a sphere bounding box algorithm.
7. The collision detection method according to claim 1, characterized in that the method further comprises:
Respectively acquiring original point cloud data of the first object and the second object;
respectively carrying out three-dimensional reconstruction on the original point cloud data of the first object and the second object to obtain dense point cloud data of the first object and the second object;
And downsampling the dense point cloud data of the first object and the second object to obtain sample point cloud data of the first object and the second object, and executing a collision detection step of the first object and the second object based on the sample point cloud data.
8. A collision detection apparatus, comprising a processor for executing instructions to implement the collision detection method according to any one of claims 1 to 7.
9. A computer readable storage medium for storing instruction/program data executable to implement the collision detection method according to any one of claims 1-7.
CN202011627667.6A 2020-12-31 2020-12-31 Collision detection method, apparatus, and computer-readable storage medium Active CN112700471B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011627667.6A CN112700471B (en) 2020-12-31 2020-12-31 Collision detection method, apparatus, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011627667.6A CN112700471B (en) 2020-12-31 2020-12-31 Collision detection method, apparatus, and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN112700471A CN112700471A (en) 2021-04-23
CN112700471B true CN112700471B (en) 2024-06-07

Family

ID=75513379

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011627667.6A Active CN112700471B (en) 2020-12-31 2020-12-31 Collision detection method, apparatus, and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN112700471B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116810493B (en) * 2023-08-31 2023-11-21 山东惠硕重工机械有限公司 Anti-collision detection method and system for numerical control machine tool based on data driving

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012066690A (en) * 2010-09-24 2012-04-05 Fujitsu Ten Ltd Vehicle control system, vehicle control apparatus, and vehicle control method
CN104092180A (en) * 2014-06-20 2014-10-08 三星电子(中国)研发中心 Collision detection processing method and device
CN108062600A (en) * 2017-12-18 2018-05-22 北京星云互联科技有限公司 A kind of vehicle collision prewarning method and device based on rectangle modeling
CN108714303A (en) * 2018-05-16 2018-10-30 深圳市腾讯网络信息技术有限公司 Collision checking method, equipment and computer readable storage medium
CN109844671A (en) * 2016-08-26 2019-06-04 克朗设备公司 The path confirmation and dynamic route modification of materials handling vehicle
CN110232741A (en) * 2019-06-17 2019-09-13 腾讯科技(深圳)有限公司 Multilayer bounding box determines method, collision detection and motion control method and equipment
CN110428663A (en) * 2019-08-30 2019-11-08 合肥鑫晟光电科技有限公司 A kind of vehicle collision prewarning method, car-mounted terminal and server
CN111177888A (en) * 2019-12-09 2020-05-19 武汉光庭信息技术股份有限公司 Simulation scene collision detection method and system
CN111402633A (en) * 2020-03-23 2020-07-10 北京安捷工程咨询有限公司 Object anti-collision method based on UWB positioning and civil engineering anti-collision system
EP3739356A1 (en) * 2019-05-12 2020-11-18 Origin Wireless, Inc. Method, apparatus, and system for wireless tracking, scanning and monitoring
WO2020258218A1 (en) * 2019-06-28 2020-12-30 深圳市大疆创新科技有限公司 Obstacle detection method and device for mobile platform, and mobile platform

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012066690A (en) * 2010-09-24 2012-04-05 Fujitsu Ten Ltd Vehicle control system, vehicle control apparatus, and vehicle control method
CN104092180A (en) * 2014-06-20 2014-10-08 三星电子(中国)研发中心 Collision detection processing method and device
CN109844671A (en) * 2016-08-26 2019-06-04 克朗设备公司 The path confirmation and dynamic route modification of materials handling vehicle
CN108062600A (en) * 2017-12-18 2018-05-22 北京星云互联科技有限公司 A kind of vehicle collision prewarning method and device based on rectangle modeling
CN108714303A (en) * 2018-05-16 2018-10-30 深圳市腾讯网络信息技术有限公司 Collision checking method, equipment and computer readable storage medium
EP3739356A1 (en) * 2019-05-12 2020-11-18 Origin Wireless, Inc. Method, apparatus, and system for wireless tracking, scanning and monitoring
CN110232741A (en) * 2019-06-17 2019-09-13 腾讯科技(深圳)有限公司 Multilayer bounding box determines method, collision detection and motion control method and equipment
WO2020258218A1 (en) * 2019-06-28 2020-12-30 深圳市大疆创新科技有限公司 Obstacle detection method and device for mobile platform, and mobile platform
CN110428663A (en) * 2019-08-30 2019-11-08 合肥鑫晟光电科技有限公司 A kind of vehicle collision prewarning method, car-mounted terminal and server
CN111177888A (en) * 2019-12-09 2020-05-19 武汉光庭信息技术股份有限公司 Simulation scene collision detection method and system
CN111402633A (en) * 2020-03-23 2020-07-10 北京安捷工程咨询有限公司 Object anti-collision method based on UWB positioning and civil engineering anti-collision system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Improved Hybrid Bounding Box Collision Detection Algorithm;Wang Chao等;《Journal of System Simulation》;20180906;第30卷(第11期);第4236-4243页 *
关于机械设计制造与自动化的研究探讨;吴建军;《科技风》;20190109;第2019卷(第1期);第141页 *
煤矿井下移动机器人运动规划方法研究;杨林 等;《工矿自动化》;20200616;第46卷(第6期);第23-30页 *
飞机插配零部件数字化装配碰撞检测研究;李丽娟 等;《机械设计与制造》;20200208;第2020卷(第2期);第145-148页 *

Also Published As

Publication number Publication date
CN112700471A (en) 2021-04-23

Similar Documents

Publication Publication Date Title
US11216971B2 (en) Three-dimensional bounding box from two-dimensional image and point cloud data
US11816852B2 (en) Associating LIDAR data and image data
CN110286389B (en) Grid management method for obstacle identification
US11442162B2 (en) Millimeter wave radar modeling-based method for object visibility determination
CN107133966B (en) Three-dimensional sonar image background segmentation method based on sampling consistency algorithm
CN111640089A (en) Defect detection method and device based on feature map center point
EP2352128A1 (en) Mobile body detection method and mobile body detection apparatus
EP3293700A1 (en) 3d reconstruction for vehicle
WO2022133770A1 (en) Method for generating point cloud normal vector, apparatus, computer device, and storage medium
CN112700471B (en) Collision detection method, apparatus, and computer-readable storage medium
CN116310673A (en) Three-dimensional target detection method based on fusion of point cloud and image features
CN114764885A (en) Obstacle detection method and device, computer-readable storage medium and processor
CN114241448A (en) Method and device for obtaining heading angle of obstacle, electronic equipment and vehicle
CN112446952B (en) Three-dimensional point cloud normal vector generation method and device, electronic equipment and storage medium
US10223803B2 (en) Method for characterising a scene by computing 3D orientation
CN112308917A (en) Vision-based mobile robot positioning method
CN114972492A (en) Position and pose determination method and device based on aerial view and computer storage medium
CN116931583A (en) Method, device, equipment and storage medium for determining and avoiding moving object
CN112700474A (en) Collision detection method, device and computer-readable storage medium
CN114241011A (en) Target detection method, device, equipment and storage medium
CN113724296B (en) Material tracking method and device under motion background, storage medium and terminal
CN113609985B (en) Object pose detection method, detection device, robot and storable medium
CN114219832B (en) Face tracking method and device and computer readable storage medium
CN113297340B (en) Vectorization method and device for point cloud map and method and device for converting vector map into point cloud map
EP3731130B1 (en) Apparatus for determining an occupancy map

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant