CN112700471A - Collision detection method, device and computer-readable storage medium - Google Patents

Collision detection method, device and computer-readable storage medium Download PDF

Info

Publication number
CN112700471A
CN112700471A CN202011627667.6A CN202011627667A CN112700471A CN 112700471 A CN112700471 A CN 112700471A CN 202011627667 A CN202011627667 A CN 202011627667A CN 112700471 A CN112700471 A CN 112700471A
Authority
CN
China
Prior art keywords
boundary
component
coordinate value
point cloud
cloud data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011627667.6A
Other languages
Chinese (zh)
Other versions
CN112700471B (en
Inventor
吴博文
朱林楠
杨林
黄健东
陈凌之
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Midea Group Co Ltd
Guangdong Midea White Goods Technology Innovation Center Co Ltd
Original Assignee
Midea Group Co Ltd
Guangdong Midea White Goods Technology Innovation Center Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Midea Group Co Ltd, Guangdong Midea White Goods Technology Innovation Center Co Ltd filed Critical Midea Group Co Ltd
Priority to CN202011627667.6A priority Critical patent/CN112700471B/en
Publication of CN112700471A publication Critical patent/CN112700471A/en
Application granted granted Critical
Publication of CN112700471B publication Critical patent/CN112700471B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/2433Single-class perspective, e.g. one-against-all classification; Novelty detection; Outlier detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0014Image feed-back for automatic industrial control, e.g. robot with camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/12Bounding box
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/21Collision detection, intersection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Multimedia (AREA)
  • Robotics (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

A collision detection method includes determining a first boundary of a first object; determining a second boundary of the first object using the first boundary and the first prediction coefficient; and judging whether the second object at least partially falls into the second boundary so as to judge whether the first object and the second object collide. Through the mode, the collision detection precision and speed can be improved.

Description

Collision detection method, device and computer-readable storage medium
Technical Field
The present invention relates to the field of automation technologies, and in particular, to a collision detection method, device, and computer-readable storage medium.
Background
With the rapid development of artificial intelligence, the field of intelligent manufacturing gradually becomes a focus of attention. For example, the assembly robot can effectively replace the traditional complex assembly process, particularly for batch production, greatly reduce the manufacturing cost, improve the production efficiency and accelerate the industrial automation and intelligent process in the manufacturing field. However, due to the tendency of relative motion of objects, collision may occur between different objects, and for the industrial field with high precision requirement, the whole production line is often in a stagnation state due to extremely small collision accidents. Therefore, when an industrial robot performs path planning on an assembly process, a moving or static obstacle needs to be avoided in real time, and the process can involve a collision detection problem. In the process of collision detection, the speed and accuracy of detection cannot be balanced, and improvement is urgently needed.
Disclosure of Invention
The invention mainly solves the technical problem of providing a collision detection method, equipment and a computer readable storage medium, which can improve the precision and speed of collision detection.
In order to solve the technical problems, the invention adopts a technical scheme that: a collision detection method is provided, the method comprising determining a first boundary of a first object; determining a second boundary of the first object using the first boundary and the first prediction coefficient; and judging whether the second object at least partially falls into the second boundary so as to judge whether the first object and the second object collide.
Wherein, judging whether the second object at least partially falls within the second boundary to judge whether the first object collides with the second object comprises: and in response to the second object portion falling within the second boundary, determining whether the first component collides with the first object, the first component being a portion of the second object falling within the second boundary. .
Wherein determining whether the first component collides with the first object comprises: determining a third boundary of the first component; determining a fourth boundary of the first component using the third boundary and the second prediction coefficient; and judging whether the first object at least partially falls within the fourth boundary so as to judge whether the first part collides with the first object.
And judging whether a second part collides with the first part or not in response to the first object falling within the fourth boundary, wherein the second part is the part of the first object falling within the fourth boundary.
Wherein determining whether the first component and the second component collide comprises: and judging whether the minimum distance between the first component and the second component is smaller than the early warning value or not so as to judge whether the first component and the second component collide or not.
Wherein, in response to the minimum distance between the first component and the second component being less than the warning value, it is determined that the first component and the second component will collide; or in response to the minimum distance between the first component and the second component being greater than or equal to the warning value, determining that the first component and the second component do not collide.
Wherein, judging whether the minimum distance between the first component and the second component is less than the early warning value comprises: respectively acquiring point cloud data of a first part and a second part; calculating a minimum distance between the point cloud of the first part and the point cloud of the second part; and judging whether the minimum distance between the point cloud of the first part and the point cloud of the second part is smaller than the early warning value or not.
Wherein the first prediction coefficient is the same as the second prediction coefficient.
The fourth boundary coordinate value is the sum of the third boundary coordinate value and the second prediction coefficient; the fourth boundary coordinate value is the coordinate value of the boundary point on the fourth boundary, and the third boundary coordinate value is the coordinate value of the boundary point on the third boundary.
The second prediction coefficient is the product of the running speed of the first object and the prediction interval, and the prediction interval is the time interval between the current moment and the predicted next moment.
The second boundary coordinate value is the sum of the first boundary coordinate value and the first prediction coefficient; the second boundary coordinate value is a coordinate value of a boundary point on the second boundary, and the first boundary coordinate value is a coordinate value of a boundary point on the first boundary.
Wherein determining whether the first component collides with the first object comprises: and judging whether the minimum distance between the first component and the whole first object is smaller than an early warning value or not so as to judge whether the first component collides with the first object or not.
Wherein, judging whether the second object at least partially falls within the second boundary range to judge whether the first object collides with the second object comprises: and in response to the second object not falling within the second boundary, determining that the first object and the second object do not collide.
The envelope algorithm comprises any one of an axis-aligned bounding box algorithm, a directed bounding box algorithm, a discrete directed polyhedron bounding box algorithm and a sphere bounding box algorithm.
The method comprises the steps of acquiring original point cloud data of a first object and original point cloud data of a second object respectively; respectively carrying out three-dimensional reconstruction on the original point cloud data of the first object and the original point cloud data of the second object to obtain dense point cloud data of the first object and the dense point cloud data of the second object; and performing down-sampling on the dense point cloud data of the first object and the second object to obtain sample point cloud data of the first object and the second object, and performing a collision detection step of the first object and the second object on the basis of the sample point cloud data.
The method for respectively acquiring the original point cloud data of the first object and the original point cloud data of the second object comprises the following steps: and respectively acquiring original point cloud data of the first object and the second object by using the ToF depth camera.
Acquiring all track points from an initial moment to a current moment on a running track of a first object; obtaining coordinate values of all track points; combining the coordinate values of all track points into a feature vector; and processing the characteristic vector by using a noise density-based clustering method to judge whether the running track of the first object is abnormal or not.
Wherein, include after obtaining the coordinate value of all track points: and removing abnormal track points by using an isolated forest algorithm.
Wherein, include after the coordinate value of all track points makes up the eigenvector: and performing dimensionality reduction on the feature vectors by using a principal component analysis method.
In order to solve the technical problem, the invention adopts another technical scheme that: there is provided a collision detection apparatus comprising a processor for executing instructions to implement the collision detection method of any one of the above.
In order to solve the technical problem, the invention adopts another technical scheme that: there is provided a computer readable storage medium for storing instructions/program data executable to implement the collision method of any one of the above.
The invention has the beneficial effects that: in distinction from the state of the art, the present invention provides a collision detection method by determining a first boundary of a first object; determining a second boundary of the first object based on an envelope algorithm, wherein the second boundary is obtained by utilizing the first boundary and a first prediction coefficient; whether the second object at least partially falls within the second boundary is judged to judge whether the first object and the second object collide, when the distance between the two objects is calculated, a key part on the object can be selected firstly, the key part is the part of the object which is most likely to collide with the other object, the distance between the key parts of the two objects is calculated, or the distance between the key part of one object and the whole body of the other object is calculated, the distance between the two whole objects does not need to be calculated, and the calculation amount of collision detection can be further reduced.
Drawings
FIG. 1 is a schematic flow chart of a collision detection method in an embodiment of the present application;
FIG. 2 is a schematic flow chart of a collision detection method according to an embodiment of the present application;
fig. 3 is a schematic structural view of a collision detecting apparatus in an embodiment of the present application;
fig. 4 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present application.
Detailed Description
In order to make the purpose, technical solution and effect of the present application clearer and clearer, the present application is further described in detail below with reference to the accompanying drawings and examples.
Due to the tendency of objects to move relative to each other, there may be a risk of collision between different objects. Particularly for the industrial field with high precision requirement, such as the field of industrial robots, when the robot performs path planning, a moving or static obstacle needs to be avoided in real time, and collision detection is needed in the process. Collision detection typically involves at least two objects moving relative to each other, but may involve more than one object moving relative to each other. The present application will describe the collision detection method by taking two objects moving relative to each other as an example, but is not limited thereto. For convenience of description, two objects participating in collision detection are referred to as a first object and a second object respectively, the two objects are opposite, when one object is referred to as the first object, the other opposite object is the second object, specific identities of the two objects are not limited herein, and the second object and the first object may be interchanged.
Collision detection involves two relatively moving objects, typically one stationary and one moving; of course, both objects may be moving. For ease of calculation, when both objects are moving, one of the objects may be considered stationary relative to the other, in which case the speed of movement of the moving object is the sum of the speeds of movement of the two objects. The collision detection method will be described by taking an example in which one object moves and one object is stationary, but is not limited thereto. The moving object may be referred to as a first object and the stationary object may be referred to as a second object, and vice versa.
When collision detection is performed, the geometry of an object that may collide is usually quite complex, and any position of the object may collide, which may cause a large amount of calculation for collision detection, and make collision detection very complex. In order to simplify the collision detection method, a simple geometric object (such as a sphere, a cuboid, an ellipsoid and the like) can be used for approaching a complex object, namely, a simple enclosing body can be used for enclosing the original object, and if an object does not collide with the enclosing body, the object does not collide with the object in the enclosing body. An envelope algorithm may be utilized to achieve a simplified bounding of an object. The envelope algorithm may be any one of an Axis Aligned Bounding Box (AABB) algorithm, a directed bounding box (OBB) algorithm, a discrete directed polyhedral bounding box (k-DOP) algorithm, and a sphere bounding box algorithm. Different envelope algorithms surround the original object with a correspondingly shaped bounding volume. The present application will explain the collision detection method by taking an AABB algorithm as an example.
In the collision detection based on the envelope algorithm, the object is abstracted into the envelope, so that although the operation amount is reduced and the operation speed is improved, the detection precision is lost. Based on the above, the application provides an optimized collision detection method, which performs detection and judgment of different levels for different situations, so that the collision detection can meet the requirements of detection speed and precision at the same time.
In one embodiment, if the two objects are far apart from each other, it is only necessary to roughly estimate whether there is an opposing object in the envelope range to determine whether the two objects collide with each other. Specifically, if no object of the other side exists in the envelope range, it is determined that the two objects do not collide; and if the opposite object exists in the envelope range, judging that the two objects collide.
In one embodiment, if the two objects are close to each other, whether an opposite object exists in an envelope range is estimated, and if no opposite object exists in the envelope range, the two objects are determined not to collide; if there is an object of the other side in the envelope range, the distance between the two objects needs to be calculated continuously to further judge whether the two objects collide. According to the method, when the distance between the two objects is calculated, the key part on the object can be selected, the key part is the part of the object which is most likely to collide with the other object, the distance between the key parts of the two objects is calculated, or the distance between the key part of one object and the whole object is calculated, the distance between the two whole objects does not need to be calculated, and the calculation amount of collision detection can be further reduced.
Referring to fig. 1, fig. 1 is a schematic flow chart of a collision detection method according to an embodiment of the present disclosure. In this embodiment, the collision detection method includes:
s110: a first boundary of a first object is determined.
The first boundary is a base boundary of the first object. The first boundary of the first object may be determined using an envelope algorithm, and the enclosure enclosed by the first boundary is the smallest enclosure capable of enclosing the first object. Or, the first boundary is an abstracted geometric boundary of the first object, and whether the first object collides with the first boundary can be determined by determining whether the first object collides with the first boundary. The spatial position data of the first object can be obtained, the minimum circumscribed cube of the first object is obtained, the minimum enclosing body capable of enclosing the first object is obtained, and then the first boundary is obtained.
S120: a second boundary of the first object is determined using the first boundary and the first prediction coefficient.
The second boundary is a predicted boundary of the first object. A prediction area may be drawn based on the first object and used to determine objects that may collide with the first object. Specifically, it is possible that an object falling within the prediction region may collide with the first object.
The second boundary may be obtained by extending a prediction coefficient from the first boundary. That is, the first boundary of the first object may be expanded in combination with the first boundary and the first prediction coefficient, and the range of the region where the first object may collide with the first object may be predicted, so as to obtain the second boundary. The first prediction coefficient can be set according to the detection precision, and if the speed is high and the calculation amount is small, the prediction coefficient can be set to be smaller, so that the number of points to be detected which are packaged is small.
Wherein, the second boundary can be obtained by expanding the surrounding range on the basis of the first boundary. Namely, the second boundary coordinate value is the sum of the first boundary coordinate value and the first prediction coefficient; the second boundary coordinate value is a coordinate value of a boundary point on the second boundary, and the first boundary coordinate value is a coordinate value of a boundary point on the first boundary.
S130: it is determined whether the second object at least partially falls within the second boundary.
If the second object does not fall within the second boundary, determining that the second object does not collide with the first object; if the second object portion falls within the second boundary, the second object may collide with the first object, and specifically, it is determined whether the first component collides with the first object.
Wherein the first part is a portion of the second object that falls within the second boundary. The part of the second object falling within the second boundary is a critical part of the second object that may collide with the first object, i.e. the second boundary range may be used to define and select a critical part of the second object that is most likely to collide with the first object, and then determine whether the critical part will collide with the first object. Whether the second object collides with the first object can be determined by only calculating and determining whether the first component collides with the first object.
In one embodiment, after selecting a critical portion of the second object (i.e., the first component) that may collide with the first object, the distances between the first component and all points on the first object may be directly calculated, and it is determined whether the minimum distance is smaller than the warning value, so as to determine whether the first component may collide with the first object. In other embodiments, the envelope algorithm may be further utilized again to select a critical portion of the first object, which may possibly collide with the first component, and determine whether the critical portion of the first object may collide with the first component, so as to determine whether the first object may collide with the second object. In this way, the amount of calculation can be further reduced.
In particular, a third boundary of the first component may be determined, the third boundary being a base boundary of the first component. A fourth boundary of the first component is determined based on an envelope algorithm, the fourth boundary being a predicted boundary of the first component. A fourth boundary of the first component may be determined using the third boundary and the second prediction coefficient. The fourth boundary can be obtained by enlarging the surrounding range on the basis of the third boundary. The fourth boundary coordinate value is the sum of the third boundary coordinate value and the second prediction coefficient; the fourth boundary coordinate value is the coordinate value of the boundary point on the fourth boundary, and the third boundary coordinate value is the coordinate value of the boundary point on the third boundary. It is determined whether the first object is at least partially within the fourth boundary. In response to the first object portion falling within the fourth boundary, it is determined whether the second component collides with the first component. The second component is a portion of the first object that falls within the fourth boundary. In this embodiment, the fourth boundary range may be used to define a key portion of the first object that is selected to be most likely to collide with the first component. For the details of the principle, please refer to the above description, which is not repeated herein. The first prediction coefficient and the second prediction coefficient may be the same or different.
In the above embodiment, by alternately using the envelope algorithm based on the first object and the second object, the key portions (the second member and the first member) of the first object and the second object that are likely to collide can be selected, and whether the second object collides with the first object can be determined by only determining whether the second member and the first member collide. By the method, the calculation amount of collision detection can be reduced, and the precision of collision detection can be ensured.
Wherein, iteration can be repeated for many times, and a more accurate key part is selected, but the calculation time is increased, and the judgment delay is caused; in order to ensure the real-time performance, the accuracy and the calculated amount of the collision detection, iteration can be selected twice, and key parts which are possibly collided on the first object and the second object can be selected respectively.
In one embodiment, it may be determined whether a minimum distance between the first component and the second component is less than a warning value to determine whether the first component collides with the second component.
In response to the fact that the minimum distance between the first component and the second component is smaller than the early warning value, it is determined that the first component and the second component are likely to collide; or in response to the minimum distance between the first component and the second component being greater than or equal to the warning value, determining that the first component and the second component do not collide. The size of the early warning value can be set according to the precision requirement of collision detection.
In one embodiment, determining whether the minimum distance between the first component and the second component is less than the warning value comprises: respectively acquiring point cloud data of a first part and a second part; calculating a minimum distance between the point cloud of the first part and the point cloud of the second part; and judging whether the minimum distance between the point cloud of the first part and the point cloud of the second part is smaller than the early warning value or not.
The ToF depth camera can be used for acquiring the original point cloud data of the first object and the original point cloud data of the second object respectively, and three-dimensional reconstruction is carried out on the original point cloud data of the first object and the original point cloud data of the second object respectively to obtain dense point cloud data of the first object and the dense point cloud data of the second object. The three-dimensional reconstruction technology based on the depth camera takes an RGB image and a depth image as input to restore a sparse point cloud three-dimensional model of an object. And matching the sparse point cloud with the mathematical model so as to reconstruct a dense point cloud model. Wherein each pixel value in the depth image is the distance from each point in the scene to the vertical plane in which the depth camera is located. And the dense point cloud data of the first object and the second object can be sampled down to obtain sample point cloud data of the first object and the second object, and whether the first object collides with the second object is judged on the basis of the sample point cloud data. By means of downsampling and thinning, the operation speed of collision detection can be increased, and the real-time performance of positioning is improved.
In the above embodiment, the TOF depth camera is used for performing 3D point cloud data acquisition and dense reconstruction on the surrounding environment, the first object and the second object are positioned in real time, and the AABB envelope algorithm is used for performing alternate key part selection. And the detection boundary is expanded by utilizing a preset collision early warning coefficient, so that the minimum distance calculation amount in the collision process is closely related to the distance in the object motion process, and the real-time requirement of the collision detection process is ensured.
Referring to fig. 2, fig. 2 is a schematic flow chart of a collision detection method according to an embodiment of the present disclosure. In this embodiment, the collision detection method includes:
s210: a first boundary of a first object is determined.
The ToF depth camera can be used for collecting 3D point cloud data of the first object, and the minimum circumscribed cube of the obtained point cloud data of the first object is solved to obtain a first boundary.
S220: a second boundary of the first object is determined using the first boundary and the first prediction coefficient.
Boundary points of the first boundary, such as two coordinate points of diagonal angles in the cube, (min (x), min (y), min (z)) and (max (x), max (y), max (z)) can be selected as boundary points, and the minimum cube is expanded to obtain a second boundary. Coordinates of boundary points of the second boundary, namely two coordinate points of an oblique angle of the cube after the outward expansion: (min (x) + h, min (y) + h, min (z) + h) and (max (x) + h, max (y) + h, max (z) + h). Where h is the first prediction coefficient.
S230: it is determined whether the second object at least partially falls within the second boundary.
The ToF depth camera can be used for acquiring the 3D point cloud data of the second object, comparing the point cloud coordinate of the second object with the coordinate of the second boundary, and judging whether the second object at least partially falls in the second boundary.
S240: a third boundary of the first component is determined, and a fourth boundary of the first component is determined using the third boundary and the second prediction coefficient.
The first component is a portion of the second object that falls within the second boundary, and the corresponding point cloud data may be denoted as T1. And solving a minimum circumscribed cube of T1 to obtain a third boundary of the first part, and determining a fourth boundary of the first part by using the third boundary and the second prediction coefficient. For example, two coordinate points of diagonal angles in the cube are selected: and (min (x1), min (y1) and min (z1)) and (max (x1), max (y1) and max (z1)) are used as boundary points, and the minimum cube is subjected to outward expansion to obtain a fourth boundary. Coordinates of boundary points of the fourth boundary, namely, two coordinate points diagonally opposite to each other: (min (x1) + h, min (y1) + h, min (z1) + h) and (max (x1) + h, max (y1) + h, max (z1) + h), h being the first prediction coefficient.
S250: it is determined whether the first object is at least partially within the fourth boundary.
And comparing the point cloud coordinate of the first object with the coordinate of the fourth boundary, and judging whether the first object at least partially falls into the fourth boundary.
S260: and judging whether the minimum distance between the first component and the second component is smaller than the early warning value or not.
The second component is a portion of the first object that falls within the fourth boundary, and the corresponding point cloud data may be denoted as T2. Calculating the minimum distance d between T1 and T2, and if d is less than h and h is an early warning value, collision occurs; otherwise, no collision occurs, i.e. in this embodiment, the first prediction coefficient is the same as the second prediction coefficient and is equal to the warning value.
In one embodiment, in the embodiments provided in the present application, the position of the object at the current time can be used in combination with the moving speed of the object to predict in advance whether the two objects will collide at the next time. In this embodiment, the first prediction coefficient may be a displacement value of the object, the displacement value being a product (v × t) of the operating speed v of the first object and a prediction interval t, the prediction interval being a time interval between a current time and a predicted next time.
In one embodiment, the trajectory may also be detected for anomalies prior to collision detection.
Abnormal noise of the position coordinates of the object from the initial moment to the current moment can be filtered by using an isolated forest algorithm, and the track coordinates of the cleaner object are obtained. And then, judging whether the track at the current moment is abnormal or not by utilizing a DBSCAN density clustering algorithm for the track coordinate. The input of the DBSCAN is a feature vector composed of all point coordinates of the current trajectory from the initial time to the current time. Since the feature vectors constructed here are large in dimension, the dimension reduction can be performed using the PCA technique. If the track is abnormal, the whole operation process of the object is terminated in advance. And otherwise, judging whether the minimum distance between the vehicle body and the tire at the next moment predicted by the vehicle body and the tire is smaller than the collision early warning value or not by using the collected point cloud data. If less than the threshold, a collision occurs; otherwise, no collision occurs and operation continues. Whether the trajectory is abnormal or not may be determined once before the collision determination is performed each time, or may be determined at predetermined intervals.
In the above embodiment, the clustering algorithm is used to determine the abnormal trajectory of the mechanical arm, so as to react to the abnormal motion of the mechanical arm in real time in advance, thereby reducing unnecessary operations. The abnormal track is detected by using a clustering method of DBSCAN, and a feature vector formed by coordinates of all points of the current track from an initial moment to the current moment is used as an input, so that not only can abnormal coordinate points which suddenly occur be detected, but also the abnormal track generated due to time accumulation can be detected. In the collision detection process, the mechanical arm is used for estimating whether the next frame has collision danger or not by using the running speed of the current frame and the time interval between two adjacent frames, and making advance judgment. Through the combination of the functions, the probability of collision is greatly avoided, and meanwhile, the calculated amount in the collision detection process is reduced.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a collision detection apparatus according to an embodiment of the present disclosure. In this embodiment, the collision detecting apparatus 10 includes a processor 11.
The processor 11 may also be referred to as a CPU (Central Processing Unit). The processor 11 may be an integrated circuit chip having signal processing capabilities. The processor 11 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor 11 may be any conventional processor or the like.
The collision detecting device 10 may further comprise a memory (not shown in the figures) for storing instructions and data required for the operation of the processor 11.
The processor 11 is configured to execute instructions to implement the methods provided by any of the embodiments of the collision detection method of the present application and any non-conflicting combinations thereof.
The collision detection device may be a computer device such as a server, may be a separate server, may be a server cluster, or the like.
Referring to fig. 4, fig. 4 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present disclosure. The computer readable storage medium 30 of the embodiments of the present application stores instructions/program data 31 that when executed enable the methods provided by any of the embodiments of the collision detection methods described above and any non-conflicting combinations of the embodiments of the present application. The instructions/program data 31 may form a program file stored in the storage medium 30 in the form of a software product, so as to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute all or part of the steps of the methods according to the embodiments of the present application. And the aforementioned storage medium 30 includes: various media capable of storing program codes, such as a usb disk, a mobile hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, or terminal devices, such as a computer, a server, a mobile phone, and a tablet.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The above description is only an embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes performed by the content of the present specification and the attached drawings, or applied to other related technical fields directly or indirectly, are included in the scope of the present invention.

Claims (10)

1. A collision detection method, characterized by comprising:
determining a first boundary of a first object;
determining a second boundary of the first object using the first boundary and a first prediction coefficient;
and judging whether a second object at least partially falls into the second boundary so as to judge whether the first object and the second object collide.
2. The collision detecting method according to claim 1,
in response to the second object portion falling within the second boundary, the determining whether the first object collides with the second object includes:
determining a third boundary of a first component, the first component being a portion of the second object that falls within the second boundary;
determining a fourth boundary of the first component using the third boundary and a second prediction coefficient;
determining whether the first object at least partially falls within the fourth boundary.
3. The collision detecting method according to claim 2,
in response to the first object portion falling within the fourth boundary, the determining whether the first object collides with the second object includes:
and judging whether the minimum distance between the first component and a second component is smaller than an early warning value, wherein the second component is a part of the first object falling within the fourth boundary.
4. The collision detecting method according to claim 3,
determining that the first component and the second component are likely to collide in response to the minimum distance between the first component and the second component being less than the warning value; or
Determining that the first component and the second component are not collided in response to that the minimum distance between the first component and the second component is greater than or equal to the early warning value.
5. The collision detecting method according to claim 2,
the first prediction coefficient is the same as the second prediction coefficient.
6. The collision detecting method according to claim 2,
the second boundary coordinate value is the sum of the first boundary coordinate value and the first prediction coefficient; the second boundary coordinate value is a coordinate value of a boundary point on the second boundary, and the first boundary coordinate value is a coordinate value of a boundary point on the first boundary
The fourth boundary coordinate value is the sum of the third boundary coordinate value and the second prediction coefficient; the fourth boundary coordinate value is a coordinate value of a boundary point on the fourth boundary, and the third boundary coordinate value is a coordinate value of a boundary point on the third boundary.
7. The collision detecting method according to any one of claims 1 to 6,
determining the boundary of the object by utilizing an envelope algorithm, wherein the envelope algorithm comprises any one of an axis-aligned bounding box algorithm, a directed bounding box algorithm, a discrete directed polyhedron bounding box algorithm and a sphere bounding box algorithm.
8. The collision detection method according to claim 1, characterized in that the method further comprises:
respectively acquiring original point cloud data of the first object and the second object;
respectively carrying out three-dimensional reconstruction on the original point cloud data of the first object and the original point cloud data of the second object to obtain dense point cloud data of the first object and the dense point cloud data of the second object;
and performing down-sampling on the dense point cloud data of the first object and the second object to obtain sample point cloud data of the first object and the second object, and performing a collision detection step of the first object and the second object on the basis of the sample point cloud data.
9. A collision detection apparatus, characterized in that it comprises a processor for executing instructions to implement a collision detection method according to any one of claims 1-8.
10. A computer-readable storage medium for storing instructions/program data executable to implement a collision method according to any one of claims 1-8.
CN202011627667.6A 2020-12-31 2020-12-31 Collision detection method, apparatus, and computer-readable storage medium Active CN112700471B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011627667.6A CN112700471B (en) 2020-12-31 2020-12-31 Collision detection method, apparatus, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011627667.6A CN112700471B (en) 2020-12-31 2020-12-31 Collision detection method, apparatus, and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN112700471A true CN112700471A (en) 2021-04-23
CN112700471B CN112700471B (en) 2024-06-07

Family

ID=75513379

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011627667.6A Active CN112700471B (en) 2020-12-31 2020-12-31 Collision detection method, apparatus, and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN112700471B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116810493A (en) * 2023-08-31 2023-09-29 山东惠硕重工机械有限公司 Anti-collision detection method and system for numerical control machine tool based on data driving

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012066690A (en) * 2010-09-24 2012-04-05 Fujitsu Ten Ltd Vehicle control system, vehicle control apparatus, and vehicle control method
CN104092180A (en) * 2014-06-20 2014-10-08 三星电子(中国)研发中心 Collision detection processing method and device
CN108062600A (en) * 2017-12-18 2018-05-22 北京星云互联科技有限公司 A kind of vehicle collision prewarning method and device based on rectangle modeling
CN108714303A (en) * 2018-05-16 2018-10-30 深圳市腾讯网络信息技术有限公司 Collision checking method, equipment and computer readable storage medium
CN109844671A (en) * 2016-08-26 2019-06-04 克朗设备公司 The path confirmation and dynamic route modification of materials handling vehicle
CN110232741A (en) * 2019-06-17 2019-09-13 腾讯科技(深圳)有限公司 Multilayer bounding box determines method, collision detection and motion control method and equipment
CN110428663A (en) * 2019-08-30 2019-11-08 合肥鑫晟光电科技有限公司 A kind of vehicle collision prewarning method, car-mounted terminal and server
CN111177888A (en) * 2019-12-09 2020-05-19 武汉光庭信息技术股份有限公司 Simulation scene collision detection method and system
CN111402633A (en) * 2020-03-23 2020-07-10 北京安捷工程咨询有限公司 Object anti-collision method based on UWB positioning and civil engineering anti-collision system
EP3739356A1 (en) * 2019-05-12 2020-11-18 Origin Wireless, Inc. Method, apparatus, and system for wireless tracking, scanning and monitoring
WO2020258218A1 (en) * 2019-06-28 2020-12-30 深圳市大疆创新科技有限公司 Obstacle detection method and device for mobile platform, and mobile platform

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012066690A (en) * 2010-09-24 2012-04-05 Fujitsu Ten Ltd Vehicle control system, vehicle control apparatus, and vehicle control method
CN104092180A (en) * 2014-06-20 2014-10-08 三星电子(中国)研发中心 Collision detection processing method and device
CN109844671A (en) * 2016-08-26 2019-06-04 克朗设备公司 The path confirmation and dynamic route modification of materials handling vehicle
CN108062600A (en) * 2017-12-18 2018-05-22 北京星云互联科技有限公司 A kind of vehicle collision prewarning method and device based on rectangle modeling
CN108714303A (en) * 2018-05-16 2018-10-30 深圳市腾讯网络信息技术有限公司 Collision checking method, equipment and computer readable storage medium
EP3739356A1 (en) * 2019-05-12 2020-11-18 Origin Wireless, Inc. Method, apparatus, and system for wireless tracking, scanning and monitoring
CN110232741A (en) * 2019-06-17 2019-09-13 腾讯科技(深圳)有限公司 Multilayer bounding box determines method, collision detection and motion control method and equipment
WO2020258218A1 (en) * 2019-06-28 2020-12-30 深圳市大疆创新科技有限公司 Obstacle detection method and device for mobile platform, and mobile platform
CN110428663A (en) * 2019-08-30 2019-11-08 合肥鑫晟光电科技有限公司 A kind of vehicle collision prewarning method, car-mounted terminal and server
CN111177888A (en) * 2019-12-09 2020-05-19 武汉光庭信息技术股份有限公司 Simulation scene collision detection method and system
CN111402633A (en) * 2020-03-23 2020-07-10 北京安捷工程咨询有限公司 Object anti-collision method based on UWB positioning and civil engineering anti-collision system

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
ESSAY: "矩形包围盒碰撞检测", pages 1 - 7, Retrieved from the Internet <URL:https://heptaluan.github.io/2020/11/28/Essay/31/> *
WANG CHAO等: "Improved Hybrid Bounding Box Collision Detection Algorithm", 《JOURNAL OF SYSTEM SIMULATION》, vol. 30, no. 11, 6 September 2018 (2018-09-06), pages 4236 - 4243 *
吴建军: "关于机械设计制造与自动化的研究探讨", 《科技风》, vol. 2019, no. 1, 9 January 2019 (2019-01-09), pages 141 *
李丽娟 等: "飞机插配零部件数字化装配碰撞检测研究", 《机械设计与制造》, vol. 2020, no. 2, 8 February 2020 (2020-02-08), pages 145 - 148 *
杨林 等: "煤矿井下移动机器人运动规划方法研究", 《工矿自动化》, vol. 46, no. 6, 16 June 2020 (2020-06-16), pages 23 - 30 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116810493A (en) * 2023-08-31 2023-09-29 山东惠硕重工机械有限公司 Anti-collision detection method and system for numerical control machine tool based on data driving
CN116810493B (en) * 2023-08-31 2023-11-21 山东惠硕重工机械有限公司 Anti-collision detection method and system for numerical control machine tool based on data driving

Also Published As

Publication number Publication date
CN112700471B (en) 2024-06-07

Similar Documents

Publication Publication Date Title
US11216971B2 (en) Three-dimensional bounding box from two-dimensional image and point cloud data
JP2021523443A (en) Association of lidar data and image data
US9171403B2 (en) Contour completion for augmenting surface reconstructions
WO2021052283A1 (en) Method for processing three-dimensional point cloud data and computing device
CN111640089A (en) Defect detection method and device based on feature map center point
CN107133966B (en) Three-dimensional sonar image background segmentation method based on sampling consistency algorithm
Markovic et al. Feature sensitive three-dimensional point cloud simplification using support vector regression
WO2022133770A1 (en) Method for generating point cloud normal vector, apparatus, computer device, and storage medium
Sveier et al. Object detection in point clouds using conformal geometric algebra
CN115131521A (en) Generating a three-dimensional representation of an object surface from a structured point cloud
CN112700471B (en) Collision detection method, apparatus, and computer-readable storage medium
CN110070606B (en) Space rendering method, target detection method, detection device, and storage medium
CN114764885A (en) Obstacle detection method and device, computer-readable storage medium and processor
CN112446952B (en) Three-dimensional point cloud normal vector generation method and device, electronic equipment and storage medium
CN114972492A (en) Position and pose determination method and device based on aerial view and computer storage medium
CN112308917A (en) Vision-based mobile robot positioning method
CN112700474A (en) Collision detection method, device and computer-readable storage medium
Goforth et al. Joint pose and shape estimation of vehicles from lidar data
Lim et al. Integration of Vehicle Detection and Distance Estimation using Stereo Vision for Real-Time AEB System.
CN114241011A (en) Target detection method, device, equipment and storage medium
Khalfaoui et al. View planning approach for automatic 3d digitization of unknown objects
CN114022630A (en) Method, device and equipment for reconstructing three-dimensional scene and computer readable storage medium
Mukhaimar et al. Comparative analysis of 3D shape recognition in the presence of data inaccuracies
US20240028784A1 (en) Segmenting a building scene
CN114219832B (en) Face tracking method and device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant