AU2019222803A1 - Volume measurement apparatus and method - Google Patents
Volume measurement apparatus and method Download PDFInfo
- Publication number
- AU2019222803A1 AU2019222803A1 AU2019222803A AU2019222803A AU2019222803A1 AU 2019222803 A1 AU2019222803 A1 AU 2019222803A1 AU 2019222803 A AU2019222803 A AU 2019222803A AU 2019222803 A AU2019222803 A AU 2019222803A AU 2019222803 A1 AU2019222803 A1 AU 2019222803A1
- Authority
- AU
- Australia
- Prior art keywords
- depth
- calibration
- volume
- cameras
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01F—MEASURING VOLUME, VOLUME FLOW, MASS FLOW OR LIQUID LEVEL; METERING BY VOLUME
- G01F17/00—Methods or apparatus for determining the capacity of containers or cavities, or the volume of solid bodies
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01G—WEIGHING
- G01G19/00—Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
- G01G19/52—Weighing apparatus combined with other objects, e.g. furniture
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Fluid Mechanics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The disclosure provides a volume measurement apparatus and method. The
apparatus includes a central processing unit and a measurement platform, the
measurement platform includes a vision imaging unit and a dynamic weighing sensor,
and the vision imaging unit mainly consists of depth cameras aligning at a plurality
of angles around an imaging area; the depth cameras are all connected with the
central processing unit; the dynamic weighing sensor is disposed under the imaging
area and configured for collecting weight data to be sent to the central processing
unit. According to the disclosure, depth data at the plurality of visual angles can be
fused by a multi-depth camera calibration technique and a point cloud matching
technique to reconstruct three-dimensional space size information of an irregular
object, thereby improving the stability, robustness and measurement precision of a
measurement unit. The disclosure utilizes a slicing method to cut the
three-dimensional space size of the irregular object into n irregular graphics,
calculates the volume of the irregular object by an integral method, thereby greatly
improving the calculation precision of the volume and the adaptability of the
irregular degree.
Description
[0001] The disclosure belongs to the field of automation equipment, and particularly relates to a volume measurement apparatus and method.
[0002] Coal gangues are solid wastes discharged in the process of coal production and processing, and are rocks which are low in carbon content and harder than coal. In order to improve the quality of coal, sorting of coal gangues is an indispensable link in coal mine production.
[0003] At present, sorting of coal gangue mainly includes a wet separation method, a dry separation method and a manual sorting method. The wet separation method includes a jigging method and a heavy medium method, and is characterized in that raw coals are placed in solution by utilizing different densities of coals and coal gangues to achieve the separation of coals and coal gangues. This method needs huge equipment, is complicated in process and low in sorting efficiency, and causes serious environmental pollution. The dry separation method adopts roller breaking considering different strengths of coals and coal gangues, or adopts recognition sorting using double-energy y ray considering different light transmittances of coals and coal gangues, the former destroys the shape of coal briquettes and is poor in sorting efficiency and huge in equipment, and the latter is expensive in facility and has radiation. In addition, manual sorting is poor in working environment, large in labor intensity and easily generates false sorting and leak.
[0004] In order to achieve the automatic sorting of coals and coal gangues, the volume measurement and density calculation of the materials can be performed, thereby identifying the coals and the coal gangues. Thus, it is currently urgent to automatically measure the volume of the material, especially, to calculate the volume of the irregular object.
[0005] The disclosure provides a volume measurement apparatus and measurement method to calculate the volume of the irregular object, thereby greatly improving the calculation precision of the volume and the adaptability to the irregular degree.
[0006] In order to achieve the above objective, the technical solutions of the disclosure are implemented as follows:
[0007] A volume measurement apparatus, comprising a central processing unit and a measurement platform, wherein the measurement platform comprises a vision imaging unit and a dynamic weighing sensor; the vision imaging unit mainly consists of depth cameras aligning at a plurality of angles of an imaging area; the depth cameras are all connected with the central processing unit; the dynamic weighing sensor is disposed under the imaging area and configured for collecting weight data to be sent to the central processing unit.
[0008] Advantageously, the depth cameras are uniformly distributed within an area of 360° around an object to be measured.
[0009] Advantageously, further comprising an imaging shutter trigger arranged on an edge of an entrance of the measurement platform.
[0010] Advantageously, further comprising a sorting execution mechanism comprising a pushing plate and a hopper, wherein the pushing plate is arranged on the measurement platform and is connected with the central processing unit; the hopper is arranged at the measurement platform side in a pushing direction of the pushing plate.
[00111 The invention also provides a volume measurement method using the above
volume measurement apparatus, comprising:
[0012] S1, calibration of a single depth camera: comprising calibration of internal parameters of a RGB camera head and an infrared camera head contained in the depth camera and calibration of external parameters of the two camera heads;
[0013] S2, calibration of a plurality of depth cameras: obtaining an internal parameter matrix and an external parameter matrix of the single camera through the calibration method in step Sl, wherein, the external parameter matrix of the camera characterizes a pose of the camera relative to a calibration plate; calibration of the plurality of cameras is based on this principle, and a coordinate system transformation matrix among various depth cameras is obtained by acquiring the external parameter matrixes of the plurality of cameras relative to a same calibration plate;
[0014] S3, acquisition of depth information of an object: setting time intervals for depth cameras at different angles to start shooting and capturing graphics, and transforming depth information collected by various cameras to a same coordination system according to a calibration result among various depth cameras, thereby obtaining the full-scene depth information of the object;
[0015] S4, preprocessing of depth information: extracting background data by a dimension filter, and then performing Gaussian smoothing filtering to extract noise data according to distribution and a relative geometrical relationship between point clouds; and
[0016] S5, volume measurement of the measured object: slicing an irregular object according to one dimension with reference to the multi-integral thought, and transforming calculation of the irregular volume into calculation of an area of an irregular geometric graphics.
[0017] Advantageously, the step S5 particularly comprises the following steps: firstly, obtaining a continuous contour of a specific slice of the object by utilizing a partial polynomial fitting method: assuming that there are N dispersion contour points Pt(xyt),t = 1,2,- -,N, a polynomial function fitted by these
contour points is as follows:
wherein, a is a multifunctional coefficient;
constructing a fitted target function into a sum of distances from the N dispersion contour points to the above polynomial curve, and a formula is as follows:
wherein, y, is a vertical coordinate of the dispersion contour point; directly calculating an optimal solution of the fitted polynomial function (x) through generalized matrix inverse; then, calculating an area of the specific slice of the object by utilizing an integral method, namely, an area surrounded by the fitted polynomial function y (x) and the x
axis at a [% maw] internal, wherein a formula for calculating the area through
integral is as follows:
S.W = JmiL xx~d ,and
finally, calculating the volume of the object to be measured by utilizing the integral method, wherein, the volume of the object to be measured is an integral of the area of the object slice on the z axis, and a calculation formula is as follows:
[0018] Compared with the prior art, the disclosure has the following beneficial effects:
[0019] (1) according to the disclosure, by a multi-depth camera calibration technique and a point cloud matching technique, depth data at a plurality of visual angles can be fused, the three-dimensional space size information of the irregular object is reconstructed, thereby improving the stability, robustness and measurement precision of the measurement unit;
[0020] (2) according to the disclosure, the three-dimensional space size of the irregular object is cut into n irregular graphics by utilizing a slicing method, and the volume of the irregular object is calculated by an integral method, thereby greatly improving the calculation precision of the volume and the adaptability to the irregular degree;
[0021] (3) according to the disclosure, a target object is shot by using depth cameras to acquire the volume of the target object, thereby excluding the influence of natural light; there is no need to add an auxiliary light source d in the process of capturing images;
[0022] (4) according to the disclosure, there is no need to rotate the object for 3600 , thereby improving the adaptability of multi-measurement environment and
equipment; the structure of the image capture unit device becomes more easier; and
[0023] (5) according to the disclosure, only a 3-frame depth camera data is enough for measurement, thereby improving the timeliness of measurement and meeting the requirement of on-line measurement on an assembly line.
[0024] FIG. 1 is a structural diagram of a measurement platform according to an embodiment of the disclosure;
[0025] FIG. 2 is another structural diagram of a measurement platform according to an embodiment of the disclosure; and
[0026] FIG. 3 is a flowchart showing a volume measurement procedure according to an embodiment of the disclosure.
[0027] In the figures:
[0028] 13. opposite-type photoelectric sensor; 14. depth camera; 15. pushing plate; and 16. hopper
[0029] It should be noted that the embodiments of the disclosure and the characteristics of the embodiments can be mutually combined under the condition of no conflict.
[0030] The design principle of the disclosure is that the volumes of coals and coal gangues are acquired by utilizing the characteristics that the densities of coals and coal gangues are different in combination with an intelligent visual and image processing algorithm, the quality is acquired according to a dynamic weighing sensor installed on a roll-roll shipment line, and coals and coal gangues are separated by utilizing the characteristics of different densities.
[0031] As shown in FIG. 1 and FIG. 2, in the disclosure, an imaging shutter trigger is a pair of opposite-type photoelectric sensors 13 which are disposed on the roll-roll shipment line of the measurement platform close to an edge of the shooting area; and the opposite-type photoelectric sensors 13 are connected with the central processing unit.
[0032] The measurement platform consists of a visual imaging unit and a dynamic weighing sensor in the shooting area.
[0033] The visual imaging unit consists of depth cameras 14 located at a plurality of angles of an object to be measured, and the depth camera 14 may be a TOF camera and a Kinect camera; and the depth cameras 14 are all connected with the central processing unit and controlled by the central processing unit.
[0034] When a sample enters the shooting area of the measurement platform, the opposite-type photoelectric sensors 13 send a trigger signal to the central processing unit. The central processing unit controls the depth cameras 14 to capture images. In order to avoid the mutual interference of cameras when shooting so as not to simultaneously shoot, it is needed for the depth cameras 14 to respectively shoot at certain time intervals.
[0035] The dynamic weighing sensor is disposed under the imaging area, and sends the collected weight data to the central processing unit.
[0036] The sorting executing mechanism is arranged on the measurement platform, including a pushing plate 15 and a hopper 16. The pushing plate 15 is arranged on the measurement platform 5 and is connected with the central processing unit; and the hopper 16 is arranged at the measurement platform 5 side in the scraping direction of the pushing plate 15.
[00371 The volume measurement steps are as follows:
[0038] (I) calibration of a single depth camera 14
[0039] A commercial depth camera 14, for example TOF camera Microsoft Kinect, generally contains a RGB camera head and an infrared camera head sensing depth, and thus calibration of the single depth camera 14 involves calibration of the inner parameters of the RGB camera head and the infrared camera head and calibration of the outer parameters of the two camera heads.
[0040] Calibration of the inner parameters adopts a famous Zhang Zhengyou calibration method. The positions of the cameras are fixed to capture a plurality of images of checkerboards having different poses (note: when the infrared camera head performs capture, it is needed to shield the active infrared light source of the depth camera 14). The inner parameter matrix of the camera is calculated by utilizing the following formula:
F[F =(MN](R T] L1. (1)
wherein, M is the internal parameter matrix of the camera, and the dimension is 3 x 3; R is the external parameter rotation matrix of the camera, and the dimension is 3 X 3; T is the external parameter translation vector of the camera, the dimension is 3 X 1, [ Y Z]T is the point coordinate under the world coordinate system; is the point coordinate under the image coordinate system; [v 'v] is the scaling coefficient, which generally equals to the depth information under the camera coordinate system.
[0041] Calibration of the external parameters is actually the calibration of relative poses the RGB camera head and the infrared camera head. The pose relationship between the RGB camera head and the infrared camera head can be expressed as follows:
Pr t (2) wherein, P, is the point coordinate under the infrared camera head coordinate system, and the dimension is 3 X 1; P,, is the point coordinate under the RGB
camera head coordinate system, and the dimension is 3:X1, R' is a rotation matrix from the infrared camera head coordinate system to the RGB camera head coordinate system, and the dimension is 3X ; T is a translation matrix from the infrared camera head coordinate system to the RGB camera head coordinate system, and the dimension is 3 X 1.
[0042] Similarly, a transformation relationship from the point under the world coordinate system to the RGB camera head coordinate system and the infrared camera head coordinate system can be expressed as follows:
=R~1~+ ~ (3)
P M = R 2, TI (4)
wherein, R7 is a rotation matrix from the world coordinate system to the RGB camera head coordinate system, and the dimension is 3X S; P, is the point coordination under the world coordinate system, and the dimension is 3X'1; T' is a translation matrix from the world coordinate system to the infrared camera head coordinate system, and the dimension is 3 X1; R," is a rotation matrix from the world coordinate system to the infrared camera head coordinate system, and the dimension is 3 X 3; Ty is a transition matrix from the world coordinate system to the infrared camera head coordinate system.
[0043] Assuming that points seen by the RGB camera head and the infrared camera head are identical , and formulas (3) and (4) are substituted into formula (2) to obtain:
R-- P,4 T'Ff = RW (HR"P, + T ") 4 T(5)
[0044] The following is obtained after expansion: R P, = RR 4 R T|4bl T (6)
[0045] The following is obtained through equation (6):
- (7)
'F rb- = ic R~rn'r(8)
[0046] The RGB camera head and the infrared camera head simultaneously observe the same calibration plate, outer parameter matrixes of the RGB camera head and the infrared camera head relative to the world coordination system defined by the current calibration plate can be estimated by utilizing the Zhang Zhengyou
calibration algorithm, namely, , T;,A" and T., a multi-frame observation
result is substituted into formula (8) to optimally estimate R and , namely, rotation and translation matrixes between the RGB camera head and the infrared camera head.
[0047] It can be seen from formula (1) that mapping of the point coordination under the RGB camera head coordination system to an image coordination system can be obtained by the following formula:
s- f1 =M (9)
wherein, is depth information provided by the RGB camera head.
[0048] Similarly, the following can be obtained:
1 t wherein, s, is depth information provided by the infrared camera head.
[0049] Formula (11) is substituted into formula (2) to obtain:
Pa 5- + (12)
[0050] Formula (12) is substituted into formula (9) to obtain:
[0051] This formula is for registering depth information to a RGB image, namely, is a re-projection formula. Optimization of re-projection is performed by utilizing the multi-frame observation result, thereby further improving the calibration precision of the depth camera 14.
[0052] (I) Calibration of a plurality of depth cameras 14
[0053] The objective of calibration of a plurality of depth cameras 14 is to obtain coordinate transformation matrixes of various depth cameras 14 by a calibration method. On the premise that calibration of single depth camera 14 is completed, the essence of calibration of the multi-depth camera 14 is to calibrate the transformation matrix of the RGB camera coordinate systems of the plurality of depth cameras 14. Similar to the above derivation, its calibration method can use optimization of re-projection to improve the calibration precision of the plurality of cameras.
[0054] The matrix transformation particularly comprises:
[00551 The external parameter matrix of the camera characterizes the pose of the camera relative to a certain calibration plate. Calibration of the plurality of cameras depends on this principle. The coordinate system transformation matrix of various depth cameras 14 can be obtained by acquiring the external parameter matrix of the plurality of cameras relative to the same calibration plate.
[0056] (III) Acquisition of object depth information
[0057] In order to achieve the volume measurement of irregular objects, this method needs to obtain the depth information of a measured object within the panoramic range of 360 degrees. In order to meet the above requirements and avoid mutual interference between the plurality of depth cameras 14 and realize the dynamic measurement of conveyor belt objects, the depth cameras 14 located at different angles around the object to be measured need to set time intervals to start shooting.
[0058] When the object on the conveyor belt passes through the shooting area, the shutter trigger triggers the corresponding depth camera 14 to capture the image after detecting the object. Then, according to the calibration results of the plurality of depth cameras 14, the depth information collected by each camera can be transformed into the same coordinate system, and then the panoramic depth information of the object is obtained.
[0059] In this embodiment, the method for obtaining the object depth information particularly comprises: three depth cameras at different angles (0, 120, 240 degrees) need to set time intervals to start shooting to capture the images, then three depth maps captured by three cameras are transformed to the same coordinate system according to the calibration results of various depth cameras (coordinate system conversion matrix), and then the panoramic depth information of the object is obtained.
[0060] (IV) Preprocessing of depth information
[0061] Depth data acquired by the plurality of depth cameras 14 contains object data, and background data such as conveyor belts, and meanwhile also contains noise data caused by environmental interference or precision of equipment itself. Therefore, the objective of preprocessing is to get rid of eliminating background data and noise data through effective methods to acquire the effective depth information of the measured object. Relative to object data, background data is basically specific depth data, such as a conveyor belt plane. Therefore, most of the background data can be extracted by a dimension filter. Then, according to the distribution of point clouds and the relative geometric relationship, Gaussian smoothing filtering is performed, namely, an octree for describing the spatial position relationship of point clouds is constructed so as to quickly retrieve the point clouds; according to the distribution and relative geometric relationship between the nearest K points, the mean and variance of the Gaussian mixed model are calculated, and Gaussian smoothing filtering is performed to extract noise data.
[0062] (V) Volume measurement of a measured object
[0063] This method aims at volume measurement of irregular objects. The volume can not be calculated by simply multiplying length, width and height. Generally speaking, the acquired point clouds are sparse. In order to calculate the volume, the continuous contour edges of objects must be acquired by a fitting method. Because of the irregularity of the surface of the object, adoption of a plane or surface fitting method can inevitably cause larger errors and lower computational efficiency. This method draw lessons from the idea of the plurality of integral to slice irregular objects according to one dimension, and converts calculation of irregular volumes into calculation of irregular geometric graphic areas. Considering the sparseness of point cloud data, this method firstly utilizes a local polynomial fitting method to obtain the continuous contour of a specific slice of an object: assuming there are N discrete contour points ty) = L, N , and a polynomial function fitted by these contour points is as follows:
wherein, a is a multinomial coefficient.
[0064] The fitting objective function is constructed as a sum of distances from N discrete contour points to the above polynomial curve, and the formula is as follows:
wherein, y, is a longitudinal coordinate of a discrete contour point.
[0065] Through generalized matrix inversion, the optimal solution of the fitting polynomial function can be directly obtained.
[0066] Then the area of a specific slice of the object is calculated by an integral method, namely, the area surrounded by the fitting polynomial function y (x) and the x axis in the[ite ] interval. A a f or calculating the area via integral is as follows:
f Y(X)dx
[0067] Finally, the volume of the object to be measured is calculated by the integral method. The volume of the object to be measured is the integral of the slice area of the object on the z axis. The calculation formula is as follows:
[0068] The above description is only a preferred embodiment of the disclosure. It should be noted that those of ordinary skill in the art can also make several improvements and modifications without departing from the concept of the disclosure, and these improvements and modifications should be deemed as being included within the protection scope of the disclosure.
[0069] It will be understood that the term "comprise" and any of its derivatives (eg comprises, comprising) as used in this specification is to be taken to be inclusive of features to which it refers, and is not meant to exclude the presence of any additional features unless otherwise stated or implied.
[0070] The reference to any prior art in this specification is not, and should not be taken as, an acknowledgement or any form of suggestion that such prior art forms part of the common general knowledge.
Claims (6)
1. A volume measurement apparatus, comprising a central processing unit and a measurement platform, wherein the measurement platform comprises a vision imaging unit and a dynamic weighing sensor; the vision imaging unit mainly consists of depth cameras aligning at a plurality of angles of an imaging area; the depth cameras are all connected with the central processing unit; the dynamic weighing sensor is disposed under the imaging area and configured for collecting weight data to be sent to the central processing unit.
2. The volume measurement apparatus according to claim 1, wherein the depth cameras are uniformly distributed within an area of 3600around an object to be measured.
3. The volume measurement apparatus according to claim 1, further comprising an imaging shutter trigger arranged on an edge of an entrance of the measurement platform.
4. The volume measurement apparatus according to claim 1, further comprising a sorting execution mechanism comprising a pushing plate and a hopper, wherein the pushing plate is arranged on the measurement platform and is connected with the central processing unit; the hopper is arranged at the measurement platform side in a pushing direction of the pushing plate.
5. A volume measurement method using the volume measurement apparatus according to any one of claims 1-4, comprising: Si, calibration of a single depth camera: comprising calibration of internal parameters of a RGB camera head and an infrared camera head contained in the depth camera and calibration of external parameters of the two camera heads; S2, calibration of a plurality of depth cameras: obtaining an internal parameter matrix and an external parameter matrix of the single camera through the calibration method in step S, wherein, the external parameter matrix of the camera characterizes a pose of the camera relative to a calibration plate; calibration of the plurality of cameras is based on this principle, and a coordinate system transformation matrix among various depth cameras is obtained by acquiring the external parameter matrixes of the plurality of cameras relative to a same calibration plate; S3, acquisition of depth information of an object: setting time intervals for depth cameras at different angles to start shooting and capturing graphics, and transforming depth information collected by various cameras to a same coordination system according to a calibration result among various depth cameras, thereby obtaining the full-scene depth information of the object; S4, preprocessing of depth information: extracting background data by a dimension filter, and then performing Gaussian smoothing filtering to extract noise data according to distribution and a relative geometrical relationship between point clouds; and S5, volume measurement of the measured object: slicing an irregular object according to one dimension with reference to the multi-integral thought, and transforming calculation of the irregular volume into calculation of an area of an irregular geometric graphics.
6. The volume measurement method according to claim 5, wherein, the step S5 particularly comprises the following steps: firstly, obtaining a continuous contour of a specific slice of the object by utilizing a partial polynomial fitting method: assuming that there are N dispersion contour points Pt(x,yt, = IZ-,N, a polynomial function fitted by these
contour points is as follows: Y(X-)= ag+ ajx- + ax, + ax-';
wherein, a is a multifunctional coefficient;
constructing a fitted target function into a sum of distances from the N dispersion contour points to the above polynomial curve, and a formula is as follows:
t=1
wherein, y, is a vertical coordinate of the dispersion contour point; directly calculating an optimal solution of the fitted polynomial function yx) through generalized matrix inverse; then, calculating an area of the specific slice of the object by utilizing an integral method, namely, an area surrounded by the fitted polynomial function y (x) and the x axis at a[%t" ' am ] internal, wherein a formula for calculating the area through integral is as follows: and finally, calculating the volume of the object to be measured by utilizing the integral method, wherein, the volume of the object to be measured is an integral of the area of the object slice on the z axis, and a calculation formula is as follows:
15f dz
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910132568.1A CN109632033B (en) | 2019-02-22 | 2019-02-22 | Volume measurement device and method |
CN201910132568.1 | 2019-02-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
AU2019222803A1 true AU2019222803A1 (en) | 2020-09-10 |
Family
ID=66065869
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU2019222803A Abandoned AU2019222803A1 (en) | 2019-02-22 | 2019-08-26 | Volume measurement apparatus and method |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN109632033B (en) |
AU (1) | AU2019222803A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111739031A (en) * | 2020-06-19 | 2020-10-02 | 华南农业大学 | Crop canopy segmentation method based on depth information |
CN113640177A (en) * | 2021-06-29 | 2021-11-12 | 阿里巴巴新加坡控股有限公司 | Cargo density measuring method and system and electronic equipment |
CN113916125A (en) * | 2021-08-04 | 2022-01-11 | 清华大学 | Vinasse volume measuring method based on depth imaging |
CN114972351A (en) * | 2022-08-01 | 2022-08-30 | 深圳煜炜光学科技有限公司 | Mine car ore quantity detection method, device and equipment |
CN116152344A (en) * | 2023-04-18 | 2023-05-23 | 天津德通电气有限公司 | Coal dressing method and system based on shape database identification |
CN117036953A (en) * | 2023-08-21 | 2023-11-10 | 中国科学院自动化研究所 | Multi-feature fusion coal gangue identification method |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10930001B2 (en) * | 2018-05-29 | 2021-02-23 | Zebra Technologies Corporation | Data capture system and method for object dimensioning |
CN110189347B (en) * | 2019-05-15 | 2021-09-24 | 深圳市优***科技股份有限公司 | Method and terminal for measuring volume of object |
CN110220567A (en) * | 2019-07-12 | 2019-09-10 | 四川长虹电器股份有限公司 | Real time inventory volume measurement method for irregular container |
CN111027405B (en) * | 2019-11-15 | 2023-09-01 | 浙江大华技术股份有限公司 | Method and device for estimating space occupancy of article, terminal and storage device |
CN111307659A (en) * | 2020-03-11 | 2020-06-19 | 河南理工大学 | Rapid density measuring system for irregular rigid object |
CN111047652B (en) * | 2020-03-13 | 2020-06-30 | 杭州蓝芯科技有限公司 | Rapid multi-TOF camera external parameter calibration method and device |
CN113496142A (en) * | 2020-03-19 | 2021-10-12 | 顺丰科技有限公司 | Method and device for measuring volume of logistics piece |
CN111429507A (en) * | 2020-04-14 | 2020-07-17 | 深圳市异方科技有限公司 | Volume measurement device based on multiple 3D lenses |
CN111879662A (en) * | 2020-08-03 | 2020-11-03 | 浙江万里学院 | Weight measurement system and method for open water of iron ore |
CN112270702A (en) * | 2020-11-12 | 2021-01-26 | Oppo广东移动通信有限公司 | Volume measurement method and device, computer readable medium and electronic equipment |
CN112419393B (en) * | 2020-11-15 | 2022-06-14 | 浙江大学 | Real-time measuring and calculating device and method for volume of garbage in hopper of garbage incinerator |
CN113237423B (en) * | 2021-04-16 | 2023-09-05 | 北京京东乾石科技有限公司 | Article volume measuring device |
CN113513980B (en) * | 2021-06-07 | 2023-05-02 | 昂视智能(深圳)有限公司 | Volume measuring device based on auxiliary ruler |
CN113393383B (en) * | 2021-08-17 | 2021-11-16 | 常州市新创智能科技有限公司 | Splicing method for photographed images of double-depth camera |
CN114264355B (en) * | 2021-11-18 | 2024-06-25 | 河南讯飞智元信息科技有限公司 | Weight detection method, device, electronic equipment and storage medium |
CN114459362B (en) * | 2021-12-31 | 2024-03-26 | 深圳市瑞图生物技术有限公司 | Measuring device and measuring method thereof |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104132613B (en) * | 2014-07-16 | 2017-01-11 | 佛山科学技术学院 | Noncontact optical volume measurement method for complex-surface and irregular objects |
CN104457574A (en) * | 2014-12-11 | 2015-03-25 | 天津大学 | Device for measuring volume of irregular object in non-contact measurement mode and method |
CN104492725A (en) * | 2014-12-16 | 2015-04-08 | 郑州大河智信科技股份公司 | Intelligent coal gangue separating machine |
CN105180806A (en) * | 2015-08-25 | 2015-12-23 | 大连理工大学 | Trans-scale geometrical parameter measurement method based on microscopic visual sense |
CN106839975B (en) * | 2015-12-03 | 2019-08-30 | 杭州海康威视数字技术股份有限公司 | Volume measuring method and its system based on depth camera |
CN107514983B (en) * | 2016-08-16 | 2024-05-10 | 上海汇像信息技术有限公司 | System and method for measuring surface area of object based on three-dimensional measurement technology |
CN107449501A (en) * | 2017-08-17 | 2017-12-08 | 深圳市异方科技有限公司 | A kind of product is weighed reading code and dimension volume measuring system automatically |
CN107730555A (en) * | 2017-08-25 | 2018-02-23 | 徐州科融环境资源股份有限公司 | A kind of coal conveyer belt coal cinder granularity ONLINE RECOGNITION monitoring method based on machine vision |
CN108717728A (en) * | 2018-07-19 | 2018-10-30 | 安徽中科智链信息科技有限公司 | A kind of three-dimensional reconstruction apparatus and method based on various visual angles depth camera |
CN209230716U (en) * | 2019-02-22 | 2019-08-09 | 浙江大学滨海产业技术研究院 | A kind of volume measurement device |
-
2019
- 2019-02-22 CN CN201910132568.1A patent/CN109632033B/en active Active
- 2019-08-26 AU AU2019222803A patent/AU2019222803A1/en not_active Abandoned
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111739031A (en) * | 2020-06-19 | 2020-10-02 | 华南农业大学 | Crop canopy segmentation method based on depth information |
CN113640177A (en) * | 2021-06-29 | 2021-11-12 | 阿里巴巴新加坡控股有限公司 | Cargo density measuring method and system and electronic equipment |
CN113916125A (en) * | 2021-08-04 | 2022-01-11 | 清华大学 | Vinasse volume measuring method based on depth imaging |
CN114972351A (en) * | 2022-08-01 | 2022-08-30 | 深圳煜炜光学科技有限公司 | Mine car ore quantity detection method, device and equipment |
CN114972351B (en) * | 2022-08-01 | 2022-11-11 | 深圳煜炜光学科技有限公司 | Mine car ore quantity detection method, device and equipment |
CN116152344A (en) * | 2023-04-18 | 2023-05-23 | 天津德通电气有限公司 | Coal dressing method and system based on shape database identification |
CN116152344B (en) * | 2023-04-18 | 2023-07-11 | 天津德通电气有限公司 | Coal dressing method and system based on shape database identification |
CN117036953A (en) * | 2023-08-21 | 2023-11-10 | 中国科学院自动化研究所 | Multi-feature fusion coal gangue identification method |
CN117036953B (en) * | 2023-08-21 | 2024-02-13 | 中国科学院自动化研究所 | Multi-feature fusion coal gangue identification method |
Also Published As
Publication number | Publication date |
---|---|
CN109632033A (en) | 2019-04-16 |
CN109632033B (en) | 2024-04-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2019222803A1 (en) | Volume measurement apparatus and method | |
WO2018028103A1 (en) | Unmanned aerial vehicle power line inspection method based on characteristics of human vision | |
CN109544679B (en) | Three-dimensional reconstruction method for inner wall of pipeline | |
CN112418103B (en) | Bridge crane hoisting safety anti-collision system and method based on dynamic binocular vision | |
CN105346706B (en) | Flight instruments, flight control system and method | |
CN105551064B (en) | A kind of method based on characteristics of image estimation windrow volume change | |
CN106969706A (en) | Workpiece sensing and three-dimension measuring system and detection method based on binocular stereo vision | |
CN104482860B (en) | Fish morphological parameters self-operated measuring unit and method | |
CN105716539B (en) | A kind of three-dimentioned shape measurement method of quick high accuracy | |
CN109848073A (en) | A kind of apparatus and method for sorting coal and gangue | |
CN101512551A (en) | A method and a system for measuring an animal's height | |
Rau et al. | Bridge crack detection using multi-rotary UAV and object-base image analysis | |
CN105447853A (en) | Flight device, flight control system and flight control method | |
CN110910350B (en) | Nut loosening detection method for wind power tower cylinder | |
CN114241298A (en) | Tower crane environment target detection method and system based on laser radar and image fusion | |
CN110047111B (en) | Parking apron corridor bridge butt joint error measuring method based on stereoscopic vision | |
CN111266315A (en) | Ore material online sorting system and method based on visual analysis | |
CN104200492B (en) | Video object automatic detection tracking of taking photo by plane based on profile constraints | |
Son et al. | Rapid 3D object detection and modeling using range data from 3D range imaging camera for heavy equipment operation | |
CN103913149B (en) | A kind of binocular range-measurement system and distance-finding method thereof based on STM32 single-chip microcomputer | |
CN209230716U (en) | A kind of volume measurement device | |
CN110458785B (en) | Magnetic levitation ball levitation gap detection method based on image sensing | |
CN115909025A (en) | Terrain vision autonomous detection and identification method for small celestial body surface sampling point | |
CN108180871B (en) | method for quantitatively evaluating composite insulator surface pulverization roughness | |
CN112258398B (en) | Conveyor belt longitudinal tearing detection device and method based on TOF and binocular image fusion |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
MK5 | Application lapsed section 142(2)(e) - patent request and compl. specification not accepted |