CN114723885A - Plant cold tolerance analysis method based on RGBD image dense three-dimensional reconstruction - Google Patents

Plant cold tolerance analysis method based on RGBD image dense three-dimensional reconstruction Download PDF

Info

Publication number
CN114723885A
CN114723885A CN202210353858.0A CN202210353858A CN114723885A CN 114723885 A CN114723885 A CN 114723885A CN 202210353858 A CN202210353858 A CN 202210353858A CN 114723885 A CN114723885 A CN 114723885A
Authority
CN
China
Prior art keywords
plant
model
image
dimensional
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210353858.0A
Other languages
Chinese (zh)
Other versions
CN114723885B (en
Inventor
卜佳俊
石泽鑫
蔡晓旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN202210353858.0A priority Critical patent/CN114723885B/en
Publication of CN114723885A publication Critical patent/CN114723885A/en
Application granted granted Critical
Publication of CN114723885B publication Critical patent/CN114723885B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/08Indexing scheme for image data processing or generation, in general involving all processing steps from image acquisition to 3D model generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2012Colour editing, changing, or manipulating; Use of colour codes

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a plant cold tolerance analysis method based on RGBD image dense three-dimensional reconstruction, which comprises the following steps: (1) RGBD images of plants at different angles shot at different stages are obtained, and the image of each stage of each plant is used as a set of data. Before the three-dimensional reconstruction starts, firstly obtaining the depth position of a plant in a depth image, and setting configuration file parameters; (2) for each group of images, reconstructing a plant model by a dense three-dimensional reconstruction method; (3) and analyzing the change of the three-dimensional model of the plant at each stage, and counting the proportion of the model at the subsequent stage to the model at the initial state to distinguish whether the plant is a cold-resistant plant. The invention has the advantages of low cost, mature and efficient function, wide application scene and the like.

Description

Plant cold tolerance analysis method based on RGBD image dense three-dimensional reconstruction
Technical Field
The invention belongs to the field of plant three-dimensional structure analysis, and particularly relates to a plant cold tolerance analysis method based on RGBD image dense three-dimensional reconstruction.
Background
The intelligent agriculture is the combination of modern science and technology and agricultural planting, so that unmanned, automatic and intelligent management is realized.
For plants, environmental factors play a key role in growth, different plants can survive under various complex geographical environmental conditions, and more particularly, due to the adaptability of plants to environmental changes, for agriculture, crops with cold-resistant planting environments can improve yield and quality. The cold resistance of the plant can be expressed as the morphological difference of the plant in different environments, and the cold resistance can be judged by analyzing a plant model by using a three-dimensional reconstruction technology.
And based on three-dimensional reconstruction of the image, the three-dimensional reconstruction is to establish a three-dimensional model according to the data. The early three-dimensional technology usually takes a two-dimensional image as input to reconstruct a three-dimensional model, and the three-dimensional model established by the RGB image has low precision and limited application range; with the emergence of depth cameras for common consumers, depth image data is provided, reconstruction difficulty is reduced, application scenes are wider, and three-dimensional scanning and reconstruction technologies based on the depth cameras are developed rapidly.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a plant cold tolerance analysis method based on RGBD image dense three-dimensional reconstruction.
The technical scheme provided by the invention is as follows:
(1) RGBD images of plants at different angles shot at different stages are obtained, and the image of each stage of each plant is used as a set of data. Before the three-dimensional reconstruction starts, the depth position of the plant in the depth image is firstly obtained, and configuration file parameters are set.
(2) For each group of images, reconstructing a plant model by means of a dense three-dimensional reconstruction
(3) And analyzing the change of the plant three-dimensional model in each stage, and counting the proportion of the model in the subsequent stage to the model in the initial state to distinguish whether the plant is a cold-resistant plant.
Further, the plant image shooting stage in the step (1) comprises three stages of before-treatment, after-treatment and after-recovery, the operation is cold treatment, and three contrasts of different cold-resistant plants, namely a middle T, a cold-resistant J and a sensitive C, are set in each stage.
Further, the image capturing angle in step (1) is converted by: the camera is fixed, the shooting direction faces to the plants, the plants are placed on a turntable to rotate at the speed of m minutes by one turn, the frame rate of the shot images is f frames per second, m frames by 60 frames by f frames are shot in one turn, and each frame comprises two pictures, an RGB (red, green and blue) image and a depth image.
Further, for the depth position of the plant in the step (1), counting the number of depth values by using traversal pixels, and finding the depth value of the plant as model initialization for the shooting environment of the experiment in a certain interval, wherein the parameters of the set configuration file mainly include: the method comprises the steps of data set path, Volume size, cube grid scale, camera-to-model center distance initialization, pyramid processing layer number, distance threshold value of matched associated projection points, angle threshold value, three-layer pyramid iteration times, filtering kernel size and the like.
Further, the method for dense three-dimensional reconstruction in step (2) to reconstruct a plant model specifically includes:
(2.1) from the sequence of captured RGBD plant images, the RGB images are not processed and used for coloring in the model surface reconstruction stage described in step S2. Processing the depth map to generate a depth image pyramid and performing bilateral filtering processing, and processing each layer of depth image DkGenerating three-dimensional points under the current world coordinate system through camera internal reference back projection to obtain a vertex diagram V of the three-dimensional pointsk. And then cross-multiplying and normalizing by using the difference value of the vertical u direction and the vertical v direction according to the top point diagram, calculating the normal vector of the three-dimensional point, and obtaining a normal diagram Nk. Vertex diagram VkThe formula of (1) is:
Vk(p)=Dk(p)K-1p (1)
wherein p is the coordinate of the pixel point of the camera, K is the calibration matrix, and subscript K represents the kth frame.
Normal graph NkThe formula of (1) is:
Nk(p)=Normalize[(Vk(u+1,v)-Vk(u,v))×(Vk(u,v+1)-Vk(u,v))] (2)
wherein u, v are the horizontal and vertical coordinates of the point p.
(2.2) pose estimation step, for the case of the first frame, this step will be skipped and the bits of the first frame will be skippedThe posture is set up to an initial value. For the case of non-first frame, using the vertex diagram V obtained in step (2.1)kAnd normal diagram NkVertex map of surface model data inferred from existing models
Figure BDA0003581650670000021
Hema normal map
Figure BDA0003581650670000022
And the pose obtained by the last iteration is used for measuring the angle and distance error of the vertex data of the current frame and the vertex of the existing surface model data on the pixel coordinate, the associated projection point u is obtained by matching, and after the iteration is carried out for multiple times, the camera pose error of the current frame is minimized, wherein the error estimation formula is as follows:
Figure BDA0003581650670000023
wherein, TkFor the pose transformation of the current frame and the previous frame, the error between the current k frame image data and the previous frame k-1 image data through the pose transformation is as small as possible, and the current k frame image data is also the optimal solution required to be obtained by minimizing the error.
The objective function is a nonlinear least square problem, which can be linearized by an approximation method and solved by calculation by using a matrix SVD decomposition method.
(2.3) reconstructing the surface of the model, fusing frame data to Global Volume by using a TSDF method, and according to the obtained current frame camera pose TkA vertex diagram V of the current framekAnd normal diagram NkEach thread corresponds to (x, y, x) in the Global Volume, each thread is responsible for processing all data on the z axis, a cubic grid where each (x, y, z) is located is called a voxel, namely a cubic grid of the spatial scene model, each grid stores a distance value and a weight value, the distance value represents a distance difference value to the surface of a nearby object, and the closer to 0 represents the closer to the surface of the current voxel point. The TSDF value is calculated by the formula:
Figure BDA0003581650670000024
wherein t iskIs the coordinate of the camera of the k-th frame under a world coordinate system, p is the currently traversed voxel center point, lambda is the corresponding relation between the three-dimensional distance and the depth difference value of the camera and the voxel point, RkIs a depth image and x is the projected point pixel coordinate of the voxel center on the camera image.
The calculation formula of the corresponding relation lambda between the three-dimensional distance between the camera and the voxel point and the depth difference value is as follows:
λ=||K-1x||2 (5)
the calculation formula of the projection point pixel coordinate x of the voxel center on the camera image is as follows:
Figure BDA0003581650670000025
where pi (u) represents the perspective projection of the u-space three-dimensional point onto the camera plane.
The formula of the truncation operation Ψ (η) of the TSDF is:
Figure BDA0003581650670000031
where μ represents the truncation distance and less than- μ is a value where it is considered that object occlusion cannot be acquired.
Obtained
Figure BDA0003581650670000032
Represents the TSDF value of the p voxel point when processing the k frame image.
And updating the existing TSDF value and weight in the Volume by using the TSDF value and weight obtained by the new frame of image, wherein a value updating formula and a weight updating formula are as follows:
Figure BDA0003581650670000033
Figure BDA0003581650670000034
(2.4) by using ray casting, starting from a camera, traversing each pixel point, going against a ray, passing through a voxel in a Global Volume until the TSDF value passing through the voxel changes from positive to negative or from negative to positive, judging the position of a surface near the value of 0, and performing linear interpolation on two adjacent points with changed value signs, wherein the linear interpolation formula is as follows:
Figure BDA0003581650670000035
where t represents the ray path, Δ t is the step size,
Figure BDA0003581650670000036
for the value of the TSDF for the previous step,
Figure BDA0003581650670000037
is the current TSDF value.
The point of the three-dimensional coordinate with the value of 0 obtained by interpolation is the surface three-dimensional top point diagram of the object model
Figure BDA0003581650670000038
And calculating to obtain a normal map
Figure BDA0003581650670000039
As model surface data for comparison in the next frame pose estimation step. Enter the next frame of the loop.
Further, for analyzing the change of the plant three-dimensional model in each stage in the step (3), the statistics of the ratio of the model in the subsequent stage to the model in the initial state specifically includes:
and after a plant three-dimensional point cloud model and a mesh model are obtained, analyzing the point cloud model. By PK,S,NRepresenting the plant model, K the cold tolerance type, S the stage pre, post, re, N the plant number.
Counting the number of point clouds of the plant three-dimensional model at each stage, considering the atrophy of leaf morphology of the plant in a withered state, reducing the surface area, and showing that the number of the point clouds in the three-dimensional model is reduced; after the recovery stage, the three cold-resistant plants can be in different states, and whether the plants are the cold-resistant plants or not can be distinguished according to the proportion of the point cloud number of the leaves in the model to the initial state.
Compared with the prior art, the invention has the advantages that:
(1) the depth camera shooting equipment required by the method is low in cost, mature and efficient in function, simple in implementation and operation, capable of realizing automation and free of professional plant phenotype analysis knowledge.
(2) The method for plant model analysis has strong universality and wide application scenes, and can obtain better results for different plants or in different scenes.
(3) The plant phenotype analysis method combined with the three-dimensional reconstruction basis improves the efficiency, reduces the workload and improves the accuracy of the traditional analysis work. The analysis of plant phenotype parameters is closely related to breeding, and the method is beneficial to research and development of new crop varieties, improves the yield and quality of crops, and improves the capability of resisting drought or diseases and pests.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be fully described below, and all other embodiments obtained by a person of ordinary skill in the art based on the embodiments of the present invention without creative efforts belong to the protection scope of the present invention.
The embodiment is as follows:
the invention provides a plant cold tolerance analysis method based on RGBD image three-dimensional reconstruction, which comprises the following steps:
(1) RGBD images of plants at different angles shot at different stages are obtained, and the image of each stage of each plant is used as a set of data. The three stages are before treatment, after treatment and after recovery, the operation of the treatment is cold treatment, and three controls of different cold-resistant plants, namely middle T, cold-resistant J and sensitive C, are set in each stage. The image is taken at an angle by first fixing the camera with the direction of the image facing the plant, rotating the plant on a turntable at a speed of two minutes and one turn, at a frame rate of 6 frames per second for a total of 720 frames.
Each frame of the shot data comprises two pictures, an RGB picture and a depth picture, a color image folder containing 720 RGB pictures and a depth image folder containing 720 depth pictures are placed into the folders, the folders are used as a group of data sets and contain camera internal reference files, and model files and pose files generated by subsequent processing are placed into outfile folders at the same level.
Before the three-dimensional reconstruction starts, the position of a plant in a depth image is firstly obtained, the number of depth values is counted by using traversal pixels, the shooting environment of an experiment is in a range of 300-600, the depth value of the plant is about 450, a configuration file parameter needs to be set, the path of a data set is provided, the Volume size is [500, 300, 300], the cube grid scale is 1, the distance from a camera to the center of a model is initialized to be 450, the pyramid processing layer number is 3, the distance threshold value of a matched and associated projection point is 5, the angle threshold value is 10 degrees, the three layers of pyramids of iteration times are [5, 7, 10], and the size of a filter kernel is 5. The first 1 second or so of the shot results in unsharp due to exposure problems, so the first 12 frames of images are discarded in the example.
(2) For 708 frames in the RGBD plant image sequence, the RGB image is not processed and is used for coloring in the work of the model surface reconstruction stage. Processing the depth map to generate a 3-layer pyramid of the depth image and perform bilateral filtering processing, and processing each layer of depth image DkAnd transmitting the image to a GPU for operation. Traversing each pixel according to the depth map, obtaining depth and judging background, generating three-dimensional points under the current world coordinate system for non-background points through camera internal reference back projection, and obtaining a top point map V of the three-dimensional pointsk. And according to the top point diagram, each pixel is traversed, the difference value of the u direction and the v direction is cross-multiplied,and normalizing, calculating normal vector of three-dimensional point to obtain normal graph Nk. Vertex diagram VkThe formula of (1) is:
Vk(p)=Dk(p)K-1p (1)
wherein p is the coordinate of the pixel point of the camera, K is a calibration matrix, and subscript K represents the kth frame.
Normal graph NkThe formula of (1) is:
Nk(p)=Normalize[(Vk(u+1,v)-Vk(u,v))×(Vk(u,v+1)-Vk(u,v))] (2)
wherein u, v are the horizontal and vertical coordinates of the point p.
Then, a pose estimation step is carried out, if the current frame is the first frame, the pose estimation step is skipped, and an initial value is established for the pose of the first frame, wherein the initial value is that the pose is according to the Volume size and the distance from the camera to the center of the model:
Figure BDA0003581650670000051
according to the obtained vertex diagram VkAnd normal diagram NkVertex map of surface model data inferred from existing models
Figure BDA0003581650670000052
Hema normal map
Figure BDA0003581650670000053
And the pose obtained by the last iteration is transmitted into a GPU for processing, each pixel of the image size is traversed, the vertex data of the current frame and the angle and distance error of the vertex of the existing surface model data on the pixel coordinate are measured, the associated projection point u is obtained in a matched mode, after the iteration is carried out for multiple times, the camera pose error of the current frame is minimized, and the error estimation formula is as follows:
Figure BDA0003581650670000054
wherein, TkFor the pose transformation of the current frame and the previous frame, the error between the current k frame image data and the previous frame k-1 image data through the pose transformation is as small as possible, and the current k frame image data is also the optimal solution required to be obtained by minimizing the error.
The objective function is a nonlinear least square problem, which can be linearized by an approximation method and solved by calculation by using a matrix SVD decomposition method.
Then fusing frame data to Global Volume by using a TSDF method, and obtaining the camera pose T of the current frame according to the obtained camera pose T of the current framekA vertex diagram V of the current framekAnd normal diagram NkAnd transmitting the image into a GPU, wherein each thread corresponds to (x, y, x) in a Global Volume, each thread is responsible for processing all data on a z-axis, a cubic grid where each (x, y, z) is located is called a voxel and is a cubic grid of a spatial scene model, each grid is stored with a distance value and a weight value, the distance value represents a distance difference value to the surface of a nearby object, and the approach of 0 represents that the front pixel point is closer to the surface. The TSDF value is calculated by the formula:
Figure BDA0003581650670000055
wherein t iskIs the coordinate of the camera of the k-th frame under a world coordinate system, p is the currently traversed voxel center point, lambda is the corresponding relation between the three-dimensional distance and the depth difference value of the camera and the voxel point, RkIs a depth image and x is the projected point pixel coordinate of the voxel center on the camera image.
The calculation formula of the corresponding relation lambda of the three-dimensional distance between the camera and the voxel point and the depth difference value is as follows:
λ=||K-1x||2 (5)
the calculation formula of the projection point pixel coordinate x of the voxel center on the camera image is as follows:
Figure BDA0003581650670000056
where pi (u) represents the perspective projection of a three-dimensional point in u space onto the camera plane.
The formula of the truncation operation Ψ (η) of the TSDF is:
Figure BDA0003581650670000057
where μ represents the truncation distance, and less than- μ is a value where object occlusion is considered to be unacquirable.
Obtained
Figure BDA0003581650670000061
Represents the TSDF value of the p voxel point when processing the k frame image.
Updating the existing TSDF value and weight in the Volume by using the TSDF value and weight obtained by the new frame of image, wherein the value updating formula and the weight updating formula are as follows:
Figure BDA0003581650670000062
Figure BDA0003581650670000063
then, using ray casting, starting from a camera, traversing each pixel point, moving against light, passing through a voxel in a Global Volume until the TSDF value passing through the voxel changes from positive to negative or from negative to positive, judging the position of a surface near the value of 0, and performing linear interpolation on two adjacent points with changed value signs, wherein the linear interpolation formula is as follows:
Figure BDA0003581650670000064
where t represents the ray path, Δ t is the step size,
Figure BDA0003581650670000065
for the value of the TSDF for the previous step,
Figure BDA0003581650670000066
is the current TSDF value.
The point of the three-dimensional coordinate with the value of 0 obtained by interpolation is the surface three-dimensional top point diagram of the object model
Figure BDA0003581650670000067
And calculating to obtain a normal map
Figure BDA0003581650670000068
As model surface data for comparison in the next frame pose estimation step. The next frame of the loop is entered.
(3) And after 708 frames of the cycle are finished, a plant three-dimensional point cloud model and a mesh model can be obtained, the final model results of the example are 27 cases, the three stages of the experiment are before treatment, after treatment and after recovery, the operation is cold treatment, and the contrast of three different cold-resistant plants, namely a middle T, a cold-resistant J and a sensitive C, is set up in each stage.
With PK,S,NRepresenting the plant model, K the cold tolerance type, S the stage pre, post, re, N the plant numbers 1, 2, 3. And counting the point cloud number of the plant three-dimensional model at each stage, considering the atrophy of the leaf form of the plant in a withered state, reducing the surface area, and reflecting that the point cloud number in the three-dimensional model is reduced, and after the recovery stage, the three cold-resistant plants can be in different states, and whether the plant is the cold-resistant plant can be distinguished according to the proportion of the point cloud number of the leaf in the model to the initial state.
The above are preferred embodiments of the present invention, and all changes made according to the technical scheme of the present invention that produce functional effects do not exceed the scope of the technical scheme of the present invention belong to the protection scope of the present invention.

Claims (6)

1. A plant cold tolerance analysis method based on RGBD image dense three-dimensional reconstruction is characterized by comprising the following steps:
s1: the method comprises the steps of obtaining RGBD images of plants at different angles shot at different stages, and taking the image of each stage of each plant as a group of data; before the three-dimensional reconstruction starts, firstly obtaining the depth position of a plant in a depth image, and setting configuration file parameters;
s2: for each group of images, reconstructing a plant model by a dense three-dimensional reconstruction method;
s3: and analyzing the change of the plant three-dimensional model in each stage, and counting the proportion of the model in the subsequent stage to the model in the initial state to distinguish whether the plant is a cold-resistant plant.
2. The method for analyzing plant cold tolerance based on RGBD image dense three-dimensional reconstruction as claimed in claim 1, wherein the stages of the image of the plant of step S1, including three stages of before treatment, after treatment and after recovery, are operated as cold treatment, and each stage is set up with three different cold tolerance plant controls, namely middle T, cold tolerance J and sensitive C.
3. The method for analyzing plant cold tolerance based on RGBD image dense three-dimensional reconstruction as claimed in claim 1, wherein the capturing angle of the image in step S1 is transformed by: the camera is fixed, the shooting direction faces to the plants, the plants are placed on a turntable to rotate at the speed of m minutes by one turn, the frame rate of the shot images is f frames per second, m frames by 60 frames by f frames are shot in one turn, and each frame comprises two pictures, an RGB (red, green and blue) image and a depth image.
4. The method as claimed in claim 1, wherein the depth position of the plant in step S1 is obtained by counting the number of depth values with traversal pixels, and finding the depth value of the plant as model initialization for the shooting environment of the experiment within a certain interval, and the parameters of the configuration file include the path of the data set, Volume size, cube grid size, initialization of the distance from the camera to the center of the model, the number of pyramid processing layers, the distance threshold matching the associated projection point, angle threshold, the number of pyramid iterations of three layers, and the size of the filter kernel.
5. The method for analyzing the cold tolerance of the plant based on the dense three-dimensional reconstruction of the RGBD image as claimed in claim 1, wherein the method for reconstructing the dense three-dimensional reconstruction of the RGBD image in step S2 is used for reconstructing a plant model, and specifically comprises:
s21: from the sequence of RGBD plant images captured, the RGB images are not processed and used for coloring in the model surface reconstruction stage described in step S23. Processing the depth map to generate a depth image pyramid and performing bilateral filtering processing, and processing each layer of depth image DkGenerating three-dimensional points under the current world coordinate system through camera internal reference back projection to obtain a vertex diagram V of the three-dimensional pointsk. And cross multiplying and normalizing the difference value of the vertical u direction and the vertical v direction according to the top point diagram, and calculating a normal vector of the three-dimensional point to obtain a normal diagram Nk. Vertex diagram VkThe formula of (1) is:
Vk(p)=Dk(p)K-1p (1)
wherein p is the coordinate of the pixel point of the camera, K is the calibration matrix, and subscript K represents the kth frame.
Normal graph NkThe formula of (1) is:
Nk(p)=Normalize[(Vk(u+1,v)-Vk(u,v))×(Vk(u,v+1)-Vk(u,v))] (2)
wherein u and v are horizontal and vertical coordinates of the point p;
s22: the pose estimation step is skipped for the case of the first frame and the pose of the first frame is set to an initial value. For the case of non-first frame, using the vertex diagram V obtained in step (2.1)kAnd normal graph NkVertex map of surface model data inferred from existing models
Figure FDA0003581650660000011
Hema normal map
Figure FDA0003581650660000012
And the pose obtained by the last iteration is used for measuring the angle and distance error of the vertex data of the current frame and the vertex of the existing surface model data on the pixel coordinate, the associated projection point u is obtained by matching, and after the iteration is carried out for multiple times, the camera pose error of the current frame is minimized, wherein the error estimation formula is as follows:
Figure FDA0003581650660000021
wherein, TkFor the pose transformation of the current frame and the previous frame, the error between the current k frame image data and the previous frame k-1 image data through the pose transformation is as small as possible, and the current k frame image data is also the optimal solution required to be obtained by minimizing the error.
The target function is a nonlinear least square problem, can be linearized by an approximate method and calculated and solved by a matrix SVD decomposition method;
s23: reconstructing the surface of the model, fusing frame data to Global Volume by using a TSDF method, and obtaining the camera pose T of the current frame according to the obtained camera pose T of the current framekA vertex diagram V of the current framekAnd normal graph NkEach thread corresponds to (x, y, x) in the Global Volume, each thread is responsible for processing all data on the z axis, a cubic grid where each (x, y, z) is located is called a voxel, namely a cubic grid of the spatial scene model, each grid stores a distance value and a weight value, the distance value represents a distance difference value to the surface of a nearby object, and the closer to 0 represents the closer to the surface of the current voxel point. The TSDF value is calculated by the formula:
Figure FDA0003581650660000022
wherein t iskIs the coordinate of the camera of the k-th frame under a world coordinate system, p is the currently traversed voxel center point, lambda is the corresponding relation between the three-dimensional distance and the depth difference value of the camera and the voxel point, RkIs a depth image, and x is the projected point pixel of the voxel center on the camera imageAnd (4) coordinates.
The calculation formula of the corresponding relation lambda of the three-dimensional distance between the camera and the voxel point and the depth difference value is as follows:
λ=‖K-1x‖2 (5)
the calculation formula of the projection point pixel coordinate x of the voxel center on the camera image is as follows:
Figure FDA0003581650660000023
where pi (u) represents the perspective projection of the u-space three-dimensional point onto the camera plane.
The formula of the truncation operation Ψ (η) of the TSDF is:
Figure FDA0003581650660000024
where μ represents the truncation distance and less than- μ is a value where it is considered that object occlusion cannot be acquired.
Obtained
Figure FDA0003581650660000025
The TSDF value of the p-voxel point when processing the kth frame image is indicated.
Updating the existing TSDF value and weight in the Volume by using the TSDF value and weight obtained by the new frame of image, wherein the value updating formula and the weight updating formula are as follows:
Figure FDA0003581650660000026
Figure FDA0003581650660000027
s24: by using ray casting, starting from a camera, traversing each pixel point, moving against light, passing through a voxel in a Global Volume until the TSDF value passing through the voxel changes from positive to negative or from negative to positive, judging the position of a surface near the value of 0, and performing linear interpolation on two adjacent points with changed value signs, wherein the linear interpolation formula is as follows:
Figure FDA0003581650660000031
where t represents the ray path, Δ t is the step size,
Figure FDA0003581650660000032
for the value of the TSDF for the previous step,
Figure FDA0003581650660000033
is the current TSDF value.
The point of the three-dimensional coordinate with the value of 0 obtained by interpolation is the surface three-dimensional top point diagram of the object model
Figure FDA0003581650660000034
And calculating to obtain a normal map
Figure FDA0003581650660000035
As model surface data for comparison in the next frame pose estimation step. Enter the next frame of the loop.
6. The method for analyzing the cold tolerance of the plant based on the dense three-dimensional reconstruction of the RGBD images as claimed in claim 1, wherein the step S3 is performed by analyzing the variation of the three-dimensional model of the plant in each stage and counting the ratio of the model in the subsequent stage to the model in the initial state, and specifically comprises:
and after a plant three-dimensional point cloud model and a mesh model are obtained, analyzing the point cloud model. By PK,S,NRepresenting the plant model, K the cold tolerance type, S the stage pre, post, re, N the plant number.
Counting the number of point clouds of the plant three-dimensional model at each stage, considering the atrophy of leaf morphology of the plant in a withered state, reducing the surface area, and showing that the number of the point clouds in the three-dimensional model is reduced; and after the recovery stage, the three cold-resistant plants can present different states, and whether the plants are the cold-resistant plants or not is distinguished according to the proportion of the point cloud number of the leaves in the model to the initial state.
CN202210353858.0A 2022-04-06 2022-04-06 Plant cold resistance analysis method based on RGBD image dense three-dimensional reconstruction Active CN114723885B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210353858.0A CN114723885B (en) 2022-04-06 2022-04-06 Plant cold resistance analysis method based on RGBD image dense three-dimensional reconstruction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210353858.0A CN114723885B (en) 2022-04-06 2022-04-06 Plant cold resistance analysis method based on RGBD image dense three-dimensional reconstruction

Publications (2)

Publication Number Publication Date
CN114723885A true CN114723885A (en) 2022-07-08
CN114723885B CN114723885B (en) 2024-06-21

Family

ID=82242896

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210353858.0A Active CN114723885B (en) 2022-04-06 2022-04-06 Plant cold resistance analysis method based on RGBD image dense three-dimensional reconstruction

Country Status (1)

Country Link
CN (1) CN114723885B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117710469A (en) * 2024-02-06 2024-03-15 四川大学 Online dense reconstruction method and system based on RGB-D sensor

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108802759A (en) * 2018-06-07 2018-11-13 北京大学 The nearly sensing system of movable type towards plant phenotype and data capture method
CN109242873A (en) * 2018-08-22 2019-01-18 浙江大学 A method of 360 degree of real-time three-dimensionals are carried out to object based on consumer level color depth camera and are rebuild
CN110176032A (en) * 2019-04-28 2019-08-27 暗物智能科技(广州)有限公司 A kind of three-dimensional rebuilding method and device
CN110223383A (en) * 2019-06-17 2019-09-10 重庆大学 A kind of plant three-dimensional reconstruction method and system based on depth map repairing
WO2021115071A1 (en) * 2019-12-12 2021-06-17 中国科学院深圳先进技术研究院 Three-dimensional reconstruction method and apparatus for monocular endoscope image, and terminal device
CN113112504A (en) * 2021-04-08 2021-07-13 浙江大学 Plant point cloud data segmentation method and system
WO2021197341A1 (en) * 2020-04-03 2021-10-07 速度时空信息科技股份有限公司 Monocular image-based method for updating road signs and markings
CN114119574A (en) * 2021-11-30 2022-03-01 安徽农业大学 Picking point detection model construction method and picking point positioning method based on machine vision
CN114170288A (en) * 2021-11-30 2022-03-11 北京理工大学 Three-dimensional reconstruction method and device based on surface element

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108802759A (en) * 2018-06-07 2018-11-13 北京大学 The nearly sensing system of movable type towards plant phenotype and data capture method
CN109242873A (en) * 2018-08-22 2019-01-18 浙江大学 A method of 360 degree of real-time three-dimensionals are carried out to object based on consumer level color depth camera and are rebuild
CN110176032A (en) * 2019-04-28 2019-08-27 暗物智能科技(广州)有限公司 A kind of three-dimensional rebuilding method and device
CN110223383A (en) * 2019-06-17 2019-09-10 重庆大学 A kind of plant three-dimensional reconstruction method and system based on depth map repairing
WO2021115071A1 (en) * 2019-12-12 2021-06-17 中国科学院深圳先进技术研究院 Three-dimensional reconstruction method and apparatus for monocular endoscope image, and terminal device
WO2021197341A1 (en) * 2020-04-03 2021-10-07 速度时空信息科技股份有限公司 Monocular image-based method for updating road signs and markings
CN113112504A (en) * 2021-04-08 2021-07-13 浙江大学 Plant point cloud data segmentation method and system
CN114119574A (en) * 2021-11-30 2022-03-01 安徽农业大学 Picking point detection model construction method and picking point positioning method based on machine vision
CN114170288A (en) * 2021-11-30 2022-03-11 北京理工大学 Three-dimensional reconstruction method and device based on surface element

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
RUICHENG QIU 等: "Detection of the 3D temperature characteristics of maize under water stress using thermal and RGB-D cameras", COMPUTERS AND ELECTRONICS IN AGRICULTURE, vol. 191, 31 December 2021 (2021-12-31), pages 1 - 17 *
YU HOU 等: "Investigation on performance of RGB point cloud and thermal Information data fusion for 3D building thermal map modeling using aerial images under different experimental conditions", JOURNAL OF BUILDING ENGINEERING, vol. 45, 31 January 2022 (2022-01-31), pages 1 - 19 *
于广州: "白噪声干扰下三维点云数据重建方法仿真", 计算机仿真, vol. 34, no. 05, 15 May 2017 (2017-05-15), pages 444 - 447 *
孙婷 等: "基于序列图像的植株三维重建试验与分析", 软件, vol. 38, no. 04, 15 April 2017 (2017-04-15), pages 6 - 11 *
易兵 等: "基于2D/3D多特征融合的近距离空间非合作目标相对测量方法", 航天控制, vol. 35, no. 02, 15 April 2017 (2017-04-15), pages 60 - 65 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117710469A (en) * 2024-02-06 2024-03-15 四川大学 Online dense reconstruction method and system based on RGB-D sensor
CN117710469B (en) * 2024-02-06 2024-04-12 四川大学 Online dense reconstruction method and system based on RGB-D sensor

Also Published As

Publication number Publication date
CN114723885B (en) 2024-06-21

Similar Documents

Publication Publication Date Title
Scheerlinck et al. CED: Color event camera dataset
CN104620282B (en) For suppressing the method and system of the noise in image
CN107516319A (en) A kind of high accuracy simple interactive stingy drawing method, storage device and terminal
CN110490252B (en) Indoor people number detection method and system based on deep learning
CN108876926A (en) Navigation methods and systems, AR/VR client device in a kind of panoramic scene
CN109102537A (en) A kind of three-dimensional modeling method and system of laser radar and the combination of ball curtain camera
CN108810418A (en) Image processing method, device, mobile terminal and computer readable storage medium
CN107103626A (en) A kind of scene reconstruction method based on smart mobile phone
CN104132897B (en) A kind of nitrogenous measuring method of plant leaf blade based on handheld device and device
CN112200854B (en) Leaf vegetable three-dimensional phenotype measuring method based on video image
CN108805839A (en) Combined estimator image defogging method based on convolutional neural networks
CN102982520B (en) Robustness face super-resolution processing method based on contour inspection
CN108416754A (en) A kind of more exposure image fusion methods automatically removing ghost
CN107170037A (en) A kind of real-time three-dimensional point cloud method for reconstructing and system based on multiple-camera
CN105825543B (en) Point off density cloud generation method and system are regarded based on low altitude remote sensing image more
US20200035038A1 (en) An Interactive Implementation Method for Mobile Terminal Display of 3D Model
CN108717714A (en) Polyphaser scaling method, calibration system, storage medium and electronic equipment
Sen et al. Practical high dynamic range imaging of everyday scenes: Photographing the world as we see it with our own eyes
CN110349163A (en) Image processing method and device, electronic equipment, computer readable storage medium
CN110276831A (en) Constructing method and device, equipment, the computer readable storage medium of threedimensional model
CN114723885B (en) Plant cold resistance analysis method based on RGBD image dense three-dimensional reconstruction
CN112508812A (en) Image color cast correction method, model training method, device and equipment
CN107610219A (en) The thick densification method of Pixel-level point cloud that geometry clue perceives in a kind of three-dimensional scenic reconstruct
Sarkar et al. LitNeRF: Intrinsic Radiance Decomposition for High-Quality View Synthesis and Relighting of Faces
Fu et al. Raw image based over-exposure correction using channel-guidance strategy

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant