CN116894933B - Three-dimensional model comparison method, device, equipment and storage medium - Google Patents

Three-dimensional model comparison method, device, equipment and storage medium Download PDF

Info

Publication number
CN116894933B
CN116894933B CN202311153320.6A CN202311153320A CN116894933B CN 116894933 B CN116894933 B CN 116894933B CN 202311153320 A CN202311153320 A CN 202311153320A CN 116894933 B CN116894933 B CN 116894933B
Authority
CN
China
Prior art keywords
dimensional
model
graph
curved surface
coordinate graph
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311153320.6A
Other languages
Chinese (zh)
Other versions
CN116894933A (en
Inventor
李硕文
陆炎
江腾飞
李洲强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shining 3D Technology Co Ltd
Original Assignee
Shining 3D Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shining 3D Technology Co Ltd filed Critical Shining 3D Technology Co Ltd
Priority to CN202311153320.6A priority Critical patent/CN116894933B/en
Publication of CN116894933A publication Critical patent/CN116894933A/en
Application granted granted Critical
Publication of CN116894933B publication Critical patent/CN116894933B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2012Colour editing, changing, or manipulating; Use of colour codes
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to a three-dimensional model comparison method, a device, equipment and a storage medium, wherein the three-dimensional model comparison method comprises the following steps: acquiring a three-dimensional design model of a first target object; calculating a complete coordinate graph of the three-dimensional design model according to a preset global resolution; carrying out scanning reconstruction on a first target object to obtain a first three-dimensional scanning model, calculating the distance from the complete coordinate graph to the first three-dimensional scanning model, and generating a first distance graph; and coloring the three-dimensional design model based on the preset color range, the complete coordinate graph and the first distance graph, and generating a three-dimensional comparison model of the first target object. The method provided by the disclosure effectively improves the rendering efficiency while guaranteeing the overall rendering effect of the model.

Description

Three-dimensional model comparison method, device, equipment and storage medium
Technical Field
The present invention relates to the field of computer application technologies, and in particular, to a method, an apparatus, a device, and a storage medium for comparing three-dimensional models.
Background
With the rapid development of the field of industrial production and manufacturing, the precision requirement of industrial production is gradually increasing, and in order to compare the difference between the visual industrial product and the sample, a 3D chromatogram comparison method is generally adopted, wherein the comparison method is to map the difference of the distance between a sample model and a scanning model of the industrial product into a group of continuous gradual colors, and render the corresponding colors on the sample model or the scanning model, so that the 3D chromatogram comparison can be realized, and the similarity and the difference between the sample and the industrial product can be intuitively observed.
At present, 3D chromatogram comparison can be realized by calculating a texture map of a model, the difference between a sample model and a scanning model is represented by using the texture map, but the overall texture expansion of the sample model is complex, if the texture expansion of each curved surface of the sample model is calculated independently to generate a texture map, each curved surface is required to be drawn independently and added with textures, and the overall rendering efficiency of the model is greatly reduced.
Disclosure of Invention
In order to solve the technical problems, the embodiments of the present disclosure provide a three-dimensional model comparison method, apparatus, device, and storage medium, which effectively improve rendering efficiency while ensuring the overall rendering effect of a model.
In a first aspect, an embodiment of the present disclosure provides a three-dimensional model comparison method, including:
acquiring a three-dimensional design model of a first target object;
calculating a complete coordinate graph of the three-dimensional design model according to a preset global resolution;
carrying out scanning reconstruction on the first target object to obtain a first three-dimensional scanning model, calculating the distance from the complete coordinate graph to the first three-dimensional scanning model, and generating a first distance graph;
and coloring the three-dimensional design model based on a preset color range, the complete coordinate graph and the first distance graph to generate a three-dimensional comparison model of the first target object.
Optionally, the calculating the complete coordinate graph of the three-dimensional design model according to the preset global resolution includes:
calculating the resolution of each curved surface of the three-dimensional design model in a preset direction according to the preset global resolution;
sampling each curved surface based on the resolution of each curved surface in a preset direction and a preset coordinate range, and generating a curved surface coordinate graph of each curved surface;
and combining the curved surface coordinate graphs of each curved surface to obtain a complete coordinate graph of the three-dimensional design model.
Optionally, the coloring the three-dimensional design model based on the preset color range, the complete coordinate graph and the first distance graph, and generating the three-dimensional comparison model of the first target object includes:
determining the initial row and column positions of the curved surface coordinate graph of each curved surface of the three-dimensional design model in the complete coordinate graph;
calculating a first texture coordinate of each pixel in the complete coordinate graph in the three-dimensional design model according to the initial row and column positions, the height of the curved surface coordinate graph and the height and width of the complete coordinate graph;
and coloring the three-dimensional design model based on a preset color range, the first texture coordinates and the first distance map, and generating a three-dimensional comparison model of the first target object.
Optionally, the calculating the first texture coordinates of each pixel in the complete coordinate graph in the three-dimensional design model according to the initial row and column positions, the height of the curved surface coordinate graph, and the height and the width of the complete coordinate graph includes:
calculating pixel positions of second texture coordinates of grids in each curved surface of the three-dimensional design model in the corresponding curved surface coordinate graph;
and calculating the first texture coordinates of each pixel in the complete coordinate graph in the three-dimensional design model according to the initial row and column positions, the pixel positions, the height of the curved surface coordinate graph and the height and width of the complete coordinate graph.
Optionally, the start line position includes a start line value and a start column value, the pixel position includes an abscissa value and an ordinate value, and the first texture coordinate includes a first coordinate value in a horizontal direction and a second coordinate value in a vertical direction.
Optionally, the calculating the first texture coordinate of each pixel in the complete coordinate graph in the three-dimensional design model according to the starting row and column position, the pixel position, the height of the curved coordinate graph, and the height and width of the complete coordinate graph includes:
Calculating to obtain the first coordinate value according to the abscissa value, the initial column value and the width of the complete coordinate graph;
and calculating to obtain the second coordinate value according to the ordinate value, the height of the curved surface coordinate graph, the initial line value and the height of the complete coordinate graph.
Optionally, the coloring the three-dimensional design model based on the preset color range, the first texture coordinates and the first distance map, and generating a three-dimensional comparison model of the first target object includes:
taking the acquired gradient color array as one-dimensional texture data, and taking the first distance map and the first texture coordinates as two-dimensional texture data;
and coloring the three-dimensional design model by using a coloring device based on the one-dimensional texture data, the two-dimensional texture data and a preset color range, and generating a comparison model of the first target object.
Optionally, after the generating the three-dimensional comparison model of the first target object, the method further includes:
carrying out scanning reconstruction on a second target object to obtain a second three-dimensional scanning model, wherein the second target object is a physical object of the three-dimensional design model except the first target object;
Calculating the distance from the complete coordinate graph to the second three-dimensional scanning model, and generating a second distance graph;
and re-coloring the three-dimensional design model based on the preset color range, the complete coordinate graph and the second distance graph to generate a three-dimensional comparison model of the second target object.
In a second aspect, embodiments of the present disclosure provide a three-dimensional model comparison apparatus, including:
an acquisition unit configured to acquire a three-dimensional design model of a first target object;
the first calculation unit is used for calculating a complete coordinate graph of the three-dimensional design model according to a preset global resolution;
the second calculation unit is used for carrying out scanning reconstruction on the first target object to obtain a first three-dimensional scanning model, calculating the distance from the complete coordinate graph to the first three-dimensional scanning model and generating a first distance graph;
and the generating unit is used for coloring the three-dimensional design model based on a preset color range, the complete coordinate graph and the first distance graph, and generating a three-dimensional comparison model of the first target object.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including:
a memory;
a processor; and
A computer program;
wherein the computer program is stored in the memory and configured to be executed by the processor to implement the three-dimensional model comparison method described above.
In a fourth aspect, embodiments of the present disclosure provide a computer readable storage medium having a computer program stored thereon, wherein the computer program when executed by a processor implements the steps of the three-dimensional model comparison method described above.
The embodiment of the disclosure provides a three-dimensional model comparison method, which comprises the following steps: acquiring a three-dimensional design model of a first target object; calculating a complete coordinate graph of the three-dimensional design model according to a preset global resolution; carrying out scanning reconstruction on a first target object to obtain a first three-dimensional scanning model, calculating the distance from the complete coordinate graph to the first three-dimensional scanning model, and generating a first distance graph; and coloring the three-dimensional design model based on the preset color range, the complete coordinate graph and the first distance graph, and generating a three-dimensional comparison model of the first target object. According to the method, 3D chromatographic comparison is achieved by calculating the distance between the coordinate graph of the three-dimensional design model and the scanning model to generate the distance graph, the calculation complexity is reduced, the rendering effect of the model is improved, and the color difference transition effect is better.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
In order to more clearly illustrate the embodiments of the present disclosure or the solutions in the prior art, the drawings that are required for the description of the embodiments or the prior art will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
Fig. 1 is a schematic flow chart of a three-dimensional model comparison method according to an embodiment of the disclosure;
FIG. 2 is a schematic diagram of a refinement flow of S140 in the three-dimensional model comparison method shown in FIG. 1;
fig. 3 is a schematic structural diagram of a three-dimensional model comparing device according to an embodiment of the disclosure;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
In order that the above objects, features and advantages of the present disclosure may be more clearly understood, a further description of aspects of the present disclosure will be provided below. It should be noted that, without conflict, the embodiments of the present disclosure and features in the embodiments may be combined with each other.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure, but the present disclosure may be practiced otherwise than as described herein; it will be apparent that the embodiments in the specification are only some, but not all, embodiments of the disclosure.
3D chromatograms are a common technique for comparing differences between different samples, especially when it is desired to compare the differences between a sample to be measured and its standard sample. In industrial manufacturing, most of standard samples are CAD (Computer Aided Drafting), the sample to be measured is an industrial object produced according to CAD, the industrial object is scanned to generate a grid model (namely a three-dimensional scanning model) more often, the grid model is compared with a discretized grid model (namely a three-dimensional design model) of CAD, under the premise of alignment of the grid model and the three-dimensional model, the distance between a point on the CAD and a corresponding point on the scanning model is expressed as a color (the source of the color is a group of continuous gradual color bands, different colors represent different distances), and the color is covered on the whole CAD model, so that the similarity and the difference between the point and the scanning model can be perceived through the change of the color on the CAD model.
Currently, there are two main types of existing technologies for implementing 3D chromatograms: 1) The first method is to calculate the distance between all vertexes on one grid model and the corresponding points on the other grid model, transfer the distance value as the attribute of the corresponding vertexes into a shader, calculate the texture coordinate represented by the distance value, sample the color value represented by the distance value in the one-dimensional texture representing the color band, take the color value as the color of the corresponding vertexes, and automatically interpolate the internal points of the triangle through an open graphics library (Open Graphics Library, openGL) to realize the chromatographic rendering of the whole model. 2) The second method is to calculate a texture map of a model by uv (vertical and horizontal) expansion of the model, and the effective pixels on the texture map are the colors represented by the distances between a certain point on the model and another model.
The first method mainly comprises the steps of assigning a color to each triangle vertex in a model, and then calculating the color inside the triangle by using an OpenGL triangle automatic interpolation algorithm. However, when the model is a CAD digital-analog discretized grid model, because the loading efficiency is considered, the discretized triangles are not too dense, but the accuracy is not lost, for example, a rectangular plane can be formed by only two triangles, at this time, the grid model has a large number of elongated uneven triangles, the color values represented by the three vertices of the triangle may have an excessive difference, the real difference between the triangle and the scan model cannot be reflected after the internal interpolation, and the color difference on the visual effect on the CAD digital-analog is discontinuous.
The second method is to use a texture map to represent the differences between the CAD data model and the scanned model, compared with the first method, the color differences represented on the model are more continuous transition, and the true distance differences at every place on the model can be represented. However, calculating the overall texture expansion (uv expansion) of a CAD digital model is complicated, if the texture expansion of each curved surface of the CAD digital model is calculated alone to generate a texture map, each curved surface needs to be drawn alone and textures are added for each curved surface, the workload is huge, the overall rendering efficiency of the CAD digital model is also greatly reduced, and meanwhile, when the 3D chromatogram is updated in real time, a large amount of data to be recalculated exists, and the time consumption is relatively large.
Aiming at the technical problems, the embodiment of the disclosure provides a three-dimensional model comparison method, which is applied to a 3D chromatographic drawing scene for displaying comparison differences on CAD digital models, a complete coordinate graph is generated on the CAD digital models, the shortest distance from pixels in the coordinate graph to a scanning model is calculated to generate a distance graph, and the color of each segment is calculated in a shader through the coordinate graph and the distance graph, so that the real differences between the two models are effectively and accurately displayed on the CAD digital models, the overall rendering efficiency of the CAD digital models is improved, and the color difference excessive effect is better. And in particular by one or more of the following examples.
In particular, the three-dimensional model comparison method may be performed by a terminal or a server. Specifically, in a feasible application scenario, a terminal or a server directly acquires a three-dimensional design model and a three-dimensional scanning model, and displays a comparison difference on the three-dimensional design model. In another possible application scenario, a terminal scans a target object to obtain a three-dimensional scanning model, and a server acquires the three-dimensional scanning model from the terminal and displays a comparison difference on the three-dimensional design model. The specific application scenario is not limited.
Fig. 1 is a flow chart of a three-dimensional model comparison method provided in an embodiment of the present disclosure, which is applied to a terminal, and the terminal automatically scans a target object to perform difference comparison, and displays real differences on a three-dimensional design model, and specifically includes steps S110 to S140 shown in fig. 1:
s110, acquiring a three-dimensional design model of the first target object.
Wherein the first target object may be a real object produced based on the three-dimensional design model.
It is understood that a three-dimensional design model of a first target object is obtained, the three-dimensional design model can be a CAD digital-analog discretized grid model, the first target object is an industrial object produced according to the CAD digital-analog, it is understood that a batch of industrial objects can be produced according to the CAD digital-analog in industrial production, and the first target object is one object in the batch of industrial objects.
And S120, calculating a complete coordinate graph of the three-dimensional design model according to a preset global resolution.
It can be appreciated that, based on S110, the global resolution is preset, and can be customized according to the user requirement. Then, a complete coordinate graph of the three-dimensional design model is calculated according to the global resolution, wherein each effective pixel on the complete coordinate graph represents a coordinate, the coordinate can be understood as a vertex coordinate, the vertex coordinate is used for describing the shape and the position of an object, for example, the shape of a curved surface and the position of the curved surface in the three-dimensional design model, and the coordinate represented by the pixel can be an xyz three-dimensional coordinate corresponding to an rgb three-color channel of the complete coordinate graph.
Optionally, in S120, the calculating a complete coordinate graph of the three-dimensional design model according to the preset global resolution is specifically implemented by the following steps:
calculating the resolution of each curved surface of the three-dimensional design model in a preset direction according to the preset global resolution; sampling each curved surface based on the resolution of each curved surface in a preset direction and a preset coordinate range, and generating a curved surface coordinate graph of each curved surface; and combining the curved surface coordinate graphs of each curved surface to obtain a complete coordinate graph of the three-dimensional design model.
Wherein the preset direction includes a vertical direction (u-direction) and a horizontal direction (v-direction).
It can be appreciated that the resolution of each surface of the three-dimensional design model in the u-direction and v-direction is calculated according to a preset global resolution. And then, uniformly sampling each curved surface based on the resolution of each curved surface in the uv direction and a preset coordinate range, generating an image for recording the coordinates of the curved surface, and recording the image as a curved surface coordinate graph, wherein the three-dimensional design model consists of a plurality of curved surfaces, and each curved surface has a curved surface coordinate graph for recording the coordinates. And then, combining the curved surface coordinate graphs of all the curved surfaces into a complete coordinate graph, wherein the complete coordinate graph is marked as img_c, and the combining mode can be that all the curved surface coordinate graphs are spliced in a non-overlapping way, for example, 10 curved surface coordinate graphs are spliced into a complete coordinate graph consisting of 2 rows and 5 columns of curved surface coordinate graphs, namely 5 curved surface coordinate graphs are spliced in each row.
S130, carrying out scanning reconstruction on the first target object to obtain a first three-dimensional scanning model, calculating the distance from the complete coordinate graph to the first three-dimensional scanning model, and generating a first distance graph.
It can be understood that, based on S120, the scanning device is used to perform the scanning process and the reconstruction process on the first target object, so as to obtain a scanning model of the first target object, which is denoted as a first three-dimensional scanning model. And then, calculating the shortest distance between each effective pixel in the complete coordinate graph and the first three-dimensional scanning model, and generating a first distance graph for recording distance values, wherein each pixel value in the complete coordinate graph represents the three-dimensional coordinate value of the three-dimensional design model, and each pixel value in the first distance graph represents the distance value.
And S140, coloring the three-dimensional design model based on a preset color range, the complete coordinate graph and the first distance graph, and generating a three-dimensional comparison model of the first target object.
It will be appreciated that, based on S130 above, the three-dimensional design model is colored according to the preset color range, the full graph and the first distance graph, for example, the fragment coloring is performed by using a Shader to implement 3D chromatogram rendering, so as to generate a three-dimensional comparison model of the first target object, that is, the comparison difference between the scan model of the first target object and the CAD digital model is displayed on the CAD digital model, where the Shader (loader) is used to implement image rendering, and is an editable program instead of a fixed rendering pipeline, and the Shader mainly includes a Vertex Shader (Vertex Shader) and a Pixel Shader (Pixel Shader), and the user can select a required Shader according to the requirement.
Optionally, after generating the three-dimensional comparison model of the first target object, the method further includes:
carrying out scanning reconstruction on a second target object to obtain a second three-dimensional scanning model; calculating the distance from the complete coordinate graph to the second three-dimensional scanning model, and generating a second distance graph; and re-coloring the three-dimensional design model based on the preset color range, the complete coordinate graph and the second distance graph to generate a three-dimensional comparison model of the second target object.
Wherein the second target object may be a real object of the three-dimensional design model other than the first target object.
It can be understood that after the three-dimensional design model chromatogram is rendered to obtain a three-dimensional comparison model, the chromatogram rendering effect can be updated, and if the three-dimensional scanning model is not changed, that is, the same scanning model aiming at the same target object, the chromatogram can be changed only by changing the preset color range (color band); if the three-dimensional scanning model is replaced, the situation may be that a target object is replaced, the difference between a second target object and the three-dimensional design model needs to be compared, the second target object is other industrial objects which are produced according to the three-dimensional design model and are other than the first target object, the first target object and the second target object can belong to the same batch of industrial objects, or the first target object may be scanned again to obtain an updated three-dimensional scanning model, the situation of the replacement scanning model only needs to calculate a distance map according to the complete coordinate map and the replaced three-dimensional scanning model again, namely, calculate the shortest distance from an effective pixel in the complete coordinate map to the second three-dimensional scanning model to generate a second distance map, then the second distance map is input into a shader as two-dimensional texture data, the rest data is unchanged, and then the three-dimensional design model can be subjected to fragment coloring again to generate a three-dimensional comparison model of the second target object.
According to the three-dimensional model comparison method provided by the embodiment of the disclosure, a complete coordinate graph is generated for the three-dimensional design model, the distance graph between the two models is calculated, fragment coloring is performed in the coloring device, 3D chromatographic drawing is realized on CAD digital models to display real comparison differences, the overall rendering efficiency of the digital models is improved, the color differences on visual effects are continuous, when the chromatograms are updated, chromatographic drawing can be updated in real time only by updating the color range if the two comparison models are unchanged, and when the CAD digital models are unchanged and the scanning model is changed, the chromatographic drawing can be updated only by updating the distance graph.
On the basis of the foregoing embodiment, fig. 2 is a schematic diagram of a refinement flow of S140 in the three-dimensional model comparison method shown in fig. 1, and optionally, coloring the three-dimensional design model based on a preset color range, the complete coordinate graph and the first distance graph to generate a three-dimensional comparison model of the first target object, which specifically includes steps S210 to S230 shown in fig. 2:
S210, determining the initial row and column positions of the curved surface coordinate graph of each curved surface of the three-dimensional design model in the complete coordinate graph.
It can be understood that after the curved surface coordinate graph of each curved surface of the three-dimensional design model is obtained by calculation, the initial row and column position of each curved surface coordinate graph in the complete coordinate graph is recorded and is denoted as (row, col), for example, taking the complete coordinate graph obtained by combining 2 rows and 5 columns of curved surface coordinate graphs as an example, the initial row and column position of the curved surface coordinate graph 1 of the curved surface 1 in the complete coordinate graph is (1, 2), that is, the position of the curved surface coordinate graph 1 in the 1 st row and 2 nd column in the complete coordinate graph.
S220, calculating first texture coordinates of each pixel in the complete coordinate graph in the three-dimensional design model according to the initial row and column positions, the height of the curved surface coordinate graph and the height and width of the complete coordinate graph.
It can be understood that, on the basis of S210 above, for each curved surface, according to the curved surface coordinate graph of the curved surface and the initial row and column positions of the curved surface coordinate graph, and the width and height of the complete coordinate graph, the first texture coordinates of all pixels corresponding to the three-dimensional design model in the range of the curved surface coordinate graph in the complete coordinate graph are calculated, after all the curved surface calculations are completed, the first texture coordinates of all the pixels corresponding to the three-dimensional design model in the complete coordinate graph, that is, the first texture coordinates corresponding to each pixel in the complete coordinate graph exist, where the texture coordinates are used to describe the position of the texture on the object surface, and generally, the form of (u, v) is used to represent the coordinates in the horizontal direction, v represents the coordinates in the vertical direction, and the function of the texture coordinates is to map the texture image onto the object surface, so that the object surface has richer detail and reality, and better rendering effect can be obtained on the object surface by defining a texture coordinate for each pixel (representing one coordinate).
Optionally, in S220, according to the initial row and column position, the height of the curved coordinate graph, and the height and width of the complete coordinate graph, a first texture coordinate of each pixel in the complete coordinate graph in the three-dimensional design model is calculated, which is specifically implemented by the following steps:
calculating pixel positions of second texture coordinates of grids in each curved surface of the three-dimensional design model in the corresponding curved surface coordinate graph; and calculating the first texture coordinates of each pixel in the complete coordinate graph in the three-dimensional design model according to the initial row and column positions, the pixel positions, the height of the curved surface coordinate graph and the height and width of the complete coordinate graph.
It can be appreciated that determining second texture coordinates of the mesh in each curved surface of the three-dimensional design model, the three-dimensional design model being a CAD-digital-analog discretized mesh model, the three-dimensional design model being composed of a plurality of curved surfaces, each curved surface including a plurality of meshes, the meshes being triangular, calculating second texture coordinates of each mesh in the three-dimensional design model, and determining a pixel position of each second texture coordinate in a corresponding curved surface coordinate graph, wherein each mesh has one second texture coordinate, each second texture coordinate has a corresponding pixel, and each pixel has a pixel position in the curved surface coordinate graph. And then, for each curved surface, calculating first texture coordinates of all pixels in the range of the curved surface coordinate graph in the complete coordinate graph according to the height, the initial row and column positions of the curved surface coordinate graph of the curved surface, all pixel positions related to the curved surface and the height and width of the complete coordinate graph, and further calculating the first texture coordinates of all pixels in the complete coordinate graph.
The initial row and column positions comprise initial row values and initial column values, the pixel positions comprise horizontal coordinate values and vertical coordinate values, and the first texture coordinates comprise first coordinate values in the horizontal direction and second coordinate values in the vertical direction.
Optionally, calculating the first texture coordinate of each pixel in the complete coordinate graph in the three-dimensional design model according to the initial row and column position, the pixel position, the height of the curved coordinate graph, and the height and width of the complete coordinate graph includes:
calculating to obtain the first coordinate value according to the abscissa value, the initial column value and the width of the complete coordinate graph; and calculating to obtain the second coordinate value according to the ordinate value, the height of the curved surface coordinate graph, the initial line value and the height of the complete coordinate graph.
It can be understood that, according to the abscissa value in the pixel position, the initial column position of the curved surface coordinate graph where the pixel is located and the width of the complete coordinate graph, a first coordinate value of the pixel in the horizontal direction is calculated; according to the ordinate value in the pixel position, the height of the curved surface coordinate graph where the pixel is located, the initial line value and the height of the complete coordinate graph, calculating to obtain a second coordinate value of the pixel in the vertical direction, wherein the first coordinate value and the second coordinate value form a first texture coordinate of the pixel, and the mode of calculating the first texture coordinate of the pixel is specifically shown in the following formulas (1) to (2):
Formula (1)
Formula (2)
In the method, in the process of the invention,a first coordinate value (texture coordinate u value) in the horizontal direction, is indicated>The second coordinate value (texture coordinate v value) in the vertical direction is represented, (x, y) represents the pixel position, (row, col) represents the start line position, row represents the start line position, col represents the start line position, img_h is the height of the curved surface coordinate graph, and w and h are the width and height of the complete coordinate graph.
And S230, coloring the three-dimensional design model based on a preset color range, the first texture coordinates and the first distance map, and generating a three-dimensional comparison model of the first target object.
Optionally, in S230, the three-dimensional design model is colored based on the preset color range, the first texture coordinate and the first distance map, so as to generate a three-dimensional comparison model of the first target object, which is specifically implemented by the following steps:
taking the acquired gradient color array as one-dimensional texture data, and taking the first distance map and the first texture coordinates as two-dimensional texture data; and coloring the three-dimensional design model by using a coloring device based on the one-dimensional texture data, the two-dimensional texture data and a preset color range, and generating a comparison model of the first target object.
It can be understood that, based on S220, a set of gradient color arrays is used as one-dimensional texture data, the first distance map is used as two-dimensional texture data, a preset color range and texture data (one-dimensional texture data and two-dimensional texture data) are input into a shader, the shader performs fragment coloring on the three-dimensional design model according to the preset color range, and 3D chromatogram rendering is implemented, where the gradient color arrays and each distance value have a corresponding relationship, for example, the first distance value in the distance map corresponds to a first color value in the array, and the shader renders a three-dimensional point corresponding to the first distance value in the three-dimensional design model to the first color value.
According to the three-dimensional model comparison method, texture coordinates of pixels in the complete coordinate graph are calculated, so that accurate rendering of the model is facilitated according to a distance graph generated by the complete coordinate graph and the scanning model, and real differences among comparison models are reflected.
Fig. 3 is a schematic structural diagram of a three-dimensional model comparing device according to an embodiment of the disclosure. The three-dimensional model comparing apparatus provided in the embodiments of the present disclosure may perform a process flow provided in the embodiments of the three-dimensional model comparing method, as shown in fig. 3, the three-dimensional model comparing apparatus 300 includes an obtaining unit 310, a first calculating unit 320, a second calculating unit 330, and a generating unit 340, where:
An acquisition unit 310 for acquiring a three-dimensional design model of the first target object;
a first calculating unit 320, configured to calculate a complete coordinate graph of the three-dimensional design model according to a preset global resolution;
the second calculating unit 330 is configured to perform scan reconstruction on the first target object to obtain a first three-dimensional scan model, calculate a distance from the complete coordinate graph to the first three-dimensional scan model, and generate a first distance graph;
the generating unit 340 is configured to color the three-dimensional design model based on a preset color range, the complete coordinate graph and the first distance graph, and generate a three-dimensional comparison model of the first target object.
Optionally, the first computing unit 320 is configured to:
calculating the resolution of each curved surface of the three-dimensional design model in a preset direction according to the preset global resolution;
sampling each curved surface based on the resolution of each curved surface in a preset direction and a preset coordinate range, and generating a curved surface coordinate graph of each curved surface;
and combining the curved surface coordinate graphs of each curved surface to obtain a complete coordinate graph of the three-dimensional design model.
Optionally, the generating unit 340 is configured to:
Determining the initial row and column positions of the curved surface coordinate graph of each curved surface of the three-dimensional design model in the complete coordinate graph;
calculating a first texture coordinate of each pixel in the complete coordinate graph in the three-dimensional design model according to the initial row and column positions, the height of the curved surface coordinate graph and the height and width of the complete coordinate graph;
and coloring the three-dimensional design model based on a preset color range, the first texture coordinates and the first distance map, and generating a three-dimensional comparison model of the first target object.
Optionally, the generating unit 340 is configured to:
calculating pixel positions of second texture coordinates of grids in each curved surface of the three-dimensional design model in the corresponding curved surface coordinate graph;
and calculating the first texture coordinates of each pixel in the complete coordinate graph in the three-dimensional design model according to the initial row and column positions, the pixel positions, the height of the curved surface coordinate graph and the height and width of the complete coordinate graph.
Wherein the start line position in the device 300 includes a start line value and a start column value, the pixel position includes an abscissa value and an ordinate value, and the first texture coordinate includes a first coordinate value in a horizontal direction and a second coordinate value in a vertical direction.
Optionally, the generating unit 340 is configured to:
calculating to obtain the first coordinate value according to the abscissa value, the initial column value and the width of the complete coordinate graph;
and calculating to obtain the second coordinate value according to the ordinate value, the height of the curved surface coordinate graph, the initial line value and the height of the complete coordinate graph.
Optionally, the generating unit 340 is configured to:
taking the acquired gradient color array as one-dimensional texture data, and taking the first distance map and the first texture coordinates as two-dimensional texture data;
and coloring the three-dimensional design model by using a coloring device based on the one-dimensional texture data, the two-dimensional texture data and a preset color range, and generating a comparison model of the first target object.
Optionally, the apparatus 300 is further configured to:
carrying out scanning reconstruction on a second target object to obtain a second three-dimensional scanning model;
calculating the distance from the complete coordinate graph to the second three-dimensional scanning model, and generating a second distance graph;
and re-coloring the three-dimensional design model based on the preset color range, the complete coordinate graph and the second distance graph to generate a three-dimensional comparison model of the second target object.
The three-dimensional model comparing device in the embodiment shown in fig. 3 may be used to implement the technical solution of the above method embodiment, and its implementation principle and technical effects are similar, and will not be described herein again.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure. Referring now in particular to fig. 4, a schematic diagram of an electronic device 400 suitable for use in implementing embodiments of the present disclosure is shown. The electronic device 400 in the embodiments of the present disclosure may include, but is not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), wearable electronic devices, and the like, and fixed terminals such as digital TVs, desktop computers, smart home devices, and the like. The electronic device shown in fig. 4 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 4, the electronic device 400 may include a processing means (e.g., a central processor, a graphics processor, etc.) 401, which may perform various suitable actions and processes according to a program stored in a Read Only Memory (ROM) 402 or a program loaded from a storage means 408 into a Random Access Memory (RAM) 403 to implement a three-dimensional model comparison method of an embodiment as described in the present disclosure. In the RAM 403, various programs and data necessary for the operation of the electronic device 400 are also stored. The processing device 401, the ROM 402, and the RAM 403 are connected to each other by a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
In general, the following devices may be connected to the I/O interface 405: input devices 406 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 407 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 408 including, for example, magnetic tape, hard disk, etc.; and a communication device 409. The communication means 409 may allow the electronic device 400 to communicate with other devices wirelessly or by wire to exchange data. While fig. 4 shows an electronic device 400 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts, thereby implementing the three-dimensional model comparison method as described above. In such an embodiment, the computer program may be downloaded and installed from a network via communications device 409, or from storage 408, or from ROM 402. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 401.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
Alternatively, the electronic device may perform other steps described in the above embodiments when the above one or more programs are executed by the electronic device.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
It should be noted that in this document, relational terms such as "first" and "second" and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or gateway that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or gateway. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or gateway comprising the element.
The foregoing is merely a specific embodiment of the disclosure to enable one skilled in the art to understand or practice the disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown and described herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (8)

1. A method for comparing three-dimensional models, comprising:
acquiring a three-dimensional design model of a first target object;
calculating a complete coordinate graph of the three-dimensional design model according to a preset global resolution, wherein each effective pixel on the complete coordinate graph represents a three-dimensional coordinate, and the three-dimensional coordinate is used for describing the shape of a curved surface and the position of the curved surface in the three-dimensional design model;
carrying out scanning reconstruction on the first target object to obtain a first three-dimensional scanning model, calculating the distance from the complete coordinate graph to the first three-dimensional scanning model, and generating a first distance graph;
coloring the three-dimensional design model based on a preset color range, the complete coordinate graph and the first distance graph to generate a three-dimensional comparison model of the first target object;
the coloring the three-dimensional design model based on the preset color range, the complete coordinate graph and the first distance graph, and generating a three-dimensional comparison model of the first target object comprises the following steps:
determining the initial row and column positions of the curved surface coordinate graph of each curved surface of the three-dimensional design model in the complete coordinate graph; calculating a first texture coordinate of each pixel in the complete coordinate graph in the three-dimensional design model according to the initial row and column positions, the height of the curved surface coordinate graph and the height and width of the complete coordinate graph; coloring the three-dimensional design model based on a preset color range, the first texture coordinates and the first distance map, and generating a three-dimensional comparison model of the first target object;
The calculating, according to the initial line position, the height of the curved coordinate graph, and the height and width of the complete coordinate graph, a first texture coordinate of each pixel in the complete coordinate graph in the three-dimensional design model includes:
calculating pixel positions of second texture coordinates of grids in each curved surface of the three-dimensional design model in the corresponding curved surface coordinate graph; and calculating the first texture coordinates of each pixel in the complete coordinate graph in the three-dimensional design model according to the initial row and column positions, the pixel positions, the height of the curved surface coordinate graph and the height and width of the complete coordinate graph.
2. The method of claim 1, wherein said calculating a complete graph of the three-dimensional design model from a preset global resolution comprises:
calculating the resolution of each curved surface of the three-dimensional design model in a preset direction according to the preset global resolution;
sampling each curved surface based on the resolution of each curved surface in a preset direction and a preset coordinate range, and generating a curved surface coordinate graph of each curved surface;
and combining the curved surface coordinate graphs of each curved surface to obtain a complete coordinate graph of the three-dimensional design model.
3. The method of claim 1, wherein the starting line location comprises a starting line value and a starting column value, the pixel location comprises an abscissa value and an ordinate value, the first texture coordinate comprises a first coordinate value in a horizontal direction and a second coordinate value in a vertical direction,
the calculating the first texture coordinates of each pixel in the complete coordinate graph in the three-dimensional design model according to the initial row and column positions, the pixel positions, the height of the curved surface coordinate graph and the height and width of the complete coordinate graph comprises the following steps:
calculating to obtain the first coordinate value according to the abscissa value, the initial column value and the width of the complete coordinate graph;
and calculating to obtain the second coordinate value according to the ordinate value, the height of the curved surface coordinate graph, the initial line value and the height of the complete coordinate graph.
4. The method of claim 1, wherein the coloring the three-dimensional design model based on the preset color range, the first texture coordinates, and the first distance map generates a three-dimensional comparison model of the first target object, comprising:
Taking the acquired gradient color array as one-dimensional texture data, and taking the first distance map and the first texture coordinates as two-dimensional texture data;
and coloring the three-dimensional design model by using a coloring device based on the one-dimensional texture data, the two-dimensional texture data and a preset color range, and generating a comparison model of the first target object.
5. The method of claim 1, wherein after the generating the three-dimensional comparison model of the first target object, the method further comprises:
carrying out scanning reconstruction on a second target object to obtain a second three-dimensional scanning model;
calculating the distance from the complete coordinate graph to the second three-dimensional scanning model, and generating a second distance graph;
and re-coloring the three-dimensional design model based on the preset color range, the complete coordinate graph and the second distance graph to generate a three-dimensional comparison model of the second target object.
6. A three-dimensional model comparison apparatus, comprising:
an acquisition unit configured to acquire a three-dimensional design model of a first target object;
the first calculation unit is used for calculating a complete coordinate graph of the three-dimensional design model according to a preset global resolution, each effective pixel on the complete coordinate graph represents a three-dimensional coordinate, and the three-dimensional coordinate is used for describing the shape of a curved surface and the position of the curved surface in the three-dimensional design model;
The second calculation unit is used for carrying out scanning reconstruction on the first target object to obtain a first three-dimensional scanning model, calculating the distance from the complete coordinate graph to the first three-dimensional scanning model and generating a first distance graph;
the generating unit is used for coloring the three-dimensional design model based on a preset color range, the complete coordinate graph and the first distance graph, and generating a three-dimensional comparison model of the first target object;
the generating unit is used for determining the initial row and column positions of the curved surface coordinate graph of each curved surface of the three-dimensional design model in the complete coordinate graph; calculating a first texture coordinate of each pixel in the complete coordinate graph in the three-dimensional design model according to the initial row and column positions, the height of the curved surface coordinate graph and the height and width of the complete coordinate graph; coloring the three-dimensional design model based on a preset color range, the first texture coordinates and the first distance map, and generating a three-dimensional comparison model of the first target object;
the generating unit is used for calculating the pixel position of the second texture coordinates of each grid in each curved surface of the three-dimensional design model in the corresponding curved surface coordinate graph; and calculating the first texture coordinates of each pixel in the complete coordinate graph in the three-dimensional design model according to the initial row and column positions, the pixel positions, the height of the curved surface coordinate graph and the height and width of the complete coordinate graph.
7. An electronic device, comprising:
a memory;
a processor; and
a computer program;
wherein the computer program is stored in the memory and configured to be executed by the processor to implement the three-dimensional model comparison method of any one of claims 1 to 5.
8. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the three-dimensional model comparison method according to any one of claims 1 to 5.
CN202311153320.6A 2023-09-08 2023-09-08 Three-dimensional model comparison method, device, equipment and storage medium Active CN116894933B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311153320.6A CN116894933B (en) 2023-09-08 2023-09-08 Three-dimensional model comparison method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311153320.6A CN116894933B (en) 2023-09-08 2023-09-08 Three-dimensional model comparison method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN116894933A CN116894933A (en) 2023-10-17
CN116894933B true CN116894933B (en) 2024-01-26

Family

ID=88311038

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311153320.6A Active CN116894933B (en) 2023-09-08 2023-09-08 Three-dimensional model comparison method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116894933B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013074153A1 (en) * 2011-11-17 2013-05-23 University Of Southern California Generating three dimensional models from range sensor data
CN111814558A (en) * 2020-06-10 2020-10-23 深圳市美鸣齿科技术有限公司 Method, system, equipment and medium for measuring precision of 3D printing molded tooth model
CN113413229A (en) * 2021-06-04 2021-09-21 昆明医科大学附属口腔医院 Digital method for evaluating quality of dental preparation
CN114782607A (en) * 2021-01-06 2022-07-22 Arm有限公司 Graphics texture mapping
CN116295002A (en) * 2023-02-09 2023-06-23 陕西正诚路桥工程研究院有限公司 Bridge pier size deviation detection method, system, terminal and storage medium thereof

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112819559A (en) * 2019-11-18 2021-05-18 北京沃东天骏信息技术有限公司 Article comparison method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013074153A1 (en) * 2011-11-17 2013-05-23 University Of Southern California Generating three dimensional models from range sensor data
CN111814558A (en) * 2020-06-10 2020-10-23 深圳市美鸣齿科技术有限公司 Method, system, equipment and medium for measuring precision of 3D printing molded tooth model
CN114782607A (en) * 2021-01-06 2022-07-22 Arm有限公司 Graphics texture mapping
CN113413229A (en) * 2021-06-04 2021-09-21 昆明医科大学附属口腔医院 Digital method for evaluating quality of dental preparation
CN116295002A (en) * 2023-02-09 2023-06-23 陕西正诚路桥工程研究院有限公司 Bridge pier size deviation detection method, system, terminal and storage medium thereof

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"基于工件曲面形貌信息的数控机床误差分离研究";邓永红;《中国优秀硕士学位论文全文数据库 工程科技Ⅰ辑(月刊)》;第三,五章) *
Unsupervised analysis of a chromatographic signal based on an infinite Gaussian mixture model;O. Harant 等;《IEEE Xplore》;全文 *
基于激光扫描数据的三维可视化建模;路兴昌 等;《***仿真学报》(07);全文 *

Also Published As

Publication number Publication date
CN116894933A (en) 2023-10-17

Similar Documents

Publication Publication Date Title
US20230053462A1 (en) Image rendering method and apparatus, device, medium, and computer program product
US9495767B2 (en) Indexed uniform styles for stroke rendering
CN111260766A (en) Virtual light source processing method, device, medium and electronic equipment
US20070182762A1 (en) Real-time interactive rubber sheeting using dynamic delaunay triangulation
CN109544668B (en) Texture coordinate processing method, terminal device and computer readable storage medium
CN111803952A (en) Topographic map editing method and device, electronic equipment and computer readable medium
CN111127603B (en) Animation generation method and device, electronic equipment and computer readable storage medium
CN114742931A (en) Method and device for rendering image, electronic equipment and storage medium
CN116310036A (en) Scene rendering method, device, equipment, computer readable storage medium and product
CN114596399A (en) Image processing method and device and electronic equipment
CN113724331B (en) Video processing method, video processing apparatus, and non-transitory storage medium
CN114742934A (en) Image rendering method and device, readable medium and electronic equipment
WO2023193613A1 (en) Highlight shading method and apparatus, and medium and electronic device
CN111862342A (en) Texture processing method and device for augmented reality, electronic equipment and storage medium
CN116894933B (en) Three-dimensional model comparison method, device, equipment and storage medium
CN110378948B (en) 3D model reconstruction method and device and electronic equipment
CN113506356B (en) Method and device for drawing area map, readable medium and electronic equipment
CN114996374A (en) Online data visualization implementation method, system, device and medium
CN115019021A (en) Image processing method, device, equipment and storage medium
CN114723600A (en) Method, device, equipment, storage medium and program product for generating cosmetic special effect
CN114022601A (en) Volume element rendering method, device and equipment
CN110390717B (en) 3D model reconstruction method and device and electronic equipment
CN114020390A (en) BIM model display method and device, computer equipment and storage medium
CN113379814A (en) Three-dimensional space relation judgment method and device
CN117333560B (en) Scene-adaptive stripe structure optical decoding method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant