WO2022121653A1 - Transparency determination method and apparatus, electronic device, and storage medium - Google Patents

Transparency determination method and apparatus, electronic device, and storage medium Download PDF

Info

Publication number
WO2022121653A1
WO2022121653A1 PCT/CN2021/131498 CN2021131498W WO2022121653A1 WO 2022121653 A1 WO2022121653 A1 WO 2022121653A1 CN 2021131498 W CN2021131498 W CN 2021131498W WO 2022121653 A1 WO2022121653 A1 WO 2022121653A1
Authority
WO
WIPO (PCT)
Prior art keywords
detection point
target detection
current
model
sub
Prior art date
Application number
PCT/CN2021/131498
Other languages
French (fr)
Chinese (zh)
Inventor
冯乐乐
Original Assignee
上海米哈游天命科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海米哈游天命科技有限公司 filed Critical 上海米哈游天命科技有限公司
Publication of WO2022121653A1 publication Critical patent/WO2022121653A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation

Definitions

  • the embodiments of the present application relate to game development technologies, for example, to a method, apparatus, electronic device, and storage medium for determining transparency.
  • the translucent effect between the inner model and the outer model is usually set, for example, the translucent display of the skin model and the clothes model.
  • the translucent display mainly depends on the effect displayed after the light reflected by the inner layer object penetrates the outer layer object after a certain distance and is incident on the human eye.
  • each model is composed of points. When the distance between the point on the inner layer object and the corresponding point on the outer layer object is farther, the translucent effect is weaker, and vice versa, the translucent effect is relatively strong.
  • the transparency display value of the outer layer model is usually set, and the transparency display value is usually fixed, and all points on the outer layer model realize the transparent display according to the set transparency display value. . In this way, there is a certain deviation between the transparent display and the actual situation, resulting in poor transparent display effect and poor user experience.
  • the embodiments of the present application provide a method, apparatus, electronic device, and storage medium for determining transparency, so that the transparent display effect is consistent with the actual situation, and the user experience is improved.
  • an embodiment of the present application provides a method for determining transparency, the method comprising:
  • the first sub-model For each target detection point on the first sub-model, determine the current coordinate information of the current target detection point, and determine the distance function from the current target detection point based on the current coordinate information; The coordinates obtained after the three-dimensional space coordinates of the current target detection point are changed in a matrix, and the distance function is determined according to the relative distance information between the target detection point and the second sub-model in each direction; the first The submodel is a model wrapping the second submodel;
  • the distance function of each target detection point is processed based on the preset spherical harmonic function to obtain the projection coefficient value of each target detection point;
  • the spherical harmonic function is composed of a plurality of basis functions;
  • the projection coefficient value of the current target detection point is stored in the target storage position in the engine, so that when transparent display is detected, the target reconstruction function is reconstructed based on the projection coefficient value stored in the target storage position , determine the target distance information between the first sub-model and the second sub-model under the target shooting angle based on the target reconstruction function, and determine the transparency parameter corresponding to the target distance information, based on the transparency parameter of each target detection point Display each object detection point on the first submodel.
  • the embodiments of the present application also provide a method for determining transparency, the method comprising:
  • the target coordinate information is determined according to the current coordinate information of the current target detection point, and the target of the current target detection point is reconstructed according to the target coordinate information and the projection coefficient value of the current target detection point reconstruction function;
  • the current coordinate information is the coordinates obtained by performing matrix transformation on the three-dimensional space coordinates of the current target detection point;
  • the first sub-model is a model wrapping the second sub-model;
  • the projection coefficient value is based on spherical harmonics
  • the function is determined after processing the distance function of the spherical distribution of each target detection point on the first sub-model;
  • the distance information corresponding to each target detection point is determined, and the distance between the first sub-model and the second sub-model is determined based on the distance information and display the first submodel based on the transparency parameter.
  • the embodiments of the present application also provide a device for determining transparency, including:
  • a distance information determination module configured to determine the current coordinate information of the current target detection point for each target detection point on the first sub-model, and determine a distance function from the current target detection point based on the current coordinate information;
  • the current coordinate information is the coordinates obtained after the three-dimensional space coordinates of the current target detection point are changed in a matrix, and the distance function is obtained according to the relative distance information between the target detection point and the second sub-model in each direction. Determined; the first sub-model is a model wrapping the second sub-model;
  • a projection coefficient value determination module configured to process the distance function of each target detection point based on a preset spherical harmonic function to obtain a projection coefficient value of each target detection point;
  • the spherical harmonic function is composed of a plurality of basis functions;
  • the transparency parameter determination module is configured to store the projection coefficient value of the current target detection point in the target storage position in the engine for each target detection point, so that when transparent display is detected, the projection based on the projection stored in the target storage position
  • the coefficient value reconstructs the target reconstruction function, determines the target distance information between the first sub-model and the second sub-model under the target shooting angle based on the target reconstruction function, and determines the transparency parameter corresponding to the target distance information, based on each
  • the transparency parameter of each object detection point shows each object detection point on the first sub-model.
  • the embodiments of the present application also provide a device for determining transparency, including:
  • a target shooting angle determination module configured to determine the target shooting angle corresponding to each target detection point on the shooting device and the first sub-model
  • the target reconstruction function reconstruction module is configured to determine target coordinate information according to the current coordinate information of the current target detection point for each target detection point, and reconstruct the target coordinate information according to the target coordinate information and the projection coefficient value of the current target detection point.
  • the target reconstruction function of the current target detection point; the current coordinate information is the coordinates obtained by performing matrix transformation on the three-dimensional space coordinates of the current target detection point;
  • the first sub-model is a model wrapping the second sub-model;
  • the projection coefficient value is determined after processing the distance function of the spherical distribution of each target detection point on the first sub-model based on the spherical harmonic function;
  • the transparent display module is configured to determine the distance information corresponding to each target detection point based on the reconstruction function corresponding to each target detection point and the corresponding target shooting angle, and determine the first sub-model and the described distance information based on the distance information.
  • a transparency parameter between the second sub-models, and the first sub-model is displayed based on the transparency parameter.
  • an embodiment of the present application further provides an electronic device, the electronic device comprising:
  • processors one or more processors
  • storage means arranged to store one or more programs
  • the one or more processors When the one or more programs are executed by the one or more processors, the one or more processors implement the method for determining transparency described in any one of the embodiments of the present application.
  • the embodiments of the present application further provide a storage medium containing computer-executable instructions, the computer-executable instructions, when executed by a computer processor, are configured to perform the determination described in any one of the embodiments of the present application method of transparency.
  • FIG. 1 is a flowchart of a method for determining transparency provided by an embodiment of the present application
  • FIG. 2 is a flowchart of a method for determining transparency provided by another embodiment of the present application.
  • FIG. 3 is a flowchart of a method for determining transparency provided by another embodiment of the present application.
  • FIG. 4 is a flowchart of a method for determining transparency provided by another embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of a device for determining transparency according to an embodiment of the present application.
  • FIG. 6 is a schematic structural diagram of a device for determining transparency provided by another embodiment of the present application.
  • FIG. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • FIG. 1 is a flowchart of a method for determining transparency provided by an embodiment of the present application. This embodiment can be applied to the case where the transparency parameter is adjusted according to the actual situation, so that the transparent display effect is consistent with the actual theoretical effect.
  • the method can be determined by determining the transparency.
  • the apparatus may be implemented in the form of software and/or hardware, and the hardware may be an electronic device, for example, the electronic device may be a mobile terminal or the like.
  • this embodiment includes the following steps:
  • the first sub-model and the second sub-model are relative. If the application scene is a skin model and a clothing model, the model corresponding to the clothes can be used as the first sub-model, and the model corresponding to the skin can be used as the second sub-model.
  • the target detection point may be a preset detection point on the first sub-model; it may also be that the preset point on the first sub-model is divided into a plurality of blocks, and the center point of each block may be used as the target detection point; or It can be a detection point set by the developer according to actual needs; it can also be that the first sub-model is composed of multiple points, and each point can be used as a target detection point.
  • the current coordinate information is the coordinates obtained by performing a matrix change on the three-dimensional space coordinates of the current target detection point, which may be determined based on the tangent, the subtangent and the normal of the current target detection point.
  • a coordinate system can be established based on the tangent, subtangent and normal of the current target detection point, and the spatial coordinates in the three-dimensional space coordinate system are converted into coordinates in the tangent, subtangent and normal directions.
  • the coordinates obtained at this time are Current coordinate information.
  • the distance function corresponds to each target detection point, that is, a target detection point corresponds to a distance function. For each distance function, it can be understood as: it can be determined according to the relative distance information between a certain target detection point and the second sub-model in each direction in space.
  • the distance between each target detection point and the second sub-model in each direction can be: take the target detection point as the center of the sphere, emit physical rays in each direction in space, and determine each The distance information between the intersection of each physical ray and the second sub-model and the target detection point.
  • the number of distance functions is also 1000, wherein each distance function can be determined based on the distance between a certain target detection point and the second sub-model in each direction. distance function.
  • the same method is used to determine the spherical distribution distance function of different target detection points.
  • the spherical distribution function of one of the target detection points is determined as an example to introduce.
  • a target detection point on the first sub-model can be used as the center of the sphere, physical rays can be emitted in each direction in the space, and the distance information between the intersection of each physical ray and the second sub-model and the target detection point can be determined . If the physical ray has an intersection with the second sub-model in the direction of the ray, the distance between the intersection and the target detection point is used as the distance information between the target detection point and the second sub-model in this direction. If there is no intersection between the physical ray and the second sub-model in the direction of the ray, the preset maximum distance information may be used as the distance information between the target detection point and the second sub-model in this direction.
  • the current target detection point emits a physical ray in a certain direction in space. If the physical ray has an intersection with the second sub-model, the method for determining the distance information of the current target detection point in the current direction may be, according to the current target detection point
  • the distance between the two points can also be determined in other ways, which is not shown here, as long as the distance information between the two points can be determined.
  • the distance information between the corresponding first sub-model and the second sub-model can be determined.
  • the distance function of the spherical distribution of the target detection point can be determined.
  • the spherical distribution distance function of the target detection point A is:
  • i represents the ith direction
  • F(i) represents the distance information in the ith direction
  • dist_i represents the distance information
  • n represents the total number of directions.
  • the distance function of the spherical distribution of each target detection point is a composite function
  • the number of sub-functions in the composite function can be determined according to the preset number of samples.
  • the default can be 16 ⁇ 32 precision, that is, The composite function contains 512 sub-functions.
  • the number of samples can be determined according to actual needs.
  • S120 Process the distance function of each target detection point based on a preset spherical harmonic function to obtain a projection coefficient value of each target detection point.
  • the same method is used to determine the projection coefficient values of different target detection points.
  • the projection coefficient value of one of the target detection points is determined as an example.
  • the spherical harmonic function is composed of multiple basis functions, and each distance function can be processed by using the basis function. Different orders of spherical harmonics correspond to different numbers of basis functions.
  • the second-order spherical harmonics include 4 basis functions, the third-order spherical harmonics include 9 basis functions, and the fourth-order spherical harmonics include 16 basis functions.
  • the projection coefficient value is the coefficient value obtained by processing the distance function according to the basis function in the spherical harmonic function, and the number of the coefficient value is the same as that of the basis function. According to the spherical harmonic function and its corresponding multiple basis functions, the spherical distribution distance function of the target detection point can be compressed to obtain the projection coefficient value.
  • the spherical harmonic function is second-order, that is, it includes 4 basis functions, then inputting the distance function corresponding to the target detection point into the spherical harmonic function can obtain 4 projection coefficient values, thereby compressing the distance function to 4 projection coefficient value.
  • the higher the order of the spherical harmonic function the higher the similarity between the reconstructed sphere and the actual sphere during reconstruction, so developers can choose spherical harmonic functions of different orders according to actual needs.
  • the higher-order spherical harmonic function contains more basis functions, and the degree of restoration of the distance function is higher in the subsequent restoration of the distance function according to the spherical harmonic function and the projection coefficient value.
  • the distance function of the spherical distribution of the target detection point can be input into the spherical harmonic function, and the distance function can be processed based on the basis function in the spherical harmonic function to obtain the above distance function in The projection coefficient values on each basis function. It follows that the number of projection coefficient values is equal to the number of basis functions in spherical harmonics.
  • the distance functions of the spherical distribution of different target detection points can be the same or different.
  • the distance functions corresponding to different target detection points are the same.
  • the distance functions corresponding to different target detection points are different, and whether the distance function is the same is based on the target detection points in each direction in space. The distance information on it is determined. After inputting different distance functions into the preset spherical harmonic function for processing, the obtained projection coefficient values are different.
  • each target detection point For each target detection point, store the projection coefficient value of the current target detection point in the target storage position in the engine, so that when transparent display is detected, reconstruct the target reconstruction function based on the projection coefficient value stored in the target storage position , determine the target distance information between the first sub-model and the second sub-model under the target shooting angle based on the target reconstruction function, and determine the transparency parameter corresponding to the target distance information, to display the first sub-model based on the transparency parameter of each target detection point Each object detection point on a submodel.
  • the projection coefficient value corresponding to the current target detection point may be stored in the vertex color corresponding to the current target detection point and/or in the attribute information of the current target detection point. Indexed coordinates are imported into the target storage location in the engine.
  • each target detection point on the first sub-model can be used as a vertex color
  • each target detection point corresponds to a pixel channel, that is, four channels of RGBA
  • the projection coefficient value of the target detection point can be stored in each channel.
  • the attribute information can be scalable information corresponding to each target detection point, for example: UV, ie u, v texture map coordinates.
  • the vertex color corresponding to the target detection point and the attribute information of the current target detection point can be used together to store the projection coefficient value.
  • the projection coefficient value corresponding to the current target detection point is stored in the vertex color corresponding to the current target detection point
  • four projection coefficient values can be stored respectively according to the four channels of R, G, B and A, and , the number of vertex colors required can be determined according to the number of projection coefficient values. It is also possible to store 4 projection coefficient values in one vertex color, and store the remaining projection coefficient values in the UV coordinates corresponding to the current target detection point. The above method can reduce the use of vertex colors and facilitate the calculation of projection coefficient values. storage and subsequent recall and use.
  • the vertex color corresponding to the target detection point and the UV coordinate corresponding to the current target detection point can also be used together to store the projection coefficient value. For example, if 9 projection coefficient values are stored, two vertex colors can be used to store 8 projection coefficient value, and store the remaining 1 projection coefficient value in the UV coordinate corresponding to the current target detection point.
  • the storage method of the projection coefficient value can also be: for each target detection point, the projection coefficient value corresponding to the current target detection point is stored in the pixel points in at least one picture, and the number of pictures corresponds to the current target detection point. the same number of projection coefficient values. Import all images storing projection coefficient values into the target storage location in the engine according to the index coordinates of the vertices.
  • 9 projection coefficient values may be stored, 9 pictures may be used, and the projection coefficient values may be respectively stored in the pixel points corresponding to the current target detection point.
  • the projection coefficient value After the projection coefficient value is stored, it can be imported into the target position in the engine according to the index coordinates of the current target detection point for storage, so that the projection coefficient value corresponding to the current target detection point can be retrieved from the engine according to the target detection point.
  • the target distance information reconstructed by the coefficient value determines the transparency parameter of each target detection point, and then displays the first sub-model and the second sub-model based on each transparency parameter.
  • the current coordinate information of the current target detection point is determined, and the distance function corresponding to the current target detection point is determined based on the current coordinate information, and based on the preset
  • the set spherical harmonic function processes the distance function of each target detection point to obtain the projection coefficient value of each target detection point, and then stores the projection coefficient value of the current target detection point to the target storage location in the engine, which solves the problem of Fixed transparency display value
  • the transparent display is deviated from the actual situation, resulting in poor transparent display effect and poor user experience. Adjust the transparency parameters according to the actual situation, so that the transparent display effect is consistent with the actual situation. The theoretical effect is consistent, and the user experience is improved.
  • FIG. 2 is a flowchart of a method for determining transparency provided by another embodiment of the present application.
  • the current coordinate information and the determination method of determining the distance function are determined, and the projection coefficient value of each target detection point is determined.
  • the storage method of storing to the target storage location reference may be made to the technical solution of this embodiment. The explanations of the terms that are the same as or corresponding to each of the above-mentioned embodiments are not repeated here.
  • the method for determining transparency includes:
  • a point in space is used as the origin of space coordinates to establish a three-dimensional space coordinate system.
  • the coordinate information of the current target detection point in the three-dimensional space coordinate system is the three-dimensional space coordinates of the current target detection point.
  • a certain point under the person's feet is the origin of the space coordinates, namely (0, 0, 0).
  • a three-dimensional space coordinate system is established.
  • the current target detection point may be a certain point on the head of the person, and the coordinates of this point in the three-dimensional space are
  • the coordinate information in the system is (0, 1, 0), and this coordinate information is the three-dimensional space coordinate of the point.
  • the three-dimensional space coordinate of each target detection point is fixed.
  • S220 Process the three-dimensional space coordinates according to the predetermined coordinate transformation matrix, and determine the current coordinate information of the current target detection point.
  • the coordinate transformation matrix is a transformation matrix used to convert the three-dimensional space coordinates into the current coordinate information, and the current coordinate information is determined by obtaining the values of the tangent, subtangent and normal of the current target detection point according to the transformation matrix.
  • the current coordinate axis is defined based on the tangent, subtangent and normal of the current target detection point, and the coordinates of the three-dimensional space coordinate are changed according to the coordinate transformation matrix corresponding to the current coordinate axis, and the current coordinate information of the current target detection point can be obtained.
  • T is the coordinate transformation matrix, which is determined by the tangent, subtangent and normal of the current target detection point
  • x is the three-dimensional space coordinate to be transformed.
  • a certain point on the hand of a certain person in space is used as the current target detection point, for example, the three-dimensional space coordinates of the current target detection point are (0, 0, 0).
  • the current coordinate axis is determined according to the tangent, subtangent and normal of the current target detection point, and the coordinates of the three-dimensional space coordinate are changed according to the coordinate transformation matrix corresponding to the current coordinate axis.
  • the three-dimensional space coordinate of the current target detection point (0, 0, 0) becomes (-0.2, 0.5, 0.3).
  • the current coordinate information of the current target detection point is located in the space of the current target detection point, rather than relative coordinate information determined based on other points in the space.
  • the tangent, subtangent and normal of the current target detection point will also change accordingly, and the changed current coordinate information can be further determined according to the changed current target detection point.
  • the reconstruction function can be reconstructed based on the coordinates. It is suitable for determining the distance information of the deformed object, and then determining the transparency parameter according to the distance information.
  • each target detection point determine the information of each collision point to be processed when the current target detection point emits physical rays in each direction in space and passes through the second sub-model.
  • the information of each collision point to be processed may include position information of each collision point to be collided, such as spatial coordinate information corresponding to the current target information, and the like.
  • physical rays can be emitted in each direction based on each target detection point on the first sub-model, and each of the above-mentioned physical rays may pass through the second sub-model.
  • the collision point information to be processed may be information used to describe the position of the point, such as spatial coordinate information corresponding to the current target information. Therefore, according to the collision point to be processed, it can be determined that the spatial coordinate information corresponding to the collision point to be processed is the information of the collision point to be processed.
  • determining the information of the collision point to be processed may be: taking the current target detection point as the center of the sphere, emitting physical rays in any direction in space, and determining the time when each physical ray passes through the second sub-model. Pending collision point information.
  • the emission of physical rays from each target detection point on the first sub-model to each direction can be regarded as taking the current target detection point as the center of the sphere, and emitting physical rays in each direction of the spherical surface. If the physical ray passes through the second sub-model, the spatial coordinate information of the intersection of the physical ray and the second sub-model is used as the collision point information to be processed.
  • S240 Determine distance information between the current target detection point and the second sub-model in each direction according to the current target detection point and the information of each collision point to be processed.
  • the distance information between the information of the collision point to be processed and the current target detection point is determined.
  • the spatial coordinate information corresponding to the collision point to be processed and the current target information is used as the spatial coordinate information of the collision point to be processed.
  • the distance information between the collision point information to be processed and the current target detection point is calculated by using the calculation formula of the distance between two points in the space.
  • the distance information corresponding to the collision point to be processed is set as a set value.
  • the physical ray and the first There is no collision point between the two sub-models, and the collision information to be processed at this time can be set as the set value.
  • the set value may be the maximum distance information between the collision point information to be processed and the current target detection point.
  • the distance information between the current target detection point and the second sub-model in each direction is determined.
  • the distance information or setting value corresponding to each physical ray emitted by the current target detection point can be determined, and the distance information or setting value can be used as the current target detection point in each direction and Distance information between the second submodels.
  • the distance function of the spherical distribution of the target detection point can be obtained.
  • the number of sub-functions in this distance function is the same as the amount of distance information of the target detection point in each direction of the sphere.
  • the number of sub-functions can be increased, that is, the density of physical rays can be increased, and the number of physical rays can be determined according to actual requirements.
  • S260 Determine the order of the spherical harmonic function, and determine the representation of the basis function in the spherical harmonic function and the quantity of the basis function according to the order.
  • Different orders of spherical harmonics contain different numbers of basis functions.
  • the second-order spherical harmonics contain 4 basis functions
  • the third-order spherical harmonics contain 9 basis functions
  • the fourth-order spherical harmonics contain 16 basis functions, etc. .
  • the higher the order of the spherical harmonic function the better the reconstruction effect will be when the reconstruction function is used later, and the order needs to be set according to the actual needs.
  • the order of the spherical harmonic function is determined to be a according to requirements, then the number of basis functions in the spherical harmonic function can be determined to be a 2 .
  • each basis function can be determined according to the relationship between the distance function and the value of the projection coefficient.
  • the same method is used to determine the projection coefficient values of different target detection points.
  • the projection coefficient value of one of the target detection points is determined as an example.
  • the projection coefficient value is a value determined by calculating the distance function using each basis function of the preset spherical harmonic functions.
  • the number of projection coefficient values is the same as the number of basis functions.
  • the distance function of the spherical distribution of each target detection point is different, and after inputting the different distance functions into the preset spherical harmonic function for processing, the obtained projection coefficient values are different.
  • the projection coefficient value corresponding to the current target detection point may be stored in the vertex color corresponding to the current target detection point and/or the attribute information of the current target detection point.
  • the projection coefficient value corresponding to the current target detection point may also be stored in the pixel point of at least one picture corresponding to the current target detection point.
  • determine the target number of projection coefficient values corresponding to the current target detection point For example, determine the target number of projection coefficient values corresponding to the current target detection point; determine the number of vertex colors corresponding to the current target detection point based on the target number and the storage number corresponding to the vertex color; store the projection coefficient value to the current target detection point. In the vertex color corresponding to the target detection point.
  • the target number is the number of projection coefficient values, and is also the number of basis functions in the preset spherical harmonics.
  • the storage quantity is the number of projection coefficient values that each vertex color can store.
  • the vertex color contains four channels of RGBA, and the storage quantity is 4.
  • the number of vertex colors is the number of vertex colors used to store the projection coefficient values.
  • the vertex color can be stored by RGBA channel, that is, there are 4 channels of values, or it can be stored by RGB channels, that is, there are 3 channels of values.
  • RGBA channel Taking the vertex color stored in RGBA channel as an example, if the preset spherical harmonic function is a second-order spherical harmonic function and contains 4 basis functions, then 4 projection coefficient values can be obtained, and then the above 4 projection coefficient values can be stored to the current value. In a vertex color corresponding to the target detection point. If the preset spherical harmonic function is the fourth-order spherical harmonic function and contains 16 basis functions, then 16 projection coefficient values can be obtained, and the above 16 projection coefficient values are stored in the four vertex colors corresponding to the current target detection point. , the four vertex colors belong to different pictures, corresponding to the current target detection point.
  • each vertex color can store 4 projection coefficient values, if the number of projection coefficient values is not a multiple of 4, then the number of vertex colors needs to be rounded, for example: the third-order spherical harmonic function, including There are 9 base functions corresponding to 9 projection coefficient values. Then, two vertex colors can be used to store 8 projection coefficient values, and the remaining 1 projection coefficient value still needs one vertex color to be stored, so a total of 3 vertex colors are needed.
  • Another possible way is to use one vertex color to store part of the projection coefficient value, and then use the attribute information of the vertex color to store the remaining projection coefficient value.
  • the preset number is the number of projection coefficient values that the vertex color can store.
  • the remaining projection coefficient value may be determined according to the difference between the target number and the preset number, and the remaining projection coefficient value may be stored in the attribute information of the vertex color.
  • each vertex color can store 4 projection coefficient values
  • the third-order spherical harmonic function contains 9 basis functions, corresponding to 9 projection coefficient values.
  • Using one vertex color corresponding to the target detection point can store 4 projection coefficient values, and store the 5 remaining projection coefficient values in the UV coordinates corresponding to the target detection point.
  • the projection coefficient value may also be selected to be stored in the pixel point of at least one picture corresponding to the current target detection point, so as to facilitate subsequent retrieval and use.
  • 9 projection coefficient values may be stored, 9 pictures may be used, and the projection coefficient values may be respectively stored in the pixel points corresponding to the current target detection point.
  • vertex color and/or attribute information and the current coordinate information are imported into the target position in the engine for storage.
  • the engine can be a core component of a programmed editable computer game system or some interactive real-time image application programs.
  • the target location may be a storage space used to store data and/or information in the engine, and in this embodiment, is a storage space used to store coordinate information of the target detection point.
  • the projection coefficient value corresponding to the current target detection point is stored in the vertex color corresponding to the current target detection point and/or the attribute information of the current target detection point, or stored in at least one corresponding to the current target detection point
  • it is necessary to import the vertex color and/or attribute information of the current target detection point, or at least one picture pixel point into the engine according to the The target position is stored, and the current coordinate information is also imported to the target position in the engine for storage. If the engine needs to use the projection coefficient value corresponding to a certain target detection point, the projection coefficient value corresponding to the coordinate information can be determined at the target position in the engine for subsequent reconstruction.
  • the three-dimensional space coordinates of the current target detection point are determined for each target detection point, and the three-dimensional space coordinates are processed according to a predetermined coordinate transformation matrix to determine the current coordinate information of the current target detection point.
  • the distance information of the current target detection point in each direction is determined, and then the distance function of the spherical distribution of the corresponding target detection point is determined.
  • a basis function processes the distance function of the current target detection point, obtains the projection coefficient value of the current target detection point, stores the projection coefficient value of the current target detection point and the current coordinate information to the target storage location, and solves the problem of displaying the value with a fixed transparency.
  • the transparent display When the first sub-model is transparently displayed, the transparent display is deviated from the actual situation, and the transparent display effect is not good and the user experience is poor. Adjust the transparency parameters according to the actual situation, so that the transparent display effect is consistent with the actual theoretical effect. Improved user experience.
  • FIG. 3 is a flowchart of a method for determining transparency provided by another embodiment of the present application.
  • This embodiment can be applied to reconstruct the distance information between the target detection point and the second sub-model at each angle according to the projection coefficient value.
  • the transparent display is performed according to the distance information
  • the method can be performed by a device for determining the transparency, and the device can be implemented in the form of software and/or hardware, and the hardware can be an electronic device, for example, the electronic device can be a mobile terminal Wait.
  • this embodiment includes the following steps:
  • the photographing device is a device for observing and photographing the first sub-model
  • the target photographing angle is the relative angle between the photographing device and each target detection point on the first sub-model.
  • the relative position of the shooting device and each target detection point can be determined. angle information, and the angle information can be used as the target shooting angle.
  • For each target detection point determine target coordinate information according to the current coordinate information of the current target detection point, and reconstruct a target reconstruction function of the current target detection point according to the target coordinate information and the projection coefficient value of the current target detection point.
  • the first sub-model and the second sub-model are relative. If the application scene is a skin model and a clothing model, the model corresponding to the clothes can be used as the first sub-model, and the model corresponding to the skin can be used as the second sub-model.
  • the target coordinate information is the coordinate information of the current target detection point in the current scene, for example, the current coordinate information before the current target detection point is deformed, and converted into target coordinate information after the deformation occurs.
  • the projection coefficient value of the current target detection point includes the projection coefficient value determined after processing the distance function of the spherical distribution of the current target detection point on the first sub-model based on the spherical harmonic function.
  • the target reconstruction function is to process the projection coefficient value of the current target detection point to obtain the distance information corresponding to the collision between the physical ray and the second sub-model when the current target detection point emits physical rays in each direction in the space. The function.
  • the target coordinate information of the current target detection point in the current scene can be determined according to the current coordinate information of the current target detection point and the coordinate axis determined according to the tangent, subtangent and normal of the current detection point.
  • the projection coefficient value of the current target detection point can be obtained from the target storage location corresponding to the current target detection point.
  • a distance function including distance information corresponding to each angle is simulated according to the stored projection coefficient value and the target coordinate information of the current target detection point.
  • the distance value corresponding to the second sub-model in each direction can be simulated when physical rays are emitted in each direction of the space with the current target detection point as the spherical center.
  • the target reconstruction function of the current target detection point can be reconstructed according to the distance information.
  • the second sub-model such as an arm
  • a certain target detection point on the first sub-model is in the y-axis direction, for example, the clothes on the arm are the first Sub-model
  • the x-axis and z-axis are parallel to the ground and are approximately in a straight line with the eyes.
  • the first sub-model is approximately opaque; when the arm stands up, the target on the first sub-model The detection point is located in the positive direction of the y-axis, but the z-axis is facing the second sub-model.
  • the coordinate information of the same target detection point has changed, that is, the tangent line corresponding to the same target detection point at different positions, The coordinates of the vice tangent and the normal are different, and the reconstruction functions reconstructed based on the coordinates are also different. Even under the condition of object deformation, the reconstruction function corresponding to each target detection point can still be determined, and then each target detection point can be determined. The distance information corresponding to the point.
  • the above method can be used to determine the target shooting angle between the target detection point and the shooting device, and based on the target shooting angle, determine the difference between each target detection point and the collision point in the second sub-model. distance information between and reconstruct the target reconstruction function.
  • the current target detection point on the first sub-model when the current target detection point on the first sub-model is transparently displayed, 2 vertex colors and 1 attribute information corresponding to the current target detection point can be determined according to the current target detection point, since each vertex One projection coefficient value is stored in each of the four RGBA channels of the color, so nine projection coefficient values corresponding to the current target detection point can be obtained.
  • nine pieces of distance information can be determined.
  • the above 9 distance information is the distance information corresponding to the target detection point under 9 angles in space, and a reconstruction function can be constructed according to the distance information.
  • the transparency parameter is used to represent the degree of transparency when the model is displayed, which can be represented by percentage, for example, the transparency is 80% and so on.
  • the target reconstruction function can be used to process the input target shooting angle to determine the distance information between the first sub-model and the second sub-model corresponding to the target detection point under the target shooting angle.
  • the target shooting angle between the shooting device and the current target detection point can be determined, and the target shooting angle can be input into the target reconstruction function.
  • the collision point on the second sub-model corresponding to the current target detection point on the second sub-model can be determined, and then the distance information between the collision point and the current target detection point can be determined.
  • the same method is used to determine the transparency parameters of different target detection points.
  • the transparency parameter of one of the target detection points is determined as an example.
  • determining the transparency parameter of the current target detection point may be: according to the distance information, between the pre-stored distance information and the transparency parameter
  • the transparency parameter is determined in the corresponding relationship between the two, or it can be calculated according to a preset transparency parameter calculation model, and the distance information is input into the transparency parameter calculation model, and the transparency parameter corresponding to the distance information can be obtained through calculation.
  • the distance information between f is a monotonic and monotonic decreasing function.
  • the target detection points on the first sub-model can be displayed according to the corresponding transparency parameters to obtain The effect of transparent display.
  • the target shooting angle is determined, the target coordinate information is determined according to the current coordinate information of the current target detection point, and the target reconstruction function is reconstructed according to the target coordinate information and the projection coefficient value of the current target detection point.
  • function and the corresponding target shooting angle determine the corresponding distance information, and then determine the transparency parameter and display it based on the transparency parameter, which solves the problem of the deviation between the transparent display and the actual situation when the first sub-model is transparently displayed with a fixed transparency display value.
  • adjust the transparency parameters according to the actual situation so that the transparent display effect is consistent with the actual theoretical effect, and the user experience is improved.
  • FIG. 4 is a flowchart of a method for determining transparency provided by another embodiment of the present application.
  • a reconstruction function can be reconstructed according to the spherical harmonic function and the projection coefficient value, and the target detection point can be determined based on the reconstruction function.
  • the implementation can refer to the following description. The explanations of the terms that are the same as or corresponding to each of the above-mentioned embodiments are not repeated here.
  • the method for determining transparency includes:
  • S420 Determine the target coordinate information of the current target detection point in the current scene according to the current coordinate information of the current target detection point.
  • the current coordinate information of the current target detection point is coordinate information obtained by processing the three-dimensional space coordinates of the current target detection point according to a predetermined coordinate transformation matrix.
  • the target coordinate information of the current target detection point in the current scene can be determined according to the current coordinate information of the current target detection point and the coordinate axis determined according to the tangent, subtangent and normal of the current detection point.
  • S430 Process the target coordinate information and the projection coefficient value of the current target detection point by using a preset spherical harmonic function to reconstruct the target reconstruction function of the current target detection point.
  • the target coordinate information is the coordinate information obtained by further converting the current coordinate information of the target detection point in the current scene.
  • the reconstruction function corresponding to each target detection point can be reconstructed.
  • the reconstruction function of reconstructing one of the target detection points can be described as an example.
  • the reconstruction function of the target detection point is to process the projection coefficient of the current target detection point according to the spherical harmonic function, so as to obtain when the current target detection point emits physical rays in each direction in space, when the physical rays collide with the second sub-model
  • the function constructed from the corresponding distance value.
  • the reconstruction function can be used to process the input target shooting angle to determine the distance information between the first sub-model and the second sub-model corresponding to the target detection point at the target shooting angle.
  • the target coordinate information and the projection coefficient value of the current target detection point based on the preset spherical harmonic function, it can be obtained that when the current target detection point emits physical rays in each direction in space, the physical rays collide with the second sub-model The distance information between the collision point and the current target detection point. According to the distance information of each direction in the space corresponding to the current target detection point, a reconstruction function corresponding to the current target detection point is constructed.
  • the target shooting angle can be input into the reconstruction function corresponding to the current target detection point, and the reconstruction function can process the target shooting angle and output The distance information between the collision point and the current target detection point when the current target detection point and the line belonging to the target shooting device collide with the second sub-model under the target shooting angle.
  • the 9 projection coefficient values corresponding to the current target detection point are obtained, according to the above-mentioned 9 projection coefficient values and target coordinate information, inverse transformation processing is performed through the preset spherical harmonic function, and the target detection point can be obtained.
  • Corresponding reconstruction function The target shooting angle, such as 45°, is input into the reconstruction function, and the distance information between the first sub-model and the second sub-model corresponding to the current target detection point at the target shooting angle can be determined based on the reconstruction function, such as 5 nm.
  • S450 Determine the transparency parameter of each target detection point according to the preset correspondence between the distance information and the transparency parameter and the distance information corresponding to each target detection point.
  • each distance information and its corresponding transparency parameter can be stored in advance. For example, when the transparency parameter decreases by 10% for every 10nm increase, the distance information is recorded as dist, 0nm ⁇ dist ⁇ 10nm, and the transparency parameter is 100%, 10nm ⁇ dist ⁇ 20nm, the transparency parameter is 90%, 20nm ⁇ dist ⁇ 30nm, the transparency parameter is 80%, etc.
  • the transparency parameter may include the transparency parameter of the first sub-model and the transparency parameter of the second sub-model.
  • the transparency parameter corresponding to the distance information may be determined in the pre-stored correspondence between each distance information and the transparency parameter, which may include the transparency parameter of the first sub-model And the transparency parameter of the second submodel for subsequent transparent display.
  • the relative position of the first sub-model and the second sub-model can be realized based on the above-mentioned transparency parameter The transparency of the display effect.
  • the target coordinate information and the projection coefficient value of the current target detection point are determined based on the predetermined shooting angle of the target and the target coordinate information of the current target detection point in the current scene, and based on the preset spherical harmonic function.
  • Process reconstruct the target reconstruction function of the current target detection point, determine the distance information according to the target shooting angle and the reconstruction function, and then determine the transparency parameter and display it, which solves the problem of transparent display when the first sub-model is transparently displayed with a fixed transparency display value.
  • adjust the transparency parameters according to the actual situation so that the transparent display effect is consistent with the actual theoretical effect, and the user experience is improved.
  • the apparatus includes: a distance information determination module 510 , a projection coefficient value determination module 520 and a transparency parameter determination module 530 .
  • the distance information determination module 510 is configured to determine the current coordinate information of the current target detection point for each target detection point on the first sub-model, and determine the distance function corresponding to the current target detection point based on the current coordinate information;
  • the coordinate information is the coordinates obtained after the three-dimensional space coordinates of the current target detection point are changed in a matrix, and the distance function is determined according to the relative distance information between the target detection point and the second sub-model in each direction;
  • the model is a model wrapping the second sub-model;
  • the projection coefficient value determination module 520 is set to process the distance function of each target detection point based on a preset spherical harmonic function to obtain the projection coefficient value of each target detection point;
  • the harmonic function is composed of a plurality of basis functions;
  • the transparency parameter determination module 530 is configured to store the projection coefficient value of the current target detection point in the target storage position in the engine for each target detection point, so that when transparent display is detected, The target reconstruction function is reconstructed based on the projection coefficient value stored in the
  • the distance information determination module 510 is configured to, for each target detection point, determine the information of each to-be-processed collision point when the current target detection point emits physical rays to each direction in space and passes through the second sub-model; The detection point and the information of each point to be collided, determine the distance information between the current target detection and the second sub-model in each direction; according to the current coordinate information of the current target detection point, and the current target detection point in each direction The distance information is determined to determine the spherical distribution distance function of the current target detection point.
  • the distance information determination module 510 is set to take the current target detection point as the center of the sphere, emit physical rays in any direction in space, and determine the collision point information to be processed when each physical ray passes through the second sub-model.
  • the distance information determination module 510 is configured to determine the distance information between the collision point information to be processed and the current target detection point when there is a collision point to be processed between the physical ray and the second sub-model; When there is a collision point to be processed, the distance information corresponding to the collision point to be processed is set as the set value; according to the distance information and set value corresponding to each collision point to be processed, it is determined that the current target detection point is in each collision point. distance information from the second sub-model in this direction.
  • the distance information determination module 510 is configured to determine the three-dimensional space coordinates of the current target detection point for each target detection point; process the three-dimensional space coordinates according to a predetermined coordinate transformation matrix to determine the current coordinates of the current target detection point Information; the current coordinate information is determined based on the tangent, subtangent and normal of the current target detection point.
  • the projection coefficient value determination module 520 is configured to determine the order of the spherical harmonic function, and determine the representation of the basis functions in the spherical harmonic function and the number of basis functions according to the order; for each target detection point, based on each basis The function processes the distance function of the current target detection point to obtain the projection coefficient value of the current target detection point; the number of projection coefficient values is the same as that of the basis function.
  • the transparency parameter determination module 530 is configured to store, for each target detection point, the projection coefficient value of the current target detection point and the current coordinate information to the target storage location.
  • the current coordinate information of the current target detection point is determined, and the distance function corresponding to the current target detection point is determined based on the current coordinate information, and based on the preset
  • the set spherical harmonic function processes the distance function of each target detection point to obtain the projection coefficient value of each target detection point, and then stores the projection coefficient value of the current target detection point to the target storage location in the engine, which solves the problem of Fixed transparency display value
  • the transparent display is deviated from the actual situation, resulting in poor transparent display effect and poor user experience. Adjust the transparency parameter according to the actual situation, so that the transparent display effect is consistent with the actual situation. The theoretical effect is consistent, and the user experience is improved.
  • the apparatus for determining transparency provided by the embodiment of the present application can execute the method for determining transparency provided by any embodiment of the present application, and has corresponding functional modules and beneficial effects for executing the method.
  • FIG. 6 is a schematic structural diagram of an apparatus for determining transparency according to another embodiment of the present application. As shown in FIG. 6 , the apparatus includes: a target shooting angle determination module 610 , a target reconstruction function reconstruction module 620 and a transparent display module 630 .
  • the target shooting angle determination module 610 is configured to determine the target shooting angle corresponding to the shooting device and each target detection point on the first sub-model; the target reconstruction function reconstruction module 620 is configured to determine the target shooting angle for each target detection point according to the current
  • the current coordinate information of the target detection point determines the target coordinate information, and reconstructs the target reconstruction function of the current target detection point according to the target coordinate information and the projection coefficient value of the current target detection point; the current coordinate information is the three-dimensional space of the current target detection point.
  • the coordinates obtained by the matrix transformation of the coordinates; the first sub-model is the model wrapping the second sub-model; the projection coefficient value is determined based on the spherical harmonic function of the spherical distribution of each target detection point on the first sub-model.
  • the transparent display module 630 is set to determine the distance information corresponding to each target detection point based on the reconstruction function corresponding to each target detection point and the corresponding target shooting angle, and determine the first sub-model and the first sub-model based on the distance information.
  • the transparency parameter between the two submodels, and the first submodel is displayed based on the transparency parameter.
  • the target reconstruction function reconstruction module 620 is configured to determine the target coordinate information of the current target detection point in the current scene according to the current coordinate information of the current target detection point;
  • the projection coefficient value of the point is processed to reconstruct the target reconstruction function of the current target detection point;
  • the spherical harmonic function includes at least one basis function.
  • the transparent display module 630 is configured to input the target shooting angle corresponding to the current target detection point into the reconstruction function, and obtain when the current target detection point and the line belonging to the target shooting device intersect the first sub-model and the second sub-model distance information.
  • the transparent display module 630 is configured to determine the transparency parameter of each target detection point according to the preset correspondence between the distance information and the transparency information and the distance information corresponding to each target detection point; A submodel and a second submodel.
  • the three-dimensional space coordinates of the current target detection point are determined for each target detection point, and the three-dimensional space coordinates are processed according to a predetermined coordinate transformation matrix to determine the current coordinate information of the current target detection point.
  • the current target detection point and the information of each collision point to be processed determine the distance information of the current target detection point in each direction, and then determine the distance function of the spherical distribution of the corresponding target detection point.
  • For each target detection point based on each The basis function processes the distance function of the current target detection point, obtains the projection coefficient value of the current target detection point, stores the projection coefficient value of the current target detection point and the current coordinate information to the target storage location, and solves the problem of displaying the value pair with fixed transparency.
  • the transparent display of the first sub-model When the transparent display of the first sub-model is performed, the transparent display deviates from the actual situation, resulting in poor transparent display effect and poor user experience. Adjust the transparency parameters according to the actual situation to make the transparent display effect consistent with the actual theoretical effect, and improve the user experience.
  • the apparatus for determining transparency provided by the embodiment of the present application can execute the method for determining transparency provided by any embodiment of the present application, and has corresponding functional modules and beneficial effects for executing the method.
  • FIG. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application, and FIG. 7 shows a block diagram of an exemplary electronic device 70 suitable for implementing the implementation of the embodiment of the present application.
  • the electronic device 70 shown in FIG. 7 is only an example, and should not impose any limitations on the functions and scope of use of the embodiments of the present application.
  • electronic device 70 takes the form of a general-purpose computing device.
  • Components of electronic device 70 may include, but are not limited to, one or more processors or processing units 701, system memory 702, and a bus 703 connecting different system components (including system memory 702 and processing unit 701).
  • Bus 703 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, a graphics acceleration port, a processor, or a local bus using any of a variety of bus structures.
  • these architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, enhanced ISA bus, Video Electronics Standards Association (Video Electronics Standards Association) Association, VESA) local bus and Peripheral Component Interconnect (PCI) bus.
  • Electronic device 70 typically includes a variety of computer system readable media. These media can be any available media that can be accessed by electronic device 70, including both volatile and non-volatile media, removable and non-removable media.
  • System memory 702 may include computer system readable media in the form of volatile memory, such as random access memory (RAM) 704 and/or cache memory 705 .
  • Electronic device 70 may further include other removable/non-removable, volatile/non-volatile computer system storage media.
  • storage system 706 may be configured to read and write to non-removable, non-volatile magnetic media (not shown in FIG. 7, commonly referred to as a "hard drive").
  • a disk drive configured to read and write to removable non-volatile magnetic disks (eg "floppy disks") and removable non-volatile optical disks (eg CD-ROM, DVD-ROM) may be provided or other optical media) to read and write optical drives.
  • each drive may be connected to bus 703 through one or more data media interfaces.
  • System memory 702 may include at least one program product having a set (eg, at least one) of program modules configured to perform the functions of each embodiment of the present application.
  • Program modules 707 generally perform the functions and/or methods of the embodiments described herein.
  • the electronic device 70 may also communicate with one or more external devices 709 (eg, keyboard, pointing device, display 710, etc.), with one or more devices that enable a user to interact with the electronic device 70, and/or with Any device (eg, network card, modem, etc.) that enables the electronic device 70 to communicate with one or more other computing devices. Such communication may take place through input/output (I/O) interface 711 . Also, the electronic device 70 may communicate with one or more networks (eg, Local Area Network (LAN), Wide Area Network (WAN), and/or public networks, such as the Internet) through a network adapter 712. As shown, network adapter 712 communicates with other modules of electronic device 70 via bus 703 . It should be understood that, although not shown in FIG.
  • electronic device 70 may be used in conjunction with electronic device 70, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, disk arrays (Redundant Arrays of Independent Disks, RAID) systems, tape drives, and data backup storage systems, etc.
  • the processing unit 701 executes each functional application and data processing by running the program stored in the system memory 702, for example, implementing the method for determining transparency provided by the embodiments of the present application.
  • An embodiment of the present application further provides a storage medium containing computer-executable instructions, the computer-executable instructions, when executed by a computer processor, are configured to perform a method of determining transparency.
  • the method includes:
  • the first sub-model For each target detection point on the first sub-model, determine the current coordinate information of the current target detection point, and determine the distance function corresponding to the current target detection point based on the current coordinate information; the current coordinate information is the three-dimensional image of the current target detection point. The coordinates obtained after the spatial coordinates are changed in a matrix, and the distance function is determined according to the relative distance information between the target detection point and the second sub-model in each direction; the first sub-model is a model wrapping the second sub-model;
  • the distance function of each target detection point is processed based on the preset spherical harmonic function, and the projection coefficient value of each target detection point is obtained;
  • the spherical harmonic function is composed of multiple basis functions;
  • the projection coefficient value of the current target detection point is stored in the target storage position in the engine, so that when transparent display is detected, the target reconstruction function is reconstructed based on the projection coefficient value stored in the target storage position, based on The target reconstruction function determines the target distance information between the first sub-model and the second sub-model under the target shooting angle, and determines the transparency parameter corresponding to the target distance information, so as to display the first sub-model based on the transparency parameter of each target detection point.
  • the method includes:
  • the target coordinate information is determined according to the current coordinate information of the current target detection point, and the target reconstruction function of the current target detection point is reconstructed according to the target coordinate information and the projection coefficient value of the current target detection point;
  • the current coordinate information is the coordinate obtained by performing matrix transformation on the three-dimensional space coordinates of the current target detection point;
  • the first sub-model is the model wrapping the second sub-model;
  • the projection coefficient value is based on the spherical harmonic function for each target detection point on the first sub-model The distance function of the spherical distribution of is determined after processing;
  • the distance information corresponding to each target detection point is determined, and the transparency parameter between the first sub-model and the second sub-model is determined based on the distance information, And display the first submodel based on the transparency parameter.
  • the computer storage medium of the embodiments of the present application may adopt any combination of one or more computer-readable media.
  • the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
  • the computer-readable storage medium can be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or a combination of any of the above.
  • Examples (a non-exhaustive list) of computer-readable storage media include: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (Read- Only Memory, ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage devices, magnetic storage devices, or any suitable combination of the above.
  • a computer-readable storage medium can be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a propagated data signal in baseband or as part of a carrier wave, with computer-readable program code embodied thereon. Such propagated data signals may take a variety of forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • a computer-readable signal medium can also be any computer-readable medium other than a computer-readable storage medium that can transmit, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device .
  • Program code embodied on a computer readable medium may be transmitted using any suitable medium, including but not limited to wireless, wire, optical fiber cable, radio frequency (RF), etc., or any suitable combination of the foregoing.
  • RF radio frequency
  • Computer program code configured to perform the operations of the embodiments of the present application may be written in one or more programming languages, or combinations thereof, including object-oriented programming languages—such as Java, Smalltalk, C++, and also A conventional procedural programming language - such as the "C" language or similar programming language.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (eg, using an Internet service provider through Internet connection).
  • LAN local area network
  • WAN wide area network

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

The embodiments of the present application disclose a transparency determination method and apparatus, an electronic device and a storage medium. Said method comprises: for each target detection point on a first sub-model, determining current coordinate information of a current target detection point, and determining a distance function corresponding to the current target detection point on the basis of the current coordinate information; processing the distance function of each target detection point on the basis of a preset spherical harmonic function, so as to obtain a projection coefficient value of each target detection point; and for each target detection point, storing the projection coefficient value of the current target detection point to a target storage position in an engine.

Description

确定透明度的方法、装置、电子设备和存储介质Method, apparatus, electronic device and storage medium for determining transparency
本申请要求在2020年12月08日提交中国专利局、申请号为202011444007.4的中国专利申请的优先权,该申请的全部内容通过引用结合在本申请中。This application claims the priority of the Chinese Patent Application No. 202011444007.4 filed with the China Patent Office on December 08, 2020, the entire contents of which are incorporated herein by reference.
技术领域technical field
本申请实施例涉及游戏开发技术,例如涉及一种确定透明度的方法、装置、电子设备和存储介质。The embodiments of the present application relate to game development technologies, for example, to a method, apparatus, electronic device, and storage medium for determining transparency.
背景技术Background technique
在动画设计中,通常会设置内层模型与外层模型之间的半透明效果,例如,皮肤模型和衣服模型的半透明显示。相应的,半透明显示主要是依赖内层物体所反射的光经过一定距离后穿透外层物体并入射到人眼后所显示的效果。其中,每个模型是由一个个点构成的,当内层物体上的点与外层物体上与其相对应的点的距离越远,半透明效果比较弱,反之半透明效果就相对比较强。In animation design, the translucent effect between the inner model and the outer model is usually set, for example, the translucent display of the skin model and the clothes model. Correspondingly, the translucent display mainly depends on the effect displayed after the light reflected by the inner layer object penetrates the outer layer object after a certain distance and is incident on the human eye. Among them, each model is composed of points. When the distance between the point on the inner layer object and the corresponding point on the outer layer object is farther, the translucent effect is weaker, and vice versa, the translucent effect is relatively strong.
相关技术中,确定半透明效果,通常是设定外层模型的透明度显示值,该透明度显示值通常是固定不变的,外层模型上的所有点依据设定的透明度显示值来实现透明显示。此种方式存在透明显示与实际情况存在一定的偏差,从而导致透明显示效果不佳以及用户体验较差。In the related art, to determine the translucent effect, the transparency display value of the outer layer model is usually set, and the transparency display value is usually fixed, and all points on the outer layer model realize the transparent display according to the set transparency display value. . In this way, there is a certain deviation between the transparent display and the actual situation, resulting in poor transparent display effect and poor user experience.
发明内容SUMMARY OF THE INVENTION
本申请实施例提供一种确定透明度的方法、装置、电子设备和存储介质,使透明显示效果与实际相符,提高了用户体验。The embodiments of the present application provide a method, apparatus, electronic device, and storage medium for determining transparency, so that the transparent display effect is consistent with the actual situation, and the user experience is improved.
第一方面,本申请实施例提供了一种确定透明度的方法,该方法包括:In a first aspect, an embodiment of the present application provides a method for determining transparency, the method comprising:
针对第一子模型上的每个目标检测点,确定当前目标检测点的当前坐标信息,并基于所述当前坐标信息确定与所述当前目标检测点的距离函数;所述当前坐标信息为对所述当前目标检测点的三维空间坐标进行矩阵变化后得到的坐标,所述距离函数是根据目标检测点在每个方向上与第二子模型之间的相对距离信息来确定的;所述第一子模型为包裹所述第二子模型的模型;For each target detection point on the first sub-model, determine the current coordinate information of the current target detection point, and determine the distance function from the current target detection point based on the current coordinate information; The coordinates obtained after the three-dimensional space coordinates of the current target detection point are changed in a matrix, and the distance function is determined according to the relative distance information between the target detection point and the second sub-model in each direction; the first The submodel is a model wrapping the second submodel;
基于预先设置的球谐函数对每个目标检测点的距离函数进行处理,得到每个目标检测点的投影系数值;所述球谐函数由多个基函数构成;The distance function of each target detection point is processed based on the preset spherical harmonic function to obtain the projection coefficient value of each target detection point; the spherical harmonic function is composed of a plurality of basis functions;
针对每个目标检测点,将当前目标检测点的投影系数值存储至引擎中的目 标存储位置,以在检测到透明显示时,基于所述目标存储位置中存储的投影系数值重建出目标重建函数,基于目标重建函数确定目标拍摄角度下第一子模型与第二子模型之间的目标距离信息,并确定与所述目标距离信息相对应的透明度参数,以基于每个目标检测点的透明度参数显示第一子模型上的每个目标检测点。For each target detection point, the projection coefficient value of the current target detection point is stored in the target storage position in the engine, so that when transparent display is detected, the target reconstruction function is reconstructed based on the projection coefficient value stored in the target storage position , determine the target distance information between the first sub-model and the second sub-model under the target shooting angle based on the target reconstruction function, and determine the transparency parameter corresponding to the target distance information, based on the transparency parameter of each target detection point Display each object detection point on the first submodel.
第二方面,本申请实施例还提供了一种确定透明度的方法,该方法包括:In a second aspect, the embodiments of the present application also provide a method for determining transparency, the method comprising:
确定拍摄装置与第一子模型上每个目标检测点所对应的目标拍摄角度;Determine the target shooting angle corresponding to the shooting device and each target detection point on the first sub-model;
针对每个目标检测点,根据当前目标检测点的当前坐标信息确定目标坐标信息,并根据所述目标坐标信息以及所述当前目标检测点的投影系数值,重建出所述当前目标检测点的目标重建函数;所述当前坐标信息是对当前目标检测点的三维空间坐标进行矩阵变换后得到的坐标;所述第一子模型为包裹第二子模型的模型;所述投影系数值是基于球谐函数对第一子模型上每个目标检测点的球面分布的距离函数处理后确定的;For each target detection point, the target coordinate information is determined according to the current coordinate information of the current target detection point, and the target of the current target detection point is reconstructed according to the target coordinate information and the projection coefficient value of the current target detection point reconstruction function; the current coordinate information is the coordinates obtained by performing matrix transformation on the three-dimensional space coordinates of the current target detection point; the first sub-model is a model wrapping the second sub-model; the projection coefficient value is based on spherical harmonics The function is determined after processing the distance function of the spherical distribution of each target detection point on the first sub-model;
基于每个目标检测点所对应的重建函数以及相应的目标拍摄角度,确定与每个目标检测点所对应的距离信息,基于所述距离信息确定第一子模型和所述第二子模型之间的透明度参数,并基于所述透明度参数显示第一子模型。Based on the reconstruction function corresponding to each target detection point and the corresponding target shooting angle, the distance information corresponding to each target detection point is determined, and the distance between the first sub-model and the second sub-model is determined based on the distance information and display the first submodel based on the transparency parameter.
第三方面,本申请实施例还提供了一种确定透明度的装置,包括:In a third aspect, the embodiments of the present application also provide a device for determining transparency, including:
距离信息确定模块,设置为针对第一子模型上的每个目标检测点,确定当前目标检测点的当前坐标信息,并基于所述当前坐标信息确定与所述当前目标检测点的距离函数;所述当前坐标信息为对所述当前目标检测点的三维空间坐标进行矩阵变化后得到的坐标,所述距离函数是根据目标检测点在每个方向上与第二子模型之间的相对距离信息来确定的;所述第一子模型为包裹所述第二子模型的模型;A distance information determination module, configured to determine the current coordinate information of the current target detection point for each target detection point on the first sub-model, and determine a distance function from the current target detection point based on the current coordinate information; The current coordinate information is the coordinates obtained after the three-dimensional space coordinates of the current target detection point are changed in a matrix, and the distance function is obtained according to the relative distance information between the target detection point and the second sub-model in each direction. Determined; the first sub-model is a model wrapping the second sub-model;
投影系数值确定模块,设置为基于预先设置的球谐函数对每个目标检测点的距离函数进行处理,得到每个目标检测点的投影系数值;所述球谐函数由多个基函数构成;a projection coefficient value determination module, configured to process the distance function of each target detection point based on a preset spherical harmonic function to obtain a projection coefficient value of each target detection point; the spherical harmonic function is composed of a plurality of basis functions;
透明度参数确定模块,设置为针对每个目标检测点,将当前目标检测点的投影系数值存储至引擎中的目标存储位置,以在检测到透明显示时,基于所述目标存储位置中存储的投影系数值重建出目标重建函数,基于目标重建函数确定目标拍摄角度下第一子模型与第二子模型之间的目标距离信息,并确定与所述目标距离信息相对应的透明度参数,以基于每个目标检测点的透明度参数显示第一子模型上的每个目标检测点。The transparency parameter determination module is configured to store the projection coefficient value of the current target detection point in the target storage position in the engine for each target detection point, so that when transparent display is detected, the projection based on the projection stored in the target storage position The coefficient value reconstructs the target reconstruction function, determines the target distance information between the first sub-model and the second sub-model under the target shooting angle based on the target reconstruction function, and determines the transparency parameter corresponding to the target distance information, based on each The transparency parameter of each object detection point shows each object detection point on the first sub-model.
第四方面,本申请实施例还提供了一种确定透明度的装置,包括:In a fourth aspect, the embodiments of the present application also provide a device for determining transparency, including:
目标拍摄角度确定模块,设置为确定拍摄装置与第一子模型上每个目标检测点所对应的目标拍摄角度;a target shooting angle determination module, configured to determine the target shooting angle corresponding to each target detection point on the shooting device and the first sub-model;
目标重建函数重建模块,设置为针对每个目标检测点,根据当前目标检测点的当前坐标信息确定目标坐标信息,并根据所述目标坐标信息以及所述当前目标检测点的投影系数值,重建出所述当前目标检测点的目标重建函数;所述当前坐标信息是对当前目标检测点的三维空间坐标进行矩阵变换后得到的坐标;所述第一子模型为包裹第二子模型的模型;所述投影系数值是基于球谐函数对第一子模型上每个目标检测点的球面分布的距离函数处理后确定的;The target reconstruction function reconstruction module is configured to determine target coordinate information according to the current coordinate information of the current target detection point for each target detection point, and reconstruct the target coordinate information according to the target coordinate information and the projection coefficient value of the current target detection point. The target reconstruction function of the current target detection point; the current coordinate information is the coordinates obtained by performing matrix transformation on the three-dimensional space coordinates of the current target detection point; the first sub-model is a model wrapping the second sub-model; The projection coefficient value is determined after processing the distance function of the spherical distribution of each target detection point on the first sub-model based on the spherical harmonic function;
透明显示模块,设置为基于每个目标检测点所对应的重建函数以及相应的目标拍摄角度,确定与每个目标检测点所对应的距离信息,基于所述距离信息确定第一子模型和所述第二子模型之间的透明度参数,并基于所述透明度参数显示第一子模型。The transparent display module is configured to determine the distance information corresponding to each target detection point based on the reconstruction function corresponding to each target detection point and the corresponding target shooting angle, and determine the first sub-model and the described distance information based on the distance information. A transparency parameter between the second sub-models, and the first sub-model is displayed based on the transparency parameter.
第五方面,本申请实施例还提供了一种电子设备,该电子设备包括:In a fifth aspect, an embodiment of the present application further provides an electronic device, the electronic device comprising:
一个或多个处理器;one or more processors;
存储装置,设置为存储一个或多个程序;storage means arranged to store one or more programs;
当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现本申请实施例中任一项所述的确定透明度的方法。When the one or more programs are executed by the one or more processors, the one or more processors implement the method for determining transparency described in any one of the embodiments of the present application.
第四方面,本申请实施例还提供了一种包含计算机可执行指令的存储介质,所述计算机可执行指令在由计算机处理器执行时设置为执行本申请实施例中任一项所述的确定透明度的方法。In a fourth aspect, the embodiments of the present application further provide a storage medium containing computer-executable instructions, the computer-executable instructions, when executed by a computer processor, are configured to perform the determination described in any one of the embodiments of the present application method of transparency.
附图说明Description of drawings
图1为本申请一实施例提供的确定透明度的方法的流程图;1 is a flowchart of a method for determining transparency provided by an embodiment of the present application;
图2为本申请另一实施例提供的确定透明度的方法的流程图;2 is a flowchart of a method for determining transparency provided by another embodiment of the present application;
图3为本申请另一实施例提供的确定透明度的方法的流程图;3 is a flowchart of a method for determining transparency provided by another embodiment of the present application;
图4为本申请另一实施例提供的确定透明度的方法的流程图;4 is a flowchart of a method for determining transparency provided by another embodiment of the present application;
图5为本申请一实施例提供的一种确定透明度的装置的结构示意图;5 is a schematic structural diagram of a device for determining transparency according to an embodiment of the present application;
图6为本申请另一实施例提供的一种确定透明度的装置的结构示意图;6 is a schematic structural diagram of a device for determining transparency provided by another embodiment of the present application;
图7为本申请一实施例提供的一种电子设备的结构示意图。FIG. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
具体实施方式Detailed ways
下面结合附图和实施例对本申请作进一步的详细说明。可以理解的是,此处所描述的实施例仅仅用于解释本申请,而非对本申请的限定。另外还需要说明的是,为了便于描述,附图中仅示出了与本申请相关的部分而非全部结构。The present application will be further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the embodiments described herein are only used to explain the present application, but not to limit the present application. In addition, it should be noted that, for the convenience of description, the drawings only show some but not all the structures related to the present application.
图1为本申请一实施例提供的确定透明度的方法的流程图,本实施例可适用于根据实际情况调整透明度参数,以使透明显示效果与实际理论效果相符的情况,该方法可以由确定透明度的装置来执行,该装置可以通过软件和/或硬件的形式实现,该硬件可以是电子设备,例如,电子设备可以是移动终端等。FIG. 1 is a flowchart of a method for determining transparency provided by an embodiment of the present application. This embodiment can be applied to the case where the transparency parameter is adjusted according to the actual situation, so that the transparent display effect is consistent with the actual theoretical effect. The method can be determined by determining the transparency. The apparatus may be implemented in the form of software and/or hardware, and the hardware may be an electronic device, for example, the electronic device may be a mobile terminal or the like.
如图1所述,本实施例包括如下步骤:As shown in Figure 1, this embodiment includes the following steps:
S110、针对第一子模型上的每个目标检测点,确定当前目标检测点的当前坐标信息,并基于当前坐标信息确定与当前目标检测点对应的距离函数。S110. For each target detection point on the first sub-model, determine current coordinate information of the current target detection point, and determine a distance function corresponding to the current target detection point based on the current coordinate information.
其中,第一子模型与第二子模型是相对而言的,若应用场景为皮肤模型和衣服模型时,可以将衣服所对应的模型作为第一子模型,皮肤所对应的模型作为第二子模型。目标检测点可以是第一子模型上预先设置的检测点;也可以是将第一子模型上的预先设置的点划分为多个块,可以将每个块的中心点作为目标检测点;也可以是开发人员根据实际需求设置的检测点;也可以是,第一子模型是由多个点构成的,可以将每个点作为目标检测点。当前坐标信息为对当前目标检测点的三维空间坐标进行矩阵变化后得到的坐标,可以是基于当前目标检测点的切线、副切线以及法线来确定。可以是根据当前目标检测点的切线、副切线以及法线来建立一个坐标系,将三维空间坐标系下的空间坐标转换为切线、副切线以及法线方向上的坐标,此时得到的坐标为当前坐标信息。距离函数是与每个目标检测点相对应的,即一个目标检测点对应一个距离函数。针对每个距离函数,可以理解为:可以根据某个目标检测点在空间中的每个方向上与第二子模型之间的相对距离信息确定出来的。由于是球面分布的距离函数,每个目标检测点在每个方向与第二子模型之间的距离,可以是:以目标检测点为球心,向空间中每个方向发射物理射线,确定每个条物理射线与第二子模型的交点与目标检测点之间距离信息。示例性的,若目标检测点有1000个,距离函数的数量也为1000个,其中每个距离函数可以是基于某个目标检测点在每个方向,与第二子模型之间的距离确定出的距离函数。Among them, the first sub-model and the second sub-model are relative. If the application scene is a skin model and a clothing model, the model corresponding to the clothes can be used as the first sub-model, and the model corresponding to the skin can be used as the second sub-model. Model. The target detection point may be a preset detection point on the first sub-model; it may also be that the preset point on the first sub-model is divided into a plurality of blocks, and the center point of each block may be used as the target detection point; or It can be a detection point set by the developer according to actual needs; it can also be that the first sub-model is composed of multiple points, and each point can be used as a target detection point. The current coordinate information is the coordinates obtained by performing a matrix change on the three-dimensional space coordinates of the current target detection point, which may be determined based on the tangent, the subtangent and the normal of the current target detection point. A coordinate system can be established based on the tangent, subtangent and normal of the current target detection point, and the spatial coordinates in the three-dimensional space coordinate system are converted into coordinates in the tangent, subtangent and normal directions. The coordinates obtained at this time are Current coordinate information. The distance function corresponds to each target detection point, that is, a target detection point corresponds to a distance function. For each distance function, it can be understood as: it can be determined according to the relative distance information between a certain target detection point and the second sub-model in each direction in space. Since it is a distance function of spherical distribution, the distance between each target detection point and the second sub-model in each direction can be: take the target detection point as the center of the sphere, emit physical rays in each direction in space, and determine each The distance information between the intersection of each physical ray and the second sub-model and the target detection point. Exemplarily, if there are 1000 target detection points, the number of distance functions is also 1000, wherein each distance function can be determined based on the distance between a certain target detection point and the second sub-model in each direction. distance function.
需要说明的是,确定不同目标检测点的球面分布距离函数是采用相同的方式,为了清楚的介绍本实施例技术方案,以确定其中一个目标检测点的球面分布函数为例来介绍。It should be noted that the same method is used to determine the spherical distribution distance function of different target detection points. In order to clearly introduce the technical solution of this embodiment, the spherical distribution function of one of the target detection points is determined as an example to introduce.
例如,可以以第一子模型上的一个目标检测点作为球心,向空间中的每个方向发射物理射线,确定每个条物理射线与第二子模型的交点与目标检测点之间距离信息。若物理射线在该射线的方向上与第二子模型存在交点,则将该交 点与目标检测点的距离作为目标检测点在该方向上与第二子模型之间的距离信息。若物理射线在该射线的方向上与第二子模型不存在交点,则可以将预设的最大距离信息作为目标检测点在该方向上与第二子模型之间的距离信息。For example, a target detection point on the first sub-model can be used as the center of the sphere, physical rays can be emitted in each direction in the space, and the distance information between the intersection of each physical ray and the second sub-model and the target detection point can be determined . If the physical ray has an intersection with the second sub-model in the direction of the ray, the distance between the intersection and the target detection point is used as the distance information between the target detection point and the second sub-model in this direction. If there is no intersection between the physical ray and the second sub-model in the direction of the ray, the preset maximum distance information may be used as the distance information between the target detection point and the second sub-model in this direction.
需要说明的是,确定当前目标检测点在空间中不同方向上的距离信息是采用相同的方式,为了清楚的介绍本实施例技术方案,以确定其中一个方向为例来介绍。It should be noted that the same method is used to determine the distance information of the current target detection point in different directions in space. In order to clearly introduce the technical solution of this embodiment, one of the directions is determined as an example.
当前目标检测点在空间中的某个方向上发射物理射线,若该物理射线与第二子模型存在交点,则当前目标检测点在当前方向上距离信息的确定方法可以是,根据当前目标检测点的三维空间坐标信息以及当前物理射线与第二子模型的交点的碰撞点三维空间坐标信息,可以通过空间中两点间的直线距离公式确定的,例如:当前坐标信息为(am1,am2,am3),碰撞点坐标信息为(ap1,ap2,ap3),则可以确定当前目标检测点基于当前物理射线所对应的第一子模型和第二子模型之间的距离信息为
Figure PCTCN2021131498-appb-000001
若该物理射线与第二子模型不存在交点,则当前目标检测点在当前方向上距离信息可以为预设的最大距离信息,例如:L=10nm。
The current target detection point emits a physical ray in a certain direction in space. If the physical ray has an intersection with the second sub-model, the method for determining the distance information of the current target detection point in the current direction may be, according to the current target detection point The three-dimensional space coordinate information of , and the three-dimensional space coordinate information of the collision point of the intersection of the current physical ray and the second sub-model can be determined by the straight-line distance formula between two points in the space, for example: the current coordinate information is (am1, am2, am3 ), the coordinate information of the collision point is (ap1, ap2, ap3), then it can be determined that the distance information between the first sub-model and the second sub-model corresponding to the current target detection point based on the current physical ray is
Figure PCTCN2021131498-appb-000001
If there is no intersection between the physical ray and the second sub-model, the distance information of the current target detection point in the current direction may be the preset maximum distance information, for example: L=10 nm.
需要说明的是,也可以采用其他方式来确定两个点之间的距离,在此不作显示,只要可以确定两个点之间的距离信息即可。It should be noted that the distance between the two points can also be determined in other ways, which is not shown here, as long as the distance information between the two points can be determined.
进一步的,根据每个目标检测点都可以确定与其对应的第一子模型和第二子模型之间的距离信息。Further, according to each target detection point, the distance information between the corresponding first sub-model and the second sub-model can be determined.
在确定距离信息后,可以确定目标检测点的球面分布的距离函数,例如:目标检测点A的球面分布距离函数为After the distance information is determined, the distance function of the spherical distribution of the target detection point can be determined. For example, the spherical distribution distance function of the target detection point A is:
Figure PCTCN2021131498-appb-000002
Figure PCTCN2021131498-appb-000002
其中,i表示第i个方向,F(i)表示第i个方向上的距离信息,dist_i表示距离信息,n表示方向的总数量。Among them, i represents the ith direction, F(i) represents the distance information in the ith direction, dist_i represents the distance information, and n represents the total number of directions.
需要说明的是,每个目标检测点的球面分布的距离函数是一个复合函数,该复合函数中子函数的个数可以根据预设的采样数量确定,默认的可以是16×32的精度,即复合函数中包含512个子函数。当采样数量越大时,复合函数中的子函数数量越多,精度越高。采样数量可以根据实际需求来确定。It should be noted that the distance function of the spherical distribution of each target detection point is a composite function, and the number of sub-functions in the composite function can be determined according to the preset number of samples. The default can be 16×32 precision, that is, The composite function contains 512 sub-functions. When the number of samples is larger, the larger the number of sub-functions in the composite function, the higher the precision. The number of samples can be determined according to actual needs.
S120、基于预先设置的球谐函数对每个目标检测点的距离函数进行处理, 得到每个目标检测点的投影系数值。S120. Process the distance function of each target detection point based on a preset spherical harmonic function to obtain a projection coefficient value of each target detection point.
需要说明的是,确定不同目标检测点的投影系数值是采用相同的方式,为了清楚的介绍本实施例技术方案,以确定其中一个目标检测点的的投影系数值为例来介绍。It should be noted that the same method is used to determine the projection coefficient values of different target detection points. In order to clearly introduce the technical solution of this embodiment, the projection coefficient value of one of the target detection points is determined as an example.
其中,球谐函数由多个基函数构成,可以使用基函数对每个距离函数进行处理。球谐函数不同的阶数对应的基函数数量不同,二阶球谐函数包含4个基函数,三阶球谐函数包含9个基函数,四阶球谐函数包含16个基函数等。投影系数值是根据球谐函数中的基函数对距离函数进行处理后,得到的系数值,其系数值的数量与基函数的数量相同。根据球谐函数与其对应的多个基函数可以对目标检测点的球面分布距离函数进行压缩,以得到投影系数值。示例性的:球谐函数为二阶,即包括4个基函数,那么将目标检测点所对应的距离函数输入至该球谐函数中可以得到4个投影系数值,从而将距离函数压缩为4个投影系数值。Among them, the spherical harmonic function is composed of multiple basis functions, and each distance function can be processed by using the basis function. Different orders of spherical harmonics correspond to different numbers of basis functions. The second-order spherical harmonics include 4 basis functions, the third-order spherical harmonics include 9 basis functions, and the fourth-order spherical harmonics include 16 basis functions. The projection coefficient value is the coefficient value obtained by processing the distance function according to the basis function in the spherical harmonic function, and the number of the coefficient value is the same as that of the basis function. According to the spherical harmonic function and its corresponding multiple basis functions, the spherical distribution distance function of the target detection point can be compressed to obtain the projection coefficient value. Exemplary: the spherical harmonic function is second-order, that is, it includes 4 basis functions, then inputting the distance function corresponding to the target detection point into the spherical harmonic function can obtain 4 projection coefficient values, thereby compressing the distance function to 4 projection coefficient value.
需要说明的是,球谐函数的阶数越高,在重建时重建出来的球形与实际球形的相似度就越高,因此开发人员可以根据实际需求选择不同阶数的球谐函数。越高阶的球谐函数包含的基函数越多,在后续根据球谐函数以及投影系数值还原距离函数时的还原程度越高。It should be noted that the higher the order of the spherical harmonic function, the higher the similarity between the reconstructed sphere and the actual sphere during reconstruction, so developers can choose spherical harmonic functions of different orders according to actual needs. The higher-order spherical harmonic function contains more basis functions, and the degree of restoration of the distance function is higher in the subsequent restoration of the distance function according to the spherical harmonic function and the projection coefficient value.
例如,在确定球谐函数的阶数后,可以将目标检测点的球面分布的距离函数输入至球谐函数中,基于球谐函数中的基函数对距离函数进行处理,以获得上述距离函数在每个基函数上的投影系数值。由此可知,投影系数值的数量与球谐函数中的基函数的数量相等。For example, after determining the order of the spherical harmonic function, the distance function of the spherical distribution of the target detection point can be input into the spherical harmonic function, and the distance function can be processed based on the basis function in the spherical harmonic function to obtain the above distance function in The projection coefficient values on each basis function. It follows that the number of projection coefficient values is equal to the number of basis functions in spherical harmonics.
需要说明的是,不同目标检测点的球面分布的距离函数可以相同也可以不同,当不同目标检测点在空间中每个方向上的距离信息都相同时,不同目标检测点所对应的距离函数相同,当不同目标检测点在空间中每个方向上的距离信息存在至少一个不同时,不同目标检测点所对应的距离函数不同,其中,距离函数是否相同是根据目标检测点在空间中每个方向上的距离信息来确定的。将不同的距离函数输入至预先设置的球谐函数进行处理后,得到的投影系数值不同。It should be noted that the distance functions of the spherical distribution of different target detection points can be the same or different. When the distance information of different target detection points in each direction in space is the same, the distance functions corresponding to different target detection points are the same. , when there is at least one difference in the distance information of different target detection points in each direction in space, the distance functions corresponding to different target detection points are different, and whether the distance function is the same is based on the target detection points in each direction in space. The distance information on it is determined. After inputting different distance functions into the preset spherical harmonic function for processing, the obtained projection coefficient values are different.
S130、针对每个目标检测点,将当前目标检测点的投影系数值存储至引擎中的目标存储位置,以在检测到透明显示时,基于目标存储位置中存储的投影系数值重建出目标重建函数,基于目标重建函数确定目标拍摄角度下第一子模型与第二子模型之间的目标距离信息,并确定与目标距离信息相对应的透明度参数,以基于每个目标检测点的透明度参数显示第一子模型上的每个目标检测点。S130. For each target detection point, store the projection coefficient value of the current target detection point in the target storage position in the engine, so that when transparent display is detected, reconstruct the target reconstruction function based on the projection coefficient value stored in the target storage position , determine the target distance information between the first sub-model and the second sub-model under the target shooting angle based on the target reconstruction function, and determine the transparency parameter corresponding to the target distance information, to display the first sub-model based on the transparency parameter of each target detection point Each object detection point on a submodel.
例如,针对每个目标检测点,可以将与当前目标检测点所对应的投影系数值存储至与当前目标检测点所对应的顶点色中和/或当前目标检测点的属性信息中,根据顶点的索引坐标导入引擎中的目标存储位置。For example, for each target detection point, the projection coefficient value corresponding to the current target detection point may be stored in the vertex color corresponding to the current target detection point and/or in the attribute information of the current target detection point. Indexed coordinates are imported into the target storage location in the engine.
其中,可以将第一子模型上的每个目标检测点作为顶点色,每个目标检测点对应有像素通道,即RGBA四个通道,可以在每个通道中存储该目标检测点的投影系数值。属性信息可以是每个目标检测点所对应的可扩展信息,例如:UV,即u,v纹理贴图坐标。目标检测点所对应的顶点色以及当前目标检测点的属性信息可以共同使用来存储投影系数值。Among them, each target detection point on the first sub-model can be used as a vertex color, each target detection point corresponds to a pixel channel, that is, four channels of RGBA, and the projection coefficient value of the target detection point can be stored in each channel. . The attribute information can be scalable information corresponding to each target detection point, for example: UV, ie u, v texture map coordinates. The vertex color corresponding to the target detection point and the attribute information of the current target detection point can be used together to store the projection coefficient value.
例如,将当前目标检测点所对应的投影系数值存储至与当前目标检测点所对应的顶点色中时,可以是按R,G,B以及A四个通道分别存储4个投影系数值,并且,可以根据投影系数值的数量确定需要顶点色的数量。也可以在一个顶点色中存储4个投影系数值,并将剩余的投影系数值存储至与当前目标检测点所对应的UV坐标中,上述方式可以减少顶点色的使用,并且便于投影系数值的存储以及后续的调取和使用。For example, when the projection coefficient value corresponding to the current target detection point is stored in the vertex color corresponding to the current target detection point, four projection coefficient values can be stored respectively according to the four channels of R, G, B and A, and , the number of vertex colors required can be determined according to the number of projection coefficient values. It is also possible to store 4 projection coefficient values in one vertex color, and store the remaining projection coefficient values in the UV coordinates corresponding to the current target detection point. The above method can reduce the use of vertex colors and facilitate the calculation of projection coefficient values. storage and subsequent recall and use.
例如,还可以将目标检测点所对应的顶点色与当前目标检测点所对应的UV坐标共同使用来存储投影系数值,例如:若存储9个投影系数值,可以使用两个顶点色存储8个投影系数值,并将剩余的1个投影系数值存储至与当前目标检测点所对应的UV坐标中。For example, the vertex color corresponding to the target detection point and the UV coordinate corresponding to the current target detection point can also be used together to store the projection coefficient value. For example, if 9 projection coefficient values are stored, two vertex colors can be used to store 8 projection coefficient value, and store the remaining 1 projection coefficient value in the UV coordinate corresponding to the current target detection point.
投影系数值的存储方式还可以是:针对每个目标检测点,将与当前目标检测点所对应的投影系数值存储至至少一个图片中的像素点中,图片的数量与当前目标检测点所对应的投影系数值的数量相同。将全部存储投影系数值的图片根据顶点的索引坐标导入引擎中的目标存储位置。The storage method of the projection coefficient value can also be: for each target detection point, the projection coefficient value corresponding to the current target detection point is stored in the pixel points in at least one picture, and the number of pictures corresponds to the current target detection point. the same number of projection coefficient values. Import all images storing projection coefficient values into the target storage location in the engine according to the index coordinates of the vertices.
示例性的,若存储9个投影系数值,可以使用9张图片,分别将投影系数值存储至当前目标检测点所对应的像素点中。Exemplarily, if 9 projection coefficient values are stored, 9 pictures may be used, and the projection coefficient values may be respectively stored in the pixel points corresponding to the current target detection point.
当投影系数值完成存储后,可以根据当前目标检测点的索引坐标导入引擎中的目标位置进行存储,以根据目标检测点从引擎中调取与当前目标检测点所对应投影系数值,以根据投影系数值重建后的目标距离信息,确定每个目标检测点的透明度参数,进而基于每个透明度参数显示第一子模型和第二子模型。After the projection coefficient value is stored, it can be imported into the target position in the engine according to the index coordinates of the current target detection point for storage, so that the projection coefficient value corresponding to the current target detection point can be retrieved from the engine according to the target detection point. The target distance information reconstructed by the coefficient value determines the transparency parameter of each target detection point, and then displays the first sub-model and the second sub-model based on each transparency parameter.
本实施例的技术方案,通过针对第一子模型上的每个目标检测点,确定当前目标检测点的当前坐标信息,并基于当前坐标信息确定与当前目标检测点对应的距离函数,并基于预先设置的球谐函数对每个目标检测点的距离函数进行处理,得到每个目标检测点的投影系数值,进而将当前目标检测点的投影系数值存储至引擎中的目标存储位置,解决了以固定透明度显示值对第一子模型进 行透明显示时透明显示与实际情况存在偏差而导致的透明显示效果不佳,用户体验较差的技术问题,根据实际情况调整透明度参数,使透明显示效果与实际理论效果相符,提高了用户体验。In the technical solution of this embodiment, for each target detection point on the first sub-model, the current coordinate information of the current target detection point is determined, and the distance function corresponding to the current target detection point is determined based on the current coordinate information, and based on the preset The set spherical harmonic function processes the distance function of each target detection point to obtain the projection coefficient value of each target detection point, and then stores the projection coefficient value of the current target detection point to the target storage location in the engine, which solves the problem of Fixed transparency display value When the first sub-model is transparently displayed, the transparent display is deviated from the actual situation, resulting in poor transparent display effect and poor user experience. Adjust the transparency parameters according to the actual situation, so that the transparent display effect is consistent with the actual situation. The theoretical effect is consistent, and the user experience is improved.
图2为本申请另一实施例提供的确定透明度的方法的流程图,在上述实施例的基础上,确定当前坐标信息以及确定距离函数的确定方式,以及将每个目标检测点的投影系数值存储至目标存储位置的存储方式可以参见本实施例技术方案。其中与上述每个实施例相同或相应的术语的解释在此不再赘述。FIG. 2 is a flowchart of a method for determining transparency provided by another embodiment of the present application. On the basis of the above embodiment, the current coordinate information and the determination method of determining the distance function are determined, and the projection coefficient value of each target detection point is determined. For the storage method of storing to the target storage location, reference may be made to the technical solution of this embodiment. The explanations of the terms that are the same as or corresponding to each of the above-mentioned embodiments are not repeated here.
参见图2,本实施例提供的确定透明度的方法包括:Referring to FIG. 2, the method for determining transparency provided by this embodiment includes:
S210、针对每个目标检测点,确定当前目标检测点的三维空间坐标。S210. For each target detection point, determine the three-dimensional space coordinates of the current target detection point.
例如,以空间中的某一点为空间坐标原点,建立三维空间坐标系,当前目标检测点在该三维空间坐标系中的坐标信息就是当前目标检测点的三维空间坐标,例如:以空间中某一人物的脚下的某个点为空间坐标原点,即(0,0,0),此时,建立三维空间坐标系,当前目标检测点可能是该人物头上某一点,该点的在三维空间坐标系中的坐标信息为(0,1,0),这一坐标信息就是该点的三维空间坐标。在该三维空间坐标系中,每一个目标检测点的三维空间坐标是固定的。For example, a point in space is used as the origin of space coordinates to establish a three-dimensional space coordinate system. The coordinate information of the current target detection point in the three-dimensional space coordinate system is the three-dimensional space coordinates of the current target detection point. A certain point under the person's feet is the origin of the space coordinates, namely (0, 0, 0). At this time, a three-dimensional space coordinate system is established. The current target detection point may be a certain point on the head of the person, and the coordinates of this point in the three-dimensional space are The coordinate information in the system is (0, 1, 0), and this coordinate information is the three-dimensional space coordinate of the point. In this three-dimensional space coordinate system, the three-dimensional space coordinate of each target detection point is fixed.
S220、根据预先确定的坐标变换矩阵,对三维空间坐标进行处理,确定当前目标检测点的当前坐标信息。S220: Process the three-dimensional space coordinates according to the predetermined coordinate transformation matrix, and determine the current coordinate information of the current target detection point.
其中,坐标变换矩阵是用于将三维空间坐标转换为当前坐标信息的变换矩阵,当前坐标信息是根据变换矩阵,得到当前目标检测点在切线、副切线以及法线的值来确定的。The coordinate transformation matrix is a transformation matrix used to convert the three-dimensional space coordinates into the current coordinate information, and the current coordinate information is determined by obtaining the values of the tangent, subtangent and normal of the current target detection point according to the transformation matrix.
例如,以当前目标检测点的切线、副切线以及法线为基准定义当前坐标轴,根据当前坐标轴所对应的坐标变换矩阵对三维空间坐标进行坐标变化,可以得到当前目标检测点的当前坐标信息。例如:通过公式x’=Tx,T为坐标变换矩阵,由当前目标检测点的切线、副切线以及法线来确定,x是待变换的三维空间坐标。For example, the current coordinate axis is defined based on the tangent, subtangent and normal of the current target detection point, and the coordinates of the three-dimensional space coordinate are changed according to the coordinate transformation matrix corresponding to the current coordinate axis, and the current coordinate information of the current target detection point can be obtained. . For example: through the formula x'=Tx, T is the coordinate transformation matrix, which is determined by the tangent, subtangent and normal of the current target detection point, and x is the three-dimensional space coordinate to be transformed.
示例性的,以空间中某一人物的手上的某个点为当前目标检测点,例如:当前目标检测点的三维空间坐标为(0,0,0)。根据当前目标检测点的切线、副切线以及法线为基准确定当前坐标轴,并根据当前坐标轴所对应的坐标变换矩阵对三维空间坐标进行坐标变化,当前目标检测点的三维空间坐标(0,0,0)就变为了(-0.2,0.5,0.3)。Exemplarily, a certain point on the hand of a certain person in space is used as the current target detection point, for example, the three-dimensional space coordinates of the current target detection point are (0, 0, 0). The current coordinate axis is determined according to the tangent, subtangent and normal of the current target detection point, and the coordinates of the three-dimensional space coordinate are changed according to the coordinate transformation matrix corresponding to the current coordinate axis. The three-dimensional space coordinate of the current target detection point (0, 0, 0) becomes (-0.2, 0.5, 0.3).
需要说明的是,当前目标检测点的当前坐标信息是位于当前目标检测点的空间中的,而不是以空间中其他点为基准确定的相对坐标信息。当当前目标检 测点的位置在空间中发生变化时,当前目标检测点的切线、副切线以及法线也会随之变化,根据变化后的当前目标检测点可以进一步确定变化后的当前坐标信息。It should be noted that the current coordinate information of the current target detection point is located in the space of the current target detection point, rather than relative coordinate information determined based on other points in the space. When the position of the current target detection point changes in space, the tangent, subtangent and normal of the current target detection point will also change accordingly, and the changed current coordinate information can be further determined according to the changed current target detection point.
也就是说,在模型中的每个目标检测点的三维空间坐标发生变化时,可以确定其所对应的切线、法线以及副切线方向上的坐标,进而基于该坐标重建出重建函数,即可适用于对产生形变的物体进行距离信息确定,进而根据距离信息确定透明度参数的情形。That is to say, when the three-dimensional space coordinates of each target detection point in the model change, the corresponding coordinates in the tangent, normal, and subtangent directions can be determined, and then the reconstruction function can be reconstructed based on the coordinates. It is suitable for determining the distance information of the deformed object, and then determining the transparency parameter according to the distance information.
S230、针对每个目标检测点,确定当前目标检测点向空间中每个方向发射物理射线透过第二子模型时的每个待处理碰撞点信息。S230. For each target detection point, determine the information of each collision point to be processed when the current target detection point emits physical rays in each direction in space and passes through the second sub-model.
其中,每个待处理碰撞点信息可以包括每个待碰撞点的位置信息,例如:与当前目标信息对应的空间坐标信息等。The information of each collision point to be processed may include position information of each collision point to be collided, such as spatial coordinate information corresponding to the current target information, and the like.
例如,可以是基于第一子模型上的每个目标检测点向每个方向发射物理射线,上述每个条物理射线可能会透过第二子模型,当物理射线透过第二子模型时,确定物理射线与第二子模型存在待处理碰撞点,并将物理射线与第二子模型的交点作为待处理碰撞点。待处理碰撞点信息可以是用于描述该点位置的信息,如与当前目标信息对应的空间坐标信息。因此,根据待处理碰撞点可以确定与待处理碰撞点相对应的空间坐标信息为待处理碰撞点信息。For example, physical rays can be emitted in each direction based on each target detection point on the first sub-model, and each of the above-mentioned physical rays may pass through the second sub-model. When the physical rays pass through the second sub-model, It is determined that there is a collision point to be processed between the physical ray and the second sub-model, and the intersection of the physical ray and the second sub-model is used as the collision point to be processed. The collision point information to be processed may be information used to describe the position of the point, such as spatial coordinate information corresponding to the current target information. Therefore, according to the collision point to be processed, it can be determined that the spatial coordinate information corresponding to the collision point to be processed is the information of the collision point to be processed.
在本实施例中,确定待处理碰撞点信息可以是:可以以当前目标检测点为球心,向空间中的任意方向发射物理射线,确定每个物理射线透过所述第二子模型时的待处理碰撞点信息。In this embodiment, determining the information of the collision point to be processed may be: taking the current target detection point as the center of the sphere, emitting physical rays in any direction in space, and determining the time when each physical ray passes through the second sub-model. Pending collision point information.
例如,可以将从第一子模型上的每个目标检测点向每个方向发射物理射线看作是以当前目标检测点为球心,向球面的每个方向发射物理射线。若物理射线透过第二子模型,则将物理射线与第二子模型交点的空间坐标信息作为待处理碰撞点信息。For example, the emission of physical rays from each target detection point on the first sub-model to each direction can be regarded as taking the current target detection point as the center of the sphere, and emitting physical rays in each direction of the spherical surface. If the physical ray passes through the second sub-model, the spatial coordinate information of the intersection of the physical ray and the second sub-model is used as the collision point information to be processed.
S240、根据当前目标检测点与每个待处理碰撞点信息,确定当前目标检测点在每个方向上与第二子模型之间的距离信息。S240: Determine distance information between the current target detection point and the second sub-model in each direction according to the current target detection point and the information of each collision point to be processed.
当物理射线与第二子模型存在待处理碰撞点时,则确定待处理碰撞点信息与当前目标检测点的距离信息。When there is a collision point to be processed between the physical ray and the second sub-model, the distance information between the information of the collision point to be processed and the current target detection point is determined.
例如,当存在待处理碰撞点时,将待处理碰撞点与当前目标信息对应的空间坐标信息作为待处理碰撞点的空间坐标信息,可以根据当前目标检测点的当前坐标信息以及待处理碰撞点的空间坐标信息,使用空间中两点之间距离的计算公式计算得出待处理碰撞点信息与当前目标检测点的距离信息。For example, when there is a collision point to be processed, the spatial coordinate information corresponding to the collision point to be processed and the current target information is used as the spatial coordinate information of the collision point to be processed. For spatial coordinate information, the distance information between the collision point information to be processed and the current target detection point is calculated by using the calculation formula of the distance between two points in the space.
当物理射线与第二子模型不存在待处理碰撞点时,则将与待处理碰撞点所 对应的距离信息设置为设定值。When there is no collision point to be processed between the physical ray and the second sub-model, the distance information corresponding to the collision point to be processed is set as a set value.
例如,若物理射线不透过第二子模型,如,发射的物理射线与第二子模型是平行的或者是朝着背向第二子模型的方向发射的物理射线,此时物理射线与第二子模型之间不存在碰撞点,可以将此时的待处理碰撞信息设定为设定值。该设定值可以是待处理碰撞点信息与当前目标检测点的最大距离信息。For example, if the physical ray does not pass through the second sub-model, for example, the emitted physical ray is parallel to the second sub-model or the physical ray emitted in the direction away from the second sub-model, at this time, the physical ray and the first There is no collision point between the two sub-models, and the collision information to be processed at this time can be set as the set value. The set value may be the maximum distance information between the collision point information to be processed and the current target detection point.
根据与每个待处理碰撞点所对应的距离信息或设定值,确定当前目标检测点在每个方向上与第二子模型之间的距离信息。According to the distance information or set value corresponding to each collision point to be processed, the distance information between the current target detection point and the second sub-model in each direction is determined.
例如,基于上述两种情况,可以确定当前目标检测点发射的每个条物理射线所对应的距离信息或设定值,并将距离信息或设定值作为当前目标检测点在每个方向上与第二子模型之间的距离信息。For example, based on the above two situations, the distance information or setting value corresponding to each physical ray emitted by the current target detection point can be determined, and the distance information or setting value can be used as the current target detection point in each direction and Distance information between the second submodels.
S250、根据每个目标检测点在每个方向上的距离信息,确定相应目标检测点的球面分布的距离函数。S250 , according to the distance information of each target detection point in each direction, determine the distance function of the spherical distribution of the corresponding target detection point.
例如,将每个目标检测点在空间中每个方向上的距离信息作为目标检测点的球面分布的距离函数中的一个子函数,可以得到目标检测点的球面分布的距离函数。该距离函数中的子函数数量与目标检测点在球面每个方向上的距离信息数量相同。For example, taking the distance information of each target detection point in each direction in space as a sub-function of the distance function of the spherical distribution of the target detection point, the distance function of the spherical distribution of the target detection point can be obtained. The number of sub-functions in this distance function is the same as the amount of distance information of the target detection point in each direction of the sphere.
需要说明的是,为了提高精度,可以增加子函数的数量,即增加物理射线的密度,物理射线的数量可以根据实际需求来确定。It should be noted that, in order to improve the accuracy, the number of sub-functions can be increased, that is, the density of physical rays can be increased, and the number of physical rays can be determined according to actual requirements.
S260、确定球谐函数的阶数,并根据阶数确定球谐函数中基函数的表示方式以及基函数的数量。S260. Determine the order of the spherical harmonic function, and determine the representation of the basis function in the spherical harmonic function and the quantity of the basis function according to the order.
不同阶数的球谐函数中包含的基函数数量不同,例如:二阶球谐函数包含4个基函数,三阶球谐函数包含9个基函数,四阶球谐函数包含16个基函数等。球谐函数的阶数越高,在后续使用重建函数进行重建时的效果会越好,阶数需要根据实际需求来设定。Different orders of spherical harmonics contain different numbers of basis functions. For example, the second-order spherical harmonics contain 4 basis functions, the third-order spherical harmonics contain 9 basis functions, and the fourth-order spherical harmonics contain 16 basis functions, etc. . The higher the order of the spherical harmonic function, the better the reconstruction effect will be when the reconstruction function is used later, and the order needs to be set according to the actual needs.
例如,根据需求确定球谐函数的阶数为a,那么可以确定球谐函数中的基函数数量为a 2For example, if the order of the spherical harmonic function is determined to be a according to requirements, then the number of basis functions in the spherical harmonic function can be determined to be a 2 .
根据距离函数与投影系数值的关系可以确定每个基函数的表示方式。The representation of each basis function can be determined according to the relationship between the distance function and the value of the projection coefficient.
S270、针对每个目标检测点,基于每个基函数对当前目标检测点的距离函数进行处理,得到当前目标检测点的投影系数值。S270. For each target detection point, process the distance function of the current target detection point based on each basis function to obtain a projection coefficient value of the current target detection point.
需要说明的是,确定不同目标检测点的投影系数值是采用相同的方式,为了清楚的介绍本实施例技术方案,以确定其中一个目标检测点的的投影系数值为例来介绍。It should be noted that the same method is used to determine the projection coefficient values of different target detection points. In order to clearly introduce the technical solution of this embodiment, the projection coefficient value of one of the target detection points is determined as an example.
其中,投影系数值的数量与基函数的数量相同。投影系数值是使用预先设置的球谐函数中的每个基函数对距离函数进行计算确定的值。where the number of projection coefficient values is the same as the number of basis functions. The projection coefficient value is a value determined by calculating the distance function using each basis function of the preset spherical harmonic functions.
例如,基于每一个基函数对目标检测点的距离函数进行处理都可以得到与该基函数对应的投影系数值,因此,投影系数值的数量与基函数的数量相同。将目标检测点的距离函数输入至球谐函数的每个基函数中,可以获得上述距离函数在每个基函数上的投影系数值。For example, by processing the distance function of the target detection point based on each basis function, a projection coefficient value corresponding to the basis function can be obtained. Therefore, the number of projection coefficient values is the same as the number of basis functions. By inputting the distance function of the target detection point into each basis function of the spherical harmonic function, the projection coefficient value of the above distance function on each basis function can be obtained.
需要说明的是,每个目标检测点的球面分布的距离函数不同,将不同的距离函数输入至预先设置的球谐函数进行处理后,得到的投影系数值不同。It should be noted that the distance function of the spherical distribution of each target detection point is different, and after inputting the different distance functions into the preset spherical harmonic function for processing, the obtained projection coefficient values are different.
S280、针对每个目标检测点,将当前目标检测点的投影系数值以及当前坐标信息存储至目标存储位置。S280. For each target detection point, store the projection coefficient value of the current target detection point and the current coordinate information in the target storage location.
例如,可以将与当前目标检测点所对应的投影系数值存储至与当前目标检测点所对应的顶点色中和/或当前目标检测点的属性信息中。还可以将与当前目标检测点所对应的投影系数值存储至与当前目标检测点所对应的至少一个图片的像素点中。For example, the projection coefficient value corresponding to the current target detection point may be stored in the vertex color corresponding to the current target detection point and/or the attribute information of the current target detection point. The projection coefficient value corresponding to the current target detection point may also be stored in the pixel point of at least one picture corresponding to the current target detection point.
为了将当前目标检测点所对应的投影系数值进行存储,可以选择将投影系数值存储至当前目标检测点所对应的顶点色中和/或当前目标检测点的属性信息中,以便于后续调取和使用。In order to store the projection coefficient value corresponding to the current target detection point, you can choose to store the projection coefficient value in the vertex color corresponding to the current target detection point and/or the attribute information of the current target detection point, so as to facilitate subsequent retrieval and use.
例如,确定当前目标检测点所对应的投影系数值的目标数量;基于目标数量以及与顶点色相对应的存储数量,确定与当前目标检测点所对应的顶点色数量;将投影系数值存储至与当前目标检测点对应的顶点色中。For example, determine the target number of projection coefficient values corresponding to the current target detection point; determine the number of vertex colors corresponding to the current target detection point based on the target number and the storage number corresponding to the vertex color; store the projection coefficient value to the current target detection point. In the vertex color corresponding to the target detection point.
其中,目标数量是投影系数值的数量,也是预先设置的球谐函数中的基函数的数量。存储数量是每个顶点色能够存储的投影系数值的数量,例如:顶点色包含RGBA四个通道,存储数量为4。顶点色数量是用于存储投影系数值的顶点色的数量。Among them, the target number is the number of projection coefficient values, and is also the number of basis functions in the preset spherical harmonics. The storage quantity is the number of projection coefficient values that each vertex color can store. For example, the vertex color contains four channels of RGBA, and the storage quantity is 4. The number of vertex colors is the number of vertex colors used to store the projection coefficient values.
例如,顶点色可以是按RGBA通道存储的,即存在4个通道的数值,也可以是按RGB通道存储的,即存在3个通道的数值。以顶点色使用RGBA通道存储为例,若预先设置的球谐函数为二阶球谐函数,包含4个基函数,那么可以得到4个投影系数值,则将上述4个投影系数值存储至当前目标检测点所对应的一个顶点色中。若预先设置的球谐函数为四阶球谐函数,包含16个基函数,那么可以得到16个投影系数值,则将上述16个投影系数值存储至当前目标检测点所对应的四个顶点色中,四个顶点色分别属于不同的图片,与当前目标检测点相对应。For example, the vertex color can be stored by RGBA channel, that is, there are 4 channels of values, or it can be stored by RGB channels, that is, there are 3 channels of values. Taking the vertex color stored in RGBA channel as an example, if the preset spherical harmonic function is a second-order spherical harmonic function and contains 4 basis functions, then 4 projection coefficient values can be obtained, and then the above 4 projection coefficient values can be stored to the current value. In a vertex color corresponding to the target detection point. If the preset spherical harmonic function is the fourth-order spherical harmonic function and contains 16 basis functions, then 16 projection coefficient values can be obtained, and the above 16 projection coefficient values are stored in the four vertex colors corresponding to the current target detection point. , the four vertex colors belong to different pictures, corresponding to the current target detection point.
需要说明的是,由于每个顶点色可以存储4个投影系数值,若投影系数值的数量不是4的倍数,那么需要对顶点色的个数进行取整,例如:三阶球谐函数,包含9个基函数,对应9个投影系数值,那么,可以使用两个顶点色存储8 个投影系数值,剩余的1个投影系数值仍需一个顶点色进行存储,因此总共需要3个顶点色。It should be noted that since each vertex color can store 4 projection coefficient values, if the number of projection coefficient values is not a multiple of 4, then the number of vertex colors needs to be rounded, for example: the third-order spherical harmonic function, including There are 9 base functions corresponding to 9 projection coefficient values. Then, two vertex colors can be used to store 8 projection coefficient values, and the remaining 1 projection coefficient value still needs one vertex color to be stored, so a total of 3 vertex colors are needed.
还可以采用的方式是:使用一个顶点色存储部分投影系数值之后,使用顶点色的属性信息存储剩余的投影系数值。Another possible way is to use one vertex color to store part of the projection coefficient value, and then use the attribute information of the vertex color to store the remaining projection coefficient value.
例如,确定当前目标检测点所对应的投影系数值的目标数量;根据当前目标检测点的顶点色确定存储投影系数值的预设数量;根据目标数量以及预设数量,将剩余投影系数值存储至顶点色的属性信息中。For example, determine the target number of projection coefficient values corresponding to the current target detection point; determine the preset number of stored projection coefficient values according to the vertex color of the current target detection point; store the remaining projection coefficient values to In the attribute information of vertex color.
其中,预设数量是顶点色可以存储的投影系数值的数量。Among them, the preset number is the number of projection coefficient values that the vertex color can store.
例如,根据目标数量以及预设数量的差值可以确定剩余投影系数值,并将剩余投影系数值存储至顶点色的属性信息中。例如:每个顶点色可以存储4个投影系数值,而三阶球谐函数包含9个基函数,对应9个投影系数值。使用目标检测点对应的一个顶点色可以存储4个投影系数值,并将5个剩余投影系数值存储至与该目标检测点所对应的UV坐标中。For example, the remaining projection coefficient value may be determined according to the difference between the target number and the preset number, and the remaining projection coefficient value may be stored in the attribute information of the vertex color. For example: each vertex color can store 4 projection coefficient values, and the third-order spherical harmonic function contains 9 basis functions, corresponding to 9 projection coefficient values. Using one vertex color corresponding to the target detection point can store 4 projection coefficient values, and store the 5 remaining projection coefficient values in the UV coordinates corresponding to the target detection point.
还可以是使用目标检测点对应的两个顶点色存储8个投影系数值,将1个剩余投影系数值存储至与该目标检测点所对应的UV坐标中。上述存储方式可以减少顶点色的使用目的。It is also possible to use the two vertex colors corresponding to the target detection point to store 8 projection coefficient values, and store one remaining projection coefficient value in the UV coordinates corresponding to the target detection point. The above storage method can reduce the use purpose of vertex color.
为了将当前目标检测点所对应的投影系数值进行存储,还可以选择将投影系数值存储至与当前目标检测点所对应的至少一个图片的像素点中,以便于后续调取和使用。In order to store the projection coefficient value corresponding to the current target detection point, the projection coefficient value may also be selected to be stored in the pixel point of at least one picture corresponding to the current target detection point, so as to facilitate subsequent retrieval and use.
示例性的,若存储9个投影系数值,可以使用9张图片,分别将投影系数值存储至当前目标检测点所对应的像素点中。Exemplarily, if 9 projection coefficient values are stored, 9 pictures may be used, and the projection coefficient values may be respectively stored in the pixel points corresponding to the current target detection point.
进一步,将顶点色和/或属性信息以及当前坐标信息导入至引擎中的目标位置进行存储。或,将至少一个图片的像素点以及当前坐标信息导入至引擎中的目标位置进行存储。Further, the vertex color and/or attribute information and the current coordinate information are imported into the target position in the engine for storage. Or, import the pixel point of at least one picture and the current coordinate information into the target position in the engine for storage.
其中,引擎可以是已编写好的可编辑电脑游戏***或者一些交互式实时图像应用程序的核心组件。目标位置可以是引擎中用于存储数据和/或信息的存储空间,在本实施例中是用于存储目标检测点的坐标信息的存储空间。Among them, the engine can be a core component of a programmed editable computer game system or some interactive real-time image application programs. The target location may be a storage space used to store data and/or information in the engine, and in this embodiment, is a storage space used to store coordinate information of the target detection point.
例如,在将当前目标检测点所对应的投影系数值存储至与当前目标检测点所对应的顶点色中和/或当前目标检测点的属性信息中或存储至与当前目标检测点所对应的至少一个图片的像素点中之后,为了使引擎能够调取和使用存储的投影系数值,就需要将当前目标检测点的顶点色和/或属性信息,或至少一个图片的像素点根据导入至引擎中的目标位置进行存储,并将当前坐标信息也导入至引擎中的目标位置进行存储。若引擎需要使用某一个目标检测点所对应的投影系数值,则可以在引擎中的目标位置确定与该坐标信息相对应的投影系数值以供后续重建使用。For example, when the projection coefficient value corresponding to the current target detection point is stored in the vertex color corresponding to the current target detection point and/or the attribute information of the current target detection point, or stored in at least one corresponding to the current target detection point After the pixel points of a picture, in order to enable the engine to retrieve and use the stored projection coefficient value, it is necessary to import the vertex color and/or attribute information of the current target detection point, or at least one picture pixel point into the engine according to the The target position is stored, and the current coordinate information is also imported to the target position in the engine for storage. If the engine needs to use the projection coefficient value corresponding to a certain target detection point, the projection coefficient value corresponding to the coordinate information can be determined at the target position in the engine for subsequent reconstruction.
本申请实施例的技术方案,通过针对每个目标检测点,确定当前目标检测点的三维空间坐标,根据预先确定的坐标变换矩阵,对三维空间坐标进行处理确定当前目标检测点的当前坐标信息,根据当前目标检测点与每个待处理碰撞点信息,确定当前目标检测点在每个方向上的距离信息,进而确定相应目标检测点的球面分布的距离函数,针对每个目标检测点,基于每个基函数对当前目标检测点的距离函数进行处理,得到当前目标检测点的投影系数值,将当前目标检测点的投影系数值以及当前坐标信息存储至目标存储位置,解决了以固定透明度显示值对第一子模型进行透明显示时透明显示与实际情况存在偏差而导致的透明显示效果不佳,用户体验较差的技术问题,根据实际情况调整透明度参数,使透明显示效果与实际理论效果相符,提高了用户体验。According to the technical solution of the embodiment of the present application, the three-dimensional space coordinates of the current target detection point are determined for each target detection point, and the three-dimensional space coordinates are processed according to a predetermined coordinate transformation matrix to determine the current coordinate information of the current target detection point. According to the current target detection point and the information of each collision point to be processed, the distance information of the current target detection point in each direction is determined, and then the distance function of the spherical distribution of the corresponding target detection point is determined. A basis function processes the distance function of the current target detection point, obtains the projection coefficient value of the current target detection point, stores the projection coefficient value of the current target detection point and the current coordinate information to the target storage location, and solves the problem of displaying the value with a fixed transparency. When the first sub-model is transparently displayed, the transparent display is deviated from the actual situation, and the transparent display effect is not good and the user experience is poor. Adjust the transparency parameters according to the actual situation, so that the transparent display effect is consistent with the actual theoretical effect. Improved user experience.
图3为本申请另一实施例提供的确定透明度的方法的流程图,本实施例可适用于根据投影系数值重建出可以确定每个角度下目标检测点与第二子模型之间的距离信息,并根据距离信息进行透明显示的情形,该方法可以由确定透明度的装置来执行,该装置可以通过软件和/或硬件的形式实现,该硬件可以是电子设备,例如,电子设备可以是移动终端等。其中与上述每个实施例相同或相应的术语的解释在此不再赘述。3 is a flowchart of a method for determining transparency provided by another embodiment of the present application. This embodiment can be applied to reconstruct the distance information between the target detection point and the second sub-model at each angle according to the projection coefficient value. , and the transparent display is performed according to the distance information, the method can be performed by a device for determining the transparency, and the device can be implemented in the form of software and/or hardware, and the hardware can be an electronic device, for example, the electronic device can be a mobile terminal Wait. The explanations of the terms that are the same as or corresponding to each of the above-mentioned embodiments are not repeated here.
如图3所述,本实施例包括如下步骤:As shown in Figure 3, this embodiment includes the following steps:
S310、确定拍摄装置与第一子模型上每个目标检测点所对应的目标拍摄角度。S310. Determine the target shooting angle corresponding to the shooting device and each target detection point on the first sub-model.
其中,拍摄装置是用于观测和拍摄第一子模型的装置,目标拍摄角度是拍摄装置与第一子模型上每个目标检测点之间的相对角度。The photographing device is a device for observing and photographing the first sub-model, and the target photographing angle is the relative angle between the photographing device and each target detection point on the first sub-model.
例如,拍摄装置与每个目标检测点所对应的目标拍摄角度存在差异,根据拍摄装置与第一子模型上每个目标检测点的相对位置关系,可以确定拍摄装置与每个目标检测点的相对角度信息,并可以将该角度信息作为目标拍摄角度。For example, there is a difference in the target shooting angle corresponding to the shooting device and each target detection point. According to the relative positional relationship between the shooting device and each target detection point on the first submodel, the relative position of the shooting device and each target detection point can be determined. angle information, and the angle information can be used as the target shooting angle.
S320、针对每个目标检测点,根据当前目标检测点的当前坐标信息确定目标坐标信息,并根据目标坐标信息以及当前目标检测点的投影系数值,重建出当前目标检测点的目标重建函数。S320. For each target detection point, determine target coordinate information according to the current coordinate information of the current target detection point, and reconstruct a target reconstruction function of the current target detection point according to the target coordinate information and the projection coefficient value of the current target detection point.
其中,第一子模型与第二子模型是相对而言的,若应用场景为皮肤模型和衣服模型时,可以将衣服所对应的模型作为第一子模型,皮肤所对应的模型作为第二子模型。目标坐标信息是当前目标检测点在当前场景下的坐标信息,例如:当前目标检测点发生形变之前的当前坐标信息,在发生形变之后转换为目标坐标信息。当前目标检测点的投影系数值包括基于球谐函数对第一子模型上的当前目标检测点的球面分布的距离函数处理后确定的投影系数值。目标重建函数是是对当前目标检测点的投影系数值进行处理后,得到当前目标检测点向空间中每个方向发射物理射线时,物理射线与第二子模型碰撞时所对应的距离 信息构建出的函数。Among them, the first sub-model and the second sub-model are relative. If the application scene is a skin model and a clothing model, the model corresponding to the clothes can be used as the first sub-model, and the model corresponding to the skin can be used as the second sub-model. Model. The target coordinate information is the coordinate information of the current target detection point in the current scene, for example, the current coordinate information before the current target detection point is deformed, and converted into target coordinate information after the deformation occurs. The projection coefficient value of the current target detection point includes the projection coefficient value determined after processing the distance function of the spherical distribution of the current target detection point on the first sub-model based on the spherical harmonic function. The target reconstruction function is to process the projection coefficient value of the current target detection point to obtain the distance information corresponding to the collision between the physical ray and the second sub-model when the current target detection point emits physical rays in each direction in the space. The function.
例如,在当前场景下,可以根据当前目标检测点的当前坐标信息和根据当前检测点的切线、副切线以及法线确定的坐标轴,确定当前场景下当前目标检测点的目标坐标信息。在需要透明显示第一子模型上的每个目标检测点时,可以从当前目标检测点所对应的目标存储位置中获取当前目标检测点的投影系数值。根据存储的投影系数值以及当前目标检测点的目标坐标信息模拟出包括每个角度下所对应的距离信息的距离函数。如,根据投影系数值,可以模拟出以当前目标检测点为球心向空间的每个方向发射物理射线时,每个方向上与第二子模型所对应的距离值。并且,可以根据距离信息重建出当前目标检测点的目标重建函数。示例性的,以眼睛为拍摄装置,当第二子模型,例如手臂,向前伸直时,第一子模型上的某个目标检测点在y轴方向上,例如手臂上的衣服为第一子模型,此时,x轴和z轴平行于地面,与眼睛近似处于一条直线,在这种情况下,第一子模型近似是不透明的;当手臂立起来时,第一子模型上的目标检测点位于y轴的正方向上,但是,z轴朝向第二子模型,在这种情况下,同一目标检测点的坐标信息发生了变化,即同一个目标检测点在不同位置所对应的切线、副切线以及法线的坐标不同,进而基于该坐标重建出的重建函数也不相同,即使在物体形变的条件下,依然可以确定每个目标检测点所对应的重建函数,进而确定每个目标检测点所对应的距离信息。For example, in the current scene, the target coordinate information of the current target detection point in the current scene can be determined according to the current coordinate information of the current target detection point and the coordinate axis determined according to the tangent, subtangent and normal of the current detection point. When each target detection point on the first sub-model needs to be displayed transparently, the projection coefficient value of the current target detection point can be obtained from the target storage location corresponding to the current target detection point. A distance function including distance information corresponding to each angle is simulated according to the stored projection coefficient value and the target coordinate information of the current target detection point. For example, according to the projection coefficient value, the distance value corresponding to the second sub-model in each direction can be simulated when physical rays are emitted in each direction of the space with the current target detection point as the spherical center. Moreover, the target reconstruction function of the current target detection point can be reconstructed according to the distance information. Exemplarily, taking the eyes as the photographing device, when the second sub-model, such as an arm, is stretched forward, a certain target detection point on the first sub-model is in the y-axis direction, for example, the clothes on the arm are the first Sub-model, at this time, the x-axis and z-axis are parallel to the ground and are approximately in a straight line with the eyes. In this case, the first sub-model is approximately opaque; when the arm stands up, the target on the first sub-model The detection point is located in the positive direction of the y-axis, but the z-axis is facing the second sub-model. In this case, the coordinate information of the same target detection point has changed, that is, the tangent line corresponding to the same target detection point at different positions, The coordinates of the vice tangent and the normal are different, and the reconstruction functions reconstructed based on the coordinates are also different. Even under the condition of object deformation, the reconstruction function corresponding to each target detection point can still be determined, and then each target detection point can be determined. The distance information corresponding to the point.
需要说明的是,针对每个目标检测点可以采用上述方式来确定目标检测点与拍摄装置之间的目标拍摄角度,并基于目标拍摄角度确定每个目标检测点与第二子模型中碰撞点之间的距离信息,并重建目标重建函数。It should be noted that, for each target detection point, the above method can be used to determine the target shooting angle between the target detection point and the shooting device, and based on the target shooting angle, determine the difference between each target detection point and the collision point in the second sub-model. distance information between and reconstruct the target reconstruction function.
示例性的,在对第一子模型上的当前目标检测点进行透明显示时,根据当前目标检测点可以确定与当前目标检测点所对应的2个顶点色和1个属性信息,由于每个顶点色的RGBA四个通道中分别存储了1个投影系数值,因此可以获取当前目标检测点对应的9个投影系数值。将上述9个投影系数值根据预先设置的球谐函数进行处理,可以确定9个距离信息。上述9个距离信息是目标检测点在空间中9个角度下所对应的距离信息,并可以根据距离信息构建重建函数。Exemplarily, when the current target detection point on the first sub-model is transparently displayed, 2 vertex colors and 1 attribute information corresponding to the current target detection point can be determined according to the current target detection point, since each vertex One projection coefficient value is stored in each of the four RGBA channels of the color, so nine projection coefficient values corresponding to the current target detection point can be obtained. By processing the above-mentioned nine projection coefficient values according to the preset spherical harmonic function, nine pieces of distance information can be determined. The above 9 distance information is the distance information corresponding to the target detection point under 9 angles in space, and a reconstruction function can be constructed according to the distance information.
S330、基于每个目标检测点所对应的重建函数以及相应的模板拍摄角度,确定与每个目标检测点所对应的距离信息,基于距离信息确定第一子模型和第二子模型之间的透明度参数,并基于透明度参数显示第一子模型。S330, based on the reconstruction function corresponding to each target detection point and the corresponding template shooting angle, determine the distance information corresponding to each target detection point, and determine the transparency between the first sub-model and the second sub-model based on the distance information parameter, and display the first submodel based on the transparency parameter.
其中,透明度参数是用于表示模型显示时的透明程度,可以使用百分比来表示,例如:透明度为80%等。Among them, the transparency parameter is used to represent the degree of transparency when the model is displayed, which can be represented by percentage, for example, the transparency is 80% and so on.
例如,目标重建函数可以用于对输入的目标拍摄角度进行处理,以确定在该目标拍摄角度下目标检测点所对应的第一子模型与第二子模型之间的距离信 息。根据当前目标检测点与拍摄装置,例如,显示屏中的摄像装置或人眼瞳孔位置,可以确定出拍摄装置与当前目标检测点之间的目标拍摄角度,将该目标拍摄角度输入至目标重建函数中,可以确定出第二子模型上与当前目标检测点所对应的第二子模型上的碰撞点,进而确定该碰撞点与当前目标检测点之间的距离信息。For example, the target reconstruction function can be used to process the input target shooting angle to determine the distance information between the first sub-model and the second sub-model corresponding to the target detection point under the target shooting angle. According to the current target detection point and the shooting device, for example, the camera device in the display screen or the position of the pupil of the human eye, the target shooting angle between the shooting device and the current target detection point can be determined, and the target shooting angle can be input into the target reconstruction function. , the collision point on the second sub-model corresponding to the current target detection point on the second sub-model can be determined, and then the distance information between the collision point and the current target detection point can be determined.
需要说明的是,确定不同目标检测点的透明度参数是采用相同的方式,为了清楚的介绍本实施例技术方案,以确定其中一个目标检测点的透明度参数为例来介绍。It should be noted that the same method is used to determine the transparency parameters of different target detection points. In order to clearly introduce the technical solution of this embodiment, the transparency parameter of one of the target detection points is determined as an example.
例如,根据与当前目标检测点所对应的第一子模型和第二子模型之间的距离信息,确定当前目标检测点的透明度参数可以是:根据距离信息在预先存储的距离信息与透明度参数之间的对应关系中确定透明度参数,也可以是根据预先设置的透明度参数计算模型来计算,将距离信息输入至透明度参数计算模型中,经过计算可以获取与该距离信息对应的透明度参数。For example, according to the distance information between the first sub-model and the second sub-model corresponding to the current target detection point, determining the transparency parameter of the current target detection point may be: according to the distance information, between the pre-stored distance information and the transparency parameter The transparency parameter is determined in the corresponding relationship between the two, or it can be calculated according to a preset transparency parameter calculation model, and the distance information is input into the transparency parameter calculation model, and the transparency parameter corresponding to the distance information can be obtained through calculation.
示例性的,将距离信息记为dist,若0nm<dist≤5nm,透明度参数为90%,5nm<dist≤10nm,透明度参数为80%,当dist=7.5nm时,对应的透明度参数为80%。Exemplarily, the distance information is recorded as dist, if 0nm<dist≤5nm, the transparency parameter is 90%, 5nm<dist≤10nm, the transparency parameter is 80%, when dist=7.5nm, the corresponding transparency parameter is 80% .
示例性的,透明度计算公式为a i=f(l i),其中,a i表示第i个目标检测点,l i表示第i个目标检测点所对应的第一子模型和第二子模型之间的距离信息,f为具有单调性,且是单调递减的函数。 Exemplarily, the transparency calculation formula is a i =f(li i ), where a i represents the ith target detection point, and li represents the first sub-model and the second sub-model corresponding to the ith target detection point. The distance information between f is a monotonic and monotonic decreasing function.
在确定目标检测点的透明度参数之后,为了使第一子模型和第二子模型的视觉体验效果更好,可以将第一子模型上的目标检测点按照相对应的透明度参数进行显示,以得到透明显示的效果。After determining the transparency parameters of the target detection points, in order to make the visual experience of the first sub-model and the second sub-model better, the target detection points on the first sub-model can be displayed according to the corresponding transparency parameters to obtain The effect of transparent display.
本实施例的技术方案,通过确定目标拍摄角度,根据当前目标检测点的当前坐标信息确定目标坐标信息,并根据目标坐标信息以及当前目标检测点的投影系数值,重建出目标重建函数,基于重建函数以及相应的目标拍摄角度,确定对应的距离信息,进而确定透明度参数并基于透明度参数进行显示,解决了以固定透明度显示值对第一子模型进行透明显示时透明显示与实际情况存在偏差而导致的透明显示效果不佳,用户体验较差的技术问题,根据实际情况调整透明度参数,使透明显示效果与实际理论效果相符,提高了用户体验。In the technical solution of this embodiment, the target shooting angle is determined, the target coordinate information is determined according to the current coordinate information of the current target detection point, and the target reconstruction function is reconstructed according to the target coordinate information and the projection coefficient value of the current target detection point. function and the corresponding target shooting angle, determine the corresponding distance information, and then determine the transparency parameter and display it based on the transparency parameter, which solves the problem of the deviation between the transparent display and the actual situation when the first sub-model is transparently displayed with a fixed transparency display value. For technical problems with poor transparent display effect and poor user experience, adjust the transparency parameters according to the actual situation, so that the transparent display effect is consistent with the actual theoretical effect, and the user experience is improved.
图4为本申请另一实施例提供的确定透明度的方法的流程图,在上述实施例的基础上,可以根据球谐函数以及投影系数值重建出重建函数,可以基于重建函数确定在目标检测点与拍摄装置的目标拍摄角度下,目标检测点所对应的透明度参数,其实施方式可参见下述描述。其中与上述每个实施例相同或相应的术语的解释在此不再赘述。FIG. 4 is a flowchart of a method for determining transparency provided by another embodiment of the present application. On the basis of the above embodiment, a reconstruction function can be reconstructed according to the spherical harmonic function and the projection coefficient value, and the target detection point can be determined based on the reconstruction function. For the transparency parameter corresponding to the target detection point under the target shooting angle of the shooting device, the implementation can refer to the following description. The explanations of the terms that are the same as or corresponding to each of the above-mentioned embodiments are not repeated here.
参见图4,本实施例提供的确定透明度的方法包括:Referring to FIG. 4 , the method for determining transparency provided by this embodiment includes:
S410、确定拍摄装置与第一子模型上每个目标检测点所对应的目标拍摄角度。S410. Determine the target shooting angle corresponding to the shooting device and each target detection point on the first sub-model.
S420、根据当前目标检测点的当前坐标信息,确定当前目标检测点在当前场景下的目标坐标信息。S420: Determine the target coordinate information of the current target detection point in the current scene according to the current coordinate information of the current target detection point.
例如,当前目标检测点的当前坐标信息是根据预先确定的坐标变换矩阵,对当前目标检测点的三维空间坐标进行处理得到的坐标信息。在当前场景下,可以根据当前目标检测点的当前坐标信息和根据当前检测点的切线、副切线以及法线确定的坐标轴,确定当前场景下当前目标检测点的目标坐标信息。For example, the current coordinate information of the current target detection point is coordinate information obtained by processing the three-dimensional space coordinates of the current target detection point according to a predetermined coordinate transformation matrix. In the current scene, the target coordinate information of the current target detection point in the current scene can be determined according to the current coordinate information of the current target detection point and the coordinate axis determined according to the tangent, subtangent and normal of the current detection point.
S430、通过预先设置的球谐函数对目标坐标信息以及当前目标检测点的投影系数值进行处理,重建出当前目标检测点的目标重建函数。S430. Process the target coordinate information and the projection coefficient value of the current target detection point by using a preset spherical harmonic function to reconstruct the target reconstruction function of the current target detection point.
其中,目标坐标信息为当前场景下对目标检测点的当前坐标信息进行进一步转换处理后得到的坐标信息。The target coordinate information is the coordinate information obtained by further converting the current coordinate information of the target detection point in the current scene.
基于预先设置的球谐函数,可以重建出与每个目标检测点所对应的重建函数,为了清楚的介绍本实施例技术方案,可以以重建其中一个目标检测点的重建函数为例来介绍。Based on the preset spherical harmonic function, the reconstruction function corresponding to each target detection point can be reconstructed. In order to clearly introduce the technical solution of this embodiment, the reconstruction function of reconstructing one of the target detection points can be described as an example.
其中,目标检测点的重建函数是根据球谐函数对当前目标检测点的投影系数进行处理后,得到当前目标检测点向空间中每个方向发射物理射线时,物理射线与第二子模型碰撞时所对应的距离值构建出的函数。重建函数可以用于对输入的目标拍摄角度进行处理,以确定在该目标拍摄角度下目标检测点所对应的第一子模型与第二子模型之间的距离信息。Among them, the reconstruction function of the target detection point is to process the projection coefficient of the current target detection point according to the spherical harmonic function, so as to obtain when the current target detection point emits physical rays in each direction in space, when the physical rays collide with the second sub-model The function constructed from the corresponding distance value. The reconstruction function can be used to process the input target shooting angle to determine the distance information between the first sub-model and the second sub-model corresponding to the target detection point at the target shooting angle.
例如,基于预先设置的球谐函数对目标坐标信息以及当前目标检测点的投影系数值进行处理,可以得到当前目标检测点向空间中每个方向发射物理射线时,物理射线与第二子模型碰撞时的碰撞点与当前目标检测点的距离信息。根据当前目标检测点所对应的空间中每个方向的距离信息,构建当前目标检测点所对应的重建函数。For example, by processing the target coordinate information and the projection coefficient value of the current target detection point based on the preset spherical harmonic function, it can be obtained that when the current target detection point emits physical rays in each direction in space, the physical rays collide with the second sub-model The distance information between the collision point and the current target detection point. According to the distance information of each direction in the space corresponding to the current target detection point, a reconstruction function corresponding to the current target detection point is constructed.
S440、将与当前目标检测点所对应的目标拍摄角度输入至重建函数中,得到当前目标检测点与目标拍摄装置所属直线与第一子模型和第二子模型相交时的距离信息。S440. Input the target shooting angle corresponding to the current target detection point into the reconstruction function to obtain distance information when the line belonging to the current target detection point and the target shooting device intersects the first sub-model and the second sub-model.
例如,在确定当前目标检测点与目标拍摄装置之间的目标拍摄角度后,可以将目标拍摄角度输入至与当前目标检测点所对应的重建函数中,重建函数可以对目标拍摄角度进行处理,输出目标拍摄角度下当前目标检测点与目标拍摄装置所属直线与第二子模型碰撞时,碰撞点与当前目标检测点之间的距离信息。For example, after determining the target shooting angle between the current target detection point and the target shooting device, the target shooting angle can be input into the reconstruction function corresponding to the current target detection point, and the reconstruction function can process the target shooting angle and output The distance information between the collision point and the current target detection point when the current target detection point and the line belonging to the target shooting device collide with the second sub-model under the target shooting angle.
示例性的,若获取到当前目标检测点所对应的9个投影系数值,根据上述9个投影系数值以及目标坐标信息,通过预先设置的球谐函数进行反变换处理,可以得到目标检测点所对应的重建函数。将目标拍摄角度,如45°,输入至该重建函数中,可以基于重建函数确定目标拍摄角度下当前目标检测点所对应的第一子模型和第二子模型之间的距离信息,如5nm。Exemplarily, if the 9 projection coefficient values corresponding to the current target detection point are obtained, according to the above-mentioned 9 projection coefficient values and target coordinate information, inverse transformation processing is performed through the preset spherical harmonic function, and the target detection point can be obtained. Corresponding reconstruction function. The target shooting angle, such as 45°, is input into the reconstruction function, and the distance information between the first sub-model and the second sub-model corresponding to the current target detection point at the target shooting angle can be determined based on the reconstruction function, such as 5 nm.
S450、根据预先设置的距离信息与透明度参数之间的对应关系以及每个目标检测点所对应的距离信息,确定每个目标检测点的透明度参数。S450: Determine the transparency parameter of each target detection point according to the preset correspondence between the distance information and the transparency parameter and the distance information corresponding to each target detection point.
其中,可以预先存储每个距离信息与其相对应的透明度参数之间的对应关系,例如,每增加10nm透明度参数降低百分之十,将距离信息记为dist,0nm<dist≤10nm,透明度参数为100%,10nm<dist≤20nm,透明度参数为90%,20nm<dist≤30nm,透明度参数为80%等。Among them, the correspondence between each distance information and its corresponding transparency parameter can be stored in advance. For example, when the transparency parameter decreases by 10% for every 10nm increase, the distance information is recorded as dist, 0nm<dist≤10nm, and the transparency parameter is 100%, 10nm<dist≤20nm, the transparency parameter is 90%, 20nm<dist≤30nm, the transparency parameter is 80%, etc.
需要说明的是,透明度参数可以包括第一子模型的透明度参数以及第二子模型的透明度参数。It should be noted that the transparency parameter may include the transparency parameter of the first sub-model and the transparency parameter of the second sub-model.
例如,根据每个目标检测点所对应的距离信息,可以在预先存储的每个距离信息与透明度参数的对应关系中,确定与距离信息相对应的透明度参数,可以包括第一子模型的透明度参数以及第二子模型的透明度参数,以便后续透明显示时使用。For example, according to the distance information corresponding to each target detection point, the transparency parameter corresponding to the distance information may be determined in the pre-stored correspondence between each distance information and the transparency parameter, which may include the transparency parameter of the first sub-model And the transparency parameter of the second submodel for subsequent transparent display.
S460、基于透明度参数显示第一子模型和第二子模型。S460. Display the first sub-model and the second sub-model based on the transparency parameter.
例如,在确定目标拍摄角度下每个目标检测点所对应的用于显示第一子模型和第二子模型的透明度参数后,基于上述透明度参数可以实现第一子模型和第二子模型相对位置的透明度显示效果。For example, after determining the transparency parameter for displaying the first sub-model and the second sub-model corresponding to each target detection point at the target shooting angle, the relative position of the first sub-model and the second sub-model can be realized based on the above-mentioned transparency parameter The transparency of the display effect.
本实施例的技术方案,通过确定的目标拍摄角度,以及当前目标检测点在当前场景下的目标坐标信息,并基于预先设置的球谐函数对目标坐标信息以及当前目标检测点的投影系数值进行处理,重建出当前目标检测点的目标重建函数,根据目标拍摄角度以及重建函数确定距离信息,进而确定透明度参数并进行显示,解决了以固定透明度显示值对第一子模型进行透明显示时透明显示与实际情况存在偏差而导致的透明显示效果不佳,用户体验较差的技术问题,根据实际情况调整透明度参数,使透明显示效果与实际理论效果相符,提高了用户体验。In the technical solution of this embodiment, the target coordinate information and the projection coefficient value of the current target detection point are determined based on the predetermined shooting angle of the target and the target coordinate information of the current target detection point in the current scene, and based on the preset spherical harmonic function. Process, reconstruct the target reconstruction function of the current target detection point, determine the distance information according to the target shooting angle and the reconstruction function, and then determine the transparency parameter and display it, which solves the problem of transparent display when the first sub-model is transparently displayed with a fixed transparency display value. For technical problems with poor transparent display effect and poor user experience caused by deviation from the actual situation, adjust the transparency parameters according to the actual situation, so that the transparent display effect is consistent with the actual theoretical effect, and the user experience is improved.
图5为本申请一实施例提供的一种确定透明度的装置的结构示意图,如图5所示,该装置包括:距离信息确定模块510,投影系数值确定模块520和透明度参数确定模块530。5 is a schematic structural diagram of an apparatus for determining transparency provided by an embodiment of the present application. As shown in FIG. 5 , the apparatus includes: a distance information determination module 510 , a projection coefficient value determination module 520 and a transparency parameter determination module 530 .
其中,距离信息确定模块510,设置为针对第一子模型上的每个目标检测点, 确定当前目标检测点的当前坐标信息,并基于当前坐标信息确定与当前目标检测点对应的距离函数;当前坐标信息为对当前目标检测点的三维空间坐标进行矩阵变化后得到的坐标,距离函数是根据目标检测点在每个方向上与第二子模型之间的相对距离信息来确定的;第一子模型为包裹第二子模型的模型;投影系数值确定模块520,设置为基于预先设置的球谐函数对每个目标检测点的距离函数进行处理,得到每个目标检测点的投影系数值;球谐函数由多个基函数构成;透明度参数确定模块530,设置为针对每个目标检测点,将当前目标检测点的投影系数值存储至引擎中的目标存储位置,以在检测到透明显示时,基于目标存储位置中存储的投影系数值重建出目标重建函数,基于目标重建函数确定目标拍摄角度下第一子模型与第二子模型之间的目标距离信息,并确定与目标距离信息相对应的透明度参数,以基于每个目标检测点的透明度参数显示第一子模型上的每个目标检测点。Wherein, the distance information determination module 510 is configured to determine the current coordinate information of the current target detection point for each target detection point on the first sub-model, and determine the distance function corresponding to the current target detection point based on the current coordinate information; The coordinate information is the coordinates obtained after the three-dimensional space coordinates of the current target detection point are changed in a matrix, and the distance function is determined according to the relative distance information between the target detection point and the second sub-model in each direction; The model is a model wrapping the second sub-model; the projection coefficient value determination module 520 is set to process the distance function of each target detection point based on a preset spherical harmonic function to obtain the projection coefficient value of each target detection point; The harmonic function is composed of a plurality of basis functions; the transparency parameter determination module 530 is configured to store the projection coefficient value of the current target detection point in the target storage position in the engine for each target detection point, so that when transparent display is detected, The target reconstruction function is reconstructed based on the projection coefficient value stored in the target storage location, the target distance information between the first sub-model and the second sub-model under the target shooting angle is determined based on the target reconstruction function, and the target distance information corresponding to the target distance information is determined. A transparency parameter to display each target detection point on the first sub-model based on the transparency parameter of each target detection point.
例如,距离信息确定模块510,设置为针对每个目标检测点,确定当前目标检测点向空间中每个方向发射物理射线透过第二子模型时的每个待处理碰撞点信息;根据当前目标检测点与每个待碰撞点信息,确定当前目标检测在每个方向上与第二子模型之间的距离信息;根据当前目标检测点的当前坐标信息,以及当前目标检测点在每个方向上的距离信息,确定当前目标检测点的球面分布距离函数。For example, the distance information determination module 510 is configured to, for each target detection point, determine the information of each to-be-processed collision point when the current target detection point emits physical rays to each direction in space and passes through the second sub-model; The detection point and the information of each point to be collided, determine the distance information between the current target detection and the second sub-model in each direction; according to the current coordinate information of the current target detection point, and the current target detection point in each direction The distance information is determined to determine the spherical distribution distance function of the current target detection point.
例如,距离信息确定模块510,设置为以当前目标检测点为球心,向空间中的任意方向发射物理射线,确定每个物理射线透过第二子模型时的待处理碰撞点信息。For example, the distance information determination module 510 is set to take the current target detection point as the center of the sphere, emit physical rays in any direction in space, and determine the collision point information to be processed when each physical ray passes through the second sub-model.
例如,距离信息确定模块510,设置为当物理射线与第二子模型存在待处理碰撞点时,则确定待处理碰撞点信息与当前目标检测点的距离信息;当物理信息与第二子模型不存在待处理碰撞点时,则将与待处理碰撞点所对应的距离信息设置为设定值;根据与每个待处理碰撞点所对应的距离信息和设定值,确定当前目标检测点在每个方向上与第二子模型之间的距离信息。For example, the distance information determination module 510 is configured to determine the distance information between the collision point information to be processed and the current target detection point when there is a collision point to be processed between the physical ray and the second sub-model; When there is a collision point to be processed, the distance information corresponding to the collision point to be processed is set as the set value; according to the distance information and set value corresponding to each collision point to be processed, it is determined that the current target detection point is in each collision point. distance information from the second sub-model in this direction.
例如,距离信息确定模块510,设置为针对每个目标检测点,确定当前目标检测点的三维空间坐标;根据预先确定的坐标变换矩阵,对三维空间坐标进行处理,确定当前目标检测点的当前坐标信息;当前坐标信息是基于当前目标检测点的切线、副切线以及法线来确定。For example, the distance information determination module 510 is configured to determine the three-dimensional space coordinates of the current target detection point for each target detection point; process the three-dimensional space coordinates according to a predetermined coordinate transformation matrix to determine the current coordinates of the current target detection point Information; the current coordinate information is determined based on the tangent, subtangent and normal of the current target detection point.
例如,投影系数值确定模块520,设置为确定球谐函数的阶数,并根据阶数确定球谐函数中基函数的表示方式以及基函数的数量;针对每个目标检测点,基于每个基函数对当前目标检测点的距离函数进行处理,得到当前目标检测点的投影系数值;投影系数值的数量与基函数的数量相同。For example, the projection coefficient value determination module 520 is configured to determine the order of the spherical harmonic function, and determine the representation of the basis functions in the spherical harmonic function and the number of basis functions according to the order; for each target detection point, based on each basis The function processes the distance function of the current target detection point to obtain the projection coefficient value of the current target detection point; the number of projection coefficient values is the same as that of the basis function.
例如,透明度参数确定模块530,设置为针对每个目标检测点,将当前目标检测点的投影系数值以及当前坐标信息存储至目标存储位置。For example, the transparency parameter determination module 530 is configured to store, for each target detection point, the projection coefficient value of the current target detection point and the current coordinate information to the target storage location.
本实施例的技术方案,通过针对第一子模型上的每个目标检测点,确定当前目标检测点的当前坐标信息,并基于当前坐标信息确定与当前目标检测点对应的距离函数,并基于预先设置的球谐函数对每个目标检测点的距离函数进行处理,得到每个目标检测点的投影系数值,进而将当前目标检测点的投影系数值存储至引擎中的目标存储位置,解决了以固定透明度显示值对第一子模型进行透明显示时透明显示与实际情况存在偏差而导致的透明显示效果不佳,用户体验较差的技术问题,根据实际情况调整透明度参数,使透明显示效果与实际理论效果相符,提高了用户体验。In the technical solution of this embodiment, for each target detection point on the first sub-model, the current coordinate information of the current target detection point is determined, and the distance function corresponding to the current target detection point is determined based on the current coordinate information, and based on the preset The set spherical harmonic function processes the distance function of each target detection point to obtain the projection coefficient value of each target detection point, and then stores the projection coefficient value of the current target detection point to the target storage location in the engine, which solves the problem of Fixed transparency display value When the first sub-model is transparently displayed, the transparent display is deviated from the actual situation, resulting in poor transparent display effect and poor user experience. Adjust the transparency parameter according to the actual situation, so that the transparent display effect is consistent with the actual situation. The theoretical effect is consistent, and the user experience is improved.
本申请实施例所提供的确定透明度的装置可执行本申请任意实施例所提供的确定透明度的方法,具备执行方法相应的功能模块和有益效果。The apparatus for determining transparency provided by the embodiment of the present application can execute the method for determining transparency provided by any embodiment of the present application, and has corresponding functional modules and beneficial effects for executing the method.
图6为本申请另一实施例提供的一种确定透明度的装置的结构示意图,如图6所示,该装置包括:目标拍摄角度确定模块610,目标重建函数重建模块620和透明显示模块630。FIG. 6 is a schematic structural diagram of an apparatus for determining transparency according to another embodiment of the present application. As shown in FIG. 6 , the apparatus includes: a target shooting angle determination module 610 , a target reconstruction function reconstruction module 620 and a transparent display module 630 .
其中,目标拍摄角度确定模块610,设置为确定拍摄装置与第一子模型上每个目标检测点所对应的目标拍摄角度;目标重建函数重建模块620,设置为针对每个目标检测点,根据当前目标检测点的当前坐标信息确定目标坐标信息,并根据目标坐标信息以及当前目标检测点的投影系数值,重建出当前目标检测点的目标重建函数;当前坐标信息是对当前目标检测点的三维空间坐标进行矩阵变换后得到的坐标;第一子模型为包裹第二子模型的模型;投影系数值是基于球谐函数对第一子模型上每个目标检测点的球面分布的距离函数处理后确定的;透明显示模块630,设置为基于每个目标检测点所对应的重建函数以及相应的目标拍摄角度,确定与每个目标检测点所对应的距离信息,基于距离信息确定第一子模型和第二子模型之间的透明度参数,并基于透明度参数显示第一子模型。The target shooting angle determination module 610 is configured to determine the target shooting angle corresponding to the shooting device and each target detection point on the first sub-model; the target reconstruction function reconstruction module 620 is configured to determine the target shooting angle for each target detection point according to the current The current coordinate information of the target detection point determines the target coordinate information, and reconstructs the target reconstruction function of the current target detection point according to the target coordinate information and the projection coefficient value of the current target detection point; the current coordinate information is the three-dimensional space of the current target detection point. The coordinates obtained by the matrix transformation of the coordinates; the first sub-model is the model wrapping the second sub-model; the projection coefficient value is determined based on the spherical harmonic function of the spherical distribution of each target detection point on the first sub-model. The transparent display module 630 is set to determine the distance information corresponding to each target detection point based on the reconstruction function corresponding to each target detection point and the corresponding target shooting angle, and determine the first sub-model and the first sub-model based on the distance information. The transparency parameter between the two submodels, and the first submodel is displayed based on the transparency parameter.
例如,目标重建函数重建模块620,设置为根据当前目标检测点的当前坐标信息,确定当前目标检测点在当前场景下的目标坐标信息;通过预先设置的球谐函数对目标坐标信息以及当前目标检测点的投影系数值进行处理,重建出当前目标检测点的目标重建函数;球谐函数中包括至少一个基函数。For example, the target reconstruction function reconstruction module 620 is configured to determine the target coordinate information of the current target detection point in the current scene according to the current coordinate information of the current target detection point; The projection coefficient value of the point is processed to reconstruct the target reconstruction function of the current target detection point; the spherical harmonic function includes at least one basis function.
例如,透明显示模块630,设置为将与当前目标检测点所对应的目标拍摄角度输入至重建函数中,得到当前目标检测点与目标拍摄装置所属直线与第一子模型和第二子模型相交时的距离信息。For example, the transparent display module 630 is configured to input the target shooting angle corresponding to the current target detection point into the reconstruction function, and obtain when the current target detection point and the line belonging to the target shooting device intersect the first sub-model and the second sub-model distance information.
例如,透明显示模块630,设置为根据预先设置的距离信息与透明度信息之 间的对应关系以及每个目标检测点所对应的距离信息,确定每个目标检测点的透明度参数;基于透明度参数显示第一子模型和第二子模型。For example, the transparent display module 630 is configured to determine the transparency parameter of each target detection point according to the preset correspondence between the distance information and the transparency information and the distance information corresponding to each target detection point; A submodel and a second submodel.
本实施例的技术方案,通过针对每个目标检测点,确定当前目标检测点的三维空间坐标,根据预先确定的坐标变换矩阵,对三维空间坐标进行处理确定当前目标检测点的当前坐标信息,根据当前目标检测点与每个待处理碰撞点信息,确定当前目标检测点在每个方向上的距离信息,进而确定相应目标检测点的球面分布的距离函数,针对每个目标检测点,基于每个基函数对当前目标检测点的距离函数进行处理,得到当前目标检测点的投影系数值,将当前目标检测点的投影系数值以及当前坐标信息存储至目标存储位置,解决了以固定透明度显示值对第一子模型进行透明显示时透明显示与实际情况存在偏差而导致的透明显示效果不佳,用户体验较差的技术问题,根据实际情况调整透明度参数,使透明显示效果与实际理论效果相符,提高了用户体验。In the technical solution of this embodiment, the three-dimensional space coordinates of the current target detection point are determined for each target detection point, and the three-dimensional space coordinates are processed according to a predetermined coordinate transformation matrix to determine the current coordinate information of the current target detection point. The current target detection point and the information of each collision point to be processed, determine the distance information of the current target detection point in each direction, and then determine the distance function of the spherical distribution of the corresponding target detection point. For each target detection point, based on each The basis function processes the distance function of the current target detection point, obtains the projection coefficient value of the current target detection point, stores the projection coefficient value of the current target detection point and the current coordinate information to the target storage location, and solves the problem of displaying the value pair with fixed transparency. When the transparent display of the first sub-model is performed, the transparent display deviates from the actual situation, resulting in poor transparent display effect and poor user experience. Adjust the transparency parameters according to the actual situation to make the transparent display effect consistent with the actual theoretical effect, and improve the user experience.
本申请实施例所提供的确定透明度的装置可执行本申请任意实施例所提供的确定透明度的方法,具备执行方法相应的功能模块和有益效果。The apparatus for determining transparency provided by the embodiment of the present application can execute the method for determining transparency provided by any embodiment of the present application, and has corresponding functional modules and beneficial effects for executing the method.
图7为本申请一实施例提供的一种电子设备的结构示意图,图7示出了适于用来实现本申请实施例实施方式的示例性电子设备70的框图。图7显示的电子设备70仅仅是一个示例,不应对本申请实施例的功能和使用范围带来任何限制。FIG. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application, and FIG. 7 shows a block diagram of an exemplary electronic device 70 suitable for implementing the implementation of the embodiment of the present application. The electronic device 70 shown in FIG. 7 is only an example, and should not impose any limitations on the functions and scope of use of the embodiments of the present application.
如图7所示,电子设备70以通用计算设备的形式表现。电子设备70的组件可以包括但不限于:一个或者多个处理器或者处理单元701,***存储器702,连接不同***组件(包括***存储器702和处理单元701)的总线703。As shown in FIG. 7, electronic device 70 takes the form of a general-purpose computing device. Components of electronic device 70 may include, but are not limited to, one or more processors or processing units 701, system memory 702, and a bus 703 connecting different system components (including system memory 702 and processing unit 701).
总线703表示几类总线结构中的一种或多种,包括存储器总线或者存储器控制器,***总线,图形加速端口,处理器或者使用多种总线结构中的任意总线结构的局域总线。举例来说,这些体系结构包括但不限于工业标准体系结构(Industry Standard Architecture,ISA)总线,微通道体系结构(Micro Channel Architecture,MCA)总线,增强型ISA总线、视频电子标准协会(Video Electronics Standards Association,VESA)局域总线以及***组件互连(Peripheral Component Interconnect,PCI)总线。 Bus 703 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, a graphics acceleration port, a processor, or a local bus using any of a variety of bus structures. For example, these architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, enhanced ISA bus, Video Electronics Standards Association (Video Electronics Standards Association) Association, VESA) local bus and Peripheral Component Interconnect (PCI) bus.
电子设备70典型地包括多种计算机***可读介质。这些介质可以是任何能够被电子设备70访问的可用介质,包括易失性和非易失性介质,可移动的和不可移动的介质。 Electronic device 70 typically includes a variety of computer system readable media. These media can be any available media that can be accessed by electronic device 70, including both volatile and non-volatile media, removable and non-removable media.
***存储器702可以包括易失性存储器形式的计算机***可读介质,例如随机存取存储器(Random Access Memory,RAM)704和/或高速缓存存储器705。 电子设备70可以进一步包括其它可移动/不可移动的、易失性/非易失性计算机***存储介质。仅作为举例,存储***706可以设置为读写不可移动的、非易失性磁介质(图7未显示,通常称为“硬盘驱动器”)。尽管图7中未示出,可以提供设置为对可移动非易失性磁盘(例如“软盘”)读写的磁盘驱动器,以及对可移动非易失性光盘(例如CD-ROM,DVD-ROM或者其它光介质)读写的光盘驱动器。在这些情况下,每个驱动器可以通过一个或者多个数据介质接口与总线703相连。***存储器702可以包括至少一个程序产品,该程序产品具有一组(例如至少一个)程序模块,这些程序模块被配置以执行本申请每个实施例的功能。 System memory 702 may include computer system readable media in the form of volatile memory, such as random access memory (RAM) 704 and/or cache memory 705 . Electronic device 70 may further include other removable/non-removable, volatile/non-volatile computer system storage media. For example only, storage system 706 may be configured to read and write to non-removable, non-volatile magnetic media (not shown in FIG. 7, commonly referred to as a "hard drive"). Although not shown in Figure 7, a disk drive configured to read and write to removable non-volatile magnetic disks (eg "floppy disks") and removable non-volatile optical disks (eg CD-ROM, DVD-ROM) may be provided or other optical media) to read and write optical drives. In these cases, each drive may be connected to bus 703 through one or more data media interfaces. System memory 702 may include at least one program product having a set (eg, at least one) of program modules configured to perform the functions of each embodiment of the present application.
具有一组(至少一个)程序模块707的程序/实用工具708,可以存储在例如***存储器702中,这样的程序模块707包括但不限于操作***、一个或者多个应用程序、其它程序模块以及程序数据,这些示例中的每一个或某种组合中可能包括网络环境的实现。程序模块707通常执行本申请所描述的实施例中的功能和/或方法。A program/utility 708 having a set (at least one) of program modules 707, which may be stored, for example, in system memory 702, such program modules 707 including, but not limited to, an operating system, one or more application programs, other program modules, and programs Data, each or some combination of these examples may include an implementation of a network environment. Program modules 707 generally perform the functions and/or methods of the embodiments described herein.
电子设备70也可以与一个或多个外部设备709(例如键盘、指向设备、显示器710等)通信,还可与一个或者多个使得用户能与该电子设备70交互的设备通信,和/或与使得该电子设备70能与一个或多个其它计算设备进行通信的任何设备(例如网卡,调制解调器等等)通信。这种通信可以通过输入/输出(I/O)接口711进行。并且,电子设备70还可以通过网络适配器712与一个或者多个网络(例如局域网(Local Area Network,LAN),广域网(Wide Area Network,WAN)和/或公共网络,例如因特网)通信。如图所示,网络适配器712通过总线703与电子设备70的其它模块通信。应当明白,尽管图7中未示出,可以结合电子设备70使用其它硬件和/或软件模块,包括但不限于:微代码、设备驱动器、冗余处理单元、外部磁盘驱动阵列、磁盘阵列(Redundant Arrays of Independent Disks,RAID)***、磁带驱动器以及数据备份存储***等。The electronic device 70 may also communicate with one or more external devices 709 (eg, keyboard, pointing device, display 710, etc.), with one or more devices that enable a user to interact with the electronic device 70, and/or with Any device (eg, network card, modem, etc.) that enables the electronic device 70 to communicate with one or more other computing devices. Such communication may take place through input/output (I/O) interface 711 . Also, the electronic device 70 may communicate with one or more networks (eg, Local Area Network (LAN), Wide Area Network (WAN), and/or public networks, such as the Internet) through a network adapter 712. As shown, network adapter 712 communicates with other modules of electronic device 70 via bus 703 . It should be understood that, although not shown in FIG. 7, other hardware and/or software modules may be used in conjunction with electronic device 70, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, disk arrays (Redundant Arrays of Independent Disks, RAID) systems, tape drives, and data backup storage systems, etc.
处理单元701通过运行存储在***存储器702中的程序,从而执行每个种功能应用以及数据处理,例如实现本申请实施例所提供的确定透明度的方法。The processing unit 701 executes each functional application and data processing by running the program stored in the system memory 702, for example, implementing the method for determining transparency provided by the embodiments of the present application.
本申请一实施例还提供一种包含计算机可执行指令的存储介质,所述计算机可执行指令在由计算机处理器执行时设置为执行一种确定透明度的方法。An embodiment of the present application further provides a storage medium containing computer-executable instructions, the computer-executable instructions, when executed by a computer processor, are configured to perform a method of determining transparency.
该方法包括:The method includes:
针对第一子模型上的每个目标检测点,确定当前目标检测点的当前坐标信息,并基于当前坐标信息确定与当前目标检测点对应的距离函数;当前坐标信息为对当前目标检测点的三维空间坐标进行矩阵变化后得到的坐标,距离函数是根据目标检测点在每个方向上与第二子模型之间的相对距离信息来确定的;第一子模型为包裹第二子模型的模型;For each target detection point on the first sub-model, determine the current coordinate information of the current target detection point, and determine the distance function corresponding to the current target detection point based on the current coordinate information; the current coordinate information is the three-dimensional image of the current target detection point. The coordinates obtained after the spatial coordinates are changed in a matrix, and the distance function is determined according to the relative distance information between the target detection point and the second sub-model in each direction; the first sub-model is a model wrapping the second sub-model;
基于预先设置的球谐函数对每个目标检测点的距离函数进行处理,得到每个目标检测点的投影系数值;球谐函数由多个基函数构成;The distance function of each target detection point is processed based on the preset spherical harmonic function, and the projection coefficient value of each target detection point is obtained; the spherical harmonic function is composed of multiple basis functions;
针对每个目标检测点,将当前目标检测点的投影系数值存储至引擎中的目标存储位置,以在检测到透明显示时,基于目标存储位置中存储的投影系数值重建出目标重建函数,基于目标重建函数确定目标拍摄角度下第一子模型与第二子模型之间的目标距离信息,并确定与目标距离信息相对应的透明度参数,以基于每个目标检测点的透明度参数显示第一子模型上的每个目标检测点。For each target detection point, the projection coefficient value of the current target detection point is stored in the target storage position in the engine, so that when transparent display is detected, the target reconstruction function is reconstructed based on the projection coefficient value stored in the target storage position, based on The target reconstruction function determines the target distance information between the first sub-model and the second sub-model under the target shooting angle, and determines the transparency parameter corresponding to the target distance information, so as to display the first sub-model based on the transparency parameter of each target detection point. Each object detection point on the model.
或,该方法包括:Or, the method includes:
确定拍摄装置与第一子模型上每个目标检测点所对应的目标拍摄角度;Determine the target shooting angle corresponding to the shooting device and each target detection point on the first sub-model;
针对每个目标检测点,根据当前目标检测点的当前坐标信息确定目标坐标信息,并根据目标坐标信息以及当前目标检测点的投影系数值,重建出当前目标检测点的目标重建函数;当前坐标信息是对当前目标检测点的三维空间坐标进行矩阵变换后得到的坐标;第一子模型为包裹第二子模型的模型;投影系数值是基于球谐函数对第一子模型上每个目标检测点的球面分布的距离函数处理后确定的;For each target detection point, the target coordinate information is determined according to the current coordinate information of the current target detection point, and the target reconstruction function of the current target detection point is reconstructed according to the target coordinate information and the projection coefficient value of the current target detection point; the current coordinate information is the coordinate obtained by performing matrix transformation on the three-dimensional space coordinates of the current target detection point; the first sub-model is the model wrapping the second sub-model; the projection coefficient value is based on the spherical harmonic function for each target detection point on the first sub-model The distance function of the spherical distribution of is determined after processing;
基于每个目标检测点所对应的重建函数以及相应的目标拍摄角度,确定与每个目标检测点所对应的距离信息,基于距离信息确定第一子模型和第二子模型之间的透明度参数,并基于透明度参数显示第一子模型。Based on the reconstruction function corresponding to each target detection point and the corresponding target shooting angle, the distance information corresponding to each target detection point is determined, and the transparency parameter between the first sub-model and the second sub-model is determined based on the distance information, And display the first submodel based on the transparency parameter.
本申请实施例的计算机存储介质,可以采用一个或多个计算机可读的介质的任意组合。计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的***、装置或器件,或者任意以上的组合。计算机可读存储介质的例子(非穷举的列表)包括:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机存取存储器(Random Access Memory,RAM)、只读存储器(Read-Only Memory,ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本文件中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行***、装置或者器件使用或者与其结合使用。The computer storage medium of the embodiments of the present application may adopt any combination of one or more computer-readable media. The computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium. The computer-readable storage medium can be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or a combination of any of the above. Examples (a non-exhaustive list) of computer-readable storage media include: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (Read- Only Memory, ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage devices, magnetic storage devices, or any suitable combination of the above. In this document, a computer-readable storage medium can be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device.
计算机可读的信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读的信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算 机可读介质可以发送、传播或者传输用于由指令执行***、装置或者器件使用或者与其结合使用的程序。A computer-readable signal medium may include a propagated data signal in baseband or as part of a carrier wave, with computer-readable program code embodied thereon. Such propagated data signals may take a variety of forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing. A computer-readable signal medium can also be any computer-readable medium other than a computer-readable storage medium that can transmit, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device .
计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括——但不限于无线、电线、光缆、射频(Radio Frequency,RF)等等,或者上述的任意合适的组合。Program code embodied on a computer readable medium may be transmitted using any suitable medium, including but not limited to wireless, wire, optical fiber cable, radio frequency (RF), etc., or any suitable combination of the foregoing.
可以以一种或多种程序设计语言或其组合来编写设置为执行本申请实施例操作的计算机程序代码,所述程序设计语言包括面向对象的程序设计语言—诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言——诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络——包括局域网(LAN)或广域网(WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。Computer program code configured to perform the operations of the embodiments of the present application may be written in one or more programming languages, or combinations thereof, including object-oriented programming languages—such as Java, Smalltalk, C++, and also A conventional procedural programming language - such as the "C" language or similar programming language. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (eg, using an Internet service provider through Internet connection).

Claims (15)

  1. 一种确定透明度的方法,包括:A method of determining transparency, including:
    针对第一子模型上的每个目标检测点,确定当前目标检测点的当前坐标信息,并基于所述当前坐标信息确定与所述当前目标检测点对应的距离函数;所述当前坐标信息为对所述当前目标检测点的三维空间坐标进行矩阵变化后得到的坐标,所述距离函数是根据目标检测点在每个方向上与第二子模型之间的相对距离信息来确定的;所述第一子模型为包裹所述第二子模型的模型;For each target detection point on the first sub-model, determine the current coordinate information of the current target detection point, and determine the distance function corresponding to the current target detection point based on the current coordinate information; the current coordinate information is a pair of The coordinates obtained after the three-dimensional space coordinates of the current target detection point are changed in a matrix, and the distance function is determined according to the relative distance information between the target detection point and the second sub-model in each direction; the first a submodel is a model wrapping the second submodel;
    基于预先设置的球谐函数对每个目标检测点的距离函数进行处理,得到每个目标检测点的投影系数值;所述球谐函数由多个基函数构成;The distance function of each target detection point is processed based on the preset spherical harmonic function to obtain the projection coefficient value of each target detection point; the spherical harmonic function is composed of a plurality of basis functions;
    针对每个目标检测点,将当前目标检测点的投影系数值存储至引擎中的目标存储位置,以在检测到透明显示时,基于所述目标存储位置中存储的投影系数值重建出目标重建函数,基于目标重建函数确定目标拍摄角度下第一子模型与第二子模型之间的目标距离信息,并确定与所述目标距离信息相对应的透明度参数,以基于每个目标检测点的透明度参数显示第一子模型上的每个目标检测点。For each target detection point, the projection coefficient value of the current target detection point is stored in the target storage position in the engine, so that when transparent display is detected, the target reconstruction function is reconstructed based on the projection coefficient value stored in the target storage position , determine the target distance information between the first sub-model and the second sub-model under the target shooting angle based on the target reconstruction function, and determine the transparency parameter corresponding to the target distance information, based on the transparency parameter of each target detection point Display each object detection point on the first submodel.
  2. 根据权利要求1所述的方法,其中,所述确定当前目标检测点的当前坐标信息,并基于所述当前坐标信息确定与所述当前目标检测点的距离函数,包括:The method according to claim 1, wherein the determining the current coordinate information of the current target detection point, and determining the distance function from the current target detection point based on the current coordinate information, comprises:
    针对每个目标检测点,确定当前目标检测点向空间中每个方向发射物理射线透过第二子模型时的每个待处理碰撞点信息;For each target detection point, determine the information of each to-be-processed collision point when the current target detection point emits physical rays in each direction in space and passes through the second sub-model;
    根据当前目标检测点与每个待碰撞点信息,确定所述当前目标检测点在每个方向上与所述第二子模型之间的距离信息;According to the current target detection point and each to-be-collision point information, determine the distance information between the current target detection point and the second sub-model in each direction;
    根据当前目标检测点的当前坐标信息,以及当前目标检测点在每个方向上的距离信息,确定当前目标检测点的球面分布距离函数。According to the current coordinate information of the current target detection point and the distance information of the current target detection point in each direction, the spherical distribution distance function of the current target detection point is determined.
  3. 根据权利要求2所述的方法,其中,所述针对每个目标检测点,确定当前目标检测点向空间中每个方向发射物理射线透过第二子模型时的每个待处理碰撞点信息,包括:The method according to claim 2, wherein, for each target detection point, determining the information of each to-be-processed collision point when the current target detection point emits physical rays in each direction in space and passes through the second sub-model, include:
    以当前目标检测点为球心,向空间中的任意方向发射物理射线,确定每个物理射线透过所述第二子模型时的待处理碰撞点信息。Taking the current target detection point as the center of the sphere, physical rays are emitted in any direction in the space, and the collision point information to be processed when each physical ray passes through the second sub-model is determined.
  4. 根据权利要求2所述的方法,其中,所述根据当前目标检测点与每个待碰撞点信息,确定所述当前目标检测在每个方向上与所述第二子模型之间的距离信息,包括:The method according to claim 2, wherein, according to the current target detection point and each to-be-collision point information, the distance information between the current target detection and the second sub-model in each direction is determined, include:
    当物理射线与所述第二子模型存在待处理碰撞点时,则确定待处理碰撞点信息与所述当前目标检测点的距离信息;When a collision point to be processed exists between the physical ray and the second sub-model, the distance information between the collision point information to be processed and the current target detection point is determined;
    当所述物理信息与所述第二子模型不存在待处理碰撞点时,则将与所述待处理碰撞点所对应的距离信息设置为设定值;When there is no collision point to be processed between the physical information and the second sub-model, the distance information corresponding to the collision point to be processed is set as a set value;
    根据与每个待处理碰撞点所对应的距离信息和设定值,确定所述当前目标检测点在每个方向上与所述第二子模型之间的距离信息。According to the distance information and the set value corresponding to each collision point to be processed, the distance information between the current target detection point and the second sub-model in each direction is determined.
  5. 根据权利要求1所述的方法,其中,所述针对第一子模型上的每个目标检测点,确定当前目标检测点的当前坐标信息,包括:The method according to claim 1, wherein, for each target detection point on the first sub-model, determining the current coordinate information of the current target detection point comprises:
    针对每个目标检测点,确定当前目标检测点的三维空间坐标;For each target detection point, determine the three-dimensional space coordinates of the current target detection point;
    根据预先确定的坐标变换矩阵,对所述三维空间坐标进行处理,确定所述当前目标检测点的当前坐标信息;According to a predetermined coordinate transformation matrix, the three-dimensional space coordinates are processed to determine the current coordinate information of the current target detection point;
    所述当前坐标信息是基于当前目标检测点的切线、副切线以及法线来确定。The current coordinate information is determined based on the tangent, the subtangent and the normal of the current target detection point.
  6. 根据权利要求1所述的方法,其中,所述基于预先设置的球谐函数对每个目标检测点的距离函数进行处理,得到每个目标检测点的投影系数值,包括:The method according to claim 1, wherein the distance function of each target detection point is processed based on a preset spherical harmonic function to obtain a projection coefficient value of each target detection point, comprising:
    确定所述球谐函数的阶数,并根据所述阶数确定球谐函数中基函数的表示方式以及基函数的数量;Determine the order of the spherical harmonic function, and determine the representation of the basis function and the number of basis functions in the spherical harmonic function according to the order;
    针对每个目标检测点,基于每个基函数对当前目标检测点的距离函数进行处理,得到所述当前目标检测点的投影系数值;所述投影系数值的数量与所述基函数的数量相同。For each target detection point, the distance function of the current target detection point is processed based on each basis function to obtain the projection coefficient value of the current target detection point; the number of the projection coefficient values is the same as the number of the basis functions .
  7. 根据权利要求1所述的方法,其中,所述将当前目标检测点的投影系数值存储至引擎中的目标存储位置,包括:The method according to claim 1, wherein the storing the projection coefficient value of the current target detection point to the target storage location in the engine comprises:
    针对每个目标检测点,将当前目标检测点的投影系数值以及当前坐标信息存储至所述目标存储位置。For each target detection point, the projection coefficient value of the current target detection point and the current coordinate information are stored in the target storage location.
  8. 一种确定透明度的方法,包括:A method of determining transparency, including:
    确定拍摄装置与第一子模型上每个目标检测点所对应的目标拍摄角度;Determine the target shooting angle corresponding to the shooting device and each target detection point on the first sub-model;
    针对每个目标检测点,根据当前目标检测点的当前坐标信息确定目标坐标信息,并根据所述目标坐标信息以及所述当前目标检测点的投影系数值,重建出所述当前目标检测点的目标重建函数;所述当前坐标信息是对当前目标检测点的三维空间坐标进行矩阵变换后得到的坐标;所述第一子模型为包裹第二子模型的模型;所述投影系数值是基于球谐函数对第一子模型上每个目标检测点的球面分布的距离函数处理后确定的;For each target detection point, the target coordinate information is determined according to the current coordinate information of the current target detection point, and the target of the current target detection point is reconstructed according to the target coordinate information and the projection coefficient value of the current target detection point reconstruction function; the current coordinate information is the coordinates obtained by performing matrix transformation on the three-dimensional space coordinates of the current target detection point; the first sub-model is a model wrapping the second sub-model; the projection coefficient value is based on spherical harmonics The function is determined after processing the distance function of the spherical distribution of each target detection point on the first sub-model;
    基于每个目标检测点所对应的重建函数以及相应的目标拍摄角度,确定与每个目标检测点所对应的距离信息,基于所述距离信息确定第一子模型和所述第二子模型之间的透明度参数,并基于所述透明度参数显示第一子模型。Based on the reconstruction function corresponding to each target detection point and the corresponding target shooting angle, the distance information corresponding to each target detection point is determined, and the distance between the first sub-model and the second sub-model is determined based on the distance information and display the first submodel based on the transparency parameter.
  9. 根据权利要求8所述的方法,其中,所述根据当前目标检测点的当前坐标信息确定目标坐标信息,并根据所述目标坐标信息以及所述当前目标检测点的投影系数值,重建出所述当前目标检测点的目标重建函数,包括:The method according to claim 8, wherein the target coordinate information is determined according to the current coordinate information of the current target detection point, and the target coordinate information is reconstructed according to the target coordinate information and the projection coefficient value of the current target detection point. The target reconstruction function of the current target detection point, including:
    根据当前目标检测点的当前坐标信息,确定当前目标检测点在当前场景下的目标坐标信息;Determine the target coordinate information of the current target detection point in the current scene according to the current coordinate information of the current target detection point;
    通过预先设置的球谐函数对所述目标坐标信息以及当前目标检测点的投影系数值进行处理,重建出当前目标检测点的目标重建函数;球谐函数中包括至少一个基函数。The target coordinate information and the projection coefficient value of the current target detection point are processed by the preset spherical harmonic function to reconstruct the target reconstruction function of the current target detection point; the spherical harmonic function includes at least one basis function.
  10. 根据权利要求8所述的方法,其中,所述基于每个目标检测点所对应的重建函数以及相应的相对拍摄角度,确定与每个目标检测点所对应的距离信息,包括:The method according to claim 8, wherein, determining the distance information corresponding to each target detection point based on the reconstruction function corresponding to each target detection point and the corresponding relative shooting angle, comprising:
    将与所述当前目标检测点所对应的目标拍摄角度输入至所述重建函数中,得到所述当前目标检测点与所述目标拍摄装置所属直线与第一子模型和所述第二子模型相交时的距离信息。Input the target shooting angle corresponding to the current target detection point into the reconstruction function, and obtain the intersection of the current target detection point and the line to which the target shooting device belongs and the first sub-model and the second sub-model distance information.
  11. 根据权利要求8所述的方法,其中,所述基于所述距离信息确定第一子模型和所述第二子模型之间的透明度参数,并基于所述透明度参数显示第一子模型,包括:The method according to claim 8, wherein the determining a transparency parameter between the first sub-model and the second sub-model based on the distance information, and displaying the first sub-model based on the transparency parameter, comprises:
    根据预先设置的距离信息与透明度信息之间的对应关系以及每个目标检测点所对应的距离信息,确定每个目标检测点的透明度参数;Determine the transparency parameter of each target detection point according to the preset correspondence between the distance information and the transparency information and the distance information corresponding to each target detection point;
    基于所述透明度参数显示所述第一子模型和所述第二子模型。The first sub-model and the second sub-model are displayed based on the transparency parameter.
  12. 一种确定透明度的装置,包括:A device for determining transparency, comprising:
    距离信息确定模块,设置为针对第一子模型上的每个目标检测点,确定当前目标检测点的当前坐标信息,并基于所述当前坐标信息确定与所述当前目标检测点的距离函数;所述当前坐标信息为对所述当前目标检测点的三维空间坐标进行矩阵变化后得到的坐标,所述距离函数是根据目标检测点在每个方向上与第二子模型之间的相对距离信息来确定的;所述第一子模型为包裹所述第二子模型的模型;A distance information determination module, configured to determine the current coordinate information of the current target detection point for each target detection point on the first sub-model, and determine a distance function from the current target detection point based on the current coordinate information; The current coordinate information is the coordinates obtained after the three-dimensional space coordinates of the current target detection point are changed in a matrix, and the distance function is obtained according to the relative distance information between the target detection point and the second sub-model in each direction. Determined; the first sub-model is a model wrapping the second sub-model;
    投影系数值确定模块,设置为基于预先设置的球谐函数对每个目标检测点的距离函数进行处理,得到每个目标检测点的投影系数值;所述球谐函数由多 个基函数构成;The projection coefficient value determination module is set to process the distance function of each target detection point based on the preset spherical harmonic function to obtain the projection coefficient value of each target detection point; the spherical harmonic function is composed of a plurality of basis functions;
    透明度参数确定模块,设置为针对每个目标检测点,将当前目标检测点的投影系数值存储至引擎中的目标存储位置,以在检测到透明显示时,基于所述目标存储位置中存储的投影系数值重建出目标重建函数,基于目标重建函数确定目标拍摄角度下第一子模型与第二子模型之间的目标距离信息,并确定与所述目标距离信息相对应的透明度参数,以基于每个目标检测点的透明度参数显示第一子模型上的每个目标检测点。The transparency parameter determination module is configured to store the projection coefficient value of the current target detection point in the target storage position in the engine for each target detection point, so that when transparent display is detected, the projection based on the projection stored in the target storage position The coefficient value reconstructs the target reconstruction function, determines the target distance information between the first sub-model and the second sub-model under the target shooting angle based on the target reconstruction function, and determines the transparency parameter corresponding to the target distance information, based on each The transparency parameter of each object detection point shows each object detection point on the first sub-model.
  13. 一种确定透明度的装置,包括:A device for determining transparency, comprising:
    目标拍摄角度确定模块,设置为确定拍摄装置与第一子模型上每个目标检测点所对应的目标拍摄角度;a target shooting angle determination module, configured to determine the target shooting angle corresponding to each target detection point on the shooting device and the first sub-model;
    目标重建函数重建模块,设置为针对每个目标检测点,根据当前目标检测点的当前坐标信息确定目标坐标信息,并根据所述目标坐标信息以及所述当前目标检测点的投影系数值,重建出所述当前目标检测点的目标重建函数;所述当前坐标信息是对当前目标检测点的三维空间坐标进行矩阵变换后得到的坐标;所述第一子模型为包裹第二子模型的模型;所述投影系数值是基于球谐函数对第一子模型上每个目标检测点的球面分布的距离函数处理后确定的;The target reconstruction function reconstruction module is configured to determine target coordinate information according to the current coordinate information of the current target detection point for each target detection point, and reconstruct the target coordinate information according to the target coordinate information and the projection coefficient value of the current target detection point. The target reconstruction function of the current target detection point; the current coordinate information is the coordinates obtained by performing matrix transformation on the three-dimensional space coordinates of the current target detection point; the first sub-model is a model wrapping the second sub-model; The projection coefficient value is determined after processing the distance function of the spherical distribution of each target detection point on the first sub-model based on the spherical harmonic function;
    透明显示模块,设置为基于每个目标检测点所对应的重建函数以及相应的目标拍摄角度,确定与每个目标检测点所对应的距离信息,基于所述距离信息确定第一子模型和所述第二子模型之间的透明度参数,并基于所述透明度参数显示第一子模型。The transparent display module is configured to determine the distance information corresponding to each target detection point based on the reconstruction function corresponding to each target detection point and the corresponding target shooting angle, and determine the first sub-model and the described distance information based on the distance information. A transparency parameter between the second sub-models, and the first sub-model is displayed based on the transparency parameter.
  14. 一种电子设备,包括:An electronic device comprising:
    一个或多个处理器;one or more processors;
    存储装置,设置为存储一个或多个程序;storage means arranged to store one or more programs;
    当所述一个或多个程序被所述一个或多个处理器执行,使得所述一个或多个处理器实现如权利要求1-8或9-11中任一项所述的确定透明度的方法。The one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of determining transparency of any one of claims 1-8 or 9-11 .
  15. 一种包含计算机可执行指令的存储介质,所述计算机可执行指令在由计算机处理器执行时设置为执行如本权利要求1-8或9-11中任一项所述的确定透明度的方法。A storage medium containing computer-executable instructions which, when executed by a computer processor, are arranged to perform the method of determining transparency of any of the present claims 1-8 or 9-11.
PCT/CN2021/131498 2020-12-08 2021-11-18 Transparency determination method and apparatus, electronic device, and storage medium WO2022121653A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011444007.4 2020-12-08
CN202011444007.4A CN114612602A (en) 2020-12-08 2020-12-08 Method and device for determining transparency, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
WO2022121653A1 true WO2022121653A1 (en) 2022-06-16

Family

ID=81856760

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/131498 WO2022121653A1 (en) 2020-12-08 2021-11-18 Transparency determination method and apparatus, electronic device, and storage medium

Country Status (2)

Country Link
CN (1) CN114612602A (en)
WO (1) WO2022121653A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115222896A (en) * 2022-09-20 2022-10-21 荣耀终端有限公司 Three-dimensional reconstruction method and device, electronic equipment and computer-readable storage medium
CN115661426A (en) * 2022-12-15 2023-01-31 山东捷瑞数字科技股份有限公司 Model modification method, device, equipment and medium based on three-dimensional engine

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140204087A1 (en) * 2013-01-18 2014-07-24 Pixar Photon beam diffusion
CN108525300A (en) * 2018-04-27 2018-09-14 腾讯科技(深圳)有限公司 Position indication information display methods, device, electronic device and storage medium
CN111494945A (en) * 2020-04-22 2020-08-07 网易(杭州)网络有限公司 Virtual object processing method and device, storage medium and electronic equipment
CN111659117A (en) * 2020-07-08 2020-09-15 腾讯科技(深圳)有限公司 Virtual object display method and device, computer equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140204087A1 (en) * 2013-01-18 2014-07-24 Pixar Photon beam diffusion
CN108525300A (en) * 2018-04-27 2018-09-14 腾讯科技(深圳)有限公司 Position indication information display methods, device, electronic device and storage medium
CN111494945A (en) * 2020-04-22 2020-08-07 网易(杭州)网络有限公司 Virtual object processing method and device, storage medium and electronic equipment
CN111659117A (en) * 2020-07-08 2020-09-15 腾讯科技(深圳)有限公司 Virtual object display method and device, computer equipment and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115222896A (en) * 2022-09-20 2022-10-21 荣耀终端有限公司 Three-dimensional reconstruction method and device, electronic equipment and computer-readable storage medium
CN115222896B (en) * 2022-09-20 2023-05-23 荣耀终端有限公司 Three-dimensional reconstruction method, three-dimensional reconstruction device, electronic equipment and computer readable storage medium
CN115661426A (en) * 2022-12-15 2023-01-31 山东捷瑞数字科技股份有限公司 Model modification method, device, equipment and medium based on three-dimensional engine

Also Published As

Publication number Publication date
CN114612602A (en) 2022-06-10

Similar Documents

Publication Publication Date Title
US20230053462A1 (en) Image rendering method and apparatus, device, medium, and computer program product
US11263803B2 (en) Virtual reality scene rendering method, apparatus and device
WO2022121653A1 (en) Transparency determination method and apparatus, electronic device, and storage medium
WO2021228031A1 (en) Rendering method, apparatus and system
US11727632B2 (en) Shader binding management in ray tracing
US20220375258A1 (en) Image processing method and apparatus, device and storage medium
US20220358675A1 (en) Method for training model, method for processing video, device and storage medium
EP4254343A1 (en) Image rendering method and related device therefor
US20220358735A1 (en) Method for processing image, device and storage medium
CN112766027A (en) Image processing method, device, equipment and storage medium
CN114419226A (en) Panorama rendering method and device, computer equipment and storage medium
WO2019042028A1 (en) All-around spherical light field rendering method
US20210090322A1 (en) Generating and Modifying Representations of Objects in an Augmented-Reality or Virtual-Reality Scene
CN112465692A (en) Image processing method, device, equipment and storage medium
CN112528707A (en) Image processing method, device, equipment and storage medium
US20240046554A1 (en) Presenting virtual representation of real space using spatial transformation
WO2022121652A1 (en) Transparency determination method and apparatus, electronic device, and storage medium
WO2022121654A1 (en) Transparency determination method and apparatus, and electronic device and storage medium
CN114020390A (en) BIM model display method and device, computer equipment and storage medium
CN114627231A (en) Method and device for determining transparency, electronic equipment and storage medium
Jung et al. Fast and efficient vertex data representations for the web
CN115714888B (en) Video generation method, device, equipment and computer readable storage medium
US20240169568A1 (en) Method, device, and computer program product for room layout
EP4386682A1 (en) Image rendering method and related device thereof
CN114612603A (en) Method and device for determining transparency, electronic equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21902360

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21902360

Country of ref document: EP

Kind code of ref document: A1