CN116542847B - Low-small slow target high-speed image simulation method, storage medium and device - Google Patents
Low-small slow target high-speed image simulation method, storage medium and device Download PDFInfo
- Publication number
- CN116542847B CN116542847B CN202310815341.3A CN202310815341A CN116542847B CN 116542847 B CN116542847 B CN 116542847B CN 202310815341 A CN202310815341 A CN 202310815341A CN 116542847 B CN116542847 B CN 116542847B
- Authority
- CN
- China
- Prior art keywords
- coordinate
- target
- image
- projection plane
- affine
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000004088 simulation Methods 0.000 title claims abstract description 57
- 238000000034 method Methods 0.000 title claims abstract description 54
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 claims abstract description 112
- 230000009466 transformation Effects 0.000 claims abstract description 70
- 238000013507 mapping Methods 0.000 claims abstract description 46
- 239000011159 matrix material Substances 0.000 claims abstract description 45
- 238000003384 imaging method Methods 0.000 claims abstract description 19
- 238000013519 translation Methods 0.000 claims description 23
- 230000005693 optoelectronics Effects 0.000 claims description 8
- 208000006440 Open Bite Diseases 0.000 claims description 6
- 238000004891 communication Methods 0.000 claims description 4
- 230000008569 process Effects 0.000 abstract description 19
- 238000004364 calculation method Methods 0.000 abstract description 8
- 238000011056 performance test Methods 0.000 description 6
- 230000015572 biosynthetic process Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000003786 synthesis reaction Methods 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 238000012795 verification Methods 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000002372 labelling Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/08—Projecting images onto non-planar surfaces, e.g. geodetic screens
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/60—Rotation of whole images or parts thereof
- G06T3/604—Rotation of whole images or parts thereof using coordinate rotation digital computer [CORDIC] devices
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention discloses a low-low and slow target high-speed image simulation method, a storage medium and a device, and relates to the technical field of photoelectric image simulation, wherein the method comprises the steps of determining an observation coordinate system of photoelectric equipment, mapping a background and a shielding object in a target simulation flight scene to a 2D projection plane in the observation coordinate system to obtain a background image and a shielding object image; determining a first corresponding relation, namely mapping original coordinates of the target in the simulated flight scene to a coordinate transformation relation of a 2D projection plane, and determining an affine transformation matrix of the target according to imaging parameters of the photoelectric equipment; calculating coordinate values of the target in the 2D projection plane according to the affine transformation matrix and the first corresponding relation to obtain a target image; resolving the shielding relation; first and second simulated images are generated. The simulated flight scene of the low-speed and slow-speed target in the three-dimensional space is simplified into the occlusion relation analysis of the two-dimensional image, so that the calculation process is simplified, and the simulated image generation process is faster.
Description
Technical Field
The invention belongs to the technical field of photoelectric image simulation, in particular to photoelectric image simulation for high-definition photoelectric turntable performance detection, and particularly relates to a low-small slow target high-speed image simulation method, a storage medium and a device.
Background
The low-altitude, small-sized and slow-speed aircraft is a collective term of low-altitude, small-sized and slow-speed aircraft and mainly comprises part of aircraft, an aeromodel, part of manned aircraft (such as a power parachute and a glider), an air-floating balloon and the like. With the popularity of low-speed aircraft, the system requirements for detecting, identifying and countering such aircraft are becoming more and more intense, and the development of such systems is becoming more and more urgent, particularly for small or miniature unmanned aerial vehicles for civil use, such as unmanned aerial vehicles, among others, also known as low-speed targets that are in urgent need of detection, identification and countering.
The high-definition photoelectric turntable is used as photoelectric equipment, and mainly comprises an imaging module, a tripod head servo mechanism and the like, so that the acquisition and compression of low-small slow-flying scene images, the control of a lens camera and the like are completed, and the high-definition photoelectric turntable is an important component part of a low-small slow-target reaction system, so that the performance test of the high-definition photoelectric turntable is also an important content in the development of the low-small slow-target reaction system. Various photoelectric image simulation systems existing at the present stage cannot simulate camera image input at high speed, and great difficulty is brought to performance test and upgrading verification of a high-definition photoelectric turntable. In order to realize performance test and upgrade verification of a high-definition photoelectric turntable, a core method is to form a simulation image based on modeling of a three-dimensional scene. The method depends on a large server and a special image acceleration card, and is difficult to realize in an embedded system in consideration of various aspects such as a modeling method, a simulation flow, complex calculation amount and the like.
In summary, with the embedded system applicable to the general computing power as a goal, a scheme of how to simulate an image of a flight scene by using a low-speed target to quickly generate a simulated image is needed to be proposed, so that performance test on a high-definition photoelectric turntable can be performed efficiently.
Disclosure of Invention
In view of the above, the invention provides a low-speed target high-speed image simulation method, a storage medium and a device, which are used for solving the technical problems that the existing photoelectric image simulation system cannot simulate camera image input at high speed and the simulation imaging method based on three-dimensional scene modeling is difficult to realize in an embedded system, so that performance tests on a high-definition photoelectric turntable cannot be normally developed under more scenes.
The aim of the invention is realized by the following technical scheme:
the first aspect of the invention provides a low-small slow target-speed image simulation method, which comprises the following steps:
determining an observation coordinate system of the photoelectric equipment, and mapping a background and an occlusion object in a target simulated flight scene to a 2D projection plane in the observation coordinate system to obtain a background image and an occlusion object image;
determining a first corresponding relation and an affine transformation matrix of a target according to imaging parameters of photoelectric equipment, wherein the first corresponding relation is a coordinate transformation relation between an original coordinate of the target in a simulated flight scene and a projection coordinate obtained by mapping the original coordinate to a 2D projection plane;
calculating affine coordinate values of the target in the 2D projection plane according to the affine transformation matrix and the first corresponding relation, and generating a target image according to the affine coordinate values;
comparing the three-dimensional coordinates of the target image with all the three-dimensional coordinates of the occlusion object image, judging whether the target is occluded, if so, generating a first simulation image, otherwise, generating a second simulation image;
the three-dimensional coordinate of the target image consists of an affine coordinate value of the target image in a 2D projection plane and a coordinate value in a third coordinate axis direction different from the coordinate axis in the 2D projection plane, wherein the coordinate value in the third coordinate axis direction is an affine transformation value of the distance between the three-dimensional coordinate origin of the coordinate system of the original coordinate of the target in the simulated flight scene and the 2D projection plane; the three-dimensional coordinate of the shielding object image consists of a coordinate value of the shielding object image in a 2D projection plane and a coordinate value in a third coordinate axis direction different from the coordinate value in the 2D projection plane, wherein the coordinate value in the third coordinate axis direction is the distance between the three-dimensional coordinate origin of the coordinate system of the original coordinate of the shielding object in the simulated flight scene and the 2D projection plane;
the first simulation image is synthesized according to the background image and the occlusion object image, and the second simulation image is synthesized according to the background image, the occlusion object image of the non-occlusion object and the object image.
Further, the determining the affine transformation matrix of the target according to the imaging parameters of the optoelectronic device specifically includes:
according to imaging parameters of the photoelectric equipment, determining translation distances of targets in directions of all coordinate axes of the 2D projection plane, and scale transformation coefficients and rotation coefficients on the 2D projection plane;
and combining the translation distance, the scale transformation coefficient and the rotation coefficient to obtain an affine transformation matrix of the target.
Further, the affine transformation matrix is expressed as;
wherein ,Tx Representing the translation distance of the target in one of the coordinate axis directions, T y Representing the translation distance of the object in the direction of the other coordinate axis,representing scale transform coefficients, +.>Representing the rotation coefficient.
Further, the 2D projection plane is an XOY plane, when the background and the shielding object in the target simulated flight scene are mapped to the 2D projection plane in the observation coordinate system, mapping is performed based on a first formula, the first correspondence is generated according to the first formula, and the first formula is as follows:;
wherein ,representing the original coordinates of a background or a shutter within a target simulated flight scene, +.>The representation will->Coordinate values obtained by mapping to the 2D projection plane, are>Coordinate values representing the direction of a third coordinate axis of the background or the shielding object in a coordinate axis different from the coordinate axis in the 2D projection plane, wherein z represents the distance between the three-dimensional origin of the coordinate system where the original coordinates of the background or the shielding object are located and the 2D projection plane, F is a normalized proportionality coefficient associated with the 2D projection plane, and->Representing the three-dimensional coordinates of the background image or the occlusion image.
Further, the affine coordinate value of the target in the 2D projection plane is calculated according to the affine transformation matrix and the first correspondence relation, and the target image is generated according to the affine coordinate value, specifically:
mapping the original coordinates of the target in the simulated flight scene onto a 2D projection plane according to the first corresponding relation to obtain projection coordinates corresponding to the original coordinates, wherein />,W represents the original number of coordinates of the target, +.>Representing the i-th original coordinate of the object,the representation will->Projection coordinates obtained by mapping to a 2D projection plane,>an initial coordinate value representing a direction of a third coordinate axis of the object different from the coordinate axis in the 2D projection plane, +.>Representing the distance between the three-dimensional origin of the coordinate system where the original coordinates of the target are located and the 2D projection plane;
calculating affine coordinate values of the object in the 2D projection plane and coordinate values of the object in a third coordinate axis direction different from the coordinate axes in the 2D projection plane based on the respective projection coordinates and affine transformation matrix, and forming three-dimensional coordinates of the object by the affine coordinate values of the object in the 2D projection plane and the coordinate values of the object in the third coordinate axis direction different from the coordinate axes in the 2D projection planeThe method comprises the steps of carrying out a first treatment on the surface of the Wherein F' represents an affine transformation matrix, +.>Affine coordinate values representing the object in the 2D projection plane,/->Coordinate values representing the direction of a third coordinate axis of the object different from the coordinate axis in the 2D projection plane;
and obtaining a target image according to affine coordinate values of the target in the 2D projection plane.
Further, the determining whether the target is blocked specifically includes:
according to affine coordinate values of the target image in the 2D projection plane, if coordinate points which are the same as the affine coordinate values can be found in the same shelter image, and the coordinate values of the shelter image in the third coordinate axis direction are smaller than the coordinate values of the target image in the third coordinate axis direction, determining that the target is shielded; otherwise, determining that the target is not occluded.
The first aspect of the invention has the following beneficial effects:
(1) The simulated flight scene of the low-speed target in the three-dimensional space is simplified into the occlusion relation analysis of the two-dimensional image, so that complex projection calculation is avoided when the simulated image is generated based on the traditional three-dimensional scene modeling method, the calculation process is simplified, the calculation efficiency is improved, the generation process of the simulated image is faster, and the method can be applied to real-time simulation of the simulated flight scene of the low-speed target, so that the generation process of the simulated image also takes account of real-time performance, and further the efficiency of testing the performance of the high-definition photoelectric turntable is improved based on the rapid generation of the simulated image;
(2) The calculation process is simplified, so that the number of simulation parameters is reduced, and the communication bandwidth is reduced, therefore, the low-small and slow target high-speed image simulation method realized by the first aspect of the invention can be carried on test equipment with common calculation power, and compared with the dependence of the traditional three-dimensional scene modeling method on a large server, the function module design of the image simulation equipment is simplified, and the construction cost of the equipment is saved.
A second aspect of the present invention provides a storage medium for connection to an external processor, the storage medium having stored therein at least one instruction, at least one program, a set of codes or a set of instructions, the at least one instruction, the at least one program, the set of codes or the set of instructions being loaded and executed by the processor to implement a method according to the first aspect of the present invention.
The second aspect of the present invention brings about the same advantageous effects as the first aspect and is not described in detail herein.
The third aspect of the invention provides a low-small slow target high-speed image simulation device, which comprises an upper computer and an FPGA module, wherein the upper computer is in communication connection with the FPGA module, and the FPGA module is also used for being connected with external photoelectric equipment;
the upper computer is used for determining an observation coordinate system of the photoelectric equipment, mapping a background and a shielding object in a target simulated flight scene to a 2D projection plane in the observation coordinate system to obtain a background image and a shielding object image, determining a first corresponding relation and determining an affine transformation matrix of the target according to imaging parameters of the photoelectric equipment, wherein the first corresponding relation is a coordinate transformation relation between an original coordinate of the target in the simulated flight scene and a projection coordinate obtained by mapping the original coordinate to the 2D projection plane, and sending the background image, the shielding object image, the affine transformation matrix and the first corresponding relation to the FPGA module;
the FPGA module is used for calculating affine coordinate values of the target in the 2D projection plane according to the affine transformation matrix and the first corresponding relation, generating a target image according to the affine coordinate values, comparing the three-dimensional coordinates of the target image with all the three-dimensional coordinates of the occlusion object image, judging whether the target is occluded, if so, generating a first analog image, and otherwise, generating a second analog image;
the three-dimensional coordinate of the target image consists of an affine coordinate value of the target image in a 2D projection plane and a coordinate value in a third coordinate axis direction different from the coordinate axis in the 2D projection plane, wherein the coordinate value in the third coordinate axis direction is an affine transformation value of the distance between the three-dimensional coordinate origin of the coordinate system of the original coordinate of the target in the simulated flight scene and the 2D projection plane; the three-dimensional coordinate of the shielding object image consists of a coordinate value of the shielding object image in a 2D projection plane and a coordinate value in a third coordinate axis direction different from the coordinate value in the 2D projection plane, wherein the coordinate value in the third coordinate axis direction is the distance between the three-dimensional coordinate origin of the coordinate system of the original coordinate of the shielding object in the simulated flight scene and the 2D projection plane;
the first simulation image is synthesized according to the background image and the shelter image; the second simulation image is synthesized according to the background image, the occlusion object image of the non-occlusion object and the object image;
the FPGA module is also used for sending the generated first analog image or the generated second analog image to the optoelectronic device.
Further, the determining the affine transformation matrix of the target according to the imaging parameters of the optoelectronic device specifically includes:
according to imaging parameters of the photoelectric equipment, determining translation distances of targets in directions of all coordinate axes of the 2D projection plane, and scale transformation coefficients and rotation coefficients on the 2D projection plane;
obtaining affine transformation matrix of target by combining translation distance, scale transformation coefficient and rotation coefficient;
wherein ,Tx Representing the translation distance of the target in one of the coordinate axis directions, T y Representing the translation distance of the object in the direction of the other coordinate axis,representing scale transform coefficients, +.>Representing the rotation coefficient.
Further, the 2D projection plane is an XOY plane, when the background and the shielding object in the target simulated flight scene are mapped to the 2D projection plane in the observation coordinate system, mapping is performed based on a first formula, and the first correspondence is generated according to the first formula, where the first formula is:;
wherein ,representing the original coordinates of a background or a shutter within a target simulated flight scene, +.>The representation will->Coordinate values obtained by mapping to the 2D projection plane, are>Coordinate values representing the direction of a third coordinate axis of the background or the shielding object in a coordinate axis different from the coordinate axis in the 2D projection plane, wherein z represents the distance between the three-dimensional origin of the coordinate system where the original coordinates of the background or the shielding object are located and the 2D projection plane, F is a normalized proportionality coefficient associated with the 2D projection plane, and->Representing background images and masksThree-dimensional coordinates of the barrier image;
the affine coordinate value of the target in the 2D projection plane is calculated according to the affine transformation matrix and the first corresponding relation, and a target image is generated according to the affine coordinate value, specifically:
mapping the original coordinates of the target in the simulated flight scene onto a 2D projection plane according to the first corresponding relation to obtain projection coordinates corresponding to the original coordinates, wherein />,W represents the original number of coordinates of the target, +.>Representing the i-th original coordinate of the object,the representation will->Projection coordinates obtained by mapping to a 2D projection plane,>an initial coordinate value representing a direction of a third coordinate axis of the object different from the coordinate axis in the 2D projection plane, +.>Representing the distance between the three-dimensional origin of the coordinate system where the original coordinates of the target are located and the 2D projection plane;
calculating affine coordinate values of the target in the 2D projection plane and coordinate values of the target in a third coordinate axis direction different from the coordinate axes in the 2D projection plane based on the respective projection coordinates and affine transformation matrix, and sitting the target in the third coordinate axis direction different from the coordinate axes in the 2D projection plane by the affine coordinate values of the target in the 2D projection planeThree-dimensional coordinates of target composed of target valuesThe method comprises the steps of carrying out a first treatment on the surface of the Wherein F' represents an affine transformation matrix, +.>Affine coordinate values representing the object in the 2D projection plane,/->Coordinate values representing the direction of a third coordinate axis of the object different from the coordinate axis in the 2D projection plane;
obtaining a target image according to affine coordinate values of the target in the 2D projection plane;
the judging whether the target is shielded or not specifically comprises:
according to affine coordinate values of the target image in the 2D projection plane, if coordinate points which are the same as the affine coordinate values can be found in the same shelter image, and the coordinate values of the shelter image in the third coordinate axis direction are smaller than the coordinate values of the target image in the third coordinate axis direction, determining that the target is shielded; otherwise, determining that the target is not occluded
The third aspect of the present invention brings about the same advantageous effects as the first aspect and is not described in detail herein. Meanwhile, the algorithm process of the shielding relation calculation and the image synthesis is integrated in the FPGA module, so that the parallel computing resources of the FPGA module are fully utilized, and the high-frame-rate image simulation of the low-speed target simulation flight scene is realized.
Drawings
FIG. 1 is a schematic flow chart of a low-speed target-speed image simulation method;
FIG. 2 is a schematic flow chart of another method of simulating a low-slow target-speed image;
FIG. 3 is a schematic diagram of a projective transformation;
FIG. 4 is a schematic diagram of a low-speed, target-speed image simulation device.
Detailed Description
The technical solutions of the present invention will be clearly and completely described below with reference to the embodiments, and it is apparent that the described embodiments are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by a person skilled in the art without any inventive effort, are intended to be within the scope of the present invention, based on the embodiments of the present invention.
Example 1
The embodiment provides a low-small slow target high-speed image simulation method, by which a simulation image of a low-small slow target (hereinafter simply referred to as a target) simulation flight scene is generated, and the simulation image is used for inputting photoelectric equipment, particularly a high-definition photoelectric turntable, so as to perform performance test or upgrade verification on the high-definition photoelectric turntable.
As shown in fig. 1 to 3, a specific implementation procedure of the low-small slow target-speed image simulation method is as follows:
s100, after various parameters of the photoelectric equipment to be tested are known, determining an observation coordinate system of the photoelectric equipment, and mapping a background and a shelter in a target simulated flight scene to a 2D projection plane in the observation coordinate system to obtain a background image and a shelter image. The number of occlusions within the simulated flight scene of the target is typically multiple. The acquisition of the background and the shielding object in the target simulation flight scene is based on the acquisition process in the common embodiment, and the premise of generating the simulation image in this embodiment is obtained, for example, after automatic identification and manual labeling by using an online target identification model, and this embodiment will not be described in detail in this section.
In some embodiments, when mapping the background and the occlusion within the target simulated flight scene to the 2D projection plane within the observation coordinate system, parameters of the mapping process are determined based on the location of the target simulated flight scene from the observation coordinate system and a normalized scaling factor associated with the 2D projection plane.
In some embodiments, the XOY plane within the observation coordinate system is chosen as the 2D projection plane.
In particular, in connection with fig. 3, when mapping the background and the occlusion in the target simulated flight scene to the XOY plane in the observation coordinate system, mapping is performed based on a first formula, where the first formula is as follows:
;
in the formula ,representing the original coordinates of a background or a shutter within a target simulated flight scene, +.>The representation will->The coordinate values obtained by mapping to the XOY plane, and (2)>Representing the coordinate value of the background or the shielding object in the Z-axis direction of the observation coordinate system, wherein Z represents the distance between the three-dimensional coordinate origin of the coordinate system of the original coordinate of the background or the shielding object and the 2D projection plane, F is the normalized proportionality coefficient associated with the 2D projection plane, and->Representing the three-dimensional coordinates of the background image or the occlusion image.
When other two-dimensional planes in the observation coordinate system are selected as the 2D projection planes, the background and the shielding object in the target simulated flight scene are mapped to the 2D projection planes based on the same mapping principle, for example, when an XOZ plane in the observation coordinate system is selected as the 2D projection plane, the coordinate value of the original coordinate of the background or the shielding object in the Y-axis direction of the observation coordinate system is 0, the coordinate value of the background or the shielding object in the Y-axis direction of the observation coordinate system is equal to the distance between the three-dimensional coordinate origin of the coordinate system of the original coordinate of the background or the shielding object and the 2D projection plane, and the F is kept unchanged.
S200, determining a first corresponding relation and an affine transformation matrix of the target according to imaging parameters of the photoelectric equipment, wherein the first corresponding relation is a coordinate transformation relation between original coordinates of the target in the simulated flight scene and projection coordinates obtained by mapping the original coordinates to a 2D projection plane. It is known that, to perform a test on a given optoelectronic device, the mapping process of mapping the original coordinates of the target within the simulated flight scene to the 2D projection plane is the same as the mapping process of mapping the background and the obstruction within the target simulated flight scene to the 2D projection plane. When mapping is preferably performed based on the first formula, the above-described first correspondence is obtained according to the first formula. The acquisition of the target in the simulated flight scene is based on the acquisition process in the common embodiment, and the premise of generating the simulated image in the embodiment is obtained after automatic identification and manual labeling of the online target identification model, for example, and the embodiment does not describe the content of the part.
In some embodiments, one implementation of determining an affine transformation matrix of a target from imaging parameters of an optoelectronic device is:
s001, according to imaging parameters of the photoelectric equipment, determining translation distances of targets in directions of all coordinate axes of the 2D projection plane, and scale transformation coefficients and rotation coefficients on the 2D projection plane. Wherein the rotation coefficient is also referred to as the pitch angle of the target.
S002, combining the translation distance, the scale transformation coefficient and the rotation coefficient to obtain an affine transformation matrix of the target;
wherein ,Tx Representing the translation distance of the target in one of the coordinate axis directions, T y Representing the translation distance of the object in the direction of the other coordinate axis,representing scale transform coefficients, +.>Representing the rotation coefficient. For example, when the 2D projection plane is selected as the XOY plane of the observation coordinate system, the above-described translation distances include a translation distance in the X direction and a translation distance in the Y direction.
S300, calculating affine coordinate values of the target in the 2D projection plane according to the affine transformation matrix and the first corresponding relation, and generating a target image according to the affine coordinate values.
In particular, in combination with the first correspondence obtained based on the first formula and the generation of the affine transformation matrix described in the foregoing embodiment, a specific implementation procedure for calculating the affine coordinate value of the target in the 2D projection plane according to the affine transformation matrix and the first correspondence is as follows:
s301, mapping original coordinates of the target in the simulated flight scene to a 2D projection plane according to the first corresponding relation to obtain projection coordinates corresponding to the original coordinates, wherein ,/>W represents the original number of coordinates of the target, +.>Representing the i-th original coordinates of the object, +.>The representation will->Projection coordinates obtained by mapping to a 2D projection plane,>an initial coordinate value representing a direction of a third coordinate axis of the object different from the coordinate axis in the 2D projection plane, +.>Representing the distance between the three-dimensional origin of the coordinate system where the original coordinates of the target are located and the 2D projection plane;
s302, calculating based on each projection coordinate and affine transformation matrixAffine coordinate values of the object in the 2D projection plane and coordinate values of the object in a third coordinate axis direction different from the coordinate axes in the 2D projection plane, a three-dimensional coordinate of the object is composed of affine coordinate values of the object in the 2D projection plane and coordinate values of the object in a third coordinate axis direction different from the coordinate axes in the 2D projection planeThe method comprises the steps of carrying out a first treatment on the surface of the Wherein F' represents an affine transformation matrix, +.>Affine coordinate values representing the object in the 2D projection plane,/->Coordinate values representing the direction of a third coordinate axis of the object different from the coordinate axis in the 2D projection plane;
s303, obtaining a target image according to affine coordinate values of the target in the 2D projection plane.
S400, comparing the three-dimensional coordinates of the target image with all three-dimensional coordinates of the occlusion object image, judging whether the target is occluded, if so, generating a first simulation image, and otherwise, generating a second simulation image; the three-dimensional coordinate of the target image consists of an affine coordinate value of the target image in a 2D projection plane and a coordinate value in a third coordinate axis direction different from the coordinate axis in the 2D projection plane, wherein the coordinate value in the third coordinate axis direction is an affine transformation value of the distance between the three-dimensional coordinate origin of the coordinate system of the original coordinate of the target in the simulated flight scene and the 2D projection plane; the three-dimensional coordinate of the shielding object image consists of a coordinate value of the shielding object image in a 2D projection plane and a coordinate value in a third coordinate axis direction different from the coordinate value in the 2D projection plane, wherein the coordinate value in the third coordinate axis direction is the distance between the three-dimensional coordinate origin of the coordinate system of the original coordinate of the shielding object in the simulated flight scene and the 2D projection plane; the first simulation image is synthesized according to the background image and the occlusion object image, and the second simulation image is synthesized according to the background image, the occlusion object image of the non-occlusion object and the object image.
Further, when the 2D projection plane is preferably an XOY plane, the third coordinate axis direction is the Z direction. The first simulated image, also known as an occlusion scene composite map, is synthesized using the synthesis process of the conventional embodiment. The second simulated image, also known as the target scene composite map, is also synthesized using the synthesis process of the conventional embodiment.
In some embodiments, one specific implementation of determining whether a target is occluded is:
according to affine coordinate values of the target image in the 2D projection plane, if coordinate points which are the same as the affine coordinate values can be found in the same occlusion object image, and the coordinate values of the occlusion object image in the third coordinate axis direction are smaller than the coordinate values of the target image in the third coordinate axis direction, the target is determined to be occluded, otherwise, the target is determined to not be occluded.
When the third coordinate axis direction is the Z direction, the above-mentioned occlusion relationship solving process can be described as:;
in the formula ,representing the occlusion relation between the target image and the occlusion object image,/->Coordinate value of the j-th occlusion image in Z direction,>representing the coordinate value of the target image in the Z direction, if each coordinate point (affine coordinate value in the X direction and affine coordinate value in the Y direction) of the target image on the XOY plane can find the coordinate point which is the same as the coordinate point in the j-th shielding object, namely the affine coordinate value in the X direction is the same and the affine coordinate value in the Y direction is the same, andthen->A value of 1 indicates that the target is blocked by the blocking object, otherwise +.>A value of 0 indicates that the target is not occluded by any occlusion.
Example two
The present embodiment provides a storage medium, configured to be connected to an external processor, where at least one instruction, at least one section of program, a code set, or an instruction set is stored in the storage medium, where the at least one instruction, the at least one section of program, the code set, or the instruction set is loaded by the processor and executed by the processor to implement a low-speed target-speed image simulation method according to the first embodiment.
Example III
As shown in fig. 4, the present embodiment provides a device loaded with a low-speed target-speed image simulation method implemented in the first embodiment. The device comprises an upper computer and an FPGA module connected with the upper computer through a communication interface. The FPGA module is realized based on a commercially available FPGA chip. The FPGA module is also used for sending the generated analog image to the photoelectric equipment through an output interface of the FPGA module.
The upper computer is used for determining an observation coordinate system of the photoelectric device, mapping a background and a shielding object in a target simulated flight scene to a 2D projection plane in the observation coordinate system to obtain a background image and a shielding object image, determining a first corresponding relation and determining an affine transformation matrix of the target according to imaging parameters of the photoelectric device, wherein the first corresponding relation is a coordinate transformation relation between an original coordinate of the target in the simulated flight scene and a projection coordinate obtained by mapping the original coordinate to the 2D projection plane, and sending the background image, the shielding object image, the affine transformation matrix and the first corresponding relation to the FPGA module.
The FPGA module is used for calculating affine coordinate values of the target in the 2D projection plane according to the affine transformation matrix and the first corresponding relation, generating a target image according to the affine coordinate values, comparing the three-dimensional coordinates of the target image with all the three-dimensional coordinates of the occlusion object image, judging whether the target is occluded, if so, generating a first analog image, and otherwise, generating a second analog image; the three-dimensional coordinate of the target image consists of an affine coordinate value of the target image in a 2D projection plane and a coordinate value in a third coordinate axis direction different from the coordinate axis in the 2D projection plane, wherein the coordinate value in the third coordinate axis direction is an affine transformation value of the distance between the three-dimensional coordinate origin of the coordinate system of the original coordinate of the target in the simulated flight scene and the 2D projection plane; the three-dimensional coordinate of the shielding object image consists of a coordinate value of the shielding object image in a 2D projection plane and a coordinate value in a third coordinate axis direction different from the coordinate value in the 2D projection plane, wherein the coordinate value in the third coordinate axis direction is the distance between the three-dimensional coordinate origin of the coordinate system of the original coordinate of the shielding object in the simulated flight scene and the 2D projection plane; the first simulation image is synthesized according to the background image and the occlusion object image, the second simulation image is synthesized according to the background image, the occlusion object image of the non-occlusion object and the object image, and the first simulation image or the second simulation image is used for being sent to the photoelectric equipment.
The affine transformation matrix determining process, the mapping process based on the first formula and the occlusion relationship resolving process described in the first embodiment are also applicable in the present embodiment, and are not described here again, so that those skilled in the art can clearly know the function implementation process of the upper computer and the FPGA module in the present embodiment.
The foregoing is merely a preferred embodiment of the invention, and it is to be understood that the invention is not limited to the form disclosed herein but is not to be construed as excluding other embodiments, but is capable of numerous other combinations, modifications and environments and is capable of modifications within the scope of the inventive concept, either as taught or as a matter of routine skill or knowledge in the relevant art. And that modifications and variations which do not depart from the spirit and scope of the invention are intended to be within the scope of the appended claims.
Claims (10)
1. A low-low slow target speed image simulation method, comprising:
determining an observation coordinate system of the photoelectric equipment, and mapping a background and an occlusion object in a target simulated flight scene to a 2D projection plane in the observation coordinate system to obtain a background image and an occlusion object image;
determining a first corresponding relation and an affine transformation matrix of a target according to imaging parameters of photoelectric equipment, wherein the first corresponding relation is a coordinate transformation relation between an original coordinate of the target in a simulated flight scene and a projection coordinate obtained by mapping the original coordinate to a 2D projection plane;
calculating affine coordinate values of the target in the 2D projection plane according to the affine transformation matrix and the first corresponding relation, and generating a target image according to the affine coordinate values;
comparing the three-dimensional coordinates of the target image with all the three-dimensional coordinates of the occlusion object image, judging whether the target is occluded, if so, generating a first simulation image, otherwise, generating a second simulation image;
the three-dimensional coordinate of the target image consists of an affine coordinate value of the target image in a 2D projection plane and a coordinate value in a third coordinate axis direction different from the coordinate axis in the 2D projection plane, wherein the coordinate value in the third coordinate axis direction is an affine transformation value of the distance between the three-dimensional coordinate origin of the coordinate system of the original coordinate of the target in the simulated flight scene and the 2D projection plane; the three-dimensional coordinate of the shielding object image consists of a coordinate value of the shielding object image in a 2D projection plane and a coordinate value in a third coordinate axis direction different from the coordinate value in the 2D projection plane, wherein the coordinate value in the third coordinate axis direction is the distance between the three-dimensional coordinate origin of the coordinate system of the original coordinate of the shielding object in the simulated flight scene and the 2D projection plane;
the first simulation image is synthesized according to the background image and the occlusion object image, and the second simulation image is synthesized according to the background image, the occlusion object image of the non-occlusion object and the object image.
2. The method for simulating the low-speed target image according to claim 1, wherein the determining the affine transformation matrix of the target according to the imaging parameters of the optoelectronic device is specifically as follows:
according to imaging parameters of the photoelectric equipment, determining translation distances of targets in directions of all coordinate axes of the 2D projection plane, and scale transformation coefficients and rotation coefficients on the 2D projection plane;
and combining the translation distance, the scale transformation coefficient and the rotation coefficient to obtain an affine transformation matrix of the target.
3. A low-slow target-speed image simulation method according to claim 2, wherein the affine transformation matrix is expressed as;
wherein ,Tx Representing the translation distance of the target in one of the coordinate axis directions, T y Representing the translation distance of the object in the direction of the other coordinate axis,representing scale transform coefficients, +.>Representing the rotation coefficient.
4. The method for simulating a low-speed target image according to claim 1, wherein the 2D projection plane is an XOY plane, and the mapping is performed based on a first formula when mapping a background and an obstruction in a target simulated flight scene to the 2D projection plane in an observation coordinate system, the first correspondence is generated according to the first formula, and the first formula is:;
wherein ,representing the original coordinates of a background or a shutter within a target simulated flight scene, +.>Representing the presentation to beCoordinate values obtained by mapping to the 2D projection plane, are>Coordinate values representing the direction of a third coordinate axis of the background or the shielding object in a coordinate axis different from the coordinate axis in the 2D projection plane, wherein z represents the distance between the three-dimensional origin of the coordinate system where the original coordinates of the background or the shielding object are located and the 2D projection plane, F is a normalized proportionality coefficient associated with the 2D projection plane, and->Representing the three-dimensional coordinates of the background image or the occlusion image.
5. The method for simulating a low-speed target image according to claim 4, wherein the affine coordinate values of the target in the 2D projection plane are calculated according to the affine transformation matrix and the first correspondence relation, and the target image is generated according to the affine coordinate values, specifically:
mapping the original coordinates of the target in the simulated flight scene onto a 2D projection plane according to the first corresponding relation to obtain projection coordinates corresponding to the original coordinates, wherein />,W represents the original number of coordinates of the target, +.>Representing the i-th original coordinate of the object,the representation will->Projection coordinates obtained by mapping to a 2D projection plane,>an initial coordinate value representing a direction of a third coordinate axis of the object different from the coordinate axis in the 2D projection plane, +.>Representing the distance between the three-dimensional origin of the coordinate system where the original coordinates of the target are located and the 2D projection plane;
calculating affine coordinate values of the object in the 2D projection plane and coordinate values of the object in a third coordinate axis direction different from the coordinate axes in the 2D projection plane based on the respective projection coordinates and affine transformation matrix, and forming three-dimensional coordinates of the object by the affine coordinate values of the object in the 2D projection plane and the coordinate values of the object in the third coordinate axis direction different from the coordinate axes in the 2D projection planeThe method comprises the steps of carrying out a first treatment on the surface of the Wherein F' represents an affine transformation matrix, +.>Affine coordinate values representing the object in the 2D projection plane,/->Coordinate values representing the direction of a third coordinate axis of the object different from the coordinate axis in the 2D projection plane;
and obtaining a target image according to affine coordinate values of the target in the 2D projection plane.
6. The method for simulating a low-speed target image according to claim 1, wherein the determining whether the target is occluded is specifically:
according to affine coordinate values of the target image in the 2D projection plane, if coordinate points which are the same as the affine coordinate values can be found in the same shelter image, and the coordinate values of the shelter image in the third coordinate axis direction are smaller than the coordinate values of the target image in the third coordinate axis direction, determining that the target is shielded; otherwise, determining that the target is not occluded.
7. A storage medium for connection to an external processor, wherein the storage medium has stored therein at least one instruction, at least one program, code set, or instruction set, which is loaded and executed by the processor to implement the method of any one of claims 1 to 6.
8. The low-small slow target high-speed image simulation device is characterized by comprising an upper computer and an FPGA module, wherein the upper computer is in communication connection with the FPGA module, and the FPGA module is also used for being connected with external photoelectric equipment;
the upper computer is used for determining an observation coordinate system of the photoelectric equipment, mapping a background and a shielding object in a target simulated flight scene to a 2D projection plane in the observation coordinate system to obtain a background image and a shielding object image, determining a first corresponding relation and determining an affine transformation matrix of the target according to imaging parameters of the photoelectric equipment, wherein the first corresponding relation is a coordinate transformation relation between an original coordinate of the target in the simulated flight scene and a projection coordinate obtained by mapping the original coordinate to the 2D projection plane, and sending the background image, the shielding object image, the affine transformation matrix and the first corresponding relation to the FPGA module;
the FPGA module is used for calculating affine coordinate values of the target in the 2D projection plane according to the affine transformation matrix and the first corresponding relation, generating a target image according to the affine coordinate values, comparing the three-dimensional coordinates of the target image with all the three-dimensional coordinates of the occlusion object image, judging whether the target is occluded, if so, generating a first analog image, and otherwise, generating a second analog image;
the three-dimensional coordinate of the target image consists of an affine coordinate value of the target image in a 2D projection plane and a coordinate value in a third coordinate axis direction different from the coordinate axis in the 2D projection plane, wherein the coordinate value in the third coordinate axis direction is an affine transformation value of the distance between the three-dimensional coordinate origin of the coordinate system of the original coordinate of the target in the simulated flight scene and the 2D projection plane; the three-dimensional coordinate of the shielding object image consists of a coordinate value of the shielding object image in a 2D projection plane and a coordinate value in a third coordinate axis direction different from the coordinate value in the 2D projection plane, wherein the coordinate value in the third coordinate axis direction is the distance between the three-dimensional coordinate origin of the coordinate system of the original coordinate of the shielding object in the simulated flight scene and the 2D projection plane;
the first simulation image is synthesized according to the background image and the shelter image; the second simulation image is synthesized according to the background image, the occlusion object image of the non-occlusion object and the object image;
the FPGA module is also used for sending the generated first analog image or the generated second analog image to the optoelectronic device.
9. The low-speed target high-speed image simulation device according to claim 8, wherein the affine transformation matrix of the target is determined according to imaging parameters of the optoelectronic device, specifically:
according to imaging parameters of the photoelectric equipment, determining translation distances of targets in directions of all coordinate axes of the 2D projection plane, and scale transformation coefficients and rotation coefficients on the 2D projection plane;
obtaining affine transformation matrix of target by combining translation distance, scale transformation coefficient and rotation coefficient;
wherein ,Tx Representing the translation distance of the target in one of the coordinate axis directions, T y Representing the translation distance of the object in the direction of the other coordinate axis,representing scale transform coefficients, +.>Representing the rotation coefficient.
10. The apparatus according to claim 8, wherein the 2D projection plane is an XOY plane, the mapping is performed based on a first formula when mapping a background and an obstruction in a target simulated flight scene to the 2D projection plane in an observation coordinate system, and the first correspondence is generated according to the first formula, and the first formula is:;
wherein ,representing the original coordinates of a background or a shutter within a target simulated flight scene, +.>Representing the presentation to beCoordinate values obtained by mapping to the 2D projection plane, are>Coordinate values representing the direction of a third coordinate axis of the background or the shielding object in a coordinate axis different from the coordinate axis in the 2D projection plane, wherein z represents the distance between the three-dimensional origin of the coordinate system where the original coordinates of the background or the shielding object are located and the 2D projection plane, F is a normalized proportionality coefficient associated with the 2D projection plane, and->Representing three-dimensional coordinates of a background image or a shutter image;
the affine coordinate value of the target in the 2D projection plane is calculated according to the affine transformation matrix and the first corresponding relation, and a target image is generated according to the affine coordinate value, specifically:
mapping the original coordinates of the target in the simulated flight scene onto a 2D projection plane according to the first corresponding relation to obtain projection coordinates corresponding to the original coordinates, wherein />,W represents the original number of coordinates of the target, +.>Representing the i-th original coordinate of the object,the representation will->Projection coordinates obtained by mapping to a 2D projection plane,>an initial coordinate value representing a direction of a third coordinate axis of the object different from the coordinate axis in the 2D projection plane, +.>Representing the distance between the three-dimensional origin of the coordinate system where the original coordinates of the target are located and the 2D projection plane;
calculating affine coordinate values of the object in the 2D projection plane and coordinate values of the object in a third coordinate axis direction different from the coordinate axes in the 2D projection plane based on the respective projection coordinates and affine transformation matrix, and forming three-dimensional coordinates of the object by the affine coordinate values of the object in the 2D projection plane and the coordinate values of the object in the third coordinate axis direction different from the coordinate axes in the 2D projection planeThe method comprises the steps of carrying out a first treatment on the surface of the Wherein F' represents an affine transformation matrix, +.>Affine coordinate values representing the object in the 2D projection plane,/->Coordinate values representing the direction of a third coordinate axis of the object different from the coordinate axis in the 2D projection plane;
obtaining a target image according to affine coordinate values of the target in the 2D projection plane;
the judging whether the target is shielded or not specifically comprises:
according to affine coordinate values of the target image in the 2D projection plane, if coordinate points which are the same as the affine coordinate values can be found in the same shelter image, and the coordinate values of the shelter image in the third coordinate axis direction are smaller than the coordinate values of the target image in the third coordinate axis direction, determining that the target is shielded; otherwise, determining that the target is not occluded.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310815341.3A CN116542847B (en) | 2023-07-05 | 2023-07-05 | Low-small slow target high-speed image simulation method, storage medium and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310815341.3A CN116542847B (en) | 2023-07-05 | 2023-07-05 | Low-small slow target high-speed image simulation method, storage medium and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116542847A CN116542847A (en) | 2023-08-04 |
CN116542847B true CN116542847B (en) | 2023-10-10 |
Family
ID=87449199
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310815341.3A Active CN116542847B (en) | 2023-07-05 | 2023-07-05 | Low-small slow target high-speed image simulation method, storage medium and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116542847B (en) |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103489214A (en) * | 2013-09-10 | 2014-01-01 | 北京邮电大学 | Virtual reality occlusion handling method, based on virtual model pretreatment, in augmented reality system |
CN105513112A (en) * | 2014-10-16 | 2016-04-20 | 北京畅游天下网络技术有限公司 | Image processing method and device |
CN106251282A (en) * | 2016-07-19 | 2016-12-21 | 中国人民解放军63920部队 | A kind of generation method and device of mechanical arm sampling environment analogous diagram |
CN106803286A (en) * | 2017-01-17 | 2017-06-06 | 湖南优象科技有限公司 | Mutual occlusion real-time processing method based on multi-view image |
CN107515392A (en) * | 2017-08-11 | 2017-12-26 | 北京中科罗宾雷达技术有限公司 | The crime prevention system and method for low small slow target |
CN107564089A (en) * | 2017-08-10 | 2018-01-09 | 腾讯科技(深圳)有限公司 | Three dimensional image processing method, device, storage medium and computer equipment |
KR20200019016A (en) * | 2018-08-13 | 2020-02-21 | 오스템임플란트 주식회사 | Method and apparatus for generating panoramic image, computer-readable recording medium based on 2D image registration and 3D image reconstruction |
CN111459046A (en) * | 2020-02-20 | 2020-07-28 | 南京理工大学 | Real-time dynamic generation system and method for target and scene for image seeker |
CN113240692A (en) * | 2021-06-30 | 2021-08-10 | 北京市商汤科技开发有限公司 | Image processing method, device, equipment and storage medium |
WO2022088419A1 (en) * | 2020-10-26 | 2022-05-05 | 成都极米科技股份有限公司 | Projection method, apparatus, and device, and computer-readable storage medium |
CN114782691A (en) * | 2022-04-20 | 2022-07-22 | 安徽工程大学 | Robot target identification and motion detection method based on deep learning, storage medium and equipment |
WO2022193739A1 (en) * | 2021-03-19 | 2022-09-22 | 深圳市火乐科技发展有限公司 | Projection image adjustment method and apparatus, storage medium, and electronic device |
KR102460864B1 (en) * | 2021-04-27 | 2022-10-28 | 재단법인대구경북과학기술원 | Bone surgical apparatus and method based on registration between coordinates of a surgical robot and coordinates of a patient |
CN115311133A (en) * | 2022-08-09 | 2022-11-08 | 北京淳中科技股份有限公司 | Image processing method and device, electronic equipment and storage medium |
-
2023
- 2023-07-05 CN CN202310815341.3A patent/CN116542847B/en active Active
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103489214A (en) * | 2013-09-10 | 2014-01-01 | 北京邮电大学 | Virtual reality occlusion handling method, based on virtual model pretreatment, in augmented reality system |
CN105513112A (en) * | 2014-10-16 | 2016-04-20 | 北京畅游天下网络技术有限公司 | Image processing method and device |
CN106251282A (en) * | 2016-07-19 | 2016-12-21 | 中国人民解放军63920部队 | A kind of generation method and device of mechanical arm sampling environment analogous diagram |
CN106803286A (en) * | 2017-01-17 | 2017-06-06 | 湖南优象科技有限公司 | Mutual occlusion real-time processing method based on multi-view image |
CN107564089A (en) * | 2017-08-10 | 2018-01-09 | 腾讯科技(深圳)有限公司 | Three dimensional image processing method, device, storage medium and computer equipment |
CN107515392A (en) * | 2017-08-11 | 2017-12-26 | 北京中科罗宾雷达技术有限公司 | The crime prevention system and method for low small slow target |
KR20200019016A (en) * | 2018-08-13 | 2020-02-21 | 오스템임플란트 주식회사 | Method and apparatus for generating panoramic image, computer-readable recording medium based on 2D image registration and 3D image reconstruction |
CN111459046A (en) * | 2020-02-20 | 2020-07-28 | 南京理工大学 | Real-time dynamic generation system and method for target and scene for image seeker |
WO2022088419A1 (en) * | 2020-10-26 | 2022-05-05 | 成都极米科技股份有限公司 | Projection method, apparatus, and device, and computer-readable storage medium |
WO2022193739A1 (en) * | 2021-03-19 | 2022-09-22 | 深圳市火乐科技发展有限公司 | Projection image adjustment method and apparatus, storage medium, and electronic device |
KR102460864B1 (en) * | 2021-04-27 | 2022-10-28 | 재단법인대구경북과학기술원 | Bone surgical apparatus and method based on registration between coordinates of a surgical robot and coordinates of a patient |
CN113240692A (en) * | 2021-06-30 | 2021-08-10 | 北京市商汤科技开发有限公司 | Image processing method, device, equipment and storage medium |
CN114782691A (en) * | 2022-04-20 | 2022-07-22 | 安徽工程大学 | Robot target identification and motion detection method based on deep learning, storage medium and equipment |
CN115311133A (en) * | 2022-08-09 | 2022-11-08 | 北京淳中科技股份有限公司 | Image processing method and device, electronic equipment and storage medium |
Non-Patent Citations (2)
Title |
---|
基于运动目标三维轨迹重建的视频序列同步算法;王雪;SHI Jian-Bo;PARK Hyun-Soo;王庆;;自动化学报(10);全文 * |
适用于单目视频的无标记三维人体运动跟踪;邹北骥;陈姝;彭小宁;史操;;计算机辅助设计与图形学学报(08);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN116542847A (en) | 2023-08-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11954870B2 (en) | Dynamic scene three-dimensional reconstruction method, apparatus and system, server, and medium | |
JP6830139B2 (en) | 3D data generation method, 3D data generation device, computer equipment and computer readable storage medium | |
JP6057298B2 (en) | Rapid 3D modeling | |
CN109961522A (en) | Image projecting method, device, equipment and storage medium | |
CN111292408B (en) | Shadow generation method based on attention mechanism | |
CN109300143A (en) | Determination method, apparatus, equipment, storage medium and the vehicle of motion vector field | |
CN110135396A (en) | Recognition methods, device, equipment and the medium of surface mark | |
CN110969687A (en) | Collision detection method, device, equipment and medium | |
CN114187589A (en) | Target detection method, device, equipment and storage medium | |
CN117197388A (en) | Live-action three-dimensional virtual reality scene construction method and system based on generation of antagonistic neural network and oblique photography | |
Fu et al. | Image segmentation of cabin assembly scene based on improved RGB-D mask R-CNN | |
CN112023400A (en) | Height map generation method, device, equipment and storage medium | |
CN113496503A (en) | Point cloud data generation and real-time display method, device, equipment and medium | |
CN116542847B (en) | Low-small slow target high-speed image simulation method, storage medium and device | |
CN114627438A (en) | Target detection model generation method, target detection method, device and medium | |
CN116642490A (en) | Visual positioning navigation method based on hybrid map, robot and storage medium | |
Bownes | Using motion capture and augmented reality to test aar with boom occlusion | |
Zhang et al. | A smart method for developing game-based virtual laboratories | |
CN113436242B (en) | Method for obtaining high-precision depth value of static object based on mobile depth camera | |
CN117872590A (en) | Space target optical imaging simulation method and system | |
Deshmukh et al. | 2D to 3D Floor plan Modeling using Image Processing and Augmented Reality | |
CN117475058A (en) | Image processing method, device, electronic equipment and storage medium | |
Fumin | Design of an Electric Vehicle Modeling and Visualization System Based on Industrial CT and Mixed Reality Technology | |
Agarwal et al. | Simulating City-Scale Aerial Data Collection Using Unreal Engine | |
Fu et al. | Simulation design of mechanical laser radar sensor based on raytrace technology |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |