CN111462089B - Virtual scene precision testing method based on optical dynamic capturing system and related equipment - Google Patents

Virtual scene precision testing method based on optical dynamic capturing system and related equipment Download PDF

Info

Publication number
CN111462089B
CN111462089B CN202010252267.5A CN202010252267A CN111462089B CN 111462089 B CN111462089 B CN 111462089B CN 202010252267 A CN202010252267 A CN 202010252267A CN 111462089 B CN111462089 B CN 111462089B
Authority
CN
China
Prior art keywords
scale
virtual
optical dynamic
capturing system
standard
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010252267.5A
Other languages
Chinese (zh)
Other versions
CN111462089A (en
Inventor
杭建伟
刘爽
陈文涛
许秋子
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Realis Multimedia Technology Co Ltd
Original Assignee
Shenzhen Realis Multimedia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Realis Multimedia Technology Co Ltd filed Critical Shenzhen Realis Multimedia Technology Co Ltd
Priority to CN202010252267.5A priority Critical patent/CN111462089B/en
Publication of CN111462089A publication Critical patent/CN111462089A/en
Application granted granted Critical
Publication of CN111462089B publication Critical patent/CN111462089B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0006Industrial image inspection using a design-rule based approach
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Testing Of Optical Devices Or Fibers (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to the technical field of computers, in particular to a computer systemRelates to a virtual scene precision testing method based on an optical dynamic capturing system and related equipment. The method comprises the following steps: setting a virtual scale in a virtual scene to be tested, determining the position of a central point of a rigid body, multi-frame three-dimensional coordinate data of a first mark point and a second mark point on a physical scale through an optical dynamic capturing system, and obtaining a virtual reading m through calculation f Average value of
Figure DDA0002435923150000011
And standard deviation; and determining whether the precision of the virtual scene meets the standard or not through multiple times of judgment. The invention utilizes the optical dynamic capturing system to virtually mark scales and calculate the virtual scale, can effectively determine whether the content accuracy of the virtual scene meets the standard, has easily obtained hardware devices, has simple and convenient whole testing process, and is extremely suitable for virtual games, racing cars, tanks, aircrafts and other scenes which can be trained by personnel and are based on the VR/AR of the optical dynamic capturing system.

Description

Virtual scene precision testing method based on optical dynamic capturing system and related equipment
Technical Field
The invention relates to the technical field of computers, in particular to a virtual scene precision testing method based on an optical dynamic capturing system and related equipment.
Background
In virtual scenes such as racing cars, tanks, aircrafts and the like which can be trained by personnel and are based on an optical motion capture system or virtual games, accuracy tests need to be carried out on displacement distances and the like in the virtual scenes, and the accuracy needs to be within a certain error delta, namely, euclidean distances in the virtual scenes and actual Euclidean distances are within a certain error range, so that the accuracy tests of the content of the virtual scenes can be determined to reach the standard. But there is currently a lack of a method for testing the accuracy of virtual scenes.
In addition, with the increasing wide application of machine vision, multi-camera-based optical dynamic capture systems in large space environments are widely used for high-precision positioning and tracking in large spaces.
Disclosure of Invention
The invention mainly aims to provide a virtual scene precision testing method based on an optical dynamic capturing system and related equipment, and aims to solve the problem of virtual scene precision testing by combining the optical dynamic capturing system.
In order to achieve the above purpose, the present invention provides a virtual scene precision testing method based on an optical dynamic capturing system, the method comprising the following steps:
setting a virtual scale in a virtual scene to be tested, wherein the virtual scale is consistent with the scale of a physical scale and scales in equal proportion, the virtual scale is provided with a scale arrow, and the scale arrow indicates the virtual scale;
determining the position of a central point of a rigid body through a preset optical dynamic capturing system, wherein the rigid body consists of at least three rigid body mark points, and the physical scale is placed at a preset distance of the rigid body;
obtaining a target scale corresponding to a second mark point in the physical scale to obtain a target reading m E Taking the position of the center point of the rigid body as the center, taking the preset distance as the radius, acquiring multi-frame three-dimensional coordinate data of a first mark point and a second mark point on the physical scale through the optical dynamic capture system, respectively calculating the Euclidean distance of each frame, and assigning the Euclidean distance to the virtual scale to enable the scale arrow of the virtual scale to be marked on the corresponding virtual scale to obtain a virtual reading m f Calculating an average of the virtual readings over a plurality of frames
Figure GDA0004213188520000021
Calculating the standard deviation of the virtual reading according to the average value;
obtaining the maximum value C of the virtual reading according to the sliding range of the scale arrow of the virtual scale max And minimum value C min Judging |C max -m E Error delta less than or equal to C min -m E If the error delta is true, calculating
Figure GDA0004213188520000022
Obtaining a difference value, if the difference value is smaller than the firstAnd if the standard deviation is smaller than the second threshold, the test precision of the round is considered to reach the standard, otherwise, the test precision of the round is considered to be not reach the standard.
Optionally, the acquiring, by the optical dynamic capturing system, multi-frame three-dimensional coordinate data of the first mark point and the second mark point on the physical scale, and calculating the euclidean distance of each frame respectively includes:
setting the average value window of the optical dynamic capturing system as delta t, and setting the average value window as F at a preset sampling frequency S Collecting the three-dimensional coordinate data through the optical dynamic capturing system, marking the first mark point and the second mark point as f frames at the time t, and calculating the Euclidean distance d of the f frames t
Optionally, an average of the virtual readings over a Δt time window
Figure GDA0004213188520000023
The calculation formula of (2) is as follows:
Figure GDA0004213188520000024
wherein F is S For the sampling frequency, f is the current frame, m i Representing a virtual reading of the starting frame within a time window.
Optionally, the calculation formula of the standard deviation of the virtual reading in Δt time is:
Figure GDA0004213188520000025
wherein F is S For the sampling frequency, f is the current frame, m i Representing a virtual reading of the starting frame within a time window.
Optionally, if the difference is smaller than the first threshold and the standard deviation is smaller than the second threshold, the testing precision of the round is considered to reach the standard, otherwise, after the testing precision of the round is considered to not reach the standard, the method further includes:
adding one to the test times, adding one to the standard reaching times when the precision is considered to reach the standard, judging whether the continuous standard reaching times reach a preset standard reaching threshold, and finally considering that the precision of the virtual scene reaches the standard if the continuous standard reaching times reach the standard reaching threshold;
if the standard reaching times do not reach the standard reaching threshold, continuing to judge whether the test times reach a preset test threshold, and if the test times reach the test threshold, finally considering that the accuracy of the virtual scene does not reach the standard;
if the test times do not reach the test threshold, continuing to acquire a target scale corresponding to a second mark point in the physical scale to obtain a target reading m E Is carried out by a method comprising the steps of.
Optionally, if the number of tests does not reach the test threshold, continuing to acquire a target scale corresponding to the second mark point in the physical scale to obtain a target reading m E Before, still include:
if the test times do not reach the test threshold, sliding the second mark point and indicating another target scale to obtain another target reading m E Or after the physical scale and the rigid body are within the preset distance and the posture of the sliding rail is changed, continuing to acquire a target scale corresponding to a second mark point in the physical scale to obtain a target reading m E Is carried out by a method comprising the steps of.
Further, in order to achieve the above object, the present invention also provides a virtual scene accuracy testing device based on an optical dynamic capturing system, including:
the virtual scale determining module is used for setting a virtual scale in a virtual scene to be tested, wherein the virtual scale is consistent with the scale of the physical scale and scales in equal proportion, the virtual scale is provided with a scale arrow, and the scale arrow indicates the virtual scale;
the rigid body is composed of at least three rigid body mark points, and the physical scale is placed at a preset distance of the rigid body;
sampling and calculating module for obtainingTaking a target scale corresponding to a second mark point in the physical scale to obtain a target reading m E Taking the position of the center point of the rigid body as the center, taking the preset distance as the radius, acquiring multi-frame three-dimensional coordinate data of a first mark point and a second mark point on the physical scale through the optical dynamic capture system, respectively calculating the Euclidean distance of each frame, and assigning the Euclidean distance to the virtual scale to enable the scale arrow of the virtual scale to be marked on the corresponding virtual scale to obtain a virtual reading m f Calculating an average of the virtual readings over a plurality of frames
Figure GDA0004213188520000042
Calculating the standard deviation of the virtual reading according to the average value;
the judging module is used for acquiring the maximum value C of the virtual reading according to the sliding range of the scale arrow of the virtual scale max And minimum value C min Judging |C max -m E Error delta less than or equal to C min -m E If the error delta is true, calculating
Figure GDA0004213188520000041
And obtaining a difference value, if the difference value is smaller than a first threshold value and the standard deviation is smaller than a second threshold value, considering that the test precision of the round meets the standard, and otherwise, considering that the test precision of the round does not meet the standard.
In order to achieve the above object, the present invention further provides a virtual scene accuracy testing device based on an optical dynamic capturing system, the device comprising: the method comprises the steps of a memory, a processor and a virtual scene precision testing program based on the optical dynamic capturing system, wherein the virtual scene precision testing program based on the optical dynamic capturing system is stored in the memory and can run on the processor, and the steps of the virtual scene precision testing method based on the optical dynamic capturing system are realized when the virtual scene precision testing program based on the optical dynamic capturing system is executed by the processor.
In order to achieve the above object, the present invention further provides a computer readable storage medium, on which a virtual scene precision test program based on an optical dynamic capturing system is stored, where the steps of the virtual scene precision test method based on an optical dynamic capturing system described above are implemented when the virtual scene precision test program based on the optical dynamic capturing system is executed by a processor.
In order to achieve the above purpose, the invention also provides a scale, which comprises a physical scale, wherein scales are arranged on the physical scale along the length direction, a slide rail is arranged on the physical scale along the length direction, two sliding blocks are arranged on the slide rail, a first marking point and a second marking point are respectively arranged on the two sliding blocks, and the first marking point and the second marking point slide on the slide rail through the corresponding sliding blocks respectively.
According to the virtual scene precision testing method based on the optical dynamic capturing system, the optical dynamic capturing system is utilized to obtain the scales of the physical scale, and whether the content precision of the virtual scene meets the standard can be effectively determined by virtually marking the scales and calculating the virtual scale.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention.
Fig. 1 is a schematic structural diagram of a virtual scene precision testing device based on an optical dynamic capturing system according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a physical scale according to an embodiment of the present invention;
FIG. 3 is a flow chart of a virtual scene accuracy test method based on an optical dynamic capture system in an embodiment of the invention;
FIG. 4 is a schematic diagram of a virtual scale according to an embodiment of the present invention;
fig. 5 is a block diagram of a virtual scene accuracy testing device based on an optical dynamic capturing system according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless expressly stated otherwise, as understood by those skilled in the art. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Referring to fig. 1, a schematic structural diagram of a virtual scene accuracy testing device based on an optical dynamic capturing system according to an embodiment of the present invention is shown.
The virtual scene precision testing device based on the optical dynamic capturing system comprises a processor, such as a CPU, a communication bus, a user interface, a network interface and a memory. Wherein the communication bus is used to enable connection communication between these components. The user interface may comprise a Display, an input unit such as a Keyboard (Keyboard), and the network interface may alternatively comprise a standard wired interface, a wireless interface (e.g. WI-FI interface). The memory may be a high-speed RAM memory or a stable memory (non-volatile memory), such as a disk memory. The memory may alternatively be a storage device separate from the aforementioned processor. An operating system, a network communication module, a user interface module, and a virtual scene accuracy test program based on an optical dynamic capturing system may be included in a memory as a computer readable storage medium. The operating system is a program for managing and controlling the virtual scene precision testing equipment and software resources based on the optical dynamic capturing system, and supports the virtual scene precision testing program based on the optical dynamic capturing system and the running of other software and/or programs. In a hardware structure of virtual scene precision test equipment based on an optical dynamic capturing system, a network interface is mainly used for accessing a network; the user interface is mainly used for detecting and confirming instructions, editing instructions and the like, and the processor can be used for calling a virtual scene precision testing program based on the optical dynamic capturing system stored in the memory and executing the following operations of various embodiments of the virtual scene precision testing method based on the optical dynamic capturing system.
As shown in fig. 1, the virtual scene precision testing device based on the optical dynamic capturing system further includes a plurality of optical cameras 1, rigid bodies 2 and scales 3, wherein the plurality of optical cameras 1 are used for performing exposure shooting on the rigid bodies 2 and the scales 3, sending the shot image data to the processor, the rigid bodies 2 include a plurality of rigid body marking points, the plurality of optical cameras 1 are arranged around the rigid bodies 2, the real prop scales 3 are placed at a preset distance L from the rigid bodies 2, and the scales 3 can be placed at random in a rotating mode within the preset distance L.
As shown in fig. 2, the scale 3 includes a physical scale, a scale 31 is provided on the physical scale along a length direction, a slide rail 32 is provided on the physical scale along the length direction, two sliding blocks are provided on the slide rail 32, a first mark point 33 and a second mark point 34 are respectively provided on the two sliding blocks, and the first mark point 33 and the second mark point 34 slide on the slide rail through corresponding sliding blocks respectively. The reflective markers formed by coating the outer surfaces of the two sliders with a reflective coating are the first marker point 33 and the second marker point 34. By taking the center point of the rigid body 2 as the center, the optical dynamic capturing system only needs to search the first marking point 33 and the second marking point 34 on the scale 3 within the range of the preset distance L, so that the 3D position of the marking point is obtained, the searching time is shortened, and the interference of other factors is avoided.
Those skilled in the art will appreciate that the hardware configuration of the virtual scene accuracy testing apparatus based on the optical dynamic capture system does not constitute a limitation of the virtual scene accuracy testing apparatus based on the optical dynamic capture system, and may include more or less components than illustrated, or may combine certain components, or may be a different arrangement of components.
Referring to fig. 3, a flowchart of a virtual scene accuracy testing method based on an optical dynamic capturing system according to an embodiment of the present invention, as shown in fig. 3, is a virtual scene accuracy testing method based on an optical dynamic capturing system, including the following steps:
step S1, determining a virtual ruler: setting a virtual scale in a virtual scene to be tested, wherein the virtual scale is consistent with the scale of the physical scale and scales in equal proportion, the virtual scale is provided with a scale arrow, and the scale arrow indicates the virtual scale.
In this step, a virtual scale is preset in the virtual scene to be tested, as shown in fig. 4, the virtual scale is provided with a virtual scale 41, the virtual scale 41 is consistent with the scale of the physical scale and scales freely in equal proportion, and the virtual scale in the virtual scene is scaled to a proper size according to the requirement, so that the virtual scale 41 is convenient to see. The virtual scale is also provided with a scale arrow 42, the scale arrow 42 can slide left and right, and the scale arrow 42 is aligned with the virtual scale and is used for indicating the size of received virtual data.
Step S2, determining the position of the center point of the rigid body: the position of the central point of the rigid body is determined by a preset optical dynamic capturing system, the rigid body is composed of at least three rigid body mark points, and a real object scale is placed at a preset distance of the rigid body.
The preset distance in the step is recorded as L, and the preset distance L can be set according to the requirement, for example, L is 2m, namely, further, the distance from the slide rail of the physical scale to the rigid body is not more than 2m.
Because the rigid body is provided with a plurality of rigid body mark points, for example, five rigid body mark points are arranged, the position of the central point of the rigid body can be determined through the optical dynamic capturing system in the prior art.
Step S3, sampling and calculating data: obtaining a target scale corresponding to a second mark point in the physical scale to obtain a target reading m E Taking the position of the central point of the rigid body as the center, taking the preset distance as the radius, acquiring multi-frame three-dimensional coordinate data of a first mark point and a second mark point on a physical scale through an optical dynamic capturing system, respectively calculating the Euclidean distance of each frame, and assigning the Euclidean distance to a virtual scale to enable the scale arrow of the virtual scale to be marked on the corresponding virtual scale to obtain a virtual reading m f Calculating an average of multiple virtual readings
Figure GDA0004213188520000081
The standard deviation of the virtual reading is calculated from the mean.
The step is to obtain a target reading m by obtaining a target scale corresponding to a second mark point in the physical scale E Before, a first mark point in a physical scale is fixed on a slide rail and indicates a 0 scale, a second mark point is slid and indicates any target scale, and a target reading m is obtained E
The method comprises the steps of searching a first mark point and a second mark point on a scale within a preset distance L by taking the position of the center point of a rigid body as the center, so as to obtain 3D data of the two mark points, further calculating the Euclidean distance of each frame based on the 3D data, and obtaining virtual reading of each frame, so that average value and standard deviation are calculated. Virtual reading m in this step f Average value of
Figure GDA0004213188520000082
Standard deviation, virtual reading m f Maximum value C of (2) max And minimum value C min Can be displayed through the display screen.
In one embodiment, step S3 includes:
step S301, setting the average window of the optical dynamic capture system to be Deltat, and setting the sampling frequency to be F S Collecting three-dimensional coordinate data through an optical dynamic capturing system, marking a first mark point and a second mark point as an f-th frame at the time t, and calculating the Euclidean distance d of the f-th frame t
The embodiment sets an average value window Deltat, i.e. the length of the historical data, for example 0.5s or 1s, F for the optical dynamic capture system S 60 samples are obtained at a sampling frequency of 120Hz, namely 60 frames of three-dimensional coordinate data are respectively obtained by the first marked point and the second marked point.
Step S302, the Euclidean distance d t Assigning values to the virtual scales to enable the scale arrows of the virtual scales to be marked to corresponding virtual scales to obtain virtual readings m f Calculating virtual reading m in delta t time window f Average value of (2)
Figure GDA0004213188520000083
Figure GDA0004213188520000091
Wherein F is S For sampling frequency, f is the current frame, m i Representing a virtual reading of the starting frame within a time window.
Step S303, calculating the standard deviation sigma of the readings in the Δt time period f
Figure GDA0004213188520000092
Wherein F is S For sampling frequency, f is the current frame, m i Representing a virtual reading of the starting frame within a time window.
The embodiment finally obtains all virtual readings m in the delta t time period through the sampling and calculation f And standard deviation thereof, provides accurate data for subsequent judgment.
Step S4, judging whether the precision meets the standard: obtaining the maximum value C of the virtual reading according to the sliding range of the scale arrow of the virtual scale max And minimum value C min Judging |C max -m E Error delta less than or equal to C min -m E If the error delta is true, calculating
Figure GDA0004213188520000093
And obtaining a difference value, if the difference value is smaller than a first threshold value and the standard deviation is smaller than a second threshold value, considering that the test precision of the round meets the standard, and otherwise, considering that the test precision of the round does not meet the standard.
In the step, multiple judgment is combined during one round of test, and the maximum value and the minimum value of the virtual reading and the target reading m indicated by the second mark point in the step S3 are preferably adopted E A first round error range comparison is performed. Only the difference between the maximum value and the target reading and the difference between the minimum value and the target readingAnd under the condition that the error delta is smaller than or equal to the preset error delta, performing second round error judgment. If any one of the difference between the maximum value and the target reading and the difference between the minimum value and the target reading is larger than a preset error delta, judging that the test is false, and considering that the test precision of the round does not reach the standard.
In the step, when the second round of error judgment is carried out, the obtained average value is used for
Figure GDA0004213188520000094
With the target reading m on the slide rail E In contrast, the smaller the difference, and the smaller the standard deviation, the higher the representative accuracy.
Step S5, optimizing and judging: and after a round of testing is carried out to obtain a judging result, the testing precision of the round meets the standard or the testing precision of the round does not meet the standard, further optimizing and judging whether the precision of the virtual scene meets the standard or not.
In order to effectively prove the reliability of the optical dynamic capture system, the steps S3 to S4 need to be repeated for a plurality of times to obtain a conclusion whether the precision meets the standard. Therefore, the step obtains a more accurate result through optimization judgment.
And step S501, adding one to the test times, adding one to the standard reaching times when the precision is considered to reach the standard, judging whether the continuous standard reaching times reach a preset standard reaching threshold, and finally considering that the precision of the virtual scene reaches the standard if the continuous standard reaching times reach the standard reaching threshold.
Whether the test accuracy of the round reaches the standard or the test accuracy of the round does not reach the standard as a result of the judgment in the step S4, the test times are added with one process, which represents that one round of test is performed. And (3) only when the judgment result of the step S4 meets the standard of the round of test precision, adding one for the standard reaching times, after adding one for the standard reaching times, judging the continuous standard reaching times once, and if the standard reaching threshold value is reached, considering that the optical dynamic capturing system is reliable, and on the premise that the optical dynamic capturing system is reliable, obtaining the accurate standard reaching result of the virtual scene precision.
In the step, when the continuous standard reaching times are judged, the continuous standard reaching rate can be adopted, namely whether the continuous standard reaching rate reaches a preset standard reaching threshold value is judged, and if the continuous standard reaching rate reaches the standard reaching threshold value, the virtual scene precision is finally considered to reach the standard. The standard reaching rate is calculated by dividing the continuous standard reaching times by the test times.
Step S502, if the number of times of reaching the standard does not reach the standard threshold, continuing to judge whether the number of times of testing reaches the preset test threshold, and if the number of times of testing reaches the test threshold, finally considering that the precision of the virtual scene does not reach the standard.
After multiple tests are performed, if the number of times of the tests reaches the test threshold, and the number of times of reaching the standard still does not reach the standard threshold, the accuracy of the virtual scene is considered to be not high enough, and the expected requirement is not met.
The step can also adopt standard reaching rate to judge, namely if the standard reaching rate still does not reach the standard reaching threshold after the test times reach the test threshold, the virtual scene is considered to have insufficient precision and can not reach the expected requirement.
Step S503, if the number of tests does not reach the test threshold, sliding the second mark point and indicating another target scale to obtain another target reading m E Or after the gesture of the slide rail is changed between the physical scale and the rigid body within the preset distance, continuously acquiring a target scale corresponding to a second mark point in the physical scale to obtain a target reading m E Is carried out by a method comprising the steps of.
The step can also be used for reading m of a newly obtained target E With the last obtained target reading m E Comparing, if the target readings are the same, prompting, and prompting the user to change the target reading m by changing the target scale indicated by the second mark point E . If the two marks are different, continuing to take the center point position of the rigid body as the center, taking the preset distance as the radius, acquiring multi-frame three-dimensional coordinate data of the first mark point and the second mark point on the physical scale through the optical dynamic capturing system, and respectively calculating the Euclidean distance of each frame.
This step gives the possibility to continue the test, and before repeating the test, the second marker point needs to be moved to a different position, resulting in a different target reading. The sliding rail can be changed continuously in posture, namely, the sliding rail is freely rotated and translated in the observable range of the optical dynamic capturing system, so that the optical dynamic capturing system is ensured to be reliable. The sliding of the second mark point and the changing of the sliding rail posture are not in a sequential relationship, the sliding of the second mark point and the changing of the sliding rail posture can be combined at will, and after at least one condition of sliding the second mark point or the changing of the sliding rail posture, the steps S3 to S5 can be repeated on the premise that a positioning system is reliable to obtain whether the accuracy test result of the virtual scene content reaches the standard.
According to the virtual scene precision testing method based on the optical dynamic capturing system, for one round of testing, when the method is implemented, the physical scale is in a static state for a plurality of time periods, and whether the precision meets the standard can be tested and judged for a plurality of times in the time periods. When the precision test is repeated for a plurality of times, the gesture of each physical scale is different in the observable range of an optical dynamic capture system (Tracker), and the distances between two optical marking points on the scale are also different. And finally judging that the standard reaches the standard after the standard reaching rate reaches a certain standard. According to the virtual scene precision testing method based on the optical dynamic capturing system, the virtual scale corresponding to the physical scale is designed based on the optical dynamic capturing system, whether the virtual scene content precision meets the standard can be effectively determined through multiple judgment and testing, and the testing result is accurate.
In one embodiment, a virtual scene accuracy testing device based on an optical dynamic capturing system is provided, as shown in fig. 5, the device includes:
the virtual scale determining module is used for setting a virtual scale in a virtual scene to be tested, wherein the virtual scale is consistent with the scale of the physical scale and scales in equal proportion, the virtual scale is provided with a scale arrow, and the scale arrow indicates the virtual scale;
the rigid body position determining module is used for determining the position of the center point of the rigid body through a preset optical dynamic capturing system, wherein the rigid body consists of at least three rigid body mark points, and a real object scale is placed at the preset distance of the rigid body;
the sampling and calculating module is used for obtaining a target scale corresponding to a second mark point in the physical scale to obtain a target reading m E Taking the position of the central point of the rigid body as the center and taking the preset distance as the radius, and acquiring a plurality of frames of first marking points and second marking points on a physical scale through an optical dynamic capturing systemThree-dimensional coordinate data are respectively calculated, euclidean distance of each frame is calculated, the Euclidean distance is assigned to the virtual scale, and scale arrow of the virtual scale is marked to corresponding virtual scale to obtain virtual reading m f Calculating an average of multiple virtual readings
Figure GDA0004213188520000121
Calculating the standard deviation of the virtual reading according to the average value;
the judging module is used for acquiring the maximum value C of the virtual reading according to the sliding range of the scale arrow of the virtual scale max And minimum value C min Judging |C max -m E Error delta less than or equal to C min -m E If the error delta is true, calculating
Figure GDA0004213188520000122
And obtaining a difference value, if the difference value is smaller than a first threshold value and the standard deviation is smaller than a second threshold value, considering that the test precision of the round meets the standard, and otherwise, considering that the test precision of the round does not meet the standard.
In one embodiment, a virtual scene precision testing device based on an optical dynamic capturing system is provided, the device comprising: the method for testing the precision of the virtual scene based on the optical dynamic capturing system comprises a memory, a processor and a virtual scene precision testing program based on the optical dynamic capturing system, wherein the virtual scene precision testing program based on the optical dynamic capturing system is stored in the memory and can run on the processor, and the steps in the method for testing the precision of the virtual scene based on the optical dynamic capturing system in each embodiment are realized when the virtual scene precision testing program based on the optical dynamic capturing system is executed by the processor.
In one embodiment, a computer readable storage medium stores a virtual scene accuracy test program based on an optical dynamic capturing system, where the virtual scene accuracy test program based on the optical dynamic capturing system implements the steps in the virtual scene accuracy test method based on the optical dynamic capturing system in the above embodiments when executed by a processor. The storage medium may be a volatile storage medium or a nonvolatile storage medium.
Those of ordinary skill in the art will appreciate that all or part of the steps in the various methods of the above embodiments may be implemented by a program to instruct related hardware, the program may be stored in a computer readable storage medium, and the storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above-described embodiments represent only some exemplary embodiments of the invention, which are described in more detail and are not to be construed as limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the scope of protection of the present invention is to be determined by the appended claims.

Claims (9)

1. The virtual scene precision testing method based on the optical dynamic capturing system is characterized by comprising the following steps of:
setting a virtual scale in a virtual scene to be tested, wherein the virtual scale is consistent with the scale of a physical scale and scales in equal proportion, the virtual scale is provided with a scale arrow, and the scale arrow indicates the virtual scale;
determining the position of a central point of a rigid body through a preset optical dynamic capturing system, wherein the rigid body consists of at least three rigid body mark points, and the physical scale is placed at a preset distance of the rigid body;
obtaining a target scale corresponding to a second mark point in the physical scale to obtain a target reading m E Taking the center point position of the rigid body as the center, taking the preset distance as the radius, and acquiring multi-frame three-dimensional data of a first marking point and a second marking point on the physical scale through the optical dynamic capturing systemCoordinate data, respectively calculating Euclidean distance of each frame, and assigning the Euclidean distance to the virtual scale to enable the scale arrow of the virtual scale to be marked on the corresponding virtual scale to obtain a virtual reading m f Calculating an average of the virtual readings over a plurality of frames
Figure FDA0004213188510000011
Calculating the standard deviation of the virtual reading according to the average value;
obtaining the maximum value C of the virtual reading according to the sliding range of the scale arrow of the virtual scale max And minimum value C min Judging |C max -m E Error delta less than or equal to C min -m E If the error delta is true, calculating
Figure FDA0004213188510000012
Obtaining a difference value, if the difference value is smaller than a first threshold value and the standard deviation is smaller than a second threshold value, considering that the test precision of the round meets the standard, otherwise, considering that the test precision of the round does not meet the standard;
the method for obtaining multi-frame three-dimensional coordinate data of a first mark point and a second mark point on the physical scale through the optical dynamic capturing system, respectively calculating Euclidean distance of each frame, comprises the following steps:
setting the average value window of the optical dynamic capturing system as delta t, and setting the average value window as F at a preset sampling frequency S Collecting the three-dimensional coordinate data through the optical dynamic capturing system, marking the first mark point and the second mark point as f frames at the time t, and calculating the Euclidean distance d of the f frames t
2. The method for testing the accuracy of a virtual scene based on an optical dynamic capture system according to claim 1, wherein the average value of the virtual readings in a Δt time window
Figure FDA0004213188510000021
The calculation formula of (2) is as follows:
Figure FDA0004213188510000022
wherein F is S For the sampling frequency, f is the current frame, m i Representing a virtual reading of the starting frame within a time window.
3. The method for testing the accuracy of the virtual scene based on the optical dynamic capturing system according to claim 1, wherein the calculation formula of the standard deviation of the virtual reading in the Δt time period is as follows:
Figure FDA0004213188510000023
wherein F is S For the sampling frequency, f is the current frame, m i Representing a virtual reading of the starting frame within a time window.
4. The method for testing precision of a virtual scene based on an optical dynamic capturing system according to claim 1, wherein if the difference is smaller than a first threshold and the standard deviation is smaller than a second threshold, the round of testing precision is considered to be up to standard, otherwise, after the round of testing precision is considered to be not up to standard, the method further comprises:
adding one to the test times, adding one to the standard reaching times when the precision is considered to reach the standard, judging whether the continuous standard reaching times reach a preset standard reaching threshold, and finally considering that the precision of the virtual scene reaches the standard if the continuous standard reaching times reach the standard reaching threshold;
if the standard reaching times do not reach the standard reaching threshold, continuing to judge whether the test times reach a preset test threshold, and if the test times reach the test threshold, finally considering that the accuracy of the virtual scene does not reach the standard;
if the test times do not reach the test threshold, continuing to acquire a target scale corresponding to a second mark point in the physical scale to obtain a target reading m E Is carried out by a method comprising the steps of.
5. The method for testing precision of virtual scene based on optical dynamic capturing system as set forth in claim 4, wherein if said test number does not reach said test threshold, continuing to obtain a target scale corresponding to a second mark point in said physical scale, thereby obtaining a target reading m E Before, still include:
if the test times do not reach the test threshold, sliding the second mark point and indicating another target scale to obtain another target reading m E Or after the physical scale and the rigid body are within the preset distance and the posture of the sliding rail is changed, continuing to acquire a target scale corresponding to a second mark point in the physical scale to obtain a target reading m E And (3) step (c).
6. A virtual scene accuracy testing device based on an optical dynamic capturing system, the device comprising:
the virtual scale determining module is used for setting a virtual scale in a virtual scene to be tested, wherein the virtual scale is consistent with the scale of the physical scale and scales in equal proportion, the virtual scale is provided with a scale arrow, and the scale arrow indicates the virtual scale;
the rigid body is composed of at least three rigid body mark points, and the physical scale is placed at a preset distance of the rigid body;
the sampling and calculating module is used for obtaining a target scale corresponding to a second mark point in the physical scale to obtain a target reading m E Taking the position of the center point of the rigid body as the center, taking the preset distance as the radius, acquiring multi-frame three-dimensional coordinate data of a first mark point and a second mark point on the physical scale through the optical dynamic capture system, respectively calculating the Euclidean distance of each frame, and assigning the Euclidean distance to the virtual scale to enable the scale arrow of the virtual scale to be marked on the corresponding virtual scale to obtain a virtual reading m f Calculating a multi-frame stationAverage of the virtual readings
Figure FDA0004213188510000031
Calculating the standard deviation of the virtual reading according to the average value;
the judging module is used for acquiring the maximum value C of the virtual reading according to the sliding range of the scale arrow of the virtual scale max And minimum value C min Judging |C max -m E Error delta less than or equal to C min -m E If the error delta is true, calculating
Figure FDA0004213188510000032
Obtaining a difference value, if the difference value is smaller than a first threshold value and the standard deviation is smaller than a second threshold value, considering that the test precision of the round meets the standard, otherwise, considering that the test precision of the round does not meet the standard;
the sampling and calculating module is specifically configured to:
setting the average value window of the optical dynamic capturing system as delta t, and setting the average value window as F at a preset sampling frequency S Collecting the three-dimensional coordinate data through the optical dynamic capturing system, marking the first mark point and the second mark point as f frames at the time t, and calculating the Euclidean distance d of the f frames t
7. Virtual scene precision test equipment based on optical dynamic capture system, characterized in that the equipment includes:
a memory, a processor and an optical dynamic capturing system based virtual scene accuracy testing program stored on the memory and executable on the processor, the optical dynamic capturing system based virtual scene accuracy testing program when executed by the processor implementing the steps of the optical dynamic capturing system based virtual scene accuracy testing method according to any one of claims 1 to 6.
8. A computer-readable storage medium, wherein the computer-readable storage medium has stored thereon a virtual scene accuracy test program based on an optical dynamic capturing system, which when executed by a processor, implements the steps of the virtual scene accuracy test method based on an optical dynamic capturing system according to any one of claims 1 to 5.
9. The scale for the virtual scene precision test based on the optical dynamic capturing system, which is used for any one of claims 1 to 5, comprises a real scale, wherein scales are arranged on the real scale along the length direction, the scale is characterized in that a sliding rail is arranged on the real scale along the length direction, two sliding blocks are arranged on the sliding rail, a first marking point and a second marking point are respectively arranged on the two sliding blocks, and the first marking point and the second marking point slide on the sliding rail through the corresponding sliding blocks.
CN202010252267.5A 2020-04-01 2020-04-01 Virtual scene precision testing method based on optical dynamic capturing system and related equipment Active CN111462089B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010252267.5A CN111462089B (en) 2020-04-01 2020-04-01 Virtual scene precision testing method based on optical dynamic capturing system and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010252267.5A CN111462089B (en) 2020-04-01 2020-04-01 Virtual scene precision testing method based on optical dynamic capturing system and related equipment

Publications (2)

Publication Number Publication Date
CN111462089A CN111462089A (en) 2020-07-28
CN111462089B true CN111462089B (en) 2023-07-11

Family

ID=71685823

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010252267.5A Active CN111462089B (en) 2020-04-01 2020-04-01 Virtual scene precision testing method based on optical dynamic capturing system and related equipment

Country Status (1)

Country Link
CN (1) CN111462089B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115389246B (en) * 2022-10-31 2023-03-03 之江实验室 Speed precision measuring method, system and device of motion capture system
CN117523678B (en) * 2024-01-04 2024-04-05 广东茉莉数字科技集团股份有限公司 Virtual anchor distinguishing method and system based on optical action data

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2977961B1 (en) * 2014-07-24 2018-06-27 Deutsche Telekom AG Method and communication device for creating and/or editing virtual objects
CN105554247B (en) * 2015-12-07 2019-05-14 魅族科技(中国)有限公司 Measurement method, measuring system and terminal
JP2020008917A (en) * 2018-07-03 2020-01-16 株式会社Eidea Augmented reality display system, augmented reality display method, and computer program for augmented reality display

Also Published As

Publication number Publication date
CN111462089A (en) 2020-07-28

Similar Documents

Publication Publication Date Title
CN107645701B (en) Method and device for generating motion trail
CN111199564A (en) Indoor positioning method and device of intelligent mobile terminal and electronic equipment
CN110163889A (en) Method for tracking target, target tracker, target following equipment
CN108638062B (en) Robot positioning method, device, positioning equipment and storage medium
US20230316690A1 (en) 3-D Reconstruction Using Augmented Reality Frameworks
CN111354042A (en) Method and device for extracting features of robot visual image, robot and medium
CN108810473B (en) Method and system for realizing GPS mapping camera picture coordinate on mobile platform
CN111462089B (en) Virtual scene precision testing method based on optical dynamic capturing system and related equipment
WO2016155074A1 (en) Correcting and focusing method and system for included angle of optical axis, and dual-camera equipment
CN111127559B (en) Calibration rod detection method, device, equipment and storage medium in optical dynamic capture system
CN107621263B (en) Geomagnetic positioning method based on road magnetic field characteristics
CN109977466A (en) A kind of 3-D scanning viewpoint planning method, apparatus and computer readable storage medium
JP2008298685A (en) Measuring device and program
CN113029128B (en) Visual navigation method and related device, mobile terminal and storage medium
CN112465871B (en) Evaluation method and system for accuracy of visual tracking algorithm
CN110648363A (en) Camera posture determining method and device, storage medium and electronic equipment
CN112200838A (en) Projectile trajectory tracking method, device, equipment and storage medium
CN113240806B (en) Information processing method, information processing device, electronic equipment and storage medium
CN113888583A (en) Real-time judgment method and device for visual tracking accuracy
CN105338541A (en) Mobile wireless network data-based abnormal trajectory detection method and device
CN111582385B (en) SLAM quality quantization method, system, computer device and storage medium
CN111818260B (en) Automatic focusing method and device and electronic equipment
CN112419739A (en) Vehicle positioning method and device and electronic equipment
CN112504156A (en) Structural surface strain measurement system and measurement method based on foreground grid
CN106123784B (en) method and device for measuring length

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant