CN112950677A - Image tracking simulation method, device, equipment and storage medium - Google Patents

Image tracking simulation method, device, equipment and storage medium Download PDF

Info

Publication number
CN112950677A
CN112950677A CN202110036617.9A CN202110036617A CN112950677A CN 112950677 A CN112950677 A CN 112950677A CN 202110036617 A CN202110036617 A CN 202110036617A CN 112950677 A CN112950677 A CN 112950677A
Authority
CN
China
Prior art keywords
angle
calibration
distance
image tracking
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110036617.9A
Other languages
Chinese (zh)
Inventor
陈普华
张培喜
曾奎
张力
李焰
张帆
罗伟
黄鑫鑫
苏茂
杨利兰
万佩
查凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
General Designing Institute of Hubei Space Technology Academy
Original Assignee
General Designing Institute of Hubei Space Technology Academy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by General Designing Institute of Hubei Space Technology Academy filed Critical General Designing Institute of Hubei Space Technology Academy
Priority to CN202110036617.9A priority Critical patent/CN112950677A/en
Publication of CN112950677A publication Critical patent/CN112950677A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Projection Apparatus (AREA)

Abstract

The invention discloses an image tracking simulation method, device, equipment and storage medium, wherein the method is implemented by constructing a simulation imaging environment; calibrating the distance and the angle of the imaging equipment relative to the projection screen in the simulated imaging environment according to a preset Least Mean Square (LMS) algorithm to obtain a calibration distance and a calibration angle; carrying out image tracking simulation according to the calibration distance and the calibration angle; the distance between the focal point of the television equipment and the projection screen can be obtained without measuring, the position of the focal point of the imaging equipment projected in the projection screen can be quickly and accurately positioned, the calibration work of the system is greatly simplified, the operation is simple, the positioning of the focal point of the observation equipment is simplified, the isolation from the external environment is good, the image tracking simulation is more convenient and efficient, the simulation precision is ensured, and the simulation efficiency is improved.

Description

Image tracking simulation method, device, equipment and storage medium
Technical Field
The present invention relates to the field of image simulation, and in particular, to an image tracking simulation method, apparatus, device, and storage medium.
Background
For an image tracking simulation test, the simplest method for simulating the motion of a target is to generate a virtual observation target through a projection screen, capture the target through television equipment and observe the motion of the target; in the test, the imaging quality is related to the authenticity of the simulation test, and the accuracy of the observation data output influenced by the relative relationship between the observation equipment and the projection screen is high, so that the calibration of the good imaging environment and the relative relationship between the observation equipment and the projection screen is very important; however, observation equipment in an open environment is easily interfered by an external environment, so that the authenticity and the accuracy of a simulation test are influenced; the imaging focus of the observation equipment is not visible explicitly, and the measurement difficulty is undoubtedly greatly increased for the measurement of the distance of the focus relative to the screen and the projection position in the screen; in addition, because simulation tests need to be carried out for multiple times in different development stages, the distance between the television observation equipment and the screen inevitably changes to a certain extent in the test process.
The traditional image tracking simulation method is to simulate the movement of a target through a double-standard sliding rail and observe a high-low angle and an azimuth angle; the method has the advantages that the time spent in the calibration process is long, the requirements on a test field and the environment are high, and the imaging environment is easily interfered; before the test, if a manual measurement mode is adopted for obtaining the relative position of the imaging device and the imaging screen, firstly, the relative distance between the television observation device and the screen is calculated according to the field of view characteristic of the television observation device, then the television observation device and the screen are placed according to the pre-calculated size, and then the central point of the television observation device is coincided with the central point of the screen in a manual repeated adjustment mode; the operation process is not only very complicated, but also has low precision, and a great deal of time and energy are needed before the test.
Disclosure of Invention
The invention mainly aims to provide an image tracking simulation method, an image tracking simulation device, image tracking simulation equipment and a storage medium, and aims to solve the technical problems that in the prior art, the focus of television equipment is difficult to accurately measure, the focus is difficult to align to the center of a screen, the calibration precision is difficult to grasp, the operation process is complicated, time is wasted and labor is wasted.
In a first aspect, the present invention provides an image tracking simulation method, including the steps of:
constructing a simulated imaging environment;
calibrating the distance and the angle of the imaging equipment relative to the projection screen in the simulated imaging environment according to a preset Least Mean Square (LMS) algorithm to obtain a calibration distance and a calibration angle;
and carrying out image tracking simulation according to the calibration distance and the calibration angle.
Optionally, the constructing a simulated imaging environment comprises:
the imaging device and the projection screen are positioned in a relatively closed imaging environment through a box body, the projection screen is arranged on one side of the box body, and the imaging device fixed through a turntable mechanism is arranged on the other side of the box body;
the imaging device is right opposite to the projection screen, the projection screen displays an image of an observed object, and a simulated imaging environment is formed by the box body, the imaging device and the projection screen.
Optionally, the calibrating the distance and the angle of the imaging device relative to the projection screen in the simulated imaging environment according to a preset least mean square algorithm LMS to obtain a calibrated distance and a calibrated angle includes:
acquiring a preset horizontal distance between an imaging device and a projection screen in the simulated imaging environment;
acquiring relative position and posture information of the imaging device and an observed object, and acquiring size and shape information of the observed object;
and calibrating the distance and the angle of the imaging equipment relative to the projection screen by combining a preset least mean square algorithm (LMS) according to the preset horizontal distance, the relative position, the attitude information and the size and shape information to obtain a calibration distance and a calibration angle.
Optionally, the calibrating the distance and the angle of the imaging device relative to the projection screen according to the preset horizontal distance, the relative position, the attitude information, and the size and shape information by combining a preset least mean square algorithm LMS to obtain a calibration distance and a calibration angle includes:
acquiring the screen resolution of the projection screen, and determining the physical size of a single pixel according to the screen resolution and the size and shape information;
determining a high-low angle and an azimuth angle of the imaging device according to the preset horizontal distance, the relative position and the attitude information;
acquiring a preset state quantity, a preset filter coefficient and an objective function of a preset least mean square algorithm LMS;
determining the initial pixel position of a projection point according to the physical size, the elevation angle, the azimuth angle, a preset state quantity and a preset filter coefficient;
determining an error function according to the target function and the initial pixel position, and determining a gradient vector of the error function and the preset filter coefficient according to the error function;
converging the preset filter coefficient through the convergence factor of the LMS and the gradient vector to obtain a target filter coefficient;
and calibrating the coordinates of the target projection points according to the target filter coefficients, and determining the calibration distance and the calibration angle according to the coordinates of the target projection points.
Optionally, the converging the preset filter coefficient through the convergence factor of the LMS and the gradient vector to obtain a target filter coefficient includes:
converging the preset filter coefficient through the convergence factor of the LMS and the gradient vector, and obtaining a target filter coefficient through the following formula:
ωk+1=ωk+2μekxk
wherein, ω isk+1Is a target filter coefficient, omegakFor presetting filter coefficients, mu is a convergence factor, 2ekxkIs the instantaneous value of the gradient vector estimate.
Optionally, the calibrating the coordinates of the target projection point according to the target filter coefficient, and determining the calibration distance and the calibration angle according to the coordinates of the target projection point include:
determining the coordinates of the target projection points according to the target filter coefficients and the preset state quantity;
acquiring an observation point coordinate of the imaging equipment, and determining a current distance and a current angle between the two coordinates according to the observation point coordinate and the target projection point coordinate;
and taking the current distance as a calibration distance and the current angle as a calibration angle.
Optionally, before performing the image tracking simulation according to the calibration distance and the calibration angle, the image tracking simulation method further includes:
when the imaging equipment and the projection screen have angular deviation, acquiring a high-low inclination angle in the high-low direction of the imaging equipment and an azimuth inclination angle corresponding to the azimuth;
correcting the coordinates of the observation points on the projection screen according to the high-low inclination angle and the azimuth inclination angle;
and re-determining the calibration distance and the calibration angle according to the corrected coordinates of the observation point.
In a second aspect, to achieve the above object, the present invention further provides an image tracking simulation apparatus, including:
the construction module is used for constructing a simulation imaging environment;
the calibration module is used for calibrating the distance and the angle of the imaging equipment relative to the projection screen in the simulated imaging environment according to a preset Least Mean Square (LMS) algorithm to obtain a calibration distance and a calibration angle;
and the simulation module is used for carrying out image tracking simulation according to the calibration distance and the calibration angle.
In a third aspect, to achieve the above object, the present invention further provides an image tracking simulation apparatus, including: a memory, a processor, and an image tracking simulation program stored on the memory and executable on the processor, the image tracking simulation program configured to implement the steps of the image tracking simulation method as recited in the claims above.
In a fourth aspect, to achieve the above object, the present invention further provides a storage medium, on which an image tracking simulation program is stored, the image tracking simulation program implementing the steps of the image tracking simulation method as described above when executed by a processor.
The image tracking simulation method provided by the invention is characterized in that a simulation imaging environment is constructed; calibrating the distance and the angle of the imaging equipment relative to the projection screen in the simulated imaging environment according to a preset Least Mean Square (LMS) algorithm to obtain a calibration distance and a calibration angle; carrying out image tracking simulation according to the calibration distance and the calibration angle; the distance between the focal point of the television equipment and the projection screen can be obtained without measuring, the position of the focal point of the imaging equipment projected in the projection screen can be quickly and accurately positioned, the calibration work of the system is greatly simplified, the operation is simple, the positioning of the focal point of the observation equipment is simplified, the isolation from the external environment is good, the image tracking simulation is more convenient and efficient, the simulation precision is ensured, and the simulation efficiency is improved.
Drawings
FIG. 1 is a schematic diagram of an apparatus architecture of a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a first embodiment of an image tracking simulation method according to the present invention;
FIG. 3 is a flowchart illustrating a second embodiment of an image tracking simulation method according to the present invention;
FIG. 4 is a flowchart illustrating a third embodiment of an image tracking simulation method according to the present invention;
FIG. 5 is a schematic flow chart of a fourth embodiment of an image tracking simulation method according to the present invention;
FIG. 6 is a diagram showing the relationship between the observation device and the projection screen in the image tracking simulation method of the present invention;
FIG. 7 is a flowchart illustrating a fifth embodiment of an image tracking simulation method according to the present invention;
FIG. 8 is a schematic diagram of angular deviations in high and low angular directions in the image tracking simulation method of the present invention;
FIG. 9 is a functional block diagram of a first embodiment of an image tracking simulation apparatus according to the present invention;
fig. 10 is a schematic view of an imaging configuration of the image tracking simulation apparatus of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The solution of the embodiment of the invention is mainly as follows: by constructing a simulated imaging environment; calibrating the distance and the angle of the imaging equipment relative to the projection screen in the simulated imaging environment according to a preset Least Mean Square (LMS) algorithm to obtain a calibration distance and a calibration angle; carrying out image tracking simulation according to the calibration distance and the calibration angle; the distance between the focal point of the television equipment and the projection screen can be obtained without measuring, the position of the focal point projection of the imaging equipment in the projection screen can be quickly and accurately positioned, the calibration work of a system is greatly simplified, the operation is simple, the positioning of the focal point of the observation equipment is simplified, the isolation from the external environment is good, the image tracking simulation is more convenient and efficient, the simulation precision is ensured, the simulation efficiency is improved, and the technical problems that the focal point of the television equipment is difficult to accurately measure by using the traditional image tracking simulation method in the prior art, the focal point is difficult to align to the center of the screen, the calibration precision is difficult to grasp, the operation process is complicated, and time and labor are wasted are solved.
Referring to fig. 1, fig. 1 is a schematic device structure diagram of a hardware operating environment according to an embodiment of the present invention.
As shown in fig. 1, the apparatus may include: a processor 1001, such as a CPU, a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., a Wi-Fi interface). The Memory 1005 may be a high-speed RAM Memory or a Non-Volatile Memory (Non-Volatile Memory), such as a disk Memory. The memory 1005 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the configuration of the apparatus shown in fig. 1 is not intended to be limiting of the apparatus and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, a memory 1005, which is a storage medium, may include therein an operating system, a network communication module, a user interface module, and an image tracking emulation program.
The apparatus of the present invention calls an image tracking simulation program stored in the memory 1005 by the processor 1001, and performs the following operations:
constructing a simulated imaging environment;
calibrating the distance and the angle of the imaging equipment relative to the projection screen in the simulated imaging environment according to a preset Least Mean Square (LMS) algorithm to obtain a calibration distance and a calibration angle;
and carrying out image tracking simulation according to the calibration distance and the calibration angle.
Further, the processor 1001 may call the image tracking simulation program stored in the memory 1005, and also perform the following operations:
the imaging device and the projection screen are positioned in a relatively closed imaging environment through a box body, the projection screen is arranged on one side of the box body, and the imaging device fixed through a turntable mechanism is arranged on the other side of the box body;
the imaging device is right opposite to the projection screen, the projection screen displays an image of an observed object, and a simulated imaging environment is formed by the box body, the imaging device and the projection screen.
Further, the processor 1001 may call the image tracking simulation program stored in the memory 1005, and also perform the following operations:
acquiring a preset horizontal distance between an imaging device and a projection screen in the simulated imaging environment;
acquiring relative position and posture information of the imaging device and an observed object, and acquiring size and shape information of the observed object;
and calibrating the distance and the angle of the imaging equipment relative to the projection screen by combining a preset least mean square algorithm (LMS) according to the preset horizontal distance, the relative position, the attitude information and the size and shape information to obtain a calibration distance and a calibration angle.
Further, the processor 1001 may call the image tracking simulation program stored in the memory 1005, and also perform the following operations:
acquiring the screen resolution of the projection screen, and determining the physical size of a single pixel according to the screen resolution and the size and shape information;
determining a high-low angle and an azimuth angle of the imaging device according to the preset horizontal distance, the relative position and the attitude information;
acquiring a preset state quantity, a preset filter coefficient and an objective function of a preset least mean square algorithm LMS;
determining the initial pixel position of a projection point according to the physical size, the elevation angle, the azimuth angle, a preset state quantity and a preset filter coefficient;
determining an error function according to the target function and the initial pixel position, and determining a gradient vector of the error function and the preset filter coefficient according to the error function;
converging the preset filter coefficient through the convergence factor of the LMS and the gradient vector to obtain a target filter coefficient;
and calibrating the coordinates of the target projection points according to the target filter coefficients, and determining the calibration distance and the calibration angle according to the coordinates of the target projection points.
Further, the processor 1001 may call the image tracking simulation program stored in the memory 1005, and also perform the following operations:
converging the preset filter coefficient through the convergence factor of the LMS and the gradient vector, and obtaining a target filter coefficient through the following formula:
ωk+1=ωk+2μekxk
wherein, ω isk+1Is a target filter coefficient, omegakFor presetting filter coefficients, mu is a convergence factor, 2ekxkIs the instantaneous value of the gradient vector estimate.
Further, the processor 1001 may call the image tracking simulation program stored in the memory 1005, and also perform the following operations:
determining the coordinates of the target projection points according to the target filter coefficients and the preset state quantity;
acquiring an observation point coordinate of the imaging equipment, and determining a current distance and a current angle between the two coordinates according to the observation point coordinate and the target projection point coordinate;
and taking the current distance as a calibration distance and the current angle as a calibration angle.
Further, the processor 1001 may call the image tracking simulation program stored in the memory 1005, and also perform the following operations:
when the imaging equipment and the projection screen have angular deviation, acquiring a high-low inclination angle in the high-low direction of the imaging equipment and an azimuth inclination angle corresponding to the azimuth;
correcting the coordinates of the observation points on the projection screen according to the high-low inclination angle and the azimuth inclination angle;
and re-determining the calibration distance and the calibration angle according to the corrected coordinates of the observation point.
According to the scheme, the simulation imaging environment is constructed; calibrating the distance and the angle of the imaging equipment relative to the projection screen in the simulated imaging environment according to a preset Least Mean Square (LMS) algorithm to obtain a calibration distance and a calibration angle; carrying out image tracking simulation according to the calibration distance and the calibration angle; the distance between the focal point of the television equipment and the projection screen can be obtained without measuring, the position of the focal point of the imaging equipment projected in the projection screen can be quickly and accurately positioned, the calibration work of the system is greatly simplified, the operation is simple, the positioning of the focal point of the observation equipment is simplified, the isolation from the external environment is good, the image tracking simulation is more convenient and efficient, the simulation precision is ensured, and the simulation efficiency is improved.
Based on the hardware structure, the embodiment of the image tracking simulation method is provided.
Referring to fig. 2, fig. 2 is a schematic flow chart of a first embodiment of the image tracking simulation method of the present invention.
In a first embodiment, the image tracking simulation method comprises the following steps:
and step S10, constructing a simulated imaging environment.
It should be noted that the simulated imaging environment is a simulated imaging environment which is constructed by a specific structure or a combination of accessories and can isolate external environment interference and realize image tracking simulation, and the purpose of constructing the simulated imaging environment is to isolate external environment interference factors, so that image tracking simulation is more convenient and efficient, the construction mode can be various, and the embodiment does not limit the environment.
And step S20, calibrating the distance and the angle of the imaging device relative to the projection screen in the simulated imaging environment according to a preset Least Mean Square (LMS) algorithm to obtain a calibration distance and a calibration angle.
It should be noted that the preset Least Mean Square (LMS) is a preset adaptive LMS, and the LMS can simplify the device calibration process, avoid measurement of the relative position of the focus of the observation device and the projection screen, improve the simulation accuracy, make the simulation process more convenient and efficient, and calibrate the distance and angle of the imaging device relative to the projection screen according to the LMS to obtain the calibrated distance and angle, i.e., the calibrated distance and the calibrated angle.
And step S30, performing image tracking simulation according to the calibration distance and the calibration angle.
It should be noted that the calibration distance and the calibration angle may be used as simulation model parameters to start image tracking simulation, that is, the calibration distance and the calibration angle may be used as control inputs of a simulation model.
According to the scheme, the simulation imaging environment is constructed; calibrating the distance and the angle of the imaging equipment relative to the projection screen in the simulated imaging environment according to a preset Least Mean Square (LMS) algorithm to obtain a calibration distance and a calibration angle; carrying out image tracking simulation according to the calibration distance and the calibration angle; the distance between the focal point of the television equipment and the projection screen can be obtained without measuring, the position of the focal point of the imaging equipment projected in the projection screen can be quickly and accurately positioned, the calibration work of the system is greatly simplified, the operation is simple, the positioning of the focal point of the observation equipment is simplified, the isolation from the external environment is good, the image tracking simulation is more convenient and efficient, the simulation precision is ensured, and the simulation efficiency is improved.
Further, fig. 3 is a schematic flow chart of a second embodiment of the image tracking simulation method of the present invention, and as shown in fig. 3, the second embodiment of the image tracking simulation method of the present invention is proposed based on the first embodiment, and in this embodiment, the step S10 specifically includes the following steps:
and step S11, enabling the imaging device and the projection screen to be in a relatively closed imaging environment through a box body, wherein one side of the box body is the projection screen, and the other side of the box body is the imaging device fixed through a rotary table mechanism.
It should be noted that external environment interference can be isolated through the box, imaging equipment and a projection screen exist in the box, one side of the box is the projection screen, the other side of the box is the imaging equipment fixed through a turntable mechanism, the turntable mechanism is an adjustable support structure, and the distance and the angle of the imaging equipment relative to the projection screen can be adjusted.
And step S12, the imaging device is right opposite to the projection screen, the projection screen displays the image of the observed object, and a simulated imaging environment is formed by the box body, the imaging device and the projection screen.
It can be understood that, by default, the imaging device faces the projection screen, the projection screen displays an image of an observed object, the observed object is an observed target to be subjected to image tracking by the imaging device, and a simulated imaging environment is formed by the box body, the imaging device and the projection screen, so that the position of the focal point of the imaging device projected in the projection screen is quickly and accurately positioned, and the calibration work of the system is greatly simplified.
According to the scheme, the imaging device and the projection screen are positioned in a relatively closed imaging environment through the box body, the projection screen is arranged on one side of the box body, and the imaging device fixed through the turntable mechanism is arranged on the other side of the box body; the imaging device is right opposite to the projection screen, the projection screen displays the image of the observed object, a simulated imaging environment is formed by the box body, the imaging device and the projection screen, external interference factors can be isolated, the distance between the focal point of the television device and the projection screen does not need to be obtained through measurement, the position of the focal point of the imaging device projected in the projection screen can be quickly and accurately positioned, and the calibration work of the system is greatly simplified.
Further, fig. 4 is a schematic flow chart of a third embodiment of the image tracking simulation method of the present invention, and as shown in fig. 4, the third embodiment of the image tracking simulation method of the present invention is proposed based on the first embodiment, and in this embodiment, the step S20 specifically includes the following steps:
and step S21, acquiring the preset horizontal distance between the imaging device and the projection screen in the simulated imaging environment.
It should be noted that the preset horizontal distance is a preset default horizontal distance between the imaging device and the projection screen, and the preset horizontal distance is a distance that needs to be calibrated.
Step S22, acquiring relative position and posture information of the imaging device and the observed object, and acquiring size and shape information of the observed object.
It is understood that the imaging device and the observed object have a relative position, which is a spatial relative distance position of the imaging device and the observed object, and corresponding posture information, the posture information is information corresponding to a spatial relative angular orientation posture of the imaging device and the observed object, and the size and shape information of the observed object is size and shape information of the observed object.
Step S23, calibrating the distance and the angle of the imaging device relative to the projection screen according to the preset horizontal distance, the relative position, the attitude information and the size and shape information by combining a preset Least Mean Square (LMS) algorithm, and obtaining a calibration distance and a calibration angle.
It should be noted that the preset least mean square algorithm LMS is used to calibrate the distance and angle of the imaging device relative to the projection screen by performing correlation operation on the parameters, such as the preset horizontal distance, the relative position, the attitude information, and the size and shape information, so as to obtain the calibrated distance and calibrated angle.
According to the scheme, the preset horizontal distance between the imaging device and the projection screen is acquired in the simulated imaging environment; acquiring relative position and posture information of the imaging device and an observed object, and acquiring size and shape information of the observed object; calibrating the distance and the angle of the imaging equipment relative to the projection screen according to the preset horizontal distance, the relative position, the attitude information and the size and shape information by combining a preset Least Mean Square (LMS) algorithm to obtain a calibration distance and a calibration angle; the system calibration work can be greatly simplified, the operation is simple, the positioning of the focus of the observation equipment is simplified, the isolation from the external environment is good, the image tracking simulation is more convenient and efficient, the simulation precision is ensured, and the simulation efficiency is improved.
Further, fig. 5 is a schematic flow chart of a fourth embodiment of the image tracking simulation method of the present invention, and as shown in fig. 5, the fourth embodiment of the image tracking simulation method of the present invention is provided based on the third embodiment, in this embodiment, the step S23 specifically includes the following steps:
step S231, acquiring a screen resolution of the projection screen, and determining a physical size of a single pixel according to the screen resolution and the size and shape information.
It should be noted that, the screen resolution is a screen resolution of the projection screen, and the physical size of a single pixel may be determined by combining the size and shape information with the screen resolution, and in a specific implementation, the physical size of a single pixel is determined according to the screen resolution and the size and shape information by the following formula:
Figure RE-GDA0002991039770000111
wherein k isxiIs the X-axis physical dimension, k, of a single pixelyiIs the Y-axis physical size of a single pixel, L is the length of the projection screen obtained from the size and shape information, W is the width of the projection screen obtained from the size and shape information, ximaxAnd yimaxIs the screen resolution.
Step S232, determining a height angle and an azimuth angle of the imaging device according to the preset horizontal distance, the relative position and the attitude information.
It is understood that the elevation and azimuth of the television apparatus with respect to the projection screen output can be determined based on the preset horizontal distance, the relative position, and the attitude information.
In a specific implementation, referring to fig. 6, fig. 6 is a diagram illustrating a relationship between an observation device and a projection screen in the image tracking simulation method of the present invention, and as shown in fig. 6, a focus of a television device is set to OcThe intersection point of the axis and the projection screen is O, and the distance is l. And establishing a plane coordinate system Oxy on the projection screen by taking the point O as a coordinate origin, and setting P as an observation point, wherein projection sub-tables of the observation point on Ox and Oy are N and M. The television viewing device outputs a high and low angle alpha and an azimuth angle beta. Defining low angle alpha ═ MOcO, azimuth angle beta ═ NOcAnd O, dividing the projection screen into four quadrants by a coordinate system Oxy, wherein the elevation angle in the quadrant I, II is positive, and the azimuth angles in the quadrants II and III are positive. With OixiyiRepresenting a screen pixel coordinate system.
Step S233, a preset state quantity, a preset filter coefficient, and an objective function of a preset least mean square algorithm LMS are obtained.
It should be understood that the preset state quantities are quantities of different states represented by the preset least mean square algorithm LMS, the preset state quantities may be quantities that are not convenient to measure directly or difficult to measure accurately, the filter coefficients are coefficients for performing data filtering by the preset least mean square algorithm LMS, and the objective function is an observed pixel point position corresponding to an observation point of the observation device.
And S234, determining the initial pixel position of the projection point according to the physical size, the elevation angle, the azimuth angle, the preset state quantity and the preset filter coefficient.
It is understood that the initial pixel position can be obtained by the following formula according to the physical size, the elevation angle, the azimuth angle, the preset state quantity and the preset filter coefficient:
Figure RE-GDA0002991039770000121
wherein k isxiIs the X-axis physical dimension, k, of a single pixelyiIs the Y-axis physical dimension, x, of a single pixelipAs the X-axis coordinate, y, in the initial pixel position of the observation pointipIs the Y-axis coordinate in the initial pixel position of the observation point, alpha is the elevation angle, beta is the azimuth angle, l is the preset horizontal distance, xioAs X-axis coordinates, y, of the pixels corresponding to the origin of coordinatesioAnd the coordinate of the Y axis of the pixel corresponding to the coordinate origin is shown, omega is the preset filter coefficient, and x is the preset state quantity.
In a specific implementation, the above formula is obtained by: with continued reference to FIG. 6, the elevation and azimuth angles are output by the viewing device, P-point pixel location (x)ip,yip) Read from the screen, the size and resolution of the screen can be found by looking at the device properties, i.e. kxiAnd kyiCan be obtained by calculation; the quantity to be calibrated is the distance l, the pixel position (x) of the projection point Oio,yio) (i.e., zero position of coordinate Oxy); let the coordinate of the point P in Oxy be (x)p,yp) Then, there are:
Figure RE-GDA0002991039770000122
defined by elevation and azimuth:
Figure RE-GDA0002991039770000123
this formula is taken together to give:
Figure RE-GDA0002991039770000131
i.e. can be expressed as:
Figure RE-GDA0002991039770000132
step S235, determining an error function according to the target function and the initial pixel position, and determining a gradient vector of the error function and the preset filter coefficient according to the error function.
It will be appreciated that the error function is determined from the objective function and the initial pixel position by:
Figure RE-GDA0002991039770000133
under stationary conditions, the mean square error can be expressed as:
Figure RE-GDA0002991039770000134
in the above formula, p ═ E [ dkxk]A correlation vector representing the objective function and the state vector;
Figure RE-GDA0002991039770000135
an autocorrelation matrix representing a state vector. Herein, the superscript T denotes the transpose of a vector or matrix. For the sampling of the system at the time k,
Figure RE-GDA0002991039770000136
setting the objective function to dkThe maximum gradient vector gk of the error function and the preset filter coefficient can be determined according to the error function by:
Figure RE-GDA0002991039770000137
and step S236, converging the preset filter coefficient through the convergence factor of the LMS and the gradient vector to obtain a target filter coefficient.
It can be understood that obtaining the target filter coefficient can improve the precision of the image tracking simulation by converging the preset filter coefficient.
Further, the step S236 specifically includes the following steps:
converging the preset filter coefficient through the convergence factor of the LMS and the gradient vector, and obtaining a target filter coefficient through the following formula:
ωk+1=ωk+2μekxk
wherein, ω isk+1Is a target filter coefficient, omegakFor presetting filter coefficients, mu is a convergence factor, 2ekxkIs the instantaneous value of the gradient vector estimate.
In a specific implementation, a preset state quantity and an initial value of a preset filter coefficient may be randomly selected, and a pseudo code of the adaptive LMS method is as follows:
for k=1:N
Figure RE-GDA0002991039770000141
Figure RE-GDA0002991039770000142
Figure RE-GDA0002991039770000143
Figure RE-GDA0002991039770000144
randomly selecting initial values of omega and x, selecting a proper gain convergence coefficient mu, and obtaining an estimated value of the converged state quantity x to obtain a result to be calibrated; the adaptive LMS has low computational complexity, is easy to converge in a stable environment, can obtain a wiener solution by unbiased convergence, and realizes the stability of the algorithm by using limited precision.
The above analysis is based on the assumption that the imaging device is facing the projection screen without an angular deviation, and the angular deviation correction method is as in the following embodiment.
It should be understood that the search may generally be at the steepest descent rate:
Figure RE-GDA0002991039770000145
μ represents a convergence factor of the light emitted from the light source,
Figure RE-GDA0002991039770000146
denotes gkTaking instantaneous values of p and R as gkThe formula for obtaining the target filter coefficient holds because the instantaneous value is obtained
Figure RE-GDA0002991039770000147
The convergence of the filter coefficient can be ensured by taking the convergence factor in a certain range, and the value range of the convergence factor mu is
Figure RE-GDA0002991039770000148
Where tr represents the trace of the matrix, the sum of the diagonal elements of the trace matrix.
And S237, calibrating the coordinates of the target projection points according to the target filter coefficients, and determining a calibration distance and a calibration angle according to the coordinates of the target projection points.
It can be understood that, the coordinates of a target projection point, which is the coordinates of the projection point of the observation device on the projection screen relative to the measured target, can be determined by the target coefficient, and can be obtained by a calculation method based on the initial pixel position after the target coefficient is determined.
Further, the step S237 specifically includes the following steps:
determining the coordinates of the target projection points according to the target filter coefficients and the preset state quantity;
acquiring an observation point coordinate of the imaging equipment, and determining a current distance and a current angle between the two coordinates according to the observation point coordinate and the target projection point coordinate;
and taking the current distance as a calibration distance and the current angle as a calibration angle.
It should be understood that the formula for obtaining the initial pixel position according to the application by the determined target filter coefficient and the preset state quantity
Figure RE-GDA0002991039770000149
It should be noted that the projection point position is generated on the screen, and reflects the motion relationship of the simulation model, so xip and yip are known, and the calibration quantity thereof requires distance 1, focus projection positions xi0 and yi0, that is, the final values of two coefficients omega estimated in the pseudo code, and the common method is to take the final coefficient value or the average value of the final coefficients as the estimated value, so as to obtain the coordinates of the target projection point, and after obtaining the coordinates of the observation point of the imaging device, the current distance and the current angle between the two coordinates can be determined according to the coordinates of the observation point and the coordinates of the target projection point, and then the current distance is taken as the calibration distance, and the current angle is taken as the calibration angle.
According to the scheme, the physical size of a single pixel is determined according to the screen resolution and the size and shape information by acquiring the screen resolution of the projection screen; determining a high-low angle and an azimuth angle of the imaging device according to the preset horizontal distance, the relative position and the attitude information; acquiring a preset state quantity, a preset filter coefficient and an objective function of a preset least mean square algorithm LMS; determining the initial pixel position of a projection point according to the physical size, the elevation angle, the azimuth angle, a preset state quantity and a preset filter coefficient; determining an error function according to the target function and the initial pixel position, and determining a gradient vector of the error function and the preset filter coefficient according to the error function; converging the preset filter coefficient through the convergence factor of the LMS and the gradient vector to obtain a target filter coefficient; the method has the advantages that the coordinates of the target projection points are calibrated according to the target filter coefficients, the calibration distance and the calibration angle are determined according to the coordinates of the target projection points, the calibration work of the system can be greatly simplified, the operation is simple, the positioning of the focus of the observation equipment is simplified, the isolation from the external environment is good, the image tracking simulation is more convenient and efficient, the simulation precision is ensured, and the simulation efficiency is improved.
Further, fig. 7 is a schematic flowchart of a fifth embodiment of the image tracking simulation method of the present invention, and as shown in fig. 7, the fifth embodiment of the image tracking simulation method of the present invention is proposed based on the first embodiment, in this embodiment, before the step S30, the image tracking simulation method further includes the following steps:
step S301, when the imaging device and the projection screen have an angle deviation, acquiring a high-low inclination angle in the high-low direction of the imaging device and an azimuth inclination angle corresponding to the azimuth.
It can be understood that when the imaging device has an angular deviation from the projection screen, the angular deviation may be corrected, and a high-low inclination angle in the high-low direction of the imaging device and an azimuth inclination angle corresponding to the azimuth at the current moment need to be obtained.
Step S302, correcting the coordinates of the observation point on the projection screen according to the high-low inclination angle and the azimuth inclination angle.
It should be understood that the elevation angle can be adjusted by the elevation angle to obtain a new elevation angle, the azimuth angle can be adjusted by the azimuth angle to obtain a new azimuth angle, and the coordinates of the observation point can be recalculated according to the new elevation angle and the azimuth angle, so as to complete the correction of the coordinates of the observation point on the projection screen.
And step S303, re-determining the calibration distance and the calibration angle according to the corrected observation point coordinates.
It is understood that a new calibration distance and a new calibration angle can be determined again by the method of the above embodiment according to the corrected coordinates of the observation point.
In the specific implementation, referring to fig. 8, fig. 8 is a schematic diagram of the angular deviation in the high and low angular directions in the image tracking simulation method of the present invention, as shown in fig. 8, O is obtained by considering the high and low angular directions (the same analysis of the azimuth directions), andcis the focal point of the camera, and O is the focal pointThe point is at the projection point of the projection screen, the projection point of the intersection point of the camera axis and the projection screen surface in the longitudinal axis is O', a certain imaging point is P (x, y), the corresponding measurement height angle is alpha, the inclination angle of the camera in the height direction is delta alpha, and the geometrical relationship is as follows:
|OP|=l·tan(α+Δα)≈l·tanα+l·Δα+l·Δα·tan2α
thus, two measurement points Pi(xi,yi) And Pj(xj,yj) There is a relationship:
|PiPj|=yj-yi=l(tanαj-tanαi)+lΔα(tan2αj-tan2αi)
order to
Figure RE-GDA0002991039770000161
Δ α can be found as shown in the first section; similarly, Δ β can be obtained.
Modified expressions of the above embodiments by Δ α and Δ β
Figure RE-GDA0002991039770000162
A calibration method with a deviation angle can be obtained.
When the arrangement positions of the imaging device and the projection screen are correct and the calibration error meets the simulation requirement, the angle correction condition can be not considered to simplify the model, and after the calibration is finished, the calibration quantity is used as a simulation model parameter to start the image tracking simulation.
According to the scheme, when the imaging device and the projection screen have angular deviation, the high-low inclination angle of the imaging device in the high-low direction and the azimuth inclination angle corresponding to the azimuth are obtained; correcting the coordinates of the observation points on the projection screen according to the high-low inclination angle and the azimuth inclination angle; the calibration distance and the calibration angle are determined again according to the corrected coordinates of the observation point, and the calibration distance and the calibration angle can be calculated and corrected when the angle deviation exists, so that the simulation precision is effectively guaranteed, and the simulation efficiency is improved.
Correspondingly, the invention further provides an image tracking simulation device.
Referring to fig. 9, fig. 9 is a functional block diagram of the image tracking simulation apparatus according to the first embodiment of the present invention.
In a first embodiment of the image tracking simulation apparatus of the present invention, the image tracking simulation apparatus includes:
a building block 10 for building a simulated imaging environment.
And the calibration module 20 is configured to calibrate the distance and the angle of the imaging device relative to the projection screen according to a preset least mean square algorithm LMS in the simulated imaging environment, so as to obtain a calibrated distance and a calibrated angle.
And the simulation module 30 is configured to perform image tracking simulation according to the calibration distance and the calibration angle.
The steps implemented by each functional module of the image tracking simulation device can refer to each embodiment of the image tracking simulation method of the present invention, and are not described herein again.
In a specific implementation, referring to fig. 10, fig. 10 is a schematic view of an imaging structure of the image tracking simulation apparatus of the present invention, and as shown in fig. 10, the box is used for simulating an imaging environment and isolating external environment interference. One side of the box body is provided with a projection screen which generates an image of an observed object, and the other side of the box body is fixed with an imaging device which is fixed on a turntable, can be finely adjusted in direction and can move for a certain distance; the fixed support is used for fixing the observation equipment, so that the equipment can move forwards and backwards for a certain distance, and the posture can be finely adjusted to a certain extent; the image generation computer is used for generating an object motion image in the projection screen according to the relative position and posture information of the observation device and the observed object, the distance between the observation device and the projection screen and the size and shape of the observed object, which are calculated by the relative motion model; the simulation system comprises an observation device and a simulation system of an observed object motion model, receives imaging information of the imaging device and resolves the imaging information into observed angle information as control input of the model.
In addition, an embodiment of the present invention further provides a storage medium, where an image tracking simulation program is stored on the storage medium, and when executed by a processor, the image tracking simulation program implements the following operations:
constructing a simulated imaging environment;
calibrating the distance and the angle of the imaging equipment relative to the projection screen in the simulated imaging environment according to a preset Least Mean Square (LMS) algorithm to obtain a calibration distance and a calibration angle;
and carrying out image tracking simulation according to the calibration distance and the calibration angle.
Further, the image tracking simulation program when executed by the processor further performs the following operations:
the imaging device and the projection screen are positioned in a relatively closed imaging environment through a box body, the projection screen is arranged on one side of the box body, and the imaging device fixed through a turntable mechanism is arranged on the other side of the box body;
the imaging device is right opposite to the projection screen, the projection screen displays an image of an observed object, and a simulated imaging environment is formed by the box body, the imaging device and the projection screen.
Further, the image tracking simulation program when executed by the processor further performs the following operations:
acquiring a preset horizontal distance between an imaging device and a projection screen in the simulated imaging environment;
acquiring relative position and posture information of the imaging device and an observed object, and acquiring size and shape information of the observed object;
and calibrating the distance and the angle of the imaging equipment relative to the projection screen by combining a preset least mean square algorithm (LMS) according to the preset horizontal distance, the relative position, the attitude information and the size and shape information to obtain a calibration distance and a calibration angle.
Further, the image tracking simulation program when executed by the processor further performs the following operations:
acquiring the screen resolution of the projection screen, and determining the physical size of a single pixel according to the screen resolution and the size and shape information;
determining a high-low angle and an azimuth angle of the imaging device according to the preset horizontal distance, the relative position and the attitude information;
acquiring a preset state quantity, a preset filter coefficient and an objective function of a preset least mean square algorithm LMS;
determining the initial pixel position of a projection point according to the physical size, the elevation angle, the azimuth angle, a preset state quantity and a preset filter coefficient;
determining an error function according to the target function and the initial pixel position, and determining a gradient vector of the error function and the preset filter coefficient according to the error function;
converging the preset filter coefficient through the convergence factor of the LMS and the gradient vector to obtain a target filter coefficient;
and calibrating the coordinates of the target projection points according to the target filter coefficients, and determining the calibration distance and the calibration angle according to the coordinates of the target projection points.
Further, the image tracking simulation program when executed by the processor further performs the following operations:
converging the preset filter coefficient through the convergence factor of the LMS and the gradient vector, and obtaining a target filter coefficient through the following formula:
ωk+1=ωk+2μekxk
wherein, ω isk+1Is a target filter coefficient, omegakFor presetting filter coefficients, mu is a convergence factor, 2ekxkIs the instantaneous value of the gradient vector estimate.
Further, the image tracking simulation program when executed by the processor further performs the following operations:
determining the coordinates of the target projection points according to the target filter coefficients and the preset state quantity;
acquiring an observation point coordinate of the imaging equipment, and determining a current distance and a current angle between the two coordinates according to the observation point coordinate and the target projection point coordinate;
and taking the current distance as a calibration distance and the current angle as a calibration angle.
Further, the image tracking simulation program when executed by the processor further performs the following operations:
when the imaging equipment and the projection screen have angular deviation, acquiring a high-low inclination angle in the high-low direction of the imaging equipment and an azimuth inclination angle corresponding to the azimuth;
correcting the coordinates of the observation points on the projection screen according to the high-low inclination angle and the azimuth inclination angle;
and re-determining the calibration distance and the calibration angle according to the corrected coordinates of the observation point.
According to the scheme, the simulation imaging environment is constructed; calibrating the distance and the angle of the imaging equipment relative to the projection screen in the simulated imaging environment according to a preset Least Mean Square (LMS) algorithm to obtain a calibration distance and a calibration angle; carrying out image tracking simulation according to the calibration distance and the calibration angle; the distance between the focal point of the television equipment and the projection screen can be obtained without measuring, the position of the focal point of the imaging equipment projected in the projection screen can be quickly and accurately positioned, the calibration work of the system is greatly simplified, the operation is simple, the positioning of the focal point of the observation equipment is simplified, the isolation from the external environment is good, the image tracking simulation is more convenient and efficient, the simulation precision is ensured, and the simulation efficiency is improved.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. An image tracking simulation method, characterized in that the image tracking simulation method comprises:
constructing a simulated imaging environment;
calibrating the distance and the angle of the imaging equipment relative to the projection screen in the simulated imaging environment according to a preset Least Mean Square (LMS) algorithm to obtain a calibration distance and a calibration angle;
and carrying out image tracking simulation according to the calibration distance and the calibration angle.
2. The image tracking simulation method of claim 1, wherein the constructing a simulated imaging environment comprises:
the imaging device and the projection screen are positioned in a relatively closed imaging environment through a box body, the projection screen is arranged on one side of the box body, and the imaging device fixed through a turntable mechanism is arranged on the other side of the box body;
the imaging device is right opposite to the projection screen, the projection screen displays an image of an observed object, and a simulated imaging environment is formed by the box body, the imaging device and the projection screen.
3. The image tracking simulation method according to claim 1, wherein the calibrating the distance and the angle of the imaging device relative to the projection screen in the simulated imaging environment according to a preset least mean square algorithm LMS to obtain a calibrated distance and a calibrated angle comprises:
acquiring a preset horizontal distance between an imaging device and a projection screen in the simulated imaging environment;
acquiring relative position and posture information of the imaging device and an observed object, and acquiring size and shape information of the observed object;
and calibrating the distance and the angle of the imaging equipment relative to the projection screen by combining a preset least mean square algorithm (LMS) according to the preset horizontal distance, the relative position, the attitude information and the size and shape information to obtain a calibration distance and a calibration angle.
4. The image tracking simulation method according to claim 3, wherein the calibrating the distance and the angle of the imaging device relative to the projection screen according to the preset horizontal distance, the relative position, the attitude information and the size and shape information in combination with a preset Least Mean Square (LMS) algorithm to obtain a calibrated distance and a calibrated angle comprises:
acquiring the screen resolution of the projection screen, and determining the physical size of a single pixel according to the screen resolution and the size and shape information;
determining a high-low angle and an azimuth angle of the imaging device according to the preset horizontal distance, the relative position and the attitude information;
acquiring a preset state quantity, a preset filter coefficient and an objective function of a preset least mean square algorithm LMS;
determining the initial pixel position of a projection point according to the physical size, the elevation angle, the azimuth angle, a preset state quantity and a preset filter coefficient;
determining an error function according to the target function and the initial pixel position, and determining a gradient vector of the error function and the preset filter coefficient according to the error function;
converging the preset filter coefficient through the convergence factor of the LMS and the gradient vector to obtain a target filter coefficient;
and calibrating the coordinates of the target projection points according to the target filter coefficients, and determining the calibration distance and the calibration angle according to the coordinates of the target projection points.
5. The image tracking simulation method according to claim 4, wherein the converging the preset filter coefficient by the convergence factor of the LMS and the gradient vector to obtain a target filter coefficient comprises:
converging the preset filter coefficient through the convergence factor of the LMS and the gradient vector, and obtaining a target filter coefficient through the following formula:
ωk+1=ωk+2μekxk
wherein, ω isk+1Is a target filter coefficient, omegakFor presetting filter coefficients, mu is a convergence factor, 2ekxkIs the instantaneous value of the gradient vector estimate.
6. The image tracking simulation method of claim 4, wherein the calibrating the coordinates of the target projection point according to the target filter coefficient and determining the calibration distance and the calibration angle according to the coordinates of the target projection point comprise:
determining the coordinates of the target projection points according to the target filter coefficients and the preset state quantity;
acquiring an observation point coordinate of the imaging equipment, and determining a current distance and a current angle between the two coordinates according to the observation point coordinate and the target projection point coordinate;
and taking the current distance as a calibration distance and the current angle as a calibration angle.
7. The image tracking simulation method according to any one of claims 1 to 6, wherein before performing image tracking simulation based on the calibration distance and the calibration angle, the image tracking simulation method further comprises:
when the imaging equipment and the projection screen have angular deviation, acquiring a high-low inclination angle in the high-low direction of the imaging equipment and an azimuth inclination angle corresponding to the azimuth;
correcting the coordinates of the observation points on the projection screen according to the high-low inclination angle and the azimuth inclination angle;
and re-determining the calibration distance and the calibration angle according to the corrected coordinates of the observation point.
8. An image tracking simulation apparatus, characterized in that the image tracking simulation apparatus comprises:
the construction module is used for constructing a simulation imaging environment;
the calibration module is used for calibrating the distance and the angle of the imaging equipment relative to the projection screen in the simulated imaging environment according to a preset Least Mean Square (LMS) algorithm to obtain a calibration distance and a calibration angle;
and the simulation module is used for carrying out image tracking simulation according to the calibration distance and the calibration angle.
9. An image tracking simulation apparatus characterized by comprising: a memory, a processor and an image tracking simulation program stored on the memory and executable on the processor, the image tracking simulation program configured to implement the steps of the image tracking simulation method of any of claims 1 to 7.
10. A storage medium having stored thereon an image tracking simulation program which, when executed by a processor, implements the steps of the image tracking simulation method of any one of claims 1 to 7.
CN202110036617.9A 2021-01-12 2021-01-12 Image tracking simulation method, device, equipment and storage medium Pending CN112950677A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110036617.9A CN112950677A (en) 2021-01-12 2021-01-12 Image tracking simulation method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110036617.9A CN112950677A (en) 2021-01-12 2021-01-12 Image tracking simulation method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112950677A true CN112950677A (en) 2021-06-11

Family

ID=76235282

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110036617.9A Pending CN112950677A (en) 2021-01-12 2021-01-12 Image tracking simulation method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112950677A (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06123765A (en) * 1992-10-09 1994-05-06 Mitsubishi Electric Corp Image tracking system
CN103336272A (en) * 2013-03-26 2013-10-02 中国科学院电子学研究所 Geometric structure based complex target SAR image simulation method
CN104349096A (en) * 2013-08-09 2015-02-11 联想(北京)有限公司 Image calibration method, image calibration device and electronic equipment
CN105069809A (en) * 2015-08-31 2015-11-18 中国科学院自动化研究所 Camera positioning method and system based on planar mixed marker
CN106403900A (en) * 2016-08-29 2017-02-15 上海交通大学 Flyer tracking and locating system and method
CN107273799A (en) * 2017-05-11 2017-10-20 上海斐讯数据通信技术有限公司 A kind of indoor orientation method and alignment system
CN108958475A (en) * 2018-06-06 2018-12-07 阿里巴巴集团控股有限公司 virtual object control method, device and equipment
CN109212545A (en) * 2018-09-19 2019-01-15 长沙超创电子科技有限公司 Multiple source target following measuring system and tracking based on active vision
CN111563962A (en) * 2020-04-09 2020-08-21 中国科学院空天信息创新研究院 Remote sensing image simulation method based on geometric radiation integrated sampling
CN111754580A (en) * 2019-03-28 2020-10-09 阿里巴巴集团控股有限公司 Camera calibration method, roadside sensing equipment and intelligent traffic system
CN111754581A (en) * 2019-03-28 2020-10-09 阿里巴巴集团控股有限公司 Camera calibration method, roadside sensing equipment and intelligent traffic system
CN111862150A (en) * 2020-06-19 2020-10-30 杭州易现先进科技有限公司 Image tracking method and device, AR device and computer device

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06123765A (en) * 1992-10-09 1994-05-06 Mitsubishi Electric Corp Image tracking system
CN103336272A (en) * 2013-03-26 2013-10-02 中国科学院电子学研究所 Geometric structure based complex target SAR image simulation method
CN104349096A (en) * 2013-08-09 2015-02-11 联想(北京)有限公司 Image calibration method, image calibration device and electronic equipment
CN105069809A (en) * 2015-08-31 2015-11-18 中国科学院自动化研究所 Camera positioning method and system based on planar mixed marker
CN106403900A (en) * 2016-08-29 2017-02-15 上海交通大学 Flyer tracking and locating system and method
CN107273799A (en) * 2017-05-11 2017-10-20 上海斐讯数据通信技术有限公司 A kind of indoor orientation method and alignment system
CN108958475A (en) * 2018-06-06 2018-12-07 阿里巴巴集团控股有限公司 virtual object control method, device and equipment
CN109212545A (en) * 2018-09-19 2019-01-15 长沙超创电子科技有限公司 Multiple source target following measuring system and tracking based on active vision
CN111754580A (en) * 2019-03-28 2020-10-09 阿里巴巴集团控股有限公司 Camera calibration method, roadside sensing equipment and intelligent traffic system
CN111754581A (en) * 2019-03-28 2020-10-09 阿里巴巴集团控股有限公司 Camera calibration method, roadside sensing equipment and intelligent traffic system
CN111563962A (en) * 2020-04-09 2020-08-21 中国科学院空天信息创新研究院 Remote sensing image simulation method based on geometric radiation integrated sampling
CN111862150A (en) * 2020-06-19 2020-10-30 杭州易现先进科技有限公司 Image tracking method and device, AR device and computer device

Similar Documents

Publication Publication Date Title
CN107564069B (en) Method and device for determining calibration parameters and computer readable storage medium
KR102458415B1 (en) System and method for automatic hand-eye calibration of vision system for robot motion
CN112022355B (en) Hand-eye calibration method and device based on computer vision and storage medium
Sun et al. An empirical evaluation of factors influencing camera calibration accuracy using three publicly available techniques
JP5850962B2 (en) Robot system using visual feedback
CN101998136B (en) Homography matrix acquisition method as well as image pickup equipment calibrating method and device
US11277544B2 (en) Camera-specific distortion correction
JP6929123B2 (en) Camera calibration device and camera calibration program
WO2018163450A1 (en) Robot control device and calibration method
Zhang et al. A universal and flexible theodolite-camera system for making accurate measurements over large volumes
CN111801198A (en) Hand-eye calibration method, system and computer storage medium
WO2019029991A1 (en) System and method for recalibrating a projector system
JP2015022027A (en) Image pickup device and method for controlling the same
CN110225321B (en) Training sample data acquisition system and method for trapezoidal correction
CN110465946B (en) Method for calibrating relation between pixel coordinate and robot coordinate
CN110969665A (en) External parameter calibration method, device and system and robot
WO2022067665A1 (en) Coordinate transformation method, apparatus, and system, program and electronic device thereof
CN115564842A (en) Parameter calibration method, device, equipment and storage medium for binocular fisheye camera
CN112229323A (en) Six-degree-of-freedom measurement method of checkerboard cooperative target based on monocular vision of mobile phone and application of six-degree-of-freedom measurement method
KR100520275B1 (en) Method for correcting geometry of pushbroom image using solidbody rotation model
CN114998556A (en) Virtual-real fusion method for mixed reality flight simulation system
JP2015024480A (en) Information processing device, control method and program
CN112669392B (en) Map positioning method and system applied to indoor video monitoring system
CN112950677A (en) Image tracking simulation method, device, equipment and storage medium
CN112785685A (en) Assembly guiding method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination