CN110200584B - Target tracking control system and method based on fundus imaging technology - Google Patents

Target tracking control system and method based on fundus imaging technology Download PDF

Info

Publication number
CN110200584B
CN110200584B CN201910592902.1A CN201910592902A CN110200584B CN 110200584 B CN110200584 B CN 110200584B CN 201910592902 A CN201910592902 A CN 201910592902A CN 110200584 B CN110200584 B CN 110200584B
Authority
CN
China
Prior art keywords
fundus
image
main system
control arm
motion signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910592902.1A
Other languages
Chinese (zh)
Other versions
CN110200584A (en
Inventor
张�杰
张金莲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Boshi Medical Technology Co ltd
Original Assignee
Nanjing Boshi Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Boshi Medical Technology Co ltd filed Critical Nanjing Boshi Medical Technology Co ltd
Priority to CN201910592902.1A priority Critical patent/CN110200584B/en
Publication of CN110200584A publication Critical patent/CN110200584A/en
Application granted granted Critical
Publication of CN110200584B publication Critical patent/CN110200584B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0025Operational features thereof characterised by electronic signal processing, e.g. eye models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • A61B3/1225Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes using coherent radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • A61B3/145Arrangements specially adapted for eye photography by video means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/007Methods or devices for eye surgery
    • A61F9/008Methods or devices for eye surgery using laser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/007Methods or devices for eye surgery
    • A61F9/008Methods or devices for eye surgery using laser
    • A61F2009/00844Feedback systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30041Eye; Retina; Ophthalmic

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Ophthalmology & Optometry (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Vascular Medicine (AREA)
  • Signal Processing (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The invention discloses a target tracking control system and a method thereof based on an eyeground imaging technology, wherein the control system comprises a main system and an auxiliary system; the main system is used for calculating the transformation relation of the fundus position along with time by taking the imaging image as a basis to obtain a fundus motion signal; and establishing and calibrating a space coordinate position transformation relation between the fundus motion signal and the parameters of the laser control arm of the auxiliary system through the main system, and converting the value of the fundus motion signal into the parameters of the laser control arm of the auxiliary system. By adopting the invention, the random fundus motion can be dynamically compensated by carrying out real-time image stabilization on the main system, so that a visually stable fundus dynamic image is presented in the imaging system, a pathological area is conveniently selected for clinical operation, and an operation operator can accurately position the fundus position to be hit by the auxiliary system.

Description

Target tracking control system and method based on fundus imaging technology
Technical Field
The invention relates to the fundus target tracking and retina imaging image stabilization technology in the medical field, in particular to a target tracking control system and a target tracking control method based on the fundus imaging technology.
Background
In the existing Fundus laser beam control system based on the image system, there is usually a main imaging system (called main system for short) for Fundus lateral imaging, such as a conventional Fundus Camera (Fundus Camera), a Line Scan Fundus Camera (LSO), a Confocal Scanning Fundus Camera (cSLO), an adaptive optical Fundus Camera (flow-adaptive optical recording), an adaptive optical LSO (AO-LSO), or an adaptive optical SLO (AO-SLO). Then, under the navigation of the main system, an auxiliary imaging system (auxiliary system for short) is integrated. The auxiliary system can be used for projecting a focused laser beam to the fundus for fundus/retinal laser treatment, and can also be used for imaging the focused laser beam in a scanning mode in the longitudinal direction of the fundus (a section perpendicular to the fundus), such as OCT imaging, and also for other purposes.
The host systems described above, generally contain at least three important functions: (1) providing fundus pathological area information for clinical workers through the recorded fundus images; (2) allowing a clinician to select a pathological area to be operated by the auxiliary system on the image of the main system by taking the image of the main system as a reference benchmark; and (3) taking the main system as navigation, capturing fundus motion signals through a dynamic image of the main system, and then converting the fundus motion signals of the main system into a laser control arm of the auxiliary system through a specific spatial transformation relation, so that the auxiliary system can dynamically adjust parameters of the laser control arm according to the fundus motion signals of the main system, and deliver focused laser beams of the auxiliary system to a specified fundus position. However, the above prior art has two obvious disadvantages:
1) due to the random movement of the eye, fundus motion images (images and video) of the main system imaging system also tend to drift randomly over time and are often accompanied by rotation. This random drift dynamic image gives the surgical operator the inconvenience of selecting a pathological zone on the primary system. Since the position of the pathological area is the same, for example, in the process of laser striking during fundus treatment, the situation is difficult for the operation operator to accurately position the fundus to be struck by the auxiliary system.
2) In the conventional main system fundus transverse imaging system, the image frame frequency is usually 25-30 frames/second. Existing algorithms for calculating the fundus motion signal are usually in units of frames, such as each frame image, and applying Cross Correlation (Cross Correlation) algorithm can give a set of (x, y, θ), where (x, y) is the amount of translation and θ is the amount of rotation. However, the movement spectrum of the fundus image due to the movement of the eyeball and the head tends to cover a considerable range. Based on the existing algorithm of the image processing of the existing main system which generally uses 25-30 Hz frame frequency, the fundus motion with higher frequency is difficult to capture, so that the spatial position of the auxiliary system laser arm cannot be accurately controlled, and the spatial accuracy of the auxiliary system laser beam falling to the appointed fundus position is poor.
Disclosure of Invention
In view of the above, the main objective of the present invention is to provide a target tracking control system and method based on fundus imaging technology, which can present a visually stable fundus dynamic image in an imaging system by performing real-time image stabilization and dynamic compensation on a main system, so as to facilitate selection of a pathological area for clinical operation and to facilitate accurate positioning of a fundus position to be struck by an auxiliary system by an operation operator.
In order to achieve the purpose, the technical scheme of the invention is as follows:
a target tracking control system based on fundus imaging technology comprises a main system and an auxiliary system; the main system is used for calculating the transformation relation of the fundus position along with time by taking the imaging image as a basis to obtain a fundus motion signal; and establishing and calibrating a space coordinate transformation relation between the fundus motion signal and the parameters of the laser control arm of the auxiliary system through the main system, and converting the value of the fundus motion signal into the parameter value of the laser control arm of the auxiliary system.
Wherein: fundus motion signal (x)i,yi,θi) With parameters (X) of a laser control arm of the auxiliary systemi, Yi) The spatial coordinate transformation relationship between the two is specifically as follows:
(Xi,Yi)=g(x,y,θ;X,Y)(xi,yi,θi) (1)
wherein: g (X, Y, theta; X, Y) is a space coordinate transformation relation; (x)i,yi) Translation amount, θ, of fundus motion acquired from primary system imaging imageiThe rotation amount of the movement of the fundus; (X)i, Yi) The amount of translation of the laser control arm of the auxiliary system.
The process of calibrating the spatial coordinate transformation relation between the fundus motion signal and the parameters of the laser control arm of the auxiliary system according to the main system comprises the following steps:
a. setting the original position of the fundus motion signal (X is 0, Y is 0, and θ is 0) and the parameters of the laser control arm to be also set at the zero position (X is 0, Y is 0); recording a set of fundus images as reference images f0Recording the position of the focused laser beam in the auxiliary system on the fundus;
b. changing x, y of reference image of main system, rotation direction adjustmentA motion scale k for recording the new image f of the main systemkPosition (x) relative to a reference imagek,yk,θk) (ii) a And adjusting the laser control arm of the auxiliary system to adjust the focused laser beam to the same fundus position at the zero position to obtain a parameter (X) of the laser control armk,Yk);
c. And c, circulating the step b, traversing the motion scales of the translation amount x, y and the rotation angle theta allowed by the optical system of the main system to obtain a matrix relation:
G[X Y]=[x y θ] (2)
wherein: matrix G [ X Y]The spatial transformation relation from the main system to the auxiliary system is measured; x ═ X1 X2…XK]T,Y=[Y1Y2…YK]T,x=[x1x2…xK]T,y=[y1y2…yK]T,θ=[θ1θ2…θK]T
The main system is any one of a fundus camera, a line scanning fundus camera LSO, a confocal scanning fundus camera cSLO, an adaptive optical fundus camera, an adaptive optical LSO and an adaptive optical SLO camera.
A target tracking control method based on a fundus imaging technology comprises the following steps:
A. in the fundus imaging system integrated with the auxiliary system, the main system is utilized to calculate the transformation relation of the fundus position along with time by taking the imaging image as the basis to obtain a fundus motion signal;
B. and establishing and calibrating a space coordinate transformation relation between the fundus motion signal and the parameters of the laser control arm of the auxiliary system through the main system, and converting the value of the fundus motion signal into the parameters of the laser control arm of the auxiliary system.
Wherein the fundus motion signal (x) of step Bi,yi,θi) Parameter (X) of laser control arm of auxiliary systemi,Yi) The spatial coordinate transformation relationship between the two is specifically as follows:
(Xi,Yi)=g(x,y,θ;X,Y)(xi,yi,θi) (1)
wherein: g (X, Y, theta; X, Y) is a space coordinate transformation relation; (x)i,yi) Translation amount, θ, of fundus motion acquired from primary system imaging imageiThe rotation amount of the movement of the fundus; (X)i, Yi) The amount of translation of the laser control arm of the auxiliary system.
The process of calibrating the spatial coordinate transformation relation between the fundus motion signal and the parameters of the laser control arm of the auxiliary system according to the main system comprises the following steps:
b1, setting the original position of the fundus motion signal (X is 0, Y is 0, θ is 0) and the parameters of the laser control arm are also set at the zero position (X is 0, Y is 0); recording a set of fundus images as reference images f0Recording the position of the focused laser beam in the auxiliary system on the fundus;
b2, changing x and y of reference image of main system, adjusting a motion scale k in rotation direction, and recording new image f of main systemkPosition (x) relative to a reference imagek,yk,θk) (ii) a And adjusting the laser control arm of the auxiliary system to adjust the focused laser beam to the same fundus position at the zero position to obtain a parameter (X) of the laser control armk,Yk);
B3, looping step B2, traversing the motion scales of the translation amount x, y and the rotation angle theta allowed by the optical system of the main system, and obtaining a matrix relation:
G[X Y]=[x y θ] (2)
wherein: matrix G [ X Y]The spatial transformation relation from the main system to the auxiliary system is measured; x ═ X1 X2…XK]T,Y=[Y1Y2…YK]T,x=[x1x2…xK]T,y=[y1y2…yK]T,θ=[θ1θ2…θK]T
The target tracking control system and method based on the fundus imaging technology have the following beneficial effects:
1) by adopting the invention, the real-time image stabilization method is applied in the main system, the random fundus motion is dynamically compensated, and the visually stable fundus dynamic video is provided on the imaging system, so that the operator can efficiently and accurately complete the selection of the pathological area, and can set the laser striking parameters (such as the size of a light spot, the space position, the adjacent time/space interval, the exposure time and the like) or accurately position the OCT scanning area.
2) By adopting the invention, the fundus motion information with higher frequency can be captured by properly increasing (2 times or 3 times) the image frame frequency without greatly increasing the laser radiation quantity of the fundus in some fundus imaging systems such as a non-scanning main system fundus camera. In other imaging systems, such as scanning imaging systems, higher frequency fundus motion information may be captured by frequency doubling. The effect brought by the higher sampling frequency is shorter time delay and higher auxiliary system laser beam space control precision.
3) By applying the invention, an off-line pathological area editing method can be provided in the application of laser striking, and a user is allowed to generate a main system reference map from the existing patient fundus map database. Selecting and editing laser striking parameters in the reference image, and importing the reference image with pathological area parameters into main control software, so that the subsequent fundus laser striking is performed manually, semi-automatically or fully automatically according to the set parameters.
4) The technology of the invention also supports the combination of different main system optical imaging systems to realize the same auxiliary system fundus laser percussion or OCT imaging. In the optical imaging system, the following products can be applied (including but not limited to): industrial Fundus cameras (Fundus Camera), Line Scanning Fundus cameras (LSO), Confocal Scanning Light Ophthalmoscope (cSLO), adaptive optical Fundus Camera (flood-atomized adaptive optical imaging), adaptive optical LSO (AO-LSO), adaptive optical SLO (AO-SLO) cameras, all of which are fully compatible with the present invention. The auxiliary system of the invention is used for controlling the laser beam, so that high-precision fundus laser single-point striking or array striking can be carried out; the auxiliary system of the present invention can also be used for OCT scanning, as well as other opto-electromechanical system applications.
Drawings
FIG. 1 is a schematic view of a fundus imaging system incorporating fundus laser therapy functionality according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of the movement of a light spot in a two-dimensional plane using two tilting mirrors that can be moved independently in orthogonal axes of motion;
FIG. 3 is a schematic diagram of the movement of a light spot in a two-dimensional plane using a tilting mirror with two orthogonal axes of motion;
FIG. 4 is a schematic view of a fundus SLO image obtained by one module 1;
FIG. 5 is a schematic view of an eye fundus AO-SLO image obtained from the module 1;
FIG. 6 is a diagram showing calculation of the movement amount (x) of the fundus oculi from the image in frame bit unitsi,yi,θi) A process schematic of (a);
FIG. 7 is a schematic diagram of an optical system for accurately calibrating the spatial (coordinate) transformation g (X, Y, θ; X, Y) based on the embodiment shown in FIG. 1;
FIG. 8 is a set of spatial transformation relationships between the primary and secondary systems obtained from actual measurement on a conventional engineering prototype;
FIG. 9 is a schematic diagram of the embodiment of the present invention using frequency doubling to improve and reduce the time delay of fundus calculation;
FIG. 10 is a graph illustrating control accuracy and sample time interval according to an embodiment of the present invention;
FIG. 11 shows (x) obtained from FIG. 6i,yi,θi) Non-scanned image f1,f2,…,fnIs "straightened" to the reference plane f0A schematic diagram of (a);
FIG. 12 shows (x) of the scanning pattern obtained from FIG. 9i,m,yi,m,θi,m) Image f to be scannedkIs "straightened" to the reference plane f0Schematic representation of (a).
Detailed Description
The present invention will be described in further detail below with reference to the accompanying drawings and embodiments thereof.
Fig. 1 is a schematic view of a fundus imaging system integrated with fundus laser therapy in an embodiment of the present invention. The optical structure of the fundus imaging system is similar to the Navalis fundus laser treatment system of OD-OS Global, Germany.
As shown in fig. 1, module 1 is a fundus imaging system, i.e., a main system, module 2 is a treatment laser control system, i.e., an auxiliary system, module 3 is a focusing lens, and module 4 is a fundus, i.e., a retina.
In the above module 1, the imaging light source 11 may be monochromatic light or white light, and after passing through the collimating lens 12, the imaging light source is partially reflected by the beam splitter 13 to the beam splitter 25 of the relay imaging light source and the therapeutic light source, and then enters the fundus 4. The light reflected from the fundus is partially transmitted through the spectroscope 25 and the spectroscope 13 into the focusing lens 14, and is finally received by the fundus camera 15. The fundus camera data is received, displayed and stored by the main control computer 16, and specifically, the main control computer 16 may be a PC or a tablet computer or the like.
In the above module 2, the aiming light source 21 and the therapeutic light source 22 enter the therapeutic light system, i.e. the auxiliary system, from the same spatial position through a specific optical device. The two light sources are typically controlled by the master machine 16 including parameters such as on and off status, output power, etc. The aiming light is typically of relatively low power, and provides the operator with a reference location of the fundus to be struck by a real-time image of the fundus camera, and then activates the treatment laser to deliver a relatively powerful laser to the reference location indicated by the aiming light. This process is currently commonly referred to as photocoagulation (photocoagulation). It will be appreciated that since aiming light and treatment light are often not at the same spatial location of the fundus during different engineering scenarios, such errors are related to a number of factors, and we proceed hereinafter.
The light source of the module 2 passes through the light spot size control device 23, relays to the light spot position control device 24, passes through the spectroscope 25, and finally is focused on the fundus oculi 4.
As mentioned above, the means 23 are used to control the size of the spot on the fundus, typically in the range 50 microns to 1000 microns. The means 24 is typically a pair of one-dimensional tilting mirrors or a two-dimensional tilting mirror for controlling the lateral spatial position of the spot on the fundus. The on-off state of the light spot is controlled by the host.
To achieve two-dimensional lateral spatial position control, the device 24 typically uses two tilting mirrors that can be moved independently in orthogonal axes of motion, as in FIG. 2, or one tilting mirror with two orthogonal axes of motion, as in FIG. 3.
In fig. 1, the module 1 may be a Fundus Camera (Fundus Camera), or may be replaced by any one of a Line Scan Fundus Camera (LSO), a Confocal Scanning Fundus Camera (cSLO), an adaptive optical Fundus Camera (flood-excited adaptive optical imaging), an adaptive optical LSO (AO-LSO), and an adaptive optical SLO (AO-SLO), and the final purpose is to obtain a Fundus transverse dynamic image. The working principles of the adaptive optics Fundus Camera, the adaptive optics LSO and the adaptive optics SLO are similar to those of the traditional Fundus Camera (Fundus Camera), the LSO and the cSLO, and the main difference is that the adaptive optics technology is integrated in the adaptive optics technology to compensate the eyeball aberration in real time so as to improve the optical resolution of the system to the level of single cells.
Module 2 shows a fundus laser treatment system for single point or array impingement of a focused laser beam on the fundus. The module 2 can also be a fundus scanning imaging system like OCT, in which case the scanning mirror of figure 2 or 3 is used on the one hand for periodic regular scanning to achieve the B-scan and C-scan of OCT, and then the fundus location information fed back from the module 1 is superimposed on the B-scan or C-scan so that the B-scan and C-scan can track the fundus location.
Fig. 4 is a schematic view of a fundus SLO image obtained by one module 1. In this embodiment, the module 1 is a conventional wide-angle SLO.
FIG. 5 is a schematic view of an eye fundus AO-SLO image obtained from the module 1. In this embodiment the module 1 is an adaptive optics SLO.
As shown in fig. 5, white dots in the image are fundus photoreceptor cells, and the curved black shadow in the middle is a blood vessel shadow. Fig. 5 is equivalent to a partial optical magnification of the white box position of fig. 4.
As described above, in order to accurately control the laser arm of the sub-system so that the focused laser beam falls on a prescribed spatial position of the fundus, it is first necessary to calculate the temporal change relationship of the fundus position on the basis of an imaged image (image or video) by the main system. Existing algorithms for calculating fundus motion signals are usually in frame bit units, and for each frame image, a set of (x, y, θ) can be given using a Cross Correlation algorithm, where (x, y) is the amount of translation and θ is the amount of rotation. Please refer to fig. 6.
FIG. 6 is a diagram showing calculation of the movement amount (x) of the fundus oculi from the image in frame bit unitsi,yi,θi) Schematic process diagram of (1).
As shown in FIG. 6, assume a first image f0Is the main system reference image, f0May be the current frame f immediately next to fig. 6 in time order1The previous frame of the image may be any single frame image obtained previously or a processed image of the same patient at a position close to the fundus of the eye. In the next chronological order (1, 2, …, n), the host system receives n frames of images in succession. The common method is to use0As a reference, then fi(i-1, 2, 3, …, n) and f0Cross-correlation is carried out one by one to obtain each frame image fiRelative to f0Spatial position (x) ofi,yi,θi) Wherein (x)i,yi) Is the amount of translation, θiIs the amount of rotation. These (x)i,yi,θi) It represents the change of the fundus over time, since the lower corner i here represents the time series. As can be seen from the schematic diagram of fig. 6, if we define that the image center coordinates of each frame image coincide with the center of the orthogonal dot-dash line, the fundus position drifts with time, for example, the circle position represents that the position of the macular region drifts with time. Obviously, f1Relative to a reference plane f0With relative amount of exercise(x1,y1,θ1),f2Relative to a reference plane f0With relative movement (x)2,y2,θ2),fnRelative to a reference plane f0With relative movement (x)n,yn,θn) And the like.
Typically, the frame rate of a typical main system imaging system is 25Hz, that is, if the frame of the image shown in FIG. 6 is taken as a unit, the algorithm of the main system outputs a set (x) of images every 40 millisecondsi,yi,θi)。
According to different clinical requirements, the main system and the auxiliary system can have different magnifications, different rotation directions and the like according to specific optical designs of the main system and the auxiliary system. Obviously, the primary and secondary systems should be pointed at the same fundus location, regardless of whether the secondary system is used for laser shock or OCT-like fundus imaging.
As can be seen in connection with fig. 1 and 6, fig. 6 represents the course of the fundus image obtained by the module 1 of fig. 1 over time. As shown in FIG. 6, the motion signal of the fundus can be derived from any one of the current frame images fiRelative to a reference image f0Is obtained, i.e. the movement signal (x) of the fundus oculii,yi,θi)。
Assuming that the purpose of module 2 is to dynamically lock the laser beam into the circular position shown in fig. 6, then, at f, due to the random motion of the fundus1,f2,…,fnAt different times, the laser control arm of module 2 needs to be according to (x)i,yi,θi) Adjusts the control arm parameters so that the focused laser beam of module 2 can track the circle position of fig. 6.
However, as can be seen from fig. 1, modules 1 and 2 employ a non-common path optics (non-common path optics) design. And the parameters of the laser arm of the control module 2 are usually voltage values or current values or digital control modes, and the parameters of the laser arm of the control module 1 are obtained (x)i,yi,θi) Typically the pixel values and angles (or radians) of the image. To reach (x) of the slave module 1i,yi,θi) The value conversion to the parameter (voltage or current, or other digital control mode) of the laser control arm of the module 2 realizes the dynamic and accurate projection of the focused laser beam of the module 2 to each circle position in fig. 6, and in the embodiment of the invention, a space (coordinate) transformation relation from the module 1 to the module 2 is established.
The spatial (coordinate) transformation relationship of module 1 to module 2 as described above can be expressed by the following mathematical relationship:
(Xi,Yi)=g(x,y,θ;X,Y)(xi,yi,θi) (1)
in equation (1), g (X, Y, θ; X, Y) is the spatial (coordinate) transformation described herein, with the objective of relating (X) of module 1 toi,yi,θi) Converting into parameters (X) of laser control arm of auxiliary systemi, Yi)。
FIG. 7 is a schematic diagram of an optical system for accurately calibrating the spatial (coordinate) transformation g (X, Y, θ; X, Y) based on the embodiment shown in FIG. 1.
In the present embodiment, the fundus position of fig. 7 may be replaced with a simulated eye instead of a real eye. The simulated eye has 3 independent degrees of freedom of movement, namely translation x, y and rotation angle theta. The simulated eye can be arranged on a mechanism capable of translating in a 2-dimensional space, and then the whole mechanism is arranged on a rotating table, so that 3 independent motion degrees of freedom of the simulated eye are realized. Of course, 3 independent degrees of freedom of motion of the simulated eye can be achieved in other ways.
Referring to fig. 7, the method for precisely calibrating the spatial (coordinate) transformation relationship g (X, Y, θ; X, Y) adopted by the present invention is that, when the simulated eye is at the original position (X is 0, Y is 0, and θ is 0), the parameters of the laser control arm are also set at the zero position (X is 0, and Y is 0). At this time, a set of fundus images is recorded as a reference image, such as f0At the same time, the recording module 2 focuses the laser beam on the position of the fundus. Then, a movement scale k is adjusted in the x, y and rotation directions of the simulated eye respectively, and the recording module 1 records a new image fkPosition (x) relative to a reference imagek,yk,θk). All in oneIn the process, the laser control arm of the module 2 is adjusted manually or automatically to adjust the focused laser beam to the same simulated eye fundus position under the condition of zero position, so that the parameter (X) of the laser control arm can be obtainedk,Yk)。
The position of the simulated eye is changed continuously, the calibration method is adopted circularly, once the x and y of the simulated eye and the rotation angle movement scale allowed by all the optical systems are traversed, the following matrix relation can be obtained,
G[X Y]=[x y θ] (2)
in equation (2), the matrix G is the measured spatial transformation from module 1 to module 2.
Wherein X ═ X1X2…XK]T,Y=[Y1Y2…YK]T,x=[x1x2…xK]T,y=[y1y2… yK]T,θ=[θ1θ2…θK]T
FIG. 8 is a set of spatial transformation relationships of the primary and secondary systems obtained by actual measurement on a conventional engineering prototype.
As shown in fig. 8, the specific process is similar to the method shown in fig. 7, and is not described here again. The brief results shown in fig. 8, however, show that the primary and secondary systems have different optical magnifications in the x-direction and y-direction, and the control axis is rotated 90 degrees.
In a practical (control) scenario, the system first knows that (x) is obtained from module 1 as shown in FIG. 6i,yi,θi) And then converted into laser arm control parameters (X) for module 2i,Yi). Therefore, the matrix G of equation (2) needs to be inverted to achieve the calculation of equation (1), which obviously results in:
g(x,y,θ;X,Y)=(GTG)-1GT (3)
suppose (G)TG)-1Are present. In (G)TG) In the case of singular values, the calculation of equation (3) may be implemented by singular value decomposition.
The motion spectrum of the fundus image tends to cover a considerable range due to the eyeball and head motion. The existing algorithm of a main system image system based on a common 25-30 Hz frame frequency is difficult to capture high-frequency fundus motion, so that the spatial position of an auxiliary system laser arm cannot be controlled accurately, and the spatial accuracy of an auxiliary system laser beam falling to a specified fundus position is poor. In order to improve the spatial precision of the main system for capturing the motion of the eyeground, the embodiment of the invention adopts the following two methods:
the method comprises the following steps: under the condition that the laser radiation quantity of the fundus of the main system is not or does not need to be greatly increased, the image frame rate (such as 2-4 times) of a traditional non-scanning fundus camera is properly increased to capture fundus motion information with higher frequency. As an example, if the image frame frequency is increased by a factor of 4 to 100Hz, the algorithm of the host system outputs a set (x) every 10 msi,yi,θi). In this way, by increasing the sampling frequency of the fundus calculation of the main system, the time delay from the eye movement of the auxiliary system to the beginning of the reaction of the laser arm is reduced, and the effect is that the spatial precision of the laser arm for controlling the auxiliary system to focus the laser beam on the fundus is improved.
In the above embodiment, the image capturing device that can be used is the A5131M/CU210 industrial area-array camera of the Chinese Rui technology. The camera can capture 210 frames 1280x1024 pixels of image per second.
The second method comprises the following steps: in other imaging systems, such as scanning imaging systems, higher frequency fundus motion information may be captured by frequency doubling. In a scanning imaging system, images are formed by "dot- > line- > planes" such as SLO or AO-SLO or by "line- > planes" such as LSO or AO-LSO. In both cases, the image capture device or line camera may be controlled such that the image of each frame is divided into a plurality of sub-frame elements according to the sequential arrival order of each scan line, as shown in fig. 9.
In fig. 9, it is assumed that the image frame rate of SLO/LSO is still 25Hz (compared to the case shown in fig. 6, the main system does not increase the laser exposure), but the present invention has technically realized dividing a complete frame into a plurality of sub-frame elements due to the flexibility of the scanning system described in method two. Assuming that M in fig. 9 is 20, the time sequence of arrival of the dummy at the master machine is:
Figure BDA0002116700580000121
the same applies to any frame image, such as the k-th frame.
Figure BDA0002116700580000122
FIG. 9 is a schematic diagram of the embodiment of the present invention for improving and reducing the time delay of fundus calculation by using the frequency doubling technique.
Referring to fig. 9, the main system calculates fundus motion signals of M sets of sub-frame elements in the time order of arrival of each sub-frame element at the main control machine for each frame image:
(xi,1,yi,1,θi,1),(xi,2,yi,2,θi,2),…,(xi,M,yi,M,θi,M)
similarly, the results of the above-described fundus motion signals for M sets of sub-frame elements are converted to the laser control arm of the sub-system in accordance with the relationship of equation (1), and it is possible to obtain:
(Xi,m,Yi,m)=g(x,y,θ;X,Y)(xi,m,yi,m,θi,m) (4)
wherein M is 1, 2, 3, …, M.
Compared with equation (1), equation (4) above increases the update frequency of the laser control arm of the auxiliary system by M times. If the former is 25Hz, under the condition of not increasing the exposure, the frequency doubling condition corresponding to the figure 9 can increase the adjusting frequency of the laser control arm to 25 xM Hz, thereby greatly improving the space precision of the auxiliary system laser control arm for controlling the focused laser beam on the eyeground.
Obviously, the first method and the second method both improve the control accuracy of the laser control arm of the auxiliary system, that is, improve the sampling frequency, by the same means in principle.
FIG. 10 is a diagram illustrating control accuracy and sampling time interval according to an embodiment of the present invention.
As shown in fig. 10, the case of two sampling intervals are compared, the short time interval Δ T of the thin dotted line and the long time interval Δ T. It is assumed here that the curve in the figure is a change in the fundus motion trajectory with time (for simplicity of description, only the y direction is used).
In fig. 10, in the case of a long sampling interval Δ T, it is assumed that eye movement occurs at time i. Since a sampling interval Δ T is required, recording the image data of the eye movement at time i does not occur until time i + 1. Even if it is assumed that (x) is calculatedi,yi,θi) And the mechanical response of the therapeutic light control arm is instantaneously completed, resulting in the compensation of the eye movement at the time i +1 by the data at the time i, so that the generated compensation error is relatively large.
In fig. 10, in the case of a short sampling interval Δ t, it is assumed that an eye movement occurs at time j. Also, the system compensates for eye movement at time j +1 with time j data, but because the time intervals between j and j +1 are short, the resulting compensation error is much smaller.
For one of the two methods described above, 25Hz in a non-scanning system corresponds to 100 Hz. 25Hz is to compensate for the current eye movement with data 40 milliseconds ago. Generally, it is common to generate 100-200 μm eye movements within 40 ms, so this low frame rate method has at least 100-200 μm error, i.e. the laser arm control accuracy is poor. However, in the case of 100Hz, the current eye movement is compensated with data 10 milliseconds ago. Generally, 30-40 micron eye movement is generated within 10 milliseconds, so that the high frame rate method can control the error to be 40-50 micron, that is, the laser arm control accuracy is high.
For the second of the above two methods, if the number M of sub-frame elements in the scanning system is 20, the sampling frequency of the system can be increased to 500 Hz. For the 500Hz case, the current eye movement is compensated with data 2 milliseconds ago. Generally, eye movements of 4-5 microns are generated within 2 milliseconds, and of course, in consideration of calculation delay and mechanical and electronic delay of a laser control arm, the error can be controlled to be 15-20 microns by the high frequency doubling method, that is, the control accuracy of the laser arm can be improved by one or more orders of magnitude under the condition of scanning frequency doubling.
As described above, another aspect of the present invention is fundus position information corresponding to any one frame image obtained by the host system, such as (x) in non-scanning modei,yi,θi) Or in a scanning mode (x)i,m, yi,m,θi,m) Will (x)i,yi,θi) Or (x)i,m,yi,m,θi,m) The original image is applied and the frame image is "straightened" (dewarping) to the position of the reference frame image. The method is embodied in dynamic images (videos), and the equivalent effect of the straightening is equivalent to that of adopting an image stabilizing technology to visually stabilize a dynamic and sudden image.
FIG. 11 shows (x) obtained from FIG. 6i,yi,θi) Non-scanned image (i.e. target image) f1,f2,…,fnIs "straightened" to the reference plane f0Schematic representation of (a).
As shown in FIG. 11, in a non-scanning imaging system, the target image f is straightened using an image "straightening" technique1,f2,…,fnIs "straightened" to the reference plane f0. Wherein the dashed thick box represents the reference plane f0In the target image f1,f2,…,fnThe position of (a). The purpose of "straightening" is to characterize the fundus image, e.g. f in the figure1, f2,…,fnIs pulled back to the reference plane f0The position of the circle. The visual effect achieved is that the fundus image features do not drift over time, thus stabilizing the target image. It is clear that this digital "straightening" based approach will result in the edges of the "straightened" image being hidden at random, i.e. the dashed bold boxes and figures in fig. 11Like non-overlapping parts.
FIG. 12 shows (x) of the scanning pattern obtained from FIG. 9i,m,yi,m,θi,m) Image f to be scannedkIs "straightened" to the reference plane f0Schematic representation of (a). Compared with the "straightening" method shown in fig. 11, the method in this embodiment is obviously more elaborate, and is embodied in that: not only can the entire frame of image be pulled back to the position of the reference surface, but also distortions inside the image can be "straightened out". Distortion within the image is common in scanning systems. Also, this digital "straightening" based approach would result in the edges of the "straightened" image being ignored at random, i.e., the black-edged portions of fig. 12.
Although the black border phenomenon occurs in the case of the above digital tracking of the stabilized main system imagery (images and video), the visible part of the image is still stable with respect to the reference image. The digital image stabilization technology provides greater convenience for clinical operators to select pathological areas, and also provides technical support for surgical operators to accurately position the fundus position to be stricken by the auxiliary system.
Note that, when the main system is scanning, the equation (1) may be used directly without using the frequency doubling method. This technique has been widely used in the Heidelberg Engineering product Spectralis and the Carl Zeiss (Zeiss) product Cirrus.
From the above, it can be seen that the solution of the present invention covers the technologies from intelligent fundus laser imaging, surgical treatment, imaging image stabilization technology to image stabilization control, but the essential difference is that the main system in the above technologies integrates the digital image stabilization technology and frequency doubling technology which are the most advanced in the industry, so that the fundus motion signal (x) obtained from these related systemsi,yi,θi) Or (x)i,m,yi,m,θi,m) The position of the sub-system focusing laser beam on the fundus can be controlled more accurately.
While the embodiments of the present invention adopt the "open loop" control mode, the fundus motion signal (x) in the embodiments of the present inventioni,yi,θi) Or (x)i,m,yi,m,θi,m) Calculated only from the main system image, the calculation result does not need to be fed back to the main system optical system again. The related optical system adopting the open-loop control mode has the obvious advantages that the cost of products is effectively reduced under the condition of simplifying the optical system, and the problem that the control precision is reduced along with the reduction of the hardware (including a signal feedback and arithmetic device of closed-loop control) of the optical system can be effectively solved by adopting the technology of the invention.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention.

Claims (5)

1. A target tracking control system based on fundus imaging technology comprises a main system and an auxiliary system; the system is characterized in that the main system is used for calculating the transformation relation of the fundus position along with time by taking the imaging image as a basis to obtain a fundus motion signal; establishing and calibrating a space coordinate transformation relation between the fundus motion signal and the parameters of the laser control arm of the auxiliary system through the main system, and converting the value of the fundus motion signal into the parameter value of the laser control arm of the auxiliary system;
the focused laser beam of the auxiliary system tracks the position of the fundus motion signal, the main system adopts an open-loop control mode of a digital image stabilization technology and a frequency doubling technology, the fundus motion signal is obtained by calculating from an image of the main system, and the calculation result does not need to be fed back to an optical system of the main system again;
the fundus motion signal (x)i,yi,θi) With parameters (X) of a laser control arm of the auxiliary systemi,Yi) The spatial coordinate transformation relationship between the two is specifically as follows:
(Xi,Yi)=g(x,y,θ;X,Y)(xi,yi,θi) (1)
wherein: g (X, Y, theta; X, Y) is a space coordinate transformation relation; (x)i,yi) For the fundus obtained from the image of the main systemTranslation of motion, θiThe rotation amount of the movement of the fundus; (X)i,Yi) The amount of translation of the laser control arm of the auxiliary system.
2. The fundus imaging technology-based target tracking control system according to claim 1, wherein the process of calibrating the spatial coordinate transformation relationship between the fundus motion signal and the parameters of the laser control arm of the auxiliary system according to the main system is:
a. setting the original position of the fundus motion signal (X is 0, Y is 0, and θ is 0) and the parameters of the laser control arm to be also set at the zero position (X is 0, Y is 0); recording a set of fundus images as reference images f0Recording the position of the focused laser beam in the auxiliary system on the fundus;
b. respectively changing x and y of reference image of main system, regulating a movement scale k in rotation direction, and recording new image f of main systemkPosition (x) relative to a reference imagek,yk,θk) (ii) a And adjusting the laser control arm of the auxiliary system to adjust the focused laser beam to the same fundus position under the condition of zero position to obtain a parameter (x) of the laser control armk,yk);
c. And c, circulating the step b, traversing the motion scales of the translation amount x, y and the rotation angle theta allowed by the optical system of the main system to obtain a matrix relation:
G[X Y]=[x yθ] (2)
wherein: matrix G [ X Y]The spatial transformation relation from the main system to the auxiliary system is measured; x ═ X1X2…Xk]T,Y=[Y1Y2…Yk]T,x=[x1x2…xk]T,y=[y1y2…yk]T,θ=[θ1θ2…θk]T
3. The fundus imaging technology-based target tracking control system according to claim 1, wherein said master system is any one of a fundus camera, a line scanning fundus camera LSO, a confocal scanning fundus camera cSLO, an adaptive optics fundus camera, an adaptive optics LSO, and an adaptive optics SLO camera.
4. A target tracking control method based on an eyeground imaging technology is characterized by comprising the following steps:
A. in the fundus imaging system integrated with the auxiliary system, the main system is utilized to calculate the transformation relation of the fundus position along with time by taking the imaging image as the basis to obtain a fundus motion signal;
B. establishing and calibrating a space coordinate transformation relation between the fundus motion signal and the parameters of the laser control arm of the auxiliary system through the main system, and converting the value of the fundus motion signal into the parameters of the laser control arm of the auxiliary system;
the focused laser beam of the auxiliary system tracks the position of the fundus motion signal, the main system adopts an open-loop control mode of a digital image stabilization technology and a frequency doubling technology, the fundus motion signal is obtained by calculating from an image of the main system, and the calculation result does not need to be fed back to an optical system of the main system again;
step B the fundus motion signal (x)i,yi,θi) Parameter (X) of laser control arm of auxiliary systemi,Yi) The spatial coordinate transformation relationship between the two is specifically as follows:
(Xi,Yi)=g(x,y,θ;X,Y)(xi,yi,θi) (1)
wherein: g (X, Y, theta; X, Y) is a space coordinate transformation relation; (x)i,yi) Translation amount, θ, of fundus motion acquired from primary system imaging imageiThe rotation amount of the movement of the fundus; (X)i,Yi) The amount of translation of the laser control arm of the auxiliary system.
5. The fundus imaging technology-based target tracking control method according to claim 4, wherein the process of calibrating the spatial coordinate transformation relationship between the fundus motion signal and the parameters of the laser control arm of the auxiliary system according to the main system is:
b1, setting the original position of the fundus motion signal (X is 0, Y is 0, θ is 0) and the parameters of the laser control arm are also set at the zero position (X is 0, Y is 0); recording a set of fundus images as reference images f0Recording the position of the focused laser beam in the auxiliary system on the fundus;
b2, changing x and y of reference image of main system, adjusting a motion scale k in rotation direction, and recording new image f of main systemkPosition (x) relative to a reference imagek,yk,θk) (ii) a And adjusting the laser control arm of the auxiliary system to adjust the focused laser beam to the same fundus position under the condition of zero position to obtain a parameter (x) of the laser control armk,yk);
B3, looping step B2, traversing the motion scales of the translation amount x, y and the rotation angle theta allowed by the optical system of the main system, and obtaining a matrix relation:
G[X Y]=[x y θ] (2)
wherein: matrix G [ X Y]The spatial transformation relation from the main system to the auxiliary system is measured; x ═ X1X2…Xk]T,Y=[Y1Y2…Yk]T,x=[x1x2…xk]T,y=[y1y2…yk]T,θ=[θ1θ2…θk]T
CN201910592902.1A 2019-07-03 2019-07-03 Target tracking control system and method based on fundus imaging technology Active CN110200584B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910592902.1A CN110200584B (en) 2019-07-03 2019-07-03 Target tracking control system and method based on fundus imaging technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910592902.1A CN110200584B (en) 2019-07-03 2019-07-03 Target tracking control system and method based on fundus imaging technology

Publications (2)

Publication Number Publication Date
CN110200584A CN110200584A (en) 2019-09-06
CN110200584B true CN110200584B (en) 2022-04-29

Family

ID=67795928

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910592902.1A Active CN110200584B (en) 2019-07-03 2019-07-03 Target tracking control system and method based on fundus imaging technology

Country Status (1)

Country Link
CN (1) CN110200584B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6210401B1 (en) * 1991-08-02 2001-04-03 Shui T. Lai Method of, and apparatus for, surgery of the cornea
US6702809B1 (en) * 1989-02-06 2004-03-09 Visx, Inc. System for detecting, measuring and compensating for lateral movements of a target
TW200727883A (en) * 2005-08-31 2007-08-01 Alcon Refractive Horizons Inc System and method for automatic self-alignment of a surgical laser

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003102498A1 (en) * 2002-05-30 2003-12-11 Visx, Inc. “tracking torsional eye orientation and position”
US9844463B2 (en) * 2008-04-01 2017-12-19 Amo Development, Llc Ophthalmic laser apparatus, system, and method with high resolution imaging
US10751217B2 (en) * 2013-03-13 2020-08-25 Amo Development, Llc Free floating patient interface for laser surgery system
WO2016011043A1 (en) * 2014-07-14 2016-01-21 University Of Rochester Real-time laser modulation and delivery in opthalmic devices for scanning, imaging, and laser treatment of the eye
US20160278983A1 (en) * 2015-03-23 2016-09-29 Novartis Ag Systems, apparatuses, and methods for the optimization of laser photocoagulation
JP6746960B2 (en) * 2016-03-02 2020-08-26 株式会社ニデック Ophthalmic laser treatment device
US11529259B2 (en) * 2017-10-26 2022-12-20 Amo Development, Llc Femtosecond laser system and methods for photorefractive keratectomy
CN109924943A (en) * 2019-04-25 2019-06-25 南京博视医疗科技有限公司 A kind of digital image stabilization method and system based on improved Line-scanning Image Acquisition System
CN109938919B (en) * 2019-04-25 2023-09-29 南京博视医疗科技有限公司 Intelligent fundus laser surgery treatment device, system and implementation method thereof
CN110176297B (en) * 2019-05-24 2020-09-15 南京博视医疗科技有限公司 Intelligent auxiliary diagnosis system for fundus laser surgery

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6702809B1 (en) * 1989-02-06 2004-03-09 Visx, Inc. System for detecting, measuring and compensating for lateral movements of a target
US6210401B1 (en) * 1991-08-02 2001-04-03 Shui T. Lai Method of, and apparatus for, surgery of the cornea
TW200727883A (en) * 2005-08-31 2007-08-01 Alcon Refractive Horizons Inc System and method for automatic self-alignment of a surgical laser

Also Published As

Publication number Publication date
CN110200584A (en) 2019-09-06

Similar Documents

Publication Publication Date Title
CN110200585B (en) Laser beam control system and method based on fundus imaging technology
CN103687532B (en) The misalignment controlled for the image processor of ophthalmic system reduces
US11650320B2 (en) System and method for refining coordinate-based three-dimensional images obtained from a three-dimensional measurement system
JP4610829B2 (en) Two camera off-axis eye tracking devices
KR101900907B1 (en) Electronically controlled fixation light for ophthalmic imaging systems
US8948497B2 (en) System and method for increasing resolution of images obtained from a three-dimensional measurement system
CN110200582B (en) Laser beam control system and method based on fundus imaging technology
JP4499574B2 (en) A system for tracking the movement of spherical objects.
CN110200584B (en) Target tracking control system and method based on fundus imaging technology
CN110301886B (en) Optical system for real-time closed-loop control of fundus camera and implementation method thereof
JP2018022047A (en) Microscope system
EP3804607A1 (en) Ophthalmic scanning system and method
CN210228108U (en) Optical image stabilization system based on line scanning imaging system
CN110051320B (en) Method for calculating fundus target movement amount of line scanning imaging system
CN110215184B (en) Closed-loop control system and method of fundus camera
US20230277257A1 (en) Robotic imaging system with orbital scanning mode
US20240027180A1 (en) Calibration of imaging system with combined optical coherence tomography and visualization module
JP2017068867A (en) Ophthalmologic apparatus and operation method for ophthalmologic apparatus
JP2022157511A (en) Ophthalmologic apparatus
WO2020232309A1 (en) Method and apparatus to track binocular eye motion
Harvey et al. Low bandwidth eye tracker for scanning laser ophthalmoscopy
JPH0314448B2 (en)

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant