CN109091228B - Multi-instrument optical positioning method and system - Google Patents

Multi-instrument optical positioning method and system Download PDF

Info

Publication number
CN109091228B
CN109091228B CN201810724839.8A CN201810724839A CN109091228B CN 109091228 B CN109091228 B CN 109091228B CN 201810724839 A CN201810724839 A CN 201810724839A CN 109091228 B CN109091228 B CN 109091228B
Authority
CN
China
Prior art keywords
instrument
marker
instruments
frame
markers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810724839.8A
Other languages
Chinese (zh)
Other versions
CN109091228A (en
Inventor
张楠
武博
王宇
张梦诗
叶灿
贾博奇
梁楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Capital Medical University
Original Assignee
Capital Medical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Capital Medical University filed Critical Capital Medical University
Priority to CN201810724839.8A priority Critical patent/CN109091228B/en
Publication of CN109091228A publication Critical patent/CN109091228A/en
Application granted granted Critical
Publication of CN109091228B publication Critical patent/CN109091228B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems

Abstract

The invention discloses a multi-instrument optical positioning method and a multi-instrument optical positioning system, wherein the method comprises the following steps: a multi-instrument judging step, namely judging whether an instrument enters or exits through increase and decrease of the markers between the front frame and the rear frame; a multi-instrument identification step, wherein each instrument is identified based on the geometric shape of a marker on the instrument, and when a new instrument enters, the new instrument is identified; a multi-instrument tracking step of tracking the motion of a plurality of said instruments using a motion vector tracking algorithm. The invention can track a plurality of instruments in real time, can reduce the situations of frame loss and incapability of tracking individual instruments, and solves the real-time property of multi-instrument tracking; it is proposed to identify instruments using side length ratios and/or perimeter lengths, reducing the restrictions on the shape of the instruments, even if there are similarly shaped instruments that can still be distinguished. The invention can judge the increase and decrease conditions of the instruments and can still correctly position the positions and the directions of all the instruments when the number of the instruments changes.

Description

Multi-instrument optical positioning method and system
Technical Field
The invention relates to the technical field of medical operation navigation, in particular to a multi-instrument optical positioning method and system.
Background
At present, the surgical navigation system is widely applied to the fields of neurosurgery, orthopaedics, otorhinolaryngology and the like, the time for surgery is greatly shortened, the pain of a patient is relieved, and the surgery precision is improved. Among them, the optical positioning system is widely used because of its advantages such as high positioning accuracy and high flexibility. The optical positioning system tracks the active luminous near-infrared LED lamp or the passive reflective ball on the instrument through the camera, so that the relative position relation between the tip point of the instrument and the focus is displayed in real time. Most studies today often require two or more instruments to be identified and tracked simultaneously by the system in an actual clinical procedure.
Currently, the advanced technology for optical positioning of instruments is north digital corporation of canada (NDI corporation), wherein Hybrid Polaris Spectra and Passive Polaris Spectra can track up to 15 wireless instruments (up to 6 active wireless instruments), Polaris Vicra can track up to 6 instruments (up to 1 active wireless instrument), and Polaris Vega can track up to 25 instruments (up to 6 active wireless instruments).
Unlike the tracking of passive and active wireless instruments, North Digital Incorporated (NDI) tracks active wired instruments, the controller can be used to illuminate markers individually to track the instruments, which is expensive and difficult to popularize in optical positioning systems. In patent publication No. CN1199054C, NDI proposes a method of tracking multiple objects, indicating that each object must have a unique segment length in its marker pair.
In patent publication No. CN101750607A of the university of ihua, the distances between the marker points on the same instrument are different, and the distances between the marker points on different instruments are also significantly different. However, this limits the case if there are pairs of marks on one instrument that are the same distance from the other instrument that are similar in shape to the different instruments.
Some prior arts propose to use kalman filter to track multiple objects, which can overcome the problem of delay to some extent, but when the object moves too fast, the target cannot be tracked well.
Disclosure of Invention
The purpose of the invention is realized by the following technical scheme.
The invention realizes real-time positioning of multiple instruments by using different geometric shapes formed by optical markers on each instrument and a motion vector tracking method, provides that the side length ratio is used for identifying the instruments in different shapes, and if the shapes of the instruments are similar, the actual circumferences of the instruments are used for distinguishing; tracking a plurality of moving instruments by using a motion vector tracking method; the change of the number of the instruments is monitored by using the increase and decrease of the markers of the front and the back frames.
Specifically, according to one aspect of the present invention, there is provided a multi-instrument optical positioning method, comprising:
a multi-instrument judging step, namely judging whether an instrument enters or exits through increase and decrease of the markers between the front frame and the rear frame;
a multi-instrument identification step, wherein each instrument is identified based on the geometric shape of a marker on the instrument, and when a new instrument enters, the new instrument is identified;
a multi-instrument tracking step of tracking the motion of a plurality of said instruments using a motion vector tracking algorithm.
Preferably, the label is actively and/or passively luminescent.
Preferably, the geometric shape comprises a side length ratio and/or a perimeter between the markers.
Preferably, the multi-instrument recognition step specifically includes: firstly, identifying instruments in different shapes by using the side length ratio between the markers, and if the instruments are similar in shape, distinguishing the different instruments by using the circumferences between the markers.
Preferably, the multi-instrument recognition step specifically includes:
acquiring left and right views of one or more instruments newly entering the system, identifying each instrument by utilizing different side length ratios and/or circumferences of marker pairs to obtain central pixel coordinates of each instrument marker in the left and right views, and calculating the spatial position and direction of an instrument tip point according to the identified pixel coordinates of each instrument;
preferably, the central pixel coordinate of the marker is determined based on a region growing method and a gray scale centroid method.
Preferably, the multi-instrument tracking step specifically includes:
in the t-th frame (t >1), the central pixel coordinates of each marker are calculated, and the marker pixel coordinates with the minimum motion change of each marker of the instrument identified in the t-th frame and the t-1 frame are searched by using a motion vector tracking method, so that the spatial position and the direction of the tip point of each instrument in the t-th frame are calculated.
According to another aspect of the present invention, there is also provided a multi-instrument optical positioning system, comprising:
the multi-instrument judging module judges whether an instrument enters or exits through increase and decrease of the markers between the front frame and the rear frame;
the multi-instrument recognition module is used for recognizing each instrument based on the geometric shape of the marker on the instrument, and recognizing a new instrument when the new instrument enters;
a multi-instrument tracking module that tracks the motion of a plurality of said instruments using a motion vector tracking algorithm.
The invention has the advantages that: the invention tracks the apparatus by using the method of motion vector tracking, can track a plurality of apparatuses in real time, can reduce the frame loss and the condition that individual apparatus is not tracked, and solves the real-time property of multi-apparatus tracking; it is proposed to identify instruments using side length ratios and/or perimeter lengths, reducing the restrictions on the shape of the instruments, even if there are similarly shaped instruments that can still be distinguished. The invention can judge the increase and decrease of the instruments by using the increase and decrease of the number of the markers between the front frame and the back frame, and can still correctly position the positions and the directions of all the instruments when the number of the instruments changes.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
FIG. 1 illustrates a hardware infrastructure architecture diagram according to an embodiment of the invention;
FIG. 2 illustrates a schematic block diagram of a multi-instrument tracking optical localization method according to an embodiment of the present invention;
FIG. 3 shows a schematic representation of the construction of two differently sized instruments used in the present invention;
FIG. 4 illustrates the multi-instrument recognition algorithm process of the present invention;
FIG. 5 is a flow chart of a method for real-time tracking of an instrument using a time-series motion vector tracking method according to the present invention;
FIG. 6 is a schematic diagram illustrating the operation of a multi-instrument optical positioning system of the present invention;
FIG. 7 is a schematic diagram showing the effect of the MSI and TSI tracking the track simultaneously.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
FIG. 1 illustrates a hardware infrastructure architecture diagram according to an embodiment of the invention; the invention provides a multi-instrument optical positioning system, which comprises a plurality of instruments 1 (four are listed in fig. 1, but not limited to four, and less than four or more than four are also possible), a binocular camera 3, a host 4 and a display 5. Each instrument 1 has three markers 2, which may be actively or passively illuminated, such as actively illuminated near-infrared LED lamps or passive reflective spheres. The binocular camera 3 shoots images of the instrument 1 by using a binocular vision principle, tracks an actively luminous near-infrared LED lamp or a passive reflective ball on the instrument, receives and processes data by the host 4, and displays the data on the display 5, so that the relative position relation between the tip point of the instrument and a focus is displayed in real time.
As shown in FIG. 2, the technical scheme of the invention is divided into three parts, namely, the change condition of the number of the instruments is judged according to the increase and decrease of the markers of the front frame and the rear frame, the multiple instruments are identified based on the different shapes of the instruments and the actual circumferences of the markers of the instruments, and the motion of the multiple instruments is tracked by utilizing a motion vector tracking algorithm. As shown in fig. 2, first, the system is initialized, the marker coordinate tables of all instruments are cleared, and the number of markers is set to 0. When the number of the markers is detected to increase in the 1 st frame, a new instrument enters the 1 st frame, the instruments in the 1 st frame are identified by using different side length ratios and/or circumferences of the marker pairs, the central pixel coordinates of the markers of the instruments in the left and right images are obtained, and the central pixel coordinates of the markers of the instruments in the left and right images are obtained according to the central pixel coordinatesCalculating the spatial position and direction of the tip point of the instrument according to the identified pixel coordinates of each instrument; at the tth frame (t)>1) The central pixel coordinates of each marker are calculated firstly, and the marker pixel coordinates with the minimum motion change of each marker of the instrument identified in the t-1 frame are searched in the t-th frame by using a motion vector tracking method, so that the spatial position and the direction of the tip point of each instrument in the t-th frame are calculated. Wherein, the point connected by the solid line in the t-th frame is the marker obtained in the t-th frame, the point connected by the solid line in the (t-1) th frame is the marker obtained in the (t-1) th frame,
Figure BDA0001719499820000041
is a motion vector.
The method principle and the process for implementing the present invention are described in detail below.
Example 1
Step S1: judging increase or decrease of number of instruments based on increase or decrease of number of markers
In practical applications, there are often situations where a new instrument is added or removed while tracking the instrument. In order to prevent the system from influencing the tracking of other instruments when the number of the instruments changes, the invention judges whether the instruments enter or exit by detecting the increase and decrease conditions of the markers of the frames before and after detection. Firstly, initializing the system, and clearing the marker coordinate tables of all instruments, wherein the number of the markers is 0.
Step S2: marker geometry based instrument recognition
For the convenience of tracking and identification, markers can be added at different positions on the instrument to form different geometric shapes, so that the aim of distinguishing a plurality of instruments is fulfilled. Although using the principle of binocular vision, the distance between the markers is fixed after the instrument is calibrated. However, when the binocular camera captures a plurality of infrared markers of a plurality of instruments through the filter, the poses of the respective instruments are different, and thus the instruments cannot be distinguished simply by tracking the distances between the markers in real time. The invention adopts a method that the instrument is just aligned to the camera for the first time, and based on the fixed geometric shape of the marker, different instruments are distinguished by utilizing the side length ratio and/or the perimeter.
Taking the example of tracking two instruments, as shown in FIG. 3, the left small instrument in the figure is denoted as MSI; on the right is a large machine, denoted as TSI. The instrument is provided with a marker which is an actively luminous near-infrared light-emitting diode. Line segments are formed between the luminous markers, and it can be seen that the geometrical structures of the two instruments are different, the three markers of the small instrument are approximately in the shape of a right triangle, and the three markers of the large instrument are approximately in the shape of an isosceles triangle.
If there are L instruments, at least three markers { A, B, C } are set for each instrument, and are respectively marked as { A1,B1,C1},…,{Al,Bl,Cl},,{AL,BL,CLL ═ 1,2, …, L, the instrument number involved in optical localization is indicated, and L is the total number of instruments involved in optical localization.
Labeling the AlAnd Bl、AlAnd ClThe Euclidean distances therebetween are respectively denoted as
Figure BDA0001719499820000051
Of each instrument
Figure BDA0001719499820000052
Is different, thereby enabling the registration of the identification of multiple instruments.
Figure BDA0001719499820000053
Is represented as follows:
Figure BDA0001719499820000054
wherein
Figure BDA0001719499820000055
Respectively A on the first instrumentl、Bl、ClThe center pixel coordinates of the three markers. The method is based on a region growing method and a gray scale centroid method to calculate the central pixel coordinate of the marker.
FIG. 4 is a multi-instrument recognition algorithm process of the present invention. Firstly, placing a plurality of instruments over against a camera, acquiring a 1 st frame of left and right views, and respectively solving the coordinates of central pixel points of left and right view markers. By finding the leftmost point a in the left viewl(l∈[1,L]) Separately calculate AlEuclidean distance from the rest of the points, and is recorded as
Figure BDA0001719499820000061
The calculation is not carried out with the self point, and the calculation is shown as a formula (2):
Figure BDA0001719499820000062
computing
Figure BDA0001719499820000063
The ratio between two is recorded as η when η epsilon [ delta ]l-ε,δl+ε]Then, the three markers belong to the first instrument, wherein [ delta ]l-ε,δl+ε]For a set range of ratios for the first instrument, δlWhen the first instrument is opposite to the camera
Figure BDA0001719499820000064
The value of epsilon is an empirical error to solve the problem of deviation caused by the fact that the camera cannot be exactly aligned when instruments are manually placed. And for the marker in the right view, the same operation as that in the left view is carried out, so that the registration or pairing of each instrument is realized, and then the position of the tip point of each instrument is calculated by utilizing the stereo matching principle.
Step S3: motion vector tracking
In an actual process, the pose of the instrument is often required to be adjusted in real time according to a specific tracking position, the instrument is difficult to be completely opposite to the camera for acquisition, when the instrument rotates, the shape of the markers acquired by the camera changes, and when the rotation angle is too large, the markers exceed a set side length ratio range. In order to achieve more accurate positioning, the present invention proposes to track the instrument in real time by using a motion vector tracking method on a time series, as shown in fig. 5.
In order to calculate the position of the tip of the instrument in the current t (t is more than or equal to 2) th frame, the positions of markers of the left and right views in the t frame are tracked firstly, and the central pixel coordinates of the markers in the left and right views in the (t-1) th frame and the t frame are calculated. Since the relative motion of the adjacent frames is small, i.e. the motion vectors of the adjacent frames are small, the magnitude of the motion vector can be defined, i.e. the magnitude of the motion vector of the center point of the marker in different frames is:
Figure BDA0001719499820000065
i in formula (3)v,w(t) represents a motion vector from a V point of a (t-1) th frame to a W point of a t frame
Figure BDA0001719499820000066
Of which wherein
Figure BDA0001719499820000067
Respectively represent the position vectors of V point at the (t-1) th frame and W point at the t-1 th frame, and
Figure BDA0001719499820000071
then, according to step S2, the recognition result of each instrument is obtained, if A isl,Bl,Cl(L is more than or equal to 1 and less than or equal to L) are three mark points for identifying the registered first instrument, and the mark objects belong to the first instrument of the t frame according to the minimum motion vector amplitude of each mark object in the left view of the t frame in the following formula by the central pixel coordinate of the left view of the (t-1) th frame of each instrument.
Figure BDA0001719499820000072
In the formula (4)
Figure BDA0001719499820000073
Respectively representing a marker of the left view of the t-th frame and a marker A of the left view of the (t-1) th framel、Bl、ClThe marker corresponding to the minimum value of the motion vector magnitude of (a) belongs to the i-th instrument at the t-th frame. At the same time, the right view does the same algorithm tracking.
Therefore, the method realizes the registration of the multi-instrument markers of front and back frames and left and right views based on the motion vector, calculates the coordinates of the tip point of each instrument by using the binocular vision principle, and realizes the real-time tracking of each instrument.
According to another aspect of the present invention, there is also provided a multi-instrument tracking optical localization system, comprising: the multi-instrument judging module judges whether an instrument enters or exits through increase and decrease of the markers between the front frame and the rear frame; the multi-instrument recognition module is used for recognizing each instrument based on the geometric shape of the marker on the instrument, and recognizing a new instrument when the new instrument enters; a multi-instrument tracking module that tracks the motion of a plurality of said instruments using a motion vector tracking algorithm. Fig. 6 is a schematic diagram illustrating the operation of the multi-instrument tracking optical positioning system of the present invention. Firstly, initializing the system, and clearing the marker coordinate tables of all instruments, wherein the number of the markers is 0. The camera captures left and right images of all markers to be tracked and then calculates the center pixel coordinates of each marker. Judging whether an instrument enters or exits by using a multi-instrument judging module, and if so, identifying a new instrument by using a multi-instrument identifying module; if not, tracking each instrument by using the motion vector to obtain the central pixel coordinates of each instrument marker in the left and right images. And finally, performing stereo matching three-dimensional reconstruction on each instrument, and calculating the spatial position coordinates and the directions of the tip points of each instrument.
In order to verify the technical effect of the invention, a large number of experiments were also performed, and the following are experimental results.
Results of the experiment
1 marker matching assay
In the experiment, a multi-instrument multi-marker image is obtained by using Bumblebe 2 of PointGrey company, the image resolution is 640 multiplied by 480, a near infrared LED lamp is used as a marker, the peak wavelength is 850nm, and two near infrared filters (the wavelength range is 850-1000nm) are added in front of a camera to filter the interference of natural light. The MSI and TSI instruments are adopted as the instruments.
A large number of experiments prove that the error epsilon of experience is set to be 0.05, so that the problem of deviation caused by manually placing instruments over the camera can be well solved.
Respectively acquiring 100 images when MSI and TSI are right facing to the camera, and calculating
Figure BDA0001719499820000081
Table 1 and table 2 show the first 10 ratios and 100 averages of MSI and TSI, respectively.
TABLE 1 of MSI and TSI
Figure BDA0001719499820000082
Value of (A)
Figure BDA0001719499820000083
From the data in Table 1, the MSI can be expressed
Figure BDA0001719499820000084
The range is set to [1.332,1.432 ]]Of TSI
Figure BDA0001719499820000085
The range is set to [1.212,1.312 ]]From step S1, if there is a ratio η1∈[1.332,1.432],η2∈[1.212,1.312]Then the registered identification of MSI and TSI is completed.
To verify the real-time validity of the algorithm, two instruments are placed approximately 620mm from the camera, and then both instruments are moved simultaneously, with the MSI rounding away from the camera and the TSI rounding closer to the camera. The algorithm can display the positions and the motion tracks of the two instrument tip points in real time. FIG. 7, below, is a schematic diagram of trajectory tracking after movement of two instruments, where the solid line is the trajectory of the MSI tip point and the dashed line is the trajectory of the TSI tip point.
2 experiment of positioning accuracy
In order to measure the tracking precision of the algorithm, the invention designs an experiment to measure the precision of two instruments on the distance, and fixes the large instrument and the small instrument on the grating ruler and carries out positioning and tracking. The resolution of the grating ruler is 0.005mm, the moving distance is measured and displayed digitally through the movement of the slider of the grating ruler, the value is used as a true value, then the moving distance calculated by the algorithm is compared with the true value, and the precision of the algorithm on the distance is calculated.
Firstly, clearing the numerical value of the display of the grating ruler, recording the numerical value as a zero point, acquiring 100 images, and calculating the average value of the coordinates of the tip point of the instrument. And then, moving the slide block to a certain position, recording the distance value displayed by the grating ruler, and recording as an end point. At this time, 100 images are again acquired at the end point, and the average value of the coordinates of the tip point is obtained. And finally, calculating the distance between the zero point and the terminal point, and comparing the distance with the true value.
Simulated instrument measurements are shown in tables 2 and 3, with table 2 being the MSI distance measurements and table 3 being the TSI measurements.
TABLE 2 MSI measurement accuracy
Figure BDA0001719499820000091
Figure BDA0001719499820000101
TABLE 3 TSI measurement accuracy
Figure BDA0001719499820000102
As can be seen from tables 2 and 3, the average absolute errors between the distances measured by the grating scales of the MSI and the TSI and the distance calculated by the algorithm are respectively 0.065mm and 0.031mm, and the average RMSE between the distances measured by the grating scales of the MSI and the TSI is respectively 0.041mm and 0.102mm, so that the algorithm has higher precision. Meanwhile, in order to prove that the distance between the two instruments is basically unchanged, the Euclidean distance between the tip points of the two instruments is calculated through an algorithm, and the algorithm verification is carried out, as shown in Table 4.
TABLE 4 Euclidean distance between MSI and TSI instruments
Figure BDA0001719499820000103
Figure BDA0001719499820000111
The average of the European standard deviation of the distance in Table 4 is 0.053mm, and the distance between the two instrument tips is stable.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (8)

1. A multi-instrument optical positioning method, comprising:
a multi-instrument judging step, namely judging whether an instrument enters or exits through increase and decrease of the markers between the front frame and the rear frame;
a multi-instrument recognition step of recognizing each instrument based on the geometric shape of the marker on the instrument; when a new instrument enters, identifying the new instrument; the geometric shape comprises the side length ratio and/or the perimeter between the markers;
a multi-instrument tracking step of tracking the motions of a plurality of the instruments by using a motion vector tracking algorithm; in the t-th frame, t >1, the central pixel coordinates of each marker are calculated, and the marker pixel coordinates with the minimum motion change of each marker of the instrument identified in the t-th frame and the t-1 frame are searched by using a motion vector tracking method, so that the spatial position and the direction of the tip point of each instrument in the t-th frame are calculated.
2. The method of claim 1,
the marker is actively and/or passively luminescent.
3. The method of claim 1,
the multi-instrument identification step specifically comprises the following steps: firstly, identifying instruments in different shapes by using the side length ratio between the markers, and if the instruments are similar in shape, distinguishing the different instruments by using the circumferences between the markers.
4. The method of claim 3,
the multi-instrument identification step specifically comprises the following steps:
acquiring left and right views of one or more entered instruments, identifying each instrument by utilizing different side length ratios and/or circumferences of marker pairs to obtain central pixel coordinates of each instrument marker in the left and right views, and calculating the spatial position and direction of an instrument tip point according to the identified pixel coordinates of each instrument.
5. The method of claim 4,
and solving the central pixel coordinate of the marker based on a region growing method and a gray scale centroid method.
6. The method of claim 4,
the multi-instrument identification step specifically comprises the following steps:
(1) acquiring left and right views of one or more instruments entering the system, and respectively solving the coordinates of central pixel points of markers of the left and right views;
(2) searching a leftmost point in a left view, respectively calculating Euclidean distances between the leftmost point and other points in the left view, calculating a ratio between every two distances, calculating a difference value between the ratio and the side length ratio of a calibrated instrument, and if the difference value is within a certain range, considering a marker related to the ratio as a marker on the calibrated instrument;
(3) deleting the identified markers, and detecting whether the markers exist; if yes, repeating the step (2); if not, ending the identification of the left view marker; and for the marker in the right view, the same operation as that in the left view is carried out, so that the registration or pairing of each instrument is realized, and then the position of the tip point of each instrument is calculated by utilizing the stereo matching principle.
7. The method of claim 1,
the multi-instrument tracking step specifically comprises the following steps:
(1) calculating the coordinates of the central pixels of the markers in the t-1 th frame and the t-th frame left and right views, wherein the t-1 th frame left and right views identify the instruments to which the markers belong;
(2) selecting a marker on an instrument identified by a left view in a t-1 frame, calculating the amplitude of a motion vector between the coordinate of a central pixel of the marker and the coordinate of the marker in the t frame, wherein the marker in the t frame corresponding to the value with the minimum amplitude corresponds to the marker in the t-1 frame and is an image of the same marker at different moments;
(3) repeating step (2) for each marker on each instrument until there is no marker;
(4) the right view does the same tracking as the left view described above.
8. A multi-instrument optical positioning system, comprising:
the multi-instrument judging module judges whether an instrument enters or exits through increase and decrease of the markers between the front frame and the rear frame;
a multi-instrument recognition module that recognizes each instrument based on a geometry of a marker on the instrument; when a new instrument enters, identifying the new instrument; the geometric shape comprises the side length ratio and/or the perimeter between the markers;
a multi-instrument tracking module for tracking the motion of a plurality of said instruments using a motion vector tracking algorithm; in the t-th frame, t >1, the central pixel coordinates of each marker are calculated, and the marker pixel coordinates with the minimum motion change of each marker of the instrument identified in the t-th frame and the t-1 frame are searched by using a motion vector tracking method, so that the spatial position and the direction of the tip point of each instrument in the t-th frame are calculated.
CN201810724839.8A 2018-07-04 2018-07-04 Multi-instrument optical positioning method and system Active CN109091228B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810724839.8A CN109091228B (en) 2018-07-04 2018-07-04 Multi-instrument optical positioning method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810724839.8A CN109091228B (en) 2018-07-04 2018-07-04 Multi-instrument optical positioning method and system

Publications (2)

Publication Number Publication Date
CN109091228A CN109091228A (en) 2018-12-28
CN109091228B true CN109091228B (en) 2020-05-12

Family

ID=64845703

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810724839.8A Active CN109091228B (en) 2018-07-04 2018-07-04 Multi-instrument optical positioning method and system

Country Status (1)

Country Link
CN (1) CN109091228B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111461049B (en) * 2020-04-13 2023-08-22 武汉联影智融医疗科技有限公司 Space registration identification method, device, equipment and computer readable storage medium
CN116650119B (en) * 2023-07-24 2024-03-01 北京维卓致远医疗科技发展有限责任公司 Calibration reference frame for adjustable operation reference frame

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009024339B3 (en) * 2009-06-09 2010-09-16 Atlas Elektronik Gmbh Bearing method and direction finding system for detecting and tracking temporally successive bearing angles
CN101980229A (en) * 2010-10-12 2011-02-23 武汉大学 Single-camera and mirror reflection-based space tracking and positioning method
CN108053491A (en) * 2017-12-12 2018-05-18 重庆邮电大学 The method that the three-dimensional tracking of planar target and augmented reality are realized under the conditions of dynamic visual angle

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7346381B2 (en) * 2002-11-01 2008-03-18 Ge Medical Systems Global Technology Company Llc Method and apparatus for medical intervention procedure planning
US7778686B2 (en) * 2002-06-04 2010-08-17 General Electric Company Method and apparatus for medical intervention procedure planning and location and navigation of an intervention tool
JP5268433B2 (en) * 2008-06-02 2013-08-21 キヤノン株式会社 IMAGING DEVICE AND IMAGING DEVICE CONTROL METHOD
CN101750607B (en) * 2008-07-25 2012-11-14 清华大学 Instrument identifying method for passive optical position fixing navigation system
CN101327148A (en) * 2008-07-25 2008-12-24 清华大学 Instrument recognizing method for passive optical operation navigation
US8657809B2 (en) * 2010-09-29 2014-02-25 Stryker Leibinger Gmbh & Co., Kg Surgical navigation system
US10228428B2 (en) * 2012-03-22 2019-03-12 Stylaero Ab Method and device for pose tracking using vector magnetometers
JP7029932B2 (en) * 2016-11-04 2022-03-04 グローバス メディカル インコーポレイティッド Systems and methods for measuring the depth of instruments

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009024339B3 (en) * 2009-06-09 2010-09-16 Atlas Elektronik Gmbh Bearing method and direction finding system for detecting and tracking temporally successive bearing angles
CN101980229A (en) * 2010-10-12 2011-02-23 武汉大学 Single-camera and mirror reflection-based space tracking and positioning method
CN108053491A (en) * 2017-12-12 2018-05-18 重庆邮电大学 The method that the three-dimensional tracking of planar target and augmented reality are realized under the conditions of dynamic visual angle

Also Published As

Publication number Publication date
CN109091228A (en) 2018-12-28

Similar Documents

Publication Publication Date Title
CN103575227B (en) A kind of vision extensometer implementation method based on digital speckle
US8811660B2 (en) Object-tracking systems and methods
JP2019537077A (en) Simultaneous positioning map creation navigation method, apparatus and system using indicators
Zhou et al. Optical surgical instrument tracking system based on the principle of stereo vision
Krigslund et al. A novel technology for motion capture using passive UHF RFID tags
CN109091228B (en) Multi-instrument optical positioning method and system
CN110223355B (en) Feature mark point matching method based on dual epipolar constraint
CN109000558A (en) A kind of big visual field non-contact three-dimensional point coordinate measurement method and apparatus
JP2019535467A (en) Medical imaging jig and method of use thereof
JP2014211404A (en) Motion capture method
CN109035345A (en) The TOF camera range correction method returned based on Gaussian process
CN107595388A (en) A kind of near infrared binocular visual stereoscopic matching process based on witch ball mark point
CN112184653B (en) Binocular endoscope-based focus three-dimensional size measuring and displaying method
CN106236264A (en) The gastrointestinal procedures air navigation aid of optically-based tracking and images match and system
EP4067817A1 (en) System and method for spatial positioning of magnetometers
Su et al. Hybrid marker-based object tracking using Kinect v2
CN103673912A (en) Image correcting system for deformation measurement of speckle correlation methods
Tjaden et al. High-speed and robust monocular tracking
CN111156917B (en) Deformation measurement method based on gray level mark points
CN109410277B (en) Virtual mark point filtering method and system
CN112638251A (en) Method for measuring position
CN110887470B (en) Orientation pose measurement method based on microlens array two-dimensional optical coding identification
CN113066126A (en) Positioning method for puncture needle point
García et al. Calibration of a surgical microscope with automated zoom lenses using an active optical tracker
CN111862170A (en) Optical motion capture system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant