CN212326346U - Endoscope imaging system - Google Patents

Endoscope imaging system Download PDF

Info

Publication number
CN212326346U
CN212326346U CN202021451250.4U CN202021451250U CN212326346U CN 212326346 U CN212326346 U CN 212326346U CN 202021451250 U CN202021451250 U CN 202021451250U CN 212326346 U CN212326346 U CN 212326346U
Authority
CN
China
Prior art keywords
light
image
light source
fluorescence
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202021451250.4U
Other languages
Chinese (zh)
Inventor
陆汇海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Bosheng Medical Technology Co ltd
Original Assignee
Shenzhen Bosheng Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Bosheng Medical Technology Co ltd filed Critical Shenzhen Bosheng Medical Technology Co ltd
Priority to CN202021451250.4U priority Critical patent/CN212326346U/en
Application granted granted Critical
Publication of CN212326346U publication Critical patent/CN212326346U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Endoscopes (AREA)

Abstract

The utility model discloses an endoscope imaging system, including following step: the device comprises a handle, a mirror tube, a light source component, a first front end component, a second front end component and a control device. Collecting visible light reflected by a human body object and excited fluorescence by two sensors positioned at the front end of the endoscope tube to generate corresponding image signals, and transmitting the image signals to a rear end for processing to obtain a final endoscope image; compared with the prior art that visible light and fluorescence are transmitted to the rear end through the light path for collection and processing, the endoscope solves the problem of accuracy reduction caused by light path transmission, improves imaging accuracy, and saves light path cost and light path occupation space; in addition, due to the design of independent light paths of visible light and fluorescence, the visible light and the fluorescence can be respectively focused and collected, so that the definition consistency of a visible light image and a fluorescence image is ensured; the endoscope imaging system is binocular stereoscopic vision, and can obtain three-dimensional information to compensate the fluorescence brightness of different depth areas.

Description

Endoscope imaging system
Technical Field
The utility model relates to the technical field of medical equipment, concretely relates to endoscope imaging system.
Background
In a minimally invasive surgery, the position of cancer can be accurately positioned by applying an excitation light endoscope (or a fluorescence endoscope), so that cancerated tissues can be more accurately resected. The main principle is as follows: after an excitation light drug is scattered or injected into a target site of a living body, excitation light from a light source device is irradiated to a subject, and fluorescence from cancer is captured, whereby the presence diagnosis of cancer, the degree of malignancy, and the like are qualitatively diagnosed.
The prior art is mainly realized by single-camera time-sharing imaging and dual-camera light-sharing imaging. The spectroscopic technique mainly adopts the separation of white light, exciting light and exciting light at the handle end, and adopts 2 or more than 2 sensors (CCD or CMOS) to respectively sense the white light and the fluorescent monochromatic image. One implementation is a color sensor with a single excitation light monochrome sensor. The other is realized by a three-way R, G, B monochrome sensor and an additional excitation light monochrome sensor.
The prior art requires sophisticated light splitting and filtering techniques to avoid contamination of the fluorescent monochrome image with white light or contamination of the white color image with excitation light. In the prior art, no matter time sharing or light splitting is performed, visible light, return excitation light and excitation light all share a set of front-end optical link, and due to the fact that focal lengths are different due to different wavelengths of the visible light and the excitation light, the definition of a visible light image is not consistent with that of a fluorescent monochromatic image.
Disclosure of Invention
The utility model provides an endoscope imaging system with simple light path and clear imaging.
In one embodiment, an endoscopic imaging system is provided, comprising:
a handle having a threading channel;
the lens tube is connected with the front end of the handle;
the light source assembly comprises a first light source, a second light source and a light guide beam, wherein the first light source is used for emitting white light, the second light source is used for emitting exciting light, one end of the light guide beam penetrates through the lens tube and extends to the front end in the lens tube, and the other end of the light guide beam is connected with the first light source and the second light source;
the first front end assembly is arranged at the front end in the lens tube and comprises a first lens group and a first sensor, the first lens group and the first sensor are sequentially arranged from front to back, the first lens group is used for acquiring visible light reflected by a human body object, and the first sensor is used for acquiring the visible light filtered by the first lens group and generating a first image signal;
the second front end component and the first front end component are arranged at the front end in the lens tube side by side, the second front end component comprises a second lens group and a second sensor which are sequentially arranged from front to back, the second lens group is used for acquiring fluorescence excited by a human body object, and the second sensor is used for acquiring excitation light filtered by the second lens group and generating a second image signal;
the control device is connected with the first light source, the second light source, the first sensor and the second sensor; the control device is used for controlling the first light source to emit white light and the second light source to emit exciting light respectively, collecting first image information generated by the first sensor and second image information generated by the second sensor, and outputting an endoscope image.
Furthermore, the first lens group comprises a fluorescence cut-off filter lens which is arranged at the rear end of the first lens group, the second lens group comprises a visible light cut-off filter lens which is arranged at the rear end of the second lens group.
Furthermore, the first lens group comprises a fluorescence cut-off filter film, and the fluorescence cut-off filter film is attached to the front end or the rear end lens of the first lens group; the second lens group comprises a visible light cut-off filter film, and the visible light cut-off filter film is attached to the front end or the rear end lens of the second lens group.
Further, the light source component further comprises a fusion light path, the fusion light path is provided with two input ends and an output end, the two input ends of the fusion light path are respectively aligned with the first light source and the second light source, the emergent end of the fusion light path is connected with the light guide beam, and the fusion light path is used for fusing white light and exciting light to form mixed light.
Furthermore, the output end of the light guide beam is provided with two branch beams, and the two branch beams are symmetrically arranged at the front end in the mirror tube.
Further, the control device comprises a controller, an image acquisition module and an image processing module, wherein the controller is used for controlling the first light source to emit white light and the second light source to emit exciting light respectively, the image acquisition module is used for acquiring first image information generated by the first sensor and second image information generated by the second sensor, and the image processing module is used for converting the first image signal and the second image signal into an endoscope image.
Furthermore, the control device further comprises a sensor driving module, the input end of the sensor driving module is connected with the controller, the output end of the sensor driving module is respectively connected with the first sensor and the second sensor, and the sensor driving module is used for outputting the sensor setting information obtained by calculation of the controller to the first sensor and the second sensor.
Further, the light source assembly further comprises a light source control module, an input end of the light source control module is connected with the controller, an output end of the light source control module is connected with the first light source and the second light source, and the light source control module is used for receiving a control signal of the controller and controlling light intensity of light emitted by the first light source and the second light source.
Further, the control device also comprises an input device, and the input device is connected with the controller.
Further, the endoscope imaging system also comprises a display, and the display is connected with the image processing module.
According to the endoscope imaging system of the embodiment, the two sensors at the front end of the endoscope tube can respectively collect visible light reflected by a human subject and excited fluorescence to generate corresponding image signals, and then the image signals are transmitted to the rear end to be processed to obtain a final endoscope image; compared with the prior art that visible light and fluorescence are transmitted from the front end of the endoscope tube to the rear end through the light path for collection and processing, the endoscope solves the problem of accuracy reduction caused by light path transmission, improves imaging accuracy, and saves light path cost and light path occupation space; in addition, due to the design of independent light paths of visible light and fluorescence, the visible light and the fluorescence can be respectively focused and collected, so that the definition consistency of a visible light image and a fluorescence image is ensured; the endoscope imaging system is binocular stereoscopic vision, and can obtain three-dimensional information to compensate the fluorescence brightness of different depth areas.
Drawings
FIG. 1 is a schematic diagram of an endoscopic imaging system in one embodiment;
FIG. 2 is a flow chart of a method of endoscopic imaging in one embodiment;
FIG. 3 is a flow diagram of image processing for an endoscopic imaging method in one embodiment.
Detailed Description
The present invention will be described in further detail with reference to the following detailed description and accompanying drawings. Wherein like elements in different embodiments are numbered with like associated elements. In the following description, numerous details are set forth in order to provide a better understanding of the present application. However, those skilled in the art will readily recognize that some of the features may be omitted or replaced with other elements, materials, methods in different instances. In some instances, certain operations related to the present application have not been shown or described in detail in order to avoid obscuring the core of the present application from excessive description, and it is not necessary for those skilled in the art to describe these operations in detail, so that they may be fully understood from the description in the specification and the general knowledge in the art.
Furthermore, the features, operations, or characteristics described in the specification may be combined in any suitable manner to form various embodiments. Also, the various steps or actions in the method descriptions may be transposed or transposed in order, as will be apparent to one of ordinary skill in the art. Thus, the various sequences in the specification and drawings are for the purpose of describing certain embodiments only and are not intended to imply a required sequence unless otherwise indicated where such sequence must be followed.
The numbering of the components as such, e.g., "first", "second", etc., is used herein only to distinguish the objects as described, and does not have any sequential or technical meaning. The term "connected" and "coupled" when used in this application, unless otherwise indicated, includes both direct and indirect connections (couplings). The front end in this text is the end close to the human subject, and the rear end is the end far from the human subject.
The first embodiment is as follows:
the embodiment provides an endoscope imaging system which is a binocular endoscope and is mainly used for detecting canceration of a human body.
As shown in FIG. 1, the present endoscopic imaging system generally includes a handle 10, a scope tube 20, a light source assembly 30, a first front end assembly 40, a second front end assembly 50, and a control device 60, and other portions of the endoscopic imaging system are not referred to in this application and will not be described in detail.
The endoscope tube 20 is a hard endoscope tube or a soft endoscope tube, the rear end of the endoscope tube 20 is connected to the front end of the handle 10, a threading channel communicated with the endoscope tube 20 is arranged in the handle 10, and the front end of the endoscope tube 20, which is used for a doctor to operate, of the handle 10 extends into a human body.
The light source assembly 30 includes a first light source 31, a second light source 32 and a light guide beam 33, the first light source 31 and the second light source 32 are located behind the handle 10, for example, installed in the control device 60, or independently installed in a single device, two ends of the light guide beam 33 are respectively an input end and an output end, the input end and the output end of the light guide beam 33 both have two branch beams, the two branch beams at the input end of the light guide beam 33 are respectively connected with the first light source 31 and the second light source 32, the two branch beams at the output end of the light guide beam 33 pass through the handle 10 and extend to the front position in the mirror tube 20, and the two output branch beams are symmetrically arranged. The first light source 31 is configured to emit white light (visible light), the second light source 32 is configured to emit excitation light, and the white light emitted from the first light source 31 and the excitation light emitted from the second light source 32 are fused into mixed light in the light guide beam 33. The front ends of the first light source 31 and the second light source 32 can also be provided with a fusion light path 34, the fusion light path 34 is composed of two input ends and an output end, and the interior of the fusion light path is provided with a refractor and other structures, so that the two paths of light are merged into one path of light output, the output end of the fusion light path 34 is connected with the input end of the light guide beam 33, the two input ends of the fusion light path 34 are aligned with the emergent light paths of the first light source 31 and the second light source 32, and the fusion light path 34 fuses the white light emitted by the first light source 31 and the excitation light emitted by the second light source 32 into mixed light which is then irradiated on a human body shot object. The output end of the light guiding beam 33 is provided with two branch beams so that the mixed light can be uniformly irradiated onto the human subject.
The first and second front end assemblies 40, 50 are mounted side-by-side in the mirror tube 20 at front positions, with the first and second front end assemblies 40, 50 being located centrally within the mirror tube 20 and the output ends of the light guide bundle 33 being located at the edge within the mirror tube 20. The first front end assembly 40 is used to collect visible light reflected by the human subject, and the second front end assembly 50 is used to collect fluorescence excited by the human subject.
The first front end module 40 includes a first lens group 41 and a first sensor 42, and the first lens group 41 and the first sensor 42 are arranged in sequence in a front-to-rear direction and aligned on an optical axis. The first lens group 41 has a fluorescence cut filter 411 at the rear end thereof, the fluorescence cut filter 411 is used for filtering fluorescence in the mixed light, and the fluorescence cut filter 411 irradiates the filtered visible light (white light) onto the first sensor 42. The first sensor 42 is a color sensor, and the first sensor 42 is used for acquiring visible light (white light) and generating a corresponding first image signal.
The second front end assembly 50 includes a second lens group 51 and a second sensor 52, and the second lens group 51 and the second sensor 52 are arranged in order in a front-to-rear direction and aligned on the optical axis. The second lens group 51 has a visible light cut filter 511 at the rear end thereof, the visible light cut filter 511 is used for filtering the visible light in the mixed light, and the visible light cut filter 511 irradiates the filtered fluorescence onto the second sensor 52. The second sensor 52 is a monochrome sensor, and the second sensor 52 is used for acquiring fluorescence and generating a corresponding second image signal. The first lens group 41 and the second lens group 51 are focusing lens groups respectively configured to focus in visible light and fluorescence modes, and the lenses of the first lens group 41 and the second lens group 51 except the filter lens may be the same. In other embodiments, the fluorescence cut-off filter 411 and the visible light cut-off filter 511 are filter films attached to the front or rear lens of the second lens group 51.
In this embodiment, the control device 60 is installed in the apparatus behind the handle 10, the control device 60 is a main body part of the endoscopic imaging system, and the control device 60 has control and processing functions. The control device 60 includes a controller 61, an image capturing module 62 and an image processing module 63, and the controller 61 is connected to the image capturing module 62, the image processing module 63, the first light source 31, the second light source 32, the first sensor 42 and the second sensor 52, respectively. The controller 61 is a control center for controlling the entire endoscopic imaging work.
The light source assembly 30 further includes a light source control module 35, an input end of the light source control module 35 is connected to the controller 61, and two output ends of the light source control module 35 are respectively connected to the first light source 31 and the second light source 32. The light source control module 35 is configured to obtain a control signal from the receiving controller 61 and control the light intensity of the light emitted by the first light source 31 and the second light source 32.
The control device 60 further comprises a sensor drive module 64, an input end of the sensor drive module 64 is connected with the controller 61, one output end of the sensor drive module 64 is connected with the first sensor 42 in the mirror tube 20 through the handle 10 by a cable 43, and the other output end of the sensor drive module 64 is connected with the second sensor 52 in the mirror tube 20 through the handle 10 by a cable 53. The sensor driving module 64 is configured to output the sensor setting information calculated by the controller 61 to the first sensor 42 and the second sensor 52.
The control device 60 further comprises an input device 65, the input device 65 is a keyboard, a touch screen or the like, the input device 65 is connected with the controller 61, the input device 65 is used for inputting operation instructions and parameters to the controller 61, and the controller 61 responds to the input instructions to control other parts and workpieces.
The input end of the image acquisition module 62 is connected with the first sensor 42 in the lens tube 20 through the handle 10 by the cable 44, and the image acquisition module 62 acquires a first image signal generated by the first sensor 42 through the cable 44. The input end of the image acquisition module 62 is also connected with the second sensor 52 in the lens tube 20 through the handle 10 by a cable 54, and the image acquisition module 62 acquires a second image signal generated by the second sensor 52 through the cable 54. The output end of the image acquisition module 62 is connected to the image processing module 63, the image processing module 63 is configured to obtain a first image signal and a second image signal acquired by the image acquisition module 62, and the image processing module 63 is further configured to convert the first image signal into a white light color image and convert the second image signal into a fluorescent monochrome image, and output the endoscope image by image processing the white light color image and the fluorescent monochrome image.
The image processing module 63 is connected to the display 70, and the image processing module 63 outputs the endoscope image generated by calculation to the display 70 for display.
In the endoscopic imaging system of the present embodiment, since the first front end assembly 40 and the second front end assembly 50 are mounted at the front end of the endoscope 20, the first front end assembly 40 and the second front end assembly 50 collect visible light and fluorescence respectively, and convert the visible light and the fluorescence into corresponding image signals, and transmit the image signals to the image processing module 63 at the rear end of the handle 10 for calculation and processing. Compared with the prior art that visible light and fluorescence are transmitted to the rear end from the front end of the endoscope tube through the light path for collection and processing, the endoscope solves the problem of accuracy reduction caused by light path transmission, improves imaging accuracy, and saves light path cost and light path occupation space. In addition, due to the design of independent light paths of visible light and fluorescence, the visible light and the fluorescence can be respectively focused and collected, so that the definition consistency of a visible light image and a fluorescence image is ensured; the endoscope imaging system is binocular stereoscopic vision, and can obtain three-dimensional information and generate a stereoscopic image.
Example two:
the present embodiment provides an endoscopic imaging method, which is implemented based on the endoscopic imaging system in the above embodiments.
As shown in fig. 2, the endoscopic imaging method of the present embodiment includes the steps of:
s10, illumination;
the light source control module 35 controls the first light source 31 to emit white light, and controls the second light source 32 to emit excitation light, the white light and the excitation light are fused into mixed light through a light path, and the mixed light is irradiated onto a tissue (subject) in a human body through the light guide beam 33; the tissue (object) in the human body reflects white light (visible light), and the cancerated area in the human body is excited to generate fluorescence after being irradiated by the exciting light;
s20, collecting visible light and fluorescence;
visible light is collected by a first front end assembly 40 located at the front end within the tube 20, and fluorescence is collected by a second front end assembly 50 located at the front end within the tube 20.
The first lens group 41 transmits a mixed light of the visible light reflected and excited by the human subject and the fluorescence to the fluorescence cut filter 411, the fluorescence cut filter 411 filters the fluorescence and irradiates the visible light to the first sensor 42, and the first sensor 42 converts the visible light into the first image signal.
Meanwhile, the second lens group 51 transmits the mixed light of the visible light and the fluorescence reflected and excited by the human subject to the visible light cut filter 511, the visible light cut filter 511 filters the visible light to irradiate the fluorescence to the second sensor 52, and the second sensor 52 converts the fluorescence into the second image signal.
S30, collecting image signals;
the first image signal and the second image signal are acquired by the image acquisition module 62, and are transmitted to the image processing module 63.
And S40, image processing.
The image processing module 63 converts the first image signal into a white light color image and the second image signal into a fluorescent monochromatic image after acquiring the first image signal and the second image signal, and outputs the endoscope image by performing image processing on the white light color image and the fluorescent monochromatic image.
As shown in fig. 3, the image processing of the white light color image and the fluorescent monochromatic image specifically includes the following sub-steps:
s41, converting the white light color image into an RGB image;
because the white light color image is a Bayer pattern image, the white balance of the white light color image is adjusted, the white balance is converted into an RGB image after the white balance is adjusted, and then the RGB image is subjected to color correction.
The white light color image adjusts the white balance to R, G, B three color channel ratios so that the R, G, B component values in the white or gray image are equal.
And after the white balance of the white light color image is adjusted, the R, G, B component values on each pixel point in the white light color image are complemented by using a difference algorithm. Namely, G and B components are supplemented on an R pixel point in a Bayer pattern, R and B components are supplemented on a G pixel point, and R and G components are supplemented on a B pixel point. And finally, outputting the RGB three-channel color image.
In the color correction process, the color restoration degree of the image is corrected by the following color correction matrix:
Figure BDA0002593715850000071
wherein M is a color correction matrix,
Figure BDA0002593715850000072
in order to be an input, the user can select,
Figure BDA0002593715850000073
is the corrected output.
S42, converting the RGB image into a gray image;
converting the color corrected RGB image into a gray scale image, wherein the gray scale value can be obtained by adopting the following method:
the first method is as follows: taking R, G, B average value, namely (R + G + B)/3;
the second method comprises the following steps: adopting a coding mode specified in ITU-R BT.709, wherein Y is 0.2125R +0.7153G + 0.0721B;
the third method comprises the following steps: to L channels in L x a x b color space.
S43, normalizing the image;
and carrying out image normalization processing on the gray information of the fluorescent monochromatic image and the gray image.
The fluorescent monochromatic image is preprocessed and then is subjected to image normalization processing, and the preprocessing of the fluorescent monochromatic image comprises operations including image denoising, smoothing, region segmentation, morphology and the like. The important point is that the fluorescent monochromatic image is subjected to region division, and the fluorescent region is extracted from the background region.
And in the image normalization processing process, the fluorescence area extracted by segmentation and the gray level image are adopted for image normalization processing.
The image normalization process can be calculated in two ways:
the first method is as follows:
let U { U1, U2, …, Un } be the fluorescence regions segmented on the fluorescence image, and n be the number of fluorescence regions.
Figure BDA0002593715850000081
Mean value of fluorescence image wherein
Figure BDA0002593715850000082
The area average of the ith fluorescence area is shown. Is provided with
Figure BDA0002593715850000083
Is the average of the grayscale images. Is provided with
Figure BDA0002593715850000084
The normalized fluorescence image is Ut ═ { W × U1, W × U2, …, W × Un }.
The second method comprises the following steps:
similar to the first method, the difference is the calculation method of the average value of the gray scale map. Let XUIs a set of coordinate points corresponding to U, NUThe sum of the areas of all the fluorescence regions (i.e., the total number of pixels corresponding to U). The average value of the gray image is calculated as
Figure BDA0002593715850000085
S44, calculating a stereo disparity map;
and obtaining a stereo parallax image of the fluorescent monochromatic image matched to the white light color image by using a stereo matching algorithm.
The binocular camera module needs to be first binocular corrected (stereo rectification) before the following algorithm is used. The corrected left and right images of the subject to be photographed have the same horizontal position, that is, the same y coordinate value. When the matching algorithm is carried out, only transverse search is needed, so that the calculation amount is reduced. The binocular correction algorithm is a mature algorithm, and is not described in detail again. When binocular calibration is performed, the visible light cut filter in the fluorescence camera can be temporarily changed into the fluorescence cut filter, so that the white light image can be conveniently used for calibration.
Let the first lens (white light camera) be the left camera, the second lens (fluorescence camera) be the right camera. The stereo matching algorithm has the following two modes:
the first method is as follows: simple block matching algorithm
The following operations are performed for all the pixel points in the fluorescence region:
let a pixel coordinate in the current fluorescence image be XiTaking the adjacent 3x3 or 5x5 region and marking as Yi. On a gray scale map, from coordinate XiStart, search right and YiThe most similar gray block (block matching) and note down the coordinate X of the matching block on the gray mapm。dXi=Xm-XiAs a fluorescent image pixel XiThe corresponding parallax.
The second method comprises the following steps: SGM (Semi-Global Matching)
The first method, although simple and fast in operation, causes more noise and inaccurate parallax calculation. In practical applications, the SGM algorithm is more applied.
Ref:H.Hirschmuller,"Stereo Processing by Semiglobal Matching and Mutual Information,"in IEEE Transactions on Pattern Analysis and Machine Intelligence,vol.30,no.2,pp.328-341,Feb.2008,doi:10.1109/TPAMI.2007.1166.
The purpose of the two methods is to obtain the parallax (disparity) value of each fluorescent pixel point corresponding to the gray scale image.
S45, reconstructing a fluorescence monochromatic image;
and reconstructing the fluorescent monochromatic image into the coordinate system of the white light color image according to the stereo parallax image to obtain a reconstructed fluorescent monochromatic image.
Reconstructing a fluorescent monochromatic image can be achieved in two ways:
the first method is as follows: pixel based (pixel based)
For any fluorescent pixel XiObtain its corresponding dXiWith X coordinate on the gray scale (white light) mapm=Xi+dXi
The second method comprises the following steps: region-based (region based)
For any fluorescence region UiCalculating an average value of the parallaxes corresponding to all the pixels
Figure BDA0002593715850000091
Will fluoresce the area UiAll pixels in the image shift to the right
Figure BDA0002593715850000092
Then the U can be obtainediReconstructed into a white light map coordinate system.
S46, image superposition;
and superposing the reconstructed fluorescence monochromatic image on the RGB image after color correction to form an endoscope image.
The RGB image is also subjected to image adding and image parameter adjusting processing in superposition. The image adding comprises common sharpening, denoising, edge enhancement and other processing, and the image parameter adjusting comprises adjusting parameters such as brightness, contrast, saturation and the like of the image. The RGB image after image addition and image parameter adjustment has higher image quality, so that a clearer endoscope image can be obtained through subsequent superposition processing.
According to the endoscope imaging method provided by the embodiment, the visible light and the fluorescence are collected through the two eyes at the front end, and the visible light and the fluorescence can be respectively collected in a focusing manner, so that the definition of a visible light image is consistent with that of a fluorescence image; and the fluorescence image and the white light image are subjected to stereo parallax calculation, and the fluorescence image is reconstructed into a white light coordinate system, so that clear and accurate images can be obtained.
In this other embodiment, in order to further improve the imaging effect, fluorescence intensity compensation is also performed after the step of S44. The fluorescence capability of the lens for acquiring the fluorescence area with a longer distance is weaker, so that the intensity of the fluorescence area far away from the fluorescence area after imaging is weaker, and the fluorescence areas with different distances from the lens can show the same fluorescence intensity by compensating the fluorescence intensity.
The principle steps of fluorescence intensity compensation are as follows: and calculating the distance between the shot object in the human body and the end face of the camera through the stereo disparity map, further obtaining the depth information of different fluorescence areas, and then compensating the fluorescence brightness of the fluorescence areas with far distance.
The depth information is calculated as follows:
let a pixel coordinate in the current fluorescence image be XiCorresponding parallax is dXi. The depth value formula is:
Figure BDA0002593715850000101
wherein f is the camera focus, and T is two camera centre-to-centre distances about, and these two values all can be obtained in the binocular correction result.
The fluorescence intensity compensation specifically includes the following two modes:
the first method is as follows:
calculating the average value of the fluorescence light of each fluorescence area, and finding out the fluorescence area with the highest average value;
and multiplying the other fluorescence areas except the fluorescence area with the highest average value by a corresponding gain coefficient respectively to ensure that the fluorescence brightness of the other fluorescence areas is consistent with that of the fluorescence area with the highest average value.
The second method comprises the following steps:
designing and calculating a fluorescence compensation gain curve according to the distance between a human body object and the end face of the camera in advance;
and after depth information of different fluorescence areas is obtained, fluorescence compensation is carried out on each area according to the compensation gain curve.
Wherein the compensation curve is calculated as follows:
a fluorescence target is adopted, and a fluorescence camera is used for collecting fluorescence images at the positions 1cm,2cm and 20cm away from the target respectively. The central 32x32 area image was taken and the average calculated. And (4) taking the distance as an abscissa and the fluorescence value as an ordinate, and drawing an attenuation curve of the fluorescence value intensity along with the change of the distance. Assuming that 5cm (or any distance deemed appropriate) is the optimal fluorescence distance, the attenuation curve is normalized using the fluorescence value corresponding to 5cm, and a compensation value for the fluorescence intensity as a function of distance is obtained. One way to implement this is to set ybFor optimum fluorescence values, curve b is compensatedi=yb/yi,i={1,2,...,20},yiThe measured fluorescence value is divided by the distance i cm.
The fluorescence target is a pure color uniform target, and in order to reduce noise interference, when images are collected at various distances, a plurality of images can be continuously collected and averaged to obtain an average image.
The compensation curve is a discrete curve, and in application, an interpolation algorithm can be used to obtain a compensation coefficient at the current distance.
The specific method for applying the compensation curve to the compensation is as follows:
1) calculating a mean depth value of a pixel of any fluorescence area in the fluorescence image;
2) calculating a corresponding compensation coefficient alpha in the fluorescence depth compensation curve by using an interpolation algorithm;
3) the compensation factor α is multiplied for each fluorescent pixel in the current region.
The present invention has been described in terms of specific examples, which are provided to aid understanding of the invention and are not intended to be limiting. For a person skilled in the art to which the invention pertains, several simple deductions, modifications or substitutions may be made according to the idea of the invention.

Claims (10)

1. An endoscopic imaging system, comprising:
a handle having a threading channel;
the mirror tube is connected with the front end of the handle;
the light source assembly comprises a first light source, a second light source and a light guide beam, wherein the first light source is used for emitting white light, the second light source is used for emitting exciting light, one end of the light guide beam penetrates through the endoscope and extends to the front end in the endoscope, and the other end of the light guide beam is connected with the first light source and the second light source;
the first front end assembly is mounted at the front end in the lens tube and comprises a first lens group and a first sensor, the first lens group and the first sensor are sequentially arranged in front and behind, the first lens group is used for acquiring visible light reflected by a human subject, and the first sensor is used for acquiring the visible light filtered by the first lens group and generating a first image signal;
the second front end component and the first front end component are arranged at the front end in the lens tube side by side, the second front end component comprises a second lens group and a second sensor which are sequentially arranged in front and back, the second lens group is used for acquiring fluorescence excited by a human subject, and the second sensor is used for acquiring the fluorescence filtered by the second lens group and generating a second image signal;
the control device is connected with the first light source, the second light source, the first sensor and the second sensor; the control device is used for controlling the first light source to emit white light and the second light source to emit exciting light respectively, collecting first image information generated by the first sensor and second image information generated by the second sensor, and outputting endoscope images.
2. The endoscopic imaging system of claim 1, wherein said first lens group comprises a fluorescence cut filter disposed at a rear end of said first lens group, said second lens group comprises a visible light cut filter disposed at a rear end of said second lens group.
3. The endoscopic imaging system of claim 1, wherein the first lens group comprises a fluorescence cut-off filter affixed to the front or rear lens of the first lens group; the second lens group comprises a visible light cut-off filter film, and the visible light cut-off filter film is attached to the front end or the rear end lens of the second lens group.
4. The endoscopic imaging system of claim 2 or 3, wherein said light source assembly further comprises a fused light path having two input ends and one output end, said two input ends of said fused light path being aligned with said first and second light sources, respectively, said exit end of said fused light path being connected to said light guiding bundle, said fused light path being configured to fuse white light and excitation light into mixed light.
5. The endoscopic imaging system according to claim 4, wherein said output end of said light guide is provided with two branch beams, said two branch beams being symmetrically disposed at a front end within said scope.
6. An endoscopic imaging system as defined in claim 2 or 3, wherein said control device comprises a controller for controlling said first light source to emit white light and said second light source to emit excitation light, respectively, an image capturing module for capturing first image information generated by said first sensor and second image information generated by said second sensor, and an image processing module for converting said first image signal and said second image signal into an endoscopic image.
7. The endoscopic imaging system of claim 6, wherein the control device further comprises a sensor driver module, an input of the sensor driver module is connected to the controller, an output of the sensor driver module is connected to the first and second sensors, respectively, and the sensor driver module is configured to output sensor setting information calculated by the controller to the first and second sensors.
8. The endoscopic imaging system of claim 6, wherein the light source assembly further comprises a light source control module, an input of the light source control module is connected to the controller, an output of the light source control module is connected to the first and second light sources, and the light source control module is configured to receive a control signal from the controller and to control the light intensity of the light emitted by the first and second light sources.
9. The endoscopic imaging system of claim 6 wherein the control apparatus further comprises an input device, the input device being connected to the controller.
10. The endoscopic imaging system of claim 6, further comprising a display coupled to the image processing module.
CN202021451250.4U 2020-07-21 2020-07-21 Endoscope imaging system Active CN212326346U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202021451250.4U CN212326346U (en) 2020-07-21 2020-07-21 Endoscope imaging system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202021451250.4U CN212326346U (en) 2020-07-21 2020-07-21 Endoscope imaging system

Publications (1)

Publication Number Publication Date
CN212326346U true CN212326346U (en) 2021-01-12

Family

ID=74081975

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202021451250.4U Active CN212326346U (en) 2020-07-21 2020-07-21 Endoscope imaging system

Country Status (1)

Country Link
CN (1) CN212326346U (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115153399A (en) * 2022-09-05 2022-10-11 浙江华诺康科技有限公司 Endoscope system
WO2023109853A1 (en) * 2021-12-14 2023-06-22 微创优通医疗科技(上海)有限公司 Binocular endoscope and binocular endoscope imaging system thereof

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023109853A1 (en) * 2021-12-14 2023-06-22 微创优通医疗科技(上海)有限公司 Binocular endoscope and binocular endoscope imaging system thereof
CN115153399A (en) * 2022-09-05 2022-10-11 浙江华诺康科技有限公司 Endoscope system

Similar Documents

Publication Publication Date Title
CN111803013A (en) Endoscope imaging method and endoscope imaging system
US11330237B2 (en) Medical inspection apparatus, such as a microscope or endoscope using pseudocolors
US11062442B2 (en) Vascular information acquisition device, endoscope system, and vascular information acquisition method
KR101854189B1 (en) Augmented stereoscopic visualization for a surgical robot
US7123756B2 (en) Method and apparatus for standardized fluorescence image generation
US7539335B2 (en) Image data processor, computer program product, and electronic endoscope system
US8724015B2 (en) Image processing apparatus, image processing method, imaging apparatus, and information storage medium
CN113208567A (en) Multispectral imaging system, imaging method and storage medium
CN212326346U (en) Endoscope imaging system
CA2627611A1 (en) Imaging system and method to improve depth perception
CN109310271B (en) Medical observation device and method for temporally and/or spatially modulated false color patterns
US9198564B2 (en) Image processing device and fluoroscopy device
US20110109761A1 (en) Image display method and apparatus
CN110772208B (en) Method, device and equipment for acquiring fluorescence image and endoscope system
CN114414046A (en) Observation support device, information processing method, and program
EP2589328A1 (en) Image processing apparatus and image processing method
EP4248835A1 (en) Fluorescence endoscope system, control method and storage medium
EP4201298A1 (en) Endoscope system with adaptive lighting control
WO2021095672A1 (en) Information processing device and information processing method
CN115804561A (en) Method and apparatus for video endoscopy using fluorescent light

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant